Comment Re: What a win for xAI (Score 1) 51
I was actually in college in the 1990s, but yes, a middle schooler today with python on a raspberry pi and a pretty simple GPS module could do this.
I was actually in college in the 1990s, but yes, a middle schooler today with python on a raspberry pi and a pretty simple GPS module could do this.
I didn't say it wasn't abhorrent or alarming. I'm presenting the scenario that this task of "defend this three dimensional coordinate box" doesn't require AI.
Yes, it did. The beacon signals weren't that good back then, neither were the sensors. I had the same problem in the fake robot battles I was involved in.
The answer turned out to be a solution not from Defense industries, but from Genie Garage Door Openers.
The robot doesn't care. The robot's job isn't foreign policy. The robot's job is "here's a box defined by this coordinate cloud, defend it"
Like I said, I programmed it for a fighting robot back in the 1990s. It ain't that complex, and with today's drone factory ships, the Navy can now output this level of AI in killbots at a rate of 10,000 a day.
> You can't fix it by "not letting stupid people breed", you have to fix it through not letting people become stupid
This sentence seems to be somewhat self contradictory. Despite decades of trying to make it not so, it seems that intelligence remains primarily inherited from parents/ancestors.
Socioeconomic status, education, opportunities, etc all have no ability to improve iq. Nutrition only matters in the sense of malnutrition. So environmental factors can reduce IQ, but they cant do anything to raise it.
Attempting to "fix" it, which we have been doing in the first world for a while now, seems to be causing average intelligence to drop precipitously. the peak IQ in most nations is now firmly in the past 30-70 years back. I think we just have to give up on the idea that this is something to "fix" per se, and let people make their own choices as individuals.
Kill decisions are simple in comparison: Stay within your predefined geofence, kill anything that moves that isn't transmitting Friend beacon. We don't need AI for that, I coded a form of it in both Basic and Forth back in the 1990s.
And if they don't, some other startup will.
> The most important part which is wrong however is the idea that people who never contributed much would easily find economically equivalent work.
There is a name for that opinion: luddism.
Every single new technology shifts jobs and work, and every single time people fantasize about permanent structural unemployment, and every single time people just move into new types of work and there is no such thing as structural unemployment. Luddites back then could never imagine a world where less than 95% of people worked in agriculture, and today's neo luddites cant imagine a world without hordes of graphic artists, paralegal functionaries, music techs, and such, but its coming regardless.
Fundamentally, what people are willing to pay for always comes down to work done by others. Things that are automated, at best, shape how people work, and what work attracts more or less pay. Just as a shovel helps you to dig, a DNN helps you slop out boilerplate text with errors, so you dont need to pay as many people to do it.
These two fundamentals never change when a new tool or technology is used
* some people get more productive to some degree using the new tools
* That frees up labor to do new things, and the economy as a whole grows, because the new things have value
Its possible there will be a growing field of content curation; The DNN's & LLM's basically explode when their outputs are fed back their inputs. So instead of making tons of derivative works de novo, many creative types will instead become art critics; helping to curate an input set to train the machines that synthesize generated works.
paying taxes is also negative growth, so that part doesnt even matter.
The reason for "AI" causing zero gain is because its not "AI"; its not people, its not independent economic actors with agencies.
The name "AI" is nothing but a marketing gimmick, a lie.
What we have are just tools; helpers. And like any tool, the best they can be is productivity enhancers for people. But what these tools excel at is mostly economically unimportant work; shoddy art, boilerplate text, remixed music. So they make something with nearly no impact require fewer people.
Even if every single B-grade graphic artist, musician, and every contract and legal functionary gets put out of work by these babble generators, no real economic impact would be felt because those fields never contributed much, so the people freed from such work would quickly find economically equivalent work of any kind.
If there is to be any gain from the whole overinvestment bubble, it will be improved search engine answers. Like today, they will still be questionable, full of random errors, and politically slanted by the local jurisdiction. But they will be slightly better for what they are.Just like with search engines, people who are adept at prompting queries and filtering through the results will be slightly more productive than people who are not.
But it will take an awful long time for their meagre gain to measurably exceed the insane overinvestment lost in "AI".
Or time literacy. Permanent!=3 years
If hardware doesnt run linux super well, the hardware is garbage. Simple as.
Don't be so sure that Republicans aren't Communists and Democrats aren't Fascists.
And apparently "Faster than ever before" is limited to a data set less than 4 centuries old out of 4.5 billion. It's science reporting like this that makes people distrust science.
And the total number of layoffs of American tech workers in 2022-2025 was in the hundreds of thousands.
"Virtual" means never knowing where your next byte is coming from.