Comment Re:Publicly funded research is broken (Score 2) 64
s'okay. Biology is the only scientific discipline where division and multiplication are the same thing.
s'okay. Biology is the only scientific discipline where division and multiplication are the same thing.
They are continuing to follow fads and introduce new "features" that nobody wants, and that can't easily be uninstalled
They need to stop adding this crap, and provide easy ways to uninstall it
They need to use every tool they have, including AI, to find bugs and security weaknesses
If something is good, people will choose it voluntarily and even pay for it
If something is installed by default and can't be removed, it's likely not good
We need a reliable OS
The USPS was designed for letters. Now letter volume is way down and package volume is increasing rapidly. Small post offices that were designed for letters are overwhelmed with packages. Delivery vehicles are small. To properly handle high package volume, a lot of facilities would need to be upgraded or replaced
Does it work?
Kids are smart and really good at finding workarounds
The likely outcome is political theater, where politicians claim success while kids get creative
The other likely outcome is annoyance and failure when the tech goes wrong
uh, no. You didn't win.
Places like Bell Labs were more like university research centers than corporate dressing on mandatory-overtime grind. They were not expected to directly turn a profit as business units of the company, because what they did was to lay the groundwork for technology that the other business units could then adapt into products. The return on the investment paid into running them took years or even decades to realize. Without the pressures of needing to turn quarterly or even annual profits they weren't working their researchers to the bone and they were fostering a culture of internship for college students into joining their ranks as researchers to perpetuate the institutional knowledge.
Instead of using AI to "increase productivity" by quickly generating bloated, inefficient, bug-ridden, insecure slop, the better use of the tools is to find bugs, security weaknesses and unhandled edge conditions. AI research should focus on creating better code, bug-free, efficient and secure with all edge cases handled
*ow!*
uh, found it...
Not only have I seen that, but I have experienced it.
My socket set and ratchet isn't trying to convince me to be in a relationship with it, to be in love with it, to be something of an equal to it.
Even our pets as living beings capable of expressing themselves are not able to communicate at our level.
Large language model AI is attempting to spoof being human, to mimic being us. There are already examples of people becoming very, VERY upset when their AI-boyfriend or AI-girlfriend is taken away by companies revising the AI standards and interaction rules. This is unhealthy. The relationship needs to remain that of tool user and tool, because anything more than that is one-sided and subject to terrible abuse by anyone that managed to co-opt that system.
Immature tech is immature
AI tech is making real, rapid and exciting progress, but is still immature.
The hypemongers make outrageous fantasy claims.
The tech sometimes works great, sometimes mediocre, and sometimes fails catastrophically.
Anyone who believes that the tech is perfected deserves what they get.
WTF?
This is one of the stupidest things I've seen
How stupid can stupid get before it all collapses into a black hole of stupidity?
I support AI research and believe that it will help us solve previously intractable problems in science, engineering, medicine and maybe even economics.
I also believe that OpenAI has made a series of tactical errors that's turning the public against AI.
While other labs kept the tech in the lab, OpenAI released it to the general public. This resulted in excitement and a few fun things, but also a tremendous amount of slop and scams. Meanwhile, pundits and hypemongers constantly and publicly claimed that AI will replace all jobs. When all the public sees is slop, scams and fear of job loss, the angry response is not surprising.
Then, OpenAI wanted to continue their push into pop culture by introducing "adult mode". This waste of electricity serves no useful purpose, but it amplified OpenAI's push into pop culture.
Now, it appears that they realize that they made have made a bad choice, and are dropping adult mode. This is a good step, but they need to go further.
The proper use of AI is as a tool, not as a friend, lover or therapist, and especially not as an addiction
The Butlerian Jihad cannot come soon enough. Machines should be useful tools serving us, where our emotional connections to them are still based on tool-user and tool.
I've been a reddit user for many years. Even before the release of chatgpt, I knew that I was training a robot.
I liked this idea, because I believed my posts had value and the robot training would be better if it included my posts
The music labels know that they are unnecessary and headed for the dustbin of history, so they tried to extort some money from rich companies
Music labels are evil and deserve to die
This is a terrible way to look at new and exciting tech
The better strategy is for people to evaluate the new tools and determine which ones work for them and which ones don't
Someday somebody has got to decide whether the typewriter is the machine, or the person who operates it.