Comment Because we just don't have *enough* languages. (Score 1) 82
Let's throw this one out and watch it sink too.
Let's throw this one out and watch it sink too.
You're putting a charge into a highway. The cars will be able to use a tiny fraction of that. The rest will dissipate as heat. I'd love to see the efficiency studies on this one.
...is still AOK, completely unregulated and tacitly encouraged!
To kick their own chip development efforts into high gear. Which they will now do.
The fact of the matter is that, regardless of consequences, AI can not and will not be a completely controlled thing.
The USA is a country that can't stop gun violence, drug use, or enforce reasonable antitrust laws. While they can hire expertise to make recommendations, few in government are technically savvy enough to grasp the full implications of ever improving AI over the next decade or two.
In the end, when AI has stealth replaced most governmental function and officials start realizing that more and more, they are just figureheads while AI makes decisions behind the scenes, there may be some faltering, ineffective pushback.
It won't make any difference.
As usual, anti-porn activists should mind their own fucking business.
LIke a LOT more than on premises hosting.
Capex vs Opex arguments only appeal to fools, MBAs and other types who have no idea how a company actually runs.
And this is why we only deal with offshore sites.
Remember kids, the writ of the USA is not universal. With a VPN, you can essentially tell the moralistic whingers to fuck right the hell off.
Unless they stop being a bunch of cowardly little pussies, incorporate and move ops offshore and continue on their merry way.
But they probably won't. They'll keep their noses firmly in the asses of the moralists in the USA's government.
And go bankrupt
Do you really think foreign governments, three letter security agencies or any military organizations anywhere are going to pay the slightest attention to some law made in the USA?
If so, I have a bridge to sell you.
whose degree is in psychology, I'll say "yes" to this one.
Feel free to add a few verifiable facts or a semblance of reasoning to that statement.
> Just because Xanax helps your anxiety doesn't mean your anxiety is caused by "chemical imbalance",
Uh, you're completely wrong. Sometimes. The problem with anxiety disorders is that there is no "root cause." Everyday things that most people ignore cause huge amounts of anxiety in some people because their neurological biasing is just faulty. No amount of "talk therapy" is going to fix that. It simply can't be rationalized away.
As an amusing example, I suggest you find your local meth addict going through a paranoid delusional episode and try and "talk" them out of it. The effects of neurological biasing will become quite obvious. You can try it with drunks too. Equally ineffective.
Yes, these are extreme examples and the biasing agents are external. Know what? THAT DOESN"T MATTER. Neurochemistry doesn't care about source.
They know that the country with the most useful AI will win. Period. Full stop.
AI is a race we dare not lose. We can't stop. Or even slow down.
So here's the thing though. ChatGPT and other LLMs mimic the part of our cognition that's best described as "learning by rote." Humans do this with years of play. LLMs do this by being trained on text. In each case, neural nets are set up to create rapid jumps along the highest weighted probability path with some randomness and minimal processing thrown in. It's the most computationally cheap method for doing most of what humans do (walking, seeing, talking) and what chatGPT does (talking). Most of what humans consider "conscious intelligence" exists to train the parts of your brain that are automatic (i.e. like chatGPT).
The computationally expensive part - verification of facts via sensory data, rule based processing, accessing and checking curated, accurate data, internal real world rule base modeling with self correction, and most importantly, having a top level neural net layer that coordinates all of these things is what LLMs do not do. Generally we call this "thinking."
The hard parts haven't been done yet, but they will be, and soon. So, right now LLMs are not AGI, but we'll fill in the missing pieces soon enough as the picture of what intelligence is and isn't becomes clearer.
What is algebra, exactly? Is it one of those three-cornered things? -- J.M. Barrie