Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re:Computers don't "feel" anything (Score 1) 33

It's different from humans in that human opinions, expertise and intelligence are rooted in their experience. Good or bad, and inconsistent as it is, it is far, far more stable than AI. If you've ever tried to work at a long running task with generative AI, the crash in performance as the context rots is very, very noticeable, and it's intrinsic to the technology. Work with a human long enough, and you will see the faults in his reasoning, sure, but it's just as good or bad as it was at the beginning.

Comment Re:What is a 'token'? (Score 2) 21

A "token" is a substring. They're usually parts of words or whole short words.

"Processing" "tokens" is fundamentally what an LLM does.

Simplified, It takes input text, tokenizes it (splits it up according to the same rules as the corpus), maps that to a huge sparse network of vectors that serve as a lossy represention the tokenized training corpus, and then plays "pick the next most likely token" to respond.

If you choose to pay money to one of the robot timeshares, you are effectively buying the right to feed it this many tokens and expect back to get back that many tokens per month.

Comment Bluehat (Score 4, Insightful) 6

I know a couple long-time Redhatters who left at various points during the digestion process.

I heard both unhappiness about how the company changed and unhappiness about IBM shafting the open source world from both of them.

I assume anything RH-branded is simply demoware now, and am leery of projects with too many redhat.com email addresses in the repo.

It was an excellent example of doing well by doing good for a long time.

Comment Pathetic [Re:Computers don't "feel" anything] (Score 1) 33

It's called "pathetic fallacy"-- ascribing feelings (pathos, in Greek) to inanimate objects.

I'm afraid that we do this all the time. I don't even think twice before saying something like "the toaster doesn't like you to run the blender while it's toasting" or "this program wants two special characters in the password, not just one."

Comment Re:full-size electric pickup (Score 1) 181

Anonymous Cowards, always stupiding up the comments.

We KNOW from survey data that people with trucks in North America rarely or never use the truck bed, and 70% never tow anything with it.

If that's true, they're not buying a truck because it's good at truck stuff, they're buying it for reasons that are superficial, because a truck is worse at literally everything to do with driving on roads than cars UNLESS they're towing or hauling something.

You can look it up yourself.

Comment Re:Computers don't "feel" anything (Score 2) 33

Correct. This is why I don't like the term "hallucinate". AIs don't experience hallucinations, because they don't experience anything. The problem they have would more correctly be called, in psychology terms "confabulation" -- they patch up holes in their knowledge by making up plausible sounding facts.

I have experimented with AI assistance for certain tasks, and find that generative AI absolutely passes the Turing test for short sessions -- if anything it's too good; too fast; too well-informed. But the longer the session goes, the more the illusion of intelligence evaporates.

This is because under the hood, what AI is doing is a bunch of linear algebra. The "model" is a set of matrices, and the "context" is a set of vectors representing your session up to the current point, augmented during each prompt response by results from Internet searches. The problem is, the "context" takes up lots of expensive high performance video RAM, and every user only gets so much of that. When you run out of space for your context, the older stuff drops out of the context. This is why credibility drops the longer a session runs. You start with a nice empty context, and you bring in some internet search results and run them through the model and it all makes sense. When you start throwing out parts of the context, the context turns into inconsistent mush.

Slashdot Top Deals

The beer-cooled computer does not harm the ozone layer. -- John M. Ford, a.k.a. Dr. Mike

Working...