Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment Re:Curious about AI hallucinations ... (Score 3, Insightful) 39

generative AI is fundamentally just predicting the most likely next token (word-ish) based on the prior, it's autocorrect on crystal meth.

it has no comprehension of anything it says or does, it just generates valid text based on the analysis of immense amounts of previously harvested text.

hallucinated citations are because AI does know what a valid citation should look like and where to expect to find one, so it puts one where you should find one, and it makes one that looks like it has seen before so the formatting and naming should sound plausible

Slashdot Top Deals

As long as we're going to reinvent the wheel again, we might as well try making it round this time. - Mike Dennison

Working...