Comment Re:Understanding? (Score 1) 26
Isn't hallucination very much a human trait?
No, at least not without chemical assistance or mental issues which, in either case, means that the brain in question is not functioning properly.
Ask an LLM a question. Then ask it to explain step by step how it arrived at the answer. It will do so more logically than most humans.
No, it may sound logical but it is not actually using any logic. All it is doing is predicting what text is most appropriate to add next. It is not doing what a human would which is have some concepts in mind and then struggle to find the correct words to express or explain those concepts. Current AI is exactly like a parrot: it can mimic human writing - and yes do so insanely well - but that is all it is doing mimicing, or in some case just flat out copying. That can, and indeed does, give an extremely powerful illusion that AI somehow comprehends what it is writing but at no point is the AI recognizing a concept and then trying to express that concept in words as a human would.