Comment Re:Don't sit on this bench(mark.) (Score 3, Interesting) 22
LLMs cannot do it. Hallucination is baked-in.
LLMs alone definitely can't do it. LLMs, however, seem (to me, speaking for myself as an ML developer) to be a very likely component in an actual AI. Which, to be clear, is why I use "ML" instead of "AI", as we don't have AI yet. It's going to take other brainlike mechanisms to supervise the hugely flawed knowledge assembly that LLMs generate before we even have a chance to get there. Again, IMO.
I'd love for someone to prove me wrong. No sign of that, though.