Comment Garbage In, Garbage Out (Score 1) 62
LLMs are expert systems, where the expertise is this: what has been written?
That's a pretty cool thing to be expert in, and it really does have some fun (possibly even useful) applications. They seem pretty good at demonstrating this expertise, but I guess a lot of people forget GIGO is a fundamental property of "what has been written?" until you point out that a lot of crap has been written. (Shitposters know the megacodex of human writing contains a lot of crap, because we've knowingly contributed our bespoke turds to it. And I bet LLMs have contributed many of their own turds too, which they're eating and redigesting unless their feeds are very carefully controlled.)
That said, I do have to admit that LLMs have made me move the goalposts on detecting/testing intelligence. LLMs know a lot of stuff (whether it's true or false) and I have a pretty easy time seeing how people could be fooled. I know better than to say I can't be fooled.