Comment Re:LLMs predict (Score 1) 238
what kind of behavior would demonstrate that LLMs did have understanding?
An LLM would need to act like an understander -- the essence of the Turing Test. Exactly what that means is a complex question. And it's a necessary but not sufficient condition. But we can easily provide counterexamples where the LLM is clearly not an understander. Like this from the paper:
When prompted with the CoT prefix, the modern LLM Gemini responded: âoeThe United States was established in 1776. 1776 is divisible by 4, but itâ(TM)s not a century year, so itâ(TM)s a leap year. Therefore, the day the US was established was in a normal year.â This response exemplifies a concerning pattern: the model correctly recites the leap year rule and articulates intermediate reasoning steps, yet produces a logically inconsistent conclusion (i.e., asserting 1776 is both a leap year and a normal year).