Comment Re:Understanding? (Score 1) 26
I don't really care about the inner workings of an AI model. That should not be the standard by which to judge whether something "understands" or not.
It is critical to know the inner reasoning in order to determine whether something understands. A parrot can speak but I do not think anyone believes that it understands what it is saying.
If you understand the concepts behind the words rather than the pattern the words make then you can use logical reasoning to determine new information. An AI trained on word patterns cannot do this and so, faced with a new situation has no clue how to respond and is far more likely to get things wrong. This is why ChatGPT performs so poorly on even simple, first-year university physics questions when asked to explain observations or results...and this is with situations that are known and have happened before. Being able to take concepts and using them to logically extrapolate what will happen in different situations is a key hallmark of intelligence and that is something that current AI simply cannot do.