Comment Re: Not a threat to survival (Score 1) 87
That's the crux of the matter, though. These LLMs don't use "logic" at all. Maybe, quite by accident, they tend to give answers that appear to be logical, but that's only a reflection of those vast troves of information it was trained on. LLMs have no ability to reason though logical processes; If you ask it to, it will make a show of doing so... then apply those rules inconsistently. Assuming it uses any sort of "reasoning" to arrive at an answer very dangerous. It's just giving you something statistically likely.