Comment Hallucinations (Score 5, Insightful) 47
In topics that I don't know, it's damn near impossible to pick out errors. I'm fairly confident those errors still exist. But, I can't see them due to my own knowledge limitations. Regardless, the AI remains confident regarding all output whether correct or not.
The knowledge provided by university which is denoted by those fancy degrees is arguably more valuable in the era of AI due to hallucinations. You need humans who can tell when the AI is correct or not.