That's not what's happening. That's never what happens. Any time someone uses an ai chat bot as part of their work, they immediately turn into drooling idiots.
Yeah, who needs a chatbot when you can make unqualified claims as statements of fact. You don't even need citations, such as the ones you're claiming (without citation) they make up. (Which just to be clear, they do, a certain amount, although a casual interpretation of your words suggests you're implying "always".)
Look, there are lots of problems with LLMs, but I find it amusing to watch people launch into "what I say is true, because I said it, and it sounds true to me" when talking about LLMs being sources of inaccurate information.