It seems there's a lot of misunderstanding about what ChatGPT (and similar LLMs) are ... having the tech packaged/presented as a chatbot has been great at popularizing it, but the fact that you can now ask it questions and get replies seems to have made a lot of people think that it's at heart some type of search engine that is attempting to factually answer questions, when really nothing could be further from the truth!
This LLM/transformer tech is built to generate language that is statistically similar to stuff it was trained on. In order to do a good job of this is has necessarily learnt quite a lot about the world being described by the training set, but nonetheless it has no notion of facts or sources .. it's just a giant meat-grinder of text that generates extremely plausible new text ...
Now, if you ask it about something that it was trained on then there's a good chance that it's response will draw from that training material and be "factual", but if you ask it about something where it has less of relevance to draw upon, then it will equally happily generate a bunch of BS out of thin air, and essentially has no way itself to know when it's doing this. It doesn't deal in facts - it deals in language.
If you want to get something more "factual" (ie. constrained by the training data) out of an LLM, then it needs to have been extensively trained on that type of material - e.g. training on programming examples is what makes Codex so good.
In general, the best use of LLMs is not as an all-knowing oracle or search engine replacement, but playing to its strength and core capability as a language processor, and use it for summaries, translations, text generation around a prompted theme etc. Obviously you can ask it questions too, but these generally need to be treated more as brainstorming suggestions, or as things that you need to check for truthfulness if being used in a context where you care.
When integrated with a search engine (vs using ChatGPT), as Bing has done, then you are more likely to get a factual response since it's using the search engine to retrieve BAU search results, and mostly relying on the LLM to understand your "query" and present the response, although you can of course still use it in more unconstrained ways too. Couple of days ago I was talking to Bing getting it to draw SVG platypuses for me. Odd world that we are entering...