There are lots of ways an LLM like ChatGPT could be combined with search to improve the experience, and the interesting question is what are Microsoft/Bing doing here?
I think most people might be assuming that ChatGPT is being used to generate search query "content", the same as when ChatGPT is used stand-alone, but there's a couple reasons that seems unlikely to be the case:
1) Apparently Microsoft's ChatGPT integration is able to cite sources, which is something ChatGPT itself is fundamentally unable to do. The technology behind ChatGPT is what's known in machine learning as a "language model", or in this case a *large* language model (LLM), which is something that essentially deals in word statistics, not facts or sources. ChatGPT just generates words one at a time based on the coimbined statistics of everything it was trained on. As far as ChatGPT is concerned there is no source for the word-by-word output its generating - it's just a product of the meat grinder if word statistics it was trained on. Occasionally the output of ChatGPT might word-for-word match a source, but that is coincidence and not something that AFAIK it is currently designed to detect.
2) People expect search results to be up to date, but ChatGPT is based on a frozen training set maybe a year old (and it costs tens of millions of dollars to retrain, so that is not going to be done often). There is a line of research into having a frozen ChatGPT generate queries to be used as *inputs* to search engines (and other sources), which is conceivably what Microsft is doing if this is capable of returning recent data.
So, if ChatGPT itself, as used by Bing, isn't generating the response content (as such), then what might they be using it for ... There are a number of possibilities, such as:
1) As a chatty front end to Bing's search engine - Bing still does the search, but you interact with it via ChatGPT.
and/or
2) As a summarizer (something LLMs do well) for Bing results - so more than just a front end, but not generating the search query content itself, but perhaps summarizing or rephrasing it in more conversational style.
Given the popularity of ChatGPT it seems we're going to see some type of integration of this type of LLM technology into all the major search engines, and they may all do it in different ways. What they most likely all will have in common is that they won't be replacing a traditional web-crawling search engine with straight-up ChatGPT interaction, since no-one is going to accept something presenting itself as a search engine just making incorrect stuff up (a good portion of the time) and unable to answer questions not based on a frozen training set.