Sam Altman has the world's most expensive barrel of snake-oil and he's determined to sell all of it. ChatGPT and other LLMs like it have some very, very niche cases where they can effectively replace human workers. Remote customer service where the role is very simple and well-defined, employees essentially act programmatically following a rigid procedure, using "business friendly" language... that's obviously the one that maps most closely to an LLM's core competency.
I'm absolutely sure there have been "expert systems" for medical diagnostics for decades; I remember reading about them in school in the 1990s and I think they've been around a lot longer than that. They weren't AI-based, they were just programs where you answered a bunch of questions about the patient, symptoms, test results etc. and it came up with the most likely diagnoses.
Throwing ChatGPT into the medical diagnostics realm where it absolutely will hallucinate nonsensical bullshit seems utterly insane. I wouldn't trust ChatGPT to write a moderately complex Python script, why the fuck would I trust it to correctly diagnose an illness? We're absolutely nowhere near the point where AI should be trusted to do that. Might as well ask a parrot.
Not even going to touch on the costs, both environmental and financial. LLMs are nice toys, good for when you cba Googling something. That's about it.