Using Windows in health care was a really stupid idea in my opinion.
If Linux was in the state it's presently in, back when computers were making inroads in healthcare situations, you may be on to something. Linux in the 90's, however, didn't play too well with most things who's I/O didn't involve an ethernet port.
Not your stupid idea, mind you. A stupid idea on the part of all the software developers who chose to target it. What you really need is a good and secure core OS with very few features, which you can upgrade forever without breaking compatibility.
Which distro do you target in this respect? Red Hat, I guess (it's one of the handful still here today that were around in 1995, but at the time, there were plenty of other promising distros that didn't survive)...but if breaking compatibility weren't a problem, Red Hat wouldn't still be issuing minor updates for RHEL 4, because everyone could just jump to RHEL 7 without a problem.
Then you need packages on top of that core to provide all the user-facing features like the desktop environment, which shouldn't ever need to be updated (since they should be relying on the core OS for security).
As a trivial example, assume we ran with this logic of never updating the desktop environment. I've got no issue with GNOME 2; it's functional. Old computer didn't have wireless, new one does. Old GNOME won't have a UI for connecting to a wireless network. i'm sure it can be command line scripted, but that script starts getting longer as more and more edge cases for the desktop UI come to light.
All the healthcare-specific applications shouldn't ever need to be rebuilt or updated (except for security updates).
...unless the laws change and you need different information entered. Or, you switch upstream providers and you need to alter the output. Or, it was built in Java and the new iterations of Java outright block interfaces that don't have super duper blessed certificate chains. Or the facility offers a new service that they didn't used to. Or the vendor goes out of business and you have to migrate to someone new anyway. Or, MySQL/Postgres does things a bit differently and you need to match the new version....The list of why software needs to be updated is endless - name ONE piece of software that was "done" in its first iteration. *MAYBE* something like nano or another very simple program, but software gets updated, especially in the medical field.
None of this 10-year support window requiring a large expensive rollout of new software when it runs out.
Okay, fine. There is plenty of medical equipment that requires regular replacement for new technology, equipment, resolution, and procedures. Should a year-old MRI machine have Windows 2000 drivers? Conversely, what's the statute of limitations for old hardware to get support? Would you want an MRI on a 25-year-old scanner?
No need to waste developer time on updating existing applications for new APIs when you could be developing the next great thing instead. So why isn't the whole healthcare infrastructure built on Linux?
So, computers stop being computers, and instead just become part of the embedded hardware? That can make some sense - no one ever complained about their Nokia phones not getting software updates. Super standard languages for certain things are wonderful; it's why HP Laserjet 8000 series printers are still on the road. However, if we're not updating software, we wouldn't be able to update hardware, except in terms of what the existing software can do.
The correct approach is the correct approach. Pardon the tautology, but it's true - minimally changing UIs and APIs can be good. In other areas, allowing software to be more radically altered makes the hardware a better investment. Knowing which is which, is almost the definition of wisdom.