Okay, here's an "artificial barrier": You're an IT administrator for a bank. You support about 35 mission-critical applications that go to a mainframe. Why keep the mainframe? Because it's the only thing that's gone through the laborous process of being documented, audited, and certified for use. Those certifications could run into the tens of millions of dollars, plus another fifty million to retool your existing infrastructure, minimum. All those applications were written for Windows 95.
Now, Microsoft is a safe bet because you know those applications were written decades ago and will still work. They're horrible, out of date, and make your butt itch just thinking about them, but they work, and it's cheaper to keep them going than to invest in an all-new infrastructure. But you go with Apple, or Linux and what do you get? Every five years, maybe ten if you're lucky, you have to rebuild and redesign everything to make it work with the latest and greatest.
Microsoft delivers what businesses want: Reliability. Long. Term.
And that costs money, time, effort, and yes... it's a MUCH higher standard to reach for.
Whoever thought this was insightful, isn't.
Your use of "Mainframe" could have client apps written in anything. In fact, you fail to point out what the mainframe is running. If, as you claim in your hypothetical, the mainframe system is the part that's documented, you can always write a conforming client on just about anything, yes, windows included but linux and MacOs as well.
As a real-world proof, I've assisted building a web application that interfaces with a legacy PIC database and replaced proprietary desktop apps with a thin net client. After our work, what OS is required by the millions of users? We don't care, any browser made after 1998 could run the app, on any OS that runs the browser.
If you fail to see this, you deserve to pay Redmond every dime you already obviously do.