By heavily marketing Microsoft Windows to the point that it is used, in a capacity where it can run things like Minecraft, in mission-critical IT infrastructure, they have done much to bring the current situation about. Mission critical IT infrastructure should be decomposed as a system of well-defined, hardware-isolated roles, each of which has only the authority necessary to do its job, and nothing more. (This is the principle of least authority.) There is more profit for Microsoft and major IT consultancies in just pushing Windows. Indeed Linux, in its 'desktop' flavour is no better. But Linux, being open-source, is sufficiently customisable that, as in Android or embedded uses, you can remove as much as you like.
For example, there is no need, in a patient records system, for the facility to arbitrarily create, overwrite, and delete files. If you have one machine that stores important details, another that categorises records stored by the first, and another that reads back the result, and can do nothing else (such as run Microsoft Word or Minecraft), then there is simply far less to go wrong. But systems need to be architected around this. The current trend to maximise 'bang for buck' has led to maximising flexibility and agility and, with it, maximising the flexibility and and agility offered to attackers and, thus, maximising vulnerability.
Microsoft and other proprietary software vendors, in pursuing their market positions, have done much to bring this situation about, and only when we learn that a general purpose OS is not a good idea for actually running mission-critical infrastructure (even while they are great for designing and programming them), will we start to get out of this mess of 'cyber insecurity' that we find ourselves.