Zero-days are not "back doors".
Unless the zero day flaw was put there intentionally, as back doors are put there intentionally, a zero day flaw is not a back door, it's just some incompetent who should be employed asking me "Do you want fries with that?", rather than employed writing security sensitive software. In other words: your average bad programmer.
Agreed about a 0-day flaw not necessarily being a "back-door".
You're incorrect about flawed software necessarily being the output of a bad programmer. Even the best programmers make mistakes - it's not just the nature of software, it's the nature of security - "absolutely secure systems do not exist" (Shamir's First Law). Except may death - and even then it's not certain.
Programming languages, development procedures, code auditing, and system architecture keep developing towards inherently better security. But it won't change some fundamental restrictions epitomised by Shamir's Second Law.
"To halve your vulnerability, you have to double your expenditure"
Increasing security is a case of diminishing returns. The mythically perfect integrity shell probably won't solve the problem either (Shamir's Third Law "Cryptography is typically bypassed, not penetrated").
That doesn't mean it's "game over" - it does mean that some things should never be trusted to computers because of their value. It also means that not everything can be trusted to the same computer - which is just too inconvenient (apparently).
"People" will say - but [insert OS or package here] has never been exploited. Maybe... but it's a big maybe, and very much dependant on a given point in time. It's very hard to prove it - as a mathematical proven fact. At best it's just an until-now-not-disproven fact. There's a difference.
tl;dr it's a false and dangerous assumption to propose that all flawed software is the result of bad programmers. As a technology software development is somewhere around the same stage as the first cars in relative terms (Dig me up when the car is mathematically proven secure. Good luck with that - you may find the worms have beaten you to it).
When it comes to the relative security of different OS incidence of deployment is not necessarily a good indicator. I'd propose that level of access to the OS, level of awareness and education of the operator, and relative value of exploiting the system are the main factors. i.e. Windows is not the most deployed platform - it is as a "desktop", and the average level of awareness and education of the operator is low relative to other "desktops" - and it's accessibility is low (anyone can get hold of it, a lot of people can explore it). The hypothesis seems valid as it has a relatively high number of known exploits in it's history (3 years after release the fixes take up more space than the original install) - most of them of low risk . Apply the same criteria to "Linux", allow for it's diversity, and the fact that until recently the average operator had a relatively higher level of awareness and education - then factor in the relative value of it as a target (higher) and the hypothesis also seems valid. i.e. higher skills and resources were pitted against it which meant, less exploits found (in the core system), the majority of known exploits quickly found were low risk - the higher risk ones were harder (took longer to be reported) to find.
It's just a hypothesis - and not particularly well stated, I've simplified things but I have tried to take into account factors like predictability of the core system (Windows core system is more predictable than Linux), and reporting/detecting exploits of flaws. Financial trading systems are less likely to report exploits than browsers used for banking, but I suspect greater skill and resources would be focused in a smaller amount of projects aimed at finding flaws to exploit in share trading systems or telecommunication test heads. Allowing for (possible) increased auditing and monitoring of systems that run more valuable processes Shamir's Second Law also means they are exponentially harder to protect against the attempts against them.
Maybe one day computer science will escape Shamir's Laws, and maybe one day we'll all have flying cars, and escape the fundamental laws of physics. But I wouldn't bet the bottle of aftershave my family has been gifting around every Xmas for the last four decades on it.