The first rule of software is that all software beyond the barest of trivial examples will have bugs. Compilers are software, and have the same long and sordid history of bugs. Since compilers have been mentioned specifically, you might be interested in the classic work Reflections on Trusting Trust (it was apparently written by a guy who knows a thing or two about the topic, some Ken Thompson fellow).The same goes for test suites. In many cases, bugs translate to security vulnerabilities. In some cases, perfectly rational behavior demonstrated by entities known as programs results in unexpected behavior when they are made to exchange data. This phenomenon is referred to as "novel outcomes" in some circles, and "wow, that's some fucked up shit" in others. There is a reason the field of information security is as broad as it always has been, is, and always will be.
Your post proves you have never worked as a professional developer, or for an organization where your role was deeply connected to systems or development work. Heck, it proves you've never worked on any major open source project either, for that matter. I suppose we should all stop using anything resembling software immediately to prevent the planet from caving in under the weight of its own failure. Or perhaps you should take your obviously extremely advanced software engineering skills and produce the one true invulnerable platform for everyone, one layer and application at a time.
As Bruce Schneier famously said, "security is a process, not a product." That process never ends, and involves complexities I believe could be delicately framed as things that aren't exactly your area of expertise. That's okay, though; you can always start educating yourself immediately. We're all looking forward to your next batch of brilliant revelations on infosec strategy.