Everyone and everything has an error rate. Software development is well known not to be a perfect process.
Building a wall (or a better analogy, designing the house the wall will be a part of) is no perfect process either.
I just recently thought about why software is so difficult, compared to physical engineering tasks. A big difference I found (aside from the obvious practicalities, such as lacking proper specification and resources) is lack of tolerance in how software is being built. When you're designing a supporting wall for a house, you calculate how much weight it needs to be able to carry. Then, you multiply that weight by a safety factor, adding tolerance. Similarly, when actually constructing the wall, the bricks don't need to be perfectly aligned, good enough is good enough, the final adjustment can be fixed with bit more or less mortar.
A lot of software is built with low tolerance. Part of it is cutting costs, part of it is just immaturity of the industry. There are already known good practises for increasing tolerance of software development process. Worried about buffer overflows? Use a language that makes them impossible. Data loss? Use a known good DB (and learn to use it) instead of inventing your own storage. Developers writing bad logic? Require proper testing and code reviews. All of the previous requested, but not happening? Bring in a competent project manager.
Then there's the whole other unique issue that software development faces, changing requirements. Construction workers will likely give you the finger, then go drink some beer and laugh about it, if you tell them that the garage they have built half-way actually needs to be a cathedral by the end of the month. In software, that's business as usual.
And then, every once in a while, walls collapse too. Sometimes they find someone who had not done his job properly, sometimes it's just written down as a sum of consequences.