it's just a matter of time until the unwashed hordes of C++ monkeys are unleashed unto critical systems.
No way. The corporate lawyers will never let that happen. Neither will the regulators. It is very hard to certify a SDC for public roads. Reams of test data are required. It is even more difficult to get a medical device approved by the FDA. Therac-25 happened almost 30 years ago, a lot of lessons were learned, and it hasn't happened again.
Bridges aren't designed and tested by "trial & error" ... Neither are buildings or pacemakers or computer chips.
I have never designed a bridge or pacemaker, but I have designed computer chips. I sit at a workstation, and I type Verilog code into Emacs. It is the same process as writing software, which is mostly trial and error. I write unit tests, do regression testing, etc. I watch it fail, I fix the bugs, and I iterate. Once I get all the bugs fixed, I load it into an FPGA, and watch it fail with some signal skew that I didn't think of. So I write more tests, and repeat. When it runs flawlessly on the FPGA, I ask a co-worker to test it some more, and review my code. Eventually we go to silicon, where a bug costs a million bucks. Usually everything is fine, but that isn't because it is "different" than doing software. It is basically the same process. It is more reliable because most ICs are far less complicated than even a typical iPhone app. They tend to have lots of the same cells repeat over and over. So an IC with a million gates isn't like a million lines of code. It is more like a few dozen 50 line subroutines, that are called a million times.