This is probably something that is well understood by the engineers who are building robot surgeons (and maybe even by those building driverless cars), but it certainly isn't well understood by the overwhelming majority of software engineers and it's just a matter of time until the unwashed hordes of C++ monkeys are unleashed unto critical systems.
Bridges aren't designed and tested by "trial & error"--if they were then half of them would fall down within a few weeks. Neither are buildings or pacemakers or computer chips.
There are some scary problems with how [many if not most] software engineers see the world which don't bode well for a world where software can kill:
(a) by and large they've had essentially no exposure to any method of verification other than "trial & error"
(b) they have insufficient reverence for cause and effect because most of their bugs have really low cost (as in, nobody dies)--therefore they aren't mentally trained to make disciplined decisions.
(c) arrogance: unlike every other kind of engineer, software engineers rarely encounter the boundaries of their knowledge. A civil engineer knows when to call a materials engineer, a mechanical engineer knows when to talk to an industrial or chemical engineer, but a software engineer spends their entire lives inside a carefully constructed virtual world where they can't really do that much damage.