I tend to agree in many ways. It's not entirely an engineering problem.
The real risks come as a result of our system, which is squarely rooted in human greed and fallibility. We're risk-takers by nature, and the risk/reward equation is skewed toward danger.
For example:
If I'm a CEO and build a reactor, cutting costs by attenuating the safety systems specified by the engineers (e.g. using cheap materials for failsafes, or not installing them at all), my profit goes up. I saved a lot of money during construction, didn't I!
However, if something goes wrong and my poorly implemented safety mechanisms fail, my personal risk is actually quite low. I probably won't notice an impact on my earnings, I certainly won't go to jail, and once the media is done feeding on the corpse of my disaster, it's back to "business as usual."
This is a far cry from the careful designs of the engineer, and the scenario gets played out all the time, in various disciplines (see also: BP oil spill, mortgage-backed securities, etc).
Maybe the solution is to let the engineers control the nuclear industry, soup-to-nuts, and send the MBA's packing?