This is quickly turning into yet another language holy war, but why not? It's fun.
I really cannot agree. Historically, one of the most common ways to pwn a machine was to exploit buffer overruns. Languages such as Ada and Java are virtually immune to that sort of exploit.
Sounds like you agree completely: Ada addressed a need in the 1980's that doesn't exist anymore; almost all our languages these days are type safe, and many have excellent static type systems that run rings around Ada's.
the amount of time and effort to make a secure, robust application is more or less independent of the programming language used. The difference lies in where you spend your time.
Here too you are pretty much saying what I'm saying: in statically typed languages, the compiler catches a lot of stuff for you, in dynamically typed languages you need to write more tests to achieve the same level of fault detection, and it ends up taking about the same amount of time overall. That means that statically typed languages are good for production but not prototyping, while dynamically typed languages are good for both prototyping and production, and furthermore let you transform a prototype into a production system gradually by adding tests.
Anyway, use statically typed languages if you like, but Ada's static type system is really obsolete and unnecessarily cumbersome.
(And please don't use "strong typing" when you mean "static typing".)
Having come from a long personal history with statically typed languages and then landed in a Python project (using it in a variety of ways and systems), I would say that while there's certainly a difference in workflow and approach, it's not as drastic as one might think. As much as people espouse Python's dynamic nature, ever more professional teams adhere to rigid principles that include lots of static analysis (unit testing, dedicated static analysis tools, REPL, etc.).
And at the same time, I saw lots of leaky abstractions and twisting of the type system by some of the frameworks, bleeding of run-time needed components into configuration and other artifacts not visible to the compiler and so on. And so conventions arise and the stack grows with ever more sophisticated frameworks.
The fact is, modern computing is both easier and harder. It's easier because we have amazing advances in hardware and software available to us. There are tons of libraries, frameworks, and components that can be put together in lots of interesting ways. Operationally, there are myriads of tools available as well. But it's also harder because making sense of it all isn't so simple and because the pace of development that is expected of us is higher too. Many organizations have built up a higher appetite for risk in dealing with the latest technologies in search of an edge. This isn't only typical of start-ups, as the pressure is on for everyone now. The next great database, analytics engine, server, framework, or language could come equally from Microsoft or Google, as well as from some unknown 17-year-old's home in Netherlands.
The principals of building successful systems now are rooted in computer science and good operational practices. Languages are important, but not the be-all-end-all of a team's success. Openness to alternative approaches and a willingness to look under the hood to understand what you're producing are equally important. Don't get me wrong, I love static analysis and compilers and good type systems are the place for it. It can be done elsewhere, but feels icky to me. But icky or not, it won't stop me from solving the problem at hand if I can help it.