There are a couple of problems with this:
1. Everybody thinks they're the highest performers. You have to be smart enough to know when you're being dumb, and most of the dumb performers aren't smart enough to realize it.
2. You assume there's a good reason for doing things a certain way, and that reason hasn't been invalidated. Programmers used COBOL, FORTRAN, and Assembly for a good reason, and now very few programmers use them, for smaller good reasons. This is a move to a higher level language. People did raw pointer math in C, in part because it was fast and in part because there wasn't a better way to do it. Now we have higher-level languages that handle that material, and they are slower to run, but much faster to code.
The basic fact about higher level, more insulated languages is that programmer time is much more expensive than computer time, and programmer mistakes are even more expensive. The narrowing opportunity that this produces means less ability for high-performance, hacky, unmaintainable code that no one else is smart enough to understand, but much more opportunity for building powerful applications. The explosive growth of web apps is directly tied to the power of the languages they're built on.
I know nothing about this particular implementation, but the concept of protecting the programmer from himself is actually a sound one, and I think you need to avoid being so defensive about it.