There's a problem with efforts to make programming more accessible to non-programmers at the technology level: it turns out that you still have to become a programmer to use the technology effectively. This very notion is how programming languages were developed in the first place—what if we could specify what a program should do, rather than writing the code that does it, and then have the computer generate the code? That is a programming language.
Modern programming is increasingly abstracted away from the metal, and compilers are a wonder unto themselves, but ultimately in order to effectively write a program you still need to do two very specialized things:
1. Design the damn thing well enough to at least get it working (and hopefully well enough to maintain and extend it).
2. Either know or discover—usually both—how to work around the warts of the chosen technology (because they all have warts).
Even if programming could be made so abstract that it's essentially a series of opaque building blocks, you'll always need to do #1, and only by vast inefficiencies and ignorance be able to avoid #2.
- - -