TL;DR: "You kids get off my lawn"
May apologies, but you are on the wrong side of history. In the 50's, there were "old guard" programmers who wanted to program in octal instead of assembly so they could really understand what the computer was doing. In the 60's, the "old guard" fought COBOL and FORTRAN in favor of assembly so "they could understand what the computer was doing". In then 70's, they fought virtual memory because "only with real memory could you understand what the computer was doing". In the 80's, they fought SQL and wanted to keep COBOL so "they could understand what the computer was really doing". In the 90's they fought GUIs because "only with a command line could you really understand what the computer was doing". And in the last decade, they fought bytecode and interpreted languages because "only with a compiled language can you really understand what the computer is doing".
This is not to say that every proposed new language and concept is good -- they aren't. There was an research computer where the compiler was in hardware (yes, individual gates and thing to parse your source code), along with the entire OS. There have been visual languages by the dozen; almost all were losers.
But, overall, history isn't on your side. The higher level languages and abstractions actual make people more productive programmers. Both Java and .NET have been accepted as "good" by an enormous number of working programmers and their hard-nosed managers; they are here to stay.