Where is "nowhere"? Also, what are you referring to when you say "that"? I'm lost in parsing this sentence. And how does a graphical interface eliminate syntactical expressiveness and therefore the potential for syntactical error? Are you suggesting that a language is better off not having any combinations of operators that could potentially result in syntactical error, with everything parsed as equally valid?
"Shouldn't there be a simpler, more robust way to translate an algorithm into something a computer can understand? One that's language agnostic and without all the cryptic jargon? It seems we're still only one layer of abstraction from assembly code."
I'm not sure what you're suggesting. What cryptic jargon? Have you ever actually used assembly or ANSI-C ? They are pretty un-jargon as far as languages go. Jargon is higher-order language that is so built up on technical definitions defined recursively by further technical definitions, that the language is so specialized, that only a specialized segment of the population can understand it. Most mnemonics (which C for example is -- a mnemonic for assembly, albeit in modern versions accompanied by many helpful and useful tools on the side) ... most mnemonics are by definition easier to understand than jargon. And most programming languages are mnemonic in some way, shape, or form. When you say we're one layer of abstraction from assembly code, I'm not entirely clear whether you think that's good or bad, or if you're just cleverly trolling us all.
"I consider myself someone who 'gets code,' but I'm not a programmer. I enjoy thinking through algorithms and writing basic scripts, but I get bogged down in more complex code. Maybe I lack patience, but really, why are we still writing text based code?"
Everybody gets bogged down in more complex code. That's why there are numerous attempts at simplifying coding in general. Procedural code, functional-argument code, object-oriented programming, it's all an attempt to put more and more "ease of use" into the programmer's life.
These methods don't always work, period, let alone for everybody or even for all coders. The Java language is the current best case in point of this. Java shows what can go wrong with not only a programming language but any tool, or project, or system when it suffers from too many eggs in one basket -itis.
I myself suffered programming burnout at a young age. I had been programming since 8 years old but never quite "got" it, just inputting code from magazines (back when we still did that) and modifying strings and variables. I understood GOTO and GOSUB but still didn't grasp that these essentially put procedural programming in the hands of lined BASIC.
I remember when I met one BBS sysop when I was BBSing and SYSOPing in the early 90's, around the time I was 15 or 16. I had asked him for advice in programming this QBASIC thing I was making. I kept getting out of memory errors. And he said if the program is too big, I'm likely to get that particular error and he asked to see my program. Well, it was all lined code (which QBASIC still supported) and because I had no sense of procedure, what I had was an attempt to hard-wire the entire game including all of its choices.
Learning procedure and function was pretty amazing, but the same helpful SYSOP went further and started teaching me C++. So now I was learning pointers and references as variables, functions as arguments, and recursive programming. I started getting headaches, my head was actually physically overheating when I was programming, and I gave out. I didn't program anything from the age of 16 until about roughly 26 or so.
I approached returning to code with a fresh perspective. I wanted to learn 8086 assembly, but I didn't want to run an old DOS machine and learn old DOS conventions of assembly, I wanted to program for windows. So I looked into assembly for windows. There is one really great-sounding windows assembly language out there called SPASM, "Specific Assembler". It allows you to write and re-write your mnemonic in-line in the form of "macros". That's pretty extraordinary. And I also learned, sadly, that assembly in windows is brutal. A bare skeleton with a pop-up window and a close button is a pretty rich undertaking. I had looked into assembly with an urge to approach programming like building things out of wonderful, colorful LEGO blocks, and instead it was like I was being asked to perform liposuction on a morbidly obese bed patient, by scooping the fat out of the belly with my own bare hands, all to be performed with the recesses of a gaping red bedsore.
So, I learned C instead. There are always tough choices in programming. C is that "one layer away from assembly" that you describe, and there simply isn't a better language than C >>> IMHO .
Because I was able to shuffle away what would have been hours and hours, and weeks, and months, and years of getting the hang of coding the guts of assembly programs inside of the windows environment (although that would have been alleviated greatly, I suspect, through SPASM) and instead focus on things like programming logic and good coding practices, I was able to program better than I ever had when I was being tutored by somebody who today works north of Silicon Valley doing just about everything.
For instance, I used to plan programs in the GUI or inside of a text editor. What the hell good is that? Code planning calls for a nice large piece of blank paper or a big dry-erasable board, not the squished lines of a computer screen. And I used to just deal with the conventions of the GUI. Now I don't use a GUI unless it's highly configurable. One of the first things I did while learning C was make my own indentation rules. Just that act alone, working in a set indentation scheme fit to my own personal demands from then out, increased my learning and productivity by a very noticeable degree. Said Sililcon Valley friend, seeing it a couple of years later, was impressed enough to give it a try. That's highly effective programming across numerous levels. And where is the GUI or the mnemonic in any of that?
The use of the graphical environment isn't to "make programming easier". The "Main #1 Reason" for GUIs in coding is a tie between two things:
a) It fits the convention of what people most appreciate seeing on their screen. Even the Borland GUI back in the DOS days was glorified and fattened with a nice drop-menu user interface in text mode. Most of the popular DOS applications featured this semi-graphical text mode interface, so that's what was appreciated by users. It fit in with the rest of their computer use experience.
b) It makes code-management easier -- nothing to do with making coding itself easier at all. If you don't know how to program, a GUI is of no use at all. In fact, the GUI is probably going to be more of a distraction and a sort of bewildering architectural maze than it is of any help. I can't stand the idea of teaching college programming courses inside the GUI. There are some good (great?) instructors out there who teach the essentials in command-line interface and then expect the students to perform inside the GUI, but they are exceptions to the rule.
So, this argument you're making that there's something wrong with current trends in programming in general just because the GUI doesn't magically make programmers out of dimwits doesn't really hold water.
It's not about the GUI or about the mnemonic, especially when the best sort of language lets you rewrite the mnemonic in-line any way and in which case THAT language inside THAT program that THAT programmer is making would be THE MOST "jargon" programming language anybody could expect to find because it makes no sense outside of the program itself.
It's just about how you approach code and how coding is taught to you.