The rationale for a graphical approach is that one can visualize concurrent behavior more easily.
Structure is also easier to perceive if it is visually apparent - unless it is cluttered up with detail level features, as you have pointed out. So to be able to perceive large scale structure, one must "factor" low level features into larger, more conceptual structure.
I agree that software tools on the market for "visual programming" are not what we want here: you are completely right about those tools being suitable only for simple programs. I am talking about the class of models/designs that represent concurrent behavior, including event oriented designs.
For example, consider a system that is designed completely around events. If one can visualize the inter-connection of components, and one annotates each component with the types of events that it generates (perhaps using expressions), one can easily grasp the overall behavior much more easily than one could if that same design were expressed textually. Thoughts?
PS - I was on the team that designed the VHDL language at Intermetrics circa 1984, and I built the first synthesis compiler for VHDL, so I definitely appreciate your points. I am just not convinced that we are not on a path that is a result of the dominant computing paradigm (imperative programming), rather than the optimal path... I have been out of the HDL field for a long time though, but I often build simulation models of complex systems, and a visual paradigm seems to really help with analysis. That is what makes me wonder about visual approaches to design. I have also wondered about the fact that software programs - created using an imperative paradigm - are so buggy, whereas electronic systems tend to be far more error free. What are your thoughts on why that is?