So is the contention that code needs to be sent through unreliable media a lot?
No, contention is that this can happen at all, and there's likely no way to reconstruct the code afterwards. You were talking about implicit blocks as evolution of languages, but this evolution makes unhealthy assumptions about every media and every program only doing verbatim transfers. This is far from reality.
I was talking of programmers doing less busywork just to satisfy the language as the evolution of languages. As far as verbatim transfers, to me it seems strange that a language designer would spend any time at all trying to worry about the case where code might be copied through some medium that can't be relied upon. Not saying it can't be a problem, just that it seems to be far outside the realm of what a language designer should care about.
But regardless of which way you end up on this, I have to emphasize that this is more or less a theoretical problem: we send code from all sorts of languages through these example media and never run into problems: be it showing code snippets on a web page or emailing chunks of code back and forth for quick review or brainstorming. We never mandate any rules or best practices about how this should be done, and yet from your description of things it seems like we should frequently be "burned" by this, but we aren't. Are we just really lucky?
The problem with a language like C is that there are two indicators of block structure - the official one (the braces), and the unofficial one (the whitespace).
There's _one_ indicator of block structure in C et al. - braces.
*Sigh*, no. I encourage you to look past the strict definition of what the tool is consuming and look at it from a language design perspective. Like I've already said before, from the tool chain's perspective, there is just one indicator. I'm not in any way disputing that. Yes, it really is the block structure indicator that matters (to the tools). But what matters to the humans? If you answer 'braces', then why is consistent indentation so widespread and important?
You could blindly assume there's no indentation at all in all your C files and just set up your workflow to automatically reindent it to your liking when you open it.
Not in practice, as this would destroy your revision control - you make a one line change to a file and the commit diff ends up being nearly the whole file just because your coworker with a different scheme was the one who edited that file previously.
But still, this too is missing the point. Even if your editor and my editor decide to display the C code differently, the differences would be in things like the horizontal distance in each level or whether or not the opening curly brace is on its own line or not. The block structure of the program would not realistically be different in my editor than yours. Things that didn't really have anything to do with the program's structure might be displayed differently, yes, but not the structure of the program itself. Why? Because the structure of the program isn't changing, and that formatting conveys to you the structure (far more than the braces do), so of course it's going to be the same materially in both cases.
As noted earlier, that's why bugs like this can occur:
while (condition);
doSomething();
And that's why bugs like this are easily caught simply by hitting the same "reformat" key without needing a static analysis tool.
Forget how to catch/correct the problem. I'm asking you to think through /why/ the problem could occur in the first place, and not be instantly obvious after the fact? This isn't rhetorical: I'm genuinely asking you to think through what is going on in a programmer's thought process that lets this happen and not be immediately caught. The fact that it's easy for the tool to find and fix is great, I'm saying there's insight into language design if you get why it happens in the first place.
doesn't it seem a little strange that the strongest indicator of block structure to a human is different than the one that actually matters?
No, for reasons I stated in last post's PS. Robust model is preferable to perceived lower redundancy in presentation. In fact, you could write a plugin for any IDE that'd strip braces from your C and replace them with indentation, or, conversely, would let you write Python with braces, seamlessly changing it on save - model's the same, "IfStmt(condexpr, truestmt, falsestmt)" etc., it's just the question of presentation. Python's reliance on whitespace is bad design decision because it believes in perfection - that no one trims whitespace, every editor is aware of Python and programmers are perfect beings and don't misindent. All of this is just false. Good design shouldn't make so much assumptions and trip over imperfections of real world.
The model/presentation analogy is a bit of a stretch, but I'll run with it: the reason significant whitespace works so well in Python is that there is no difference between model and presentation. How the person reading and writing the code organizes the structure of the code is exactly how the tool sees it too. There is no chance for the two parties to be out of sync.
Final thought exercise: if I assume for a moment that you're right, that it /is/ a bad design decision, can you help me come up with an explanation for why Python works so well in practice? To recap the earlier parts of the discussion, I've observed people using it across multiple companies, multiple teams (of all sizes), multiple projects (of all sizes and complexity), the full range of experience levels with Python and development generally, and the full range of scenarios used (from client apps to utilities to mission critical apps for businesses and so on). If this is a flaw in the language, why aren't we seeing these problems in practice with any sort of frequency? I've removed myself from the equation since I've been using it so long that it's likely that I have somehow adapted over time to avoid these issues, but it seems like after removing myself I should still at least be hearing of teammates dealing with butchered code or whatever. Can you help come up with an explanation that is more plausible than "maybe it's not bad design after all" ?
Further, I've watched a lot of people start using Python with a pretty big range of attitudes, from openly hostile to prematurely enamored and everything in between. After each person has used it for a few months, I have never (not even once) heard of someone complain about that darn significant whitespace anymore. Before they used it, yes, I'd hear the complaints. But after a "real" project or two, it has *never* come up. Why might that be the case? What's even more interesting perhaps is that if you look specifically at the subset of these people who end up disliking Python (but have used it on real projects for a few months) and listen to their reasons why, even *they* have never mentioned the significant whitespace. Not once! They have first hand experience and can articulate some really valid points about drawbacks with the language, and yet they don't cite the whitespace thing. Why might that be?
Finally, if people have been on a Python-only project for awhile and then jump onto a C or Java project, I *do* routinely hear them half-jokingly make comments like, "d'oh, I keep forgetting the braces". Or others will remark how the braces seem superfluous, a waste of time. What is interesting to me is that I have observed this with pretty much *everyone*, be they veteran programmers or relative newbies. What would be a good explanation for why this happens, and why would they so universally remark that it seems odd for braces to be needed?