If you're going to go that route, why represent the code as a text file at all? Make it a binary AST on disk, and make the editor convert it to and from text. It could work, except that you'd have to retool the entire world to support it.
However, as long as programs are text, you have to require humans to enter some text to tell a compiler where, say, a block of statements ends. If you do that using some random characters like curly braces, some human is going to have to type those. That human (or more likely the text editor) is also going to type whitespace to make the program readable. You're requiring the work to happen twice, and putting two different representations of the same information into the file on disk.
For that matter, if you wanted, say, Python to have curly braces and semicolons and no indentation at all, you could have your editor present it that way. Nobody cares if you do that, so long as you write out a properly formated Python file in the end (and make sure you don't change things that are significant to SCM when you don't mean to).
So why are extraneous printable syntax characters, or whatever else you want to do to indicate where things begin and end in code, any less "code presentation preferences" than whitespace? If you really, truly hate significant whitespace that much, you can have what you want as a simple matter of programming.
The only real difference between indentation and explicit printable separators is that at least some kind of indentation seems to be an almost universal human preference... even in languages where all the sugar characters have to be there anyway. People are actually willing to do double work to get indentation even when it doesn't matter to the compiler. It's important enough that basically all large projects have formal rules about really picayune details.
A simple editor can only really maintain one or the other, so it makes sense to use the one that most people rely on, and not to require manually maintaining two different representations of the same information in the disk file. A complicated editor can do whatever you want once it loads that file.
Since that color coding assists and ultimately matters to how humans decode the document, then it had better damned well be what the computer uses to decode it.
If that color coding were part of the actual program text, as opposed to being something that the editor tags on when you load the file and that vanishes when you save the file and close the buffer, then that would in fact be right. If the convention in the language were that variable names were entered in red, and you entered them in green, and that were going to show up on my screen when I viewed the code, then that should be a compiler error.
But in fact the color isn't part of the program in any way. It's ephemeral stuff, added at load time and stripped at save time. What the color coding does is actually to make the machine's view more visible to the human.