It seems a bit foolish to worry about purely theoretical security issues when we've got so many real ones to deal with. Ken Thompons' compiler infection demonstration was an interesting experiment designed to make a particular point, but I don't think it's wise to consider tool-chain hacking a legitimate threat, as we've never seen anything remotely like this in the wild, as far as I'm aware. And frankly, I question whether it's even realistically possible beyond a very simplistic demonstration.
First off, naturally the level of security I'm talking about would probably only be reserved for national governmental agencies intended to protect ultra-sensitive data. For them, that level of security is necessary, and they will spend the money and resources to audit and verify everything if necessary (which is why we have SELinux).
Additionally, the build chain comprises not only the compiler, but the standard libraries and any third-party libraries as well. If not verified, these could easily have unexpected code inserted into them, that compromises your product once linked against them. You wouldn't expect to see such compromised libraries "in the wild", as they would probably part of a targeted attack. This is hardly unprecedented; while not done at build time, Stuxnet uses DLL replacement on Windows to add extra routines to the operating system, which are used to inject code being uploaded into a PLC.
Again, most organizations don't care to undertake the kind of expense required to protect against such attacks; they use the chain-of-trust you describe. However, national security organizations do work at this level, and if you need that level of security, pre-compiled binaries, whether they come with source or not, is insufficient.