Its name was, in fact, Boren.
That's true, but also missing the point.
The main issue is that these situations are not always that obvious in real world code.
There was a talk covering these issues posted at the ACM in February:
Is this a signal that Microsoft decided that they need to compete with Apple by making their productivity applications free?" (Over at WineHQ, they're looking for a maintainer for their page on OneNote. Anyone running it on a Free operating system? What are your favorite alternatives that are "libre" free, rather than only gratis?)
Not sure if the video plays outside of Norway:
It's about at 54 min. in.
- Carlsen was given 30 sec to win.
- Gates humbly said he had a 1600 rating vs someone with 2000 etc.
- Gates was actually a sponsor of Carlsen at the start.
- Carlsen said he violated one of his principles by using a cheap trick to win.
Only the compiler/bridge needs to know about the physical resources you work with. The user only needs a model of the hardware to work with. The compiler/bridge should then be able to produce the optimal code with no language overhead. Unfortunately, the way software is done today is still very ad hoc. There is little or no state modelling so you could map out or simulate what happens with the resources. We'd need model representations for all the hardware.
You're ignoring the reason C went the way it does: performance. 'int' can translate to whatever is fastest, not whatever the spec demands (like in Java, say). It's a tradeoff, and as C is a language which puts speed over just about anything else, I can't fault their decision there.
C actually makes it a lot harder to produce the best, optimized code, because the language constructs are too closely tied to the machine domain. An optimizing compiler has less leeway in how it can re-arrange and streamline the logic to reach the goal.