Maybe having the ballpark at $13k is good enough. Maybe having the Red in a pixel at 240 is good enough compared to 255. But how is the chip going to tell that making a jump to 0x389519B0 - some offset requires full precision, compared to, I dunno subtracting this number because someone punched it into the calculator program.
And at the end, CPUs calculate a lot of these integers for indexed jumps, and branches very frequently. How's the hardware going to be able to tell when it can skimp and when it cannot?
When it can't, it would be trying to run linux on known bad ram. Things will crash, and people won't be happy. If you need software to hint, then you might be able to code a compiler to do it for you, but even then, the people who knows which addition need full precision and which one doesn't is the application designer. The compiler won't read your minds.
If they say, they'd split this into 2 sets of arithmetic instruction, then they're not going to get the power efficiency they were looking for anyways.