Because our computers, almost since their earliest inception, work in base-2 arithmetic.

And this matters because...?
The SI prefix are used to denotes quantity. Except for the total size of semi-conductor memory (and sub-elements thereof), those quantities are usually completely unrelated to power of 2.

Disk sizes are not a power of 2. Files stored on them are even more arbitrary in size. Same for memory requirements of various codes. Heck, once upon a time, HPC programs would overallocate arrays to *avoid* power of 2 allocation (multiple of the page size wrecked havoc on direct-mapped caches)!

Frequency are not power of 2. Your 3 GHz processors runs at 3*1000^3 Hz (well, probably not very precisely :-), not 3.22*1000^3 HZ. Same goes for the (directly related) bandwidth. Latency doesn't even come close.

Heck, these days, even *memory buses* are moving away from strict power of 2: GTX 275 have a 448 bits-wide memory bus, Core i7's can be described as 192 bits.

And for the few cases where it matters (usually not end-user visible) **that's what the freaking/frelling/rutting/smegging/[pick you favorite SciFi show euphemism] binary units are for !**

(I'm not going to try to make sense from the rest of your post, because I can't see how the number of bits in a bytes is related to the discussion in any way).