People should be doing everything from measurement to arithmetic in hexidecimal (base 16) these days. SI is obsolete in the information age. Although it might be nice to replace the abcdef numerals with something non-alphabetic.
You can draw all the same arguments that were made for the metric system and apply then to why we should switch everything to base 16.
Floating-point operations are generally performed on a base-2 representation of a base 10 number, so conversion errors are common. Base-10 floats or decimal types are possible, but less commonly used and generally don't have CPU hardware support.
Base-16 can represent larger values in a shorter space.
Computer memory is based on address lines that follow the powers of 2, so that a 'kilo' byte is 1024... of course people are just starting to collectively address this issue with the use of KiB.
While we are at it, why do we still have 24-hour days, or worse 12-hour half-days where the 0 hour is actually 12 and proceeds to 1. Why are there 360 degrees in one rotation? Arc seconds, arc-minutes... Why is a dozen 12 units?
Of course I'm just playing devil's advocate here. I know most non-computer science people out there would have their head spinning if they tried to understand anything besides base-10.