So I see something totally clueless in a slashdot user's
10 Bits= $.25 100 Bits= $.50 110 Bits= $.75 1000 Bits= 1 byte
Man this guy is wrong on so many fronts. If it is suppossed to be a joke, the humor has escaped me. The math is bad. The conversions incorrect. But it did trip my mind into thinking of the relationship between the shave and hair cut, 2 bits, and 8 bits in a Byte.
So the quick conversion:
Money: 2 bits = $0.25 Data: 8 bits = 1 byte
I was told early in my education that the term BIT was a contraction of "Binary DigIT". Made sense to me, but thinking about it, I realized that when related to coins, a "bit" was 1/8th. Just as a single bit is 1/8th of a Byte. So I am now wondering if the person who coined the term bit meant 1/8th, and the Binary digIT was an added-on kludge later on.
Now the money bits comes from the day when you could "make change" by actually cutting your coins. The simplist way was to halve it, then halve the halves, then halve those, which gave you 8 pieces and was about all that was practical. This is where "Pieces of Eight" that we hear about in all those songs about pirates come from. So two bits is 2/8ths, or 1/4th hence $0.25 when we are bitting-up a dollar.
Many of the terms used in computing were and are puns. When Computer Science was young, and run by acedemics instead of money-makers this was much more wide-spread than it is today. I'm thinking "BIT" is another one of these puns, it just never occured to me that it was anything other than Binary digIT.
Current Mood: contemplative
EDIT: Its been pointed out to me that the 10, 100, 110, 1000 is the sig is BINARY. Duh me.