> So we've had a defined standard that was, arguably, not the easiest to understand.
> THEN harddrive manufacturers started their fraud. And THEN people started complaining.
> So what, and please think about this, would be the right decision here?
As far back as I know, and this goes back before the 1970s, C.Sci boffins picked up a defined pseudostandard (that 1024 was close though to 1000 to use K, etc) for concepts that required *only* direct binary addressibility like RAM and CPU registers/caches, and all else used a base 10 definition of K right from the start - that includes tape drive storage, hard drive storage, bandwidth rates, CPU frequencies, display frequencies, screen resolution, sampling rates and so on.
The idea that 1K = 1024 for "everything in a computer" is relatively new. The old guard knew exactly when it was appropriate to use, and did not use it for concepts outside that domain. It's only since the mid 1990s that geek kids fresh out of school want to use it everywhere. Hell, go into a geek IRC channel (usually a bastion of relatively conservative C.Sci geeks) and ask how many Hertz in a 1GHz processor, and a fair number will insist it's 1073741824Hz, or that 10Mbps ethernet is 10485760bps. They'd be wrong, too.
http://www.groklaw.net/article.php?story=20070810Hot off the presses: Judge Dale Kimball has issued a 102-page ruling [PDF] on the numerous summary judgment motions in SCO v. Novell. Here is what matters most: [T]he court concludes that Novell is the owner of the UNIX and UnixWare Copyrights. That's Aaaaall, Folks! If anyone can please put this into text for us, that'd be simply great.
Here is Roblimo's take on Linux video editing state of affairs:
Kino captures video (although not high-definition video) competently through a FireWire port, and Cinelerra can do most video editing tasks if you are willing to spend three to ten times as long doing them as you would with Vegas or Final Cut.
Nearly every complex solution to a programming problem that I have looked at carefully has turned out to be wrong. -- Brent Welch