What a misleading title, it is not even in the same continent as the article.
A large number of people obviously didn't read the actual article.
And I guess Knuth has quite a fanboi community on slashdot. I wonder if he really appreciates that ?
Some of those who did read the article, does not seem to know the difference between a binary heap and a binary tree, and even the pretty strong clue to the difference in the text, did not make them go check wikipedia. 10 out of 10 for selfesteem, but 0 out of 10 for clue.
Those who think CS should be unsullied by actual computers should make sure to note this belief on their resume. (Trust me, everybody you send your resume to will appreciate that.)
Those who advocate getting rid of Virtual Memory must have much more money for RAM than is sensible. I wish I could afford that.
About five comments tries, in vain it seems, to explain the point of my article to the unwashed masses (kudos!, but really, what are you doing here ?)
Not one comment rises to a level where I feel a need to answer it specifically. On Reddit over the weekend there were about a handful.
Is that really par for the course on slashdot these days ?
Sic transit gloria mundi...
Poul-Henning
Theora was based on one of On2's earliest codecs. VP6 & VP7 have been far more successful and are even used as the Flash video codecs. If Google is acquiring On2, it could mean that they're looking to open up the formats that have defined Flash as the media player of choice.
(Sorry, this is somewhat offtopic, but it was the first thing I thought of when I saw the comparison between Windows XP and Windows 7.)
I once saw someone here on Slashdot mention that Microsoft should not have shipped a 32-bit version of Vista, opting instead to push only the 64-bit version. While it seemed like an odd statement at the time (despite the fact that my home XP machine was an AMD64 processor), I find myself agreeing with it on Windows 7.
As it stands today, 32-bit Windows is quickly becoming too small for many business and industrial uses, and it's very affordable to build a high-performance home machine with more than 4GB of RAM. (Case in Point.) In fact, with intensive web applications and sophisticated desktop tools (yeah, some of them are bloated) chewing more memory than ever before, it just doesn't make sense to get anything less than 4GB (nay, 3GB if you're running Windows 32-bit!) except for a few edge cases.
Unfortunately, Windows has been kind of lagging on the 64-bit front. By treating it as sort of a bastard child (like they treated all their non-i386 NT versions), Microsoft managed to ensure that hardware manufacturers wouldn't make an effort to support 64-bit windows in a non-server environment. Which is frustrating as I've started bumping up against that once-awesome 4GB barrier.
In an attempt to turn this into a slightly more useful conversation rather than a one-sided rant, I was wondering if I could get some opinions on using virtualization as a solution? With Windows' poor track record as a 64-bit OS, I have been thinking about running a 64-Bit Unix and virtualizing 32-bit windows for backward compatibility. I've already had some success with virtualizing Windows 7 on a MacBook, and have even been able to get desktop integration working. (Quite spiffy that. Though the two interfaces occasionally confuse my wife. She's the primary user of Windows, needing support for some specialized programs with no real alternatives available.)
Does anyone here have experience with setting up a system like this? Do you use Xen, VMWare, Sun VirtualBox/OpenxVM, or some other solution? What do you use as your primary OS? Linux has come a long way, but the upgrade treadmill is still frustrating. Especially with the seemingly regular ABI upgrades. Does anyone use [Open]Solaris x86_64 as a host? Do you have 3D Graphics completely disabled, or have you found a good way to allow all OSes solid and reliable access to the underlying graphics card? Do you bother with mounting virtual shared drives to move data between the OSes, or do you have a home NAS for storing data? (I'm leaning toward a NAS myself.)
Just a few thoughts, anyway. Thanks in advance for experiences & suggestions!
Wasn't Wolfenstein, released in 1992 the first game with 3D graphics?
Not even close. Wolfenstein wasn't even the first raycaster game. It was preceded by Catacombs 3D (also by Id) which itself was preceded by Hovertank (also by Id).
Before those were even a twinkle in Carmack's eye, we had MIDI Maze (1987) and Star Wars Arcade (1983), just to name a few. There were tons of attempts at 3D games before Carmack. He merely popularized the First Person Shooter genre and made 3D Graphics the standard.
The problem here is in trying to patent a trade secret rather than an invention. Patents are intended to cover inventions. Real, working gizmos that operate is a specific fashion. Trade secrets cover processes and information that is of a competitive advantage.
In this case, the two are getting mixed up. The company may have a device to detect certain attributes (which IS patentable) but the fact that the attributes can be measured in order to draw conclusions is inherently unpatentable. If someone else develops a machine for measuring the attributes that works different from your machine... well... tough noodles.
All that can be done is to keep the information a secret. By keeping it secret, it is legally viewed as a "trade secret" which can be contractually protected when sharing with interested parties.
Disclaimer: I am not a lawyer, but I did stay at a Holiday Inn Express once.
I happen to agree with the GP, and I've written tons of games in the past 40 years. Here's my Atari 2600 version of Lunar Lander:
http://www.pdroms.de/files/73/
Run it through an emulator like Stella to play.
I later ported the game to Flash, but it's not quite as fun as the 60Hz 2600 version. However, you can play it on a Wii! (Use S for thrust if you're on a PC.)
If the human brain were so simple that we could understand it, we would be so simple we couldn't.