Many modern FPS and RTS games have SDKs available where you can create mods of the original game. Pick your favorite, figure out what language it requires, study that, and make a mod for the game. It will get you used to working with level design tools, which you're going to need at some point, as well as programming in that language as it pertains to games. If you can bring in extra developers, that's good teamwork experience, which you'll need in just about any field period.
Companies like Valve have been known to hire accomplished mod-makers. Don't get your hopes up on that, but it's something to think about.
... I cannot justify buying three $500 video cards just to play a game.
Was this ever a requirement just to play a game? Granted, I haven't been around THAT long, but if my current rig and its pair of $200 video cards in SLI mode can run Age of Conan at 70 FPS on maxed out settings, I fail to see why anyone would be shelling out $1500 on graphics hardware alone.
An often-missed point in this discussion is, even with bleeding edge $500 video cards available, there isn't a game out there that requires more than one of these behemoth cards to run at max settings. None that I've encountered, anyway, and this was true even four years ago when I built my current rig's predecessor.
As for the gaming PC being dead, mine seems to be alive and well despite being a year old now. I generally build a new rig every three years or so, and it seems to cost roughly $1500 for the entire machine each time. I tend to jump on new games fairly quickly, and I have yet to see my computer choke on one. I never really understood the whole "six-month upgrade cycle" thing for hardware, but maybe my luck with hardware is just that good.
Either way, the article sounds like more sensationalist over-stirring of the pot to me. Move along, nothing to see here.
The following statement is not true. The previous statement is true.