"Though a $500+ video card is considered top of the line, a $250 one will now play pretty much any game at the highest settings with no problem. (Maybe that’s what everyone wanted?) Pretty soon, however, graphics chip makers won’t be able to sustain their rate of growth because the software is so far behind, which will be bad for gamers on consoles as well as PC."
Making content that looks good at 1080P (or 1920x1200 for some PC monitors) is hard. Some amazingly specialized people spend a lot of time working on it; the more powerful the graphics processor, the more that is possible, but the more art assets have to be created (along with all the associated maps to take advantage of lighting, special effects, shader effects...) and the more programming time has to me spent. Much like the number of pixels increases far faster than the perimeter of the screen, or the volume of a sphere increases faster than its surface area... the work to support ever-increasing graphics power grows faster than the visual difference in the image.
It's not sustainable, but those advancing graphics processors are a big part of why game developers are moving to consoles: a shinier graphics engine costs more money to develop, which increases the minimum returns for a project to be successful. Anyone who looks at the business side can see that the market of people who have $500 graphics cards is much tinier than the market of people who have an Xbox360 or Playstation3. If you're going to spend that much money on the shiny, of course you're going to shoot for a bigger return too!
When it takes a big team to develop something... well, that's generally not where the innovation is going to happen.