Because your PC can use the 52 in the living room and the same sound system, but will normally look better if it's properly configured. The Xbox360 and PS3 are using 5 year old graphics technology, and a $100 GPU for a PC will outperform one nowadays. Given that a TV runs at a set resolution, this extra power can either be thrown sideways into PhysX objects, used for higher resolution textures, greater particle densities or stronger antialiasing.
For a good example, compare Mirrors Edge on the Xbox/Ps3 to the PC version (the mission Ropeburn is a good example as it uses all the games features) - even running on PS3 era technology (a Geforce 8800), the PC version generally looks better.
Of course, most people haven't cottoned on to this yet and don't have a gaming PC set up in their living room. Given the consoles are likely to have another 3-5 years before being succeeded, it wouldn't surprise me if this becomes more common. Especially given how similar these consoles already are to PC's - software updates, installs to hard drives, web access - it's only the interface that's really different.
Games For Windows was a flunk, but if they got games (including Steam/D2D) integrated properly with the Media Centre (pretty sure it's now part of 7 by default, haven't upgraded yet myself) there wouldn't be any reason to have a console anymore. All you'd do is set up a guest account on the PC that loads the media centre in fullscreen mode, with user switching enabled in case you decided to do some work on it.
Wireless joypads for the PC have been available for a long time, and a machine capable of beating a console in performance can be had for a similar price (remembering that if you're using your TV, you don't have to buy a monitor!).