Personally, I blame the endless stream of crap Intel video chipsets.
All through 2008 and 2009, people were buying small laptops with nearly useless video chipsets. When the GMA based stuff shows up used, you just want to take a hammer to it.
If you put a high resolution display on those systems, you'll just see how long it takes to update the screen. (When you're doing anything intensive.)
Larger pixel counts require powerful graphics chips. I remember when we had a used Precision M65. Updating all 1680x1050 pixels worked the Quadro FX350M so hard it would overheat.
My main displays are still a pair of 1600x1200 20" LCDs. Finding good dual-DVI video cards, that don't require supplemental power, was no easy task.
And if, like so many IT managers, you're running a pair of DVI capable 20" monitors on analog VGA, you need to hand in your geek card.