It depends on the card really.
I was "upgraded" to an HP 6000 at work (ugh, don't ask). We had a discrete graphics card in an earlier model, but then due to warranty, were all upgraded to the newer model. We lost the graphics card because supposedly the 6000's onboard graphics had enough power to run Win7.
One 22 inch was hooked through DVI while one was hooked through VGA. The VGA looked like total crap - washed out, blurry, totally noticeable. I promptly grabbed the video card back and dropped it in - problem gone. That was a simple 1GB ATI 5450.
I later tried the 512MB model of the SAME card (different brand). When I hooked it up, I immediately noticed the same quality picture as the onboard graphics through the VGA. Sure, I could have used and HDMI -> DVI adapter for the HDMI port, but didn't have an adapter handy.