I was talking about the Wii, not the Wii U.
What you're saying, probably, is that you can't tell the difference between VGA (roughly equivalent to component, if not better) or DVI/HDMI on a digital flat panel. That's fine for you, but not for me. Eye fatigue from trying to refocus the slight blur of the VGA signal is gone with a digital connection for me. And again, black levels are an issue with analog sources.
Hang on, that's not what I said at all, and I resent the assumption. Why assume things like that? My eyes are plenty good enough to see the difference when the system is capable of reproducing it. My CRT is not due to various factors explained below. What I'm saying is that upscaling a non-native resolution to the panel's native resolution still isn't as sharp as it ought to be on a flat panel. I think we could agree on that.
Analog signals are never as clear as a digital signal, obviously, since you can only approximate a square wave (hard edge) and hardware capable of better approximations cost more money than a consumer's going to pay. When an analog signal's loss is equal to or less than the loss inherent to the analog CRT, you'll never notice anything related to the analog signal quality because the CRT's loss is greater - which is probably the case in my system. Hooking up a digital panel to a digital signal is the ideal situation because it has that digital nature, discrete pixels that don't bleed into each other. (Inter-pixel light bleed is a different story but mostly negligible for this discussion since CRTs have the same problem.) Hooking up a digital panel with poor analog sampling, IE a consumer grade TV/monitor, to an analog signal will immediately look blurry compared to what you'd get with a decent CRT. It will still look blurry until you spend thousands of dollars on the scaling equipment, which nobody in their right mind is going to do unless they are doing professional video archiving or restoration or feed-switching work. They have room to grow with video ADCs in consumer/prosumer level equipment but nobody's going to do it because of the poor cost/benefit ratio. So... you're always going to see it...
Black levels are defined by the specs, so it's up to you to calibrate the output device using the input source. Blu-ray, DVD, broadcast DTV (see Rec.709) etc have a digital range of 16-235 per channel to accommodate filter overshoots and such, so you never get true black or true white in the first place. Computer video devices, digital and analog, have no such restrictions. Calibrate your screen to a computer, and you'll have black level complaints when you hook up a blu-ray player or cable box. Calibrate it to the other system, and you'll have the opposing complaint.
Well, hopefully someone finds some of this information useful.