I don't know about that... My nice 47" TV draws 210W at maximum, powering a 47" 120hz panel, a CPU (smart TV), TV tuner, image processor and stream decoder, and 2 channels of 20 watt nominal (10watt RMS) audio. The only part of that I can account for in numbers is the audio amplifier, which is probably something around 80% (being generous) efficiency... 20W x 2 channels / .8 = 50 watts, leaving 160 for everything else. In use and streaming a video, the unit (according to my Kill-A-Watt) actually draws 172W, which means all components except the audio amp (which we determined to be a 50W load) are sharing the remaining 122W.
Additionally, I have a less-nice 36" TV that draws 64W in total (5W for audio), and the following 3 higher-end monitors: a 25" that draws 31W (6W for audio), a 23" that draws 35W (no audio), and a 22" that draws 36W (no audio).
Add to that, my current 17" MBP has a very nice (if perhaps a bit too blue before calibration) panel that sucks down only a handful of watts (the entire system can run with all cores pegged and pushing the GPU to its limits, on 85W and still charge the battery).
How do you figure a 17" panel is going to draw more current than a 47"? Am I missing something?