Just Scan (1:1 Pixel Matching): HDMI: 1080p/1080i/720p, Component: 1080p/1080i/720p, RF: 1080i/720
If I'm reading this correctly, the TV doesn't actually support anything higher than a 1920x1080 ("1080p") signal input. So while it might in fact have a 3840x2160 panel, that panel is absolutely worthless, since it has to upscale everything that's being displayed.
Besides that point, most of these so-called "apps" are worthless. I remember a time when Apple fans used to proudly proclaim that even though there was less software on the Mac platform, they were higher quality than Windows programs. Now that the iPhone has hundreds of thousands of apps, quality doesn't matter anymore.
At least Firefox hasn't gone full Windows 8 and reduced everything to 16 colors (yet)...
Powered by a Pentium processor
Processsor Type: VIA 8505
Not only did they get the company wrong, it's not even x86 architecture. VIA 8505 is ARM-based. This isn't even including the fact that it runs Windows CE (aka Windows Embedded Compact), so standard Win32 programs wouldn't run on it, even if compiled for ARM.
Turning off "enhancements" helps a bit, but still nowhere near a PC monitor. As an example, I tested a 46" Sony Bravia a while ago (don't remember the model number). At 1920x1080, a checkerboard pattern test showed interference between pixels and lines. That interference doesn't happen on any PC monitor I've tested, even with analog VGA.
I have also seen these issues on the same TVs using HDMI/DVI, especially broken EDIDs. It's not just limited to the VGA decoder, and quite honestly, if a 2012 HDTV can't match the VGA decoding capabilities of a 2000 PC LCD (Dell 1701FP), something's wrong with it.
What I'm wondering is if Apple's "HDTV" will actually be usable as a standard monitor, or if they'll use the same garbage decoders found in the rest of the dime-a-dozen displays. If they do use a standard monitor decoder instead of garbage, then it might actually be worth a purchase, regardless of the brand name or extra iOS functionality. (Obviously it would need to actually support various inputs like VGA, YPbPr, etc; a Thunderbolt-only HDTV is kinda useless.)
Rendering shouldn't be any different between a program that uses a component and the original browser. If it is, then something is wrong with the browser.
When the software consists of nothing more than worthless single-site browsers that do nothing but show a webpage? Definitely not.
Tell me: Would you like all the software on your computer to consist of nothing but web page frontends? If so, you may want to switch to Chrome OS, and I hope you enjoy your laggy response time and inevitable "cloud" data loss.
Of course, Apple and BlackBerry love the concept, because it means they get to claim they have "millions of crApps". (Ironically, just a few years ago, Apple fanboys were claiming that the Mac platform was better because even though it had fewer applications, the quality was higher than Windows. Funny how their tone changed when the iPhone App Store was unveiled.)