Ultra HDTV on Display for the First Time 314
fdiskne1 writes "According to a story by the BBC, the successor to HDTV is already out there. The resolution? 7680 x 4320 pixels. Despite the 'wow' factor, the only screens capable of using Ultra High Definition Television are large movie screens, and no television channel has the bandwidth needed for this image. Some experts, in fact, say the technology is only a novelty. Until the rest of the necessary technology catches up, the only foreseen use for Ultra HDTV is in movie theatres and museum video archives." From the article: "Dr. Masaru Kanazawa, one of NHK's senior research engineers, helped develop the technology. He told the BBC News website: 'When we designed HDTV 40 years ago our target was to make people feel like they were watching the real object. Our target now is to make people feel that they are in the scene.' As well as the higher picture resolution, the Ultra HD standard incorporates an advanced version of surround sound that uses 24 loudspeakers. "
The final resolution jump? (Score:5, Interesting)
I wonder, can the human eye even see such high resolution; does it even matter at that point? I mean,
According to this page [72.14.205.104] it would appear that each human eye is a 15 megapixel camera.
If my maths are correctish then 7680 x 4320 is 33 million pixels.
So then, the question is - does this mean that by adding both eyes together, at best humans have 30 megapixel resolution vision?
Could this be considered "full human" resolution?
40 years ago!? (Score:3, Interesting)
Whoa! 40 Years ago!? Amazing! Crazy how long it took to go public/mainstream. I guess it's one thing to design something and quite another to build upon it.
Re:Goddamnit... (Score:3, Interesting)
Re:The final resolution jump? (Score:5, Interesting)
I don't know the exact numbers, but we'll use the number of 15 megapixels per eye... just because a single eye is 15MP doesn't mean that both eyes working in tandem is going to be 30MP. In Astronomy, you can drastically increase the resolution of a picture you're taking by taking a dozen pictures spread out over a large area. If they're at the same time, then you can interpolate the missing data and produce a *really* high resolution picture. I'd be surprised if we aren't subconsciously doing the same thing with our eyes.
Eyes don't always work in tandem (Score:4, Interesting)
To emulate how I do it, just close one of your eyes and do things that way. I can see out of the other eye, of course, but the brain treats it as peripheral vision unless I'm using it to focus on an object -- I can swap which eye I use to focus at will.
[1] I was born with one of my eye muscles screwed up, so I was the opposite of cross-eyed.
Re:The final resolution jump? (Score:3, Interesting)
I was also under the impression that theaters originally started cutting down the frame rate to help shrink the size of the film reels. 24FPS was the slowest (and thus cheapest and smallest) before they started drastically reducing quality and making the image look choppy. In my experience people prefer the digital theaters to their film counterparts for many many reasons and even if they don't see the quality difference if they're using a digital projector movie go-ers don't have to watch a strobe light for 2 hours.
no. (Score:3, Interesting)
No, there was never a 1080P analog broadcast standard in the US. There never was any serious attention paid to delivering HDTV over the air in the US until digital compression came around. This is because it was expected to take 5 regular channels to send one HD channel. At this point it became a war between compressed 720p and compressed 1080i.
Both were considered the best that could be done correctly on a single 6MHz (14mbps) channel. Both contain the same amount of info, and it's not by accident.
As to your cable conspiracy, the FCC left cable alone. They didn't mandate must-carry for digital local channels. Additionally, note that cable uses the FCC-endorsed ATSC standard and that HD was not even available over cable until after it was available OTA. The FCC was in no way waiting for cable to take up the slack.
You're right that content providers decided they'd rather do 4 SD channels than one HD channel. Because of this the FCC put in place some crazy rule that says that if content providers provide additional content on those alternate channels that are not on the main channel, they must return the revenue derived from that content. I don't know if the rule is even enforced, but because of it, the alternate channels in my area are all either PBS, commercial-free content (often just weather radar or rolling news) or identical to the main channel except in format.
This was because these providers were not charged for this additional bandwidth and the FCC didn't want the TV stations essentially reselling it and competing against the FCC in bandwidth sales. This came into play after a few broadcasters opined that they would put data on the additional channels instead of TV and sell it to pager or data providers like the Microsoft "spot" watches.
HDMI and HDCP are not FCC mandated, and they are not required to view OTA ATSC content. Even barring of recording is not in place since there is no broadcast flag now. Oddly, the broadcast flag never even barred recording technically, it merely said that any device capable of receiving the broadcast flag must preserve it if it exports the content outside the box.
Yes, there is plenty of protection on BluRay/HD-DVD and you'll maybe have trouble recording HBO. But neither of those fall under the FCC's mandates nor the public airwaves.