Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Ultra HDTV on Display for the First Time 314

fdiskne1 writes "According to a story by the BBC, the successor to HDTV is already out there. The resolution? 7680 x 4320 pixels. Despite the 'wow' factor, the only screens capable of using Ultra High Definition Television are large movie screens, and no television channel has the bandwidth needed for this image. Some experts, in fact, say the technology is only a novelty. Until the rest of the necessary technology catches up, the only foreseen use for Ultra HDTV is in movie theatres and museum video archives." From the article: "Dr. Masaru Kanazawa, one of NHK's senior research engineers, helped develop the technology. He told the BBC News website: 'When we designed HDTV 40 years ago our target was to make people feel like they were watching the real object. Our target now is to make people feel that they are in the scene.' As well as the higher picture resolution, the Ultra HD standard incorporates an advanced version of surround sound that uses 24 loudspeakers. "
This discussion has been archived. No new comments can be posted.

Ultra HDTV on Display for the First Time

Comments Filter:
  • by w33t ( 978574 ) * on Friday September 15, 2006 @11:42AM (#16113982) Homepage
    That's quite the resolution.

    I wonder, can the human eye even see such high resolution; does it even matter at that point? I mean,

    According to this page [72.14.205.104] it would appear that each human eye is a 15 megapixel camera.

    If my maths are correctish then 7680 x 4320 is 33 million pixels.

    So then, the question is - does this mean that by adding both eyes together, at best humans have 30 megapixel resolution vision?

    Could this be considered "full human" resolution?
  • 40 years ago!? (Score:3, Interesting)

    by Buddy_DoQ ( 922706 ) on Friday September 15, 2006 @11:48AM (#16114043) Homepage
    "When we designed HDTV 40 years ago..."

    Whoa! 40 Years ago!? Amazing! Crazy how long it took to go public/mainstream. I guess it's one thing to design something and quite another to build upon it.
  • Re:Goddamnit... (Score:3, Interesting)

    by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Friday September 15, 2006 @12:01PM (#16114182) Homepage Journal
    I just want to make a serious comment here; this is precisely what has happened with both video and now with digital video. The 8mm film when transferred to video lost a lot of information and that had the effect of smoothing out blemishes. Shooting direct to video meant a lot less was lost, and you saw a lot more pimples. Now, digital video has brought us another level of nastiness, because splotchiness in an image is even more pronounced when you've got artifacting going on - and we have MPEG-compressed DV, which is then decompressed and processed, and recompressed with MPEG2 again, at a different bitrate (and probably in a substantially different format.) So at once you get the clarity of DV, and the splotchiness of recompressed MPEG, and every pimple, blackhead, scar, and abcess one's had since birth stands out in living color.
  • by KillerBob ( 217953 ) on Friday September 15, 2006 @12:03PM (#16114195)
    The pixel density is higher than the eyes can see, unless it's taking up your full field of vision. But the other thing to keep in mind is that your eyes are essentially two cameras working in parallel. We subconsciously interpolate the information they're sending to create depth, but we also subconsciously interpolate the data to increase the resolution (and sharpen the image). Pick something in your room, take off your glasses if you wear them. It's relatively in focus, depending on how bad your prescription is. Now... close one eye, then the other. Notice that with both eyes open, the focus is better than it is with one eye closed, and it doesn't matter which eye is closed for that effect. Even if you're like me where one eye is near-sighted and the other is far-sighted. (My right is -0.50, my left is +0.25)

    I don't know the exact numbers, but we'll use the number of 15 megapixels per eye... just because a single eye is 15MP doesn't mean that both eyes working in tandem is going to be 30MP. In Astronomy, you can drastically increase the resolution of a picture you're taking by taking a dozen pictures spread out over a large area. If they're at the same time, then you can interpolate the missing data and produce a *really* high resolution picture. I'd be surprised if we aren't subconsciously doing the same thing with our eyes.
  • by Nimey ( 114278 ) on Friday September 15, 2006 @01:07PM (#16114726) Homepage Journal
    FWIW my eyes do not cooperate[1], so I do not see depth they way most people do. I can see movement sure enough, but my brain has to do wetware emulation to figure out how far away something is, and close up it sucks. As a result I can't catch a ball but I can estimate how far away a moving car is, but it helps if I know about what size the object is and I must use visual context.

        To emulate how I do it, just close one of your eyes and do things that way. I can see out of the other eye, of course, but the brain treats it as peripheral vision unless I'm using it to focus on an object -- I can swap which eye I use to focus at will.

    [1] I was born with one of my eye muscles screwed up, so I was the opposite of cross-eyed.
  • by twistedsymphony ( 956982 ) on Friday September 15, 2006 @01:47PM (#16115059) Homepage
    IIRC 1080p (at least for consumer product use) only does up to 30FPS due to bandwidth limitations of component video and HDMI... not exactly "super fast"

    I was also under the impression that theaters originally started cutting down the frame rate to help shrink the size of the film reels. 24FPS was the slowest (and thus cheapest and smallest) before they started drastically reducing quality and making the image look choppy. In my experience people prefer the digital theaters to their film counterparts for many many reasons and even if they don't see the quality difference if they're using a digital projector movie go-ers don't have to watch a strobe light for 2 hours.
  • no. (Score:3, Interesting)

    by YesIAmAScript ( 886271 ) on Friday September 15, 2006 @03:25PM (#16115888)
    I've been paying attention too. I first saw HDVS in 1988. I never saw Hi-Vision, the first Japanese analog broadcast standard.

    No, there was never a 1080P analog broadcast standard in the US. There never was any serious attention paid to delivering HDTV over the air in the US until digital compression came around. This is because it was expected to take 5 regular channels to send one HD channel. At this point it became a war between compressed 720p and compressed 1080i.

    Both were considered the best that could be done correctly on a single 6MHz (14mbps) channel. Both contain the same amount of info, and it's not by accident.

    As to your cable conspiracy, the FCC left cable alone. They didn't mandate must-carry for digital local channels. Additionally, note that cable uses the FCC-endorsed ATSC standard and that HD was not even available over cable until after it was available OTA. The FCC was in no way waiting for cable to take up the slack.

    You're right that content providers decided they'd rather do 4 SD channels than one HD channel. Because of this the FCC put in place some crazy rule that says that if content providers provide additional content on those alternate channels that are not on the main channel, they must return the revenue derived from that content. I don't know if the rule is even enforced, but because of it, the alternate channels in my area are all either PBS, commercial-free content (often just weather radar or rolling news) or identical to the main channel except in format.

    This was because these providers were not charged for this additional bandwidth and the FCC didn't want the TV stations essentially reselling it and competing against the FCC in bandwidth sales. This came into play after a few broadcasters opined that they would put data on the additional channels instead of TV and sell it to pager or data providers like the Microsoft "spot" watches.

    HDMI and HDCP are not FCC mandated, and they are not required to view OTA ATSC content. Even barring of recording is not in place since there is no broadcast flag now. Oddly, the broadcast flag never even barred recording technically, it merely said that any device capable of receiving the broadcast flag must preserve it if it exports the content outside the box.

    Yes, there is plenty of protection on BluRay/HD-DVD and you'll maybe have trouble recording HBO. But neither of those fall under the FCC's mandates nor the public airwaves.

An Ada exception is when a routine gets in trouble and says 'Beam me up, Scotty'.

Working...