Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Ultra HDTV on Display for the First Time 314

fdiskne1 writes "According to a story by the BBC, the successor to HDTV is already out there. The resolution? 7680 x 4320 pixels. Despite the 'wow' factor, the only screens capable of using Ultra High Definition Television are large movie screens, and no television channel has the bandwidth needed for this image. Some experts, in fact, say the technology is only a novelty. Until the rest of the necessary technology catches up, the only foreseen use for Ultra HDTV is in movie theatres and museum video archives." From the article: "Dr. Masaru Kanazawa, one of NHK's senior research engineers, helped develop the technology. He told the BBC News website: 'When we designed HDTV 40 years ago our target was to make people feel like they were watching the real object. Our target now is to make people feel that they are in the scene.' As well as the higher picture resolution, the Ultra HD standard incorporates an advanced version of surround sound that uses 24 loudspeakers. "
This discussion has been archived. No new comments can be posted.

Ultra HDTV on Display for the First Time

Comments Filter:
  • by Cyno01 ( 573917 ) <Cyno01@hotmail.com> on Friday September 15, 2006 @11:38AM (#16113939) Homepage
    And i just bought an HDTV last week.
  • An excuse to not upgrade to Blu-Ray or HD-DVD!

    Now I won't have to lose geek credibility when I say SD is "good enough."
  • The device (Score:5, Funny)

    by also-rr ( 980579 ) on Friday September 15, 2006 @11:39AM (#16113958) Homepage
    Also required blood to be sampled and only one life form to be detected in the room before it allows you to play your DNA proteced version of "Stars Wars IV - Remix 92 - The Jedi Beat The Terrorists (2020 release)".
    • Re: (Score:2, Insightful)

      by iainl ( 136759 )
      Mind you, in the '77 release, the Jedi _are_ the terrorists. Lucas seems to have got something of a bone to pick with Bush, judging from the heavy-handed subtext of the prequels, too.
      • Mind you, in the '77 release, the Jedi _are_ the terrorists. Lucas seems to have got something of a bone to pick with Bush, judging from the heavy-handed subtext of the prequels, too.

        The outline of the story of nine movies was written before any of them were shot. Lucas picked the middle of the story partly because he felt it was the only portion that could be carried off successfully with the technology of the day, and partly because he felt it would be the most palatable.

        This isn't to say that the

        • by Golias ( 176380 )
          The outline of the story of nine movies was written before any of them were shot. This is a myth based on some comments Lucas made during some interviews regarding Empire

          The truth is, he wrote a long movie which started in the middle to feel like a Saturday serial, and upon realizing it was too long to shoot, he took the first thrid of his idea and created Star Wars.

          Empre and Jedi did not follow the remaining script ideas which he had written to the letter. For example, the scenes with the Ewoks were orig
          • Re: (Score:3, Informative)

            by Enderandrew ( 866215 )
            Exactly. If you read his original Journey of the Whills scripts, it is readily apparent he had no master plan, or at the very least never stuck to it.

            Han Solo was a late addition, that up until shooting was supposed to be killed by Jabba. Star Wars was supposed to be much darker. We used WWII films like Dam Busters as inspirations, as well as Kurosawa's Hidden Fortress. Both of those he credits. But he also lifted heavily from Dune, which he doesn't credit.

            The early drafts focus more on the spice trade
            • Re: (Score:2, Informative)

              by operagost ( 62405 )
              George Lucas even admits the sibling relationship between Luke and Leia was a very last minute thing while shooting Jedi, because he didn't know how Vader would anger Luke during their final duel. So at the last second, he invented that relationship.
              That's funny, because this was first hinted at, then later revealed, in ESB.
            • Although I not a huge Starwars fan, I find a lot of the credit for the "master plan" can be attributed to the authors of the Star Wars novels. They're the ones with the creative drive to continue the story they love. I think that if episodes 7, 8, and 9 ever get made, they should be written by Micheal A. Stackpole, who seems to generally know what he's doing.
        • Re: (Score:3, Insightful)

          by jskiff ( 746548 )
          The outline of the story of nine movies was written before any of them were shot

          Does anyone actually believe this anymore? It became patently obvious by Return of the Jedi that Lucas was making up the story as he went along. What other reason would there be to reuse the plot of the first film? Blow up the Death Star? Where have I seen that before?

          Or perhaps he actually meant, all along, for Luke and Leia to be sisters and for Anakin to have built C-3PO. That thing about Leia remembering her moth
    • Re: (Score:2, Funny)

      by operagost ( 62405 )
      Unlimited free viewings will be allowed if it detects a high level of midichlorians in the viewer's blood.
  • by phpWebber ( 693379 ) on Friday September 15, 2006 @11:40AM (#16113967)
    I want to see if it looks better than my computer monitor resolution.
  • 640 (Score:2, Funny)

    640 (by 480) ought to be enough for anyone...
  • hrmph! (Score:3, Funny)

    by B5_geek ( 638928 ) on Friday September 15, 2006 @11:41AM (#16113978)
    The inventors were overheard as saying; "Big deal. IT'S THE CONTENT STUPID!"

    Atleast books will always have a higher (mental) resolution, it's to bad nobody reads anymore.

  • However, it is unlikely to be available to the public for at least 25 years.

    Whoah, /. posted a story that is ahead of time!
  • by w33t ( 978574 ) * on Friday September 15, 2006 @11:42AM (#16113982) Homepage
    That's quite the resolution.

    I wonder, can the human eye even see such high resolution; does it even matter at that point? I mean,

    According to this page [72.14.205.104] it would appear that each human eye is a 15 megapixel camera.

    If my maths are correctish then 7680 x 4320 is 33 million pixels.

    So then, the question is - does this mean that by adding both eyes together, at best humans have 30 megapixel resolution vision?

    Could this be considered "full human" resolution?
    • I wonder, can the human eye even see such high resolution

      Not all at once, of course. The point of having higher resolution is so that when you attend to one part of the scene, you won't perceive a reduction in image quality.

    • Re: (Score:3, Insightful)

      by interiot ( 50685 )
      If you had a display that wraps completely around you, eg. "surround vision", then you certainly couldn't look at the entire display at one time, so it would be reasonable to have media that carried more data than the human eye can see.
    • That is true, but by that time scientists will have increased the resolution of the human eye to 66 megapixels.

      Rather like the way scientists increased the speed of light in Futurama.
    • by pz ( 113803 ) on Friday September 15, 2006 @12:37PM (#16114484) Journal
      Could this be considered "full human" resolution?

      IAAVN (I Am A Visual Neuroscientist). The answer to your question is, "no." The article pointed to claiming 15 million pixels specifically states the pixels are variable resolution. The photosensors in the central part of human (and primate) vision are packed at a much, much higher resolution than those at the periphery. The standard resolution in central vision for people with 20/20 vision is about 3 minutes of arc; at 3 degrees away from the fovea, this drops to 1/2 that figure; and at only 20 degrees eccentric (about two fist widths held at arm length), it's at 1/10. (If you've never heard that vision is variable resolution, try this trick: open a book or newspaper and stare at a single word in the middle of a paragraph; then, without moving your eyes, see how far to the left, right, up and down, you can read. You will find that the limits are astonishingly narrow. Evenly sampled high resolution vision is a powerful illusion based on the extreme resolution we have in the central part of vision, the ability to move our eyes, and some incredible circuitry in our brains.)

      More importantly, saying you have N by M pixels alone doesn't give visual resolution, it gives object resolution: it is not possible to resolve individual pixels in an 8x10 photo printed at VGA resolution held 10 meters away, despite the relatively low resolution of the image. It is necessary to know not only the resolution of the image but the viewing distance as well to be able to say if the combination approaches the limits of human vision.
      • Re: (Score:3, Funny)

        by stonecypher ( 118140 )
        it is not possible to resolve individual pixels in an 8x10 photo printed at VGA resolution held 10 meters away

        This is Slashdot, where a legitimate reply involves the word "binoculars."
    • I don't understand why a digital HD video spec needs to include limits on the H/W of the image in the first place. Shouldn't it just be arbitrary? It doesn't exactly take revolutionary technology to scale digital video to a given screen size & manage the differences in aspect ratio.

      Am I missing something here? I can play 1080i MP4 files on a 720p screen. Ignoring the lack of source material for a minute, I could hypothetically have a file @ 2880 * 1620 and still play it on my current setup. Further, if
    • Re: (Score:3, Informative)

      by 4D6963 ( 933028 )

      It's not that simple, mainly because the human eye's resolution isn't uniform. Basically, because of the fovea [wikipedia.org], in the center of our vision we have an area about 2 large (4 times the appearant diameter of the moon) offering us in the area a resolution of about 28" (seconds of arc), the resolution outside of this area being lower. Since it was projected on a 7 x 4 meter screen [cdfreaks.com], each pixel is about 0.9 mm x 0.9 mm.

      Which means that if I got my maths right, you would have to be 6.94 meters (almost 23 feet) away

      • by 4D6963 ( 933028 )

        we have an area about 2 large

        Snap! I meant 2 degrees of arc but the degree symbol didn't come out right...

    • by Kjella ( 173770 )
      If you read slightly more, you'll see that the eye is extremely sharp in the center, then quite fuzzy towards the edges. If you have *any* sort of imperfection in your eye, your optician will concentrate on getting the center right. Scroll down to table 1 for a good overview:

      60" HDTV at 8':
      Resolution: 1920×1080
      Eye limit: 1.17M pixels
      Display limit: 2.07M pixels
      Eye & display limit: 1.06M pixels

      Why the difference? Because you have .11MP dead center your TV can't show, and 0.90MP around that the eye ca
    • IT seems useless, unless we radically change the way we use TV. I've seen quite a lot of high-def video where the end product looks worse than regular definition television. When I'm watching some news reporter talk to me from the whitehouse lawn: regular television shows me a picture of a news reporter in front of an iron fence. HDTV shows me a news-reporter, with smudged makeup and lint on his colar, in front of an iron-fence that has bird droppings on it. Sometimes you don't want the extra detail. At a m
  • by windowpain ( 211052 ) on Friday September 15, 2006 @11:44AM (#16114011) Journal
    The article says we might start to see these UHDTV sets in about 25 years. Although SDTV can be said to have started in the 1920s or 30s practically speaking it's about 55 or so years old as the transition to high definition picks up steam. (2006 will be the first year more high definition sets than standard definition sets are sold in the US.) With the rate of technological change and Moore's law it seems reasonable to me that the next generation will arrive in about half the time SDTV lasted.
    • Re: (Score:2, Informative)

      --insert obligatory slashdot reply here saying moore's law doesn't have anything to do with tv resolution--
    • by tlhIngan ( 30335 )
      It is about right - they were talking about HDTV since the 80's. It's just that only around now has technology reached the point where HDTV is practical. (Wasn't the original HDTV rollout years something like 1997, 2000, 2003, and so on until technology became cheap and available?)

      TVs are getting cheap (what you paid for a "big screen" TV back in the 1980s would get you a nice HDTV these days, IIRC), contents is starting to become available (as equipment etc. become much cheaper), bandwidth usage remains si
      • by Catbeller ( 118204 ) on Friday September 15, 2006 @12:29PM (#16114417) Homepage
        "It's just that only around now has technology reached the point where HDTV is practical. (Wasn't the original HDTV rollout years something like 1997, 2000, 2003, and so on until technology became cheap and available?)"

        I'm probably the only one here who is 1) old enough to remember, and 2) actually paying attention to the HDTV fiasco from 1985 onwards.

        Analog HDTV was rolled out in Japan in the 1980's. A bit stung, the American television manufactures and the networks hammered together a proposal to broadcast 1080p in the following way: standard def over the usual VHF channels, while the HD component would be broadcast over unused channels. Thus, Channel 2 CBS would go out as normal, while an HDTV set would take that signal and add information broadcast over channel, say, 3. All analog. All broadcast. The rollout would have been around 1990 or so.

        A funny thing happened. Digital video. The broadcasters saw what digital compression could do for them. Why just one channel, using all that bandwidth, when we can now use the same two channels and broadcast 4 programs simo? We promise that sometimes we'll broadcast in HD; just most of the time, we'd like to make more money with four low-def channels. And they demanded, and got, 1080 (i), to halve the signal and enable more channels on the side thereby.

        And their wish was granted. These were the years of no-regulation, after all. The issue of public ownership of the airwaves was going bye-bye, and the government would like to auction off those frequencies anyway, which leads us to

        Cable. Since so much programming was going over cable, the Gov decided that public regulation of public airwaves was silly and undermining competition. So long Fairness Doctrine, so long limits on corporate ownership and monopoly control. And so additionally, why force public airwaves to go digital when cable could deliver it so much better than they?

        And network TV didn't really want to pay to upgrade, either, so that slowed it down a lot. Delay after delay...

        THEN the kicker. The "content owners" saw that in the digital age they had a chance to lock down signals and force people to pay each time they accessed their "property". They wanted taping to go away as well -- they hated VCR's and almost killed the tech in 1984. They could win this one, and so was born the Broadcast Flag, a digital lock on transmissions that controlled the use of the program. Cue a big delay as HDMI, HTCP and all the other locks were developed and approved by the "content" industry.

        Now... it's the 21st century. almost 20 years late, and we've crappy 1080i signals going over the air, infomercials clogging all those channels we can access for free, and we can't record the standard 1080i signal.

        Remember, the public airwaves are supposed to belong to we the people, and the broadcasters and producers are supposed to dance to our tune. Somehow they are now the masters, and we those begging for mercy.
        • no. (Score:3, Interesting)

          I've been paying attention too. I first saw HDVS in 1988. I never saw Hi-Vision, the first Japanese analog broadcast standard.

          No, there was never a 1080P analog broadcast standard in the US. There never was any serious attention paid to delivering HDTV over the air in the US until digital compression came around. This is because it was expected to take 5 regular channels to send one HD channel. At this point it became a war between compressed 720p and compressed 1080i.

          Both were considered the best that coul
  • Typical (Score:4, Funny)

    by Yahweh Doesn't Exist ( 906833 ) on Friday September 15, 2006 @11:45AM (#16114021)
    you wait 40 years to upgrade and a week later you're obsolete.

    what I hate about TV is how the specs are so hardware-dependent. all kinds of numbers and letters and if it differs by 1 character your thousands of dollars might have been wasted.

    imo it should be more like computers: you basically have a processor that determines your data processing and a display device that determines your viewable resolution. almost everything else is software and thus improvements are continuous and ongoing. it's a much better model than upgrading every couple of decades, with a half-decade period when your TV is too good for the signal.

    once TV is based on more internet-like digital technologies this will hopefully happen.
    • once TV is based on more internet-like digital technologies this will hopefully happen.

      And then we will have movies starting with the message "This movie will be enjoyed most on an 1024x768 TV." And if your TV has a higher resolution, the movie will be shown in a small rectangle in the middle.

      At least that would be the analog to many of today's web pages.

      • no, that's exactly how TV works.

        with computers you can handle any source your CPU is up to and can always click full screen.

        and you have source options; ever watched a trailer from Apple? you get the choice of small, medium, large and HD res.

        with TVs it's either HD or not. and HD versions are often completely different channels. it's lame, just like those "+1 hour channels" are lame - if you had decent scaling/shifting abilities in the first place you could do 10x as much 10x as easily.
  • Pitty (Score:3, Insightful)

    by suv4x4 ( 956391 ) on Friday September 15, 2006 @11:47AM (#16114030)
    Pitty money and time is spent on increasing the specs of something that is already in abundance.

    As technology matures there's a race for bigger, faster, and finer. But this race is not eternal: in few years the sweet spot is hit and people are not interested in higher resolutions.

    With TV resolution this sweet spot is already somewhere between DVD and EDTV, way below 1800p. So yea, don't expect "technology to catch up" in that respect, as the summary suggest, since noone cares for it to catch up in this way.
    • by Ctrl-Z ( 28806 )
      Correct me if I'm wrong, but isn't DVD the same resolution as EDTV, 480p? The wikipedia article on enhanced-definition television [wikipedia.org] says that DVD is at the lower end because it isn't capable of a 60 Hz frame rate (480p60). So you're basically saying that 480p is the "sweet spot". I beg to differ.
  • 40 years ago!? (Score:3, Interesting)

    by Buddy_DoQ ( 922706 ) on Friday September 15, 2006 @11:48AM (#16114043) Homepage
    "When we designed HDTV 40 years ago..."

    Whoa! 40 Years ago!? Amazing! Crazy how long it took to go public/mainstream. I guess it's one thing to design something and quite another to build upon it.
  • bandwidth (Score:2, Insightful)

    by Raleel ( 30913 )
    So, did I do my math right?
    x*y*bytes per pixel*frames per second gives bytes per section /1024 gives kb /1024 gives mb /1024 gives gb
    7680*4320*3*25/1024/1024/1024 = 2.3174 gigabytes per second

    that's quite a chunk for streaming video. of course, there will be compression techs and other tricks, but that's pretty impressive.
    • Re: (Score:2, Informative)

      by hattig ( 47930 )
      I expect that video will be 5 bytes per pixel by the time this comes out - already the latest version of the HDMI specification allows for 36-bits per pixel, which would require 5 bytes.

      So 7680 x 4320 x 5 at 60fps = 9.3GB/s.

      Another comment said that this was 25 years away, although I wouldn't be surprised if it was only 15 years away the way things are progressing. 9.3GB/s is offered on even low-end graphics cards these days, but the bandwidth problem is between the player and the display, i.e., the HDMI eq
  • This format may also be useful for showing the often missing moon landing movies.
  • Now I just have to wait for it to come down in price and I'll finally be able to have an HDTV LCD that displays ALL 3 common resolutions without doing funky scaling tricks
    1080i/p pixels = 4x4 pixel block of real pysical pixels, 720p pixels = 6x6 block of physical pixels, 480i/p pixels = 9x9 block of physical pixels.
    1080i pip is still 1080i at 1/4 of the screen. no down scaling.

    THIS folks is what I've been waiting for ever since HDTV was announced.

    If only I could afford to be an early adopter on this technol
    • by iamacat ( 583406 )
      This will look pathetic. You want uniform blurring rather than square pixels. That's the reason low-resolution CRTs and LCDs look way better than high-resolution ones in low-res mode. Either HDTVs have killer video processors or SD picture must look pretty lousy. I wonder if that's most of their sales appeal anyway, as I don't have any quality complaints watching a DVD from the other end of the living room.
  • Say it with me: HOLODECK

    Give me gyroscopes and holograms! It doesn't matter if that Klingon bat'leth is UHD resolution...all you'll see is a low-res blur before your very high res intestines spill out before you!

    (Of course, those aren't really your intestines, but this holodeck goes for intensity in imagery.)
    • (Of course, those aren't really your intestines, but this holodeck goes for intensity in imagery.)

      Since the safety protocols have no doubt broken/been bypassed/been shut off/been overridden by a rogue AI inside the holodeck program itself, those are, in fact, really your intestines.
  • by mcai8rw2 ( 923718 ) on Friday September 15, 2006 @11:54AM (#16114114) Homepage
    Haha! That resoultion sounds flippin great...but I can see it now:

    "Ken Kutaragis' head announces that the "playstation 14" ships WITHOUT the foot wide ultra-ultra-ultra-HDMI cable."


    Meanwhile, CMDR Taco [deceased] writes on how playstations "neural implant connect-kinetic extremity dongle [N.I.C.K.E.D]...was 'actually just a rehash of the Wiiiiiiiiiiiiiiiiiiis controller.
  • They finally have some still cameras that can usefully photograph at such huge megapixel counts, but getting video camera to take more than 20 frames per second is a different story. Besides, it's been my experience, and the experience of many, that the real bottleneck in digital cameras is no longer the pixel count but the optics themselves. By which I mean that pictures with 30Mpix will not look better than pictures at 7Mpix with even upmarket still camera optics these days.

    What's more, while all the el

    • It's not the precision of the optics that matters*, it's the size. You could support 31 megapixels if you were willing to put a big enough aperture on the front of the camera. Look at the lens of an HD TV camera sometime - it's much wider than the lens on an SD TV. The extra information is captured by widening the light intake, rather than perfecting the light intake for a given width.

      *This statement is to be taken strictly in the context of improving on current optics for higher resolutions. Obviously, the
    • How about removing the optics entirely, and post-processing the image data into viewable form?
  • Also... (Score:2, Funny)

    The $3000 version of the PS4 is built specifically for Ultra HDTV! Pre-order now!
  • Ramping up pixel count is like Detroit building bigger and bigger engines , 380 CuInches, baby vrrooom vrrooom. Hope they improve the dynamic range of these screens, which are all pathetic 1000 for most screens. Human eye has 1 million. Also why not some real depth perception too.
  • I'm serious when I say this, so bear that in mind as you snicker... ...but: what's the challenge, here? What's the innovation? If you're not worried about how you're going to fit it into an existing transmission medium (that is, they obviously aren't worried about sending OTA on a TV channel), then what's the challenge to designing a higher-resolution spec?

    How is this different than me defining a video spec that operates at 1048576 x 589824 pixels x 120 fps, non-interlaced? Is it just that they spent the mo
  • Just wait a few more years for WHUXGA...

    From http://en.wikipedia.org/wiki/HUXGA [wikipedia.org]
    WHUXGA 7680×4800 16:10 37M

    WHUXGA an abbreviation for Wide Hex[adecatuple] Ultra Extended Graphics Array, is a display standard that can support a resolution up to 7680 x 4800 pixels, assuming a 16:10 aspect ratio. The name comes from the fact that it has sixteen (hexadecatuple) times as many pixels as an WUXGA display. As of 2005, one would need 12 such displays to render certain single-shot digital pictures, for
  • This is where the law of diminishing returns kicks in...

    Many people, when shown a 60" screen at a reasonable viewing distance, can't tell the difference between 720p and 1080p. The added resolution of UHDTV would only be of benefit on a large movie screen from a close viewing distance. But movies implemented the ideal screen resolution decades ago... It's called film.
    • While it is true that resolution alone in this thing is pretty pointless, it does make for interesting possibilities, like being able to zoom in on a part of a scene or doing some neat post-processing magic with some of the data. You could for example make a movie where the entire scene is a whole island or whatever, and the viewers can zoom in on different parts of it, with the audio changing according to location. This UHDTV resolution wouldn't be enough for that though, but something on a smaller scale w
  • I think it is completely reasonable to have this kind of technology in movie theaters. The whole concept of a movie theater is that it is an expensive experience that cannot be replicated at home. If you have HD-DVD at home plus a large format HD display or a projector, then what is the point of going to the movies. I think it is OK for theaters to invest in a technology that makes the answer to the previous question something other than "none".
  • Film (Score:3, Informative)

    by GWBasic ( 900357 ) <`slashdot' `at' `andrewrondeau.com'> on Friday September 15, 2006 @12:17PM (#16114319) Homepage
    Ultra-HDTV's resolution is comparable to 30mm and 70mm film. This will probably be what's adopted when digital projection becomes mainstream in theaters.
  • I don't know why we don't already just use 3 screens across at just 1600x1900 (UXGA) all the time. We look at the middle screen for detail, and the side screens fill our peripheral vision. It seems like mounting a bezel on the screens that pushes out past their frames to join in front in a seam is a lot easier than making a really big panel. And the lower detail demand for the side screens could mean the driving boards don't need to triple the UXGA performance.

    I'd expect this kind of rig to already be stand
  • ...in the same way that high resolution still images are -- they might not fit in their entirety on the screen, but it enables you to zoom in on details. If you processor is fast enough to keep up with all the data at all, that is.
  • Comment removed (Score:4, Informative)

    by account_deleted ( 4530225 ) on Friday September 15, 2006 @12:28PM (#16114414)
    Comment removed based on user account deletion
  • First of all, it is a TV, with moving pictures. We need less resolution for flickering frames than we need for static images. Secondly, 5-6MP are enough if you look at the whole picture (literally), final size does not matter, be it a postcard or building.
    It's only when you want to examine a small details of the picture, you need more pixels, therefore the landscape photographers use all the megapixels they can get. But for TV or projection, it is just plain silly.
    The coolness factor is high, though.

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...