Forgot your password?
typodupeerror

HDMI Spec Upgraded To Support 'Deep Color' 142

Posted by timothy
from the blush-in-high-fidelity dept.
writertype writes "If you own a digital television, there's a good chance it supports HDMI as an A/V interface. Well, for all you early adopters who bought an HDMI-less TV and regretted it later, the HDMI spec has been upgraded yet again, to version 1.3. Features include "deep color", or color depths beyond what the human eye can perceive, eight-channel audio support, among others. Interesting note: the PlayStation 3 supports deep color, according to the HDMI chief."
This discussion has been archived. No new comments can be posted.

HDMI Spec Upgraded To Support 'Deep Color'

Comments Filter:
  • by reklusband (862215) on Monday June 12, 2006 @04:32PM (#15519362)
    Does it support Deep Purple? Inna gadda davida baby! 8 CHANNELS AND DEEP PURPLE!!!!
  • by Volante3192 (953645) on Monday June 12, 2006 @04:32PM (#15519364)
    Unfortunatly, due to unforseen copyright issues, all colors between Deep Green and Deep Violent will be subjected to a licencing fee.

    IBM was unreachable for comment.
  • Huh? (Score:5, Insightful)

    by sexyrexy (793497) on Monday June 12, 2006 @04:33PM (#15519367)
    Sorry for stating the obvious, but doesn't color depths beyond what the human eye can perceive just seem really... pointless? I don't think the human eye is going to evolve to greater color sensitivity during HDMI's lifetime. It's one thing to have a higher quality image to downsample to, but... seriously. Isn't there SOMETHING the bandwidth could be used for besides information we can't use?
    • Sorry for stating the obvious, but doesn't color depths beyond what the human eye can perceive just seem really... pointless? I don't think the human eye is going to evolve to greater color sensitivity during HDMI's lifetime. It's one thing to have a higher quality image to downsample to, but... seriously. Isn't there SOMETHING the bandwidth could be used for besides information we can't use?

      i'd have to agree with that. allowing audio that is above or below human hearing has a purpose, as you can feel extr
      • I bet ya the companies wish they could shake the hand of the scientist who wrote the paper / proof showing that there are more colors than the human eye can see! Thus, they have another feature to push for their product ... there are so many colors in this TV that you can't even see them all!!!!!
      • by unitron (5733)
        "i'd have to agree with that. allowing audio that is above or below human hearing has a purpose, as you can feel extremely deep tones, but colour is only visual, so there is no other sense to fall back on."

        Well, the infrared will make you feel nice and toasty, and the ultraviolet will let you work on your tan without going outdoors. :-)

    • Re:Huh? (Score:5, Informative)

      by statemachine (840641) on Monday June 12, 2006 @04:36PM (#15519405)
      I figured someone would be confused by this. However, the article expains:
      "The color bit depth [of today's displays] is typically 24-bits RGB - that gets you 16 million colors, and the human eye can distinguish that," Chard said. "That leads to scaling and onscreen effects which you can pick up. Either 36-bit or 48-bit RGB is beyond the ability of the human eye to distinguish."

      Right now your eye can see the color transitions. The point is to make it so you can't see the transitions.
      • It also has to do with ease and accuracy of scaling. Scaling 24 bit requires more bits than that to accomplish without errors. If you go to higher source bit amounts any degredation will still be beyond what is visible, hence no visible scaling artifacts in the color area.
    • Re:Huh? (Score:5, Informative)

      by pthisis (27352) on Monday June 12, 2006 @04:39PM (#15519428) Homepage Journal
      With current color depths, you can distinguish the difference between adjacent colors (in some limited portions of the field). By taking it to a depth where differences are imperceptible, you make things look smoother.

      Essentially you want to have your colors go as deep as you need to to make differences imperceptible, which this (supposedly) does. After that going even deeper would be a waste.
      • Re:Huh? (Score:5, Informative)

        by MBCook (132727) <foobarsoft@foobarsoft.com> on Monday June 12, 2006 @04:48PM (#15519492) Homepage
        The best example of this is a gradient. Take your monitor and make a gradient that is full screen from solid red to solid black. As things are now you get 256 bands of color because there are 256 possible values for red. The problem with this is that the transitions are VERY obvious.

        Now if you have 4090 possible values of red, your eye may not be able to perceive the difference between #1024 and #1032 individually. But when you make that large gradient while you will not be able to see the individual bands.

        You've gone from blocky to smooth. Anywhere you want a gradient, this is good. Fading to black, the sky, etc. And let's not forget that this can give us better HDR.

        • Take your monitor and make a gradient that is full screen from solid red to solid black. As things are now you get 256 bands of color because there are 256 possible values for red. The problem with this is that the transitions are VERY obvious.

          Actually, they're not that obvious at all across a 1200 pixel-wide image on my screen. However, knowing that the eye is more sensitive to green shades than red or blue, I tried the same experiment with a gradient from (0,0,0) to (0,255,0) and the steps are clearly

        • "Take your monitor and make a gradient that is full screen from solid red to solid black. As things are now you get 256 bands of color because there are 256 possible values for red. The problem with this is that the transitions are VERY obvious."

          I have a 2.0" monitor, you insensitive clod!

        • Take your monitor and make a gradient that is full screen from solid red to solid black. As things are now you get 256 bands of color because there are 256 possible values for red. The problem with this is that the transitions are VERY obvious.

          Not quite. If you try it you won't see the transitions. But if you apply some kind of histogram-modifying filter to the result, then, yes, you'll see the banding because of truncation of the byte values. In other words, you had 256 values, you now have half that. Pro

      • is that watermarking becomes easier... so pirate copies of films can be traced, maybe individually.
        • This watermarking would be easily defeated with a posterization filter. Just dither every color to the nearest multiple of 2 or 3, or whatever threshold is necessary. We shouldn't see much perceptible loss in quality, but the watermark would be eliminated.

          Watermarking is actually generally about altering the data stream in a way that is invasive enough that removing it would unacceptably degrade quality. Thus all watermarking will be perceptible, or else it is too easily defeated. I think the rule of th
      • Essentially you want to have your colors go as deep as you need to to make differences imperceptible, which this (supposedly) does. After that going even deeper would be a waste.

        Of course, reasonable limits aren't. Just because human perception says you don't need to put out gradations twice as precise as human vision to conceal them doesn't mean we won't use other devices to help us perceive more, much like you can use a CCD camera to turn infrared light visible, or that it needs to support more to conce

      • by fossa (212602)

        I don't know if HDMI is using RGB or not, but it's worth noting that RGB wastes bits in places the eye won't notice leaving fewer bits for places where it wil notice. See Greg Ward's page, High Dynamic Range Image Encodings [anyhere.com]

        which discusses perceivable color differences in the context of HDR encodings.
    • Maybe, or maybe not. There's a couple possibilities in here. The article isn't specific enough to say if either of these are true, so these are just guesses.

      First, consider storing each channel in 8 bits. That gives 256 possible levels. Say our eyes can distinguish 400 levels. 8 bits isn't enough then, so you need 9 bits. But that gives 512 levels, beyond our level of perception. But you can't use 8 1/2 bits for each channel (at least without more complicated encoding).

      Second, say it was easier for some tec
      • BTW, I don't mean to suggest that the 400 number is correct; it's WAY too low. I'm just using it for illustration.

        Also,

        Fifth, you want your cat to enjoy TV more.
        • by hurfy (735314)
          "Fifth, you want your cat to enjoy TV more."

          Awww, see now i can convince the Significant Other to let me get the new TV :)

          Thanks
    • It's one thing to have a higher quality image to downsample to, but... seriously. Isn't there SOMETHING the bandwidth could be used for besides information we can't use?



      It gives extra information for subliminal messaging.

    • In addition to what's already been mentioned by everybody else, this can both be used to increase the granularity of the scale and increase the dynamic range. If you want to crank up to change contrast/brightness on your display, it's good if your display is not being fed a "cropped" signal as input for that transformation.
      • and increase the dynamic range.

        Dynamic range is the critical part, IMO. Existing display standards are crap in terms of reproducing the full range of colours we can see, because no matter how well you calibrate the monitor, the colour depth itself prevents them from being displayed. IIRC, green suffers the most, but all colours are affected to some degree.
    • So what do you suggest, using 30-bit RGB? This isn't really a bandwidth issue. It is an encoding issue. From TFA:

      The ITU 601 standard, which governs today's displays, allows only 60 to 80 percent of the available colors, even if the display can support more, Chard said. "The color bit depth [of today's displays] is typically 24-bits RGB - that gets you 16 million colors, and the human eye can distinguish that," Chard said. "That leads to scaling and onscreen effects which you can pick up. Either 36-bit or
      • It may be that something like 10 bits per channel would more closely match the eye's sensitivity, but it's rather more convenient to encode on byte-boundaries.

        MARGINALLY more convenient, perhaps.

        Before the days of 24-bit color, we encoded RGB colorspace across byte boundaries and we liked it, by gum! 16 = 5+6+5, and you never heard anyone complaining!

        It's really only performing color arithmetic in the ALU that would benefit from having byte-aligned color values -- one less shift instruction to execute per
    • Sorry for stating the obvious, but doesn't color depths beyond what the human eye can perceive just seem really... pointless?

      They're planning for the point where every human on the planet owns an HDMI television and they have to start marketing to insects instead.
      • Re:Huh? (Score:3, Funny)

        by JDevers (83155)
        I can see it now "On Monday, October 13th don't miss the premiere of "Flowers!" filmed in our proprietary ultraviolet format!

        ---while watching "Flowers!" with a UV equipped television, remember to wear sunscreen and sit at least eight feet away from the screen or risk sunburn"
    • It's one thing to have a higher quality image to downsample to, but... seriously.

      That's likely the point. As it is you can cover the whole range, but if you start adjusting the color balance or white balance on the display you're going to throw out some bits and be looking for some more. If you've ever fiddled with the curves in Photoshop in 8 and 16 bit modes and then looked at the histograms you'll know what I mean.
    • by c0l0 (826165)
      Isn't there SOMETHING the bandwidth could be used for besides information we can't use?

      Sounds like a great idea! What about meta-data for, say, totally crippling copy-protection schemes?

      Oh, wait...
    • by Goaway (82658)
      Good job going for the obvious bait dangled in front of you by the article writer.
    • You can't see the colors beyond human perception, but the growth of your penis that devices made to this new spec will cause is also beyond human perception, at least if you're wearing pants. What matters is that you can stick your hand in your pocket, and *you* know that it's there. That's all you need.
    • Just you wait mister. My bionic eyeballs are already growing in a petri dish at the back of a noodle house.
    • Where do you think they're going to hide the watermarks for copy protetion and/or tracking pirate rips to their source? The best place, considering the common user/casual pirate, is directly in front of them where they can't see it. The pros will find a way around it, they always do, but Jimbob ain't gonna be sharing his new DVD over the internet without someone tracking it back to him somehow.
    • You're shortsightedly ignoring the long term benefits... How are we ever supposed to evolve beyond our current levels of perception unless our primary visual input encourages deeper color depth as a positive survival trait?
    • This is the same as the frame rate of the human eye. We can see around 16 frames per second, and anything below or near this is perceived as flickering. By bumping it up to 24 or 30 (film and video, respectively) we can produce an image that has smooth motion to it. They're doing the same thing with color, now.
    • Classic marketing speak - remember that Video CDs were supposed to be "beyond the quality of VHS" (but weren't). And a 192kbps-compressed MP3 was "CD-quality" (but wasn't).

      If you want to make something that is actually giving you all of the colors the eye can see, you have to promote it as giving you more than that - so "beyond the human eye" fits the bill.

      Also, bear in mind that: 1) these TVs will have some kind of stupid CineUltraVividNightVisionEnhancement chip in them to "enhance" the colors. Thes
    • >I don't think the human eye is going to evolve to greater color sensitivity

      Haven't you seen the ads? Chicks really dig guys who can see deep color and are eager to bear their children.
    • The human eye can typically distinguish between colors somewhere in 11 to 12 bit color depth per channel (for most humans, 11 bit per channel is all they can distinguish).
      Current 24-bit RGB is 8 bit per channel, 36-bit is 12 bit per channel, which should be enough for even the best eyes.
      48-bit RGB as used in graphics applications is overkill for simple image viewing but primarily exists because image processing generally reduces color resolution in the long run (actually, PS uses just (2^15)+1) colors per c
    • Isn't there SOMETHING the bandwidth could be used for besides information we can't use?

      I've never understood why we don't go for higher frame rates. Watching a fast scene turn into a blur is annoying. It can be mitigated with short aperture (so that each frame is captured quickly, i.e. not blurry -- I seem to remember Blackhawk Down was like this, and Band of Brothers), but really, if you're updating video standards, why not up the frame rate?
  • According to Chard, a few early adopters should announce products soon, Chard said, with "lots of products by the end of the year," in time for Christmas, he said.

    You mean companies will create buzz right before Xmas in order to get consumers to buy a product they "must have" but don't need? Wow!
  • "Color depths beyond what the human eye can perceive." Whoopie! Somebody get my retina upgrades at once!
  • by SeXy_Red (550409)
    What is the advantage of having a standard that supports colors the human eye can not see?
    • What is the advantage of having a standard that supports colors the human eye can not see?
      Maybe the HDMI group is positioning themselves to be the first to welcome our color-perception-advantaged alien overlords?
  • so what are we talking about here? infrared? ultraviolet? microwaves?
  • I can finally own a TV that shows Octarine! [wikipedia.org]
  • Upgraded... (Score:4, Funny)

    by HolyCrapSCOsux (700114) on Monday June 12, 2006 @04:54PM (#15519554)
    So how do I flash the firmware on my TV and DVD player?
  • by eclectro (227083) on Monday June 12, 2006 @04:54PM (#15519558)

    If the media you are playing is not Approved Media (TM), it plays in shallow color, otherwise known as black and white.
  • Their vision extends somewhat into the UV, IIRC.

    Will this be available on the Vrusk homeworld?

    • Unfortuntley the Vrusk are region 99 and as such will have to wait a long time for any new releases - the only movie to be released for some time to come in that region is the Vrusk translation of "The Color Purple".
  • by ikejam (821818)
    better get those UV filtered sunglasses out next time you're watching soccer..
  • by mmell (832646) <mike.mell@gmail.com> on Monday June 12, 2006 @05:15PM (#15519713)
    After all, if there's a fire on TV, a lot of the energy involved is in the IR spectrum - that's radiant heat.
  • If you read the areticle carefully, you'd find that the bandwidth has been pushed up to allow a 1920x1080x24bit (HDTV 1080) display to be update at 90hz. That would allow a 2048×1536 (think Apple 30" Cinema Display) to be run at 60hz with a Type A connector. That's an interesting development.
  • This will work nicely for the very few tetrachromats among us, ( http://www.freerepublic.com/forum/a3a24199b1ef8.h tm [freerepublic.com]). These are women who through genetic accident have an extra gene for color in the eye: "that woman's retinas would have four different types of photopigments: blue, red, green, and the slightly shifted green." They apparently have a much more finely tuned sense of color. Of course, there's probably only a few of them around, but hey, we're all about accessibility here!
  • I have an HDMI enable HDTV and I use it. It's good I guess but the problem I have with HDMI is that it's limited to one stream of information per connection. Look at firewire, it allows you to daisy chain multimedia and other devices and it works pretty well. I'm sure HDMI has way more bandwidth but most people aren't looking to get 8 streams of digital audio and 1080p. I'd be much happier if I could daisy chain a cable high-def box with a DVD player or game console and send that to my TV. In my setup
    • If you upgraded to an HDMI receiver as "the brain" to your A/V system then you could do this. All of your A/V would plug into the receiver and your TV would be on the monitor output. I do this except just using component video. My HDTV gets 1080i and 5.1 Dolby digital and/or DTS from the digital cable box using component video and monster spdif cables. They connect to the receiver and my TV is on the monitor out. If you have a DVD player that will upconvert the signial to HDMI then you should be able t
    • Wouldn't it be great if the article actually talked about that? Oh wait -- it did!
  • by path_man (610677) on Monday June 12, 2006 @05:47PM (#15519932)

    Really, shouldn't the industry concentrate on properly implementing to the existing spec's before they bother with new & improved features?? I currently have an HDTV Panasonic plasma panel, a Denon receiver and a SciAtl set-top-box all tied together with HDMI, and I cannot get a signal because HDMI does not properly authenticate for the very reason HDMI was created -- to legally broadcast copy protected signals.

    I am personally sick of these half-assed industry rollouts where most of the spec is adhered to by vendors, but the rest is blatently ignored, just so they can be first to market with a shiney new badge on their product. There is so much inoperability between HighDef products and home-theatre in general, that you're really playing russian roulette by being the first on your block to try an untested combination of components.

    To you vendors out there: GET IT RIGHT first. You know why folks aren't lining up outside their local electronics boutique to get the latest HD gear? They are pretty sure that the stuff isn't going to work and they won't be separated from their hard-earned dough by the latest marketing gimmick.

    PS - in case anyone wants to know my "workaround" I actually had to downgrade to connecting my SciAtl box to the Denon via component RGB cables then run HDMI to my panel. I talked with a Denon tech and this was the only workaround due to the stupidity of the *ahhem* engineering *ahhhem* at SciAtl. Maybe the Cisco acquisition will fix that nonsense.

    • It makes far more sense to buy a $500 PS3 and use component cables to connect it than to use HDMI which may not even work!

      I read similar complaints around the first HD-DVD player, which had trouble connecting via HDMI to a display.

      Supposedly the HDMI v1.3 is the "stable" spec, we shall see... I too think it's rediculous that HDMI was integrated into things in such a buggy state as it is today.
    • Not to sound like a jerk, but until you quit buying this 1st gen crap, they'll keep pushing it out the door. It's really simple...
    • by vanyel (28049) *
      Diverging slightly, this reminds me of the problem I ran into: I just bought a new house and setup a theater in it. As part of that, I bought a new Denon 3805, which has a feature where it converts all video inputs to HDMI, so I only have to run one relatively small cable to the projector, a Sharp XV-Z10000. It has a DVI input, but HDMI-DVI adapters are simple and readily available. Get everything hooked up, and find that my HD Tivo works, my progressive DVD recorder works, my old regular Tivo does not,
    • You're really playing russian roulette by being the first on your block to try [DUH].

      Where DUH is one or more of the following:
      BetaMax
      iPod Nano
      MiniDisc
      DIVX
      Capri Pants
      Any new car model
      The Pentium
      Joint NASA/ESA orbiters
      Vioxx
      HDMI
      Anything new.

      The early adopter is pretty much a beta tester who pays for the privelage. Consequently, he is rarely the f
  • by Animats (122034) on Monday June 12, 2006 @06:24PM (#15520133) Homepage

    Right now, we're mostly at 8 bits of data per color channel. This upgrade supports 10, 12, and 16 bits of color per channel, or 24, 30, 36 and 48 bits per pixel.

    This will be a big help in reducing banding on smooth gradients and artifacts during fades. Actually, you don't get more colors; you get more luminance range. It would probably work just as well to have 16 bits of luminance and two other color difference channels of 8 bits, but the HDMI people went uncompressed.

    Now the compression people have to go to work and deal with the issues of when it's worthwhile to send that much data and when it isn't.

  • color depths beyond what the human eye can perceive

    Oh, yeah. That'll be useful.

    • If you're the MPAA, the ability to watermark content in non-visible wavelengths may be perceived as *very* useful.
  • As an A/V professional, I'd be happy with a new HDMI spec that actually worked right and reliably. Us folks in the biz are still using analog component video for HD, and will until things like HDCP handshake errors and mysterious port disablings are a thing of the past.

Do not underestimate the value of print statements for debugging.

Working...