Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Wireless HDMI Prototype Announced 141

legoburner writes "Tzero Technologies and Analog Devices announced that they have created a wireless HDMI interface for HDTVs, next-gen DVD players, and set-top boxes. The backbone for the technology is ultrawideband, also used as a future replacement for wired USB. The Analog Device compresses data with the [lossy] JPEG2000 video codec, which is then packetized and encrypted, and transmitted via the Tzero MAC and PHY chip."
This discussion has been archived. No new comments can be posted.

Wireless HDMI Prototype Announced

Comments Filter:
  • Women! (Score:4, Funny)

    by Anonymous Coward on Wednesday September 06, 2006 @10:46AM (#16052217)
    Tzero and Analog executives say that wireless HDMI will make for much more aesthetically pleasing HD systems, which, according to them, will make women happier in the selection of home theater systems.

    "One of the things we are hearing more and more now is that the disinterested spouse is taking a more active role in selecting and hanging the television, typically that's the wife," Bucklen said. "That's all well and good until you start dragging cables into the solution. HDI cables are expensive and bulky and we think that a wireless approach can give consumers the flexibility to put televisions where they want them."
    The 1950s called. They want their mentality back.
    • Re: (Score:3, Insightful)

      by InsaneGeek ( 175763 )
      Just because it was true in the 1950's and kind of has a caveman feel to it, doesn't mean there isn't a ring of truth to it. If you go over to http://www.avsforum.com/ [avsforum.com] you'd be surprised as to the number of posts talking about passing the spouse test regards to being esthetically pleasing on required cable hiding, etc.
    • Re:Women! (Score:5, Funny)

      by legoburner ( 702695 ) on Wednesday September 06, 2006 @10:55AM (#16052309) Homepage Journal
      And with jpeg as the codec, they can keep their 1950s picture quality too!
      • Re:Women! (Score:4, Informative)

        by strstrep ( 879828 ) on Wednesday September 06, 2006 @11:43AM (#16052749)
        JPEG and JPEG2000 are very different lossy image compression algorithms. JPEG uses discrete cosine transforms, whereas JPEG2000 uses wavelet transforms, which are much better at representing non-periodic data, like you'd see in motion video.
        • http://www.extremeuwb.com/article/TZeros+UWB+Tech n ology+Extends+Through+Walls/180268_1.aspx [extremeuwb.com]

          "TZero, a startup in the Intel-led WiMedia camp, claims its components will be able to produce 100-Mbit* data rates across distances of between 10 and 30 meters,"

          Their goal is to be able to handle 3 HD streams at once.
          Do they even need to compress the streams to do that?

          *supposedly >100 Mb/s at distancess 10 meters
        • JPEG uses discrete cosine transforms, whereas JPEG2000 uses wavelet transforms, which are much better at representing non-periodic data, like you'd see in motion video.

          Problem is, the video you're sending to your HDTV, from ANY SOURCE, is already DCT-encoded... MPEG-2, VC-1, H.264, etc. All of them use DCT. So, lossy wavelet recompression of the decompressed DCT signal is just introducing another set of artifacts on top. JPEG (at highest quality) would certainly be much closer, and introduce less quali

      • It's not JPEG it's JPEG2000. There's a big difference, try it out here: http://www.kakadusoftware.com/ [kakadusoftware.com]
      • While I very much agree with this statement it is heading in the right direction. If anyone in this discussion thinks that it will be JPEG2000 forever please, fill me in as to why you would think that would be. These people will bring it out, people will say, "I'd prefer that there were better picture quality" and someone will come up with some genius way to make it better. There will be pioneers who can get all the arrows and I will be happy with my not-very-lossy wireless HDMI in a few years.

        By the w
    • It's true! (Score:4, Insightful)

      by paranode ( 671698 ) on Wednesday September 06, 2006 @11:00AM (#16052361)
      My wife's only complaint with my home theater set up was all the wires and how best to hide them. She was totally against me using surround sound because of the wires. Finally I ran the wires under the flooring (it's complicated) and then it was no problem. So in reality these guys have a good point.
      • You still have to route power, right? Maybe you can find a plug a little more local to the speakers, but you still have wires. This will always be the trade-off, IMHO. More devices and speakers make for a better theatre experience, but the 4inch diameter cable bundle under the floor is a bit unwieldy.
        • The speakers are powered by the speaker cable. These are the rear ones I'm referring to though, the rest is tucked away in an entertainment center cabinet.
      • My wife once asked if I could get some smaller speaker to blend in and fit on some shelves (I have some fairly large, if quite old, Tannoy "dual concentrics"). I can't remember my exact reply, but my wife realised she wasn't going to win that battle... we compromised on trying to get the cables better hidden.

        Long ago a girlfriend asked if I couldn't tuck these speakers away into the corners to get them out of sight etc; I replied pretty brusquely I'd rather take them outside and burn them. She said "I'll

    • Re: (Score:2, Insightful)

      The 1950s called. They want their mentality back.

      Which just goes to show how cool the 50's really were.

      Seriously, after decades of political correctness, we see that some stereotypes aren't always that far off. These guys aren't guessing that women want this, it's part of the feedback/research. My own experience (my wife and her friends) supports this. I know, my own experience doesn't offer a sample size large enough to reject the null hypothesis but it makes it a little easier to believe when I hear
    • by dvd_tude ( 69482 )
      From a guy who's not shy about ripping up drywall, don't discount the spousal acceptance factor.

      I remember sitting on my deck one night and overhearing this argument...

      She (in an exasperated tone): "My life is HOLES in WALLS... "
      He (in a matter-of-fact 'WTF' tone): "Well, how else are these speakers going to get installed?"

      Which brings up a point: wireless speakers are a huge win, so it's good they're becoming common now.

      In the same vein, an easy-to-use wireless product that solves the "last 10 feet" proble
  • by saboola ( 655522 ) on Wednesday September 06, 2006 @10:47AM (#16052226)
    In other news, in an attempt to make the PS3 future proof, Sony has once again delayed the PS3 till 2009 so that they may integrate wireless HDMI. Wireless HDMI will not come standard however, but be part of the 1500 dollar "ZOMG" SKU.
  • HD compression? (Score:4, Insightful)

    by MindStalker ( 22827 ) <mindstalker@[ ]il.com ['gma' in gap]> on Wednesday September 06, 2006 @10:50AM (#16052254) Journal
    Ok why would someone spend large amounts of money on an HD system only to have it compressed.

    On another note, what about the signal band already used by HD TV broadcasters, would a signal thats weak enough to stay inside your house be legal?
    • Re: (Score:2, Insightful)

      by legoburner ( 702695 )
      Because they are posers and just want to look like they have expensive equipment? That would apply to quite a few people with top-of-the-range systems that I know of.
    • by brunes69 ( 86786 ) <[slashdot] [at] [keirstead.org]> on Wednesday September 06, 2006 @10:56AM (#16052321)
      If you get your HD from digi cable or dish (which 90% of HDTV owners do), then the signal has already been compressed in MPEG2 or MPEG4 on it's way down the pipe.

      Then again, this thing is just adding in another compress/decompress cycle - not good IMO.
      • by ergo98 ( 9391 )

        If you get your HD from digi cable or dish (which 90% of HDTV owners do)

        Only 90%? Seems more likely to be 100%. About the only uncompressed source of HDMI material will be a PC digital output -- so your Vista Aero Glass experience will be marred by compression artifacts!

        Then again, this thing is just adding in another compress/decompress cycle - not good IMO.

        Exactly. This wouldn't be too terrible if it relayed a source compressed format (e.g. the original MPEG-4 stream straight from the cable provider), but

        • by JonTurner ( 178845 ) on Wednesday September 06, 2006 @12:15PM (#16053063) Journal
          >>If you get your HD from digi cable or dish (which 90% of HDTV owners do)
          >Only 90%? Seems more likely to be 100%.

          The other 10% is Over The Air (e.g. Antenna). If you're after the highest possible quality, this is what you want. OTA HD broadcasts are usually of higher quality than cable or dish. It sounds counterintuitive, but it's true -- The cable/sat company (re)compresses the signal, introducing visual artifacts. In effect, you're getting a second-generation copy.
      • Any source of HD is compressed with some algorithm (even terrestial broadcast is MPEG2).
    • Tell me, where in the world can you buy HD movies or watch HD TV programs that aren't compressed?
    • Why another digital format? Wouldn't it be just as easy or easier to just put in a small tcp stack and rj45 port? Gigabit should be more than enough to move HD tv over cat6 cable or fiber cable. Why do we need yet another digital cable? A built in hub, IPv6 and a basic standard on how to identify components would be all you would need.
  • See! (Score:4, Funny)

    by yakhan451 ( 841816 ) on Wednesday September 06, 2006 @10:51AM (#16052272)
    See! Sony's once again ahead of the curve, not shipping the PS3 with an HDMI cable.
  • Not really HDMI (Score:2, Insightful)

    by Anonymous Coward
    HDMI, in its present incantation, is just glorified DVI with DRM. But, anyways, a wireless version of a video connection which is lossy is not the same as the video connection it purports to replicate. I would propose they call it HDMI Minus (or something like that) but HDMI is already a minus.

    If lossy is allowed, my regular CRT TV from 1998 could be called HDTV. It's just lossy, right?
    • I wouldn't say HDMI is a *complete* piece of garbage. I agree completely that the DRM puts a downer on it, but it is nice simply because of the size of the cable. As for the wireless iteration, yeah the loss on it is going to make it a no go for me, but it is nice to see someone finally step up to the plate on the concept. Like all technologies, this one will either die or improve over time. I personally am hoping it improves because anything that reduces the number of wires needed to set up a quality h
      • Re:Not really HDMI (Score:4, Insightful)

        by Pieroxy ( 222434 ) on Wednesday September 06, 2006 @11:11AM (#16052445) Homepage
        I have learned that the wireless equivalents are always well under the performances of the wired ones. And I'm tired to see my image freeze every time someone walks between the Wi-Fi access point and the HTPC.

        Wireless is a no-go, in any of its incarnations today, save the input devices which don't need high data rate: mice, keyboards, remotes. All the rest is just on an emergency basis.
    • by Ucklak ( 755284 )
      Doesn't it also include 2 channel audio?
    • by jZnat ( 793348 ) *
      Exactly! There was nothing wrong with DVI (or dual-DVI sometimes...) and S/PDIF (fibre optic audio); they had to go and mix the standards with a bit of DRM koolaid...
      • Dammit, the HDMI standard does not add any additional DRM. HDCP has been associated with DVI since before HDMI existed. All HDMI does is combine DVI + S/PDIF + a low bandwidth control channel in to one cable.

        All the Dell xx07WFP LCDs support HDCP over DVI, as do all HDCP-ready InFocus projectors. I'd imagine that every single display device which has both HDMI and DVI connections supports HDCP over DVI.
    • Yes, HDMI is basically DVI-D. But it has many advantages, and few disadvantages.

      • It also carries 8 channels of uncompressed digital audio.
      • It has a smaller, more convenient physical connector.
      • It is specced to drive longer cable distances.

      Also,

      • It supports, but does not require HDCP. Just like DVI.

      Basically, the only thing DVI can do that HDMI can't is carry an analog video signal. But if you really want that, VGA is a lot more widely used.

      For a wireless version, I'd be concerned about a number of t

  • by topham ( 32406 ) on Wednesday September 06, 2006 @10:53AM (#16052286) Homepage
    JPEG2000 has both lossless and lossy modes.

    Did I miss something in the article indicating which they were using?
    • The article didn't indicate which, but I hope lossless is an option. I'm excited about this technology. It could be one more building block for a glasses-and-PDA-based desktop replacement.
    • The article states, "The Analog Device compresses data with JPEG2000 video code [sic?]".

      Though, they could just be stupid and really mean 'encodes'. Maybe they mean lossless..? I'm sort of dumb and always just think of lossless compression as encoding.

      Who knows.. I would guess they might have brought it up if the compression was lossy. Then again, I would guess they might assure readers that it was not lossy.. aahhh!! I don't know.
      • They do discuss how JPEG2000 compresses "without the characteristic blocky and blurry effects of the original JPEG standard."

        Sooo.. I guess it must be lossy if they're discussing that. They're essentially saying that it doesn't look as bad as the previous lossy compression method..
      • by Forseti ( 192792 )
        Though, they could just be stupid and really mean 'encodes'. Maybe they mean lossless..? I'm sort of dumb and always just think of lossless compression as encoding.

        Generally speaking, if you're that self-aware, you can't be that dumb. In this case though, there's no good reason to equate encoding to lossless compression. Encoding just means you scamble the signal. Lossless compression entails, well, "compression": The stream will contain more data than it currently occupies in the medium. There IS such a t

    • by cnettel ( 836611 )
      The need to use compression indicates that they at least have some fallback to lossy or lower framerate (it has to do SOMETHING if fed white noise). It doesn't have to be that bad, it's not like the original signal, in the video case, will ever be an uncompressed HD signal in itself. I wonder if they have considered any efforts to match the expected inherent compression artefacts; as we all know, lossy + lossy can sometimes be a very bad thing.
      • JPEG2000 works using a progressive encoding. If I were designing such a device, I would send as many refinements as possible in each frame, giving lossless quality on some scenes, lossy on others. You could also refine it slightly by allowing unchanged segments of the scene to have frame n+1 contain just the refinements.

        Either way, this seems like the wrong way of doing it. Most content that actually needs compressing is already compressed; decompressing it, then recompressing it, transmitting it, and

    • by Anonymous Coward
      So sayeth their presentation anyhow.

      http://www.tzerotech.com/site/demo/ [tzerotech.com]
  • Installation? (Score:3, Interesting)

    by onion2k ( 203094 ) on Wednesday September 06, 2006 @10:53AM (#16052288) Homepage
    "If we break this down, it's going to be less than an HDMI cable," Karr said. "Those are about $100 plus installation."

    People pay for someone to come and install a cable?

    "It's that whole 'plugging it in' thing! It's got me completely stumped!" ;)
    • Yes, people actually pay a lot of money to have simple things installed. It's not so much the physical installation though, it's all the settings, since most manufacturers don't ship the products pre-setup. Case in point, there are a number of Up-Converting DVD players that don't come out of the box with the HDMI port enabled. You have to go through menus (sometimes these procedures aren't even in the user manuals) to set it up.

      Now if you are a technical newbie, plopping down thousands of dollars at the
    • $100??? WTF??? (Score:3, Insightful)

      by brunes69 ( 86786 )
      You can get hihg quality HDMI cables from monoprice.com [monoprice.com] for $12 or less.

      Only a complete retard would pay $100 for a cable meant to deliver a purely digital signal. Then again these are the same people Monster-brand products are amrketed to, so nothing surprises me.

      • Re: (Score:2, Funny)

        by charstar ( 64963 )
        My problem usually goes like this:

        1) need cable for whatever reason
        2) where can i get one around here? best buy? circuit city? compusa?
        3) drive to one of the above
        4) "where are the cables?"
        5) get pointed to a wall of Monster stuff
        6) "do you carry any thing else?"
        7) sigh
        8) GOTO 3
        9) hurt myself trying to open the packaging
    • $100 for an HDMI cable? I know that HDTVs have yet to become common in living rooms, and even so not all of them have HDMI.. but really have you ever seen the cable? The connections remind me of SATA meets USB. They SLIDE into place, there aren't even thumbscrews to spin but they charge INSTALLATION for these things?

      $10 HDMI cables [google.com]

      Once companies find out you dropped a pretty penny on an HDTV set they will be out to screw you any way possible.
  • The Pulse (Score:1, Funny)

    by Arkiel ( 741871 )
    Dead horse, I know.

    Now included free with every PS3! A wireless device you can't live with/out!
    Sony's hotshit ultrawideband technology summons soul-sucking ghosts from another dimension to EAT YOUR SOUL!
    [watch them spin it as a viral marketing scheme]
  • by squoozer ( 730327 ) on Wednesday September 06, 2006 @10:56AM (#16052316)

    You rush out and spend god knows how much on the latest and greatest next gen DVD player, you throw away your perfectly good TV / projector / box that emits coloured light and buy a new one that supports HDMI (and HD). Finally, you then cough up more hard earned cash to buy a movie you probably already own on regular DVD for twice the price. You do all this in the hopes of getting a fantasic picture with amazing sound.

    Why, oh why, would anyone with two brain cells to rub together then install a wireless connection that uses lossy compression?

    Still, fair play for getting that many bits through the air. Personally, I won't be standing anywhere near the transmitter.

  • IANAE (I am not an engineer)

    Isn't the point of HDMI to have the highest qualety possible?

    Send this over a wireless connection (even if you could do it with out compressing it), and you are more likely to start seeing a degraded signal. Now if you compress this (As you have to I assume), then you end up with with loss by default.

    Now we have taken our thousand dollar TV, our thousand dollar DVD player, and stuck another expensive piece between em, that lowers the final picture.

    If you are going to spend that
    • Do your wireless downloads to your laptop degrade because it is wireless?

      No. It works or it doesn't. It is digital.

      • by Thansal ( 999464 )
        I have had my connection get dropped then brought back up because of my wireless LAN connection (thus making a download fail). Possibly degrade was not the best word, but you get the idea. What happens if my connection gets interupted? Is there some sorta caching mechanism? will the picture look exactly like if I use my HDMI Cable? Will the connection never get interupted (or at least no mroe then the cable will be?)

        Again, I am not an engineer, and simply speaking from personal experiance. I have yet
        • well, for the point, you're likely gonna be moving the signal over a much shorter range than you would with a wireless network. all your gear is likely to be in the same piece of furniture, so it's traveling a couple of feel at the most, and the stuff that is in the way (shelves, other components, etc.) is pretty much fixed (not to mention that UWB uses much higher frequencies (~3.1-10.6Ghz vs. ~2.4Ghz for 802.11b/g) than a wireless network would use, and would generally have better signal penetration), as
      • by famikon ( 994709 )
        You can't look at it like that. If you have the time to spare, even a wireless connection that loses packets 'all willy nilly' can eventually download the movie, and it will still look as good. But this will be streaming in real time. Do you often stream videos via wireless? I dont, for a reason.
        • I'm not saying it's a good idea. I'm just saying that the general population doesn't understand what a digital signal REALLY means.

          If you go to Best Buy, they sell HDMI cables for $60, $100 or $150 for premium. It is DIGITAL. Getting gold plated connectors on your cable isn't going to make a bit of difference.

          The same misconception was being applied to the wireless signal by the parent poster. Of course the data still needs to get there - and it needs to get there reliably.

          • by famikon ( 994709 )
            Yea I totally get what you are saying. I had a hard time trying to convince my friend that a $100 MIDI cable wouldnt make his softsynths sound any better than his $10 cable. Either it works or it doesnt (to an extent....)
    • Re: (Score:3, Interesting)

      by Doctor Memory ( 6336 )

      you are more likely to start seeing a degraded signal

      True, but only at the RF level. Since it's a digital signal (presumably with ECC, I haven't taken a look at the HDMI spec), you'll easily be able to either reconstruct the stream (using ECC) or ask that it be re-sent. And probably, if you're getting less than some threshold of signal strength, the devices probably won't sync up, so you'll look at the little blinking "SYNC" light and the manual will tell you to move the transmitter closer to the TV. Eit

    • When flat panel LCD monitors came out, the prices were obscene, the sizes were small, and the image quality, in many cases was below that of CRT tubes.

      Put annother way, you could hire a contractor to mount a tube monitor INTO your wall for less than the price of an LCD when they were new. But they sold because they "took up less space".

      In this room, I have two computers. Mine still has a CRT, which is long overdue for replacement, and will probably be replaced by a LCD soon enough. My girlfriend's PC has
    • Isn't the point of HDMI to have the highest quality possible?

      No, I believe the point of HDMI was to add content controls [wikipedia.org] to the digital signal in an attempt to make copying it more difficult.

  • The standard calls for link reliability of at least 95 percent, packet error rate of less than 1 in one hundred million, interference resistances for microwaves and cordless phones, and the ability to process three or more HD streams at 10 meters.

    From Intel's Website:
    In the United States, the Federal Communications Commission (FCC) has mandated that UWB radio transmissions can legally operate in the range from 3.1 GHz up to 10.6 GHz, at a limited transmit power of -41dBm/MHz.

    Unregulated frequencie

  • by codefrog ( 302314 ) on Wednesday September 06, 2006 @11:03AM (#16052387)
    I can already picture the audiophile products which will at no small cost somehow imbue the air in your living room with better wireless transmission characteristics...
    Maybe even a vacuum chamber so you don't degrade your digital transmission. It sure would suck to have your bits coming through the ether in low fidelity.

    Of course we all know that movies looked better on vinyl anyway.
  • by loose electron ( 699583 ) on Wednesday September 06, 2006 @11:18AM (#16052511) Homepage
    Remember - JPEG is a compression standard. By definition it is a "lossy" comnpression. Picture quality loss remains TBD. Need to read the details.

    This is a first generation UWB wireless interconnect. When the concept of UWB mas marketed around a few years ago, the claim was that it would be a low power RF communication method.

    Low power at the antenna, yes, at the power supply, no.

    However, the power consumed for all the signal processing in the receiver & transmitter is pretty huge. The channel bandwidth is 250MHz and uses OFDM modulation. The implication is gobs of juice to run an ADC to deal with that high bandwidth, and "must have" DSP to do all the signal processing. (OFDM requires rather fancy signal processing, which can not be implemented using a lower power analog method.)

    The net result - The "low power of UWB" may be true at the antenna, but the electronics require huge amount of juice to get the job done. Consequently battery powered applications are no-go. Now you got this fancy new wireless standard and a limited use for it, with all the applications needing to be plugged into the wall.

    IMHO? Poke a hole in the drywall at the floor, run the cables up thru the wall and into the display. You have to do that for the power cord anyhow, so why not? It's not like you are going to be moving the silly thing much after you install it!

    UWB won't see the widespread use of WiFi or Bluetooth.
    • Remember - JPEG is a compression standard. By definition it is a "lossy" comnpression.

      JPEG2000 (which TFA is talking about), on the other hand, defines both lossy and lossless standards.

      • JPEG2000 (which TFA is talking about), on the other hand, defines both lossy and lossless standards.

        Except that if they're using lossless compression, the bandwidth required would be more than just leaving it at MPEG-2/4. In other words, they're using lossy.
        • Well, a poster up above stated that they are infact using lossless JPEG2000. I believe 720p on my digital cable box produces about 80 Mb/s of compressed data and a channel occupies about 6 MHz of bandwidth. The GP says that the UWB system will provide 250 MHz of bandwidth, which is much greater than the 6 MHz currently being utilized in compressed form. If a 6 MHz channel can carry 80 Mb/s, a 250 MHz channel could carry 3.333 Gb/s and an uncompressed 1080p program only contains 1920x1080 (Pixels/Frame) x

    • True that JPEG only offered lossy compression.

      JPEG2000 DOES have a lossless compression mode for images. Sweet!

      I have used it to compress terrain data. It gave a nice 5:1 compression where our previous compresion was only getting 3.5:1. On the flip side, it uses a surprising amount of CPU.

      I don't know if there is an option for lossless video, but it seems likely.

  • No matter how hard the digital people work towards perfection, some creative engineer is determined to re-created the effect of crappy analog. Why use compression? Why not just keep cans of spraypaint next to your plasma screen and use them everytime something you care about is on?
  • by Overzeetop ( 214511 ) on Wednesday September 06, 2006 @11:44AM (#16052756) Journal
    How about we concentrate on getting systems which will modulate the original, compressed HD over coax so that 99% of the population who owns a house that is already built around the old way of doing things can still watch TV without fishing cable around?

    C'mon folks, there's a hundred usable channels with 19.x Mb/s effective bandwidth so we could *in theory* just pipe that HD signal from a remote box to the tv with the existing wires, let the ATSC STB (or internal tuner) demodulate and decode the content and display it. Hell, we could all have everything-everywhere in our houses with all the ugly gear stashed in the basement with this standard. *Analog is not the enemy* OTA HD works damned fine. Why fuck it up with expensive, unnecessary cabling?

    Disclaimer - yes I have an older home. I also have the DVD jukebox on channel 40, my Tivo on 45, my wife's tivo on 50, and a media server on 55. They get combined with the off air antenna and piped through an RG-59 coax to every TV in the house, with a Xantech IR sensor (DC coax return) at each TV. It works great, except that there's no HD. My parents just bought a new house, but can't put HD in the rooms because the builder ran (the standard) one coax to each TV location. Suprise...DTV requires 2 to get HD (I haven't verified this, mine are old TiVo units with two tuners, and need two cables).
    • by xiaoren ( 311714 )
      You can do this with satellite already.
      Pipe the TS stream from an HD DVB FTA source through a linux box running streamdev-server.
      From my tests, I could easily transmit a 1080i broadcast (raw ts stream) over ordinary 100mbit cat5.
      So for wireless, a good 802.11g router and receiver could handle the bandwidth of one channel at least.
      Also, the signal that comes off the LNB through the coax is that raw HD stream. So it's definately doable. Coax can handle
      at least 50mb/s, likely at lot more.

      HDMI is just about c
      • That's way too complicated, and not easily decodable on the TV end w/o an extra box with the networking and TS decoder (=$$). If I could get the TS already encoded onto 8VSB in baseband and then remodulate it into a standard OTA channel like I do with NTSC baseband->UHF, I could pack as many channels onto the coax as I had sources (I don't have the money to run out of channels >36, where my lowpass cutoff filter is, even with a 2-3 channel separation). Then with a TV with ATSC and an external IR rece
    • by dabadab ( 126782 )
      I guess you just have described DVB-C.
  • Of course audio/videophiles aren't going to want this, but I'm can think of a few applications this would be convenient enough to offset the (minor, hopefully) quality loss. i.e. equipment cabinet or rack outside of main living room area. Fussing with extra-long HDMI cables or having to add repeaters into the mix can be a hassle for some.

    Also w/ JPEG 2000 the artifacts are going to be pretty minor. It's compressing each frame independantly so none of the weird MPEG-esque artifacts inbetween keyframes.

  • by Ruprecht the Monkeyb ( 680597 ) * on Wednesday September 06, 2006 @12:04PM (#16052955)
    FTA: "The standard calls for link reliability of at least 95 percent...." I think that's shooting kinda low, guys. My current setup has a link reliability of 99.99%. The only time it fails is when I go running across the room to eject the p0rn from the DVD player and trip over a cable. OTOH, if they can guarantee it will always fail during commercials, maybe they're on to something.
    • "The only time it fails is when I go running across the room to eject the p0rn from the DVD player and trip over a cable."

      Avoid running with your pants around your ankles and you won't have that problem.
  • by Animats ( 122034 ) on Wednesday September 06, 2006 @12:10PM (#16053002) Homepage

    If the consumer-electronics people weren't so hung up on proprietary interfaces, consumer electronics could just use 100baseT for everything. More bandwidth than some UWB thing, can be extended to cover just about any house, cables are cheap, and interference isn't a problem. You can get a whole 100baseT/TCP/IP node in the RJ45 connector now, so low data rate sources like audio devices could play cheaply. Power over Ethernet could power some of the lesser boxes, like cable modems.

    That "30 meter UWB" link will turn out to be a huge pain. It probably won't work through walls especially ones with metal studs, so inter-room links in houses will fail. Even across a large classroom (an obvious application), there might be problems. The DRM probably won't allow multipoint distribution, so you can only have one monitor per Blu-Ray player, but that's another issue.

    • Design, patent, and license free to hobbyists and "pro-freedom" commercial vendors a box that takes {your favorite signal} on one end and converts it to/from something that can ride on cat5 or better yet, ethernet or tcp/ip.

      It's been done with KVM-over-cat5, various disk protocols over tcp/ip and I think ethernet, and others already.

      Sounds like a great market opportunity.
    • Well put. I've been using this approach for a couple of years now using a Roku HD1000 (thing that
      can turn transport stream HD data into a component 1080i signal) using ethernet to read
        NFS mounted data that was captured using an HD3000 HD ATSC tuner card. Standards, wonderful things.
    • More bandwidth than some UWB thing,

      But less than DVI or firewire...

      Power over Ethernet could power some of the lesser boxes, like cable modems.

      PoE is a mess. It's cheaper to give out wall warts than to include the circuitry to allow a device to operate on a wide range of voltages (necesary because ethernet is high resistance, so voltage varies dramatically by distance).

      That "30 meter UWB" link will turn out to be a huge pain.

      Agreed, it's just some marketing BS.

  • Power Wires? (Score:2, Interesting)

    by popeye44 ( 929152 )
    While I understand their desire to have a wireless standard, Are we not forgetting there is a whole home standard being devised around broadband over power lines? Could they not instead use something that would travel the power line digitally and make the connection? Perhaps BPL is a dead horse but I had not heard that it was so. The home standard was to allow devices to travel the wire path to make all sorts of connections. This would be a much better design IMHO.
  • who cares? (Score:2, Interesting)

    by shummer_mc ( 903125 )
    I don't think the model is to transmit video data to the monitor. I think the idea is to include the groovy computer that wirelessly downloads HD TV content onto a hard drive that's IN the monitor. The DVD drive, as long as the format survives, will also be included in the 'console' which we call the TV. No video needs to be transmitted. Am I missing something?

    Think of it as a giant laptop on the wall (hopefully the non-TV components will be interchangeable). IO should be the only thing that needs to b
  • by Ancil ( 622971 )
    My God. Are we sending video signals straight through thin air these days??? What exciting times we live in!
  • I would not consider myself an expert, but this is my field, so let me give everyone a REALLY quick lesson in 1) JPEG2000 and 2) "lossy" video compression.

    JPEG2000 [jpeg.org] is an advanced set of tools for video compression [wikipedia.org]. It is used at the highest levels of distribution [dcimovies.com], and has been proposed for consumer use as is the case here. For more on JPEG2000 a decent primer is here [purdue.edu].

    If you are watching content at home, it already has gone through a "lossy" compression scheme. Whether it is DTH satellite MPEG2 [wikipedia.org] or

  • The backbone for the technology is ultrawideband, also used as a future replacement for wired USB.

    This sentence makes me tense.

  • Jpeg isn't bad (Score:2, Informative)

    by luketheduke ( 945392 )
    Hey guys don't you know that every "HD" signal you currently see is compressed? Infact it's even compressed with a lossy compression when its recorded to tape from the HD camera. Unless you're taking an SDI out cable directly into a Hard Disk Recording system and hardly anyone does that. Why is all HD compressed? 1 1080i uncompressed stream runs 165MB/sec ....do the math ;) and even though its compressed it looks pletty good. One of the widest formats used with cameras and editing/storing is the DVCPRO HD

Without life, Biology itself would be impossible.

Working...