Slashdot stories can be listened to in audio form via an RSS feed, as read by our own robotic overlord.

 



Forgot your password?
typodupeerror
Intel

Intel Launches Wi-Di 172

Posted by samzenpus
from the clear-streams dept.
Barence writes "Intel has launched a new display technology called Wi-Di at CES. Intel Wireless Display uses Wi-Fi to wirelessly transmit video from PCs running Intel's latest generation of Core processors to HD television sets. Televisions will require a special adapter made by companies such as Netgear — which will cost around $100 — to receive the wireless video signals. Intel also revealed its optical interconnect technology, Light Peak, will be in PCs 'in about a year.'"
This discussion has been archived. No new comments can be posted.

Intel Launches Wi-Di

Comments Filter:
  • Great! (Score:4, Funny)

    by Errol backfiring (1280012) on Friday January 08, 2010 @08:04AM (#30693234) Journal
    Why didn't I think of that? First, kill off all TV signals and force people to use cable companies, then invent a system to ...
    transmit TV signals!
    Brilliant!
  • if you can broadcast a signal to every set in your house, or even your entire apartment floor, then there goes a bunch of lucrative descrambler box fees. then again, they can all only show one channel at a time. however, media companies seem to all be losing income nowadays, and have all taken a hostile attitude towards new technology. they seem to need very little reason, however slim and irrational, to pick a fight with new technology

    of course, the future is all streaming media over the internet, mostly on demand and mostly free, so they're all fucked

    • by Ilgaz (86384)

      Perhaps, cable companies have brains to go with standard technologies like H264 (it is lossy, remember) and add some kind of _standard_ DRM layer on it.

      In fact, IPTV guys are doing it for years without 10000s of Intel CPUs. All they need is to put the encoder/encryption chip to the set top box and "air" over standard TCP/IP with gigabit cable or wireless.

      The issue with Intel in that case is, $10 chip will do far better job than Intel Core i7 rolls royce processor since it is designed for it :)

      I don't follow

    • That's progress. Some business models stop making money and become obsolete. New business opportunities become available.
  • Wi-Di (Score:5, Funny)

    by hcpxvi (773888) on Friday January 08, 2010 @08:08AM (#30693258)
    ... when you could LIVE?
    bada-bing-TISH! Thank you ladies and gentlemen, I'll be here all week.
    Seriously, though, did their advertising people not spot what a silly name Wi-Di is?
    • by Spad (470073)

      I prefer the pronunciation "widdy"

    • Re: (Score:3, Funny)

      by dmayle (200765)

      What a great joke! You're so Wi-Di...

    • Devices with those silly names will never sell.

      • by Tim C (15259)

        "Kindle" means "to start a fire burning by lighting paper, wood, etc"

        I assume that it was chosen to conjure images of sparking off or kindling an e-book revolution.

        • I assume that it was chosen to conjure images of sparking off or kindling an e-book revolution.

          Or perhaps it was meant to conjure images of Nazis burning books in Germany [historyplace.com]??? Muhahaha... :/

          • by omnichad (1198475)

            Probably more like Fahrenheit 451. Since that would be, you know, a literary reference. Referring to how they delete copies of books from your own device.

    • Re:Wi-Di (Score:5, Funny)

      by JeffSpudrinski (1310127) on Friday January 08, 2010 @08:33AM (#30693442)

      Discussion of how to pronounce it reminds me of the little-known trivia about how the inventor of SCSI wanted it to be pronounced as the "Sexy Interface" rather than the "Scuzzy Interface".

      -JJS

      • Re:Wi-Di (Score:4, Informative)

        by morgan_greywolf (835522) on Friday January 08, 2010 @09:56AM (#30694420) Homepage Journal

        Discussion of how to pronounce it reminds me of the little-known trivia about how the inventor of SCSI wanted it to be pronounced as the "Sexy Interface" rather than the "Scuzzy Interface".

        The inventor of SCSI was Larry Boucher at Shugart Associates (and later Adaptec). They've always pronounced it 'scuzzy'. Apple was the player that wanted it to be pronounced 'sexy' because they were (at the time) pushing SCSI as a technology that made their machines superior to IBM and the clone makers, who were generally not including SCSI interfaces. Apple used SCSI for HDDs, FDDs, and CD-ROMs, and the inclusion of SCSI on the Mac was the biggest reason why early scanners always used a SCSI interface, Other players in the early days of SCSI (around 1986 or so) included Commodore, who included in the Amiga, and Sun Microsystems, who included it in their Unix workstations and servers.

        • While SCSI was used for hard drives and CD-ROM drives it was not used in as build Macintoshes for the floppy drive.
        • I thought Shugart Associates pronounced it "SASI". It only became SCSI when it becaame a standard, rather than proprietry interface.

          Prior to that, When Shugart worked for IBM, it was called the "IBM Data Channel Interface" or something. (I'll Google it later)

    • Re: (Score:2, Insightful)

      by XavidX (1117783)

      Well it worked didnt it. Your gonna remember it for awhile.

    • by AP31R0N (723649)

      i've always despised the term Wi-Fi. Fidelity isn't the issue! Stop trying to steal recognition from a totally different type of product!

      And get off my lawn.

      • Things with i also don't localise well. In a lot of European languages, i and e are the opposite way around to English, so when a French person says WiFi it sounds like 'weefee' or, if he says it quickly, 'whiffy'. Wi-Di will sound like 'weedy,' which is probably quite appropriate.
    • Just wait for Microsoft to integrate it into Windows LIVE.
      And then wait for the worst pun-filled advertisement EVER. ;)

  • I really hope they've got encryption on by default in this technology or we'll have this whole security fiasco that we had and still are having with the open WiFi all over again.
    • by delt0r (999393)
      If you care about security, you probably won't be using this in the first place. Since i have over 12 overlapping wifi networks at my apartment, I can't imagine that these things will handle interference well. And it had better be better than HDTV which at least here (Vienna, Austria) looks like total crap.
  • Homology with "wi-fi" and "hi-fi" demands that the two parts rhyme. The obvious is "why-die" but the alternatives such as "wee-dee" (weedy) and "whih-dhih" don't exactly jump off the tongue either.

    • by Plunky (929104)

      In french, wi-fi is pronounced as a single word as english might pronounce "whiffy", and I as a native english speaker generally say it that way too so "whiddy" is fine for me.

      In yon case, makin just as much sends as like fiddy, nuff respec, ma bro.

  • by mcgrew (92797) *

    You gotta go sometime...

    At least this one makes sense, unlike Wi-Fi. Kind of morbid name, though.

  • Yet another kind of connection from PC to TV?

    Why not just watch on the monitor of the PC, or use a projector?

    • Exactly - a TV is often lower resolution and lower quality than the Monitor on your PC anyway (Monitors tend to be smaller simply because you tend to sit nearer it ...)

      And the only difference is that your TV has a built in analog receiver (which will soon be obsolete) or a built in digital decoder (which you can replace with a box)

    • Why not just watch on the monitor of the PC, or use a projector?

      I would use this. In my current set up, we have TV/Tivo/cable box on one side of the room, and projector/PC on the other side of the room. PC plays movies and Netflix on the projector, and Tivo/cable on either the TV or projector.
      Because there are two viewing devices on opposite sides of the room, somewhere there will be a long-ass cable involved. Currently, a 50' S-video cable from the Tivo to the projector.
    • Re: (Score:3, Informative)

      Because people have bought expensive HD sets with VGA/S-Video/HDMI and they want to use them as big, honkin' monitors in their living room without running cable.

  • Sounds like something you scream at the TV when the redneck down the street starts talking on his CB and turning the screen to snow right in the middle of your favorite show.

  • Does this work with real video cards / chips? and not intel GMA that is a about the same speed as 1-2 year old on board ati / nvidia chips?

  • that intel came up with light peak after getting called on their attempt to keep usb 3.0 host controller specs proprietary?

  • by mbrod (19122) on Friday January 08, 2010 @09:07AM (#30693794) Homepage Journal
    Apple started the concept but ceded it to Intel to develop it.

    http://www.engadget.com/2009/09/26/exclusive-apple-dictated-light-peak-creation-to-intel-could-be/
  • They already sell the equivalent for iPods to transmit to a radio in your car. It does work, and I use one, but the quality is hit or miss. It's not as good as a straight cable, and it's very prone to interference. I'm planning on upgrading to a new head-unit sometime this year so that I can plug right into it rather than use the radio setup.

    Wireless (anything) for me is only a temporary convenience that I can use until I properly setup a wired system. It ALWAYS has drawbacks, and I never want to use it

    • by omnichad (1198475)

      No, the equivalent to that would be to have a device that transmits VHF or UHF wirelessly. Your problem with audio quality is that it's a lossy, analog solution. You're dealing with heavy loss of the high and low end, and heavy range compression with FM. And your reason for hit and miss quality is other real radio stations with huge antenna's overpowering your FCC-regulated "must accept interference" device.

      I prefer wires, but you're building a straw man.

  • The real killer app is of course the millions of projectors hanging from office ceilings worldwide. From now on you will get Death By Powerpoint without these pesky 20 meter VGA cables. When does somebody make a projector you can simply stick a USB key in?
  • Intel CPUs? (Score:4, Insightful)

    by VincenzoRomano (881055) on Friday January 08, 2010 @09:21AM (#30693936) Homepage Journal

    PCs running Intel's latest generation of Core processors

    I don't see the point here. How can I see from WiFi whether you use Intel, AMD, ARM or whatever else?
    Sounds more like advertisement than technology!

    • Wow, you mean a company came out with a product that showcases its own technologies and products?! Jeepers, creepers, there are shenanigans afoot! Quick, call the FTC!
  • by Anita Coney (648748) on Friday January 08, 2010 @09:34AM (#30694080) Homepage

    Yeah, because we all know how completely difficult it is to connect a DVI to HDMI cable and an 1/8" cable from your computer to your TV.

    Of course someone will say, "Most people don't keep their PCs near their TVs."

    If people were willing to spend $600 on a PS3 that sits in their living room, I don't see why they can't spend a few hundred for a PC. Heck, if you subtract the $100 "special adapter" from the price of the PC, you can get one real cheap.

    Of course someone else will say, "Who wants a noisy PC in their living room?" And to that I'll say, "Have you ever been in the same room with an Xbox 360?" Mine is much more noisy than my PC by a wide margin.

    Compared to the 90s, I think retail desktop PCs are pretty quite nowadays. (Of course I built mine myself.)

    • by Fred IV (587429)

      If people were willing to spend $600 on a PS3 that sits in their living room, I don't see why they can't spend a few hundred for a PC. Heck, if you subtract the $100 "special adapter" from the price of the PC, you can get one real cheap.

      Meaning yet another power-hungry disposable consumer good to sit in common space now and in the landfill later. Thanks, but no thanks. Besides, if they already spent for a PS3, they can already stream video over wi-fi using PS3 Media Server [google.com].

  • Should use ATSC (Score:4, Insightful)

    by gr8_phk (621180) on Friday January 08, 2010 @10:12AM (#30694614)
    They should just broadcast it using ATSC. Then we don't need a receiver on the TV just the antenna.
  • If I understand correctly, digital television signals are still using basic MPEG2 compression, like on DVDs. I'm not sure if this is still the case for HD streams (blu-ray, etc), but it seems to me like they can't fit all that much data on a disc compared to what you can download in a torrent.

    Meanwhile, I regularly stream xvid and h264 videos from my laptop to my "media" computer (a desktop PC connected to my TV running Ubuntu) using regular old 802.11G over SSH. (The ssh isn't necessary, but sshfs is pre

    • Re: (Score:3, Insightful)

      by delt0r (999393)
      If you want really good quality (as i do) then you at the high bandwidth end of the spectrum and mpeg2 is no worse that H264 (and even the experts agree on this point). Basically you at the end where you are encoding quite a bit of noise (film grain etc). h264 shines at lower bitrates, but with massive increases in complexity and patents. Hell the spec reads like a bunch of engineers had a stack of patents that they wanted to include in the spec.

      I know a lot of fan boys love h264 and believe that HD can
    • by n7ytd (230708)

      The Digital TV transition has been in the works for what, 10-12 years now? All that fancy-pants h.264 stuff takes processing power to decode, which means extra cost in every TV set.

  • So many questions... (Score:3, Interesting)

    by dr_wheel (671305) on Friday January 08, 2010 @10:54AM (#30695326)

    ... and none of the articles I've read about 'Wi-Di' seem to answer them.

    How about sound? Transmitting video directly to my tv sounds nice, but how does this tech account for transmitting sound to a HT receiver? Potential for audio/video de-sync? How will this be handled?

    Potential for latency issues? This could be a big one, especially for gaming.

  • Bandwidth? (Score:2, Insightful)

    by JustNiz (692889)

    I can't imagine that wi-fi has enough bandwidth for full HD, at least without massive compression that would obviously downgrade picture quality.
    Someone wake me up when this technology can transmit pixel-perfect full screen HD video, without the annoying dropouts existing wi-fi suffers from.

You see but you do not observe. Sir Arthur Conan Doyle, in "The Memoirs of Sherlock Holmes"

Working...