Wow, the AC responses from people who don't even RTFC (comments) keep rolling in.
I responded to an OP who wanted an HD optical disc player, but claimed to be waiting for the war between HD-DVD and BluRay to settle down. He wasn't concerned about the next cycle or possible streaming solutions, and he didn't indicate any uneasiness about the costs and benefits of BluRay vs. regular old DVD. He simply wanted an HD disc player, but didn't want to buy the wrong type. He was apparently unaware that the format war has been over for a year and a half.
I didn't address any of the other salient points in the HD Disc debate because I was merely responding to OP's concerns about BluRay vs. HD-DVD.
Read the entire thread. My post was a reply to someone who said he was waiting to buy until he saw whether HD-DVD or Blu-Ray won. Blu-Ray won. That is all I addressed.
Hence my comment regarding whether the quality improvement is worth the money. The Grandparent poster's only concern is waiting until the victor in the HD Optical Disc wars is declared, though, so he or she would be well advised that the battle is over and done.
HD-DVD is dead. There's no need to wait to see who will win, as that question was answered a year and a half ago when Toshiba (the banner carrier for HD-DVD) announced that they would discontinue all HD-DVD production. According to the wiki article, the entire HD-DVD promotion group was dissolved March of last year. To my knowledge, no one builds a new HD-DVD player; there are a small number of PC drives that include HD-DVD compatibility, but I assume that's because of the low cost of inclusion once the blue laser diodes for Blu-ray are already in the drive. You can not walk into a retail store and find an HD-DVD player unless they found some hidden stock in the back and are clearance selling it for $20. You can't find HD-DVD discs unless the same thing happens. Any movie that's come out since then will never come out on HD-DVD. HD-DVD is dead and voluntarily buried by its own support and manufacturing group.
In summary, there is no more waiting. The race was over last year. You can debate whether the quality improvement is worth the money, and there's some definite complaints to be made about the cost of the discs. If your only concern, however, is which of the formats will win, then there's no reason to continue waiting. Blu-Ray won last year.
Comparing an 8 port video switcher to an 8 port ethernet switch is an apples and oranges comparison, which is all I was trying to say; using an 8 port ethernet switch does require encoding hardware on all real-time devices, while the HDMI switch is simply something for hooking up raw video run in short runs. For its purpose, the HDMI switch is substantially cheaper. The fact that you were trying to compare the two led me to believe that you think people were using HDMI as structured cabling and trying to install it in a star topology, which couldn't be further from the truth. Different tools for different tasks.
iSCSI works spectacularly well because Ethernet has caught up to and in some cases surpassed local SCSI cabling speeds (or at least functionally matched it, once we factor in the limitations of an array of drives). We didn't invent some new, lossy drive communication spec to make it work, we mostly just communicated in the same ol' SCSI block addressing system over a different cable. HDMI vastly surpasses gigabit ethernet speeds (32bpp x 1920x1080 pixels x 24 fps minimum + audio) in bandwidth, requires different latency correction, and has no video compression applied. If 10Gbps ethernet makes an enormous drop in price, then HDMIoE could certainly work and would be a fantastic system. What you're describing, however, is not HDMIoE.
The decode chip is unnecessary in a real-time device such as a game system; those systems already use their graphics processors for decoding. They're necessary on a video box that's transcoding video, sure, but not on a game system. They can add it to the GPU if they see a compelling reason to do so, such as your hypothetical multi-room gaming, but it's not a good idea to make it compulsory. The difference may not be horrific, but even $5 or $10 adds up on low margin items.
If you want 1080p or 3D, yes, you'll have to upgrade your TV, but you don't have to upgrade your TV to display MPEG-4 encoded videos just because they became common in the time since you bought your display. You don't have to upgrade your TV to interface with a different audio standard just because it became available after the fact. You're already limited in upgrading your display technology, but building in the box limits the input technology as well. If your argument is that TVs change quickly enough that it won't be a problem, I'd wonder why you think people are going to want to replace each of their smaller TVs scattered about.
The 19" TV for $150 is not going to magically include the $200 STB parts, minus the minor cost for enclosure/Power Supply/few redundancies, for $150. If the parts are cheap enough to cram in a $150 TV, they're going to be cheap enough to package in a $30 STB if someone wants them. If someone doesn't, the TV can still tune into normal broadcast television, be used with a stand alone disc player, etc if that person wants it. I don't know why they wouldn't get the hypothetical $30 STB, but that's beside the point. Any smaller display scattered around that supports your system would have nearly similar costs to a plain display plus an STB that supported the same system, and wouldn't be obsoleted with the introduction of a new codec or STB feature that couldn't (or wouldn't) be added by that TVs manufacturers.
I'm still a bit confused about shunting raw video around, and my confusion goes all the way back to your first post. Your insistence on comparing HDMI and Ethernet is baffling given that they're intended for two completely different purposes. HDMI is not poised to replace Ethernet, and it never will or should. Doing so is asinine. If there's an STB in front of every TV, it will interface to the television with a raw video format and speak to the rest of the network via normal networking equipment for no reason more complex than compatibility and cost reduction. HDMI includes simple, low-bandwidth data transfer (I have no idea of the speed) suitable for signaling devices to turn on/off/change volume/etc; there's this new version with 100 Mbit ethernet built-in, but the consensus from everyone seems to be that it's a solution in search of a problem. No one is using HDMI for anything other than transmitting raw audio and video a few feet (perhaps 50 - 75 in the case of front projectors) from a source device to a display/audio device. If an STB sits outside of the television, it will make sense to minimize the costs of transmitting the data to the display, and that means raw video and audio.
If you say so; a chip whose application I've never seen in ANY consumer device anywhere (and has been available for at least 2 years) isn't going to "be embedded in gizmos anyways." I'm not questioning that real-time encoding hardware exists, just the cost. What's the cost of this device, and what's the cost of placing it in every relevant box you own? What's the cost of replacing every TV with models capable of being upgraded properly to deal with this system, including the new codecs?
Now what's the cost of doing that instead of handling what you're describing with a single, inexpensive set top box that can be more easily replaced/upgraded/installed, and outputs to a television as a monitor. I don't think we're arguing for different things here, necessarily; I want ubiquitous network access to a television as much as you do, but I disagree that there's a price savings to be had in embedding compression hardware (not to mention the lock in and backwards compatibility nightmares that it brings) in every device.
Network tuners already exist. Network surveillance cameras already exist. They can both already write to NAS drives without any trouble, and are perfectly capable of being controlled over a network. Set top boxes can already software switch inputs, poll for meta-data for movies being played, access information on other STBs, communicate with other networked devices.
A $200 Popcorn Hour Set Top Box is already hardware capable of doing 9/10s of what you listed (the multi-display game system excluded, although advances in STB hardware would certainly allow the game system to act as a server and render those images on the STB). The same can be done in any recent model digital cable or satellite box. An STB communicates with network devices over the network and communicates with displays over HDMI, it avoids the need for encoding video when it isn't necessary, and when technology has progressed far enough that things need to be changed, it can be replaced much more easily. No need to fret about firmware upgradeable television to support a new compression format, or developing new compression formats to deal with the problems inherent in your system, because you treat the display as a display and leverage the economies of scale on a box that's more easily replaceable. Arguing that packaging the STB in the TV is cheaper is missing the forest for the trees. Again, it works great for prerecorded content, but falls apart for real-time video where the need to maintain upgradeability, compatibility, and cost savings is sorely trounced by a cheap external box.
The HDMI switch has an "original price" of $250 in the same sense that a car has a sticker price. Monoprice is not a clearance site; if you want to pay $150 to someone else, you're free to do so.
HDMI does indeed support addressing. You may want to read up on the standard before you begin claiming things that aren't true. Any HDMI device can communicate with any other over the HDMI system, although no one has built a switch that handles things in such a manner because there's no demand for such a product (as I'll explain in a bit).
I didn't go looking for an industrial part because there's no pricing available. There never is for products where the purchaser is expected to buy in lots of 1,000. The conexant chips you linked have A) no price listed and B) don't support HD resolution, so they're a moot point anyway.
What I did link was a product with the necessary industrial hardware buried inside. If we ignore the D/A converter (those are dirt cheap anyway), the USB chipset, and the fan (yes, fan) necessary to keep the compression hardware cool, then we can make a reasonable approximation of the cost of the HD compression hardware inside. If you can find an HD capable industrial part, including the price, please feel free to link it.
By the way, compression chips don't care if they're taking digital input from a source that was originally digital or digital input from a D/A converter. It's already digital by the time it hits the compression hardware. There might be a tiny, miniscule hit in speed due to the digital video from the D/A converter having slight imperfections, but it's pretty nominal.
Here we have a $200 box. Let's say that $100 of it is due to the unnecessary D/A equipment, USB hardware, and the enclosure itself. Hell, I'll be super generous and say that only $50 of that device is the actual compression chip. Now you're telling me that we should have a $50 encoder chip in a $200 Xbox 360 or a $400 PS3? Let's say that the cost for those chips drops 10 fold due to economy of scale. $5 is still a ton of lost profit on devices that are known for being sold at a loss. And that's all for an encoder chip capable of doing 1080p @ 30fps. What about 60 fps? What about 120 fps or 240 fps for the newer 120hz/240hz panels? What about the problems introduced when you run MPEG-2 compression on the text of the website I'm attempting to read with my networked video device?
On top of that, you still haven't addressed the problem of poor video compression, the reasons why we're even bothering to introduce lossy compression to an uncompressed video output signal in the first place, or the issue of codec stagnation when we're dealing with everything being transcoded to MPEG-2 in the end.
Finally, I still can't work out the problem you're trying to solve in the first place. You seem so intent on comparing a video output signal to a network cable that you can't tell me WHY you want to do this. You have a vague requirement of routing inputs to outputs, but we already have a system in place for that. Commercially available set top boxes such as the Apple TV, Popcorn Hour, and similar have been around for several years, and homebrew solutions have been around for longer than that. They all allow us to use existing home networks to send only the compressed video that's necessary, then layer their interface on top before outputting with regular video outputs. Got a video on device A but want to play it on TV B? We can just send the video file over an existing network. You want us jumping through hoops to decode the file at Device A (remember, it might not be in MPEG-2 or whatever format your system uses), layer our interface on top, re-encode to an MPEG-2 Transport Stream format, then send it to TV B. Or, we can have an inexpensive set top box at TV B, capable of handling any input format and without the necessary hardware to encode back to Transport Stream format. Then we just dump the framebuffer video out to a TV.
I won't lie, your system sounds incredibly appealing except for the edge cases. That's why it already exists. DTVs have been capable of doing EXACTLY what you're describing for at least 6 or 7 years over firewire. A standard exists to allow MPEG-2 Transport Streams over firewire from any device to any device, and it even supports simplistic menu systems. Hell, there's even a standard called that allows for a DTV to dump a tuned signal out to a firewire hard drive (provided the drive enclosure can speak the protocol necessary) for use as a tremendously inexpensive DVR. There's even a company that sells software that causes your computer to emulate the AVCHD system, and I'm sure someone could implement an OSS solution in a couple of weeks if they were bored and resourceful. Because there's plenty of bandwidth left on the firewire stream, it could even support DRM for the Hollywood crew. Firewire protocol over ethernet cabling is trivial to implement, and involves not much more than dropping in a box with a firewire port on one side and an ethernet port on the other.
It never caught on because it fails at the edge cases. Game systems require encoding hardware, computers require encoding hardware. Video transmitted in a format other than the MPEG-2 TS (like, say, everything except OTA DTV) requires decoding and encoding hardware. Face it: you want your television to be a thin client rather than a monitor, except instead of one central server you want every device to be capable of acting as a server. The minor upside (it's awesome and cheap for prerecorded video in MPEG-2 TS format) is completely outweighed by the myriad of downsides (it requires extra, expensive hardware for any device that isn't doing prerecorded video in MPEG-2 TS format, and it has poorer image quality for real-time video).
XP-x64 is really Windows Server 2003 with the XP appearance tacked on top. It's a fine OS, but it's also an orphaned child that's often left aside. It was cooked up as a temporary stop-gap until Vista64, and it served its stopgap purpose.
Drivers are non-existent for some pieces of hardware. Pretty much any hardware needs to have XP and Vista drivers, but XP-x64 isn't actually XP (it requires 64-bit drivers), so the drivers aren't necessarily a drop-in replacement. With the release of Vista-64 as Microsoft's 64-bit desktop OS, XP-x64 is also a complete dead-end in the driver department; new hardware comes out, and since Vista64 and Windows Server 2008 already exist, there's not as much reason for companies to bother with driver support for XP-x64. It's not worth the testing or support resources necessary for an OS that only ever commanded a tiny fraction of the market. On top of that, plenty of install applications fail because they check for XP or Vista but not XP-x64; even though the program will run, it can't be installed without some irritating workarounds.
On top of that, his IT department may be unwilling to dedicate the resources necessary to maintaining one or a few workstations with a totally different OS and image than the rest of the systems. You may argue that it's IT's job to do that, but they also need to weigh costs and benefits; perhaps they've already determined that the hardware or critical software isn't supported under XP-x64, or perhaps they're about to migrate to Win7 and it simply isn't worth the extra cost and hassle until they start migrating people in 9 months.
"Just think, with VLSI we can have 100 ENIACS on a chip!" -- Alan Perlis