Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment The Rosewill RSV-S8 (Score 4, Interesting) 210

The Rosewill RSV-S8 is pretty much exactly what you've described. It's an eSATA enclosure with 8 drive caddies, a power supply, and a fan. It presents the drives to the system as JBOD or one of the various common versions of RAID (implemented in software, I assume). Ignore the comically inflated MSRP; it's $300 on Newegg. It ships with its own eSATA card for compatibility purposes, but I assume it would work with any eSATA adapter that followed the proper specifications. There's also a five drive version available for about $100 less, give or take. I can't speak to the reliability or ease of use, but this sounds like it will fit your requirements.

Comment Re:Forest, meet trees. (Score 1) 685

Wow, the AC responses from people who don't even RTFC (comments) keep rolling in.

I responded to an OP who wanted an HD optical disc player, but claimed to be waiting for the war between HD-DVD and BluRay to settle down. He wasn't concerned about the next cycle or possible streaming solutions, and he didn't indicate any uneasiness about the costs and benefits of BluRay vs. regular old DVD. He simply wanted an HD disc player, but didn't want to buy the wrong type. He was apparently unaware that the format war has been over for a year and a half.

I didn't address any of the other salient points in the HD Disc debate because I was merely responding to OP's concerns about BluRay vs. HD-DVD.

Comment Re:early adopters VSs the luddites (Score 4, Insightful) 685

HD-DVD is dead. There's no need to wait to see who will win, as that question was answered a year and a half ago when Toshiba (the banner carrier for HD-DVD) announced that they would discontinue all HD-DVD production. According to the wiki article, the entire HD-DVD promotion group was dissolved March of last year. To my knowledge, no one builds a new HD-DVD player; there are a small number of PC drives that include HD-DVD compatibility, but I assume that's because of the low cost of inclusion once the blue laser diodes for Blu-ray are already in the drive. You can not walk into a retail store and find an HD-DVD player unless they found some hidden stock in the back and are clearance selling it for $20. You can't find HD-DVD discs unless the same thing happens. Any movie that's come out since then will never come out on HD-DVD. HD-DVD is dead and voluntarily buried by its own support and manufacturing group.

In summary, there is no more waiting. The race was over last year. You can debate whether the quality improvement is worth the money, and there's some definite complaints to be made about the cost of the discs. If your only concern, however, is which of the formats will win, then there's no reason to continue waiting. Blu-Ray won last year.

Comment Re:use Ethernet - decoding wrong place (Score 1) 357

Oh wait, you're going to use video compression? Because clearly, in the dozen or so earlier posts where we debated the relative merits and costs of video compression and the potential downsides of forcing compressed video for final transmission to the display, I didn't realize you were going to use video compression. The link to the wikipedia article really helped cement that idea for me.

It's all so clear now. My concerns regarding codec lock-in, compatibility, cost of redundant compression chips, etc. are all answered because you've repeated your plan the twelfth time and now claimed that I don't think video can be compressed.

You think it can be done cheaply enough to avoid all of the downsides. I don't and think that the final step from STB or similar device should use raw video for cost and compatibility's sake. That's apparently where we stand, and only a decade or so of time will tell whether either of us is correct.

Comment Re:use Ethernet - decoding wrong place (Score 1) 357

Comparing an 8 port video switcher to an 8 port ethernet switch is an apples and oranges comparison, which is all I was trying to say; using an 8 port ethernet switch does require encoding hardware on all real-time devices, while the HDMI switch is simply something for hooking up raw video run in short runs. For its purpose, the HDMI switch is substantially cheaper. The fact that you were trying to compare the two led me to believe that you think people were using HDMI as structured cabling and trying to install it in a star topology, which couldn't be further from the truth. Different tools for different tasks.

iSCSI works spectacularly well because Ethernet has caught up to and in some cases surpassed local SCSI cabling speeds (or at least functionally matched it, once we factor in the limitations of an array of drives). We didn't invent some new, lossy drive communication spec to make it work, we mostly just communicated in the same ol' SCSI block addressing system over a different cable. HDMI vastly surpasses gigabit ethernet speeds (32bpp x 1920x1080 pixels x 24 fps minimum + audio) in bandwidth, requires different latency correction, and has no video compression applied. If 10Gbps ethernet makes an enormous drop in price, then HDMIoE could certainly work and would be a fantastic system. What you're describing, however, is not HDMIoE.

Comment Re:use Ethernet - decoding wrong place (Score 1) 357

The decode chip is unnecessary in a real-time device such as a game system; those systems already use their graphics processors for decoding. They're necessary on a video box that's transcoding video, sure, but not on a game system. They can add it to the GPU if they see a compelling reason to do so, such as your hypothetical multi-room gaming, but it's not a good idea to make it compulsory. The difference may not be horrific, but even $5 or $10 adds up on low margin items.

If you want 1080p or 3D, yes, you'll have to upgrade your TV, but you don't have to upgrade your TV to display MPEG-4 encoded videos just because they became common in the time since you bought your display. You don't have to upgrade your TV to interface with a different audio standard just because it became available after the fact. You're already limited in upgrading your display technology, but building in the box limits the input technology as well. If your argument is that TVs change quickly enough that it won't be a problem, I'd wonder why you think people are going to want to replace each of their smaller TVs scattered about.

The 19" TV for $150 is not going to magically include the $200 STB parts, minus the minor cost for enclosure/Power Supply/few redundancies, for $150. If the parts are cheap enough to cram in a $150 TV, they're going to be cheap enough to package in a $30 STB if someone wants them. If someone doesn't, the TV can still tune into normal broadcast television, be used with a stand alone disc player, etc if that person wants it. I don't know why they wouldn't get the hypothetical $30 STB, but that's beside the point. Any smaller display scattered around that supports your system would have nearly similar costs to a plain display plus an STB that supported the same system, and wouldn't be obsoleted with the introduction of a new codec or STB feature that couldn't (or wouldn't) be added by that TVs manufacturers.

I'm still a bit confused about shunting raw video around, and my confusion goes all the way back to your first post. Your insistence on comparing HDMI and Ethernet is baffling given that they're intended for two completely different purposes. HDMI is not poised to replace Ethernet, and it never will or should. Doing so is asinine. If there's an STB in front of every TV, it will interface to the television with a raw video format and speak to the rest of the network via normal networking equipment for no reason more complex than compatibility and cost reduction. HDMI includes simple, low-bandwidth data transfer (I have no idea of the speed) suitable for signaling devices to turn on/off/change volume/etc; there's this new version with 100 Mbit ethernet built-in, but the consensus from everyone seems to be that it's a solution in search of a problem. No one is using HDMI for anything other than transmitting raw audio and video a few feet (perhaps 50 - 75 in the case of front projectors) from a source device to a display/audio device. If an STB sits outside of the television, it will make sense to minimize the costs of transmitting the data to the display, and that means raw video and audio.

Comment Re:use Ethernet - decoding wrong place (Score 1) 357

yes, but I'm afraid you are not returning the courtesy. The chipset spec already sent describes reading from HDMI and encoding, for example, to write to a disk on a PVR.

And that chipset, the broadcom 7043, isn't used in any of the devices you posted. They're all using various decoding-only chipsets. If you google any of the other chipsets, you'll find the first page of results have several mentions of various Blu-Ray players and STBs using them. If you Google the Broadcom 7043, BCM 7043, or BCM-7043, the first five pages are nothing but press releases describing the capabilities of the chip and other pages reprinting the press releases (and one post on a TiVO message board where someone dreams about it being in a TiVO). Nothing is using the encoding-capable chip, and it's not terribly surprising. Very, very few people are demanding the ability to record in from HDMI, and it's a copy-protected input anyway, so the devices that you could record from are few and far between. We have no point of reference for how much a 7043 costs, but based on other encoding devices I've seen, it's not a cheap part. For a real-time device such as a game console, it's unnecessary to require encoding the output then decoding it again; if they want to add the ability to stream to an "IPTV compatible" device, then they could do so, but it's pointless to force the replication of the same encoding part in each game console or similar real-time video device.

I already have a system that does 9/10s of what you describe. It's a home theater computer, and it can be built for ~$350; it does so with a mixture of software that's about 85% user-friendly and 15% frustrating as hell. It also illustrates the problem both of our systems will run into: neither standard will automatically intelligently interoperate. Both of us are arguing about the features that could be leveraged by such a system, but we're both being naive if we think either system would intelligently interoperate simply by "being there." Frankly, the only thing that's required for all of this to work is ethernet, NAS, something capable of decoding 1080p video, and software. That last section is the real sticking point, and various fortunes have been destroyed in the last 10 years by people or companies trying to develop and cash in on a proper standard for home device communication. Simply jamming ethernet on the back of the TV isn't going to fix the problem any more than my claiming an inexpensive box will do it.

However, if we do hit the point where the cheap box can do all of that, I'd still argue that it wouldn't be better to include it in the TV for the reasons I posted earlier. On the real-time device side, redundantly replicating encoding hardware is not as cheap or efficient as simply allowing the device to output a framebuffer to the display. As far as the TV "computer" is concerned, firmware can be upgraded, but sooner or later you want to do something that the built-in "box" can't physically do. When you hit that point, it's easier and more cost efficient to handle that work in an external device. The ~$20-$30 savings you get from ditching the power supply, enclosure, and some of the PCB components is all worthless the moment you have to replace the TV to allow a new feature when you could just replace a $150 or $200 box and be good to go.

Comment Re:use Ethernet - decoding wrong place (Score 1) 357

Every one of those devices is using Broadcom chips for playback, which is no big deal. Show me the device that is using an encoder. We've been talking about encoders for how many posts now, and you still seem under the impression that they already exist in every device, when that's patently untrue. A cheap as hell HD capable decoder is nothing special, and the fact that a company chose one from provider A instead of B is nothing to write home about. Show me the consumer device with the HD capable encoder. Seriously, are you even reading what I write? An encoder is unnecessary in every one of those devices because the data comes in already encoded, but an encoder WOULD be necessary for rendering any real-time video usable under your system.

Is there some sort of language barrier here, because your English is flawless but your comprehension is awful. You don't need a DVR, disk player, sling box, etc. A single box can do it all. I thought I just spent several paragraphs making that clear.

Again, since you seem unable to catch my message: A SINGLE BOX CAN DO IT ALL. The technology already exists, and it's about $200. It's here. Today. You're going to need a NAS somewhere on your network (I'm not sure why that's unnecessary in your vision, but something has to store recorded video), but otherwise what you describe exists right now, today, with a single box source. Any device that can access the internet/local network and play 1080p video is physically capable of handling every single thing you describe.

Slingbox: Take a video and send it out over the internet. The STB already reads digital video in, decodes it, and sends it up to the TV. The only reason today's cable box, satellite box, or network tuner doesn't already allow an option to skip the decoding and just send out the digital stream is because no one has added a software function to do it.

PVR: An STB can save the digital stream over the network to a NAS. Play it back from a NAS. Again, all the hardware for this is here, and a select few devices have the software necessary to do this. Mostly, it's just a question of someone bothering to add the software functionality for it.

NAS: Uh, you're going to need network attached storage somewhere. I'm not sure why you're keeping yours underneath the television, though.

Disc Player: Mostly unnecessary, and probably won't be around in 10 years. The only thing that physically prevents an optical disc player from sharing the disc over the network is a lack of willingness on Hollywood's part. If the player could share it out over ethernet (nothing fancy there), any 1080p-capable STB has all the hardware necessary for playing that stream and the disc player could be located anywhere in the network. The more tech oriented in the world can simply rip the disc to their NAS and then play it back from NAS->STB. Again, every single piece of hardware necessary for this exists, it's just a question of software.

A VCR: Seriously? I doubt there would be IP-capable VCRs in your future. If we were using a VCR, there's going to be a encoder involved somewhere in the process; likely, you'd use it long enough to rip video to the NAS then toss it. At that point, we're looking at playing back a stored file under either system.

A computer: The STB is capable of browsing the web, downloading meta-data, playing back files, etc. If what you want is a general purpose computer running a desktop OS and used like a general purpose computer, then you should know that the ergonomics of the living room make it a lousy choice anyway. Nevertheless, any form of VNC is already capable of handling this, and an STB is more than capable of running a VNC client. Programs such as FRAPS already exist to record your computer desktop, and the only thing keeping it from using a standard streaming protocol rather than storing it as a file on the NAS is someone taking the short while to implement it in software. Again, the STB is capable of video playback; it has no problem deciding where the video comes from.

I've already elucidated the reasons why you'd want the single box. It's easier to upgrade than a whole television, and it doesn't require encoding hardware to transmit raw video to a television. My solution (the one the entire world uses) is backwards compatible with every television made since the 1950s and has an unlimited upgrade path. If codecs or features or requirements ever outstrip what the TV-locked standard you've proposed can do, we simply replace the STB. You mention the CableCARD system that allow you to use your TV as an STB; that's a perfect example. As soon as your cable company decides to migrate their system to MPEG-4 from MPEG-2 for the quality and bandwidth savings, a CableCARD-equipped television is useless.

Your standard requires us to replace every television out there and every video playback device. Granted, we can slowly transition to your standard, but even once we're there we have to add in encoders (again, encoders: the expensive chips with an 'e' at the beginning, not the cheap ones with a 'd') in devices that don't have much use for them. Once we've transitioned, we're locked there. Firmware can be upgraded, sure, but that's not a magic fix-all. As soon as there's a feature or capability that your TV can't handle, your system is obsoleted and we have to ditch the TV or start a separate upgrade path.

I'll say it again: you and I aren't really arguing different things. We want ubiquitous network access at the television, but you want an STB inside the TV and I don't. Your solution puts the ethernet port on the TV, and mine puts it on an STB. My assertion is that the minor cost savings from moving the STB inside the television are undone by the expenses for adding encoders where they aren't necessary, transitioning all of our television again, and preventing the ease of upgrade that's allowed by handling things externally in a cheap box. If you respond to this post, please explain to me how your system will get around these problems.

Comment Re:use Ethernet - decoding wrong place (Score 1) 357

If you say so; a chip whose application I've never seen in ANY consumer device anywhere (and has been available for at least 2 years) isn't going to "be embedded in gizmos anyways." I'm not questioning that real-time encoding hardware exists, just the cost. What's the cost of this device, and what's the cost of placing it in every relevant box you own? What's the cost of replacing every TV with models capable of being upgraded properly to deal with this system, including the new codecs?

Now what's the cost of doing that instead of handling what you're describing with a single, inexpensive set top box that can be more easily replaced/upgraded/installed, and outputs to a television as a monitor. I don't think we're arguing for different things here, necessarily; I want ubiquitous network access to a television as much as you do, but I disagree that there's a price savings to be had in embedding compression hardware (not to mention the lock in and backwards compatibility nightmares that it brings) in every device.

Network tuners already exist. Network surveillance cameras already exist. They can both already write to NAS drives without any trouble, and are perfectly capable of being controlled over a network. Set top boxes can already software switch inputs, poll for meta-data for movies being played, access information on other STBs, communicate with other networked devices.

A $200 Popcorn Hour Set Top Box is already hardware capable of doing 9/10s of what you listed (the multi-display game system excluded, although advances in STB hardware would certainly allow the game system to act as a server and render those images on the STB). The same can be done in any recent model digital cable or satellite box. An STB communicates with network devices over the network and communicates with displays over HDMI, it avoids the need for encoding video when it isn't necessary, and when technology has progressed far enough that things need to be changed, it can be replaced much more easily. No need to fret about firmware upgradeable television to support a new compression format, or developing new compression formats to deal with the problems inherent in your system, because you treat the display as a display and leverage the economies of scale on a box that's more easily replaceable. Arguing that packaging the STB in the TV is cheaper is missing the forest for the trees. Again, it works great for prerecorded content, but falls apart for real-time video where the need to maintain upgradeability, compatibility, and cost savings is sorely trounced by a cheap external box.

Comment Re:use Ethernet - decoding wrong place (Score 1) 357

The HDMI switch has an "original price" of $250 in the same sense that a car has a sticker price. Monoprice is not a clearance site; if you want to pay $150 to someone else, you're free to do so.

HDMI does indeed support addressing. You may want to read up on the standard before you begin claiming things that aren't true. Any HDMI device can communicate with any other over the HDMI system, although no one has built a switch that handles things in such a manner because there's no demand for such a product (as I'll explain in a bit).

I didn't go looking for an industrial part because there's no pricing available. There never is for products where the purchaser is expected to buy in lots of 1,000. The conexant chips you linked have A) no price listed and B) don't support HD resolution, so they're a moot point anyway.

What I did link was a product with the necessary industrial hardware buried inside. If we ignore the D/A converter (those are dirt cheap anyway), the USB chipset, and the fan (yes, fan) necessary to keep the compression hardware cool, then we can make a reasonable approximation of the cost of the HD compression hardware inside. If you can find an HD capable industrial part, including the price, please feel free to link it.

By the way, compression chips don't care if they're taking digital input from a source that was originally digital or digital input from a D/A converter. It's already digital by the time it hits the compression hardware. There might be a tiny, miniscule hit in speed due to the digital video from the D/A converter having slight imperfections, but it's pretty nominal.

Here we have a $200 box. Let's say that $100 of it is due to the unnecessary D/A equipment, USB hardware, and the enclosure itself. Hell, I'll be super generous and say that only $50 of that device is the actual compression chip. Now you're telling me that we should have a $50 encoder chip in a $200 Xbox 360 or a $400 PS3? Let's say that the cost for those chips drops 10 fold due to economy of scale. $5 is still a ton of lost profit on devices that are known for being sold at a loss. And that's all for an encoder chip capable of doing 1080p @ 30fps. What about 60 fps? What about 120 fps or 240 fps for the newer 120hz/240hz panels? What about the problems introduced when you run MPEG-2 compression on the text of the website I'm attempting to read with my networked video device?

On top of that, you still haven't addressed the problem of poor video compression, the reasons why we're even bothering to introduce lossy compression to an uncompressed video output signal in the first place, or the issue of codec stagnation when we're dealing with everything being transcoded to MPEG-2 in the end.

Finally, I still can't work out the problem you're trying to solve in the first place. You seem so intent on comparing a video output signal to a network cable that you can't tell me WHY you want to do this. You have a vague requirement of routing inputs to outputs, but we already have a system in place for that. Commercially available set top boxes such as the Apple TV, Popcorn Hour, and similar have been around for several years, and homebrew solutions have been around for longer than that. They all allow us to use existing home networks to send only the compressed video that's necessary, then layer their interface on top before outputting with regular video outputs. Got a video on device A but want to play it on TV B? We can just send the video file over an existing network. You want us jumping through hoops to decode the file at Device A (remember, it might not be in MPEG-2 or whatever format your system uses), layer our interface on top, re-encode to an MPEG-2 Transport Stream format, then send it to TV B. Or, we can have an inexpensive set top box at TV B, capable of handling any input format and without the necessary hardware to encode back to Transport Stream format. Then we just dump the framebuffer video out to a TV.

I won't lie, your system sounds incredibly appealing except for the edge cases. That's why it already exists. DTVs have been capable of doing EXACTLY what you're describing for at least 6 or 7 years over firewire. A standard exists to allow MPEG-2 Transport Streams over firewire from any device to any device, and it even supports simplistic menu systems. Hell, there's even a standard called that allows for a DTV to dump a tuned signal out to a firewire hard drive (provided the drive enclosure can speak the protocol necessary) for use as a tremendously inexpensive DVR. There's even a company that sells software that causes your computer to emulate the AVCHD system, and I'm sure someone could implement an OSS solution in a couple of weeks if they were bored and resourceful. Because there's plenty of bandwidth left on the firewire stream, it could even support DRM for the Hollywood crew. Firewire protocol over ethernet cabling is trivial to implement, and involves not much more than dropping in a box with a firewire port on one side and an ethernet port on the other.

It never caught on because it fails at the edge cases. Game systems require encoding hardware, computers require encoding hardware. Video transmitted in a format other than the MPEG-2 TS (like, say, everything except OTA DTV) requires decoding and encoding hardware. Face it: you want your television to be a thin client rather than a monitor, except instead of one central server you want every device to be capable of acting as a server. The minor upside (it's awesome and cheap for prerecorded video in MPEG-2 TS format) is completely outweighed by the myriad of downsides (it requires extra, expensive hardware for any device that isn't doing prerecorded video in MPEG-2 TS format, and it has poorer image quality for real-time video).

Comment Re:So? (Score 3, Informative) 182

XP-x64 is really Windows Server 2003 with the XP appearance tacked on top. It's a fine OS, but it's also an orphaned child that's often left aside. It was cooked up as a temporary stop-gap until Vista64, and it served its stopgap purpose.

Drivers are non-existent for some pieces of hardware. Pretty much any hardware needs to have XP and Vista drivers, but XP-x64 isn't actually XP (it requires 64-bit drivers), so the drivers aren't necessarily a drop-in replacement. With the release of Vista-64 as Microsoft's 64-bit desktop OS, XP-x64 is also a complete dead-end in the driver department; new hardware comes out, and since Vista64 and Windows Server 2008 already exist, there's not as much reason for companies to bother with driver support for XP-x64. It's not worth the testing or support resources necessary for an OS that only ever commanded a tiny fraction of the market. On top of that, plenty of install applications fail because they check for XP or Vista but not XP-x64; even though the program will run, it can't be installed without some irritating workarounds.

On top of that, his IT department may be unwilling to dedicate the resources necessary to maintaining one or a few workstations with a totally different OS and image than the rest of the systems. You may argue that it's IT's job to do that, but they also need to weigh costs and benefits; perhaps they've already determined that the hardware or critical software isn't supported under XP-x64, or perhaps they're about to migrate to Win7 and it simply isn't worth the extra cost and hassle until they start migrating people in 9 months.

Comment Re:use Ethernet - decoding wrong place (Score 1) 357

Er, I bought 3 10 foot HDMI cables, terminated, with ferrite beads, for $10. While I can order 3 10 foot ethernet cables terminated for a bit cheaper (probably $6 or so), that's about it. Denon sells a $500 ethernet cable, but that doesn't mean that ethernet cabling on the whole is that expensive; don't use Monster Cable as your pricing comparison.

Here is an HDMI swithc with 8 ports for $78 (provided you buy in bulk, else it's $87). A 4 port switch from the same site is $30. HDMI cable runs are possible out to 50 feet before they start needing signal repeaters, but the people who require more than that are honestly so far at the edge that it's not worth building a standard to them.

Gaming consoles already have hardware in there to do all sorts of graphics operations in hardware. MPEG encoding is right up their alley.

I'm not sure if you're familiar with the concept of encoding vs. decoding 1080p MPEG-2 video. There's an order of magnitude difference in computational power required to encode. Gaming consoles do not have that hardware, period. Just because you claim it's right up their alley does not make it so. To put it in perspective, this is an encoder capable of real-time compression of HD signals, and a lousy one at that. Encoders capable of doing proper, broadcast quality encoding in real-time cost thousands of dollars. Anything less than that is going to look like a crappy web video, grainy, pixealted, and washed-out for no other purpose than to solve your imaginary problem. Instead of dumping the framebuffer out to a monitor, you want to dump it to an expensive piece of hardware, encode it, transmit it a few a feet, and decode it. Why? Seriously?

On top of all of that, we'll have video quality issues to contend with, as brand X will use cheaper chips that result in more macroblocking or washed colors on playback. It'll be like the era when we used to have to care about what RAMDAC various video card manufacturers are using because certain ones resulted in visibly worse picture quality. We already have to deal with this to a lesser degree with decoding chips, but now you want to up the ante.

And what happens when we want to use a better compression format to improve picture quality at the same bitrate (such as MPEG-4). Too bad! We're going to transcode to MPEG-2 anyway, so don't waste your time.

Let's face it: A television is a monitor. HDMI is DVI video plus audio. You're conjuring up a ridiculous solution to a problem no one has, your solution costs more, and produces lower quality video as a bonus.

Comment Re:Early adopters (Score 1) 262

Just a heads-up: The PS3 is far from the cheapest Blu-Ray player on the market. The base 80GB model costs $400 and the step up 160GB model goes for $500, while there are multiple players available for $200 with the average cost for Sony and Panasonic players appearing to hover in the $300 range. The PS3 is available for as low as $300 with the use of various coupons or promotions that show up on common deal forums, but it's still at least $100 more than a dedicated player.

You can debate about the relative merits of some of the "generic" brands in that list, but Sharp and Samsung are certainly reputable electronics manufacturers. I won't dispute your other points; the HD-DVD players were in most respects superior to BluRay players at the time. Now that things have caught up and the prices have dropped substantially, though, I'm as glad as you are that the format with superior capacity will win in the end.

Slashdot Top Deals

"Just think, with VLSI we can have 100 ENIACS on a chip!" -- Alan Perlis

Working...