Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:My experiences (Score 1) 277

2) HDMI inputs. Again, my TV has 4 inputs - 3 more than I need. The TV will NOT take the digital audio from an HDMI source - for example, Blu-Ray audio from my PS3 - and pass that audio unmolested through to the optical output connecting the TV to the amplifier. As a result, all I would get from any game or from most Blu-Ray disks was the left and right channels passed on to the stereo - no sub, no surround, no center channel. And the TV does NOT have a six channel audio output - only 2. So I end up having to do all the switching at the stereo, and then pass everything on to the TV - so I really only need one HDMI input.

Just as a heads-up: The reason your TV won't do this is because it couldn't pass all of the audio streams through an optical or coaxial digital connection. The S/PDIF system standard used over those connections tops out at a bitrate that's too low for the newer Dolby Digital TrueHD and DTS HD audio formats used on Blu-Ray discs. Hence, you could get regular 5.1 Dolby Digital or DTS, but not the lossless compressed audio formats. I can imagine that it would be a support nightmare for any TV manufacturer to have a TV that could output audio from some discs with some audio selections turned on, but not from others.

Comment Re:Clunkers is a clunker (Score 1) 594

Yeah, I guess I was a little unclear when I dashed off the response earlier. The only requirement is that the salvage yard sell the parts within 6 months; it doesn't have to sell the parts directly to someone who is going to install it that day. There's nothing in the program that I'm aware of that would prevent the salvage yard from stripping the car of useful parts and selling them to a wholesaler or even to a sister company run on the same lot. The only thing that should be getting crushed is the chassis and some body panels from an old shitbox.

In reality, of course, salvage yards are used to outsourcing their "stripping" to customers via the pick-a-part business model. The six month time limit might cause the salvage yard to do the stripping themselves. High-demand, relatively easily removed items such as alternators, starters, ECUs, doors/trunk, etc. will probably be stripped but I doubt they're going to go to the trouble of pulling harder to get to parts. The rest of the drive train besides the engine can actually be sold, but it has to be in its component parts, and it's a little hard to understand if they'll let you sell a whole transmission or if it has to be dismantled down; either way, I doubt it's financially worthwhile to expend huge amounts of time ripping an aging transmission out of a sub-$4,000 car. The drive train is as good as dead, absent maybe an axle or driveshaft.

I'm no expert, obviously, but it looks like the only thing that necessarily must be destroyed within 6 months is a hunk of scrap iron in the vague shape of a car. The only thing a non-shredded car would be good for is the parts, and you can get those before you shred it. As far as I'm concerned, the car still exists in a useful manner if you can use the parts after the fact.

Comment Re:Clunkers is a clunker (Score 1) 594

From the law (599.400-.403): "the disposal facility may sell any part of the vehicle other than the engine block or the drive train." Their words, not mine. To be fair, they're not killing the rest of the drive train, but it does have to be dismantled down into parts before it can be sold. A differential and drive shaft might get picked off, but I doubt anyone is going to the trouble to tear down the transmission and sell off random gears in their six month period.

Comment Re:Clunkers is a clunker (Score 3, Informative) 594

Then what I don't understand is that all of the car that are traded in, go straight to the car crusher.

The entire car isn't crushed; the only requirement is that the drive train be destroyed. The recommended method from the feds is to drain the engine of oil, fill it with some sand, and turn it on. The rest of the car is sent off to the parts yard.

Comment Re:early adopters VSs the luddites (Score 1) 685

My comment was a response to someone who wanted to buy an HD disc player but wanted to wait for the Blu Ray/HD-DVD battle to settle down. I was not commenting on the original article, but rather on another poster's desire not to be stuck with the "wrong" HD disc format. Hence the "Re:" in my post title.

Comment The Rosewill RSV-S8 (Score 4, Interesting) 210

The Rosewill RSV-S8 is pretty much exactly what you've described. It's an eSATA enclosure with 8 drive caddies, a power supply, and a fan. It presents the drives to the system as JBOD or one of the various common versions of RAID (implemented in software, I assume). Ignore the comically inflated MSRP; it's $300 on Newegg. It ships with its own eSATA card for compatibility purposes, but I assume it would work with any eSATA adapter that followed the proper specifications. There's also a five drive version available for about $100 less, give or take. I can't speak to the reliability or ease of use, but this sounds like it will fit your requirements.

Comment Re:Forest, meet trees. (Score 1) 685

Wow, the AC responses from people who don't even RTFC (comments) keep rolling in.

I responded to an OP who wanted an HD optical disc player, but claimed to be waiting for the war between HD-DVD and BluRay to settle down. He wasn't concerned about the next cycle or possible streaming solutions, and he didn't indicate any uneasiness about the costs and benefits of BluRay vs. regular old DVD. He simply wanted an HD disc player, but didn't want to buy the wrong type. He was apparently unaware that the format war has been over for a year and a half.

I didn't address any of the other salient points in the HD Disc debate because I was merely responding to OP's concerns about BluRay vs. HD-DVD.

Comment Re:early adopters VSs the luddites (Score 4, Insightful) 685

HD-DVD is dead. There's no need to wait to see who will win, as that question was answered a year and a half ago when Toshiba (the banner carrier for HD-DVD) announced that they would discontinue all HD-DVD production. According to the wiki article, the entire HD-DVD promotion group was dissolved March of last year. To my knowledge, no one builds a new HD-DVD player; there are a small number of PC drives that include HD-DVD compatibility, but I assume that's because of the low cost of inclusion once the blue laser diodes for Blu-ray are already in the drive. You can not walk into a retail store and find an HD-DVD player unless they found some hidden stock in the back and are clearance selling it for $20. You can't find HD-DVD discs unless the same thing happens. Any movie that's come out since then will never come out on HD-DVD. HD-DVD is dead and voluntarily buried by its own support and manufacturing group.

In summary, there is no more waiting. The race was over last year. You can debate whether the quality improvement is worth the money, and there's some definite complaints to be made about the cost of the discs. If your only concern, however, is which of the formats will win, then there's no reason to continue waiting. Blu-Ray won last year.

Comment Re:use Ethernet - decoding wrong place (Score 1) 357

Oh wait, you're going to use video compression? Because clearly, in the dozen or so earlier posts where we debated the relative merits and costs of video compression and the potential downsides of forcing compressed video for final transmission to the display, I didn't realize you were going to use video compression. The link to the wikipedia article really helped cement that idea for me.

It's all so clear now. My concerns regarding codec lock-in, compatibility, cost of redundant compression chips, etc. are all answered because you've repeated your plan the twelfth time and now claimed that I don't think video can be compressed.

You think it can be done cheaply enough to avoid all of the downsides. I don't and think that the final step from STB or similar device should use raw video for cost and compatibility's sake. That's apparently where we stand, and only a decade or so of time will tell whether either of us is correct.

Comment Re:use Ethernet - decoding wrong place (Score 1) 357

Comparing an 8 port video switcher to an 8 port ethernet switch is an apples and oranges comparison, which is all I was trying to say; using an 8 port ethernet switch does require encoding hardware on all real-time devices, while the HDMI switch is simply something for hooking up raw video run in short runs. For its purpose, the HDMI switch is substantially cheaper. The fact that you were trying to compare the two led me to believe that you think people were using HDMI as structured cabling and trying to install it in a star topology, which couldn't be further from the truth. Different tools for different tasks.

iSCSI works spectacularly well because Ethernet has caught up to and in some cases surpassed local SCSI cabling speeds (or at least functionally matched it, once we factor in the limitations of an array of drives). We didn't invent some new, lossy drive communication spec to make it work, we mostly just communicated in the same ol' SCSI block addressing system over a different cable. HDMI vastly surpasses gigabit ethernet speeds (32bpp x 1920x1080 pixels x 24 fps minimum + audio) in bandwidth, requires different latency correction, and has no video compression applied. If 10Gbps ethernet makes an enormous drop in price, then HDMIoE could certainly work and would be a fantastic system. What you're describing, however, is not HDMIoE.

Comment Re:use Ethernet - decoding wrong place (Score 1) 357

The decode chip is unnecessary in a real-time device such as a game system; those systems already use their graphics processors for decoding. They're necessary on a video box that's transcoding video, sure, but not on a game system. They can add it to the GPU if they see a compelling reason to do so, such as your hypothetical multi-room gaming, but it's not a good idea to make it compulsory. The difference may not be horrific, but even $5 or $10 adds up on low margin items.

If you want 1080p or 3D, yes, you'll have to upgrade your TV, but you don't have to upgrade your TV to display MPEG-4 encoded videos just because they became common in the time since you bought your display. You don't have to upgrade your TV to interface with a different audio standard just because it became available after the fact. You're already limited in upgrading your display technology, but building in the box limits the input technology as well. If your argument is that TVs change quickly enough that it won't be a problem, I'd wonder why you think people are going to want to replace each of their smaller TVs scattered about.

The 19" TV for $150 is not going to magically include the $200 STB parts, minus the minor cost for enclosure/Power Supply/few redundancies, for $150. If the parts are cheap enough to cram in a $150 TV, they're going to be cheap enough to package in a $30 STB if someone wants them. If someone doesn't, the TV can still tune into normal broadcast television, be used with a stand alone disc player, etc if that person wants it. I don't know why they wouldn't get the hypothetical $30 STB, but that's beside the point. Any smaller display scattered around that supports your system would have nearly similar costs to a plain display plus an STB that supported the same system, and wouldn't be obsoleted with the introduction of a new codec or STB feature that couldn't (or wouldn't) be added by that TVs manufacturers.

I'm still a bit confused about shunting raw video around, and my confusion goes all the way back to your first post. Your insistence on comparing HDMI and Ethernet is baffling given that they're intended for two completely different purposes. HDMI is not poised to replace Ethernet, and it never will or should. Doing so is asinine. If there's an STB in front of every TV, it will interface to the television with a raw video format and speak to the rest of the network via normal networking equipment for no reason more complex than compatibility and cost reduction. HDMI includes simple, low-bandwidth data transfer (I have no idea of the speed) suitable for signaling devices to turn on/off/change volume/etc; there's this new version with 100 Mbit ethernet built-in, but the consensus from everyone seems to be that it's a solution in search of a problem. No one is using HDMI for anything other than transmitting raw audio and video a few feet (perhaps 50 - 75 in the case of front projectors) from a source device to a display/audio device. If an STB sits outside of the television, it will make sense to minimize the costs of transmitting the data to the display, and that means raw video and audio.

Comment Re:use Ethernet - decoding wrong place (Score 1) 357

yes, but I'm afraid you are not returning the courtesy. The chipset spec already sent describes reading from HDMI and encoding, for example, to write to a disk on a PVR.

And that chipset, the broadcom 7043, isn't used in any of the devices you posted. They're all using various decoding-only chipsets. If you google any of the other chipsets, you'll find the first page of results have several mentions of various Blu-Ray players and STBs using them. If you Google the Broadcom 7043, BCM 7043, or BCM-7043, the first five pages are nothing but press releases describing the capabilities of the chip and other pages reprinting the press releases (and one post on a TiVO message board where someone dreams about it being in a TiVO). Nothing is using the encoding-capable chip, and it's not terribly surprising. Very, very few people are demanding the ability to record in from HDMI, and it's a copy-protected input anyway, so the devices that you could record from are few and far between. We have no point of reference for how much a 7043 costs, but based on other encoding devices I've seen, it's not a cheap part. For a real-time device such as a game console, it's unnecessary to require encoding the output then decoding it again; if they want to add the ability to stream to an "IPTV compatible" device, then they could do so, but it's pointless to force the replication of the same encoding part in each game console or similar real-time video device.

I already have a system that does 9/10s of what you describe. It's a home theater computer, and it can be built for ~$350; it does so with a mixture of software that's about 85% user-friendly and 15% frustrating as hell. It also illustrates the problem both of our systems will run into: neither standard will automatically intelligently interoperate. Both of us are arguing about the features that could be leveraged by such a system, but we're both being naive if we think either system would intelligently interoperate simply by "being there." Frankly, the only thing that's required for all of this to work is ethernet, NAS, something capable of decoding 1080p video, and software. That last section is the real sticking point, and various fortunes have been destroyed in the last 10 years by people or companies trying to develop and cash in on a proper standard for home device communication. Simply jamming ethernet on the back of the TV isn't going to fix the problem any more than my claiming an inexpensive box will do it.

However, if we do hit the point where the cheap box can do all of that, I'd still argue that it wouldn't be better to include it in the TV for the reasons I posted earlier. On the real-time device side, redundantly replicating encoding hardware is not as cheap or efficient as simply allowing the device to output a framebuffer to the display. As far as the TV "computer" is concerned, firmware can be upgraded, but sooner or later you want to do something that the built-in "box" can't physically do. When you hit that point, it's easier and more cost efficient to handle that work in an external device. The ~$20-$30 savings you get from ditching the power supply, enclosure, and some of the PCB components is all worthless the moment you have to replace the TV to allow a new feature when you could just replace a $150 or $200 box and be good to go.

Comment Re:use Ethernet - decoding wrong place (Score 1) 357

Every one of those devices is using Broadcom chips for playback, which is no big deal. Show me the device that is using an encoder. We've been talking about encoders for how many posts now, and you still seem under the impression that they already exist in every device, when that's patently untrue. A cheap as hell HD capable decoder is nothing special, and the fact that a company chose one from provider A instead of B is nothing to write home about. Show me the consumer device with the HD capable encoder. Seriously, are you even reading what I write? An encoder is unnecessary in every one of those devices because the data comes in already encoded, but an encoder WOULD be necessary for rendering any real-time video usable under your system.

Is there some sort of language barrier here, because your English is flawless but your comprehension is awful. You don't need a DVR, disk player, sling box, etc. A single box can do it all. I thought I just spent several paragraphs making that clear.

Again, since you seem unable to catch my message: A SINGLE BOX CAN DO IT ALL. The technology already exists, and it's about $200. It's here. Today. You're going to need a NAS somewhere on your network (I'm not sure why that's unnecessary in your vision, but something has to store recorded video), but otherwise what you describe exists right now, today, with a single box source. Any device that can access the internet/local network and play 1080p video is physically capable of handling every single thing you describe.

Slingbox: Take a video and send it out over the internet. The STB already reads digital video in, decodes it, and sends it up to the TV. The only reason today's cable box, satellite box, or network tuner doesn't already allow an option to skip the decoding and just send out the digital stream is because no one has added a software function to do it.

PVR: An STB can save the digital stream over the network to a NAS. Play it back from a NAS. Again, all the hardware for this is here, and a select few devices have the software necessary to do this. Mostly, it's just a question of someone bothering to add the software functionality for it.

NAS: Uh, you're going to need network attached storage somewhere. I'm not sure why you're keeping yours underneath the television, though.

Disc Player: Mostly unnecessary, and probably won't be around in 10 years. The only thing that physically prevents an optical disc player from sharing the disc over the network is a lack of willingness on Hollywood's part. If the player could share it out over ethernet (nothing fancy there), any 1080p-capable STB has all the hardware necessary for playing that stream and the disc player could be located anywhere in the network. The more tech oriented in the world can simply rip the disc to their NAS and then play it back from NAS->STB. Again, every single piece of hardware necessary for this exists, it's just a question of software.

A VCR: Seriously? I doubt there would be IP-capable VCRs in your future. If we were using a VCR, there's going to be a encoder involved somewhere in the process; likely, you'd use it long enough to rip video to the NAS then toss it. At that point, we're looking at playing back a stored file under either system.

A computer: The STB is capable of browsing the web, downloading meta-data, playing back files, etc. If what you want is a general purpose computer running a desktop OS and used like a general purpose computer, then you should know that the ergonomics of the living room make it a lousy choice anyway. Nevertheless, any form of VNC is already capable of handling this, and an STB is more than capable of running a VNC client. Programs such as FRAPS already exist to record your computer desktop, and the only thing keeping it from using a standard streaming protocol rather than storing it as a file on the NAS is someone taking the short while to implement it in software. Again, the STB is capable of video playback; it has no problem deciding where the video comes from.

I've already elucidated the reasons why you'd want the single box. It's easier to upgrade than a whole television, and it doesn't require encoding hardware to transmit raw video to a television. My solution (the one the entire world uses) is backwards compatible with every television made since the 1950s and has an unlimited upgrade path. If codecs or features or requirements ever outstrip what the TV-locked standard you've proposed can do, we simply replace the STB. You mention the CableCARD system that allow you to use your TV as an STB; that's a perfect example. As soon as your cable company decides to migrate their system to MPEG-4 from MPEG-2 for the quality and bandwidth savings, a CableCARD-equipped television is useless.

Your standard requires us to replace every television out there and every video playback device. Granted, we can slowly transition to your standard, but even once we're there we have to add in encoders (again, encoders: the expensive chips with an 'e' at the beginning, not the cheap ones with a 'd') in devices that don't have much use for them. Once we've transitioned, we're locked there. Firmware can be upgraded, sure, but that's not a magic fix-all. As soon as there's a feature or capability that your TV can't handle, your system is obsoleted and we have to ditch the TV or start a separate upgrade path.

I'll say it again: you and I aren't really arguing different things. We want ubiquitous network access at the television, but you want an STB inside the TV and I don't. Your solution puts the ethernet port on the TV, and mine puts it on an STB. My assertion is that the minor cost savings from moving the STB inside the television are undone by the expenses for adding encoders where they aren't necessary, transitioning all of our television again, and preventing the ease of upgrade that's allowed by handling things externally in a cheap box. If you respond to this post, please explain to me how your system will get around these problems.

Slashdot Top Deals

The key elements in human thinking are not numbers but labels of fuzzy sets. -- L. Zadeh

Working...