Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment Re:Fix HD First (Score 1) 559

Why 32bpp? Monitors and televisions (and our eyes) only have three color channels, so you have 24-bit, or 30-bit, or 36-bit, for 8, 10, or 12 bits per channel, respectively. When you're talking commercially distributed video, it's almost always YUV420 rather than RGB. You have your one intensity channel at full resolution, and two color channels running at quarter resolution, so a cluster of four pixels shares one color. 24-bit color becomes 12bpp, 30-bit becomes 15bpp, and 36-bit becomes 18bpp. It was originally designed to allow B&W and color television to coexist on the same transmission, but was found to be a good psychovisual compression mechanism. The only time you would use 16-bit or 32-bit color is inside your graphics pipeline prior to compositing, where the three-channel image is accompanied by a fourth transparency channel.

Comment Re:Simple reason ... (Score 1) 559

No, he's complaining that the HDTV he bought 12 years ago wasn't supported properly as much as 8-10 years ago.

Because the HD spec kept changing, the early adopters got screwed. Before HD was available to most people, the first two generations of display devices were already obviated. By the time you actually got any HD content the spec had said "oh, you aren't supported at full resolution or with this connector".

Now back to reality, the ATSC spec was published way way back in 1995, and was accepted and standardized by the FCC just a year later, long before even the "enhanced definition" sets hit the market. An HDTV purchased 12 years ago will have a component input, and will be able to receive an analog HD signal from a modern Bluray player.

Comment Re:Fix HD First (Score 1) 559

Wut? Raw 1080p30 is ~1.87Gbps so 18.2Mbps MPEG2 is already over 100X compression.

Technically, it's only half that (plus your math is off somewhere). Nearly all video transmission techniques convert the color space from three color channels to an intensity and two color channels, and then quarter the resolution of those color channels. A 1920x1080 video frame will only store color information at 960x540. Even analog formats do that. With two of three channels down to a quarter the data, it averages to only 12 bits per pixel, rather than 24.

That said, the OP is still a complete dunce.

Comment Re:Fix HD First (Score 1) 559

You have a point, but you lost credibility when you included OTA in that list. OTA is uncompressed 18.2mbit MPEG.

Did he now? I surely hope you're not employed by that IPTV/Satellite service in a technical fashion. Please tell me you're in sales or something...

Do the math. 1920 x 540 x 60 x 12 = 750Mbps. That's quite a bit higher than 18.2...

There is no point in compressing an OTA broadcast because the bandwidth is functionally unlimited

Or it's limited to around 18.2Mbps, as you just stated... It's actually typical to see the primary channel only running 12-16Mbps, with the excess used for one or more secondary channels.

When you see artifacts on an OTA broadcast it is most emphatically *not* from compression, it's usually from interference or a badly tuned/aligned antenna.

When you see large chunks of missing or corrupted video, THAT is from interference or an insufficient antenna. Compression effects are things like color banding, or easily discernible macro-block boundaries, and you absolutely do see those as your scene complexity or motion increases beyond the codec's ability to handle it.

Comment Re:There really is no point (Score 1) 559

That's just not true. You're taking one measurement of average human perception, and using it where it does not apply.

The standard "good enough" resolution people quote assume 20/20 vision, or roughly one arc minute of perception, when determining the width of a monochrome line with one eye. If we were viewing monochrome monitors, or ones that at least had their colors in the same physical location, that might make sense, but we don't. A good percentage of the population can see well below 20/20 vision even in this measurement, with the best somewhere around 20/10. That means if 2K is "good enough" for average vision, then 4K would be necessary for those with exceptional vision.

However, that's still wrong. That resolving ability of our eyes might be around one arc minute, but vernier acuity has been measured around eight arc seconds, and stereo acuity has been measured all the way down to two arc seconds. So, maybe once we get another three orders of magnitude pixels, we won't be able to improve it further.

Comment Re:I would love 4K!!! (Score 1) 559

If you look at the tvs out there many that are '4k' are actually 2160p. They have conveniently flipped it on end and call it 4k.

Huh? No one has flipped anything. That's just what 4K is. It's a cinema format describing a long axis of roughly 4K pixels. Your current 1080p television could be considered a 2K display.

Slashdot Top Deals

If I have not seen so far it is because I stood in giant's footsteps.

Working...