Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:Simple reason ... (Score 1) 559

Passive LCD uses a different film, but it's not a significant extra expense. LCD panels have a polarizer on them, all of them. If you doubt it, wear a pair of polarized shades or, for that matter, RealD glasses around and tilt your head around those many displays in your life. The passive displays require an alternating circular pixel aligned polarizer, so yeah, it's a more complex version of an existing step in the process.

My new-last-spring Sony has passive 3D. I find it far superior to active... no significant dimming of the screen is a big one.

Comment Re:Simple reason ... (Score 1) 559

3D is like Blu-ray these days... if you buy any old video player over about $50, it does Blu-ray. It's a feature, even if some won't use it. 3D is the same thing.. it was cheap to implement (which is why the television makers jumped to that as their Next Big Thing) and now it's just going to be there in any television above a certain threshold.

It's possible 4K goes the same way. Given that LCD panel makers are pushing 2.5K displays onto 8" tablet panels, it's pretty likely that the pixel resolution over time is scaling faster than price. So maybe. If video DSPs are fast enough, they might sell a few on the promise of HD upscaling, much as quite a few folks seem satisfied with DVD upscaling on HD screens. But I don't see it really taking off unless media (whether streaming or disc-based) and television embrace it. And even then, many aren't going to see any difference... and some will still have that one yellow RCA-ended CVBS connector driving their 4K display :-)

Comment Re:Simple reason ... (Score 1) 559

The TV makers are the guys pushing 4K... not the TV networks. Just as in the case with 3D TV.

See, they got spoiled on TV upgrades. As you say, NTSC ran for over 40 years... ok, sure, that whole nasty color switcheroo in the early 60s, and the vast might of the world's brain power spent on adaptive comb filters and other things in a hopeless attempt to turn the pig's ear of analog TV into some kind of silk purse. So TV evolved, but slowly. And you probably kept that old TV for 10 or 20 year at least... TVs actually wore out!

But then HDTV came along. And a bunch of use early adopters went out and bought the first generation analog HDTVs... really just SDTVs with CRT-driven HD displays, no digital tuner, no digital inputs (well, sometimes Firewire, but that only worked with MPEG-2 input). Even though this was a fairly small group... many decided to pass on a 600lbs., $4000 TV with no content (sadly, not I), it was a huge boon to the TV industry.

And followed up in 5 years or so by the first generation digital HDTVs, the move to digital displays (plasma, LCoS, DLP... even LCD, but back then, on the low end)... so for me it was another $4000 for a digital TV, this time a 71" DLP (died last spring). And because of Blu-ray, and ATSC/cable/satellite had gone HD by then, and football looks so good in HD you never want to see it in SD again, they sold crazy numbers of new televisions. They were now hooked to this 5-7 year upgrade cycle... doesn't take long to love success.

And it kind of looped again; the vast might of the world's brain power this time set to make LCDs not look terrible, since plasma screens had high cost, crazy power requirements, and burn-in, but pretty much everyone was now demanding a TV that hung on the wall like a gigantic picture frame. But there wasn't a Big New Thing to sell you on, other than that hanging on the wall and not sucking thing. Some of it was just price... making LCD panels the size of double-garage doors in one shot, they could make big screen much cheaper... my DLP replacement, a 70" Sony, was just over $2,000 this time. But they figured on 3D as a big hook, since Avatar did so well, and... ok, since Avatar did so well. And, sure, because folks were dumping cash at the movies on 3D films.

Only problem... 3D at home kind of sucked. Particularly the LCD shutter glasses -- they dim the display with low duty cycle, even with that, crosstalk, etc. My latest TV had 3D -- you basically can't get a premium model without it, but it's passive 3D (they rig the LCD polarizer to alternate lines, then use RealD style circular polarizers... sounds like it might be bad, but it's actually an improvement). About as useful as 3D film -- occasionally good, but usually just a distraction. The nice thing about passive, too, is that you can get 2D glasses, which let you view the "3D" video without the 3D effect. But I digress.

4K doesn't have any of the problems of 3D... no need for glasses, primarily. The problem is more along the lines of the problem we've had establishing a followup higher-end digital music format. Ok, today that's Blu-ray, but mostly because Blu-ray just exceeded the other attempts as part of the main stream spec. And the format wars between DVD-Audio and SACD -- both of which required a new player -- were not pretty. But the main problem there was just that most people buying CDs didn't have home stereo systems that did justice to CDs... much less something with twice the resolution. And also, these entered the market just as the digital download revolution was kicking into high gear. The average listener was more concerned with getting all their music in a pocket sized player, even if that meant high compression and relatively lousy sound (but still historically great, compared to AM radio and the typical turntable owned by most folks in decades past).

I have worked in digital video since the early 90s, and HD about as long as one could have. I know "better" when I see it. The 4K Sonys on display at many big box stores, playing 50Mb/s AVC 4K video from a dedicated PVR, these look damn fine. Of course, they're also playing "demo reel" stuff... that's probably going to be the best possible video for that in-store thing. Don't see much fast motion there, or anything else that trips up even a good AVC encoder manned by a good compression engineer. But damn does that look good... from 3-4ft away. But that 65" television is unlikely to offer me much of a practical improvement over my 70" Sony in my media room... I'm sitting 10-12ft from the screen. If I moved the leather motorized reclining theater seats to about 5ft, then sure. Or put in a 100" or some-such TV... I do have the room, though my wife likes the cabinets where they are. I'm a videohead from way back, and when and if 4K really hits, I'll get it. But that's going to take a long time, too... formats needs to be established: video disc (if any... HVEC on Blu-ray and/or BD-XL solves the storage problem), cabling (HDMI 2.0 does the job, but everyone's got to agree.. right now, its a mess of random multiple HDMI cables), etc. Plus, I'll need a 4K camcorder or HDSLR, etc. Lots of cash to spend there, which explains why I got the $2K Sony, not the $7K Sony.

But most people don't have room for a 65" TV, much less a 100" TV. And HDTV didn't start to catch on until broadcasts.. getting 4K over the air, or even over cable/satellite, is going to be a slow and treacherous thing. And unless this catches on, that may never happen. There's some statistic out there that claims something like 30-50% (depending on where you steal that stat) have their HDTV hooked up to an SD source only, and don't even know it. I wouldn't have believed it... after all, the SD->HD difference is even more profound than HD->4K (most 4K is quad-HD, an even 4x improvement, rather than the 6x of SD->HD, and there are also diminishing returns as the resolution increases), but my ex brother-in-law had just that set up, their HD cable box hooked to the brand new 50" HD LED-backed LCD TV, though that single yellow CVBS cable. But the dude though he was watching HDTV (not the reason my sister dumped the guy, but it worked for me).

Comment Re:Fix HD First (Score 1) 559

HD broadcasts in the UK use the DVB T2 system, which has a raw bitrate of up to 35.4Mb/s per channel. That's the effect of using 256-QAM and 8MHz analog slots, rather than the 8VSB used for ATSC... and pretty much only ATSC. US cable and satellite systems also use QAM modulation. DVB T2 also allows AVC encoding, not just MPEG-2.

So the BBC may suck, but it's not for lack of bandwidth, it's something else. Even the old 64-QAM DVB-T system delivered 24.13Mb/s per 8MHz slot, and that was just for SD broadcast.

Comment Re:Fix HD First (Score 1) 559

First of all, all broadcast video is 4:2:0 decimated. That's an ATSC and DVB requirement. So you immediately have to cut your bitrates in half; that averages out to 12 bits per pixel.

And of course, ATSC transmissions are only 1920x1080 (technically 1088, but the bottom eight lines are blanked) or 1280x720. Most broadcasts are either 1080i60 or 720p60, though technically, 30p and 23.976 "NTSC Film" are also supported. Most broadcast stations transmit at least 13Mb/s on their primary channel, leaving the rest for a low quality SD channel or two. Yeah, there are some that claim to transmit three HD channels, but that's pretty rare.

Comment Re:Fix HD First (Score 1) 559

Older Blu-ray discs use MPEG-2 compression on a single layer, they're not significantly better than ATSC. Modern discs, using AVC on a 50GB disc, that's another story. If well compressed -- there's an art to that, too. Encoding engineers can apply overall low-pass filtering, manually vary encoding bitrate, etc. to deliver a consistent visual experience. Auto-compressed discs do the latter algorithmically, the former not at all, so they don't look as good, unless they just crank up the bitrate to compensate.

All intraframe video compression algorithms rely on a small number of independent frames and good redundancy in-between them, P and B frames in AVC and MPEG-2. When the motion search algorithms can't rely on redundancy, as in the case of fast motion, the encoder lacks the bit-budget to encode that video without motion artifacts. There's more latitude in Blu-ray than broadcast, of course, both in bitrate and the option to use VBR, but it's not unlimited. On Blu-ray, there's also the option of going to 720p60, but film is usually 1080p24. Much of the visual artifacting comes from differences between adjacent DCT blocks, thus, applying a global low pass filter in such areas (which you'll see in use in any very fast motion on nearly any commercial DVD or Blu-ray) lessens or eliminates the artifacting, and given the likely motion blur on 24p film transfers anyway, may just pass without being obvious. Of course, that's not true of video shot on AVC or MPEG-2 camcorders... the camera itself may warn the operator about motion that's crushing the encoder, but it doesn't selectively blur the overall image.

Comment Re:This is important! (Score 1) 559

You don't need to quite scale it directly. We were doing SD at 8Mb/s (ish) with DVD, but didn't need to scale that to 48Mb/s for HD. In fact, while the spec allows for 19.4Mb/s, you're correct to note that most broadcasters are going to include a few SD side-channels. So you're not even twice the bitrate in practice, scaling DVD to ATSC HD. And that was a factor-of-6 increase, not a factor of two. Going by that rule of thumb, you'd be happy with MPEG-2 at 25Mb/s or AVC at 12Mb/s. I don't actually believe that, but that's the math of what came before. And maybe not that far off... Netflix already claims they're happy with 4K at 15Mb/s. Then again, they're also happy with 720p24 at HD.

Not sure that's sufficient, but it should be pretty obvious that you don't need a linear increase in bitrate. And the Red-ray encoding, whatever that is, suggests that the same 19.4Mb/s with advanced encoding ought to deliver 4K at acceptable quality for broadcast, given sufficient encoding cleverness. But I wouldn't hold my breath -- the industry doesn't like to push hard on these things. They'd probably consider AVC, but not HEVC or something even better... they're likely to pick the best mature technology. Which is why these standards are always behind the curve. But they're also thinking about realtime encoding, transport stream re-encoders once you get into cable and satellite, etc. Commercial encoding, via disc or download, has to budget to do it very off-line if necessary, fully tweak it by encoding engineers, etc.

Comment Re:Dev only needs mini to test 64-bit A7/M7 ... (Score 1) 471

Not surprising... this looks to be a perhaps customized version of an NXP embedded ARM processor, the M7 that is. If I'm paying as little as $3.50 for these, I can only imagine how cheap they are for Apple. And this should make practical a slew of applications that could be done without it, but would likely run down your battery in no time flat. The little IOP can sample and cook data for the main SOC all day without using much juice.

Comment Re:Getting to be too many models, again? (Score 1) 471

Google introduced a small, powerful tablet last year: the Nexus 7. Apple introduced a less small, less powerful tablet last year: the iPad Mini. Sure, the new one is better, both of them. It has yet to be shown how the dual-core Mini compares in performance to the four-core Nexus 7.

I'm currently using a plenty fast Samsung Note 8.0... small enough, powerful enough for anything I'm currently doing on tablets. That's what really matters. Markedly better than the recently-dropped Asus Transformer that preceded it. And the real win on a small tablet is the "S-Pen", the wacom-style digitizer. Far more accuracy on-screen than a finger on a 10" tablet, so you're not suffering at all with the smaller screen, at least for interactive use. A bit small for reading my guitar music, but this may only be temporary, or maybe just for around the office. Best device ever for note-taking.

Comment Re:Getting to be too many models, again? (Score 1) 471

5c is just the 5 in a new case. It's supposed to appeal to a different market than the 5/5s did, expanding the reach of the iPhone a bit, perhaps. We'll see. It's hardly any risk, the design was paid for long ago, this is just packaging it a little cheaper. If it doesn't work, don't expect a colorful 5s-remake next year.

Comment Re: Getting to be too many models, again? (Score 1) 471

Apple's always had a little diversification in their Mac line... and generally much less than any major competitor. Today they're at an all-time low I think, far as models go, at least since laptops were possible. They nixed the 17" laptops, all laptops are SSD/no-optical now, etc. They still really doing just one iPhone per year... no good reason to stop making the old one if it's still selling. Contrast that to Samsung, who seems to have one new smartphone model per month. Same with tablets.

Apple's one-size-fits-all approach was good when they were the only one doing the job, and fine when they were the only ones doing the job well. But with other vendors building better devices, taking more risks, thus more innovation, they do at some point have to compete. Just being a fashion house isn't going to be enough forever. They're good at making each year's device "about twice as fast" as last year's... particularly when it comes to gaming. But that's about the only thing you can count on. Not enough for many users, who see larger screens, new I/O devices, built-in wacom digitizers, etc.

That's not to suggest Apple should make a radical change to the iPhone or iPad... and that's also not what anyone who's innovating does today. Rather, they try different things. So Samsung made a fairly predictable Galaxy S4, but they also followed it up with a few variations... a serious camera phone (Galaxy Zoom), the giant-screen S4, etc. They also pretty much created the "phablet" market, and made that work by adding the wacom, so a large phone screen could actually deliver tablet capabilities, despite your fingers still being the size they were last year. Apple needs to try some variations, try something new. They ultimately will have to something, whether that's experimenting on their own or copying the other guys. But I don't think they survive, not at their current "leader of the pack" levels anyway, if it's just about taking the best of Samsung or Motorola or HTC from this year into next year's devices.

Comment Re:Getting to be too many models, again? (Score 1) 471

Actually, with phones, Apple went directly from one tier to three-tier with the iPhone 4S... they offered the iPhone 4 at the $99 point and the iPhone 3GS at the $0 point. It was just that, since the iPhone 3GS was only for AT&T (GSM, and before T-Mo got involved), they didn't have that on Verizon or Sprint.

They are certainly upping the SKUs at a high rate: two iPhone 4S, five iPhones 5c, and nine iPhones 5s. But they have finally acknowledged, and rightly so, that the iPhone is a luxury item, and that pretty much makes it a fashion item. So while one size still apparently almost fits all, once finish clearly does not. Most other phone makers are not doing the "fashion statement" thing.... Nokia a bit, I guess, with their similar color selection.

And of course, the 5c isn't new, it's just the iPhone 5 cost reduced and put in a plastic case. This is what Apple's been doing, only now they're making more money at it and making it seem new. Targeting a different demographic than the high-end user. My college-age kids are both pushing for iPhones (where did I go wrong) for Christmas... the boy (first year Pharmacy school) wants the 5s, "because the new OS runs like crap on older models", the girl (second year Bio/Nursing) wants a 5c, "because they have pink".

Comment Re:Mavericks is free? Hmmm... (Score 1) 471

Yup... they're removing any barrier to the new OS, and thus, any good reason for older users not to upgrade. That may kill off some old hardware faster, if there are machines that can't be upgraded, also maybe a good thing in the short term. And they're giving Mac users a big warm and fuzzy smooch on the cheek, at the same time their main competition has been pushing an OS that practically no one likes, for $100+ a copy. They're looking at PC users and saying "come over here, we have cookies".

Slashdot Top Deals

What good is a ticket to the good life, if you can't find the entrance?

Working...