Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment You've gotta be kidding. (Score 2) 91

Isn't the refrain - borne out by numerous financial statements by the sued companies and others besides - that optical drives are pricing themselves into extinction with razor-thin margins due to fierce competition and decreasing demand? It's possible HP has a valid point or has stumbled onto evidence, but this sounds more like flailing before declaring that optical drives will be an optional feature going forward...

Comment Re:Great (Score 1) 559

Exactly - if content providers aren't even willing to send enough bitrate through the pipe to deliver a satisfactory experience by today's HD standards, who on Earth would imagine they'd do justice to 16x the bandwidth requirement just a few years from now? Some broadcasts are still MPEG-2; some others are MPEG-2 but get passed through a last-leg AVC transcoder to save bandwidth; and while AVC's enjoying healthy adoption, there's no way to expect most companies will pay the hefty fees to adopt HEVC equipment in time for this Great Leap Forward. 4K's swell for going to a theater or a similarly large exhibition area, but for most consumers the upgrade will be pretty trivial.

Comment Re:Truly a great day (Score 1) 35

OpenGL 4.1 already has full API compatibility with OpenGL ES 2.0. Let's not go throwing out decades of hard work for a little bit of convenience with regard to video games, especially when hardware going forward will all be capable of transparently handling the API you wanna switch to. As for throwing out X11 and tossing in the Android graphics stack for everybody, that's madness for a thousand reasons.

Comment Re:Those bastards (Score 1) 259

First: yes, it's clearly a joke. How can you copy something done ten years earlier? :P

Second, Apple does make their own ARM CPUs these days. They build and design licensed ARM CPUs for their iOS devices these days, which includes AppleTV, iPhones, iPads, and iPods, but for their Mac / OS X business they are still 100% Intel. Their latest design is starting to turn some heads.

Comment Re:P4 vs Athlon XP (Score 1) 259

The misleading thing about benchmarks is that they're generally prebaked - there's no chance for "surprise" physics interactions or various pipeline-stalling things that tended to trip up the Pentium 4. From personal experience I'll tell you that my old 2.8 GHz Pentium 4 generally didn't do as well as my Athlon XP 2400+ in Doom 3, Bioshock, or Unreal Tournament 3. The latter two should have been poster children for the Netburst chip by comparison. Also, the Pentium D 820 was a 2.8 GHz chip: it was the miserably hot 130W TDP 840 that ran at 3.2 GHz. But you're correct on the other counts - the higher IPC and integrated memory controller were both HUGE advantages over a latency-crippled, deeply pipelined architecture. The Pentium D was itself a flailing, mostly failed response to the surge in mindshare the Athlon 64 X2 created until the Core architecture could be prepared.

Comment Re:Before AMD committed suicide (Score 1) 259

It depends an awful lot on the workload, though. For gaming it's one-sided in Intel's favor to the tune of around 2/3 more work done per clock on average (sometimes more), but for video encoding with x264 the sheer core count makes it better than competitive with Intel unless you're willing to pay noticeably more. It's a behemoth for virtual machines, it plays video games well enough, and for scientific computation I really haven't found myself wanting. Granted, I'm an edge case...

Slashdot Top Deals

After an instrument has been assembled, extra components will be found on the bench.

Working...