Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Re:Those bastards (Score 1) 259

First: yes, it's clearly a joke. How can you copy something done ten years earlier? :P

Second, Apple does make their own ARM CPUs these days. They build and design licensed ARM CPUs for their iOS devices these days, which includes AppleTV, iPhones, iPads, and iPods, but for their Mac / OS X business they are still 100% Intel. Their latest design is starting to turn some heads.

Comment Re:P4 vs Athlon XP (Score 1) 259

The misleading thing about benchmarks is that they're generally prebaked - there's no chance for "surprise" physics interactions or various pipeline-stalling things that tended to trip up the Pentium 4. From personal experience I'll tell you that my old 2.8 GHz Pentium 4 generally didn't do as well as my Athlon XP 2400+ in Doom 3, Bioshock, or Unreal Tournament 3. The latter two should have been poster children for the Netburst chip by comparison. Also, the Pentium D 820 was a 2.8 GHz chip: it was the miserably hot 130W TDP 840 that ran at 3.2 GHz. But you're correct on the other counts - the higher IPC and integrated memory controller were both HUGE advantages over a latency-crippled, deeply pipelined architecture. The Pentium D was itself a flailing, mostly failed response to the surge in mindshare the Athlon 64 X2 created until the Core architecture could be prepared.

Comment Re:Before AMD committed suicide (Score 1) 259

It depends an awful lot on the workload, though. For gaming it's one-sided in Intel's favor to the tune of around 2/3 more work done per clock on average (sometimes more), but for video encoding with x264 the sheer core count makes it better than competitive with Intel unless you're willing to pay noticeably more. It's a behemoth for virtual machines, it plays video games well enough, and for scientific computation I really haven't found myself wanting. Granted, I'm an edge case...

Comment Re:It's pretty simple actually - Do Some Evil. (Score 4, Insightful) 192

The answer's simple: Facebook wants their interface to be gobbledygook because that means you're spending more time on the site, and having to mentally filter relevant content from the ads they want you to see. By the logic of someone creating an attractive nuisance, interfering with this kind of product makes a perverse sense because it's making the product better for users but worse for Facebook's actual customers - advertisers and marketers.

Comment Re:Great (Score 1) 289

Blu-ray's variable, but video bitrate alone will sometimes jump over 36 Mbit/sec. Audio's also variable - Blu-ray supports everything from 0.1 Mbit Dolby Mono all the way up to 7+ Mbit lossless 24-bit PCM. Worst-case scenario you could expect something pushing 50 Mbit/second, which is definitely not within range of the typical American's broadband connection. 4K will be impressive, but assuming a linear doubling (as HEVC will be twice as efficient, but have to deal with four times the video data)... yep, your math checks out. There's also the problem of product segmentation between different streaming video providers and studio bickering, lapsing of rights, seasonal variability, &c. Until and unless that gets sorted out, I'm buying physical copies of my favorite movies.

Slashdot Top Deals

I bet the human brain is a kludge. -- Marvin Minsky

Working...