Intel's Core 2 Desktop Processors Tested 335
Steve Kerrison writes "It's early morning here in the UK, but that doesn't stop us from being around to see the launch of Conroe and friends, Intel's newest desktop chips. Even a $180 Intel CPU can beat an Athlon FX-62 in a number of tests. Now that's bound to get the fanboy blood pumping, right? We've also taken a look at a pre-built system that's powered by the Extreme X6800 CPU, along with an nForce 4 SLI chipset. As you'd expect, it's quick."
Loss Leader? (Score:5, Insightful)
Re:Loss Leader? (Score:4, Insightful)
Re:Kyle Bennet seems to disagree... (Score:5, Insightful)
If you have a single high end card (7900), there isnt a whole lot of difference between the FX62 and the X6800, or even the E6700. Most games are GPU limited now, and will be until the next generation of cards is released in 3+ months (FEAR is really the only exception to this).
They didnt run any benchmarks at 800x600 or whatever, because those results are more or less useless. Who spends $500+ on a processor and $500 on a video card and plays games at that low resolution.
What matters if you're going to buy a new rig now is the price performance ratio. If you're a midrange gamer, your best bet is probably a E6600 and a $250 video card. Or an AM2 setup, it all depends on the prices AMD cuts their X2 line to. We'll find out closer to the end of this month what the deal is. Come August 1st we'll have a very good idea of which platform is on top.
Re:Kyle Bennet seems to disagree... (Score:2, Insightful)
However, if you _do_ have tasks that are heavy on CPU and not GPU, Core 2 owns AMD.
So what's hype about a CPU that's 1) cheaper 2) plays games just as well 3) can handle the occasional DivX rip or MP3 conversion much faster?
More good news for the consumer! (Score:3, Insightful)
More good stuff is coming from both camps, I predict.
Will this be The Return Of The King? (Score:2, Insightful)
I think the competition has been good, but if Intel returns wearing the performance crown then I think there is a real potential that the CPU market will be dominated by Intel more so than it has ever been before, with consoles being the main holdout. If these benchmarks are true, then the introduction of the Core Duo will be a real turning point I think. Keep in mind that these speeds are introductory and that in the past Intel hasn't had much trouble progressing to higher performance out of the same architecture.
What this all means: (Score:5, Insightful)
Kyle Bennet is an AMD whore... (Score:5, Insightful)
They do the power tests with power saving settings turned off. This gives AMD the edge at idle, mostly due to a lower transistor count. As other sites have shown, turning the power saving settings on (as one would expect) puts Intel far out front at idle.
How do they end that article?
" I would highly suggest keeping your eyes on AMD low wattage / energy efficient processors for those projects that require a noiseless solution."
So they make Intel look worse than they are, and yet Intel still wins at under load. What's the takeaway? Buy AMD.
In the gaming, after the Intel gets done smoking the FX-62, what do they say?
"It is very interesting that in all of our testing, both "what is playable" testing and "apples-to-apples" testing, the Intel Core 2 Extreme X6800 and Intel Core 2 Duo E6700 are very close in performance. In fact, in some games they are dead even. The price difference between the two is very extreme with the Core 2 Extreme X6800 costing $999 and the Core 2 Duo E6700 at $530. Does it look like the price is justified between the two for gaming? We can safely say "no" as far as gaming goes with this gameplay testing we have performed."
Then, when speaking of AMD, do they mention even the E6700 ($530) beat the FX-62 and the FX-62 costs over $800? Nope.
"As for the AMD Athlon 64 FX-62, all of our testing shows that it does trail the two new Intel CPUs in gameplay performance. So, if you wanted to point one out as being a "winner" then for sure it is the new Intel Core 2 X6800 and E6700. But, if you look at the amount of difference between the AMD and Intel CPUs, you will see that it isn't enough to amount to anything. The only game that we saw any real-world difference in was Oblivion, and even that was tiny. A little overclocking would clear that difference up."
Any mention of overclocking levels and how the Core 2 Duo overclocks well? Much better than an FX-62 usually. Nope.
What's their takeaway from the gaming section where a $530 Intel beats out AMD's fastest chip (at $800)?
"We have proven here that the flurry of canned benchmarks based on timedemos showing huge gains with Core 2 processors are virtually worthless in rating the true gaming performance of these processors today. The fact of the matter is that real-world gaming performance today greatly lies at the feet of your video card. Almost none of today's games are performance limited by your CPU. Maybe that will change, but given the trends, it is not likely."
and then
"Lastly, I would advise everyone that is thinking of rushing out and purchasing their latest upgrade that we are sure to see HUGE pricing slashes out of AMD before the end of the month."
Way to go HardOCP. Rig your tests, ignore Intel victories and make your summary "buy AMD".
You have zero cerdibility, HardOCP.
Also, you used bullshot wrong. Bullshot is a term for fake screenshots designed for games (like EA uses). It doesn't fit here.
Re:Loss Leader? (Score:4, Insightful)
Thing is; these are miles ahead of AMDs current crop, Intel could double the prices on them and they're still good value for money. If they're a good product, market share will come without trying.
Re:Intel's Core 2 need programmer do morething (Score:3, Insightful)
I think what he was trying to say is that Core 2 isn't a magic processor that just makes everything faster, but can also be leveraged by programmers for even greater gains with some optimization. Of course, this isn't different from any other processor, and I could be completely wrong about what he was saying.
Energy efficiency (Score:5, Insightful)
Noticable Difference? (Score:2, Insightful)
Sadly.... (Score:1, Insightful)
What about the more reasonable processors? (Score:4, Insightful)
Not yet available (Score:3, Insightful)
Re:Kyle Bennet is an AMD whore... (Score:2, Insightful)
[H]ardOCP in my mind is for stupid or lazy gamers--they can their benchmarks and corresponding reviews for audiences who can't or won't draw good conclusions from traditional datapoint-intensive head-to-head benchmarks and reviews. Not to mention, the great possibility for abuse to twist the results, which is what happened here--when the playing field is unlevel at the discretion of [H]ardOCP, whoever wins is up to them rather than the relative merits of the products.
The fact is, Intel has a lineup of real winners here but [H]ardOCP made the playing field unlevel to avoid acknowledging it. Sad fanboyism.
Re:That's almost always the case (Score:3, Insightful)
Games have been predominantly GPU-limited for the past 6 years (or in layman's terms --- as long as GPUs have existed in the form they do today, the nVidia GeForce being the first such chip). It made no sense in 1999 to use Quake 3 running at 640x480 as a benchmark, because the game looked a *lot* better at higher resolutions, and the harware was able to cope with it.
This isn't even taking into account the fact that virtually nobody has a monitor capable of displaying more than 100 FPS, nor could one perceptually distinguish any frame rates above 50fps -- most people still run their monitors at 60Hz.
Don't try to fake a real-world benchmark. If the only way to see any difference in game performance when comparing across CPUs is to lower the resolution to the rock-bottom value, then the conclusion should be that the CPU is not a contributing factor to game performance. ATI got into a lot of trouble a few years back by optimizing their drivers to 'fake' Quake 3 framerates over 120 fps (or some really high number like that). Nobody noticed for *months* that the frame rates were artificially inflated.
Gamers are quickly approaching audiophiles in my book in terms of their level of insanity.
Re:Loss Leader? (Score:3, Insightful)
I think the new Intel CPUs are priced very aggressivley, but Intel is still making money with them. And they put a lot of pressure on AMD.
MythTV (Score:3, Insightful)
Re:That's almost always the case (Score:3, Insightful)
I agree with your post, but this part stuck out. Why is it, that on a slashdot geek site, nobody ever references CPU performance to programming desktop/work-station use. Photoshop, servers and games seem to be the main reasons people justify the highest-performance machines. But ever since I was in high-school (in the 80s), I've always overtaxed my machine... I've never had a machine and said "for what I do, this is sufficient".. And it never mattered what it was I was doing.. Beit running windows (in the 90s) and experiencing the dreaded context-switch pause... Beit Linux and running evolution + spam-filtering + "grep -r" on something - I've always gone "man It sucks working on this machine at work, v.s. the machine I have at home collecting dust". And my work machine has never been a slouch (currently a AMD 2800+ Barton with 1.5Gig of RAM).
Now if a linux programmer uses vi all day and ssh's to a beefy build-machine, then I can understand them not needing a beefier desktop/work-station. But I use to use xemacs and THAT was slow on many machine in the late 90s. Now I use a java editor named idea (commercial counterpart to eclipse) and every ounce of horse-power I can muster I long for. With intelligent real-time code analysis, there is no longer such a thing as idle-time on the work-station. Then add the fact that you can put your application server on your desktop, along with a full project-management suite (evolution, time-tracking tools, dozens of terminal windows, dozens of browser windows...). Waiting 20 seconds for the mouse to regain focus because I'm deploying a new application is just painful. Now do this 5 times in quick succession because you're iteratively debugging something.. Multi-CPU is a god-send for this type of environment.
My point is that I almost never hear people referencing this type of multi-application work-environment as a justification for cheap-but-beefy machines (except for the ubiquitous reference to photoshop, which can kiss my *@@).