Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Intel's Core 2 Desktop Processors Tested 335

Steve Kerrison writes "It's early morning here in the UK, but that doesn't stop us from being around to see the launch of Conroe and friends, Intel's newest desktop chips. Even a $180 Intel CPU can beat an Athlon FX-62 in a number of tests. Now that's bound to get the fanboy blood pumping, right? We've also taken a look at a pre-built system that's powered by the Extreme X6800 CPU, along with an nForce 4 SLI chipset. As you'd expect, it's quick."
This discussion has been archived. No new comments can be posted.

Intel's Core 2 Desktop Processors Tested

Comments Filter:
  • by chinaitnews.cn ( 988810 ) on Friday July 14, 2006 @02:28AM (#15717151) Homepage
    Intel transfer the difficult from Hadware to software, for get more power, programmer need more technology.
  • by androvsky ( 974733 ) on Friday July 14, 2006 @02:33AM (#15717164)
    You have to be carefull with the HardOCP benchmarks. I only read the first several pages, but they're doing their usual "real-world" stuff... which means leaning on the video card to do most of the work. Naturally, CPU differences aren't going to show up much here. I appreciate them doing something to put a real-world perspective on things, but what I read doesn't change the fact that the Core2Duos (I hate typing that) are really stinking fast. But playing games that do most of the work on the video card won't matter, big surprise... it really looks like a clever yet still desperate attempt to be a raving AMD fanboy and prop them up.

    Uh-oh, rant ahead, I tried to avoid it, I swear... ;)

    I am a raving AMD fanboy, but I'm a raving AMD fanboy because they've made the best CPUs for a long time. They also have a wonderful motherboard architecture that makes very high bandwidth applications much easier to deal with. I find myself wishing I could plug the Core2Duos into an AMD motherboard... on-chip motherboard controllers would help Intel also. Ah, what do I care, I want to see a real motherboard built around a Cell, the overall system bandwidth is almost as exciting as the cpu. Too bad that means buying everything from Rambus... :(

  • by riprjak ( 158717 ) on Friday July 14, 2006 @02:39AM (#15717180)
    Responding to anonymous... I must be mad.

    But your point is accurate. Gaming is mostly GPU limited; my gaming system, an s939 amd64x2 3800+ with a pair of old GPU's (7800GTX 256MB) achieves equal or better gaming results than all of these.

    I suppose the point is are such prices for CPUs currently justified when they wont have much impact on user experience?

    No doubt the new entry level core 2 duo's seem to be the upgrade of choice to maintain near cutting edge; but a high end GPU seems to be a wiser spend than a new CPU for gamers.

    As for video encoding et al, HardOCP had the same results in their "real world" testing as others, but at least they make an effort to simulate the way the "average" person might use the things; either way, I'll reserve judgement here until I see some 64bit results, since encoding in native 64bit will be the telling tale IMHO.

    In any case, I think we are reaching the point of dimishing returns, a year old 2GHz processer already rips music as fast as the drive can deliver it, already transcodes video as fast as the drives can burn it etc... GPUs control gaming... It is nice to see intel returning to the game in a serious fashion and no doubt this will have positive results for the consumer if AMD try to match price performance. I was mainly trying to point out that the "benchmarks" aren't nescessarily useful in describing the performance of these beasts in operation.

    err!
    jak
  • by cnettel ( 836611 ) on Friday July 14, 2006 @02:40AM (#15717181)

    The Oblivion tests show all things wrong with this, E6700 and X6800 getting identical (more or less) numbers indicate a GPU bound test, AND they use different settings for the AMD test - as they state that the game was not playable if the higher quality settings were used there.

    This [msdn.com] MSDN blog post was an interesting read to me. As the writer notes, image processing is a kind of virtual task. But it shows some pretty interesting stuff, IMHO, like the fact that the gap between AMD and Intel (Intel winning in the end) is much smaller at 64-bit. Maybe that should be no surprise, with AMD designing the AMD64 instruction set in tandem with the K8. It's also interesting as it might indicate trends regarding tight loop performances in JITed environments in general. This, like it or not, is becoming more common.
  • Re:Loss Leader? (Score:1, Interesting)

    by Anonymous Coward on Friday July 14, 2006 @03:06AM (#15717241)
    being a desperate AMD fanboy seems to get you modded higher than being realistic and admitting your team can't always lead the division tonight...
  • by Sycraft-fu ( 314770 ) on Friday July 14, 2006 @03:54AM (#15717348)
    The processor is generally the thing I upgrade the least because it simply has the least increase in demands. Video cards you can upgrade once a year and not be doing it too often given the advances they have. Throwing lots of RAM at your system is also a good idea. Processor? Well for gaming and most apps it just really isn't that big a deal. Get a good dual core of prett much any design you like and call it good. Hell if all you are worried about id gaming and not doing things in the background while you game get a good single core, games still don't make any use of a second core to speak of.

    I moved from a P4 2.4GHz to a Pentium D 2.8GHz when I did a system overhaul not too long ago. Why such a minor processor upgrade, you might wonder? Well because the processor wasn't the issue. That 2.4 was plenty fast, for games at least. The graphics card was the issue and I wanted PCIe which my board didn't support. Had the board had the same socket, I would have just kept the processor. It was fine (though because of teh audio work I do I'm appreciating the dual core). I just got a dual core because they weren't that much more expensive and it has geek appeal to me.

    The real useful thing, in my book, is that the Core 2s run cooler. Current processors have tended towards too hot. AMD is much better than Intel but even they put out quite a bit of heat at the high end. It sounds like the Core 2s are quite efficient for the performance they give. That's good because I value a quiet system and frankly, it's as good as I'm willing to make it at this point cooling wise. I'm not going water cooling and there's just no more air cooling I can do short of making the fans speed up.

    I don't think I'd recommend these as an upgrade to anyone who already has a dual core AMD or Intel system. Unless you are doing simulations or rendering or something I just can't see the minor increase as worth it. Certianly not for games. However if you need to upgrade anyhow, these look like winners.
  • benchmarks (Score:3, Interesting)

    by Exter-C ( 310390 ) on Friday July 14, 2006 @04:26AM (#15717408) Homepage
    Its interesting that all benchmarks seem to include mp3 compression or mpeg video creation etc. How many slashdot users actually use their computers more than 1-5% of the time doing that type of stuff? Of course Its all those DiVX groups that need the performance so that they can encode and release an extra 20% more videos in a month ;)

    Overall the performance of the latest bunch of Intel processors is great, but when it comes down to it in a datacentre environment where spare stock etc is a costly exercise using Intel products is going to cost you more in the long run, while if we go with Opteron we can save on spares and still get great performance/power consumption.
  • 64-bit benchmarks? (Score:1, Interesting)

    by Anonymous Coward on Friday July 14, 2006 @05:32AM (#15717529)
    I've followed all the links discussed here and I can't find any 64-bit benchmarks. Does the Intel Core 2 also deliver superb results with 64-bit code running on a 64-bit operating system?
  • Comment removed (Score:4, Interesting)

    by account_deleted ( 4530225 ) on Friday July 14, 2006 @07:19AM (#15717700)
    Comment removed based on user account deletion
  • by Anonymous Coward on Friday July 14, 2006 @08:22AM (#15717885)
    according to this chart
    http://img236.imageshack.us/img236/5492/amd724pric elistshort5xu.png [imageshack.us]

    and they overclock easily from 2.0ghz core to 2.5ghz making it 4600+

  • Re:Loss Leader? (Score:1, Interesting)

    by Anonymous Coward on Friday July 14, 2006 @10:06AM (#15718485)
    You must be young :)

    I had enough of "kewl" reviews (competent or not) already back at the good ol' times of 3dfxgamers and that ilk. Insert enough "action" and you can make a review of Greek sausages feel immersive -- but I'm just tired of that cheap trick.

    And if you look hard at the details of the GD review... the author appears much less clueful than he makes out to be. Look at these little gems from TFLA:

    For the first time we see Intel move to a technology which lets delivers more instructions per clock cycle and lets each core perform up to 4 instructions at the same time.

    If this means "for the first time Intel can execute 4 instructions simultaneosly" then I understand it. It's factually wrong, though -- a (single) Core core can in favorable circumstances execute or at least dispatch more than that -- and other readings are just funny (like, "for the first time Intel has parallel execution" -- or something, I'm not sure how to decipher that).

    By not opting to place the memory controller inside the CPU, Intel has more flexibility as to building a better one on the chipset which doesn't require an entire overhaul as we've seen AMD do going from Socket939 to AM2.

    No mention of latency, the sole motivation for an integrated controller. "Flexibility as to building a better one on chipset" -- pray tell, how does integration on CPU die inherently hamper a memory controller? And about AM2... Socket 939 was with us how many years? What's the big problem with "overhauling" it now? Like, Intel should still be at Socket 7 or what?

    Intel's infamous [sic] Streaming SIMD Extension Instructions (SSE) have been kicked up a notch as well by Conroe's ability to issue the instructions at a rate of one per clock cycle which is double that of previous generation CPUs.

    "Kicked up" -- how? Can you explain in more detail? (Probably not -- need to RTFM first.) And, even previous cores could issue more than one SSE instruction per cycle, but now for the first time the datapaths are full 128-bit to each unit.

    In a word: awkward, as skating on thin ice often is. But yes it looked very kewl and "adult", with enlightening and illustrative allusions to street fighting and veneral diseases from Vegas whores. Now that's a way to write a review on computer processors!

    (No offense meant, dear parent, maybe I've just really had enough of these sort of reviews... Back in my cave now. And those benchie results were interesting -- goes to show that Intel indeed has the predicted winner in their hands now.)
  • by default luser ( 529332 ) on Friday July 14, 2006 @10:47AM (#15718751) Journal
    Merom and Yonah are basically dual-core Pentium M chips - 3 instruction decoders, 3-wide instruction issue / retire. They include the Pentium M's instructional units, including 2 64-bit SSE units per core.

    Conroe and Woodcrest are complete redesigns of the Pentium M architecture, and are 4 + 1 decode, 4-wide issue and retire. Intel completely revamped the execution units: they include additional execution ports, and more floating-point power (ncluding full 128-bit wide SSE processing paths).

    While they are both of the same pedigree (P6 -> Pentium M), they are NOT AT ALL the same. One is designed for efficiency, and the other tosses some efficiency out the window in favor of increased performance. See the preview article here at Real World Technologies [realworldtech.com].

    You are thinking of the AMD Athlon / Opteron / Turion, which are the exact same chip with different microcode paths enabled. These chips can most certainly be taken from the same wafer.
  • by barawn ( 25691 ) on Friday July 14, 2006 @11:27AM (#15719120) Homepage
    In any case, I think we are reaching the point of dimishing returns, a year old 2GHz processer already rips music as fast as the drive can deliver it, already transcodes video as fast as the drives can burn it etc... GPUs control gaming...

    Why wouldn't you just double things up, then? Drop another monitor, keyboard, and mouse, and two users should be able to use one of these computers just as fast as one can on a modern computer.

    I know a lot of households that would benefit from buying only one new computer rather than two.

    Besides, there are plenty of CPU bound uses for these processors. People who mainly do gaming might not see it, but - my God, those rendering and integer/floating point performances just make me want to go suggest replacing all of our computing servers right now.

    I was mainly trying to point out that the "benchmarks" aren't nescessarily useful in describing the performance of these beasts in operation.

    Sure they are. Just not in your operations. In mine, oh my God, that thing's insane. But this has been a problem for computers for years. The real-life speed difference in terms of Web browsing between a modern PC and a five year old PC is minimal.

    It's not a limitation of these processors. It's a limitation of your usage of them. I'd venture to say that probably 80-90% of all computers out there are far more powerful than they need to be, with the people who use PCs for gaming being the exception in the past. Welcome to the rest of the world.
  • Re:first PC's? (Score:4, Interesting)

    by gfxguy ( 98788 ) on Friday July 14, 2006 @12:00PM (#15719443)
    I agree with you. Way back when, in my first years in college (mid 80s), I had an Atari 130XE. A math teacher asked me if I had a PC, and I said "Yes." I was very annoyed when she berated me in front of the class, explaining that a PC meant "IBM PC".

    So I stoop up and I asked "So you don't think a computer for personal use is a personal computer? So what should I call it? An egg timer?", and thankfully got a lot of chuckles from the other nerds in the class.

    So she explained that PCs were used for things like word processing and spreadsheets. "I have a word processor and a spreadsheet."

    She was like "oh, I didn't you could get those on an Atari."

    I loved my Atari (in a strictly platonic way, of course).

E = MC ** 2 +- 3db

Working...