Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Intel's Core 2 Desktop Processors Tested 335

Posted by CowboyNeal
from the smokin'-tires dept.
Steve Kerrison writes "It's early morning here in the UK, but that doesn't stop us from being around to see the launch of Conroe and friends, Intel's newest desktop chips. Even a $180 Intel CPU can beat an Athlon FX-62 in a number of tests. Now that's bound to get the fanboy blood pumping, right? We've also taken a look at a pre-built system that's powered by the Extreme X6800 CPU, along with an nForce 4 SLI chipset. As you'd expect, it's quick."
This discussion has been archived. No new comments can be posted.

Intel's Core 2 Desktop Processors Tested

Comments Filter:
  • Loss Leader? (Score:5, Insightful)

    by Breakfast Pants (323698) on Friday July 14, 2006 @01:04AM (#15717099) Journal
    Gotta wonder if intel can legitimately deliver at this price or if they are going with loss leader tactics to try and regain marketshare.
    • Re:Loss Leader? (Score:4, Insightful)

      by Anonymous Coward on Friday July 14, 2006 @01:17AM (#15717129)
      Loss or not, they at least gave stockholders a little more confidence than the slaughtering over the last two years. This is good news even if they take a loss.

    • Re:Loss Leader? (Score:5, Informative)

      by cnettel (836611) on Friday July 14, 2006 @01:30AM (#15717154)
      Well, for now, the yields of the fastest Core CPUs is probably low enough that the average price of manufacturing could be higher for the cheapest chips, as they are a necessary part of the process, anyway. On the other hand, I think that the pure manufacturing costs for a (desktop) CPU tend to be quite a bit lower than this -- the big costs are the onetimers in development and investing in fab infrastructure. When that's already in place (for the current chip generation), it makes sense to use the resources available fully.
    • Re:Loss Leader? (Score:4, Insightful)

      by Anonymous Coward on Friday July 14, 2006 @02:09AM (#15717247)
      Bearing in mind that for the first time ever, the silicon for the cores of their laptops, desktops and servers are capable of coming off the same wafer, and I'd say they're onto a cost-saving exercise here. Heck, even the Core Solo is a Duo with a core disabled.

      Thing is; these are miles ahead of AMDs current crop, Intel could double the prices on them and they're still good value for money. If they're a good product, market share will come without trying.
      • Thing is; these are miles ahead of AMDs current crop, Intel could double the prices on them and they're still good value for money.

        Dude, don't give them any ideas! I am rebuilding next month...

      • Merom and Yonah are basically dual-core Pentium M chips - 3 instruction decoders, 3-wide instruction issue / retire. They include the Pentium M's instructional units, including 2 64-bit SSE units per core.

        Conroe and Woodcrest are complete redesigns of the Pentium M architecture, and are 4 + 1 decode, 4-wide issue and retire. Intel completely revamped the execution units: they include additional execution ports, and more floating-point power (ncluding full 128-bit wide SSE processing paths).

        While they are
        • False (Score:3, Informative)

          by Andy Dodd (701)
          Merom is from the same microarchitecture as Conroe and Woodcrest.

          You are correct about Yonah though.
    • Re:Loss Leader? (Score:3, Informative)

      by TheCp (988820)
      GDHardware's article: http://www.gdhardware.com/hardware/cpus/intel/conr oe/X6800_E6700/001.htm [gdhardware.com] That thing SMOKES ol' AMD... for now at least...
      • by rgravina (520410) on Friday July 14, 2006 @02:50AM (#15717336)
        That has to be one of the most entertaining, yet informative, reviews I've read in a long time!

        From TFLA (The Fine Linked Article):
        "[Intel's] P4 chip has largely been having its ass handed to it on a silver platter by the Athlon64 family of CPUs from AMD."

        and then later:

        "But this is where their [(Intel's)] little parade comes to a screeching halt - why? Because in the most simplistic of terms, Conroe (dubbed Core 2 Duo) kicks the Athlon64 right in the balls and doesn't look back."

        Now, my friends, *that* is how you write a review!
        • by evilviper (135110) on Friday July 14, 2006 @06:15AM (#15717689) Journal
          Now, my friends, *that* is how you write a review!

          And on which of the 20 pages the review is divided into, should I insert these witty remarks.
    • by Sycraft-fu (314770) on Friday July 14, 2006 @02:43AM (#15717320)
      Most likley they are just having good yeilds. They've pretty much got the kinks worked out of their 65nm process with the Pentium Ds they made on it so it wouldn't supprise me that Core 2s are having high yeilds. High yeilds = low cost per unit. This is espically true if the yeilds are high, but mostly at lower speeds. Say 90% of chips work, but of that 90% 50% only work at the slowest speed. Well, just knock the price down on that and get it back in volume, hike it up more on the rarer fast chips.

      If you look at their current pricing, it's not real supprising. You find you can get a Pentium D 65nm for as little as $175. That gets you a 3GHz one on their old 90nm technology. The price creaps up on the first incriment, a 3.2 is $217. However it takes a sizable jump then to $317 for 3.4GHz. The 3.6GHZ, if you can find it, is $500 or so. Past that, well there's only the "extreme" edition and that's over $1000 for 3.73GHz.

      The jumps like that are normal. They can easily produce low speed chips and there's a large market for them so they are cheap. Maybe a couple incremental upgrades. Then you hit a knee and prices start jumping fast.

      Based on their current pricing for their current high end, I don't see anything out of the oridinary for this new pricing.
      • Well that, plus unlike Microsoft with the Xbox 360 and the HD-DVD boys, they don't really have a secondary product to make up the losses (mobos and chipsets I suppose, but I think the nVidia chipsets are quite popular and of the boards with an Intel-brand chipset, not too many of them will actually be an Intel-made board). Yields have to be great - as you said, they got their 65nm process pretty well figured out with the PD line, and additionally the die shrink from 90nm lets them fit that many more chips
    • Re:Loss Leader? (Score:3, Insightful)

      by kill-1 (36256)
      Don't forget that Intel uses a 65nm process and can put almost 2x more dies on a wafer than AMD. AMD's SOI process is more expensive, too.

      I think the new Intel CPUs are priced very aggressivley, but Intel is still making money with them. And they put a lot of pressure on AMD.
  • so when will the first PC's come out with these?
    • Re:first PC's? (Score:3, Informative)

      by dhollist (811706)
      WWDC [apple.com] is August 7th.
      • Re:first PC's? (Score:4, Informative)

        by MojoStan (776183) on Friday July 14, 2006 @07:43AM (#15717985)
        so when will the first PC's come out with these?
        WWDC is August 7th.
        The question asked when the first PCs will come out with Core 2 desktop processors. You gave an answer based on an unconfirmed and very uncertain Mac rumor (linking to Apple's conference/lovefest), then got modded up.

        Unbelievable.

        Here's the answer the GP was probably looking for (from Anandtech's conclusion [anandtech.com]):

        According to Intel:

        Intel Core 2 Extreme processor based systems and boxed product are expected to be available on the day of launch, 27 July. Intel Core 2 Duo processor based systems and boxed product [through places such as Newegg] are expected to be available from 7 August. Each OEM has their own product introduction / transition cycles based upon their target market segment and current product offerings. We expect some to offer product in August with more introductions extending through September. Check with the OEMs of your choice to get their specific message on system availability.

        From what Intel is telling us, you shouldn't be able to so much as purchase Core 2 processors until after the first week in August, although you'll be able to get complete systems before then. At the same time, we're hearing that distributors already have some Core 2 parts in stock and will begin shipping very soon. While we tend to believe Intel's assessment of availability, we're hoping it's conservative.

    • On release day, because we'll build them ourselves.

      How many /.ers have an off-the-shelf PC anyway?
  • by Anonymous Coward on Friday July 14, 2006 @01:09AM (#15717107)

    There's a much more detailed review up at HotHardware.com [hothardware.com]

  • So... (Score:5, Funny)

    by the.metric (988575) on Friday July 14, 2006 @01:10AM (#15717110)
    will this be enough to run Vista?
    • Re:So... (Score:3, Funny)

      by moro_666 (414422)
      Actually i can't wait until they ship laptops with this cpu and a nasty integrated gpu that won't be able to pull vista off with all the bells and whistles attached. :)

        Now let's hope amd finds something to strike back on this, more competition means more cheap'n'fast cpu-s for us.
      • Re:So... (Score:3, Informative)

        by paganizer (566360)
        I'm typing this on a HP DV8230US Laptop with a core duo T2300 CPU; it just purely and simply rocks and I hate intel, this is the first Intel chip i've owned in 12 years.
        together with the gig of ram and the 256mb Nvidia 7300 GPU, I think this thing would run Vista.
        Not that it ever will, of course. Any Win OS after Win2k sucks; took me forever to get WinXP media center off this thing.
        • Honestly, ever since Centrino, I wouldn't skip an Intel chip on a lappy - they are worth the Intel tax. At least now you get your money's worth with intel.
      • Eh. Just like the old scam of software hard-disk compressors and 'memory doublers', if these CPU's are fast enough it won't take long for an enterprising developer to develop a WHQL-signed intermediate driver to do software rendering, and charge uninformed or desperate users $30-$50. You could probably even have it store its textures on the hard drive so it can run in lower-memory environments (which I'm sure will be common on machines without dedicated GPU's.)

      • Re:So... (Score:3, Funny)

        by moosesocks (264553)

        Actually i can't wait until they ship laptops with this cpu and a nasty integrated gpu that won't be able to pull vista off with all the bells and whistles attached. :)

        Nasty?

        This is slashdot. Not Mean Girls.

         
    • Re:So... (Score:5, Funny)

      by DigiShaman (671371) on Friday July 14, 2006 @01:39AM (#15717177) Homepage
      Now THAT was a stupid question...of course not. But it will be fast enough to run the latest in spyware in the years to follow.
    • Yes, it should be just sufficient enough to boot the operating system and allow you to play Solitaire at a playable frame rate. They should have Core Trio out by then anyway so I wouldn't worry.

      I wonder if we will ever see a Core Pentio?

    • I think "walk" would be more appropriate.
    • by joshsnow (551754)
      Vista? Nah, will this be enough to run Duke Nukem Forever?
  • by riprjak (158717) on Friday July 14, 2006 @01:11AM (#15717113)
    "Real World" [hardocp.com] testing of the new core2 duo's over at HardOCP seems to suggest that the hype is, well, Bullshot (Penny Arcade). He also savages... no, investigates, the other benchmarks with his normal subtle-but-robust manner :) It seems that the top of the line Core2Duo just barely beats an FX-62 numerically in actual game performance; statistically there is no difference whatsoever... As with all things, it comes down to perspective. I have no doubt that Intel are catching up to AMD, may indeed have caught up. However, I simply do not believe they have gone from lagging significantly to leading significantly at the same clock speed; Time, I suppose, will tell.
    • wow... mangled the formatting there, I look like a fucking fanboy.

      Apologies all!
    • by doormat (63648) on Friday July 14, 2006 @01:20AM (#15717138) Homepage Journal
      One of the things with HardOCPs review shows one extremely interesting thing...

      If you have a single high end card (7900), there isnt a whole lot of difference between the FX62 and the X6800, or even the E6700. Most games are GPU limited now, and will be until the next generation of cards is released in 3+ months (FEAR is really the only exception to this).

      They didnt run any benchmarks at 800x600 or whatever, because those results are more or less useless. Who spends $500+ on a processor and $500 on a video card and plays games at that low resolution.

      What matters if you're going to buy a new rig now is the price performance ratio. If you're a midrange gamer, your best bet is probably a E6600 and a $250 video card. Or an AM2 setup, it all depends on the prices AMD cuts their X2 line to. We'll find out closer to the end of this month what the deal is. Come August 1st we'll have a very good idea of which platform is on top.
      • by Sycraft-fu (314770) on Friday July 14, 2006 @02:54AM (#15717348)
        The processor is generally the thing I upgrade the least because it simply has the least increase in demands. Video cards you can upgrade once a year and not be doing it too often given the advances they have. Throwing lots of RAM at your system is also a good idea. Processor? Well for gaming and most apps it just really isn't that big a deal. Get a good dual core of prett much any design you like and call it good. Hell if all you are worried about id gaming and not doing things in the background while you game get a good single core, games still don't make any use of a second core to speak of.

        I moved from a P4 2.4GHz to a Pentium D 2.8GHz when I did a system overhaul not too long ago. Why such a minor processor upgrade, you might wonder? Well because the processor wasn't the issue. That 2.4 was plenty fast, for games at least. The graphics card was the issue and I wanted PCIe which my board didn't support. Had the board had the same socket, I would have just kept the processor. It was fine (though because of teh audio work I do I'm appreciating the dual core). I just got a dual core because they weren't that much more expensive and it has geek appeal to me.

        The real useful thing, in my book, is that the Core 2s run cooler. Current processors have tended towards too hot. AMD is much better than Intel but even they put out quite a bit of heat at the high end. It sounds like the Core 2s are quite efficient for the performance they give. That's good because I value a quiet system and frankly, it's as good as I'm willing to make it at this point cooling wise. I'm not going water cooling and there's just no more air cooling I can do short of making the fans speed up.

        I don't think I'd recommend these as an upgrade to anyone who already has a dual core AMD or Intel system. Unless you are doing simulations or rendering or something I just can't see the minor increase as worth it. Certianly not for games. However if you need to upgrade anyhow, these look like winners.
        • Given that I stopped reading this sort of nonsense long ago, I was surprised to see that people are *still* using Low-Resolution games as a benchmark.

          Games have been predominantly GPU-limited for the past 6 years (or in layman's terms --- as long as GPUs have existed in the form they do today, the nVidia GeForce being the first such chip). It made no sense in 1999 to use Quake 3 running at 640x480 as a benchmark, because the game looked a *lot* better at higher resolutions, and the harware was able to cop
        • Games don't make use of the second core? Who the fuck is writing these things!? Shouldn't Windows be able to make it use the other core for parallel tasks?

          Anyone?

          I guess my definition of an operating system that supports multiple processors is a little different from Microsoft's.
          • Yes, but what other tasks are running(Remember: Running is actuelly doing something, not just waiting for input) while you are playing? You might have a firewall and anti-virus installed, but there cpu usage will be 1% so running them on the other core, is not really that much of a boost.
        • Unless you are doing simulations or rendering or something I just can't see the minor increase as worth it.

          I agree with your post, but this part stuck out. Why is it, that on a slashdot geek site, nobody ever references CPU performance to programming desktop/work-station use. Photoshop, servers and games seem to be the main reasons people justify the highest-performance machines. But ever since I was in high-school (in the 80s), I've always overtaxed my machine... I've never had a machine and said "for w
      • Unfortunately such benchmarks with games are not nearly as valid as they once were from the CPU perspective. They use timedemos, which most often don't redo all the CPU work such as AI and physics and so on. The reason being that it lowers the risk that the simulation will diverge due to numerical errors.

        So the CPU has a larger impact that reviews often makes it seem. It is true though that the top of the end CPU really need top of the end graphics to balance, and who can afford that? On the other hand it

    • by Anonymous Coward
      Playing games at high resolution is limited by graphics card. CPU plays minor role. Film at 11.

      However, if you _do_ have tasks that are heavy on CPU and not GPU, Core 2 owns AMD.

      So what's hype about a CPU that's 1) cheaper 2) plays games just as well 3) can handle the occasional DivX rip or MP3 conversion much faster?
      • by riprjak (158717) on Friday July 14, 2006 @01:39AM (#15717180)
        Responding to anonymous... I must be mad.

        But your point is accurate. Gaming is mostly GPU limited; my gaming system, an s939 amd64x2 3800+ with a pair of old GPU's (7800GTX 256MB) achieves equal or better gaming results than all of these.

        I suppose the point is are such prices for CPUs currently justified when they wont have much impact on user experience?

        No doubt the new entry level core 2 duo's seem to be the upgrade of choice to maintain near cutting edge; but a high end GPU seems to be a wiser spend than a new CPU for gamers.

        As for video encoding et al, HardOCP had the same results in their "real world" testing as others, but at least they make an effort to simulate the way the "average" person might use the things; either way, I'll reserve judgement here until I see some 64bit results, since encoding in native 64bit will be the telling tale IMHO.

        In any case, I think we are reaching the point of dimishing returns, a year old 2GHz processer already rips music as fast as the drive can deliver it, already transcodes video as fast as the drives can burn it etc... GPUs control gaming... It is nice to see intel returning to the game in a serious fashion and no doubt this will have positive results for the consumer if AMD try to match price performance. I was mainly trying to point out that the "benchmarks" aren't nescessarily useful in describing the performance of these beasts in operation.

        err!
        jak
        • What codec, or drives, do you use if the drives are limiting for video transcoding?
        • In any case, I think we are reaching the point of dimishing returns, a year old 2GHz processer already rips music as fast as the drive can deliver it, already transcodes video as fast as the drives can burn it etc... GPUs control gaming...

          Why wouldn't you just double things up, then? Drop another monitor, keyboard, and mouse, and two users should be able to use one of these computers just as fast as one can on a modern computer.

          I know a lot of households that would benefit from buying only one new computer
    • by androvsky (974733) on Friday July 14, 2006 @01:33AM (#15717164)
      You have to be carefull with the HardOCP benchmarks. I only read the first several pages, but they're doing their usual "real-world" stuff... which means leaning on the video card to do most of the work. Naturally, CPU differences aren't going to show up much here. I appreciate them doing something to put a real-world perspective on things, but what I read doesn't change the fact that the Core2Duos (I hate typing that) are really stinking fast. But playing games that do most of the work on the video card won't matter, big surprise... it really looks like a clever yet still desperate attempt to be a raving AMD fanboy and prop them up.

      Uh-oh, rant ahead, I tried to avoid it, I swear... ;)

      I am a raving AMD fanboy, but I'm a raving AMD fanboy because they've made the best CPUs for a long time. They also have a wonderful motherboard architecture that makes very high bandwidth applications much easier to deal with. I find myself wishing I could plug the Core2Duos into an AMD motherboard... on-chip motherboard controllers would help Intel also. Ah, what do I care, I want to see a real motherboard built around a Cell, the overall system bandwidth is almost as exciting as the cpu. Too bad that means buying everything from Rambus... :(

    • The Oblivion tests show all things wrong with this, E6700 and X6800 getting identical (more or less) numbers indicate a GPU bound test, AND they use different settings for the AMD test - as they state that the game was not playable if the higher quality settings were used there.

      This [msdn.com] MSDN blog post was an interesting read to me. As the writer notes, image processing is a kind of virtual task. But it shows some pretty interesting stuff, IMHO, like the fact that the gap between AMD and Intel (Intel winning in

    • by YesIAmAScript (886271) on Friday July 14, 2006 @02:02AM (#15717228)
      (cribbed from my post in anothe rplace).HardOCP are complete AMD whores here.

      They do the power tests with power saving settings turned off. This gives AMD the edge at idle, mostly due to a lower transistor count. As other sites have shown, turning the power saving settings on (as one would expect) puts Intel far out front at idle.

      How do they end that article?

      " I would highly suggest keeping your eyes on AMD low wattage / energy efficient processors for those projects that require a noiseless solution."

      So they make Intel look worse than they are, and yet Intel still wins at under load. What's the takeaway? Buy AMD.

      In the gaming, after the Intel gets done smoking the FX-62, what do they say?

      "It is very interesting that in all of our testing, both "what is playable" testing and "apples-to-apples" testing, the Intel Core 2 Extreme X6800 and Intel Core 2 Duo E6700 are very close in performance. In fact, in some games they are dead even. The price difference between the two is very extreme with the Core 2 Extreme X6800 costing $999 and the Core 2 Duo E6700 at $530. Does it look like the price is justified between the two for gaming? We can safely say "no" as far as gaming goes with this gameplay testing we have performed."

      Then, when speaking of AMD, do they mention even the E6700 ($530) beat the FX-62 and the FX-62 costs over $800? Nope.

      "As for the AMD Athlon 64 FX-62, all of our testing shows that it does trail the two new Intel CPUs in gameplay performance. So, if you wanted to point one out as being a "winner" then for sure it is the new Intel Core 2 X6800 and E6700. But, if you look at the amount of difference between the AMD and Intel CPUs, you will see that it isn't enough to amount to anything. The only game that we saw any real-world difference in was Oblivion, and even that was tiny. A little overclocking would clear that difference up."

      Any mention of overclocking levels and how the Core 2 Duo overclocks well? Much better than an FX-62 usually. Nope.

      What's their takeaway from the gaming section where a $530 Intel beats out AMD's fastest chip (at $800)?

      "We have proven here that the flurry of canned benchmarks based on timedemos showing huge gains with Core 2 processors are virtually worthless in rating the true gaming performance of these processors today. The fact of the matter is that real-world gaming performance today greatly lies at the feet of your video card. Almost none of today's games are performance limited by your CPU. Maybe that will change, but given the trends, it is not likely."

      and then

      "Lastly, I would advise everyone that is thinking of rushing out and purchasing their latest upgrade that we are sure to see HUGE pricing slashes out of AMD before the end of the month."

      Way to go HardOCP. Rig your tests, ignore Intel victories and make your summary "buy AMD".

      You have zero cerdibility, HardOCP.

      Also, you used bullshot wrong. Bullshot is a term for fake screenshots designed for games (like EA uses). It doesn't fit here.
      • I stopped reading [H]ardOCP soon after they switched from "real" benchmarks (equal settings for both machines) to their oh-so-flawed allegedly-but-not-"real-world" tests (different settings for each platform, to get a similar framerate). It's bullshit and tells me *nothing* except what I'd get if I copied their settings directly--if I prefer to play at different resolutions, and/or different levels of AA/AF/etc., their reviews become worthless compared to traditional ones which give head-to-head benchmarks
      • They do the power tests with power saving settings turned off. This gives AMD the edge at idle, mostly due to a lower transistor count. As other sites have shown, turning the power saving settings on (as one would expect) puts Intel far out front at idle.
        Does this mean that the AMD is faster when it does nothing ?

        (puzzled)
    • In my opinion it's a stupid argument to make. As an observation it has some merit, but in no way justifies the conclusions being made.

      Yes, as fast as today's video cards are, they still are the limiting factor when gaming at high resolutions with all the features turned on. CPUs are fast enough and getting a more powerful CPU isn't going to help when it's the video card that is maxing out.

      So of course people shouldn't expect their games to play faster by buying a faster CPU, but I really don't see how that
    • I don't think Tycho would like you redefining the meaning of words he invented. Bullshot is used to describe a 'screenshot', and not afaik benchmarks, etc.
      • I don't think Tycho would like you redefining the meaning of words he invented.

        I don't think Tycho gets any say in the matter, any more than we have to apply to Shakespeare's descendants for permission to update the meaning of words he invented. So far, thank God, the English language itself is not open to any "IP" claims, which means that the fact that such-and-such a person claims to have been the first to use a word means precisely nothing. The rest of us can use it however the hell we like, and if eve
    • The way they've done things, no proccessor, however awesome, could make a significant difference, short of putting in an old 486 or something. The CPU is not the bottleneck. They are benchmarking CPUs in a test where the CPU does not even approach 100% utilization.
  • OCAU's view (Score:5, Informative)

    by Agg (246996) on Friday July 14, 2006 @01:18AM (#15717136) Homepage
    We have a comprehensive review on OCAU also: http://www.overclockers.com.au/article.php?id=4895 87 [overclockers.com.au] We compare the new high-end 2.93GHz X6800 and the 2.67GHz E6700, with the current Pentium D 955XE and AMD's A64 FX-62. Lots of info, loads of benchmarks and of course, some overclocking.
  • ads ads ads (Score:2, Funny)

    by Anonymous Coward
    this might very well be an interesting article but if they're going to submit me to atleast five flash ads on screen at the same time, this early in the morning, I think I'll pass.
    • by iapetus (24050)
      There's your problem, then. With a Conroe you'd have enough power to display up to ten flash ads at the same time.
  • Intel transfer the difficult from Hadware to software, for get more power, programmer need more technology.
    • by wbren (682133) on Friday July 14, 2006 @01:54AM (#15717209) Homepage
      Intel transfer the difficult from Hadware to software, for get more power, programmer need more technology.
      I completely agree! The Intel transfer the from hardware to software, get more on the power. In conclusion:

      What?
      • Look at his username. I don't think English is his primary language.

        I think what he was trying to say is that Core 2 isn't a magic processor that just makes everything faster, but can also be leveraged by programmers for even greater gains with some optimization. Of course, this isn't different from any other processor, and I could be completely wrong about what he was saying.
  • by twistedcubic (577194) on Friday July 14, 2006 @01:42AM (#15717190)
    ..but it seems I need to upgrade to this new Intel processor so that my computer can handle all the ads in the website.
  • by CCFreak2K (930973) on Friday July 14, 2006 @01:44AM (#15717191) Homepage Journal
    Even if it's just a shot at getting market share back, the fact that great things like this are being sold at lower prices only mean good things for the consumer. This, for example, is GREAT for me as a system builder because everything besides the Pentium D 805 was expensive. Now, with something like this, I can offer a (possibly) better CPU for not that much more.

    More good stuff is coming from both camps, I predict.
  • I've been holding my breath waiting for AMD to respond. Anytime now would be a good time for them to announce how they are going to counter the Core Duo. But the reality might be that they need to recoup their costs from developing the AM2 platform before they can make any changes.

    I think the competition has been good, but if Intel returns wearing the performance crown then I think there is a real potential that the CPU market will be dominated by Intel more so than it has ever been before, with consoles

  • by extra the woos (601736) on Friday July 14, 2006 @01:58AM (#15717221)
    What the benchmarks mean is that if you do a lot of media encoding, compiling, etc, you would probably benefit from conroe. HOWEVER, if you play games, regardless of whether or not you are on an AMD/INTEL system currently--if your system is pretty new--Do not upgrade at this time, as you are GPU limited, not CPU limited. Basically conroe: Large performance gains in cpu bound applications Little performance gain in gpu bound applications, obviously. This is good for intel. My systems for the past 7 years or so have been AMD. My next one very well may not be. The good news for everyone is that AMD is now the underdog again. Remember what happened last time they were the underdog? We got the athlon. The cpu speed wars went into a frenzy. For the last several years (5 or so) Intel has been sucking balls. Their chips have not been performance competitive. Clock speeds in both camps have stagnated. AMD chip prices have went way up compared to how they used to be. This is good news, AMD will go into overdrive developing their next-gen chips. Amd chips will become dirt cheap again. We'll see a new performance war. This is something i've been waiting for anxiously for a few years. I am very excited. Another thing is that the new intel chips take much less power than the old ones . (thank god)
  • Energy efficiency (Score:5, Insightful)

    by kjart (941720) on Friday July 14, 2006 @02:40AM (#15717313)
    Everything else aside, that was the one thing that interested me the most about the review - the fact that the new conroes are allegedly going to be consuming about half as much power as current desktop chips. Why is this important? Well, if such gains can be made on the desktop, I'm _really_ looking forward to the laptop chips. Maybe the 7hrs claimed battery life by laptop manufacturers will actually be accurate in the near future.
    • Re:Energy efficiency (Score:5, Informative)

      by Afty0r (263037) on Friday July 14, 2006 @03:53AM (#15717451) Homepage
      Maybe the 7hrs claimed battery life by laptop manufacturers will actually be accurate in the near future.
      I don't think it has been innacurate until now.

      I own a Fujitsu Amilo V2000 laptop (in the UK) which uses the original Intel Centrino chipset. I work mostly at home, but am on the move once or twice a week. Several times early in its' life (first few months while the battery is fresh) I had come home in the evening from an onsite job, then got up in the morning and switch the laptop on and started work only to have the battery warning (10%) give me a nudge around 4pm (from a 9am start). My work is web development, so while it's not too intensive I'm running email, web radio, text editors etc. constantly. Admittedly it was running on a wired network, and using the built in wireless chip results in a loss of an hour or two from that figure...

      I was completely amazed the first time it happened - forgetting to plug it in I assumed it would die a couple of hours later but it lasted almost the entire workday. (Other notes about that model : the battery itself died after 6 months, how annoying... and the screen is a bit glarey but overall I was very happy with the laptop.)
    • Well, if such gains can be made on the desktop, I'm _really_ looking forward to the laptop chips.

      Do you expect laptop CPUs to somehow consume a fixed percentage of desktop CPU power? Or how else do you get this sort of assumption?

      I don't think the difference is going to be as big as, say, between P4 and P-M. Those two were very different architectures, whereas the same Core 2 basis will be used for both desktops and laptops.

      Besides, if laptop CPUs can be engineered for low power with high performa

      • Do you expect laptop CPUs to somehow consume a fixed percentage of desktop CPU power? Or how else do you get this sort of assumption?

        Oh, I would agree that it's an assumption - but I think I fairly reasonable one. What I'm basing it off of, though, is that while energy drain is not a huge concern for desktops, it is for laptops. Therefore, one would expect that the extra engineering spent on making a mobile version would be focused on making it even more efficient than the desktop.

        Besides, if laptop CPU

  • by treak007 (985345)
    Even though the benchmarks show that the intel conroe beats the amd fx, the real question still remains, the value. Would you honestly notice a difference in fps when both processors were running relatively close in frames per second. Maybe the conroe can get 20 more fps per second, but is that worth the extra money. Amd is notorious for being less expensive then Intel. Either way you could run the top of the line games, its just a question of which allows you to get more bang for your buck. If Amd sets a s
  • by dhollist (811706) on Friday July 14, 2006 @02:57AM (#15717352)
    The headline states that "Even a $180 Intel CPU can beat an Athlon FX-62 in a number of tests" but if you read the article, the $182 Core 2 Duo E6300 (1.83 GHz) chip wasn't tested [hexus.net]. All of the performance data relates to the $224 Core 2 Duo E6400 and pricier chips. The results are impressive, but I think the "$180 chip beats Athlon FX-62" deception should be pointed out to anyone who didn't pick that detail up from RTFA.
    • Steve K (submitter) here. You're right, it was the E6400 not the E6300 that was tested. My apologies... 5am is not a friendly time for the brain. Neverhtheless, $40 more still makes for a CPU that's far cheaper than the FX-62.
  • by Anonymous Coward
    Rick Brewster of Paint.NET fame tested [msdn.com] two Core 2 CPUs with his own benchmark.
  • benchmarks (Score:3, Interesting)

    by Exter-C (310390) on Friday July 14, 2006 @03:26AM (#15717408) Homepage
    Its interesting that all benchmarks seem to include mp3 compression or mpeg video creation etc. How many slashdot users actually use their computers more than 1-5% of the time doing that type of stuff? Of course Its all those DiVX groups that need the performance so that they can encode and release an extra 20% more videos in a month ;)

    Overall the performance of the latest bunch of Intel processors is great, but when it comes down to it in a datacentre environment where spare stock etc is a costly exercise using Intel products is going to cost you more in the long run, while if we go with Opteron we can save on spares and still get great performance/power consumption.
    • I'd be more interested in H.264/AVC decoding performance than encoding performance, that said we should be doing decoding of H.264 on GPU. Unfortunately the bastages at nVidia want you to pay extra for that privilege.
    • MythTV (Score:3, Insightful)

      by leoxx (992)
      My MythTV box spends MOST Of its time doing exactly that. If these chips are as fast/cheap/cool (and therefore QUIET) as they appear to be, my MythTV box will be running Conroe by the end of the year.
  • by Anonymous Coward on Friday July 14, 2006 @03:44AM (#15717437)
    List kep up to date of Core 2 Reviews [madshrimps.be] so far including 16 articles, with more to follow.
  • by rikkus-x (526844) <rik@rikkus.info> on Friday July 14, 2006 @05:22AM (#15717607) Homepage
    Personally, I don't care about processors costing USD 400 or gaming performance, where CPU doesn't matter too much anyway. Are there any comparisons of the cheapest Core 2 processors with similarly priced AMDs?
  • Not yet available (Score:3, Insightful)

    by Hackeron (704093) on Friday July 14, 2006 @06:15AM (#15717688) Journal
    Has anyone noticed the processor is not yet available for sale and won't be available for a while? - I was very impressed by the benchmarks until I tried to find it for sale and saw that expected street prices will be far higher than those listed in the review sites and in fact will rival AMD prices.
  • I was only waiting for these for the supposed AMD price cuts. The only reason my A64 3000+ system is getting the boot is because it is socket 754, so I need to be rid of the dead socket and get me some DDR2, SATA-II, and PCI-X lovin'. The PCI-X is the main reason since my 9600AIW is showing age, though it did get me the best free game with a vid card ever, HL2. Real performance increase for me will come from the GPU (though the CPU won't hurt).
  • by digitaldc (879047) * on Friday July 14, 2006 @09:19AM (#15718561)
    Even a $180 Intel CPU can beat an Athlon FX-62 in a number of tests

    But, wait! YOU didn't wait for the next Athlon FX-63 processor that totally smashes that $180 Intel one!
    It's coming out in a few minutes...

    But wait again! A NEW $175 Intel next generation processor is on the verge of completion, and will be released soon after the Athlon FX-63 to totally obliterate that one!
    It's coming out at the close of business today.

"Don't talk to me about disclaimers! I invented disclaimers!" -- The Censored Hacker

Working...