Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Intel vs. AMD - Today's Generation Compared 150

Bender writes "The Tech Report compares 15 Core 2 and Athlon 64 processors from Intel and AMD — from sub-$200 to a cool grand, from slower dual cores to fast quad cores — in 32 & 64-bit apps in Windows Vista, including the new, multithreaded RTS game Supreme Commander. 'The release of Windows Vista and a round of price cuts by AMD prompted us to hatch a devious plan involving Vista, a new test suite full of multithreaded and 64-bit applications, fifteen different CPU configurations, and countless hours of lab testing. That plan has come to fruition in the form of a broad-based comparison of the latest processors from AMD and Intel... from the lowly Athlon 64 X2 4400+ and Core 2 Duo E6300 to the astounding Athlon 64 FX-74 and Core 2 Extreme QX6700.' Folding@Home in Linux, power use, and energy efficiency are tested, too."
This discussion has been archived. No new comments can be posted.

Intel vs. AMD - Today's Generation Compared

Comments Filter:
  • Summary (Score:5, Informative)

    by richdun ( 672214 ) on Friday March 23, 2007 @10:16AM (#18458091)
    14 pages of ads later...

    Intel > AMD at high end, Intel >= AMD at low end, Core 2 > A64, Intel finally has a lead in both architecture design and process (65nm).
    • by Visaris ( 553352 ) on Friday March 23, 2007 @10:27AM (#18458253) Journal
      Intel > AMD at high end, Intel >= AMD at low end, Core 2 > A64, Intel finally has a lead in both architecture design and process (65nm).

      I would agree with that as a generalization, but I still think it is very important for people to consider the applications they use most often. TR's benches clearly show that someone working primarily with POV-Ray would get better performance for $599 with AMD than for $999 with Intel. I agree that Intel takes the overall win, but blanket statements like this really fail to catch the areas where some chips shine and others do not.
      • Re: (Score:2, Interesting)

        by trimbo ( 127919 )
        TR's benches clearly show that someone working primarily with POV-Ray would get better performance for $599 with AMD than for $999

        Ok, let's be realistic here. Does anyone use POV ray for anything other than processor benchmarks? I have yet to see one real production, student or otherwise, rendered with POVRay. Let's see benchmarks with PRMan and Mental Ray. Those are production renderers people are actually using.
        • by ari_j ( 90255 )
          I have been involved with the use of povray as part of a DNA visualization suite in a commercial context. That said, I agree that there is a very low probability of anyone who would buy a CPU based on its povray performance and nothing else.
        • by insignificant1 ( 872511 ) on Friday March 23, 2007 @01:53PM (#18461339)

          My comment addresses yours, but wanders, so apologies in advance.

          I use POVRay for explaining engineering concepts to co-workers, in papers for external viewing, and for use by the marketing folks. Word on the street is that its renders find their way into publications such as Nature and Science.

          You talk of "production," which sounds like "movie," but it isn't a chump app just because movies don't use it for their render engine. It's a free, not-too-restricted-source-code app that yields stunning results. (Side-note: PRMan is ~$1k, and I couldn't find a price on Mental Ray, but my attitude towards Autodesk is that of the indentured slave to his master. But I digress.)

          It is the single most time-consuming task I do, doing a render, and about everything else runs "fast enough" for me on my three-year-old mid-grade P4 system. Sometimes I wish for greater performance with an Octave/Matlab script when I am playing around. But that's hard to benchmark & compare on systems and only rarely does it take the same order of magnitude of time that POVRay can take.

          So in response to another, I am one (and likely not the only) person who would lean towards a system based upon its POVRay performance. I have just been overjoyed that this has started to be used for benchmarks. I personally find the frames-per-second on Doom4 benchmarks useless, but what it comes down to is one thing:

          The more apps people benchmark (accurately), the more people benefit and can make informed decisions that address their specific situations.

          So I would like to see the apps you mention benchmarked, too. CompUSA never let me install photoshop and run my personal tests on it back when Photoshop was important to me. And now the greatest computer selection is from online retailers; how will you compare the value of a computer for you? $3000 for a Core2Duo Extreme Quad SSE2 from Dell, $2800 for an AMD Dual Quad HyperTransport blah, blah, from HP, and a bare-bones system from somebody else? What is each worth to you? And performance, obviously, gets far more complicated when you move beyond the processor isolated in a system. Is it worth the extra $100 for me to get another 1GB of memory? Will I ever really know how that will effect my apps, or does it just come down to whether or not I am willing to hand over another $100 just in case it might help?

          I guess that is it: What are people's expectations for spending their money? Those who look at benchmarks might just find the "best" and drop their cash (or credit) on that one. Those who don't look at benchmarks might hit a price point, and just grab the best-looking system or find something from a specific brand. People like myself who want to optimize on a personally-important criteria are mostly left to guess and always be uncomfortable with any choice they make. And this situation might only change when spending the $1000 (or whatever) is worth caring a great deal about.

      • by Talinom ( 243100 ) *
        but blanket statements like this really fail

        Uh, this is Slashdot. We are all about making blanket statements and then ripping them apart.

        Welcome aboard!
    • Re:Summary (Score:5, Interesting)

      by hal2814 ( 725639 ) on Friday March 23, 2007 @10:32AM (#18458341)
      I disagree with their definition of "low end." Maybe low end as far as what they tested, but there are a lot of non-X2 Athlon 64s and Pentium/Celeron Ds being sold. At the true low end, AMD is still more than competitive. It's only when you near the most-horsepower-per-dollar peak that Intel really pulls away (and that's where they seem to start measuring here). It's worth noting that I have no dying love of AMD. I have two AMD processors and one Intel processor running in my current personal machines and plan to get a Core 2 as soon as the next significant price drop occurs.
      • Re: (Score:3, Insightful)

        by Barny ( 103770 )
        In particular the x2 3600 and 3800 seem to be 2 of the best bang-for-buck chips out there.

        Yes ANY of the CPUs tested will leave them for dead, but if your user is running WinXP, doing a bit of this (video transcoding) and a bit of that (watching streamed video) and even a bit of the other, they will do it and leave them thinking "damn this thing is fast" all for pocket change compared to these other chips.
        • by morcego ( 260031 )

          In particular the x2 3600 and 3800 seem to be 2 of the best bang-for-buck chips out there.


          Since I'll be buying some X2 3800+ computers soon, I'm really happy to hear that.
      • Re: (Score:3, Informative)

        by Endo13 ( 1000782 )
        Couldn't agree more. Intel's cheapest Core 2 Duo CPU is still $169 on Newegg. http://www.newegg.com/Product/Product.aspx?Item=N 8 2E16819115013 [newegg.com]

        And anything Intel has that's lower than that is still pwned by AMD CPUs that sell for half the price.

        Just last night I finally bought some kit to get my system into the dual-core/DDR2 generation, and with AMD I was able to squeak in at a mere $300 at Newegg, including CPU, motherboard, RAM, and video card. The entrance fee with Intel (albeit with a significantly bett
        • AMD is for people who prefer cheap over performance. You know, people who will be buying another AMD chip before they would have bought another intel chip because the AMD chip will fall behind the perfornace curve faster.

          • Or for people who's performance needs are met by any currently marketed processor from either company. For typical desktop applications - internet applications, office suites, finance applications, basic photo-editing - even midrange processors are a waste.

            The only reason I recently retired my primary desktop at home (an Athlon 800 built in 2000) is because a free Athlon 64 fell into my hands.
      • Re: (Score:3, Insightful)

        by geekoid ( 135745 )
        "Maybe low end as far as what they tested, but there are a lot of non-X2 Athlon 64s and Pentium/Celeron Ds being sold. At the true low end, AMD is still more than competitive."

        do you relize that you're saying AMD is on it's way to absolences?

        AMDs new slogan: "We're really fast on old CPUs!"

        • by hal2814 ( 725639 )
          "do you relize that you're saying AMD is on it's way to absolences?"

          Your point being? I've already stated that I have no great love for AMD. It wasn't karma whoring. It was truth.
      • but it was a review of multi-core CPUs. As far as the review/test is concerned, a AMD x2 4200+ is the low end chip.

        at the true low-end, Intel is still the cheapest, as I can get a P3 1ghz off ebay for a fiver...
    • Re: (Score:1, Offtopic)

      Ads? I don't see no steenking ads... Adblock [mozdev.org] rulez!

      Offtopic, yes, but I couldn't resist..

      • Re: (Score:1, Redundant)

        by ArcherB ( 796902 ) *
        I don't see any adds either! All I see is:

        Unable to connect

        Firefox can't establish a connection to the server at techreport.com.

        * The site could be temporarily unavailable or too busy. Try again in a few
        moments.

        * If you are unable to load any pages, check your computer's network
        connection.

    • Re: (Score:1, Redundant)

      by LoudMusic ( 199347 )

      14 pages of ads later...

      Intel > AMD at high end, Intel >= AMD at low end, Core 2 > A64, Intel finally has a lead in both architecture design and process (65nm).
      There were ads on those pages? News to me. Firefox + Adblock. It will set you free.

      As far as the data is concerned, it's good information but somehow unsurprising. Maybe Apple partnered with the right people after all.
    • by drsmithy ( 35869 )

      Intel > AMD at high end, Intel >= AMD at low end, Core 2 > A64, Intel finally has a lead in both architecture design and process (65nm).

      Finally ? I think you mean, after AMD's brief lead, the status quo has been returned.

    • Thanks! That's all I wanted. Good night. P.S Could you come work in my office?
  • by Anonymous Coward on Friday March 23, 2007 @10:18AM (#18458121)
    Despite the FAH PS3 client has been out under 24h the PS3 client performance is overtaking all the CPU/GPU FAH clients combined!
    http://fah-web.stanford.edu/cgi-bin/main.py?qtype= osstats [stanford.edu]
    http://folding.stanford.edu/FAQ-PS3.html [stanford.edu]
  • Refreshing (Score:5, Insightful)

    by Visaris ( 553352 ) on Friday March 23, 2007 @10:21AM (#18458165) Journal
    While the review is not perfect, it is a breath of fresh air compared to many of the tactics reviewers often use to skew the results in the favor of one company or the other (usually Intel). Tech Report presents benchmarks that each side wins. AMD takes a clear win in Cinebench and POV-Ray and some minor wins in a couple of other areas. It is good to see AMD get some accurate representation in a time when most are happy to claim that Conroe and the Core2 arch cannot be beaten. AMD's new architecture (new core enhancements as well as quad-core) will come out at the end of the second quarter this year, and if their claims of performance improvement on the per-core level is accurate, I think we may see another stage in the never ending game of leapfrog. Anyways, I'm pleased to see a mostly accurate review, even if I disagree with the commentary at times.
    • Re: (Score:3, Insightful)

      by tomstdenis ( 446163 )
      I didn't read the review [bah, ads] but the point isn't to be faster, it's to be better. Better often means things radically departed from simply faster.

      For example, if AMD made a core, the same clock rate as their current mainstream core [say 2.4GHz], but take 40% less power at maximum, wouldn't that be better? It wouldn't render POV-Ray any faster but it would take less juice to do it. Aside from speed, size matters too. Smaller chips are cheaper to produce, reduces cost. Many design changes are for
      • by Visaris ( 553352 ) on Friday March 23, 2007 @10:37AM (#18458403) Journal
        Sure, performance is not the only factor. Intel does get some points for having a great combination of outstanding performance and very good thermal characteristics. Core2 is a great architecture, and I don't think anyone is trying to say otherwise. However, many people take Intel's general win and skew this into the claim that Intel and Core2 are the best for everything, which clearly is not true. The tactics used in the past by many reviewers have been to run overclocked Intel chips against stock AMD chips. This isn't exactly "cheating," but it ensures that Intel will be in the top stops on the charts. Also, many reviewers simply choose to skip the benches AMD is strong at, like Cinebench and POV-Ray. I'm not here to claim any one chip beats the hell out of the other, I just wish a lot of the fanboyism and Intel's reviewer payments would go away so we could get more reviews like TechReport's, which show many of the strong points and weaknesses of both sides.
    • Re:Refreshing (Score:5, Interesting)

      by flaming-opus ( 8186 ) on Friday March 23, 2007 @11:21AM (#18459095)
      What gets to me is the way that most reviewers compare power usage by simply comparing the listed thermal envelopes. AMD lists the maximum power used, whereas intel lists the typical power used. Furthermore, for laptops and other machines where heat is your big concern, you do care a lot about the loaded maximum power used. However, for most desktops in which heat is not really an issue, you're more concerned about the cost of the electricity you burn. In that case it's almost more relevant to measure the idle power usage, since most desktops sit around doing nothing most of the time. Any good review should actually measure the system power between wall socket and PSU, otherwise it's not really infromative to the actual concerns of the user.

      • Re: (Score:3, Informative)

        by 0xABADC0DA ( 867955 )

        Any good review should actually measure the system power between wall socket and PSU, otherwise it's not really informative to the actual concerns of the user.

        Well if you RTFA that's exactly what they did, but it's wrong. You don't measure the power consumption of a processor by comparing the total system draw unless the systems are otherwise the same (you can compare Intel to itself this way, not to AMD). The Core motherboard they tested is from Intel and has aluminum heatsinks, whereas the motherboard for AMD was by Asus and had heat pipes. Maybe they just put heat pipes on for look, but my bet is this MB adds quite a bit to the system power use.

        The differe

        • They did compare system power, thus the title "refreshing".
          Correct, they did it wrong by not comparing identical systems. However, there is some merrit in this. I don't care if the power usage is from the processor, or from the northbridge needed to support said processor. Saying that a processor uses X amount of power makes for a good bullet point on marketing slides, but the actual user experience is with the whole platform, so I think it's appropriate to benchmark the power draw of the platform. Otherwis
          • Re: (Score:3, Interesting)

            by 0xABADC0DA ( 867955 )
            For instance I actually have a desktop system that draws 50 watts at idle, with two drives drives and a case fan (according to kill-a-watt...). If a different processor family takes 10 watts more you may call that academic, but I call it 20% more.

            the actual user experience is with the whole platform, so I think it's appropriate to benchmark the power draw of the platform.

            That's exactly the point... virtually nobody is going to use their same configuration of 700W power supply, drive, memory, MB, CPU, video. So there is no user experience with these systems. It's almost entirely meaningless to give a power usage for them. And s

            • Everything that you say is true, but remember one important thing: Comparing Athlon64 power consumption to Core 2 power consumption isn't an apples-to-apples comparison because of the memory controller. In order to get a useful comparison you need to compensate for that fact (the easiest way is to compare motherboard + processor consumption for both platforms).

      • In a desktop, I'm not that bothered about power consumption. In a laptop, I'm interested in battery life, so overall power usage is important.
    • Re: (Score:3, Insightful)

      by geekoid ( 135745 )
      "(usually Intel)"
      gah, talk about biased.

      There skewed pretty evenly over all. gaame and 'geek' sites have a strong tendency to favor AMD. industry reviews have a tendency to favor Intel.

  • by Anonymous Coward
    Is this [techreport.com]. Not this [slashdot.org].

    HTML is fine, but double check those URLs and HTML tags!
  • So... (Score:2, Offtopic)

    by Southpaw018 ( 793465 ) *
    So I load this article up and a grand total of two comments have been posted. One says this test shows Intel beat AMD. The other says AMD beat Intel.

    Beautiuful.
    • Re:So... (Score:5, Informative)

      by LBArrettAnderson ( 655246 ) * on Friday March 23, 2007 @10:44AM (#18458503)
      The "Conclusion" page... (for those of you who don't want to go through the 10 pages of pretty graphs and charts).

      The fact that Intel retains the overall performance crown comes as no surprise. As we said at the outset, AMD has no real answer to the Core 2 Extreme X6800 among its dual-core processors. Also, Intel's quad-core CPUs tend to scale better than AMD's Quad FX platform, especially for typical desktop-class applications. Our move to Windows Vista x64 has done little to alter this dynamic. At the same time, Core 2 processors tend to draw less power and to be more energy efficient--sometimes markedly so--than Athlon 64s. Right now, Intel has the magic combination of a superior processor microarchitecture and a more mature, fully realized 65nm manufacturing capability working together on its side.
      This one-two punch has allowed Intel to maintain a performance edge at most price points, despite standing pat through AMD's aggressive pricing moves and new model introductions. AMD's current weaknesses manifest themselves most fully in its high-end models, like the Athlon 64 X2 6000+, which draws more power at peak than the Core 2 Extreme QX6700 yet is often outperformed by the less expensive Core 2 Duo E6600. The Athlon 64 looks more competitive in its lower-end incarnations like the X2 5000+ and 4400+, which match up better on both performance and power characteristics against the Core 2 Duo E6300 and E6400. These processors have the benefit of being available in 65nm form, and I'd say the minor performance penalty one pays in performance at 65nm (due to the slower L2 cache) is worth it for the reduced power draw.

      This low-to-mid-range territory, incidentally, is where I'd be looking to buy. Many of our tests have shown the benefits of quad-core processors, but honestly, finding applications that will make good use of four cores is not easy--and the list of games that really use four cores is approximately zero. I'd probably grab a Core 2 Duo E6400 and overclock it until it started to glow, if I were putting together a system right now. I must admit, though, that I have an almost irrational fondness for the Core 2 Quad Q6600, probably because it's the most energy efficient processor in our Cinebench power test. The thing is by no means a great deal--two E6600s will set you back over $200 less than a single Q6600--but it's easy to imagine a near-silent multitasking monster built around one.

      AMD would do well to expand its 65nm offerings into higher clock frequencies as soon as it can reasonably do so. That may take a while yet, given the limited overclocking headroom we've seen from early 65nm Athlon 64 X2s. Meanwhile, Intel isn't likely to sit still for much longer. Rumors of an April price cut abound, and in light of the Core 2's ample frequency headroom, higher speed grades are a definite possibility, as well. For AMD, its next-generation microarchitecture can't come a moment too soon.
  • Nice summary (Score:1, Informative)

    by Anonymous Coward
    What kind of 'editing' is this?

    "The Tech Report compares 15 Core 2 and Athlon 64 processors from Intel and AMD from sub-$200 to a cool grand, from slower dual cores to fast quad cores in 32 & 64-bit apps in Windows Vista, including the new, multithreaded RTS game Supreme Commander. 'The release of Windows Vista and a round of price cuts by AMD prompted us to hatch a devious plan involving Vista, a new test suite full of multithreaded and 64-bit applications, fifteen different CPU configurations, and c

  • Comment removed based on user account deletion
    • by Zo0ok ( 209803 )
      Sleeping with Apple?
    • by Targon ( 17348 )
      Intel tends to push the full "Intel processor+Intel chipset" which also favors using Intel graphics. Pretty much every AMD based machine uses either an AMD/ATI chipset, or a NVIDIA chipset with the appropriate graphics. For Vista, the "experience" favors having a decent GPU, so AMD is the better platform there on the low end.

      Every advertisement out there is about the low end of the dual-core, with a few mentions of the higher end products. So, you are looking at Athlon 64 X2 machines, or Core 2 Duo ma
  • David v. Goliath? (Score:5, Interesting)

    by ThePhilips ( 752041 ) on Friday March 23, 2007 @10:47AM (#18458553) Homepage Journal

    For AMD, its next-generation microarchitecture can't come a moment too soon.

    Nice reading.

    But of course conclusions are not that surprising. AMD is 10+ times smaller than Intel (judged by capitalization). Intel has many fabs - while AMD is constantly struggling expanding its production capacities.

    Yet, AMD (with Athlon 64) had managed to pull quite a match against Intel. Kudos to AMD: without you Intel's CPUs for sure would have costed $2500 a piece.

  • by Andy_R ( 114137 ) on Friday March 23, 2007 @10:49AM (#18458589) Homepage Journal
    "...the list of games that really use four cores is approximately zero."

    That's the most interesting part of the article for me. Apart from 3-D rendering and folding@home, they are really pushed to find any real-world reason for having 4 cores.

    Maybe they should have waited for Adobe's CS3 when heavy Photoshop tasks should provide nice real-world benchmark, and perhaps Apple will finally give us that long-awaited an 8-core Macintosh to put up against high-end Vista machines.
    • It's so you get smooth performance when you're running a video game, a music player, a web browser, and that coding assignment you left up to fool your boss when you have to alt tab, all at once.

      And games should become more multithreaded with the way consoles are shaping up. Just because they don't take advantage of it now (mostly because 90% of people still have single core systems, and programming in multiple threads is hard) doesn't mean that newer games won't.
    • Re: (Score:3, Informative)

      by ryanvm ( 247662 )
      they are really pushed to find any real-world reason for having 4 cores.

      On the desktop sure, but there is no shortage of even mid-size server scenarios where 4 or even 8 cores come in handy.
      • I wonder if this is cheaper for licensing too. If Oracle (for example) charged by the CPU then having multicore CPUs would be a great way to save money, until Oracle changed licensing of course.
        • Oracle does charge by the socket, its one of their competitive advantages. And paying a single CPU license fee for a nice quad Xeon is pretty damn sweet. It also means that you can put together a "2 CPU" cluster using Oracle Standard Edition (max 4 CPUs) with 8 cores of powerful goodness and the ability to scale up 100% with another two boxes without hitting licensing issues.

          And yes, a cluster of 2 boxes is often the smallest deployment you can consider from a failover standpoint.
          • by suggsjc ( 726146 )

            And yes, a cluster of 2 boxes is often the smallest deployment you can consider from a failover standpoint.
            Ok, you've really got my stumped here. I've been thinking about this for a while and I can't imagine a deployment smaller than 2 that would be able to survive a failover.
            • Badly stated, perhaps - you can get away with one box if your failover requirements don't exist, which is more common than you might think.
    • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Friday March 23, 2007 @11:26AM (#18459169) Homepage Journal

      That's the most interesting part of the article for me. Apart from 3-D rendering and folding@home, they are really pushed to find any real-world reason for having 4 cores.

      This argument only holds up if you only do one thing at a time. Even on my Athlon XP 2500+ (obviously, a single-core system) I would regularly burn a CD (or a DVD, but only at 2x max) while playing a game. It would work out but the game would sometimes stutter and the burn would sometimes underrun; the underrun protection would work, but it does slow down the burn.

      With four cores, you can play your game AND burn a disc AND have some crap going in the background and not have to care unless you become I/O-bound.

      Benchmarks do one thing at a time, so they're a shitty measurement of real-world performance for power users whose brains can cope with multitasking.

      • by hal2814 ( 725639 )
        "With four cores, you can play your game AND burn a disc AND have some crap going in the background and not have to care unless you become I/O-bound."

        That's s pretty big "unless" considering becoming I/O bound is such a common occurrence. "Until" is probably a much better word to use.
        • That's s pretty big "unless" considering becoming I/O bound is such a common occurrence. "Until" is probably a much better word to use.

          It's a problem that is easily remedied, however, by the use of RAID controllers, modern buses, and so on.

          The computer I last got rid of has an onboard RAID controller on a 64 bit internal PCI bus...

      • Re: (Score:3, Insightful)

        by geekoid ( 135745 )
        You can do the with SCSI and a celeron.
        for example, My SCSI drive was too small. So I decided to put in a SATA drive. That was the only difference. I can no longer play WoW, and iTunes, and down load images from my camera at the same time without serious stuttering.

        My point is, even 4 cores writing to a crappy hard drive architecture won't make much of a different for people using more then on program at a time.

      • Way back in the day when CDRs were expensive, way before burnproof I used my 166MHz Pentium to burn a CD while playing Quake at high enough resolution to make the audio stutter.

        Without any getting a coaster.

        I'll let you guess if I was running Linux or windows:)
        • Way back in the day (1996-7) when CDRs were still expensive, before burnproof I used my 200MHz Pentium to burn a CD while playing Quake at a sensible resolution, but still to the point of using up all the CPU (well, you are doing that essentially unless it hit the FPS cap easily anyway).

          Without getting a coaster.

          I used Windows (NT).
      • With four cores, you can play your game AND burn a disc AND have some crap going in the background and not have to care unless you become I/O-bound.

        You mean with two cores. I play games and backup dvds in the background all the time while playing mp3s at the same time on my lowly socket 939 Athlon64 X2 3800+. I can play ANY game at 1680x1050 with 4xAA and 8xAF, max quality settings and still burn a dvd in the background with no noticeable drop in frame rate. Dual cores have been able to do this for a whil

        • Dual cores have been able to do this for a while now, not just with quad cores like you mentioned.

          you missed the "AND have some crap going in the background" part. I am sitting at a dual core system right now (Core Duo T2600) and I am well aware of what it is and is not capable of doing.

          Your antialiasing and filtering settings are completely and totally irrelevant as they are handled solely by the GPU.

          Regardless, games ARE heading towards being more multithreaded. The Xbox 360 is going to provide a gigan

      • by drsmithy ( 35869 )

        This argument only holds up if you only do one thing at a time. Even on my Athlon XP 2500+ (obviously, a single-core system) I would regularly burn a CD (or a DVD, but only at 2x max) while playing a game. It would work out but the game would sometimes stutter and the burn would sometimes underrun; the underrun protection would work, but it does slow down the burn.

        Something is wrong with your PC. Ca. mid-1997 I was burning CDs (at a blistering 4x) and playing Quake at the same time, on a 133Mhz Pentium w

      • With four cores, you can play your game AND burn a disc AND have some crap going in the background and not have to care unless you become I/O-bound.

        You mean like RAM management?
    • by mgblst ( 80109 )
      It is so that you can read email, and listen to music at the same time (at least, according to the PC World adds, the leading UK pc seller). I wonder how people used their computers before core duos?
    • by AmigaBen ( 629594 ) on Friday March 23, 2007 @12:25PM (#18460017)
      "...the list of games that really use four cores is approximately zero."


      "That's the most interesting part of the article for me... they are really pushed to find any real-world reason for having 4 cores."


      Excuse me? Games=real world?


      Sorry, must have missed the memo.

      • by Minwee ( 522556 )
        Absolutely. If they were looking for real world applications, they should have started with porn.
    • by jim3e8 ( 458859 )
      `make -j4` is a compelling real-world reason to have 4 cores. SMP is not limited to multithreading.
    • by mandolin ( 7248 )
      "...the list of games that really use four cores is approximately zero."

      Do you play chess? Many chess engines will use as many cores as they can get (leaving out the fact that it could probably still beat you with just one).

    • That's the most interesting part of the article for me. Apart from 3-D rendering and folding@home, they are really pushed to find any real-world reason for having 4 cores.

      Just wait for real time raytracing. From what I've seen of OpenRT [openrt.de], it only takes about 8 cores to get photorealistic real time raytracing for complex models entirely with software. Combining that with the advanced shaders on the GPU should make for some very visually stunning games with perfect reflections, refractions, shadows, and sur
    • The current Windows Media Video 9/ 9 Advanced Profile codec is 4-way threaded.

      To get the current version, you need to install any of:

      Windows Media Player 11
      Windows Media Format SDK 11
      Windows Vista

  • by compwizrd ( 166184 ) on Friday March 23, 2007 @10:50AM (#18458601)
    I clearly need a dual or quad core system to handle the flash skyscraper banners on the sides.
  • Whoops, wrong site.
  • by FirstOne ( 193462 ) on Friday March 23, 2007 @11:05AM (#18458831) Homepage
    Most of these benchmarks are targeted towards unified caches.. (Intel)

    Meanwhile real world apps favor separate caches per core.
    (Where one user app isn't flushing cache entries of another app executing on different core.)

    If they wanted to make it fair..
      They should execute n-copies of each benchmark compiled separately using different module names. (no unified cache sharing.)

    Next item.. Graphics & games. What are they really measuring?
        The ability of some device driver writer to take advantage of some esoteric CPU optimization?

    Last item they disabled Cool and Quiet on over clocked AMD configuration s it should have never been published.. I.E. They're simulating certain AMD configurations and aren't testing the real thing..
    • Hu? What kind of workload do you have, that have multiple tasks consuming much cpu time?

    • Meanwhile real world apps favor separate caches per core. (Where one user app isn't flushing cache entries of another app executing on different core.)

      Real world apps may not favor an integrated cache - but real world workloads do. Why do I want to give my simple little 500K application a 2MB cache to run in. Why not give it 500K and let the other 3.5MB go to the larger application that is also running and needing cache space. That said - yes I agree that in a case of an application needing 4 MB cache,

    • Last item they disabled Cool and Quiet on over clocked AMD configuration s it should have never been published..

      Pretty much any overclocker will tell you to disable Cool'n'Quiet if you plan on overclocking, mainly because C'n'Q can cause weird behaviours while attempting to overclock.

      Cool'n'quiet down clocks your processor, the opposite of what you are trying to accomplish when you overclock. They don't really play very well together.
  • by MobyDisk ( 75490 ) on Friday March 23, 2007 @11:10AM (#18458895) Homepage
    AMD is producing chips using a 90nm process and moving to 65nm, while Intel is moving from 65nm to 45nm. It is very difficult to compete in design when you are working with something 4 times less dense. AMD has always been behind in this area (except when they were using IBM fabs, and they had copper interconnects before Intel).

    Simultaneously with this story, I see an announcement that Intel has announced another 45nm processor [engadget.com] for ultra low power consumption.
    • It is very difficult to compete in design when you are working with something 4 times less dense.

      And yet, AMD is just barely behind Intel, in both performance, and power consumption. The only exception is the very high-end, where Intel has a quad core, and AMD needs two dual-core SMP CPUs.

      In the article it's specifically mentioned that you can find certain AMD CPUs priced much cheaper than Intel counterparts. But the part that interests me much more is that AMD's low-end CPUs aren't crippled, where Intel

  • I went to read the article and after page 2 I am getting a 404 error.
  • by digitaldc ( 879047 ) * on Friday March 23, 2007 @11:20AM (#18459071)
    Summary: "I'd probably grab a Core 2 Duo E6400 and overclock it"

    Save your money and buy the cheaper Core 2 duo. Then you can find out the Core 8 Octo will be released in a few weeks for about the same price.
    Oblivion looks amazing IMHO.
  • x2 4400 low end??? (Score:4, Insightful)

    by brennanw ( 5761 ) * on Friday March 23, 2007 @12:04PM (#18459735) Homepage Journal
    I don't even have a dual core chip. I guess that makes my computer non-existent... ...

    Egads. I've been looking forward to getting a single-core 3800 -- that would be an upgrade for me.
    • I don't even have a dual core chip. I guess that makes my computer non-existent... ...

      Egads. I've been looking forward to getting a single-core 3800 -- that would be an upgrade for me.

      I hear, ya. I'm still running an AMD Athlon XP 2500+ (Barton Core). No overclocking, 1GB of cheap generic DDR Memory.
      I can't play any games [ASUS TNT Geforce 128MB video card (I think?)], but I can still run WinXP with Photoshop CS2 (that's all I use it for) in Vmware on top of my Gentoo install like a dream.
      To anyon

    • by Deagol ( 323173 )
      No kidding!

      Last spring I fried my MB, so I grabbed the Asus A8V w/ an Athlon 64 3200+ (2GHz, though I have it clocked to 2.2GHz). I was thinking about putting in a 2nd GB of memory and topping out the processor this spring (tax return time!), which for the A8V is (I think) the X2 3800. Sheesh. Looks like I can't get out of "low end" now w/o upgrading the MB, too.

      What happened to the days of 3-year-old hardware being "low end"? :)

    • I also found it kind of funny that they consider the X2 4400 low end, considering I have a X2 3800....Also, you can now get X2 3600's which are even lower end dual cores...

      PS. I love my X2 3800+, it doesn't feel very "low end" to me.
  • by rmdyer ( 267137 ) on Friday March 23, 2007 @12:09PM (#18459813)
    We've found after extensive testing that the Xeon line, the 5000 series, and specifically the 5150@2.66, are several percentage points slower than the Core 2 line of processors. The Core 2 Duo 2.66 is faster than the 5150 2.66 processor. So buying the Xeon processors apparently only gets you SMP capability for the higher price(?)

    • by rmdyer ( 267137 )
      Note(s) to my previous post...

      1. Most of the tests were conducted using 32 BIT Windows XP Pro SP2, some were using Red Hat Linux (also 32 BIT).

      2. The machines tested were the Dell Precision Workstation line vs. the business Optiplex series.

      3. Some of the software...

      ProEngineer Wildfire 2 and 3.
      Matlab
      Quake 4 Benchmarks.
      Spec tests for ProEngineer.

      The Xeons may in f
  • by hrieke ( 126185 ) on Friday March 23, 2007 @12:22PM (#18459961) Homepage
    What effect the L2 cache on the Intel chips have on the numbers.

  • Not a good idea to do a price/performance comparison when prices and lineup are about to change.

    Intel will be releasing a few new CPUs and cutting prices on April 22. The E6320 and E6420 for example, identical to their 6x00 counterparts except with 4mb of L2 cache. They'll go for $163 and $183 respectively.

    Benchmarks for next month's processors with price list:
    http://www.xbitlabs.com/articles/cpu/display/core 2 duo-e6420.html [xbitlabs.com]

    A 20-30% price cut [digitimes.com] is expected from AMD on April 9 [theinquirer.net].

    Even now the prices Tec
  • I have read somewhere that when moving from 32-bit to 64-bit arch AMD CPUs gained about 15-20% whereas Core2's gained 0-5% (apparently Intel skips some hardware "optimizations" in x64 code). I wonder if you use a 64-bit machine for a (Unix of course) server is Opteron still a better deal that Core2/Xeon? I don't trust Microsoft's 64bit implementation to be seriously good of course. Any x64 server app benchmarks?

Two can Live as Cheaply as One for Half as Long. -- Howard Kandel

Working...