Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD

Athlon 64 3400+ Reviewed 245

SpinnerBait writes "Unlike the Athlon 64 FX-51, this new 3400+ rated Processor, has a 64 bit memory interface, with its integrated memory controller, drops in at several hundred dollars less than an FX-51 and is also clocked at 2.2GHz. It gives a P4 3.2GHz Canterwood based machine a run for its money too, as this review with benchmarks at HotHardware reports. And where is Prescott? Fortunately for AMD, it's a bit tardy to market and this will give this new Athlon 64 speed bin time to take a firm hold."
This discussion has been archived. No new comments can be posted.

Athlon 64 3400+ Reviewed

Comments Filter:
  • by Anonymous Coward
    "this will give this new Athlon 64 speed bin time to take a firm hold"

    What's a speed bin?
    • by Anonymous Coward on Tuesday January 06, 2004 @12:35PM (#7892134)

      Why is this moderated as 'funny' and not 'ignorant' :-P ?

      Processor makers 'bin' processors. That is, they try for the fastest speed, but if the chip doesn't make it, it get's 'binned' dowm the line and tried as a lower-speed chip. They can also 'bin' due to market-reasons (putting hi-grade chips in the low-speed bin because of demand, etc)

      • moderating "ignorant"?
        Maybe your ignorance could be moderated according.

        Could be it was an honest question. Not everyone in the world uses english as their primary language. (including myself)

        And even then there could be problems. (see dunno-what-speakin'-Bush: "mis-underestimating")

        • Perhaps, if you knew what ignorance meant, you wouldn't use it in an incorrect way. The parent post of your comment knows what ignorance means, but it seems that you do not.

          Being ignorant isn't a bad thing. There are a lot of things that I am ignorant about. I would probably prefer to be ignorant of them, if I ever found them out.

          The word "ignorant" has been misused, typically within the race card. It probably started as "you are ignorant of what my people went through," but it somehow morphed into a

      • by Anonymous Coward
        Why is this moderated as 'funny' and not 'ignorant' :-P ?

        I'd hardly call not knowing what a speed bin is "ignorant". The poster didn't know, wasn't trying to be funny about it (AFAICT) and isn't responsible for the moderation it got.

        • I'd hardly call not knowing what a speed bin is "ignorant". The poster didn't know.

          That's what "ignorant" means. From latin, ignorare, "to be unaware [of sth], not to know". Antonym of scire ,"to know, to be aware [of sth]". (cf. "Science").

          • Then everyone is ignorant since none of us know everything. Any term applicable to all loses it's differentiating power and becomes useless, so I'm guessing the grandparent post meant "ignorant" in a more limited, less pleasant way.
    • by hab136 ( 30884 ) on Tuesday January 06, 2004 @12:39PM (#7892180) Journal
      "this will give this new Athlon 64 speed bin time to take a firm hold"
      What's a speed bin?

      In case you're not trolling, chip manufacturers crank out one design of chip, test it, then put them into bins based on how fast they can run reliably. They probably don't actually use plastic bins, but you get the idea.

      Thus, a "speed bin" - a lot of chips designated to run at a certain speed, despite the fact that it's the same design and metal as a chip designated to run at a slower speed.

    • by Anonymous Coward on Tuesday January 06, 2004 @12:43PM (#7892214)
      Due to minor process variations (random faults in materials, equipment, background radiation etc.) every manufactured chip is different. Some chips work fine at higher speeds, some chips only work properly at lower speeds, some chips fail to work at all. Since these microprocessors are at the cutting edge of silicon process technology, the variation matters. Now, to sort out which chip works at which clock speed, the manufacturer has to test every chip and classify them accordingly. Some are sold as 2 GHz chips, some are sold as 1.8 GHz chips, and so on. These different grades are called "speed bins".
    • OK, you lot seem to be way over analysing this.
      it looks like a simple bloody typo:

      "this will give this new Athlon 64 speed bin time to take a firm hold" ..becomes..

      "this will give this new Athlon 64 speed in time to take a firm hold"

      i doubt it has anything to do with grading the processors or anything......
  • by thammoud ( 193905 ) on Tuesday January 06, 2004 @12:20PM (#7891935)
    The Itanium is too expensive and slow. Ditto Sparc. AMD 64 bit servers running 64bit Java VMs will make for a killer combination.
  • by McVeigh ( 145742 ) <seth&hollen,org> on Tuesday January 06, 2004 @12:20PM (#7891941) Homepage
    http://anandtech.com/cpu/showdoc.html?i=1941
  • by GeckoFood ( 585211 ) <geckofood AT gmail DOT com> on Tuesday January 06, 2004 @12:20PM (#7891948) Journal

    Found this little gem in the article:

    It kept our CPU running in the mid -40C range while gaming at default clock speeds.

    Last AMD I had ran hot enough to roast a turkey from 10 feet away. -40C would freeze it solid.

  • by Inoshiro ( 71693 ) on Tuesday January 06, 2004 @12:21PM (#7891953) Homepage
    "We found the heatsink to work quite well. It kept our CPU running in the mid -40C range while gaming at default clock speeds."

    If your CPU runs at -40C, you have something very special. I, for one, would be worried about condensation from water becoming ice on contact with the CPU at that temperature!
    • Tilda vs. minus (Score:4, Interesting)

      by crow ( 16139 ) on Tuesday January 06, 2004 @12:46PM (#7892247) Homepage Journal
      I expect they meant to use a tilda ('~') instead of a minus ('-'), so as to indicate "about" instead of "negative."

      The best a heatsink can ever hope for is to cool to the ambient air temperature, and we won't see anything aproach that until we have superconducting heatsinks. (Imagine a large superconducting mass in the ground with a superconducting cable connecting it to the CPU to draw off heat: power outlets with a pin for cooling, superconducting traces on circuit boards for cooling, and no need for fans.)
      • I'm not trolling, just trying to educate.

        It's called a tilde:

        nick@marvin:~$ dict tilde
        3 definitions found

        From Webster's Revised Unabridged Dictionary (1913) [web1913]:

        Tilde \Til"de\, n. [Sp., fr. L. titulus a superscription, title,
        token, sign. See {Title}, n.]
        The accentual mark placed over n, and sometimes over l, in
        Spanish words [thus, [~n], [~l]], indicating that, in
        pronunciation, the sound of the following vowel is to be
        preceded by that of the initial, or consonantal,
      • Re:Tilda vs. minus (Score:5, Informative)

        by tiger99 ( 725715 ) on Tuesday January 06, 2004 @02:00PM (#7892973)
        It is not uncommon to use a Peltier Effect cooler. This is basically a huge stack of thermocouples run in reverse. You put in a lot of current at low voltage, and it produces a temperature differential. If you heatsink the "hot" side to ambient air, the "cold" side may be well below zero. But, it is not very efficient (like all cooling systems), so you need to put in several times as many watts as it extracts from its "cold" side, and not surprisingly the total of both appears at the "hot" side, so you may need a very big heatsink with powerful fans.

        I don't have the numbers in front of me right now, but at a guess you would need 300 watts to cool a 100 watt CPU, so would need to dissiapte 400 watts to air.

        It is inadvisable to make any attempt to get the chip below zero, obviously ice formation will happen, and when you switch off, it will melt. Should you switch on again, disaster is quite probable, unless the PCB had a good conformal coating and the socket has an interfacial seal. The conformal coating can be dealt with quite easily, but I have never seen a sealed CPU socket. BTW I usually work as an avionics designer, where we have to make things that will run from well below zero to well above, so I do know the problems.

        Another issue is thermal fatigue. The temperature coefficient of expansion of silicon does not exactly match that of the (probably epoxy) package, every temperature cycle causes a stress cycle, which causes a strain cycle, until something breaks. Same for the motherboard itself of course, if you should cool the whole thing. That is also a good reason to never overclock anything, apart from the possibility of getting subtle data errors and increasingly buggy OS as a result of inadequate timing margins, you will definitely wear the thing out a lot quicker. Every 10 deg C roughly halves the life, or the number of on/off cycles it will survive, and if you do the calculations, the numbers are quite depressing for a modern PC.

        If you really want a thumping great 64 bit processor (I certainly do, when the price comes down!), it would be best to calculate the cooling system, and maybe do some tests with thermocouples etc, to try to get the CPU chip to settle down at a relatively safe temperature, say 40 deg C, without getting ice formation on the coldest parts. The clever bit would be to get it to power on and off without any excursions below room temperature (often 20 deg C) or above 40 deg C. Heat soak when you switch off the CPU would be minimal, the mass of the chip itself is very small, but cold soak from a huge peltier block could be a problem, the CPU could be dragged down to -40 deg when you switch off, which is exactly what you don't need, for a long and reliable life.

        The other thing to watch out for is that at low temperature the CPU internals will be out of spec. It is actually possible to get excessive current flow in some transistors, and local hot spots, because it is too cold. There may also be timing problems, data corruption, ...... It is not possible to test properly that these things are not happening. It takes a smallish time to fully exercise an 8-bit processor to verify all possible data and instruction operations, but somewhere near the lifetime of the universe to do the same for 16 bits. Throw in onboard cache, 64 bits, etc, and it is just impossible. These things work statistically, AMD know how timing variations, for example, might vary across the chip, and allow sufficient margin, within the published clock frequency and temperature range, but deviate from these ratings and this is no longer true. If running at 1GHz, a 1 in 10e12 error rate would corrupt your data or OS code within 1000 seconds, just over 1/4 hour. The error rate required to run an OS for weeks at a time at several GHz defies all attempts at testing. Again, a very good deterrent to overclocking (BTW my non-overclocked ancient K6-350 had a meltdown due to fan failure, and as it died the corrupt Win XP blew away all the passwords so when I got a new CPU and fan, any of th

  • by cK-Gunslinger ( 443452 ) on Tuesday January 06, 2004 @12:21PM (#7891961) Journal

    Anandtech [anandtech.com]

    Looks like a winner to me!

  • by Shisha ( 145964 ) on Tuesday January 06, 2004 @12:24PM (#7892003) Homepage
    Well not that I'm buying one anytime soon, but it's nice to know that once I buy one, I'll get a Linux distro, that is compiled & optimized for a 64bit CPU. So for me only Mathematica will run in the 32bit (slower) mode. But Gimp, mplayer, video editing apps, hell even twm and xclock, will be compiled for 64bit CPUs.

    I was wondering how is this going to be sorted out by application vendors on PCs? Are they going to release 64bit and 32bit versions? Is every CD going to contain both? What about 3rd party plugins? I've been asking the same question actually about Apple's G5, but www.apple.com (and I didn't search too carefuly) is bit short on nasty details like this. Is it really worth getting a 64bit machine without planning to use Linux?
    • Is it worth getting _any_ machine without planning to use Linux?
    • Is it really worth getting a 64bit machine without planning to use Linux?

      Well, if you want the current top of the line 32-bit performance, why not? That's a bit like asking "Should a get this Super Duty Dodge Ram with the best towing capacity available today, but also includes an extra cup holder I might never use." If it has what you need for a reasonable price, why question the extras you might never use. It's not like the 64-bit-ness is truly "wasted" just because you might not use it. Those extra

      • "Well, if you want the current top of the line 32-bit performance, why not?"

        I haven't checked, but won't a P4 system give me better "speed per dollar"?

        For me personally I couldn't care less about speed. With me the "weakest link" it's usually my brain or ocassionally the internet connection.

        What I would care about more is a silent and small (think book sized) system. When I say _silent_ (not just almost silent), I mean that it won't need a CPU fan, no power source fan and that it would be based around a

        • Well, the Anandtech review has several charts showing price/performance ratios for different scenarios. In everyone one of them, the Athlon 64 3400+ and P4 2.8C take first and second places. So I would guess that, depending on your budget, either of these will give you the best "speed per dollar." Although, I would like to see some of the Althon XP's compared for reference. I would be willing to bet that the Athlon XP 2600+ 333MHz/512K Barton CPU for ~$95 would rank pretty high on that chart, simply bec
        • When I say _silent_ (not just almost silent), I mean that it won't need a CPU fan, no power source fan and that it would be based around a 1GB compact flash card. I would quite like a decent (not great) graphics card. And a 1 gig ethernet port.

          You can get by without a CPU fan (see Via as others state). Good luck on the no PSU fan. And basing it on a CF card? With a 1 Gb ethernet port? Why? The card can't possibly keep up to the port (particularly for writes, which is another issue -- if you put any kind o
        • You are talking about price / performance -- or getting the most bang for the buck. Since the release of the original AMD 386 clone chip, AMD has had the best price / performance ration. IOW, the AMD chips will always give you the most speed per dollar spent on it.

          As for wanting a super-cool, super-quiet CPU, that has decent performance, you'll want to check into the Transmeta TM5800 and TM5900 series chips, or the VIA C3 chips. Both of these can run without fans, and are clocked up around the 1GHz mark
        • won't a P4 system give me better "speed per dollar"?

          Actually, an AthlonXP will give the best speed per dollar, since it gets more done in a clock cycle. It's actually pretty close between Athlon and Pentium, but if you add in the cost of the electricity over the life of the computer, the AthlonXP will win.

          What I would care about more is a silent and small (think book sized) system. When I say _silent_ (not just almost silent), I mean that it won't need a CPU fan, no power source fan and that it would b
    • by discstickers ( 547062 ) <chris AT discstickers DOT com> on Tuesday January 06, 2004 @12:41PM (#7892198) Homepage
      Actually, Apple's in a good position here. Mach allows "fat" binaries (ie, more than one binary in a single application icon). So both version can be distributed together.

      They did a similar thing around the transition to PowerPC.
    • by Espen ( 96293 ) on Tuesday January 06, 2004 @12:46PM (#7892239)
      Apple proposes: "Packaging your Optimizations:

      Code that has been optimized for the G5 by simple re-compilation will run without penalty on a G4. If you have done more in-depth, G5-specific tuning (levels 1, 2 and 3) then you will in all likelihood want to provide a separate binary. In extreme cases, you may decide that you need only offer one version of your software that runs on Power Mac G5 computers only. However, you'll probably want to support most or all of the Macintosh product line, which means that you need to decide how best to deliver the right code to each of your customers. There are several ways to achieve this; the first is:

      Create different versions of your software for each processor that you support. This requires that you maintain three parallel code bases, something you may not want to do.

      It is possible for your software to query the computer on which it is running to see which processor-related features are available. You can design your software to isolate processor-dependent code and call the appropriate version as needed. This leads to two additional strategies for packaging your application:

      For every function that calls processor-dependent binary code, have your code call the appropriate version. If such functions are needed frequently, using this approach may decrease execution speed and make your source code (cluttered with if...then constructs) less readable.

      Isolate processor-specific functions into frameworks or shared libraries, then have your software load the appropriate version when it starts up. This enables you to write your main code without wrapping function calls in if...then constructs."

      (from G5 Optimization [apple.com])

    • Likely, vendors will only ship one version of their app, the 32-bit version, since AMD64 CPUs can run 32-bit and 64-bit. If the app doesn't need or greatly benefit from 64-bit, why bother with it? (The code will likely need porting to 64-bit anyway, especially by average Windows programmers who've never had to worry about different CPU architectures before.)

      Apps that greatly benefit from 64-bit support may either be 64-bit only, or provide both versions. I recall reading that the UT developers plan on r
      • Actually, a simple recompile could provide additional performance, as the AMD-64 has a host of additional registers over and above the standard x86 set.
        • Unfortunately, it doesn't work that way. You can't compile and get just the extra registers. You have to take it all, which includes changes to things size changes of certain data types. The software will likely need modifications for this if the developers never intended to run on non-32-bit systems.
    • Well, the nice thing about AMD's 64-bit thing is that, unlike ia64, it's even quite capable as a 32-bit processor.

      So maybe think of it as getting a very good 32-bit processor, with room to grow as more 64-bit binaries become available?
    • I was wondering how is this going to be sorted out by application vendors on PCs?
      They'll recompile and then sell it to you as an upgrade, as always.
    • Simple: applications that benefit from being compiled for AMD64 will be. Applications that don't, won't. That's the beauty of backward compatibility.
  • by ViolentGreen ( 704134 ) on Tuesday January 06, 2004 @12:25PM (#7892018)
    Looks pretty good. I still don't think there is a huge demand to have these in desktops as of yet. P4s are still very powerful and still compete with AMDs 64 bit chips. Even the Athlons are enough for most people to play the newest games and all.

    I don't think that most people do the really computer intensive tasks that would benefit from 64bit chips plus the lack of truely 64 bit software that will give them this advantage is a hinderance as well.

    I think it will be 2005 or maybe even 2006 before 64 bit chips become the standard.
    • "I don't think that most people do the really computer intensive tasks that would benefit from 64bit chips"

      Everyone would benefit from switching because of the extra registers in 64-bit mode and the low-latency memory controller. Some people have said they got a 10-20% speedup just from recompiling in 64-bit mode without making any changes to their code.

      Of course if all you do is run Word all day that will make little difference... but if all you do is run Word all day you'd probably be happy with a Penti
      • Everyone would benefit from switching because of the extra registers in 64-bit mode and the low-latency memory controller. Some people have said they got a 10-20% speedup just from recompiling in 64-bit mode without making any changes to their code.

        Seriously though, how many "regular" computer users have access to the source code for their applications or would even know what to do with it if they did.

        This is great for universities and research facilities that use either their own software or open so
        • "Seriously though, how many "regular" computer users have access to the source code for their applications or would even know what to do with it if they did."

          Uh, that's irrelevant, as people will buy 64-bit versions of software for their 64-bit PC... and that software will run faster just because it's been recompiled with a 64-bit compiler that doesn't waste half the time copying data between registers and memory.

          "But before it becomes really mainstream, you are going to have to have the 64-bit windows (n
        • Seriously though, how many "regular" computer users have access to the source code for their applications or would even know what to do with it if they did.

          There is a vast horde of us Gentoo users who are laughing like hell at you right now.

          This is the nicest thing about distributing applications as source. Who cares what architecture I run on, if it'll compile it'll be optimized and work on my local hardware! Itanium, x86-64, x86, any will do. I'll just 'emerge -ev world' and everything will rec
          • There is a vast horde of us Gentoo users who are laughing like hell at you right now.

            I use gentoo myself. However, the vast majority of computer users in the world use windows. That's what I was refering to by saying "regular" computer users.
        • The reason AMD build a 64 bit chip. You have to understand the changes going on on the server markets. In short x86 caught up with proprietary architectures. Realize that it probably takes about $2 billion to develop a processor architecture (I'm probably low here, but you get the idea). Let's assume that 15 million server processors are sold annually. Sun ships about 300k servers a quarter and has 1/3 of the market, most of them are at the lower end (ie lots more 1-4 processor systems than 72 processo
    • Yeah but, while all of that may be true, I still want one NOW!
    • On most of the roads in the nation, the speed limit is either 55MPH or 65MPH. Some places out West on the Interstates, it's 75MPH. Even a 100MPH speedometer is WAY overdesigned, well past short-term bursts for passing, accident avoidance, and the like.

      So why do we have speedometers that go up so high, and why can many cars actually go that fast? After all, it's illegal, and we don't NEED that speed, or speedometer.

      Perhaps we really do - about as much as a 64-bit processor.
      • I take it you're not old enough to remember the stupid 85mph speedometers which used to be fitted to most (all?) American cars?
      • That's the reason I drive a Nissan and not a Porche. Well one of them at least...

        I can see the relation but I'm not sure of your point.
        • I can see I need a bulk response here. You get it, for everyone.

          My point wasn't to be taken too literally. I was trying to say that most (but not all) of us will never drive over 100MPH. Most (but not all) of us have no need for a speedometer that goes to 120 or 160 MPH, but we all have them. That's not to say that our speedometers shouldn't have some margin above the top US speed limit of 75MPH, to debunk the Apollo analogy, a little. We need some margin, just not 100% margin. (It may be that Montana is b
      • Ugh. There used to be a law in the US that speedometers had to top off at 85 mph. My car ('93 Explorer) has that, and it's damnably unsafe. Most of the time, I have no fucking clue how fast I'm going, other than "as fast as the cars around me." I've taken to using a GPS on I-280, just to know if I'm going under 100 (at which point the cops might care, since I'm obstructing traffic going so slow) or not.
    • I still don't think there is a huge demand to have these in desktops as of yet.

      There is a HUGE demand for these desktop chips. AMD has pretty much sold out [yahoo.com] of them.

      Your point about people not really needing these processors is valid, but the demand is there.

    • by Lumpy ( 12016 ) on Tuesday January 06, 2004 @02:44PM (#7893430) Homepage
      I still don't think there is a huge demand to have these in desktops as of yet.

      for the casuah home ding-dong user? you are right.

      for business and companiesthat depend on processing power? Ha! the amd 64's rock massively.

      I replaced a dual Xeon box here at work that was the CGI station running Blender, povray and yafray and producing better graphics than a maya station next to it, and faster... with a dual Opteron using a 64bit compiled Gentoo install on it...

      Then we recompiled the apps for 64 bit.

      I am getting a 70% increase in rendering speed. I'm betting that with some optimization this could work even better... the Blender guys are working on that right now BTW...

      a 64bit linux version of Maya? the company said "maybe 4Q 2004 for beta testing"

      which is a shining example of why open source is the way to go.

      businesses using the number crunching and processing power and are smart enoughto have embraced linux for the needs it can fill are all over AMD64 right now.
      • I used to work at an architecture/graphics firm that's been running 64-bit chips since at least 1998. It was called DEC/Compaq/HP ALPHA chips running at 500mhz when the PII's were brand new at 350Mhz.

        In fact they had 27 Quad 500Mhz Alpha chips with 4GB of RAM each configured into a renderfarm and one on one, still beat the replacement dual 2Ghz Xeon boxes which replaced them last year.

        However, they condensed a 1500 sqr foot room full of servers and wires into two, and now getting a third, IBM blade uni

  • 9 more reviews here (Score:3, Informative)

    by whovian ( 107062 ) on Tuesday January 06, 2004 @12:26PM (#7892028)
    http://amdmb.com/#News-7458 or linkified [amdmb.com].

    Reviewed by amdmb, HotHardware, Neoseeker, CPU Performance, Tech Report, Hardcoreware, Hardocp, Hexus, X-Bit Labs.
  • Video encoding? (Score:5, Interesting)

    by Lussarn ( 105276 ) on Tuesday January 06, 2004 @12:28PM (#7892060)
    Have anyone tried to encode xvid with one of these in 32 and 64 bit, preferebly using Linux? Is there much difference in speed? I'm looking at the 3000+ part as it is cheap but there are zero and none benchmarks to back it up in 64 bit mode.
    • Could one assume that Xvid benchmarks would scale along aside the DivX benchmarks that are included with the artical? I know its not the same but it would be a good place to start.
  • They mainly focused on the price/performance ratio as it is truely a killer to everything out there.

    Link to Anandtech Article [anandtech.com]

    Basically, they are predicting the death of the AMD FX51 as the 3400+ has equal or better performance and MUCH cheaper.
  • by eddy ( 18759 ) on Tuesday January 06, 2004 @12:32PM (#7892099) Homepage Journal

    Please add links to any reviews that run 64-bit linux (or other 64-bit OS of choice) with 64-bit benchmarks on said processor:

    fineprint: I don't need a lecture on the nature of 64-bitness.

  • by ruiner5000 ( 241452 ) on Tuesday January 06, 2004 @12:36PM (#7892147) Homepage
    Guys we have all the reviews listed on our main page [amdzone.com], and I'm adding more as they come in. It currently totals at 19. Does Hothardware pay Slashdot for these links? ;)

    • Damn, you posted something that doesn't bash intel and/or apple.

      I am in complete and total shock.

      For those that do not know, AMDZONE used to be a great site for info on AMD. In the past 3-6 months it has become a bash Intel/Apple site.

      Note. I am typing this from a dual athlon, and most of the other systems in my lab are AMD based. So do not call me an Intel/Apple fanboy.e
  • Fair Comparisons? (Score:3, Informative)

    by Aaron England ( 681534 ) on Tuesday January 06, 2004 @12:38PM (#7892166)
    If they are comparing the $700 [newegg.com] AMD 64-FX chip, they should be comparing it to Pentium's $1000 [newegg.com] P4 3.2 EE chip, not their sub-$400 P4 3.2.

    Also does anyone have an idea how expensive the AMD 3400+ chips are? Because the AMD 3200+ chips are $400 [newegg.com] retail. The article quoted a price for a thousand quantities but I was wondering how much it would cost for just one. Because if its pricey enough the P4 3.2 may beat out the 3400+ dollar for dollar.

    Though Intel doesn't have to really worry about that title. At $164 [newegg.com] the Pentium P4C smokes the pants off any AMD processor in its price range. At least, after overclocking it to 3 GHz, which is very doable even with standard cooling.

    • Check out the anandtech article [anandtech.com] from several posts above.

      It's compare there to the EE chip. I believe that I remember hearing that the price for the 3400 is less then the FX chip and out performs it as well.
    • Re:Fair Comparisons? (Score:2, Interesting)

      by EmagGeek ( 574360 )
      "Though Intel doesn't have to really worry about that title. At $164 the Pentium P4C smokes the pants off any AMD processor in its price range. At least, after overclocking it to 3 GHz, which is very doable even with standard cooling."

      Will it really be cheaper and faster when you have to buy a new one every 6-12 months because you destroy it?
      • by Just Some Guy ( 3352 ) <kirk+slashdot@strauser.com> on Tuesday January 06, 2004 @02:13PM (#7893089) Homepage Journal
        Will it really be cheaper and faster when you have to buy a new one every 6-12 months because you destroy it?

        Indeed the word "overclock" has become my hardware review spam filter; it has a strong "cold fusion" connotation. If I drop $700 on a CPU, I will not be running it out of spec in any way. If I'm that hungry for speed, I'll build a cluster.

        I can understand people wanting to overclock, say, a P-III 933 to see how far they can push it, but I just don't get the fanboy fascination with extreme cooling, adding a few megahertz, etc. Reading this stuff in a tech article is like finding an article on adding a whaletail to a ricer in Car&Driver - it just doesn't belong in a serious text.

        • You are obviously not a hardware enthusiast (this may sound like a flame but i just feel too strongly about this)... You can easily overclock for economical reasons without sacrificing cpu life. I've done this for years and continue to do so. As long as you know what your doing and have proper data from research (from reviews like this)you can take a sub $100 dollar processor and have it run like one for over $400 for years). The best part is that if you have the right components and you know how to tune
        • BTW, my cooling wasn't extreme... just a very large heatsink with good a large surface area and a 90mm fan (the larger the fan the lesser the rpms needed to move air and the lower the noise). Simply air cooling not water, peltier, or phase change... 50 % faster about 15 degrees cooler then most processors running on spec (the heatsinks amd ships are extremely weak... this is one thing intel has them totally beat at) as far as amd goes.
    • In lots of 1000 [amd.com]:
      Athlon 64 FX-51: $733
      Athlon 64 3400+: $417
      Athlon 64 3200+: $278

      So retail using your $400/3200+ as mark-up ratio, that should be something like $1040, $600 and $400, respectively.

      Also, on the P4/Athlon war I haven't checked lately since I'm happy with the XP2000+ I have, but at the price range I've been at AMD has come out on top for my last three processors (Duron 700, Athlon 1200 and the above mentioned XP2000+). Maybe the P4C is different, right now I really don't care though :)

      Kjella
  • Any peice of hardware that can spank the competition EVEN while its potential isn't fully being realized by the software testing it deserves my dollar.
    And yes, im talking about how well it games, I can really give a flying fsck about how quickly it runs office...
  • by Masarand ( 598211 ) on Tuesday January 06, 2004 @12:53PM (#7892321)
    The main point of the Athlon 64 and Opteron is that they are 64-bit CPUs that can run 64-bit applications.

    The fact that they can run 32-bit apps under a 32-bit OS at pretty much the same speed as a 32-bit CPU is surely a huge yawn (but great for backward compatability.)

    Has anyone seen any comparative benchmarks under a 64-bit Linux system?

  • You may also want to take a look at this review [techreport.com] at our good old Tech Report [techreport.com].
  • AMD's chance (Score:4, Insightful)

    by G3ckoG33k ( 647276 ) on Tuesday January 06, 2004 @01:11PM (#7892505)
    I wouldn't be too surprised if AMD chose to withhold faster versions of the Athlon64 FX-series until any Prescott is just about to be released. A day before or so. Leap-frogging at its finest.
  • Pentium Vs. P4 (Score:5, Insightful)

    by wowbagger ( 69688 ) on Tuesday January 06, 2004 @01:22PM (#7892623) Homepage Journal
    <sarcasm>
    Well, in all my testing of running 16 bit apps, a Pentium I outran a similarly clocked P4 by a healthy margin - so obviously the Pentium is a better chip, right?
    </sarcasm>

    Seriously - For a period of time the A64 will be running mostly 32 bit apps (at least in the Windows world), and so it is fair to benchmark its performance against 32 bit apps. But I cannot help but wonder how much P4 tweaking all those apps had, and how much A64 tweaking they did not have.

    Also, the memory performance tests are, to my mind, somewhat questionable as well, as different CPUs even within the Pentium line have different memory access behavior - code that will be bus limited on a P4 might not be bus limited on a P3.

    I am not saying the comparisons are not useful, but I am saying that they don't tell the whole story. Let us see some benchmarks wherein the A64 is running code that is written for the A64 - using the extra registers and so on.
  • by Jeremy Erwin ( 2054 ) on Tuesday January 06, 2004 @02:04PM (#7893018) Journal
    The reviews are all the same--run various permutations of the PC through benchmarks, and display the results using bar charts. And not just any bar charts. Use a gradient to color the bar, so that the color legend is rendered useless.

    The reviewers should read Tufte, and figure out a more effective way of illustrating their analyses than endless pages of bar charts. Oh wait, that's how they get their ad revenue. Never mind.

If you want to put yourself on the map, publish your own map.

Working...