First Benchmarks of AMD Hammer Prototype 497
porciletto writes "As seen on Ace's Hardware, this article features Quake 3 benchmarks comparing an 800 MHz ClawHammer sample to Athlon MPs at 800 MHz and 1667 MHz, as well as a Willamette Pentium 4 (256 KB L2, 400 MHz FSB) at 800 MHz and 1600 MHz. The benchmark results indicate a 40% performance increase over an Athlon MP for the ClawHammer. Additionally, the 800 MHz ClawHammer manages to tie (actually outperform by 1 FPS) the 1667 MHz Willamette Pentium 4."
It would be more interesting if... (Score:5, Interesting)
Re:It would be more interesting if... (Score:3, Insightful)
How many registers are there?
I've only programmed on assembly level for MIPS (32 regs) and x86 (whoa - registers gone awry)
In short, the MIPS was fun and an excellent "beginner's processor" to try out your noob assembly skills on. The x86 was a nightmare.
Re:It would be more interesting if... (Score:2, Informative)
Re:It would be more interesting if... (Score:5, Informative)
Found this PDF document [x86-64.org] to be a very interesting document with tons of info about the Hammer. So intersting that I felt the need to post it here.
Regarding registers, it shows that not only has it got 2x "standard"/GPR registers that's 2x wide, but also 2x SSE/SSE2 128-bit registers.
So it seems to total in 16 * 128-bit registers, 16 * 64-bit registers (and 8 * 80-bit regs for floating point ops).
Yeah, and a widened program counter register too.
Re:It would be more interesting if... (Score:2)
Re:It would be more interesting if... (Score:2)
Intel has a Big Problem (Score:5, Insightful)
Expect a massive FUD attack from Intel in the coming months as they try to convince the world that their chips aren't really inferior to those from AMD.
Re:Intel has a Big Problem (Score:5, Informative)
It does make most sense for AMD to spend there time building a 64-bit x86 processor then it does a completely new architecture atm. But that doesn't mean we wouldn't all benifit greatly from dropping x86. Of course this can't be an overnight change, but it does need to happen.
Eventually you have to break backwards compatibility to move forward without making things ugly. x86 is old, it is overly complex, it is inefficient in many respects, it is time to say good bye. There is a reason the original designers only expected it to be a 3-5 year temporary solution.
Not the best architecture (Score:5, Informative)
But if Intel was going to supercede a messy architecture like X86, I wish they'd done something better than IA64. While the jury is still out on the merits of IA64, it has some of the marks of Internal Politics on it. It sounds like a VLIW camp inside Intel sold some management on a renamed version of the basic approach, and the project gathered Corporate Inertia.
At the same time, it doesn't sound as if all of the VLIW problems have been solved on the compiler side, so it's not clear that IA64 is doing any more than a clean, modern architecture cable of OOO execution could have done.
Out of the Hammer series, I'm reminded/hoping for the phenomenon described in "Soul of a New Machine", where they managed to clean and extend the old architecture at the same time. By the time they were done, the old architecture was an ugly wart on the side of a new clean one. The fear was the new being an uglier wart on the side of an already ugly one, and they avoided it.
I don't know enough about Hammer to know what the case is. I have the documents, but haven't made time to read them. I've also heard some rumblings that some of the performance improvements to IA64 involve de-purifying it's VLIW to pick up OOO techniques. I've heard that VLIW was an attempt to sidestep OOO because those prolems were feared, but in the meantime the industry has learned how to do OOO pretty well.
Re:Not the best architecture (Score:3, Funny)
*sigh*
I wish we'd just start calling these data types what they are - int16, int32, int64, float64, etc. It could save us all so much confusion. I mean, what are they going to call it when chips move to 512-bit? Uber Turbo Fantastically-Amazing Super Very Long Instruction Word?
Re:Not the best architecture (Score:4, Funny)
Re:Not the best architecture (Score:3, Informative)
-- No segmentation
-- Can address bottom 8 bits of every GP register (in other words, GP registers are truly general purpose now)
-- Some stupid instructions removed (e.g., BCD ops)
-- Recommend using SSE2 instead of the x87 horror
In addition you get the nice extensions of long mode:
-- 16 GP registers
-- 16 SSE2 registers
-- 64-bit ALU and memory ops
-- IP-relative addressing mode
If you look at long mode and ask, "what's really horrible about this?" I would only say instruction encoding and a large number of remaining wacko instructions. But together these give the x86 a performance advantage it has always had over other designs --- small code size and therefore better memory system performance for the instruction stream.
Intel should have bought Alpha years earlier ... (Score:5, Informative)
As a fellow ECE, I'll give Intel a mark in the "innovative" column on IA-64. But the concepts of predication, EPIC and compiler-time optimizations we're NOT good enough to even make the new architecture competitive when not considering x86 compatibility. And Intel needs to be smacked for all those stupid extensions -- it's funny to see AMD accomodating them with less effort than Intel.
Alpha has always been the "64-bit RISC of RISCs" and they had binary translation techology c/o FX!32 so Linux/x86, NT/x86 and VMS/VAX apps could run on Linux/Alpha, NT/Alpha and VMS/Alpha, respectively. It was not only original, but using binary translation on the same OS, but different architecture, works far better for compatibility in software than general (any OS) architectural compatibility in hardware/microcode! With Alpha 364 at 0.13um would be kicking IA-64 butt. I mean, 3-year old Alpha 264 0.25um processors beat IA-64 at the same clock speeds!
Anyhoo, as a fellow EE/ECE, please read this post I made a few weeks ago and let me know what you think. It is entitled "How AMD and its partners are putting x86 back on the right track ... " [matrixlist.com]. IA-64 was an ideal and novel concept, one that is not so good based in reality where good branch prediction is better than predication, and run-time optimization is just as important as compile-time. The Alpha 364 team predicted the "problems" with IA-64, which came true.
Re:Intel has a Big Problem (Score:3, Insightful)
Everybody has been saying that for twenty years.
Twenty years. It is far too late for x86ers to worry about "making things ugly". That sacrifice was made in the early 80s. And it paid off.
The reason Intel is still in business, is because they knew what drove the market. Superior (in performance, power use, and just plain elegance) alternatives were around all along, but x86 still got all the sales. The reason for this is that the strongest market force is the need for good compatability with The Legacy. Against this force, all other considerations are irrelevant.
That's why Intel survived (flourished) in the 90s, and why AMD is about to kick their ass. AMD's embarrassing toadying to this principle in the Hammer design, shows that they understand. Intel attempt to raise the bar, shows that they have forgotten. Intel's chip is going to be the next 68k or PPC or SPARC. It'll be a niche, where everyone says how neato it is, and yet few actually use it. And in the mean time, AMD will be selling gazillions of Hammers.
Re:Intel has a Big Problem (Score:4, Insightful)
Think back to Rambus. (Back?) Intel got a lot of options on Rambus stock, provided that Intel could ship n percent of systems using Rambus memory. If Intel had no significant chipset competition, this would be easy. But it turned out there was enough competition to give people a choice of chipsets, and hence memory technologies.
Still, the P4 seems consciously designed to play to Rambus strengths. It chews memory bandwidth like candy through prefetching, which helps cover the higher Rambus latencies. I think Intel took a performance hit relative to AMD when the market preferred DDR SDRAM.
Anyway, it's a great story for Intel if they could control the future of PC technology. Rambus gets rich, Intel gets rich, you pay more. Three cheers for AMD for breaking this.
IA64 now looks similiar. If it wasn't for the aura of inevitability associated with the Itanic, nobody would be particularly thrilled with it. The initial SPECint numbers where it barely kept up with a SPARC were the first practical warning---if you don't count the schedule slips.
If IA64 was inevitable, everybody would have to pay up to transition to it. If it was the banner Win64 platform, a lot of places would be buying them regardless of relative price/performance. But because it looks like AMD will eat IA64 from the low end, and with POWER4 staring down from the high end, there's no longer an obvious niche where IA64 dominance is inevitable.
Four cheers for AMD.
Re:Intel has a Big Problem (Score:5, Funny)
AMD design engineers run into an Intel strategy exec at a conference. Intel guy says:
Re:Intel has a Big Problem (Score:3, Insightful)
Right, and what's interesting is that from a pure geek perspective, Intel did the right thing - AMD did not.
People have been griping about CISC and Intel's grotesque manifestations of x86 for years now. So they finally do the right thing and sit down with HP to spend a couple years hammering out a brand new design. And what do they get from the geeks? Nothing but boohs and hisses. You guys should be ashamed of yourselves. Did you really want a Pentium V, VI, etc.?
I'm glad Intel finally quit x86 cold turkey. AMD may have bought themselves a little time with the Opteron, but the sooner we're all off x86 the better.
Oh, and don't think that IA64 won't be looking MUCH better once we start seeing properly optimized software and later iterations of it. Intel is just like Microsoft, the first implementations invariably suck, but they always get better from there.
Re:Intel has a Big Problem (Score:3, Informative)
Re:Intel has a Big Problem (Score:2)
Re:Intel has a Big Problem (Score:3, Insightful)
Exactly what Intel-sponsored DRM are you referring to? The only technology that Intel has introduced that bothered me was the short-lived serial number fiasco, and once the press put some heat on them they dropped that quicker than a pair of wet undies.
Most of the hardware companies out there are not too keen on DRM. How does it benefit them? It's the legislation sponsored by content providers that you have to worry about.
How meaningful is this test? (Score:4, Insightful)
Are you testing to see whether I am an android or a lesbian, Mr Deckard?
Re:How meaningful is this test? (Score:4, Funny)
However.
I propose a new benchmarks. A benchmark of PacMen. How many simultaneous instances of a PacMan emu this thing can run with all of them at 100% at full sound and video...
Re:How meaningful is this test? (Score:3, Funny)
Please Help!!! (Score:2, Funny)
Re:Please Help!!! (Score:2)
Re:Please Help!!! (Score:2, Funny)
Re: Reasons for 64 bits (Score:2)
You've better have a *really* big memory, or you'll suffer terrible performance due to pagin.
Hammer's final name (Score:3, Informative)
Re:Hammer's final name (Score:5, Funny)
Re:Hammer's final name (Score:2, Funny)
The problem was not with the name hammer itself, per sae, but with the cost of manufacturing the little anti-static balloon pants they were going to ship the CPUs in.
Re:Hammer's final name (Score:2)
"Opteron" is the name of the SMP-capable Hammer -- the non-SMP version will carry the old name, "Athlon". Look here [theregister.co.uk] for more.
Personally, I very much prefer calling it "Hammer". It's much easier to pronounce, for one.
"Please Opteron, Don't Hurt 'Em" (Score:3, Funny)
...doesn't have the same ring.
~jeff
Re:Hammer's final name (Score:5, Funny)
Duron becomes Athlon Jr.
Athlon XP becomes Athlon
Clawhammer becomes Double Athlon
Sledgehammer becomes Bacon Double Athlon with Cheese
Re:Hammer's final name (Score:5, Funny)
"What, they don't call it the 'Bacon Double Athlon with Cheese' ?"
"Nah, they got some sense, they call it "Le Hammer"
"Le Hammer, sh*t"
(well , actually we'd pronounce it "le ameure" (no 'H' at the beginning and the usual adding of an 'e' at the end))
"We're french types-ah , why else do you think we have this outrrrrrrageous accent for ?"
Yeah, but (Score:2, Funny)
Non-Intel all the way! (Score:3, Interesting)
ps -- where is the obligatory Beowulf cluster commentary on this??? I am shocked and appalled at this apparent oversight by my fellow /.'ers...
Imagine a Beowulf cluster of /. humourists (Score:3, Funny)
Re:Imagine a Beowulf cluster of /. humourists (Score:2)
Re:Imagine a Beowulf cluster of /. humourists (Score:2)
Are we talking about cramming a bunch of bad slashdot humorists into 1U racks? Because you may be on to something there ...
-jdm
Re:Non-Intel all the way! (Score:3, Interesting)
If you don't have a basis for comparison, how would you know?
You certainly missed out on a whole slew of pentium FP bug panics.
Re:Non-Intel all the way! (Score:2)
Fab on a ethnically cleansed viilage (Score:3, Informative)
& have made no offer to compensate those villagers even though as far as the Geneva Convention, the Hague Convenention, the IDHR & the UN are concerned, they (the former villagers) still own that land.
arrg stop with the quake already (Score:4, Insightful)
Gah.
Tom
Re:arrg stop with the quake already (Score:3, Insightful)
Because Quake is what will be used by people who believe 'reviews' and 'benchmarks' from sites like aceshardware.
Re:arrg stop with the quake already (Score:2)
Ace's is pretty good, actually. (Score:3, Insightful)
To judge real-world performance, Quake is at least as good as any synthetic benchmark. Personally, I'd like to see benchmarks for 3DS MAX, TMPGEnc or Photoshop (because those are some of the programs I use daily). But between Quake and WhateverMark2002, I prefer Quake (and I don't even play Quake).
RMN
~~~
Re:arrg stop with the quake already (Score:3, Insightful)
You are right, and in fact Quake is not even a good benchmark for gaming in general. However, it is very memory intensive and was generally the P4's strong point.
Saying that the Opteron will smoke a P4 at Quake is saying that it smokes the P4 at its own game.
The test is a good indicator that if
Re:arrg stop with the quake already (Score:2, Insightful)
One way to make intel nervous... (Score:2)
", however, it's important to keep in mind that these are unauthorized tests of an early revision CPU and platform, and that there could be significant differences in performance from final shipping versions depending upon the state of the test hardware used here."
Get false results by having a chip that isn't the same one that is going to be released and Intel will overcompensate by dropping prices (again) and coming out with a "better" chip.
Re:One way to make intel nervous... (Score:2)
Speaking of Itanium 2, my initial question still remains: "What the hell happened to Itanium?" I still haven't seen this chip anywhere yet....
Re:One way to make intel nervous... (Score:2)
Or, it would be, if it had sold as much as the Alpha did, in any year.
Seriously, everybody (except the most naive among us) knew that Itanic was going to be a flop. From the start, Intel has said that McKinley (oups, sorry, Intel, Itanic 2) would be where things started getting warm.
And with McKinley, there should be wide-spread industrial support for IA64, from SGI, IBM and HP, with the competition being IBM (yes, funny, isn't it), SUN and, erh, not many more players.
Amazing performance from a clock-limited proto (Score:4, Interesting)
I can't wait for these chips to get out there.
thad
Re:Amazing performance from a clock-limited proto (Score:3, Interesting)
AMD does not live in that world ... (Score:2, Interesting)
As everything in life you can go overboard of course, Intel might have gone a little overboard to one side (too many pipeline stages) but Motorola is too far overboard on the other side to play a part in the high performance processor buessinuess (which is why Apple will probably have to switch to IBM wholesale in the near future, especially with Motorola giving up trying to keep their semiconductor processes competetive for high speed logic).
AMD probably wont close the MHz gap, but they do not intend to lag it to the same extent as before
Yes, but (Score:5, Funny)
Re:Yes, but (Score:4, Funny)
Sledgehammer can. Clawhammer is better used for ripping out chunks of embedded Celeron.
Not an optimal test of a 64-bit platform... (Score:4, Informative)
As has been said, Quake is only relevant to the chips concerned in that it only tests the 32-bit compatability of the Opteron. I would have like to see some tests that demonstrated the advantage of 64-bit processors over 32-bit processors. Granted, the reviewers only wanted to show benchmearks that the populous was familar with and they were pressed for time. Let's give them a break for that.
Nahtanoj
1667? (Score:2, Interesting)
It's nice to see that the industry isn't playing too much of the "more is faster" game, at least as much as they used to. When an 800mhz part is comparable to a 1600mhz, you've got to wonder what intel isn't doing to optimize.
Re:1667? (Score:4, Interesting)
FYI, we had a teacher in a processor architecture course that worked with optimizing algorithms and had worked for Intel. He left and started working for AMD instead. He openly said that Intel sucked. Guess what PR that gives when it's from the mouth of an insightful teacher.
So they must do something wrong over there.
Re:1667? (Score:3, Insightful)
I guess we have an explanation of the diff in AMD/Intel clock frequencies right there...
Re:1667? (Score:2)
But is it, really? We don't know yet. These were, as others have explained, unauthorized tests made on a pre-pre-release system.
That does not mean that the release system, running at 1600 MHz is faster than a 3200 MHz Pentium 4.
It might just as well mean that it is slower, and 3 GHz-ish is what the Pentium 4 will be at around release time for Opteron.
It's still going to be a race.
On a more historical (heh) note, there have been processors before running at lower clock frequencies outperforming others at double or higher frequency. MIPS R10K and descendants, for instance.
Also, it is completely out in the blue whether Opteron will be any good in multi-CPU configurations, compared to offers from Intel and other chip makers.
Re:1667? (Score:2)
AFAIK, the FPU of the P4 is crap (not that the IA32 FPU stack isn't crap in general). Intel's strategy is to get software designers to switch calculations that would normally involve the FPU over to their SSE2 instruction set. This really does improve P4 performance considerably.
This is happening, but adoption is slow. Also, the Opteron core has SSE2 support.
I would have liked to see an SSE2 heavy benchmark run on both machines.
Re:1667? (Score:2)
If Hammer= 1.4 x Athlon
& Athlon = 1.2 x P4 (non Northwood)
then if the word of hammer starting at 3400+ and 4000+ is true it would result in Hammer having a 2.0Ghz to 2.4Ghz clock (assuming AMD do nothing regards 64bit being faster)
Given the small size of the die, this sounds quite feasible, indeed 5000+ sounds possible with 0.13 and SOI
OK, ok, so its very iffy to extrapolate like that, but Intel may well have some significant problems if AMD can roll out Hammer fast enough to beat the P4 0.09 die strink.
The reason it's only 800MHz. (Score:3, Insightful)
Obviously, these guys did. AMD will NOT be happy about this.
Also remember that the Opteron will be running at MUCH higher clock speeds upon release. I'd guess above the 2GHz range for sure, but AMD doesn't want anybody to know that. This also suggests that this lil' 800MHz sample could be very overclockable.
This is AMD's weapon that can really take a LOT of market share. Microsoft already have a Windows XP build ported to the Opteron/x86-64 platform. The Opteron runs cooler, as well.
One thing that disappoints me - I have not seen ONE PCI64 slot on any of these test boards!! I hope that this'll be worked out before release.
Sign...more of the same (Score:5, Insightful)
But its SOOOOO disheartening to see my fellow nerds and
I can understand the love for Linux. A group of people programming for free, fighting a giant like Microsoft. But why should AMD garner the same sort of love and respect? AMD is a giant corporation itself, willing to screw you over. They'd charge you $2000 per processor if Intel wasn't around (and yes Intel would do the same).
Last week Intel dropped the prices of its processors. AMD was forced to follow suit, dropping their prices about 2 days later. Did the Slashdot community cheer Intel?
So along comes this news...AMD Opteron 800 MHz beats a Pentium 4 1.6 GHz by one frame pre second. I guess I fail to see why everyone is so excited?
I'll wager ANYTHING that when it ships, a 800MHz Opteron will sell for at LEAST twice the price of a Pentium 4 1.6Ghz.
Why do I even bother.
Re:Sign...more of the same (Score:2, Interesting)
Re:Sign...more of the same (Score:2, Informative)
Lets see..... why do we like AMD
1 Dollar for dollar they kick Intel's hiney for performance. IE you get more bang for your buck.
2 A 800 mhz AMD is as fast as a 1600 mhz Intel. That is just plain cool.... It has geek cool factor all over it.
3 As to the last statement about an Opteron 800 being twice what a P4 1600
4 As geeks we get tired of a market dominated by inferior products... IE Windows is the dominat operating system, and Intel is the dominant chip. Sometimes we just like to root for the underdog. If AMD can beat Intel at their own game, more the power to them.
why i cheer amd (Score:5, Interesting)
I am a long time system designer
Another reason why i tend to prefer Amd is the cynical marketing processor known as the P-4. The vast majority of benchmarks show that unless your running software thats heavily SSE-2 optimized, the Athlon's spank the P-4. Yet the P-4's are much more $$$$ due to all those wonderful Intel commercials with dancing morons in bunny suits, or some smucks painted up like a martian with a bad head cold. Instend of wasting all that money marketing, use it to improve your designs! Amd spends virtually nothing on marketing, and yet whenever they have a good design, their products sell extremely well. And dont get me started on intel's late ddr support, or the earily 845 chipsets that were sdram only, which had PATHETIC performace.
I guess the point of my whole rant is......I use Intel or Amd, or whoever, as long as they give me a good value for my (or my customer's) dollar. Give me a nice industry standard design. Dont foist some new marketing propierty design on me. If its gotta be propierty, it better be for one of two reasons: Considerably cheaper, or considerably faster. Intel in the past few years has NOT focused on giving the customer value. Amd has. Give me a 1000 dollars, and I can build either an Intel box, or an Amd box thats 20% faster then the Intel box, and just as stable. (I dont buy the Amd isnt stable arguement, it all comes down to knowing your hardware and how to configure it properly for stable operation.)
When Intel returns to delivering a product that is worth the price Intel charges for it, I'll use Intel again. Until then, I'll continue to laugh at ridiculous marketing schemes and do my research on which product is the fastest for the least money.
Re:Sign...more of the same (Score:3)
Seems to me that Intel used to spend months, even YEARS between significant speed increases of their processors. How long to go from a 486/33Mhz to a DX2/50? How long from the 486 to the Pentium? The Pentium Pro? Before AMD was on the scene Intel would milk every processor for a long, long time. People would pay through the nose for Intel chips. Intel's profit margins were grossly higher than anyone else's in the industry.
Now comes AMD, bringing similar (sometimes GREATER) performance than Intel chips at a FRACTION of Intel's price. A quick check of Pricewatch shows an Athlon 2100+ going for $177, while Intel's 2.2Ghz P4 (the likeliest competitor) is going for $238. The situation was even more out of wack last week until Intel lowered pricing. Do you think for one minute Intel lowered prices out of the goodness of their hearts? Of course they didn't. They did it because Athlons had been grossly undercutting them in price and performing every bit as well as Intel's finest.
Your predictions on the pricing of the Opteron are not valid as there will BE no 800Mhz Opteron. The Opteron is most likely going to debut around 1.5Ghz, give or take a couple of hundred Mhz. It will most likely cost twice what a 1.6Ghz P4 is costing right now, but that'll be just fine as it will most likely OUTPERFORM that 1.6Ghz P4 by about two to one. Things will be much closer with the Northwood B chips, but no matter what, AMD will almost certainly undercut Intel in pricing while delivering the same (within 10%) performance.
Face it: Intel is used to high margins and is unwilling to cut their pricing far enough to put AMD in the coffin. They are running on brand name and little else right now. If the situations were reversed and AMD had the household name and Intel was the relative unknown, does anyone for one moment think that anyone in their right mind would pay the lofty prices Intel is commanding right now? Of course not.
more regulation!!! (Score:2, Informative)
Senator F. Bar R-51st state announced the drafting of a new technology bill. It requires that all CPU chips conform to a regulated speed quantifier. This will allow all chips to be able to be easily compared with one another to end industry confusion. The unit, abbreviated IHz (Intel Hertz), was developed by the Intel Corporation. They have lobbied to get this standard, which will be controled and policed by a board of independent persons funded by Intel, adoped into Federal law....
ughh..
call me with the real benchmarks (Score:3, Interesting)
Still, I'm eagerly awaiting the ClawHammer release. Every x86 box I've built for the last 5 years has been pure AMD, and I've been quite happy with them.
some things to ponder about. (Score:2, Interesting)
why use a some low form benchmark. although i understand that the current systems are in prototype, the benchmark should reflect something of the server world including but not limited to tpc, spec, etc. i would really love seeing the performance of hammer in a oracle/sql/db2 or other database benchmark. i would love seeing the hammer handling ssl transactions and others.
with regards to amd using x86 with compatibility to 32bit, would it be dumb if you would run some non native applications? this means that amd anticipates that companies will not optimize their software to run on pure 64bit platform. this may be an indication that the initial design is not intended for the server product line. running 64bit does not make you compete in the server arena!!!!! the server market is a very different ball game compared to the consumer - cpu is not the prime reason.
and x86 is obsolete. it is not the efficient out there so it is time for a major change in the hardware world.
Re:I want one already (Score:2, Informative)
Re:I want one already (Score:5, Funny)
Don't worry, though. You can still refer to indefinitely long time periods as 'about when the Hurd is released'.
Re:why would anyone buy intel? (Score:4, Informative)
Having two AMD machines at home, a Tbird 1400 and an Athlon XP 1700+, I'm seriously underimpressed by the P4 performance. As far as I can tell, the only reason to buy Intel anymore is out of pure inertia; they bring nothing to the table.
Re:why would anyone buy intel? (Score:3, Informative)
Don't ask me about game performance, hell, I don't even have Solitare loaded.
There is actually (Score:2)
Most people are sorely dissapointed with their 1.6a if it only overclocks to 2.2 GHz (stable with standard cooling). Most people can ramp it up to between 2.4 and 2.6.
At that speed, it can smoke everything AMD is offering, and at a low cost. Lots of the OCers are buying the P4 1.6a with an Asus P4S533 motherboard.
Re:why would anyone buy intel? (Score:2)
When switching to an unknown mainboard, I go with a manufacturer I know to build good stuff, and make sure I beat the hell out of the mainboard before my return policy time runs out.
Not Entirely Sure (Score:2)
I know far too many people who have built AMD boxes - people who certainly know what they are doing - to have them crash and burn on a number of applications and games. The problem is so widespread across the AMD platform that the processor is the only logical point of failure.
I am one of those people, and I tend to disagree with you. I had a machine subject to random lockups and general disintergration of system integrity (on linux!). It was a K6-2 on a FIC-VA503+. The motherboard had issues. My friend who went with a nearly idenitcal system but used an older FIC motherboard with a higher stepping never had any problems, and is still using the machine today, five years later (six?).
I am not saying that the K7 is definitely not to blame, but it is also not the only logical point of failure. The amount of poorly tested crap that comes out for the K7 is a suspect for higher failure rates.
OTOH, the last two AMD machines I have built, a TBird 1.0 GHz on an Asus A7V133A, and an XP 1800+ on an A7V333 have worked flawlessly.
Plus, I don't think I need to bring up the issue of the flaming AMD Athlon in too much detail to get everybody's minds on that Toms Hardware video. There have been rebuttals and claims of inaccuracy from the AMD camp, but for the record:
Removing the heatsink/fan from a P4 chip caused the machine to BSOD.
Removing the heatsink/fan from an Athlon caused it to BURST INTO FLAMES AND MELT
I don't care what the details of the situation were, I have absolutely zero desire to run a chip that has the possibility of catching fire. There's an old saying that I'm rather fond of, it goes "The bitterness of poor quality lasts much longer than the sweetness of low cost." If you buy AMD simply because it's cheaper... eh. Your machine, your loss.
This is no longer a valid argument. If you are willing to shell out the $40 for a quality Mobo, then you are now likely to get thermal protection. If my heatsink falls off with my Asus A7V333, the chip does not fry. However, this fall would crush my All-In-Wonder 7500, which I am more worried about.
Re:Er, yes? (Score:2)
I saw the video too, and while it's amusing, I fail to see how this could even happen. The heatsinks on AMD CPUs is on so hard you need to work at it to get it off. Anybody who has one "accidentally" fall off didn't put it on right in the first place.
Your car (ANY car) will burst into flames if you remove the gas line from the engine and point it at the exhaust manifold while it's running with the engine hot. Same situation; something that's NEVER going to happen in real life, unless the thing was put together wrong in the first place.
Re:Er, yes? (Score:3, Informative)
I saw the video too, and while it's amusing, I fail to see how this could even happen. The heatsinks on AMD CPUs is on so hard you need to work at it to get it off. Anybody who has one "accidentally" fall off didn't put it on right in the first place.
Haha... what I worry about is catastrophic "smash the flathead screwdriver through the motherboard while trying to loosen the clip" failure.
or also, catastrophic "heatsink clip breaks off the cheap plastic socket notch upon removal" faliure.
Much more likely...
If anything, I wish AMD would do more in the way of promoting bolted heatsinks rather than the cheesy clips.
Re:why would anyone buy intel? (Score:2)
WTF have you been smoking, I can get dual processor 1U Athlon MP designs from litterally dozens of vendors! Many beowolf clusters of late have been based on this exact configuration, a couple gigs of ram and 2 high speed Athlons in 1U is a sick computation density that can only be rivaled by the new 1U dual Alpha boxes from hpaq, but those cost about $25-40,000 so the Athlons still easily win on performance/cost/space considerations.
Re:Stop! (Score:5, Funny)
Can't touch this ( 'cause it is frigging hot ! )
Actually...not true (Score:3, Informative)
In some things, it was keeping pace with a chip twice its megahetz. Be interesting to see what this baby can do once they take the training wheels off.
-Pete
Re:Stop! (Score:2)
We'll see what happens when they get to the 1.6GHz they hope to launch at, but still...
Re:Stop! (Score:2)
Will they be frigging hot, or will they be passively cooled? [theinquirer.net]
Re:Microwave my brains! (Score:2)
Re:would be faster (Score:5, Informative)
ya missed the point (Score:2)
Re:ya missed the point (Score:2)
Re:ya missed the point (Score:3, Interesting)
Re:ya missed the point (Score:2)
These numbers are really extremely impressive - if the perfomance scales with the clock speed, an Opteron will outperform a P4 running at twice the clock speed. Given that it's also been designed to scale, it's going to be hard for Intel to keep up . . .
himi
Re:I'm confused... (Score:2, Funny)
actually - dunno...
Re:Benchmarks (Score:5, Insightful)
In case you don't know, Clawhammer is meant for desktops/workstations. Hell, there's even a mobile version of it in the pipeline! Then we have Clawhammer DP (dual-processor) and Sledgehammer that are meant for servers.
Please, get your facts straight before opening your mouth!
Re:Benchmarks (Score:3, Funny)
Re:Benchmarks (Score:3, Funny)
Re:What's next? (Score:2, Funny)
Cheers,
Jonathan
Re:They should use POVRay or GCC (Score:2)
Re:Binary compatible (Score:2)
/Brian
Re:Kernel Hackers (Score:3, Informative)
You've been living in a cave, right?
Yes, _some_ Athlon chipsets did have serious problems with early versions of the 2.4 kernel. But this has been fixed for a very long time. I know this, because I'm using one of those chipsets. Be sure that you're actually using a recent kernel, and that you've gotten the latest BIOS updates for your motherboard.
Also, there were never any problems with running Linux 2.2 on an Athlon.