
AMD Athlon MP 1800+ Processor Review 214
Lars Olsen writes: "Amdmb.com has posted a review of the new AMD Athlon MP 1800+ processor -- a big speed jump for the dual Athlon processor family with the new processor running at 1.53GHz. There are also 1600+ and 1500+ Athlon MPs available as well right away at stores around the World.
Dual AMD Goodness is now running just as fast as its desktop counterpart ! Here's a quote: 'Those of you who want to jump into the dual processing Athlon world will finally be able to do so with the knowledge that your processors are the top speed that the Athlon family has to offer. And for anyone who already has a Tyan Thunder or Tiger MP board and a pair of Athlon MP processors, you may just want to pop a couple of these new Athlon MP 1800+ CPUs in your system to boost performance.'" Some of the comments following yesterday's "dream system" article addressed dual-Athlon complications, so make sure you read before you buy.Update: 10/15 15:14 GMT by T : Check below for LinuxHardware.org's take on this chip, and Athlon MP systems in general as well.
Augustus writes "LinuxHardware.org takes a look at the Athlon MP platform under Linux and the newly released Athlon MP 1800+ is included. Covered in this article is not only the technology and performance of the AMD-760 MP chipset and the Tyan Thunder K7 motherboard but we also look at why anyone would consider a multi-processor system."
Re: (Score:2, Funny)
Re:fingers... (Score:2, Funny)
Please stop chopping off your fingers for the decimal points.
Thanks.
Re:fingers... (Score:1, Informative)
Why bother ? its an excuse to write bad code (Score:2, Interesting)
All too often developers use the increased memory and processor speed to write worse implementations, or to create pointless bloatware. I know this will continue no matter what I say but at the end of the day who really needs this much power, QuakeIV players ? QuakeV ? QuakeIII runs fine with my upgraded graphics card, and top of the line sound card, the processor does bugger all.
Moore's law is great, it means computers can do more and more, but for the home market its just silly, 90% of people would be fine not changing their machine for 4 years, but they are forced to upgrade by market perception.
Faster this, faster that.... but never ever actually "better", "more reliable" or "stable".
Hardware is the excuse for bloatware, its not H/W engineers fault but it isn't an excuse to use....
(and yes this is partly a dig at the huge swap requirements on the 2.4 kernel)
Re:Why bother ? its an excuse to write bad code (Score:5, Informative)
Have you ever
a) done audio editing
b) done video editing
c) applied a filter to a 50MB+ image
d) compiled X
e) done any ray-tracing
etc, etc.
Any of these things can suck up vast amounts of horsepower and beg for more.
Also, 2.4 is getting somewhat more sane in recent releases.
Chris
Re:Why bother ? its an excuse to write bad code (Score:1)
Re:Why bother ? its an excuse to write bad code (Score:1)
Shower Curtains (Score:1)
Re:Why bother ? its an excuse to write bad code (Score:1)
You forgot "solving systems of 50,000 equations." People always bring that one up, as unrealistic as it.
What is so unrealistic about it? Have you ever tried solving an eigenvalue problem for 10000x10000 matrix in MATLAB? It takes about an hour on a dual PIII, 800mhz with 1Gig RAM.
Things like that come up a lot more often that one would expect.
Re:Why bother ? its an excuse to write bad code (Score:3, Interesting)
In other words, no-one needs this unless they (a) need to compile mega-programs or (b) do heavy maths work. So no home user and most business users have no need.
I speak as someone who switched from a P233 to a Duron 800 only bcos the mobo broke - I refused to spend £80 on a new Pentium mobo when £200 would get a complete new system!
Grab.
Re:Why bother ? its an excuse to write bad code (Score:1)
my bias:
I keep 16 1 GHz PIII's going all the time at 100% running sims.
Re:Why bother ? its an excuse to write bad code (Score:2)
Re:Why bother ? its an excuse to write bad code (Score:3, Informative)
(Note: I'm not talkig about home use here) Actually, 50,000 equations is a rather small system. Any idea what weather prediction looks like? Something like 10 equations per grid point, with a grid that's something like 200x200x50=2,000,000. So you end up with a 20 million equation system. Also, many CAD software (eg finite element simulations) also need to solve *huge* systems. The faster the computer, the more precise the simulation (because you can afford more grid points).
Re:Why bother ? its an excuse to write bad code (Score:2, Informative)
Aerodynamics
Fluid Mechanics
Oceanography
Meteorology
Stress Analysis
Well, that's the ones I know about but there must be a lot more. And not just scientists use these, there must be thousands of engineers working in these fields daily.
solving systems of 50,000 equations... (Score:2, Informative)
This is very common and very useful.
Also, if I had a PC with 100 times the memory and speed, I could still bring it to its knees. As it is, I have to simplify and granulate my models to make them fit the computing power I have.
How do you think they predict the weather? Design cars and planes? Do thermal analysis? Do vibration analysis? Do electromagnetic analysis? Do displacement/stress analysis? Do computational fluid dynamics? Do transient analysis of all the above?
Re:Why bother ? its an excuse to write bad code (Score:2)
b) see a) lots of this after snowboarding holidays. Mostly done directly from the video camera over the firewire connection
c) Yup... now that is slow, but I've got lots of memory so not that bad... in fact given that I've got 3/4Gb of RAM its probably as fast as memory limited machines with a fast CPU...
d) Yup... hell done that on _much_ slower machines.
e) Yup...
The basic one here is that I don't work 100% of the time on a single task. Waiting for renders is fine (I've always tended to do them as overnight jobs anyway).
All of the above are very very possible on a PII400Mhz, just ensure its got a good soundcard, a good graphics card, fast disk, and lots of memory.
Most of those things suck up memory rather than CPU and its the huge amounts of swapping that cause them to slow down.
Re:Why bother ? its an excuse to write bad code (Score:1, Interesting)
I don't see why people disparage using faster processors for legitimate applications. I've done video editing, and no matter what CPU I do it on, I wish I had more. And no, it wasn't disk I/O bound, because I had no trouble playing the input and output videos at full-speed.
Re:Why bother ? its an excuse to write bad code (Score:2)
The fellow's point was that 99.98% of Joe Sixpacks out there *don't* need all the power that's being hyped. Just because you like running Emboss on your hi-res porn images doesn't mean that some college student in Albuquerque, a secretary in Toledo, or your Grandma needs to do it.
If you need 2GHz, get 2GHz. If not, don't do it just because the salesman told you you need it.
Re:Why bother ? its an excuse to write bad code (Score:2, Funny)
My grandmother was compiling X the other day on her P166 and she's like, "Goddammit! Git me one of those Amdy Altheron processors!"
-J
Re:Why bother ? its an excuse to write bad code (Score:3, Insightful)
High-speed CPUs are very useful to our clients who run large database implementations with voice-recognition data-entry systems, FYI.
Re:Why bother ? FOR THE GAMES, SILLY (Score:2, Insightful)
-Berj
Re:Why bother ? FOR THE GAMES, SILLY (Score:3, Insightful)
But the kicker is that these games really don't need such horsepower. I'm willing to bet that if there were any pressure to get any of these games running on a more resource constrained system, like a game console, then lots of unnecessary internal fat would be trimmed right away. But there's no pressure to do so otherwise. And even if a game that could run just fine on a PII 400 requires a 1GHz processor, certain people seem to _like_ the justification for upgrading.
Re:Why bother ? FOR THE GAMES, SILLY (Score:5, Insightful)
The two environments are very different, and most of that fat can't be trimmed by wishing it away or blaming on programmers
As for bloatware, start modelling cloth, hair, IK, bump maps, and the hardware gets used again. The reason the games aren't doing it now is because they want the comfortable sales window.
Honestly pushing ultra-high-end features that cut your market to 4% of what it could be isn't a big selling point - good luck convincing your publisher to bring the game to market - and trying to build an engine to scale between low and high-end aggravates the bloat of PC vs. console problem even worse.
Re:Why bother ? FOR THE GAMES, SILLY (Score:3, Funny)
You're right, they could cut their polygon count down to a quarter of what it is now, precache almost everything (quadrupling the amount of hard disk space used) and probably use 50% of the CPU they use now. Game developers really are into severely optimizing their code, especially those programmers dealing with graphics; They're usually trying to find ways to optimize every single action.
On the other hand, as others have pointed out, the only way to really optimize the hell out of something is to write it in assembler. That makes any large codebase pretty much unusable.
The biggest thing game developers could do right now to improve game performance is to use really excellent multi-res in a game. Multi-res is a process where, when used to its fullest, lets you start with very high polygon models for everything, and the game engine will reduce the polygon count one vertex at a time, in some cases all the way down to a single polygon. When done right this will let you draw amazingly complex scenes without slowdown; The computer can tell more or less what you're looking at and decide what needs lots of polys.
Unfortunately, even those games which are using multires are using a low-rent version where they pre-reduce the vertex count, so you still "pop" from model to model. It's getting better, though.
The best thing about multires of course is that you don't have to precompute things, like BSP-based schemes, and that it will make the best use of your graphics hardware, while still running well and looking good on lower-end hardware. On the other hand, your graphics card had better handle lighting pretty damned well. Since you can get a GEForce MX400 card for less than $100 (Or a GF2 for about $150) that's really not much of an issue these days.
Re:Why bother ? FOR THE GAMES, SILLY (Score:2)
I agree in principle, but that's not it. It isn't polycount either, as someone else said. At the moment, the average PS2 game has more polygons than the average PC game (that's because if you assume hardware T&L on the PC then you have a severely limited market; lots and lots of mass market PCs still ship with the equivalent of a Voodoo 1 or worse, go to Dell's site if you don't believe me).
I'm talking about much larger issues. For example, on the PC you come up with a file format for something, then just keep using it because it works. With a little work, it often turns out that a 20MB file of world geometry can be knocked down to 5MB, just because there's so much garbage in there and no one ever thought about remove it. Or maybe there are thousands of keyframes of animation that make no visual difference and can be removed. Or some trifling module allocates 8M at load time and keeps it around, even though it isn't actually used. Or maybe there's poor collision detection code that does way too much work and could be made to run 4x faster. These kinds of things are _common_. I'm a game developer; I've been there.
Re:Why bother ? FOR THE GAMES, SILLY (Score:2)
Re:Why bother ? FOR THE GAMES, SILLY (Score:1)
Wrong. Video editing. To convert a 20 minutes of video to mpeg2 takes 82 minutes with a 450 mhz celery, and 49 minutes with a 850 celery. I still have to convert some 50 8mm video tapes to mpeg2.
-asp
Word is an excellent example... (Score:1)
Re:Why bother ? its an excuse to write bad code (Score:5, Insightful)
New, faster technology is being brought out just to make programmers dumber. Its an evil conspiracy against us all!
Seriously, though, what is your definition of "bloatware"? Lets say I'm writing Quake4. I want to use C++ and lotsa nice OOD that's easier to write, easier to read, easier to expand, easier to debug, and easier to maintain.
Is that "bloatware"?
Sure, I coulda used assembly on the whole thing and it woulda been efficent and fast! You wouldn't need the super hardware!
Hope you don't want to mod it, or me to fix any bugs, though.
Maybe us developers like faster systems so we can implement software with better techniques to make technology grow? Sure it requires a little more hardware, but I wouldn't call it some evil conspiracy.
It doesn't matter what technology is out there, there will always be crap (bloatware).
BTW - You might want to buy this shirt [thinkgeek.com].
Re:Why bother ? its an excuse to write bad code (Score:3, Insightful)
Things like SOAP are a classic example. CORBA is a perfect way to get computers communicating, it uses IDL to describe the services, it works on any platform and works using a binary protocol which can be tunneled via HTTP if required.
SOAP is an ASCII based RPC mechanism, when was that a good idea ? So you can _read_ computer to computer transactions ? This is possible because we have cycles to burn and so doing two sets (or more) of textual conversion isn't seen as a bad thing(tm).
Outlook, Netscape 6,
XEmacs used to be considered the worlds largest piece of bloatware... its 4.2meg, its got email, news, web-browser, editor, mayan calendar and the kitchen sink in there....
Mozilla appears to be 16Meg at least (IE was 100Meg when I installed everything!) Is it 4 times as functional, 4 times as reliable... nope.
Re:Why bother ? its an excuse to write bad code (Score:1)
C was fast and efficient, why did anyone need C++?
They just burn extra clock cycles!
Ugh, under your rules, innovation would be in a standstill.
Re:Why bother ? its an excuse to write bad code (Score:2)
These steps are nothing to do with _now_ yes we needed to have machines that went from 1Hz to 400Mhz or so, otherwise it was a pain in the arse, but the last 3 years has seen insanely powerful machines, and not seen the sort of increases in quality that could be expected.
And no-one EVER needed C++, its a HORRIBLE language
LISP, Smalltalk now you're talking
Re:Why bother ? its an excuse to write bad code (Score:1)
I, personally, have an Athlon 800, I'm a big gamer, and I'm perfectly happy with the machine, not upgrading it for at least a year...
But I'm also a developer that believes in good design and good design and good desi.... etc... and good coding techniques. Even if it sacrifices memory and horsepower.
And C++ has its ups and downs, as does any other language
Re:Why bother ? its an excuse to write bad code (Score:2)
If Quake4 is released w/any bugs, runs slow on decent hardware (I consider a 400mhz computer decent), and is fucking HUGE (minimum req is ridiculously high) then I will be sorely disappointed.
If you need to assemble the god damn thing to make it run fast, do it. I am sick and tired of "great" games being released that are frickin' huge and slow and require a dual athlon to run.
I don't care if I can mod it, I don't care if you can debug it (there shouldn't be that many in the first place for how much it costs), and I certainly don't care if you think it should be easy for you to program.
Freeware is one thing. A seriously high-end game should run fast and not need a dual athlon.
If Quake4 is released it better play like Q1, or there will be yet another version that I won't play
Just my worthless
Re:Why bother ? its an excuse to write bad code (Score:1)
Re:Why bother ? its an excuse to write bad code (Score:1)
"But quake ran fine on my PII!" - then run Quake.
"They should make this new game run fast on my 4 year old computer." No, you should buy (or write) games that run fast on your 4 year old computer (try 4 year old games). I want games that are released in my lifetime with lots of features and visual effects - so I get hardware that can run them.
And if Quake4 played like Quake1, why would they make Quake4? Especially if it ran the same on the same hardware? I think you're a sales demographic ID can afford to lose.
Re:Why bother ? its an excuse to write bad code (Score:2)
Currently video cards only draw the scene you describe, until recently they couldn't even transform the light (T&L) the scene so the CPU had to do it all. (Transform means taking the level, clipping out bit syou can't see, bits that are occluded, and then transforming what's left to fit the screen in the proper perspective. Lighting then takes that and a list of all the lights and calculates which walls are being lit.)
If you want to do a few hundred thousand floating point calculations to draw the scene you're going to need a very fast CPU to do it many times in a second.
Quake is a graphical game, it's doing exactly what it says on the box. Now, MS Office, that's bloat.
Re:Why bother ? its an excuse to write bad code (Score:2, Interesting)
I want to use C++ and lotsa nice OOD that's easier to write, easier to read, easier to expand, easier to debug, and easier to maintain.
In theory, you should be able to write such classes so you can define one flag and the debug stuff will compile away to nothing. (or just a few extra pointers)
So the developers need good machines, everyone else doesn't.
Except some companies are shipping debug builds as their final product. I'm not sure why. (Black & White, for example, includes the debug mfc & msvcrt dlls.)
Re:Why bother ? its an excuse to write bad code (Score:1)
OTOH, if the traders bothered to get their option pricing models written in a decent computer language rather than VBA, then yes, maybe they could run on a 256MHz P2.
Unfortunately, the banks are firing a lot of their IT staff [bbc.co.uk] because, frankly, throwing hardware at the problem is cheaper than writing the stuff properly.
Re:Why bother ? its an excuse to write bad code (Score:1)
shut up man
Re:Why bother ? its an excuse to write bad code (Score:1)
No. Good hardware is never an "excuse" for writing bloatware. Most times you refer to programs as bloatware, it's not the programmers who intentionally write "bad code" but the development enviorment and it's assosciated overheads that cause bloatware. Besides, most bloatwares have lots of features that you may not need, but others do.
All said and done, I don't think we are doing too badly as far as bloat is concerned...and any bloat that exists is more a reflection on programming methodologies being used and their limitation as we scale.
Bad Post! but a good link... (Score:1)
Anyway, here [editthispage.com] is a rather illuminating article on "bloatware". Cheers.
So what is good code? (Score:4, Insightful)
With current hardware, people are still writing code a lot of code in C and C++ for performance reasons which has lead to buffer overflows, segfaults, core dumps, general protection faults, and blue screens becoming generally accepted aspects of computer programming. Now that the hardware is finally becoming fast enough, maybe we can wean ourselves from C & C++ and move over to writing apps in Java or even C# instead of still dealing with the same issues that were solvable problems 20 years ago. Programmers have shown that it is practically impossible to deliver significantly problem free C/C++ code in a decent timeframe while programming environments like Java have shown the opposite. Once hardware creeps up enough we can rid ourselves of the problems of C & C++ once the performance gains are not worth the amount of bugs one has to deal with, which is already happening in lots of server applications.
Also once, hardware creeps up enough maybe some of the stuff that has been in research labs for the past 20 years can finally see some use. For instance microkernel are generally seen as a superior way to design an OS but have had difficulty taking hold due to performance reasons (although Windows NT is based on a -kernel architecure and MacOS X is also built on the Mach -kernel) which wil change once hardware advances make it possible for the performance difference to become acceptable.
A.I. being built into applications as well as the OS is another place where hardware performance and memory availability would play a big part in helping come to fruition.
How about voice recognition and face recognition being built into the applications you use?
How about bringing virtual reality to masses?
Or do you think that a 1 GHz CPU and 128 MBs of RAM is all the power a computer user will ever need?
Re:So what is good code? (Score:3, Interesting)
Read any software QA textbook, and you'll find they all agree (and experience tells you the same). How do you learn to code? It's not by being taught, it's by hacking away in a dark room somewhere. Individual coders/engineers may being incredibly skilled, but the experience doesn't get passed on, so the next generation of engineers make the same mistakes as the last one! Personally, I split up software developers into "hackers" and "engineers".
The "hacker", when given a vague problem to solve, sits down on his own and bashes out a piece of code without reference to requirements clarification, design documents, etc. It may even work - but it will be an unmaintainable nightmare, and if it doesn't work first time (or if it works sporadically) then it's over to printf and the debugger for months. Documentation, where it exists, will be written post-facto, and you'll be lucky if it explains the code properly. No-one else will be able to rework the code, and the hacker himself may not remember how it worked 6 months later!
An "engineer", OTOH, spends most of their time working in Word and a CASE package working out what they want to do and how they're going to achieve it, and runs his ideas past someone else to see whether a fresh pair of eyes can spot anything wrong. By the time the engineer goes for his favourite text editor, the problem's most of the way solved, and any bugs can be found by comparing design against code (ie. peer review). Any future changes are simple to include, as the design explains how everything works in sufficient clarity that anyone can pick it up and rework it.
A really good engineer (and I'm not one, yet
I've not run Netscape 6 for more than a few hours total, and it's already crashed on me more than once. Java is no magic bullet. Sure, there's some ways C will let you kill things which Java doesn't let you do. But coding standards such as MISRA define "safe" subsets of C, and by following them you will minimise the risks. Is it better to be coding in C, knowing how to avoid the problems, or coding in Java without knowing about any pitfalls? And as for timescales, Netscape are hardly a shining example, are they?
For a typical user running typical productivity software, a 300MHz CPU and 128MB of RAM is all they'll ever need. More power will only be required for a new "breed" of programmes - maybe the Metaverse, maybe not. But your typical home computer user will not require any more processing power until a new killer app comes along. OfficeXP is not that killer app.
Grab.
Re:Why bother ? its an excuse to write bad code (Score:1)
Linux 2.4.10 doesn't have the huge swap requirements of the older kernels. I went from using 500MB of swap per node in my cluster to using 50MB of swap running a cfd code by upgrading the kernels to 2.4.10 (512MB of memory per node).
-asb
Re:Why bother ? its an excuse to write bad code (Score:2)
But your point is valid, if the current software weren't using all the new speed, we might already be there. (Well, for most things. Crypto cracking will still use an infinite amount of CPU...)
Re:Why bother ? its an excuse to write bad code (Score:2)
Firingsquad reviews dual durons vs thunderbird (Score:2)
Damn! Slashdotted! (Score:4, Interesting)
Twice the burned-out CPUs? (Score:1, Troll)
Re:Twice the burned-out CPUs? (Score:2)
If you're really really scared, get one of the heatsinks that bolts onto the motherboard instead of clipping onto the socket.
Re:Twice the burned-out CPUs? (Score:3, Informative)
Other CPUs are also very sensitive. What's rather surprising is how well Intel's P4 thermal shutdown works. I suspect AMD will get around to doing something similar. But in the meantime, I've attached a nice quiet (3800 RPM, not the 7200 RPM version) ThermoEngine to my Thunderbird, and it cruises at around 100 degrees F. Some newer/bigger heatsinks bolt to the motherboard, rather than clip on to the socket, which I suppose helps if you're really paranoid about its falling off. I use Motherboard Monitor to keep track of the temp via the Win98 system tray, and wish Linux distros would include similar capability out of the box (yeah, I know there's a way to build it in yourself...).
But then I do admit to using a 1 GHz Tbird rather than a faster one because I don't want that excess heat or power consumption.
Re:Twice the burned-out CPUs? (Score:2)
The new Athlons (XP and MP) have thermal sensors on board according to AMD's site. I still can't find any information indicating whether/how they actually use these though.
Re:Twice the burned-out CPUs? (Score:2)
Yes AMDs will incinerate themselves if the heatsink alls off - but funny, you don't see many people saying this has happened - yes it has to a few, but honestly - I'd rather get the higher performance for my dollar and risk having to replace the CPU if the heatsink fell off - something very unlikely. But if it did, the replacement CPU would be pretty cheap given how prices on processors fall over just a few months! And total cost would STILL probably be chaeper than an equivalent Pent 4 system (not CPU, system) Hell my 1GHz Athlon has been chugging along for months and the heatsink is still on solid!
Re:Twice the burned-out CPUs? (Score:4, Insightful)
Seriously here, you are missing out if this kind of thing actually sways you away. The biggest flaw, IMHO, is the AMD cores chips way too easy. I would really like a coating of nickel or copper like the Intel chips have. As an early adopter of the Chrome Orb (rev 1), the hard part was safely getting the heat sink on.
I've found that an AMD CPU will give you warning signs like lockups, kernel panics, and other goofy things when you loose a fan. My mainboard will shut down 5 sec after the post if the CPU fan is not spinning fast enough! Since they are good up to ~100C, using a motherboard monitor prog will go a long way to making sure it runs safely and shuts down before it gets into deep weeds. A copper heat sink goes a long way to passive heat removal as well in an emgerency situation.
This is like buying a car based on how well it runs without oil in the engine. I suspect my BMW would make for a fantastic video if I tried that too. DON'T DO THAT! I would not pay extra for an engine that would - like using synthetic oil to give an extra two minutes of use.
Buying a CPU that throttles back and paying extra for it -- that might be insurance, but I stopped buying retail boxed CPU's with the three year warr.... It would cost me more to ship an old 400mHz CPU back to Intel than to just replace it these days. I paid $99USD for a 1.4G CPU a couple weeks ago. At that price, these things are practically disposable.
Re:Twice the burned-out CPUs? (Score:1)
IIRC, the main problem with the AMD processors was that they would burn out in around 3 seconds in the (unlikely I know) event that the heatsink fell off. Another point was that the plastic tabs which the heatsink was clipped to, weren't particular strong, so it perhaps wasn't as unlikely as one might think.
john
Re:Twice the burned-out CPUs? (Score:1)
So, where's the problem?
Re:Twice the burned-out CPUs? (Score:2, Funny)
Re:Twice the burned-out CPUs? (Score:2)
the case fan died on my box , and because I run dual-seti at night, the machine heated up and started beeping.. woke up me up, but found that the box shut itself off
Interesting, this. (Score:2, Interesting)
Too bad that IT managers go with what they know (everyone else is using) and what's worked for them in the past.
It may be confusing for Jane Consumer, but it's nice to see that AMD's finally gotten a marketroid with a clue as to what works. Now if only their stock would start working, too...
Man, and I just built a dual 1.2ghz.... (Score:3, Informative)
A few notes on the TigerMP though: VERY picky on RAM, very picky on how it's seated (read: install memory before board is in your case, so you can wedge it in on a flat surface!), but since getting past that, it's been ROCK solid! Beautiful system I must say!
MadCow... always 500mhz behind the curve.
Re:Man, and I just built a dual 1.2ghz.... (Score:1)
Thanks for the tip on the Tiger MP. I bought high-quality memory, so it should work (when it arrives).
Re:Man, and I just built a dual 1.2ghz.... (Score:2)
Compare that to Intel. Over the past 3 years we've had Socket 7 (pentium), Socket 8 (pentium pro), Slot 1 (p2/p3/celeron), Slot 2 (xeon), Socket 370 (p3/celeron), Socket 423 (p4), Socket 478 (p4)...
Re:Man, and I just built a dual 1.2ghz.... (Score:2)
All Tyan lists is "supports two Athlon MP processors"... no frequency range, like most other motherboards out there. It'd be great if I can drop in two 4ghz processors next year when the next bloatware OS slows my system to a crawl!
However, back in the real world, I'm now ripping MP3's (at 12x speed+), running Seti@home at full speed (realtime priority, just for fun), surfing the web, and running Komodo/Mozilla, and still only running at 70% CPU usage... it's not like I need more power right now! q:]
MadCow.
Re:Man, and I just built a dual 1.2ghz.... (Score:2)
It is using basically all of one processor's time, but none of the others. When my computer is "idling" with just Seti running, it's at exactly 50% CPU usage.
MadCow.
Roadmap (Score:3, Informative)
About the naming (Score:2, Interesting)
I really think AMD will have te expect some problems with this. Back in the good old days (r) of the pentium and the cyrix 6x86 I worked in a computer store and we also sold cyrix computers to customers that didn't want too spend too much money (so sue me)
Very often people came back because they saw that their Cyrix PR200+ wasn't actually running on 200Mhz and demanded a refund (which they didn't get ofcourse) we had to explain the whole thing and it costed us a lot of time
That's why we stopped selling them back then
Another thing is that the semi-geeks (the dudes that THINK they are geek but basically know nothing) won't buy them because "they are already overclocked"
Expensive heat death? (Score:1, Troll)
Tom's Hardware [tomshardware.com] notes that the AMDs can cook really fast and beyond the ability of the motherboard sensor to flag. I guess these have on-die sensors but these were noted as being fairly ropey as well.
Intel's P4 seemed to do quite well out of the test as the clock slows automatically as the die temperature increases (in effect the processor ignores the clocks until the temperature goes reasonable). This means that it will even run without a heatsink (but very slowly).
I just get very nervous about having high-end silicon that is vulnerable to a SPOF. It a heatsink detaches or the processor fan fails - blam. If the chassis fan fails, at least there is some chance of a shutdown, but those processor heatsinks make me uncomfortable. Yes, I know I can buy quality, but MTBF is just that, a fan can still fail early.
So I wait for AMD to get a bit more serious about thermal protection and stick with using cheaper processors as thermal fuses.
Re:Expensive heat death? (Score:1)
Frankly I think people are being just a little too paranoid about this whole issue. It's like monitor implosion. Possible != likely.
Re:Expensive heat death? (Score:5, Funny)
For your convenience, here is a list of other things you should avoid buying because they have "fatal flaws":
Re:Expensive heat death? (Score:1)
At nearly twice the clock speed, those athlons could still run quite a bit hotter than my lowly duron, I suppose. I would still expect that a hardware monitor set for fan RPMs or processor temp would catch a failure in time. Don't set it on 149 deg. F. If it's above 125, something is wrong.
BTW, exactly what do you do to your computer that could detach the heatsink? Most heatsinks (unless you buy quality) can be a pain in the butt to detach even when you want to detach them.
Athlon XPs HAVE THERMAL DIODES (Score:2)
Processor Idea (Score:4, Funny)
Imagine an x86 compatible processor that runs at a clock speed of 50ghz? That's right, fifty BILLION hertz! Now, that clock only ever hits a counter that lets the 8086-compatible processor cycle once every half to full second. You could get a whopping 1-2 IPS
You'd be able to make millions selling 8086's that use the first 640k of a bunch of 128 meg chips, and the first 40 megs of a 400 gig hard drive. Think of the possibilities!
to MP or not to MP? (Score:1)
Can anyone confirm this? Is this new, higher-priced series of Athlon MP's simply a marketing gimmick, a la NVIDIA's Quadro cards? (which are the same as a Geforce hardware-wise - save one tiny resistor that tells the driver to un-cripple certain optimizations - but cost 2-3 times as much a Geforce)
Re:to MP or not to MP? (Score:1, Informative)
The chance of a dual XP or dual Duron setup not working is infinitesimally small.
Re:to MP or not to MP? (Score:2, Informative)
SMP-enabled and mistakenly shipped. AMD supposedly will be disabling SMP in the XPs very soon.
Re:to MP or not to MP? (Score:2)
Another preview on Tech Report (Score:2, Informative)
They only compare against the 1.2Ghz Athlon MP though... although they intend to do an expanded article soon.
shut up man
Athlon MP restricted by AMD760 mobo (Score:3, Insightful)
I think the Athlon MPs are awesome, but having a much cheaper, single-processor setup beat out a dually in some tests throws a bit of cold water on my upgrade lust.
shut up man
Re:Athlon MP restricted by AMD760 mobo (Score:3, Informative)
If you're running tasks/benchmarks that aren't CPU bound, multiple CPUs won't do you any good. If you're running multithreaded apps or multiple single-thread apps, multiple CPUs are a Good Thing, and two AthlonMP 1800+ CPUs will outrun a single AthlonXP 1800+ on a KT266A motherboard. Linux kernel compiles, fr'instance.
Re:Athlon MP restricted by AMD760 mobo (Score:3, Interesting)
The problem is, their will ALWAY's be a bottle neck. No matter what you are dealing with. Wether it be the internet, the computer memory sub system, or the traffic on the way to work. Once we make one thing faster, it shows that another isn't quite up to par. So, that is th next thing that needs to be worked on. Wether it be the mobo manufactures, processor manufacturers, the wonderful people that lay that precious fiber optic cable, or the road crews that interupt my morning commute to work.
Things like the nVidia nForce shipset are (At least IMHO) going to advance computer technology even more than a newer, slightly faster processor. Why? Because of battle necks such as this memory issue we are seeing with the AthlonMP SMP systems.
Granted we have to give them some credit. When the Via chipsets were first released their memeory bandwith was HORRIBLE. Even to the point that it was better to stay with the BX chipset over upgradeing to the newer Via133 chipset. But, that has been fixed for the most part through things as simple as BIOS updates.
Their is a lot to a computer system, and their is a lot that makes it function properly. And if I had time I would get into the bandwith limitations between the northbridge, and the southbridge, the interactions with a SMP systems and the different cache's available to that processor, and their bandwith/ latencies, etc.
/pointless blabbering
- Ice_Hole
AMD (Score:1, Redundant)
I do think they should provide a more accurate "instructions per second" rating rather than relying on Intel as the benchmark for their rating.
P4 2ghz runs at 4ghz (Score:1)
What about Dual Durons? (Score:4, Interesting)
Get it? Get it! (Score:4, Insightful)
Is there any software currently available that requires this kind of speed? Nope.
Is there any sensible reason to upgrade your CPU? Nope.
Is my rational, analytical mind paying the slightest bit of attention to this argument? Nope.
It's all about the megahertz, baby! In an earlier generation, we were the people tinkering under the hoods of our Fords, trying to get a little more oomph out of a carburetor. Most of us don't need it, most of us have no idea what to do with it, but since when has that ever stopped us? More speed! More storage! More bandwidth! I want more!!!
Good job, AMD. Keep 'em coming.
My id is sneaking up behind my superego with a rock...
Things You Need To Know About Dual Athlons (Score:5, Informative)
Big speed boost? (Score:2, Interesting)
We're talking about mhz increase of less than 10% -- in how many months? It's been over a year since the T-birds were introduced.
Yeah, they have a new core. Whoopee. It's not a dramatically new improvement, and apparently AMD has decided that if its chips, in name, are as fast as P4s, they should cost as much too.
I like AMD stuff, but the Mhz Myth shit hasn't worked for apple, ever, and it won't work for AMD. Apple tried the Mhz Myth stuff back when the ppc601 came out, and despite 6 or 7 years of PR bunko, it's not caught on.
XP won't remain MP (Score:1)
"The initial batch of Athlon XP chips shipped out to distribution were unlocked and this was not suppose to happen. Within a week or two, these unlocked CPUs will be phased out, or recalled. I'm not sure what will happen but AMD has confirmed that the Athlon XPs will be locked very, very soon.
Some of you are lucky, to have snagged a few Athlon XPs that were unlocked."
-Rothfuss
Anandtech's review (Score:2, Interesting)
Hammer time! (Score:2)
Chip Speed and Bus Speed (Score:3, Informative)
1. Bandwidth - face it, email and the web are king. Unless you're a gamer.
2. Video Card - if you're a gamer, you're better off spending your money on this and making sure it has tons of cache.
3. Sound Card - if you're a gamer, you're better off spending the rest of your money on this. The rest of us don't care, so skip this.
4. Memory - more, more, more. Yes, even more.
5. Bus speed - more channel so those CPUs can actually send more data.
6. Hard disk - you really should have more RAM, but once that's crammed, get better seek and access times here.
6. Chip speed - WAY DOWN HERE! - yes, if you maxed on all the above, then you MIGHT notice the difference between a 1GHz and 1.8GHz system. Otherwise, unless you're a graphics artist, YOU SHOULDN'T WASTE YOUR MONEY!
Naturally, when people review systems, they compare older systems with slower bus speed, less RAM, slower HD, and cheaper cards to new systems with faster H/W. Buy the motherboard and cards yourself and pop in a slower chip and spend the extra money on RAM - you will get way more bang for your buck that way.
Aside - I own AMD shares, so sure, go buy these speed demons! But don't do it because you have to, do it because you know you just like BIG NUMBERS.
Is it still about the MHz? (Score:2, Insightful)
Seriously, I think we all agree here that AMD is making a bold and necessary move to diminish the importance of MHz. Unless we follow suit and stop using MHz as our measure of performance, the public will never catch on. I think the importance of attaching a "model number" to a chip name is that we will eventually forget about MHz altogether and focus on pure chip performance. Let's start that now.
The Mhz equivalents for each of these new processors had no place in this article.
Otto-matic
Re:Truth in labeling (Score:1)
Intel are making faster (Mhz) chips rather than faster (actual processing) chips.
The simple fact is that the 1800+ is FASTER than a p4 at 1800mhz would be, rather than being a gimmick they are erring on the side of caution.
You are right, most customers would mistake it for a mhz rating, just as if they called it an athlon 1533 no retarded customer would buy it because it has less "Mhz". It works both ways and AMD have to deal with the customers.
Re:Truth in labeling (Score:5, Informative)
From your tone I'd expect you woudln't buy AMD anyway. However, if you did any research, you'd find the AMD's new numbering plan is actually conservative. Independant benchmark reviews have shown that the AMD 1800+ is actually more of an equivalent to the Pent 4 2GHz chip. But AMD chose a conservative threshold. Granted, the new Intel cores will boost performance a bit, but even then the AMD numbering plan is expected to be on target. Honestly - who cares what they call the chip - anyone with half a brain can find out the MHz value. But to what end? Me? I want to buy teh system which gives me the most performance for the least $$$ and right now that is an AMD chip hands down when you account for other CPU specific system costs and impacts (chipset, memory type needed, etc)
I honestly think AMD did what it HAD to do - their chips are faster at slower clock speeds and Intel managed to get folks thinking MHz was king. Now AMD has ot try and chance that thinking.
Re:Truth in labeling (Score:2, Informative)
To me, this is an apples vs. oranges analogy. On one hand, we have the apples, who are the auto and motorcycle enthusiasts. On the other hand, we have the oranges, or the vast uninformed PC-buying public walking into CompUSA and Circuit City stores. Two completely different species.
Intel's ability to con the public into buying into the MHz game is obhorrent, at best. They manufactured an inferior processor, the P4, basically to outmatch AMD in numbers. Intel knew that AMD wouldn't be able to ramp up their Athlons to the same level within a reasonable amount of time. The P4's inferiority is backed up by the fact that P3s outperform P4s MHz-per-MHz.
I feel that AMD's new effective/relative performance ratings are justified in this case, especially since the numbers are realistic (as opposed to their 486/K5 series or Cyrix's CPUs). If Intel wants to bloat numbers, AMD has to catch up in the marketing game in order to survive in this industry. People are walking into the major retail stores and being convinced by salespeople that the P4 systems are better and just as cheap (only because they bundle inferior components such as nVidia TNT2 graphics cards and generic sound cards to reduce the price) as an Athlon-based system. The regular Joe Blow will see a bigger MHz number and an affordable price, which is the killer combination.
The Linux/hardware enthusiasts are by far a minority in the PC market. Thus, the battlegrounds look ugly to those who are more informed, but I'm sure they look even worse within the buildings of Intel and AMD. It's a dog-eat-dog world.
Re:Truth in labeling (Score:2, Funny)
- Freed
Re:Ouch, Looks Like We Broke Their Website (Score:2)
Re:Why NOT to use amd.. (Score:1)
Since Tom (of Tom's HArdware) has started accepting bucks from Intel, his reviews have certainly taken on a pro-Intel flavor!
Re:Why NOT to use amd.. (Score:1)
Re:Why NOT to use amd.. (Score:1)
Duuuude, it's got an Intel Pentium IV processor (which is soooo nice!).