AMD Athlon MP 1800+ Processor Review 214
Lars Olsen writes: "Amdmb.com has posted a review of the new AMD Athlon MP 1800+ processor -- a big speed jump for the dual Athlon processor family with the new processor running at 1.53GHz. There are also 1600+ and 1500+ Athlon MPs available as well right away at stores around the World.
Dual AMD Goodness is now running just as fast as its desktop counterpart ! Here's a quote: 'Those of you who want to jump into the dual processing Athlon world will finally be able to do so with the knowledge that your processors are the top speed that the Athlon family has to offer. And for anyone who already has a Tyan Thunder or Tiger MP board and a pair of Athlon MP processors, you may just want to pop a couple of these new Athlon MP 1800+ CPUs in your system to boost performance.'" Some of the comments following yesterday's "dream system" article addressed dual-Athlon complications, so make sure you read before you buy.Update: 10/15 15:14 GMT by T : Check below for LinuxHardware.org's take on this chip, and Athlon MP systems in general as well.
Augustus writes "LinuxHardware.org takes a look at the Athlon MP platform under Linux and the newly released Athlon MP 1800+ is included. Covered in this article is not only the technology and performance of the AMD-760 MP chipset and the Tyan Thunder K7 motherboard but we also look at why anyone would consider a multi-processor system."
Re:Why bother ? its an excuse to write bad code (Score:3, Insightful)
High-speed CPUs are very useful to our clients who run large database implementations with voice-recognition data-entry systems, FYI.
Re:Why bother ? FOR THE GAMES, SILLY (Score:2, Insightful)
-Berj
Re:Why bother ? its an excuse to write bad code (Score:5, Insightful)
New, faster technology is being brought out just to make programmers dumber. Its an evil conspiracy against us all!
Seriously, though, what is your definition of "bloatware"? Lets say I'm writing Quake4. I want to use C++ and lotsa nice OOD that's easier to write, easier to read, easier to expand, easier to debug, and easier to maintain.
Is that "bloatware"?
Sure, I coulda used assembly on the whole thing and it woulda been efficent and fast! You wouldn't need the super hardware!
Hope you don't want to mod it, or me to fix any bugs, though.
Maybe us developers like faster systems so we can implement software with better techniques to make technology grow? Sure it requires a little more hardware, but I wouldn't call it some evil conspiracy.
It doesn't matter what technology is out there, there will always be crap (bloatware).
BTW - You might want to buy this shirt [thinkgeek.com].
Re:Why bother ? FOR THE GAMES, SILLY (Score:3, Insightful)
But the kicker is that these games really don't need such horsepower. I'm willing to bet that if there were any pressure to get any of these games running on a more resource constrained system, like a game console, then lots of unnecessary internal fat would be trimmed right away. But there's no pressure to do so otherwise. And even if a game that could run just fine on a PII 400 requires a 1GHz processor, certain people seem to _like_ the justification for upgrading.
Re:Why bother ? its an excuse to write bad code (Score:3, Insightful)
Things like SOAP are a classic example. CORBA is a perfect way to get computers communicating, it uses IDL to describe the services, it works on any platform and works using a binary protocol which can be tunneled via HTTP if required.
SOAP is an ASCII based RPC mechanism, when was that a good idea ? So you can _read_ computer to computer transactions ? This is possible because we have cycles to burn and so doing two sets (or more) of textual conversion isn't seen as a bad thing(tm).
Outlook, Netscape 6,
XEmacs used to be considered the worlds largest piece of bloatware... its 4.2meg, its got email, news, web-browser, editor, mayan calendar and the kitchen sink in there....
Mozilla appears to be 16Meg at least (IE was 100Meg when I installed everything!) Is it 4 times as functional, 4 times as reliable... nope.
Re:Why bother ? FOR THE GAMES, SILLY (Score:5, Insightful)
The two environments are very different, and most of that fat can't be trimmed by wishing it away or blaming on programmers
As for bloatware, start modelling cloth, hair, IK, bump maps, and the hardware gets used again. The reason the games aren't doing it now is because they want the comfortable sales window.
Honestly pushing ultra-high-end features that cut your market to 4% of what it could be isn't a big selling point - good luck convincing your publisher to bring the game to market - and trying to build an engine to scale between low and high-end aggravates the bloat of PC vs. console problem even worse.
So what is good code? (Score:4, Insightful)
With current hardware, people are still writing code a lot of code in C and C++ for performance reasons which has lead to buffer overflows, segfaults, core dumps, general protection faults, and blue screens becoming generally accepted aspects of computer programming. Now that the hardware is finally becoming fast enough, maybe we can wean ourselves from C & C++ and move over to writing apps in Java or even C# instead of still dealing with the same issues that were solvable problems 20 years ago. Programmers have shown that it is practically impossible to deliver significantly problem free C/C++ code in a decent timeframe while programming environments like Java have shown the opposite. Once hardware creeps up enough we can rid ourselves of the problems of C & C++ once the performance gains are not worth the amount of bugs one has to deal with, which is already happening in lots of server applications.
Also once, hardware creeps up enough maybe some of the stuff that has been in research labs for the past 20 years can finally see some use. For instance microkernel are generally seen as a superior way to design an OS but have had difficulty taking hold due to performance reasons (although Windows NT is based on a -kernel architecure and MacOS X is also built on the Mach -kernel) which wil change once hardware advances make it possible for the performance difference to become acceptable.
A.I. being built into applications as well as the OS is another place where hardware performance and memory availability would play a big part in helping come to fruition.
How about voice recognition and face recognition being built into the applications you use?
How about bringing virtual reality to masses?
Or do you think that a 1 GHz CPU and 128 MBs of RAM is all the power a computer user will ever need?
Athlon MP restricted by AMD760 mobo (Score:3, Insightful)
I think the Athlon MPs are awesome, but having a much cheaper, single-processor setup beat out a dually in some tests throws a bit of cold water on my upgrade lust.
shut up man
Re:Twice the burned-out CPUs? (Score:4, Insightful)
Seriously here, you are missing out if this kind of thing actually sways you away. The biggest flaw, IMHO, is the AMD cores chips way too easy. I would really like a coating of nickel or copper like the Intel chips have. As an early adopter of the Chrome Orb (rev 1), the hard part was safely getting the heat sink on.
I've found that an AMD CPU will give you warning signs like lockups, kernel panics, and other goofy things when you loose a fan. My mainboard will shut down 5 sec after the post if the CPU fan is not spinning fast enough! Since they are good up to ~100C, using a motherboard monitor prog will go a long way to making sure it runs safely and shuts down before it gets into deep weeds. A copper heat sink goes a long way to passive heat removal as well in an emgerency situation.
This is like buying a car based on how well it runs without oil in the engine. I suspect my BMW would make for a fantastic video if I tried that too. DON'T DO THAT! I would not pay extra for an engine that would - like using synthetic oil to give an extra two minutes of use.
Buying a CPU that throttles back and paying extra for it -- that might be insurance, but I stopped buying retail boxed CPU's with the three year warr.... It would cost me more to ship an old 400mHz CPU back to Intel than to just replace it these days. I paid $99USD for a 1.4G CPU a couple weeks ago. At that price, these things are practically disposable.
Get it? Get it! (Score:4, Insightful)
Is there any software currently available that requires this kind of speed? Nope.
Is there any sensible reason to upgrade your CPU? Nope.
Is my rational, analytical mind paying the slightest bit of attention to this argument? Nope.
It's all about the megahertz, baby! In an earlier generation, we were the people tinkering under the hoods of our Fords, trying to get a little more oomph out of a carburetor. Most of us don't need it, most of us have no idea what to do with it, but since when has that ever stopped us? More speed! More storage! More bandwidth! I want more!!!
Good job, AMD. Keep 'em coming.
My id is sneaking up behind my superego with a rock...
Is it still about the MHz? (Score:2, Insightful)
Seriously, I think we all agree here that AMD is making a bold and necessary move to diminish the importance of MHz. Unless we follow suit and stop using MHz as our measure of performance, the public will never catch on. I think the importance of attaching a "model number" to a chip name is that we will eventually forget about MHz altogether and focus on pure chip performance. Let's start that now.
The Mhz equivalents for each of these new processors had no place in this article.
Otto-matic