Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD

AMD Opteron "Hammer" Preview 252

Melvin Tong writes "Hardware Extreme has posted a preview of AMD's 8th-generation processor that AMD is currently developing with a few exclusive pics of the mechanical sample. AMD Athlon processors based on Hammer technology are expected to ship in the forth quater of 2002. The preview is located over at HW Extreme."
This discussion has been archived. No new comments can be posted.

AMD Opteron "Hammer" Preview

Comments Filter:
  • It's a pretty safe bet such systems are already in use in some TLAs, we just don't know about it. Opteron/Hammer will be a nice step forward but obviously it's not meant as competition to big iron from Sun and others for quite a while, if ever. AMD has years and years to go before they can enter such markets successfully, they are relatively small and have to concentrate on the desktop and small to mid sized servers for the
    forseeable future, plus have yet to prove themselves in the higher end, attract the appropriate support and build an image they don't yet have when compared to Intel, Sun or IBM.

    It also looks like they have their work cut out for them already, with not as fast clock/rampup on 0.13 micron as expected and a tight line to get the Hammer line done properly as they are pretty strung out on cash compared to Intel, while the latter seems to have no trouble in increasing clock all the time (by throwing huge gobs of money at it of course).

    • There are benchmarks out for this same engineering sample from several months ago(actual ones, not "well, here's that same test on a 6 year old celeron" benchmarks), and it it quite fast. If I recall correctly, the 800(which is now several months old) beat the p4 1600. It doesn't seem to impressive until you also realize that the Athlon 1 Ghz chips were *just* edging out the p4 1.4 ghz chips in some applications
  • dup dup * . (Score:4, Funny)

    by Pig Hogger ( 10379 ) <pig.hogger@g[ ]l.com ['mai' in gap]> on Tuesday August 20, 2002 @10:32PM (#4109139) Journal
    expected to ship in the forth quater of 2002.
    Is the Opteron a stack machine????
    • Well, it's faster than a stack of Itanics, anyway.

      I was going to make a crack about it running Postscript real well, but I thought it might be too obscure...

      • I was going to make a crack about it running Postscript real well, but I thought it might be too obscure...

        I suspect more people are familiar with PostScript than Forth...

        /Times-Roman findfont 36 scalefont setfont
        72 720 moveto
        (Hammer Time) show
        showpage

        (I usually use "This is a test," but seeing as this article is about certain microprocessors with a tool as their codename...)

        A basic knowledge of PostScript is useful to tell if a printer or a print server is running properly. By comparison, does anybody use Forth for anything? (I downloaded a couple of Forth interpreters for the Apple II years ago, but never got around to doing anything with them.)

  • I particularly liked this point:

    "The AMD Opteron is designed to be scalable, reliable and compatible, which can result in lower total cost of ownership."

    Gee, the whole article sounds like a lame press release. I want the real low down, not a marketing piece!
    • Too... many... buzzwords... head... hurts...
    • by Anonymous Coward
      And surprise surprise hit the next button on page 4 and you go to pricegrabber.com searching for the Athlon XP 2200+. Real professional.
      • Yeah, that was just damn sad. I mean, on the first page it had "CLICK HERE TO SEE PRICES ON AMD PROCESSORS!!!!" and I thought that was shameless advertising exposing an obvious bias... But then they made the last page actually BE an advertisement... That's god damn pathetic. Both Melvin Tong and Hemos should be ashamed of themselves for promoting such tripe.
    • Gee, the whole article sounds like a lame press release. I want the real low down, not a marketing piece!

      You may want to avoid hardwareextreme, then. I knew I shouldn't even have bothered reading this one. I've never read a thing there that wasn't "written" in the same style (ie, copy a bunch of press releases and call yourself a hardware site).
    • Let's just hope these AMD marketeers do a better job at advertising Opteron than they did with Athlons and K6-2. If I was one of the mindless (sorry) mainstream users who don't read slashdot and buy factory built computers I wouldn't know about AMD. I would only know about Intel Processors and brainwashed from those guys dancing in white space suits and head banging aliens that P4's are the best chips out there. AMD really needs to crunch out television advertising and get endorsements from computer manufacturers (displace the Intel jingle at the end of every Dell and Gateway commercial), especially if they beat Intel in releasing a 64bit processor.
      • Duh, why do you think AMDs processors are better (cheaper)? Maybe because they don't waste money on advertising. I frankly don't care what the "mindless mainstream users" use, or pay. I don't want to have to pay extra just to get mainstream users to convert. I don't care if AMD is successful as long as I'm getting a deal.
        • Hopefully you do realize that once AMD goes down, Intel will most likely jack up the prices way, way up.

          There are way more "mindless mainstream users" than techies, and without them, there's no AMD. I've rather have them advertise more, charge me extra $5 per chip, and develop a better faster chip through competition that would've cost me extra $50 if AMD wasn't around.
      • Do you know that Intel spent roughly 700 million on advertising last year...? Do you realize that AMD made 2.7 billion (that is before expenses). Add to that how Intel is adding more than 200 Million more to their ad budget this year to fight against hammer (by 'helping' white box vendors & some other new monoplistic ways of advertising)...

        I'd love to see AMD ads more often, but lets be a bit more realistic... Intel spends almost a third of the total earnings of AMD for a year on advertising... AMD couldn't keep up with that unless they were much bigger...
      • AMD was only cheaper for non mainstream users who purchased their own components from the grey market and assembled them. If you look at AMD's 1000 unit pricing, the are lower than Intel's but quite competitve. Last time I checked, in June, the discount of street prices to OEM prices was 40% for AMD vs. 5% for Intel. If AMD had better control over their channel, you would be singing a very different tune. Also, if AMD advertised more, pricing would be quite similar, and they would probably take more contol over their top tier buyers, similar to Intel.
        Yes, Opteron will be cheaper than Itanium but might only be significantly cheaper for those who assemble their own systems.
  • This article just reiterates stuff that we've all heard before about the Hammers. However, there is one new piece of interesting information: if the pictures are to be interpreted at face value, the hammers will finally get the heat spreaders that the P4s have had for a long time. Don't get me wrong, I'm an amd fan all the way, but athlons (aka fires waiting to happen) have needed these for a long time.
  • Tough crowd.
  • not much there.. (Score:3, Insightful)

    by kesuki ( 321456 ) on Tuesday August 20, 2002 @10:38PM (#4109167) Journal
    A picture of a pure copper CPU mock-up, and then a picture of an evaluation opteron. And about 4 pages of months old regurgitated AMD press releases. I wouldn't really consider this news, since AMD's been showing off the evaluation chip for a few months now.
  • by glassware ( 195317 ) on Tuesday August 20, 2002 @10:38PM (#4109169) Homepage Journal
    This article has nearly all the technical specs, except benchmarks [theinquirer.net]. Sightings for Opteron/Hammer chips have been sparsely available for a while. When actual results show up in SPEC CPU2000 listings [spec.org], that's when the chip will be finally ready for market.

    As a side bonus, you can find SPEC benchmarks for Itanium and Itanium IIs on that chart (search for the word Itanium - Dell and HP have both submitted results).

  • Intel (Score:2, Interesting)

    by T-Kir ( 597145 )

    Either way, It would be funny if Intel ended up having to license AMD's x86-64 technology. Even though I don't think that would happen, I suspect Intel would rather fork the 64bit platform with their Itanic (part 2) than license from AMD... but you never know!

    • Re:Intel (Score:3, Interesting)

      by Shadow99_1 ( 86250 )
      Actually if you remember back a year or twop ago you'd realize they already have a cross-licensing deal with AMD which would entitle them to use x86-64 if desired without further hassle... Of course Intel may prefer you forget that til they need to use it...
    • Actually,

      Intel ALREADY licensed x86-64. This is one of the reasons that Yamhill (Intel's x86-64) might be happening.

      Read any (good - unlike this one) preview of the opteron and they almost always mention this fact.

      Derek
  • This in not meant to be a troll. However, I sure as hell don't trust "sneak preview" tech specs full of typos in a article written by rumor-mongering hardware freaks half a page down from a picture of someone hitting a CPU with a giant green inflatable hammer.
    • This in not meant to be a troll. However, I sure as hell don't trust "sneak preview" tech specs full of typos in a article written by rumor-mongering hardware freaks half a page down from a picture of someone hitting a CPU with a giant green inflatable hammer.

      Why not? The specs are cribbed from the PR kit, so they're about as trustworthy as what the company itself says.

      Remember, "sneak previews" on hardware sites are like trailers in movie theaters. They're there to get you interested in the product, not to critique it.
  • Nice cap! (Score:3, Interesting)

    by (H)elix1 ( 231155 ) <slashdot.helix@nOSPaM.gmail.com> on Tuesday August 20, 2002 @10:40PM (#4109176) Homepage Journal
    I was very happy to see the nickel cap on their new CPU. After crushing a couple of AMD chips, I became very weary of removing the heat sink after a successful mounting. More so than I probably should be, but after chipping the edge off of some $100+ CPU's, I was very nervous about picking up any of the cutting edge processors.

    I look forward to lapping the cap to a shinny mirror finish!
    • Re:Nice cap! (Score:5, Interesting)

      by Artifex ( 18308 ) on Tuesday August 20, 2002 @10:51PM (#4109220) Journal
      After crushing a couple of AMD chips, I became very weary of removing the heat sink after a successful mounting.

      No doubt. Actually, that worked to my advantage, when I was trying to get Fry's to take back an Athlon XP that had gone bad... when they told me they had to test it, I was worried, because their idea of a testbed is another customer's board hooked up to crappy "PC Doctor" software, and has rarely caught transient errors in the past.

      Wouldn't you know it, though, they cracked it during mounting, so of course it became "oops, let's get you credit for that chip" instead of "we can't find a problem in 30 minutes of running crappy test software so it must not be bad."
    • Perhaps a tad off-topic, but did anybody notice on page 3 [hwextreme.com] that 3 or 4 pins are bent on the bottom left corner of the CPU? Perhaps it is a bear to pry out of the socket these days?

      • those pins are slightly bent, but nothing extreme. .... I was thinking that someone just held the pins firmly and bent them, but they bend out not in. Anyways, as long as it isn't bent bent, you can push it back into place.
    • >I look forward to lapping the cap to a shinny mirror finish!

      Just don't sand off any DRM bits, or it's your ass in the slammer! The DMCA is watching you, punk.

    • Yeah, I accidentally broke a fan blade off of my current CPU fan. I didn't want to screw around with it, so I just broke off the blade directly opposite. That was four months ago; still running with the same fucking fan.
    • Anyone know why they would use nickel and not copper? or even aluminium? AMD's chips have been known to heat up quickly and i would think a nice copper heatspreader (while a bit pricy) would help a lot.
    • Much easier (and less dangerous) to install and remove than the old clip-on type. Nearly all motherboards accept them.
  • AMD (Score:5, Funny)

    by kwishot ( 453761 ) on Tuesday August 20, 2002 @10:46PM (#4109199)
    Hardware Extreme has posted a preview of AMD's 8th-generation processor that AMD is currently developing...

    As opposed to the 8th-generation AMD processor that Intel is developing....

    (/sarcasm)
    • As opposed to the 8th-generation AMD processor that Intel is developing....

      It'll be Intel's 9th generation processor that is AMD's 8th generation--after the Itanic sinks.
  • Preview??? (Score:3, Funny)

    by Spackler ( 223562 ) on Tuesday August 20, 2002 @10:48PM (#4109205) Journal
    How is this a preview? This is just a preview of the marketing docs! A poorly spelt one at that.

    -Spackler

    PS: spelt was a joke
  • pics (Score:1, Informative)

    by Anonymous Coward
    whoa.. seems that Hardware Extreme was careless with the chip.. look at the first pic.. the top side is all stratched up.. also on the third page:
    http://www.hwextreme.com/reviews/processor/ opteron /page3.shtml
    the pins on the left side are bent!!
    these have got to be worth about $1000-$2000 right now (actually, $10, if it's beat up that badly)
  • by sconeu ( 64226 ) on Tuesday August 20, 2002 @10:52PM (#4109223) Homepage Journal
    From the "summary" page:
    AMD's 64-bit processors extend our long, rich history of semiconductor solutions based on customer-centric innovations.
    (Emphasis mine)

    Clearly a blatant rip-off.
  • That site might as well just point to the amd webpage itself. It sounds as though its a bunch of AMD corporate fluff talking employees.
  • by jjn1056 ( 85209 ) <jjn1056&yahoo,com> on Tuesday August 20, 2002 @10:54PM (#4109230) Homepage Journal
    I won't bother to elaborate on what several others have already mentioned, that this is a poorly edited stored pasted together from AMD press releases. The total kicker on this is the very last 'next' link takes you to a pages to buy some AMD Athlon chips!

    The boundry between news and advertisement gets more porous each year...
  • this picture here [hwextreme.com]... on the left hand side... eh wait a minute. Sorry, just a bent pin.
  • Look at the picture at the top of page 3 for the review, along the left edge of the chip. Some idiot bent a pin! I'd think they'd be a little less "extreme" with a top-of-the-line, unreleased processor...
  • Usually these previews are riddled with canned PR hype that may or may not be true. The fun part is seeing which promises come true when the product eventually hits the market, and which were totally off base.

    Mmmm... nostalgia.
  • So where's Intel's response to all this? Will the Hammer be of much concern to players like Sun, who also offer cheap Sparcs nowadays? How does Hammer live up to Motorola's G4 (what's taking the G5 so long anyway??). I had expected a LITTLE more depth to a story like this.
    This is just a poorly cut/pasted buch of marketing speak. So ok, the new athlons will have heat spreading.. No need to waste so much space on that.
  • by wowbagger ( 69688 ) on Tuesday August 20, 2002 @11:10PM (#4109278) Homepage Journal
    I have to wonder about the lifespan of a CPU that has an integrated memory controller of any type - not just DDR, but RDRAM, or FOORAM, or NARFRAM. What happens to the family when a new RAM interface comes along?

    Now, for high-integration CPUs designed for embedded style apps I can see it, but for a main-line CPU it seems to me that tying the memory controller to the CPU limits the lifespan of the design.

    I realize that should POITRAM become the new speed king that the RAM controller block of the CPU can be redesigned, and I understand that putting the RAM controller in the chip can increase the memory bandwidth to the CPU.

    But it does cause me to think....
    • That may be, but if you want to take a look at some of the serious articles on ememory & clock latency (from the CPU's perspective) you'd realize why they are adding the memory controller where they are. A 'normal' SDRAM memory controller on a VIA or AMD motherboard for instance can easily take 70+ cpu cycles before returnign the required data... So unless the cpu has other data to process (which fits into the cache) then it just sits there til it has the data requested... With a cpu built-in memory controlelr of this sort (especially if they allow tolerences for faster rated memory within the existing class) could lower the latency down to say 6 cycles...

      This is great for memory intensive & system intensive tasks (from gaming to high demand servers)...
      • With a cpu built-in memory controlelr of this sort (especially if they allow tolerences for faster rated memory within the existing class) could lower the latency down to say 6 cycles.

        Not to be argumentative, but this IS slashdot.

        I'd like to see these "serious articles" about memeory and clock latency that say that moving a memory controller from off chip to on chip will reduce latency from 70 cycles to 6.

        The latency for retrieving data from main memory is an effect of current memory technology, data can only be fetched so fast from DRAM based memory. DRAM uses a capacative effect to store data and it is relatively slow especially compared to the ever faster modern processors. This is the reason for using physically more complex SRAM which stores data in much faster transistor based latches. SRAM is used for cache in modern computers.

        The memory controller, which is primarily comprised of some addressing logic as well as analog stages to interface with the memory bus, must be physically positioned in between the processor and the DRAM based memory banks. Whether it is on the same piece of silicon as the CPU or on a seperate chip has only a very small effect on the latency of the CPU making requests to main memory. The reasons for positioning the controller on the die are mostly economic, and it may by a very tiny speed advantage. The IBM POWER4 [ibm.com] processor integerates the Main memory and L3 Cache controller on the processor.

        I realize I called you on your lack of references, so I should probably provide some. Unfortunately I don't know of any good web links, but I recommend reading some books on Computer Architecture and/or Computer Organization:
        The Modern Computer Architecture: A common textbook in Computer Architecture/Organization classes [amazon.com]

        • I'm wasn't sure which site the article was on... It was either Ace's Hardware or ArsTechnica... Unfortunately I still don't know because I haven't had the time to go check around in back articles on each site...

          I do remember they were talking about the way current memory controllers (then on some VIA & AMD boards, the article itself was talking about asyncronous vs synchronous memory) and the path used to reach the memory that multiplied the base latency (those CAS/RAS latencies really) of the whole system...

          having the controller on the chip & optimizing the path to the memory you drop the extra cycles it would take as the data moves thorugh the system...
    • by karlm ( 158591 ) on Tuesday August 20, 2002 @11:41PM (#4109389) Homepage
      They can swap in another memory controller when DDR gets old, or they can add an interface for an external memory controller. The benefits of an integrated memory controller are just huge.

      CPU designs are pretty modular. It shouldn't be hard at all to swap in a new controller when the time comes. If the internal hardware interfaces weren't very clean, design would take a lot longer.

    • How often do you upgrade the motherboard and the RAM but keep the CPU...?

      As new (faster) memory becomes available, they'll simply update the memory controller on the (new) CPUs (just as they updated the FSB from 100 to 133 to 166 to 200 to 266 and soon to 333 or 400).

      RMN
      ~~~
    • What happens to the family when a new RAM interface comes along?

      You'd just drop in a new memory controller. Keep in mind that new memory interfaces don't come around all that often. You might get speed bumps like PC100/PC133 and the various flavors of DDR. But a single model of controller can often handle multiple speeds. Think about how many flavors of PIII/Celeron came out that used the PC66/PC100/PC133 SDR memory interface.

      If this gives AMD a big performance boost, which it should, it's a good move.
    • Generally, if you want to upgrade to a different type of memory, you want to upgrade to a different type of motherboard, and probably a different type of CPU. I don't think it's such a bad thing. "This motherboard is designed for DDR memory and a DDR compatible CPU!"
    • "but for a main-line CPU it seems to me that tying the memory controller to the CPU limits the lifespan of the design."
      and BINGGO was his name-o

      gee, then you would have to buy something even more often. Boy I bet they will cry all the way to the bank.
    • The integrated memory controller of the Hammer series chips can be disabled and replaced with a motherboard-based memory controller. Additionally, the core of the processors were designed to make it fairly easy to swap integrated memory controllers, but "easy" is a much looser term when describing modifying a multimillion transistor multilayer CPU core.
      Either way, it will not be a problem.
    • As I said in my previous post, I realize that the memory controller section of the chip can be redesigned. I realize that AMD would rather have a good way to sunset the current chips when a new memory design comes out. I largely wanted to make sure that I wasn't the ONLY one to realize this.

      However, look at what happened during the transition from SDR to DDR, or from DDR to RDRAM - all that had to be redesigned was the external memory controller chip, which allowed the release of mobos that supported the new RAM standards fairly quickly. How quickly would they have been supported if the Celeron/Duron chips had not had external memory controllers?

      Also, something that occurred to me as I slept - how do they handle memory coherency in a multiprocessor system? Does each CPU have its own memory, and they coordinate cachelines? (sort of a ccNUMA type arrangement) Or do they have a single external memory controller that all the CPUs talk to (and take the speed hit)?

      If the former, that would have a pretty large impact on Linux. If the latter, then a SMP machine would take a large speed hit relative to a UMP machine due to the slower memory access.
    • Given recent trends, the new POITRAM would require a new chipset which would require a new processor anyway. Might as well make it a single chip.

  • These guys [newisys.com] are designing Opteron servers, including dual Opteron 1U servers (web and render farm goodness) and quad Opteron 3U servers. Very impressive specs. The management is dominated by senior IBMers, plus a senior marketing weasel from Dell. Hmm, Dell skipped the Itanic2...

    Somehow, I suspect their designs are going to get licensed by some very big vendors. Call it a hunch.
    • Wow. Did you read the part about the integrated system management software, complete with SSL webserver and dedicated ethernet ports? Am I correct in thinking that they've replaced the BIOS with an entire OS? I don't know if I should be awed or terrified.
  • I would like to get more pics of the AMD Hammer processors. Any help would be greatly appreciated. Thanks
  • Wow. (Score:2, Insightful)

    by hatter3bdev ( 533135 )
    That was the longest advertisement i've ever read.
  • AMD Hammer FAQ (Score:3, Interesting)

    by antdude ( 79039 ) on Tuesday August 20, 2002 @11:13PM (#4109286) Homepage Journal
    AMDZone wrote a FAQ [amdzone.com] which was a good read.
  • by spiro_killglance ( 121572 ) on Tuesday August 20, 2002 @11:18PM (#4109313) Homepage
    The NDA isn't quite up until 2400 USA (eastern? pacific?, don't ask me i don't know) time, but look at, Here [216.239.39.120]

    Expect reviews from the usual suspects.

    AMD have modified there ratings a little so as
    to keep the model numbers fair compared with
    the newer faster Northwood pentium 4s. So while
    the old rating system would have had 2400+ as a 1933MHz Athlon, and 2600+ as a 2066Mhz Athlon, in
    fact the 2400+ is the first 2GHz Athlon while the
    2600+ clocks in a 2133MHz.

    We can expected newer Athlons to be released later
    with 333MHz Front Side buses, and later 512MB of cache. Even when Hammer comes out, AMD will still to selling Athlons for around a year afterwoods, the Athlon will move done the low end to replace the Duron, and thats going give the celeron a real kicking. In fact Intel seems to have blown
    there wad completely, with nothing to compete with
    the Hammer until there Prescott strink of the
    P4 in Q4 2003.
  • I want to see what features of Palladium have been implemented, since AMD have declared their support for it.
    Will the first series of Opteron prevent me from downloading mp3s, or will that be an optional extra / firmware upgrade?
    Of course I expect users will be able to 'opt-out' of these new features for the next year or so, until the US government, in their infinite wisdom, decide that opt-out is no longer an option, and that there will only be one licensed implementation ... the one that comes with a licensed copy of the M$'s latest and greatest.
  • On February 28, 2002, AMD announced the support of SuSE Linux for the Opteron processors. Good news to home computer users, on April 24, 2002, Microsoft has also collaborated to further 64-bit computing. (Emphasis added.)

    Yet another article implying Linux is not for the home. People read enough of these articles and they will conclude a priori Linux is not to be used in the home and never try it for themselves.

    Note I'm not saying it's completely ready for home use, especially by people with extremely limited computer knowledge, but people should decide for themselves. If everything they read says or implies Linux isn't for the home, they won't even consider it an option.
  • I hope... (Score:4, Interesting)

    by ParisTG ( 106686 ) <tgwozdz AT gmail DOT com> on Tuesday August 20, 2002 @11:41PM (#4109388)
    I hope they dont ship them like this! [hwextreme.com] (Note the bent pins on the left corner :))
  • This is only the hood, there's nothing to look at under it!

  • That has to be the strangest looking Hammer I've ever seen. Doesn't even have a handle. On the plus side, it does seem to come with a lot of built-in nails.

    RMN
    ~~~
  • I was expecting something of substance; for crists sake, we had perliminary benchmarks of the same processor months ago! All I got was a press release and a really badly done benchmark comparison("well, here's how the 800 did. For comparison, lets see what a 400 celeron did!")

    Reading this truly was a waste of my time. The ad when I clicked on the final "next" link added to my frustration.
    • What's even funnier, is that the 'benchmarks' were ripped off from The Inquirer, who's source was some guy who apparently sneaked onto an AMD machine at linuxworld and ran the benchmarks.

      It's sad that slashdot actually linked to such utter shit. I'd accuse them of being paid for links, but it's not like it's the first time...

  • Benchmarks (Score:3, Interesting)

    by decefett ( 127257 ) <(moc.ellevaf) (ta) (ttocs)> on Wednesday August 21, 2002 @01:41AM (#4109700) Homepage
    The machine was running Mandrake Linux, kernel 2.4.18-24mdk, and identified itself as running at 797.7 MHz with 256k of cache.
    ...
    And here's a comparison, openssl 0.9.6b (as shipped with Redhat 7.3) running on a 400 MHz


    What was that about lies, damned lies and...

  • A factor of four compared over the Celeron is really disappointing. It would be interesting if we know if they run the hand-coded x86 routines against GCC-compiled x86-64 code. It wouldn't be too bad, then.
  • I was mostly interested in the pictures, 'cause the article was terrible -- I think it was bashed together from press releases. I won't even get into the benchmarking, except to say that benchmarks should *not* compare two completely different architectures running at significantly different clock speeds with different software and OS versions. What were they trying to demonstrate?

    The big image on the third page was a shocker. Ack. Sure, it's just a mechanical sample, but adding a big page showing that you bent the pins on the processor doesn't particularly add to your breathless and misleading review. Wow. That was a terrible article. If AMD wants positive press in the technical crowd, they should be giving the samples to folks who know what they're doing.

  • Is it just me, or does anyone else REALLY enjoy frying eggs on their computer? My current CPU isn't quite hot enough, but as soon as I get AMD's new processor, I think I'll give bacon a try... I always hear people calling computers applicances. Consider that mine runs hotter than my toaster, and eats more power than every other appliance in my house, I can see how they might get confused.

    Seriously though... Is AMD or Intel showing any signs of reducing the power consumption and heat output of their chips? Or are they just going to gradually reduce the maximum operating temperature until you need to get a dedicated freezer just to cool your computer?

    Until they get on the ball, any alternative processor suggestions? I'm willing to pay more for decent equipment, and because everything I use is in source-form, any processor will be fine. The problem is that I've never seen anything but Intel and PPC notebooks... But, even if I've got to use a different processor on a laptop than on my desktop machines, I'd be willing to. It's really time for me to change.

    One hot day, I went into my BIOS and checked out the hardware section, only to find that my CPU and case were 256 degrees F, and my CPU fan was spinning at several hundred-thousand RPMs. You might instantly disregard that, but here, where room temp is often 130F, and I'd had several fans croak already, it was a coffee-spitting moment (on a related note, I need a new keyboard too ;-) ).

With your bare hands?!?

Working...