Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AMD

Behind the Closed Doors of AMD's Chip Production 151

rokali writes "Tom's Hardware is running an article on AMD's chipmaking procedure, plants, and future. Check out the pictures of Fab 36, their new plant slated to open in 2006, which will put of the next generation of 65nm chips. From the article: 'Currently, AMD's devices in Dresden are still produced on 200 mm wafers; the new APM 3.0 using 300 mm wafers won't be ramped up until Fab 36 opens. Production startup at the new facility is slated for the beginning of 2006, at which point the company will have invested an additional $2.5 billion.'"
This discussion has been archived. No new comments can be posted.

Behind the Closed Doors of AMD's Chip Production

Comments Filter:
  • It's all so shiny. In true geek fashion I got to the first glittery photo and can no longer scroll down.
  • Question (Score:4, Insightful)

    by elid ( 672471 ) <eli DOT ipod AT gmail DOT com> on Tuesday April 19, 2005 @05:38PM (#12286910)
    Incentives from the German government and the EU have lured a number of high-tech firms to the Saxony region of Germany, many of which have formed alliances. AMD, Infineon and ZMD work particularly closely together.

    Anyone know anything about this? What makes Dresden so interesting to AMD?

    • Re:Question (Score:3, Informative)

      by Blapto ( 839626 )
      The EU will give funding and tax breaks to large inward investment. Computer chip designing is a huge added value system (cheapish raw materials/chip) so it produces a large benefit for the EU. You'll find the same going on in most countries.
    • Re:Question (Score:3, Interesting)

      by BlacBaron ( 875559 )
      Not sure how related one could consider this, but it might explain why they chose Dresden and not some other german city.

      History [bbc.co.uk]

      As a result I believe it was rebuilt to be a rather industrial place.
    • Re:Question (Score:3, Insightful)

      by homerj79 ( 58075 )
      If I recall correctly from the PR fluff AMD put out a few years ago when they announced Fab 30, its due to the highly skilled workforce because of the Technische Universität Dresden [tu-dresden.de] (Dresden University of Technology [tu-dresden.de]).
    • Re:Question (Score:2, Funny)

      by Anonymous Coward

      What makes Dresden so interesting to AMD?

      They both have problems coping with heat.

    • Re:Question (Score:5, Informative)

      by Zocalo ( 252965 ) on Tuesday April 19, 2005 @05:58PM (#12287113) Homepage
      Simple logistics; just as many tech companies congregated in Silicon Valley, a similar situation exists in Dresden. Going from chip design to the actual fabrication requires a considerable amount of support infrastructure much of which is done by external companies. For more complex devices it will typically take a few months at least from finalising the design to the first chips actually rolling out of the fab.

      I know for a fact that not even Intel does everything in house, so it's highly unlikely that AMD does. Essentially there are just far too many different types of highly complex technologies and processes involved for one company to do it all. Having as much of that infrastructure located in the same general vicinity can save a lot of time, money and aggravation. Which is why we have manufacturing sites in both Silicon Valley and Dresden, amongst others...

      • When I worked for AMD (1993-94), masks were made by DuPont, not in-house. I never fully understood the process, but there was a $100K-per-license application called CATS that did "fracturing" on final chip designs to represent them in a way that the mask-making machines could grok.
    • Dresden is in the former East Germany, an economically backward part of reunited Germany. Companies that locate in this area have been given incentives by the German Government to expedite development and job creation.
    • Re:Question (Score:1, Informative)

      by Anonymous Coward
      The German government gave AMD large grants.

      "AMD said it has secured $700 million from a consortium of banks, and a series of lucrative guarantees and grants from the governments of Germany and Saxony."

      http://news.earthweb.com/bus-news/article.php/31 11 941

      I think that number has grown to over $1 billion now.
    • What makes Dresden so interesting to AMD?

      Well, how about the Dresden Nuclear Power Station?

      Wait! No! That's in Illinois! Sorry. Never mind.

    • Re:Question (Score:5, Informative)

      by Bender_ ( 179208 ) on Wednesday April 20, 2005 @01:03AM (#12290023) Journal
      What makes Dresden so interesting to AMD?

      Dresden was one of the centers of GDR microelectronics. The GDR was the technolocial leader in microelectronics of the entire east block and the gourvernment poured billions into it. However, COCOM [wikipedia.org] succeeded in keeping them technologically way behing the western countries. Nevertheless, Dresden was the birthplace of Honeckers infamous 1 mbit (scroll down) [cpu-museum.com] chip.

      After the reunificiation there was a huge skilled workforce in microelectronics readily available in Dresden. This was, and is, aside from gouvernment incentives a major reason to build fabs there. Siemens (and now Infineon) were the first to take advantage of this. AMD came later.

      The fabs have been extremely successful so far. Infineons fab was the first to have mass production on 300mm wafers world wide. AMDs fab managed to ramp the copper/low-k metallization process in record time.

      Btw. some of the GDR semiconductor companies still live on in form of ZMD (Dresden), X-FAB (in Erfurt) and the IHP (Frankfurt/Oder). However they mostly specialize in niche products now.

      From the Article:
      Check out the pictures of Fab 36, their new plant slated to open.

      You wish. There is no photo showing the actual production at an AMD site. One photo shows some support level, another photo does actually show the production of an entirely different company.
      • by x4u ( 877483 )
        Dresden was called the saxony silicon valley aready during the 80s. Famous products have been chips for a PDP 11 clone [cpu-collection.de] and the very popolar Z80 clone U880 [cpu-collection.de] which was the CPU of all east block home computers.
    • > What makes Dresden so interesting to AMD?

      Dresden used to be the area where East German computers and chips (more or less illegimate clones of the IBM PC and the Intel 8080/8086 running a Russian clone of DOS) were produced before 1989. Afterwards, the state government invested into maintaining computer and chip production there and bring it to Western level, and AMD was attracted also by the fact that there was a skilled workforce available in the Dresden area which needed no fundamental retraining fo

  • by WillAffleckUW ( 858324 ) on Tuesday April 19, 2005 @05:38PM (#12286915) Homepage Journal
    Oops. Sorry.

    I was reading from the FUD PR put out by Intel about AMD.

    A chip is a chip, except when you put salsa on it.

    Or have it with some Java.

  • Motherboards (Score:5, Interesting)

    by superpulpsicle ( 533373 ) on Tuesday April 19, 2005 @05:38PM (#12286920)
    How about manufacturing AMD motherboards. The Intel chip + Intel board is a ridiculously stable combination. AMD should have a combo of their own to counter.

    • Re:Motherboards (Score:5, Insightful)

      by ad0gg ( 594412 ) on Tuesday April 19, 2005 @05:47PM (#12287019)
      They used to have their own chipset and it sucked(speedwise and feature wise) compared to the VIA chipset that was out at the same time. AMD doesn't need its own chipset now since Nvidia makes a really great chipset.
      • Comment removed based on user account deletion
      • Re:Motherboards (Score:1, Informative)

        by Anonymous Coward
        Maybe VIA chipsets are only so fast because they take shortcuts implementing the spec. Besides, a "fast" motherboard is really only 1-3% faster than a "slow" one. Who would care except overclocking shitheads and cheap walmart bastards? (Except, come to think of it, that's most of AMD's customer base.)

        Although AMD never got the USB working right on their 751(?) chipset, major reason nobody used it.
      • They used to have their own chipset and it sucked(speedwise and feature wise) compared to the VIA chipset that was out at the same time.

        That's because the first VIA chipset for the AMD Athlon (Apollo KX133) didn't ship until about 6 months after [anandtech.com] the Athlon (and AMD 750 chipset) launch [anandtech.com]. I'd expect a brand new chipset (with PC133 and AGP 4x) to outperform and have more features than a six-month-old chipset (with PC100 and AGP 2x).

        If I remember correctly, AMD has said they are not in the chipset business (

      • Agreed AMD has 4 chipset makers. SiS, Via, Nvidia and um that other one... you know.

        And they will probably get some ATi Mobo's assuming ATi doesn't fold after the x700 fiasco.
    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Tuesday April 19, 2005 @05:49PM (#12287034)
      Comment removed based on user account deletion
      • It seems the only boards that bothered with AMD chipsets were boards intended for server and workstations.

        I think the 8000 series chipsets are still made, but generally are only put in Opteron systems. They had not yet made a PCIe replacement for the 813x chips. I think that update will become necessary in the next year to keep pace in the server market, though PCI-X seems to still be going pretty strong.

      • AMD still makes chipsets, but they're pretty much never found in desktop configurations. AMD produces chipsets mainly as a platform to help get the chips on the market before third party chipset manufacturers get a design out. Seems that third parties aren't keen on investing in a chipset design without seeing what the part looks like in real life.
    • Re:Motherboards (Score:2, Informative)

      by evilviper ( 135110 )
      The Intel chip + Intel board is a ridiculously stable combination.

      Intel's motherboards are just re-branded Asus motherboards.

      So buy an AMD chip and get an Asus motherboard for it. Doesn't take a rocket scientist...
    • I've found the nVidia nForce 3/4 to be better than Intel's latest; AMD's HTT makes a huge difference for memory-intensive computation :)
      With Intel having HT it's probably 6(1) or 1/2 dozen t'other, but I like my nForce.
    • Nvidia has been making good chipsets for AMD processors since the Nforce2 days. Yeah, sure, the entire Super7 platform had bad chipsets, and VIA's lovely offerings in slot/socketA had crappy southbridges until the KT266 and KT266A. That's all history.

      In fact, SiS makes chipsets too, at least in terms of stability. The boards that use them aren't always built to the best standards, but the chipsets themselves are fine.
  • Moore's Law (Score:1, Redundant)

    I bought a 3.0ghz (800Mhz FSB) PC almost two years ago and today, there still isn't anything out there that seems noticeably faster... I can't wait for Longhorn to come out; maybe then the processor industry will catch up :). Or perhaps I should just by a dual processor PowerMac? http://www.basicreations.com/
    • This is one reason I'm seriously considering buying a PowerMac to go with my PowerBook... processors are practically at a stand still, I figure this is one of the best times to buy a new machine to stay competitive. Not to mention a Dual 2 something after the next update will certainly be more powerful than my 1Ghz 12 inch PowerBook.
    • So you believe Speed=Frequenzy?
      • Actually, what I meant was that I can't notice a great jump in performance, regardless of AMD64 or Intel; and I agree, frequency does not necessary equate to speed.

        However, I have toyed around with some of the newer machines, and in comparison to my two year old machine, the difference is negligible (maybe a few seconds off of an Excel spreadsheet, for example)

        It seemed like not too long ago, just a jump of two or three hundred megahertz made quite a bit of difference, especially for games... But as pro

        • It seemed like not too long ago, just a jump of two or three hundred megahertz made quite a bit of difference, especially for games... But as processors are getting to their peek, I have found that it is going to take a much larger jump to get anywhere near that difference today.

          That was probably because games were CPU limited, as opposed to GPU limited as they are now. Remember, games used to have the T&L all be processed on the CPU as opposed to on the graphics card.
    • I'm not sure if you're joking or not, but the processor industry hasn't been sitting on their asses for the past two years. They're reaching the limits of how much they can push silicon technology. Frequency isn't increasing on the same rate as it did in the 90s because new issues like leakage become a bigger issue as the transistors get smaller.
  • by SpookyFish ( 195418 ) on Tuesday April 19, 2005 @05:43PM (#12286966)
    Early '06:
    "Dell considering building machines with AMD thanks to new fab capacity"

    Early '06 + 1 week:
    "Dell sticking with Intel"

    Well, at least it will help remove one of the theories (AMD supposedly not having the capacity).
  • by tofucubes ( 869110 ) on Tuesday April 19, 2005 @05:44PM (#12286985)
    better have a big wallet...looks like a lot of geeks will be window shopping... the low-end Opteron 865 chip will cost $1,514 USD dual-core Opteron 870 will run $2,149, with the Opteron 875 priced at $2,649 http://www.betanews.com/article/DualCore_AMD_Opter on_Prices_Leak/1113922595
    • New toys have cost this much in the past, and then become cheaper over time. Just like anything else in this industry, really.

      AMD seems to be aiming at a different market - 2k is not that much for a server that can handle the web hits their new chips should be able to...... if they can get their reliability up with Intel's....
      • by ciroknight ( 601098 ) on Tuesday April 19, 2005 @06:03PM (#12287149)
        The real problem is, AMD's Opteron will probably be done and shipping by the time Intel gets 64-bit dual core Xeons out the door. Not that they couldn't go ahead and shift all of their production capacity to dual core now, and have early chips ready by the end of this year, it's more like they won't.

        More and more I get my hopes up that Intel is doing research into a 64-bit enhancement for the Pentium-M, and I believe this to be the only reason we haven't seen Dual Core Pentium-M's yet. We're just now starting to see a move for the Pentium-M to the desktop, which is a good start, but without the cutting edge memory controllers present on new chipsets, it doesn't stand a chance.

        I believe Intel is also probably investigating adding memory controllers to their next Xeon line, which is definitely going to extend the amount of time in which we expect to see it. Intel really would see this as defeat, but as DDR2 becomes prime, Opteron's with DDR2 controllers will be able to completely smash any Intel offering, simply because it can get the data faster, get it processed, and pumped back out, while the Intel chips still wait for the laggy north bridge memory host to allocate the resources.

        Reliability will always be in Intel's court, simply because they control all factors of production, beginning to end. AMD's trying to take this approach, and by opening new fab facilities, maybe they can get into competition in other chip segments (like the Turion vs the Pentium-M). It also doesn't help that AMD is no longer making chipsets, but I believe a new fab facility will open this up as a possibility once again.

        Oh I love competition.
    • by CajunArson ( 465943 ) on Tuesday April 19, 2005 @06:03PM (#12287148) Journal
      Opteron 865 chip...
      If you want to build a 4/8 way machine (which is the only reason to buy from the 8x series) $1500 is not a bad price for a chip at all, and $2149 for the dual-core is only ~40% markup! If you want cheap.. buy a normal PC, after all the extra CPU's won't make your games faster and many of the server boards that take these chips don't even bother with high-speed graphics ports since they're designed to be servers. Opterons are cheap (err.. inexpensive) compared to Itaniums or other 64 bit architectures out there.
    • by greg_barton ( 5551 ) * <greg_barton @ y a h o o.com> on Tuesday April 19, 2005 @06:32PM (#12287377) Homepage Journal
      This is FUD. The 865 is not "low end" no matter what the article says. It's the chip that's capable of 8 way SMP, as opposed to the 2 and 1 way. Those are cheaper.

      Here [digitimes.com] is the source article for the price leak from DigiTimes. The prices for the 1 and 2 level chips are much less:

      165 chip: $637
      265 chip: $851

      Don't believe the FUD.
  • Here's a question... (Score:5, Interesting)

    by DrKyle ( 818035 ) on Tuesday April 19, 2005 @05:45PM (#12286993)
    Why is the building so darn yellow inside? Is it important for the process, the workers, the ability to keep the environment clean? It's just so yellow, I think I'd get a huge headache working there.
    • by Blapto ( 839626 ) on Tuesday April 19, 2005 @05:53PM (#12287071)
      True fact:
      When I was at school, the walls were painted "bright spark yellow". According to our teacher, studies had been done and it was found that this particular colour made people think more productively. He had entire studies to give us and everything, being 11 I'm not quite sure what we did with them.
      • Hmmm... My final year in grade-school was in a new school (constructed in 1976), and the walls were all sorts of cheerful shades of yellow and orange, with just a bit of white to rest your eyes. The school layout was sort of cool too, all the classrooms surrounded a large open-plan library (none of the "endless corridor" feeling most schools have). I remember really liking that school, so I suppose there was something to it.

        When was your school built? It seems like a very mid-70s sort of thing to do...
        • It was a tiny place, about 120 pupils. The building was probably built in then 1930s, but I imagine they repainted it every couple of years while it was a school. Kids can be pretty messy.
    • I have a theory: the workers would feel they're missing out when they where yellow goggles, so they painted the place all yellow...

      personally I thought it was kind of like golden, but that's just me

    • They took the photo through a yellow-tinted clean-room window. Taking a camera into a clean room is a lot of work and bother (it has to be, well, cleaned). I think the widows are tinted to prevent UV transmission or something.
    • by Anonymous Coward
      The reason it's yellow is because of the photolithography operations going on there. The yellow light doesn't expose the wafers like the white light does.
    • by Zocalo ( 252965 )
      If you are talking about a clean room, then it's part of the environmental control. In addition to the usual temperature, humidity and particulate matter controls, you also need to regulate static, ionisation and the lighting. The silicon wafers, the photomasks and other manufacturing devices are incredibly sensitive to all those things at varying stages of production. Basically the design of a chip is projected onto the silicon wafer in a manner kind of like projecting a photographic transparency onto a
      • Besides, once you are cooped up inside one of those natty suits that you have to wear in modern chip fabrication environments, believe me when I say that the lighting is *not* a major concern...

        I was under the impression that modern chip fabrication environments [ibm.com] were all automated and didn't require the full garb since all the wafers are enclosed and pushed around on air. As is explained in the 3rd paragraph of that article. Of course, I'll let you know in a few weeks, since we're taking a field trip
    • by DrLex ( 811382 )
      I visited an experimental fab here (at IMEC [www.imec.be]) a while ago and there also was a great deal of yellow light in some places. As far as I can remember, it has something to do with the processing. Some methods involve 'developing' photoresist layers on wafers, like developing a photographic film, and this process is insensitive to yellow light -- just like good ol' black & white photographs were insensitive to the typical red light in dark rooms.
      However, I recently visited a new cleanroom in the same fab, ma
    • by El ( 94934 ) on Tuesday April 19, 2005 @06:19PM (#12287263)
      It's not yellow. The picture is shot through a window, which has a UV-blocking coating on it. This makes everything appear yellow. Apparently certain frequencies of light are bad for the wafers.
      • They aren't bad for the wafers, but rather the photoresist applied to define/block metal and other features of the device.
      • by kesuki ( 321456 ) on Tuesday April 19, 2005 @10:26PM (#12289226) Journal
        Doh you opened the door, now this batch is all shot!

        As many have stated here, if the window is tinted yellow, the room inside infact has all yellow (amber) lighting. This is because much as a photo negative will expose under more than the slightest infrared lighting, CPUs will not be etched correctly if exposed to UV rays in the wrong areas.

        The entire building is not yellow, as only certain processes are UV sensative, and once the part has been given the needed chemical baths they are no longer light sensitive.

        White light would burn out the chips about to be etched as surely as opening the door to a dark room before the film/photo paper can be given it's chemical bath to 'crystalize' the paper/films light sensitivity.
      • It is yellow. The reason for this are indeed the frequencies of light. Although the photoresist used on wafers under process during the litho steps is most sensitive to a certain wavelenght (193 nm for 90 nm feature size, see wikipedia on litho [wikipedia.org]), unindented exposure most be avoided at all costs.

        Working under normal light would ruin any wafer with photoresist. Working in total dark would be ideal, but yellow is supposed to be some good compromise between working conditions and process issues.

        Allthough th


    • Photo areas in wafer fabs use chemicals (photoresist) that harden when exposed to UV light, which ordinary fluorescent lights emit in very small amounts (and mostly the harmless UVB type). Fluorescent lights around photoresist have "yellow" (more amber, actually) coatings to absorb the UV.

    • In fact, you will see yellow light in many clean rooms. Blue light can interfer with chipm aking process, so you can only use the green and red spectrum of the light. And green and read appears as yellow to the eye.
      The yellow light is not a problem, you get used of it within minutes. Then yellow will appear as white to you. But beware when you leave the clean room - everything will look quite funny until your eye recalibrates.
    • Background: I worked as an intern in photolithography at AMD in Austin, TX for most of 2001, and was in their FAB25 almost every day during that period. I was also one of the persons tapped to give FAB

      The yellow light comes from the aggressive filtering of blue (and UV) light in the photolithography processing area. This is to prevent premature development of the wafer's photoresist during transport of wafers between processing tools.

      The photo in the article is actually of a subfloor region, the equipm
  • by tofucubes ( 869110 ) on Tuesday April 19, 2005 @05:52PM (#12287065)
    dual cpu and gpu...
    I'm betting the same people who bought SLI configs are going to buy dual core...
    the problem with dual core vs. SLI is that people can buy one video card now and one later...
    which is not the case with dual core

    anyway I wonder if this all started people buying two of the same ram modules for more bandwidth performance

    and I wonder if this trend will continue?

  • by G4from128k ( 686170 ) on Tuesday April 19, 2005 @05:58PM (#12287112)
    Looking at the die layout, its easy to imagine that AMD (and Intel) will be produces a good many dual-core chips with one defective core (maybe 10-25% of production). I'd bet that somebody finds a market for those partially-functional chips. I also wonder what will happen when people discover that one core can be overclocked more than another core. For applications/loads that only use a single core, the system could disable the slow core and run the fast core at full speed.
    • It's been done before, so why not? Intel used to sell "defective" 80486DX chips where the fault lay in the numeric processor as perfectly functional 80486SX chips. It only makes sense that the design be engineered so that defective cores can be disabled after fabrication and leave a functional chip.
    • I also wonder what will happen when people discover that one core can be overclocked more than another core.

      At $1600+ for a 4way config, I don't think anybody is going to be overclocking these bad boys.

      • That is like saying people who plunk down a few hundred thousand dollars for a really nice (or at least expensive) car or boat will not tune it. $1600 is dirt cheap for a 4-way configuration; I expect that the overclockers will give it a lot more attention than they give current quad-cpu offerings.
        • That is like saying people who plunk down a few hundred thousand dollars for a really nice (or at least expensive) car or boat will not tune it. $1600 is dirt cheap for a 4-way configuration; I expect that the overclockers will give it a lot more attention than they give current quad-cpu offerings.

          Exactly. At $300k, you're buying a pretuned car that is damn fast and difficult to improve on. A $1600/cpu 4way box starts at around $10k or more and is useless for games. I can't see overclockers spending the

      • Don't be so sure ;)

        "If you have a lot of money. this is a great board for gameing if you set it up right using to dual operations and full all the banks with 8 Gbs of pc 3200. Best board ever."

        Rich gamer,12/15/2004 2:07:24 PM

        (from a review of a TYAN Thunder K8SR [newegg.com] Dual Opteron board on Newegg)
    • The dual core chips with one defective core will simply be sold as single core chips. The single and dual core chips both use the same motherboards, so really the only way someone could tell them apart would be by the writing on it anyway.

      As to overclocking, there's only one bus and one clock driving both chips, so you can't clock them differently.
  • by ndykman ( 659315 ) on Tuesday April 19, 2005 @06:04PM (#12287154)
    Firstly, after reading the article, I was shocked to note that AMDs processors come out of one fab line, and the American fab line was flash only. If this is the case, well, wow. That seems a bit risky. If you get a tricky or persistent process issue (and it happens, no matter how cool you are), that seems like it could really impact AMDs output and yield a good deal.

    Of course, that's the main question here, and no way you are going to find out that answer. Yield. How many chips are good in a wafer?

    You can guess, but the answer may speak alot about AMD and Intel. It could very well be (here comes the flames) that Intel has an advantage in being consistent in volume and yield that allows them to keep large-scale contracts.

    It is a big question in my mind if AMD can currently provide the large-scale on demand volume that the big companies require in some product lines. Could an HP, a Gateway rely exclusively on AMD for chips? (I don't know)

    Certainly, it seems that have one fab plant only could be a big bottleneck or issue to make major vendors concerned and place a cloud on that question.

    Toss in this which the fact that you can get chipsets (heck, network chips if you'd like) from Intel as well, and you have a real competitive advantage that is tough to beat. All your motherboard bits, one vendor.

    And, sure, Intel chips have disadvantages, but in real-world experiences, the performance of similarly priced AMD and Intel desktop solutions aren't so obviously different that most people will notice enough to overcome those other issues at play.

    Just a thought.
    • by Brain_Recall ( 868040 ) <brain_recall@@@yahoo...com> on Tuesday April 19, 2005 @08:13PM (#12288180)
      Quite right. The Austin plant hasn't done CPUs since the aluminium Athlon days (think Thunderbird core).

      But we are talking about the Dresden Fab 30, which was for a long time considered the most advanced fab in the world.

      "In May 2001, Fab 30 was awarded the coveted "Fab of the Year" title by Semiconductor International. The magazine recognized Fab 30 as the first facility in the world specifically designed to produce microprocessors with copper interconnects." http://www.amdboard.com/amdfab30.html [amdboard.com]

      With over 150,000 square feet of clean-room, it could, and does, handle the load.

      As a side note, here's AnandTech's tour of Fab 30: http://www.anandtech.com/cpuchipsets/showdoc.html? i=1773 [anandtech.com]

      • Thanks for the links. This brings some light on things. Now, off to see if I can find any info on Intel fabs. (See if I can find wafer start info, for example).

        And just for the record, I believe that both AMD and Intel have top-notch fabrication apabilities, which, for all the arguments about one company vs. the next, we do reap the benefit of as consumers. You know, you get a pretty amazing chip for less than 200$ these days.

        Frankly, I want both companies to stick around for a while.
  • Fab 36? Wow, the Thunderbirds have come a long way since the initial 1-5.....
    • Re:Thunderbirds? (Score:2, Informative)

      by Anonymous Coward
      IIRC, AMD's FAB naming convention refers to the year (relative to AMD's founding) that the FAB opened. So FAB 25 in Austin was built when AMD was 25 years old, FAB 30 in Dresden when AMD was 30 and now FAB 36 when AMD turns 36.
  • ...produced on 200 mm wafers; the new APM 3.0 using 300 mm wafers...
    When I was in college mm stood for millimeter, which would mean we're talking about 8-inch wafers now, and 12-inch wafers with the new APM (rounded to nearest inch). Am I misunderstanding the mm abbreviation, or was this mis-typed in the article and supposed to be something else?
    • Re:200 & 300 mm??? (Score:2, Informative)

      by Anonymous Coward
      Yes, the wafers really are that big.

      Remember, a great many chips are made from each
      wafer (the same pattern, or die, is repeated many times across the wafer surface, which allows for many chips to be made in parallel). They are cut into chips with a saw at the last stage.
    • The wafers are indeed 300 millimeter in diameter. Some times also named 12 inch but this is incorrect there they only are about 11.8 inches...

      This is about the same size as a LP (you know, the black disks we used to have music on before the CD?)
  • From the article:

    > In the new 90nm model of the Athlon 64 with Winchester core, half of the L2 cache is deactivated; the production process for the chips is identical to that of the larger variants.

    Any one know whether this deactivation is reversible?

    I know if I was building such a chip I would make it so either half of the cache could be activated. That way in case of a production fault, either half of the cache could be used. Consequently yields would go up and AMD gets more dollars in the ban

  • The reason their new fab is their first 300mm wafer facillity is because Intel paved the way last year.
    • ha ha ha. There are about 5 companies that run a 300mm fab for more than a year now.

      Intel paving the way, probably. They are definitely a driver, and it it was not last year. I think Intel is running 300mm fabs for over 3 years already.

Professional wrestling: ballet for the common man.

Working...