Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

AMD Announces Quad Core Tape-Out 347

Gr8Apes writes "The DailyTech has a snippet wherein AMD announced that quad core Opterons are taped out and will be socket compatible with the current DDR2 Opterons. In fact, all AM3 chips will be socket compatible with AM2 motherboards. For a little historical perspective, AMD's dual-core Opteron was taped out in June 2004, and then officially introduced in late April, 2005.' AMD also claims that the new quad processors will be demo'd this year. Perhaps Core 2 will have a very short reign at the top?" From the article: "The company's press release claims 'AMD plans to deliver to customers in mid-2007 native Quad-Core AMD Opteron processors that incorporate four processor cores on a single die of silicon.'"
This discussion has been archived. No new comments can be posted.

AMD Announces Quad Core Tape-Out

Comments Filter:
  • Software Licensing (Score:5, Insightful)

    by graphicartist82 ( 462767 ) on Tuesday August 15, 2006 @02:42PM (#15911999)
    I'm interested to see if software companies who license their software by CPU will continue to define a "CPU" as a physical socket, or a core. Right now Microsoft and VMWare (and lots of others) define a CPU as a physical socket, not a core. So a dual core processor only counts as one CPU for licensing purposes.

    It will suck if they start realizing how much more money they could be making by defining a core as a CPU for licensing...
  • by Short Circuit ( 52384 ) * <mikemol@gmail.com> on Tuesday August 15, 2006 @02:43PM (#15912002) Homepage Journal
    Isn't AMD depending on additional cores to beat Intel's performance similar to how Intel's Prescott depended on additional MHz to beat AMD's performance?

    Sounds like the shoe's on the other foot. I hope AMD brings back the kind of engineering innovations that brought it support among those in the know back in 1999 and 2000.. (Like focusing on a superscalar architecture with the K7.)

    Four cores is a fine concept, but they mustn't forget to increase the capabilities of the individual cores.
  • by martinbogo ( 468553 ) on Tuesday August 15, 2006 @02:47PM (#15912054) Homepage Journal
    It took AMD a very long time to create a low-wattage version of the dual core 280. With four cores burning away on the new chip, I wonder how efficient putting a quad-core chip on a server board will be. Right now, most servers are running more than 80W per chip, making for a massive thermal dissipation problem. There's a lot of heat to shunt away from the chip, after all.

    I'd rather have an ultra-efficient dual core chip, sayyyy .. 25W .. over having a quad core monster at over 140W!

  • by eebra82 ( 907996 ) on Tuesday August 15, 2006 @02:51PM (#15912091) Homepage
    I doubt that the release of a quad core CPU has anything to do with Intel getting desperate. AMD has stolen a lot of market shares from Intel in the server area so it is only natural for them to extend the current line-up with even more, faster CPU:s. You know, there is actually a market for quad core CPU:s as many server applications will benefit strongly from such architecture.

    Additionally, AMD gets to claim the quad core market before Intel, just like it got to 1000 MHz before Intel did. It's not only positioning, but also marketing.

    Last but not least, you can bet on an entirely new architecture from AMD coming next year. As with all new CPU designs, this is a difficult, expensive and time-consuming project so it's not like Intel and AMD are ramping out new CPU:s too often. Instead, they try to improve current technology and make the most out of it.
  • by nine-times ( 778537 ) <nine.times@gmail.com> on Tuesday August 15, 2006 @02:59PM (#15912159) Homepage
    I've never really understood the idea of licensing software per CPU. It seems a bit crazy/arbitrary to me. Why not charge per DIMM or RAM, or per byte of L2 cache?
  • by jeswin ( 981808 ) on Tuesday August 15, 2006 @03:02PM (#15912179) Homepage
    Wonder how long it will take for compilers and languages to catch up with the concurrency challenges [www.gotw.ca]. Till then, applications will run slower than ever.

    [On the desktop, multimedia players, browsers, compilers, IDEs, how many of them will use those cores? Servers seem to be ready though.]
  • Buy for tomorrow (Score:5, Insightful)

    by spyrochaete ( 707033 ) on Tuesday August 15, 2006 @03:07PM (#15912230) Homepage Journal
    In fact, all AM3 chips will be socket compatible with AM2 motherboards

    This is precisely why I recently purchased an Athlon 64 X2 instead of a Core Duo despite glowing reviews of the latter. The Duo is on Intel's ancient 478/775 sockets whereas X2 is on AMD's new AM2 socket. How many more processors can Intel jimmy into those tight little PGAs? AM2 will have legs for years to come while early adopters of Duo will be buying new motherboards with their next CPU upgrades.
  • by doh123 ( 951318 ) on Tuesday August 15, 2006 @03:07PM (#15912236)
    back in the day, most all apps could not use multiple processors, and if you wanted the specialized version that could use more, you had to pay more because of the extra development costs and low sales.

    Now, they do it because people are used to it, its accepted as norm... It doesnt cost them more but they can charge more just because people expect it. Its simple corp. greed
  • Re:Taped out? (Score:3, Insightful)

    by Chris Burke ( 6130 ) on Tuesday August 15, 2006 @03:11PM (#15912278) Homepage
    I've gotten conflicting answers from people in industry, often seemingly related to how old they are (and thus whether they'd have been around for the actual-tape-mask phase), which is why I said I wasn't sure. Since "tape out" with magnetic tape would still be somewhat of a euphamism whereas "tape out" with real tape is literal, I'm still not convinced it refers to magnetic tape.
  • by drix ( 4602 ) on Tuesday August 15, 2006 @03:15PM (#15912319) Homepage
    Once upon a time, the only people who had SMP machines had spent a huge amount of money on them. Licensing per CPU was simply a smart way to discriminate your customer base and figure out who had a high willingness to pay. Maximize producer surplus and all that. SMP became more and more commoplace in the 90s and now, with the advent of dual core, every grandma on AOL will be running on two or more CPUs in a matter of years. Since performance gains seem to be oriented towards more parallelism and not more MHz nowadays, this effectively means that software that runs on only one CPU has reached a performance plateau compared with everything else. My guess is the software industry will wake up to this fact and stop licensing by CPU, unless they want to field all sorts of questions about why theirs runs twice as slow as the next guy's.
  • Next node (Score:4, Insightful)

    by TopSpin ( 753 ) * on Tuesday August 15, 2006 @03:17PM (#15912346) Journal
    Wake me up when AMD has 65 nm scale cores. The vast majority of Dou Core 2 Duo Conroe Core whatever performance and efficiency gains are due to the differences between 90 and 65 nm features. Smaller scale means more execution units and more sophisticated cache logic on the same die. Until AMD does 65 nm their products will be either too hot or too slow.

    We've been at 90 nm for so long people almost forgot what a massive improvement a smaller node size can make. Various AMD 65 nm engineering samples are floating around Asia and AMD has made announcements about various 65 nm models appearing Q4 06, early 2007. This is the real battle. However, no mention of what these quad-core parts are supposed to be using...

  • by EricBoyd ( 532608 ) <mrericboyd.yahoo@com> on Tuesday August 15, 2006 @03:21PM (#15912390) Homepage
    Honestly, can you use 4 cores in any of your current applications? I think the time is coming when the 30 year trend in faster CPUs will end. If you can't increase the mega-herts, and extra cores don't actually improve application performance, what will Intel and AMD do to keep improving their products? I wrote an essay with some possible ideas: Computers in 2020 [digitalcrusader.ca]
  • by Aadain2001 ( 684036 ) on Tuesday August 15, 2006 @03:29PM (#15912472) Journal
    As a poster above me stated, there is every chance that Intel could win this little battle. Sure, the four cores on a single die does allow for better communication between the cores and better use of the caches. BUT, because of the increase in space on the wafer for a four core processor, the yield rates may drop dramatically. Intel's solution side-steps the yield problem by simply joining two dual core processors into a single package. So, while AMD's processors may have a slight performance edge over Intel's quad chips, they may also be 2x-4x times more expensive and harder to come by.
  • by Gr8Apes ( 679165 ) on Tuesday August 15, 2006 @03:45PM (#15912644)
    This is the same argument proferred in the first round a couple of years ago when AMD went 2 cores on a single die vs Intel's 2 separate cores slapped together. Did we forget the outcome of that battle so quickly? (Refresher: Intel got their ass handed to them)

    While I don't disagree with your point about the potential for increased failure rates of 4 cores on a die vs 2 cores, also note that we're at least one more generation advanced in fab facilities, which one hopes will help ameliorate the failure rates.

    Also, think about this - there's more to the new AMD chips than merely 4 cores on a single die. So I don't doubt they'll be slightly more expensive than Intel's offering while trouncing them in every way. Sort of like the Core 2 today. The difference between them? Core 2 took 3 years after AMD's first Opteron release, AMD's response to Core 2 will be less than 12 months.
  • by Raging Bool ( 782050 ) on Tuesday August 15, 2006 @04:12PM (#15913079)
    Actually, yes. Four cores would do very nicely for several of the applications developed by the company I work for.

    We produce real-time data acquisition and analysis systems for multi-channel data in the audio bandwidth and above. Some of our programs have several threads per channel, and on a 128-channel system I believe we have seen over 500 threads running...

    Anything that can allow our software to do more real-time analysis on the captured data without compromising the low-latency display update rates demanded by our customers is great news. Admittedly our application area is not a typical case, but I'm sure we're not alone.
  • by kabocox ( 199019 ) on Tuesday August 15, 2006 @04:15PM (#15913126)
    It will suck if they start realizing how much more money they could be making by defining a core as a CPU for licensing...

    They'll wait until we all have 4 to 8 cores. Then they'll hit us for an 4-8x hit in licensing cost. They don't want to kill off multi-core processing for main stream use before it really begins.
  • Bzzzt. Nope. (Score:3, Insightful)

    by JayBat ( 617968 ) on Tuesday August 15, 2006 @04:32PM (#15913383)
    Back in the dawn of time, when dirt was new, we "taped-out" by writing GDSII to a 1/2" 9-track 1600bpi magtape.

    Back before the dawn of time, when we didn't have dirt yet, we "cut rubies" (used Exacto knives and straightedges to cut Rubylith). People still use Rubylith [ehow.com] to do fabric silkscreening and such. No colored tape on paper, not dimensionally stable and not enough contrast for camera-reduction.

    -Jay-

  • by mrchaotica ( 681592 ) * on Tuesday August 15, 2006 @05:00PM (#15913755)

    If you don't fundamentally understand parallelism, Java isn't going to help you. I mean, so it's got a "synchronized" keyword. So what? You've still got to know at what granularity you want to synchronize stuff, you've still got to avoid deadlocks and race conditions, etc.

    The only thing hyping Java as a magic silver bullet will do is encourage the creation of a lot of buggy threaded code.

  • by IronChef ( 164482 ) on Tuesday August 15, 2006 @05:10PM (#15913864)
    All this multi-core stuff is great, but is software keeping pace? It's nice to multitask more quickly, but unless I am mistaken that extra core doesn't help when you are playing a 3d game.

    (I read that Unreal's upcoming "Gemini" rendering engine will be multi-threaded on the PS3. Hopefully that'll mean it supports multiple procs on the PC too.)
  • by NerveGas ( 168686 ) on Tuesday August 15, 2006 @06:10PM (#15914502)
    Photoshop can max out the four cores in my dual dual-core Opteron setup. Admittedly, I don't do that often, but that's still one app which *can*, and that's just a desktop app. Most server-oriented applications, however, are designed to take advantage of multiple CPUs.

    steve
  • Re:Taped out? (Score:3, Insightful)

    by owlstead ( 636356 ) on Tuesday August 15, 2006 @06:43PM (#15914816)
    Would not want to literally tape out one of these beasts :)

The key elements in human thinking are not numbers but labels of fuzzy sets. -- L. Zadeh

Working...