Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AMD Technology

AMD Demos Dual-Core Athlon 64 428

DigitumDei writes "Dual core chips came closer to reality as AMD demonstrated their Athlon64 dual-core offering. The 90nm technology chip will use the same 939-pin infrastructure and cooling solutions as the current Athlon 64 chips, meaning that upgrading to a dual-core chip from your current AMD64 will require little more than a BIOS update. Available in the second half of this year, the chip will be added to AMD's current line (Athlon64, Athlon FX, Sempron)."
This discussion has been archived. No new comments can be posted.

AMD Demos Dual-Core Athlon 64

Comments Filter:
  • by Dragoon412 ( 648209 ) on Thursday February 24, 2005 @10:16AM (#11766398)
    I don't understand the hype about dual core CPUs.

    As I understand it, they work almost identically to a SMP setup, meaning they don't offer much of any performance benefit in most apps (particularly games). They draw more power, they run at higher temperatures, etc.

    Is there something I'm missing? Or is this whole dual-core mess really just SMP on one CPU? Because from what I've read on the likes of Extremetech, Anandtech, and so on, I'm not finding any reason to be impressed.
    • they don't offer much of any performance benefit in most apps

      And how many apps & other processes is your system running at the moment? Mine's running 58 with 518 threads.

      • by dsginter ( 104154 ) on Thursday February 24, 2005 @10:31AM (#11766593)
        And how many apps & other processes is your system running at the moment? Mine's running 58 with 518 threads.

        But what's the processor utilization? On most systems, its usually less than 10 percent. So when a user does something, the bottleneck is usually not the processor. Its usually the hard drive.

        Money would be better spent on RAID, rather than dual core or dual processor.
        • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Thursday February 24, 2005 @10:37AM (#11766665) Homepage Journal
          You can use an affinity tool to put your process on a single processor alone, and everything else on the other. This all but eliminates context switches in your program's context, while all other processes can continue to run on the other core(s). The only thing that consumes a lot of CPU that is not typically multithreaded is game software. Eventually, this too will be multithreaded, as we see more multi-core systems out there.
        • by Donny Smith ( 567043 ) on Thursday February 24, 2005 @11:21AM (#11767125)
          >Money would be better spent on RAID, rather than dual core or dual processor.

          You're right about that.

          Unlike CPUs which become worthless in less than 2 years, RAID h/w last a bit longer.

          Some five years ago I bought an Ultra2Wide SCSI 320 card and a (at the time big) 8GB HDD - I paid $400 for the card and $250 for the HDD.

          I still use the card - I haven't checked but it should be as fast as SATA II I guess - and the SCSI disk works too (although it's quite useless - I use it as dedicated swap disk).

          In the meantime I went thru 3-4 generations of motherboards and CPUs (consecutive 100% wipeouts) and my RAID stuff still rocks...

          By year's end I'll go not for a dual core CPU system but for what's today top of the line nForce4 system. Screw the hype.
        • by Just Some Guy ( 3352 ) <kirk+slashdot@strauser.com> on Thursday February 24, 2005 @11:30AM (#11767239) Homepage Journal
          But what's the processor utilization? On most systems, its usually less than 10 percent. So when a user does something, the bottleneck is usually not the processor. Its usually the hard drive.

          That's most systems, but certainly not all. I wrote a web application in Zope that acts as a portal to our scanned document warehouse. Whenever a customer wants to access some of the data we're storing for them, we fetch a few TIFFs from a Samba filesystem, convert them into a PDF with ImageMagick, and send them out. A RAID wouldn't make a bit of difference to our setup, since even the comparatively slow network file retrieval is much faster than the image processing which is the real bottleneck.

          Our system is idle probably 95% of the time. In fact, it currently has a 5-minute load average of 0.04. But in that other 5% of the time, we want it to respond NOW and not 30 seconds from now. This is a pretty common situation for server machines - relatively long periods of inactivity punctuated by short periods of frantic scurrying - and it seems reasonable that AMD is offering their 64-bit server chip with this feature.

        • But what's the processor utilization? On most systems, its usually less than 10 percent.

          Regardless of the fact that you pulled that number out of your ass ;-) there are many applications that are processor, not data, intensive. Also, in the case of servers that run multiple services, if one of your services has a problem and pegs out a CPU, your site is not completely crippled, and it is also much easier to remotely connect to the server to fix it. Ever try fixing a machine with one CPU that is completely
      • by Waffle Iron ( 339739 ) on Thursday February 24, 2005 @10:36AM (#11766653)
        And how many apps & other processes is your system running at the moment? Mine's running 58 with 518 threads.

        Typically 517 of those threads are asleep waiting for IO or a signal, and the one piece of information that you are currently waiting for is being processed in the single remaining active thread.

      • by TheRaven64 ( 641858 ) on Thursday February 24, 2005 @10:38AM (#11766682) Journal
        I only do two things that really tax my CPU. Compiling and video editing. Compiling is embarrassingly parallel, and make programs (including GNU make) have been able to take advantage of this for ages. Generally, the best performance can be achieved by running make with number of CPUs + 1 way parallelism. Video editing is similarly parallel, since most CPU intensive things are effects that need to be applied to a large number of frames, making it trivial to split the workload. I would certainly see a large performance benefit from SMP.

        Before I abandoned the desktop in favour of the laptop, I had an SMP system, and it was nicer to use than my faster UP system, since single-threaded computationally expensive things could be run on one CPU leaving the other one free for UI-related tasks.

    • by teg ( 97890 ) on Thursday February 24, 2005 @10:23AM (#11766484)

      I don't understand the hype about dual core CPUs.

      As I understand it, they work almost identically to a SMP setup, meaning they don't offer much of any performance benefit in most apps (particularly games). They draw more power, they run at higher temperatures, etc.

      SMP without the mess (extra CPUs, cooling, expensive/complicated motherboards) and cost is definitely something to be impressed about.

      It should give a big performance boost to a multi app and multi thread environment.

      • by c.r.o.c.o ( 123083 ) on Thursday February 24, 2005 @10:39AM (#11766687)
        SMP without the mess (extra CPUs, cooling, expensive/complicated motherboards) and cost is definitely something to be impressed about.


        The motherboards supporting dual core CPUs should be identical to those running single core CPUs. I guess this is where having the memory controller integrated into the CPU really pays off for AMD since it further simplifies mb design. But in the past SMP motherboards weren't THAT much more expensive (at the most $100 extra) than similar single CPU motherboards. The main cost associated with SMP setups were the very expensive SMP CPUs, which were anywhere from 1.5 times or more expensive than regular CPUs. The pricing of dual core CPUs remains to be seen, but I think it'll still be cheaper than 2 separate SMP-enabled CPUs.

        However I completely agree with the rest of your post. Not having separate heatsinks, large motherboards, etc is a definite advantage. Just because of that the market acceptance will increase very rapidly. I wouldn't be surprised if games and other CPU intensive apps started supporting dual core CPUs soon.
        • Also with dual core memory controller on chip has the distint advantage of being able to share that on chip memory. Meaning core to core communication is significantly faster than chip to chip communication.
    • by forkazoo ( 138186 ) <wrosecrans AT gmail DOT com> on Thursday February 24, 2005 @10:23AM (#11766489) Homepage
      Yup, pretty much. But, some of us do more than play games. Also, as multiprocessing hardware becomes more common, game makers will begin to take advantage of the benefits. For me personally, when I want to use my box for general-purpose stuff, and it is running the mythtv backand and transcoding some files into MPEG4, and I am rendering a 3D animation, and so on... Well, having SMP sure isn't a bad thing!
    • by drwtsn32 ( 674346 ) on Thursday February 24, 2005 @10:24AM (#11766505)
      Sure, benchmarking a single app on an SMP system often makes little to no performance difference, but SMP is fantastic if you are a heavy multitasker and work with several apps at once.

      My first SMP system was a dual Pentium 133Mhz box. After that I never went back to a single proc... until the Pentium 4 came out. It's disappointing that this chip does not support SMP (except for the Xeon line). P4 hyperthreading helped bring back some SMP goodness, but it's still not as good as two real chips.

      Personally I can't wait for dual core CPUs!
      • It depends on the app. Specifically if it is multithreaded or not. if it is, different threads can run on different cores.

        Most current desktop apps don't use many/any threads therefore aren't as able to really capitalise on SMP architectures.

        As dual-core gets more generally widespread, there will be more pressure/benefit for developers to write multi-threaded apps.
        • by plague3106 ( 71849 ) on Thursday February 24, 2005 @10:58AM (#11766894)
          You know, I've been hearing this alot on this thread, and I don't understand the thinking. You do NOT need threaded apps to take advantage of SMP. A SINGLE app will run faster (potentially) on SMP if it is threaded, but running a SINGLE application isn't the big benefit SMP gets you.

          Its that Firefox can be run on one processor, while your MP3 player is running at the same time on the other. This in turn will speed up BOTH applications, since Firefox does not ever have to yield to the player, and visa versa.

          Since there's more going on then just those two apps (various system process, etc), your machine should be faster as each process now only has to worry about HALF the number of processes it did before.

          I don't understand why a largely tech audience misses that point. We're not in DOS anymore; the OSes we are using all run more then one process at a time.
          • But the all the processes you're talking about really don't consume much processor time.

            Even on a single core system, the CPU load of running an os, firefox and an mp3 player would be pretty minimal, especially since firefox isn't usally doing anything unless its actually loading a page.

            It seems to me that the real benefit of dual core is for continuous high-load applications, such as hardcore gaming or bulk-serving database queries or web pages. The latter is probably more likely a to be running on an SM
          • by pclminion ( 145572 ) on Thursday February 24, 2005 @12:31PM (#11767952)
            Its that Firefox can be run on one processor, while your MP3 player is running at the same time on the other. This in turn will speed up BOTH applications, since Firefox does not ever have to yield to the player, and visa versa.

            Only if those applications were maxing the CPU to begin with. An MP3 player on a modern processor only utilizes around 1% of its capacity. Firefox a similar amount. They can easily share a CPU with 98% of its capacity to spare. They might run imperceptibly faster due to better cache utilization, but the reality is that almost every application spends 99% of its time waiting for something slower, like disk or network.

            The only sort of application that a typical user (i.e. a non-developer) uses that's actually capable of maxing the CPU is, say, video editing, or a high-performance game.

        • "Most current desktop apps don't use many/any threads therefore aren't as able to really capitalize on SMP architectures."
          They don't? The desktop app my company sells is multi-threaded. The app we wrote to manage our support calls is multi-threaded. Frankly any program that has ability to have more than a single view should be multi threaded. The Phone call manager uses a thread for to check what the status for all the support techs is. A thread to refresh the display of the waiting calls. and a thread for
      • I trust you are aware that you could have gone dual AMD? And that in many cases HT actually results in poorer performance?
    • by i41Overlord ( 829913 ) on Thursday February 24, 2005 @10:28AM (#11766554)
      As I understand it, they work almost identically to a SMP setup, meaning they don't offer much of any performance benefit in most apps (particularly games). They draw more power, they run at higher temperatures, etc.

      The reason most games don't get a performance boost from dual CPU's is because they aren't programmed to take advantage of the other CPU. How many end-users' home systems have dual CPU's? Hardly any of them. There was no reason for game makers to go through the effort programming for something that 99.99% of their customers can't use.

      With the new dual core chips, technically it isn't anything groundbreaking but it will ensure that there's much more widespread adoption of multiprocessor systems. With more of the userbase using dual core CPU's, game makers will have a reason to program to take advantage of it, and you'll begin to see games that do see a performance increase when using dual CPU's (or dual cpu cores).
    • by wowbagger ( 69688 ) on Thursday February 24, 2005 @10:28AM (#11766556) Homepage Journal
      Most games have several potential threads running at once:
      • Graphics rendering
      • Sound rendering (compositing the various sounds together, and playing music)
      • Game logic (monster AI, object movement, physics model)
      • User input monitoring
      • Network processing


      An SMP system can greatly benefit a game designed to be truly multithreaded.

      Even if the game is NOT designed to be multithreaded, there is the fact that one core can be running the game, while the other core handles interrupts, operating system processing, and other tasks.

      The days of your computer doing only one thing at a time are long gone.
      • The problem here is that the multi-core will inevitably mean slower cores, and a lower speed at which any one thread can possibly be processed. In gaming, for example, it may be multithreaded, but most of the threads are pathetic (user input monitoring is actually really easy and insignificant for example), and one thread (maybe two very rarely) is a resource hog. Therefore even if you have overall 150% more clocks, but each core constitutes 75% of the clock speed of the single core alternative, then the r
    • by Junta ( 36770 ) on Thursday February 24, 2005 @10:28AM (#11766557)
      The big deal is that it is a way to get *cheap* SMP, motherboard sockets x2, the downsides being cooling such densities (read: reduced clock per core), and sharing a memory controller per two cores (which is what Intel SMP has been doing forever, AMD used memory controllers per processor in a NUMA fashion full time, and Hyper Transport to access memory not associated with the current processor).

      Theoretically, the dual core clocks will add up to more cycles overall than a single core, but the single core will have more clocks per individual thread, so unless a game leverages threading very nicely in the processor intensive segments, a multi-core may be slower than a single-core for the high-end gaming scene, however for workstation/server/HPC fields, it is very exciting.
      • These could really be helpful in video editing and encoding. If I could pull my 2 xeons and replace them with 2 dual core xeons i suspect i would see a big time decrease in compile times....
    • by Shalda ( 560388 ) on Thursday February 24, 2005 @10:29AM (#11766564) Homepage Journal
      Effectively it's SMP on one CPU. Something I very much look forward to as I always build my desktops SMP. It also incorporates some of the overhead of SMP into the CPU driving down the system price a little bit.

      While you're right that SMP offers little performance to most apps, I tend to run a lot of CPU hogs at the same time. Watch a DVD while waiting for a project to finish compiling or whatnot. It can also help keep runaway processes from sabatoging your system. I used to have a program that set its priority to 'AboveNormal' and would from time to time it would hang up in a loop. Since it was running at a higher priority, you sometimes couldn't bring up Task Manager to kill it off as the higher priorty thread always took precedant. And if all else fails, you can set up 2 SETI @ Home clients and process twice as many packets. But do your self a favor and set their priorty to 'Low'.

      Also, you've got a chicken and egg problem. The reason there's so few programs that benifit from SMP is that there are so few computers that are SMP. When I was running some computer labs at a Big 10 university a few years back I was insisting on SMP workstations so that the CS students could learn to program multithreaded apps and see the benifits of it when it ran on 1 vs. 2 processors.

      Lastly, my basement is very poorly insulated and gets a bit chilly in the winter. Anything to help warm it up and keep my fingers working properly is a good thing(tm)!
    • by magarity ( 164372 ) on Thursday February 24, 2005 @10:29AM (#11766566)
      they don't offer much of any performance benefit in most apps (particularly games)

      Ah, rejoice irresponsible youth! Right now a $25,000 8 CPU machine that can no longer keep up with a decent sized corporate database needs to be replaced with a $60,000 16-CPU machine. After dual core hits the market, it can be upgraded for the price of 8 new dual-core CPUs and a BIOS flash. Less money for hardware == more money for bonuses... W00T! W00T!
      • What makes you think the database is CPU-bound instead of I/O-bound? And what makes you think the manufacturer of the two machines will allow this software patch instead of selling lots of expensive new systems?
        • by magarity ( 164372 ) on Thursday February 24, 2005 @11:50AM (#11767470)
          What makes you think the database is CPU-bound instead of I/O-bound?

          CPU load versus disk queue length versus memory page accesses. But no worries about my off the cuff example being the only case; a very popular move these days is to run VMware (or similar) on an 8 or 16 way machine in order to provide 20, 30, or more "servers" doing light to modest loads so introducing dual core CPUs will allow even more virtual servers hosted on a single physical box. The cost saving potential is TREMENDOUS.

          And what makes you think the manufacturer of the two machines will allow this software patch

          It only takes one manufacturer to advertise "Buy our 8 way single core now and upgrade to dual core on the cheap later!". You can buy an 8-way now with only 2 CPUs and add later; there's no reason why you won't be able to buy 8-way dual core capable with only a few single cores and upgrade/add CPUs for years. Big corporate servers costing serious money are upgraded and/or assigned to different roles for a long time compared to desktop PCs. It's a whole different world. One company *finally* decommissioned a quad PPro server and donated to a nonprofit I know. That's been in service 24/7 for what, 14 years? It was probably over $50K when new and they have to get their money's worth out of it. Anyway, the manufacturers know that most IT budgets are not unlimited and customers always like less expensive alternatives.
    • by SpongeBobLinuxPants ( 840979 ) on Thursday February 24, 2005 @10:30AM (#11766579) Homepage
      meaning they don't offer much of any performance benefit in most apps (particularly games). They draw more power, they run at higher temperatures, etc.

      from TFA:

      For example, a processor with dual 2.0-GHz cores can deliver performance not all that different from a single-core 3.5-GHz part. More important, such a dual-core part will hold down power dissipation to a figure closer to that of a standalone 2.0-GHz CPU, allowing processing throughput to effectively double for not much more power.

      and

      At such speeds, single-CPU processors can often dissipate more than 150 W.
      The dual-core Athlon 64 runs at a clock-speed of 2.4 GHz and has a maximum power dissipation of 100 W.
      • You can't really take TFA seriously, since it's InformationWeek, and not a reputable source. Just look at your first quote. There are plenty of 2.0Ghz single core chips out there that can deliver beter performance than a 3.5Ghz P4...

        When they talk about single core chips using 150 watts, they're talking about intel chips (almost certainly the Itanium). 100 Watts for a dual core 2.4 Ghz Opteron is a 60% increase in power usage over the single core version.

        Of course the parent to your post is just plain wro
      • deliver performance not all that different from a single-core 3.5-GHz part.

        Agreed...but that is generally. When it comes to games and other programs which rely on the higher clock speeds, it doesn't matter that the overall, aggregate performance is simular to a 3.5ghz machine. A single threaded app will run at the clock speed of the chip. So if the chip is clocked at 2.0, then the single threaded app will only run at 2.0, and not the aggregate 3.5. Then there is the issue of an operating system that can

    • Dual core is basically SMP as far as I know, so you're not missing anything, except maybe some of the benefits of running SMP. From my experience, SMP can go a long way to making a system feel less "sluggish". Tasks that will essentially lock up a single processor system can run in the background without making a system useless on SMP.

      Now this doesn't usually add up enough to warrant a normal user to spend the extra money on SMP, but if dual cores become the replacement for the desktop line, they should

    • Some instructions on the x86 hardware have a delay; let's say I'm asking for something from ram or cache. This takes more than one cycle (3 from cache if I remember correctly). On a normal, non hyperthreading processor, the processor sits idle until that memory value comes back.

      On a hyperthreading processor, the processor can do an instruction on another process in that time. So a second process can come in and do a couple ADDLs.

      It may not make your game run any faster, but if you have something in th

    • There must've been some similar reasoning when ICs first came out. "What do they offer compared to the soldered circuits we know and love?". Well, for once, integration.
    • Don't offer much benefit... for games that aren't written to use them.

      However lots of games could be very easily written to use them. Not to mention they offer immediate (small) performance benefits to gamers of being able to offload everything but the game onto the second processor. Windows, the mp3 player, the virus scanner, web browser showing a stuipd flash ad, the (speedhack... er I mean) "stats" addon you have running for your favorite MMO/FPS...

      "But writing parallel software is hard!" is the usual
    • by oldmanmtn ( 33675 ) on Thursday February 24, 2005 @10:34AM (#11766634)
      As I understand it, they work almost identically to a SMP setup, meaning they don't offer much of any performance benefit in most apps (particularly games).

      I suppose if your idea of "most apps" is games, then this probably isn't an area that would be of interest to you

      If you have any multithreaded app that is even remotely competently written, then it will benefit from dual cores or (possibly) hyperthreading. If your multithreaded app is full of "big locks", then dual cores won't help, and the application designer is a failure.

      If you have a workload that has multiple processes running simultaneously, then it is also likely to benefit from dual core. It gets more interesting with business/server workloads, but "home users" can benefit two. Even something as simple as running xmms and gcc at the same time should go faster. Or running two instances of lame.

      The real win with dual core comes from increased throughput. A single job/application/process isn't likely to go any faster, but a full workload of multiple, reasonably parallelizeable, tasks will be faster.
      • If you have any multithreaded app that is even remotely competently written, then it will benefit from dual cores or (possibly) hyperthreading. If your multithreaded app is full of "big locks", then dual cores won't help, and the application designer is a failure.

        You assume everything can be trivially multithreaded. That's utterly wrong.

        First of all, multithreading is not always done for performance. Sometimes it makes the code easier to write. It's nice to be able to compartmentalize tasks into thread

    • It's simple. Everyone knows that you can scale better in heavly multithreaded envorement by adding more CPU's instead of increasing speed of single CPU - it consumes less time on each CPU spent for managing tasks and context switches than on single, since each cpu has to work with smaller amount of threads.

      And since currently even a desktop computer starts to approach point where there are hundreds of threads running (check in task manager or top) - this makes quite a lot of sense.

      Also, a lot of people mi
    • by Anonymous Coward
      Yes, you are missing something. Many of us use computers for more than games. Java applications for instance are nearly always threaded and benefit greatly from multiple CPUs (at least on Linux, I can't speak for Windows). Or how about working on a PowerPoint presentation while your spread sheet recalc's. Most any app that has a GUI will be more responsive if other activities are going on as well.

      Whether the price difference is worth it is another question. Also you can expect many threaded apps to b
    • it does help:

      one core for playing a game, the other for the OS

      or one core for ripping a dvd, the other for everything else

      as long as you dont only run one app at once you will greatly benifit, and even when you do (when you play a game for example), you still benifit.
    • Could we just kill the major security issues once and for all by dedicating one core 100% to security and encryption chores? I'd love to see hardware implementations for virus scanning, spyware, firewalling, encryption, VPN tunneling, authentication and smart patch managment that have free use of their own high end CPU
    • by Vaystrem ( 761 ) on Thursday February 24, 2005 @10:57AM (#11766885)
      I've had the pleasure of using a number of friends Dual CPU systems across the years.

      The most relevant one to this discussion is my friends Abit BP6, Dual Celeron, setup. At the time he was running Dual Celeron 300s. Not that impressive right? Except that he was able to host our Unreal Tournament Server - and then join it with no lag for any of the players. Running Unreal Tournament by himself showed a 50% load on each processor. He was able encode MP3s, burn CDs, and play games simultaneously. Something I was not able to do, and wanted to for the sake of time saving. I did not want to choose one activity over another - a 'leisure' productivity issue if you will.

      Fast forward to now. A nice dual cpu system would allow me to play games, encode movies / my audio files, simultaneously while running Distributed.net. Are dual core's absolutely necessary? No, but when you are doing some very intensive applications you can't do anything else. As well even though not all games are multi-threaded, various aspects of the game may be, the networking code/the sound code etc, meaning that the game may run somewhat better on a dual cpu setup.

    • by Thagg ( 9904 ) <thadbeier@gmail.com> on Thursday February 24, 2005 @11:25AM (#11767181) Journal
      Multiprocessing, both discrete and multicore, will accelerate all compute-bound applications in the future. Right now we haven't reached a critical mass yet -- where programmers feel it's worth the effort to multi-thread all of their applications -- but we will get there soon. It's not an easy change, and there are a whole world of problems that programmers haven't had experience with yet, but either these programmers will learn or they will not be competetive any longer.

      Face it, the days of increasing clock speeds is over. It's done. Finished. Kaput. The low hanging fruit has all been eaten.

      On the other hand, the multiprocessing benefits are huge and practically untapped. There is every reason to expect that in ten years we have 64 or 256 processors on a chip. People who hope to be working in ten years better learn how to write for these systems.

      Compare the P4 to the Cell. The P4 goes to unbelievable lengths (even literally, in pipeline lengths!) to run at a high clock speed. Its contribution to global warming is substantial. It's expensive. And, it's an absolute dead end. Intel has already abandoned it.

      The Cell has eight much-simpler processors along with its Power core. It can, and will, compute 10 times as fast as a P4, if programmed correctly. The game programmers are going to be pulling their hair out for the next couple of years, but they are going to be the high-demand programmers of the next decade as they are the first over the wall of significant multiprocessing.

      Thad Beier
  • by garcia ( 6573 ) * on Thursday February 24, 2005 @10:16AM (#11766403)
    They talk a lot about this being the savior of power-consumption but:

    They are seen as the solution to power-consumption problems that have come to the fore as clock-speeds have increased beyond 3.0 GHz. At such speeds, single-CPU processors can often dissipate more than 150 W. In contrast, dual-core parts can reduce power consumption to more reasonable levels. For example, a processor with dual 2.0-GHz cores can deliver performance not all that different from a single-core 3.5-GHz part. More important, such a dual-core part will hold down power dissipation to a figure closer to that of a standalone 2.0-GHz CPU, allowing processing throughput to effectively double for not much more power.

    Yeah, great, so it reduces power-consumption to "more reasonable levels" yet in every article I have read on this no one really mentions much more than that. What's reasonable? Telling me twice the speed for not much more power doesn't mean anything to me (other than marketing doublespeak).

    What I want to know is how much money these processors will save in power consumption compared to how much more they will cost over their single core cousins... No one has said anything about that yet.

    Now, also, how many OSs (and applications) are prepared for dual-core support? Are there any available systems that are stable and do that?
    • Now, also, how many OSs (and applications) are prepared for dual-core support? Are there any available systems that are stable and do that?

      Solaris supports dual cores on both SPARC and x86. The UltraSPARC IV processors are dual core.

      Any application should be "prepared for dual-core support". If the application even has to be aware that it's running on a dual core or hyperthreaded CPU, then the OS is broken.
      • The way I understood his question about app support was how well they're written with paralllell threads etc. I.e. not if they'll take advantage of the dual cores or not (yes, that's up to the OS to take care of), but how well they'll do it due to their design. An OS can't just take a thread and split up the work 50/50 on the two cores after all.
    • by Jugalator ( 259273 ) on Thursday February 24, 2005 @10:37AM (#11766667) Journal
      Now, also, how many OSs (and applications) are prepared for dual-core support?

      I won't go into detail of applications since I have no idea which apps you're interested in, but Windows XP Pro supports dual cores (it runs its multi-core kernel even if you just have a Pentium 4 with hyperthreading).

      Windows XP Home will not suffice though, which is a bit amusing since this might be the most common OS sitting in homes of gamers which are often the early adopters of this kind of tech nowadays. Unless they just pirated Windows XP Pro with a volume license key of course. :-)

      Windows XP Media Center Edition, and Windows XP Tablet PC will support multiple cores though, probably in the same fashion as Pro.

      Another one may give details about the common Linux distros, but I'd be very surprised if this support isn't in by far most modern distros, or can be enabled fairly easily.
      • by Malor ( 3658 ) on Thursday February 24, 2005 @11:37AM (#11767321) Journal
        If XP works the same way 2000 did, when you upgrade to a dual-core from a single-core, you will have to reinstall the OS for the second core to be activated.

        2000 had two entirely separate sets of system files, one each for uni- and multi-processor. Even if you added a second CPU, if you didn't have the multiproc HAL to begin with, it simply wouldn't work.

        Because XP is just 2000 with a facelift, I suspect this won't have changed. You are correct that if your initial install was on a P4, which 'looks like' two physical processors, XP would have installed its multi-cpu core.

        If, however, you are installing a dual-core Athlon in, chances are quite high that you didn't do your initial install on a P4. So you won't have the multiproc system files, and you'll probably have to reinstall to get the second proc going. (A 'repair' installation may be adequate, and would be much less painful.)

        Linux works somewhat similarly, but fortunately you can replace just the kernel, rather than the entire OS.
        • by un4given ( 114183 ) <`bvoltz' `at' `gmail.com'> on Thursday February 24, 2005 @03:08PM (#11769714)
          2000 had two entirely separate sets of system files, one each for uni- and multi-processor. Even if you added a second CPU, if you didn't have the multiproc HAL to begin with, it simply wouldn't work.

          Fortunately, this is incorrect. As described in Microsoft's knowledge base [microsoft.com], a HAL change is all that is required to take advantage of the second processor. Windows NT 4.0, 2000, XP, and 2003 all are capable of this, although NT needed a utility uptomp.exe to accomplish this feat.

          It is very common to have to do this on dual-processor capable servers when installing a second processor.

    • The power increase between a single core processor and dual core processor is probably less than you think: I would not be surprised if it is in the 20% range on average.

      Simply running the clock (and not performing any operations) on most processors will draw ~60-70% of the parts max power, which suggests that the loading on the part determines how much of that extra 30-40% is being dissipated.

      Working under this assumption, worst case, a dual core processor would draw 40% more power than a single core
    • by SirCyn ( 694031 ) on Thursday February 24, 2005 @10:41AM (#11766719) Journal
      Now, also, how many OSs (and applications) are prepared for dual-core support? Are there any available systems that are stable and do that?

      Microsoft Windows 2000 and XP support 2.
      Apple OSX supports 2.
      FreeBSD support 4 (or more?). NetBSD supports 2 (or more?). OpenBSD is working on it (last I knew).
      Linux 2.4.x and 2.6.x support 2+.
      Sun Solaris has support 2+ for as long as I know.
      AIX, HPUX, SCO Unix and all those support 2+.

      Did I miss any?
      Almost all OSes for the last several years have supported multiple processors natively. At worst these OSes would need a patch to update their SMP awareness.

      Applications on the other hand, well they've been slower to change to a multithreaded moddel. Many server grade programs are ready. Most common desktop programs are not.

      I have used a dual Athlon MP system for a long time now. The biggest difference I can tell you between dual 1.6GHz and single 3.2Ghz is that one process can not take over the processor. Even with modern preemption I can tell the difference when I have a second CPU processing my clicks and keystrokes. All I can say is "try one for a while, you'll get hooked".
    • Yeah, great, so it reduces power-consumption to "more reasonable levels" yet in every article I have read on this no one really mentions much more than that.

      Power consumption is roughly proportional to the square of the clock speed (everything else being equal). Doubling the clock speed of a CPU will increase the power usage by a factor of 4. On the other hand, simply adding an extra core will only double the power requirements of the chip.

    • Good question. If you look at the recent history of CPU design (say, for the past 20 years), you see that the primary concern of architects have changed with the technology over time. In the 80's any design you came up with was limited by how many transistors you could squeeze onto a chip, and so everyone was worried about transistor count. By the 90's the transistors became abundant and small enough that a lack of transistors was no longer a primary concern of designers --- instead, they were much more

  • by Gates82 ( 706573 ) on Thursday February 24, 2005 @10:19AM (#11766432)
    Rather than using two identical chips, is there any difficulty in putting say a 2.4 Ghz chip with a 500 Mhz one? I would love to have the latest and greatest chip for gaming, and crunching through video and then have a low powered second chip to play my mp3's and surf the web, while the high-end chip is crunching through numbers in the background. Guess that's why I have a laptop, play on that while the desktop is doing its thing.

    --
    So who is hotter? Ali or Ali's sister?

    • That is what a good realtime OS scheduler is supposed to do, time multiplex appropriately your processor. Not perfect, but for the kind of stuff you describe, you can get away with far from perfect and not experience a perceptible difference.
    • Most chips nowadays throttle down when they're not doing intensive calculations. They aren't running full-bore all the time.

      Even my video card throttles down. The Geforce 6600GT and it runs at 350 mhz when doing 2D stuff, and speeds up to 500 mhz when playing a 3D game.
    • by bbrack ( 842686 ) on Thursday February 24, 2005 @10:46AM (#11766773)
      The main problem with this is that the processors share a clock tree and arbitration logic - if the clock multiplier is contained in the arbitration logic, then having one core at one speed and another at a different speed would be impossible.

      If the clock multiplier is contained separately in each core, it would be possible - however, having different clock ratios on each core would considerably complicate the arbitration logic, since it would have to deal with different setup and hold timings when sending data to one core vs. the other - this would probably greatly increase your chances of inducing a processor error.

      Trying to do this could also require a great deal more design difference between the two cores, which might cause many problems. It also would make it much more difficult to sell single core versions of dual core chips (i.e. one core fails, the other core is good - blow a few fuses to get the chip to look like a single core chip, and sell at as a single core)
  • I'm poor! (Score:5, Interesting)

    by shamowfski ( 808477 ) on Thursday February 24, 2005 @10:20AM (#11766446)
    Does dual core mean dual price? With current fx-55's costing around a grand, what can we expect these to cost? 1,500-2,000? If AMD wants to remain competitive with Intel they are going to have to work on that. Who ever guessed AMD would be the one who had to lower prices to compete??
    • Prices haven't changed in a year on CPUs and RAM. Maybe this will at least force the low to mid level stuff to drop.
    • They will probably do the same as intel...Dual core chips won't use the leading edge cores, probably for power and heat dissipation reasons as well as marketing ones.

      Therefore if you're going to buy a system for raw performance such as hardcore gaming, you'll still be better off for now buying a single-core FX-55. Most current game engines are optimised for single core anyway. It'll be intersting to see how quickly that changes, if at all.
    • Re:I'm poor! (Score:2, Informative)

      by ThaReetLad ( 538112 )
      According to AMD, no. IIRC they said recently that dual core Opterons will be inline, price wise, with the top end Opterons currently available, while offering much improved performance.
    • by mapmaker ( 140036 ) on Thursday February 24, 2005 @11:31AM (#11767250)
      With current fx-55's costing around a grand, what can we expect these to cost? If AMD wants to remain competitive with Intel they are going to have to work on that.

      If you need a car but you're poor, you buy the Chevy Cavalier, not the Chevy Corvette.

      If you need a processor but you're poor, you buy an AMD Sempron, not the AMD FX-55.

      Complaining that AMD needs to lower the price on their top processors is like complaining that Chevrolet needs to lower the price of Corvettes.

  • by bigtallmofo ( 695287 ) on Thursday February 24, 2005 @10:22AM (#11766470)
    Before you buy one of these dual-core processors for your server, make sure that your software vendor isn't going to double your price on you.

    Oracle and others [com.com] have announced plans to increase their revenue by charging people for multiple cores in their single processor.
  • Awesome! 939 Huzzah! (Score:4, Interesting)

    by Anonymous Coward on Thursday February 24, 2005 @10:22AM (#11766474)

    I actually just purchased a socket 939 board for this exact reason. I'm extremely pleased with AMD for not forcing yet another motherboard upgrade on us based on chip advancement. I got a cheap Athlon 64 3000+, but two or three years from now I can go dual-core without getting a new motherboard, memory, etc. and I like that.

    I understand that sometimes it's necessary to upgrade motherboards instead of just chips (FSB adn so forth), but for those of us who can't afford top-of-the-line, bleeding-edge stuff, it's nice to see upgradability for more than just a few months into the future.

    Free Sony PSP from Gratis [tinyurl.com]

  • Dual core chips came closer to reality as AMD demonstrated their Athlon64 dual core offering. ...As I understand it, that's about as close to reality as you can get...
  • Hrmmmmm (Score:2, Insightful)

    by REDSECTOR1 ( 695888 )

    MAKEOPTS="-j3"

    Horray
  • Complexity (Score:2, Interesting)

    by Efialtis ( 777851 )
    They have been expirimenting with multi-layered parallel processing for a long time, and I think this is the "realized results" of those expiriments.
    We will see newer dual and multi-core processors come out in the future, and tha ability to parallel process with multiple chips on one board...
    Should be exciting...
  • Sweet... (Score:4, Interesting)

    by sapgau ( 413511 ) on Thursday February 24, 2005 @11:08AM (#11766991) Journal
    Intel wake up!! See how easy it is to upgrade, no new socket layouts, no new motherboards.

    Besides trying to determine what model is the Pentium dual core gives me headaches.

    /owns AMD, trying very hard to repress fanboy attitudes.
  • Closer to reality? (Score:4, Interesting)

    by cmclean ( 230069 ) on Thursday February 24, 2005 @01:08PM (#11768347) Homepage Journal
    Dual-core chips are already a reality [sun.com], Sun's UltraSPARC IV uses 2 UltraSPARC-III pipelines.

    Perhaps the author means "x86 dual-core chips"?

Adding features does not necessarily increase functionality -- it just makes the manuals thicker.

Working...