Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×
AMD Intel

Intel and AMD's 2005 Plans Revealed 272

Takemedown writes "There's a good article on CTZ that talks about Intel and AMD's plans. Intel, continuing on their 18-month chipset refresh rate, will introduce their Glenwood and Lakeport chipsets for the Smithfield dual core desktop microprocessor in 2005. The chipsets will support SATA II, Matrix RAID and a higher system bus speed for the new Pentium 4 name holder. As far as Intel's dual core strategies are concerned, they will most likely bring their dual core additions by the very end of Q2 or Q3 this year, so for those waiting for these next generation chips are better off with a due upgrade. Secondly, if you are hoping for a noticeable performance gain in regular computing tasks are in for a disappointment. Dual core microprocessors are for those who like to do multitasking or work on multithreaded applications. For example, if you are gaming and burning a DVD at the same time, dual core chips will come in handy and will definitely give a smooth computing experience."
This discussion has been archived. No new comments can be posted.

Intel and AMD's 2005 Plans Revealed

Comments Filter:
  • by SnprBoB86 ( 576143 ) on Monday January 10, 2005 @04:05PM (#11313170) Homepage
    ...chat on AIM and refresh Slashdot fast enough to get a first post!
  • Please... (Score:5, Interesting)

    by GreyWolf3000 ( 468618 ) on Monday January 10, 2005 @04:05PM (#11313173) Journal
    Can someone sum up the benefits of multi-core processors over SMP for me?

    Is it more efficient memory sharing amongst the different cores?

    • Re:Please... (Score:3, Insightful)

      by dan g ( 30777 )
      Please, can someone tell me how to use this new-fangled google [google.com] thing I keep hearing about?
    • More about cache sharing that RAM sharing, but yes.
      Still wondering if these cores will support something that many supercomputing chips have for a long time. That is the ability for both cores to run the exact same instructions, thus eliminating overhead in error checks as the error check is the comparison between the two cores.
      • Still wondering if these cores will support something that many supercomputing chips have for a long time. That is the ability for both cores to run the exact same instructions, thus eliminating overhead in error checks as the error check is the comparison between the two cores.

        Do you know what kind of error checking a typical CPU does and how much the overhead is?
        • Na, not really. I just know that a lot of the high-end non x86 dual core technologies do this. Don't really know if its worth it, but I guess if your application doesn't thread to 2 CPUs well, this is a great way to take advantage of a second core.
      • Re:Please... (Score:5, Informative)

        by GileadGreene ( 539584 ) on Monday January 10, 2005 @04:44PM (#11313751) Homepage
        Still wondering if these cores will support something that many supercomputing chips have for a long time. That is the ability for both cores to run the exact same instructions, thus eliminating overhead in error checks as the error check is the comparison between the two cores.

        This can be achieved on a commodity single-core processor using pure software techniques. The technique is known as Error-Detection through Duplicated Instructions (EDDI), and is implemented as a compilation step between assmbly code generation and object file generation. Stanford has done a bunch of work on this at their Center for Reliable Computing. I don't have any links readily available, but I'm sure that if you Google on EDDI and the ARGOS project you'll find some good info.

        Note that IIRC experiments at Stanford showed that when using EDDI on a modern super-scalar processor the EDDI instructions can take advantage of unused portions of the pipeline, resulting in a significant reduction in overhead. You might still experience a slight performance hit, but on the other hand you don't need to add a whole new processor or core.

    • Can someone sum up the benefits of multi-core processors over SMP for me?

      It's cheaper.
    • Re:Please... (Score:5, Informative)

      by Thagg ( 9904 ) <thadbeier@gmail.com> on Monday January 10, 2005 @04:17PM (#11313324) Journal
      It's exactly the same as SMP, except for two things:

      1) Far less 'glue' circuitry is required on the motherboard. This allows cheaper multi-processor systems.

      2) Potentially, communication between the processors could be faster.

      Mostly, though, the advantage will be social -- if a large fraction of systems have multiple processors, as they will soon, then more and more applications will be written to take advantage of them.

      Thad Beier
    • by AusG4 ( 651867 )
      • If you're not his butler, then why bother replying in the first place?

        I'm sorry, but this really cheeses me off. It doesn't matter if it's an Ask Slashdot seeking advice from a forum of peers, or a question in another post's comments. There's always one or two schmucks who are so mortally offended by the question that they must waste their time by providing a Google search URL and a snarky remark. So what do you get out of it?

    • Re:Please... (Score:3, Informative)

      by webmosher ( 322834 )
      Memory as in on-die cache... yes, RAM... no:

      1) Core based processors have more internal/embedded synchronization built in, especially related to on chip caching. SMP relies more heavily on the O/S for maintaining concurrency.
      2) Connection between processors is shorter and theoretically faster. The big gain here is that the MB components for SMP are all integrated on the CPU, so everything is simplified and compressed.
      3) Cache in SMP is separate to each processor, core-processors share the cache between the
    • How about price?
    • IIRC, dual core CPUs only give 50% of the available clockcycles to each thread.

      Its like having an SMP box with each CPU being half the speed, so to speak.

      If this is true then SMP is clearly better as each thread gets 100% of the available CPU.

      However its hotter and draws more current (I'd think)
      • Re:Please... (Score:3, Informative)

        by ArsonSmith ( 13997 )
        Not right, this could almost be said about hyper threading but even then it isn't really close. Dual core is two complete cpus put on the same chip. This should also alow them to share cache.

        There has recently been a patch the the Linux Kernel about zoning cpus. This helps process migration across cpus. basicly if a cpu is overloaded you have to move some process off of it to another cpu. using cpu zones the migration code can try to pick hyperthreaded or multi core cpus to migrate to first because of
  • Hmmm (Score:5, Funny)

    by Anonymous Coward on Monday January 10, 2005 @04:07PM (#11313194)
    Only thieves burn dvds, so I see the MPAA stepping top stop this.
  • A Plea (Score:5, Insightful)

    by teiresias ( 101481 ) on Monday January 10, 2005 @04:08PM (#11313205)
    A plea to both Intel and AMD. Please make a nonexpensive SILENT cooling system.

    Yeah fans are pretty good but let's be honest, they wear down and become noisy.

    Water cooling is great but I've already got one aquarium in my room.

    One core was bad. Two? Three? Twenty! Passive heat sinks, huge slabs of copper, whatever, just please, I can't hear myself think.
    • I think the term you're looking for is "passive cooling".
    • Re:A Plea (Score:5, Informative)

      by sallgeud ( 12337 ) on Monday January 10, 2005 @04:28PM (#11313481)

      AMD is using a technology patented by IBM called SOI (Silicon on Insulator)... IBM is very unwilling to allow Intel to use this technology to solve their heat problems....

      Tom's Hardware has some good information about thermal loss [tomshardware.com]. Notice that an idle AMD Winchester (SOI Athlon 64) loses only 3.2 watts, while the more recent P4 chips are losing > 34 at idle.

      This number changes at load to 30 watts for the Winchester and 100+ watts for the P4.

      Looking back and comparing it to a P2-450 I once owned... the Winchester numbers are close.... and that machine had no fan (just a very large heatsink).

      I'm not sure you could have a fully-loaded Winchester without at least some type of active cooling... but certainly the CFM required across a good heatsink would allow you for an almost silent fan.

    • Re:A Plea (Score:3, Informative)

      by eander315 ( 448340 ) *
      Silent PC Review [silentpcreview.com] has a good review of the Zalman Reserator 1 [silentpcreview.com], the only product I know of that even comes close to qualifying. It's nearly silent, more or less easy to install (if you built your computer, you can put this together, but I wouldn't recommend it for my mom), and can cool even the hottest processors and videocards simultaneously. It is not cheap, however, at around $240 shipped. I just installed mine, and it's the quietest, coolest-looking computer cooling part I've bought in years. Unfortunatel
    • Power consumption = Performance

      How much performance are you willing to give up in order to reduce cooling requirements?

      I would suggest getting a notebook. They're reasonably quiet.

      Small form factor solutions are in their infancy. They are generally more expensive than standard desktop solutions, but as demand picks up, economies of scale will take over and reduce costs as well as decibels of noise.
      • Re:A Plea (Score:3, Interesting)

        by cyngus ( 753668 )
        On the contrary I have found many x86 laptops to be as loud or louder than their desktop counterparts for two reasons
        1) The smaller package means that its harder to dissapate head into the surrounding air.
        2) The processor is physically closer to you, because its on your lap rather than across the desk or on your floor.

        However, every mac laptop I've had (iBook 12" G3 700Mhz, Lombard 400Mhz, TiBook 15" 1Ghz, Al book 15" 1,25Ghz) rarely turns on its fan. Sometimes after playing a DVD, although the Aluminum
    • You have a few options here, all involve stopping the belief that this something you didn't bring upon yourself. If you buy a high performance machine, you're going to get a lot of heat, there's not much of a way around this.

      1) Buy a cooler chip, either slower or better processing power per watt.
      2) Buy quality fans, Dell don't spend a lot on fans because most years don't know you don't have to have a dustbuster fan in there.
      3) Buy a computer that was designed to be nearly silent. I have one dual p
  • by nizo ( 81281 ) * on Monday January 10, 2005 @04:08PM (#11313207) Homepage Journal
    I liked the photo of the Intel booth, which had the tagline "Upgrade your senses". For the life of me I can't figure out what that means. Are they planning on offering upgrades to give me better vision and sight? Perhaps a socket for the back of my head too? Or maybe their new cpu can now smell me and indicate when I need to shower?
  • Overrated (Score:4, Insightful)

    by Manan Shah ( 808049 ) on Monday January 10, 2005 @04:09PM (#11313217)
    Processor power has been overrated for quite some time. A lot of people don't realize that other components, especially RAM matter just as much, if not more in most every day tasks than pure CPU power. If I had a choice between 2GHZ and 1GB ram, vs. a 4GHZ and 512mb ram, I would definatly recommend the former. So although this technology will help some people, most people should stay away from it if you are just doing day to day. Does grandma really need dual core processors for sending email and browsing the web anytime soon? Hell, a 500mhz and 256mb is more than enough for a lot of people to write documents and browse the web, but you will never see such a system on Dell website for $120 :).
    • I thought most benchmarks say that the difference between 512 MB and 1 GB is not nearly as big as the difference between 2 GHz and 4 GHz. If your example was 256 MB and 512 MB, I would agree.
      • I'm not sure that benchmarks are really an accurate simlation of how everyday people use their machines to surf the web and work with email.
      • Benchmarks? It's as simple as this: If you run out of ram, you'll start hard drive caching which is slow as hell. You'll never notice the difference between 512 and 1gb if you're just booting into windows and surfing the web. But if you're doing some heavy graphics work, 2D or 3D, you'll definitely see the speed increase.

        Speed increases from memory and cpu are different. A faster cpu (not measured simply by the clock speed) will just about always make things faster, but more ram will only speed things
    • Re:Overrated (Score:2, Insightful)

      by Xoro ( 201854 )

      If you think it's overrated, you've never had SMP on your desktop.

      If I had to choose I would pick a dual 1 GHz system over a single 2 GHz system. Everything works together so much more smoothly. Grandma would be able to browse the web, play mp3s and record Matlock all on the same machine at the same without missing a beat.

      SMP belongs on the desktop -- I think when people try out the new dual cores, they're going to wonder how they ever got along without them.

      • I agree completely. I've got a P4 2.4GHz machine sitting in the cube next to
        me. I almost never touch it because the dual P3 700MHz machine I use as my
        desktop is so comfortable to use, that it's not worth the effort right now
        to migrate to the new machine.
    • I'm tired of this. I process video. Lots of it. I need as fast processor as I can get. If possible, it should have lots of RAM, but I still need fast processor(s).

      No, for me processor speed (as opposed to GHz) is not overrated.

      Robert
    • Yes, and 640K should be enough for everybody.

      You never know what the future will bring. Perhaps this additional computing power will be put to good use someday.

      $1000 worth of computers today is way more powerful than $1000 worth of computers 5 years ago -- significantly outpacing inflation. Performance is given to you for basically free!

      If people really wanted a $120 computer to browse the web and write email, WEBTV would have been much more successful of a product. If there is a demand for such a pro
    • Grandma will need dual-core processors [msversus.org] if she wants to check her e-mail and browse with the next version of Windows. And most likely grandma will only ever use Windows because that's what she got from the store.

      This is different performance here, though. Apples and apple trees. With monolithic kernels like Linux there's a modest gain with multiple processors. There's significant overhead from switching tasks among them. With microkernels, each component of the kernel can run more independantly in each
      • With monolithic kernels like Linux there's a modest gain with multiple processors. There's significant overhead from switching tasks among them. With microkernels, each component of the kernel can run more independantly in each processor, providing better gains (at least potentially).

        Even though I'd like to agree with you because I think microkernels are a better design, your posting shows a certain amount of uninformnedness of how (modern) monolithic SMP kernels work, such as Linux and FreeBSD. All CPU's

  • blech (Score:4, Insightful)

    by grub ( 11606 ) <slashdot@grub.net> on Monday January 10, 2005 @04:09PM (#11313221) Homepage Journal

    ...computing experience

    When did using computers or the internet become an "experience"? They're tools, nothing more.
    • When the marketing people discovered this internet thing and learned that a computer exists...
    • Re:blech (Score:3, Funny)

      >>When did using computers or the internet become an "experience"? They're tools, nothing more.

      When people discovered that they could download vast amounts of porn for free.
  • Wrong. (Score:3, Informative)

    by Anonymous Coward on Monday January 10, 2005 @04:10PM (#11313240)
    The real bottleneck for gaming these days is hard drive access. If you are burning a CD while you are playing a game, there is a good chance that the game will need to load something like textures while you are burning the CD (presumably from an ISO on your hard drive). On the other hand, with a 52X CD-R burning a full CD takes less than 3 minutes, so it won't kill your game. Unless you have two hard drives, in which case the above is irrelevant.
    • Re:Wrong. (Score:2, Informative)

      by stratjakt ( 596332 )
      Well that's why your CD-R/DVD-R has BurnProof, or similar technology built into it.

      If you notice TFA talks about SATA-II and Matrix-RAID.

      Their doing all they can about the storage bottleneck, although frankly we need something better to replace the spinning magnetic disc. Holographic storage? Who knows.
    • There is an absolute limit to how fast the HDD can read data, true, but by compressing the textures and then using one core to decompress them (and trasnfer them) while the other core runs the game, wouldn't this alleviate the problem somewhat? You'd get more overall texture data than the HDD speed permits (because the decompressed textures will be larger than the compressed size, thus you're getting more out than was fed in), without affecting game speed. If I'm wrong on this, then someone please correct m
  • Bad example? (Score:5, Informative)

    by RovingSlug ( 26517 ) on Monday January 10, 2005 @04:11PM (#11313253)
    For example, if you are gaming and burning a DVD at the same time, dual core chips will come in handy and will definitely give a smooth computing experience.

    Burning a DVD is IO-bound given all the traffic on the PCI bus from the harddrive and to the DVD. Burning a DVD is not CPU-bound, so it doesn't seem like a dual core CPU would actually help that situation.

    • Re:Bad example? (Score:3, Insightful)

      Burning a DVD is not CPU-bound... unless you are also encoding the DVD.
    • Better example may have been creating two mpeg video streams at once.
    • Well, it isn't the best example that's true but consider:
      If your game is in RAM and not accessing disk, even the low CPU usage on the DVD burning process might cause stutters in game play. If the DVD burning process were running on one core and the game on the other than theorectically there shouldn't be any stuttering added to the game. That being said, modern CPUs are fast enough to handle playing a game and burning a DVD without much problem. So long as something doesn't flood the PCI bus and cause an un
    • Re:Bad example? (Score:3, Interesting)

      by eander315 ( 448340 ) *
      Burning a DVD is not CPU-bound, so it doesn't seem like a dual core CPU would actually help that situation.

      It is if you're making a "backup" which requires you to compress a dual-layer DVD onto a single-layer DVD-R. Otherwise, you're correct, the actual act of burning a DVD-R is not CPU-limited.

    • Re:Bad example? (Score:2, Informative)

      by Apreche ( 239272 )
      It would if your drive was IDE or USB instead of SCSI or SATA. Apparently, with windows at least, the cpu processes all the io for IDE, USB, and I think also firewire. So burning a disc via any of those buses will jack your cpu usage up to 100%. SCSI and SATA it seems the controller chip must do most of the work because the usage meter stays low.

      Anyhow, these are mostly i/o heavy apps and not cpu heavy. What it will be really useful for is gentooing i.e: building software while still using your computer. A
      • Apparently, with windows at least, the cpu processes all the io for IDE, USB, and I think also firewire.

        Hmm. Even if I enable DMA for my IDE channels? Isn't that the whole point of DMA?
        • Most IDE controllers are brain-dead. Even with DMA enabled, the CPU spends a lot of time servicing interrupts. With an intelligent I/O controller, you can give it a long linked list of I/O operations and it will only generate one interrupt when it has finished.
    • My guess is that when they say "burning a DVD", what they really wanted to say is "encoding an MPEG-2 stream and burning it on-the-fly to a DVD-video".

      (OT) Funny how I own a DVD-RW drive for over 6 months now, yet all I've burned was data. Reading stuff like that makes me wonder if I'm the only guy on Earth who uses DVD-Rs purely as "bigger CD-Rs".
  • by twfry ( 266215 ) on Monday January 10, 2005 @04:15PM (#11313299)
    Intel likes to say they are going to have dual-core processors for both the desktop and server segments in 2005, but this is very misleading. They are only planning dual-core Intaniums for 2005 and use this to say they have the server segment covered.

    The reality is most of the server market is their Xeon line and the dual-core Xeons are currently planned for 2006 and maybe even later.

    • by mapmaker ( 140036 ) on Monday January 10, 2005 @04:32PM (#11313538)
      Absolutely true. And the lie goes even farther than that:

      Thier initial desktop "dual core" processor is really a dual processor kludge [theinquirer.net]. It's just two Prescott P4s side by side with a bit of extra wiring between them. They are essentially going to make half as many wafer cuts and call the resulting double-wide processors "dual core".

      AMD really has got Intel by the short hairs lately. First AMD released x86-64 and Intel had to clumsily pkay catch-up, now AMD will be releasing dual core processors and Intel is again clumsily trying not to be left in the dust.

      • AMD has certainly been the better player of late. However, without the marketing to back up their superior products it doesn't mean much. With product cycles as low as they are in the processor industry, AMD might be caught by Intel before the general public even realizes that AMD was ahead for a while.
    • When will apples have dual cores?
  • by Anonymous Coward on Monday January 10, 2005 @04:20PM (#11313371)

    For example, if you are gaming and burning a DVD at the same time, dual core chips will come in handy and will definitely give a smooth computing experience.

    Why, of course, doesn't everyone burn DVDs and play games at the same time? I usually burn DVDs when I'm playing GTA: San Andreas, so by the time the DVD is done I've forgotten all about it and the tray opening scares the living shit out of me, so I pull out my penknife and stab the DVD to its rightful death! So with this new dual system you're telling me the DVD will be done so quickly that I won't forget about it? Or will the tray slide out more slowly in a smooth and controlled manner as not to provoke me?

  • Buffbots.. (Score:3, Insightful)

    by Renraku ( 518261 ) on Monday January 10, 2005 @04:23PM (#11313417) Homepage
    People that play DAoC but don't want to buy two computers will love this. It'll let them run their buffbots and mains without too much of a hassle. /scandalous
  • by aardwolf204 ( 630780 ) on Monday January 10, 2005 @04:24PM (#11313427)
    I can already game and burn a dvd on a single core system. I'm using an Athlon XP 2500+ with a Plextor 8x DVD-RW drive and I've never seen a drop in FPS while playing SOL.EXE, Ever

    On a more serious note my old roommate, the SCSI lover, could play Quake 3 while burning a CD because he was burning from SCSI HDD to SCSI CDR while playing the game off a seperate SCSI HDD. He claimed that the only thing making my machine slow while burning a CD was the CPU overhead involved in IDE.
  • I hope Intel pulls off multi-core better than they did hyperthreading. The P4 netburst architecture simply weak, and hyperthreading is really just a patch to make it not suck quite so badly. I "upgraded" from a 1.6GHz AthlonXP to a 2.8GHz P4 Dell and was horribly disappointed with task switching performance. Tried throwing more RAM at it. The P4 with 1GB was still slower than the AthlonXP with 768MB. OTOH, I setup a friend's new Dell with the latest PCI express chipset and was really impressed with the
  • "For example, if you are gaming and burning a DVD at the same time, dual core chips will come in handy and will definitely give a smooth computing experience"

    Oh will they? Consider what frequency these chips will be running at... You won't be getting dual cores featuring core frequencies along the lines of current top end CPU's anytime too soon. This should tell people that gamers would be much better off sticking to their single core guns... If they want to encode and game at the same time, there computing experience is most definetly going to have to be compromised.

    There is no other way about this considering current limitations... As the fab processes are refined and application of technology is perfected, we will see dual cores running at higher frequencies, but there are considerable improvements which will need to be made before dual core can be referred to as a formiddable gaming option for new releases at the top end of the system spectrum. (they might not even be formidable until the unlikely circumstance when gaming authors start coding for multicore platforms on a large scale)

    For MANY people with top end single core systems currently, the move to dual cores will not immediately present what would be considered a smooth computing experience - there will be noticeable deficincies in various areas, the severity of which will be determined by the specific way their system is utilized.

    • For MANY people with top end single core systems currently, the move to dual cores will not immediately present what would be considered a smooth computing experience.

      Agreed, the example given in the post was pretty inappropriate.

      However, that's not to say that the advent of dual-core CPUs isn't an exciting. In my case, I've just got a $5k grant to buy a computer, with which I intend to perform magnetohydrodynamical simulations of magnetic stars. I've already decided on a multi-CPU Opteron system, but

  • Modern games are multithreaded. Even if the game is not written to be - even if it does not use seperate threads for sound, rendering, NPC AI, and I/O - the rest of the system is running other threads (networking, interrupts, disk I/O) that are competing with the game for the CPU. A multi-CPU system can reduce that competition, allowing the system to service the NIC and hard disk IRQs while truly simultaniously servicing the game engine.
    • A multi-CPU system, maybe, but a dual-core chip has two cores sharing the same memory and io controllers, so it would be either servicing the NIC or the game engine, but not both.

      Code will have to be rewritten to take advantage of it. The game engines themselves will have to be multithreaded, and in such a way that the threads aren't constantly fighting over the same chunk of memory.

      There's not a lot of code out there (yet) that would make any real use of a dual core CPU. I've had SMP systems, and aside
  • Smoother? (Score:3, Insightful)

    by Waffle Iron ( 339739 ) on Monday January 10, 2005 @04:35PM (#11313592)
    For example, if you are gaming and burning a DVD at the same time, dual core chips will come in handy and will definitely give a smooth computing experience.

    I doubt it. Today's personal computers are already like those 60s muscle cars from Detroit (a 400 horsepower engine bolted into a car with narrow bias-ply tires, drum brakes and a solid axle).

    I was burning DVDs a couple of days ago. The system was mildly sluggish. The CPU meter was pegged at about 2% usage. Then I ran an md5sum to verify the whole disk, and the system ground to a crawl. The CPU meter indicated about 10% load. In both cases the sluggishness was caused entirely by I/O latency and/or all of the working set being flushed out of memory to make room for disk buffering. Dual-cores aren't going to do anything for that.

  • I have a dual p3 (Score:3, Interesting)

    by MarcoAtWork ( 28889 ) on Monday January 10, 2005 @04:38PM (#11313652)
    and I would really like to finally upgrade: I'd love to have an SMP box again but I'm really not sure if it's possible at all nowadays, what are my options (at a reasonable price point) for something in the 2xAthlon64/3500+ performance range (I know Athlon64s are not SMP-capable).

    The option(s) seem to be Xeon and Opteron, but I'm not quite sure which mobos are best and most supported and/or which one of them is the most cost effective (also including RAM costs). My typical usage is linux (would vmware it in this case), win32 games (would prefer AGP to PCIe) and music (hauptwerk -> I need lots of RAM (2-3gigs) and CPU power).

    I don't think I can wait another year for multicore CPUs to come out (already been waiting forever).
    • Re:I have a dual p3 (Score:3, Informative)

      by Tumbleweed ( 3706 ) *
      Here's the thing with dual AMDs right now: wait for nVidia nForce4-based systems to come out. The current ones use the AMD chipset, which forces you to use expensive registered memory. Once the nForce4-based dual proc boards are available, you'll be able to use regular memory, yay!

      The big thing I've been waiting for was PCIe and nForce4 - PCIe is here, nForce4 is here (though very limited), and nForce4 dual proc should be along in just a few months, at most (I hope).

      Stay the course!

      I'd wait for the nForc
      • thanks for the tips, I wanted AGP because I would like to have a decent performing graphics card (say, a 6800GT) while the only thing I can find in PCIe are 6600s: oh well, it looks like I'll have to wait a while longer for my upgrade :/
        • Well, I'd wait for the next nVidia rev before buying anything beyond the 6600GT; the 6800 series still has broken hardware video acceleration. :(

          My 5900XTV is so overkill for what I need, I'll not bother upgrading until I go PCIe, and only then out of necessity.
      • "AMD chipset, which forces you to use expensive registered memory. Once the nForce4-based dual proc boards are available, you'll be able to use regular memory, yay!"

        Not true. The memory controller in Athlon 64 / Operon is on the die of the processor, not a part of the chipset.

        Opteron systems will still require registered memory. If a dual-core Athlon 64 is released, it will probably be compatible with NForce3 as well as NForce4.

        NForce4 is just NForce3 250GB with a new firewall, SATA with TCQ, and PCI Exp
  • How did this become a top-level post? I mean, the news part is interesting, but teh editorializing is... not.

    a.) Talking about dual core as if it was not already introduced, and people don't already know what it is.
    b.) I'm pretty sure most of the /. crowd are aware that two processor cores only help if you're running two threads/processes.
    c.) Who isn't running at least two processes these days? Are there really people still running DOS today?
  • ...but does anybody know where I can find PC's with AMD processors in Canada? I read of all of the benefits of the different processors by both AMD and Intel, but all I can find online is Intel based systems. Any help would be appreciated.

    Thanks.

  • by Gr8Apes ( 679165 ) on Monday January 10, 2005 @04:50PM (#11313834)

    InfoWorld had a nice story about the Power5 multi-core CPU [infoworld.com] (You'll have to download the report) coming out this year. It may outperform the coming dual core AMD chip, both in raw performance and in lower power consumption.

    AMD has a write up on their upcoming dual core processor and what it means to performance. Somewhere I believe there are some published numbers for how an AMD dual core CPU running 5 steps below it's single core counterpart can still outperform dual single core processors. (i.e., a 1.4 GHz dual core CPU will outperform a 2.4GHz dual processor machine)

    Meanwhile, Intel's dual core demo was doubted doubted [xbitlabs.com] when presented at the same time as the above referenced AMD demo. Also, Intel's dual core will not perform significantly better than a dual processor system, or so the analysis of the two processors stated. (I really need to bookmark these things when I read them! Hopefully someone else will provide that reference.)

  • Wow it's like CoolTechZone paid to get these posts. Here is an article written by someone who has a clue:

    Intel: http://www.anandtech.com/tradeshows/showdoc.aspx? i =2317&p=2 [anandtech.com]
    AMD: http://www.anandtech.com/tradeshows/showdoc.aspx?i =2317&p=12 [anandtech.com]
    Transmeta: http://www.anandtech.com/tradeshows/showdoc.aspx?i =2317&p=13 [anandtech.com]

    Notice the $700 price point on dual core Athlon 64s (socket 939). Start saving up now.
  • As the story on /. discussed, Intel and AMD seem to be going very quietly ahead with their Palladium-class chips - what about the presence of installed DRM on the new class of dual-core chips? If it's there, then you probably won't be accelerating your burning of anything to DVD by using the new chips.
  • Secondly, if you are hoping for a noticeable performance gain in regular computing tasks are in for a disappointment. Dual core microprocessors are for those who like to do multitasking or work on multithreaded applications.

    I hear this all the time. Dual processors won't help most people because they only do one thing at once.

    But your system is doing all kinds of things now. Look at the services Windows is running even when you don't want it too. What about screen refreshes? Those are done outside o

  • ``if you are gaming and burning a DVD at the same time, dual core chips will come in handy and will definitely give a smooth computing experience.''

    Or simply gaming and having a few daemons running at the same time. Remember that task switches are major performance killers on x86. Your game will run smoother if all the hits are taken by the other core.
  • The performance scaling will be measured by multitasking and multithreaded application usage rather than single-threaded applications. You can expect to not see a huge performance gain, if any, with regular computing tasks.

    Who really has issues with desktop processing, when only doing *one* thing?
  • by AusG4 ( 651867 ) on Monday January 10, 2005 @06:13PM (#11314900) Homepage Journal

    Although most /. readers probably won't care, dual core CPU's are already on the market in the form of the UltraSPARC IV [sun.com] CPU from Sun Microsystems. Sun also happen to be sporting the most ambitious multi-core project going in the form of Niagara [varbusiness.com], which although initially an 8-core system has apparently been seen running Solaris 9 with 32 independent CPU cores.

    In addition to this, the POWER 5 [ibm.com] CPU is also available with multiple cores, fully supporting Linux.

    Also of note is that the Opteron dual-core CPU's from AMD are apparently going to be pin-compatible with the current Opteron processors [xbitlabs.com] (by current,I mean, the latest socket 939 (I think) systems, not the original Opteron 2xx or whatever).

    This is really of most use for the data center right now, but as more applications wrap their heads around paralelizing themselves, multi-core CPU's on the desktop will become more popular.

    That said, developers really have no excuses for not having blazing fast "dual-core aware" apps... a multi-processor system purchased today provides about as much performance as a dual core system... so it's not like a wild new technology where application developers have to wait for SDK's or test hardware. Multiple cores, HyperThreading CPU's or multiple physical processors are all just additional CPU's from the operating systems perspective, and are developed for using the same tried and true thread libraries (pthreads, etc).

    Multi-thread those apps people! There are so many instances, especially when writing GUI apps, where an extra thread or two thrown in the right direction can really improve the user experience.

    Of course, a big problem is just how developers learn to program. Everyone learns their "Hello World!", then goes from there... but this is all very linear in approach. Finding good programmers who can think of an application in terms of what many parallel threads should (or shouldn't) be doing isn't easy... but I digress.

Every program is a part of some other program, and rarely fits.

Working...