Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Intel

Intel's Core i7-980X Six-Core Benchmarked 179

Posted by CmdrTaco
from the faster-harder-better dept.
Ninjakicks writes "Although they won't hit store shelves for a few more weeks, today Intel has officially unveiled the new Core i7-980X Extreme processor. The Core i7-980X Extreme is based on Intel's 32nm Gulftown core, derived from their Nehalem architecture and sports six execution cores. The chip runs at a 3.33GHz clock frequency, that can jump up to 3.6GHz in Intel's Turbo Boost mode. This processor has a max TDP of 130W, which amazingly is the same as previous generation Core i7 quad-core CPUs. Of course, it's crazy fast too. Some may say that the majority of applications can't truly take advantage of the resources afforded by a six-core chip capable of processing up to 12 threads. However, the fact remains there are plenty of multi-threaded usage models and applications where the power of a CPU like this can be put to very good use."
This discussion has been archived. No new comments can be posted.

Intel's Core i7-980X Six-Core Benchmarked

Comments Filter:
  • by Targon (17348) on Thursday March 11, 2010 @08:41AM (#31436768)

    I know there are SOME people out there who have $1000 to spend on just a CPU, but until these come down a long way in terms of price, it is WAY out of my price range.

    • by Vectormatic (1759674) on Thursday March 11, 2010 @08:49AM (#31436832)

      Intel always prices their High end around $1000, never mind the fact that price/performance on those chips is horrible.

      It is the price you pay for getting the bleeding edge, AMD also has some halo models, but because they cant beat intel in performance, they cant afford to charge $1000 for their high end chips.

      As for this comming down, AMD is slated to release six-core phenoms to the desktop before summer iirc, it wont have the raw performance of this thing, but 6 cores for under 200 bucks sounds nice doesnt it?

      • Re: (Score:3, Interesting)

        by Pojut (1027544)

        It is the price you pay for getting the bleeding edge, AMD also has some halo models, but because they cant beat intel in performance, they cant afford to charge $1000 for their high end chips.

        AMDs current flagship costs $195 [newegg.com] and is still a heck of a performer. I'll stick with AMD for now.

        lol, anyome remember the horribly overpriced Athlon 64 FX-55?

        • I just took a look at a toms hardware CPU chart ( http://www.tomshardware.com/charts/2009-desktop-cpu-charts-update-1/Performance-Index,1407.html [tomshardware.com] ), picked out the intel CPU that came immediately above the AMD CPU you mentioned and looked up the price on newegg ( http://www.newegg.com/Product/Product.aspx?Item=N82E16819115215&cm_re=i5-750-_-19-115-215-_-Product [newegg.com] ) and it was $5 more.

          • by toastar (573882)

            given those two processors I'd take the AMD, And I'm a huge fan of the I5 architecture.

            it comes down to 4x-2.66 or 4x-3.4

            I do wish AMD did some jiggling with the on die cache. I think having a small L2 with a big L3 really isn't that smart. but i can't say that as fact :(

          • by PitaBred (632671)

            AMD's chips don't change sockets every 2 months. I can upgrade my AMD CPU without having to upgrade my entire machine. You can't compare the cost of the Intel chip directly to the AMD chip without taking the other costs into account as well.

            • AMD's chips don't change sockets every 2 months. I can upgrade my AMD CPU without having to upgrade my entire machine. You can't compare the cost of the Intel chip directly to the AMD chip without taking the other costs into account as well.

              I thought I remember something similar being said about AMD switching to AM2 then AM3 sockets. Yes you can plug an AM3 CPU into an AM2 socket but there was (is?) a performance hit. Also how long did intel keep the 775 socket? But intel should have kept the 1156 socket around longer. They did jump to socket 1366 really fast when compared to how long the 775 socket was around.

              • They have both 1156, and 1366.. the 1366 is meant for higher end systems, and the 1156 is meant for more retail channels and OEM/ISVs. The biggest difference is the reduced architecture costs of 1156 vs. 1366. Also, I tend to replace both my motherboard and CPU on a 2-3 year upgrade cycle. My most recent upgrade to an 1156 based Core i7-860 is probably more than I really need. I really didn't feel much pain on my old system (Core 2 Duo E6600), but felt it was time. Other than maybe a GPU refresh in 18
              • But intel should have kept the 1156 socket around longer. They did jump to socket 1366 really fast when compared to how long the 775 socket was around.

                1366 came out first for the i7. 1156 came out recently for the i5 and i7. Both are still around, and will be for at least a while. 1366 is aimed at enthusiasts and workstations, while 1156 is a "mainstream" part with some limitations compared to 1366.

              • But intel should have kept the 1156 socket around longer. They did jump to socket 1366 really fast
                They didn't jump from 1156 to 1366 at all (1366 is actually older than 1156). They created two different sockets for different markets (and I'm pretty sure there will be a third soon for the new processors with 8 cores, 4 QPI links and seperate memory buffer chips).

                1366 is a socket really designed for dual-socket workstation and server stuff but also used for some high end single processor stuff. 1156 is the mainstream socket.

            • AMD's chips don't change sockets every 2 months. I can upgrade my AMD CPU without having to upgrade my entire machine. You can't compare the cost of the Intel chip directly to the AMD chip without taking the other costs into account as well.

              I actually built my system with an i7 because my AM2 board (about 16 months old and fairly high-end) apparently came out just before AM2+ and therefore couldn't be reused. If you happen to have a good AM2+ system, you may be able to drop in an AM3 CPU for a boost, but simply because you have AMD doesn't mean you suddenly have a free upgrade path. In my case, I would've had to buy the same components to go with Intel or AMD, so I ended up spending a little more on Intel to get a lot more performance.

              As for

      • by ircmaxell (1117387) on Thursday March 11, 2010 @09:12AM (#31436996) Homepage

        AMD also has some halo models, but because they cant beat Intel in performance, they cant afford to charge $1000 for their high end chips.

        FUD, pure FUD. AMD has always been cheaper than Intel. Even back before Intel introduced the Core2 series, when the AMD K2 and Athlon series spanked everything that Intel had to offer. Heck, even back to the days when AMD first entered the mass market (80386 days IIRC), they were the less expensive product. And to date, AMD has arguably always held the performance/$$$ award. Sure, Intel has started gaining a lead (Marginal with C2 series, but significant with the i7 series) in recent times, but AMD isn't THAT far behind. And if you consider that most of the true innovations in CPU design have come from AMD (true multi-core (I mean where there are 4 physical cores on die, not 2 dual core cpus on the die), 64bit, shared L3 cache, on-die memory controller, elimination of the north bridge and hence the system bus, etc), I find it VERY funny that "It is the price you pay for getting the bleeding edge" is applied to the more expensive Intel as opposed to the innovator AMD. Now, I'm not saying that Intel hasn't innovated at all. I'm just saying that the major innovations that the i7 used to surpass the C2 series (Namely the elimination of the system bus, on-die memory controller and the tiered cache architecture) were done first by AMD...

        • Re: (Score:2, Insightful)

          by Vectormatic (1759674)

          hey, i never said AMD was more expensive then Intel, and i bet you that if they could charge $1000 for their top end, they would (and they should, milking the high end is the easiest way to recoup dev costs)

          personally i prefer AMD because of their price/performance ratio too, and they have consistently kicked intels but there

          • by Pojut (1027544)

            hey, i never said AMD was more expensive then Intel, and i bet you that if they could charge $1000 for their top end, they would

            They never quite hit $1000, but their Athlon 64 FX-55 went for something like $700 or $800 when it was brand new.

          • i bet you that if they could charge $1000 for their top end, they would

            Well, if by "could" you mean with a better product, then no. That was proven in the days of the Athlon (When AMD owned almost every benchmark). They were number 1, but still the least expensive of the two by a fair margin.

            If by "could" you mean with market position, then yes. Intel can charge $1k, because they have two things that AMD doesn't. First, brand recognition (I'd be willing to bet the "common" person knows Intel a lot mor

        • Hyperthreading and MMX were arguably new additions at the time they came out. They might be shitty, but MMX was one thing which helped dedicated sound cards become obsolete. Intel is also a leader in semiconductor manufacturing processes. Which is part of the reason for their insane profits.
          • I'm not saying Intel doesn't innovate. But look at the two big ones that you mentioned. Hyperthreading was introduced on the Xeon-MP line in 2002. MMX was introduced in 1996 on the original Pentium. Not to mention that MMX had nothing to do with sound cards (Other than the fact that it enabled the CPU to natively do the vector math that DSP chips were doing at the time). If anything, AMD's improvement on MMX --named 3DNow!, which added support for floating points in MMX instructions-- helped dedicated
            • by 0123456 (636235)

              My only point, is that people love to bash AMD, when you could argue that a significant portion of Intel's key features were either developed in parallel with AMD (Virtualization technologies for example) or were developed by AMD first (x64, on-die memory controller, elimination of north bridge, etc, etc)...

              On-die memory controller like the Intel 4004, you mean? And I don't believe the 4004 had a north bridge equivalent, since it could talk to memory directly.

              AMD certainly deserve kudos for developing x86-64, but claiming that an on-die memory controller was some huge innovation when microprocessors have had on-die memory controllers since the stone age of computing is just silly. If there was a huge advance it was separating the memory from the CPU by attaching it to the north bridge so you could use any comp

              • when microprocessors have had on-die memory controllers since the stone age of computing is just silly
                IMO the innovation is multiple links out of the processor.

                If you look at older computers (certainly stuff like BBC micros and i'm pretty sure eartly PCs were the same) nearly everything was on one bus (with maybe the odd bus buffer chip somewhere or maybe a DRAM refresh chip if the CPU didn't have that capability built in). This worked with the tech of the time but as things started to speed up it became a

            • Not to mention that MMX had nothing to do with sound cards (Other than the fact that it enabled the CPU to natively do the vector math that DSP chips were doing at the time).
              Another thing that MMX brought that previous CPUs core stuff didn't have afaict was saturating arithmetic. Traditionally CPUs do modular arithmetic.

              Saturating arithmetic is useful if you are trying to do stuff like audio mixing in software.

              What really made dedicated sound cards obsolete though IMO was a combination of advancing general

        • First, your comment is non-responsive. He never said Intel was cheaper, he said AMD doesn't have a model with performance levels high enough to merit a $1000 price tag when compared to what you can get for $1000 from Intel.

          Second, you cherry pick innovations for AMD.

          What do you care if your four cores are on-die? Larger dice means lower yield. Two-dice in a package can increase yield and thus reduce price or allow you to have a lot more cache, which helps performance.

          Did not Intel do L3 cache first with the

          • Did not Intel do L3 cache first with the Pentium 4 Extreme Edition on Gallatin in September 2003?

            Yes and no. L3 just means level 3, and yes Intel did introduce that in 03. But when I said L3, I meant a shared L3 across cores. All of AMD's multi-core chips have a shared cache level accessible by all cores. Intel didn't with the first few rounds of the C2Duo series (Including the original C2Quad chips).

            AMD didn't eliminate the Northbridge, they just took one part of it out, the memory controller.

            Well, eve

      • 6 cores for under 200 bucks sounds nice doesnt it?
        That all depends on how those cores perform.

        Personally given the choice I'd rather have a higher per core performance than more cores. there is still a lot of single threaded stuff out there and even some multithreaded stuff has single threaded stages and/or a lot of locking between threads.

        The information i've seen indicates that there will also be a slightly slower non-extreme version of the i7 hex core, wikipedia claims a release price of $562 though it d

        • Personally given the choice I'd rather have a higher per core performance than more cores.

          I wouldn't. Most of what I do is very multi-tasking heavy. The fact that one program can't use more than one core doesn't bother me nearly as much as that 3 or 4 programs must share the same core. Especially when you consider that I typically run more than 1 VM at a time along side my regular programs, I think (for my use case at least) the more cores, the better the computer will perform. I very rarely use a sing

      • by hairyfeet (841228)

        But how many people are actually gonna have enough work to feed 6 cores for any length of time at all? I'm thinking that niche has to be pretty teeny tiny, considering when I do follow ups on my customer's new builds and check the logs a good 80% of the time the duals are twiddling their thumbs. I know that even with me liking to do the occasional video editing and transcoding my AMD quad sits idle probably 70%+ of the time.

        So I've started to go the other way for myself and my customers, as it really is sta

      • by m.dillon (147925)

        People should also remember that Intel's graphics strategy is to try to do it with a general purpose cpu rather than a dedicated graphics engine. This could be part of that strategy. With so many cores available a system can simply dedicate a few to particularly important jobs such as rendering.

        -Matt

      • It is the price you pay for getting the bleeding edge, AMD also has some halo models, but because they cant beat intel in performance, they cant afford to charge $1000 for their high end chips.

        That's not the reason.

        Even when AMD had the fastest chips, they were cheaper. AMD seems to charge less because Intel is a marketing behemoth.

        Back around the time Intel released their P4 Extreme Edition chips, they were charging over $1000/cpu. AMD was charging a meagre $600-700 for their FX-55 which slaughtered it. And I picked up an Athlon XP for $80 around that time, which let me play all the new games. :P

    • by rotide (1015173) on Thursday March 11, 2010 @08:50AM (#31436842)

      All new bleeding edge CPUs are expensive. That's not the point of the article/submission. The point here is that a very fast 6 core, 12 thread consumer level processor is now on the market.

      Price will come down in due time.

    • by eldavojohn (898314) * <eldavojohnNO@SPAMgmail.com> on Thursday March 11, 2010 @08:56AM (#31436886) Journal

      I know there are SOME people out there who have $1000 to spend on just a CPU, but until these come down a long way in terms of price, it is WAY out of my price range.

      Companies? Rendering farms? At this price, I'd imagine they're not really for the average consumer but more so for companies that can consider such a purchase an asset.

      That said, you do realize that the i7-975 quad core that they compared it to is also nigh $1000 [newegg.com], right? I think showing that the same price will buy you an entirely different beast signals that quad cores are complete. The current quad cores price will come down but why would you make a more expensive quad core at Intel? The specs here show it cannot stand up to the new six core platform.

      All these prices will come down, of course. So it's fun to look forward to what I'll be using in two years (I just bought a low range quad core for $140 a week ago, almost right in time for this).

      And also, who strayed from the duo- quad- naming methodology?! Are you insane!? Do you have any idea the marketing power that a sexa core chip could have?

      • I wonder how many of the people who can afford a $1000 CPU want an Intel chip. At that price, something like a Power6 or T2+ looks more attractive, depending on the workload.
        • Two questions

          1: do you know of any solid comparisons between those chips and current x86-64 chips using at least the same application software? (same OS would be nice too but it's difficult to chose one that is fair to all the candidates)?

          2: do you realise just how much of the computing world is tied into either wintel or lintel?

          Note that the particular chip mentioned in the current article is the desktop version, apparently there will be a dual-socket version but I haven't seen any recent information on wh

          • by Khyber (864651)

            1. Web database servers
            2. Not very much considering most actual production machines run a Unix variant (I mean PHYSICAL production, not software production) and I've had the displeasure of having to repair major industry machines, which forced me to learn EIGHT different Unix subsets.

            No *REAL* production house (except maybe digital art studios) uses Wintel, and in fact Pixar is likely moving to solid-Tesla computing in the near future.

            • Can you provide links? A quick search for xeon vs powerpc benchmarks didn't turn up much other than articles about how the new intel macs were better than the powerpc ones.

      • by LWATCDR (28044)

        Rendering farms maybe. It would be an interesting trade off.
        Does this chip offer more processing power for dollar than using more but cheaper CPUs? You would have to look at power, cooling, space, system, and admin costs. I would give it a big maybe.

        Companies? Most corporate PCs could run on Atoms these days but in some areas I agree I see this being great.
        Simulation/CAD/CAM and Video editing are the two that jump to my mind. Throw in a any number of Science applications as well.

        Honestly I see them going in

      • by bfree (113420)

        And also, who strayed from the duo- quad- naming methodology?! Are you insane!? Do you have any idea the marketing power that a sexa core chip could have?

        The same people who decided to never release a Sexium after the Pentium.

      • The price is going down because the 32nm manufacturing process is coming online at Intel. Smaller transistors means you can add more to the same die area, at the same manufacturing cost. Moore's law et al.
      • by drooling-dog (189103) on Thursday March 11, 2010 @11:01AM (#31438750)

        Right now I'm using what must be one of the humblest CPUs on Slashdot, an Athlon XP 2500+. That's 1600 MHz of single-core 32-bit goodness. It's served me loyally for years with nary a complaint, and never missed a single day of work.

        It still does almost everything I ask of it, but sometimes does struggle to keep up with HD video. I could help it out by getting a video card that supports VDPAU, but my equally faithful motherboard only has PCI and AGP, so there's not much room for upgrade there.

        So finally it's time to retire them, and their replacements are on the way. The new kids are still pretty humble themselves, just an Athlon II X2 and a cheap AM3 motherboard. With 2GB memory, a grand total of $180. No bragging rights around here, of course, but there's nothing I'm likely to be doing for the next few years that they won't handle easily.

        But here's the thing. I should be excited about bringing in the new regime, but I really feel like I'm spending my last few days with some good old friends. Should there be some kind of ceremony? Is there a computer heaven where they'll be waiting happily for me when I reach the end of my own days, along with my old 286DX25 and AMD K2? What a joyous reunion that will be...

        • Re: (Score:3, Insightful)

          by hairyfeet (841228)

          Dude make it into a file server or netbox. no reason to toss when KVMs are dirt cheap, I think I paid $24 for my 4 port at Newegg with cables. I am typing this on a Sempron 1.8Ghz with 1.5Gb of RAM, which makes for a whisper quiet netbox/downloader without needing to fire up my quad.

          So don't toss dude, re-purpose. As long as it still runs good and doesn't throw errors there is no reason you can't still get plenty of use out of it as a file server, netbox, or a dedicated box for downloading large files. Ju

      • Re: (Score:3, Insightful)

        by Khyber (864651)

        "Rendering farms?"

        Those would be handled by massively parallel GPU clusters, not slower than crap CPUs.

    • People should have modded you differently, not "Funny". I don't believe I've ever spent more than $150 on a processor for my personal use, going all the way back to the 386sx I bought. Now, work is a different story...
    • by Kjella (173770) on Thursday March 11, 2010 @09:29AM (#31437142) Homepage

      Nice, but who has $1000 to pay on a CPU?

      Everybody that makes money off the processing power of their computers? Not many hobbyists would spend 1000$ on a camera, but photographers spends thousands. Granted, that's really a workstation market more than a consumer market, but it's not special like ECC RAM, Quatro graphics cards, SAS hard drives or similar server/niche products. If you use the right apps and get a 50% speedup it'll pay for itself in many places. Overall, I don't think it's a really expensive hobby if you want to drive around in a car costing 2000$ less and blow it all on computers. I could afford this one if I wanted to, I just don't see the point. It's so much else I could spent it on and so little extra gain.

    • by johnw (3725) on Thursday March 11, 2010 @01:19PM (#31441082)

      but until these come down a long way in terms of price, it is WAY out of my price range

      This is your lucky day. I happen to know where I can get you a pallet of really cheap Intel Core i7 processors, retail boxed, complete with heatsink, fan and a booklet.

    • Re: (Score:3, Insightful)

      by Zeio (325157)

      I'd buy it on sight if it supported ECC. No ECC support = unstable system. I always have an ECC system, and I always get high "3DMarks" and frame rates and I never get a BSOD or other system errors.

      Without ECC its impossible to know if memory errors are occurring, and 12GB of memory at 1333/1600MHz probably has a single bit event quite often.

  • by jo_ham (604554) <joham999@noSPaM.gmail.com> on Thursday March 11, 2010 @08:45AM (#31436800)

    I believe this is what's been holding up the Mac Pro refresh, with the top or middle Mac Pro slated to get these as an upgrade from the 4 core ones.

    I think core number is the new MHz. We're not going any faster, but we can just give you more of them, which makes quite a lot of sense. All those FCP render pipelines and encodes just got a lot shorter with th3 12 core Mac Pro.

    • The Mac Pro will use Xeon 56xx, not Core i7 (although they're basically the same chip, the 56xx hasn't been announced).

  • Cool (Score:4, Insightful)

    by Pojut (1027544) on Thursday March 11, 2010 @08:45AM (#31436802) Homepage

    Now to see what AMDs 6-core offering is like. I know that Intel destroys AMD in performance benchmarks and real-world performance, but AMD is FAR less expensive. If I was pushing an Eyefinity setup or something, then sure, I would go all out and drop a few hundred dollars or more on an Intel CPU. Considering that AMDs current flagship costs $195 [newegg.com] and is still a heck of a performer...yeah, I'll stick with AMD for now.

    • Re: (Score:3, Funny)

      by biryokumaru (822262) *
      If the AMD chip has a higher bang for buck ratio, why not just do the sensible thing and make a beowolf cluster? It's just like have more cores, because you do.
      • One thing to consider is that the cost of using a CPU is not the same thing as the cost of the CPU.

        Every CPU needs to be put into a socket. That socket has to be on a motherboard*. That motherboard needs a case, a PSU, ram, a switch port, something to boot off (admittedly the onboard nic may allow this). It will also need to be put in a case and those cases stored somewhere (perferablly a proper rack)

        When calculating the bang per buck of a given CPU choice you have to include these support components as wel

        • by T-Bone-T (1048702)

          Of course. You have to do that calculation for yourself because it only applies to you. The cost of the CPU is the only constant and possibly the only expense. What if you are just replacing the CPU? Those other costs don't matter.

    • Re:Cool (Score:5, Interesting)

      by beelsebob (529313) on Thursday March 11, 2010 @09:02AM (#31436950)

      AMD's flagship chip does indeed cost $195, but then, it's about the same speed (as the benchmarks showed) as the Core i5 750, which costs $199. AMD isn't offering better bang for you're buck, they're offering high energy use CPUs with comparable performance to intel's similarly priced CPUs.

      That Phenom II uses 30W more than the Core i5, so it'll cost you about $30 a year more to run, and be less upgradable.

      • "and be less upgradable."

        Not true. AMD's platform is much more forward compatible. AMD chips can now run DDR2 or DDR3 depending on what board it's in (Socket AM2/AM2+/AM3). That means that new AMD chips are compatible with 3 socket generations. Intel boards have nowhere near this broad socket and memory compatibility. Even in the same socket, a new chipset is typically required by Intel for new CPUs. This allows Intel to fake that their socket+platform had a compatibility life of 6+ years, when really

        • by Pojut (1027544)

          And it's cheaper. I don't get all the AMD hate going around.

          It isn't 1337 enough. Screw that, it's 1337 enough to run the games I want to play with my current configuration, that's all that matters.

        • by beelsebob (529313)

          Not true.
          True

          AMD2/2+/3 may have been compatible with each other, but then, so were Pentium 4s, CoreDuos and Core2Duos, all living on socket 775. At some point, a socket gets too old to support new CPUs, 1156, being a new socket still has some legs in it, it'll support *at least* the next generation of Core is, and probably the CPU design following Core i. AM2/2+/3 by contrast are coming to the end of their run. It's unlikely that AM3 will support more than the next one upgrade of the Phenom.

          For reference

        • by 0123456 (636235)

          People buying those boards and CPUs might not even notice and will be s.o.l. after the very next generation.

          How many computer buyers ever actually upgrade their CPU? 1%?

          AMD's platform is the one with the sane upgrade path. And it's cheaper.

          And it would cripple a 6-core or 8-core CPU with limited memory bandwidth on a motherboard originally designed for older CPUs with 2 or 4 cores. It also means that new AMD CPUs have to support both DDR2 and DDR3, which apparently limits their memory bandwidth even further (from what I've read, the DDR2 support in the memory controller prevents it from running at optimum performance with DDR3).

          Seriously, I've never understood this 'but I can run AM

      • That Phenom II uses 30W more than the Core i5, so it'll cost you about $30 a year more to run, and be less upgradable.

        Haha! Interesting way of looking at things.

        I bought an AM2+ board and Athlon X2 way way back. 2008, I think it was. I upgraded to a Phenom II X4 925 about a month ago when I saw the ridiculous $109.99 launch price on NCIX. (Yes, I'm Canadian) It's now overclocked to 3.5ghz. What do I use it for? H.264 encoding (x264) and games. (TF2, L4D2, anything I can buy on steam)

        Looking at these benchmarks... I'm getting just under half the performance of this flagship CPU, for about 11% the price. It's an impressive

    • Re: (Score:3, Informative)

      by petermgreen (876956)

      I know that Intel destroys AMD in performance benchmarks and real-world performance, but AMD is FAR less expensive.
      hmm, are you aware of any good comparisions between the best AMD chips and the best intel chips available at a given price point?

      I tried to do one by taking a look at http://www.tomshardware.com/charts/2009-desktop-cpu-charts-update-1/Performance-Index,1407.html [tomshardware.com], looking up prices on newegg and ingnoring pricessors that are either unavailable at newegg or are more expensive than a faster chip o

    • by zdzichu (100333)

      AMD is just about to ship 12-core Magny Cours to customers. That's a beast!

  • Just image how fast you could play Game! [wittyrpg.com] with that beast!

  • No thanks (Score:5, Funny)

    by oodaloop (1229816) on Thursday March 11, 2010 @08:48AM (#31436822)
    Unless it's lead with a solid plastic fan, I'm not interested.
  • the new 12 core mac pro starting at $4500 with 6gb ram and ati 5350 512 video. Price to high you can get the $800 mini with i5 430 and Intel video with 4gb ram.

  • Turbo mode? (Score:5, Funny)

    by Stenchwarrior (1335051) on Thursday March 11, 2010 @08:58AM (#31436896)
    After 12 years, I finally have a use for that TURBO button on the front of my case again.
    • Re:Turbo mode? (Score:4, Informative)

      by TheRaven64 (641858) on Thursday March 11, 2010 @10:35AM (#31438120) Journal
      The Turbo Boost mode is present on most of the newer Intel chips. It overclocks one of the cores, while underclocking the others, to give single-threaded apps a boost without exceeding the thermal envelope. It needs some extra support from the OS scheduler, because suddenly you have different cores running at different speeds, which messes up process accounting. As I recall the OS needs to specifically request turbo boost mode, which it does when one process is using all of the CPU time that it is given but other cores are idle.
      • Re:Turbo mode? (Score:5, Informative)

        by Cowclops (630818) on Thursday March 11, 2010 @11:51AM (#31439720)
        Thats not how it works. What it actually does is shuts power off to cores that aren't in use, and then overclocks the remaining ones. It won't/can't run different cores at different clock speeds. So if you have a 4 core processor, it might shut off 2 of the cores and then boost the clock speed by up to like 50% depending on what CPU it is, up to the thermal limits of the processor. The "breakthrough" in engineering is the part of the circuit that shuts off power to the unused cores better than anything else has in the past. This essentially gets you the best of both worlds in a single CPU - a lower clocked quad and a higher clocked single/dual core.
  • Reminds me (Score:5, Insightful)

    by gaelfx (1111115) on Thursday March 11, 2010 @09:18AM (#31437038)
    This really reminds me of the recent Ask Slashdot article lamenting the naming schemes being implemented for most pieces of hardware. i7= 4 or 6 cores. Makes sense since the first thing I think when I hear 7 is "must be 4 or 6!" And the '980' really goes a long way towards confirming that initial suspicion. I'm really glad they put the 'extreme' in there, cause I was worried about the numbers being too low. Seriously though, can't they come up with a name that is actually descriptive of the product rather than a bunch of reassurances about the awesome-o amazingness of their processor? It seems to me that most people ask someone who knows something about computers when they need to buy a new one or replacement parts for their old one, and I don't know about the rest of you, but I really hate names that give me no real information about what the heck I'm buying. Yes, I can google the information, but the whole practice seems immature (and sometimes a little insulting).
    • Of course!

      "Introducing the new Intel Core 1336-32nm-3.33-6x+HT-VT-12MBSC!"
    • by drinkypoo (153816)

      I'm right there with you. How the hell would anyone know the Phenom II 720 is a triple-core, 2.8 GHz processor with a K10 core? Assuming I even remember correctly. When I hear "phenom" I think of Dre, not a K10.

      • First, you seem to have forgotten that the processor is called the Phenom II X3 720. That right there should tell you that it's a triple core. Secondly, how do you know the Athlon XP is the K7 core? Or the Pentium 4 is the Netburst core? Your average joe doesn't buy a processor. They buy a cheap Dell with a Celeron in it. They turn it on and it works. They don't care what happens, as long as it works. The people that buy processors usually look into them.
    • Re: (Score:3, Funny)

      by drooling-dog (189103)

      Have to say I'm disappointed too. I wanted to know whether the i{N} naming is N=3,5,7 as in odd numbers or primes. This was going to be the chip that settled that once and for all, because it would be either the i9 or the i11. The mystery lives on.

    • by Rockoon (1252108)
      I would love a return to uniform class-based suffixes. For example, the 80386SX vs 80386DX. Although "SX" and "DX" isn't descriptive, its UNIFORM. The SX chips were running a 16-bit bus and the DX chips were running a 32-bit bus.

      Later Intel moved to Pentium vs Celeron, but Celeron itself wasnt uniformly descriptive (beyond meaning "shit") of the differences between them. Some Celerons had their cache's cut in half, others were simply a lower clock rate, still others were a combination of the two.

      I stron
      • by jadin (65295)

        No clue if it's true or not, but I was told back then that the move from x86 to pentium was because they could not trademark a sequence of numbers. Their competitors could sell 80486 no problemo, but could not sell a pentium.

    • Re: (Score:3, Insightful)

      by bdenton42 (1313735)

      i7= 4 or 6 cores. Makes sense since the first thing I think when I hear 7 is "must be 4 or 6!"

      Some of the i7 models for mobile use only have 2 cores, just to confuse things even further.

    • Really, it's not as bad as you make out. Firstly and most importantly, as you've already pointed out you *can* Google for it. 20 seconds on the wikipedia page for Intel's processors can tell you what you need to know about the model number in front of you.

      You really think they haven't thought about how to differentiate their products and make things clear? In reality they have a lot of different product ranges to cover, from multi-socket servers down to netbooks and PDAs.

      Then they have variables like number

  • I know years ago Intel did not, for example, make a 3 GHz P4, they made shed loads of P4's and then gave each one a clock speed that it would handle, so you had a distribution curve from each batch that ran from maybe 1.6 to 3.2 GHz, and priced accordingly.

    I can't imagine any recent changes in chip production per se that would mean an end to this distribution curve out of each batch.

    Rather this is a case of a new process finally coming on line with the production bugs mainly worked out, which shifts the dis

  • New 6-core processor is super-fast in synthetic benchmarks and when coupled with applications which are specifically coded for multi-threaded execution!

    I SO DID NOT EXPECT THAT!

    My Q6600 from 2007 runs every game I have on top settings (last game I bought was Prototype). I just don't see any benefit to the consumer.
    • by dfghjk (711126)

      ...because consumers only play games.

      • ... and they totally run POV-Ray.

        This is obviously not aimed at mom-and-pop checking their Bookface and watching a little iPlayer. Don't troll so obviously.
    • by kimvette (919543)

      Editing images is faster, especially with the megapixel race in the consumer, prosumer, and professional camera markets.

      Editing video is not only becoming easy for home users, but is becoming almost realtime in terms of seeing results.

      Embroidery and illustration packages improve a LOT.

      oblig:
      Also, on the typical home machine, this will reap a huge net gain in every day computing, Now four cores can be dedicated to botnet daemons/services and other malware, and one can run Internet Explorer and one core could

  • by managementboy (223451) on Thursday March 11, 2010 @09:49AM (#31437310) Homepage
    with this one we could get 30 frames out of the html5/javascript version of freeciv!
  • There was an article on connecting circuits with a solder that could be de-soldered with magnetic fields and it seems that the obvious future gain is reuse. If all they are doing is packing more cores in a package, the CPU should retain its value and if an effective method could be created to allow me to add new cores or delete cores that fail, then it would be just like memory. If somebody came up with a machine that could plug cores up to even 64x, it would seem that it would allow stability for twelve ye
    • by kimvette (919543)

      Then, processors had 250K or fewer transistors (not to mention other components) and 8 to 24 pins to communicate through, and ran somewhere between 700kHz and 8MHz (or 33MHz at the end of the 80s, which is later than the time you indicate), chip tracings were what, several microns, and were high voltage and low current compared to today's components. Also, heat sinks and fans were not required by most processors at the time. Even the mighty Motorola 68K and the i386 usually didn't sport heat sinks.

      Now, proc

      • Obviously you have some knowledge of the business. "Isn't that big a trick" is relative to the skill set. I designed with later technology and Gatorade ( Gate arrays ) was just the start. I am quite aware of what goes into the designs as I worked in semiconductor wafer fabrication, stuffing and somebody has to design the equipment and process that instantiates that hardware.
        I am just speculating on what might be useful in the future. As a programmer also, it is now necessary to think about threads and inter
  • Some may say that the majority of applications can't truly take advantage of the resources afforded by a six-core chip capable of processing up to 12 threads.

    Well, those "some" don't code complex stuff. Give it to me, I can put it to good use. I'd take a motherboard with 4 of these popped in any day as my work desktop (I'm dealing with massively parallel and highly computationally intensive stuff every day).
  • by Angst Badger (8636) on Thursday March 11, 2010 @10:20AM (#31437772)

    Every time this comes up, someone makes the observation that most apps aren't written to take advantage of multiple cores. That is, indeed true, but unless you're running MS-DOS, there's more to it. On the average Windows and Linux desktop installations, there can easily be twenty or so processes running before you start your first end-user application, and most users tend to have more than one app running at a time. While there is no substitute for purpose-built multi-threaded programs, it's not like those six cores will be sitting idle, especially under Windows, where you could throw an entire core or two at the OS and another couple at the two or three resident antivirus/malware programs that you need to have running to compensate for Windows' broken security model.

    Granted, a lot of end-user apps spend most of their time sleeping, waiting for user input, but a sleeping process runs just as well on one core as on six. For users whose programs are actually doing something most of the time, multiple applications can take advantage of the additional cores even if they are themselves not multithreaded.

    • by Malc (1751)

      CPU usage on my two year old dual core laptop:

      Processes: 118
      System Idle Process: 81%
      firefox.exe: 12%
      X1.exe: 4%
      OUTLOOK.EXE: 1%
      System: 1%
      vmware-authd.exe: 1%

      Most things don't do much, including services. They just sit there. If I close Firefox, my system barely uses any CPU (and gains about 750MB of memory back).

      What I would like this CPU for is AVC encoding...

      • by T-Bone-T (1048702)

        I just upgraded from a 2.4GHz Pentium 4 to a Core i3-530. The difference is astounding. A typical DVD movie image would take my P4 about 8 hours to convert to an iPhone-compatible format whereas the i3 only takes 20 minutes. Sure, I got the cheapest Core series processor but I don't even know what do do with all that power. I could convert all my movies but that would only take the afternoon.

    • Re: (Score:2, Insightful)

      by 91degrees (207121)
      Really the point is that this isn't aimed at a typical desktop user. A lot of the applications that this will be used for will easily use 12 threads. I know our 4 core i7 is great for compiling and our project is relatively small. Probably pretty good for rendering as well.
  • I think these kind of tests should start to include virtualization benchmarks. I'd really like to know, for example, how do VMWare, Virtul Box, Parallels, etc. benefit from these new processors?

  • "Some may say that the majority of applications can't truly take advantage of the resources afforded by a six-core chip capable of processing up to 12 threads."

    Well, before switching to Click-to-Flash mode I would quite happily have used 11 of those threads for Flash banner adverts spinning in a CPU-hogging mode and 1 thread for my useful applications. So I expect it will make things go faster even on the desktop! But that's not a good reason for wanting / needing more hardware threads!

  • All the computation I do is orders of magnitude faster on GPUs than CPUs. Furthermore, my graphics card also handles a lot of non-parallel tasks better than a CPU. I think we're seeing the waning days of Intel's processor dominance, unless they evolve their processor business into something else.
  • Has anyone here worked with KVM using libvirt to associate virtual cpus with physical cpus? I've always wanted to try this to see what the performance would be like.

"You don't go out and kick a mad dog. If you have a mad dog with rabies, you take a gun and shoot him." -- Pat Robertson, TV Evangelist, about Muammar Kadhafy

Working...