Intel's Core i7-980X Six-Core Benchmarked 179
Ninjakicks writes "Although they won't hit store shelves for a few more weeks, today Intel has officially unveiled the new Core i7-980X Extreme processor. The Core i7-980X Extreme is based on Intel's 32nm Gulftown core, derived from their Nehalem architecture and sports six execution cores. The chip runs at a 3.33GHz clock frequency, that can jump up to 3.6GHz in Intel's Turbo Boost mode. This processor has a max TDP of 130W, which amazingly is the same as previous generation Core i7 quad-core CPUs. Of course, it's crazy fast too. Some may say that the majority of applications can't truly take advantage of the resources afforded by a six-core chip capable of processing up to 12 threads. However, the fact remains there are plenty of multi-threaded usage models and applications where the power of a CPU like this can be put to very good use."
Nice, but who has $1000 to pay on a CPU? (Score:4, Insightful)
I know there are SOME people out there who have $1000 to spend on just a CPU, but until these come down a long way in terms of price, it is WAY out of my price range.
Re:Nice, but who has $1000 to pay on a CPU? (Score:5, Interesting)
Intel always prices their High end around $1000, never mind the fact that price/performance on those chips is horrible.
It is the price you pay for getting the bleeding edge, AMD also has some halo models, but because they cant beat intel in performance, they cant afford to charge $1000 for their high end chips.
As for this comming down, AMD is slated to release six-core phenoms to the desktop before summer iirc, it wont have the raw performance of this thing, but 6 cores for under 200 bucks sounds nice doesnt it?
Re: (Score:3, Interesting)
It is the price you pay for getting the bleeding edge, AMD also has some halo models, but because they cant beat intel in performance, they cant afford to charge $1000 for their high end chips.
AMDs current flagship costs $195 [newegg.com] and is still a heck of a performer. I'll stick with AMD for now.
lol, anyome remember the horribly overpriced Athlon 64 FX-55?
Re:Nice, but who has $1000 to pay on a CPU? (Score:4, Interesting)
I just took a look at a toms hardware CPU chart ( http://www.tomshardware.com/charts/2009-desktop-cpu-charts-update-1/Performance-Index,1407.html [tomshardware.com] ), picked out the intel CPU that came immediately above the AMD CPU you mentioned and looked up the price on newegg ( http://www.newegg.com/Product/Product.aspx?Item=N82E16819115215&cm_re=i5-750-_-19-115-215-_-Product [newegg.com] ) and it was $5 more.
Re: (Score:2)
given those two processors I'd take the AMD, And I'm a huge fan of the I5 architecture.
it comes down to 4x-2.66 or 4x-3.4
I do wish AMD did some jiggling with the on die cache. I think having a small L2 with a big L3 really isn't that smart. but i can't say that as fact :(
Re: (Score:2)
AMD's chips don't change sockets every 2 months. I can upgrade my AMD CPU without having to upgrade my entire machine. You can't compare the cost of the Intel chip directly to the AMD chip without taking the other costs into account as well.
Re: (Score:2)
AMD's chips don't change sockets every 2 months. I can upgrade my AMD CPU without having to upgrade my entire machine. You can't compare the cost of the Intel chip directly to the AMD chip without taking the other costs into account as well.
I thought I remember something similar being said about AMD switching to AM2 then AM3 sockets. Yes you can plug an AM3 CPU into an AM2 socket but there was (is?) a performance hit. Also how long did intel keep the 775 socket? But intel should have kept the 1156 socket around longer. They did jump to socket 1366 really fast when compared to how long the 775 socket was around.
Re: (Score:2)
Re: (Score:2)
But intel should have kept the 1156 socket around longer. They did jump to socket 1366 really fast when compared to how long the 775 socket was around.
1366 came out first for the i7. 1156 came out recently for the i5 and i7. Both are still around, and will be for at least a while. 1366 is aimed at enthusiasts and workstations, while 1156 is a "mainstream" part with some limitations compared to 1366.
Re:Nice, but who has $1000 to pay on a CPU? (Score:4, Informative)
But intel should have kept the 1156 socket around longer. They did jump to socket 1366 really fast
They didn't jump from 1156 to 1366 at all (1366 is actually older than 1156). They created two different sockets for different markets (and I'm pretty sure there will be a third soon for the new processors with 8 cores, 4 QPI links and seperate memory buffer chips).
1366 is a socket really designed for dual-socket workstation and server stuff but also used for some high end single processor stuff. 1156 is the mainstream socket.
Re: (Score:2)
AMD's chips don't change sockets every 2 months. I can upgrade my AMD CPU without having to upgrade my entire machine. You can't compare the cost of the Intel chip directly to the AMD chip without taking the other costs into account as well.
I actually built my system with an i7 because my AM2 board (about 16 months old and fairly high-end) apparently came out just before AM2+ and therefore couldn't be reused. If you happen to have a good AM2+ system, you may be able to drop in an AM3 CPU for a boost, but simply because you have AMD doesn't mean you suddenly have a free upgrade path. In my case, I would've had to buy the same components to go with Intel or AMD, so I ended up spending a little more on Intel to get a lot more performance.
As for
Re:Nice, but who has $1000 to pay on a CPU? (Score:5, Interesting)
FUD, pure FUD. AMD has always been cheaper than Intel. Even back before Intel introduced the Core2 series, when the AMD K2 and Athlon series spanked everything that Intel had to offer. Heck, even back to the days when AMD first entered the mass market (80386 days IIRC), they were the less expensive product. And to date, AMD has arguably always held the performance/$$$ award. Sure, Intel has started gaining a lead (Marginal with C2 series, but significant with the i7 series) in recent times, but AMD isn't THAT far behind. And if you consider that most of the true innovations in CPU design have come from AMD (true multi-core (I mean where there are 4 physical cores on die, not 2 dual core cpus on the die), 64bit, shared L3 cache, on-die memory controller, elimination of the north bridge and hence the system bus, etc), I find it VERY funny that "It is the price you pay for getting the bleeding edge" is applied to the more expensive Intel as opposed to the innovator AMD. Now, I'm not saying that Intel hasn't innovated at all. I'm just saying that the major innovations that the i7 used to surpass the C2 series (Namely the elimination of the system bus, on-die memory controller and the tiered cache architecture) were done first by AMD...
Re: (Score:2, Insightful)
hey, i never said AMD was more expensive then Intel, and i bet you that if they could charge $1000 for their top end, they would (and they should, milking the high end is the easiest way to recoup dev costs)
personally i prefer AMD because of their price/performance ratio too, and they have consistently kicked intels but there
Re: (Score:2)
hey, i never said AMD was more expensive then Intel, and i bet you that if they could charge $1000 for their top end, they would
They never quite hit $1000, but their Athlon 64 FX-55 went for something like $700 or $800 when it was brand new.
Re: (Score:2)
Well, if by "could" you mean with a better product, then no. That was proven in the days of the Athlon (When AMD owned almost every benchmark). They were number 1, but still the least expensive of the two by a fair margin.
If by "could" you mean with market position, then yes. Intel can charge $1k, because they have two things that AMD doesn't. First, brand recognition (I'd be willing to bet the "common" person knows Intel a lot mor
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
My only point, is that people love to bash AMD, when you could argue that a significant portion of Intel's key features were either developed in parallel with AMD (Virtualization technologies for example) or were developed by AMD first (x64, on-die memory controller, elimination of north bridge, etc, etc)...
On-die memory controller like the Intel 4004, you mean? And I don't believe the 4004 had a north bridge equivalent, since it could talk to memory directly.
AMD certainly deserve kudos for developing x86-64, but claiming that an on-die memory controller was some huge innovation when microprocessors have had on-die memory controllers since the stone age of computing is just silly. If there was a huge advance it was separating the memory from the CPU by attaching it to the north bridge so you could use any comp
Re: (Score:2)
when microprocessors have had on-die memory controllers since the stone age of computing is just silly
IMO the innovation is multiple links out of the processor.
If you look at older computers (certainly stuff like BBC micros and i'm pretty sure eartly PCs were the same) nearly everything was on one bus (with maybe the odd bus buffer chip somewhere or maybe a DRAM refresh chip if the CPU didn't have that capability built in). This worked with the tech of the time but as things started to speed up it became a
Re: (Score:2)
Not to mention that MMX had nothing to do with sound cards (Other than the fact that it enabled the CPU to natively do the vector math that DSP chips were doing at the time).
Another thing that MMX brought that previous CPUs core stuff didn't have afaict was saturating arithmetic. Traditionally CPUs do modular arithmetic.
Saturating arithmetic is useful if you are trying to do stuff like audio mixing in software.
What really made dedicated sound cards obsolete though IMO was a combination of advancing general
Re: (Score:2)
First, your comment is non-responsive. He never said Intel was cheaper, he said AMD doesn't have a model with performance levels high enough to merit a $1000 price tag when compared to what you can get for $1000 from Intel.
Second, you cherry pick innovations for AMD.
What do you care if your four cores are on-die? Larger dice means lower yield. Two-dice in a package can increase yield and thus reduce price or allow you to have a lot more cache, which helps performance.
Did not Intel do L3 cache first with the
Re: (Score:2)
Yes and no. L3 just means level 3, and yes Intel did introduce that in 03. But when I said L3, I meant a shared L3 across cores. All of AMD's multi-core chips have a shared cache level accessible by all cores. Intel didn't with the first few rounds of the C2Duo series (Including the original C2Quad chips).
Well, eve
Re: (Score:2)
6 cores for under 200 bucks sounds nice doesnt it?
That all depends on how those cores perform.
Personally given the choice I'd rather have a higher per core performance than more cores. there is still a lot of single threaded stuff out there and even some multithreaded stuff has single threaded stages and/or a lot of locking between threads.
The information i've seen indicates that there will also be a slightly slower non-extreme version of the i7 hex core, wikipedia claims a release price of $562 though it d
Re: (Score:2)
I wouldn't. Most of what I do is very multi-tasking heavy. The fact that one program can't use more than one core doesn't bother me nearly as much as that 3 or 4 programs must share the same core. Especially when you consider that I typically run more than 1 VM at a time along side my regular programs, I think (for my use case at least) the more cores, the better the computer will perform. I very rarely use a sing
Re: (Score:2)
Re: (Score:2)
People should also remember that Intel's graphics strategy is to try to do it with a general purpose cpu rather than a dedicated graphics engine. This could be part of that strategy. With so many cores available a system can simply dedicate a few to particularly important jobs such as rendering.
-Matt
Re: (Score:2)
It is the price you pay for getting the bleeding edge, AMD also has some halo models, but because they cant beat intel in performance, they cant afford to charge $1000 for their high end chips.
That's not the reason.
Even when AMD had the fastest chips, they were cheaper. AMD seems to charge less because Intel is a marketing behemoth.
Back around the time Intel released their P4 Extreme Edition chips, they were charging over $1000/cpu. AMD was charging a meagre $600-700 for their FX-55 which slaughtered it. And I picked up an Athlon XP for $80 around that time, which let me play all the new games. :P
Re:Nice, but who has $1000 to pay on a CPU? (Score:5, Insightful)
All new bleeding edge CPUs are expensive. That's not the point of the article/submission. The point here is that a very fast 6 core, 12 thread consumer level processor is now on the market.
Price will come down in due time.
Re:Nice, but who has $1000 to pay on a CPU? (Score:5, Informative)
I know there are SOME people out there who have $1000 to spend on just a CPU, but until these come down a long way in terms of price, it is WAY out of my price range.
Companies? Rendering farms? At this price, I'd imagine they're not really for the average consumer but more so for companies that can consider such a purchase an asset.
That said, you do realize that the i7-975 quad core that they compared it to is also nigh $1000 [newegg.com], right? I think showing that the same price will buy you an entirely different beast signals that quad cores are complete. The current quad cores price will come down but why would you make a more expensive quad core at Intel? The specs here show it cannot stand up to the new six core platform.
All these prices will come down, of course. So it's fun to look forward to what I'll be using in two years (I just bought a low range quad core for $140 a week ago, almost right in time for this).
And also, who strayed from the duo- quad- naming methodology?! Are you insane!? Do you have any idea the marketing power that a sexa core chip could have?
Re: (Score:2)
Re: (Score:2)
Two questions
1: do you know of any solid comparisons between those chips and current x86-64 chips using at least the same application software? (same OS would be nice too but it's difficult to chose one that is fair to all the candidates)?
2: do you realise just how much of the computing world is tied into either wintel or lintel?
Note that the particular chip mentioned in the current article is the desktop version, apparently there will be a dual-socket version but I haven't seen any recent information on wh
Re: (Score:2)
1. Web database servers
2. Not very much considering most actual production machines run a Unix variant (I mean PHYSICAL production, not software production) and I've had the displeasure of having to repair major industry machines, which forced me to learn EIGHT different Unix subsets.
No *REAL* production house (except maybe digital art studios) uses Wintel, and in fact Pixar is likely moving to solid-Tesla computing in the near future.
Re: (Score:2)
Can you provide links? A quick search for xeon vs powerpc benchmarks didn't turn up much other than articles about how the new intel macs were better than the powerpc ones.
Re: (Score:2)
Rendering farms maybe. It would be an interesting trade off.
Does this chip offer more processing power for dollar than using more but cheaper CPUs? You would have to look at power, cooling, space, system, and admin costs. I would give it a big maybe.
Companies? Most corporate PCs could run on Atoms these days but in some areas I agree I see this being great.
Simulation/CAD/CAM and Video editing are the two that jump to my mind. Throw in a any number of Science applications as well.
Honestly I see them going in
Re: (Score:2)
And also, who strayed from the duo- quad- naming methodology?! Are you insane!? Do you have any idea the marketing power that a sexa core chip could have?
The same people who decided to never release a Sexium after the Pentium.
Re: (Score:2)
Re:Nice, but who has $1000 to pay on a CPU? (Score:4, Interesting)
Right now I'm using what must be one of the humblest CPUs on Slashdot, an Athlon XP 2500+. That's 1600 MHz of single-core 32-bit goodness. It's served me loyally for years with nary a complaint, and never missed a single day of work.
It still does almost everything I ask of it, but sometimes does struggle to keep up with HD video. I could help it out by getting a video card that supports VDPAU, but my equally faithful motherboard only has PCI and AGP, so there's not much room for upgrade there.
So finally it's time to retire them, and their replacements are on the way. The new kids are still pretty humble themselves, just an Athlon II X2 and a cheap AM3 motherboard. With 2GB memory, a grand total of $180. No bragging rights around here, of course, but there's nothing I'm likely to be doing for the next few years that they won't handle easily.
But here's the thing. I should be excited about bringing in the new regime, but I really feel like I'm spending my last few days with some good old friends. Should there be some kind of ceremony? Is there a computer heaven where they'll be waiting happily for me when I reach the end of my own days, along with my old 286DX25 and AMD K2? What a joyous reunion that will be...
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
"Rendering farms?"
Those would be handled by massively parallel GPU clusters, not slower than crap CPUs.
Re: (Score:2)
Re:Nice, but who has $1000 to pay on a CPU? (Score:4, Interesting)
Nice, but who has $1000 to pay on a CPU?
Everybody that makes money off the processing power of their computers? Not many hobbyists would spend 1000$ on a camera, but photographers spends thousands. Granted, that's really a workstation market more than a consumer market, but it's not special like ECC RAM, Quatro graphics cards, SAS hard drives or similar server/niche products. If you use the right apps and get a 50% speedup it'll pay for itself in many places. Overall, I don't think it's a really expensive hobby if you want to drive around in a car costing 2000$ less and blow it all on computers. I could afford this one if I wanted to, I just don't see the point. It's so much else I could spent it on and so little extra gain.
Re:Nice, but who has $1000 to pay on a CPU? (Score:5, Funny)
but until these come down a long way in terms of price, it is WAY out of my price range
This is your lucky day. I happen to know where I can get you a pallet of really cheap Intel Core i7 processors, retail boxed, complete with heatsink, fan and a booklet.
Re: (Score:3, Insightful)
I'd buy it on sight if it supported ECC. No ECC support = unstable system. I always have an ECC system, and I always get high "3DMarks" and frame rates and I never get a BSOD or other system errors.
Without ECC its impossible to know if memory errors are occurring, and 12GB of memory at 1333/1600MHz probably has a single bit event quite often.
Re: (Score:2)
My first thoughts! With SQL server licenses as expensive as they are you can throw a lot of very expensive hardware at them (an array of 10 SSDs anyone) to get the most out of a single or dual CPU license.
Chips for the Mac Pro refresh I believe. (Score:5, Interesting)
I believe this is what's been holding up the Mac Pro refresh, with the top or middle Mac Pro slated to get these as an upgrade from the 4 core ones.
I think core number is the new MHz. We're not going any faster, but we can just give you more of them, which makes quite a lot of sense. All those FCP render pipelines and encodes just got a lot shorter with th3 12 core Mac Pro.
Re: (Score:2)
The Mac Pro will use Xeon 56xx, not Core i7 (although they're basically the same chip, the 56xx hasn't been announced).
Cool (Score:4, Insightful)
Now to see what AMDs 6-core offering is like. I know that Intel destroys AMD in performance benchmarks and real-world performance, but AMD is FAR less expensive. If I was pushing an Eyefinity setup or something, then sure, I would go all out and drop a few hundred dollars or more on an Intel CPU. Considering that AMDs current flagship costs $195 [newegg.com] and is still a heck of a performer...yeah, I'll stick with AMD for now.
Re: (Score:3, Funny)
Re: (Score:2)
One thing to consider is that the cost of using a CPU is not the same thing as the cost of the CPU.
Every CPU needs to be put into a socket. That socket has to be on a motherboard*. That motherboard needs a case, a PSU, ram, a switch port, something to boot off (admittedly the onboard nic may allow this). It will also need to be put in a case and those cases stored somewhere (perferablly a proper rack)
When calculating the bang per buck of a given CPU choice you have to include these support components as wel
Re: (Score:2)
Of course. You have to do that calculation for yourself because it only applies to you. The cost of the CPU is the only constant and possibly the only expense. What if you are just replacing the CPU? Those other costs don't matter.
Re: (Score:2)
Sorry...my experience with late-90's Warez websites has trained me never to look at a site with "top" followed by a number in its URL -_-;;
Re: (Score:2)
Re: (Score:2)
Of course, most top supercomputers take years to plan and build
Indeed and that means most of them were probablly designed before the xeon 5500 series (when intel finally got arround to replacing the FSB with QPI) was available.
And if you want more than 2 sockets or more than 6 ram slots (not that most of these big clusters do, they preffer to just use more nodes) on a system with a point to point based architecture AMD is still the only option.
Re:Cool (Score:5, Interesting)
AMD's flagship chip does indeed cost $195, but then, it's about the same speed (as the benchmarks showed) as the Core i5 750, which costs $199. AMD isn't offering better bang for you're buck, they're offering high energy use CPUs with comparable performance to intel's similarly priced CPUs.
That Phenom II uses 30W more than the Core i5, so it'll cost you about $30 a year more to run, and be less upgradable.
"and be less upgradable." is not true (Score:3, Interesting)
"and be less upgradable."
Not true. AMD's platform is much more forward compatible. AMD chips can now run DDR2 or DDR3 depending on what board it's in (Socket AM2/AM2+/AM3). That means that new AMD chips are compatible with 3 socket generations. Intel boards have nowhere near this broad socket and memory compatibility. Even in the same socket, a new chipset is typically required by Intel for new CPUs. This allows Intel to fake that their socket+platform had a compatibility life of 6+ years, when really
Re: (Score:2)
And it's cheaper. I don't get all the AMD hate going around.
It isn't 1337 enough. Screw that, it's 1337 enough to run the games I want to play with my current configuration, that's all that matters.
Re: (Score:2)
Not true.
True
AMD2/2+/3 may have been compatible with each other, but then, so were Pentium 4s, CoreDuos and Core2Duos, all living on socket 775. At some point, a socket gets too old to support new CPUs, 1156, being a new socket still has some legs in it, it'll support *at least* the next generation of Core is, and probably the CPU design following Core i. AM2/2+/3 by contrast are coming to the end of their run. It's unlikely that AM3 will support more than the next one upgrade of the Phenom.
For reference
Re: (Score:2)
People buying those boards and CPUs might not even notice and will be s.o.l. after the very next generation.
How many computer buyers ever actually upgrade their CPU? 1%?
AMD's platform is the one with the sane upgrade path. And it's cheaper.
And it would cripple a 6-core or 8-core CPU with limited memory bandwidth on a motherboard originally designed for older CPUs with 2 or 4 cores. It also means that new AMD CPUs have to support both DDR2 and DDR3, which apparently limits their memory bandwidth even further (from what I've read, the DDR2 support in the memory controller prevents it from running at optimum performance with DDR3).
Seriously, I've never understood this 'but I can run AM
Re: (Score:2)
That Phenom II uses 30W more than the Core i5, so it'll cost you about $30 a year more to run, and be less upgradable.
Haha! Interesting way of looking at things.
I bought an AM2+ board and Athlon X2 way way back. 2008, I think it was. I upgraded to a Phenom II X4 925 about a month ago when I saw the ridiculous $109.99 launch price on NCIX. (Yes, I'm Canadian) It's now overclocked to 3.5ghz. What do I use it for? H.264 encoding (x264) and games. (TF2, L4D2, anything I can buy on steam)
Looking at these benchmarks... I'm getting just under half the performance of this flagship CPU, for about 11% the price. It's an impressive
Re:Cool (Score:5, Interesting)
True, except for when you already have a AM2/AM2+/AM3 board, or a good supply of ddr2 ram. In that case the phenom is a drop in upgrade, versus a platform upgrade for the i5. Also keep in mind that AMD will be releasing 6-core CPUs this year too, and they will fit in any recent AM2+/AM3 board, while for the intel high end stuff, you are locked into their 'premium' 1366 socket.
This applies to me. I just ordered AMDs 965 Phenom II to replace the Athlon 64 X2 5400+ currently in my AM2+/AM3 Gigabyte board...when the new AMD chipset is widely released with SATA6/USB3 and the price becomes reasonable, I'll order one of those motherboards. Until then, my AMD 785 chipset board will suffice. AMD has always been pretty good about making sure their sockets are versatile, and the AM2+/AM3 boards are the most versatile yet.
plus i like rooting for the underdog
This is also a reason why I stick with AMD. They're the only ones producing CPUs that can remotely compete with Intel in the consumer space, yet they are a MUCH smaller company. I like that.
Re: (Score:2)
True if you already have a platform that can support a chip with the performance you want it's often a better buy.
while for the intel high end stuff, you are locked into their 'premium' 1366 socket.
True, BUT amd simply don't have anything comparable to the intel high end stuff anyway. AMD high end stuff is comparable to intel midrange stuff.
Will the AMD 6-core be better than a comparablly priced LGA1156 intel quad-core available at the time? The answer probablly depends on the workload (e.g. can the workloa
Re: (Score:3, Informative)
I know that Intel destroys AMD in performance benchmarks and real-world performance, but AMD is FAR less expensive.
hmm, are you aware of any good comparisions between the best AMD chips and the best intel chips available at a given price point?
I tried to do one by taking a look at http://www.tomshardware.com/charts/2009-desktop-cpu-charts-update-1/Performance-Index,1407.html [tomshardware.com], looking up prices on newegg and ingnoring pricessors that are either unavailable at newegg or are more expensive than a faster chip o
Re: (Score:2)
I maintain that bang per buck of processors alone isn't what matters.
Either you have a job that has to run on one machine in which case it's a matter of whether time saving/better experiance is sufficiant to make up for the cost difference or you are looking at a number of machines working together in which case you want to look at bang per buck of the entire node (including the costs of power and rackspace).
We all know top end Intel processors have a horribly low bang per buck (even when you take whole sys
Re: (Score:2)
AMD is just about to ship 12-core Magny Cours to customers. That's a beast!
Wow! (Score:2)
Just image how fast you could play Game! [wittyrpg.com] with that beast!
No thanks (Score:5, Funny)
the new 12 core mac pro starting at $4500 with (Score:2)
the new 12 core mac pro starting at $4500 with 6gb ram and ati 5350 512 video. Price to high you can get the $800 mini with i5 430 and Intel video with 4gb ram.
Turbo mode? (Score:5, Funny)
Re:Turbo mode? (Score:4, Informative)
Re:Turbo mode? (Score:5, Informative)
Re: (Score:2)
So Turbo Boost is Phenom power management + automated overclocking?
Reminds me (Score:5, Insightful)
Re: (Score:2)
"Introducing the new Intel Core 1336-32nm-3.33-6x+HT-VT-12MBSC!"
Re: (Score:2)
I'm right there with you. How the hell would anyone know the Phenom II 720 is a triple-core, 2.8 GHz processor with a K10 core? Assuming I even remember correctly. When I hear "phenom" I think of Dre, not a K10.
Re: (Score:2)
Re: (Score:3, Funny)
Have to say I'm disappointed too. I wanted to know whether the i{N} naming is N=3,5,7 as in odd numbers or primes. This was going to be the chip that settled that once and for all, because it would be either the i9 or the i11. The mystery lives on.
Re: (Score:2)
Later Intel moved to Pentium vs Celeron, but Celeron itself wasnt uniformly descriptive (beyond meaning "shit") of the differences between them. Some Celerons had their cache's cut in half, others were simply a lower clock rate, still others were a combination of the two.
I stron
Re: (Score:2)
No clue if it's true or not, but I was told back then that the move from x86 to pentium was because they could not trademark a sequence of numbers. Their competitors could sell 80486 no problemo, but could not sell a pentium.
Re: (Score:2)
The 80486 was the processor where DX meant integrated mathco. Not well know is that the 80487 was a full blow 80486DX chip, and when placed in the mathco socket, it completely disabled the "main" socket.
Re: (Score:3, Insightful)
i7= 4 or 6 cores. Makes sense since the first thing I think when I hear 7 is "must be 4 or 6!"
Some of the i7 models for mobile use only have 2 cores, just to confuse things even further.
Re: (Score:2)
Really, it's not as bad as you make out. Firstly and most importantly, as you've already pointed out you *can* Google for it. 20 seconds on the wikipedia page for Intel's processors can tell you what you need to know about the model number in front of you.
You really think they haven't thought about how to differentiate their products and make things clear? In reality they have a lot of different product ranges to cover, from multi-socket servers down to netbooks and PDAs.
Then they have variables like number
Does it still work on yield? (Score:2)
I know years ago Intel did not, for example, make a 3 GHz P4, they made shed loads of P4's and then gave each one a clock speed that it would handle, so you had a distribution curve from each batch that ran from maybe 1.6 to 3.2 GHz, and priced accordingly.
I can't imagine any recent changes in chip production per se that would mean an end to this distribution curve out of each batch.
Rather this is a case of a new process finally coming on line with the production bugs mainly worked out, which shifts the dis
Holy cow! (Score:2)
I SO DID NOT EXPECT THAT!
My Q6600 from 2007 runs every game I have on top settings (last game I bought was Prototype). I just don't see any benefit to the consumer.
Re: (Score:2)
...because consumers only play games.
Re: (Score:2)
This is obviously not aimed at mom-and-pop checking their Bookface and watching a little iPlayer. Don't troll so obviously.
Re: (Score:2)
Editing images is faster, especially with the megapixel race in the consumer, prosumer, and professional camera markets.
Editing video is not only becoming easy for home users, but is becoming almost realtime in terms of seeing results.
Embroidery and illustration packages improve a LOT.
oblig:
Also, on the typical home machine, this will reap a huge net gain in every day computing, Now four cores can be dedicated to botnet daemons/services and other malware, and one can run Internet Explorer and one core could
30 frames in FreeCiv (Score:3, Funny)
Re: (Score:2)
Really? Really? You made an html5/javascript version of freeciv?
It exists. [slashdot.org]
Future CPU gain (Score:2)
Re: (Score:2)
Then, processors had 250K or fewer transistors (not to mention other components) and 8 to 24 pins to communicate through, and ran somewhere between 700kHz and 8MHz (or 33MHz at the end of the 80s, which is later than the time you indicate), chip tracings were what, several microns, and were high voltage and low current compared to today's components. Also, heat sinks and fans were not required by most processors at the time. Even the mighty Motorola 68K and the i386 usually didn't sport heat sinks.
Now, proc
Re: (Score:2)
I am just speculating on what might be useful in the future. As a programmer also, it is now necessary to think about threads and inter
can't take advantage? (Score:2)
Well, those "some" don't code complex stuff. Give it to me, I can put it to good use. I'd take a motherboard with 4 of these popped in any day as my work desktop (I'm dealing with massively parallel and highly computationally intensive stuff every day).
Parallel apps aren't everything... (Score:5, Interesting)
Every time this comes up, someone makes the observation that most apps aren't written to take advantage of multiple cores. That is, indeed true, but unless you're running MS-DOS, there's more to it. On the average Windows and Linux desktop installations, there can easily be twenty or so processes running before you start your first end-user application, and most users tend to have more than one app running at a time. While there is no substitute for purpose-built multi-threaded programs, it's not like those six cores will be sitting idle, especially under Windows, where you could throw an entire core or two at the OS and another couple at the two or three resident antivirus/malware programs that you need to have running to compensate for Windows' broken security model.
Granted, a lot of end-user apps spend most of their time sleeping, waiting for user input, but a sleeping process runs just as well on one core as on six. For users whose programs are actually doing something most of the time, multiple applications can take advantage of the additional cores even if they are themselves not multithreaded.
Re: (Score:2)
CPU usage on my two year old dual core laptop:
Processes: 118
System Idle Process: 81%
firefox.exe: 12%
X1.exe: 4%
OUTLOOK.EXE: 1%
System: 1%
vmware-authd.exe: 1%
Most things don't do much, including services. They just sit there. If I close Firefox, my system barely uses any CPU (and gains about 750MB of memory back).
What I would like this CPU for is AVC encoding...
Re: (Score:2)
I just upgraded from a 2.4GHz Pentium 4 to a Core i3-530. The difference is astounding. A typical DVD movie image would take my P4 about 8 hours to convert to an iPhone-compatible format whereas the i3 only takes 20 minutes. Sure, I got the cheapest Core series processor but I don't even know what do do with all that power. I could convert all my movies but that would only take the afternoon.
Re: (Score:2, Insightful)
Additional Benchmarks (Score:2)
I think these kind of tests should start to include virtualization benchmarks. I'd really like to know, for example, how do VMWare, Virtul Box, Parallels, etc. benefit from these new processors?
Taking advantage of the resources ... (Score:2)
"Some may say that the majority of applications can't truly take advantage of the resources afforded by a six-core chip capable of processing up to 12 threads."
Well, before switching to Click-to-Flash mode I would quite happily have used 11 of those threads for Flash banner adverts spinning in a CPU-hogging mode and 1 thread for my useful applications. So I expect it will make things go faster even on the desktop! But that's not a good reason for wanting / needing more hardware threads!
Intel holds all the cards, but we're playing chess (Score:2)
Virtualization Performance Questions (Score:2)
Has anyone here worked with KVM using libvirt to associate virtual cpus with physical cpus? I've always wanted to try this to see what the performance would be like.