Intel Pushes Back with Xeon 5100 140
conq writes "BusinessWeek has a piece on Intel's newest chip, the Xeon 5100, which many consider might be the chip that will llow them to stop losing ground to AMD. From the article: 'During the presentation, Intel ran the now-standard comparison test against AMD's highest performing chip, handily beating the system in a speed test. And in a jab at AMD execs, who handed kill-o-watt meters to analysts at the outfit's recent technology day, Intel execs used the same device to measure the new Xeon 5100 system's performance — gauged to be 7 watts better than that of the AMD-based system.'"
Keyword... (Score:2, Insightful)
Re:They only have 2 of the 3 key components to win (Score:3, Insightful)
The cost of procurement of a server is a tiny percentage of its TCO.
Fantastic (Score:5, Insightful)
Now I have no preference in the whole AMD vs Intel debate, I just use whatever seems to give me the most value for money / required performance. I am currently using AMD chips in kit 4 years old or younger and Intel chips in some of my older hardware, and haven't yet even looked at AMD64 or IA64 chips). but it is really good to see some serious competition between two industry giants. Long live the competition, its better for the consumer.
Shilling (Score:1, Insightful)
Competition (Score:2, Insightful)
That's how you get good products at low prices - comeptition, plan and simple. The thing that is unfortunate with markets like PC and server processors (or even operating systems) is that there are only two major market share holders, and one of them is much larger than the other making it tough for them to be competitive due to lack of volume.
But as Apple and AMD have proven, you don't have to have the largest market share to innovate, and you can make a serious dent in the Microsofts and Intels of the world - even if all it accomplishes is forcing them to put more effort into their products both of their customers win.
Ah.... (Score:5, Insightful)
Does this include... (Score:4, Insightful)
Does this include the required Intel Northbridge chip (22W), or are we only looking at the CPU itself? And does the NB need a fan?
Or is this the entire system motherboard, in which cases this is hardly an apples-to-apples comparison.
Re:Numbers skewed? (Score:3, Insightful)
Re:Duh (Score:2, Insightful)
I think what he means is that we should compare Intel's not-buyable new chips with AMD's not-buyable new chips. When end-users start taking delivery of Woodcrest servers in, what, August maybe?, then maybe Intel can boast for perhaps even several weeks until AMD's new server chips are out.
Attempting to tank the entire market until Intel's next-gen chips are out just because everyone knows Intel's current "Netburst" chips are overheating crap is lame.
BTW, which Opteron CPU was Intel using in their comparison? Power consumption varies quite a bit even before you consider there are regular Opterons, Opteron HE, and Opteron EE series. A mere 7 watt advantage at the wall despite having started their 65nm transition earlier (AMD waits until they've figured out how to get mature yields before making a rapid switch to the next process node, very unlike Intel) tells me that Intel is going to get leapfrogged big-time in short order.
Re:No. (Score:4, Insightful)
What a thoughtful and insightful post. Clearly IBM does not have production and yield problems, because they are courting three major game console manufacturers with their wonderful, efficient chips.
Oh wait. Of these three, only two of them are actually available. Hrm.
Oh yeah, and I seem to recall something about a shortage of XBox360s. Something about a chip company not making as many chips as they promised. Must've been the wifi card or something.
WAIT, I DO recall a time when a company - think it was IBM - didn't produce enough G5 chips and people were backordering their Power Macs for months! Perhaps there is something to this after all.
What's that? Your XBox360 consumes so much power that the PSU caught fire and burned a hole in your carpet? Guess there is a performance-per-watt issue after all. You know, that really does matter to a lot of people. There are data centers, especially in downtown locations, that can't grow their business any more because the power company won't sell them any more wattage. And if you remove the excess thermal paste, MBPs aren't all that hot.
So yeah. Troll somewhere else.
True (Score:2, Insightful)
Details? (Score:2, Insightful)
But the peripheral requirements -- particularly FB-DIMM -- are interesting, too. And maybe a little scary. Anybody got a clue how these FB-DIMM units are gonna be priced per GB? We haven't seen any details on mobo pricing, either.
I like the idea of lower power consumption and greater throughput. But if I can't afford to build the system, it doesn't do me much good.
This announcement does sorta smell like marketing hype; I guess the implementations will tell the tale. Intel finally recognizes in public that they're getting their asses kicked by AMD, though, which is a good thing, IMO. Now if they'd just focus on price/performance competitiveness, they might even get me back as a customer.
Re:Similar processes? (Score:5, Insightful)
Re:Woodcrest: good processor but not sufficient ? (Score:1, Insightful)
Actually single-core Opteron processors running at 3 GHz do exist (Opteron 256 and 856). I think what you meant is that there is no dual-core Opteron running at 3 GHz, and this is true since the fastest dual-core Opteron processors stops at 2.6 GHz (Opteron 185, 285 and 885). This is why I said that only the high-end Xeon processors make sense (Xeon 5140, 5150 and 5160). Such processors are at least as fast as the fastest dual-core Opteron in any non memory-intensive workload. However, as soon as memory latency becomes a critical factor a 2.6 GHz dual-core Opteron becomes at least as fast as a 3.0 GHz dual-core Xeon, as shown in this Apache benchmark [gamepc.com] where Opteron outperfoms Xeon by 15-25% at the same clock frequency (and 3.0 GHz is only 15% faster than 2.6 GHz).
So in the end, Woodcrest or Opteron, which one is better ? It all depends whether the workload is memory intensive or not. No matter how fast Woodcrest runs, its memory latency will always be around 100ns while it is about 50ns for Opteron. Of course Intel tries to hide this latency by using huge 4 MB caches, but as seen in the Apache benchmark, sometimes even 4 MB is not enough. It also means that 4 MB Xeons are more expensive to produce than 1 MB Opterons, becauche L2 cache takes more than half of the die space. This is why Intel is forced to use a 65nm manufacturing process, to try to gain back this "lost" silicon wafer space.
Regarding your second question, a dual dual-core Opteron 285 (4 cores) with 8 GB of RAM with a high-end mobo would costs about $3200 (1062*2+8*70+500). Add a $400 chassis and 2 x $150 harddisks and you end up with a total of $3900 (compared to your $5K). So yes, Opteron is definitively cheaper. What are the exact tech specs of your system ?
- great dude