Intel's Quad Core CPU Reviewed 286
Gr8Apes is one of many to let us know that Tom's Hardware Guide has posted a review of Intel's new Kentsfield quad core processor. From the article: "Even expert opinions are deeply divided, ranging from 'more cores are absolutely necessary' to 'why do I need something more than my five-year-old PC system?' Although the Core 2 quad-core processors are not expected to hit retail channels before October, Tom's Hardware Guide had the opportunity to examine several Core 2 Quadro models in the test labs. We would like to make it clear that these samples were not provided by Intel."
It's the bandwidth stupid! (Score:2, Informative)
Games are going parallel (Score:4, Informative)
"Multithreaded Game Engine Architectures "
http://gamasutra.com/features/20060906/monkkonen_
"Multi-Threaded Terrain Smoothing"
http://gamasutra.com/features/20060531/gruen_02.s
Re:It's the bandwidth stupid! (Score:3, Informative)
is the cause of the day and given coherency, its not trivial to
architect around. the parent may have been a little terse, but
as you point out, overall throughput doesn't go up if all the
cores are too starved to issue.
however the memory latency picture isn't changing very much, and the
most compelling method to hide it for general purpose machines is through
thread parallelism (ignore vectors for a moment, its kind of a special
case of the same thing).
this is what makes the multicore picture interesting. assuming a workload
that can exploit it, you can really turn the scale knob pretty far up.
unfortunately the whole affair pivots on being able to get past the
crappy heavyweight thread-and-lock model the software people have grown
so fond of. and the software community isn't particularily light on its
feet.
Re:Experts? (Score:5, Informative)
Intel's latest chips are fabbed at 65nm, while AMD is still only shipping chips fabbed at 90nm. This should give Intel a serious edge in the performance/heat ratio, but AMD's chips are so much more energy efficient that they are still competitive. (The current best performance/heat is the AMD Athlon64 X2 3800+ ADD [lostcircuits.com] chip.) When AMD finally ships 65nm Opterons, those ought to be really great for dense server installations.
It's telling that even Dell is planning [com.com] to ship servers with AMD chips. They announced a 4-core server; two dual-core Opterons. It wouldn't surprise me if they will be 65nm Opterons when they finally are released.
The article says that Intel is going to transition from 65nm to 45nm sometime in 2007, and to 34nm sometime in 2009. They beat AMD to 65nm big-time. They may well be at 34nm before AMD can make it to 45nm! Just imagine some sort of server chip with 16 cores... or more likely, 8 cores and a whole bunch of cache.
But we shouldn't count those chickens before they hatch. Right now Intel is at 65nm and AMD will be there soon.
steveha
Re:Experts? (Score:3, Informative)
I'm not convinced, but that's one point of view that's often expressed.
Re:Duo 2 Sexo? (Score:3, Informative)
Re:It's the bandwidth stupid! (Score:3, Informative)
Ah, but the future is now. Cell has already addressed these issues: 25.6 GB/s main memory bandwidth, 256 kb of L1 cache per core, OoO sacrificed to minimize heat, maximal raw performance of CPUs in FP, integer, FP, load/store, FP, and main memory transfer (DMA engine) without any silliness like 8 GP registers (128). Even when multiple Cells are hooked together, it's over a 35 GB/s IOIF port.
Also for onboard multitasking, you forgot about being latency-bound by atomic operations, which is something that would be really bad with separated L2 caches. This issue is also elegantly handled by cell by having only a single bus-snooped L2.
It must be frustrating for the hardware guys. You address all the bottlenecks in a pretty uniform way, and they still criticize: "But... uh, the software guys need a refresher course in hypertasking..."
Re:FSB (Score:2, Informative)
1333 + 4 = 1337
Re:from intel's point of view (Score:3, Informative)
Processor makers need to really work on energy efficiency of all their desktops, these speeds were achieved through sheer increases in heat and power consumption, and its really flatly unacceptable
Didn't you pay attention in class? All the processor makers have started doing this. For the last one or two years, the mantra has been "computing per watt", pretty universally, no matter which company you were from. And by the way, if you compare your 386 and a modern computer, I'll bet the modern computer gives more "computing per watt", no matter how you decide to calculate, so all improvements can't have been "achieved through sheer increases in heat and power consumption".
Maybe if every 3-5 years there was a responsible and substantial leap in computing power people would upgrade in regular phases [SNIP] Of course gaming is to blame for this constant demand
Ha ha ha. Hi hi hi. Ho ho ho. Hilarious! So you think if it wasn't for the gaming industry, we would somehow magically have every manufacturer on the planet agreeing to wait releasing their improved products untill someone (who?) said it was time for a new "generation"?
Apart from the fact that such a thing has never happened in other industries (or would you care to give a counterexample?), that it is uncompatible with the idea of a free market, and that it is bad for consumers as well as producers, please explain why you think that would happen AND why you think that would be a good thing.
So I figure nothing will ever change until we hit that mythical peak in which everyone says "Good enough" and the race to real time photorealism is over.
Yeah, because when graphics are suddenly "good enough", nobody needs computers for other purposes. People don't need plain old office computers, workstations, databases, web-servers, B2B-applications, industrial control software, or anything else. Because the only thing driving the computing industry forward is gaming. And when photorealism in gaming has been achieved, all computing innovation will stop. Right!
And besides, there will never be a "good enough" for graphics. In my view, audio has been "good enough" since long before the invention of the CD. That doesn't mean that there aren't people still working on creating better and/or cheaper audio components.
Re:from intel's point of view (Score:3, Informative)
If your build system doesn't support parallelisation, you should look into switching build systems. Developer time is not cheap.
Never ask that question again. (Score:1, Informative)
OK, a lot of people have asked the same thing and I'm not picking on you. Hell, we all think it and wonder it but the answer is so crystal clear no one should ever ask it again.
I wish my 1 MHz Commodore 64 was a little bit faster. I would have paid good money for an extra MHz or two. If you would have told me then that one day computers would be 100 times faster I would have been shaken but not stirred. It'd be hard to imagine computers being that much faster than my 64 and it would sort of seem wasteful, but we're talking future here so sure, why not.
If you'd told me that computers in 20 years would be 3000 times faster than my good old 64, well I imagine I would have had a hard time wrapping the brain around that one. I'm sure I'd be asking what would be the point? How could you possibly use 3000 MHz?
Of course we all know how now. So, if you were to tell me today that next year computers are going to be 10 million times faster than today, I'd say cool.
Re:from intel's point of view (Score:2, Informative)
If all you do is surf the web and watch movies, well then you'll be fine for some time. I edit movies quite often at work, and at 720x480x2000kb/s, I'm screaming for as much horsepower as I can get my hands on. I'm currently on a P2.4 with 1Gb of memory, and i'd like at least 4 times as much power as that. Cost be damnned. It'll pay for itself in saved time even before it becomes obsolete.