Where Oh Where Is The Pentium 4? 109
Othello writes: "Sharky managed to dig up some insider info on why we aren't going to see the Pentium 4 this month. There are chipset problems with ICH2 that are causing the delay. The processor should be out around week 48, they say, which is late November or early December."
Actually that is more of a guess (Score:1)
Re:This baby is actually several years late (Score:1)
Any conforming C library (including glibc or MS) has this limitation.
C programmers will one day be as valuable as Fortran programmers in late 1999...
Time to update your resumes and claim some "Epoch Problem Consultant" title.
____________________
A great editorial about Intel's problems (Score:5)
http://www. tur botech.ch/articles2000/001001-intels_darkening-01
HardOCP [hardocp.com] quoted this part, which is a pretty good one:
September 2000 - Intel's blackest month in countless years. Following the official withdrawal of the 1.13 GHz PIII in the last days of August, in the wake of this face-loss it also became pretty obvious that Intel will have to ditch the grandiose plans of breathing new life into the dying P6 core with a 200 MHz FSB, a 0.13 micron process, and larger on-die L2 caches. It seems the Coppermine core (the last and most advanced modification of the half a decade old P6 core, introduced in the Pentium Pro 150 MHz in the mid-90s) simply won't be able to go much further.
What would... (Score:1)
Re:AMD (Score:1)
Re:Do we really need a faster processor? (Score:1)
You're kidding, right? What about the P6 core? O-O-O is not a trivial change. The P4 is a complete redesign of the core that implements concepts like the trace cache that are relatively recent in the computer architecture research community.
Yes, of course Intel has released incremental changes. If they didn't we'd all be complaining about Intel not advancing their technology. Honestly, I'm getting tired of seeing the same old Intel and Microsoft bashing with little or no useful argument to back it up.
--
Week 48? What kind of calendar do they use? (Score:2)
Re:Intel is really showing its stripes lately (Score:1)
---
Re:Then it should be possible to get accurate time (Score:1)
It has nothing to do with accurate time, or hardware or resolution. It was just a decision made to store time as a number of clock ticks in a 32 bit location and have all apis honor that.
Once a machine boots up and reads the time from the CMOS clock, it then counts clock ticks, which are usually generated by a programmable timer counter chip on motherboard. Everytime an interrupt is generated by the timer/counter , the time counter location in memory is incremented. Timer/counter chip has a lot more information in it that can be read by a programmatic call ( such as microseconds returned by gettimeofday() call ). Some counter/timers will return submicrosecond timing. However this is irrelevant to the agreed upon seconds tick by all unix api's which is a 32 bit location. It is very easy to implement a 64 bit counter , or 128 bit or anything else, but it is extremely hard to have everybody else agree to that, except by acceptance by everybody to the new spec.
One fly in the ointment is that it is very hard to read two 32 bit locations to make one 64 bit word ( or two 16 bits for one 32, or two 8 bits for one 16 , etc...) when you are reading two consecutive locations and an interrupt might occur between 2 reads. One could try a mutex, but the cost is too much for such a small operation that could happen at a high frequency. It is left to the reader as an exercise as to how to do this.
Now in two 32 bits for 64 bit situation , this is not bad since you could only be off about 68 years every 68 years, if your algorithm is not correct.. but... Also mutexes could cause priority inversion and your Mars exploration vehicle could get in to a brain lock.
Re:no, no, wrong accent! (Score:1)
Re:Do we really need a faster processor? (Score:1)
Sorry, but that's just silly. What people do and don't "need" is beside the point. I seriously doubt that most people "need" a processor much faster than 250-300 MHz for their serious computing needs; that's plenty fast enough to handle the web-surfing, word processing, spreadsheets, etc. that make up most people's daily computing use. But those people are going ahead and buying much faster processors because they're available at a reasonable price point. If Intel had put some serious effort into developing a 64 bit processor instead of continually extending the life of the PPro core, it too would be available at a reasonable price point and people would buy it. Would they really need all of the big advantages that you can get from a 64 bit processor? Of course not, but they could very well be getting more processing muscle for their dollar than they are today, and that would be a good thing.
Re:Thoughts on processor wars of late (Score:1)
I prefer to run the low end of the AMD chips, which is now near the 650-700 point. This cheap alternative would not be available at prices below $100 per chip if the high end didn't drive the market.
Do I really NEED a 650 Duron to run my day to day stuff? No, but at $65, I can afford one that will still serve my needs for years to come. While your needs may be met your current systems, please do not infer from a single point of data that everyone's needs fit the same pattern.As for x86 "losing the battle" why did Apple feel the need to offer TWO CPU's for the price of one in their high-end models? Are Mac users that influenced by a simple MHZ number? Perhaps "real benchmarks" are not equivalent to "real world" performance, as any cs undergrad will tell you.
What I do know is this: for my computer business, I can offer my clients the luxury of purchasing a system now with a 650 processor, and the guarantee of being able to swap in a 1200 processor if their future needs demand it. Apple isn't at that point, and the PC world is only due to the high-end CPU wars.
Re:AMD (Score:1)
I held my breath and went to www.amdzone.com et voila :
AMD Demonstrates Dual Athlon System
Date: Tuesday October 10, 2000
SAN JOSE, CA --OCTOBER 10, 2000--AMD today reached a new milestone with the first public demonstration of a multiprocessor computer designed specifically to work with AMD processors. The demonstration, at the 2000 Microprocessor Forum, consisted of a computer powered by dual AMD Athlon? processors, the AMD-760? MP chipset, and next-generation Double Data Rate (DDR) memory. The multiprocessing computer demonstration featured 3D Studio Max, a professional 3D design and modeling application capable of increasing system performance by using both processors simultaneously.
Re:Couldn't this all be solved by ditching RDRAM? (Score:2)
Seriously, all of the major recent Intel chipset problems were with RDRAM. There's the infamous i820E problem with the third RDRAM chip not getting registered (which was after the initial, pre-RDRAM i820 went bust). And now this, when the ICH2 is coupled with the i850 and i860 MTHs, which use... starts with R, you know this...... RDRAM! Right!
I think the time for the NVidia DDR chipset is NOW. Let's stop this half-assed hardware engineering and pre-alpha lithography which the Intel staff is undertaking.
It's not that simple. Both RDRAM and DDR require tighter tolerances on the chipset and motherboard levels for the same obvious reason--the more bandwidth you want to transfer, the less your tolerance for noise and defects, and the greater the danger of crosstalk. Period. This is simply a fact of life if you want the benefits of high-speed DRAM without taking the (in my opinion inevitable) step of ditching the current system of expandable commodity RAM on the motherboard in favor of a system which ties hardwired or embedded DRAM to the MPU.
Now, you can argue that RDRAM makes the problem worse by trying to cram the same amount of bandwidth as PC1600 DDR into a thinner bus. And you can argue that dealing with a new memory communications protocol has led to more bugs. I'm not going to argue with you there. On the other hand, RDRAM proponents would counter that RDRAM lessens these problems by switching to a packet-based protocol to cut down on interference.
And you can argue--as you did--that the fact that mighty Intel has run into myriad problems trying to implement RDRAM for their PC chipsets, and that their results to date--the i820 and i840--have been lackluster at best. But you can also point out that Intel has (for marketechture reasons) made this switch on a processor designed for use with SDRAM only. You can point out that they've (also foolishly) decided to have the switch coincide with a switch to hub-based memory management--which, incidentally, appears to be the source of the erratum in the i850, not the use of RDRAM. You might also want to note that Intel's first efforts with SDRAM chipsets, while not as starcrossed as the i820/i840, were nowhere near as efficient, stable or refined as their 4th-generation BX chipset or the 5th-gen i815.
Finally, it's worth noticing that there are exactly zero currently available DDR chipsets, bug-free or otherwise, with which to compare Intel's RDRAM record. Of course the main reason for this is politics--only after a year of Intel floundering with the RDRAM protocol did the industry finally coalesce behind DDR. And of course this fact will be changing quite soon, within the month in all likelihood. The fact that several working and apparently stable DDR chipsets are on the verge of being released, from chipset designers less accomplished than Intel, means that DDR cannot be as difficult to implement as RDRAM-backers have long argued it would be. Still--and this is very important--no one has claimed it was easy. Indeed, DDR chipsets were by all accounts much more difficult to get working than SDRAM chipsets, and even now many are reportedly quite finicky when working with DDR made by different manufacturers. Furthermore, all DDR motherboards due for release in the near future are, like all of Intel's RDRAM boards, six-layer. This improves stability and reduces crosstalk at the expense of extra engineering effort and manufacturing cost; SDRAM chipsets tend to be quite stable with just 4 layers.
I dunno where that leaves the ease-of-implementation balance. It appears that it's on the side of DDR, but it's still a bit premature to say so conclusively. In any case, the idea that whipping up a DDR chipset to replace RDRAM is child's play is absolutely false.
On the other hand, Intel already does have a DDR chipset for the P4--or rather, for Foster, the "P4 Xeon" due out in the beginning of the year. Now, this chipset could be modified for use with the normal P4 with little problem, but there are two big reasons it won't be:
1) Cost: In order to fill Foster's massive FSB, Intel's new chipset uses, IIRC, dual-channel double-wide DDR. Contrary to what you may have heard, this means a much much higher cost than the dual-channel RDRAM bus on the i850. The reason is that RDRAM's one unambiguous advantage over DDR is in using fewer pins; having two channels and doubling the bus width means multiplying the already quite large number of DDR pins by 4, which in turns means motherboards which are mucho expensive, even if two sticks of DDR is cheaper than two sticks of RDRAM.
2) Legal obligations. Intel is under contract with Rambus not to promote any other next-gen DRAM standard for its mainstream desktop line until 2003. That means that, unless they want their asses sued off (and we all know if there's one thing Rambus is good at, it's ass-sue-offing), the best Intel can do for the next couple years is license the P4 bus and allow 3rd-party chipset makers like VIA, ALi and, as you mentioned, perhaps even nvidia make DDR chipsets for the P4. Actually the contract specifically states only that Rambus has the option to revoke Intel's RDRAM license if they promote a DDR chipset for their desktop chips, so the big question is, would Intel risk losing their Rambus license, especially when there is no cheap DDR solution which can take full advantage of the P4's 3.2GB/s FSB? On the other hand, would Rambus risk revoking Intel's RDRAM license and thus taking themselves out of the PC DRAM industry possibly for good??
I dunno. Frankly I'm just hoping Infineon succeeds in overturning some of Rambus' RDRAM patents with prior art as they're seeking to do. (I'd be shocked if they didn't succeed in showing prior art for Rambus' "patents" on SDRAM and DDR.)
Re:Intel is really showing its stripes lately (Score:2)
AMD's not going to get a lot of Intel's business until they get some more workstation-class OEMs producing machines with the Athlons in them. I'm picking out 30 new machines today at work, and I'm not looking at any Athlons. Why? Dell doesn't make any Athlon machines. Gateway doesn't have any business machines, only their home-market Select series. Micron? Nope...
I'll buy them, I'm happy with my Duron 600 at home, and if they're cheaper, it's a no brainer. But AMD is going to have to break Intel's stranglehold on the business OEM before I can buy them!
---
Re:This baby is actually several years late (Score:1)
A lot of software doesn't handle that change terribly well, though some can. The EOF is ^D, unfortunately I don't have an ASCII table handy to tell you what the hex for ^D is
Re:This baby is actually several years late (Score:1)
i815? (Score:1)
Re:This baby is actually several years late (Score:1)
Re:This baby is actually several years late (Score:1)
Re:Intel Has Also Shuffled P4 Execs (Score:1)
Why do you people call it P4 ? So, when next version (Pentium V) gets out, you will call it P5 ?
Oh, wait, P5 is already reserved for Pentium I !
Could we at least use the roman number, like P-IV or something.
Re:This baby is actually several years late (Score:1)
Re:This baby is actually several years late (Score:1)
Re:This baby is actually several years late (Score:1)
Curiousity? (Score:1)
Re:This baby is actually several years late (Score:1)
Nowadays, memory is cheap and we align everythink on 8 bytes to gain some microseconds...
____________________
Same old song (Score:1)
Re:What of price (Score:1)
Re:AMD (Score:2)
Finer than 1s accuracy? (Score:1)
Re:News? (Score:2)
The 80286 was on time.
Re:And then? (Score:1)
Hyep. Remember, kids...at Intel, quality is job 0.99999999999998.
Re:This baby is actually several years late (Score:1)
And not to sound too much like whoever hyped up the Y2k "problem" blamed, but isn't the year 2000 a bit early to worry about 2038, anyhow?
Just in time... (Score:1)
for the cold winter months up north. Intel's just doing their part to help out with your heating bill :)
c.
Re:Too late to be relavant? (Score:1)
The Athlon's not going near the highend until VIA or AMD can release a SMP chipset. Until then it'll just be a gamer, home user, desktop user, and possibly economy server CPU. Which isn't bad. I just bought an Athlon system last week. Which incedentally, won't recognize it's sound blaster live card while running Redhat 7 (so if anyone here wants to give me some advice, i'd much appreciate it, hint hint
Re:Too late to be relavant? (Score:1)
That must be why DELL (read: Major Portion of PC market) must be marketing Intel processors in their systems still.
Just yesterday I got some more junk mail from Dell containing items like the 1 Ghz pentium etc. etc.
I scoured the article and could not find any mention of AMD processors, very strange.
My conclusion, based on this and several other factors, is that saying Intel is too late to be relevant is like saying since Microsoft is late to market its to late for them to be relevant *cough*
It is an underestimate of what years of monopoly, patents, deeeep pockets, and being in bed with Microsoft can do to your longevity at this point.
Jeremy
Athlons are SMP as of *today*! (Score:2)
Of course, it doesn't matter much until you can buy one in stores. Volume SMP-capable MB's should be available Q1 2001 -- not that far away.
And, the Athlon's chipsets have been SMP-capable for a while (they *are* based on DEC Alpha's interconnect), it's just that no one has put the denergy behind it to build SMP motherboards from them until now.
Happy day!
--Lenny
Re:More reasons for the delay... (Score:1)
Re:Do we really need a faster processor? (Score:1)
Re:Hey, a song could be written from this! (Score:1)
/me rips out AFCArchVile's spleen.
George Foreman (Score:1)
Market positions of Intel and AMD in a post-PC era (Score:4)
But will it matter?
We speak of the post-PC era. I don't expect to see the PC go away. I'd rather expect it to look more like the end of the mainframe era of a decade or two ago. The mainframes didn't go away, they even kept growing their market. But the wild growth was in the PCs.
Now in the post-PC era, expect to see the PC market growing, just not wildly. Knowing exactly what will be the wild growth area is what will make some people VERY rich.
But Intel's product breadth, particularly ownership of the StrongARM, is going to help them more than AMD's CPU leadership will. IMHO, AMD may well have won a Pyhrric (sp?) victory. The big question will be how they are poised to play in the post-PC era.
Re:This baby is actually several years late (Score:2)
Re:i815? (Score:1)
don't use intel (Score:1)
Re:Couldn't this all be solved by ditching RDRAM? (Score:1)
Emphasis mine. Actually, it's MCH. Memory Controller Hub. An MTH was used when you wanted to use PC100 sdram in a RIMM slot.
Re:Thoughts on processor wars of late (Score:1)
Intel Has Also Shuffled P4 Execs (Score:4)
This baby is actually several years late (Score:1)
Now where's that damn link?
Don't worry kids... (Score:1)
What of price (Score:1)
Hey, a song could be written from this! (Score:3)
Oh where, oh where is the Pentium 4?
Too late to be relavant? (Score:3)
The only thing that will save the P4 at this point is a huge marketing blitz by Intel. And even that sometimes fizzles (remember the "Enhanced for Pentium 3" web sites? The only one around I've seen is intel.com).
Now, if only Motorola/Apple finally get off their asses and make the high-clock speed G4s it's capable of. Dual CPU is tre-cool, but only if the OS fully supports it, and OSX won't be out in full release for a while.
Re:And then? (Score:1)
And then? (Score:1)
Anyone remember the Pentium III, Pentium II and the Pentium without a number? The all were discovered to have some bug after they were launched.
Is it just me or could we jolly well wait a bit longer (I mean EVEN longer) for yet another faster processor to spend money on and then be sure we get one that's not been thrown onto the market in a hurry?
I guess they're having problems... (Score:3)
What would Gordon Moore, Robert Noyce, Ted Hoff and Federico Fagin do?
Re:This baby is actually several years late (Score:1)
How exactly is IA64 worthless. A 64 bit architecture is necessary eventually unless you don't want a unix machine after 2038.
Many more people than you think (Score:1)
Sharky's Extreme is being far too kind (Score:1)
While I don't doubt the missing chipsets...I think Intel has far worse problems on the horizon. (I haven't seen anyone come up with the new compilers that are going to be required for the P4...is there news there?)
Re:Hey, a song could be written from this! (Score:1)
Anyway, I thought it was a marvelous job! Brava!
More reasons for the delay... (Score:2)
Also, not many people are REALLY happy about the fact that these new processers are going to have to be cooled by fans that are hard to implement -- 25 dollars for said fan.
Re:Finer than 1s accuracy? (Score:1)
Re:Same old song (Score:2)
Then it should be possible to get accurate time (Score:1)
Re:Pentium 4 Delayed in laptops because ... (Score:1)
Re:Too late to be relavant? (Score:2)
It doesn't make any sense to extrapolate, except to FUD Intel and for the fact that being an AMD fanboy earns points on slashdot. Sure, Intel is currently having problems with the P6 core at the end of it's lifecycle, but that's what the P4 is supposedly going to fix, and the chip is designed to get the clock speed up up up (because that's what sells chips).
The "high end" market goes where the performance is. Intel could fall on it's ass and so could AMD. What's more likely is that they will both stay within the same price/performance band in the near future, Intel will continue to keep the big OEM contracts that have made them rich, and AMD will continue to keep the loyalty of it's fans.
Re:Too late to be relavant? (Score:1)
Re:Curiousity? (Score:1)
Well... to paraphrase the old Greek saw, "Beware how picky you act, for the Gods may be pickier!"
You cannot logically deduce that AMD will be - the original poster was predicting, after all - but you given recent history you can logically induce the relative pricings. Logicians do study inductive logic as well these days, you know.
Not everything is Aristotelian (or Platonic).
Please infer plentiful smiley faces, this post is tounge-in-cheek.
And in other news... (Score:1)
Re:AMD (Score:1)
Look ma, no hands! (Score:2)
Chaos Theory, complexity and the gentic algorithm (Score:1)
Um. (Score:2)
What if I want my uberG4 for doing _work_ rather than spinning tiny wheels really fast? It's as if you don't understand how different PPC chips are and have been, from x86. They have always been register-rich and loaded with cache compared with x86- let's have more of that. Let's have an uberG4 in which the cache will fit, say, Quake III :) then the pentium people can boast of their higher clocks all they want while their computers spin their tiny wheels constantly loading stuff into their teeny 'high speed zones', and the uberG4 will be about TORQUE and will slowly overtake the pentiums for good.
(Yes, I know that quake framerates are hard to come by on MacOS- one word, ATI- no, two words, ATI and Doom: Quake III is not inherently a super complicated program. The reissue of Doom looks to be a _lot_ more demanding. The difference could well favor the G4- having a compiler use loads of huge registers to speed things is dead simple compared to the twisted arcanity that will be necessary for P4 and beyond)
Re:Look ma, no hands! (Score:1)
When I bought my 486, slowest was a 25sx - the fastest at the time was a DX2 66 - nearly three times the speed.
Now I appear to have a choice between a celery 500ish and a 1Ghz P3 - the P3 isn't even twice as fast and only maintains the lead it does because the celery is artificially crippled by intel. Even then you've still got to try hard to find the 1Ghz P3 because they are rare outside the major manufacturers. For the typical user the graphics card is far more important (e.g. 500 Celery with a Geforce will kill a P3 with an ATI).
I used to pay 20% more on the system price for a 50% increase in clock speed - about 33Mhz, now that 20% will get me about 50-100Mhz or about 10% increase in clock speed, barely noticable after all.
My advice is the best price/MHz chip you can get and a damn fast graphics card. Change the chip next year, you'll still pay less over all and average out at about the same speed.
Thoughts on processor wars of late (Score:2)
The first is that no one cares, except for the same kind of person who insists on owning cars with 500+ horsepower. I currently do high end software development, including lots of 3D graphics work, on two machines. One of them is a 333 MHz Pentium II, the other is a 400 MHz Pentium II. This is hardcore stuff, involving several compilers and some high-end languages that don't normally get used. According to benchmarks, my machines are about 29-35% of the speed of the top of the line machines. And in all honestly I have zero complaints about speed. Both of my computers are zippy. I suppose I could try to slow them down by pointlessly including extra headers everywhere, but why? "My computer is twice as fast at handling unnecessary crap as your computer" is not impressive.
The second thing is that it's obvious that the x86 architecture is a losing battle. When Apple claimed that the PowerPC processors were equivalent to Pentiums of double the clock speed, everyone pooh-poohed them. Then, according to real benchmarks like the one mentioned on Slashdot yesterday (in a story about the Pentium III), it turns out that a 500 MHz G4 really is equivalent to a 1 GHz Athlon, and only 12% slower than a 1.1 GHz Pentium III. And the G4 uses much less power, making it a realistic choice for notebooks. I am not saying that Macs are better than PCs. I am saying that the current high-end PowerPC chips are making the Intel vs. AMD battle look pretty ridiculous. Who cares if Chevy puts an 800 HP engine in one car and then Pontiac out does them with 810 HP? Everyone is happy with inexpensive and reliable with 150 HP cars that don't need to be handled with kid gloves.
Maybe the intel C compiler ? (Score:1)
Why bother asking? (Score:1)
People still can't get 1GHZ PIIIs
Couldn't this all be solved by ditching RDRAM? (Score:4)
I think the time for the NVidia DDR chipset is NOW. Let's stop this half-assed hardware engineering and pre-alpha lithography which the Intel staff is undertaking.
News? (Score:2)
Now's the time to switch... (Score:1)
We've got some at my office... (Score:1)
Of course, the sorts of demos Intel's requested are pretty bogus... Full screen video, voice recognition, some networking stuff, none of which has much to do with the power of the processor...
Re:They can rhyme. (Score:2)
Re:Don't worry kids... (Score:1)
You don't shuffle execs in a panic... (Score:2)
Santa may have a toy workshop (Score:1)
Oh Why Oh Why Do We Need Pentium 4? (Score:2)
But really, at this point the tech market is lagging, software has fallen far far behind top-of-the line processors, and we (as consumers, mind you) don't need more power except to fulfill some non-productive urge. And there is no doubt that the P-4 is targeted at base-level consumers and not folks rendering high-end graphics and animations.
Oh, and I'll put money on the P-4 being less stable than previous generation processors.
Re:I guess they're having problems... (Score:1)
What you really need to ask is:
What would Brian Boitano do?
Intel is really showing its stripes lately (Score:2)
Enigma
Enigma
AMD (Score:1)
the P4. My next box is going to be an SMP Athlon.
Now, if some manufacturer would actually
*release* an SMP Ahlon board before 2003.....
Good move on their part (Score:1)
Re:Actually that is more of a guess (Score:1)
Duh (Score:1)
Think about this. 3k for a wickly mean machine, 2k for a good computer, 500 bucks is for "all that someone really needs." A year from now that 500 bucks machine will become some sysadmin mail hub. The 2k will be his desktop for the next two years, and then serve as his file server for three, then a router for three more year. The hot machine will be the guy devel machine for a year, his desktop for two, his file server for four, and a router for another four more years.
What has more value? I think the 3k machine. That's why when I do buy a system, I buy two steps down form the best thing I can get, keeping it easy to upgrade (two 256 dimms, instead of four 128's), and understanding next year it will be less.
Re:Thoughts on processor wars of late (Score:2)
I'm not talking about Apple's marketing nonsense, but what various hardware web sites are reporting, including one linked to on Slashdot this past Monday. Also note that they're talking about the CPU in the G4. The snail ads from 1998 were about the CPU in the G3.
"with big iron on his chip..." (sorry, Marty R...) (Score:1)
Start A Pool!!! (Score:2)
This thing has already been delayed, I personally have the weeks of January as my main choices.
Re:Whoa. Very good points. (Score:1)
(still trying to adjust eyes from reading 12-point Times New Roman for five minutes straight!)
Re:This baby is actually several years late (Score:1)
It has nothing to do with EOF.
(sigh)
Re:And then? (Score:1)
-
Do we really need a faster processor? (Score:2)
Now that I have your attention, and before you call me flaimbait, hear me out.
The only truely revolutionary changes that Intel has ever done is the 386 and the Pentium. Every other design that they did was a simple upgrade to already existing chips.
Call me crazy, but I think that Intel should have released a 64 bit chip about 5 years ago. Now that would have been as revolutionary as going from the 286 to the 386, or from the 486 to the Pentium.
Why didn't they? Because they had already paid a fortune developing the existing technology and they wanted to milk it for another few years. Intel is a big believer in the incremental upgrade. They want users to pay a premium for basically the same chip three or four times while spending the money that should be going to research into the "Intel Inside" marketing campaign.
By this time Intel should have been releasing chips with 128 bit busses. Instead we are stuck with a chip whose basic design hasn't changed in many years. And the 64 bit chip is delayed yet again. Is it really going to be out at the end of next year. Sadly not.
*sighs*
I guess I just don't understand how to make money.
Re:Too late to be relavant? (Score:2)
--