
Merced Design Completed 140
NoWhere Man writes "Merced's design
is complete and are due to go into production mid-2000, but it is expected that McKinley, Merced's successor (due late 2000), will likely be the most popular in Intel's 64-bit chips.
"
Re:The big Question: Will Win2K be 64-bit ready? (Score:1)
Re: 680x0 -> PowerPC (Score:1)
They moved an entire OS, an entire platform and all of the applications that ran on it from one chip to another almost completely seamlessly.
It went so well that no one talks about it. I think that that is Apple's best technical achievement ever.
- AC
"tape-out" != "done" (Score:2)
As anyone in the hardware business knows, there is a huge gulf between what Intel's just announced (preparing the design to be fabricated into physical chips for the first time, traditionally called "tape-out" becasue the design data used to be shipped to the people making the masks on spools of tape) and actually being ready to ship product to customers (i.e. OEMs).
Keep in mind that this means they don't have a sungle physical Merced chip yet, which means they've only been simulated, never really tested. Although I don't know anything about Intel's internal simulation tools and methodologies, I'd be surprised if they've even been able to boot an operating system on their simulated design. I'd be amazed if they've simulated significant real-world applications. In other words, at this point Merced has had very little real-world testing. That's what the next step ("first silicon") is all about: testing out whether the design actually works in practice (and systems) rather than just in theory (under pre-silicon simulation).
That's not to say that Intel hasn't expended a huge effort already in working to eliminate as many bugs as possible before taping out (in fact, I'm sure they have), but there are always more bugs when the silicon arrives. In fact, many of the problems encountered in first silicon are effects which simulation can't (or at least doesn't) effectively capture (race conditions that earlier analysis missed, unanticipated electrical and electromagnetic effects, yield problems, etc.). These problems are worse when you're talking about a new from-scratch design. There are also bound to be other problems when you're talking about a totally new processor architecture (such as IA-64).
Usually the process between tape-out and shipping to clients goes like this:
To me, it seems that the estimate of one quarter between tape-out and shipping samples to OEMs seems extremely optimistic. Of course, I suppose Intel could be planning on using the OEMs for debugging their first or second pass silicon (before they've made it really robust), but somehow I doubt that. I would expect that it might be more like two quarters before anybody outside Intel actually sees a physical Merced chip.
But then again, hey, what do I know.
Kenneth C. Schalk < kenneth.schalk@compaq.com [mailto]>Software Engineer
CAD & Test Group
Alpha Development Group
Compaq [compaq.com]
Intel from the Inside (Score:2)
Word is (Score:1)
Re:Micro$oft Pressure (Score:2)
Re:what about Alpha? (Score:1)
Because these registers are general purpose, there's every reason to expect that gcc/egcs can be told about them, and hackers can find ways to make use of them- ideally by telling compilers how to optimize so the vector processors/huge registers get used. At that point, any PPC Linux users can simply recompile favorite applications to get them running markedly faster- and recompile the kernel to get that running markedly faster- something that MacOS users will only see indirectly, with system updates and with Quicktime.
Personally, I'm looking forward to hearing about this...
Oh? (Score:1)
I do have one of their CPUs in the house.
I've never been so pleased to be running a 604e as now, when the whole x86 market is being systematically exterminated. I'll just keep on supporting PPCs, seeing as I already have the Mac and the software to run on them. Things are already nasty on x86, good luck being able to afford Alpha machines (and people say Macs are expensive!) and if you're still supporting Intel, well, just think about what your money is doing, won't you? One hand supports Linux, for now- the other's killing all your hardware choices as fast as it can.
Intel's first offerings always suck (Score:1)
The first few instances of each new product line from Intel have always sucked:
Transmetta, anyone? (Score:1)
I havn't heard a good Transmetta gonna whoop Intel rumor in months.
The doco issue is moot right now. (Score:1)
-lee
Re:Does it mean "bye, bye x86 machine code" ? (Score:1)
$00MHz Sparc (Score:1)
The newer, better, faster Sparcs on the horizon should be even sweeter.
Re:And what about the compilers? (Score:2)
>Produces much faster code than \1. But does
>that matter one bit for the \1 community? No.
>Didn't think so.
Speak for yourself. I work daily from my FreeBSD box to my boss's linux box running a commercial fortran compiler (g77, etc., don't even play in the same league). We have absoft fortan, but it would have been digital if it were available.
We were even willing to pay the extra cost for the alpha box, but the costs of DU itself, both for purchase and the risk of getting sucked into the university system and fee'd to death there, mean the x86/linux/absoft solution.
A year an a half later, I've hit the price. I never thought I'd see the day I *needed* a 64 bit operating system, but now I do: I need an array with more thatn 2^32 bits, and more than 2^32 bytes would be nice, too. Absoft uses Cray code that bit addresses, leaving the size limit on an array of derived type at about
Re:And what about the compilers? (Score:1)
Absolutely nothing. People will still keep on using using GCC as they have always done. Why? Because GCC is multi-platform. Compilers like Sun's aren't. End of story.
Re:what about Alpha? (Score:1)
Actually, you have it backwards; Motorola will be supplying the Altivec-enabled chips, while IBM is concentrating on increacing the clock speed rather than the number of instructions.
Phil Fraering "Humans. Go Fig." - Rita
Re:1 GHz in 2001 means....FAST GAMES! (Score:1)
Added to that, processor design has become more bloated, moving deeper into a large, complex instruction set. Simpler processors, such as the ARM, outpaced the Intel chips even at a fraction of the clockspeed, because they were better designed.
Finally, throw in that most modern OS' are bloated and top-heavy, Linux being one notable exception, and you've a recipe for a horrible quagmire from which REAL games and gamers may never escape.
Re:Bugs (Score:1)
Re:Does it mean "bye, bye x86 machine code" ? (Score:2)
Bugs (Score:1)
Another link in the chain (Score:1)
ones made so far (?) (Score:1)
I just like to see new announcements
Re:The big Question: Will Win2K be 64-bit ready? (Score:1)
and if you merge those two (while they are still trying to make a duo/coalition) their future is even more uncertain.
but anyway, whether they both will suceed ot not (or both fail) it can be good: it'll show the people that past years of "inovation" as performed by wintel coalition has been mostly result of marketing (because real inovation is done in laboratories, not on papers containing press releases).
it can also cost us (or them? or some other users? ot some other developer? ...) lot.
Re:what about Alpha? (Score:1)
just encourage competition so we can get better products at better prices and with a lot of choises.
Re:64bit 128bit its NOT a problem if they SORT IT (Score:1)
To get their NUMA work?
Re:"tape-out" != "done" (Score:2)
I infer from "simulated design" that you mean a simulation of the Merced implementation of the IA-64 architecture, not just a simulation of the IA-64 architecture; other followups have said "they booted {NT, HP-UX} on a simulator", but that might have been a simulation of the IA-64 instruction architecture, rather than simulating Merced at, say, the gate level, and such a simulation wouldn't have tested the Merced design, it'd just have tested the software changes needed to make the OS run on an IA-64 processor.
(I.e., I'm actually replying, in bulk, to the folks who said "but they booted XXX on a simulator" in response to you; booting some OS on an IA-64 simulator doesn't necessarily test the design of a particular implementation of IA-64, and thus doesn't necessarily ferret out bugs in that implementation.)
what about Alpha? (Score:1)
there are 3 kinds of people:
* those who can count
Re:Fast Games my @**... (Score:1)
Most good programmers went to "get it right, and then get it fast."
There are more chips than Alpha (Score:1)
http://www.hp.com/visualize/products/cclass/c30
I'm not sure where the alpha is not but this thing is turning out some impressive numbers.
The big Question: Will Win2K be 64-bit ready? (Score:1)
Re:400MHz Sparc (Score:1)
Linux64 (Score:1)
Re:The big Question: Will Win2K be 64-bit ready? (Score:1)
Re:Does it mean "bye, bye x86 machine code" ? (Score:2)
Does it also mean that it is a bad idea to buy an "old" PentiumIII processor ?
Software written for the Merced may not run on a PIII, but software written for older x86s will still run on the Merced. Just as current x86 chips can emulate "real mode" for older apps, so the Merced will be able to use and/or emulate the processing modes used in current x86s.
A dedicated x86 clone - like the K7 - will be able to run these applications faster. However, _if_ they did a good job on the Merced's core, applications written natively for the Merced will run faster than applications written natively for the K7, as the K7 will still be hampered by the x86 instruction set and register structure.
_If_ Intel did a good job on the Merced core, it will be fast but still 1.5x as expensive as other RISC solutions due to the extra silicon needed to support x86 legacy features.
However, I gather that they may not have done such a good job on the core. We'll see when prototypes are benchmarked.
Re:Fast Games my @**... (Score:2)
Please click on "user info" above and see my previous response in this thread. It is a reply to another poster who presented almost identical arguments.
Re:I am aware... (Score:2)
Again Linux is the exception....
Correction: Windows is the exception. See the post that I referred to.
Re. games, I realize that most game software isn't perfectly tuned, but all of the "boost FPS by 50%" optimizations will already have been made, because it is in the game company's financial interest to do so - as a result of this optimization, they can either lower the system requirements or keep the frame rate and jack up the graphics detail. Both correlate directly to better sales.
From where do you get the impression that games are horribly written?
A few concerns with your arguments. (Score:3)
Um, no. New games require more hardware because they have fancier special effects and more detailed models. This is not really related to code complexity. It makes the _data_files_ larger, naturally, but that's about it.
Granted, there are some game writers who consider special effects a reasonable substitute for gameplay and plotting. These writers' games will sink, however, because consumers do want games that are actually fun to play.
Re. hardware vs. games, game hardware requirements will plateau when cheap hardware exists that can handle just about all of the special effects in the OpenGL feature set for photorealistic models at high resolution in real-time. Beyond that, there isn't anything left to add hardware load on the graphics side of things.
Things like AI and physics may continue to develop after that, but physics at least won't add much more load if you have hardware that powerful.
Added to that, processor design has become more bloated, moving deeper into a large, complex instruction set. Simpler processors, such as the ARM, outpaced the Intel chips even at a fraction of the clockspeed, because they were better designed.
Um, no. Look at just about any non-Intel processor. Intel chips are bloated because Intel continues to support and extend an instruction set that wasn't designed to be extensible. They're about the only major microprocessor manufacturer that made this mistake.
Also, didn't ARM not _have_ a floating-point unit? With more silicon to devote, of course they'll be faster at integer operations.
Finally, throw in that most modern OS' are bloated and top-heavy, Linux being one notable exception
And *BSD and BeOS and...
Microsoft is the primary culprit for slow OSs. This is because Microsoft is purely market-driven, and the market that they cater to would rather buy a new version of the OS with more features than a new version of the OS that works more efficiently.
OSs and chips can be designed cleanly - and _are_, with only a few exceptions. Take a look around at what's available, and you may be pleasantly surprised.
Simulated boot (Score:1)
Which must give them some hope.
Nope. (Score:1)
> 64-bit version of Windows Server
> ready by the end of next year, the path will be
> clear for the *nixes to severely
> dent Microsoft's marketshare.
Nope. w2k is not 64 bit capable. In fact, windows will probably run in a 32bit emulation mode. Much like what NT4 does on the Alpha. From what I understand, NT4 is _not_ easily ported to 64bit, for some reason or another (otherwise, why isn't the alpha version of NT running in 64bit?). This means it will require significant time on the part of M$ to either port their OS to 64bit, or write a totally new version of windows...
As soon as Linux gets a compiler, we will take full advantage of the architecture. You know how quickly we work... Another factor is that Intel has been making friendly gestures at the Linux community. They were even so audacious as to donate some compiler optimizations. Personnally, I think Mickeysoft and windows are heading into some dark days, which I don't think they will survive.
Jeff
Nice Press Release (Score:2)
This article from Byte [byte.com] goes into some of the problems Intel has from this stage forward. A little low-tech for
Re:what about Alpha? (Score:1)
64bit 128bit its NOT a problem if they SORT IT (Score:1)
what this says to me is, to get press on server stuff so K7 doesn't !?
they are still signifactly tweaking the compiler !!
the actual silicon is not the hard part its the software IE the compiler
IA64 needs a *GOOD* compiler to even compete with alphas but a realy good compiler (years off) would kill an alpha at the same clockrate HP are doing TRIMERAN and many supercomputer centers have done research on how VLIW (sorry EPIC) work and could be better
>>>wait and see boys and girls IBM bought sequent for a reason
john jones
a poor student @ bournemouth uni in the UK (a deltic so please dont moan about spelling but the content)
Micro$oft Pressure (Score:1)
The Merced is supposed to run IA32 code, but will it run a 32-bit OS like Linux or NT out of the box (or straight of the net in the case of Linux)?
Re:ones made so far (?) (Score:1)
PIII != Merced and i said Merced.
-Z
Re:what about Alpha? (Score:2)
Basically: Merced has been on the brink of failure for quite a while now. The performance of the ones made so far are considerably less than those of the PII at lesser Mhz's.
The development of the Althon (aka K7 by AMD) has been quite secret. It is actually a super powerful chip and is using something like 256k cache to bring down price and still will whoop the PIII at equal Mhz (and 512 cache, in FPU benchmarks too!). Rumor has it that they will be releasing a 512k and 1mb cache intel killer Athlon shortly after debuting it at 256k. The Athlon will be using a slot A which makes sense as they have been in bed with Alpha Processors Inc., Samsung's processor company. (Those of you who still think that Digital owns API, you're mistaken as they made a deal to sell off their majority in the company to samsung). So as we see 1ghz Alphas debut without cooling you know that they are sharing that technology with AMD so the future is really bright for AMD. One nifty thing is, on the register, i saw an 8 proc. motherboard being made for the Athlon. Hello low-cost supercomputing.
Motorola, makers of the PowerPC processor line will be introducing the G4. Rumor from the mac side is that due to a dispute between Motorola and IBM, who share in the production and design of the PowerPC processor line, there will be two different versions of the G4 coming out. One, which will be made by IBM will include a special instruction set that Mac OS X can/will be optimized for that will increase 3d rendering (sorta like 3dfx from what i understand of it) wheras Motorola will make non-optimized G4s that cost much less than the IBM manufactured ones. This probably means that lower cost macs, such as the iMac will use the Motorola G4 and upper end Macs like the PowerMac will use the IBM one.
-Z
Re:why 64? I don't know... why 32? (Score:2)
Re:1 GHz in 2001 means....FAST GAMES! (Score:1)
Re:Does it mean "bye, bye x86 machine code" ? (Score:1)
--
Matthew
Re:what about Alpha? (Score:1)
Re:McKinley will kick Merced's butt! (Score:1)
Because alot of code re-writing/re-compiling is going to be going on, and because everything will have to be moved to 64 bit hopefully we might even see more apps supporting the alpha, not only open source, but closed source commercial stuff, which as much as we all hate it, is necessary for some things.
Long Live the Alpha.
as I think I saw in someone's sig,
Intel is the question, Alpha is the answer.
Re:McKinley is really HP's chip... (Score:1)
Re:Intel is no evil anymore (Score:1)
http://www.techweb.com/news/story/TWB19990706S0
Lazy Programmers=Urban Legend (Score:1)
Re:1 GHz in 2001 means....FAST GAMES! (Score:1)
As long as companies like ID are around I don't think this will completely happen. :) In fact John Carmack has been actively working on the Matrox G200 GLX module... it's cool to have accelerated 3D in multiple windows using a totally free, sourced solution.
And what about the compilers? (Score:4)
It seems that we are entering an era when the performance of your application is going to depend on the quality of your compiler/interpreter as much as on the actual hardware inside the machine. This is both good and scary. Good if the free compilers (like egcs) will be able to compete with and outperform commercial compilers -- that will be a great boost to free software. But there is also the scary part: if the free compilers fail to keep pace with commercial offerings, they will die. Think about it: if a kernel compiled under, say, Sun compiler will run twice as fast as one compiled under gcc, what will happen to gcc?
Kaa
Re:Cost? (Score:1)
Understand that this was never the original plan. Intel basically realized that Merced will not only cost a fortune and have no application support, but it's performance will also suck so bad they're not even going to try to sell it. Merced might have been good if it was released 2 years ago like it was originally supposed to be, but it simply can't compare to other high-end processors. Plus, its die is so huge that it would most likely have been more expensive. And it currently doesn't have a single OS that will run on it! Not exactly a good buy.
So intel decided that Merced will be nothing more than a proof of concept, as well as something they can give to developers and tell them to write programs for the next version (afterall, the instruction set is the same). You won't be able to buy an IA-64 processor until McKinley (Which, ironically, is being developed almost exclusively by HP. So much for intel leading a 64 bit revolution.), and it will not be cheap. Even McKinley might not be that good, because it will have some tough competition with proven platforms by the time it comes out.
Basically, don't think x86 is finished. Intel's IA-64 line is not very impressive at the moment. Remember that they are no longer competing in the x86 market with these, they are competing with well-respected and proven designs in a market that has a lot more competition than intel is used to. I do not think that the entire IA-64 line flopping is out of the question.
Re:Does it mean "bye, bye x86 machine code" ? (Score:1)
a fast way to do x86... (Score:1)
_
"Subtle mind control? Why do all these HTML buttons say 'Submit' ?"
Merced simulation using Linux Beowulf cluster? (Score:1)
Re:There are more chips than Alpha (Score:1)
Re:Cost? (Score:1)
personal computers or workstations. The first
reason being cost. I suspect they will be
shipping at around US$5000. The second is
the fact that the IA32 emulator will be slower
than a genuine P3, at least initially.
Re:Bugs (Score:1)
Re:what about Alpha? (Score:1)
'Course, that would pull enough watts bake Pizza inna PC, but the idea of an 8-way array of 8mb K7s...
Re:why 64? I don't know... why 32? (Score:1)
Re:Does it mean "bye, bye x86 machine code" ? (Score:1)
DEC went from some old MIPS (I think) to ALPHA
IBM went from who knows what to PowerPC
Apple went from 68000 to PowerPC
All companies got to a point they dropped everything and moved on. UltraSPARC is still very similar with old SPARC V.8, but all companies got to a point that they said this old 70's and 80's CPU design methods just don't apply anymore. Will Intel ever do that and get rid of the Legacy patchwork?
Re:"tape-out" != "done" (Score:1)
It is ... (Score:1)
just it isnt going to do a great job at it.
so just recompile everything for it
Wrong about the Alpha (Score:1)
Get with it!
Re:1 GHz in 2001 means....FAST GAMES! (Score:1)
Do we really want photorealistic quake? I meen if I saw a photorealistic head exploding, I think I'd be sick. Ick.
Intel's Marketing (Score:2)
Re:PPC all the way baby! Umm?? yes (Score:1)
UltraSPARC III (Score:1)
Re:what about Alpha? (Score:1)
Does it mean "bye, bye x86 machine code" ? (Score:1)
Does it also mean that it is a bad idea to buy an "old" PentiumIII processor ?
Oh really? (Score:1)
Re:Bugs (Score:1)
Only errata.
Don't hold your breath.... (Score:1)
You want 1 Ghz? Look out for AMD, people.. the K7/Athlon will be there by Y2k (ok that's just my estimate). They are going to go hand-in-hand with Alpha. Intel's a great company, but they just got outclassed by AMD for the first time and the won't lead the race again for about another 2 years at least. Alpha? Here's their chance to make a break for it, too.
Re:And what about the compilers? (Score:1)
I remember having read awhile back that the
compiler development was going very poorly.
Re:Micro$oft Pressure (Score:1)
repeat of the PPro? Ie it runs 32 bit code but
not much faster than a true 32 bit processor?
Re:Intel's Marketing (Score:1)
alpha for the great chip it is? Wouldn't it be
nice if the masses saw Apple machines for what
they are (a lot easier to use)? Alpha and Apple
seem to be very similar - total failure to market
the products properly. As a CPQ shareholder I
would be more than happy to see them finally take
some initiative and really push for Alpha sales.
Maybe this is finally the opportunity? While you
may be right that the current users of 64 bit
may have the applications they need now, wouldn't
it be nice also have the ability to also run all
the current 32 bit software (which will likely
be redone for native 64 at some point)?
Re:Micro$oft Pressure (Score:1)
Fast Games my @**... (Score:1)
Like in the case of M$...give them a faster chip and they can add a few more million lines of code to their already bloated OS's...
(Of course Linux is the one exception to this statement..)
I am aware... (Score:1)
Again Linux is the exception....
Re:Does it mean "bye, bye x86 machine code" ? (Score:1)
No. No! NOOOOOOO!!!!!!
I realize there's a huge existing application (and talent) base in supporting legacy CPUs and OSs. But this has really got to stop, folks!
Can't Intel start with a clean slate for the CPU, sans all x86 baggage and then provide a software solution for legacy apps? I never had the pleasure of using an Alpha box, but didn't hey have something called FX!32 (or similar) for the NT Alpha which ran Win32 binaries? Did it work well?
Same damned thing with Microsoft. Each new OS carries tons of crap from the previous one.
Even MS's "32-bit" apps carry old "16-bit" junk around.
Once I made the following leap of logic, running NT Workstation 4 at home: Since I'll only be running Win32 apps, NT shouldn't need to create/support short filenames. I turned off the registry key for 8.3 filenames. I installed Office 97. And then... it just didn't work right. Various errors, complaints about not finding files, etc. convinced me that it just wasn't rid of Win16 baggage.
What a joke.
Re:$00MHz Sparc (Score:1)
This is a very nice feature if you do support from home.