Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AMD

AMD's 64-bit Plot 531

ceebABC writes "In a long interview with eWEEK, AMD's CEO Hector de Ruiz talks about struggling to compete with Intel, but more importantly about their upcoming 64-bit processors. He says that AMD's 64-bit chips will be comparatively priced to the 32-bit ones, and backwards compatible. He also thinks there will be a market for desktop 64-bit systems. Skip to the last page for the most interesting stuff."
This discussion has been archived. No new comments can be posted.

AMD's 64-bit Plot

Comments Filter:
  • Hmm (Score:3, Interesting)

    by zapfie ( 560589 ) on Monday December 02, 2002 @06:22PM (#4796889)
    Is there really much consumer (not business) application for 64-bit processors? If so, where would desktop computing benefit?
    • Re:Hmm (Score:4, Funny)

      by nmaeone ( 596348 ) on Monday December 02, 2002 @06:25PM (#4796909)
      Well, 64 bits made Mario come to life. Maybe Micro$oft will make a fully 3-D rendered MSN Butterfly to help you with your daily tasks?
    • Re:Hmm (Score:3, Interesting)

      by rodgerd ( 402 )
      Desktop video is one that springs to mind - lotsa memory.

      I can buy PC133 @ US$60 per half gig. For US$500, I can fill the address space of the 32 bit processor, yet a non-trivial home movie could occupy more than 4 GB in uncompressed form.
      • Re:Hmm (Score:3, Insightful)

        by timeOday ( 582209 )
        Besides, remember those pointers address *virtual* memory. I can eaily imagine wanting to mmap a DVD or big database even if I didn't have that much RAM.
    • I think you'll be surprised where 64-bit CPU's may become useful consumers.

      The first place where this will be useful is video editing. With the proliferation of MiniDV camcorders that have IEEE-1394 connections to desktop computers, many camcorder users are downloading video onto their computers for editing and creating home-made VideoCD or DVD-R discs. With 64-bit CPU processing we now can see the development much more sophisticated (yet easier to use) programs that make video editing and VideoCD or DVD-R disc creation almost a snap.

      The second place this is useful is still image editing. With the proliferation of digital still cameras with USB ports people are doing more and more image processing of still images before printing out the pictures. With 64-bit CPU processing we can see image-editing tools that can do image processing that is far more sophisticated than what even Photoshop 7.0 can do today, yet would be easier to use than ever.

      The final place is games. 64-bit processing makes it possible to do extremely sophisticated graphics effects in real time without over-reliance on an expensive high-end graphics card; a lot of games that need fast motion with complex backgrounds could benefit from going to 64-bit CPU processing.
  • by m0i ( 192134 ) on Monday December 02, 2002 @06:23PM (#4796894) Homepage
    will it be faster than 32 bit offerings? For almost anyone out there, it's the only factor when buying a CPU: speed! Adressing >4Gb of memory is not that worries me first :)
    • by FearUncertaintyDoubt ( 578295 ) on Monday December 02, 2002 @06:29PM (#4796942)
      For almost anyone out there, it's the only factor when buying a CPU: speed! Adressing >4Gb of memory is not that worries me first :)

      For desktops, you are right. However, a huge part of the 64-bit market is in servers, and the possibility of >4GB memory is a Big Thing. My SQL Servers will eat that much for breakfast.

    • by Junks Jerzey ( 54586 ) on Monday December 02, 2002 @06:43PM (#4797054)
      For almost anyone out there, it's the only factor when buying a CPU: speed!

      Nope. These days it's price. You can barely, oh so barely, tell the difference between 866MHz and 2.4GHZ, and only then when running certain high-end games or 3D modelling packages. Now go over to Dell's site and price a 2.4GHz system. You can easily get something with 256MB and no monitor for US$800. Now upgrade to a 3.06GHz P4. How much does that does that 27% increase in clockspeed cost you? Just over US$1000. And what does it get you? Remembering that clockspeed does not translate directly to more CPU performance, maybe you're getting a 20% across the board improvement, but _man_ are you paying for it, both in cost and power consumption. And was it worth it, for 27% faster than "more speed than I know what to do with?" Probably not (though I realize that all hardware site weenies will absolutely insist that they can feel the difference when browsing the web on such a machine).
      • And what does it get you? [...] Maybe you're getting a 20% across the board improvement, but _man_ are you paying for it, both in cost and power consumption. And was it worth it, for 27% faster than "more speed than I know what to do with?" Probably not (though I realize that all hardware site weenies will absolutely insist that they can feel the difference when browsing the web on such a machine).

        Tubes versus solid-state...
        Beta versus VHS...
        Vinyl records versus CDs...
        Air-cooled versus water-cooled...
      • You can easily get a 2 GHz system with 256 MB RAM *and* a monitor for less than $600.
      • Yes the main factor when buying a new CPU is price atm.

        But you DO notice the difference between a 866 Mhz processor and a 2.4 Ghz one in many ways. On of them is the time it takes for the computer to boot. But there are several other tasks that become much faster by going up with the frequency... also remember that a 2.4 Ghz processor has DDR whereas an 866 Mhz one probably won't (haven't heard of 866s with DDR, although I may be wrong). Hopefully another factor that will show you a nice speed increase in the future is the new Hyper Threading tech in the 3.06 Ghz Intel CPU.

        The computer's overall speed is increased, and yes, you will notice the big difference when it comes to playing games, using programs like Pro Tools or doing Graphics, but that doesn't mean the rest isn't changed at all.

        I have a K7 850 and an Athlon 1400 DDR and hell, do I notice the difference? Of course I do...

        Decameron
      • by tswinzig ( 210999 ) on Monday December 02, 2002 @07:42PM (#4797489) Journal
        You can barely, oh so barely, tell the difference between 866MHz and 2.4GHZ, and only then when running certain high-end games or 3D modelling packages.

        Sorry. Wrong. I went from a 1Ghz Athlon to an 1850Mhz AthlonXP. I use Windows XP. Programs opened faster. And when you're talking about Mozilla, or Office, or Photoshop, or Dreamweaver, or anything more complicated than notepad, really, you DO notice this. Especially when you're opening and closing programs all day long.

        When I come across a webpage designed with complex tables and CSS elements, the speed improvement is noticeable (e.g. my banking website, which I frequent, is complex and now renders much faster).

        You can never have enough speed. You will always notice a difference, eventually, because the more power that becomes available, the more complex things become that we use frequently.

        And believe it or not, but many people like to play new games. Not just "gamers." Regular people, too. My dad can barely turn around in Quake, but he loves wandering around in god mode and shooting things. He wants to play Doom3 when it comes out. He will need new hardware.

        I'm just sick of this lame argument that people aren't interested in new processors because they can't tell the difference between 800Mhz and 2Ghz. Bullshit. They might be able to LIVE with the difference in speed, especially if money is tight, but you can never have "too much" speed.
        • by Dave_bsr ( 520621 ) <slaphappysal@hotmail.com> on Monday December 02, 2002 @08:31PM (#4797831) Homepage Journal
          Dave_bsr's Law: Governing "when computers will be fast enough":

          Computers will be fast enough, when, for every conceivable operation, system delay between user requests and proper system response is less than the human ability to resolve, eg, instantaneous.

          Not instantaneous, as in .05 seconds (button click speed now now), no, I mean _instantaneous_, like when I push on my door and I see movement. When I write on paper and _instantly_ there is writing. Then, computers will be fast enough. And I don't just mean speed for mozilla, i mean processing real-time 3d bump-mapped, whooseyourdaddy environments. Yeah.
          • I think the sort of latency you see from current "realtime" audio systems is about right to meet my demands, <2ms! When we can have our computer provide a simulated environment at the compexity of modern physics in which we can shoot at our friends driving F1 cars while impregentating our girlfriend, all modeled indistinguisably from real life (could we ever model the creation of a human life meaningfully). That should keep us going for a few years but it will give scientists an incredible toy!
            With up to 10^81 (2^273) atoms in the universe and then what level of subatomic detail? 512bit seems about right to me. Pity Moore's Law suggests we might have to wait til around 2674 til /. will be covered with questions over whether 512bit really is enough, with processor speeds of 3 Peta9Hz (thats 3 + 9 * 000,000,000,000,000). Who would want a beowolf cluster of those? Whose going to supply the power!
            Unfortuantely anyone who reads this today will be lucky to see a nice 128bit computer in 96 years time at 1.5MegaPetaHz :-(
    • by Anonymous Coward
      will it be faster than 32 bit offerings?

      Yes it will, due the larger register file of x86-64. Epic ported UT2K3 to x86-64 and said they saw a 15% perf increse vs. IA32-version running on same CPU.

  • Wow (Score:3, Insightful)

    by r0xah ( 625882 ) on Monday December 02, 2002 @06:25PM (#4796906)
    Yes 64 bit CPU's for desktops will soon be the next new thing, but who really needs them? Grandma and grampa checking their email won't need something that fast and even the normal computer user will never experience such CPU intensive work to need a larger word size. Trust me I am not saying I won't be one of the first people to run out and get one, but there really is no need for the general public to have 64 bit processors.
    • Re:Wow (Score:4, Funny)

      by scotch ( 102596 ) on Monday December 02, 2002 @06:30PM (#4796954) Homepage
      Grandma and grampa checking their email won't need something that fast ....

      Grandma and grandpa could check their email on a 16-bit computer. Don't forget grandpa's geri-porn, you need some horsepower for that.

    • by nomadicGeek ( 453231 ) on Monday December 02, 2002 @06:43PM (#4797052)
      At first they will be expensive, then they will be in the $599 desktops. Why wouldn't you use them?
    • by jki ( 624756 )
      . Trust me I am not saying I won't be one of the first people to run out and get one, but there really is no need for the general public to have 64 bit processors

      Now, when you are a rich and famous IT star, you will regret saying that like this guy does :)

      Microsoft has not changed any of its plans for Windows. It is obvious that we will not include things like threads and preemptive multitasking in Windows. By the time we added that, you would have OS/2.
      -- Bill Gates, from "OS/2 Notebook", Microsoft Press, (c) 1990

    • Re:Wow (Score:5, Insightful)

      by dingleberrie ( 545813 ) on Monday December 02, 2002 @06:46PM (#4797080)
      It's not about what consumer needs 64 bit for today's applications... it's tomorrow's applications. First there must be a base of users out there.

      Do you remember the opportunity brought about by the 386? Who needed that when all the modern applications ran fine with the 286? The 386 even broke some of the old 286 code. But it was still very useful to programmers who could spend focusing on quality (and bloat?) rather than worrying about how to confine data to 64 K blocks. Almost 20 years later we are still benefitting from the whole flat memory model that finally came to x86 (flat up to 4 GB, that is).

      If you have to ask the question of who needs it, then it's not you... yet. Sure the first adopters are the Corporate people who know they need it as well as the "look what I have" crowd. But I'm pretty sure that there will be consumer applications that will make 64 bits necessary after there is enough consumers that have them.

      640 TB should be enough for anybody.
      • IBM didn't think anyone needed 386 systems. Compaq decided that they'd sell them instead. IBM's control of the PC died that day.
        • Re:Wow (Score:4, Informative)

          by TheAncientHacker ( 222131 ) <TheAncientHacker&hotmail,com> on Monday December 02, 2002 @07:28PM (#4797389)
          Actually, IBM was pretty damn sure that people needed 80386 systems. What they were also just as sure about was that an 80386 based PC would canibalize sales from their System/36 systems. The folks up in Rochester, Minnesota (where the System/36 and later AS/400 come from) went to Armonk (IBM Headquarters) and had the IBM Executive Committee block the 80386 based PC.

          The industry stalled for a while because NOBODY had introduced anything for the PC compatible industry that wasn't a clone of IBM's systems or peripherals until then. Finally, Compaq risked the company with the DeskPro 386 and IBM was in serious trouble.
    • Re:Wow (Score:5, Insightful)

      by Anonvmous Coward ( 589068 ) on Monday December 02, 2002 @07:06PM (#4797229)
      "Grandma and grampa checking their email won't need something that fast and even the normal computer user will never experience such CPU intensive work to need a larger word size."

      That's a bit of a narrow minded view, don'tcha think? Consider this: We don't know what we'll be doing with computers 2-3 years from now. If it turns out that PVRs are a killer App, for example, then suddenly 64-bit processors are interesting.

      The "who really needs it for the most basic stuff?" argument is extremely tired. Lots of people buy their machines based on their potential, not what they can do with them today. Don't believe me? Then look at all the people who bought an XBOX solely because of it's chipset and hard drive. They were (and are today) expecting to eventually buy games that blow them away.

      If computers were strictly used for their most basic features (internet browsing, email, etc...) then 'internet appliances' would have been some sort of hit as opposed to the flop that they are. So please, put this 'how do I get my grandma to buy one?' argument to rest. The answer is she won't. But there is still a large market of people who do want/need 64-bit processing. You don't need for grandma to want one in order for the product to be a success.
    • Re:Wow (Score:3, Insightful)

      by Arthur Dent ( 76567 )
      Yes 64 bit CPU's for desktops will soon be the next new thing, but who really needs them? Grandma and grampa checking their email won't need something that fast and even the normal computer user will never experience such CPU intensive work to need a larger word size.

      You're forgetting something: What if Grandpa and Grandma want to view that shiny video email of their grandkids? And what if they want to play movie director in their copious free time and compose a video email themselves?

      After all, today's crop of digital cameras already record mpg clips (about six seconds worth before the CF card fills up), but it won't be long before flash ram gets even cheaper and we start seeing 4/8 GB cards.

      Once the processors are available, applications will be written to take advantage of the larger word sizes. There's no way to tell what will happen.

  • Heat. (Score:3, Interesting)

    by Anonymous Coward on Monday December 02, 2002 @06:26PM (#4796918)
    Will the processors run cooler than the current 32 bit offerings from AMD?

    As much as I love AMD, my box is far too loud, and I'm too damned cheap to shell out another $100 for decently quiet fans.
    • Will the processors run cooler than the current 32 bit offerings from AMD?

      I don't think anyone has a definitive answer for that question. However, you have to remember that the Athlon is an older part which is nearing the end of its life... Intel faced the same situation with the Pentium III beyond 1 GHz.

      Silicon-On-Insulator (SOI) technology, which will debut with Opteron/Clawhammer, is supposed to reduce heat by around 15%

    • The new Thoroughbred Revision B Athlons (XP 2400+ and higher) made a significant drop in power consumption (1.65V core), while the 3GHz P4 guzzles more electrons than any Athlon (have you seen the heatsink Intel bundles with that thing?!). The Hammer series uses Silicon-On-Insulator technology to keep power consumption (heat) down, to the point that the larger Hammer core consumes about the same amount of power as the TBred RevB. AMD is gunning for the high-density rackmount market with the Opteron where efficient power use is critical. They'll get it too.

      I have a dual CPU Athlon 2400+ box, 2GHz each, using Thermalright SLK800 heatsinks and 80mm adjustable fans set to 2500RPM. My temps are 41C/43C/42C (case/CPU1/CPU2) at the moment with about 25% CPU utilization. Power consumption (as measured by my UPS load monitor) is the same as the dual Athlon 1800+ chips (1.53GHz) the new CPUs replaced.
  • Big Bets on Table (Score:5, Insightful)

    by 4of12 ( 97621 ) on Monday December 02, 2002 @06:27PM (#4796932) Homepage Journal

    Both Intel and AMD have been betting big on 64 bit computing and it will be interesting to see how this plays out.

    Itanium 1 was a flop. Itanium 2 has respectable performance, but is not IA-32 backward compatible, where AMD x86-64 is backward compatible.

    I will bet that backward compatibility will tilt the balance to Opteron and that Intel will scramble to introduce a new chip Yamhill(?) designed to provide the backward compatibility that IA64 lacks.

    • The Itanium 2 has silicon dedicated to x86 emulation, which has been redesigned to be faster than the original Itanium. If this is fast enough, then it shouldn't be an advantage for AMD right?

      Once AMD and Intel have 64-bit processors that are affordable and faster than their 32-bit products, I imagine apps will be optimized for both x86 and IA64 architectures. This could be by using separate binaries compiled for each, or just by writing for Java or .NET and JIT'ing to the host architecture. At this point, x86 emulation will only be used for legacy apps, so it doesn't have to be as fast as IA64 code.

      I'm not sure how the emulation works though. Does the CPU have to switch modes using a lengthy switching time, or does the emulator just pick up x86 instructions and translate them to IA64 instructions?
      • or does the emulator just pick up x86 instructions and translate them to IA64 instructions?

        As I understand it, AMD's 64-Bit processors actually have hardware for supporting the previous 32-Bit instructions. I could be misunderstanding, but if I'm not this will naturally mean that with 32-Bit instructions the AMD chip will outperform Intel's emulation.

        Intel is banking heavily on people finally ditching x86 for good. There are good reasons for people to ditch x86, but there is one good reason to keep it: Legacy Support. How important that is will depend on the person and their needs.
  • Didn't AMD announce that they were no longer going to compete for the desktop CPU? And now they say that there *IS* a market for the 64 bit CPU on the desktops!!! Well, are they, or are they not competing then?

    I confused!
    • by DjMd ( 541962 ) on Monday December 02, 2002 @06:37PM (#4797007) Journal
      I love that everyone read that story [slashdot.org] and thought it ment that they were leaving the desktop market, when it really said that they were going to diversify outside of the desktop market, as in do more in addition to their desktop market...

      (a quote from first paragraph of the Forbes article [forbes.com] "[a] strategy of developing processors for a wider range of products outside computers ...")

      • I think the problem was that no-one actually read the story... just the sensationlist Slashdot article.

        All the article said was that AMD saw the ridiculous waste of time in simply jacking up the speed of processes continually... We're up to 3GHz now... and what actually requires that? Not much... so why not spend the time building COOLER chips that can be cooled in a QUIETER way... in fact, why not ship your chips with a QUIET fan, like really QUIET (why am I shouting the word QUIET? Oh yeah, so I can be heard over my AMD with noisy FAN!)...

        Cooler... damn that would be nice... my media server, sitting in my entertainment cabinet... pumps out a lot of heat... it's ridiculous really... I got a relatively lowly Duron 1GHz and it's pouring the heat out.

        Surely, now that they're up at 3GHz... rather than screaming towards 4GHz like mad things, why don't they work on making the 2GHz and lower cooler?
  • Benchmark's (Score:4, Informative)

    by Anonymous Coward on Monday December 02, 2002 @06:31PM (#4796962)
    Here are some benchmarks for a Operton.

    http://www.aceshardware.com/
  • OK people, I know some of you are trying to be humorous, but really the 64 bits is the size of the registers and how much data the processor handles at once. Which means at 64 bits, the processor can process (theoretically) twice as much data per second than a 32 bit processor. Which also means it can handle any number up to 2^64.
    • This isn't totally correct.

      "64bit" refers to the size of the instruction word, not "how much data the processor handles at once". That is a function of pipelining, ALUs, branch prediction, etc. This can be proved by a recompile of a 32bit application with 64bit flags. The application won't be "magically" twice as fast.

      There is something else... a 64bit app may even be *slower* as the cache can only hold half the number of words, given an equal cache size. Cache misses are a huge performance hit these days, as RAM is much slower than Cache RAM.

      Of course the big difference between AMD and IBM is that the new 64bit PPC970 doesn't take a performance hit switching between 32 and 64bit applications. This has more to do with the PPC ISA than anything in the processor.

      The only thing that 64bits will give "normal" users is the ability to address a *huge* amount of LOGICAL memory. In most cases, it doesn't make sense to make 64bit versions of applications, due to the above cache issue. Also, note the allusion that users will require more RAM for 64bit applications, as it will be needed to store the larger word size.

      .
  • by Znonymous Coward ( 615009 ) on Monday December 02, 2002 @06:31PM (#4796965) Journal
    "First of all, I have no indication that Apple is even considering what we (AMD) make."

    What a shame for both Apple and AMD.

    Especially since Apple has AMD support built in [apple.com]

  • by Anonymous Coward
    64 bits is useful for databases accessing gigantic datafiles and other I/O intensive operations. High performance computing loves moving 64 bit values around. Ever do a "file" on /sbin, /usr/bin, /usr/sbin on Solaris? 32 bits all the way. The only reason I downloaded a trial copy of Sun's C compiler was to compile lsof for Solaris 9, which, since it talks to file structures, needs to be 64 bit.

    I think having 64-bit Linux without buying a SPARC, RS6000 or PA-RISC box will be huge for the enterprise. The rest of us will wonder why our apps still suck.
  • by peripatetic_bum ( 211859 ) on Monday December 02, 2002 @06:45PM (#4797068) Homepage Journal
    Just wondering, if Linux already runs on 64bit, which I think it does, and I have not heard of microsoft having anything ready for this market, does this mean that just as gamer's buying games pushed the video card (and in my opinion, the os) market, will we see linux be increasing adopted since it will run 64bit and MS does not?
    Just a question.
    Thanks for the replies
    • by JKR ( 198165 ) on Monday December 02, 2002 @06:50PM (#4797101)
      ...and I have not heard of microsoft having anything ready for this market

      MS have been quietly getting ready for 64 bit for at least 2 years; they've been shipping a 64 bit SDK on my MSDN disks for over a year. There are 64 bit NVidia drivers for WinXP-64. What makes you think MS isn't already there?

      • MS have been quietly getting ready for 64 bit for at least 2 years; they've been shipping a 64 bit SDK on my MSDN disks for over a year. There are 64 bit NVidia drivers for WinXP-64. What makes you think MS isn't already there?

        Spare me the smoke and vapor. Don't you remember the sad story of Mica, errr, NT on Alpha [winntmag.com]? Loudly proclaimed, quietly killed, that's why I think they are not there. If you consider the number of bugs and holes in 32bit M$ work, you might conclude they never arived anywhere.

        In the mean time, you can get Linux and BSD on Alpha and other 64 bit platoforms:

        • Debian [debian.org]
        • Red Hat [compaq.com]
        • BSDs, free [freebsd.org], net [netbsd.org], open [openbsd.org]
        • and plenty of others less well known.

        Oh, it hurts so much to remember and think!

    • by KPU ( 118762 )
      Check the Windows XP 64 bit edition website [microsoft.com]. I hate to burst your bubble, but microsoft knows what it's doing.
    • Windows XP and Office XP run [amdzone.com] on the opteron. There was also an article on slashdot a while back about it. Linux ports are definitely in progress [x86-64.org], but I don't know if there's anything solid running on 64 bit x86 machines right now.
    • What does it get the consumer:
      New apps (as in killer apps)? No.
      New OS features (by going 64bit)? None.
      Speed? Somewhat.

      Since when did a little bit more speed make linux a killer app? Also considering that if there is a marked, Windows will most certainly give out a 64bit-version.

      Kjella
  • These plots are 32 times as good as the 2-bit plots in Hollywood movies.

    Good thing it's backwards compatible or all the studios would have to upgrade their writers too.
  • talks virtually nothing about similarities to IA-64. Perhaps, what I'm asking is, can anybody compare and contrast the two architectures; is there a certain advantage to one or the other?
    • Re:The article (Score:5, Insightful)

      by puppetman ( 131489 ) on Monday December 02, 2002 @07:27PM (#4797378) Homepage
      Perhaps, what I'm asking is, can anybody compare and contrast the two architectures; is there a certain advantage to one or the other?

      Yah - AMD will offer it to the consumer combined with motherboards from tier-1 manufacturers like Asus, Abit, IWill, Tyan, and so forth, all at an attractive price (read: the same price as the Athalon XP CPUs).

      Intel, on the other hand, will keep their 64 bit CPUs out of the consumer hands by pricing them above what most consumers are willing to pay, thus reaping a premium on them by selling them in servers through Dell and IBM (making even more money on cases and motherboards). There will be limited support for the CPU outside Intel's own motherboard offerings, and if you run with a hard-drive, video card, CD-Rom that has not been explicitly approved by Intel, then forget support (we've had this problem with Intel on some of their server motherboards).

      Intel is taking the Cathedral approach, and AMD a Bazaar approach [tuxedo.org].
  • by Bio ( 18940 ) on Monday December 02, 2002 @06:54PM (#4797136) Homepage Journal
    I'm amazed to read the discussion, wether or not 64 bit will succeed over 32 bit processors.

    This is 10 years after DEC has introduced the Alpha Architecture (in spring 1992).

    The Alpha was fun to work with, not only because of it's 64 bit architecture, but because of the clean orthogonal instruction set and it's outstanding performance.

    Rest in peace ...
    • by gordon_schumway ( 154192 ) on Monday December 02, 2002 @07:44PM (#4797503)
      DEAD PERSON: I'm not dead! [hp.com]
      CART MASTER: What?
      CUSTOMER: Nothing. Here's your ninepence.
      DEAD PERSON: I'm not dead!
      CART MASTER: 'Ere. He says he's not dead!
      CUSTOMER: Yes, he is.
      DEAD PERSON: I'm not!
      CART MASTER: He isn't?
      CUSTOMER: Well, he will be soon. He's very ill.
      DEAD PERSON: I'm getting better!
      CUSTOMER: No, you're not. You'll be stone dead in a moment.
      CART MASTER: Oh, I can't take him like that. It's against regulations.
      DEAD PERSON: I don't want to go on the cart!
      CUSTOMER: Oh, don't be such a baby.
      CART MASTER: I can't take him.
      DEAD PERSON: I feel fine!
      CUSTOMER: Well, do us a favour.
      CART MASTER: I can't.
      CUSTOMER: Well, can you hang around a couple of minutes? He won't be long.
      CART MASTER: No, I've got to go to the Robinsons'. They've lost nine today.
      CUSTOMER: Well, when's your next round?
      CART MASTER: Thursday.
      DEAD PERSON: I think I'll go for a walk.
      CUSTOMER: You're not fooling anyone, you know. Look. Isn't there something you can do?
      DEAD PERSON: [singing] I feel happy. I feel happy. [whop]
      CUSTOMER: Ah, thanks very much.
      CART MASTER: Not at all. See you on Thursday.
      CUSTOMER: Right. All right.
    • One thing to remember is that the Alpha was not the first 64-bit processor. Before it were HP's PA-RISC in 1986, U. of Tokyo's TRON design in 1987, and DEC's MIPS R4000 in 1991. Sun/Fujitsu moved the SPARC to 64 bits in late 1992, and IBM was late when it moved the POWER in 1995. So 64-bit processors were neither unheard-of nor new in 1992.
  • by justanumber ( 621952 ) on Monday December 02, 2002 @06:55PM (#4797150)
    No real benefit will come until geniune 64-bit apps hit the consumer market. This will be a steep learning curve for most developers who have only ever know 16 or 32-bit programming.

    The problems to be hurdled are:

    1) Reliance on the fact that size of pointer is equal to size of int.

    2) Reliance on a particular byte order in the machine word.

    3) Using type long and presuming that it always has the same size as int.

    4) Alignment of stack variables.

    5) Different alignment rules in structures and classes.

    6) Pointer arithmetic.

    A lot of engineering (and developer re-education) work also needs to be put into not only these issues, but also designing the application so that it is actually getting the most out of each clock cycle.

    • No real benefit will come until geniune 64-bit apps hit the consumer market.
      Given that the Hammer will offer more 32-bit performance per dollar than the current Athlon parts (and, for that matter, Intel parts), I would say that there's at least some benefit even if there are NO 64-bit apps.

      But the advantage of Hammer is that you don't need to migrate ALL of your apps to 64-bit to get a serious performance benefit. With the IA64, the performace of 32-bit applications is terrible, so it's a poor choice unless most of your software is 64-bit.

    • Yes, but several development houses (including, primarily Microsoft) have had experience porting to other 64 bit architectures. So the problems are known and addressable.
      • The problems to be hurdled are:
        1) Reliance on the fact that size of pointer is equal to size of int.
      True. But this is really a bad code practice sort of thing that developers that the C/C++ standard doesn't endorse anyhow. Java developers need not be concerned.
      • 2) Reliance on a particular byte order in the machine word.
      Its little endian, like it has always been on Intel. What's the problem? Little endian has the natural LSB/W/D property that makes increasing the natural word size typically a non-issue (unlike big endian where it is a serious PITA.)
      • 3) Using type long and presuming that it always has the same size as int.
      It is the same. You need to use "long long", or "_int64" or something like that to move to 64 bits in your C/C++ compiler. AMD's new "long mode" actually defaults to 32 bit data sizes, and only uses 64 bits when specifically overridden to do so. That's the whole point of the x86 architecture -- it supports a variety of data sizes as a consequence of its long history of backward compatibility, not just one (like a typical RISC.)
      • 4) Alignment of stack variables.
        5) Different alignment rules in structures and classes.
      Same as it has always been. (64 bit integers already exist today in known common ABIs/compilers, in case you were unaware.)
      • 6) Pointer arithmetic.
      Eh? You can add/subtract integers to pointers, and subtract two pointers from each other. How does moving to 64 bits change anything?
  • AMD is puny (Score:5, Funny)

    by Ed Avis ( 5917 ) <ed@membled.com> on Monday December 02, 2002 @06:58PM (#4797177) Homepage
    From the interview:
    We really can't control whether we'll go to war with Iraq, and all that sort of thing.
    And that, my friends, is the difference between AMD and Intel.
  • by SkulkCU ( 137480 ) on Monday December 02, 2002 @07:01PM (#4797199) Homepage Journal

    AMD's 64-bit chips will be comparatively priced to the 32-bit ones

    So, they're going to be twice as much?

    heh.
  • by bogie ( 31020 ) on Monday December 02, 2002 @07:03PM (#4797208) Journal
    No this is the interesting stuff

    "eWEEK: What does it mean to you personally, though, when a Gateway or an IBM not just stop, but announce that they'll no longer be offering AMD as an option?

    Ruiz: I think it's terrible, obviously. It's terrible. I think if you were to talk with Ted Waitt at Gateway, and ask him, "Why'd you do that?" and if he would really tell you why, it's a question of he's being bribed to do it. Now, he's got to look out for his own hide and the company that's probably in great difficulty has got to listen to the huge amounts of money that can help him do that.

    But you know what I find amazing, think about the power, is that despite all that, which obviously we really get emotional about the fact that somebody like Gateway gets bribed into doing that, is that despite that, according to Dataquest last week, we're still holding a 19 percent share of the market. That to me tells me we're in the throes of breaking this open"

    Hey Intel, see you in court! Of course now that Intel is along with Microsoft backing a group to outlaw opensource in the government, I think its time for the opensource community to boycott Intel. Why should our money go to a company which is now attempting to hurt Linux and opensource? I know because these recent actions, I will NEVER buy Intel ever again!
  • Remember the Alpha (Score:3, Interesting)

    by chunkwhite86 ( 593696 ) on Monday December 02, 2002 @07:06PM (#4797228)
    I see many posts here wondering about porting Linux to 64 bit...

    Remember the Alpha? 64 bit goodness all the way. Has been running Linux for years.

    And for those old enough to remember... Microsoft did support Win NT on the Alpha just a few years ago.

    As far as the software goes, both Linux and Microsoft are ready for 64 bit computing.
    • NT ran in a crippled 32-bit mode on the Alpha. It did not support 64-bit applications.
    • They still do (Score:3, Interesting)

      Talk to the right people and you'll find a beta of win2k server for the alpha cpu. At the time neither intel or AMD had a ready for prime time cpu and MS needed to keep a working 64 bit codebase.

  • by Inoshiro ( 71693 ) on Monday December 02, 2002 @07:06PM (#4797230) Homepage
    2^32 addressing is obsolete already -- it cannot keep up. Most enthusiasts have a gig of RAM (or more) in their DESKTOP PCs. In 2005, most of them will have hit the 4gb limit. In 2009, most consumer PCs will have hit the same limit. Servers have already hit this limit. That's why there are special instructions (a return to segmented memory access) on P3 and P4 processors, allowing up to 64gb of RAM in 4gb segments to be addressed. If you remember doing DOS programming (I do), you know why this 64-bits is good, while 32-bit segmented access isn't.

    2^32 addressing limits addressable HD space to 2 terabytes. "2 terabytes? But that's way larger than even enthusiasts use in their PCs, despite their larger than average needs." This ignores the fact that many companies have storage arrays that are at 2 terabytes. Some work went into the 2.5 Linux kernel to increase the number of blocks that could be addressed by moving internally to 64-bits. Storage needs are always increasing. If we're hitting 2tb today, isn't it a good thing that we're moving to a better amount of bits?

    2^64 addressing is not the only benefit of the change. FPUs see additional benefit when they have more bits. More bits means more precission; this is very important and desirable, especially when working with numbers that have fractional components. For proper 3D rendering, physics models, and anything else that involves computing numbers that have fractional parts, more is better. When the FPU can handle a double in one clock cycle because it works natively on 64-bit IEEE floating point numbers, you will notice a performance boost in addition to the increased accuracy.

    64-bit word operations means that databuses can be slower, since each clock-tick sends more data. 64-bits means you can do more, more flexibly, with your computer.

    There will always people who resist change, even when there is no reason to resist change. The same people are posting comments on Slashdot about how 32-bits is enough, and how happy they are with 32-bit applications. These are the same people who had to be carried, kicking and screaming, from their 286s to the new 386 and 486 machines which had 32-bit addressing and data operations. Don't let these people hold back your exploration of new technology!

    For those of you who are saying, "what about 64 bits? Will 64 bits be enough?" 2^64 is 32 orders of magnitude bigger than 2^32. 2^32 is roughly 4.5 billion (unsigned). 2^64 unsigned is 18,446,744,073,709,551,616, or roughly 2220 * 8309 trillion. 4.5 billion goes into that number 4.5 billion times. 2^64 is certainly enough for at least a hundred years :)
    • by Rob.Mathers ( 527086 ) on Monday December 02, 2002 @07:21PM (#4797339) Homepage
      "2^64 is certainly enough for at least a hundred years"

      Famous last words?
    • by Christopher Thomas ( 11717 ) on Monday December 02, 2002 @07:24PM (#4797358)
      2^64 addressing is not the only benefit of the change. FPUs see additional benefit when they have more bits. More bits means more precission; this is very important and desirable, especially when working with numbers that have fractional components. For proper 3D rendering, physics models, and anything else that involves computing numbers that have fractional parts, more is better. When the FPU can handle a double in one clock cycle because it works natively on 64-bit IEEE floating point numbers, you will notice a performance boost in addition to the increased accuracy.

      Um, all current x86s already handle 64-bit IEEE double-precision floats natively (actually more like 80 bits, for "extended double-precision"). The FP register file has been this wide for quite a while.

      There will be no performance or precision boost for floating-point math from moving the rest of the chip to 64-bit registers/datapaths.
    • A nit. Orders of magnitude is generally thought of in the decimal realm. Thus 2^64 which is a 20 digit number is only 10 orders of magnitude greater than 2^32 (a 10 digit number).

      I wouldn't be to sure about the 100 years part either. But it out to be good for at least 10.
    • Servers have already hit this limit. That's why there are special instructions (a return to segmented memory access) on P3 and P4 processors, allowing up to 64gb of RAM in 4gb segments to be addressed.
      Bzzt. The feature you're describing is known as PAE, for physical address extension. It doesn't work via "real mode" style DOS segmentation. Each program's virtual address space is still 4GB, and pointers are still a flat 32 bits. PAE simply changes the hardware page table structure so the 4GB "window" of your virtual address space can look out onto more than 4GBs of physical memory. Even though no one process can access more memory than before, you can run multiple, 4GB processes on a single machine.

      Miraculously, someone at Intel stowed the x86 crackpipe, preventing some sort of segmented/overlay nightmare like the one you describe.
  • A major advantage, especially to the Open Source community of 64 bits on the desktop is software development.

    Remember, many (most?) open source developers are private individuals and not huge corporations. Allowing individual open source developers to own an affordable 64 bit desktop machine will allow them to more effectively develop and debug the code that runs on the 64 bit servers.

    It only seems natural that a developer, given a 64 bit system to develop and debug code on, is going to produce better 64 bit code. And we all want Linux (and the BSD's!) to be the best 64 bit platform it can be, right??
  • Alas, the memory... (Score:2, Interesting)

    by NerveGas ( 168686 )
    I've been eagerly waiting for Hammer ever since it's announcement. High-bandwidth interconnects, 8-way SMP support, and AMD's incredibly high IPC all team up for a chip that sounds like a winner.

    However, each chip is only going to get a single DDR333 memory path. With all of this time and effort, and so much at stake for AMD, you'd think that they'd make sure that they did it right, and move to a dual-channel solution, or at the very least, a DDR400 solution - which will be a pretty standard offering when the Opteron/Hammer/Athlon64/Whatever is released.

    Sure, it'll perform pretty well with a single channel of DDR333. But I'll bet it would perform MUCH better with more bandwidth. And compared to all of the design and development that they've already done, implementing a dual-channel memory controller really wouldn't have been any significant challenge.

    So, I'm not nearly as optimistic. On the other hand, I'm not a skeptic yet. When they come out, I'll see how they perform. But I'm certainly not as excited as I used to be.

    steve
  • Are there any sites that talk about how to get a 64 bit AMD system going? How expensive are they?
  • 32 bits != 4 gig max (Score:5, Informative)

    by cartman ( 18204 ) on Monday December 02, 2002 @08:51PM (#4797945)
    32 bit architectures are not limited to 4 gigabytes of memory. "32 bit processor" refers to the width of the DATA bus (and registers). It does not refer to the width of the address bus.

    For example, the z80 and 6502 were 8-bit processors, but they supported more than 256 bytes of RAM (2^8 bytes). The 68000 and 80286 were 16-bit processors, but they supported more than 64k of RAM (2^16 bytes). That's because the 8-bit processors had 16-bit address busses, and the 16-bit processors often had 24-bit address busses.

    The current pentium-4 Xeon chip supports 64 gig of RAM, despite being a 32-bit processor.

    64-bit computing means that you can hold a 64-bit quantity (long int or double) in a register. Also, you can load, store, or perform arithmetic on such quantities using one instruction and often in one clock cycle.

    This offers very few benefits for the end consumer. Mostly it's about perception: consumers will percieve that a 64-bit chip is twice as good as a 32-bit one.
    • Mostly, I agree. In fact, I spend lots of time writing software for 8 and 16 bit machines, and I spend half that time turning single bits on and off.

      One thing I'd like to point out, though: I've noticed that an awful lot of mathematics is being done using doubles (i.e., 64-bit floats) these days. It's partially laziness, but it's also really the case that 32-bit IEEE floats only give you 24 bits of accuracy. Doing math with doubles really cuts down on roundoff errors, so a lot of people switch to doubles and forget about it.

    • by Bert64 ( 520050 ) <bert@[ ]shdot.fi ... m ['sla' in gap]> on Monday December 02, 2002 @11:55PM (#4798891) Homepage
      AFAIK the 68000 was a 32bit processor, with 24bit address bus and 16bit external bus. The later 68020 increased everything to 32bit.
      However, the p4 actually has a 32bit address bus, with hacks to address 36bit space, but thats what it is.. a hack, the extra addressspace is not directly available to apps. There is also likely to be a performance hit when using these hacks..
    • by rweir ( 96112 )
      The current pentium-4 Xeon chip supports 64 gig of RAM, despite being a 32-bit processor.

      Yes, that's true, but it's horribly hacky. Addressing your RAM in 4gb segments? It's enough to make any old-skool DOS coder cry.
  • by ppetrakis ( 51087 ) <peter.petrakis@gmail.com> on Monday December 02, 2002 @10:04PM (#4798362) Homepage
    I don't have to read the article. I've been working with Alphas all my life. There is nothing for 99.9% of the applications you use everyday that could benefit from running itself in a 64 bit address space. Unless you get a signifigant performace boost from the move (like Alpha in it's heyday) it isn't worth the effort.

    If you find you need that sort of mega addressing the chances are the app you need already runs on 64 bit Solaris. After that point it's up to the vendor (Think Avanti Corp /Apollo) Wheither it's worth their while. Remember, You need their application. Unless your app is home grown or you have some signifigant pull with a vendor the port isn't going to happen.

    The desktop is an afterthought. This chip was designed to be sold in quanties of 8 and higher in single large servers. Once they cut into that market the economies of scale just happen to make it cheap enough for the desktop market to pick it up. They have a much better chance at getting it down with their builtin backwards compatibility and keeping costs down. Alpha never hit that "sweet spot" for the volume to really bring down the price..

    Now, Don't think Intel is going to sit on its hands while AMD eats their lunch. They're more likely to drop an Itanium instruction decoder into an Alpha EV7 core and push that than follow with an x86-64 processor line. Itanium is just to big and costs too much to at this stage of development to make inroads fast enough stop AMD in gaining marketshare but more importantly, mindshare. Intel would never take up x86-64, Doing so admits defeat to the industry i.e. You're not the leader anymore.

    So to sum it up, Intel will either:

    1. release Itanium and we all find out it isn't as slow as everyone claims it is or as expensive
    2. See Alpha/Itanium hybrid core above
    3. They bring back Alpha (maybe not by name) and put it under a modern process. Expect atleast x1.5 current clock speeds and Alpha's can milk rambus for all it's worth

    2 and 3 are much more likely than one, You know which one I'd rather see happen :).

    Either way it'll be a boon for the OS community and certainly make our (The Alpha community) lives easier. The way I see it, even if hammer is moderatly successfull. You guys will 'clean' most of the popular soucecode out there to be 64 bit clean, reducing our matainence work by like 80%. The only thing we'll have to worry about is firmware, toolchain, libc, Xwindows, and kernel. So please buy a *hammer and learn the joys of porting to 64 bits. If it proves too painfull, please see the ld manpage for the "-taso" flag :).

    Peter

    • I don't have to read the article. I've been working with Alphas all my life. There is nothing for 99.9% of the applications you use everyday that could benefit from running itself in a 64 bit address space.

      After looking at your title and seeing no relation to your first paragraph, I know I wouldn't have to read the rest of your post and still know exactly where you went wrong. The "market" for a computer is not necessarily defined by what new applications it can offer. It can be defined by Joe Average coming home to his house carrying a huge box, telling his wife "You've gotta check this out. Its got sixty-four bits!"

      (Chances are he's never worked with a "computer" in his life, and thinks he'll have to assemble all 64 pieces manually.)

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...