Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Microsoft

Bill Gates Talk From 1989 Surfaces 317

70sstar writes "A 1-1/2 hour recording of Bill Gates addressing a crowd of university students in 1989 was recently found and digitized, and has been circulating in some IRC channels for the past few weeks. The speech has found a permanent home on the web page of the University of Waterloo CS Club, where the talk is reported to have taken place. Gates covers the past, present, and future of computing as of 1989. While the former two might be of interest to tech historians, the real fascination is Gates's prediction of computing yet to come. Like the now-legendary '640k' remark, some of his comments are almost laughably off-target ('OS/2 is the way of the future!'). And yet, by and large, he had accurately, chillingly, prophesied an entire decade or two of software and hardware development. All in all, a fascinating talk from one of the most powerful speakers in CS and IT."
This discussion has been archived. No new comments can be posted.

Bill Gates Talk From 1989 Surfaces

Comments Filter:
  • OS/2... (Score:3, Interesting)

    by TheSHAD0W ( 258774 ) on Saturday March 24, 2007 @10:06PM (#18475257) Homepage
    You do know that the NT4 core is extremely similar to OS/2, and the only reason they diverged is because of a fight between IBM and MS?
    • by HarryCaul ( 25943 ) on Saturday March 24, 2007 @10:12PM (#18475305)

      Don't interfere with Bill-Bashing!
      • by timeOday ( 582209 ) on Saturday March 24, 2007 @10:33PM (#18475457)
        It is funny to hear it straight from Gates though. He owes almost his entire fortune to IBM's failure to deliver on OS/2, and (to be fair) Microsoft's successful delivery of DOS+Windows (crap that it was).
        • Re: (Score:3, Interesting)

          by ePhil_One ( 634771 )
          Actually, he owes it to Gary Kildall refusing to talk to IBM when they asked him to port his dominant OS to their new computer. Bill got into the OS market to save his contract with IBM for Basic on the new PC.
          • by timeOday ( 582209 ) on Sunday March 25, 2007 @12:50AM (#18476117)
            I agree, DOS (like Windows) could so easily have gone to a competitor instead. I guess it just shows how pivotal certain moments can be. IBM in particular made blunder after blunder, refusing the take the PC seriously. I guess their mainframes were doing just fine and they didn't want to open their eyes to the implications of Moore's Law - that $500 PCs would ultimately take most of the market for computing hardware. Just like all the others - Sun, Silicon Graphics, Cray, DEC...
            • Imagine... (Score:5, Insightful)

              by ReidMaynard ( 161608 ) on Sunday March 25, 2007 @07:01AM (#18477395) Homepage
              Imagine you have the only Mercedes-Benz dealership, every morning customers are lined up, check-books ready. Year after year. You are rich beyond imagination.

              Then one day this fellow shows up with a Vespa and says, "You should sell these Vespa scooters too.."

              What do you do..?
              • by kv9 ( 697238 ) on Sunday March 25, 2007 @10:16AM (#18478385) Homepage

                Then one day this fellow shows up with a Vespa and says, "You should sell these Vespa scooters too.." What do you do..?

                I repeatedly slam a car door against your head for using yet another computer/car analogy on Slashdot

            • by hey! ( 33014 ) on Sunday March 25, 2007 @11:11AM (#18478779) Homepage Journal
              I don't think you have the story quite right.

              IBM was a victim of its unintended success. The first generation IBM PCs were crippled compared to what they could have been, in almost every way. They could have had a much better processor. They could even have run a real operating system. Instead it was low rent all the way, outsourcing as much as they could, because they were making a cheapo product they expected to sell only moderately well. They built a computer that was inferior to the Apple II which had been available for several years. Radio Shack had a 68000 based computer running Unix that was introduced around the same time. These could have been a serious threat, but IBM produced a toy computer, put it in a business like case, and slapped the IBM logo on it.

              If you were working in those days (1981-1982), things started out as planned. IBM PCs were appearing on desks as a status symbol. There wasn't much useful you could do on them. Then in 1983 came Lotus 1-2-3, and suddenly all those PCs became very useful. In the same year, came the Compaq portable, the first 100% "IBM Compatible".

              The disruption of IBM's business came not from their misunderstanding the rate of technological change. They were attempting to slow the impact of change on their existing product lines by introducing a low end product of their own that was positioned low enough that it wouldn't hurt their existing product lines.

              This would have been a good strategy if they hadn't failed to anticipate the success of the product. They didn't even bother to get exclsuive rights to DOS. By making a proprietary PC, they actually accelerated the penetration of microcomputer vendors into their customer base.

        • Well... (Score:5, Interesting)

          by Xenographic ( 557057 ) on Saturday March 24, 2007 @10:45PM (#18475529) Journal
          You have to admit that it's easier to predict the future when you're the one making it... :]

          That said, the places where he was wrong are more interesting to me. I wonder what Microsoft's business plan was had IBM taken over with OS/2 instead of them?
          • Re:Well... (Score:4, Insightful)

            by Bastian ( 66383 ) on Saturday March 24, 2007 @11:33PM (#18475751)
            That said, the places where he was wrong are more interesting to me. I wonder what Microsoft's business plan was had IBM taken over with OS/2 instead of them?

            It was to rake in (slightly less) dough selling OS/2.

            OS/2 was originally a joint Microsoft/IBM effort. What became Windows NT was originally going to be the next version of OS/2, but tensions between MS and IBM increased until Microsoft decided to take its ball and go home.

            So really, Bill Gates was 100% correct in saying that OS/2 is the wave of the future. It's just that in 1989 he didn't realize that it was going to be renamed "Windows NT" 3 or 4 years later. Had Microsoft instead decided to continue working with IBM, they would probably still have ended up being stinking rich, just a bit less so.
            • Re:Well... (Score:4, Informative)

              by Alien Being ( 18488 ) on Sunday March 25, 2007 @01:11AM (#18476219)
              OS/2 and NT are different animals.

              OS/2 was originally a joint Microsoft/IBM venture and was to replace Windows, but there were squabbles over the API definition which caused Microsoft to rethink the whole plan. By that time, the Windows(3.0) API had become a defacto standard and the world's most valuable computer technology.

              MS realized that abandoning Windows (and control of the API) was a huge mistake, so they didn't. They went ahead with OS/2, but kept Windows as their primary platform. They knew that they still needed a "real" OS to replace Windows' DOS underpinnings, so they started the NT project.

              Windows remained as the market standard and MS remained as the gatekeeper to the API. OS/2 customers who wanted to run/develop apps for the "standard" system would also need a Windows license. And perhaps even more important than their ability to sell licenses, is the fact that by controlling the API, they get a huge head start over the competition when it comes to designing developer tools and applications around that API.
            • Re: (Score:3, Funny)

              by xigxag ( 167441 )
              "I'm OS/2, and I used to be the next operating system of your PC."
        • by westlake ( 615356 ) on Saturday March 24, 2007 @11:40PM (#18475783)
          He owes almost his entire fortune to IBM's failure to deliver on OS/2, and (to be fair) Microsoft's successful delivery of DOS+Windows (crap that it was).

          Gates began programming at age thirteen, at age fourteen he is clearing $20,000 in is first partnership with Allen. Microsoft is founded in 1975. Microsoft in in Japan in 1978. In Europe in 1979. In 1980 Microsoft is young, hungry, and moving a hell of lot faster than Kildall.

        • Re: (Score:3, Interesting)

          by Traa ( 158207 )
          Look, I love to hate Windows as much as anyone else (here on slashdot), but I happened to have worked on OS/2 drivers in the mid 90's and just thinking back at those make me cringe. OS/2 was a pile of crap when it died. Anyone thinking that IBM was on the verge of launching a flawless operating system is smoking something significantly stronger then I ever have (and I'm from The Netherlands)
          • Re: (Score:2, Troll)

            by rtb61 ( 674572 )
            The only reason OS/2 dies was because IBM was greedy and charged too much for it at the beginning of it's life, hence the beginning became the end. Why did lotus die, because the lotus eater were living in their own little world and were charging more for it then M$ were charging for it's whole office suit and the same applies to word perfect.

            Bearing in mind that M$ software, services and support were far cheaper and of a much higher quality in those days. The manuals were excellent, tutorial disks were p

            • Re: (Score:3, Informative)

              The only reason OS/2 dies was because IBM was greedy and charged too much for it at the beginning of it's life, hence the beginning became the end.

              Wasn't it Microsoft that set the pricing of the SDK for OS/2 1.x, and wasn't OS/2 1.x mainly sold as a Microsoft product? Who set the high prices for OS/2 again?

              Remember that IBM, once it got hold of OS/2 and was able to release the 32-bit version as a product independently of Microsoft, was willing to sell OS/2 to Windows users for US$49 and to DOS users f [guidebookgallery.org]

              • by dryeo ( 100693 )
                Warp v3 for windows (supply your own Win3.1) sold for $50 cdn around here for a while. Probably '94 or so. Unluckily they couldn't sell the blue box (included Win 3.1) for anywhere as cheap as MS wanted way too much for the Win 3.1 license.
              • by drsmithy ( 35869 )

                Windows NT 3.1 (Microsoft's first 32-bit offering) wasn't released until some time after OS/2 2.0 (July 1993, over a year later than OS/2 2.0 ).

                On the other hand, NT 3.1 was a vastly more complex and capable system than OS/2 2.0 (or any version since, for that matter).

    • by Anonymous Coward on Saturday March 24, 2007 @11:41PM (#18475791)
      It says a lot about /. these days. During the days of Olsen, he started a re-write of VMS. It had such luminaries as Cutler and Bell on the team. When the company was bleeding, Olsen killed off this project and others. When Gates got wind of this, he approached Cutler (and others such as Grey and Bell), and convinced him to join him. One of the bigger issues was that he promised the core to the VMS folks. He would control the API and above. They would control the core.
      ANd if that was not enough, back in 94, I even saw the code for NT (I worked at HP and a neighboring group were asked to port it to the pa-risc. ). I can tell you firsthand that it had NOTHING to do with OS2. If you looked at it, you knew it was dec derivitive. Even the comments said it all.
       
      So how did you get modded up?
      • Well, that explains why Windows' filename handling is so screwy. ;-)

        Yes, I'm aware that the actual codebases are different. A lot of the structure is identical, though. Another message pointed out that the APIs for OS/2 and NT4 are close to identical.
        • by Richard Steiner ( 1585 ) <rsteiner@visi.com> on Sunday March 25, 2007 @12:44AM (#18476095) Homepage Journal
          APIs are surface features which are (usually) made visible for applications to use, and they give very little indication of the nature or structure of the actual kernel code running underneath.

          OS/2 supports the POSIX API via EMXRT.DLL, for example, and yet OS/2's kernel has very little in common with, say, Linux or Solaris (which both also support POSIX programs).

          The 32-bit OS/2 kernel written by IBM for OS/2 2.0 and later and the Windows NT 4 kernel are quite different. Both Microsoft and IBM completely re-implemented their respective OS's kernels after the 16-bit OS/2 days, and the resulting software has very little relationship to the old 16-bit kernels except for support for the older 16-bit APIs. But as I said, that is simply a surface similarity.
          • by dryeo ( 100693 )

            OS/2 supports the POSIX API via EMXRT.DLL, for example, and yet OS/2's kernel has very little in common with, say, Linux or Solaris (which both also support POSIX programs).

            No, the OS/2 Toolkit supports quite a bit of POSIX function ability.
            From the OS/2 ver 4.5 toolkit C library reference (xpg4ref.inf)
            Many of the functions are defined by the following language standards:

            The American National Standards Institute C Standard and International Standards Organization, ANSI/ISO 9899-1990[1992], and the amendment ISO/IEC 9899:1990/Amendment 1:1993(E)
            The ISO/IEC 9945-1:1990/IEEE POSIX 1003.1-1990 standard
            The X/Open Common Applications Environment Specification, System Interf

      • Your explaination makes more sense, I think NT's similarity to VMS was a point of contention in a lawsuit too, it was obvious such that there were tech articles where they compared the name and function of the standard system services.
      • Your grandparent didn't really say anything that controversial. Viewed from an early 1990s point of view, Windows NT and OS/2 did have a lot in common in that they were 32-bit, protected memory, pre-emptive multithreaded operating systems with a Windows API. Windows NT was supposed to be a continuation of the OS/2 line of operating systems. Yes, you are right that it was VMS-inspired and the team had influential VMS refugees. But anyhow, Bill Gates didn't predict the split with IBM and therefore didn't know
    • Cutler hated OS/2 with white hot, foaming at the mouth hatred that only Cutler is capable of. He even tried pretty hard to fight Gates' requirement that NT runs OS/2 as a subsystem (alongside POSIX and Windows).
    • Re:OS/2... (Score:5, Informative)

      by TheNetAvenger ( 624455 ) on Sunday March 25, 2007 @01:02AM (#18476189)
      You do know that the NT4 core is extremely similar to OS/2

      Actually as an OS Engineer that has spent time working with and tearing both apart, they are very much night and day.

      You would have more success in selling OS/2 is the same as BSD.

      Here are a couple of things to get you started, and I could point out a few inaccuracies in each of these, but for the most part they will send you down the right path:

      http://en.wikipedia.org/wiki/OS/2 [wikipedia.org]
      http://en.wikipedia.org/wiki/Architecture_of_Windo ws_NT [wikipedia.org]

      Now where you are partially correct. NT started out in the OS/2 3.0 development stages, but by the time MS and IBM split, NT was a start from scratch OS as Dave Cutler thought the OS/2 codebase was horrible.

      MS even looked at using *nix concepts in the early days of NT, since it was being written from the ground up, and why MS held on to Xenix at the time in case that is the direction the NT team wanted to go with NT or base it on

      However the NT team felt the *nix architecture concepts were too limited and instead decided to take the best OS theories at the time and see if they could truly make a new OS technology.

      I get so tired of kids today confusing simple things and I see this crap on here all the time. NT is not VMS, NT was not OS/2, NT and Win95 are not related other than the Win32 subsystem, WinXP does not contain Win9x code, etc etc...

      No wonder people think Windows is more of a joke than it already is, if I saw it as a hybrid and hodgepodge of Win9x and OS/2 and NT I would think it was an insane code base too; however, it is not.

      It is easy to poke fun at Windows, but when you find real OS engineers, the NT architecture/kernel isn't quite so funny and gets quite a bit of respect even if they hate the Win32 subsystem.
      • by dryeo ( 100693 )
        You are right as far as the kernel is concerned. But remember the first version of NT was OS/2 NT ver 3 and the next was WIN NT ver 3.1. Quite confusing.
        Also the DOSCALL1.DLL in NT was quite similar to the DOSCALL1.DLL in OS/2 ver 1.3 and I'd imagine that it is similar when comparing the WIN32 DLLs between WIN9x and NT.
        Naturally this does cause some confusion.
        • Re: (Score:3, Informative)

          Also the DOSCALL1.DLL in NT was quite similar to the DOSCALL1.DLL in OS/2 ver 1.3 and I'd imagine that it is similar when comparing the WIN32 DLLs between WIN9x and NT.
          Naturally this does cause some confusion.


          Ya, it does cause confusion, but this is like 15 years ago, and yet people still don't seem to understand, yet people from the timeframe that all this was happening were 'quite' aware.

          As for the DLLs these are subsystem DLLs, not NT DLLs. I understand that in the SlashDot world, NT is a bit weird and t
    • by drsmithy ( 35869 )

      You do know that the NT4 core is extremely similar to OS/2, [...]

      You do know that you're completely wrong ?

  • by Salvance ( 1014001 ) * on Saturday March 24, 2007 @10:10PM (#18475291) Homepage Journal
    To the computer enthusiasts of the time, it would have been even more laughable had Bill Gates said "in the next two decades, Microsoft software will completely destroy OS/2, will render Apple a shell of its former self by stealing all its innovations, and will demand 1 GB of RAM." So even if he had his world domination plans set in 1989, he couldn't exactly let the world know without being laughed at.
    • Usually that phrase would apply to a company that once had a major percentage of a market and holds it no longer. The Mac never had a big piece of the market and I'll bet that the Apple II had a much larger market share than the Mac has ever enjoyed.
    • by dryeo ( 100693 )
      Sad thing is is that if OS/2 won the OS wars we'd probably just be running MS OS/2 XP, and upgrading to OS/2 Vista and needing 2 GBs of ram.
  • Transcript? (Score:4, Interesting)

    by rgo ( 986711 ) on Saturday March 24, 2007 @10:13PM (#18475311)
    Is there a transcript anywhere? Or at least a summary? I don't have the time to listen to an hour and a half mp3.
    • Re:Transcript? (Score:5, Informative)

      by Strudelkugel ( 594414 ) * on Sunday March 25, 2007 @02:15AM (#18476515)

      I don't have the time to listen to an hour and a half mp3

      Crude index:

      • 28:00 Developer teams
      • 36:00 Mouse
      • 50:00 Unix
      • 52:40 Mac
      • 56:00 PARC people
      • 57:00 Mac GUI/Microsoft developers
      • 63:00 Third standard
      • 66:30 Networks
      • 71:50 Lotus/Excel competition
      • 75:00 "World Net"
      • 76:50 Multimedia
      • 79:40 Utility of the CD (Thanks music industry!)
      • 87:00 Learn from competitors
      • 87:50 Hypertext

      Actually Gates was quite insightful. He clearly understood what was important for the evolution of the personal computer, but didn't quite manage to have Microsoft dominate all of it, fortunately. When he discussed Unix in one section, and importance of networks in another, he never mentioned anything about security, which is an important element of Unix design. Later he mentions the "World Net", but of course did not anticipate HTTP and browsers. This makes his comments about hypertext all the more interesting; he correctly states massive amounts of typeless links would overwhelm the user. The significance of search, among other things, eluded his thinking at the time. Gates' discussion of a third standard is interesting to ponder in view of OSS, which could be considered the answer to his question about what other approach might gain traction. Overall his prognostications were quite correct. If he is as astute today as he was then with regard to humanitarian issues, his health initiatives should do a lot of good.

    • People put like 5 dollar requests on mturk.com to translate podcasts, should you really need it.
  • But (Score:2, Interesting)

    by Centurix ( 249778 )
    I really do only need 640k. As long as I can play Scramble on my Vic 20 I'll be happy for life.
  • 640k remark (Score:4, Informative)

    by badasscat ( 563442 ) <basscadet75@NOspAm.yahoo.com> on Saturday March 24, 2007 @10:16PM (#18475327)
    Like the now-legendary '640k' remark

    A better description would have been the "mythical '640k' remark", because he never said it [tafkac.org].

    Nobody can ever cite a source for this alleged quote, and in the absence of such a source, you have to take his word for it. It's impossible to prove a negative; that's how urban legends start in the first place.

    (If he did say it, don't you think someone would have figured out the where and when?)

    • Re:640k remark (Score:5, Informative)

      by Andareed ( 990785 ) * on Saturday March 24, 2007 @10:18PM (#18475347)
      The exact 640k quote from the talk: "So that's a 1 MB address space. And in that original design I took the upper 340k and decided that a certain amount should be for video memory, a certain amount for the ROM and I/O, and that left 640k for general purpose memory. And that leads to today's situation where people talk about the 640k memory barrier; the limit of how much memory you can put to these machines. I have to say that in 1981, making those decisions, I felt like I was providing enough freedom for 10 years. . That is, a move from 64k to 640k felt like something that would last a great deal of time. Well, it didn't - it took about only 6 years before people started to see that as a real problem."
      • Re: (Score:3, Insightful)

        by Ford Prefect ( 8777 )

        "I have to say that in 1981, making those decisions, I felt like I was providing enough freedom for 10 years. . That is, a move from 64k to 640k felt like something that would last a great deal of time. Well, it didn't - it took about only 6 years before people started to see that as a real problem."

        ... Then you have the Motorola 68000 [wikipedia.org], designed in the late 1970s and used in home computers in the mid 1980s - capable of addressing a whopping 16MB of memory, and using a flat 32-bit address space in case of f

        • Re: (Score:3, Insightful)

          by westlake ( 615356 )
          Then you have the Motorola 68000, designed in the late 1970s and used in home computers in the mid 1980s - capable of addressing a whopping 16MB of memory

          and the street price for 16 MB of RAM in 1980 would have been...what, exactly?


          • and the street price for 16 MB of RAM in 1980 would have been...what, exactly?

            Probably about the same cost as filling up the address space on a modern 64-bit AMD...

            Part of the reason the 68000 remains so popular (embedded controllers, etc) is because it was designed intelligently: flat address space, big endian, useful instruction set. A lot like the TMS9900, but Motorola marketed it better.

          • by DrYak ( 748999 ) on Sunday March 25, 2007 @06:50AM (#18477351) Homepage

            Then you have the Motorola 68000, designed in the late 1970s and used in home computers in the mid 1980s - capable of addressing a whopping 16MB of memory

            and the street price for 16 MB of RAM in 1980 would have been...what, exactly?


            It's this kind of lack of foresight that made the whole x86 architectue crappy.
            The question is not only what is realistic to do now and what would be not be possible to buy/build.

            The question is, if this architecture hangs around for the next couple of decade what will you be happy to have taken account for ? What could be useful for future generations of machines ?

            The 68k has been designed on purpose to have a clean architecture, that could easily evolve in future machine without needing hacks. (32 bits internal, even if first versions had 16bit bus. Flat memory addressing, etc.)

            The x86 has been a long series of very short-sighted choice (because nobody tougth it could last) - like the "640k ought to be enough for everyone" (it was back then, it wasn't any more a couple of years later) or the ackward instruction set - and subsequent hacks to circumvent the limitations (the whole segmentation logic is a pain in the ass). Not to say about all legacy modes that current chips still drag around (your Core 2 is still binary compatible with 8088 code and assembly compatible with 8080 code). Intel has tried to restart something completly new and supposedly better with the Itanium, but it failed, mainly because of all this legacy. AMD was somewhat more successful with AMD64 (because it both has a nice new clean x86-64 extension and support for all the ackwrd legacy).

            It's only sad that the x86 was chosen for the IBM PC, a computer whose architecture was subsequently opened and copied by numerous clones that IBM chose to tolerate, which made this architecture popular and made it evolve very quickly.
            Whereas the 68k regularly ended up in very nice machines (Amiga, Macintosh, etc.) but whose parent company never accepted to open. And thus remained less popular (because of higher price and lower development by 3rd parties).

            At least the 68k had much more success in video games (consoles and arcades. MegaDrive and NeoGeo if i have to only site two).
    • Re:640k remark (Score:4, Informative)

      by edwardpickman ( 965122 ) on Saturday March 24, 2007 @10:37PM (#18475485)
      Probably adding fuel to the fire was the fact that the memory limitations held for so long. I've always been into graphics and animation and the early memory issues were a major hassle. Even today shortsightedness about memory has been a major hassle for Windows. Win 2000 had a 2 gig cap and XP had a 4 gig. With the average person being able to aford 4 gig of ram and graphics people needing all the ram they can get it's bizzare with cheap ram to have such limitations. Vista is an improvement but there is a major system ram charge to get it and there still a cap that will be soon reached. He may not say Win 2000 users will never need more than 2 gig of ram but it's the way the company approaches it. Back when Amiga was around it always ran circles around Windows machines for memory. I always loved the fact that a lot of components came with extra ram slots. The Amiga 3000 had a ram limit in range of modern machines and that was 17 years ago. He may not have declared 640k was enough but he's hardly a visionary where memory is concerned.
      • Re: (Score:3, Informative)

        by Psychotria ( 953670 )
        I'm not sure if it was so much a software (OS) limit rather than a hardware one. I.e. 2^32 is where the 4GB address space (limit) came from, not because MS decided to be mean (for once). Sure, there are ways to get around the hardware "limitation" in software but there was probably not much incentive at the time to do so.
        • I remember the early PC's were configured for up to 512K RAM. There were add-on boards to map in the extra RAM up to 640K (and a real time clock and ports). So the 640K however natural it was didn't match up with the way RAM naturalness developed.

            rd
        • by Nimey ( 114278 )
          There's a hardware way that's been around for ages as well. x86 chips since at least the P3 have been able to address 36 GB of RAM, essentially by paging. If XP can't see 8 GB, that's a software limitation.
      • More fuel to the fire is the endless number of arbitrary limitations MS uses all over the place. 32(?) GB limit for a FAT32 partition? 64k row/256 column limit in Excel? Not technical limitations. Just reasons to have to upgrade to the newest versions.

        I've always found that quote particularly funny as no matter what, I've ran into one or more of these issues with every version of Windows yet. Not a single one of them actually a real technological limitation, just some arbitrary number chosen for some arcane
        • by drix ( 4602 )
          Most of the limitations you and parent poster refer to are not arbitrary but rather have a technical explanation. 4gb RAM is due to 32-bit hardware, FAT32 has an 8-tebibyte volume limit using 28 bits of a 32 bit pointer (the 32gb "limitation" is just a bug [microsoft.com] in Win XP setup), and I'm fairly confident the 65536 rows in Excel is due to the rows being addressed using a 16-bit unsigned integer. Sure they could bump it up but if you're using Excel to hold a spreadsheet and a single sheet needs more than 2^16 rows,
      • Re: (Score:3, Informative)

        by swillden ( 191260 ) *

        Win 2000 had a 2 gig cap and XP had a 4 gig.

        Incorrect, XP can only manage 3 GiB of RAM. You can install 4 GiB, but you'll have one unused. Supposedly Vista supports 4 GiB. 32-bit Linux also appears to be limited to a little less that 4 GiB, unless you build with the HIGHMEM64 option, in which case it will support up to 64 GiB.[*]

        Contrast this with the foresight shown for IBM's System/38, which featured 128-bit pointers at its introduction in 1978. With such a huge address space, the entire system storage was mapped into virtual memory, effect

        • by drsmithy ( 35869 )

          Incorrect, XP can only manage 3 GiB of RAM. You can install 4 GiB, but you'll have one unused. Supposedly Vista supports 4 GiB. 32-bit Linux also appears to be limited to a little less that 4 GiB, unless you build with the HIGHMEM64 option, in which case it will support up to 64 GiB.[*]

          This isn't the OS's fault, it's the hardware, which is "stealing" address space for devices. You get the same problem in Linux for the same reason.

          I believe you can even load an XP machine up with more than 4GB if you wan

          • Of course, if you've got 64 bit hardware and a 64 bit OS, all these weird problems disappear - so if you have that option, take it.

            I think that's what I'll do. Unless I want to reinstall everything to switch to a 64-bit userspace, that means that I can't use nVidia's driver, but the Free nv driver will probably work just fine for my needs. I'm typing this on the system in question, with a 32-bit userspace and 64-bit OS. As I mentioned in another post, I'm still "missing" about 140 MiB of RAM, but I suspect that's also because of some strangeness the hardware is doing. Perhaps there's a BIOS update.

        • that should still leave me with 4,063,232 KiB, so I'm "missing" 824,392 KiB, or about 805 MiB.

          A followup to this: I have installed a 64-bit kernel and reduced the memory allocated to video to a paltry 16MB (which is probably just fine for 2D-only stuff), but I'm still not getting use of all of my RAM. It's a lot closer, but some is missing. I have 4*1024*1024 = 4,194,304 KiB, of which 16*1024 = 16,384 KiB is video RAM, which should leave 4,177,920 KiB usable. However, Linux reports MemTotal=4,033,040 KiB, which is 144,880 KiB less, meaning I have about 141 MiB of RAM that isn't being used.

          I'

      • by drsmithy ( 35869 )

        Win 2000 had a 2 gig cap and XP had a 4 gig.

        Windows 2000 and XP (and even NT 4.0, for that matter) all support 4GB of physical RAM. 64 bit versions of XP support 128GB.

        With the average person being able to aford 4 gig of ram and graphics people needing all the ram they can get it's bizzare with cheap ram to have such limitations.

        Hardly. 4GB is a lot of RAM, even today, and most consumer-level hardware currently out there is physically incapable of having any more than that installed (heck, most machi

  • by Aokubidaikon ( 942336 ) on Saturday March 24, 2007 @10:16PM (#18475331) Homepage
    Most geeks' dress sense hasn't changed much since 1989 ;)
  • Predict the future (Score:5, Insightful)

    by imunfair ( 877689 ) on Saturday March 24, 2007 @10:19PM (#18475357) Homepage
    What kind of business are you in?

    We predict the future. The best way to predict the future... is to invent it.

    -X-Files
    • by samkass ( 174571 )
      Or to buy out/steal from those that do, or even suppress those that invent something contrary to the "predictions" you've bet on.

    • Re: (Score:3, Funny)

      by The Zon ( 969911 )

      The best way to predict the future... is to invent it.
      I have prior art on the future. Also, a time machine.
  • by Anonymous Coward on Saturday March 24, 2007 @10:20PM (#18475365)
    ...for Duke Nukem Forever.
  • 30 minutes (Score:4, Funny)

    by Anonymous Coward on Saturday March 24, 2007 @10:20PM (#18475367)
    1-1/2 hour = 30 minutes

    Oh wait...
  • by Ninja Programmer ( 145252 ) on Saturday March 24, 2007 @10:48PM (#18475549) Homepage
    I was a fledgling member of the CSC at Waterloo, and I recognize the members in the photos they showed. I also remember attending this talk with a front row seat. I was sort of unimpressed because he didn't discuss anything that was new or that I didn't already know about.
    • Re: (Score:2, Offtopic)

      by Moridineas ( 213502 )
      I was going to reply, but then I saw who you are; I got some great usage out of your website a number of years ago. As a fledgling programmer way back, it was very helpful and a great resource--just wanted to say thanks!!
  • Predictions (Score:4, Interesting)

    by yuriyg ( 926419 ) on Saturday March 24, 2007 @11:00PM (#18475599)

    And yet, by and large, he had accurately, chillingly, prophesied an entire decade or two of software and hardware development.
    Shouldn't be all that surprising, since he more or less controlled the direction of desktop software development in the 90's. I would assume he just stated his vision of the future of software, and that vision was implemented.
  • Nice to know that CS geeks can't spell 'seamless.'

    In all seriousness, it sounds interesting, but I don't have 90 minutes to listen to someone talk. Anyone know if transcriptions are being worked on?

    And why would they even bother to make a .WAV available? This is a 20-something geek talking, not the London Symphony Orchestra.
  • by Burlador ( 1048862 ) on Saturday March 24, 2007 @11:54PM (#18475843)
    From Chip Magazin 1/1990 (my re-translation from German):

    "I think about Handwriting recognition. In two or three years, we may have computers without keyboards. In five or six years this will change, and voice recognition will reduce the importance of graphics."

    "In five or six years, DOS [sales] will be overtaken by OS/2."

    The he said he is personally using "a Mac II, a Compaq and a IBM" computer, as well as a "NEC-Ultralite".
  • by Anonymous Coward on Sunday March 25, 2007 @12:05AM (#18475895)
    " And yet, by and large, he had accurately, chillingly, prophesied an entire decade or two of software and hardware development."

    Yeah? Gee, if he was once such a savant, what happened between then and his 1995 book "The Road Ahead" where he totally fails to "predict" the Internet and World Wide Web when it had already happened?

    Sorry, but reciting some corrollary to Moore's Law does not count as accurate prophesy, 'chilling' or otherwise. It's just conventional wisdom
    • Re: (Score:3, Informative)

      by laffer1 ( 701823 )
      Funny you should bring up the Road Ahead. Its interesting to compare the differences between the first and second editions of that book. The Internet "exists" in the second version.
    • Re: (Score:3, Insightful)

      by kjs3 ( 601225 )
      Spot on. In addition, if Linus or some such had made a couple of conventional wisdom statements that stood up a decade down the line, it would be "brilliantly" or "insightfully"; since it's BG, it's "chillingly".
  • by suv4x4 ( 956391 ) on Sunday March 25, 2007 @01:33AM (#18476315)
    He's the richest dude in the world and his OS is on almost every PC in the world, but let's laugh that he predicted something wrong in 1989! Hahaha, that totally evens things out.

    Gee, I feel better for me now.
  • by steveha ( 103154 ) on Sunday March 25, 2007 @02:04AM (#18476475) Homepage
    I'm top-posting this instead of replying to individual posts because there are just so many posts with one conspiracy theory or another. Microsoft tricked IBM into taking OS/2, Microsoft made Office 95 break OS/2, etc. etc.

    I worked at Microsoft from 1990 to 1996, and during part of that time I worked on Microsoft Word. And I'm here to tell you: Microsoft really believed in OS/2, back in the day. They really thought it would be the future.

    In 1990, I got an OS/2 machine on my desk, as did the other folks around me, because we all knew OS/2 was the future. The MS library had OS/2 machines for looking up books (and as far as I remember, the MS library had only OS/2 machines). And all the major MS apps were shipped for OS/2: Word, Excel, etc. (But they were also shipped for Windows. MS covered all the bets.)

    Now, I was only a lowly developer, not a strategy architect, and I never ate lunch with Bill Gates, so it's possible there was some amazing subterfuge going on without me knowing. But I don't believe it.

    Here is my summary of what happened, based on what I saw then, and on various articles I read in PC Week, Infoworld, etc.

    Microsoft started developing Windows back in the 80's. The early Windows was a laughingstock in the industry: it was a primitive toy. Apple seriously jump-started their GUI efforts by building a closed platform and tailoring their GUI specifically for that platform; Microsoft was hobbled by the suckiness of the 8088 and awful graphics adapters like the CGA card. MS actually tried to get Windows to run on that sort of pathetic hardware. Windows 1.0 did run but no one wanted it.

    MS doesn't give up easily. They kept plugging away at Windows, and it started to suck less, as the machines got more powerful. Also, IBM and Microsoft decided to cooperate on a new OS: OS/2.

    Microsoft wanted to make OS/2 as compatible as possible with Windows, to make it easy to port applications. IBM wanted to make OS/2 "better" than Windows. (My memory is dim here, I don't remember specifically why it was better to be incompatible with Windows. Compatible with some graphics API that IBM already had?) So now, the plan was to sell Windows only until OS/2 conquered the world. But the Windows guys kept plugging away on Windows, even as the OS/2 guys did their thing.

    Around the time I was hired, Microsoft and IBM were telling customers that basically if you have lame hardware, go ahead and run Windows on it, but if you have good hardware, you want OS/2 because that is the future. (IIRC the decision point was: if you have less than two megabytes of RAM, run Windows.)

    Then, in 1990, Microsoft shipped Windows 3.0... and everyone, including Microsoft, was stunned by how well it sold. It flew off the shelves. Egghead (at the time, a successful brick-and-mortar chain of computer stores) sent trucks with ice cream over to Microsoft; along with everyone else, I had a free ice cream bar to celebrate the success of Windows 3.0.

    The key feature was actually that it ran DOS apps very well. You could have multiple DOS shells open at the same time, and it would multitask them well (pre-emptive multitasking, even though Windows itself used round-robin multitasking for Windows apps at the time!). You could even have a DOS app crash, and your other DOS apps would keep running just fine. Compare with the "compatibility box" in OS/2, which was usually called the "Chernobyl Box" by geeks because a misbehaving DOS app could take down your whole machine. The Chernobyl Box could only run a single DOS app at a time.

    Why? Why was Windows 3.0 better than OS/2? Because at the time OS/2 was written only to support the 286, and even if you ran it on a 386 it would just run in 286 mode. Windows 3.0 would only do the cool DOS app multitasking if you ran it on a 386. My understanding is that IBM promised, early on, that OS/2 would run great on a 286; and IBM felt it was seriously important to keep that promise. With hindsight, I
    • by mh101 ( 620659 ) on Sunday March 25, 2007 @02:44AM (#18476593)

      Steve Ballmer made a great speech a company meeting. He said that Microsoft had been sending a mixed message to customers: if you have this kind of hardware buy Windows, if you have that kind of hardware buy OS/2. He said that from now on, there would be a new message: "Windows! Windows! Windows!!!" He shouted himself hoarse saying it.

      I guess that was practice for his "Developers developers developers developers" speech.

    • by drsmithy ( 35869 )

      The key feature was actually that it ran DOS apps very well. You could have multiple DOS shells open at the same time, and it would multitask them well (pre-emptive multitasking, even though Windows itself used round-robin multitasking for Windows apps at the time!). You could even have a DOS app crash, and your other DOS apps would keep running just fine.

      You could actually do this with Windows 2.1/386, ca. 1988 (although, unsurprisingly, not many people are probably aware of that).

      My understanding is t

    • Windows 3.0 was, at the time, prettier than OS/2, friendlier than OS/2, nimbler than OS/2, ran on small configurations than OS/2, was more compatible than OS/2... and shipped with about a dozen nice little applets like Windows Write that OS/2 didn't ship with. ToolBook, too, if I remember correctly.

      The applets, are for me, the proof. If Microsoft believed OS/2 was the future, why couldn't it spare a few developers to put some of the trimmings on it that would make it appeal to non-corporate users?

      Microsoft

"What man has done, man can aspire to do." -- Jerry Pournelle, about space flight

Working...