Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Microsoft

Bill Gates Talk From 1989 Surfaces 317

70sstar writes "A 1-1/2 hour recording of Bill Gates addressing a crowd of university students in 1989 was recently found and digitized, and has been circulating in some IRC channels for the past few weeks. The speech has found a permanent home on the web page of the University of Waterloo CS Club, where the talk is reported to have taken place. Gates covers the past, present, and future of computing as of 1989. While the former two might be of interest to tech historians, the real fascination is Gates's prediction of computing yet to come. Like the now-legendary '640k' remark, some of his comments are almost laughably off-target ('OS/2 is the way of the future!'). And yet, by and large, he had accurately, chillingly, prophesied an entire decade or two of software and hardware development. All in all, a fascinating talk from one of the most powerful speakers in CS and IT."
This discussion has been archived. No new comments can be posted.

Bill Gates Talk From 1989 Surfaces

Comments Filter:
  • 640k remark (Score:4, Informative)

    by badasscat ( 563442 ) <basscadet75@@@yahoo...com> on Saturday March 24, 2007 @10:16PM (#18475327)
    Like the now-legendary '640k' remark

    A better description would have been the "mythical '640k' remark", because he never said it [tafkac.org].

    Nobody can ever cite a source for this alleged quote, and in the absence of such a source, you have to take his word for it. It's impossible to prove a negative; that's how urban legends start in the first place.

    (If he did say it, don't you think someone would have figured out the where and when?)

  • Re:640k remark (Score:5, Informative)

    by Andareed ( 990785 ) * on Saturday March 24, 2007 @10:18PM (#18475347)
    The exact 640k quote from the talk: "So that's a 1 MB address space. And in that original design I took the upper 340k and decided that a certain amount should be for video memory, a certain amount for the ROM and I/O, and that left 640k for general purpose memory. And that leads to today's situation where people talk about the 640k memory barrier; the limit of how much memory you can put to these machines. I have to say that in 1981, making those decisions, I felt like I was providing enough freedom for 10 years. . That is, a move from 64k to 640k felt like something that would last a great deal of time. Well, it didn't - it took about only 6 years before people started to see that as a real problem."
  • Comment removed (Score:1, Informative)

    by account_deleted ( 4530225 ) on Saturday March 24, 2007 @10:21PM (#18475371)
    Comment removed based on user account deletion
  • Re:640k remark (Score:4, Informative)

    by edwardpickman ( 965122 ) on Saturday March 24, 2007 @10:37PM (#18475485)
    Probably adding fuel to the fire was the fact that the memory limitations held for so long. I've always been into graphics and animation and the early memory issues were a major hassle. Even today shortsightedness about memory has been a major hassle for Windows. Win 2000 had a 2 gig cap and XP had a 4 gig. With the average person being able to aford 4 gig of ram and graphics people needing all the ram they can get it's bizzare with cheap ram to have such limitations. Vista is an improvement but there is a major system ram charge to get it and there still a cap that will be soon reached. He may not say Win 2000 users will never need more than 2 gig of ram but it's the way the company approaches it. Back when Amiga was around it always ran circles around Windows machines for memory. I always loved the fact that a lot of components came with extra ram slots. The Amiga 3000 had a ram limit in range of modern machines and that was 17 years ago. He may not have declared 640k was enough but he's hardly a visionary where memory is concerned.
  • Re:640k remark (Score:3, Informative)

    by Psychotria ( 953670 ) on Saturday March 24, 2007 @10:47PM (#18475547)
    I'm not sure if it was so much a software (OS) limit rather than a hardware one. I.e. 2^32 is where the 4GB address space (limit) came from, not because MS decided to be mean (for once). Sure, there are ways to get around the hardware "limitation" in software but there was probably not much incentive at the time to do so.
  • by Ninja Programmer ( 145252 ) on Saturday March 24, 2007 @10:48PM (#18475549) Homepage
    I was a fledgling member of the CSC at Waterloo, and I recognize the members in the photos they showed. I also remember attending this talk with a front row seat. I was sort of unimpressed because he didn't discuss anything that was new or that I didn't already know about.
  • Re:Long Road Home (Score:2, Informative)

    by Supercrunch ( 797557 ) on Saturday March 24, 2007 @11:31PM (#18475737)
    It was called "The Road Ahead", originally published in 1995. My recollection was that he got some things right, but he speculated that a new information superhighway would come along to replace the Internet. He also predicted that many/most of us would be interacting with their computers via handwriting and voice recognition.
  • by Anonymous Coward on Saturday March 24, 2007 @11:41PM (#18475791)
    It says a lot about /. these days. During the days of Olsen, he started a re-write of VMS. It had such luminaries as Cutler and Bell on the team. When the company was bleeding, Olsen killed off this project and others. When Gates got wind of this, he approached Cutler (and others such as Grey and Bell), and convinced him to join him. One of the bigger issues was that he promised the core to the VMS folks. He would control the API and above. They would control the core.
    ANd if that was not enough, back in 94, I even saw the code for NT (I worked at HP and a neighboring group were asked to port it to the pa-risc. ). I can tell you firsthand that it had NOTHING to do with OS2. If you looked at it, you knew it was dec derivitive. Even the comments said it all.
     
    So how did you get modded up?
  • by Richard Steiner ( 1585 ) <rsteiner@visi.com> on Sunday March 25, 2007 @12:44AM (#18476095) Homepage Journal
    APIs are surface features which are (usually) made visible for applications to use, and they give very little indication of the nature or structure of the actual kernel code running underneath.

    OS/2 supports the POSIX API via EMXRT.DLL, for example, and yet OS/2's kernel has very little in common with, say, Linux or Solaris (which both also support POSIX programs).

    The 32-bit OS/2 kernel written by IBM for OS/2 2.0 and later and the Windows NT 4 kernel are quite different. Both Microsoft and IBM completely re-implemented their respective OS's kernels after the 16-bit OS/2 days, and the resulting software has very little relationship to the old 16-bit kernels except for support for the older 16-bit APIs. But as I said, that is simply a surface similarity.
  • by Richard Steiner ( 1585 ) <rsteiner@visi.com> on Sunday March 25, 2007 @12:51AM (#18476125) Homepage Journal

    The only reason OS/2 dies was because IBM was greedy and charged too much for it at the beginning of it's life, hence the beginning became the end.

    Wasn't it Microsoft that set the pricing of the SDK for OS/2 1.x, and wasn't OS/2 1.x mainly sold as a Microsoft product? Who set the high prices for OS/2 again?

    Remember that IBM, once it got hold of OS/2 and was able to release the 32-bit version as a product independently of Microsoft, was willing to sell OS/2 to Windows users for US$49 and to DOS users for US$99 [guidebookgallery.org], thus making OS/2 an extremely affordable product at one of the key times in its evolution -- the time when it alone was a Windows-compatible 32-bit operating system that was completely independent from DOS.

    Windows NT 3.1 (Microsoft's first 32-bit offering) wasn't released until some time after OS/2 2.0 (July 1993, over a year later than OS/2 2.0 ).

  • Re:OS/2... (Score:5, Informative)

    by TheNetAvenger ( 624455 ) on Sunday March 25, 2007 @01:02AM (#18476189)
    You do know that the NT4 core is extremely similar to OS/2

    Actually as an OS Engineer that has spent time working with and tearing both apart, they are very much night and day.

    You would have more success in selling OS/2 is the same as BSD.

    Here are a couple of things to get you started, and I could point out a few inaccuracies in each of these, but for the most part they will send you down the right path:

    http://en.wikipedia.org/wiki/OS/2 [wikipedia.org]
    http://en.wikipedia.org/wiki/Architecture_of_Windo ws_NT [wikipedia.org]

    Now where you are partially correct. NT started out in the OS/2 3.0 development stages, but by the time MS and IBM split, NT was a start from scratch OS as Dave Cutler thought the OS/2 codebase was horrible.

    MS even looked at using *nix concepts in the early days of NT, since it was being written from the ground up, and why MS held on to Xenix at the time in case that is the direction the NT team wanted to go with NT or base it on

    However the NT team felt the *nix architecture concepts were too limited and instead decided to take the best OS theories at the time and see if they could truly make a new OS technology.

    I get so tired of kids today confusing simple things and I see this crap on here all the time. NT is not VMS, NT was not OS/2, NT and Win95 are not related other than the Win32 subsystem, WinXP does not contain Win9x code, etc etc...

    No wonder people think Windows is more of a joke than it already is, if I saw it as a hybrid and hodgepodge of Win9x and OS/2 and NT I would think it was an insane code base too; however, it is not.

    It is easy to poke fun at Windows, but when you find real OS engineers, the NT architecture/kernel isn't quite so funny and gets quite a bit of respect even if they hate the Win32 subsystem.
  • Re:Well... (Score:4, Informative)

    by Alien Being ( 18488 ) on Sunday March 25, 2007 @01:11AM (#18476219)
    OS/2 and NT are different animals.

    OS/2 was originally a joint Microsoft/IBM venture and was to replace Windows, but there were squabbles over the API definition which caused Microsoft to rethink the whole plan. By that time, the Windows(3.0) API had become a defacto standard and the world's most valuable computer technology.

    MS realized that abandoning Windows (and control of the API) was a huge mistake, so they didn't. They went ahead with OS/2, but kept Windows as their primary platform. They knew that they still needed a "real" OS to replace Windows' DOS underpinnings, so they started the NT project.

    Windows remained as the market standard and MS remained as the gatekeeper to the API. OS/2 customers who wanted to run/develop apps for the "standard" system would also need a Windows license. And perhaps even more important than their ability to sell licenses, is the fact that by controlling the API, they get a huge head start over the competition when it comes to designing developer tools and applications around that API.
  • by laffer1 ( 701823 ) <luke&foolishgames,com> on Sunday March 25, 2007 @01:15AM (#18476237) Homepage Journal
    Funny you should bring up the Road Ahead. Its interesting to compare the differences between the first and second editions of that book. The Internet "exists" in the second version.
  • Re:640k remark (Score:3, Informative)

    by akh ( 240886 ) <slashdot@alephnu l l . net> on Sunday March 25, 2007 @01:42AM (#18476359)
    According to this page [bbc.org], 16MB of RAM in 1981 would run you about $150,000.
  • Re:640k remark (Score:3, Informative)

    by swillden ( 191260 ) * <shawn-ds@willden.org> on Sunday March 25, 2007 @01:58AM (#18476439) Journal

    Win 2000 had a 2 gig cap and XP had a 4 gig.

    Incorrect, XP can only manage 3 GiB of RAM. You can install 4 GiB, but you'll have one unused. Supposedly Vista supports 4 GiB. 32-bit Linux also appears to be limited to a little less that 4 GiB, unless you build with the HIGHMEM64 option, in which case it will support up to 64 GiB.[*]

    Contrast this with the foresight shown for IBM's System/38, which featured 128-bit pointers at its introduction in 1978. With such a huge address space, the entire system storage was mapped into virtual memory, effectively turning the RAM into nothing more than a cache to accelerate operations. Of course, it was a bit too ambitious for the hardware of its day, it wasn't until the introduction of the AS/400 in 1988 that such a system could really run well.

    Today, a 64-bit address space seems plenty large for whatever we might want to do with it, including virtual addressing for all system storage as well as RAM. Given how inconceivably large a number 2^64 is (16,777,216 TiB anyone?), it seems ludicrous that more could ever be required. But I'm sure Gates would have thought the same of 2^32 in 1981. 128-bit pointers still make me shake my head in disbelief, but who knows, maybe I'm the one lacking foresight now?

    Those System/38 and AS/400 designers didn't take any chances, and their software is unfazed by the growth in RAM and disk sizes.

    [*] Maybe someone who knows could amplify? I thought that Linux with HIGHMEM4G could use up to 4 GiB RAM, but I have a system with 4 GiB, and a kernel compiled with HIGHMEM4G, and /proc/meminfo shows MemTotal 3,238,840 KiB. I'm using the onboard nVidia 6150 GPU, and have configured it to use 128 KiB of system RAM for video, but that should still leave me with 4,063,232 KiB, so I'm "missing" 824,392 KiB, or about 805 MiB.

    I'm rebuilding with HIGHMEM64G right now, and eventually I'll get around to installing a 64-bit Linux kernel (this is an Athlon 64 X2 machine), but I'm puzzled as to why I can't use my 4GB with 32-bit Linux.

  • Re:OS/2... (Score:4, Informative)

    by dryeo ( 100693 ) on Sunday March 25, 2007 @02:02AM (#18476465)
    I think you mean OS/2 ran DOS (including win3.x) in VDMs. But you are right, in win32s ver 30 MS moved some DLLs above the 2GB mark just to break OS/2 which at the time had a 512 MB per process limitation and still does for most apps. Wasn't until ver 4.5 that the client could access 3 GBs and some APIs are still tied to the 512 MB barrier.
    And yes, don't know about MAC OSX but the newer Windows and Linuxes still seem broken compared to what the WPS could do on a 486 (with lots of memory, hopefully at least 32 MB)
  • by steveha ( 103154 ) on Sunday March 25, 2007 @02:04AM (#18476475) Homepage
    I'm top-posting this instead of replying to individual posts because there are just so many posts with one conspiracy theory or another. Microsoft tricked IBM into taking OS/2, Microsoft made Office 95 break OS/2, etc. etc.

    I worked at Microsoft from 1990 to 1996, and during part of that time I worked on Microsoft Word. And I'm here to tell you: Microsoft really believed in OS/2, back in the day. They really thought it would be the future.

    In 1990, I got an OS/2 machine on my desk, as did the other folks around me, because we all knew OS/2 was the future. The MS library had OS/2 machines for looking up books (and as far as I remember, the MS library had only OS/2 machines). And all the major MS apps were shipped for OS/2: Word, Excel, etc. (But they were also shipped for Windows. MS covered all the bets.)

    Now, I was only a lowly developer, not a strategy architect, and I never ate lunch with Bill Gates, so it's possible there was some amazing subterfuge going on without me knowing. But I don't believe it.

    Here is my summary of what happened, based on what I saw then, and on various articles I read in PC Week, Infoworld, etc.

    Microsoft started developing Windows back in the 80's. The early Windows was a laughingstock in the industry: it was a primitive toy. Apple seriously jump-started their GUI efforts by building a closed platform and tailoring their GUI specifically for that platform; Microsoft was hobbled by the suckiness of the 8088 and awful graphics adapters like the CGA card. MS actually tried to get Windows to run on that sort of pathetic hardware. Windows 1.0 did run but no one wanted it.

    MS doesn't give up easily. They kept plugging away at Windows, and it started to suck less, as the machines got more powerful. Also, IBM and Microsoft decided to cooperate on a new OS: OS/2.

    Microsoft wanted to make OS/2 as compatible as possible with Windows, to make it easy to port applications. IBM wanted to make OS/2 "better" than Windows. (My memory is dim here, I don't remember specifically why it was better to be incompatible with Windows. Compatible with some graphics API that IBM already had?) So now, the plan was to sell Windows only until OS/2 conquered the world. But the Windows guys kept plugging away on Windows, even as the OS/2 guys did their thing.

    Around the time I was hired, Microsoft and IBM were telling customers that basically if you have lame hardware, go ahead and run Windows on it, but if you have good hardware, you want OS/2 because that is the future. (IIRC the decision point was: if you have less than two megabytes of RAM, run Windows.)

    Then, in 1990, Microsoft shipped Windows 3.0... and everyone, including Microsoft, was stunned by how well it sold. It flew off the shelves. Egghead (at the time, a successful brick-and-mortar chain of computer stores) sent trucks with ice cream over to Microsoft; along with everyone else, I had a free ice cream bar to celebrate the success of Windows 3.0.

    The key feature was actually that it ran DOS apps very well. You could have multiple DOS shells open at the same time, and it would multitask them well (pre-emptive multitasking, even though Windows itself used round-robin multitasking for Windows apps at the time!). You could even have a DOS app crash, and your other DOS apps would keep running just fine. Compare with the "compatibility box" in OS/2, which was usually called the "Chernobyl Box" by geeks because a misbehaving DOS app could take down your whole machine. The Chernobyl Box could only run a single DOS app at a time.

    Why? Why was Windows 3.0 better than OS/2? Because at the time OS/2 was written only to support the 286, and even if you ran it on a 386 it would just run in 286 mode. Windows 3.0 would only do the cool DOS app multitasking if you ran it on a 386. My understanding is that IBM promised, early on, that OS/2 would run great on a 286; and IBM felt it was seriously important to keep that promise. With hindsight, I
  • Re:Transcript? (Score:5, Informative)

    by Strudelkugel ( 594414 ) * on Sunday March 25, 2007 @02:15AM (#18476515)

    I don't have the time to listen to an hour and a half mp3

    Crude index:

    • 28:00 Developer teams
    • 36:00 Mouse
    • 50:00 Unix
    • 52:40 Mac
    • 56:00 PARC people
    • 57:00 Mac GUI/Microsoft developers
    • 63:00 Third standard
    • 66:30 Networks
    • 71:50 Lotus/Excel competition
    • 75:00 "World Net"
    • 76:50 Multimedia
    • 79:40 Utility of the CD (Thanks music industry!)
    • 87:00 Learn from competitors
    • 87:50 Hypertext

    Actually Gates was quite insightful. He clearly understood what was important for the evolution of the personal computer, but didn't quite manage to have Microsoft dominate all of it, fortunately. When he discussed Unix in one section, and importance of networks in another, he never mentioned anything about security, which is an important element of Unix design. Later he mentions the "World Net", but of course did not anticipate HTTP and browsers. This makes his comments about hypertext all the more interesting; he correctly states massive amounts of typeless links would overwhelm the user. The significance of search, among other things, eluded his thinking at the time. Gates' discussion of a third standard is interesting to ponder in view of OSS, which could be considered the answer to his question about what other approach might gain traction. Overall his prognostications were quite correct. If he is as astute today as he was then with regard to humanitarian issues, his health initiatives should do a lot of good.

  • Re:OS/2... (Score:3, Informative)

    by TheNetAvenger ( 624455 ) on Sunday March 25, 2007 @08:41AM (#18477831)
    Also the DOSCALL1.DLL in NT was quite similar to the DOSCALL1.DLL in OS/2 ver 1.3 and I'd imagine that it is similar when comparing the WIN32 DLLs between WIN9x and NT.
    Naturally this does cause some confusion.


    Ya, it does cause confusion, but this is like 15 years ago, and yet people still don't seem to understand, yet people from the timeframe that all this was happening were 'quite' aware.

    As for the DLLs these are subsystem DLLs, not NT DLLs. I understand that in the SlashDot world, NT is a bit weird and the concepts of subsystems seem to escape a lot of knowledgeable people.

    But people have to understand that Win32 is basically an OS that sits in a subsystem on NT, just as OS/2 1.3 was and the UNIX BSD subsystem MS also ships for Windows.

    To say that these DLLS in these subsystems are a part of NT would be like looking at the BSD libraries in the BSD Subsystem that runs on NT and say that NT and BSD are also the same OS. Which is very far from the truth and even sounds crazy for anyone that understands BSD.

    I try to suggest to people all the time in my professional realm to take a few minutes and just read the Wiki pages on NT, as they do have some inaccuracies, but for the most part are pretty good at defining what NT is and how the client/server nature of the kernel design inherently allows for the subsystem architecture it supports.

    NT cannot be defined by Win32, nor any other subsystem that runs on NT, as ANY of them could be replaced. So it doesn't matter what DLLs or applications are running in the subsystems, they have nothing to do with the NT architecture.
  • by hey! ( 33014 ) on Sunday March 25, 2007 @11:11AM (#18478779) Homepage Journal
    I don't think you have the story quite right.

    IBM was a victim of its unintended success. The first generation IBM PCs were crippled compared to what they could have been, in almost every way. They could have had a much better processor. They could even have run a real operating system. Instead it was low rent all the way, outsourcing as much as they could, because they were making a cheapo product they expected to sell only moderately well. They built a computer that was inferior to the Apple II which had been available for several years. Radio Shack had a 68000 based computer running Unix that was introduced around the same time. These could have been a serious threat, but IBM produced a toy computer, put it in a business like case, and slapped the IBM logo on it.

    If you were working in those days (1981-1982), things started out as planned. IBM PCs were appearing on desks as a status symbol. There wasn't much useful you could do on them. Then in 1983 came Lotus 1-2-3, and suddenly all those PCs became very useful. In the same year, came the Compaq portable, the first 100% "IBM Compatible".

    The disruption of IBM's business came not from their misunderstanding the rate of technological change. They were attempting to slow the impact of change on their existing product lines by introducing a low end product of their own that was positioned low enough that it wouldn't hurt their existing product lines.

    This would have been a good strategy if they hadn't failed to anticipate the success of the product. They didn't even bother to get exclsuive rights to DOS. By making a proprietary PC, they actually accelerated the penetration of microcomputer vendors into their customer base.

  • by Anonymous Coward on Sunday March 25, 2007 @12:53PM (#18479617)
    Compare with the "compatibility box" in OS/2, which was usually called the "Chernobyl Box" by geeks because a misbehaving DOS app could take down your whole machine. The Chernobyl Box could only run a single DOS app at a time.

    But OS/2 would run a virtual machine for every OS you wanted to boot in a window... you could even boot off floppies and load another OS in a window. Also, MS worked on a newer version of the Windows API which they kept from IBM and wouldn't run on OS/2, which became Windows 95...

The hardest part of climbing the ladder of success is getting through the crowd at the bottom.

Working...