Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
It's funny.  Laugh.

The Real History of the GUI 265

Posted by Hemos
from the pretty-amusing-take-on-things dept.
Big Nothing writes "Mike Tuck @ webmasterbase.com has written a piece on the development of GUIs. Like most other articles on webmasterbase.com it is fairly non-technical, but entertaining nonetheless." Update: 08/21 02:45 AM GMT by T : Note that the link above takes you to the print-friendly version of the story; for online reading, you might prefer this version instead.
This discussion has been archived. No new comments can be posted.

The Real History of the GUI

Comments Filter:
  • Good write up.

    /me: bookmarks for PHBs of the world in need of history lesson.

    Am I the only one reminded of the PBS 'Revenge of the Nerds' history of silly valley? I think it was PBS that did it; came out a couple years ago.

    Regardless, I think I'll go whack myself on the head with a big rock.
    -- RLJ
  • Bring on the telepathic interface, baby!
  • by friedo (112163)
    For someone who knows so much about GUIs, you'd wonder why the tiny font on the page is completely illegible.

    • Re:GUI (Score:3, Informative)

      by SquadBoy (167263)
      The answer is you are using Netscape 4.7 under *nix and have not done anything to fix your fonts. Upgrade to Mozilla tweak your fonts and it should look much better.
      • Well, it also has a lot to do with the fact that the publisher specified 8 points for the font size. I mean, who really wants to read a several page article in 8 point text? (though I do have to give them props for at least not using specifying the size in pixels, as so many others do.)

        Turning off cascading style sheets in Netscape should proove to be an easy way to fix it.

    • by notext (461158)
      Amen.

      I couldn't make it past the first sentence.

      I always thought you were supposed to use big font so the teacher thought you wrote more.
  • How can any history of the GUI be complete without Bill Budge's Pinball Construction Set?
    • Pinball is OK, but my favorite GUI still has to be PONG 1.0 Easy to learn, and those paddles are much simpler than this 101-key keyboard.
      • You don't get it. PCS was the first gui (you moved a pointer and clicked and dragged objects) that was available to home users. It proved the popularity and usability of the GUI before Lisa or Mac did.
  • by grammar fascist (239789) on Monday August 20, 2001 @02:32PM (#2198883) Homepage
    Once upon a time, way back in the Stone Age, lived two cavemen, Ugh and Glug.

    Actually, the two cavemen were named "Ugh" and "Slug."

    I hope this clears things up a bit for everyone.
    • Wrong again. "Slug" was the IP lawyer who represented "Glug" in the patent infringement case, claiming "Mammoth hunting begats Mammoth counting which is an integral process in the hunt. Mr Ugh's machine is a direct infringement of his prior art. We also beleive that Mr. Slug's process could enable the copying of Mr. Ugh's work, depriving him of hundereds of kills he could have legitimately claimed were it not for the illegal copies."

      Although Ugh was a pioneer in his field, the unfortunate facts are; he still couldn't get laid and Glug wound up working for Slug who now owns the rights to the process and all slate products. Ugh was last seen promoting the value of "OpenSlate".

  • Yay GEM! (Score:3, Informative)

    by Aerog (324274) on Monday August 20, 2001 @02:34PM (#2198894) Homepage
    Nice to see they mentioned the good ol' Atari ST! I managed to get one a couple years back and it's quite possibly the coolest antiquated piece of technology I own today, and the GUI is very, very impressive for the time. It works, (although finding discs for the 720k floppy drive is a pain in the ass) and has one of the best versions of Monopoly available today, not to mention that the OS is just as stable as Win98, and about 1/500th the size. Yeah, yeah, it doesn;t support everything, but then again, who wants a USB printer on a 16-year-old machine?

    Just need to figure out a hack to hook it into my network now /* extreme sarcasm */. . . . .

    • although finding discs for the 720k floppy drive is a pain in the ass

      Can't you just do the old Amiga trick, put tape over the high-density hole and format it to 720K on a PC?
      • Can't you just do the old Amiga trick, put tape over the high-density hole and format it to 720K on a PC?

        Actually, you don't even need to cover the high-density hole if you're only going to use the disk on the Atari ST. The disk drive doesn't even have a sensor for it. For transferring data to or from a PC, I'd recommend your method. A floppy disk formatted in DOS format on the ST will often not be accepted by a PC.

        • Well, the tape trick came about because some later-model Amiga drives were ordinary PC high-density floppy drives that DID have the sensor. The Amiga wouldn't use the drive in high-density mode, but the drive could get confused by the sensor state, and would more or less attempt to read/write a 720K track/sector pattern with a high-density bias.

          I don't know the ST well enough to know if this was ever a problem on it.
      • I used to own an Atari ST in the late 80's / early 90's (well, 3 actually, an ST, STE, and Mega). Those machines were sweet! The floppy is a standard 720K drive, but what we used to do is install a switch in the side of the case, which would allow us to switch sides on the disk, effectively using high-density disks as double 720k disks. I believe only the later models could read high-density disks, but it's worth a try. I'm sure you could find documentation on the 'net about this, it was a fairly standard practice back then (out of the 20 or so users in our Atari Users Group, 15 used this trick with no problems). In the worst case, you've just scrapped a standard floppy drive. Big deal. :)
      • Amigas preferred 880K formatting. They didn't have to put a gap between each sector, who hoo!
    • I have a lot of nostalgia for the old GEM desktop. I ridiculed my DOS friends at the time for their unstable, text-only systems, and for how much better games looked on my computer. But the Microsoft marketing machine killed the ST and the Amiga and others, so now it just sits in my attic.


      PS: Anyone know where I can find a copy of Sam & Ed basketball?

      • Just curious, would it be anywhere other than your attic had Atari become the sytem of choice? I will readily admit though that they did one heck of a job outmaccing the Mac. From the built in MIDI port to outstanding, for the time, color display, they had a really decent macine.
    • "not to mention that the OS is just as stable as Win98"

      Cut the ST some slack, man! It can't be that bad.

      Oh, wait...you meant that as a compliment : )

    • "not to mention that the OS is just as stable as Win98"

      Sure, one hell of a OS where ANY game can bring the whole thing down.
  • by Flabdabb Hubbard (264583) on Monday August 20, 2001 @02:34PM (#2198903) Homepage Journal
    Life was so much easier for us techies when computers were "difficult" to use. The advent of point-and-drool gui interfaces has made our life hell. Typical example of this being the brain-dead point-and-click admin of complex systems such as DNS.


    The world would be a better place if GUIs had never been invented.


    Give me an Xterm, emacs and lynx over a point-and-slobber interface anyday.

    • The world would be a better place if GUIs had never been invented.

      Absolutely. And how about that Guttenburg guy, fixing it so that any random schmuck can read the Bible for himself?

      • You are missing my point. Simple interfaces to complex systems are a BAD thing.

        Making it easy to do complicated things means that more people will do them. And those people will by definition be stupider.

        I'd rather have my DNS zone file edited in vi by a bearded unix/networking guru who knew what was going on under the covers, than have a 21-year-old fresh out of college be let loose with a Microsoft gui which allows him to "update YOUR zone files Quickly and Easily in Minutes with DNS Wizard"
    • Of course, the best would be both worlds.

      AND the GUI-version should have a good keyboard interface! Why oh why do people so often forget/don't bother about this? Especially in the unix-inherited world. You should be able to do just about everything without the mouse, except for the few mouse-only tasks, such as painting. THIS is what makes a lot of the current KDE/gnome apps feel lame. Ironically, in windows most commercial apps have at least tried to make it possible to live without mouse.
      • MS's design guidelines even recommend that your entire app should be usable with keyboard-only.
      • The mouse has been proven in usability labs to be faster than keyboard shortcuts. A true GUI power user has no probably with this scientifically proven fact and uses the mouse and a GUI designed to take full advantage of it. A clueless newbie (which a lot of *nix people tend to be the second things get graphical) would engage in the far more cognitively expensive and slower task of mentally sorting through what arbitrary sequence of keys do what. If the *nix people would, in the words of Yoda, unlearn what they have learned for the past 30 years, linux would make far more progress on the desktop.
        • The mouse has been proven in usability labs to be faster than keyboard shortcuts.

          Proven by whom? The Short Bus Study? If you type more than about 30wpm, there is no way that removing your hand from the keyboard to click once and then return it is faster than using a keyboard shortcut, unless you are referring to a poorly-designed interface which has (perhaps intentionally) set up bad tab stops.

          As long as the interface is designed to allow full control without a mouse, about the only thing I can't do faster with a keybard is rating the AmIHotOrNot sites and their variants.

          Poor design by the programmers and project managers does not change the physical reality that I can type faster than I can type a word, scroll, click, press a key, scroll, click, scroll, click, right-click, click, type one key, etc.


          woof.

          Where's the guy with the sig about the nipple being the only intuitive interface when you need him?

    • I remember being asked to set up a router for a dot-com a few years back. They were dead set on using Windows NT and two nics. The MS front end to the router had a very complex gui which had stupidly cute icons which:

      "My Network Router"

      Which of course could be opened to see:

      "My Network Routing Table."

      God, what I wouldn't have given for vi and /etc/gateways.

      this space intentionally left unwitty
    • You're assuming that EVERYONE enjoys using the CLI. While I myself prefer it, would you really feel comfortable letting Grandma use it?

      Both CLI and GUI have their places. If a GUI allows people to do things that they couldn't do before, more power to them.
      • would you really feel comfortable letting Grandma use it?


        The problem is, that Grandma herself is comfortable attempting to configure DNS because she can point and click. She has no idea of the implications of clicking 'OK' in contexts other than home use.


        If she had to put a bit of effort in (say firing up 'vim') she might not bother, and hey-presto one less f**ed-up zonefile.

      • I don't know... Me, I prefer Grandma that can type to the one that just points and drools.

        That said, most of your average old people are very good at using language and have adapted to more technological whatsits, widgets, and gimcracks than you can possibly imagine. And considering that typing is a very standard skill for the average person likely to become a computer user (they'll need it even if they have a GUI), I can't imagine that Grandma can't figure out "ls" means "list files in this directory" and that "cat" means "print the file out to the screen". I mean, if she learned to speak English after she moved here from wherever the hell she came from, she can probably learn the 50-100 commands on the average CLI system.

        And have you ever watched someone learn to double-click? Or mouse? Now let's explain right-clicking... give up. It's just not that simple. GUI isn't some dumbed-down interface, it's a useful tool. A CLI isn't some highly technical, wizard tool, it's just typing!

        Now configuring DNS with a GUI may look easy, and that's deceptive, especially since the interface may be poorly designed, but it doesn't say anything about the difference between GUI and CLI. I myself am a happy emacs user, love my CLI, and can't imagine a future without language-based interaction with my computer (see the /. recent HAL article-- drool drool), but was overjoyed to find that http://localhost:631 will help me set up and administer my printers via a web page. This is a task that is made way more complex than it needed to be, and the GUI brought it into line. Would I want to administer everything that way? Probably not.
    • Life was so much easier for us techies when computers were "difficult" to use. The advent of full-screen editors and shell history has made our life hell. Typical example of this being the brain-dead edit-and-save configuration of complex systems such as an operating system.

      The world would be a better place if these luxuries had never been invented.

      Give me a box of punch cards, or even ED, over a type-and-slobber interface anyday.

      *** Clue to original poster: if it weren't for the GUI, the PC market as it is today wouldn't exist, and you'd still be paying $5000 for a 486. Also, if the GUI didn't exist, GNU/Linux users wouldn't get to gloat about their imagined skills, because they would have to compete with really complex operating systems.

    • There are a lot of very intelligent people in the world who are entirely clueless when it comes to understanding computers. Do you want to keep the power of the computer out of the hands of these people? Computers can do much more than run DNS servers. Maybe a guified DNS server is not so great, but what about graphics editing? Or CAD? Or word processing? What about the WWW? Lynx might be great for some purposes, but I personally like to look at pictures and I think the web is very dull without them.
    • Well, for programming nerds, the days of DOS were much easier, since they were allowed to program in modes: "now we're in editing mode, so the keys CTRL-I activate italics, now we're in layout mode and CTRL-I now adjusts line spacing..."

      ... and the users rebelled. "It's too hard! I keep erasing my document because I can't remember what mode I'm in!"

      Then GUIs gave us modelessness, and all was better in the world for users, but programmers now had three times as much work to do.

      Of course, now users still lose their documents, but because of the brain dead way OSes work, the user ends up feeling like it's his fault, even if it's not.

    • The world would be a better place if GUIs had never been invented.

      Give me an Xterm...

      Okay....... And this Xterm is running where, exactly?

      Chris Beckenbach

    • Actually the problem is the "all or nothing" mentality.

      The best systems I've used combine text and gui in a very elegant manner, along with the file system. My love of simple things that work and are flexible through text programs defining the gui are seen in my choice of windowmanager, WindowMaker, which allows me to do all kinds of neat things (dynamic menus, for example), and my filemanager of choice, ROX-Filer.

      It doesn't have to be this or that. You can combine both to have a truly elegant, simple, and powerful system.

      To do anything unique or interesting with your interface in windoze, you have to be a developer. To do it in my environment of choice, you edit a text file or write a quick little perl script.

      In making things "simple" the gui folks are actually making doing things 'your' way more complex.
  • by Anonymous Coward
    * the first GUIs were written in smalltalk.
    *smalltalk was the first cross platform portable bytecode language
    * smalltalk is good pure OO unlike C++ (not pure OO) and Java (not good OO - the class hierarchy is a monster - ST's object heirarchy is much more clean)
    • by Anonymous Coward
      The article got one thing wrong though. The Lisa's software wasn't written in Smalltalk. Like most of the industry, Apple totally failed to recognize the promise of object oriented programming back then.

      Also, I don't agree with your assessment that only languages with built-in single rooted class heirarchies count as "pure OO". The single rooted class heirarchy is necessary to support a dynamic type system, but isn't required in a statically typed OO language.
  • I can't stand articles on technical subjects that include gibberish about "cave men" and the like. Who-needs-it? Obviously, the author is padding for his lack of knowledge about the subject.
  • come on everybody knows al gore invented the first gui!
  • by Bowie J. Poag (16898) on Monday August 20, 2001 @02:43PM (#2198959) Homepage


    I was going to rip this article a new one, but i'm glad they got it right. What I would consider to be the first GUI was Sutherland's "Sketchpad" system from the early 60's. The military had similar sorts of things predating Sutherland, but nothing quite flexible enough to really be called a full blown GUI.

    Anybody with their brains in the right place can tell you that the GUI was not invented by Xerox PARC. They may have done a great deal to push the idea, or perhaps simply been at the right place at the right time, but the basic idea of using graphics as a means to interact with a machine predates PARC by about 20 years.

    If you really wanna have some fun, check out Doug Englebart's 1968 presentation that introduced the world to the mouse, chordboard and other interesting stuff. There are plenty of links to it, but here's a good one [rwth-aachen.de] incase you cant find any. A while back, there was a site that offered his entire presentation in RealVideo format, IIRC..I wish someone would post a link to it, or perhaps a better (re: DivX, or straight MPEG) link... It almost brings tears to my eyes when I watch it. :)

    • I'm glad someone mentioned the military. I was in the Army in the 80's. I remember testing the Apple II and Windows 286, and GEM. I also remember using an GUI ordering system. And lets not forget about Wang. The Wang VS system had a simple GUI - well it was half text based and half GUI - sort of. lol
  • GUI Design (Score:4, Interesting)

    by eric2hill (33085) <eric@UUUijack.net minus threevowels> on Monday August 20, 2001 @02:44PM (#2198966) Homepage
    [random thoughts]

    The GUI became popular because it really made most things that users need to do on a computer far easier than cryptic command lines. For years GUI's have been refined for ease of use. We're now coming to the limit of current icon-oriented design. There's just so many ways that an icon-based system can be presented to the user before usability starts going into the toilet.

    We moved to GUI's because command line interfaces only got us so far, and some day someone will come up with a better-than-icon based system that is more logical. We'll all say "gosh, why didn't I think of that?" and everyone will jump into the "new way" of thinking.

    I have visions of time-oriented interfaces that respond to "get me the spec sheet for the network I did last week sometime" and "set a new meeting for next Tuesday with Jim and Bob in the conference room". These new interfaces will be able to store and retrieve information based upon how we think, not in the traditional tree-like-structures we're currently used to. The concepts behind OO/RDBMS systems have some potential, such as nested tables and object oriented models, but don't present their information to a user very easily.

    I don't see new interfaces becoming popular until they target the non-computer user market. I envision voice-activated systems, but they tend to be annoying to other people around. Mouse navigation doesn't seem to be viable because of it's limited 2D space, and thus the 2D GUI. The 3D systems (see spaceball on google) look neat, but aren't very intuitive to users. We may wind up with virtual filing cabinets, but hopefully we'll stay away from the Packard Bell Navigator!

    Is there anyone (university or other) that is working on a new interface concept? I'd be interested in hearing what works and what doesn't. I know M$ and (Cr)Apple invest millions into GUI research, so I wouldn't be surprised if we saw something new out of those camps in the next few years.

    And no, I don't count XP's "new and improved" GUI anything more than an over-hyped icon-based system.

    [/random thoughts]
    • "I have visions of time-oriented interfaces that respond to "get me the spec sheet for the network I did last week sometime" and "set a new meeting for next Tuesday with Jim and Bob in the conference room"."

      How about "You know, that presentation I did two years ago with the red circles? You know? That one." That's how most end users seem to be. :)

      But seriously though, often the predictions lay groundwork for more conventional ideas. Remember all the talk of "agents" a few years ago? I remember even seeing an AT&T ad where a woman is talking to a video image of a secretarial-like person, who's telling her that her mother called. "Did you get those tickets?" the woman asks the agent. "Front row," he replies.

      But we do have agents today, just not in that form. They've been "mainstreamed" by currently existing standards and technology. Amazon, for example, is one huge agent database. I can literally go clicking around their recommendation broswer finding items that interest me, and oftentimes the suggestions are pretty on target. I've even purchases a few things through the recommendation browser, simply because I want to endorse that technology and figure that it's the best way to make my argument (with my wallet).

      Will there be natural-language recognition GUI's? Sure. but will we call it that, or will it be something completely different?

    • The GUI is a tool, just like the CLI is. It's a tool that works if it is done right, and once a tool is done right, theres very little to no reason to reinvent it.

      The "Classic" MacOS GUI works very well, yea maybe it's ripped off from Xerox, but whatever, it works. It's not themeable (well kinda...) but in the System 7 and OS 8 versions it was simple and it worked very well. So well that 3 year olds and 80 year olds could master the interface quickly and without help.

      Aqua is just a step beyond the "Classic" GUI and with some refinement it could last another 20 years.

      However...I do not agree that a new interface concept is needed. A screwdriver's interface has remained unchanged for centuries and it doesn't need a new concept. Same with the firearm, a Beretta flintlock from 1300 had the same interfact characteristics as a Beretta Gold Sable rifle made in 2001.

      When a concept works...don't spend the effort on changing it. Spend the effort on making the OS run better behind the interface.
      • I almost completely agree with you. A new GUI tool is needed, just not for you and I. I want a box with an interface that my Grandmother could use with her current training. Said training is the concept of a piece of paper and a pencil, but it's training. An example product would be a thin tablet (the size of a small stack of printer paper) that was one big LCD screen. Not color, just black on bright white. A pen that you could use to write on it and at the top, a send-to section where you could write an eMail address. It would either OCR and send, or just send a picture of the writing. My Grandma could just pick this up, start writing, write an eMail address, and push the send button (which would look like a stamp since that's what she'd do with a piece of paper). I know this isn't a novel concept, but something this easy would be perfect for her. No frills, just the essence of an electronic letter without all the dial-up username password stuff that confuses people who simply don't care to know that. My Grandma uses checks and goes to the bank to get money because she doesn't want to bother remembering her PIN for the ATM. If you force her to use a password, you're just ensuring she won't use what you get her. Keeping it simple also forces programmers (me) to take care of the mundain stuff so the user can get the most use out of the least amount of work.

        You're right, a new screwdriver isn't needed, but the screwdriver with the built-in flashlight or flexible shaft to get to hard-to-reach places were needed twists (I made a funny...) on an old concept.

        I'm not saying replace the GUI, I'm saying simplify it. Although not quite to the level of MS Bob.
    • There actually is a guy named David Gelernter who came up with something like this. Because of his computer science background, was a unabomber victim. While he was recovering, he came up with a system called "LifeStreams" that would record data throughout a person's life as if their entire life was some sort of filestream that is constantly added to.

      http://www.wired.com/wired/archive/5.02/fflifest re ams_pr.html

      While the icon metaphor is limited, part of the problems people are having with it are not so much related to the design itself, but the fact that so many programmers do so many cognitively unsound things that shouldn't be done in any interface design on any platform. And this is what is really causing many users to suffer through today's desktop interfaces. For example, some programmer might implement a button layout where it is not clear how one widget relates to another. One button on one side of the screen may have some relationship to a list that is in some obscure location somewhere else on the screen (as opposed placing the button right next to the list it acts on). Or one program might have both the menu selections "Customize" and "Options", which is ridiculously confusing for the user because both words refer to the same exact type of thing (configuring something in a program) but perform different actions. I'm not pulling that particular example out of my butt--I'm taking it directly from Microsoft. Before we eliminate the icons, we need to eliminate many programmers' lack of understanding about how to create usable interfaces. If we don't do this and simply go from icons to something else, they'll just end up making the next great interface as equally miserable as the current one.
  • I prefer a decent command-line interface within an ergonomic GUI, i.e. best of both worlds. Windows definitely benefits from the addition of this [cygwin.com]. The shortcomings of the Windows CLI never cease to astound me. For instance, a command-line is not very functional without a decent egrep-like tool, IMHO.
    • Try this [apple.com]. I'd really like to see more integration between the GUI and the CLI though. OS X does a few nice things, like letting you drag a file into a terminal to insert a path, and letting you pipe to the clipboard so you can just paste into a GUI program. But that doesn't really go far enough. I'd like a field in every file manager window where I can type CLI commands and have them executed in that directory. There should be a few extra commands too, so I could do things like 'select *.zip' to select all the .zip files in the window.
    • this is exactly why i use linux (well one reason anyway) I like having a gui and a CLI. it just works very well together. I don't think the gui should ever have been envisioned as being a CLI-killer but just an adjunct to the system. there are things that i can do faster with a CLI and there are things I would rather do with a GUI. someday someone will show me a way to do some of these things with my voice or with virtual gloves or somesuch thing and they too will just be adjuncts to the system. I will still use the CLI, the GUI and the gloves, voice, thought waves.
  • by SpookComix (113948) <<moc.liamg> <ta> <ximockoops>> on Monday August 20, 2001 @02:45PM (#2198972) Homepage Journal
    Starting out as a two-man operation out of the backseat of Bill Gates's car, Gates and cohort Paul Allen...

    It's amazing to see how so many beautiful and wonderful things happened as a result of two guys in the backseat of a car.

    Wait a minute...

    --SC

  • The statement that X is an OS underscores the author's feeble grasp on the subject matter.
    • must be a Mac user.
    • I rather thought it was his statement that the Altair ran Microsoft's BASIC OS.

      Also, Jef Raskin doesn't spell his name "Jeff," and the Mac OS was referred to as "colorful" in its infancy-- though it was only black and white until ~1989.

      ~Philly
      • Umm i dont understand

        The fact that Altair Ran Basic (which at that time was considered and OS as there was no such thing as a Disk Operating System for the Altair as yet) is proven to be factually correct ? And regardless Gates wrote the First DOS for the Altair himself (on legal pads in fact and then coded it into a machine in machine language) so the statement is factually correct even if you read it as BASIC/OS

        what are you trying to say then ? That history is lying (inclduing numerous non MS sources ?)
  • by Anonymous Coward
    I'd like to point out Netscape's [netscape.com] rather interesting history of GUI browsers [netscape.com]. It starts of showing how some of the founders of Mosaic [uiuc.edu] went on to found Mosaic Communications Corporation [mcom.com] which was later renamed to Netscape. It then covers Microsoft IE [microsoft.com] and the decision to start the Mozilla [mozilla.org] project which is producing the next generation of Netscape [netscape.com] browsers as well as others [beonex.com].
  • OK, I've heard this quote a lot, and while it's hilarious, I have a sixth sense that says Bill Gates never actually said it.

    Now, I've spoken to people who were there to hear, firsthand, Bill at a big computer show (SIGGRAPH?) in the early 90s, remark that "it's impossible to write a preemptive multitasking OS that runs in less than 4MB RAM" - while there were computers on the show floor doing precisely that. But that's not quite the same quote, and though I can imagine it evolving over the years to "640K oughta be enough" I just don't think that's what happened.

    Any ideas? Anyone know where the quote supposedly appeared?
  • Out of curiousity, what is the site itself generally about? I thought it was a mainstream media site, but now I see there is a bit of complex coding discussed.
  • but at least they hacked up a link to the unofficial apple museum... their link was Like this [webmasterbase.com]

    Here's the 'cleaned-up' link from me: Enjoy [hughes.net]

  • The way they tell it in Pirates of Silicon Valley is much more exciting. *cough*

  • GUI 'simplicity'? (Score:5, Insightful)

    by Balinares (316703) on Monday August 20, 2001 @02:55PM (#2199032)
    We remember the halcyon days of DOS prompts and command line interactions; some of us take an aspirin and lie down.

    Well, I beg to differ. You could say I've kind of been enlightened after listening to the epitome of computer cluelessness: my mother.

    She was struggling with the Windows explorer GUI, trying to move a file. And then, she said, and I'm not kidding: "Oh, I prefered DOS, you know, you typed a command, and it worked!"

    Maybe what simplicity is really about, is determinism in the way the computer behaves?
    • My mother made the same complaint. She was a touch-typist, after all, and hated having to move her hands away from the keyboard to use the fscking mouse.

      • When mice became common, didn't someone say - "They invented the mouse to slow down the programmer?"

        Or maybe it is just the crack.
      • Which simply proves the point that secretaries can and should use linux. :oP

        My mother made me learn how to type one summer on the IBM PC-AT. I will never forget that fantastic transition from typewriter to PC... backspace is for wussies.
      • My mother made the same complaint. She was a touch-typist, after all, and hated having to move her hands away from the keyboard to use the fscking mouse.

        You'd think foot operated pointing devices would be common, but they are virtually non existant...
    • She was struggling with the Windows explorer GUI, trying to move a file. And then, she said, and I'm not kidding: "Oh, I prefered DOS, you know, you typed a command, and it worked!"


      Sounds like my wife. She used DOS (up until about two months ago). Until something hosed Windows (used for Quicken). Rather than having only Win '95 (long story short: couldn't reinstall win3.1) she said to set her machine to dual boot: Linux and Win '95.

      See, she likes mutt. And lynx (especially lynx. Can't even get her to look at w3m:) First, like you say, you type something and it happens. Second, she is vision impaired, and likes to have 80x24 (or 80x20) on a 17" monitor. (Hell, I kinda like it too. Really easy to read).

    • "She was struggling with the Windows explorer GUI, trying to move a file. "

      Hmm ...

      Does she often strugle like that ?
    • Well, imagine you occasionally go to a foreign country, like once a year. You can get by without learning the language. If you go to a pastry shop and want a croissant, you can point at it and they'll figure it out.

      Or let's say you're a Cuban refugee who now pitches for the Yankees. You don't need to speak much of the language, you just need to know job-related words like "base" and "throw".

      However, let's say you're surrounded by people who speak another lanugage, day in and day out. You work with these people and need to express complex ideas to them. You had better learn the language, because pointing and a 30-word vocabulary isn't going to allow you to convey ideas like, "Fetch me a nice yet casual shirt for under $45 which matches this belt."

      Similarly, GUIs and simple keystrokes (like Alt-F4 or Ctrl-Esc) may work if you only spend time in ComputerLand occasionally or are willing to only express extremely simple ideas or are willing to take a long time to get something done using a ton of simple words when a few complex ones would suffice.

      However, just like you wouldn't move to France without learning the language ASAP, or get a job as a drug cartel kingpin without becoming fluent in Spanish, you should learn to communicate with your computer through "sentences" instead of pointing if you're going to use it for any substantial amount of time at work or elsewhere.
      And as computers become more pervasive, people are going to have to understand their language or else they'll be like the current members of society who can't read. (Actually, worse -- they'd be like the members who can't talk)

    • This doesn't really mean that GUI's are inferior to CLI's - merely that UI in general is still not particularly advanced. Sadly, I can't make one due to a lack of skill and resources, but for a while I've advocated combining both user interfaces together.

      For example: Think of the GUI as being very akin to gesture in a pre-verbal society. People can't talk to one another, but they can point at objects, and make either make one of a very, very small number of commonly recognized gestures to indicate some action to be taken on it, or point to some object that represents such an action. (which is a simple noun-verb construction, really) CLI's appear to be more powerful, but this is not entirely true. Although it is akin to verbal communication, in which abstract concepts can be related pretty succinctly, there is a certain amount of complexity that is difficult to remove, without incurring a cost due to human factors. (e.g. it is faster to use the mouse for many commands, particularly rarely used ones, because you can select it from a list faster than you can remember how to invoke it from the CLI with appropriate syntax, or the shortcut for it)

      I propose having a gui with a one or two line text parser stashed somewhere. (whether there would be a universal one, or multiple ones, based on focus, I can't say without there being some actual work done). We could then avail ourselves of: 1) Ordinary GUI usage, where the CLI is ignored; 2) Ordinary CLI usage, where the GUI is ignored; 3) A partial combination of the two, where text commands can have much more feature rich visual output (e.g. looking at a graph instead of combing through top) or hopefully 4) where both types of commands are used interchangably in total cooperation.

      An example of this might be that I want to do some operations on files that I have on my computer. I use the mouse to navigate through a folder structure with plain English names that could be difficult to express in a CLI, depending on its syntax. There are loads of files of all sorts in it, and I only want the files with certain file names. Unfortunately, the names aren't in any spacial organization. So I type in a regular expression to select them in the CLI, and they become selected, with appropriate visual feedback. I can then drag them, with the mouse to a different directory, as convenient. What's important is that I do NOT have to switch modes between a GUI and CLI, at least as explicitly as one has to do so today. (with a seperate shell window that does not follow the focus of the GUI, AFAIK) There are simply two parallel methods of controlling the computer, which I may use at will, with a minimum of additional effort.

      I imagine that it would require a little reworking of GUIs, and an entirely new shell, so as to have a syntax that matched up very well. (e.g. if there is a visual 'desktop' as the root of the GUI, the root of the fs and CLI would be the same. Given that fs's are already abstract from the reality of disk sectors, and directories are a purely human organizational system, further abstraction is entirely possible and not a bad thing) The CLI would also need to be able to respond well to GUI commands... that sort of thing. I think it's all possible, I just wish someone would work on it, as I can't. (but would probably like a good one)
  • by 2Bits (167227)
    After the first few paragraphs, I was getting tired of this "ugh and glug" talks. I'm sure the author is getting paid by the number of words in the article.


    Give me some real tech stuff. If you want to speak in caveman langugage, buy yourself a time travel module and send yourself to the stone age. We are living in the 21st century here.

  • I for one would like to know about all the tech that never went anywhere.

    I remember when MacOS System 7 (code named Blue) came out (91, 92 or so, I forget when) there were parallel OS teams, working on Pink and Red. Pink was supposed to be near term ideas, a brand new OS based on Object technologies. Red was supposed to be long term groovy stuff, really whizbang.

    Now close to 10 years later MacOS 9 is obviously an updated and freshened Sys 7 with no cutting edge stuff (where is OpenDoc and Cyberdog), Taligent a distant memory, and Red never heard from again. Microsoft must have similar killed R & D (you think Bob was the only GUI idea they ever came up with, it's just the one you know about). What ideas are lurking someplace needing a better processor and some code spit and polish?

  • That article was stupid..I stoped reading it after that "ugh" crap, and the rest of the article I saw in Pirates of Silicon Valley. You know you saw it to.
  • by ergo98 (9391)

    One line that I found interesting in the article:


    Like the Amiga, the ST couldn't compete with the big boys, nor with Amiga for gamers, but its sophisticated sound capabilities earned it a niche with audio editors and musicians.


    In reality the sound on the Atari ST was somewhat subpar and it was seriously outmatched by the Amiga (note that I was a massive Atari ST fan so I'm not biased when I say that...The ST ran at 8Mhz whereas the lowly Amiga only ran at 7.14!). What they are probably referring to is that the Atari ST, in a very odd piece of design, had a MIDI in and out port on it (no thru though) which single-handedly catapulted it in the upper echelon of PCs for electronic musicians. Pretty silly really as you could inexpensively add MIDI to most other PCs, but in a strange twist of events rather than making musicians buy the ST, it made lots of ST owners musicians (or at least wannabe musicians with their Casio SK1 in tow)...

  • by British (51765)
    The irony of this article, and all the articles it links to is NO PICTURES.

    Not even the history of computer graphics article has any pictures. You'd think that was a sure bet.
  • by scott1853 (194884)
    Since this Israeli company is trying to get rid of Windows, then we won't have a GUI anymore. We'd have a SUI (Speaking User Interface). Since Sooey sounds like a farm animal call or an oriental dish, it can't be used for geek-speak. Therefore we must abandon all research towards making computers talk to us since it will never be adopted by the geeky ones.
  • I had to view this in Lynx instead of my usual web browser (Galeon) - because for some reason, his site only turns up

    <html> <body> </body> </html>

    (That's articles, front page, anything.)

    Was it just a fluke? Or should I suggest that he send an HTML 4.0 compliant webpage as the default, since I'm sure that would work just fine? (If he's sensing browser types. ;)

  • by MO! (13886)
    It's an entertaining, albeit slightly, tale. However, the author has a few inaccuracies.


    He refers to X as an OS, it is not an Operating Sytem - it's a Graphical Environment (and even that's putting it simple).


    Also, Windows/386 - which was a full 32-bit version of 2.0 was the first Windows to take advantage of the 80386's features. He states that Windows 3.0 was, although it was actually an enhancement to W/386 that dropped support for the 80286 and relied exclusively on 32-bit mode.


    He also skipped right over IBM LanManager, which was the precursor to OS/2.


    OK, enough nitpicking... I guess the Ugh and Grub or what-have-you got to me more than I thought.

    • LanManager was not the pre-cursor to OS/2. It was the precusor to it's networking environment, but not the OS itself. (I've used versions that didn't include networking.).


      Besides, LanServer and LanManager were Microsoft and IBM's product at the same time. Hence they are very similar. (Not identical).

    • Other mistakes you don't mention:
      • Includes the bogus "640K" Gates quote that everybody says he said, but can't back it up with a citation to a book or magazine or first person account.
      • Said that CP/M (or as they put it, CP-M) "had originally been written for the IBM family of PCs"
      • Refers to X as "the main graphics system for most RISC-based UNIX operating systems". I guess all those Vaxes and 68030 based systems I used to program on weren't really running X, then.
  • About this time last year, I saw a video clip of the NLS Demonstration (the one that used hyperlinks, object addressing, and videoconferencing all wrapped in a purty GUI way back in 1968). Has anyone else seen it? And if you have, do you have a link to it?
  • At last, somebody actually gets this right:

    "Apple negotiated a deal with Xerox; in return for a block of Apple stock, Xerox allowed Jobs and his team to tour PARC, take notes, and implement some of the ideas and concepts being bounced around at PARC in their own creations."

    Pirates seriously messed with history in this regard, having never touched on the deal Jobs made with Xerox, and the made-up commentary by the "Wozniak" character.

    But on the downside, the author doesn't spell Jef Raskin correctly.

  • by Animats (122034) on Monday August 20, 2001 @10:40PM (#2200828) Homepage
    I had my first tour of Xerox PARC in 1975, saw the original Alto, the first Ethernet, and the first networked laser printer. In the early 1980s I did some programming on the Alto. So I want to correct a few errors here.

    The Alto itself didn't really have a GUI. What it had were graphical applications. One of them was Smalltalk, which had its own private windowed environment. Another was Bravo, the first WYSIWYG multi-font editor. You could write other applications yourself, in Mesa, Xerox's own language, or BCPL, in which the underlying tools were written. The underlying environment was a single-task command line environment comparable to early DOS.

    Bravo was used as the programmer's editor. The internal representation of Bravo documents was ASCII text, followed by a control-Z, followed by the formatting information. The command-line tools, which understood control-Z as EOF, could thus happily process Bravo documents. Programs for the Alto were normally written in proportional fonts, with boldface and italics as needed.

    The Alto hardware itself was built by Data General under contract to Xerox. It was basically a Data General Nova with custom microcode in a desk-height rackmount case containing the computer and a 14" removable disk cartridge drive (equivalent to a DEC RK05, if anybody cares.) The display, a portrait-mode b/w full-page display built at PARC, was the main hardware innovation, along with the 3MB Ethernet and the mouse.

    The Alto project had several components. First was the concept of a number of single-user workstations connected by a network providing dedicated services. Each Alto had very limited disk space, but file servers were available. All serious storage was on the file server. There was also a print server, an Alto connected to a modified Xerox copier. PARC management was working on the assumption that, although all this was far too expensive to deploy, in time the hardware would get cheaper and it would be useful. They were right. The fact that they then blew the business aspect wasn't PARC's fault.

    The other big push was Alan Kay's Dynabook. Kay was big on simulation and teaching kids to use computers. His real direction for Smalltalk was what we today disparaginly call "edutainment", games with educational intent. This seems strange now, but that's where he was going. He's continued with that direction, at the Media Lab and elsewhere. But it never took off.

    PARC tried to commercialize the technology as the Xerox Star. This wasn't a general-purpose system; it was more like a really good dedicated word processor. Wang then ruled that market, with what was called "shared logic word processing", dumb terminals all running one common application on a time-shared host. This was cheap enough offices could afford it.

    The Star, with a real computer at every desk, a big display, and a LAN, did roughly the same function, but at higher cost. It was cool, but didn't sell. It was a closed system; you ran the Star app, and that was it. PARC didn't trust the users to mess with the system, so you couldn't do anything they hadn't anticipated.

    The computer scientists at PARC didn't see that the future was open systems running unreliable software. Really. That was the missed vision. Nobody dreamed that something like DOS, let alone non-protected mode multitasking, would end up in clerical offices. Obviously, it would break all the time, files would get lost, and the cost to the organization would be enormous. Remember, Xerox was a rental company back then; if the copier broke, it was Xerox's problem. So they took reliability very seriously. And, sadly, it cost them.

  • How can one give a history of a visual medium without showing any pictures?


    "Talking about music is like dancing about architecture."

"Laugh while you can, monkey-boy." -- Dr. Emilio Lizardo

Working...