Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Apple Businesses

The History Behind the Lisa UI 240

DoenerMord writes "There's an interesting new piece which describes the story behind Apple's doomed pre-Mac system, the Lisa (aren't there thousands of these buried in a landfill somewhere?). It covers the UI, which influenced the original Mac, and just about every other GUI since. It also discusses a bit of the controversial Xerox fiasco. I especially like the comparative OS X Aqua pic at the bottom of the screenshots page. The more things change, the more they stay the same..." Update: 02/13 07:21 by E : The site is up again. Enjoy it while it lasts.
This discussion has been archived. No new comments can be posted.

The History Behind the Lisa UI

Comments Filter:
  • Damn that's funny.. who needs photos anyway when we have ascii art?
    -o
  • They are in a landfill in Utah, supposedly for tax write-off reasons, not cause of failing hard drives :)
  • I ask again, who says differently? A lot of people were actually alive then (it may surprise you to konw) and they remember these events pretty clearly. Again, who ever said Windows came first? Yeesh. The flammage is bad enough without making up nonexistent arguments.

    Oddly enough, there are in fact people who claim Windows came first and Apple ripped off Microsoft's ideas. I've actually encountered a couple of them. The remarkable thing about this particular species of stupidity is how completely unshakable it is; the idea that Windows-came-first seems to derive more from an adolescent worship of His Billness than from any remote link to the real world.

    Even many of the Windowphiles who do acknowledge that the MacOS came first try to minimize Apple's contribution to the Windows UI. These also tend to be the same sort of folks who think that sticky menus icons are a user interface revolution, so my guess is that they simply don't realize what a big change the Mac UI was from everything else people could buy at the time.

    -Mars

  • Actually, I think it was a bunch of malfunctioning Apple III computers that were scrapped. I'll look around a bit, and if there is a good reference, it will get posted.
  • Greetings! For what is worth, I found the article in the ACM Digital Library. It originally appeared in the February 1997 issue of ACM [acm.org] Interactions, pp. 40-53. The article includes many more graphics and photos than the article that prompted this posting on /.

    I am not sure about posting this and copyright issues, though, and I'm too lazy to dig out my ACM membership to check the rules. If anyone can confirm a source where ACM says it's OK to post it I'll make the PDF file available.

    For those of you who are ACM members, search for "Inventing the Lisa Interface" under title and "Ludolph Perkins Smith" under author. The PDF file is about 1024 KB.

    Cheers

    E
  • I just checked the base page at http://home.san.rr.com/ [rr.com] and found a list of all the users off this home page. When I clicked on a few of them I received various errors from Apache, though I was able to load some of the other member pages. Some I could load once, then not, then yes again.

    I don't think the author cut us off on purpose. Based on this little experiment, I'd say they just urgently need some help in configuring their web server.

    Cheers!

    E
  • the Lisa (aren't there thousands of these buried in a landfill somewhere?)

    As far as I know most of the Lisa systems were later turned into the "Mac XL". It was out before the MacII, and was the first Mac that came with a hard drive (~10M). It had a somewhat bigger screen then the normal Mac, and amazingly every Mac program I tryed ran on it without problem. Of corse the finder took forever to do stuff after you had tons of docs all over the drive -- the Mac filesystem didn't support folders at that time, it was done in the Finder, but that ment it had to read directory entries for thousands and thousands of files and sort them into in-memory folders using inefficent algorithms tuned for what you would expect on a 400K floppy (i.e. maybe 100 files tops).

    Most of the Lisa functionality was lost, but for a while it was the studliest Mac available.

    I do expect they are mostly in landfills now. The harddrive on that one gave out not long after the MacII came to market. I would guess most of the others have stopped working as well.

    P.S. I think the 68000 port of DR-DOS which became GEMDOS/TOS in the Atari ST was done on the Lisa first.

  • I've always wondered why 2 buttons on the mouse are "confusing" while 104 buttons on the keyboard is accepted without comment. Has anybody ever tried printing something on the mouse buttons???

    If the right button said "MENU" it might have made popup menus user-friendly and we would not be wasting all our screen space today with menu bars an tool bars. Does anybody know if usability testing was ever done for this?

  • by morbid ( 4258 )
    This just goes to show how the personal computer-buying market has shot itself in the foot repeatedly over the years.

    (reminiscence on)
    When I were a lad I wanted an Amiga or ST but wasn't allowed because, "what use is it if it doesn't run Lotus 1-2-3," said my dad, an IT professional at the time.
    (reminiscence off)

    Over the years this "it must be crap if it isn't IBM compatible becasue that's what everyone else uses, for better or for worse" attutude has prevailed, and here we are looking back to ~1983 when Apple had a machine that was about as powerful as a 386sx with a modern, innovative, easy to use UI over a technically-superior (internally 32-bit but without MMU) architecture arguably a decade ahead of the mass-market.

    The lesson : the Great Unwashed (or should that be Unthinking) always screw things up for themselves, and the Market in general.

    At least we have a market where competitors and innovators exist, along side a free software movement that cares more about producing powerful, useful tools to get stuff done than to conform to Joe Suit's idea of what the world "should" be.

    I'm glad I left the commercial IT industry in 1996.....
  • I remember the old 68000 chips that were used too.

    All ][ series (and the apple I) used the 6502 or variant. THe IIgs used a 65816, which was instruction-compatible with your 8-bit 65{,C}02 found in earlier ][ models. The Apple /// used a 6502 as well.

    it envolved upgrading the 68000 and a few other chips.

    The enhancer kit is composed of the 65C02, new CD and EF ROMs, and a new chargen ROM (to support uppercase inverse with mousetext, right...) ... I also have an unenhanced ][e, and can't find a proper enhancer kit these days (I'm just out a couple of ROMs... but that CD and EF ROM are pretty important. But hey, with the chargen and 65C02 from a //c, I have "just enough enhancement" to run ProDOS 2.0! :o)

    --

  • in most of Europe, manual gear shift is the standard

    This is one thing I miss from Germany... you could get just about *any* car of any make with a real... uh, I mean... MANUAL... transmission. Compare and contrast to the US, where I may have to buy a sports car or a truck to get a decent vehicle with standard in it.

    --

  • Don't fret - these things are all coming back from the dead :-)

    Check out Squeak Smalltalk [squeak.org] or Self [sun.com] (now also on the Mac besides Sparc machines). Even GNU Smalltalk has come back from the dead and will be getting great JIT technology real soon now.

  • Eeek! But I forgot about Slashdot's HTML 'translation'. So the spaceship operator didn't appear. Maybe one day Rob will fix the text box to handle these characters properly. Or I could use Tom's posting script.

  • sub schwtr { map { $_->[1] } sort { $_->[0] $_->[0] } map { [$_[0]->($_), $_] } @$_[1] }


    I think that should be $a->[0] $b->[0].

  • Cause it's just a link to a porn site, as far as I can tell.

    --
  • The IIe didn't use a 68000 chip -- it used the Rockwell 65HC02, which was an enhanced version of the 6502 used in earlier models.

    What killed the II line was Steve Jobs. The Apple II series was still tremendously popular, and he didn't want it competing with his Macs. This is ironic, as the Apple IIgs models that were contemporary with the early Macs blew them away in terms of performance. It's also what got me to migrate over to the x86 platforms, where users weren't assumed to be drooling morons, just as the Macification of the PC with Windows is what led me to Linux.
  • 5 times this week, Site gets /. then goes poof!

    Nuf said, Mirror it.
  • OpenFirmware isn't the closest thing to a CLI on a mac. There are several CLI shells available for MacOS. One's called MacShell, there's another called nshell.

    OpenFirmware, which was developed mostly by Sun, isn't a shell. It's just Forth with a bunch of stuff stuck on it. It's more like a really nice x86 BIOS than anything else. Then idea with OpenFirmware is that all devices have some Forth that acts as a driver that can be used to operate the device until an OS loads a real driver. It's on all PCI Macs (There are only 3 non-PCI macs, the x100s). It's on most Suns. It used to be called "OpenBoot."

    The really slick thing about OpenFirmware is that it builds a device tree, which is what makes "hardware...a non-issue."

    In closing, I would like to add that Forth rocks.

  • by dadams ( 9665 )
    If you want to execute 68k code fast, the most sensible option is to make a dynamically recompiling emulator running on Alpha (or x86)

    I'm not sure how true that is. I'm pretty sure PPC chips have some special instructions and such that make it possible to emulate 68k code fairly fast. It's not full hardware emulation or anything, but even old, old powermacs run 68k code at fairly suprising speeds.

  • Small point but the original Mac screens (from the 128k to the Classic II) were 512x342 -- strange but true.

    512x384 (which is the standard 4:3 ratio) was used for the original LC series color monitor, and the Color Classic series.

    --

  • But there are those of us that prefer words to pretty pictures.

    I for one, cannot begin to distinguish what a "pretty little icon" does until I hover my mouse over it and wait for the help text. And I have to do that repeatedly each time. Something in my brain doesn't recognize pictures, but does just fine with complex sequences of words. Button bars are useless, and are the first thing I turn *off* when I see them.

    So don't be making all interfaces full of these wizzy little pictures. I'll be locked out. :(

    And from my research, it appears that about 10-15% of the people out there are like me.
  • There may be thousands in some landfill, but I have one in my closet.
  • A few things that struck me about this article...

    1) They heavily relied on usability testing to gauge how well the target audience would use the product. Aside from the research done by Englebart etc. al. at Xeros, I suspect this is the first real usability sone on the computer industry. I find it hard to believe that a lot of the early PC stuff was usability tested at all.

    2) Their tests showed what people have claimed all along... that multi-button mice are more productive than single button mice. But, since single-button mice made the initial learning experience for the naive user easier (no guessing as to which button to click) that's what the Lisa, and eventully the Mac used. Twenty years later, the Mac platform still defaults to a single button mouse... all so that computer virgins won't become confused.

    What probably made the Lisa fail was it was too ambitous. In 1980, trying to create a high-power GUI desktop machine with a hard drive and all the trappings was simply too much to expect. Compare the specs to the first Mac that was rolled out in 84. No hard drive and a paultry amount of memory.

    The thing cost an arm-and-a-leg, and simply didn't make financial sense. Who wants to shell out big bucks that makes some secretary more productive? Not the first or last time that a company has decided to address a market that simply didn't exist (yet).
  • Comment removed based on user account deletion
  • Thanks for mirroring the site ! Can you mirror the screenshots too?
  • I dunno, I always thought that the IIgs was a cool machine, but it's existence allowed Apple to price Macintoshes in the stratosphere.

    Along the way, Apple developed a Mac-like GUI for the IIgs, "HyperCard" for the IIgs, AppleTalk networking, a GUI word processor, and so on. Most of these things were developed before anything was running on MS Windows -- there were only two consumer GUI computers available at the time, both incompabible, and both from Apple Computer.

    I know quite a few people and schools who invested heavliy in the IIgs -- only to be disappointed when it was dropped a few years later. Of course, it was smart to to consolidate development on one platform (Mac), rather than having double R+D costs, but it would have been smarter not to bait-and-switch with the IIgs to begin with. Most of the IIgs users I knew never bought another Apple.

    I can't help thinking that history might have been different if there was a cheap, color Macintosh available starting in 1987, and if the IIgs never came into existence. -- The Macintosh would probably have quite a bit more mindshare and marketshare today, if only because an entire generation of people could have afforded one early on, when it was the only GUI system available.

    (They finally got it sorta right in 1990 with the Mac LC with the Apple ][ card.)
    --
  • You have to remember that after Steve Jobs left, the new management was in a rush to obliterate his legacy. They didn't want a high end Lisa/MacXL machine -- they needed everyone to forget about it so they could start moving 128K/512K Macs in volume.

    This is all about the time that JL Gasse was driving around with his OPEN MAC licence plate, and the Mac II design was being planned.
    --
  • I don't know much about landfill technology - were these things compacted before dumping? Guess there's no chance of being able to excavate them by now, without some pretty sophisticated sonar.

    -lx
    • TopView was a GUI? I thought it was just a task-switcher...?

      (My experience was with DESQview, which allegedly picked up where TopView left off. I loved DESQview. Quarterdeck could hack.)

    I've never used/seen/heard of TopView, so I can't really comment on its GUIness. I never used DESQview either (I wasn't more than six at the time) but I sure do remember my dad using it. IIRC, DESQview was a task-switcher, but also had some simple GUI functions built in. You could display more than one DOS box at once and move between them and copy text with a cursor controlled (I think) by the arrow keys (the cursor was activated by holding down the control key for a second or so I think which pissed my dad off to no extent). I also seem to recall DESQview being able to show along with two or three DOS boxes at the same time its own menus which combined with a cursor, seems to speak GUI.

  • Yeah, the whole article is password protected now.

    Gee... what a great source of info that was...

    pathetic....
  • If you didn't like that, why didn't you switch to a different view? You know, a nice by-icon view. With the small icons, you can cram an awful lot of stuff in.

    And, as stated before, you can resize columns now.

  • And GUI's can do free-form text completion boxes. In fact, they can do some kind of out-complete feature, so I can type "bra" and have the "zil" added automatically.

    As can CLI's. I've even done a clumsy form of auto-complete in a 100 line Perl script i use.

    What happens if you want to move several files with different types to a directory several folders away? That requires a lot more typing.

    Not much more typing with a well organized filesystem and a modern Unix shell which supports tab completion. In short, one or two letters and the tab key per directory. And tab completion will do more than just directory names, the example included with the tcsh distribution is enlightening. Some configuration and you could expect that shell to read your mind.

    You select text with the arrow keys? If it's just a couple of words, I can see that, but if you're selecting whole sentences, arrow keys are slow.

    Which is why the key repeat rate under Unixes doesn't crawl like under MS Windows and editors like vi(m) rarely require you to actually highlight the text. The vim tutorial (listed in the help) shows how it works.

    When I want to get normal work done, I run Mac OS. When I have coding to do, I boot into Unix. :)

    And coding isn't "normal" work. *sigh* :)

    cheers,
    sklein

  • As far as early GUIs go, does anybody happen to remember a program called Frameworks? I really do not recall much of what it was, as I was just a child when we had it on our IBM "portable" with a 4 inch screen :)

    As I recall, it was a GUI in the respect that it created frames around windows using ASCII characters. It represented drives and folders along the right side of the screen. It was probably based on XEROX, I don't know.

    I don't think that it actually had the ability to run programs, as in multitasking or task switching. You could essentially view your drive contents, and use built-in programs to manipulate them. It had a word processor, spreadsheet program, calculators, ect.

    Only thing I really remember is that it had alot of documentation. I tried to look it up on the internet but it seems to have dropped off the face of the earth.

  • Motorola introduced the M68000 in 1978, but was years late with the support chips, especially the MMU. The M68000 couldn't handle page faults right, either; instruction continuation was broken. So the designers of the Lisa had to build their own MMU (this took many chips) and avoid using 68000 instructions that needed to be interrupted and continued.

    Oh, those instructions... Er, doesn't that include just about every instruction that accesses memory? Doesn't sound practical.

    Maybe they should have waited for the 68010 (& 68851) or just given up on virtual memory altogether, like they did on the Mac.

    It's not like VM was really necessesary to be competitive. The only other consumer CPU around to compete with it was the 80286, and no one ever really tried to use its rather lame VM capabilities anyway, until many years later (OS/2 v1.x? (Or maybe XENIX?)).


    ---
  • It would be great if we could use Mac's code. They should release their code as an open source product. I believe that MAC is taking their user interfaces seriously and that this is something that Linux is missing. Go Open Mac!

    And how would Apple (not Mac or MAC) make money if the source code for the GUI is open? The entire Apple business model is centered around getting people to buy Apple hardware (Macs) because the user experience is vastly better than other platforms.

    Before anyone points out Red Hat or other companies can give away their source code, remember that their business model is built around SERVICE. The products they are selling are fundamentally hard to use for the average person.

    Anyone who doesn't believe me should sit their grandmother (or pick another computer-illiterate person in your life) down with a copy of Red Hat that's been downloaded (no manuals, remember, we're paying for that support) and then sit that person down with a Mac. See which one they get the hang of first. Imagine which one will generate more support phone calls.

    The open source business model is profits through obscurity. Apple has been fighting obscure computing for 20 years. There's zero chance that Apple will open source the MacOS GUI.

    Open sourcing the underlying OS (Darwin) makes sense, because it is basically Yet Another BSD Flavor. It's a commodity. Just like Linux.

    -jon

  • is because it's some guys home machine. notice the url? san.rr.com is san antonio road runner (time warner cable modem).


    try to be a little more considerate when posting stories. it's not fair to effectively revoke this guy's home net access because he wrote a good article.

  • No, No, No!! It means Cannibalistic Labia incissors.
  • Connie likes incense.
    cold lumpy intestines
    cunning little Iranians
    cunnilingus, licking Irene
    commatose linedancing idiots

    command line interface.
  • One of the biggest reasons I CANNOT stand the Macintrash is that (on the version of the Finder I used last) when you had a long enough file name, the end of it got hidden under the file size, and that you cannot (could not?) resize the individual columns.

    Gads man, calm down. This is the reason you don't like an operating system? When was the last time you used a Mac? I think this capability has been there since 8.0 or 8.5 atleast.

    On other topics, looks like the page has been slashdotted already. I like the other link to the Lisa page at http://galena.tjs.org/tom/ [tjs.org]. Someone mentioned they had a Lisa in a closet, anyone got one that still runs?

    -doenermord
    Don't blame the games, it takes a village to screw up a child.

  • The URL you gave (http://self.sunlabs.com/) gives me:

    Connection refused

    Description: Connection refused

    I would love to scope it out ... any alternatives?
  • Maybe /. could begin a site caching service that, when an article published a URL (that was not dependent on dynamic code) that URL was cached for the life of the article and the link -actually- sent users to this cache on /.?

    Frame it with a little "site cached by /. to prevent blah blah blah ... click here to go to the original site".

    ?
  • Why bother with this? Everyone should be using cacheing HTTP proxies [nlanr.net] anyway, to distribute the load.
  • Efficient coding? On our MacOS 8.6 Fileserver/At Ease server/Web Proxy Server at work, if you simply click on the application selector in the upper right and leave that menu open, all other operations on the computer are halted until the user clicks something.. I dont consider that to be terribly efficient coding. I'm not sure but I also seem to remember if you click any menu at all it stops all background operations from continuing on.. not to mention the terrible multitasking performance of MacOS. Otherwise, it's a much better OS IMHO.. stability on that G3 server is a non-issue compared to our NT server..
  • [The exclamation point after 100 is mathematical notation [treasure-troves.com] for factorial [nist.gov].]
  • A Command Line Interface is an outdated user interface only useful for file management, system administration, and sophisticated text processing. Every other useful desktop application is superior when developed for the more modern graphical user interface (perhaps even the above applications can be made superior also). These include word processors, spreadsheets, presentation applications, desktop publishing, computer aided drafting, and image manipulation. GNU/Linux users often prefer the command line interface because the available graphical enviroments for GNU/Linux suck real bad. Once a far superior graphical interface gets developed for GNU/Linux look for many CLI diehards to spontaneously "see the light" or become very angry that a GUI can be better than a CLI, even though the CLI is more than two decades old and limited to the ASCII and Extended ASCII character set.
  • Today, yes this would have been a bummer. But you are dealing with a different situation here:

    1. "The computer" at this time was still thought to be a large, mysterious box, not something approchable and useable by the unwashed masses.
    2. The Lisa was not the Mac. (duh) And it suffered greatly from NoApplicationItis.
    3. Most schools wouldn't have been able to do anything with them at the time.
    Remeber, this happened in the Paleolithic Era of computing. Heck, IBM was just warming up, so most people were ignorant of the Term "Personal Computer" and wouldnt have known what to do with one if they had one.

  • Yeah, truth be told, I would have wanted one too. (Grin) But what I really meant to point out is that 90% of the people you would have offered this to would have just looked at you and said "Huh?" And, being the son of a teacher and a principal, I know what I am talking about. It's shameful to see how under provided schools are now, but at the time that the Lisa was being landfilled, they were just not as big of an issue, as most schools just didnt have the funding or the knowlege on how to use them properly.
  • Hehe. Well: "Don't drink and drive" ;). What you need is a cupholder and good timing. I do that all the time. As for cellphones, it's actually forbidden to use them in the car here in Switzerland, except if you have some sort of hands-free equipment...

  • Stick shifts, CLIs, books == GOOD. Automatics, GUIs, TV == SUCK. So my prejudices say, anyway

    So true, so true. Amen to that.

  • just as manual transmission is not the favorite way of changing gears.

    Ok, this is just nitpicking and doesn't add anything to the discussion, but in most of Europe, manual gear shift is the standard and will remain so for long.

  • User interface is about much, much more than pretty widgets. This is something quite a few people have trouble understanding. Good UI design is hard, often harder than writing the code that the UI is designed to let the user interact with. It requires tons of knowledge about how humans think and the assumptions they make.

    Go check out http://www.mackido.com/Interface [mackido.com] for some basic information. You'll see that much more goes into designing a UI than you ever thought. (The site is somewhat Mac biased, but gives some good information.)

    --
  • I used to have 5 of them, but my parents got rid of them.
  • actually it looks like the bandwidth limiter kicked in. theres a mirror in the next message tho.
  • Let's slashdot the slashdot!
    Duh...
  • Hah, looking through your collection was interesting.
    Last week I set a Toshiba T5200/100 out with the rest of the rubbish.
    I once had a Compaq Portable 386 but that preceded the Toshiba to the curb by a month or two.

    Neither of which were all that portable if you ask me.
  • Many people dump on Apple machines (I'm sure lots of you will flame this article) but they really are cool! Apple computers made me want to program and have fun with them. I remember the first PC I tried programming and I gave it up. Apple ]['s and Mac Classics are definitely way more fun than a PC any day.

    I had worked with some other computers before it, but the Apple IIe my parents bought in 1985 was (and is) the most fun and most hackable computer I've ever had. Within a year of getting it, for instance, I had cobbled together for a Boy Scout project a math-drill program that talked (which, given that it used the auxiliary memory solely as wave storage, was in hindsight an early example of bloatware, even though it still fit on a single 140k floppy :-) ). I haven't gotten into the Mac much since fooling around with them some in college around '89 or '90, but I've grown my collection of Apple IIs to three...the IIe my parents bought was upgraded to a IIGS, and then a IIe and II+ were added to the lineup. Fun stuff...they're simple enough that if something goes wrong with the hardware, you stand a good chance of fixing it without resorting to the modern "rip out that board and replace it" mentality. (Reseating all the chips and disassembling and cleaning the keyboard brought the II+ back to life, for instance.)

    I even get some use out of them occasionally, especially the IIGS. It gets used mainly as a terminal for the Linux box here, though I still do some tinkering in BASIC or assembly language periodically. A little while back, I cobbled together some string-math routines in assembly language and used those to calculate the exact value of 100!. Running at 12.5 MHz, it finished in maybe a second or so. The same could've been done in C under Linux, Win98, or whatever, but it wouldn't have been as fun. (Why calculate 100!? Why not? :-) )

  • So what did 100 turn out to be?

    100 equals 100, of course. 100!, on the other hand, was (broken up to get past /.'s "lameness filter," which I didn't know even existed until now):

    9332621544394415268169923885626670049071
    5968264381621468592963895217599993229915
    6089414639761565182862536979208272237582
    51185210916864000000000000000000000000

    The program's good up to 146!; after that, it runs out of digits (maximum is 256 digits) and produces garbage results.

  • Since...what OS 8.1 or was it 8.5? And wern't there shareware/freeware extensions that allowed you to do this all the way back to 7.5?

    Well ain't that sumptin'!?! I swear I never knew you could resize the columns (and more importantly, reorder them) until I tried it on a whim after reading your message.

    Here I've been writing code on the Mac since '85, and this has to be the best Easter Egg of all. (And clearly, it has to be an Easter Egg because I never saw it mentioned in the docs anywhere, and of course, I always read all of the manuals. ;)

  • In any event, aside from Apple, there were a number of companies working on bringing GUIs to personal computers, including GEM, TopView, and the Microsoft/IBM collaboration that was to become OS/2 and Windows after they parted company.

    TopView was a GUI? I thought it was just a task-switcher...?

    (My experience was with DESQview, which allegedly picked up where TopView left off. I loved DESQview. Quarterdeck could hack.)
  • Of course, the beauty of the Lisa was that Apple was actually trying to give this power to the average user.

    Well, the "average user" that had ten grand to blow on a new system.
  • cypherpunk/cypherpunk
  • I'll agree with you that, in the context in which you speak, OpenFirmware isn't a shell. I said in my original post that the closest thing to a CLI on a Mac is OpenFirmware (if I said shell, I didn't mean to), to which I should have added that OF is the closest to a CLI that is built-in.

    And don't forget about Mac06 (I think that's what it's called)...it aims to make a POSIX compliant layer for Mac programmers to aid in the transition to Mac OS X Consumer.

    Anyway, sorry I wasn't clearer in my original post.
  • : What killed the II line was Steve Jobs. The Apple
    : II series was still tremendously popular, and he
    : didn't want it competing with his Macs.

    You've got your timeline screwed up. Steve Jobs was betreyed by sculley and forced out of Apple little more than a year after the introduction of the Macintosh.

    The Apple ][ series was continued for years following Jobs' expulsion, with the ][e and ][gs in production until mid-1993, MANY years after sculley' coup.

    john
  • VA Linux Systems. [valinux.com]

    'Nuff said.

    -E

  • I for one, cannot begin to distinguish what a "pretty little icon" does until I hover my mouse over it and wait for the help text. And I have to do that repeatedly each time. Something in my brain doesn't recognize pictures, but does just fine with complex sequences of words. Button bars are useless, and are the first thing I turn *off* when I see them.
    1. I agree completely that most GUIs are useless and have seen dozens of people in the "You are in a maze of twisty icons, all alike" trance
    2. A well-designed GUI doesn't need to be like that. Frankly, if you have to look at the help text, the GUI designer hasn't done his job. Another sign of a GUI developer who spent more time at www.bustybabes.com than building a usable interface is poor keyboard navigation.

      A properly designed GUI could take advantage of the flexibility you get from a higher resolution display and non-keyboard input, even if you end up using it just for shoving bits of text around. Unfortunately, most of the GUIs are horribly designed; I'm rather disappointed at how frequently I wish the developer had included a SQL query tool as it would be easier to use that than fight a constricting interface.

  • Well you can now.

    Since...what OS 8.1 or was it 8.5? And wern't there shareware/freeware extensions that allowed you to do this all the way back to 7.5?

    Sounds like a really nitpicky problem to me.

    I use OS X Server at work and I really dislike the NeXT browser in there.
  • It wasn't just that the slowness was only by comparison to
    dos; the perception was wrong.

    The mac was generally faster than pc/xt class machines. At one point,
    we had my 128k mac next to the pc of someone convinced that his
    was faster and better. They were running identical BASIC code.
    Well, almost identical--it was a numerical integration program,
    and we stripped the point-by-point graphing from the version on
    the PC. It was still substantially slower than the mac.
  • The Self page has moved to http://www.sun.com/research/self/index. html [sun.com]. (Actually, the entire 'SunLabs' tree has moved to http://www.sun.com/research, it appears.) The project is described as no longer being active, although the last release of Self, 4.1, was last month.

  • wasnt even a processor powerful enough to make all the pretty widgets work.

    The Lisa was based on the Xerox Alto (See here [xerox.com], here [spies.com], here [brunel.ac.uk], and here [atari-computer.de]) from the early 70's, so it was certainly doable, although perhaps not with the single-chip-CPU concept that seems to be the only thing the kids of today can conceive of.

    And no, I don't have one in my collection [sinasohn.com]. Yet.

  • There are quite a few still surviving, including the one in my collection [sinasohn.com]. There are resources out there as well, if you want to look for them.

    But yes, many were scrapped, by Sun Remarketing [sunrem.com], on Apple's order, iirc. They still sell Mac parts and used to have some Lisa stuff.

  • Speaking as one who has one [sinasohn.com]...
    The boxes themselves looked.. unconventional.

    Actually, there are those who say its appearance was derived from the (for the time) nearly ubiquitous IBM terminals that littered desks throughout corporate america. This was to make sure it was "immediately recognizable as a computer."

    Furthermore, there was no real standard for microcomputer appearance at the time -- the IBM PC slab was not yet universal -- many businesses had Apple II's, Radio Shack Model II, III, and 12's, and Sol-20's, none of which were necessarily computer-ish looking. (Most people in the early 80's thought of computers as huge things (PDP-11, HP-3000, IBM 360) with spinning (reel-to-reel) tape drives.)

    In terms of functionality, the Lisa lacked alot of commonly desired features which were in demand at the time (heh, like color)

    Um, the target market was Business. It has only been in the last 10 years that color has started to become an important part of business computing; hard copy is still mostly black and white. Perhaps you are thinking of video games?

    I know that at the time, I was recommending avoiding color monitors (CGA) for business use as the resolution was terrible (320x240, iirc) as compared to Hercules monochrome (720x?)

    Whatever you _could_ do with one often took a great deal of time to accomplish, and the box itself would crash fairly frequently.

    Sure, it was slow, as were most personal computers then. The Lisa was trying to do an awful lot with the limited hardware available. And yes, like the rest of the personal computer industry in those days, it was not the epitomy of reliability. (Like that has changed much...)

    Above that, it wasnt abundantly clear to the first-time user how to go about operating one,

    Excuse me, but do you expect a first-time user to be able to do anything at all with Unix/Linux the first time they sit down in front of it? With a GUI, the user can at least move the mouse, notice a correlation between its movement and the movement of something on the screen. When the Lisa was introduced, most people had no experience with a computer at all. The Lisa was intended to get them up to speed in the shortest possible time. The first 10 minutes might have been sheer hell, but after that it would make sense.

    (This, of course, is where the MacOS succeeded and Windows failed -- there is one key combination, for example, that will close any program on the Mac. (Command-Q, iirc) Under Windows, you might have Ctrl-Q, Ctrl-X, Alt-F4, or something completely different. On the Mac, once you knew one program, you kinda knew them all. Not so under Windows.)

    and why this sort of design was better than the conventional command-line driven concept used in personal computers in common usage at that time.

    For new users, there was nothing to remember. No secret incantations to be typed. Click on a menu, then select a option. Click on icons. It's all there. With CLI's, you need to remember the commands, the options, etc. Much more efficient in the long run, but not easy to use at first.

    I think Apple's main motivation for killing the Lisa was that it would have been a public-relations disaster anyway. Better to drop the curtain on a bad product than to have the public drop the curtain on you.

    We'll never really know for sure, but the reasons I've heard (with reasonable credibility) include internal politics and competition with the Macintosh group.

  • Last week I set a Toshiba T5200/100 out with the rest of the rubbish. I once had a Compaq Portable 386 but that preceded the Toshiba to the curb by a month or two.

    Both of those machines would have been a great machine for someone who couldn't afford the latest and greatest. Certainly, there are collectors out there who would be happy to take them.

    Folks, please, before you toss a computer, look around to see if someone might either be able to use it or if someone wants to save it for posterity!

  • Couple of comments:
    They heavily relied on usability testing to gauge how well the target audience would use the product. Aside from the research done by Englebart etc. al. at Xeros, I suspect this is the first real usability sone on the computer industry. I find it hard to believe that a lot of the early PC stuff was usability tested at all.

    I don't recall Englebart as being at Xerox; he's best known for his work 10-15 years prior at Stanford. Of course, I know that members of his team definitely did go on to work at Xerox Parc and worked on the Alto and such, which of course led to the Lisa, Macintosh, Windows, and a host of other things we take for granted today.

    However, please remember that the computer industry was around for at least 30 years before the Lisa (take a look at this page [blinkenlights.com] for a bit of PC history.) It might have been the first such testing for the Personal Computer industry, but certainly not for the computer industry in general.

    Their tests showed what people have claimed all along... that multi-button mice are more productive than single button mice. But, since single-button mice made the initial learning experience for the naive user easier

    And therein lies one of the fundamental differences between GUI's and CLI's -- The former is much easier to figure out which the latter is far more efficient.

  • A Command Line Interface is an outdated user interface only useful for file management, system administration, and sophisticated text processing.

    Except everyone I know who seriously uses AutoCAD is constantly going to the keyboard. Once you become an expert on a complex program's functionality, a command line is much faster than a GUI.

  • In a GUI, all the commands VISIBLE which you have said, unlike a CLI where all the commands are usually HIDDEN. CLI's allow for very efficient commands for the power user.

    On a Mac, all of the menu items (along with a clock and a task switcher) are in the menu bar, which eats only about 20 pixels. The commands themselves are hidden until one needs them, and 20 pixels is a trivial amount of screen space.

    Power comes at price. Why waste valuable screen real estate when the user already knows the commands (or hotkeys)

    Two reasons. First, like I said you're only losing a strip about 20 pixels high. More importantly, there are very few power users who actually memorize the keystrokes for every possible operation. A well-designed GUI will provide keystrokes for frequently used ops, but having them in the menu bar makes it easier to learn what they are.

    ONLY when you a small number of options ! Otherwise you have multiple 'tabs' or 'pages' of config options. Ever see a drop down box with 200 items? That's not elegant. (A country selection is one bad example.)

    What's the alternative. A CLI will likely have a list of country codes, which is going to be every bit as cumbersome. And GUI's can do free-form text completion boxes. In fact, they can do some kind of out-complete feature, so I can type "bra" and have the "zil" added automatically.

    That DEPENDS on what you are doing.
    Try moving *.bat, *.com, *.exe files into another directory. I could type: move *.bat *.com *.exe progdir. With a mouse that would take you _THREE_ operations.


    It does depend, but even in your example I prefer a GUI. In the Finder, you would switch to list mode, sort by kind, and then rubber-band to select the files. Yes, you'd have to do it 3 times, but you get better feedback, and it still isn't much slower than a CLI. And that's a simplified case. What happens if you want to move several files with different types to a directory several folders away? That requires a lot more typing.

    if you spend a lot of time copying text, do you use the arrow keys to select the text, and then ctrl-c ctrl-v

    You select text with the arrow keys? If it's just a couple of words, I can see that, but if you're selecting whole sentences, arrow keys are slow. Besides what you just described can be done in a GUI as well as a CLI.

    What I really would love to see is a system that COMBINES the power and intuitiveness of CLI's and GUI's

    Sure, which is why I have a dual-boot machine. When I want to get normal work done, I run Mac OS. When I have coding to do, I boot into Unix. :)
  • "pretty little icons" are a Microsoft "innovation" that has unfortunately taken over the GUI world. However, if you are taking Windoze as a standard for GUI's, then of course you are going to come to the conclusion that they are useless.

    GUI's have three major advantages over CLI's. First is that it dramatically reduces the learning curve for new apps. In a good GUI, all possible operations are available using the menu bar. Thus if you want to do something to a piece of data, you select it, and the look through the menus to find the command you want.

    Similarly, GUI's allow elegant setting of configuration options. Command-line switches simply can't match the simplicity and usefulness of a preferences dialog, and it doesn't require wading through man pages to use.

    Secondly, GUI's allow for more efficient use of screen real estate and allow more rapid entry of some kinds of data. Windows and scrollbars allow you to fit multiple apps on the screen, and to use as much or as little space as needed for a given window. Doing this with a keyboard would be a nightmare. And imagine web surfing without a mouse.

    Finally, GUI's allow you to do things simply and elegantly which cannot be done without a lot of grief with a CLI. For example, rearranging a file system with a CLI just can't match the simplicity of doing it with a GUI. And graphics and word processing would be a nightmare if we still had to do that without a mouse.

    CLI's have their strengths, but there are some things that they just can't do. It's too bad that Microsoft did such a lousy job implementing their GUI, because not all GUI's are that bad.
  • Re:CLIs vs. GUIs (Score:1)
    by sklein (sklein@mint.net) on Tuesday February 15, @01:16AM EST (#245)
    (User Info) http://members.mint.net/sklein/

    And GUI's can do free-form text completion boxes. In fact, they can do some kind
    of out-complete feature, so I can type "bra" and have the "zil" added automatically.

    As can CLI's. I've even done a clumsy form of auto-complete in a 100 line Perl script i use.

    What happens if you want to move several files with different types to a directory
    several folders away? That requires a lot more typing.

    Not much more typing with a well organized filesystem and a modern Unix shell which supports tab completion. In short, one or two letters and the tab key per directory. And tab completion will do more than just directory names, the example included with the tcsh distribution is enlightening. Some configuration and you could expect that shell to read your mind.

    Sure, with a lot of practice one can make a CLI match the speed of a GUI. But only a small fraction of users would actually have the patience to learn how to do this, and even then CLI's don't have a compelling speed advantage, just not much of a disadvantage.

    Which is why the key repeat rate under Unixes doesn't crawl like under MS Windows and editors like vi(m) rarely require you to actually highlight the text. The vim tutorial (listed in the help) shows how it works.

    I use vi extensively, and with a lot of practice, you can do a lot of things efficiently. But I can select an arbitrary piece of text in a word processor in under 2 seconds. No matter how fast vi is, it isn't going to match that. And this is assuming that we're editing plain text. If one is editing formatted text, then the only way to do it with vi is to use something like html, in which case you need to have Netscape open in a seperate window to make sure everything looks ok.

    vi is great for editing code, because it gives you precise control over formatting and allows you to work without taking your hands off the keyboard. But for writing papers and such, there's just no comparison with a good word processor.

    And coding isn't "normal" work. *sigh* :)

    By "normal work" I mean writing papers, surfing the web, using email, ICQ, tracking finances, etc. Yes, you can do this in Linux, but most of the tools are GUI's anyway, and most Linux GUI's are attrocious. The Open Source movement has yet to produce a desktop environment that approaches the care, attention to detail, consistency and style that goes into the Mac OS. Linux makes a great server and a decent workstation, but its user interface still needs a lot of work.
  • >Heh, also from a Mac vs PC chicken or egg debate,
    >they said that the Lisa UI was first introduced
    >at the National Computer Convention in 1980,
    >about 5 years before Windows 1 was released.

    Chicken/Egg? Is there anyone who denies that the Lisa was the first graphical personal computer on the market? In any event, aside from Apple, there were a number of companies working on bringing GUIs to personal computers, including GEM, TopView, and the Microsoft/IBM collaboration that was to become OS/2 and Windows after they parted company. (It might have hit the market even sooner if it had not been for their infighting.)

    >Shouldn't windows 1 have been released in the >year 1? Just a question.

    Nyuck nyuck.

    >At any rate, im sure this will get flamed to
    >hell, but at least now we have evidence to set
    >the record straight.

    I ask again, who says differently? A lot of people were actually alive then (it may surprise you to konw) and they remember these events pretty clearly. Again, who ever said Windows came first? Yeesh. The flammage is bad enough without making up nonexistent arguments.

    >Ironically enough, now we seem to be moving back
    >towards a CLI, simply for the sheer power that
    >comes with CLI. Gui is like putting a blanket
    >over a puzzle and trying to put the puzzle
    >together by moving the blanket around.

    Well, CLI is certainly never going to be the interface of choice for your average desktop worker, just as manual transmission is not the favorite way of changing gears. The GUI is faster and easier for certain tasks. (I certainly prefer writing in a GUI vs. writing in vi.) The CLI is faster and easier for certain other tasks (e.g. managing a server, importing lists of users and rights). If you're referring to Linux, well, keep in mind that even Linux as it grows will have to encompass more than one interface paradigm.

    The Microsoft method is to (as you said) put a blanket over the top, hide the complexity. I don't think it's emblematic of all approaches to writing a GUI, however. The GUI is one interactive way of accessing functionality, if it's written right. It's only when it begins to get in the way of the functionality (try adding ten users in a row, or setting up a piece of hardware without the wizard) that the GUI becomes a liability.
    ----
  • When they say browser here, they don't mean a _web_ browser; instead they mean what became the Finder. The _web_ browser paradigm is based on hypertext within the document. The _document_ browser paradigm is based on a hierarchical file list and simultaneous (or triggered) display of the document.

    Put another way, the document browser lacked the interactivity of a web browser. But it certainly was a precursor; it simply awaited the invention of a viable hypertext system by Tim Berners-Lee et al. to enhance its capabilities. These were all gradual evolutionary steps that had their roots in academic thinktanks long before the average person could make use of them.

    Of course, the beauty of the Lisa was that Apple was actually trying to give this power to the average user.
    ----
  • >Guys, what Linux needs is a better approach to it's GUI. It would be great if we
    >could use Mac's code. They should release their code as an open source product.
    >I believe that MAC is taking their user interfaces seriously and that
    >this is something that Linux is missing. Go Open Mac!

    Linux DOES need a better approach to its GUI(s). Let's not oversimplify, though.

    The power of the Macintosh GUI isn't in its code, per se, but in the <a href="http://google.com/search?query=apple%20human %20interface%20guidelines&num=50">Apple Human Interface Guidelines</a> that were developed in large part as a result of Jobs bringing all those Xerox people over to Cupertino. They were academics and serious about these issues, so they diligently tested and honed their principles. As a result the Mac interface is clean and usable, in many ways just hte opposite of the Microsoft marketing-driven Windows GUI, where everybody is supposed to follow the rules but Redmond (MS programmers are notorious for having side-by-side widgets with entirely different interface approaches). The original HIG were excessively rigid (i.e. requiring only ONE way to do something, which meant you couldn't have keyboard shortcuts -- but those were introduced later), but by being based on principles they remained a useful foundation for interface design right up until recently.

    The sad part is that Apple has apparently abandoned these principles; the new MacOS X interface is a graphic designer's wet dream, and a horrifying sight to usability people like <a href="http://www.asktog.com/">Bruce "Tog" Tognazzini</a> who was on the original team. If you need to know what the new Mac will look like, check out the Quicktime client; the new philosophy seems to be to make every application like a little handheld Sony appliance, common interface elements and mouseable operation be damned.

    Thus, what I believe the Gnome/KDE folks should do is carefully read those Apple Human Interface Guidelines from 15 years ago, and apply as much of that as they can to building a proper usabile interface for Linux that doesn't feel like a crazy-quilt mix of styles. I'm not saying it's in bad shape now, but aside from skins and themes, it's far from pretty. A more consistent interface will go a long way toward allowing Linux to creep out of the server market and onto non-hacker desktops.
    ----
  • >Ok, this is just nitpicking and doesn't add anything to the discussion, but
    >in most of Europe, manual gear shift is the standard and will remain so for long.

    Yep, I forgot that TWIAVBP. But my original point still holds: what works for one person doesn't necessarily work for another.

    As much as I'm a geek myself, I also know something of design and usability. And the last person I'd want designing my User Interface, GUI or CLI, is an engineer!

    "Sure, see, just type cwer72 -woiuerxcvz and it'll do it right every time. Oh, yeah, it's mnemonic, you know the 1997 top ten hit by the frogmen?"

    :)
    ----

  • Apple was right to bulldoze these. They were bad machines in alot of ways.

    The boxes themselves looked.. unconventional. It looked more like a keyboard-driven beige oscilloscope than anything immediately recognizable as a computer. In terms of functionality, the Lisa lacked alot of commonly desired features which were in demand at the time (heh, like color) and seemed more like a machine that was trying desparately to be unique rather than truly functional. Whatever you _could_ do with one often took a great deal of time to accomplish, and the box itself would crash fairly frequently. Above that, it wasnt abundantly clear to the first-time user how to go about operating one, and why this sort of design was better than the conventional command-line driven concept used in personal computers in common usage at that time.

    I think Apple's main motivation for killing the Lisa was that it would have been a public-relations disaster anyway. Better to drop the curtain on a bad product than to have the public drop the curtain on you. If youre going to make a splash with a new product and a new idea, you dont package it in the form of a failure.

    (FYI, this was in a public library in my home town, in 1984. First GUI I ever saw, thats why I remember it.)


    Bowie J. Poag
    Project Manager, PROPAGANDA For Linux (http://propaganda.themes.org [themes.org])
  • The Self page has moved to http://www.sun.com/research/self/index.html.

    Thanks. I was looking for that URL.

    The project is described as no longer being active, although the last release of Self, 4.1, was last month.

    The Self project is still active, as is the associated self-interest mailing list. There'll also be a Self Hack Weekend in San Francisco later this year.
  • Eeeeek! You're right!!! And to think I committed that huge gaffe in presence of the Master!!! I'm so ashamed! I deserve to be punished!!!!!!

    (Thanks for the correction, Ed.)
  • CLIs only seem "powerful" by comparison because text provides context-free representation, i.e., anything can be expressed in the same way (because the text ontology is streamlined and meta-identical), whereas a graphical representation is by definition more complex and therefore different ontologies must be used for different purposes.

    However, I think that most people's notion of GUIs come from their experience with "traditional" GUIs (Mac, NeXT, Windoze, and the UNIX windowing systems). Many fine people have been working to introduce new paradigms for graphical representation; one such group is the Self gang at Sun Labs and Stanford. Self is an extremely powerful classless, message-passing based OO language, designed around the concept of "programming as experience", which attempts to immerse the user/programmer in a homoiconic, consistent and all-around graspable "world"; according to this philosophy, the Self graphical environment (ported to Squeak Smalltalk as Morphic) is one in which all objects are graphically represented (by way of "morphs"), and in which any object can directly interact with the user in a number of standard ways, having its properties easily accessed or modified. What all this means is that the Self environment is radically different from the traditional GUI, and easily provides at least as much power and flexibility as a CLI.

    Self can be found at http://self.sunlabs.com, IIRC.
  • Hey, Randal. Y'know, since I came to Slashdot in late 1998, I seem to have been attacked by just about every Perl god around except Larry himself. It's quite a honor. So, heil sub schwtr { map { $_->[1] } sort { $_->[0] <=> $_->[0] } map { [$_[0]->($_), $_] } @$_[1] }!

    Back to the subject at hand: the Self GUI does not rely on those annoying, subjectively meaningful, almost unintelligible little picture widgets at all. So you are safe. :)
  • !100 means not 100;

    If you're in a binary system, say 8 bit, then you actually have 11111011 = !100

    =)

    -AS
  • True. Good GUI apps should have a very good keyboard interface. GUI != WIMP Interface.
  • Well, I'd call myself a hardcore Mac fan, at least until something better comes along. I like Aqua, and I like it because I think it's good UI, not just because I like the candy coating. I think the new UI is more consistent and better designed that what the Mac has now. Remember that the existing Mac OS was never designed for running multiple programs at once. That was hacked in later. It was never designed for color. That was hacked in later. It was never designed for large screens (the typical screen these days has 4 times the area of the 512x384 screen that Mac OS was originally designed for. And it was never designed for computer as fast as today's. Much of the flash is just a logical extension of things that have been in Mac OS for years (like window zooming, menu blinking, and shadows under windows).

    Mac OS X is certainly different from Mac OS, and will take some getting used to, but I think it has an excellent UI. Of course I reserve final judgment until I've had a chance to use it for a few weeks.

    --
  • I remember being absolutely WOWed by the Lisa when I saw it introduced at the Personal Computer World show in London. This was the first GUI most of us had ever seen (or anyone outside of Xerox), and it was really mind blowing.

    Still, as far as it's success goes, not only was it really slow, but it also cost around 10,000 pounds (UK), which back then was a LOT of money (my salary as a programmer was 6,000 at that time). I've always thought of the Lisa as more of a prototype for the Mac rather than the real production machine it was meant to be.

  • But the Lisa had these attributes in its favor... Memory protection/preemptive multitasking/virtual memory. These are exciting new technologies that will finally reach mainstream Mac users in sum with the release of MacOS X.

    NEW?!? ROTFL... I remember buying OS/2 Warp in late 1994 with these features. You can hardly call them NEW technologies...
    --
  • Someone may knock this down as overrated, since I'm taking advantage of my default-setting here, but I'll take that risk in the interest of interest :)

    Anyhow, I followed the link helpfully inserted in this message's parent to blinkenlights [blinkenlights.com] and was amused, impressed, informed, delighted. I recommend that you go there for some interesting, thought-provoking trivia. I like the fact that in answering the question posed on this page ("What was the first personal computer?"), the underlying assumptions about what each of those words means are parsed, and the ambiguity inherent in the question is addressed forthrightly. I cannot guarantee that the answer given on this page is the absolute best one, but it seems well-justified. (And surprising, to me, since I'd never heard of their winner before.)

    Hope someone else enjoys reading it like I did!

    timothy
  • Seeing as the article was originally about Lisa, I thought it might be fitting to add to this that Macs don't really have CLIs in the same respect that DOS incorporates a CLI and bash (one of the more common shells in Linux) is a CLI. I can't say for certain about Lisa, as I've never used one myself, but Macs boot straight to "graphics mode." The closest thing to a CLI on Macs is hardware-based, called OpenFirmware [openfirmware.org] (which was developed by Apple, Sun, and Motorola, I believe). Holding Command-Option-O-F at boot time on a newer Power Mac will drop you to OpenFirmware. The OpenFirmware website says that:
    OpenFirmware is essentially a specification for a largely machine-independent BIOS based on ANS Forth that is capable of probing and initializing plug-in cards
    that have on-board IEEE-1275 compliant Fcode in their ROMs.
    Cool, huh? And you wonder why hardware is a non-issue on Macs in comparison with Windows et al. :)

    Anyway, back to my original point. Macs don't really have a CLI, and while not explicitly stated, the original post implied that GUIs are wholly dependent on a CLI.
  • Try Google [google.com]'s cache. I was getting the same error, so I went to Google, and searched using this term:

    cache:self.sunlabs.com
    You can browse most of the site (minus images, which is sort of a problem in this case), though the interface is a little awkward. For one, Google doesn't modify the links so that you stay within the cache (like Babelfish does), so you have to modify the query every time you want to see another page.
  • It should be noted that 'pretty little icons' mean several things. What you're referring to is Microsoft's 'contribution' to the GUI, and, in large part, what seems to give GUIs a bad rap at least among the /. crowd. Early GUIs had pictures of folders that obviously meant directories or folders. The sheet of paper obviously meant a document. A simple concept was represented simply, clearly and quickly. What you're referring to is the legions of button bars in MS Word/Excel, etc. 16x16 pixel pictures cannot adequately convey the meaning 'Add New Row to table'. They just can't. That's why the early Mac interface did not have any cryptic buttons. MacPaint had the arrow, the paintbrush, the eraser, etc. Simple and obvious. Photoshop is a good example of this properly executed. Common, easily understood tasks have buttons. Anything else has a pulldown menu with a descriptive text pick. A similar example is with CAD software. Where Pro/ENGINEER says Extrude and Revolve, SolidWorks (a Windows app through and through) says 'Picture of Block' and 'Picture of Donut'. Just because most of the world uses a *bad* GUI doesn't mean that GUIs are inherently bad.
  • In late 1981, I was given the responsibility to develop an authoring system for the Viewtron videotex network planned for nation-wide deployment by AT&T and Knight-Ridder due to my prior work at the Plato project [thinkofit.com]. At about that time, the cover story for Byte Magazine by Larry Tesler of Xerox PARC was about Smalltalk. [byte.com] Since I had been looking for a decent language upon which to base a network programming environment, Dennis Hall, then technical director of the Viewtron pilot, arranged a trip to Xerox PARC to see a demonstration of their system. We met with the Xerox PARC Smalltalk team in November of 1981.

    We were having difficulty with the standards committee controlling the North American Presentation Level Protocol Syntax [faqs.org] -- the graphics protocol upon which the Viewtron videotex terminals built by Western Electric were based. Specifically, there wasn't enough programmability. The Western Electric terminal was so limited in capacity that we had to fit the graphics interpreter into a very few number of bytes, and could afford only a few thousand bytes of dynamically downloadable store. I had been enamored with Forth ever since the Byte magazine article about it about a year or so earlier (my first digital purchase was an HP35 so reverse polish didn't bother me perhaps as much as it should have). Even so, I was hunting around for options. Jim Thompson, another senior staff member with the Viewtron project, was also interested in Forth -- enough so that he had subscribed to the Forth newsletter, which he shared with me. Jim was supposed to develop a menu system to run on the central system. I had specifically asked that his menu system never achieve Turing Machine equivalence, because I knew what sort of horrors lay in wait for us if it did. Nevertheless, Jim eventually implemented GOLFBAL "Game Oriented Language for Business And Leisure" -- and it was a Forth derivative. I had rejected Forth as anything but the low level protocol and engine for the telesoftware graphics system and was fairly horrified to discover what he had done. In any case, it was this immersion in Forth we brought with us to our meeting with the Xerox PARC folks.

    Now, I swear on a stack of bibles that after I met with the PARC folks and discussed the problems of graphics communications, I had no idea the industry could end up being stuck with Postscript as a type-setting standard. I can say this for a certainty because:

    I wanted to see a Novix-style reduction-to-hardware of the Forth virtual machine [bournemouth.ac.uk] so that Forth would become the macro assembly language. Then we could use the Forth silicon machine to start running dynamically downloaded Smalltalk -- or some similar high level language -- compiled for the Forth stack machine which would provide much more powerful graphics specifications than Forth itself.

    I never imagined the Smalltalk guys would actually depart from Smalltalk itself as a graphical specification language.

    By the time the PARC guys spun off Adobe with Postscript and its Forth-like engine, I had become more interested in constraint [washington.edu]/relational [mozart-oz.org] programming semantics than object oriented semantics because it more naturally fits graphics description, distribution, nondeterminism and parallelism not to mention databases.

    It was summer of 1982 when I met with Tesler for the last time -- and he had just left PARC to go work on Lisa. We were sitting in the empty Astrodome, I think it was, next to the convention center where the Commodore 64 was being introduced to the world market as part of the precursor to Comdex. 64K of memory! At any rate, Tesler and I discussed the reason he had abandoned Smalltalk for the Lisa. I had thought that type inference coupled with artful use of assembly language libraries would be sufficient on the Motorola 68000 family, but Tesler was insistent that Object Pascal was necessary for adequate speed. Frankly, I was apalled that Tesler had so easily abandoned Smalltalk with type inference since he had made specific mention of it as an optimization technique in his Byte article. But in a recent email exchange about this history, he told me type inference was never of much interest to him -- that others at Apple were hooked on Object Pascal.

    The horrifying thing about all this is that when Steve Jobs took off from Apple to found NeXT, instead of correcting the nonsense with Postscript and going straight for Smalltalk with type inference, he repeated the mistake, only this time with Objective-C. Then, as I understand it, Objective-C was the precursor to Java with its reliance on declaration rather than inference for type checking. This despite the fact that Sun already had the Self programming language [iphil.net] in house with type inference and dynamic optimization technologies that realized the potential of Smalltalk at along last. Unfortunately the only technology to make it bigtime from Sun's Self project was the Hotspot JVM.

    Although these aren't exactly the same mistakes over and over, we're still struggling to get a decent, widely-used dynamically typed language "for everyone" that includes a pure OO library for graphics. Python isn't easily deployable and although I'm a Perl bigot, even I realize we're unlikely to get Perlscript installed in every browser anytime soon. Anyway I'm partial to prototype languages like Self when it comes to Smalltalk offspring. I do have hopes for TIBET [technicalpursuit.com] as a way of turning Javascript into a powerful programming system across many platforms -- as outrageous as that sounds. I know Bill Edney and Scott Shattuck were some of the first NeXT hackers, but we can all pray for a swift recovery. This isn't an official announcement or anything -- but Bill and Scott did do a presentation at Hackers so I figure I can mention it in the mode of a "hot rumor".

    As I said, I'm more into constraint/relational stuff these days myself, but it sure would be nice if someone brought the power originally in Smalltalk the ubiquity it deserved almost 20 years ago.

  • Ive read the article, and i think the most interesting thing is just how ambitious these guys were, I mean, in 1980, when they started, there wasnt even a processor powerful enough to make all the pretty widgets work. Heh, also from a Mac vs PC chicken or egg debate, they said that the Lisa UI was first introduced at the National Computer Convention in 1980, about 5 years before Windows 1 was released. Shouldn't windows 1 have been released in the year 1? Just a question. At any rate, im sure this will get flamed to hell, but at least now we have evidence to set the record straight. Ironically enough, now we seem to be moving back towards a CLI, simply for the sheer power that comes with CLI. Gui is like putting a blanket over a puzzle and trying to put the puzzle together by moving the blanket around. just my two bits

    '

  • Ive mirrored the original site here [umd.edu]

  • The big problems with the Lisa were hardware, not software.
    • Apple tried building a hard drive for the Lisa. It was expensive, stored only 20MB, and broke frequently. (The original Mac didn't even have a hard drive, which really sucked.)
    • Motorola introduced the M68000 in 1978, but was years late with the support chips, especially the MMU. The M68000 couldn't handle page faults right, either; instruction continuation was broken. So the designers of the Lisa had to build their own MMU (this took many chips) and avoid using 68000 instructions that needed to be interrupted and continued. This ran the cost up and hurt code density and performance.

    Around the time the Mac was in deep trouble (no hard drive, one floppy, really slow, lousy sales, no laser printer) Jobs killed off the Lisa division. This may have been done to make his Mac project look good.

  • Oh, those instructions... Er, doesn't that include just about every instruction that accesses memory? Doesn't sound practical.

    It was a pain. Here are the gory details, assuming anybody still cares.

    When you get a page fault on a reference to memory on a M68000 from an instruction that increments a register (the corresponding C syntax is x = *p++;), the instruction is aborted but the register incrementation still takes place. So returning control to the same instruction after paging in the desired page, or growing the stack will increment the register again.

    The fix was to keep the compiler from generating memory-referencing instructions with the increment bit set. The correponding equivalent in C is to rewrite x = *p++; as x = *p; p++;.

    This reduced performance and bloated the code a bit, especially in tight loops. But it worked.

Never test for an error condition you don't know how to handle. -- Steinbach

Working...