Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

The Future of Human-Computer Interaction 107

ChelleChelle writes "Starting with the Xerox Alto and the Star, ACM Queue briefly covers the history of human-computer interaction from past to present. It doesn't stop there, however. Using a hypothetical situation involving context-awareness in cellphones, the author lays out his thoughts on the future of HCI, offering opinions, advice, and examples."
This discussion has been archived. No new comments can be posted.

The Future of Human-Computer Interaction

Comments Filter:
  • by ackthpt ( 218170 ) * on Tuesday August 29, 2006 @06:40PM (#16003681) Homepage Journal

    Most of mine consists of:

    • wtf?
    • You #&%*!!
    • #$*@ %&*@!!
    • $$&*^ piece of $*&^#@!
    • Dropped carrier again? You #&^%*@ *#&%&@ pile of $&^@#!
    • [Fist on keyboard] NO! That's not what I meant!
    • [Mouse against wall] &#%*#$@ you Microsoft!
    • Another $#$*@#&%# seven?!?! (when playing Catan and have enough for a City, at last(!) and have 3 other resources)
      • And that's usually on a pretty good day. Right now I'm experiencing a lot more gremlins than usual.

    • by Anonymous Coward on Tuesday August 29, 2006 @06:48PM (#16003719)
      I work at a worldwide software engineering company. We recently had a programmer from China visit to learn how to implement our system over there. He spoke fairly poor English yet when asked how he was coping he replied-
      Oh i understand ok, it's all computer terms seperated by swear words, same as back home. :D
    • by Anonymous Coward on Tuesday August 29, 2006 @07:07PM (#16003799)
      # wtf? stupid humans!
      # You #&%*!! humans!
      # #$*@ %&*@!! humans!!!
      # $$&*^ piece of $*&^#@! human!!
      # Dropped battery again? You #&^%*@ *#&%&@ pile of $&^@#! human!
      # [Fist on human's head] NO! That's not what I meant!
      # [Human against wall] &#%*#$@ you God!
    • by Memnos ( 937795 )
      I agree with your well-reasoned and properly circumspect comments. I would add some methods that I look to employ, and that I think are the future of HC interaction that we all seek -- fully immersed tactile interaction with relevant emotional components. Some helpful ones are (IMHO), using gravity as an aid to computer responsiveness, sugary soft-drinks applied through an input device (keyboard) to provide the machine with that extra energy boost, grabbing the nearest human and "interfacing" with him/her
    • Top statements made by software developers when their software falls over:

      4. Oh. It's never done that before.
      3. It's an undocumented feature.
      2. #!!*ing Windows!

      And number one (you know it's true):

      1. Strange...

      (blatantly stolen from the usually unfunny Keeper Of Lists)
  • Picasso (Score:1, Funny)

    by mentaldingo ( 967181 )
    As Picasso said, "Good artists borrow from the work of others, great artists steal."

    That was about the only useful information I got FTA.

    Now off to go and steal some artworks...

    • Re: (Score:2, Interesting)

      by ackthpt ( 218170 ) *

      As Picasso said, "Good artists borrow from the work of others, great artists steal."

      That was about the only useful information I got FTA.
      Now off to go and steal some artworks...

      Unfortunately what *I* seem to see is the stealing of a lot of ideas which really don't have that much day-to-day value OR are really bad, annoying ideas. Whereas really good ideas seem to have been lost.

      I really couldn't give a care about a 3D desktop or pretty icons. I really want to know why the heck some task ke

      • by EvanED ( 569694 )
        Where's the log on this alleged operating system?

        Well, there's a system-wide log at Control Panel -> Administrative Tools -> Event Viewer...

        (Whether that has what you're looking for, I dunno. But then again, if an app doesn't leave something there, it probably doesn't log anything on Unix either.)
        • Sorry... just can't help putting in the stupid Linux is great plug...

          In Linux, I'd forget the logs, download the source, and run the darned tool in the debugger. Of course, then I'd fail to fix it (because I'm not THAT good), and even if I did the author would never integrate my patch. Even if he did, the 100 guys who I might have to support would each have to recompile their kernels to get the fix.

          Gotta love it!
    • Picasso said, "Mediocre artists borrow, great artists steal." I think he meant that real advances require an imaginative leap from the artist, which is almost the opposite of how the quote was interpreted here.

      How about "I'm Feeling Much Better Now, Dave: The Future of HCI" instead? Nah... borrowing.
    • by Anonymous Coward on Tuesday August 29, 2006 @08:12PM (#16004081)

      I'm fed up with people misquoting that Picaso line. What he actually said was this:

      <p>Good artists borrow from the work of others.</p><p>Great artists steal.</p>

      (HTML used for emphasis)

      They were two unrelated statements that happened to be said at the same time. Picaso wasn't talking about internalization, plagiarism, or even art theft. Picaso wasn't talking specifically about the every-day kind of theft cat burglars and shoplifters are prosecuted for, either (though it does help maintain credibility to keep a criminal record). Picaso was talking about hucksters and grifters, about con-artists and the IRS: people who convince others to hand over money and offer little in return. A piece of paper that's already got some scribbles on it, some canvas that's already been spoiled, a CD that's filled with 48 minutes of senseless noise. Picaso knew something true artists should never forget: real art is in the money. Anyone can splash a few blobs of paint on a canvas and claim it worthy of wall space. If you can convince someone else it's worthy of their wallspace, though... if you can convince them to pay you thousands of dollars for something that took you 25 minutes and three cans of Dulux... then you are an artist. Not before.

      Great artists steal our pocketbook lining, and the pocketbook linings of one another. Don't be fooled by this deceptive philosophy called art, and don't set foot into its seedy underworld. It is criminal stuff; an artist-eat-artist mire of corruption. The RIAA treats artists as the American Mafia treats store owners through protection rackets. Universal/EMG, Sony/BMG, WB treat artists as pimps treat their hos, Britney Spears and Christina Aguilera (arguably the finest, most critically acclaimed artists since N*Sync) treat one another as fighting cocks.

      Mothers, tell your sons to choose lawyering, life, love, and any of the fine sciences over this corrupt influence we call "art".

      This is The Voice Of Fate signing off. Have a Pleasant Evening.

      • by slyvren ( 989423 )
        I whole-heartedly disagree. I personally think he's talking about ideas and inspiration. It's along the lines of David Bowie proclaiming to be a "crafty thief". Your statements must be based upon growing up in modern society. I don't agree with warfare marketing, but that doesn't justify art as being useless corruption. I greatly enjoy art, but it's not like I spend much money on it. I goto a museum now and then, I look at stuff online that people slave over and admire and appreciate it. p.s. It's "P
      • We are all familiar with the life and work of Pablo Picasso.
      • by everett ( 154868 )
        Then I guess the RIAA/MPAA is the greatest artist America has ever seen.
    • All you have to do is get a job at the Hermitage appearently.
  • Bonifieds. (Score:1, Interesting)

    by Anonymous Coward
    How about we start this story off with a proper question. How many here are in the HCI field?

    Now. On your mark. Get set. Post!
    • I have a masters in HCI. Does that count?
    • Re: (Score:3, Interesting)

      by Lux ( 49200 )
      If you're writing software, but don't feel competent to have an informed opinion about HCI-related topics, you need to read up on the topic.

      And I'm not just talking about UIs and user-facing stuff, either. I work on a backend storage system. I have a web browser, a front-end server, and a middle-tier server seperating my back-end servers from my end-users, and I still feel that having taken a cogsci class that presented HCI principles well has been invaluable to my job. Examples include solid API design
  • Good article (Score:3, Interesting)

    by bunions ( 970377 ) on Tuesday August 29, 2006 @06:54PM (#16003741)
    I was surprised by this:

    "Smart-phone sales are about 15 percent of the market now (around 100 million units), but with their faster growth should outnumber PCs by 2008."
    • by ackthpt ( 218170 ) *

      "Smart-phone sales are about 15 percent of the market now (around 100 million units), but with their faster growth should outnumber PCs by 2008."

      My cell phone tries to help me with spelling out words as I type in letters. Invariably the doesn't have the option of the re but comes up with some pretty wild stuff I didn't think was in the english language or jargon dictionary. Seems every 'smart' thing I get, I spend a while un-smarting it. Notably these days is my (relatively) new digital SLR which does

      • by bunions ( 970377 )
        my treo autochanges "theyre" to "they're" and similar stuff to prevent me from having to resort to a function key, which I view as pretty smart.
      • by k3vlar ( 979024 )
        Funny you say that, since the autotype on my cellphone seems to be perfect. Sure, I backtrack on a few words, but the time saved doing it this way is greater than presing 4433555-pause-555666 to spell 'hello' when I only have to type 43556. I also end up with much more coherent and English messages, such as "Hello, are we still meeting at 6?" as opposed to "lo r we mtg @ 6 kthxbi". Mine also seems to pick up names in my phonebook too, so odd names like "Heintz van Leeuwen" are actually predicted. Maybe yo
      • I want to point my cell phone at an add from a store and get a reveiw of that product and after deciding to purchase have it determine which store has product on hand. I want to have my shopping list in the cell phone so that it can direct me to the products once I am in the store. As for the home I want my computer to know the status of every device in my house and me. I want it to determine if there are any problems with any of those devices or with me and be able to contact me or if that fail someone
    • Re: (Score:3, Insightful)

      I was more surprised by this "Personal computing launched with the IBM PC." WTF!? Writing about the history of personal computers and they can't even get that right?
    • Re: (Score:1, Insightful)

      by Anonymous Coward
      It's nice that researchers are looking to the future, but HCI has a long way to go as it is. Cell phones are probably one of the best examples we have right now. Almost everyone has one, and they aren't going away any time soon. But by and large, they still suck. We have arguably cool hardware like the Razr and LG's Chocolate and the Treo, but oh man does the software suck. I've yet to meet anyone who doesn't have some complaint about the way their cell phone behaves; for most people, it's a rather lon
  • Wiimote (Score:5, Interesting)

    by Constantine XVI ( 880691 ) <trash.eighty+slashdot@gm a i l . com> on Tuesday August 29, 2006 @07:01PM (#16003773)
    The mouse in its current form is about to be rendered obsolete. With XGL, Quartz3D, AeroGlass, and Looking Glass, we are most assuredly moving twoards fully-3d computer enviroments. As the mouse only moves in two dimensions, it will be time for some change. The Wiimote is perfect for a 3d enviroment. It also has very little learning curve (as much as the adjustment to 3d will have, anyway).
    • I'm afraid using a Wiimote for navigation for a long time will turn out to be _much_ more painful than using a mouse.
    • Re:Wiimote (Score:5, Insightful)

      by demonbug ( 309515 ) on Tuesday August 29, 2006 @07:20PM (#16003871) Journal
      Of course, for the vast majority of tasks done on a computer there is absolutely no advantage to a fully 3D environment. How does a fully 3d environment help me write a report? Track expenses? Find data (unless the data is 3d)? Unless you are working with 3d datasets (or games), there really isn't much (if any) advantage to a 3d environment.

      I'd be happy to be proved wrong, but I don't see much real advantage for the vast majority of computer users from the examples you gave - or any other 3d environment.

      That said, I've had an opportunity to use a CAVE [wikipedia.org] system for a signficant amount of time, and there is no doubt in my mind that a true 3d environment makes it a thousand times easier to work with 3d data - I just don't think that most people would gain anything from this, and trying to force 2d applications into 3d is generally pointless and in fact counterproductive. Makes great eye candy, though.
      • I'll venture that a 3D environment is more natural and intuitive to use, even with 2D documents. After all, writing a document on paper is (more or less) a 2D thing, but we perform that 2D task in a 3D environment.

        Let's say you're researching an essay and are switching between writing your paper and reading multiple reference documents. While you can use a 2D environment for this task, it might be useful (or more intuitive) to move the objects around in 3D. Even though there's little technical difference be
        • less experiened users have a smaller mental leap to get into the "PC space" that geeks are so used to

          If you're living in the industrialized world, you have about twenty years before that sentence is laughable in every social, racial, economic, and cultural category. Twenty years from now it will only apply to children raised by wild dogs. I'm not saying the distinction between computer geeks and casual users will disappear, only that tomorrow's casual user will be pretty savvy by yesterday's standards.

      • by tknd ( 979052 )
        I agree and I think they're going in the wrong direction, even Apple with their multiple desktop implementation.

        One thing that really bugs me today about applications is how I have to spend most of my time organizing windows. Multiple desktops helps, but I still find that I still spend a lot of my time garbage collecting windows I haven't used in a while myself. One of the funniest things I saw was in my HCI class where we had to observe users as they used a computer system. We were observing someone and sh
        • by grumbel ( 592662 )

          For example, if I have Outlook open, why can't I bind a keyboard shortcut to focus on that window rather than simply opening a new instance? So say I hit ctrl+alt+O and up comes Outlook if it's already open. If there were multiple instances of that process, then hitting the combination again would simply rotate through each instance like alt+tab and if one instance wasn't open then the program would be started.

          NeXTSTEP and today MacOSX already do that for most part, ie. if you click something it will laun

        • by Zoxed ( 676559 )
          I pretty much agree with all you wrote, and would like to add the observation that most of the above can be achieved using the FVWM window manager. However the configuration would be non-trivial, including probably adding some scripts, but I think it would be do-able. It may not help if you do not use a UNIX-type OS, but it would allow you to experiment.

          (So far, out of your list, I have only got as far as forcing most applications to open full-screen, and open on a predetermind Virtual Desktop.)
        • by guinsu ( 198732 )
          I've found Taskbar Shuffle to be the answer to the taskbar order problem:

          http://www.freewebs.com/nerdcave/taskbarshuffle.ht m [freewebs.com]

          I just found this after years of saying "Why doesn't windows just do this by default?".
      • Unless you are working with 3d datasets (or games), there really isn't much (if any) advantage to a 3d environment.

        Moreover: how well do most humans perform in 3D? We like to think we are smart, but how many people have problems reading a map upside down (which still is only 2D), finding the shortest route from one point to another in a (complex) building?... All I'm saying is: our brain is adapted to our 2D eyes, there is a good chance '3D computing' will be too difficult for many (most likely includ

    • The mouse in its current form is about to be rendered obsolete.

      Do you have a source you could quote or is this just your forecast? I'm hesitant to believe that, since both the mouse and the screen are 2D. Until we have holographic displays, there's not really three dimensions of anything going on. There just isn't a lot of truly unique 3D applications (not actual software packages but software concepts) that can't easily and intuitively be tackled with a mouse.

      I mean, data gloves have been around now for
      • What I can see rendering the mouse less relevant is improvements in touch screens, in better tablets. Apple applied recently for a strange patent where the screen is a big camera, and a lot of people think this could mean that Apple is going to introduce a "touch" screen that actually uses optical sensors. Wacom had a solution with display and a graphic tablet integrated into one unit, but that was too expensive and never caught on.

        Ah well, my guess is that even if touch-sensitive monitors take off, the mou
    • It was once envisioned that the mouse would be something you held and moved around in space. It was quickly realized that this was annoying and tiring. Now the best tool we have for moving things in 3 dimensions is the SpaceBall [3dconnexion.com]. Oh shit. There goes the planet.
    • by 4D6963 ( 933028 )

      The mouse in its current form is about to be rendered obsolete

      Two points :

      -You rather have your hand laid on a table than in the air waving some 3D device
      -Nobody needs a 3D environment

      It's not because something new becomes possible that we must move on. Newer != Better

    • by osi79 ( 805499 )
      > The mouse in its current form is about to be rendered obsolete. > With XGL, Quartz3D, AeroGlass, and Looking Glass, we are most assuredly moving twoards fully-3d > computer enviroments. A bold statement. Do you have any good arguments that suggest such a change? Most HCI and visualization people would disagree. Shneiderman for example, says that 3D is only useful where you have data that is best visualized as 3D. He certainly does not suggest navigation in 3D for everyday tasks. First, having so
    • > With XGL, Quartz3D, AeroGlass, and Looking Glass, we are most assuredly moving twoards fully-3d computer enviroments.

      Maybe I am missing something but none of these provide a 3D environment; they all just *simulate* 3D effects on your 2D screen.
  • It Burns... (Score:1, Funny)

    by Anonymous Coward
    Get that HCl off me...it burns...oh my skin...
  • Hey, baby (Score:2, Funny)

    by Anonymous Coward
    Wanna kill all humans?
  • by rufusdufus ( 450462 ) on Tuesday August 29, 2006 @07:11PM (#16003821)
    I participated in a study done at..a major research lab..that studied the future of speech as a computer interface. The study was done in such a way as to ignore technology limitations and assume a perfectly working speech and AI system. The conclusion of this study was that speech was not a very good interface for most applications, and would remain a niche forever. The gist was that other modalities, most notably direct manipulation, had less cogntive load, lower latencies,caused less cognitive dissonance, and incited less social friction (eg there is a reason people text message on their phones) compared to speech.
    As you might expect, these results were never published, but instead replaced by another more..paycheck oriented..paper that extolled the bright future of speech interfaces.
    This article is very similar to the researcher-paycheck oriented paper. It appeals to anecdotal fantasies about technology that don't actually work well in reality. Think about the location context phone for example; their example sounds nice but is it really useful? How many people walking around San Francisco would actually be helped by a "dinner,taxi rapid transit" menu on their cell phone? Even if it was useful in theory, in practice that list would simply be more advertising spam intruding into your life.
    • How about a brain-controlled system. You know, like how the primitive versions today require a huge-ass cable to the brain, something of that variety.

      I also how Lisp Machines would have worked on HCI, specifically for programmers.
    • In another Queue article, IBM researchers published the difficulties they encountered while trying to determine "context". Good example is someone using PowerPoint. Heuristic was that if someone was talking and PPT was active, they were presenting to a group. Except for the fact that someone might be talking with PPT active but merely rehearsing. Or one person could be reading through slides while someone else was talking, but that didn't necessarily mean that the person was reading and could be interrupted
    • Scotty: Computer. Computer?
      *Bones hands him a mouse and he speaks into it*
      Scotty: Hello, computer.
      Dr. Nichols: Just use the keyboard.
      Scotty: Keyboard. How quaint.
      • by NateTech ( 50881 )
        I always thought that scene sucked. Scotty can fix the Enterprise with duct tape and bailing wire and knows systems like the back of his hand, but he can't see the keyboard sitting on the desk nor does he recognize there's not a microphone on the mouse?

        (Ahem... bullshit... ahem...)
        • I don't want to start a "Star Trek" flame war, as I am not a ST geek to know everything about everything in the series (I do enjoy it though), but I think that scene isn't too far from what might actually occur.

          Think of it this way: What Scotty knows about engineering and such is in the context of his time period, which in that movie was pretty far displaced from "now". True, he probably does know of older computer technology, but most of his day-to-day knowledge centers around the technology available to h

          • by NateTech ( 50881 )
            Humbug.

            Any Engineer that could fix a starship would know what a damn microphone looked like, even if it came from a number of centuries before. Physics doesn't change.

            The writers put it in for a laugh and to be entertaining, which ultimately of course, IS the point of any movie.

            But it was totally unrealistic. Maybe that's good in the context of entertainment.
    • The "context-aware" phone described in the article would be a nightmare to use. It's basically got a bunch of different modes (contexts, same thing) that it selects automatically based on what it thinks you're doing. I guarantee it will be wrong all the time, and then you will have to figure out how to manually switch to a different mode to get the option you want. The best user interfaces *reduce* the number of possible modes of the system and strive for predictability. Automatic switching between hund
    • Re: (Score:2, Insightful)

      You're wrong. We don't need AI or awesome speech recognition technology. What we have is enough.

      I blew out the ulnar nerves in my elbows... not as bad as carpal tunnel syndrome, but enough to force me to write code by voice for three years instead of typing. For example, the initial version of HDL Analyst in Synplify was written almost entirely by voice. By the end I'd written over 1,600 emacs macros that I could speak to help me do my job.

      So, I'm not completely unfamiliar with voice interfaces to compu
    • I think you're absolutely right, the article (yes I rtfa) contains quite a lot of the ghost of good, old-fashionead ai (gofai).

      I am working on a degree in HCI, and I am getting more and more annoyed at the lack of serious critical thinking in the field. Instead, we're getting buzzwords from marketing departments. Terms like "smart"-phones, context-"aware" computing, computers "perceiveing" the world etc sound to me like unsuccessful AI from the 50's reborn in new shiny clothes.

      The problem with articles

  • Once some less clunky human-interface devices arrive on the scene, the possibilities for new computer games and genres will open up.
    -bendodge
    • by ackthpt ( 218170 ) *

      Once some less clunky human-interface devices arrive on the scene, the possibilities for new computer games and genres will open up.

      I must say, I'm a Catan fan and like the jsettlers (java imp you can get off sourceforge) interface. Easy to read, simple to use.

      So along comes MSN with big press release from Mayfair and MSN that they're going to develop an online version which will be added to the stable of games on MSN. I tried it. Gawd. Clunky to the extreme. There's a decent example out there of a

  • by noidentity ( 188756 ) on Tuesday August 29, 2006 @07:27PM (#16003900)
    Will the future of HCI

    Next

    be a bunch of dialogs

    Next

    or will it be one page?

    Next

    Ahh, here we go:

    http://www.acmqueue.com/modules.php?name=Content&p a=printer_friendly&pid=402&page=1 [acmqueue.com]
  • by Rockinsockindune ( 956375 ) on Tuesday August 29, 2006 @07:56PM (#16004025) Journal
    You're sitting in a strip club, at 3:00 a.m. on a Wednesday night, you look down at your phone and it has three buttons that say:
    1. Cab
    2. 24-hour drive throughs
    3. Divorce Laywer
    • Re: (Score:3, Insightful)

      by saridder ( 103936 )
      With GPS, your phone could at least have "location awareness" to know you are in a strip club.
  • Natural Language (Score:4, Informative)

    by Baldrson ( 78598 ) * on Tuesday August 29, 2006 @08:03PM (#16004048) Homepage Journal
    There's a good chance for natural language interfaces to computers given recent theoretic and practical breakthroughs.

    Until recently there was no rigorous metric for the power of a natural language understanding system but that has changed with The Hutter Prize for Lossless Compression of Human Knowledge [hutter1.net]. Since the introduction of the Hutter Prize here at Slashdot [slashdot.org] there has already been as much progress as ordinarily occurs in a year (actually a bit more since an average year progresses 3% in compression of natural language and the current contestants may have already achieved 4% improvement since the /. announcement [hutter1.net]).

    The theory is simple enough and the mathematical proof has been done: If you can sufficiently compress a large, general knowledge natural langage corpus like Wikipedia, you can competently articulate and understand natural language.

    It's a hard problem but with the metric and the prize competition driving progress there's a good chance human-level [newsvine.com] understanding of natural language will start to emerge within the next few years.

    BTW: This revolutionizes software development in more ways than one. Think about it like this: When Alan Kay first dreamt of Smalltalk, he was dreaming of a system anyone could program. Well, if you can just say what you want and the system is good enough at comprehending you, program specification just became very natural -- natural enough that you child could perform feats of programming not practical with corporate teams of software developers before.

    • I was very intrigued by your assertion:

      The theory is simple enough and the mathematical proof has been done: If you can sufficiently compress a large, general knowledge natural langage corpus like Wikipedia, you can competently articulate and understand natural language.

      Is there any reference where this connection between text compression and understanding natural language is better explained for the layperson?

    • I dunno, it seems to me that we do philosophy in natural language, but that doesn't stop the majority of people from being really bad at it. I think that some things are just hard to describe, because they require more precision than a human is used to thinking with. Now, you could just say, "Oh, well then just let the computer decide for itself the best implementation method..." but at that point, you've gone beyond just semantically understanding a text to having the artificial intelligence to creatively
    • by foqn1bo ( 519064 )
      I've been reading through the references you've given (including the rationale for the Hutter Prize) and I'm still a little fuzzy on the purpose of this competition. Granted, my training is in Theoretical Linguistics and not Math, so I may be misunderstanding some key points. If you have a maximally compressed corpus of linguistic data, it makes sense that this would effectively act as an extremely condensed version the "rules" that govern the patterns in that data. Not all compression operates on the s
      • There clearly are two very different ideas of how to go about attacking natural language systems and the Hutter Prize doesn't decidd which contestants should use:
        1. Human generated language models.
        2. Machine generated language models.

        There has been a lot of work on human generated language models and some work on machine generated language models. What has been lacking from both projects, has been a single standard for comparison so that however one approaches the problem, the result can be judged. Well, t

  • by neax ( 961176 ) on Tuesday August 29, 2006 @08:13PM (#16004087)

    Most of the new and upcoming technology that people *like* to talk about, is all the fluff. This is what marketing executives use to sell their products. The reality is that it amounts to nothing more than eye candy. People are attracted by good looking things and great sounding features, but ultimatley stay with something because of ease of use and performance.

    People appreciate things that look nice, like buying a new car...if it is a really great looking car, it is great for the first week, but if it is slow, drives like a dog and guzzles the fuel (a little heavy on the resources me thinks...) then ultimatley you end up hating the car, and if there are better options available you tend to go and find something better, because ultimatley we derive more satisfaction from a car that performs well....looks alone do not do it.

    Too much candy will give you a fat head. People are initially drawn into things that look nice because they are visually appealing and easy to figure out how to use. But once you know how to use it, you then want to cut the bull and find out the fastest and easiest way to do it.

    • by osi79 ( 805499 )
      It's not that "useful" and "good-looking" are alternative. Good technology is both good-looking and useful. Why must a useful device look ugly? If the designer adheres to "form follows function", it should be possible to have a both useful and elegant device.
      • by neax ( 961176 )
        Good point. Unfortunatley i think that technology should be made useful and easy to use first, and then good looking afterwards. We unfortunatley seem to do things the other way around...
  • take these fancy UIs and use them to control a calculator and then decide if it right for the job.

    "Right for the Job" is the key phrase.

    There are three primary UIs:

    the command line (CLI)

    the Graphical User Interface (GUI)

    and the side door port used to tie functionality together. known by many different names, but in essence an Inter Process Communication Port (IPC)

    Together they are like the primary colors of light or paint, take away one and you greatly limit what the user can do for themselves,

    But if they a
  • From previous stuff I have read about the Xerox Alto, I thought that it had a rather rudimentary GUI. There were only pop up menus and windows, which could overlap, but no pull down menus and no drag and drop. Files and folders were still shown as text (not icons) in a window. I haven't actually seen an Alto but maybe there's someone here who can verify if I'm right about it.

    The author calls the Macintosh a "close clone" of the Alto but I thought it was supposed to be a massive improvement over the Alto
    • "The author calls the Macintosh a "close clone" of the Alto but I thought it was supposed to be a massive improvement over the Alto design."

      Yes, you are absolutely right. But somehow Bill Gates managed to convince people that it was OK to copy the Mac GUI, because "Apple stole it before".

      Before Apple, the GUI looked something like this [bitsavers.org] (PDF). No bar with drop-down-menus, no icons for files, instead rather for actions, no desktop metaphor, no trashcan. Maybe Steve and his team got to see an even more low ke

    • by saddino ( 183491 )
      The Alto did not have true overlapping windows. However, Bill Atkinson saw the Smalltalk demo, assumed that the windows were indeed overlapping, and then invented Quickdraw regions to take care of it.

      From folklore.org [folklore.org]:
      Smalltalk has no Finder, and no need for one, really. Drag-and- drop file manipulation came from the Mac group, along with many other unique concepts: resources and dual-fork files for storing layout and international information apart from code; definition procedures; drag-and-drop system ex
  • Interaction with hydrochloric acid will only result in sudden death.
  • I'd say the next major leap in human-computer interface will most likely involve us boning our computers.
  • by euxneks ( 516538 ) on Wednesday August 30, 2006 @12:23AM (#16005171)
    I'm certain it was posted here before for some other article, but I fail to see posts for this article that link to this [nyu.edu]?

    Multi-touch interface is where it's at - personally, I wouldn't want to talk to my computer to get it to do something. The processing power can be put to much better use calculating a cure to cancer. Plus, it's an annoyance to everyone around, sure, it's cool in that Star Trek kind of way, but for constant use? No way man. It would suck having to tell your computer what to do - really.
    Think about it, late at night, you want the computer to do something: "Find me porn" "Honey? What are you doing?", uh, yeah, that's not gonna fly.
    Playing games: "shoot that creature there! Damnit! This is much too slow!".
    Or programming: "int main bracket argh.. no. Delete. No. Delete del- ah screw it."
    Or word processing: "Dear mom, fix aunt, delete that, delete that, delete that, select all, I think it's picking, I think it's picking up a bit of echo here, delete- select all" yeah, no. I think I'll stick with my multi-touch interface.

    Multi-touch is completely natural and virtually no learning curve. We all have fingers or limbs, or feet, or noses or whatever else with which you use to interact with things - multi-touch interface takes that into account. Plus, information has traditionally been shown on a screen. How often do you hear the story of the newb trying to use the screen as a touch interface?

  • What struck me in the article was the option of optical mouse interation with the phone, something we missed as an application for the optical nokia mouse [slashdot.org] project. This would hugely increase the usability of my phone, the joystick thingy gets crudified and unresponsive very quickly.

    Voice recognition however is a while from being usable, particularly where strong regional accents exist in a market, (though Friends etc. are quickly disposing of these).

    And finaly the opportunity to charge spamers per messag

  • Starting with the Xerox Alto and the Star, ACM Queue briefly covers the history of human-computer interaction from past to present. It doesn't stop there, however. Using a hypothetical situation involving context-awareness in cellphones, the author lays out his thoughts on the future of HCI, offering opinions, advice, and examples.

    What I really hate about these musings about the future is the tendency to always make it a push for unwelcome, mind-bending crap, like adverts. Don't these people realize that ad
  • I want the windows I'm looking at to gain focus so if I start typing while looking at it, it appears where I am looking. This of course should be easy to turn on/off for instance it would be bad when looking at one window and paraphrasing it in another.
    • I hear ya, but really, imagine the chaos it would cause with the relatively different sensitivities of tracking devices. Great on your machine, but if you go to the library where it's calibrated differently, your eyeball tracking might be all over the place.
  • Hah. Human-computer interaction. Screen: "Firefox has caused a serious error and will now close." Me: Wtf. *opens iTunes* Screen: "iTunes has caused a serious error and will now close." Me: "..." Screen: "Windows has caused a serious error and will now close. Would you like to send an error report?" Me: "Sure." *starts typing* "I...can't...wait...for...better...Linux...support ." Screen: "...ouch."

Take an astronaut to launch.

Working...