The Future of Human-Computer Interaction 107
ChelleChelle writes "Starting with the Xerox Alto and the Star, ACM Queue briefly covers the history of human-computer interaction from past to present. It doesn't stop there, however. Using a hypothetical situation involving context-awareness in cellphones, the author lays out his thoughts on the future of HCI, offering opinions, advice, and examples."
Much of my Human-Computer-Interaction (Score:5, Funny)
Most of mine consists of:
And that's usually on a pretty good day. Right now I'm experiencing a lot more gremlins than usual.
It's the global IT language (Score:5, Funny)
Oh i understand ok, it's all computer terms seperated by swear words, same as back home.
Future of your Human-Computer-Interaction (Score:5, Funny)
# You #&%*!! humans!
# #$*@ %&*@!! humans!!!
# $$&*^ piece of $*&^#@! human!!
# Dropped battery again? You #&^%*@ *#&%&@ pile of $&^@#! human!
# [Fist on human's head] NO! That's not what I meant!
# [Human against wall] &#%*#$@ you God!
Re: (Score:1)
This rings a bell... (Score:2)
4. Oh. It's never done that before.
3. It's an undocumented feature.
2. #!!*ing Windows!
And number one (you know it's true):
1. Strange...
(blatantly stolen from the usually unfunny Keeper Of Lists)
Picasso (Score:1, Funny)
That was about the only useful information I got FTA.
Now off to go and steal some artworks...
Re: (Score:2, Interesting)
That was about the only useful information I got FTA.
Now off to go and steal some artworks...
Unfortunately what *I* seem to see is the stealing of a lot of ideas which really don't have that much day-to-day value OR are really bad, annoying ideas. Whereas really good ideas seem to have been lost.
I really couldn't give a care about a 3D desktop or pretty icons. I really want to know why the heck some task ke
Re: (Score:1)
Well, there's a system-wide log at Control Panel -> Administrative Tools -> Event Viewer...
(Whether that has what you're looking for, I dunno. But then again, if an app doesn't leave something there, it probably doesn't log anything on Unix either.)
Re: (Score:1)
In Linux, I'd forget the logs, download the source, and run the darned tool in the debugger. Of course, then I'd fail to fix it (because I'm not THAT good), and even if I did the author would never integrate my patch. Even if he did, the 100 guys who I might have to support would each have to recompile their kernels to get the fix.
Gotta love it!
Re: (Score:2)
How about "I'm Feeling Much Better Now, Dave: The Future of HCI" instead? Nah... borrowing.
Great artists Just Steal (Score:4, Informative)
I'm fed up with people misquoting that Picaso line. What he actually said was this:
(HTML used for emphasis)
They were two unrelated statements that happened to be said at the same time. Picaso wasn't talking about internalization, plagiarism, or even art theft. Picaso wasn't talking specifically about the every-day kind of theft cat burglars and shoplifters are prosecuted for, either (though it does help maintain credibility to keep a criminal record). Picaso was talking about hucksters and grifters, about con-artists and the IRS: people who convince others to hand over money and offer little in return. A piece of paper that's already got some scribbles on it, some canvas that's already been spoiled, a CD that's filled with 48 minutes of senseless noise. Picaso knew something true artists should never forget: real art is in the money. Anyone can splash a few blobs of paint on a canvas and claim it worthy of wall space. If you can convince someone else it's worthy of their wallspace, though... if you can convince them to pay you thousands of dollars for something that took you 25 minutes and three cans of Dulux... then you are an artist. Not before.
Great artists steal our pocketbook lining, and the pocketbook linings of one another. Don't be fooled by this deceptive philosophy called art, and don't set foot into its seedy underworld. It is criminal stuff; an artist-eat-artist mire of corruption. The RIAA treats artists as the American Mafia treats store owners through protection rackets. Universal/EMG, Sony/BMG, WB treat artists as pimps treat their hos, Britney Spears and Christina Aguilera (arguably the finest, most critically acclaimed artists since N*Sync) treat one another as fighting cocks.
Mothers, tell your sons to choose lawyering, life, love, and any of the fine sciences over this corrupt influence we call "art".
This is The Voice Of Fate signing off. Have a Pleasant Evening.
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
Not only is it illegal, it is immoral (Score:2, Funny)
Re: (Score:1)
Bonifieds. (Score:1, Interesting)
Now. On your mark. Get set. Post!
Re: (Score:1)
Re: (Score:3, Interesting)
And I'm not just talking about UIs and user-facing stuff, either. I work on a backend storage system. I have a web browser, a front-end server, and a middle-tier server seperating my back-end servers from my end-users, and I still feel that having taken a cogsci class that presented HCI principles well has been invaluable to my job. Examples include solid API design
Good article (Score:3, Interesting)
"Smart-phone sales are about 15 percent of the market now (around 100 million units), but with their faster growth should outnumber PCs by 2008."
Re: (Score:1)
"Smart-phone sales are about 15 percent of the market now (around 100 million units), but with their faster growth should outnumber PCs by 2008."
My cell phone tries to help me with spelling out words as I type in letters. Invariably the doesn't have the option of the re but comes up with some pretty wild stuff I didn't think was in the english language or jargon dictionary. Seems every 'smart' thing I get, I spend a while un-smarting it. Notably these days is my (relatively) new digital SLR which does
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Re: (Score:3, Insightful)
Re: (Score:1, Insightful)
Wiimote (Score:5, Interesting)
Re: (Score:1)
Re:Wiimote (Score:5, Insightful)
I'd be happy to be proved wrong, but I don't see much real advantage for the vast majority of computer users from the examples you gave - or any other 3d environment.
That said, I've had an opportunity to use a CAVE [wikipedia.org] system for a signficant amount of time, and there is no doubt in my mind that a true 3d environment makes it a thousand times easier to work with 3d data - I just don't think that most people would gain anything from this, and trying to force 2d applications into 3d is generally pointless and in fact counterproductive. Makes great eye candy, though.
Re: (Score:2)
Let's say you're researching an essay and are switching between writing your paper and reading multiple reference documents. While you can use a 2D environment for this task, it might be useful (or more intuitive) to move the objects around in 3D. Even though there's little technical difference be
Re: (Score:2)
If you're living in the industrialized world, you have about twenty years before that sentence is laughable in every social, racial, economic, and cultural category. Twenty years from now it will only apply to children raised by wild dogs. I'm not saying the distinction between computer geeks and casual users will disappear, only that tomorrow's casual user will be pretty savvy by yesterday's standards.
Re: (Score:1)
One thing that really bugs me today about applications is how I have to spend most of my time organizing windows. Multiple desktops helps, but I still find that I still spend a lot of my time garbage collecting windows I haven't used in a while myself. One of the funniest things I saw was in my HCI class where we had to observe users as they used a computer system. We were observing someone and sh
Re: (Score:2)
NeXTSTEP and today MacOSX already do that for most part, ie. if you click something it will laun
Re: (Score:1)
(So far, out of your list, I have only got as far as forcing most applications to open full-screen, and open on a predetermind Virtual Desktop.)
Re: (Score:2)
http://www.freewebs.com/nerdcave/taskbarshuffle.h
I just found this after years of saying "Why doesn't windows just do this by default?".
Re: (Score:2)
Moreover: how well do most humans perform in 3D? We like to think we are smart, but how many people have problems reading a map upside down (which still is only 2D), finding the shortest route from one point to another in a (complex) building?... All I'm saying is: our brain is adapted to our 2D eyes, there is a good chance '3D computing' will be too difficult for many (most likely includ
Re: (Score:2)
Do you have a source you could quote or is this just your forecast? I'm hesitant to believe that, since both the mouse and the screen are 2D. Until we have holographic displays, there's not really three dimensions of anything going on. There just isn't a lot of truly unique 3D applications (not actual software packages but software concepts) that can't easily and intuitively be tackled with a mouse.
I mean, data gloves have been around now for
Re: (Score:2)
Ah well, my guess is that even if touch-sensitive monitors take off, the mou
Re: (Score:1)
Re: (Score:2)
The mouse in its current form is about to be rendered obsolete
Two points :
-You rather have your hand laid on a table than in the air waving some 3D device
-Nobody needs a 3D environment
It's not because something new becomes possible that we must move on. Newer != Better
Re: (Score:1)
Re: simulated 3D (Score:1)
Maybe I am missing something but none of these provide a 3D environment; they all just *simulate* 3D effects on your 2D screen.
It Burns... (Score:1, Funny)
Re: (Score:2)
Hey, baby (Score:2, Funny)
Speech is not the future (Score:5, Interesting)
As you might expect, these results were never published, but instead replaced by another more..paycheck oriented..paper that extolled the bright future of speech interfaces.
This article is very similar to the researcher-paycheck oriented paper. It appeals to anecdotal fantasies about technology that don't actually work well in reality. Think about the location context phone for example; their example sounds nice but is it really useful? How many people walking around San Francisco would actually be helped by a "dinner,taxi rapid transit" menu on their cell phone? Even if it was useful in theory, in practice that list would simply be more advertising spam intruding into your life.
Re: (Score:2)
I also how Lisp Machines would have worked on HCI, specifically for programmers.
Speech is not the future (and neither is Context) (Score:2)
Keyboard. How quaint. (Score:3, Funny)
*Bones hands him a mouse and he speaks into it*
Scotty: Hello, computer.
Dr. Nichols: Just use the keyboard.
Scotty: Keyboard. How quaint.
Re: (Score:2)
(Ahem... bullshit... ahem...)
Ever seen a punched card...? (Score:2)
Think of it this way: What Scotty knows about engineering and such is in the context of his time period, which in that movie was pretty far displaced from "now". True, he probably does know of older computer technology, but most of his day-to-day knowledge centers around the technology available to h
Re: (Score:2)
Any Engineer that could fix a starship would know what a damn microphone looked like, even if it came from a number of centuries before. Physics doesn't change.
The writers put it in for a laugh and to be entertaining, which ultimately of course, IS the point of any movie.
But it was totally unrealistic. Maybe that's good in the context of entertainment.
Agreed (Score:2)
Re: (Score:2, Insightful)
I blew out the ulnar nerves in my elbows... not as bad as carpal tunnel syndrome, but enough to force me to write code by voice for three years instead of typing. For example, the initial version of HDL Analyst in Synplify was written almost entirely by voice. By the end I'd written over 1,600 emacs macros that I could speak to help me do my job.
So, I'm not completely unfamiliar with voice interfaces to compu
Re: (Score:1)
I think you're absolutely right, the article (yes I rtfa) contains quite a lot of the ghost of good, old-fashionead ai (gofai).
I am working on a degree in HCI, and I am getting more and more annoyed at the lack of serious critical thinking in the field. Instead, we're getting buzzwords from marketing departments. Terms like "smart"-phones, context-"aware" computing, computers "perceiveing" the world etc sound to me like unsuccessful AI from the 50's reborn in new shiny clothes.
The problem with articles
Computer Games (Score:1)
-bendodge
Re: (Score:1)
Once some less clunky human-interface devices arrive on the scene, the possibilities for new computer games and genres will open up.
I must say, I'm a Catan fan and like the jsettlers (java imp you can get off sourceforge) interface. Easy to read, simple to use.
So along comes MSN with big press release from Mayfair and MSN that they're going to develop an online version which will be added to the stable of games on MSN. I tried it. Gawd. Clunky to the extreme. There's a decent example out there of a
Only one question: (Score:3, Funny)
Next
be a bunch of dialogs
Next
or will it be one page?
Next
Ahh, here we go:
http://www.acmqueue.com/modules.php?name=Content&
Re: (Score:2)
Context this... (Score:3, Funny)
Re: (Score:3, Insightful)
Natural Language (Score:4, Informative)
Until recently there was no rigorous metric for the power of a natural language understanding system but that has changed with The Hutter Prize for Lossless Compression of Human Knowledge [hutter1.net]. Since the introduction of the Hutter Prize here at Slashdot [slashdot.org] there has already been as much progress as ordinarily occurs in a year (actually a bit more since an average year progresses 3% in compression of natural language and the current contestants may have already achieved 4% improvement since the /. announcement [hutter1.net]).
The theory is simple enough and the mathematical proof has been done: If you can sufficiently compress a large, general knowledge natural langage corpus like Wikipedia, you can competently articulate and understand natural language.
It's a hard problem but with the metric and the prize competition driving progress there's a good chance human-level [newsvine.com] understanding of natural language will start to emerge within the next few years.
BTW: This revolutionizes software development in more ways than one. Think about it like this: When Alan Kay first dreamt of Smalltalk, he was dreaming of a system anyone could program. Well, if you can just say what you want and the system is good enough at comprehending you, program specification just became very natural -- natural enough that you child could perform feats of programming not practical with corporate teams of software developers before.
Re: (Score:1)
I was very intrigued by your assertion:
Is there any reference where this connection between text compression and understanding natural language is better explained for the layperson?
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Word Perplexity Measures (Score:2)
There has been a lot of work on human generated language models and some work on machine generated language models. What has been lacking from both projects, has been a single standard for comparison so that however one approaches the problem, the result can be judged. Well, t
Too much Eye Candy makes your head Fat (Score:3, Insightful)
Most of the new and upcoming technology that people *like* to talk about, is all the fluff. This is what marketing executives use to sell their products. The reality is that it amounts to nothing more than eye candy. People are attracted by good looking things and great sounding features, but ultimatley stay with something because of ease of use and performance.
People appreciate things that look nice, like buying a new car...if it is a really great looking car, it is great for the first week, but if it is slow, drives like a dog and guzzles the fuel (a little heavy on the resources me thinks...) then ultimatley you end up hating the car, and if there are better options available you tend to go and find something better, because ultimatley we derive more satisfaction from a car that performs well....looks alone do not do it.
Too much candy will give you a fat head. People are initially drawn into things that look nice because they are visually appealing and easy to figure out how to use. But once you know how to use it, you then want to cut the bull and find out the fastest and easiest way to do it.
Re: (Score:1)
Re: (Score:1)
Apply it to a calculator... (Score:2)
"Right for the Job" is the key phrase.
There are three primary UIs:
the command line (CLI)
the Graphical User Interface (GUI)
and the side door port used to tie functionality together. known by many different names, but in essence an Inter Process Communication Port (IPC)
Together they are like the primary colors of light or paint, take away one and you greatly limit what the user can do for themselves,
But if they a
The Xerox Alto / Macintosh comparison (Score:2)
The author calls the Macintosh a "close clone" of the Alto but I thought it was supposed to be a massive improvement over the Alto
Re: (Score:2)
Yes, you are absolutely right. But somehow Bill Gates managed to convince people that it was OK to copy the Mac GUI, because "Apple stole it before".
Before Apple, the GUI looked something like this [bitsavers.org] (PDF). No bar with drop-down-menus, no icons for files, instead rather for actions, no desktop metaphor, no trashcan. Maybe Steve and his team got to see an even more low ke
Re: (Score:2)
Re: (Score:2)
From folklore.org [folklore.org]:
Smalltalk has no Finder, and no need for one, really. Drag-and- drop file manipulation came from the Mac group, along with many other unique concepts: resources and dual-fork files for storing layout and international information apart from code; definition procedures; drag-and-drop system ex
HCl (Score:1)
Re: (Score:2)
If history is a reliable indicator... (Score:2, Insightful)
Re: (Score:2, Informative)
Speech? Where's the multi-touch? (Score:3, Interesting)
Multi-touch interface is where it's at - personally, I wouldn't want to talk to my computer to get it to do something. The processing power can be put to much better use calculating a cure to cancer. Plus, it's an annoyance to everyone around, sure, it's cool in that Star Trek kind of way, but for constant use? No way man. It would suck having to tell your computer what to do - really.
Think about it, late at night, you want the computer to do something: "Find me porn" "Honey? What are you doing?", uh, yeah, that's not gonna fly.
Playing games: "shoot that creature there! Damnit! This is much too slow!".
Or programming: "int main bracket argh.. no. Delete. No. Delete del- ah screw it."
Or word processing: "Dear mom, fix aunt, delete that, delete that, delete that, select all, I think it's picking, I think it's picking up a bit of echo here, delete- select all" yeah, no. I think I'll stick with my multi-touch interface.
Multi-touch is completely natural and virtually no learning curve. We all have fingers or limbs, or feet, or noses or whatever else with which you use to interact with things - multi-touch interface takes that into account. Plus, information has traditionally been shown on a screen. How often do you hear the story of the newb trying to use the screen as a touch interface?
Re: (Score:2)
http://www.catb.org/jargon/html/G/gorilla-arm.htm
ook!
Optical mouse interaction w phone (Score:1)
Voice recognition however is a while from being usable, particularly where strong regional accents exist in a market, (though Friends etc. are quickly disposing of these).
And finaly the opportunity to charge spamers per messag
How about just leaving people alone? (Score:2)
What I really hate about these musings about the future is the tendency to always make it a push for unwelcome, mind-bending crap, like adverts. Don't these people realize that ad
Eyeball tracking for focus (Score:2)
Re: (Score:1)
I can see it now... (Score:1)