Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Microsoft

Attention Sensitive User Interface 326

griffjon writes "The NYT (lame free reg blah blah) is running an article on Microsoft research into an attention-oriented UI that will use cameras and mics as well as software to monitor where a user's attention is focused and query other software (like e-mail notification, IM, etc.) to keep it from interrupting their chain of thought." This strikes me as being a really cool idea if properly implemented. Even simple things like not letting your biff update until you change focus out of a word processor. (mind you the anti-MS block on Slashdot will of course equate Microsoft's involvement with the project to mean that this is really about mind control or the corporately financed return of the plague, but what are ya gonna do?)
This discussion has been archived. No new comments can be posted.

Attention Sensitive User Interface

Comments Filter:
  • You realize that this will compeltely distroy most of the (commercial) web? Suddenly, companies will have proof of just how little time people spend looking at web advetisements. Since nobody watches them, they'll stop paying people for them, sites will vanish for not having the funds to pay for a server, and eventually every good clean site out there will vanish. As will every porn site that survives simply on advertising revenues.

    Or, there's the flipside: If you aren't looking at an ad, the UI will move the ad into your line of sight or make it expand until you _have_ to look at a page's banner ads for five seconds before reading anything else on the page.
  • I dunno if this has already been said but do we want the os to control access to mics and video cameras for ui purposes especially with microsofts entrenchment in the computer os market. we have companys like realplayer etc that HAVE been known to send back information without our notification do we really need some company hijacking feeds from my computer?
    I know this sounds overly parnoid but hey food for thought.
    -Brian Peace
  • Actually, I'm afraid of what they'll do with the singing dancing paperclip when they implement this new "innovation". I can see it now:

    "Hi! It looks like you're trying to get some work done! Want some help?"
    or:
    "Hi! It looks like you're using an Oracle database! Did you know that SQL Server is used in 90% of Fortune 500 companies?"
    or, worse yet:
    "Hi! It looks like you're surfing Slashdot again! Would you like some help trolling?"

  • COOL then! Is there any source or binary or any kind of info about it anywhere? I'd like to try it out...
  • So when I chuck MSOffice the Bird... will it undo?
  • Excellent point!!

    IMHO, this isn't about MS being incompetent of implementing something like that properly. This is about a passive interface vs. an active interface.

    Who is supposed to be in control here? the computer or the user? Surely the user. Now of course, it can be argued that this attention-sensitive UI is only receiving orders from the user, just that instead of detecting mouse-clicks, it's detecting eye-glances. But there is a vast difference between consciously clicking the mouse, and unconsciously glancing at something that caught your attention on the screen. When you click a mouse button or hit a key on the keyboard, it at least goes through your brain first and gives you a chance to consciously decide to do it. But your eyes are often distracted by unexpected events, and often this reaction is sub-conscious.

    There is a vast difference between interpreting what a user means by a mouse click and what you think a user means by looking at something. Interfaces should be passive: waiting for explicit orders, and not active: *guessing* what the orders might be from vague signs.


    ---
  • >mind you the anti-MS block on
    >Slashdot will of course equate
    >Microsoft's involvement with the project to
    >mean that this is really about mind
    >control or the corporately financed return
    >of the plague, but what are ya gonna
    >do?

    Poo poo. Hey! If you don't like the anti-MS slant
    around these parts then go post to some other
    web log. Damn, MS shills.

    So does anybody know who this CmdrTaco fella is?
    We need to figure it out and break his virtual
    knees. /. just isn't the same since all of those
    Windows-wienies started showing up around here.

    ;)
  • by tealover ( 187148 ) on Tuesday July 18, 2000 @03:44AM (#925194)
    Their involvement with the project is really about mind control or the corporately financed return of...

    Never mind.
  • Call me cynical, but when I see Microsoft working on something that will sit between the OS and the user and filter out what gets the user's attention, I don't think of it being used for good.

    I think of it being used to crush rival companies' instant messenger programs.

    "Windows ME Isn't Done Till ICQ Doesn't Run"

    Jon
  • Of course, they'll patent it, but it may not do them any good - I read this this morning in Nolo Press's excellent book, Patent it Yourself [nolo.com]:

    "The patent right isn't an absolute monopoly for the period that is in force....
    It can be lost if:
    [a few other reasons...]
    • the patent owner engages in certain defined types of illegal conduct, that is, commits antitrust or other violations connected with the patent;..."

    (emphasis mine)

    Sounds like they better be careful with the DoJ - it would be "interesting" to see many of their patents invalidated for antitrust reasons.
  • They will have to think very carefully about how to implement this
    because software that thinks for you can be really annoying. On the other hand,
    if the software can remove my Lynx window to help me focus
    on my work, that would be really nice :-)
  • by dashmaul ( 108555 ) on Tuesday July 18, 2000 @03:48AM (#925204)
    To come up with something that works well, would take enormous amouts of time and resources, as well as adding huge over head to the system. Hmm, wonder if intel is gonna push this.

    This kind of technology would us be seriously in danger of doing nothing but annoying the end user. Ever gotten into a fight with Microsoft Word over some formating issues? It can be dang near impossible to get it to do what you want because it is being so helpfull.
    Still if ever got impletmed correctly, and wasn't annoying, it would be nice.
  • Great... now my computer is gonna start complaining that I don't pay enough attention to it.
    Thanks M$
  • Yeah, you can do a similar thing. A windows program will "blink" in the taskbar to say it wants focus. It's very similar to the mac thing (didn't the applicaiton menu flash, it's been a while since I used a mac). I like it. It's relatively unobtrusive. I was looking up the API call for it the other day to make a program do it. Pretty easy.
    ---
  • Even just removing items like that can be REALLY annoying. Recently I was trying to figure out how to compare documents in Word for a friend - now she had never used that feature before so that item might have been removed from view. In looking for that feature I do not hang around menus long enough for anything to figure out I want more choices, I just flip through them - how aggrivating it would have been to search and search and then find later Word had hidden it from me!

    I think UI's that are really easily altered by the user are the way to go - application developers should look at games to see how simple and configurable UI's can be.
  • by beth_linker ( 210498 ) on Tuesday July 18, 2000 @05:41AM (#925217)
    Personally, I'm inclined to think that the most useful tools are the highly customizable ones - while I don't go near the Microsoft paperclip, tools like Outlook or procmail give me fine-tuned control over how I receive and organize my mail and that's good. So a big thing that I look for in new software is how easily I can customize it to work with me. Microsoft Word nearly always seems to work against me.

    I can't imagine trying to set up rules for something that was responding to my hand and/or eye movements. They're often way too subconscious and I don't understand them well enough to formulate rules that are as useful as "Beep when I get mail from my boss." Also, what if I want to wear mirrored sunglasses while I code? (I just got a new desk at which I face big windows with no blinds - around 3 or 4 pm, I'm very tempted to put on dark glasses). Is that going to break the UI? There are also a lot of variables in a person's behavior - sitting in a different chair, not getting enough sleep, and drinking too much coffee can all change one's movements. Although I've got to say that a UI which could detect when I hadn't had enough coffee and brew me a fresh cup would be a huge improvement.

    So, I think this idea is barking up the wrong tree. The things I'd rather see in a new UI paradigm are some integration of voice commands, easier methods for customization (so that it's not just for geeks anymore. Outlook's Rules Wizard is actually moderately good at this.), and an interface with some sort of ability to learn from interactions with a user (while maintaining enough consistency so the guy doesn't feel like his computer is schizophrenic).
  • Just me, or does this sound like a marketer's dream come true? Forget about click-through rate- we're talking about exact demographics about who looks at what for how long.

    Anyone else worried about privacy issues here? MS doesn't have the best track record when it comes to consumer privacy.
  • The roots of the paperclip as Bob still show: Ask office help about "Bob" and it takes you right to the "Office Assistant" entries...
  • so let's try something more complicated.

    Seriously, I think this is a great research project, but the reason that UIs stink is not because we have fully exploited the capabilities of GUI and have to look beyond it to a new class of capabilities. The word "agent" when applied to UI is always a red flag. Think about the paperclip tracking where your eyes are going. "I see you are looking for some nasty Pr0n..."

    The reason interfaces stink is because developers have little understanding of or deference to user needs and wants. I don't doubt that a dash of attentional interfacing wouldn't be a good thing in limited amounts, but the idea that the same people who are slapping GUIS together today are going to be able to agressively intuit what users are about to do is at best a pipe dream and at worst a nightmare.

  • by Phrogman ( 80473 ) on Tuesday July 18, 2000 @05:48AM (#925230)

    But just think of the boon this could represent to script kiddies everwhere....

    M$ AI: It looks like you are trying to crack a system, can I offer you some help from my knowledge base? There an excellent source of exploits covering most M$ products located here [securityfocus.com].

  • an attention-oriented UI that will use cameras and mics as well as software to monitor where a user's attention is focused and query other software

    Has anyone else given any thought to this? I got Big Brother circulating in my computer! Granted, it doesn't hurt if it stays in my computer, but who says that this information couldn't be leaked out, like through IE? I could be playing some flight sim, and the next thing I know when I switch over to my web browser, there's a web pop-up add advertising Microsoft Flight Simulator!

    The software would monitor what I am doing at my computer, both by camera and by mouse clicks. The moment this information leaves my computer (Microsoft innovation, anybody?), my privacy goes up in smoke.
  • by Animats ( 122034 ) on Tuesday July 18, 2000 @07:38AM (#925237) Homepage
    I saw that at IBM Almaden a few years back myself. Back then, the sensor worked, but they didn't have an application for it. My comment was that all the applications I could think of for it were awful, like web pages where you had to read the ads before the content would appear.

    Eye trackers have been around for about two decades; the only new thing about this one is how simple it is. It depends on the fact that human eyes have quite different reflective properties in IR than the rest of the body or clothing. So they take a video frame with IR lights on, then one with it off, and subtract. Eyes show up as dots.

  • Now, here's the question:

    Aside from the frequent crashes and the inefficiency, my other great frustration with Windows 9x/NT is that when I'm banging away at Eudora and waiting for Internet Exploiter to load a busy webpage, Internet Exploiter will steal the focus away from Eudora while I'm typing. With no time for me to respond, it just goes and takes the next character I was typing to Eudora as being my response to its question.

    Of course, these aren't the only programs affected by this design shortcoming; when you have ten applications running at once, the frustration can be immense.

    Now, has anyone figured out a way to make Windows behave differently when an app wants focus? How about flashing the window's taskbar presence and beeping a couple of times?

    Is this a problem that I'll see when I'm running X on my Linux box? (I'm still very new to Linux, and I'm not yet at the point where I've ever had more than about three X applications open at once.)

    Thanks.

  • by Saib0t ( 204692 ) <saibot@h[ ]eria-mud.org ['esp' in gap]> on Tuesday July 18, 2000 @04:55AM (#925242)
    Well, this sounds to me very much like we're going to be monitored more and more.

    We have rich decision software called Bayesian Inference Software that we can build down into the system that can track your usage and adjust in an automatic fashion

    As someone mentioned already, this can be used for many another thing than what this is intended.

    A couple of these uses c(w)ould be:
    - Permanent monitoring of the users with the camera. It can already be done right now actually, but a boss deciding to put a webcam on every machine for supervision purposes will make everyone feel 'spied'. This system would provide an 'excuse' for having webcams on every machine.
    - Advertisement banners can now position themselves where you're looking.
    - Since the thing would monitor the user's activities in order to determine what to give the focus to (or what to prevent being given focus to actually), it'll be easy to keep trace of the activities of the user: slashdot reading 2 hours, coding 4 hours, speaking jokes with colleages 30 minutes...

    This thing really raises a couple of disturbing issues. I may be paranoid but I don't like monitoring systems. At least are they aware of that: And Horvitz and his researchers themselves acknowledge that the information collected by the notification manager software -- potentially, information on the personal activities and movements of millions of people that would be stored on the Internet -- raises privacy and security issues that have yet to be resolved. But I doubt those issues will be resolved.

    Ignoring the system requirements to run the thing - that certainly aren't trivial - the system could be useful, if the user is given to set the "disturbance value" (or "worth") of possibly disturbing events. But that would be a hell to configure, imagine every morning having to say to the program "I'm waiting for urgent messages from person X and Y" and changing that everyday. I doubt very much that a program will be able to determine what I think important and what not.
    A simple example of this would be that a message I receive from a colleage might very well be the information I've been waiting for 2 days but also an email to, for example, notify me of a colleague's birthday party.

    Another thing I'm skeptical about in the article is: He expects that the system will be able to greet and converse with new visitors. The conversation, he says, will be on par with speaking to a person who is hard-of-hearing.

    AFAIK (As Far As I Know [I realised that acronysm's meaning only very recently...]) no current software is able to converse with a human being. Answer a couple of pre defined questions maybe, but certainly not converse.

  • I propose that all X window manager authors add the EyeballFocus setting, complementing MouseFocus, SloppyFocus, and ClickToFocus, immediately, thus avoiding any possible claims of "innovation" from Microsoft. Just create an X extension that accepts x,y screen coordinates from an arbitrary source, and worry about pesky issues like hardware later.

    Alright, so this was meant to be funny, but there are actually eyeball tracker->mouse replacements out there, I saw one in use ten or more years ago. Has anyone made one work under X?
  • Lots of windows applications are already very presumtious about their importance in your life.

    That seems to be a consequence of a MS Windows bug, not a feature. If you bring up a dialog box under Win32 with the default settings, it often comes up behind the current window. The default ought to be "in front of the windows for this application", but the default that's supposed to do that doesn't work reliably. Unfortunately, "in front of everything" is easy to specify, does work, and gets used far too often.

    It was once a rule of the Macintosh user interface guidelines that you never grab the focus unsolicited. You were supposed to use the Mac's "Notification Manager", which did no more than put little blinking icons in the upper right of the screen, hinting that some background app wanted attention, when you got around to it. This was very polite. Unfortunately, as screens became cluttered with too much blinking stuff, notifications became ignored. This led to a more aggressive style of notification. Today, it's more like MS Windows. Things were slower-paced back in the Mac's heyday.

  • I can see the new office feature now...

    It looks like you're eyeing up that secretary over there! Would you like some help with
    chatting up clerical staff?

    (*) Suggest subtle approach.
    (*) Suggest crude approach.
    (*) I'm married, don't bother me again.

    -- Now that WOULD be an office assistant!!

  • You could wink. That's probably even better because one eye is still pointing at your target so you could do drag and drop.

    --
  • Or ICQ clients which reappear on the top at a new message coming in

    I hope you're not referring to Mirabilis ICQ, it's got UI options galore to suit everyone's taste. I knows how to shut the hell up and mind its own business if you click a few widgets. Too bad it's *7 megs*.

  • Apparently the software will even read your email and try to schedule appointments for you, etc.

    "Dave, you have an appointment with Microsoft to arrange selling your soul for my next upgrade."
    "Cancel that appointment"
    "I'm sorry, Dave. I'm afraid I can't do that"

  • Waggoner-Edstrom (Microsoft PR agency) has a talent for hyping old news that Microsoft suddenly discovers it has (re)invented.

    If you read the article and web page of this technology you can see that Bayesian analysis was used during the 1970s in medicine extensively. The programmer in question was a physician. "Artificial intelligence" at the time was an attempt to use computers to solve complicated problems such as diagnosis in medicine. The program "MYCIN" was able to give advice on antibiotic use to physicians, based on expert rules.

    This technique never quite succeeded. The answers were probably right, but physicians didn't have full confidence in them, since the argument process was a bit mysterious. This was disturbing to psychologists, who had concluded that humans were conservative Bayesian in decision making under uncertainty.

    A new wave of psychologists had to come along and show that humans are not Bayesian in their thinking. It might be that computer aids could help humans to decide matters better, and help solve complex problems, but pure Bayesian analysis would not provide a full answer to our problems. This unfortunate conclusion should have been foreseen when one considers that the rationalist agenda to reduce arithmetic and logic to a complete and consistent decision theory was doomed by Goedel long ago.

    Human-computer interface theory has moved away from Bayesian analysis in another respect. The complexity of modern technology has caused enormous problems, even disasters, for which we would like to have some procedure to prevent such errors in the future. It turns out that the great technology errors such as Chernobyl or Challenger were not human-operator errors or technology-operation errors, so much as human-organizational errors and technology-design errors. The errors were latent in the design and just waiting for the random events to be exposed. Bayesian analysis would only predict what could be foreseen and calculated--these errors were overlooked in the design stage, and Bayesian analysis could never reveal them.

    Experts such as Don Norman and James Reason have written extensively on the failures of rationalist Bayesian analysis to manage complex technology. They propose concrete solutions for the design stage. I wish Microsoft would pay more attention to their ideas, instead of fooling around with this Bayesian stuff.

    One example: I sat in the audience in the big tent at the launch of Windows 95 and watched Bill Gates and Jay Leno show off the spell-checking ability of WinWord 95. Bill typed in something like, "We are at the premierr of Windows 95" and the word beginning with "p" was underlined with a red wavy line. Jay and Bill clicked on the word and accepted Word's suggested correction "premier". Based on Bayesian analysis, quite likely that would be the most probable spelling. However, it is wrong in this context: instead, the more unusual spelling "premiere" is correct. Since Jay and Bill didn't know any better, they took the word of the computer expert.

    If we follow strict Bayesian tools, this situation of exposure to unexpected errors will increase. That is why users turn off Microsoft Bob--the technology is wrong, annoying, and causes more problems than it is worth.

  • If you are looking into using LaTeX, you might want to have a look at the excellent LaTeX frontend LyX [lyx.org].

    LyX is an advanced open source document processor running on many Unix platforms and OS/2, and experimentally under Windows/Cygwin. Unlike standard word processors, LyX encourages an approach to writing based on the structure of your documents, not their appearance. LyX lets you concentrate on writing, leaving details of visual layout to the software.

    LyX produces high quality, professional output -- using LaTeX, an industrial strength typesetting engine, in the background; LyX is far more than a front-end to LaTeX, however. No knowledge of LaTeX is necessary to use LyX, although it will give a user more power.

    LyX is stable and fully featured. It has been used for documents as large as a thesis, or as small as a business letter. Despite its simple GUI interface (available in many languages), it supports tables, figures, and hyperlinked cross-references, and has a best-of-breed math editor.

  • I think technology you mention would be a good step forward for immersive VR. Now that computers are starting to get the the point where a dynamic level of detail can be calculated for objects from their underlying geometric data, that would provide a nice boost to realism.
    Another neat thing that could be done with this is in a VR environment, one of the hardest things to deal with is the problem of how to model interation with a very limited sensory/motor bandwidth. If objects could have lists of actions the user could perform on/with them when a user looks at something, the system could pop up a translucent menu over it with a list of hotkeys to perform a list of logical actions =:-)
  • I think you are being quick to condemn this work based on treating
    some rules of thumb as carved in stone. There was a nice article a
    while back in the CACM The Anti-Mac [acm.org]
    which was about what a user interface would be like if we threw out
    the desktop metaphor, one of whose assumptions is this idea of the
    passive interface. Think how useful non-passive intefaces are, like
    xbiff...

    I'm really interested in new work on user interfaces. I don't
    like the idea of hiding what programs are doing that comes with the
    desktop metaphor, and by extension to almost all GUIs, but on the
    other hand, I wouldn't go back to text-only, mouse-free, console
    experience. So I use my machine in an unprincipled mess of GUI and
    CLI. Consistency isn't so important, but surely there has to be a
    better way...

  • by HiQ ( 159108 ) on Tuesday July 18, 2000 @03:51AM (#925285)
    So now the system will actually *know* when I'm staring that stupid paperclip to death; I hope they will implement a new feature: when you stare madly at the paperclip, it will catch fire, and will be reduced to a pile of ashes! Kewl!
    How to make a sig
    without having an idea
  • Just because the HR people have their heads up their rear-ends, doesn't mean that the rest of the company isn't pretty cool. Many a good company makes the mistake of letting non-technical people determine who to hire in a technical field. (Of course, that beats the alternative mistake, which is to force techies to pull themselves from their work to make hiring decisions.) Trust me, I can speak from experience that sometimes the people who you interview with aren't always really representative of the whole company and who you'll be working with.
  • by FascDot Killed My Pr ( 24021 ) on Tuesday July 18, 2000 @03:51AM (#925290)
    So kids with Attention Deficit Disorder AND Automatic Power Management are going to have their computers shutdown on them every 10 seconds...

    Seriously, this sounds neat--if it works. But I can imagine new programs trying to compete for my attention by flashing, show nudie photos, or whatever in an attempt to boost a Nielsen-style rating.
    --
  • For the last seven years or so, companies have competed with each other for our attention on the desktop by flashing and alarming etc. However, these competing companied rarely gave us a choice of what they planned to take over, and if they did it was an annoyance (would you like IE to be your default browser? ~ sheesh I answered that question a gizillion times!)

    One of the potential problems I see with this interface is I will be dependent on my computer prioritizing for me. That, could be a great problem, because that skill isn't just used for dealing with my computer at work, but in problem solving and time management. If we don't learn how to deal with divvying(?) up attention, there is a lot we might lose (we only use a small percentage of our brains now... what's gonna happen when the computer tells us how and where we get vital info??)

    My point is that I believe a new and efficient interface would be one that would make it more natural in performing actions for you, not lock you into a certain sequence so you can get something done. For example telling the computer to shutdown as opposed to looking directly at the shutdown icon (for those systems that track eye movement), or going through left clicking this button, answer some questions... (what we use now) or leave the room for 5 mins, etc.

    It would be foolish to say that one kind of approach can solve the problem, but I do think that the emphasis, if they wish to make us more productive, should be on making how we interact with our computers, more natural.(Be it speech and tactic response etc)


    Nuff Respec'

    DeICQLady
    7D3 CPE
  • After they've mastered figuring out exactly what you're paying attention to, they'll come up with the next big, amazing thing...

    A bio-feedback monitor! Yeah, that's right, with a little head-strappie thing and a monitor that apparently picks up your brainwaves, you can control games, graphics, and much, much more. Coming soon for your *COCO 3*!!!

    So much of this user-over-friendly technology is a half-step backwards... A lot of people, while they want computers to be friendlier, don't want the me to get much more helpful.
  • ADD is a bit of a misnomer. It's not really an attention deficit disorder, but wild oscillations between the ability to hyperfocus for a long time and distractibility. Yes I know, this happens to everyone, but it moreso in people with ADD -- in fact, based on the symptoms, I think you almost have to have ADD to be a geek. (It's not a bad thing; the symptoms of ADD are closely related to the symptoms of genius and the symptoms of creativity -- they think Einstein, Edison and Mozart had it.)

    You know, I had always thought that ADD was one of these conditions created by schoolteachers and stuff as an excuse for not being very good at classroom control.

    Then, my best friend of 11 years was diagnosed with it, and lives on prescription dexamphetamines. He's got a brilliant mind, but all the way through high school, and, in fact, until the diagnosis, he couldn't stick with anything.

    The change in his personality occurred within a few days of starting his prescriptions. He's a mechanic, and on a professional front, he moved from changing oil and mufflers at a Toyota dealership to working at the best automotive restoration shop east of California. Now, he repairs and restores big-buck collectible cars like 1930s Rolls Royces, Bugattis, and 1960s-1970s musclecars.

    One of his lastest projects has been working on a 1950 Ford sedan that has been converted into a $500,000+ show car. He's at the top of his profession.

    He's come a long way from hammering rusted exhaust systems off ten-year-old Tercels.

    More power to you if you can beat this thing. It's truly the plague of genius. I don't know how I dodged that bullet. <grin>

  • "The system would also observe whether the computer user was typing, talking on the phone or speaking with someone face to face in the office. Helping keep tabs of the user would be a small camera that could determine whether the person was present and looking at the computer screen."

    Sounds like that could easily link into SMS (or similar) to keep track of what all employees are doing - that's probably not their plan, but i'm sure some companies would want it, and where there's a market...
  • My surfing style is having several /. and RemarQ windows open in Infernal Exploiter while keeping my newsreader (not Outhouse) and a paint program (GIMP, silly!) open for when I need them. This way, my ADD doesn't automatically turn into swapping on this stupid Windows 98 installation that allocates 160 MB of RAM (out of the 64 MB in my machine) on boot-up.

    --
    I don't use Windows; I tolerate it.
    <O
    ( \
  • by (void*) ( 113680 ) on Tuesday July 18, 2000 @03:54AM (#925303)
    I want to know when email comes, even in the heavy midst of coding, becuase it could be email from my wife. I want IM to tell me when the boss logs on, becuase I have Important Stuff I wish to report to him.

    Why is MS always thinking about "how cool this ..." or "how cool that ...". Don't they realize that many of this is just straitjacketing people into one set of actions or options? Perhaps a droid might like it, but I am not a droid. I am a human being with priorities that cannot be turned into a well-ordered list.

  • How about electric shocks for innatentive office workers?

    heh!

  • It is pretty expensive trying to render every pixel on screen with the same amount of quality 60 times a second... why not use this technology to only render in the nicest possible way and in maximum detail the small area the player is looking at, and render the rest with all possible shortcuts?

    Not the whole 3d engine can be made faster this way, but significant gains in fillrate and geometry (using x/y specific LOD) can be obtained. Plus it becomes more attractive to use very expensive rendering techniques (high quality antialiasing, per pixel dynamic lighting) for the focal point.
  • Some reasons why this is bad:

    One of the reasons I can get faster and faster CPU's every year, yet still see no difference in how fast my computer runs is because too many application writers are getting off on programming in "tasks" for their applications to do even though I, the user, never asked it to do so. I run application x to do word processing, but little do I know, x is also listening for mail, watching the file system, watching my every action to see if it can help, etc. Now, they'll be watching where I'm looking, etc. I would appreciate it if a standard developed that application writers would allow enable/disable options for this stuff. There are times when I want my machine to run fast. Period.

    On a related note, software takes advantage of faster and faster computers by doing more and more, rather than just going faster. Sometimes this is great, but I'd like to have some choice in the matter, as a user.

    Computers doing things I never asked them to do - thinking on their own and then acting on their own decisions is not a good trail to go down for everyday usage. It's like the argument between the console and the GUI - developers like to know what is happening. I believe it actually is possible to have both - a nice GUI so that it isn't necessary to memorize obscure commands and and understandable program that doesn't do more than you asked it to. This is the kind of "GUI innovation" I'd like to see.

  • There's also a salon.com article about this here [salon.com].
  • It's just one more way that Microsoft insults the intelligence of their customers. I don't need an idiot paperclip popping up to tell me that it "looks like" I'm writing a letter. I know whether I'm writing a letter, thanks... But Microsoft makes the blanket assumption that ALL its customers are functionally illiterate.

    A lot of my time at work is actually spent supporting new to intermediate Windows users, and, believe it or not, the paperclip and all the stuff that you and I, as more advanced users, consider to be the bane of our existances, is actually useful to a lot of them.

    I just wish, in the control panel, there was a little setting called "User Skill". Drag the little control halfway for an intermediate user, all the way to 0 if you know that the user is a complete Windows newbie. If all the applications followed this lead, it would be the best of both worlds.

    I use Internet Exploiter as my browser. Yes, it's evil, but since it's already there on my hard disk, like it or not, it saves me time and resources. And it seems to crash less often than any version of Netscape I've ever installed in Windows. And it didn't add that stupid AOL Instant Messenger the way Netscape did.

    I just wish that, when an URL fails because the server is busy, Internet Exploiter didn't open up that stupid "Navigation Cancelled" screen. It wouldn't be so bad if the long URL I'd just (mis)typed into the address bar didn't get replaced with "About: Navigation Cancelled".

    One would hope a User Skill control would, when cranked to the max, let me see the 404 error from the server.

    Grrrrr...

  • nobody likes me. you don't even notice i?m here. i?m a failure with the first post, and i?m a failure with evertyhing else.

    maybe i should kill myself... but not until i?ve taken a few classmates with me. don?t bother trying to catch me, jonkatz -- you?ll never find me, fucker.

    Hey dude, I'd be more worried about one of your classmates reporting you to Pinkerton's Thought Police.

    Listen, life sucks. It's tough, it's frustrating, it's annoying. But no one ever said that life would be easy.

    Go talk to your guidance counsellor at school or whatever. If you play your cards right, you'll get some nice and legal happy pills, paid for by your HMO. Then things start to look better.

    Best of luck.

  • In my office 2000, the only thing that happens is some entries are not shown if I don't use them enough. Same thing in Win2K. I can easily show all entries with the click of a button, and it is "smart" enough to do it automatically if I'm sitting there for a few seconds looking around.

    Honestly, why the hell do I need to defend Microsoft against FUD?
  • Loop without recursion? You can use a lambda calculus trick:
    \def\loop#1{#1#1}
    \message{begin loop}
    \loop\loop
    \message{end loop}
  • by (void*) ( 113680 ) on Tuesday July 18, 2000 @08:37AM (#925331)
    Oh RIGHT! It's email from my wife that is important. Maybe it's not - maybe it's just that I want to receive stuff from my wife that has "NEED SEX NOW" in it the subject that I want to see, but "NEED MONEY NOW", I don't want to see. :-)

    Honestly, do you people actually think these Bayesian classifiers/Neural Network/etc stuff will actually work as advertised, that it would not be a major harassment to use on autopilot? Do you understand the technology, or do you just see the marketing hype?

    I mean - you are all geeks right?? Don't you understand that we already have the tools to do what I want, except that it may take some amount of mastery? (And hence more documentation or a better UI) I mean you are geeks right? Don't you understand the need to learn focus and mental discipline, that there are no software shortcuts for these things?

    Let me tell you how I am running this wife/boss filter right now: I fire up pine in an xterm and let it sit on my mailbox. Pine beeps when I receive new mail. When I see it does, I use [Ctrl-Left] to switch desktops, look - not open - at the last piece of mail, and if it is not my wife, I switch back with [Ctrl-Right]. There - simple quick and expedient. If I had more mail, I would hack up a quick, easily grokkable procmail filter. Is MS-bloat necessary at all, or is it the case that MS again is selling you fluff that you can't see?? I mean you are all geeks right?

  • Mac OS has a "notify" widget. When your program needs attention from the user and the frontmost window belongs to another program, create a "notify" widget. It makes the Application menu (the one you pull down and get a taskbar) flash with an icon the programmer chooses and plays a loud sound (often system beep but can be overridden).
    <O
    ( \
  • This is great! Now, with the proper implementation by the major browsers, pop-up ads will be guaranteed to appear right where you are looking!

    Mojotoad
  • Anyone want to bet that all things labeled "Microsoft" or MS-Freindly companies will eventually take priority? Think about it: MS sends a spam e-mail which happens to be labeled "urgent" because if it's from MS, it must be important (probably a security patch to a poorly written program). I could be paranoid, but this could get really evil.

    If MS really wanted to stop the information overload, they should find a way to stop the 200 daily spam messages to my hotmail account and not just complain that my account is too large.



    Being with you, it's just one epiphany after another
  • Two years later, the Horvitz team saw its first commercial program become part of Microsoft's Office software. The program, based on Bayesian techniques, was a relatively simple tool known as the Answer Wizard, which tried to anticipate the needs of users looking up topics in the software's electronic documentation.

    Okay, so this same group of whackos was in charge of 'innovating' the Answer Wizard. They're the ones who ensure that you don't find what you're searching for. Great. Now they're designing software that will decide how important my incoming email is and either deliver it or hold onto it until it thinks I'm not busy.

    Yippee! So what happens when it decides I'm always busy and I never get that email I'm waiting for, etc etc. Exchange keeps doing this NOW - 'forgetting' to deliver email.

  • by fiziko ( 97143 ) on Tuesday July 18, 2000 @03:59AM (#925359) Homepage

    Ever gotten into a fight with Microsoft Word over some formating issues? It can be dang near impossible to get it to do what you want because it is being so helpful.

    That's why I switched to LaTeX. Well, first I switched to StarOffice for the equation editor, and then I switched to LaTeX for the excellent cross-referencing, table of contents generation, and damn near everything else Word wouldn't get right the first half a dozen times I tried. God, Word's automatic outline generator? I still haven't figured out how to keep the numbering accurate automatically when I change the document. Maybe there's a way, but I never found it. LaTeX is, in my opinion, a far superior product, with much lower system requirements. It's basically a markup language, but anyone with a modest combination of IQ and HTML (eg) experience should be able to pick it up in under half a day.

  • GUI's, or at least those of the Windows kind, were designed badly from the start, in that interaction for the most part was designed to be a series of pop-up dialog boxes to which the user had to respond.

    Right now, using the Windows GUI is frustrating if what you want to do is different than what it wants to do. For instance, I can in the middle of typing this sentence, and if some other notice window pops up just before I hit the space bar, I will probably tell that window to perform its default action. Even if that doesn't happen, at the very least I will lose the focus, and I will have to click back to the window I want to give input to, or to be able to see completely.

    If I am using the "Start" menu, and any window activity happens on the screen (which happens a lot for the first 30 seconds after boot up, when I usually want to start some programs) the Start menu pops back to its non-visible state. (This is probably just bad implementation on MS's part, there's no real reason it had to be implemented that way.)

    I think actions of the computer in a GUI, to which the user is required to evalutate and make a response to, should only be based on direct actions of the user.
  • No login link Link here [nytimes.com]
  • Surely if you don't want your programs to be constantly intefering with what you're doing then you have them minimized? Outlook will stick a mail icon in the system tray when mail arrives, if you don't want to be bothered by it then don't look at it...

    Still, this kind of thing can only be a good thing for people with a lot to do and not a lot of time to do it in. Probably over half of the mail you receive during the day is pointless, but currently you still have to check it just in case. And when you're in the middle of something important, the last thing you want is your train of thought inturrupted by someone mailing you about some sports result you already knew about.

    So priority systems like this that rely on contextual information are likely to be a great help to anybody in a busy office, but only if they work! If the system fails to notify you of urgent tasks whilst letting through rubbish then this will make your job even more difficult.

    So it's really all down to how well it works. If it works, MS will win big on this, if it doesn't then it'll be Bob Mk 2...

  • by Posting=!Working ( 197779 ) on Tuesday July 18, 2000 @06:36AM (#925369)
    I have read and agree to the EULA...>Click

    Paperclip: "You have read nothing. No software will be installed until you've read everything, including the procedures for sacrificing the chicken. Since you lied to me you must also read all the marketing brochures for Win 2K."

    Me: "Hey is that an error message?"

    Paperclip: "Here, read your e-mail. No wait here's some spreadsheets. Urgent database coming up!"

    The BSOD will be replaced by an urgent e-mail from your boss "get in here or you're fired" Once you left it would reboot. Paperclip: "You did not write 1200 lines of code this morning. Really, this is where you were when you left to see your boss."
  • Would you really want a turing-complete language that can read and write files to be the primary language of the WWW? It's bad enough that we have Java and Javascript. Imagine if raw HTML had those capabilities! TeX is a document language, nothing more, nothing less.

    Btw, I prefer LaTeX to raw TeX.

    Also..
    Is there a way to loop in TeX without using recursion?
  • by 11223 ( 201561 ) on Tuesday July 18, 2000 @04:01AM (#925393)
    Y'know what keeps stealing my attention? Those stupid freei.net banners at the top of the screen - how annoying. I can just see it now - all of a sudden, my screen starts flashing like crazy because of some stupid ad, and when I look up, *wham* it opened up the web site for the ad! Aaagh!

    Wait a second... I wonder how much they're being paid by the ad-mongers to develop this...

  • you don't want it to, it will also come up WHERE you don't want it to. I can see some serious abuse of this kind of technology.

    Lots of windows applications are already very presumtious about their importance in your life. Many an email package dines themselves to be of such import that, upon receiving new mail, they bring themselves to the foreground and open up a dialogue box informing you of your urgent life-or-death mail from XYZ-Spamco asking you to buy their product. Of course, the fact that you were mid-command, telneted into a unix box trying to stop circuit board etcher before it starts because you just realized there was a fault in your layout. But no, that spam mail is more important than your carefully preped sheet of copper laminated fiberglass.

    Now, not only can applications bother you with this sort of thing, they can make sure the dialog box comes up right where you're looking, intentionally breaking your train of thought.
  • Apparently the software will even read your email and try to schedule appointments for you, etc. Of course this is somewhat beyond the state of the art - but who'd want it anyway? I haven't even let human secretaries make appointments for me; let's face it, our time is one of the few things we can't increase, and I take issue with anyone who'd (inadvertently) waste it.
    Chris @ chrisworth.com [chrisworth.com]
  • The trouble is not that Microsoft is trying to do this, but that anybody is trying to do it at all.

    Interface is a constant problem with computer software, and there are many approaches- the Linux "Do whatever, the user is a geek and can learn or reprogram anything", the Mac "Thou Shalt Code It THIS Way To Fit With The HIG", the Windows "If the user has a problem we'll put in a wizard to do it for them".

    People seem to be analyzing this idea from Linux standards- the assumption being that you're going to spend hours reprogramming it to get it to work the way you want. That may not be an option...

    The problem is this- defining interface systems in such a way that they can be learned by a user. Everything is learned- even the simplest things are learned (ever see someone use a mouse for the first time and not be able to immediately correlate sliding movement with pointer movement?).

    The Mac approach (somewhat weakened with age but still alive and kicking) was to define everything beforehand and force UI to go a certain predictable way. All Edit menus will have cut copy paste IN THAT ORDER, followed by Clear if present. If Select All is present it is under Clear. File menu contains new, open, close, print, and quit. Print has an ellipsis (...) after it because ALL THINGS that pause for further input have ellipsis after them. They also ALWAYS allow the user to cancel out of the operation, meaning 'you can always select a thing with an ellipsis, even if you don't know what it is, because you get to cancel it if you didn't mean to do that'. And so on, for an inch-thick book... In this paradigm, all the energy is spent organising the UI passively. The user is the prime mover and everything sits there until you use it.

    The Linux approach is similar- except that the user is expected to do their own organizing! All systems, passive or active, are for being customised by the user. The ones who get the most enthusiastic about Linux (or any Unix) tend to be the most adept at defining interfaces for themselves and improvising new functionality that doesn't necessarily exist in consumer OSes. The elaborate shell script is the highest peak of this art (barring the writing of entire programs) because it is encapsulating whole known behaviors into the computer's interface, behaviors that are personal to that particular user. Defining/configuring X is very similar.

    When you get into Windows, some of the rules change. There's always been a profoundly influential desire to out-convenience all other OSes by secondguessing the user and doing stuff for them, like an automatic door that opens when you walk towards it (hey, it works for supermarkets). This desire is also expressed in the use of, and concept for, Wizards: the problem is not seen as making the UI comprehensible for specific tasks (which might assume user learning), the problem is seen as DOING the tasks for the user and requiring no learning at all.

    Here is the fatal flaw: systems of this nature are complex! The tasks being done are likely to be complicated (prioritizing email and workflow is VERY complicated). While being 'wizard walked' through a task is timeconsuming but mindless (with every step explained as you get to it), having your life actively managed by such a software process BEFORE you get to it is another story. It cannot fall back on being mindless- it must try to be clever, but the more clever it is, the less predictable it is likely to be.

    That is the fatal flaw in a nutshell: there will always be a mental model of the process being done. Failing to have a mental model is basically resigning yourself to cluelessness about what is going to happen- you'd need to at least think "It is sending personal email to my home and work email to work" or _something_. As these systems get more complicated and intrusive, the model becomes more complicated, and the more 'clever' it becomes, the less predictable it will be- the goal is 'do what I mean' but the result cannot be other than 'why the hell did it do that?' because it is an externalisation of a process that is also, inevitably, running inside the user's head. There will always be that parallel evaluation of significance and priorities- and the computer does not have the mind's ability to associate, does not have X many years of back data to associate with. The computer is bound to lose because it's not playing chess- it's trying to run parallel with a _particular_ very associative and dynamic human brain, that of its owner. It can't possibly win. Other HUMANS don't always win at this game (long-married couples still misinterpret each other fairly often and renegotiate things as a result). No computer can do it- unless its 'owner' is simply another computer- because it is not a purely logical process, and the computer can't match human bandwidth.

    A quick example from a book by Grant Fjermedal: "What's the population of Kampala?" Most people will respond immediately, "I don't know". How do you know so quickly that you don't have that information in your head? It's much the same as remembering "Aunt Millie's cousin Fred lost both knees in the War, causing him to be unable to walk and needing to travel in a special van with lifts. Aunt Millie does all the shopping for her cousins. Aunt Millie has no clue about any sort of technology." Now, take that background, and when you read in Uncle Bob's email, "Millie is van-shopping and is all in a frazzle", the human reaction would be to quickly associate all these things and perhaps inquire if it was Fred's van that died, and if Millie can use some techie help with the special-van issues. But how are you going to load all the information of a lifetime into a computer and expect it to make priority calls like this- and who is going to KEEP loading the data in as life goes on? It's flat impossible, unreasonable.

    This is why the only reasonable approach to future interface is finding ways to make it predictable and understandable by the intended user- which of course is how Apple survived the '90s when by all rights it should have been crushed. The Mac interface is only one of many possible interfaces but it was rigorously defined in a consistent and predictable way. Almost any interface will do if you define it consistently enough and STICK to it. In something like Linux, what ends up happening is you either end up defining your personal idea of an interface and sticking with it happily, or you give up.

    In something like Windows, people learn (sometimes with the aid of community college courses!) the rules of an ill-defined interface much of which is not intended to be clearly understood by the user- when things are supposed to 'do themselves' there isn't the same motivation to make the process clear and visible. This attempt by Microsoft to go still further in that direction is DOOMED, because it will either end up so trivial as to seem a total joke, or it will proliferate the 'problem space' of possible computer actions so vastly that the resulting behavior is entirely unpredictable, yet still grossly inadequate for matching its user's priority 'rules'. That is worst of both worlds- and if this ever ships, expect a certain amount of excusemaking by its users along the lines of 'my computer screwed up' to explain the user's worse-than-average ability to interact with others, and their failures to deal with important things. And there's only so much of that you can get away with.

    Nobody should attempt to copy this 'feature' for Linux. Instead, figure out ways of making the 'map' for computer systems' behavior more clear and predictable, get some consistent rules in there and then let the user use those rules and take their own actions.

  • "The notification manager software, which would reside on a remote computer out on the Internet,..."

    BZZZZZZT! Wrong answer please try again.

    Just what I'm sure everyone wants, a proffesional system that tracks their likes and dislikes, controls disemination of digital intercommunication, and knows where you are all the time... all this and no personal control of the device. The privacy concerns alone are staggering and I certainly wish to have no part of it. If it was siting on my home network behind an OpenBSD firewall I'd give it a second thought, but who can trust Microsoft?.. Either ethically in regards to privacy or technichaly in regards to security?

    "But Horovitz said he was confidant that adequate security and privacy safegaurds would be created."

    This guy sounds like a smart guy, so why would he come to this conclusion? Microsoft has never before provided adequate security (How many holes in NT last year?) or privacy (tracking numbers on Word docs).

  • by FascDot Killed My Pr ( 24021 ) on Tuesday July 18, 2000 @04:04AM (#925414)
    So what if I'm using Outlook? A mail comes in, I glance at Outlook--it opens the mail. It has an attachment, I look at the filename--it launches. Oh no! A virus! Don't infect Word (glance at Word). Crap! Don't send to the people in my address book (glance). Dammit!

    There are times when you want to study something passively...
    --
  • Imagine, you're in the middle of an important videoconference and you can't participate because that your database interface insists on having all of your attention.

    Hmmm, Alt-Tab should solve that problem, although I'd think that it wouldn't arise in the first place since the notification program will be aware of the fact that you're in a videoconference and will assign a higher priority to that. Your example is overly simplistic and doesn't add to the argument.

    Worse yet, you have to make a live update to the database for the sake of the conference and the conferencing software won't let you change focus to make the change.

    See above. This isn't some kind of "you must do this now" scheme, it's merely something to help you make the most of your time. If needs be you'll be able to override the program.

  • by larsal ( 128351 ) on Tuesday July 18, 2000 @04:05AM (#925419)

    Novice users are already distracted from the work they're desperately trying to figure out by every blink on their DSL modems, every whirr of their hard drives, and every change in the "helpful" indicators, telling them that they're on line 8.6", no, wait, 9.4" of their document.

    Now try to imagine the same people believing, thanks to a new, ritalin-demanding UI, that they're supposed to be dealing with all the random odds and ends of software and background apps [already needlessly numerous] the UI decides they've been paying attention to!

    "Am I supposed to deal with the 'Task . . . scheduler' now?"

    "No, you're writing an essay."

    "But it came up and. . .look! The calculator just started! Oh! 'Help'! That must be useful. . ."

    "It's the 'Help' function for the calculator. . ."

    "I wonder if it can help me write the essay?"

    Larsal [It's Worse than an Animated Einstein]

  • There are so many basic things wrong with current UI and systems software that incorporating this kind of technology seems very premature.

    With the current level of reliability and interoperabilty, you'll probably be able to conjure up a BSOD or dozens of dialog boxes just by staring at the wrong part of the screen. You know, something like "Cannot notify InstantMessenger of your inactivity. Click [OK] to continue.", or "You aren't paying enough attention to me. Look at me, dammit. (beep) [OK]".

    As for the idea itself, sure, it's good research. But the ideas have been around for a number of years, and lots of people are using Bayesian modeling, decision theory, user modeling, and affective computing, so this is not some kind of breakthrough hatched at Microsoft Research.

  • Well, when I think of game interfaces as models for application interfaces, I think of two seperate realms:

    Customizability
    Information Presentation

    For customizability, one example I can give that comes VERY fresh to mind is Diablo II (of course!). You have a lot of complex things you can do involving attacks with spells or weapons, and drinking potions, and frankly all of the actions are really reptitive, just like real applications (don't flame me please, I love Diablo II!!). All of these are really easy to customize to the mouse in various ways to make it really easy to switch do doing what you want, and it also does some things for you automatically when they are not intrusive - an example is the case of throwing potions, when once you run out the current weapon is requipped. Since you obviously do not want to be bare handed, that is the case of an obvious choice performed for you that makes sense and is welcome when it occurs. The kinds of things I could imagine in a wordprocessor or something like that are dragging commands into and out of a "hotbox" so a small set of keys could do a lot of repetitive stuff for you with ease. Also the mouse could be a lot more effictive if it were not so chained to simply being a menu activation block.

    In general note the way many FPS games have gone the path of letting you bind a few keys or mouse strokes to anything you want to do at all - that kind of total control in customizability is lacking in applications.

    As far as information presentation, I'm mosting thinking of flight sims and RTS games like Uprising or others in that category. You have to see a lot of information easily and be able to respond just as fast - in general the better games in these areas seem to easily flow between letting you see the information you need and getting to the response areas you need just as easily.

    I'd agree that I don't like the UT or Quake 3 configuration UI all that much (though the play UI is great), but it just seems like there are many lessons that could be learned about UI from games and applied to real applications. I personally think one of them is that it is not nessicary to offer a consitant interface across all applications, but I know many many people would disagree with that!
  • This opens up a world of possibilities in interfaces controlled by gestures and facial expressions. Imagine a more general interface that would disable options that cause wrong behavior in response to a simple gesture immediately following any program behavior triggered by that option. Examples are left to the users' imaginations.
  • I don't think Microsoft needs any help insulting their customers' intelligence. They do it themselves, with every "For Dummies" or "For the Complete Idiot" or "For the Barely Verbal" book they purchase.

    I've never quite understood why people would buy these books, and still be insulted when I told them that they were a complete idiot on the tech support line :)
    --
  • Is there a way to loop in TeX without using recursion?

    Yes,see exampels for the output routines in the TeXbook. Would you really want a turing-complete language that can read and write files to be the primary language of the WWW?

    Absolutely not. Actually, god save us from TeX becoming more popular with the braindamaged/brainwashed/lobotomized part of the population. It is bad enough with Word Viruses as it is. And TeX capabilities for code obfuscation and penetrating scanning are way beyond VBA or word basic

  • It's bad enough when I go to a site that opens up ten windows I can't close, now they want me to use a system that opens and closes apps for me? Fsck that! I want CONTROL over what I'm doing. It goes back to the saying 'just because you CAN do something doesn't mean you SHOULD.'
    "Where do you want to - oh, never mind. You'll go where we take you."

    The Divine Creatrix in a Mortal Shell that stays Crunchy in Milk
  • by Chris Worth ( 18843 ) on Tuesday July 18, 2000 @04:08AM (#925434) Homepage
    The social interface BOB, the paperclip's accursed ancestor, tried to introduce a virtual valet that did this sort of thing- and flopped utterly. Like to know what happened to the MS executive dealing with the project?

    Bill Gates married her.

    Chris @ chrisworth.com [chrisworth.com]
  • Shut up and stop talking B.S.

    The company you impply did not invent even the paperclip. It was outsourced. Outside of the US for the matter. The question is that the guys who wrote it do not even need an NDA. They know that they will get lynched on the spot if their friends, collegues and customers understand that they have done it. And actually the library is obviously a non-MS software.

    Guess why... It does not blow up as often as the standard Redmond B.S.

  • the first two versions of MS stuff is notoriously bad, bad, bad... so i think that I'll wait for the third version before judging...

    anyway... Its about time that someone /anyone/ moved away from the WIMP type interface. Right now, the only /reasonable/ input devices are the keyboard and mouse-like pointing thingees. This hasnt changed since the mid 70's.

    Allowing the user to have more ways of communicating with the computer will eventually make it easier for users to do work on the computer.

    Right now, input must explicitly be provided for the computer. i.e. You type a command, you click on a widget. If you curse out the stupid paper clip, the computer doesnt know anything. If you get a frown on your face every time Word mucks up your formatiing, and then you need to go to help, you have to do this explicitly.

    But if the computer could tell that you were puzzled in word, maybe it could pre-emptively prepare help. That is, get it ready for you to use, and not necessarily pop up the help window saying - "You look like you dont know what the fsck is going on"

    thats the key - keeping things behind the scenes. Like checking email when you look towards your mail program, or preparing a document to print when you start looking at the print button - but not doing it until you say so.

    The computer needs to anticipate what you are going to do, but not distract you from your task at hand while its doing it - to paraphrase Alan Cooper - the computer can not stop the workflow with stupidity.

    It sounds like a great improvement, but I still think that the user interface will not make improvements until we find a better way of obscuring the inner workings of the file system.

  • Well, as far as email, outlook makes a nice sound and puts a little icon in my system tray with a mail icon that I double click and it opens the new mail. Don't all mail programs do something similar? Also, I have a rule that plays a special sound when it is from my wife, and another for my boss.
    ---
  • by cs668 ( 89484 ) <cservin&cromagnon,com> on Tuesday July 18, 2000 @04:12AM (#925445)
    and.... It sucks.

    The menus in office 2000 apps already try to guess what you want to do. They are always moving menu entries arround based on what you are doing.

    The end result is that you always have to read the pull-down menu -- you can never learn the position of a selection.

    So by trying to anticipate what you want they make you less effecient. Wouldn't this just be more of the same?
  • by jonathanclark ( 29656 ) on Tuesday July 18, 2000 @04:21AM (#925446) Homepage
    I went by the IBM software research labs here in San Jose and got see some neat demo of exactly this (attention sensitive UI).

    The nice thing about this is that eye tracking is very cheap. The eye reflects IR very well so all you need is an IR strobe and a cheap IR CCD. An end product could cost less than $50.

    One demo allowed you to speed up mouse click on things by automatically moving the mouse to an approximate location on the screen where you are looking.

    They had one demo that would track your eye and blur the screen except for where the eye was focused. Everyone else sees a blurry screen, but you (the person being tracked) can't see a difference. Could be very cool in 3d games if the game could render the areas of the screen you were looking at in more detail and those you weren't in less detail. The military has been experimenting with this on high-end flight sims that do this with good success. But if your playing on a 13" monitor then pretty much everything is in focus. :)

    Checkout their project page for a little for info.

    http://www.almaden.ibm.com/cs/blueeyes/ [ibm.com]

    --
  • "I am not a droid"

    Oh, but apparently you are.

    Instead of reading the artical and learning that you can set priorities for different people you go off and spout some "Microsoft is bad, Microsoft can't do anything right, Why is Microsoft always wrong?" chant to make yourself seem cool.

    And Big Surprise!, you even got a moderator to agree with you.

    Wiwi
    "I trust in my abilities,
  • Of course not. We already have two very effective attention managers: Junkbusters and Procmail, and I don't feel like I need some fucking paperclip on top of these.
  • . Even simple things like not letting your biff update until you change focus out of a word processor. (mind you the anti-ms block on Slashdot will of course equate microsoft's involvement with the project to mean that this is really about mind control or the corporately financed return of the plague, but what are ya gonna do?)

    Besides the fact that most MSFT technologies sound good on paper (the Paperclip is an artificially intelligent agent that molds itself to the users behavior and acts accordingly) they are usually horrible in implementation. Somehow I have a problem with software that will stop IMs from popping up because it's 2 AM and I'm reading Slashdot or that will not announce email notifications because I'm coding. The fact is there are more variables in whether I want certain activities to occur than where my eyes are positioned and what time of the day it is, after all if such things were so easily distilled into algorithms we would all love the paper clip.

    From what I saw (talk of Bayesian Networks and agents) this is the same team that brought us the Paper Clip. I would hold of on applauding what will more than likely be another highly disparaged piece of MSFT bloatware until it is actually implemented and is no longer vapor. I remember taking a class where our professor described how some company had started research on using cameras to manage User Interfaces but the project was a failure because people do not act predictably when using a computer and a camera also distracts them. It turned out that the best thing it was useful for was for tracking how long users read banner ads.

    From the article There have been other missteps indicating that Bayesian techniques must be added to software with great care. In December 1998, for example, Blue Mountain Arts, the Internet greeting card company, filed suit after it discovered that a preliminary version of Microsoft's electronic mail software mistakenly filtered Blue Mountain's e-mail greeting cards into users' trash cans. The filter, which had been based on software developed by Horvitz's researchers, was repaired in the final release of the program. It was an important lesson, he said, in the risk of artificial intelligence making poor judgments.

    PS: I wish them luck, but this is one piece of bloat I'll definitely be avoiding.

  • yeah, i can completely relate. i mean, what were those people thinking who created the first computer? i mean, i'm a human, i want to work in the sun, plant fields, do manual labor, build crap with my hands. don't they know this?

    get serious. b/c this firm (which just happens to be MS so it's flame bait to all the little linux/bsd/unix/wack-wack-wack crowd) is trying something new, you slag it. god, reading most of the posts on this board is like hearing people in the middle ages speak on Da Vinci's works. it's called experimentation and it is what gives us new, cool and sometimes useful sh*t that gets used in ways originally never thought of.

  • So if there's all this monitoring software out there now that watches your keystrokes and sees which Web sites you're going to, what happens now that your boss can actually tell what part of the screen you're staring at? I mean, he's got a camera pointing straight at you now--now he can enforce the damned dress code through your computer.

    Otherwise, yes, quite an interesting an innovative project. Even without my paranoia over being monitored (serves me right posting from work), there's another interesting application: testing the effectiveness of banner advertisement and UIs in general.

    The first is a fairly obvious one. Advertisers want to know exactly what sorts of ads grab and hold a reader's attention. This could be bad, considering how many seizure-inducing flashy whirly banners there are already.

    The second, though, is a very good thing. In a good UI, you should be able to focus on your work without being distracted by anything else. Furthermore, you should be able to find what you're looking for instantly. If GUI designers have the opportunity to see how long your eyes wander around the screen before you click on a particular button, they can use this data to rethink the button's appearance and location.

    Auto-expanding dialogue boxes! What if you set up a toolkit in the upper-left corner of the screen and it opened automatically when you stared at it. And if you're too lazy for point-and-click, wait 'till we do stare-and-blink. I kind of like that last one---my biggest peeve with GUIs is having to take my hands off the keyboard to get something done.

    Okay, enough random commentary. This is definitely an interesting project, no matter who wrote it.

    --

  • This is precisely my concern though. I have a particular way of working which involves interruptions from things that I consider important, (messages from friends and family or simpler distractions, such as what EQ zone I will explore tonight), I get my job done and I have enough flexibility to work in this manner. This kind of software could either (benevolently or malevolently) enforce a "prefered" way of working/living that would result in something like fighting with Word autoformatting but at a grander scale. Now if it was completely configurable to my priorities that could be not very cool either because I don't want to be interrupted in the middle of a meeting because a friend just mailed me. I think it would take quite a bit of time for it to learn what I preferred and would be just plain annoying. I don't need a giant paperclip trying to run my life.
  • What about a virus that takes some snapshots and broadcast them to all the people in your address book. Might be embarrassing for some managers who sleep after lunch!
  • At first this seems great. I often shift my attention between windows without shifting focus and end up typing in the wrong window. (This is a big problem when I have a window to gcc open and am writing a /. post - gcc finds a syntax error, so I fix it, and look back to lynx without shifting the focus)

    On second thought though, there are problems. When I sit down for a long coding session I've been known to have 2 big windows on my screen, each with references for functions/error codes that I'll be working with, with just one line of my editor showing. (I hate the way windows makes the window with focus be on top)

    I like the idea, if it is powerful enough to tell when I want to shift window focus, vs when I want to keep the window focus, while reading a references

    And a big AOL style me too, to what ohters have said about filtering email from someone important vs a less important company newsletter (But don't mistake a emergency mesage from the sysadmin) vs spam.

  • You can view NYTIMES articles by changing www to www10, changing it to partners no longer works.

    [nytimes.com]
    http://www10.nytimes.com/library/tech/00/07/bizt ech/articles/17lab.html
  • Sorry, but I can't agree with you here. This piece of software is attempting to solve some problem that people in general should have gotten over past puberty - attention and focus control. Moreover, relying on this piece of crap does not solve any problems, becuase the computer is not the world. You may rely in it all you want, but if you have ever tried to interview people "on the go" in the real world as a journalist needs to do, no computer will help you keep all the distractions of the world at bay.

    Mental discipline - is that such a foreign concept that a computer has to teach it to me?

  • by cbogart ( 154596 ) on Tuesday July 18, 2000 @04:18AM (#925473)
    The problem with attempts at "smart" behavior in software is that its behavior becomes less predictable. I think this is an attempt to patch up previous attempts at smart behavior -- like the paper clip's interruptions based on smart analysis of what I'm doing. I see why they're trying to fix it, but now if I look away from the computer for a second, its state will be different when I look back. I can no longer easily control what input I give the computer; testers can't reproduce bugs because they don't remember exactly when they looked away from the screen; the msdn knowledge base fills up with bugs related to what noises and body movements you shouldn't make while FoxPro is saving a large file. Awareness of a user's drifting attention levels could be interesting in a game or something, but I don't want it in the OS until it's *really* mature.
  • by Anonymous Coward on Tuesday July 18, 2000 @04:28AM (#925479)
    I don't think this is stupid because Microsoft is involved; I think it's stupid because it's stupid.

    This is nothing more than Bob with hardware. It's that idiot paperclip on steroids. It's just one more way that Microsoft insults the intelligence of their customers. I don't need an idiot paperclip popping up to tell me that it "looks like" I'm writing a letter. I know whether I'm writing a letter, thanks. If I were functionally illiterate then perhaps it might be useful to have dopey software "help" me write a letter. But Microsoft makes the blanket assumption that ALL its customers are functionally illiterate.

    I certainly don't need my computer deciding for me whether I should be notified that I have mail. I think I'm capable of making that decision for myself.

    No thanks, Mr. "Chief Software Architect" Gates.

    Two weeks of running Windows 2000. Two Blue Screens of Death. Thank goodness for Linux!

  • Oh yeah. So should I put my wife first, or my boss second. That's a kinda hard decision to make isn't it? The system forces me to decide when in fact, my priorities are more fluid then that.

    Maybe you'd like to spend a better part of the day giving priorities to every person who might mail you. Sorry - but I need to work. Giving this kind of useless meta-data to a machine is IMHO, a distraction.

  • by tjwhaynes ( 114792 ) on Tuesday July 18, 2000 @04:32AM (#925484)

    The problem with a lot of the software I see around today is that in the desire to make software more open and friendly, it has got a lot more distracting to use. It's difficult to Zen-out when using a piece of software when every minor adjustment triggers an animated effect, be it a spinning hour glass, back illuminated button or piece of paper flying across the screen. In an attempt to give the user more feedback about what is active and what is not, software designers have taken away the "quiet" interface and have jazzed it up.

    And this has not been restricted to just the application itself. The applications often demand attention like some spoilt brat - the "HELLO? YOU HAVE MAIL!!!" syndrome. While in some cases, such as Lotus Notes, the default is to rise to the top of the window stack and bang a modal window up to get your input everytime there is new mail, you can tone this down to an audible bell only. Or ICQ clients which reappear on the top at a new message coming in. And there are others - visual alarms on calendaring tools and probably more that I have forgotten.

    When I have the option, these programs are pushed into the bit bucket as fast as possible. Using them is a dire waste of productivity. Where there is no choice about using that software, I try and tone down the alarms to be just audible effects which I can acknowledge without having to press a key, move the mouse or otherwise stir from whatever I'm doing.

    So really, this research sounds like a patch for the problem, rather than a cure. The problem is with the UI design - programs are increasingly "rude" in their attempts to get attention. At least if I hold the source, annoying habits in essential software can be trimmed to a minimum. But rarely in the Unix side of the world do I have to worry about annoying software - 95% of the stuff which irks me is Windows-ware. Maybe the art of Zen is dead on the MS platform...

    Cheers,

    Toby Haynes

  • Very good point. One thing that really turns me off about software like MS Word (besides the fact it's MS of course) is that it tries to be smarter than it is. I mean, gimme a break, they seriously think treating their users like dumb people who needs the computer to pamper their needs is equal to "good" user interface?!

    I mean, I've nothing against a better UI, or an easier to use UI. But a "UI" that tries to guess what you mean (emphasis on tries) is 80% of the time extremely annoying, and only results in people finding ways to work around it. Gee, thanks. Like I need the stupid "word processor" to totally screw up my formatting so that I have to move my hand to the mouse and search for the stupid option in the pull-down menus to manually tweak margins the 100th time, when the report is due in the next hour. And now they're trying to change things on you based on what the program thinks you're focusing your attention on?!

    For sanity's sake, MS, stop trying to have your software treat users like babies until you actually have technology that works. I hate the utter presumption of these so-called "smart" features that try to guess (emphasis on guess) what you're trying to do, get it wrong most of the time, gets in the way of getting work done, and are just plain annoying. Why add another level of frustration to an already frustrating work environment?!


    ---
  • Personally, I'm skeptical of their use of Bayesian statistics. I remember reading a book called Blind Man's Bluff (ISBN: 006103004X), which was an account of many of the supersecret, dangerous missions the US subforce has been tasked with in the past 50 years. In a sort of Hunt for Red October vein, the US DoD was aware of a number of downed Russian subs, and wanted to go looking for them, but where to look? One of their guys used Bayes' theories to come up with approximate locations of the sub in question - but the input data came not from sensor readings, but from best guesses. Basically, each expert decided where they would search for the sub and placed a weight on that position, like they were betting, and then they fed those points, along with the weights, into a Bayesian equation. The sub ended up being a couple hundred feet from where it was predicted to be. It was pretty cool. Now, this seems kindof absurd, right? I'll tell you where I think the boat is and you'll tell me where you think it is, and together we'll be right? Huh? I mean, it's just a guess... What the guy who came up with this system relied on was that these best guesses were really educated best guesses, and making these points better choices than random points. We each have a bit more knowledge than we can vocalize, which is where a lot of intuition comes from.

    So, what I don't remember was whether relying on human best guesses was part of the Bayesian model, or if it was just something this subhunter came up with.

    Which leads us to the problem at hand - I as a human could probably come up with best guesses as to whether or not you want this piece of mail, but can an AI-based piece of software do that? Would Bayes approve?

  • Usually they are OK with html. You can get a latex2html converter from http://cbl.leeds .ac.uk/nikos/tex2html/doc/latex2html/latex2html.ht ml [leeds.ac.uk].
  • First, I don't like this idea for a selfish reason. I have congenital nistagmus, which means that my eyes move back and forth constantly. The mustles around my eyes are in constant seizure, so it is quite hard to tell where I am looking based on eye position :) And, knowing MS, they will make this feature mandatory for the whole computer to work right.
    OK, now, for my other rant:
    IF YOU DON'T WANT TO BE INTERRUPTED, TURN STUPID NOTIFICATIONS OFF. THEY SUCK. The only thing that you need to be notified of is if your MB/processor/video card are overheating. The rest should be done without popups (a beep or a flashing alarm in taskbar/corner of desktop/wherever) would suffice. Software designers should be more conscious of users needs; if there is an error, don't bring the application into focus, or have an easily accessible button (on title bar?) that lets users regulate interruptions. No need for cameras.

To the systems programmer, users and applications serve only to provide a test load.

Working...