Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
IBM

Eye-based Navigation Research From IBM 64

leviramsey writes: "The Jakarta Post (through Lexis-Nexis) has an article on MAGIC, an eye-tracking component of BlueEyes, a project to add greater sensory abilities to computers IBM's Almaden Research Center. Oddly enough IBM's site has very little on MAGIC under that name, though a reference is made to PupilFinder which seems to be the technology underlying MAGIC. The article speculates about possible applications, including in cars (gulp!) and goes into detail on other components of the project, several of which are very interesting."
This discussion has been archived. No new comments can be posted.

Eye-based Navigation Research From IBM

Comments Filter:
  • by Anonymous Coward

    No! Use touch screens!

    "Gorilla Arm" is waaay better than "Web eye"

    (or will users of this technology gain a
    reputation of being "shifty eyed" ?)

    Eyes and the brain's perception system do some
    funny things. Try playing Textmode Quake for,
    say, half an hour...even just walking around a
    level -- then stop and try reading something.

    It's so trippy, I swear...your brain is trying
    to translate the text into some kind of 3d image
    for several minutes afterward.
  • It's all a plot by the porn industry:
    1) Get everyone using 'harmless' eye tracking systems.
    2) Replace the 'click here to subscribe to our $599.95/month service' links with 'blinkable' images.
    3) Watch the cash roll in!

    Be afraid.. be very afraid..
  • You can't identify a police state by counting the number of cameras owned by the state. The way to identify a police state is to count the number of cameras owned by the citizens. In police states, only the police have cameras.

    Who are the people who get in trouble with cameras? Ordinary Joe? Why would we care? The people who get in trouble with cameras are politicians and police. Would the people who beat up Rodney King have been convicted without a citizen's video tape evidence?
  • I am part of a research project [it-c.dk] at the IT University of Copenhagen [www.itu.dk], where we design eye controlled communication aids for disabled persons. We are currently working towards doing a low-resolution - but cheap - solution based on webcams, which then interfaces to a communication program. Out goal, which seems to be within reach, is to acchieve a entry speed of approx. 60 chars pr. minute, using a mix of eye control and prediction functions. My job is adapting my commercial communication program [secondguess.dk] for eye tracking.


    And yes, this is great technology, but with quite a few limitations, which aren't all obvious. First and foremost is the problem that eye movements aren't 100% conciously controlled. Your eyes allways jitter slightly, and you have a tendency to track events outside the sceen with your eyes. The canonic illustration we use in discussions is, that you certaintly don't want the program to print your text, just because a bird flying close by your windows startled you, and made you look.


    Second is the whole problem of when you "click". Dwell-time is quite popular, but has the drawback of not making you feel in controll, and leaving you all too often without a "safe harbour" on the screen. Using a "long blink" for clicking is a better option, but leaves you with another problem - distinguishing a shadow falling on the camera from a closed eye. Thus we often fall back on using a secondary device for clicking and doubleclicking.


    Last is the whole issue of precision. Eye tracking today is rather unprecise, even under optimal conditions and with expensive "best of breed" equipment. As a rule of thumb, you shouldn't expect to be able to activate a button smaller than one square inch with any kind of precision, especially using dwell time for clicking.


    So there you have it. Eye tracking certaintly is a usefull tool, but - for now - is rather far from the ultimate goal of direct mind interface to the computer. Actually it's a problem that eye controll sounds so fantastic, as it jacks the level of expectation from the users far too high for the state of the art. Right now you need specialized programs, and a lot of design effort to make the experience a good one - even under optimal conditions.


    Just my 2 cents.

  • We are doing this at the IT University of Copenhagen [www.itu.dk] right now. Se my post further down the page for more information.
  • And if you got an eyelash in your eye, your accuracy would go WAY down .. blasting at everything around you.. and use up all your ammo.

    --
  • I don't know about you, but I'd be scared with people driving with their eyes.

    There's an accident on the side of the road, and all the people slowing down to look at it would cause lots of other accidents....

    You go crusing by the beach and cause a fender bender cause you can't keep your eyes off the hot chick in the bikini!

    Just my concerns...

    Jeremy
    Glutious
  • Imagine this software installed on your PC and your webcam on top of your monitor. After some calibration (Now look to the left corner of your screen and press enter etc.), the device will be able to well to which part of the screen you are looking. This can easily replace the mouse! How about focus-follows-eye instead of focus-follows-mouse :)
  • You can use such a camera to monitor boys/girls in a café: who pays attention to you? It would be very interesting to meat that one...

    I think this camera system is a killer thing, with breakthrough applications!
  • If I'm not terribly mistaken, researchers at the University of Virginia did a lot of work in this area in the biomedical engineering department in the early 1990's.

    The entire goal of their work was to provide ways for quadrapalegic (spelling?) patients to control a mouse and interact, particularly for those whose speech was damaged beyond speech recognition technology capability. I can only imagine how much better it works now with the additional computing cycles of modern processors.
  • Great, so now I get RSI in my *eyelids*...

    Just what I need!

    -Andy
  • I totally agree.

    But my perception of a 'Police State' is one of enforced curfews, Big Brother style red tape, etc. The UK is definitely not that bad.......yet

    ----------------------------
  • You mentioned the problem of needing a second device for "clicking". IMHO, I think the way to go is definitely to combine multiple devices in as much as possible. I was imagining a set-up for a person with full abilities, who can type, look, talk and listen at the same time--if you coordinate the various devices properly, you can make things a lot easier. For example, I think the using your hands for both keyboard and mouse is extremely awkward. And a lot of the time, you don't need precise mouse movements, for instance when choosing which dialog to respond to. Or, you can have faces on the screen that will return eye contact when you look at it, and then process what you say into a microphone.

    In the case of a disabled person, perhaps one should try to take advantage of as many abilities as are remaining, and combine them to substitute for the missing ability. Thus, eye and tongue movements could substitute for a point and click device. I think there are certain things that eyes do very naturally, such pointing at interesting things. A "natural" person usually looks at what he finds interesting, and then "acts" on the things by using some other muscle. It is not natural to have a time-triggered click, because people normally look at a thing while deciding whether to do something with it. Maybe there is a breathing pattern that is naturally associated with performing an action on an object. I think it might be useful to observe everyday activity to see what gestures correspond to abstract concepts like "examine", "activate" etc. And then, one has to find which of those gestures are available to disabled people. The goal, I think, is to make them feel as though they weren't disabled, but rather were acting naturally.

    By mimicking natural activity with some analogous user interface, I think the experience would be very comfortable and efficient.
  • I think you could rig up an amazing user interface with this. The simplest application, obviously, would be to replace the mouse. For those of us *NIX types who use X-windows, this means we won't have to take our hands off the keyboard in order to respond to dialog boxes. So the whole either/or problem of whether to use text or GUI is eliminated: we'll get best of both worlds with zero switching time between the two, if you know what I mean.

    I can also imagine how this could be amazingly useful for creative work. The simple version would be a paint program where you push buttons in order to control and spray colours at the place you're looking at. A cooler version would combine the visual mouse with keyboard commands in order to build 3-d worlds: you can look at the place where you want to build a structure, and press the appropriate keys. This would open up an amazing area for UI design.

    Anyone know of good places to do this research? I'm looking to do a grad degree, and I think this might be an interestig project.
  • And, we'll have yet another way of determining which window should have the focus. I, for one, would love to be able to just look at the window to make it active.
  • "Eye 'n Buy" (TM, R, Patent pending)

    "One-blink-shopping" (TM, R, Patent pending)

    "You blink it, you buy it!" (TM, R, Patent pending)

    "Eye for an aye"(TM, R, Patent pending)

  • "oooooh look at the pretty girl!"
    *swerve*
    aaaaaaaaaaah!!!
    *CRASH*

    ========================
    63,000 bugs in the code, 63,000 bugs,
    ya get 1 whacked with a service pack,
  • OK, I'm considering... Let's see... Something about trying to type with your eyeball... Not sure there... Maybe - inhaled dust(?) from inside keyboard?... 3rd century Indian dynasties?... Something about sneezing... A definite sense of growing annoyance with the whole process... With a frozen turkey?... Hemorrhoids?... Something about your left ear...
  • Just think, troll, one day my instinctive flinch away from the goatse.cx pictures will signal my computer to launch a DOS attack on the server!


    My mom is not a Karma whore!
  • One shortcoming of using a mouse is in moving ones hand away from the keyboard and back. Taking information from a users gaze for cursor positioning seems to be a natural development. IBMs MAGIC is a step in the right direction but still suffers from the shortcoming mentioned above.

    An alternative might be in using the head movement for the second step of the positioning. This two-step mechanism could be activated by a additional key (e.g. those ones on laptops beneath the space bar). Look here [agrl.ethz.ch] for further information about this idea.

  • I don't think you quite understand what they were studying when you were an ERICA subject... The reason you, and many others, were looking at porn was they were studying where you looked, when you see porn. If I remember right, most males looked at the breasts right away, and then moved on (most of the time) to the genitalia. I can't remember where the females tended to look. Somebody at UVA had the perfect job (although most likely unpaid) of looking at a bunch of porn, and identifying which areas were breasts, genitalia, etc. Then they ran a bunch of subjects through the pictures, and compiled the results. The kittens and other crap were just there so you wouldn't expect porn all the time! Cleanse the pallette between courses, so to speak. Does anybody out there have any details on the study? A link would be fabulous and contribute greatly to the betterment of society. I for one am kicking myself trying to remember which parts of the images the girls were looking at... I also would love to know who commissioned the study.
  • If I recall, there was a story a few months ago about similar research at Microsoft.
  • A research project called dasher [cam.ac.uk] allows you to type with only a moving cursor. It employs clever language models to make it efficient. An eyeball tracker on a PDA with this software may be easier and faster than fiddling around with a stylus. It is GPL'd, source and binaries are available.
  • I could see a number of ways that something like this would be useful.

    Consider if it was used in conjunction with collision warning radar. The warning wouldn't have to be as obnoxious if the car knew you were looking directly at the object in question. This would definitely cut down on the number of cry-wolf situations.

    Also, what about a rear view mirror that tracked a little. The mirror could adjust for blind spots just by looking near the edge of the mirror.

    What about the extreme case of hooking it in with the airbag system. This way the car would be able to detect if you saw what was hitting you or not. If you were completely surprised by the collision the air bags could take into account that fact that your body would be a little more relaxed and deploy appropriately.

    And finaly, what about something that made the dashboard more readable. When you look at the speed the numbers are displayed a little larger and brighter. Kinda like the Mac OS X menu bar.

  • Although this may help a few of the disabled, quite a few people do not have the fine motor control over the motion of their head. I think this technology will be nice for some, and that is good, but there are still people out of reach.

    There are a lot of people who do not have eyes to be able to make use of this technology. This is similar to the proposals of having retina scanners for ATM access.

    It's a nice idea, and I think that the eye-scanning technology is cool, but I still think that the ultimate solution will be the brain/machine interface. Although waaaaay off in the future, it would truly allow all people access to computers and the future.

    As an aside - coupling this with the enforced ads replacing pop-ups on webpages, marketers could truly tell whether or not people were really looking at their ads.


  • This would be great for dynamic porn sites that automatically adapt to viewer preferences.
  • A lot of suggestions here seem to be made without any real consideration of what it would really mean to control a GUI with your eyes.

    It's evident that using your eyes is not the best way to scroll or zoom or aim at quake. Think about how much you move your eyes in front of the screen for a second. You would get very jerky movements on screen if the computer tried to adapt to your eyes all the time OR you would get delays before every action, if the computer tried to filter out rapid movements.

    Think about quake for example. In quake you want to look at the center of the screen at all times. You don't want to be looking at the _sides_ of your screen every time you need to turn around. What if you need to turn around real fast? Should you look away from the screen altogehter then? Brilliant.

    And what about scrolling. How many times in a day do you look at the bottom of the screen without wanting it to start scrolling? And if it did, would you ever have time to read the last sentence? The interface would be annoying as hell!

    These techniques are clearly better suited for wireless devices and other apps. And perhaps most of all for voice-recognition, because about 30% of our interpretation of what people say comes from subconscious lip-reading. Try listening to a conversation in another room without looking at the speakers, and then try again with eye-contact. Even if you are not looking directly at the speakers lips while making eye-contact, you will get a much clearer perception of what they are saying. Because you are lip-reading and decoding facial expressions to interpret words and meaning.

    To sum it up: The voice and the eyes are extremely overrated as controlling devices. Perhaps the biggest reason for this is you need both these 'devices' for other things, they are much too important to you to be used exclusively on your computer. You need to be able to look around freely without thinking about how your computer will react, and you need to be able to speak freely. By comparison, your hands are crude unsophisticated tools that aren't needed at all times.

  • They're going to use this technology in cars? Somebody stop them! Cell phones are a bad enough distraction; now we'll have to deal with a stupid contraption from Big Blue that prevents drivers from keeping their eyes on the road? Somebody stop them before a Mercedes-Benz with MAGIC at the helm runs me over.
  • This is where ease of use gets danerous.

    While I was in college I studied with Daniel Dennet, an amazing philosopher who has done a bunch of work at MIT helping them with such projects as the ambitious Cog project [mit.edu]. One of the more amazing tidbits he passed on to us (the most amazing was the genetically evolving virtual gladiator robots who after a few thousand itterations really learned how to pummel eachother's brain's out) was a prototype of an eye tracking system being developed for security personel: the idea is that you can display only what is one the screen and nothing else so onlookers can't see the information. The principle this relies on is where the daner lies: people looking at the screen and having their eyes tracked thing the scrren is completely full, they don't know that only a small fraction of the information they think they see as a contiguous whole is being displayed at any one time.

    Furthermore, if they are looking at a picture, the picture can change and they will not realize it has changed...their minds will actually "rewrite history" and they will be convince that that picture of the mona-lisa they are looking at was always frowning.

    using this technology we will get some of the scariest adverisements we have every seen (where that babe changes imperceptible to fit where you're looking, so, somehow, she's always perfect and exactly what you want), and I don't want to think about what the more brilliant and savvy admen/psychologists/brainwashers of the future are going to think up.

    But hey, as long as I'm not going to be able to tell, I guess I'll just sit back and enjoy.
  • And blowing someone skyhigh is just a blink away...
  • sweet christ on a pogo stick, if i get into a vehicle and it says, "where do you want to go today", i am going to summarily beat the shit out of it.

    My .02,

  • Those companies that pay people to watch banner ads would want this stuff wouldn't they!? They could then pay for 1c/second that you keep your eyes on the ad.
  • I know for one that my University has created a machine which tracks eye movement about 7 years ago. They tied it to a computer and created thus an input device.

    With this disabled not able to type on a keyboard, were able to type letters and use a computer. It still works nowadays though it is much more refined.

    Now it comes to the big market, we'll see where it leads to in several years...
  • I use eye-control in EOS 50E since 1997, although it is a bit rough as the camera can only choose between 3 fixed focusing points.

    AFAIR, the latest EOS 30 has 12 focusing points that can be choosen either manually or using eye-control feature.

    The problem with eye-control is that you should get used to it. Just like you would get used to dictation software. It is not just a matter of tweaking and training the system but also training yourself.

  • K, what scares me is the fact that development for cars was made mention of.. K, nice feature, sure it will come in handy, but I for one, hope that feature never makes it to cars or any kind of public vehicle.

    Why? Because computers don't come with 100% uptime guarantee's. Sure, that thing will work great for a certain amount of time, but like a home or office computer, it's going to go weird on you and or crash. And if I am assuming correctly, if this MAGIC computer system is used in automobiles to aid a driver in steering or moving that vehicle, then I am selling my car and getting a bike.. California freeways are already crazy enough.
  • Surely you just close one eye and move the other?
  • Your every action could be profiled and filed for use in targetted marketing schemes based on what you look at in stores.

    Stores are already doing this. I work for a large retailer that I can't name cause I dont want to get fired. But our marketing department frequently uses security tapes to see who is buying what with what. Like peple who buy brown shorts like blue shirts and stuff. Its really no big deal.
  • Use the technology as a mouse. One blink would mean a single click. Two blinks is a double click. Great!
    But then again, how to perform accurate drag-n-drop?
  • course if they used this for driving a car, people, like myself, who have nystagmus, would be constantly being arrested for driving dangerously!
  • what about using this inside acock pit on a helo, slaving the gun on the nose to whatever the gunner happens to be focused on? the military has used a helmet sight system for years, but it depends on a tracking arm linked to the gunner's helmit. if this could be turned into a weapon, it would result in a needed boost in our recession bound economy. our number one export is indeed arms.
  • What about roadside distractions, though? One of the busier roads in my part of town is popular with joggers. Especially female joggers. Good-looking female joggers. I don't want the car to do something it shouldn't because my gaze lingers lustfully off of the road for a couple of seconds.
  • I've a squint in both the eyes..Hope they wouldn't pick me for wrong reasons!!!
  • But don't people tend to look at obstacles they wish to avoid? Is this just a habit they would have to unlearn quick?


    -------
  • by jilles ( 20976 )
    LOL: please blink to clippy to continue

  • >With eye tracking, software can notice that the user is looking all over the screen, probably trying to find the right menu item or command. This is a signal to pop up help or (on Windows machines) advertise instruction manuals for sale at Amazon.

    Oh my god, no matter where you look... Clippy pops up!

    - Muggins the Mad
  • do you have a reference for that? There was a cool experiment designed to measure people's response time that went as follows:

    The subject sat looking at a screen with a slide projected on it. They were given a button which advanced the slide, and told to press it when they felt like it - when they were bored by the image they were currently looking at. A set of electrodes measured various brain waves - there was an obvious signal before the button was pressed that was decribed as the onset potential: it was assumed to be related to the decision to change the slide.

    The interesting thing happened when instead of using the subject's button-press to advance the slide, the experimenters secretly started using the onset-potential (plus a time delay) as the trigger instead.

    When this happened, subjects reported that the slide moved on before they'd decided to press the button, and that the projecter seemed to know what they were about to do....
    Freaky, and similar to your menu story. I collect this kind of thing, I would love a reference if you have one...
  • if they can scan your pupils, chances are they can tell if you're drunk/stoned/under the influence.

    IANAPE (i am not a pupil expert), but i think our buddies in law enforcement and our kind neighbors in the insurance companies would be interested if that level of detail were practical.
  • i remember reading a paper that came out of USC (i think) bart kosko and the fuzzy logic people.

    they were able to make vehicle sims behave more or less like a flock of birds -- weaving this way or that, without coming into contact.

    there was also some mention of tiny low power solar powered chips that you could simply toss along the roads and program with positional data.

    than any car traveling the road knows just where it is located.

    driving can be very monotonous/dangerous...especially long trips. i hope someone can move technology fwd enough that (as a start) vehicles would at least come to a controlled stop if the driver were incapacitated.

  • but what if these cameras could not only track where you go and what you do, but everything you even look at?!

    "Shades." Wear sunglasses. A little problematic in dim environments, but I'm willing to risk fracturing a shinbone to stick it to Big Brother.


  • I think its a nice idea but can see some downfalls in the ways of PUPILS being used as any kind of identification which should also be noted for those interested in Biometric security as well.

    What will happen when say a person slightly blind using their products, has his pupils deteriorate are there any thoughts on the sensors and their reactions to this?

    As for their emotion mouse I doubt it will give an accurate view of someone's psychological profile as there are heavy handed people, light handed people, etc. Will they have a certain buffer for values such as this or will they market it as a stand alone solution for determining this which is 'f(oo)ullproof'. What about persons with an abnormal perspiration problem will it flag them as a nervous wreck and more importantly will it clean itself after they've used that mouse (hey, I'm not sharing my mouse with a sweaty mo' fo now)

    Seriously though many factors will make some of these things hard and though some may seem like a great idea I think many are jumping the gun into some sort of a Star Trekkie based environment filled with overhyped products.

    Antioffline -- Putting the Hero in Heroin [antioffline.com]

  • High-end cameras can use eye-tracking to decide which part of the picture to focus on. You look in the viewfinder and concentrate on a particular spot, and that spot gets focus.

    And here's something to think about. Every single photographer I know about switches this feature off after first five minutes of use.
    --

  • and designer sunglasses made to be worn inside. Not to mention differently patterned contact lenses.

    Given biological variability, effects of some drugs on pupillary reactions (there are drugs that slow pupil contractions, but do not affect concentration or reaction times), different corneal shapes, etc. this sort of technology will have a stack of false positives and negatives plenty high. Is someone going to be arrested for DWI because they take high blood pressure medicine?

    Even the effects of race will have to be taken into account. I can tell you that most "face parts isolator" software will not recognise most oriental's eyes.

    As far as marketing goes, Yeah, I looked at it. What was my total reaction, though? Did I look at it because I liked it, or because cognitive dissonance told me that it didn't belong here, or because the packaging was so ugly I hated it? All the marketer gets is an eye time. Hope they have ESP to figure out what I was thinking during that period.

    On the whole, though, I think you can pass me that pair of mirrored Serengettis.

  • The idea behind this technology is at least 15 years old; The first time I ever heard about this technology was from that phenomenon of americanized anime: Robotech! [yahoo.com] Somewhere in the "second generation," the clever/nerdy character (Louie, I think his name was) invented goggles that tracked pupil movement, which could in turn automatically target weapons systems to whatever the wearer was looking at. The result? Our heroes were able to avoid getting their clocks completely cleaned by the Robotech Masters and their armies of new-and-improved Bioroids (see if you ever get THAT level of inventiveness out of Pokemon or DragonBallz).

    The apps I thought of following watching that were similar to some mentioned here, including hands-free computing (I had never used a mouse, though, so I was thinking of a CLI based on a gaze selected alphabet. Arg. :). But I wonder if we're going to start seeing Pupil Pistols, and looks really will be able to kill.

    Ah, speculation...



    --
  • For example, a BlueEyes-enabled television could become active when the user makes eye contact, at which point the user could then tell the television to "turn on CNN". The television would respond to the user's request by changing the channel to CNN. If the television then "sees" the user smile or nod, it would know that it had satisfied the request. If the television "sees" the user frown and complain, it would explain that it didn't understand the request and ask for clarification in which you could explain you meant CNN Headline News.

    What would happen if the whole family sits down to watch TV with papa wanting football and mama a popular soap and junior cartoon?

  • by ndf9f ( 248171 )
    ERICA, Eyegaze Resoponse Interface Computer Aid, computing already exists, and is commercially available at ERICA [ericainc.com]
    . It was developed at the University of Virginia, and I know that both Stephen Hawking and Christopher Reeve use it. It can track your eye movements, move the cursor on the screen, and detect when you want to click on something. I remember helping to test it a few years ago by sitting at a computer with the system installed and looking at a series of pictures. Some were nice things like kittens (aww, how cute, eye relaxes) and some were really graphic porn (yikes! eye contracts).
  • by NTSwerver ( 92128 ) on Tuesday December 19, 2000 @01:35AM (#549700) Journal
    I'd just like to point out to our international friends that England has not been turned into a 'virtual police state' (as Mr Hayes knows very well, being, as he is, an Englishman).

    In fact, since the current government have come into power, the level of policing in the UK has diminished to the same sort of levels recorded back in the 1970's.

    I for one (living in south east London) would feel a lot safer if there were more police cameras.

    ----------------------------
  • by troxey ( 195156 ) on Tuesday December 19, 2000 @12:44AM (#549701)
    Actually, seems to me that this technology could have substantial impact on several kinds of disabilities. There are folks who can move their eyes but can't control keyboards and mice. See for instance anyone with Lou Gerhigs or any number of other injuries or illnesses. Although this would also be of use to the normally sighted and functional as well. Just my thoughts - I could be wrong..
  • by Paul Crowley ( 837 ) on Tuesday December 19, 2000 @03:53AM (#549702) Homepage Journal
    I read a paper on gaze tracking interfaces a few years ago. When they implemented the hierarchical menus, they decided that simply gazing on a menu option briefly should trigger the display of the sub-menu, since it was easy to reverse if you looked elsewhere in the menu.

    It turned out that people tended to look up and down on a menu to choose an option, and their gaze would fall on the option that they'd eventually select long before the decision to select it was complete. This had the unnerving effect that users felt that the machine knew what they wanted before they did.
    --
  • by gunner800 ( 142959 ) on Tuesday December 19, 2000 @12:20AM (#549703) Homepage
    Once this technology is dependable and affordable, I think it will be a big deal. Not only is eye motion easy for a human to perform, but its very natural. That means newbie users can (in theory) use an old form of communication rather than having to depend entirely on a new skill set.

    With eye tracking, software can notice that the user is looking all over the screen, probably trying to find the right menu item or command. This is a signal to pop up help or (on Windows machines) advertise instruction manuals for sale at Amazon. If you know what you're doing, turn that feature off. If you don't, your software is a lot friendlier with it on.

    Maybe no more scrolling? Your computer tracks how quickly you read and moves the text by at that rate. This would be a boon for people with weak hands for whatever reason.

    And, of course, aiming in Quake will never be the same after the "eyes of death" patch.

    Anyway, cool tech.


    My mom is not a Karma whore!

  • by DrWiggy ( 143807 ) on Tuesday December 19, 2000 @12:25AM (#549704)
    On the car front Volvo are apparently keen on eye-tracking technology. As you probably know, Volvo pride themselves on the safest cars in the world (seeing as they invented crash testing more or less, I think we should let them keep that title), and there have been reports on the sort of technology they want to implement in the near future.

    One of these technologies is eye-tracking. A small sensor would be mounted in the ceiling above the driver's seat and track the movement of the head and in particular the pupils of the eyes. The details are sketchy, particularly withr regard to how this information would be used and as to what happens when the person is wearing glasses or corrective lenses.

    I suppose in principle you could detect drowsiness, lack of concentration, etc. and that information may be useful to the driver there and then. The only problem is, if it's all going to a blackbox, insurance companies are going to want the information to work out as to how often you checked your mirrors, whether you constantly look at your passenger as you are talking to them, etc. and I'm not sure what the safety advantage is in doing this.
  • by morie ( 227571 ) on Tuesday December 19, 2000 @12:41AM (#549705) Homepage

    If they fit a M$ box with this, you can't even look at it anymore without crashing it...

    Mmmm. So how is this new?
  • by irktruskan ( 235269 ) <robb.livewiretech@net> on Tuesday December 19, 2000 @01:07AM (#549706) Homepage
    Just to throw my 2 cents in on this, as in the tradition of all /. types- the first thing that popped in my head was using it to determine the window focus in X. I have a nasty habit of turning to my Linux PC, typing, and thinking whatever window I was looking at was the active one.

    Then again, maybe I'm lazy beyond belief.

    -r0bb

  • by jsse ( 254124 ) on Tuesday December 19, 2000 @12:50AM (#549707) Homepage Journal

    During my research we used similar eye-tracking device to help changing the view point of an observer in a VR environment. Our subject of testing all induced nausea and vomitting

    We failed to get more healthy subject to continue our research, we couldn't solve the problem in stablizing fast eye movement for use in control device. I am really interested in knowing what technology has they deployed to make it useful.

  • by Dan Hayes ( 212400 ) on Tuesday December 19, 2000 @12:20AM (#549708)

    This kind of technology worries me. I mean, it's bad enough that places like England have been turned into virtual police states through the installation of vast numbers of CCTV cameras throughout urban areas, but what if these cameras could not only track where you go and what you do, but everything you even look at?!

    Sure, this technology is far too primitive to deal with that kind of surveillance at the moment, but once the initial proof of concept is there, then advances come quickly, and in a few years time a camera may be able to track your eye movements from a hundred yards away.

    This really sounds like a horrible application to me, but you can just bed that law enforcement agencies and intelligence agencies around the world would love to be able to install these devices in as many places as possible. And can you imagine what advertisers and market research people would pay for this data? Your every action could be profiled and filed for use in targetted marketing schemes based on what you look at in stores.

    I'm not saying it'll happen, but it's still a damn scary idea.

One man's constant is another man's variable. -- A.J. Perlis

Working...