Forgot your password?
typodupeerror

Eye-Based Videogame Control 42

Posted by Zonk
from the i-can-see-you dept.
dsmith3689 writes "Researchers at Queen's University in Kingston, Ontario have explored the use of an eye tracker as a control device for a handful of commercial video games. To do this, they integrated a Tobii 1750 desktop eye tracker with Quake 2, Neverwinter Nights, and a flash adaptation of Missile Command called Lunar Command. A study was performed that indicates the use of direct feedback from eye movements can drastically increase the feeling of immersion (pdf) in the virtual world."
This discussion has been archived. No new comments can be posted.

Eye-Based Videogame Control

Comments Filter:
  • *twitch* (Score:2, Informative)

    by cheese-cube (910830) <cheese.cube@gmail.com> on Wednesday August 02, 2006 @09:00PM (#15836420) Homepage
    They have had technology like this for a while, although not as a method of input or control. They use special "eye-tracking" machines for hazard perception experiments with automobiles. I think using eye-tracking as an input device would be something that would be very hard to get used to. The human eye is a pretty amazing piece of hardware and I think a "machine" would have a hard time utilising it. Additionally you'd also have to have a special filter for crack addicts that have developed twitches. They could also implement shortcuts where if you roll your eyes it opens Firefox and navigates to Slashdot =P
  • Re:*twitch* (Score:4, Informative)

    by ohmypolarbear (774072) on Wednesday August 02, 2006 @10:59PM (#15836973)
    To add to this: we use eye tracking systems in my brain lab (at a major research university). It is, in fact, highly unusual for subjects to only look at one thing, or even to look at whatever they want to do when they do it. There are many extra eye movements (saccades) to other areas of a scene for planning and multitasking, even before the person is conscious of their plans. Here are two papers relating to eye tracking and games in particular:

    motion tracking and planning:
    Ripoll H. Percept Mot Skills. 1989 Apr;68(2):507-12. [nih.gov]

    multitasking:
    Cavanagh P, Alvarez GA. Trends Cogn Sci. 2005 Jul;9(7):349-54. [nih.gov]

    Needless to say, any successful attempt at eye-tracking control for something like video games would have a lot of sophisticated programming to do in order to figure out the user's intentions. From my own personal experience, especially in FPS games, I rarely look where I'm shooting. I would like to keep my sensors (eyeballs) and effectors (hands/feet/other body parts) separate, to allow me to take in more information and perform mor actions simultaneously. It would also prevent any weird interactions if the training provided by the games affects the way hardcore gamers attempt to interact with the real world (although those would be very interesting to study).

How can you work when the system's so crowded?

Working...