Media outlets are reporting that Google is creating some form of consumer electronics glasses, with the ambitious goal of launching a product based on the technology within the year. While many of these reports have included sketchy details on how the glasses might present information to the wearer, the true novelty will be the way the devices empower their owners to interact with real-world objects.
Many traditional interfaces, from steering wheels to touch screens to voice recognition to computer mice, require a relatively high degree of exertion in order to manipulate objects around us. Comparatively, an interface based on tracking the motion of the human eye could be so intuitive that in many cases the interface itself could be barely perceptible.
Let's consider a simple example to see how this might work. Imagine an elevator designed to be controllable by riders wearing gaze-tracking glasses. When a passenger wearing such glasses enters the elevator, she could look at a panel of buttons to select the floor that she wants to visit. The glasses could identify the wearer's focus on a particular button through eye tracking, could "read" the label of the button she was staring at (e.g., "floor 14") and could overlay a user interface element in her visual field. By simply continuing to stare at the elevator button, she could indicate her desired destination, her glasses could wirelessly transmit this information to a receiver installed in the elevator, then the real-world elevator button's light could illuminate, indicating that the floor was selected as her destination.
To an outside observer watching the above scenario unfold, it might appear that the elevator's passenger had some kind of magical, telekinetic ability. However, the interaction described could be implemented via a refined integration of existing technologies, and there are many hints that Google is working to build and deploy exactly this kind of device.
The human eye performs vergence and accommodation movements to focus on objects of interest nearly every waking moment. During typical use of a modern personal computer, the number of motions made by the eyes dwarfs the number of keystrokes made on a keyboard or the movements and clicks of a computer mouse. These many subtle eye movements constitute a rich expression of the viewer's thoughts and intent. For this reason, eye tracking is second only to direct neural activity monitoring in its capacity to immediately reveal a wealth of cognitive data and many media and advertising companies have been using eye tracking technology internally in product studies for years.
Google blogged about its own use of eye tracking for web search usability testing in 2009. Given the strong potential of this technology, it makes sense that Google and other companies are eager to be first-to-market with consumer products.
It's likely that Google is building a device that will implement electrooculography (EOG). EOG uses small electrodes, which can be incorporated into the rims of glasses, to determine the position of the eyes by measuring the electric potential field of the eyes themselves. Eye position can be measured in a broad range of lighting conditions, even in total darkness and when the eyelids are closed. You can watch a video demonstrating prototype EOG glasses from ETH Zürich here.
There are other eye tracking methods which have better accuracy (e.g., special tracking contact lenses) or which don't require users to wear tracking devices on their person (e.g., video vision tracking). However, these methods have several downsides, ranging from requiring more invasive eye tracking equipment to being subject to interference from lighting conditions to requiring lots of computational analysis that's not currently possible in ultra-portable electronics. For all of these reasons, EOG is the best candidate tracking technology at the moment.
Successfully deploying this technology will require the development of powerful, efficient image analysis and eye tracking software coupled with a revolutionary interface designed for sight navigation. While many media outlets are speculating about augmented reality applications for the glasses, and these are surely a possibility, a successful interface would have to remain unobtrusive in order to avoid blocking sight lines or distracting wearers.
In the above elevator scenario, for example, it might be the case that as the passenger stares at the button for the floor she wants to visit, a red stop sign icon could appear in the periphery of her visual field. Looking directly at the stop sign icon would avert her gaze from the target elevator button and cancel any action. Alternatively, if she continued to stare at the button, the icon could change from red stop sign to yellow triangle to green circle, confirming her selection before transmitting her request to the elevator.
From a marketing and sales perspective, the opportunity for Google to integrate the device with its other technologies (Android phones, Chrome OS computers, etc.) and to license the technology for other devices is huge. Users of the glasses could become eager to replace light switches in their homes with ones that they can sight-activate and purchase new robotic vacuum cleaners that can navigate to a dirty spot on the floor that's being stared at. Paraplegics and those with limited mobility could gain new independence.
Given that wireless interfaces for the glasses could be incorporated into so many other electronics, Google's acquisition of Motorola Mobility could be significant beyond enabling first-party manufacturing of Android phones. Imagine Google saying to Sony "You want to make a stereo that can be controlled by Google Glasses? Fine, just buy one of these custom Glasses receivers from us for every unit."
While the expense and immaturity of the technology described may prevent Google from including all the eye tracking features in its first-generation products, considering the difficulty in marketing head-mounted display technology, it's almost guaranteed that Google has vision tracking in mind as a "killer app" that would allow it to overcome the "nerd factor" of trying to pitch people on electronic goggles. Virtual reality and personal display devices have failed commercially in the past because they're perceived as a way of isolating the user and retreating from reality. Alternatively, an eye-tracking system that extends a person's reach and influence in the real world could be celebrated and coveted. That's surely what Google is looking to build."