Forgot your password?
typodupeerror

Submission Summary: 0 pending, 2 declined, 2 accepted (4 total, 50.00% accepted)

+ - White House Responds to Net Neutrality Petition

Submitted by bostonidealist
bostonidealist (2009964) writes "The White House has officially responded to a We The People petition created on January 15, 2014, which urged "...the President to direct the FCC to classify ISPs as 'common carriers'" after the D.C. U.S. Court of Appeals "struck down the Federal Communications Commission's open internet rules." The White House statement indicates that "absent net neutrality, the Internet could turn into a high-priced private toll road that would be inaccessible to the next generation of visionaries," but notes that "The FCC is an independent agency. Chairman Wheeler has publicly pledged to use the full authority granted by Congress to maintain a robust, free and open Internet — a principle that this White House vigorously supports.""
Cloud

+ - Amazon Matches iTunes Match With New "Audio Upgrade" Feature-> 1

Submitted by bostonidealist
bostonidealist (2009964) writes "Just after the July 6th 1-year anniversary of the its unlimited music storage promotion (and presumably after early subscribers have all renewed their annual subscriptions), Amazon.com has changed the way its Cloud Player and Cloud Drive services work. Starting today, music uploaded to a Cloud Drive will count against its owner's Cloud Drive quota and will not be accessible through Cloud Player. Further, music files previously uploaded to Cloud Player or Cloud Drive are being automatically converted to 256 Kbps audio whenever Amazon "has the rights to do so" and new audio files uploaded to Cloud Player will automatically be checked against Amazon's music database in iTunes Match-like fashion. One of the appeals of Amazon's Cloud Player service up to this point has been that users could pay a flat fee and store an unlimited number of their own music files (with their own tags, artwork, and audio data intact). Now, Amazon is automatically replacing users' previously uploaded data with its own, without allowing users to opt in/out."
Link to Original Source
Android

+ - Google Eying Vision Tracking For Glasses?

Submitted by bostonidealist
bostonidealist (2009964) writes "Google has been using eye tracking technology in internal product testing for years. Given the company's plans to release electronic glasses by year's end, is it possible that Google intends to include electrooculography (EOG) sensors in the product, such as those demonstrated in this prototype from ETH Zürich? A refined integration of EOG, machine vision, and wireless technologies could allow wearers to manipulate real world objects just by staring at them."
Google

+ - Where Google Is Going

Submitted by bostonidealist
bostonidealist (2009964) writes "Soon, we'll be able to move objects just by staring at them.

Media outlets are reporting that Google is creating some form of consumer electronics glasses, with the ambitious goal of launching a product based on the technology within the year. While many of these reports have included sketchy details on how the glasses might present information to the wearer, the true novelty will be the way the devices empower their owners to interact with real-world objects.

Many traditional interfaces, from steering wheels to touch screens to voice recognition to computer mice, require a relatively high degree of exertion in order to manipulate objects around us. Comparatively, an interface based on tracking the motion of the human eye could be so intuitive that in many cases the interface itself could be barely perceptible.

Let's consider a simple example to see how this might work. Imagine an elevator designed to be controllable by riders wearing gaze-tracking glasses. When a passenger wearing such glasses enters the elevator, she could look at a panel of buttons to select the floor that she wants to visit. The glasses could identify the wearer's focus on a particular button through eye tracking, could "read" the label of the button she was staring at (e.g., "floor 14") and could overlay a user interface element in her visual field. By simply continuing to stare at the elevator button, she could indicate her desired destination, her glasses could wirelessly transmit this information to a receiver installed in the elevator, then the real-world elevator button's light could illuminate, indicating that the floor was selected as her destination.

To an outside observer watching the above scenario unfold, it might appear that the elevator's passenger had some kind of magical, telekinetic ability. However, the interaction described could be implemented via a refined integration of existing technologies, and there are many hints that Google is working to build and deploy exactly this kind of device.

The human eye performs vergence and accommodation movements to focus on objects of interest nearly every waking moment. During typical use of a modern personal computer, the number of motions made by the eyes dwarfs the number of keystrokes made on a keyboard or the movements and clicks of a computer mouse. These many subtle eye movements constitute a rich expression of the viewer's thoughts and intent. For this reason, eye tracking is second only to direct neural activity monitoring in its capacity to immediately reveal a wealth of cognitive data and many media and advertising companies have been using eye tracking technology internally in product studies for years.

Google blogged about its own use of eye tracking for web search usability testing in 2009. Given the strong potential of this technology, it makes sense that Google and other companies are eager to be first-to-market with consumer products.

It's likely that Google is building a device that will implement electrooculography (EOG). EOG uses small electrodes, which can be incorporated into the rims of glasses, to determine the position of the eyes by measuring the electric potential field of the eyes themselves. Eye position can be measured in a broad range of lighting conditions, even in total darkness and when the eyelids are closed. You can watch a video demonstrating prototype EOG glasses from ETH Zürich here.

There are other eye tracking methods which have better accuracy (e.g., special tracking contact lenses) or which don't require users to wear tracking devices on their person (e.g., video vision tracking). However, these methods have several downsides, ranging from requiring more invasive eye tracking equipment to being subject to interference from lighting conditions to requiring lots of computational analysis that's not currently possible in ultra-portable electronics. For all of these reasons, EOG is the best candidate tracking technology at the moment.

Successfully deploying this technology will require the development of powerful, efficient image analysis and eye tracking software coupled with a revolutionary interface designed for sight navigation. While many media outlets are speculating about augmented reality applications for the glasses, and these are surely a possibility, a successful interface would have to remain unobtrusive in order to avoid blocking sight lines or distracting wearers.

In the above elevator scenario, for example, it might be the case that as the passenger stares at the button for the floor she wants to visit, a red stop sign icon could appear in the periphery of her visual field. Looking directly at the stop sign icon would avert her gaze from the target elevator button and cancel any action. Alternatively, if she continued to stare at the button, the icon could change from red stop sign to yellow triangle to green circle, confirming her selection before transmitting her request to the elevator.

From a marketing and sales perspective, the opportunity for Google to integrate the device with its other technologies (Android phones, Chrome OS computers, etc.) and to license the technology for other devices is huge. Users of the glasses could become eager to replace light switches in their homes with ones that they can sight-activate and purchase new robotic vacuum cleaners that can navigate to a dirty spot on the floor that's being stared at. Paraplegics and those with limited mobility could gain new independence.

Given that wireless interfaces for the glasses could be incorporated into so many other electronics, Google's acquisition of Motorola Mobility could be significant beyond enabling first-party manufacturing of Android phones. Imagine Google saying to Sony "You want to make a stereo that can be controlled by Google Glasses? Fine, just buy one of these custom Glasses receivers from us for every unit."

While the expense and immaturity of the technology described may prevent Google from including all the eye tracking features in its first-generation products, considering the difficulty in marketing head-mounted display technology, it's almost guaranteed that Google has vision tracking in mind as a "killer app" that would allow it to overcome the "nerd factor" of trying to pitch people on electronic goggles. Virtual reality and personal display devices have failed commercially in the past because they're perceived as a way of isolating the user and retreating from reality. Alternatively, an eye-tracking system that extends a person's reach and influence in the real world could be celebrated and coveted. That's surely what Google is looking to build."

All life evolves by the differential survival of replicating entities. -- Dawkins

Working...