That Kindle Fire you got for Christmas may be pretty cool, but what if it could flip the book pages without requiring you to lift a finger?
A Copenhagen-based company is working to commerialize software, called Senseye, that enables eye-movement-based navigation. A small camera tracks users' eye movements and moves cursors accordingly. Right now, the product is still in the early stages, but Sensye's creators hopes to have it packaged with smartphones by 2013, with a separate attachment to use with tablets possibly debuting earlier. Watch Senseye in action below.
As with other nascent emerging technologies, like exoskeletons, Senseye demonstrates how some need-based initiatives can move from serving the disabled to serving a wide customer base. There are already some similar technologies for those who cannot use traditional hand-based navigation systems; for instance, former football player Steve Smith, who suffers from ALS, or Lou Gehrig's disease, communicates with the aid of a computer that detects his eye movements. As the Wall Street Journal has reported, the program QuickGlance has helped a student with cerebal palsy to study at the Rochester Institute of Technology. Watching 20-year-old Devin Hamilton--who also uses his elbow to control a computer mouse--do his engineering homework with his eyeballs isn't so different from the demonstration of Senseye.
TODAY IN SLATE
The Budget Disaster that Sabotaged the WHO’s Response to Ebola
Are the Attacks in Canada a Sign of ISIS on the Rise in the West?
PowerPoint Is the Worst, and Now It’s the Latest Way to Hack Into Your Computer
Is It Offensive When Kids Use Bad Words for Good Causes?
Fascinating Maps Based on Reddit, Craigslist, and OkCupid Data
The Real Secret of Serial
What reporter Sarah Koenig actually believes.
The Actual World
“Mount Thoreau” and the naming of things in the wilderness.