Researchers at the Fraunhofer Institute of Industrial Engineering in Stuttgart have developed an Eye Controlled Interaction system (EYCIN) that tracks users’ eye movements to interact, sans-hands, with any screen-based device…
EYCIN consists of a monitor linked to a camera that tracks the eye, using optical recognition software to detect the pupil. An on-screen pointer follows eye movements and when the user glances at a designated area for half a second, EYCIN interprets this as a mouse click.
Dr Fabian Hermann, a human-computer interface researcher with a background in psychology, is a usability engineer on the project. ‘The project originated from a German industrial company that funded research into a visor and controls to operate a GUI with the eyes,’ he said. ‘It has since been developed into other areas.’
‘Although the first papers we produced concerned hands-free applications for maintenance engineers, EYCIN could also be used for interaction in the home,’ Hermann added.
Hermann said a marketable version of EYCIN is one or two years away. Applications for the final version will depend on how much funding the team receives, but he believes EYCIN could have important uses for people with paraplegia and other conditions that limit computer use.
‘One major European project is focusing on technology interfaces for disabled people. We hope in the course of this project to bring in eye-based interaction,’ he said.
As always with user-interfaces, the problems lie less with the technical aspects and more with intuitive usability. Too bad we engineers are too busy making fun of psych students to realize they might have something useful to offer.
More from the Engineer Online…
Link to the Fraunhofer Institute of Industrial Engineering press release…