Fully paralyzed folks are strictly limited in their ability to interact in the real world. Luckily, the virtual world of gaming and 3D environments like Second Life does not require working arms and legs. All that is truly needed there is a viable interface for easy control of an avatar on the screen. To make these environments accessible for people with locked in syndromes resulting from injury or stroke, a project called COGAIN, or Communication by Gaze Interaction, has brought European researchers together to tackle the issue.
From CORDIS News:
According to the researchers, the gaming-with-gaze software runs together with eye trackers currently available on the market. The eye trackers use cameras to monitor users’ eye movements as they gaze at a computer screen.
Eye movements of able-bodied gamers were evaluated by the developers in order to set up a visual heat map that would trigger commands depending on where users look, the team said. The various eye movement patterns are converted into ‘gaze gestures’ which are used to activate movement or action commands.
‘In the current set-up, we have programmed 12 gesture sequences to activate different keyboard or mouse events,’ Professor Istance [Professor Howell Istance, De Montfort University in the UK] told ICT Results. ‘Many more commands are possible but the total number is limited by the users’ memory and the need to differentiate between when someone wants to input a command and when they are just looking at the screen.’
The team said the gaming-with-gaze software should make the avatars of people with disabilities nearly indistinguishable in their behaviour and abilities from those of able-bodied people in online games and environments.
More from CORDIS News…
Link: Communication by Gaze Interaction….