Researchers at Korea University and TU Berlin have been working on integrating brain monitoring into a robotic exoskeleton to allow severely disabled people to move on their own. The system uses an EEG cap to detect steady state visual evoked potentials, which are brain signals generated when a person looks at a flashing light.
In the experimental setup there’s an array of lights that strobe at different frequencies. The user of the exoskeleton simply focuses on one of the lights, each of which is associated with a specific motion such as “turn left”, and the EEG system recognizes the same frequency within the generated brain signals.
While the individual components of the system are nothing new, integrating them into a working prototype was a serious challenge because the exoskeleton produces a lot of electrical noise that washes out the EEG signal. The researchers developed a special algorithm to identify the signals being searched for, producing an accuracy of activation by healthy subjects of about 91% with a response time averaging three seconds.
Here’s a video of a volunteer trying out the new brain controlled exoskeleton:
Study in Journal of Neural Engineering: A lower limb exoskeleton control system based on steady state visual evoked potentials…