Methods to control robotic arms are closely related to those used to manipulate powered prosthetic arms. At MIT scientists have been working on improving how robots interact with humans, and this research should help to make prostheses more intuitive for amputees to control.
The team from the institute’s Computer Science and Artificial Intelligence Laboratory have developed a way to link a person’s brainwaves and hand gestures to the movement of an advanced robotic arm. The system is so effective that it lets the user make real-time corrections to the robot’s actions, guiding it to make accurate, precisely desired moves.
An EEG (electroencephalography) cap is worn by the human operator that detects brain activity and an EMG (electromyography) sensor system is placed on the forearm to monitor muscle movement. A computer is used to spot so-called “error-related potentials” in the brain-wave activity of the wearer, which are biomarkers within the data that humans produce when noticing a mistake. Once a mistake is spotted, the system “listens” to the desires of the user via the EMG sensors, which lets the human point to the correcting action that the robot should take.
By applying this technology to prostheses, it should be possible to better train how these powered devices react to the users’ desires.
The overall functionality of the system is explained and demonstrated in this video:
Project page: Supervising Robots with Muscle and Brain Signals…