Researchers from Duke University Center for Neuroengineering have developed the first two-way interface between a brain and a machine. The experiment, which was published in the journal Nature, describes how the novel brain-machine-brain interface was demonstrated between monkeys and their on-screen avatars.
Two monkeys were trained to use the electrical activity in their motor cortex to control the arm of an onscreen avatar without physically moving themselves. When the monkeys placed the avatar’s virtual arm over one of three objects, tactile feedback was provided via continuous electrical stimulation to the monkey’s primary tactile cortex, which the monkey’s interpreted as texture. A different electrical stimulus was fed back depending on which object the virtual arm was placed over, providing the monkeys with the impression of different textures for the different objects. The monkeys were then trained to detect the same object from a repeated set of tests based on it’s texture alone.
From the Nature press release:
A major challenge, the authors say, was to keep the sensory input and the motor output from interfering with each other, because the recording and stimulating electrodes were placed in connected brain tissue. The researchers solved the problem by alternating between a situation in which the brain–machine–brain interface was stimulating the brain and one in which motor cortex activity was recorded; half of every 100 milliseconds was devoted to each process.
“This enforces some constraints on the exchange of information between the sensory and motor areas,” says neuroscientist Stefano Panzeri of the Italian Institute of Technology in Genoa, who was not involved in the study. But because the animals learn to use the information, this experiment shows that the brain can exchange information even under these constraints, he explains.
This bidirectional communication is a critical step in the development of brain–machine interfaces, says Rodrigo Quian Quiroga, a neuroscientist at the University of Leicester, UK who was also not involved in the study. Previous brain–machine interfaces have relied on visual feedback, a less-than-ideal situation for someone trying to use a robotic prosthetic, he says. “If you want to reach and grasp a glass, visual feedback won’t help you,” says Quian Quiroga. “It’s the sensory feedback that tells you if you have a good grip or if you are about to drop it.”
This early experimental work is part of a larger project to develop prosthetic technology capable of restoring motor control and tactile feedback to spinal cord injured patients.
Abstract in Nature: Active tactile exploration using a brain–machine–brain interface