Brain-computer interfaces have the potential to give severely disabled people the ability to easily control their wheelchairs, televisions, and other devices. Existing technologies suffer from a number of limitations, though, making them impractical for real-world applications.
One is that non-invasive brain wave monitoring currently requires large and uncomfortable electroencephalography caps with wet electrodes, wires, and associated adhesives. All this can be difficult and cumbersome, unlike simply putting on a hat and having things work right away.
Now, researchers at Georgia Institute of Technology, University of Kent in the UK, and Wichita State University in Kansas have worked together to develop the first truly portable, comfortable, and wireless brain-computer interface. Already tested in six healthy human volunteers, the technology has clear potential for direct brain-controlled manipulation of wheelchairs and other devices in patients badly needing it.
The system brings together flexible electronics, nanomembrane electrodes, and a deep learning algorithm to sense relevant brain waves and accurately translate their meaning. As with similar systems, the new brain-computer interface relies on classifying signals generated from visually evoked potentials as users look at a flashing screen.
“This work reports fundamental strategies to design an ergonomic, portable EEG system for a broad range of assistive devices, smart home systems and neuro-gaming interfaces,” said Woon-Hong Yeo, an assistant professor at Georgia Tech. “The primary innovation is in the development of a fully integrated package of high-resolution EEG monitoring systems and circuits within a miniaturized skin-conformal system.”
The package consists of a headband with dry electrodes that touch the scalp, even in the presence of hair, a nanomembrane electrode placed just under the skin, flexible electronics for power and control, and an algorithmic deep learning neural network running within the electronics to interpret the signals.
“Deep learning methods, commonly used to classify pictures of everyday things such as cats and dogs, are used to analyze the EEG signals,” said Chee Siang (Jim) Ang, senior lecturer in Multimedia/Digital Systems at the University of Kent. “Like pictures of a dog which can have a lot of variations, EEG signals have the same challenge of high variability. Deep learning methods have proven to work well with pictures, and we show that they work very well with EEG signals as well.”
There’s a Bluetooth chip built-in that provides wireless communication, allowing the system to easily connect to a wide variety of devices.
Study in journal Nature Machine Intelligence: Fully portable and wireless universal brain–machine interfaces enabled by flexible scalp electronics and deep learning algorithm