Brain-computer interfaces have the potential to rehabilitate paralyzed people, get Parkinson’s disease under control, and help mitigate symptoms of other neurological conditions. Interfacing with the brain has been one challenge, but transmitting the large amounts of data the brain produces has meant wires protruding through the scalp. That’s because doing so wirelessly would require lots of power, which means a large battery or frequent recharging. Researchers at Nanyang Technological University in Singapore are going around this limitation, having created a chip that can pre-process and compress brain signal data so that they’re small enough for wireless transmission.
The 128-channel so called “neural decoder” takes advantage of machine learning algorithms to recognize patterns in the data to be able to compress the signal. This was verified on a monkey who was performing a finger movement experiment, the chip showing a 99.3% accuracy in accurately decoding the signal.
Some more details from the study abstract in Biomedical Circuits & Systems:
A machine learning coprocessor in 0.35-
m CMOS for the motor intention decoding in the brain-machine interfaces is presented in this paper. Using Extreme Learning Machine algorithm and low-power analog processing, it achieves an energy efficiency of 3.45 pJ/MAC at a classification rate of 50 Hz. The learning in second stage and corresponding digitally stored coefficients are used to increase robustness of the core analog processor.
The same coprocessor is also used to decode time of movement from asynchronous neural spikes. With time-delayed feature dimension enhancement, the classification accuracy can be increased by 5% with limited number of input channels. Further, a sparsity promoting training scheme enables reduction of number of programmable weights by
.
Study in Biomedical Circuits & Systems: A 128-Channel Extreme Learning Machine-Based Neural Decoder for Brain Machine Interfaces…