A team of students at University of Toronto have developed a system that harnesses the ability of a commercial EEG to “read” the state of a person’s brain in order to control a robotic arm. They used the Emotiv EPOC headset, the software for which can identify common actions like a wink of an eye, to correlate different brain activity to specific movement of the robotic arm. Here’s a quick video showing off the system in action: