A new system enables users to control the movements of robots using brain signals and simple hand gestures.

The new system— created by researchers from the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory (CSAIL)—works by harnessing the power of electroencephalography (EEG) and electromyography (EMG). A human user is outfitted with a series of electrodes on their scalp and forearm, which are the linked to the robot.

Traditionally, EEG signals are not always reliably detectable and EMG signals are often difficult to map motions more specific than simple “move left or move right.” However, by merging the two signals, the researchers created a system with more robust biosensing.

“By looking at both muscle and brain signals, we can start to pick up on a person’s natural gestures along with their snap decisions about whether something is going wrong,” PhD candidate Joseph DelPreto, lead author on the study, said in a statement.

As a robot performs a task, the new system is able to detect in real time if a user notices an error, by monitoring brain activity. An interface that measures muscle activity enables a person to make hand gestures to scroll through and select the correct option for the robot to execute.

Read the rest here.