Neural Prosthetic Asynchronous Control using EEG

M. Farid, N. Rizkalla, M. Fahmy, M. Daoud, and K. El-Ayat (Egypt)


Biocomputing, Brain Computer Interface, Mu Rhythms,Prosthetic Control, Electroencephalography (EEG), Classification.


Brain Computer Interface (BCI) provides a direct neural interface and communication pathway between the human brain and an external device such as a computer or artificial limb. With such an interface a severely handicapped person, such as an Amyotrophic Lateral Sclerosis (ALS) patient, with severe muscle disorder may still communicate with and even control their environment relying using his brain activity. Even patients suffering from locked-in syndrome where they cannot move or communicate due to complete paralysis can communicate and control using BCI technology. The Neural Prosthetic Asynchronous Control (NPAC) system described asynchronously controls neural prosthetic devices for paralyzed patients. It captures EEG signals emitted from the brain, performs signal processing and classification on the captured raw signals, and produces appropriate control actions accordingly. NPAC analyzes Mu signals whose activity is associated with the motor cortex and altered with motion or intended (imagined) motion. In this paper, we describe our method, and empirically show that NPAC achieved good accuracy for trained sub jects in real-time environment. NPAC classifies right hand “imagined” movement, brain rest state, eye blinks, and eye looking down. It was tested on a trained non-paralyzedsub ject who managed to do several control activities such as controlling a wheelchair, moving a prosthetic device (both emulated on a robot), playing simple games, and navigating through virtual reality.

Important Links:

Go Back