logo

Decoding Movement and Speech from the Brain of a Tetraplegic Person

Decoding Movement and Speech from the Brain of a Tetraplegic Person

Every year, the lives of hundreds of thousands of people are severely disrupted when they lose the ability to move or speak as a result of spinal injury, stroke, or neurological diseases. At Caltech, neuroscientists in the laboratory of Richard Andersen, James G. Boswell Professor of Neuroscience, and Leadership Chair and Director of the Tianqiao & Chrissy Chen Brain-Machine Interface Center, are studying how the brain encodes movements and speech, in order to potentially restore these functions to those individuals who have lost them.

 

In 2015, the Andersen team worked with a tetraplegic participant to implant recording electrodes into a part of the brain that forms intentions to move. This brain machine interface (BMI) enabled the participant to direct a robotic limb to reach out and grasp a cup, just by thinking about those actions.

 

Now, new research from the Andersen lab has identified a region of the brain, called the supramarginal gyrus (SMG), that codes for both grasping movements and speech—indicating a promising candidate region for the implantation of more efficient BMIs that can control multiple types of prosthetics in both the grasp and speech domains.

 

The research is described in a paper that appears in the journal Neuron on March 31. Sarah Wandelt, who is a graduate student in the computation and neural systems program at Caltech, is the study’s first author.

 

Read more on TCCI® for Neuroscience website