When you listen to someone speak, your brain interprets the sounds so they make sense to you. At the same time, your brain is gathering visual information from the speaker’s face, body language and hand gestures.
The sound of the words also carries with it information about the speaker’s tone and inflection. And all this happens at lightning speed.
Because so many different brain functions are involved when you hear and understand speech, scientists don’t have a complete understanding of how the whole process is coordinated. A group of Columbia researchers, including neurosurgeon Dr. Sameer Sheth, has just been awarded a National Institutes of Health grant of $1 million a year for the next three years to study exactly that.
Neuroscientist Charles E. Schroeder, Ph.D., is the Principal Investigator for this study. He and Dr. Sheth are Co-Directors of the Cognitive Science & Neuromodulation Program at Columbia University. The program’s goal is to integrate surgical and noninvasive methods to gain a better understanding of how the brain operates when performing seemingly basic processes, such as hearing and understanding speech.
Usually when scientists want to see what happens in the brain during an activity like communication, they use non-invasive measures such as functional magnetic resonance imaging (fMRI), electroencephalography (EEG) or magnetoencephalography (MEG).
These modalities can help researchers see which parts of the brain are active when you listen to someone talk. But because understanding speech is such a complex process, even those tools can’t really break down exactly which brain functions are handling which specific parts of the process.
Drs. Schroeder and Sheth along with colleagues at Northwell Health in Long Island and Baylor College of Medicine in Houston, Texas, are using a unique neurosurgical opportunity to study how speech perception works in our brains; one they say sidesteps some of the issues that have made speech perception difficult to study. Along with looking at non-invasive imaging measures, they’re going to directly record the activity of neurons from inside the brain.
Neurons make up most of the “gray matter” of the brain. Neurons gather and transmit the electrochemical signals that control everything we think and do.
Dr. Sheth and Dr. Schroeder will directly measure what neurons do during speech perception by using electrodes implanted inside the brain, in a process called electrocorticography (ECoG). This will give them a better idea of which specific brain mechanisms are involved in speech perception.
This research has far-ranging applications. A better model of how our brains handle speech perception could help people who are experiencing hearing loss due to aging, or patients with impaired language abilities resulting from autism or stroke.
Image credit: © [Unsplash] /pixabay
You have added pages to your clipboard. Please log in or create an account to share them or use later.
You are now being taken to Columbia Neurosurgery's site dedicated to the spine.
Use this button to save pages to your clipboard for future use.OK. Got it.