How the brain processes speech
A review of human and non-human primate studies suggests that scientists are very close to forming a conclusive theory about the brain processes speech and language.
Dr Josef Rauschecker of Georgetown University and his co-author Sophie Scott, a neuroscientist at University College, London, say that both human and animal studies have confirmed that speech is processed in the brain along two parallel pathways, each of which run from lower-to higher-functioning neural regions.
The authors describe these pathways as the "what" and "where" streams, which are similar to how the brain processes sight, but are located in different regions.
Both pathways begin with the processing of signals in the auditory cortex, located inside a deep fissure on the side of the brain underneath the temples - the so-called "temporal lobe".
Information processed by the "what" pathway then flows forward along the outside of the temporal lobe, and the job of that pathway is to recognise complex auditory signals, which include communication sounds and their meaning (semantics).
The "where" pathway is mostly in the parietal lobe, above the temporal lobe, and it processes spatial aspects of a sound - its location and its motion in space - but is also involved in providing feedback during the act of speaking.
Rauschecker says that auditory perception - the processing and interpretation of sound information - is tied to anatomical structures. "Sound as a whole enters the ear canal and is first broken down into single tone frequencies, then higher-up neurons respond only to more complex sounds, including those used in the recognition of speech, as the neural representation of the sound moves through the various brain regions," he says.
"In both species, we are using species-specific communication sounds for stimulation, such as speech in humans and rhesus-specific calls in rhesus monkeys. We find that the structure of these communication sounds is similar across species," he adds.
Rauschecker believes that the findings of this research may ultimately yield some valuable insights into disorders that involve problems in comprehending auditory signals, such as autism and schizophrenia.
"Understanding speech is one of the major problems seen in autism, and a person with schizophrenia hears sounds that are just hallucinations. Eventually, this area of research will lead us to better treatment for these issues,” Rauschecker says.
The study is published in the June issue of Nature Neuroscience.