Pier and Beer Alike to the Ear
By Caroline Sheedy
When visiting a different city or country, travelers may find themselves slipping into the local accent. Timothy Murphy wants to find out why.
Researchers have been studying this phenomenon — phonetic convergence — for years. But Murphy, a graduate student studying psychology in Carnegie Mellon University's Holt Lab, said there is currently no explanation for why an American may sound a little British after a vacation across the pond.
"It's a basic science question with a lot of different applications," Murphy said. "Most interesting to me is examining how adults are able to pronounce and understand sounds when they learn new languages."
Murphy was recently awarded the Raymond H. Stetson Scholarship in Phonetics and Speech Science from the Acoustical Society of America. The scholarship will support his quest to understand the link between speech perception (the way words are heard) and speech production (the way words are said).
"There are a lot of models that explain how people can understand other speakers when there's lots of noise in the environment or the person has an accent, but none of those models can explain how people are able to learn a new language or be able to speak a new language," Murphy explained. "People think speaking is like writing, but the process in the brain is completely different."
To investigate this, Murphy will be conducting experiments built on speech perception work by his adviser, Lori Holt, a professor in CMU's Neuroscience Institute and Department of Psychology. In that work, Holt examined how people can use subtle differences in acoustic cues to understand speakers with different accents, like a nonnative speaker of a language.
"There are often multiple acoustic cues that allow listeners to make distinctions between speech sounds," Murphy said. "If a particular cue is not available, our brains can rapidly adjust to the accented speech by ignoring the unreliable cue and focusing on another. Our lab is interested in how the brain knows when to make these adjustments."
In an experiment, Holt and her team asked participants to listen to a recording of words that sound very similar in English, the words "beer" and "pier." They played a recording of those words in a sequence followed by a target word that they asked participants to categorize as either "beer" or "pier." By subtly changing the acoustic cues of words in the sequence right before the target word, the researchers found that they could drastically change participants' perception of the target words.
Taking Holt's work a step further, Murphy will ask participants to repeat the words "beer" and "pier" into their microphones, bringing speech production into the experiment.
"We have people repeat the word they hear out loud, and what we're seeing is that people will actually stop making the distinction in their speech production when these subtle manipulations are made, so it's not only that their perceptions change, but the speech production that they're making is also changing as well," Murphy explained.
Murphy asked participants in a trial to identify the words “beer” and “pier" from a recording. He and his colleagues found that they could drastically change participants’ perception of these target words. Video courtesy of Timothy Murphy.
Holt said that Murphy's work has been a valuable addition to CMU, even sparking collaboration with Nazbanou (Bonnie) Nozari, an associate professor in the Department of Psychology with expertise in speech production.
"Tim Murphy's research is a terrific example of how collaboration generates new ideas. It was a natural match for Tim, Bonnie and I to begin working closely together to better understand how the brain manages the challenges of listening and talking," Holt said.
Murphy's experiments have shown that speech perception and speech production are linked. In future work, he hopes to explore how they arise from interacting processes in the brain.