International Research Effort Gives Neuroscientists Better Feeling About Sense of Touch-CMU News - Carnegie Mellon University

Monday, January 5, 2015

International Research Effort Gives Neuroscientists Better Feeling About Sense of Touch

Discovery Will Help Scientists To Understand the Neuroscience Behind Somatosensation

By Jocelyn Duffy / 412-268-2900                   

Alison BarthPITTSBURGH—Our sense of touch is one we often take for granted, until our leg falls asleep and we aren't able to stand, or when we experience acute pain. The sense of touch also has been taken for granted in neuroscience, where it's the sense scientists know the least about.

An international group of researchers, including Carnegie Mellon University's Alison Barth, is changing that. For the first time researchers have linked a group of neurons to a specific type of somatosensation, a finding that can open the door for a heightened understanding about our sense of touch. The research is published in the Dec. 3 issue of Neuron.

"Somatosensation is critical. You can somewhat overcome losing your sense of smell, sight, taste, or hearing. But if you lose your sense of touch, you wouldn't be able to sit up or walk. You wouldn't be able to feel pain," said Barth, a professor of biological sciences and a member of Carnegie Mellon's BrainHubSM research initiative. "We know less about the features that make up our rich tactile experience than we do about any other sense, yet it's such a critical sense."

Somatosensation, which is another word for our sense of touch, occurs in a number of forms, like feeling texture, temperature, pressure, pain or vibration. It's responsible for proprioception, which helps us know where we are within our environment. It tells us if our feet are firmly planted on the floor, or if we're holding a paper cup tightly enough that it won't slip out of our hand, but loosely enough that we don't crush the cup. Scientists know a good deal about the molecular receptors that mediate the different types of somatosensation, but they know little about how touch is represented in the brain.

"When someone gets pricked by a pin, we know how information about that sensation travels from the skin to the spinal cord. But what happens in the brain has been much less clear — it seems like all different sorts of touch information get jumbled together," said Barth, who also is a member of the joint Carnegie Mellon/University of Pittsburgh Center for the Neural Basis of Cognition (CNBC).

It was a jumble — until now.

In previous studies, Barth had discovered that certain groups of neurons in the brain's neocortex were reliably more active than others. Using the fos-GFP mouse, a transgenic mouse model Barth created to study activity in live neurons, she and her colleagues set out to see if these neurons were generally more excitable, or if they responded specifically to one tactile stimulus. They found that these neurons reacted much more quickly and strongly when a puff of air was directed at the mouse's whiskers, while other neurons had little or no response.

"This is the first time we've been able to visualize neurons in the somatosensory cortex that 'like' a specific tactile stimulus," Barth said. "It shows that neurons are individuals. They have different jobs to do in the cortex. In this case these neurons had a special feature: they responded when all of the mouse's whiskers moved at once."

They also found that the neurons in question received direct synaptic input from the posteromedial nucleus of the brain's thalamus. This shows that the neurons that react to the puff-of-air stimulus have a dedicated, unique sub-network of connections that enable them to communicate with one another and amplify the information they are receiving from the stimulus.

"Now that we have isolated the neural underpinnings of a certain feature, we can try to manipulate and change the interactions between cells. Can we train the mouse and strengthen the connections between neurons? What happens to perception if we remove the connections? It's really the frontier of truly understanding somatosensory function," Barth said.

This research also could lead to work that will identify how somatosensory information is coded, which could be used to incorporate sensory information into brain-machine interfaces. This could allow robotic limbs and prosthetics to actively sense and receive tactile input.

Co-authors of the study include: Jean-Sébastien Jouhanneau, Leiron Ferrarese, Luc Estebanez and James F.A. Poulet from the Department of Neuroscience at the Max Delbrück Center for Molecular Medicine in Berlin and the Neuroscience Research Center at the Charité-Universitätsmedizin Berlin; Nick J. Audette, a doctoral student in the Department of Biological Sciences at Carnegie Mellon and the CNBC; and Michael Brecht from the Bernstein Center for Computational Neuroscience at Humboldt University in Berlin and the the Neuroscience Research Center at the Charité-Universitätsmedizin Berlin.

The research was funded by the Deutsche Forschungsgemeinschaft, the European Research Council, the Alexander von Humboldt Foundation, the European Union and the National Institutes of Health (DA 0171-88).

As the birthplace of artificial intelligence and cognitive psychology, Carnegie Mellon has been a leader in the study of brain and behavior for more than 50 years. The university has created some of the first cognitive tutors, helped to develop the Jeopardy-winning Watson, founded a groundbreaking doctoral program in neural computation, and completed cutting-edge work in understanding the genetics of autism. Building on its strengths in biology, computer science, psychology, statistics and engineering, CMU recently launched BrainHubSM, a global initiative that focuses on how the structure and activity of the brain give rise to complex behaviors.

###

For the first time researchers, including Carnegie Mellon's Alison Barth (pictured above), have linked a group of neurons to a specific type of somatosensation, a finding that can open the door for a heightened understanding about our sense of touch.