01/07/2025 | Press release | Distributed by Public on 01/07/2025 08:17
Chewing a bagel while reading the morning news, speaking while driving, dislodging a piece of food stuck between two teeth: In these and other tasks, the tongue and the brain coordinate intricate movements without conscious attention, but the exact pathway in the brain has been largely unexplored.
Now, Cornell scientists have identified the neural pathway mice use to direct the tongue to tactile targets: the superior colliculus, the same brain region that primates - including humans - use to direct their gaze to visual targets. It's likely that humans use the same neural pathways for touch-guided tongue control.
"The essence of motor control is the monitoring of behavior in real time to make what is called online adjustments. These adjustments are very automatic, but they go awry in all sorts of diseases," said Jesse Goldberg, professor and Robert R. Capranica Fellow in the Department of Neurobiology and Behavior in the College of Arts and Sciences (A&S).
"If you remove tactile feedback from the human tongue, speech is slurred, swallowing is messed up," Goldberg said. "In fact, the number one cause of death in neurological disease like Parkinson's, dystonia or ALS, is actually aspiration pneumonia. Because with poor tongue control, you don't handle food and water well and accidentally inhale them."
Identifying this neural pathway represents progress toward treating neurological disorders, said Goldberg, corresponding author of "A Collicular Map for Touch-Guided Tongue Control," which published Jan. 1 in Nature. Co-first authors are Brendan Ito, NSF Graduate Research Fellow in neurobiology and behavior (also a corresponding author), and Yongjie (Jason) Gao, doctoral student in neurobiology and behavior.
Much of the brain is devoted to online motor control tasks like chewing and reaching for a cup without looking, Ito said. But tactile feedback, including from the tongue, has largely been unexplored because it's difficult to observe. New technologies including high-speed cameras and machine learning are allowing researchers to make new connections.
A few years ago, Cornell researchers including Ito and Goldberg developed a technique for studying on-the-spot motor controlin mice by focusing on a mouse's tongue when it licks a water spout. During this 2021 study, Ito noticed that mice used such subtle tactile feedback events - a slight brush to the side of the tongue - that the tongue seemed to be moving intelligently when he observed them on millisecond time scales.
"Mice lick in bouts - six to eight licks per bout," Ito said. "In these newer experiments, we moved the spout to the left or the right of the animal. The mouse can't see the spout, so it has to rely on a guess, and the tactile feedback at the end of that guess, as to where that spout is in space."
Using tiny bits of tongue tactile feedback, a mouse re-aims its next lick to find the spout, similar to the way a person grasps a cup using a pinky finger to zero in on its location. Ito and the team determined this by using a deep learning algorithm to analyze hours of video footage taken at two images per millisecond.
The next question: How does the mouse brain solve this problem?
Using optogenetics (inactivating specific brain regions using light), the researchers were surprised to find that frontal cortical regions of the brain - known for tongue control in mice - were not required at all for re-aiming the tongue in response to touch, Ito said. Instead, "the hot spot we found, the region really required for these tactile feedback corrections, is the superior colliculus, which is known for being involved in visually guided eye movements," such as turning the head to watch a bird.
The superior colliculus contains a map of the visual field on its surface, Ito said. The new discovery indicates that it contains a map of the tongue's surface, too.
"It's not intuitive," Goldberg said. "When you move your tongue around, you don't think you're repurposing circuits for visually guided eye movement. But it turns out that in mice, this is a very conserved architecture of the brain. And usually things discovered at this level in mice extend to humans."
This discovery also opens channels of research in studying speech, another intricate use of touch-guided tongue control, and artificially intelligent robotics, development of which is hindered by the vast computational complexity of a simple human or animal gesture.
A research group at Johns Hopkins University inspired by the Cornell team's work is currently testing the connection in marmosets, to see if it carries over in primates.
Brian Kardon, research support specialist in the Goldberg lab, also contributed to this study, which was funded by the National Institutes of Health and a gift from Jean Sheng '77 and Kent G. Sheng '78.
Kate Blackwood is a writer for the College of Arts and Sciences.