We're one step closer to deciphering rodent languages

UW researchers developed a software called DeepSqueak—derived from self-driving car technology—to demystify mouse and rat communication and monitor how our furry analogs fare in the lab.

A file photograph of a white laboratory rat during a 2003 study. (Photo by Winfried Rothermel/AP)

Mice and rats have been staple creatures for laboratory research for nearly a century for good reasons: As human proxies for study, they share more than 97 percent of their DNA with our species. They also live shorter lives, have more babies, are cheaper to purchase and maintain and don’t need to sign waivers to give their lives to science. No wonder nearly 85 percent of the 25-million-plus lab animals used today are either rats or mice.

But despite their omnipresence in lab settings, rodent culture itself is still relatively understudied — especially the combination of chirping, bruxing and other behaviors that constitute rodent language. But a new software called DeepSqueak, outlined in Neuropsychopharmacology and developed by researchers at the University of Washington, could help demystify rodent language to better monitor how our furry analogs fare during experiments.

There’s more at stake for humans than just learning how to say "Hi" to Stuart Little in his native tongue. By cracking their communication, we stand to keep tabs on rodents less invasively and support research that depends on tracking their emotional states.

Rodents: Songbirds with fur?

Nearly 40 years ago, researchers realized that rats and mice use language in the form of ultrasonic vocalizations (USVs) that we would need specialized equipment to hear.

“What they do is they whistle, and when you slow it down 10 or 20 times, it sounds just like a bird call,” says Kevin Coffey, a postdoctoral researcher in the Psychiatry and Behavioral Science department at the University of Washington School of Medicine. Coffey has researched rodents (and owned them as pets) for more than a decade, giving him ample time to observe their habits. Researchers like himself have discovered that rodents make these 20 or so types of whistles — which scientists call syllables — depending on the scenario and what they want to accomplish, and structure them together in lots of different combinations, much like language.

Rats, they noticed, make two distinct categories of calls: “happy” calls, or calls that happen during positive events (like being given sugar); and “sad” calls, which happen during unpleasant events. But researchers still aren’t sure what all the different kinds of happy calls mean — individually or in different combinations.

“There is so much complexity going on,” says Coffey. “We just don't know yet what all of it means. You have to relate it back to behavior to figure it out.”

Better understanding these combinations of USVs could give them more concrete insight into how rodents feel during testing. This is especially important in his lab, which looks into treatments for psychiatric illnesses like depression, anxiety and drug abuse.

“We often infer how an animal is feeling from their behavior, but we want to just listen and let them tell us precisely how they're feeling,” he says. “But first we've gotta figure out what it all means.”

Image removed.
Kevin Coffey, left, and Russell Marx at a Society for Neuroscience Conference where they first presented DeepSqueak. (Courtesy of Kevin Coffey)

The dirt on DeepSqueak

The first step in creating an eventual Rat Rosetta Stone — before you can even compare the vocalizations against those behaviors — is collecting and categorizing all of the USVs. This is easier said than done.

Researchers visualize USVs to study them by turning audio from rodent habitats into sonograms. Existing software captures all of the sound in a rodent habitat, though, which makes it tough to distinguish between ambient noise, rodent behaviors (flicking levers, drinking) and the actual vocalizations researchers want to study.

“Some people used [the software] but it wasn't very trustworthy,” Coffey says. “Because [rodent USVs] are so high pitched, you have to slow down the audio [and check it] by hand, so if you have 10 hours of recording, it'd be 200 hours of trying to find calls in it. It wasn't feasible.”

Researcher Dr. Nathaniel Rieger, who also studies USVs at Boston College, agrees.

“In many cases, setting up an experiment that can capture both complex behaviors and isolate the production of USVs is difficult, and the analysis of these USVs is time intensive,” says Rieger, who is not involved with DeepSqueak. “Therefore, USVs are often not collected and their potential importance to behavior remains unknown.”

That’s why no matter how valuable researchers think USVs might be, the field stalled out for several decades after their initial discovery: It wasn’t worth the cost and energy, especially to researchers whose research didn’t even focus on USVs.

But better understanding of USVs could have a direct impact on the addiction and depression issues Coffey’s lab seeks to address. With the encouragement of his boss, pharmacology professor Dr. John Neumaier, Coffey teamed up with lab technician Russell Marx to develop a software that makes USV identification and categorization faster, simpler and less error-prone.

The duo turned to Faster-RCNN, a machine-learning software developed for self-driving cars. To drive safely, self-driving cars have to be able to quickly and accurately identify road features like pedestrians, trees and stop signs. Coffey and Marx theorized that by turning rodent USVs into sonogram images, they might be able to teach a neural network to separate USVs from background noise. The project quickly ballooned: DeepSqueak goes a step beyond recognizing noises as “USV/not USV” by recognizing specific calls, identifying them in syllable sequences and categorizing those sequences — all of which creates a framework for understanding rodent language.

Identifying whether a single noise is a mouse call is not a huge task. “The harder part is to take this big 10 seconds of audio pictures and say, ‘Where the heck are the mouse calls?’” Coffey says. “And that's what Faster-RCNN is really, really good at doing. We built our whole package around that.”

Other people have tried to automate software that accomplishes the disparate parts of DeepSqueak, but never all together and without bias. DeepSqueak’s efficiency and low margin of error are also creating standards for recognizing individual USVs: Until now, people in different labs may have disagreed on what even counted as a syllable.

DeepSqueak in the field

To test DeepSqueak’s efficiency, Coffey and Marx sought to replicate existing hand-done ID work on USVs. They selected a study in which authors manually parsed USVs in audio of male mice talking to each other, and then talking in the presence of female mice.

“They’re kind of silly: You put two [mice] together and they make these really simple conversations, repeating the same syllables over and over again; and you put a female mouse near them, and they turn into songbirds and have these very fancy calls with lots of transitions,” Coffey says.

That study took years of work, Coffey says. DeepSqueak took the same audio, and identified all the different USVs in an afternoon.

The initial concept of DeepSqueak came to Coffey and Marx in a day, but hashing out the program and beta testing took a year. Researchers at UW and beyond have been contributing rodent audio data to DeepSqueak and testing out the software in their labs, adding vocalizations from different environments to the puzzle.

Dr. Aaron M. Johnson, a speech-language pathologist at New York University Langone Health who has studied rodent vocalizations for the past decade, beta-tested DeepSqueak and provided audio samples after a mutual friend of Coffey’s introduced the two. Today, Johnson’s lab is using DeepSqueak as its primary USV analysis tool.

“I have used three different programs for analyzing USVs and DeepSqueak is now my preferred method,” says Johnson, speaking to the program’s reliability, accuracy and timeliness in responding to bugs.

Rieger hasn’t used DeepSqueak yet, but is impressed with the concept and foresees a range of possible positive impacts for humans and rodents alike.

“Our understanding of rodent vocalizations remains behind that of many species. ... But we do know that changes in frequency, duration and type of USV all pass along important information from signaler to receiver,” he says. “A new technology, like DeepSqueak, that can help us parse out this information could lead to major new insights on behavior and health.”

DeepSqueak for all

Coffey and Marx are offering DeepSqueak to other labs for free, in an effort to collect as many types of vocalizations as they can in an open repository for global research use.

“Researchers around the world hopefully will relate [USVs] back to all the behaviors they observe; we only study a couple [behaviors, related to psychological illness], so everybody else can fill in pieces of the puzzle just using the tool that we're giving out,” Coffey says. The software is point-and-click with a simple interface, which the team hopes will make it easy for people without machine-learning backgrounds to use.

“At the time when I first started [requesting audio files and outside help], it was mostly because we didn't have that much money, but I've just seen a growing community of people who want to share all of their tools, all their data and resources,” he says. “It moves science along so much faster. The faster we can get this out to people, the more people that use it, the better.”

The rodents will benefit from it, too. For people like Coffey who both care about these animals and understand the sacrifices rodents make for scientific efforts they never volunteered for, developing less-invasive monitoring tools to use while they’re alive is a serious imperative. Animal welfare activists aren’t the only people concerned with these lab animals’ quality of life: Researchers like Coffey spend their days with test animals and appreciate their unique personalities and feelings more intimately than most. Language-analyzing programs like DeepSqueak only deepen these sentiments.

“You do this value judgment about life and human suffering and what's worth doing to try to reduce that, so as much as we love rats and mice, and think they're great, there's value in what we do,” Coffey says. “It's never easy.”

Please support independent local news for all.

We rely on donations from readers like you to sustain Crosscut's in-depth reporting on issues critical to the PNW.

Donate

About the Authors & Contributors