How artificial intelligence could help us talk to animals
What might sperm whales and other animals be saying to each other?Â
A sperm whale surfaces, exhaling a cloud of misty air. Its calf comes in close to drink milk. When the baby has had its fill, mom flicks her tail. Then, together the pair dive down deep. GaĆĄper BeguĆĄ watches from a boat nearby. âYou get this sense of how vast and different their world is when they dive,â he says. âBut in some ways, they are so similar to us.â
Sperm whales have families and other important social relationships. They also use loud clicking sounds to communicate. It seems as if they might be talking to each other.
BeguĆĄ is a linguist at the University of California, Berkeley. He got the chance, last summer, to observe sperm whales in their wild Caribbean habitat off the coast of the island nation of Dominica. With him were marine biologists and roboticists. There were also cryptographers and experts in other fields. All have been working together to listen to sperm whales and figure out what they might be saying.
They call this Project CETI. Thatâs short for Cetacean Translation Initiative (because sperm whales are a type of cetacean).
Sperm whale clicks
A CETI tag attached to a sperm whale captured these sounds in January 2022. You can hear several different whales clicking. Does it seem like it could be a conversation?
To get started, Project CETI has three listening stations. Each one is a cable hanging deep into the water from a buoy at the surface. Along the cable, several dozen underwater microphones record whale sounds.
From the air, drones record video and sounds. Soft, fishlike robots do the same underwater. Suction-cup tags on the whales capture even more data. But just collecting all these data isnât enough. The team needs some way to make sense of it all. Thatâs where artificial intelligence, or AI, comes in.
A type of AI known as machine learning can sift through vast amounts of data to find patterns. Thanks to machine learning, you can open an app and use it to help you talk to someone who speaks Japanese or French or Hindi. One day, the same tech might translate sperm-whale clicks.
Project CETIâs team is not the only group turning to AI for help deciphering animal talk. Researchers have trained AI models to sort through the sounds of prairie dogs, dolphins, naked mole rats and many other creatures. Could their efforts crack the codes of animal communication? Letâs take a cue from the sperm whales and dive in head first.
Watch out for the person wearing blue!
Long before AI came into the picture, scientists and others have worked toward understanding animal communication. Some learned that Vervet monkeys have different calls when warning of leopards versus eagles or pythons. Others discovered that elephants communicate in rumbles too low for human ears to hear. Bats chatter in squeaks too high for our hearing. Still other groups have explored how hyenas spread scents to share information and how bees communicate through dance. (By the way: Karl von Frisch won a Nobel Prize in 1973 for his work in the 1920s that showed bees can communicate information through dance.)
Do these types of communication systems count as language? Among scientists, thatâs a controversial question. No one really agrees on how to define language, says BeguĆĄ.
Prairie dog alarm calls
Listen to these Gunnisonâs prairie dog alarm calls. Can you hear differences between them?
Call 1:
Call 2:
Call 3:
The first is the call for a coyote. The second is for a dog. The sight of a human triggered the third call.
Human language has many important features. Itâs something we must learn. It allows us to talk about things that arenât in the here and now. It allows us to do lots more, too, such as invent new words. So a better question, BeguĆĄ says, is âwhat aspects of language do other species have?â
Con Slobodchikoff studied prairie dog communication for more than 30 years. A biologist based near Flagstaff, Ariz., his work turned up many surprising language-like features in the rodents’ alarm calls.
Prairie dogs live in large colonies across certain central U.S. grasslands. Like Vervet monkeys, these animals call out to identify different threats. They have unique âwordsâ for people, hawks, coyotes and pet dogs.
But thatâs not all they talk about. Their call for a predator also includes information about size, shape and color, Slobodchikoffâs research has shown.
In one 2009 experiment with a colony of Gunnisonâs prairie dogs, three women of similar sizes took turns walking through a prairie dog colony. Their outfits were identical except that each time they walked through, they wore either a green, blue or yellow T-shirt. Each woman passed through about 30 times. Meanwhile, an observer recorded the first alarm call a prairie dog made in response to the intruder.
All the calls fit a general pattern for âhuman.â However, shoutouts about people in blue shirts shared certain features. The calls for yellow or green shirts shared different features. This makes sense, because prairie dogs can see differences between blue and yellow but canât see green as a distinct color.
âIt took us a long time to measure all these things,â says Slobodchikoff. AI, he notes, has the potential to greatly speed up this type of research. He has used AI to confirm some of his early research. He also suspects AI may discover âpatterns that [we] might not have realized were there.â
Chirp chirp
What carries meaning?
AIâs ability to find hidden patterns is something that excites BeguĆĄ, too. âHumans are biased,â he says. âWe hear what is meaningful to us.â But the way we use sound in our languages may not be anything like how animals use sound to communicate.
To build words and sentences, human languages use groups of letters called phonemes (FOH-neems). Meaning comes from the order of these phonemes. In English, swapping the phoneme âdâ for âfrâ makes the difference between âdogâ and âfrog.â In tonal languages, such as Chinese, using a high or low voice can also change the meaning of words.
Animal languages could use any aspect of sound to carry meaning. Slobodchikoff found that prairie dog calls contain more layers of frequencies than human voices do. He suspects that the way these layers interact helps shape meaning.
Before Project CETI began, researchers had already collected and studied lots of sperm whale clicks. They call a group of clicks a âcoda.â A coda sounds a bit like Morse code. (Patterns of short and long beeps represent letters in Morse code.) Researchers suspected that the number and timing of the clicks in codas carried meaning.
BeguĆĄ built an AI model to test this.
His computer model contained two parts. The first part learned to recognize sperm whale codas. It worked from a collection of sounds recorded in the wild. The second part of the model never got to hear these sounds. It trained by making random clicks. The first part then gave feedback on whether these clicks sounded like a real coda.
Over time, the second part of the model learned to create brand-new codas that sounded very real. These brand-new codas might sound like nonsense to whales. But that doesnât matter. What BeguĆĄ really wanted to know was: How did the model create realistic codas?
The number and timing of clicks mattered, just as the researchers had suspected. But the model revealed new patterns that the experts hadnât noticed. One has to do with how each click sounds. It seems important that some frequencies are louder than others.
BeguĆĄ and his colleagues shared their findings March 20 at arXiv.org. (Studies posted on that site have not yet been vetted by other scientists.)
Educators and Parents, Sign Up for The Cheat Sheet
Weekly updates to help you use Science News Explores in the learning environment
Thank you for signing up!
There was a problem signing you up.
Part of their world
Once researchers know which features of sperm whale sounds are most important, they can begin to guess at their meaning.
For that, scientists need context. Thatâs why Project CETI is gathering so much more than just sounds. Its equipment is tracking everything from water temperatures around the whales to whether there are dangerous orcas or tasty squid nearby. âWe are trying to have a really good representation of their world and what is important to them,â BeguĆĄ explains.
This gets at a tricky aspect of animal translation â one that has nothing to do with technology. Itâs a philosophical question. To translate what whale sounds mean, we need to figure out what they talk about. But how can we understand a whaleâs world?
In 1974, the philosopher Thomas Nagel published a famous essay: âWhat is it like to be a bat?â No matter how much we learn about bats, he argued, weâll never understand what it feels like to be one. We can imagine flying or sleeping upside-down, of course. But, he noted, âit tells me only what it would be like for me to behave as a bat behaves.â
Imagining life as any other species poses the same problem. Marcelo Magnasco is a physicist at the Rockefeller University in New York City who studies dolphin communication. He notes that linguists have made lists of words common to all human languages. Many of these words â such as sit, drink, fire â would make no sense to a dolphin, he says. âDolphins donât sit,â he notes. âThey donât drink. They get all their water from the fish they eat.â
Similarly, dolphins likely have concepts for things that we never talk about. To get around, they make pulses of sound that bounce off certain types of objects nearby. This is called echolocation.
In water, sound waves pass through some objects. When a dolphin echolocates off a person or fish, it âseesâ right through to the bones! Whatâs more, Magnasco notes, dolphins might be able to repeat the perceived echoes to other dolphins. This would be sort of like âcommunicating with drawings made with the mouth,â he says.
People donât echolocate, so âhow would we be able to translate that?â he asks. These and other experiences that have no human equivalent. And that could make it very difficult to translate what dolphins are saying.
Whistle whistle squeak whee!
Chatting with dolphins
Yet dolphins do share some experiences with people. âWeâre social. We have families. We eat,â notes marine biologist Denise Herzing. She is the founder and research director of the Wild Dolphin Project in Jupiter, Fla. For almost 40 years, she has studied a group of wild Atlantic spotted dolphins. Her goal has been to find meaning in what theyâre saying to each other.
This is very slow work. She points out, for instance, âWe donât know if they have a language â and if so, what it is.â
One thing researchers do know is that dolphins identify themselves using a signature whistle. Itâs a bit like a name.
Dolphin signature whistles
A signature whistle is a bit like a dolphinâs name. Luna and Trimy are two adult female Atlantic spotted dolphins that Denise Herzing works with. But they donât call themselves âLunaâ or âTrimy.â They use these whistles.
Luna:
Trimy:
Early on, Herzing gave herself a signature whistle. She also records the whistles of the dolphins she works with. She uses a machine called CHAT box to play these whistles back to dolphins. âIf they show up, we can say, âHi, how are you?ââ she explains.
Today, CHAT box runs on a smartphone. It contains more than just signature whistles. Herzing and her team invented whistle-words to identify things dolphins like to play with, including a rope toy. When the researchers and dolphins play with these items, the people use CHAT box to say the whistle-words. Some dolphins may have figured out what they mean. In a 2013 TED Talk, Herzing shared a video of herself playing the rope sound. A dolphin picked up the rope toy and brought it to her.
The next level would be for a dolphin to use one of these invented âwordsâ on its own to ask for a toy. If it does, the CHAT box will decode it and play back the English word to the researchers. It already does this when the researchers play the whistle-words to each other. (One time a dolphin whistled a sound that the CHAT box translated as âseaweed.â But that turned out to be a CHAT box error, so the team is still waiting and hoping.)
Herzing is also using AI to sift through recorded dolphin calls. It then sorts them into different types of sound. Like the Project CETI team, sheâs also matching sounds with behaviors. If a certain type of sound always comes up when mothers are disciplining calves, for example, that might help reveal structure or meaning. She notes that âthe computer is not a magic box.â Asking it to interpret all this, she says, is âso much harder than you think.â
Robot chirps
Alison Barker uses AI to study the sounds of another animal: the naked mole rat. These hairless critters live in underground colonies. âThey sound like birds. You hear them chirping and tweeting,â says this neurobiologist. She works at the Max Planck Institute for Brain Research. Itâs in Frankfurt, Germany.
Naked mole rats use a particular soft chirp when they greet each other. In one study, Barkerâs team recorded more than 36,000 soft chirps from 166 animals that lived in seven different colonies. The researchers used an AI model to find patterns in these sounds. AI, says Barker, âreally transformed our work.â Without the tool, she says, it would have taken her team more than 10 years to go through the data.
Each colony had its own distinct dialect, the model showed. Baby naked mole rats learn this. And pups raised in a colony different from where they were born will adopt the new colonyâs dialect, Barker found.
She also used an AI model to create fake soft chirps. These fit the pattern of each dialect. When her team played these sounds to naked mole rats, they responded to sounds that matched their dialect and ignored sounds that didnât.
This means that the dialect itself â and not just an individualâs voice â must help these critters understand who belongs in their group.
Does the soft chirp mean âhelloâ? Thatâs Barkerâs best translation. âWe donât understand the rules of their communication system,â she admits. âWeâre just scratching the surface.â
Naked mole rat toilet call
The soft chirp seems to be how naked mole rats say âhello.â But they make other sounds. âThey have a specific âtoiletâ call,â Alison Barker says. Only the queen and breeding males use it. Itâs like a song they sing as they urinate, she says.
Time will tell
AI has greatly sped up how long it takes to sort, tag and analyze animal sounds â as well as to figure out which aspects of those sounds might carry meaning. Perhaps one day weâll be able to use AI to build a futuristic CHAT box that translates animal sounds into human language, or vice versa.
Project CETI is just one organization working toward this goal. The Earth Species Project and the Interspecies Internet are two others focused on finding ways to communicate with animals.
âAI could eventually get us to the point where we understand animals. But thatâs tricky and long-term,â says Karen Bakker. Sheâs a researcher at the University of British Columbia in Vancouver, Canada.
Sadly, Bakker says, time is not on our side when it comes to studying wild animals. Across the planet, animals are facing threats from habitat loss, climate change, pollution and more. âSome species could go extinct before we figure out their language,â she says.
Plus, she adds, the idea of walking around with an animal translator may seem cool. But many animals might not be interested in chatting.
âWhy would a bat want to speak to you?â she asks. Whatâs interesting to her is what we can learn from how bats and other creatures talk amongst each other. We should listen to nature in order to better protect it, she argues. For example, a system set up to record whales or elephants can also track their locations. This can help us avoid whales with our boats or protect elephants from poachers.
Conservation is one goal driving Project CETI. âIf we understand [sperm whales] better, we will be better at understanding whatâs bothering them,â says BeguĆĄ. Learning that a species has something akin to language or culture could also inspire people to work harder to protect it.
For example, some people consider prairie dogs to be pests. But Slobodchikoff has found that when he explains that prairie dogs talk to each other, âpeopleâs eyes light upâ in seeming appreciation of the speciesâ value.
When you protect an animal that has some version of language or culture, youâre not merely conserving nature. Youâre also saving a way of life. Herzing says that dolphins deserve a healthy environment so their cultures can thrive.
In the future, instead of guessing at what animals might need, we might just be able to ask them.