In 2017, a group of scientists were surprised by a startling finding – sperm whale vocalizations, which sound like clicks, closely resemble Morse code. Enter: AI. It sowed the seeds for an ambitious project – the Cetacean Translation Initiative, or Project CETI – that would use artificial intelligence to translate these whale sounds so humans could understand them. Introducing technology into the study of animal behavior not only helps us to understand them better—but paradoxically, it also helps reveal our own limitations as a species. This could be done in one of two ways: enabling greater conservation efforts, or introducing hubris that could use newfound knowledge of animal communication against them.
Not only cetacean communication has been the subject of translation initiatives. Researchers have developed an algorithm that can assess pigs’ emotional states based on their calls, while others have used audio and video recordings to uncover the context of bat calls. Until recently, attempts to understand animal language were largely based on observation. The advent of technology has greatly expanded the scope of research studies. “Combined, these digital devices function like a planetary-scale hearing aid: they allow people to observe and study the sounds of nature beyond the limits of our sensory abilities,” wrote Karen Bakker in her book. The Sounds of Life: How digital technology is bringing us closer to wildlife. However, these devices have also generated vast amounts of data that are difficult and time-consuming to analyze manually.
The scientific community is therefore increasingly using technological tools such as drones, recorders, robots and AI to study the calls of a range of species, from chickens and rodents to cats and lemurs.
The New York Times reported that machine learning algorithms can detect subtle patterns that human researchers might miss. These programs can distinguish between individual animal sounds. They can also identify the individual animal sounds made under different circumstances and break down these calls into smaller parts, which is considered an important step in deciphering meaning. For example DeepSqueak – software that uses deep learning algorithms to identify, process and categorize ultrasonic squeaks from rodents that are otherwise inaudible to the human ear.
Associated with The Swaddle:
Scientists discover a ‘hidden language’ by analyzing chimpanzee voices
This could support conservation efforts in new ways. Project CETI is working to decipher the syntax and semantics of sperm whale communication – classified as federally endangered in the United States. Meanwhile, the Earth Species Project (ESP) — a nonprofit dedicated to using AI to decipher nonhuman communications — is cataloging the calls of Hawaiian crows and even trying to develop new technologies that could help humans talk to animals.
With reports of how climate change is affecting bird populations—reducing the complexity and variation of bird song and leading to the collapse of “song culture”—ESP’s study of the critically endangered Hawaiian captive-bred crows has multiple implications on nature conservation. It could help researchers study lost bird calls and reintroduce those thought to be most critical for captive birds. “The goal we’re working towards is, can we decode animal communications, discover non-human language… Along the way, and just as importantly, we develop technology that biologists and conservationists are now supporting,” Aza Raskin, co-founder and president of the ESP, said The Guardian.
The wealth of information hidden in animal communication can also aid in understanding the social dynamics within a species. When machine learning was used to analyze around 36,000 naked mole rat chirps, the researchers found that each mole rat had its own unique vocal signature and each colony had its own dialect that was passed down through the generations. In cases where a colony queen was deposed, these dialects were erased. With a new queen, a new dialect would emerge.
Additionally, research suggests that decoding animal language also has the potential to help us understand “the evolution, neurobiology, and cognitive basis of human language.” Several scholars have challenged the popular belief that animal communication is entirely independent of the evolution of human language, noting that songbirds “show a human-like ability to learn complex vocal patterns.”
Associated with The Swaddle:
Some monkeys change their “accents” when in other species’ territory: learning
However, deciphering animal language poses several difficulties. As Raskin noted, animals don’t just communicate through sound, which means we need to translate across different modes of communication, as in the case of bees informing others about a nectar source through a ‘wiggling dance’. Additionally, such research efforts raise a more fundamental question that has polarized the scientific community over the years: Do animals even have languages?
Bakker highlights in her book that indigenous communities have long been aware of animal communication. She noted that researchers investigating animal communication have met stiff opposition from the Western scientific community, which “historically has totally rejected the idea of animal communication.”
“So much of the 20th-century attempt to teach primates human speech or sign language was underpinned by the assumption that language is unique to humans and that if we were to prove animals have language, we would have to prove they humans can learn language. And looking back, that’s a very human-centric view,” Bakker told Vox, adding that today’s research takes a very different approach and has led to fascinating discoveries, including that elephants have a different signal for threatening and non-threatening humans.
While the ability to understand animals could potentially help deepen our relationships with our environment and guide conservation efforts, Bakker also raised several ethical concerns that may arise. “…[T]The ability to talk to other species sounds intriguing and intriguing, but it could either be used to create a deeper sense of kinship, or a sense of dominance and manipulative ability to domesticate wild species that we as humans never knew before could control,” she said. Researchers have found that AI may not be able to fully decipher the complex nature of animal languages. Still, it gives a glimpse of what might be possible in the future.
While we’re still a while away from a Google Translate equivalent of animal languages that can decipher the nuances of intraspecies communication, technology, particularly machine learning, is keeping that hope alive. The ability to understand animal languages could open up a wealth of possibilities that could potentially shape conservation efforts, determine our future relationship with other species, and even offer insight into the evolution of human language itself.