Sperm whales (Physeter macrocephalus) are known for their intricate social structures and behaviors, aided by their unique sounds. These whales create codas — a series of fast, Morse code-like clicks used during social interactions. These are not random mixtures of clicks but purposeful sounds that convey meaning similar to human words. For example, sperm whales use codas to recognize themselves or other groups. This knowledge is passed down culturally, as young calves cannot produce codas at birth. They learn by copying their parents.
Previously, scientists believed that the sperm whale codas were interesting but fairly simple sets of messages. However, researchers recently used AI to detect and interpret a large dataset of whale clicks, similar to using something like Google Translate to understand a phrase in a foreign language. This study suggests that codas are much more than just simple, repetitive signals.
They are actually an advanced language made up of nearly ten times more identifiable patterns than previously acknowledged.
The AI algorithms used by the researchers at MIT analyzed and categorized over 8,700 codas, identifying subtle differences in rhythm and structure that define the different “dialects” among whale groups. In this case, the use of AI was vital as it was applied to its best use case: pattern recognition. The machine was able to dissect the nuanced variations in coda sequences that would be almost impossible for human analysts to discern.
The study’s results are astonishing: sperm whale codas not only vary by group but also change contextually within conversations. Two key features — rubato (temporal variations) and ornamentation (additional clicks) — combine with rhythm and tempo to form a rich vocal repertoire. This combined system allows whales to express a wide range of information and emotions, from social cues to environmental interactions, the authors reported in the journal
Nature Communications Deep sea chatter.
However, while the researchers were able to categorize the purpose of these clicks, they have still not uncovered the semantics — the meanings associated with specific vocal patterns. The authors focused on identifying the complexity and variability in the vocal patterns, instead of interpreting what each pattern specifically communicates.
The bottom line is that sperm whale vocalizations are not just pleasant sounds in the deep blue ocean. They serve a clear purpose in intricate communication systems, perhaps not all that different from human language. This may also explain
the findings of biologists at Dalhousie University in Nova Scotia , who analyzed newly digitized logbooks kept by whalers during their hunting voyages in the North Pacific.Sperm whales were the main target of the commercial whaling industry from 1800 to 1987, as depicted in the legendary Moby Dick book. They found that the strike rate of the whalers’ harpoons decreased by 58% in just a few years — and the researchers believe this was due to the whales sharing information among themselves.
Not long ago, people thought only humans could use symbolic language. Now, the focus is on understanding animal communication without putting humans at the center of it.
The way AI is changing how we understand how animals communicate
Scientists are discovering that many species have complex ways of communicating. For example,
a study from 2016 used advanced deep learning AI on over 15,000 recordings of Egyptian fruit bats. The bats’ sounds were linked to specific behaviors. They not only compete for resources but also distinguish between male and female in their communications, use distinct “signature calls” like individual names, and engage in vocal learning. Interestingly, mother bats lower their voice when talking to their babies, unlike human mothers who use higher pitched “motherese”. The lower pitch in bats causes a babble response from the young, helping them learn specific sounds. Most of these sounds are in ultrasound, beyond our hearing range. Scientists can't hear and understand bat ‘speech’, but our computers can.
At the Free University of Berlin, researchers used AI that combines computer vision with natural language processing to understand the intricate movements and sounds of honeybees. The bees use specific signals, including instructions to stop or keep quiet. Then, the researchers used this information and
created RoboBee , a small robot placed inside a beehive. The RoboBee acts like a bee and uses bee ‘language’.A robotic “bee” does a waggle dance. Credit: Freie Universität Berlin.
A new method for finding communication in nature
The knowledge gained from these studies is as revolutionary as when the microscope revealed the microbial world centuries ago, according to Karen Bakker, a professor at the University of British Columbia and a fellow at the Harvard Radcliffe Institute for Advanced Study.
“When Dutch scientist Antonie van Leeuwenhoek started looking through his microscopes, he discovered the microbial world, which led to countless future discoveries. So, the microscope allowed humans to see in a new way with both our eyes and our imaginations. Similarly, digital bioacoustics, combined with artificial intelligence, is like a planetary-scale hearing aid that allows us to listen in a new way with both our enhanced ears and our imagination.”
“This is gradually expanding our understanding not just of the great noises that animals make but also of a basic group of queries regarding the supposed separation between humans and animals, our connection to other kinds. It’s also introducing fresh approaches to consider preservation and our connection to the planet. It’s quite significant.”
AI shows that sperm whale clicks create a intricate language, quite similar to human speech