The dolphin handler makes a gesture with her hands. Two trained dolphins emerge from the water after exchanging sounds and then flip on their backs. They created a new trick and performed it in tandem with another. Aza Raskin doesn't think it proves that there is language. It makes sense that a rich symbolic way of communicating would make this task easier.
The Earth Species Project is a California non-profit group with a bold ambition: to decode non-human communication using a form of artificial intelligence called machine learning. The movement to ban commercial whaling was started by a 1970 album of whale songs. Is there a way to translate the animal kingdom.
A scientific paper was published by the organisation in December of last year. Within our lifetimes, the goal is to be able to communicate. Can we decode animal communication and find non-human language? We are developing technology that supports biologists and conserves now.
Understanding animal vocalisations is a subject that has fascinated humans. Various primate give alarm calls that are different from one another, dolphins address one another with signature whistles, and some songbirds can rearrange their calls to communicate different messages. Most experts don't call it a language because animal communication doesn't meet all the criteria.
In the past, decoding was mostly based on observation. Machine learning is being applied to deal with the huge amount of data collected by modern animal-borne sensors. An associate professor at the University of Copenhagen who studies vocal communication in mammals and birds says that people are starting to use it. We don't know how much we can do
The pig grunts are analysed to see if the animal is happy or sad. DeepSqueak uses rodents' calls to determine if they're in a stressed state. Machine learning will be used to translate the communication of sperm whales.
ESP doesn't focus on decoding the communication of one species but all of them. The goal is to develop tools that could be applied to the entire animal kingdom, even though there will be a higher likelihood of rich, symbolic communication among social animals. Raskin says that they're speciesgnostic. There are tools that can be used across all of biology.
Machine learning can be used to translate between different, sometimes distant human languages without the need for any prior knowledge.
The first step in this process is the creation of an algorithm to represent words. The distance and direction between points show how they relate to one another. King and woman have the same relationship with the same direction and distance. By looking at how often the words occur near each other, the mapping is done.
The shapes are similar for different languages. Two groups of researchers found a way to align the shapes in order to translate. You can get from English to Urdu by aligning their shapes and finding the point closest to the word in English. It's possible to translate most words well.
ESP wants to create representations of animal communication that work on both individual species and many species at the same time, and explore questions such as whether there is overlap with the universal human shape. We don't know how animals experience the world, but there are emotions, for example grief and joy, which may well communicate with others in their species. There are parts where the shapes overlap and we can communicate or translate, but there are also parts where we can't.
He says that animals communicate in many different ways. A flower's location can be communicated via a "waggle dance" by bees. Different modes of communication will need to be translated.
The idea isn't to get there all at once, but to go to the moon. A series of small problems are needed for the bigger picture to be realized. The development of general tools that can help researchers with their research should be seen.
ESP recently published a paper on the so called "cocktail party problem" in animal communication, in which it is difficult to discern which individual in a group of the same animals is talking.
The end-to-end detangling of animal sound has never been done before. When the calls came from individuals that the model had been trained on, the model was able to disentangle calls from animals that weren't in the picture.
It could produce a step change in our ability to help the Hawaiian crow come back from the brink
There is a project that uses artificial intelligence to generate animal calls. The novel calls can be played back to the animals to see how they respond. If the artificial intelligence can distinguish between a random change and a semantically meaningful one, it will bring us closer to meaningful communication. We don't know what it means yet, but it's having the artificial intelligence speak the language
The aim of the project is to use self-supervised machine learning to determine how many call types a species has at its disposal. In an early test case, it will mine audio recordings made by a team led by Christian Rutz, a professor of biology at the University of St Andrew, to produce an inventory of the vocal range of the Hawaiian crow.
Rutz is very enthusiastic about the project. The Hawaiian crow is only found in captivity and is being bred to be reintroduced to the wild. It is hoped that by taking recordings at different times, it will be possible to track whether specific alarm calls are being lost in captivity, which could have consequences for the species's return to the wild. Rutz said that detecting and classifying the calls manually would be labour intensive and error prone, and that it could produce a step change in our ability to help these birds come back from the brink.
Another project is trying to understand the functional meanings of vocalisations. The lab of Ari Friedlaender is at the University of California, Santa Cruz. The lab studies how wild marine mammals, which are difficult to observe directly, behave underwater. The small electronic devices attached to the animals capture their location, type of motion and even what they see. There are sound recorder in the ocean.
ESP will apply machine learning to the tag data to gauge what an animal is doing and then add audio data to see if functional meaning can be given to calls tied to that behavior. Playback experiments could be used to verify any findings. The lab has tagged several animals in the same group so it's possible to see how signals are received. Friedlaender said he was hitting the ceiling in terms of what he could see from the data. He hopes that the work the ESP can do will give new insights.
Some people are not as enthusiastic about the power of artificial intelligence. Robert Seyfarth is a professor of psychology at the University of Pennsylvania who has studied primate social behavior and communication for more than 40 years. While he believes machine learning can be useful for some problems, such as identifying an animal's vocal repertoire, there are other areas, such as the discovery of the meaning and function of vocalisations, where he is not convinced.
Many animals can have complex societies, but they don't have a lot of sounds. The result is that the exact same sound can be used to mean different things in different contexts and it is only by studying the context. Seyfarth thinks the methods are not enough. You have to see the animals.
The idea that the shape of animal communication will be similar to human communication is questionable. One thing is applying computer-based analyses to human language. It can be different doing it to other animals. Kevin Coffey is a neuroscientist at the University of Washington and co-created the DeepSqueak algorithm.
Raskin acknowledges that there is more that can be done to communicate with other species. He refers to research that shows many species communicate in ways that are more complex than thought. Our ability to gather enough data and analyse it at scale has been a stumbling block. He says that these are the tools that allow us to take off the human glasses.