CRBC News
Science

DolphinGemma: The New AI Bringing Humans Closer to Understanding Dolphin Speech

DolphinGemma: The New AI Bringing Humans Closer to Understanding Dolphin Speech
Happy dolphin face peeking out of the water - Andrey E. Donnikov/Shutterstock

DolphinGemma is a new AI collaboration between Google, the Wild Dolphin Project and Georgia Tech that trains a language model on decades of recordings of Atlantic spotted dolphins in the Bahamas. The model analyzes whistles, clicks and burst pulses to identify patterns and predict sequences, with the longer-term goal of inferring possible meanings and building a shared vocabulary. Google plans to release DolphinGemma openly, and similar AI efforts (DeepSqueak, MeowTalk) show growing interest in decoding animal communication.

For decades people have imagined a real-life Doctor Dolittle who could converse with animals. Recent advances in machine learning are moving that fantasy closer to reality — at least for one of the ocean's most vocal inhabitants. DolphinGemma, a new AI project from Google in collaboration with the Wild Dolphin Project and the Georgia Institute of Technology, is trained to analyze decades of recordings from a Bahamian pod of Atlantic spotted dolphins and search for meaningful patterns in their whistles, clicks and burst pulses.

How Dolphins Communicate

Dolphins use three primary sound types: whistles (often linked to social calls), clicks (used for echolocation and possibly short-range signaling) and burst pulses (rapid groups of clicks). Since 1985 the Wild Dolphin Project has recorded audio and video of a single pod of Atlantic spotted dolphins in the Bahamas, building a rare long-term dataset of social interactions and vocal behavior.

What Is DolphinGemma?

DolphinGemma is a domain-specific large language model trained on that long-term dataset. The model looks for recurring structures and sequences in dolphin vocalizations, with two main ambitions: to predict the next sound in a sequence (similar to how LLMs predict the next word in a sentence) and to begin mapping sounds to potential meanings or functions. The project aims to create a shared vocabulary or framework that could let humans and dolphins better interpret one another's signals.

DolphinGemma: The New AI Bringing Humans Closer to Understanding Dolphin Speech - Image 1
Scuba diver and dolphin facing each other underwater - Flicketti/Shutterstock

“So long, and thanks for all the fish.” — Douglas Adams, imagined dolphins leaving a final message that humans didn’t understand.

Why This Matters — And What To Watch For

If DolphinGemma succeeds at reliably detecting patterns and associating them with behavioral context, it could transform our understanding of dolphin social life and cognition. It would also be a landmark example of machine learning applied to interspecies communication. However, translating animal sounds into human-like language is challenging: patterns do not always equal meaning, datasets can be limited, and researchers must avoid overinterpretation. Ethical considerations — such as welfare, consent for research subjects, and how we act on any intelligence we detect — are also important.

Broader Context

Dolphins are not the only species being studied with AI. Projects such as DeepSqueak (rodent ultrasonic calls) and MeowTalk (cat vocalizations) apply machine-learning tools to animal sounds, showing wider scientific interest in decoding nonhuman communication. Google says it plans to make DolphinGemma openly available upon release, which could accelerate collaboration and independent validation.

What Comes Next

Work on DolphinGemma is ongoing. Early results will need peer review, replication on additional populations, and careful interpretation. Even if the project does not produce a full “translation” of dolphin language, it may reveal structure and signals that deepen our respect for — and ability to protect — these social, intelligent animals.

Related Animal Language Milestones: Koko the gorilla learned over 2,000 signs in American Sign Language; Alex the grey parrot demonstrated conceptual understanding of shapes, colors and numbers; Chaser the border collie learned over 1,000 object names and showed basic grammar-like comprehension.

Similar Articles