DolphinGemma is a new AI collaboration between Google, the Wild Dolphin Project and Georgia Tech that trains a language model on decades of recordings of Atlantic spotted dolphins in the Bahamas. The model analyzes whistles, clicks and burst pulses to identify patterns and predict sequences, with the longer-term goal of inferring possible meanings and building a shared vocabulary. Google plans to release DolphinGemma openly, and similar AI efforts (DeepSqueak, MeowTalk) show growing interest in decoding animal communication.
DolphinGemma: The New AI Bringing Humans Closer to Understanding Dolphin Speech

For decades people have imagined a real-life Doctor Dolittle who could converse with animals. Recent advances in machine learning are moving that fantasy closer to reality — at least for one of the ocean's most vocal inhabitants. DolphinGemma, a new AI project from Google in collaboration with the Wild Dolphin Project and the Georgia Institute of Technology, is trained to analyze decades of recordings from a Bahamian pod of Atlantic spotted dolphins and search for meaningful patterns in their whistles, clicks and burst pulses.
How Dolphins Communicate
Dolphins use three primary sound types: whistles (often linked to social calls), clicks (used for echolocation and possibly short-range signaling) and burst pulses (rapid groups of clicks). Since 1985 the Wild Dolphin Project has recorded audio and video of a single pod of Atlantic spotted dolphins in the Bahamas, building a rare long-term dataset of social interactions and vocal behavior.
What Is DolphinGemma?
DolphinGemma is a domain-specific large language model trained on that long-term dataset. The model looks for recurring structures and sequences in dolphin vocalizations, with two main ambitions: to predict the next sound in a sequence (similar to how LLMs predict the next word in a sentence) and to begin mapping sounds to potential meanings or functions. The project aims to create a shared vocabulary or framework that could let humans and dolphins better interpret one another's signals.
“So long, and thanks for all the fish.” — Douglas Adams, imagined dolphins leaving a final message that humans didn’t understand.
Why This Matters — And What To Watch For
If DolphinGemma succeeds at reliably detecting patterns and associating them with behavioral context, it could transform our understanding of dolphin social life and cognition. It would also be a landmark example of machine learning applied to interspecies communication. However, translating animal sounds into human-like language is challenging: patterns do not always equal meaning, datasets can be limited, and researchers must avoid overinterpretation. Ethical considerations — such as welfare, consent for research subjects, and how we act on any intelligence we detect — are also important.
Broader Context
Dolphins are not the only species being studied with AI. Projects such as DeepSqueak (rodent ultrasonic calls) and MeowTalk (cat vocalizations) apply machine-learning tools to animal sounds, showing wider scientific interest in decoding nonhuman communication. Google says it plans to make DolphinGemma openly available upon release, which could accelerate collaboration and independent validation.
What Comes Next
Work on DolphinGemma is ongoing. Early results will need peer review, replication on additional populations, and careful interpretation. Even if the project does not produce a full “translation” of dolphin language, it may reveal structure and signals that deepen our respect for — and ability to protect — these social, intelligent animals.
Related Animal Language Milestones: Koko the gorilla learned over 2,000 signs in American Sign Language; Alex the grey parrot demonstrated conceptual understanding of shapes, colors and numbers; Chaser the border collie learned over 1,000 object names and showed basic grammar-like comprehension.
Similar Articles

Exclusive: Sperm Whale 'Codas' Contain Human-Like Vowel Sounds, Study Finds
Researchers with Project CETI report that sperm whale "codas" contain vowel-like elements when recordings are sped up, reveal...

Hawaiian Monk Seals Produce ~23,000 Complex Underwater Calls — Novel 'Combinational' Vocalizations Discovered
A University of Hawaiʻi at Mānoa team analyzed over 4,500 hours of passive recordings and identified roughly 23,000 Hawaiian ...

When Chimps Call, Our Voice Area Answers: Human Brains React to Chimp Vocalizations
Study: Human temporal voice areas respond preferentially to chimpanzee calls compared with bonobo or macaque sounds. Method: ...

Study Finds Dolphins Exhaling Microplastics Chemically Similar to Human Airway Particles — Health Impact Unknown
A PLOS ONE study detected microplastic fibers in breath samples from bottlenose dolphins in Barataria Bay (LA) and Sarasota B...

How Different Species Team Up: Shared Signals Help Animals Drive Off Predators
Animals across habitats use shared signals—sounds, sights and chemicals—to warn and recruit other species against common thre...

Mysterious 'Golden Orb' Recovered Nearly Two Miles Below the Gulf of Alaska Baffles Scientists
The NOAA Seascape Alaska 5 expedition in August 2023 recovered a roughly four-inch golden-brown sphere from a seamount more t...

Florida Dolphins Show Alzheimer’s-like Brain Changes Linked to Algal Bloom Toxins
Researchers report that bottlenose dolphins in Florida’s Indian River Lagoon display brain changes similar to early Alzheimer...

DeepSeek Founder Liang Wenfeng Named One of Nature's Top 10 People Who Shaped Science in 2025
Nature named DeepSeek founder Liang Wenfeng among its top 10 people who shaped science in 2025, citing the company's influent...

How AI Will Reshape Reporting in Five Years — What ChatGPT and Gemini Predict
AI tools are changing how work gets done. I tested ChatGPT and Gemini by asking each to predict how my reporting role coverin...

Scientists Discover New "Intermediary Roar" in African Lions — AI Reveals Hidden Vocal Type
University of Exeter scientists have identified a new "intermediary roar" in African lions, distinct from the familiar full-t...

Sea wolves filmed hauling crab traps ashore — a surprising display of learning and tool use
Researchers collaborating with the Heiltsuk First Nation used remote cameras to solve a year-long mystery: coastal wolves wer...

Mimmo the Dolphin: Venice’s Acrobatic Visitor Stuck in a Dangerous, High-Traffic Lagoon
Venice has been visited since July 23 by a playful wild dolphin nicknamed Mimmo , whose flips have drawn crowds. Veterinarian...

Ben Lamm: AI-Driven Synthetic Biology Could ‘Change Everything’ — Inside Colossal Biosciences’ Ambitious De‑extinction Plan
Ben Lamm and Colossal Biosciences have turned de‑extinction from a headline concept into a funded scientific program, raising...

Fei-Fei Li Urges End to AI Hyperbole: Call for Clear, Evidence-Based Public Messaging
Fei-Fei Li criticized polarized AI rhetoric that alternates between doomsday scenarios and utopian promises, urging clear, ev...

Mote SEA Launches STEM Workforce Labs to Train the Next Generation of Marine Scientists
Mote SEA has opened new STEM Workforce Development Labs to prepare future marine scientists. The 146,000 sq ft facility emplo...
