Google’s new AI wants to decode dolphin-speak — and it might actually work. Forget translating Spanish or French — Google’s next big language project might just be dolphin. No, seriously.
The company has just announced DolphinGemma, a foundational AI model built to understand and recreate the complex sounds dolphins make — and they’re not gatekeeping it. Google says the model will be released as open-source later this summer, so researchers and ocean nerds everywhere can join in the decoding mission. Not just another AI flex — this one's been four decades in the making While this sounds straight out of a Pixar plot, DolphinGemma’s roots go deep. The AI has been trained using real-world data from the Wild Dolphin Project (WDP) — a non-profit that’s been doing underwater research on Atlantic spotted dolphins in the Bahamas since 1985. That’s nearly four decades of dolphin drama, recorded and analysed.
The WDP has been observing how dolphins communicate naturally — not in tanks or labs, but out in the wild. By diving with them (literally), the team has figured out how certain sounds match up with specific behaviours. So when a dolphin clicks, whistles or bursts out a pulse, there’s a good chance it means something — and now, thanks to AI, we might finally figure out what.
The tech that makes it tick
So how does DolphinGemma actually work? Google’s using its SoundStream tokenizer to break down dolphin sounds into data that the AI can understand and learn from. Then, with a model trained for complex sound sequences, it can generate entirely new dolphin-like sounds. Think of it as Google Translate, but for squeaks.
The model isn’t massive in AI terms — it has 400 million parameters — but it’s been optimised to run even on Pixel phones, which WDP field researchers use in the wild. Basically, this tech is small enough to go swimming.
Can we talk to dolphins now?
Not quite — but this is a huge step toward understanding one of the most complex animal communication systems on the planet. Dolphins don’t just make random noises — researchers believe their clicks, whistles and pulses carry meaning, possibly even structured like a language.
The problem? We’ve never had the right tools to decode it all. That’s where DolphinGemma steps in. With the help of AI, scientists can now study those sounds at scale — spotting patterns, linking them to behaviours, and maybe one day, cracking the code of inter-species communication.
It’s not just about chatting with dolphins for fun. Understanding how non-human animals communicate could reshape how we view intelligence and consciousness, and even how we build future AI systems that can better understand context, emotion, and nuance — not just in humans, but in other species too.
So no, we’re not doing real-time dolphin Duolingo just yet. But with DolphinGemma, we’ve officially entered our “talking to animals” era — powered by AI.