Google has developed a new AI tool to help marine biologists better understand coral reef ecosystems and their health, which can aid in conversation efforts. The tool, SurfPerch, created with Google Research and DeepMind, was trained on thousands of hours of audio reef recordings that allow scientists studying the reef to be able to “hear reef health from the inside,” track reef activity at night, and track reefs that are in deep or murky waters.
The project began by inviting the public to listen to reef sounds via the web. Over the past year, visitors to Google’s Calling in our Corals website listened to over 400 hours of reef audio from sites around the world and were told to click when they heard a fish sound. This resulted in a “bioacoustic” data set focused on reef health. By crowdsourcing this activity, Google was able to create a library of new fish sounds that were used to fine-tune the AI tool, SurfPerch. Now, SurfPerch can be quickly trained to detect any new reef sound.

“This allows us to analyze new datasets with far more efficiency than previously possible, removing the need for training on expensive GPU processors and opening new opportunities to understand reef communities and conservation of these,” notes a Google blog post about the project. The post was co-authored by Steve Simpson a professor of Marine Biology at the University of Bristol in the U.K., and Ben Williams, a marine biologist at the University College London, both who study coral ecosystems with focuses on areas like climate change and restoration.
What’s more, the researchers realized they were able to boost SurfPerch’s model performance by leveraging bird recordings. Although bird sounds and reef recordings are very different, there were common patterns between bird songs and fish sounds that the model was able to learn from, they found.
After combining the Calling Our Corals data with SurfPerch in initial trials, researchers were able to uncover differences between protected and unprotected reefs in the Philippines, track restoration outcomes in Indonesia, and better understand relationships with the fish community on the Great Barrier Reef.
The project continues today, as new audio is added to the Calling in Our Corals website, which will help to further train the AI model, Google says.
You Might Also Like
Your public ChatGPT queries are getting indexed by Google and other search engines
It’s a strange glimpse into the human mind: if you filter search results on Google, Bing, and other search engines...
Anthropic unveils new rate limits to curb Claude Code power users
Anthropic says its rolling out new weekly rate limits for Claude to curb usage among subscribers who are running its...
Sam Altman warns there’s no legal confidentiality when using ChatGPT as a therapist
ChatGPT users may want to think twice before turning to their AI app for therapy or other kinds of emotional...
iOS 26 beta 4 arrives, with Liquid Glass tweaks and AI news summaries
Apple on Tuesday released the fourth developer beta of its next big software update, iOS 26, which brings with it...