Google has developed a new AI tool to help marine biologists better understand coral reef ecosystems and their health, which can aid in conversation efforts. The tool, SurfPerch, created with Google Research and DeepMind, was trained on thousands of hours of audio reef recordings that allow scientists studying the reef to be able to “hear reef health from the inside,” track reef activity at night, and track reefs that are in deep or murky waters.
The project began by inviting the public to listen to reef sounds via the web. Over the past year, visitors to Google’s Calling in our Corals website listened to over 400 hours of reef audio from sites around the world and were told to click when they heard a fish sound. This resulted in a “bioacoustic” data set focused on reef health. By crowdsourcing this activity, Google was able to create a library of new fish sounds that were used to fine-tune the AI tool, SurfPerch. Now, SurfPerch can be quickly trained to detect any new reef sound.
![](https://techcrunch.com/wp-content/uploads/2024/06/calling-in-our-corals.png?w=680)
“This allows us to analyze new datasets with far more efficiency than previously possible, removing the need for training on expensive GPU processors and opening new opportunities to understand reef communities and conservation of these,” notes a Google blog post about the project. The post was co-authored by Steve Simpson a professor of Marine Biology at the University of Bristol in the U.K., and Ben Williams, a marine biologist at the University College London, both who study coral ecosystems with focuses on areas like climate change and restoration.
What’s more, the researchers realized they were able to boost SurfPerch’s model performance by leveraging bird recordings. Although bird sounds and reef recordings are very different, there were common patterns between bird songs and fish sounds that the model was able to learn from, they found.
After combining the Calling Our Corals data with SurfPerch in initial trials, researchers were able to uncover differences between protected and unprotected reefs in the Philippines, track restoration outcomes in Indonesia, and better understand relationships with the fish community on the Great Barrier Reef.
The project continues today, as new audio is added to the Calling in Our Corals website, which will help to further train the AI model, Google says.
You Might Also Like
Sony tests a new PS5 3D audio profile setup to personalize in-game effects
The latest PlayStation beta lets you create personalized 3D audio profiles on your PS5 that could help you “better sense...
An artist combines AI and unsecured webcams to make mischief
On the infinite list of possible uses for AI, “getting selfie advice from a Kylie Jenner voice clone” seems both...
OpenAI is releasing a cheaper, smarter model
OpenAI is releasing a lighter, cheaper model for developers to tinker with called GPT-4o Mini. It costs significantly less than...
The best budget robot vacuums
Today’s robot vacuums are becoming a bit like cars: with all the features, upgrades, and fancy trimmings available these days,...