Google Develops New AI Tool to Help Save Coral Reefs

General

Google has developed a new AI tool to help marine biologists better understand coral reef ecosystems and their health, which can aid in conversation efforts.

The tool, SurfPerch, created with Google Research and DeepMind, was trained on thousands of hours of audio reef recordings that allow scientists studying the reef to be able to “hear reef health from the inside,” track reef activity at night, and track reefs that are in deep or murky waters.

The project began by inviting the public to listen to reef sounds via the web. Over the past year, visitors to Google’s Calling in our Corals website listened to over 400 hours of reef audio from sites around the world and were told to click when they heard a fish sound. This resulted in a “bioacoustic” data set focused on reef health. By crowdsourcing this activity, Google was able to create a library of new fish sounds that were used to fine-tune the AI tool, SurfPerch. Now, SurfPerch can be quickly trained to detect any new reef sound.

“This allows us to analy
ze new datasets with far more efficiency than previously possible, removing the need for training on expensive GPU processors and opening new opportunities to understand reef communities and conservation of these,” notes a Google blog post about the project. The post was co-authored by Steve Simpson a professor of Marine Biology at the University of Bristol in the U.K., and Ben Williams, a marine biologist at the University College London, both who study coral ecosystems with focuses on areas like climate change and restoration.

What’s more, the researchers realized they were able to boost SurfPerch’s model performance by leveraging bird recordings.

Source: Qatar News Agency