AI fish counter monitors reef’s regeneration progress
A joint project from Accenture, Intel and the Sulubaaï Environmental Foundation is using AI to monitor the resiliency of coral reefs in the Philippines.
Project: CORaiL uses hardware from Intel in an Accenture-designed AI-powered underwater camera system to monitor marine life around the reef, a key indicator of reef health. In the pilot phase, a prototype camera was deployed on the reef at Pangatalan Island. Since deployment in May 2019, it has taken more than 40,000 images.
Coral reefs around the world are rapidly perishing due to a combination of overfishing, bottom trawling and warming ocean temperatures. These reefs are an ecosystem for 25% of the planet’s marine life, and without them, miles of coastline become vulnerable to tropical storms. They also provide food and income for 1 billion people and generate $9.6 billion in tourism and recreation, according to figures provided by Intel.
The first part of Project: CORaiL saw the installation of a concrete reef prosthesis developed by Sulubaaï. This prosthesis is an underwater concrete structure which incorporates fragments of living coral, encouraging coral regrowth.
The underwater system uses AI, specifically a convolutional neural network (CNN), to classify and count fish in images taken automatically by the camera. This information is used in analysis of the abundance and diversity of marine life on the reef to monitor the reef’s progress as it begins to regenerate. This work used to rely on human divers with video cameras capturing footage which is analysed later, but they typically can only film for 30 minutes at a time. A permanent in-situ camera can give researchers 24/7 access to real-time data via a 4G wireless connection.
In the pilot stage of the project, there is one camera which is moved once a week to capture different angles and locations. The camera system is equipped with the Accenture Applied Intelligence Video Analytics Services Platform (VASP), which uses an Intel Xeon CPU, Intel FPGA and Intel Movidius vision processing unit (VPU).
“The Movidius VPU was used to accelerate the CNN to identify marine life under water. The centralized server with Xeon CPU and Intel FPGA was used to do the heavy-lifting for classification of species,” said Ewen Plougastel, managing director and ASEAN delivery lead at Accenture Applied Intelligence. “[The system] at the edge [can process] about 2 frames per second from the camera. However, the central server is capable of processing the input from tens of cameras simultaneously.”
Count and Classify
Custom CNNs carry out motion detection, classification and counting of marine life.
“Our data gathering and model training was custom-developed in two steps. First, a classic motion detector computer vision [network] to detect movement, and second, to clean and annotate the dataset to train a CNN model,” Plougastel said.
The next generation of the camera system will include a further optimised CNN and a backup power supply. The partners are also considering an infra-red camera to better capture nocturnal marine life.
“The same algorithm will be used and adapted to the infra-red camera,” Plougastel said. “The advantage of the infra-red camera is that it detects the size/mass of the marine life, which is an important input that scientists need.”
Future uses for the technology could include studying the migration rates of tropical fish and monitoring intrusion in protected underwater areas, such as reefs.