Somewhere in the North Atlantic, more than a kilometre beneath its surface, a cold-water coral reef stretches across an unnamed seamount. Despite never appearing on a chart, this underwater forest has existed for centuries, growing a centimetre or two each year.
The reef is a home and feeding ground for dozens of species that depend on it the way a woodland creature depends on trees. It has survived ice ages – but whether it will survive increasing pressures from industrial fishing, deep-sea mining and climate change is, in part, a question about data. If we don’t know it exists, how can we protect it?
A new project called Deep Vision could fundamentally transform our understanding of the deep ocean by digging into pictures and videos sat largely unexamined in research archives around the world. By using AI, thousands of hours of seafloor footage can be analysed to produce the first comprehensive maps of vulnerable marine ecosystems across the entire Atlantic basin.
Over the past two decades, robotic and autonomous underwater vehicles have collected vast quantities of footage from the deep sea. This represents an extraordinary resource – a record of ecosystems that most humans will never see.
The difficulty is that less than half of this imagery has ever been analysed. A single dive can take a trained human analyst two months to process. Multiply that by thousands of dives and you begin to appreciate why this treasure trove of information has remained largely locked away.
The solution, I am convinced, is artificial intelligence.
Yetugraphic/Shutterstock
In research published in 2022, my colleagues and I showed that AI could be trained to successfully analyse over 58,000 deep-sea images in under ten days. The AI model helped us map the distribution of a fragile xenophyophore – a giant single-celled organism that is a recognised indicator of vulnerable marine ecosystems – at a depth of 1,200 metres in the north-east Atlantic. What would have taken a human analyst many months was accomplished in days.
AI also provides consistency. Human analysts, however expert, do not always agree with one another. Indeed, they do not always agree with themselves: a researcher identifying marine species may classify specimens differently at different times. A machine makes errors but it makes them consistently, which means these errors can be identified, corrected and accounted for.
Forests of the deep
Deep Vision is focusing specifically on what we call vulnerable marine ecosystem indicator taxa, such as deep-sea corals and sponges.
These are the organisms I think of as the forests of the deep. In an environment where there are no plants to provide habitats, these animals fulfil this role. They are keystone organisms in the most literal sense: remove them and the ecosystem collapses.
Once AI has extracted biodiversity observations from the imagery, the next stage is to build habitat-suitability models – predictive maps that extend our understanding beyond the specific locations where cameras have surveyed.
Our research shows that high-resolution habitat suitability models are a useful tool in spatial management, capable of informing decisions about where marine-protected areas should be located. However, the quality of the underlying seafloor data remains critical to how well they perform.
As a marine biologist, I sometimes get asked why people should care about a sponge living two kilometres beneath the surface of the Atlantic. It is a fair question, and the answer is more immediate than most people expect. These animals recycle essential nutrients and play a key role in the carbon cycle, and that effects us all.
The ocean is the engine room of a planetary life-support system, and effective management of it relies on having the best possible understanding of the species and ecosystems within it.
If this project succeeds in the Atlantic, the methods could be replicated in other ocean basins. The Pacific, the Indian Ocean and the Southern Ocean all present the same challenges of insufficient data and vast unexplored territory.

