A learning approach for river debris detection

Àlex Solé Gómez, Leonardo Scandolo, Elmar Eisemann

Research output: Contribution to journalArticleScientificpeer-review

20 Downloads (Pure)

Abstract

Plastic pollution in the sea is an environmental hazard, negatively impacts marine life, and causes economic damage all over the world. It is estimated that each year 8 million tonnes of plastic are deposited in seas, the vast majority coming from rivers. In recent years, publicly available satellite imagery has been used to attempt to track floating plastic debris, using specialized hand-crafted features. In this work, we present an automatic learning approach based on satellite imagery that can detect floating plastic debris in rivers with high precision. This approach is based on well-proven image segmentation architectures, U-Net (RonnebeRger et al., 2015) and DeeplabV3+ (Chen et al., 2018), which we adapt to process high-dimensional multispectral images. To train and test the approach, we also present a dataset of images from different rivers around the world containing floating plastic debris, which is a key step to creating an automated learning solution. We test the predictive accuracy of our network, showing that our approach can correctly identify floating debris in images from regions not seen in the training set. Our results also show that a more extensive labeled dataset is necessary to generalize the approach to some types of rivers. Furthermore, we also demonstrate how our solution can also be used to monitor single areas over time to understand and predict floating debris accumulation.

Original languageEnglish
Article number102682
Pages (from-to)1-10
Number of pages10
JournalInternational Journal of Applied Earth Observation and Geoinformation
Volume107
DOIs
Publication statusPublished - 2022

Keywords

  • Hyperspectral Imaging
  • Machine learning
  • Segmentation

Fingerprint

Dive into the research topics of 'A learning approach for river debris detection'. Together they form a unique fingerprint.

Cite this