Agritrop
Accueil

Multi-sensor land cover classification with sparsely annotated data based on Convolutional Neural Networks and Self-Distillation

Gbodjo Jean Eudes, Montet Didier, Ienco Dino, Gaetano Raffaele, Dupuy Stéphane. 2021. Multi-sensor land cover classification with sparsely annotated data based on Convolutional Neural Networks and Self-Distillation. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 15 p.

Article de revue ; Article de recherche ; Article de revue à facteur d'impact
[img]
Prévisualisation
Version publiée - Anglais
Sous licence Licence Creative Commons.
Multi-sensor_land_cover_classification_with_sparsely_annotated_data_based_on_Convolutional_Neural_Networks_and_Self-Distillation.pdf

Télécharger (16MB) | Prévisualisation

Url - jeu de données - Dataverse Cirad : https://doi.org/10.18167/DVN1/TOARDN

Quartile : Q1, Sujet : GEOGRAPHY, PHYSICAL / Quartile : Q2, Sujet : ENGINEERING, ELECTRICAL & ELECTRONIC / Quartile : Q2, Sujet : IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY / Quartile : Q2, Sujet : REMOTE SENSING

Résumé : Extensive research studies have been conducted in recent years to exploit the complementarity among multisensor (or multimodal) remote sensing data for prominent applications such as land cover mapping. In order to make a step further with respect to previous studies, which investigate multitemporal SAR and optical data or multitemporal/multiscale optical combinations, here, we propose a deep learning framework that simultaneously integrates all these input sources, specifically multitemporal SAR/optical data and fine-scale optical information at their native temporal and spatial resolutions. Our proposal relies on a patch-based multibranch convolutional neural network (CNN) that exploits different per-source encoders to deal with the specificity of the input signals. In addition, we introduce a new self-distillation strategy to boost the per-source analyses and exploit the interplay among the different input sources. This new strategy leverages the final prediction of the multisource framework to guide the learning of the per-source CNN encoders supporting the network to learn from itself. Experiments are carried out on two real-world benchmarks, namely, the Reunion island (a French overseas department) and the Dordogne study site (a southwest department in France), where the annotated reference data were collected under operational constraints (sparsely annotated ground-truth data). Obtained results providing an overall classification accuracy of about 94% (respectively, 88%) on the Reunion island (respectively, the Dordogne) study site highlight the effectiveness of our framework based on CNNs and self-distillation to combine heterogeneous multisensor remote sensing data and confirm the benefit of multimodal analysis for downstream tasks such as land cover mapping.

Mots-clés Agrovoc : télédétection, réseau de neurones, cartographie de l'utilisation des terres, cartographie de l'occupation du sol, matériel de télédétection, capteur

Mots-clés géographiques Agrovoc : France, La Réunion

Mots-clés libres : Convolutional neural networks (CNNs), Multi-temporal and multi-scale remote sensing, Land use and land cover mapping

Classification Agris : U30 - Méthodes de recherche
P01 - Conservation de la nature et ressources foncières

Champ stratégique Cirad : CTS 5 (2019-) - Territoires

Auteurs et affiliations

  • Gbodjo Jean Eudes, IRSTEA (FRA)
  • Montet Didier, CIRAD-PERSYST-UMR Qualisud (FRA)
  • Ienco Dino, INRAE (FRA)
  • Gaetano Raffaele, CIRAD-ES-UMR TETIS (FRA)
  • Dupuy Stéphane, CIRAD-ES-UMR TETIS (FRA) ORCID: 0000-0002-9710-5364

Source : Cirad-Agritrop (https://agritrop.cirad.fr/599530/)

Voir la notice (accès réservé à Agritrop) Voir la notice (accès réservé à Agritrop)

[ Page générée et mise en cache le 2024-03-02 ]