The accurate reconstruction of areas obscured by clouds is among the most challenging topics for the remote sensing community since a significant percentage of images archived throughout the world are affected by cloud covers which make them not fully exploitable. The purpose of this paper is to propose new methods to recover missing data in multispectral images due to the presence of clouds by relying on a formulation based on an autoencoder (AE) neural network. We suppose that clouds are opaque and their detection is performed by dedicated algorithms. The AE in our methods aims at modeling the relationship between a given cloud-free image (source image) and a cloud-contaminated image (target image). In particular, two strategies are developed: the first one performs the mapping at a pixel level while the second one at a patch level to take profit from spatial contextual information. Moreover, in order to fix the problem of the hidden layer size, a new solution combining the minimum descriptive length criterion and a Pareto-like selection procedure is introduced. The results of experiments conducted on three different data sets are reported and discussed together with a comparison with reference techniques.

Reconstructing Cloud-Contaminated Multispectral Images with Autoencoder Neural Networks / Malek, S.; Melgani, F.; Bazi, Y.; Alajlan, N.. - In: IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING. - ISSN 1558-0644. - 2017:(2017).

Reconstructing Cloud-Contaminated Multispectral Images with Autoencoder Neural Networks

S. Malek;F. Melgani;Y. Bazi;
2017-01-01

Abstract

The accurate reconstruction of areas obscured by clouds is among the most challenging topics for the remote sensing community since a significant percentage of images archived throughout the world are affected by cloud covers which make them not fully exploitable. The purpose of this paper is to propose new methods to recover missing data in multispectral images due to the presence of clouds by relying on a formulation based on an autoencoder (AE) neural network. We suppose that clouds are opaque and their detection is performed by dedicated algorithms. The AE in our methods aims at modeling the relationship between a given cloud-free image (source image) and a cloud-contaminated image (target image). In particular, two strategies are developed: the first one performs the mapping at a pixel level while the second one at a patch level to take profit from spatial contextual information. Moreover, in order to fix the problem of the hidden layer size, a new solution combining the minimum descriptive length criterion and a Pareto-like selection procedure is introduced. The results of experiments conducted on three different data sets are reported and discussed together with a comparison with reference techniques.
2017
Malek, S.; Melgani, F.; Bazi, Y.; Alajlan, N.
Reconstructing Cloud-Contaminated Multispectral Images with Autoencoder Neural Networks / Malek, S.; Melgani, F.; Bazi, Y.; Alajlan, N.. - In: IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING. - ISSN 1558-0644. - 2017:(2017).
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/202415
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex ND
social impact