Nowadays, deep learning (DL) finds application in a large number of scientific fields, among which the estimation and the enhancement of signals disrupted by the noise of different natures. In this article, we address the problem of the estimation of the interferometric parameters from synthetic aperture radar (SAR) data. In particular, we combine convolutional neural networks together with the concept of residual learning to define a novel architecture, named Phi-Net, for the joint estimation of the interferometric phase and coherence. Phi-Net is trained using synthetic data obtained by an innovative strategy based on the theoretical modeling of the physics behind the SAR acquisition principle. This strategy allows the network to generalize the estimation problem with respect to: 1) different noise levels; 2) the nature of the imaged target on the ground; and 3) the acquisition geometry. We then analyze the Phi-Net performance on an independent data set of synthesized interferometric data, as well as on real InSAR data from the TanDEM-X and Sentinel-1 missions. The proposed architecture provides better results with respect to state-of-the-art InSAR algorithms on both synthetic and real test data. Finally, we perform an application-oriented study on the retrieval of the topographic information, which shows that Phi-Net is a strong candidate for the generation of high-quality digital elevation models at a resolution close to the one of the original single-look complex data.

$-Net: Deep Residual Learning for InSAR Parameters Estimation / Sica, Francescopaolo; Gobbi, Giorgia; Rizzoli, Paola; Bruzzone, Lorenzo. - In: IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING. - ISSN 0196-2892. - 59:5(2021), pp. 3917-3941. [10.1109/tgrs.2020.3020427]

$-Net: Deep Residual Learning for InSAR Parameters Estimation

Gobbi, Giorgia;Bruzzone, Lorenzo
2021-01-01

Abstract

Nowadays, deep learning (DL) finds application in a large number of scientific fields, among which the estimation and the enhancement of signals disrupted by the noise of different natures. In this article, we address the problem of the estimation of the interferometric parameters from synthetic aperture radar (SAR) data. In particular, we combine convolutional neural networks together with the concept of residual learning to define a novel architecture, named Phi-Net, for the joint estimation of the interferometric phase and coherence. Phi-Net is trained using synthetic data obtained by an innovative strategy based on the theoretical modeling of the physics behind the SAR acquisition principle. This strategy allows the network to generalize the estimation problem with respect to: 1) different noise levels; 2) the nature of the imaged target on the ground; and 3) the acquisition geometry. We then analyze the Phi-Net performance on an independent data set of synthesized interferometric data, as well as on real InSAR data from the TanDEM-X and Sentinel-1 missions. The proposed architecture provides better results with respect to state-of-the-art InSAR algorithms on both synthetic and real test data. Finally, we perform an application-oriented study on the retrieval of the topographic information, which shows that Phi-Net is a strong candidate for the generation of high-quality digital elevation models at a resolution close to the one of the original single-look complex data.
2021
5
Sica, Francescopaolo; Gobbi, Giorgia; Rizzoli, Paola; Bruzzone, Lorenzo
$-Net: Deep Residual Learning for InSAR Parameters Estimation / Sica, Francescopaolo; Gobbi, Giorgia; Rizzoli, Paola; Bruzzone, Lorenzo. - In: IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING. - ISSN 0196-2892. - 59:5(2021), pp. 3917-3941. [10.1109/tgrs.2020.3020427]
File in questo prodotto:
File Dimensione Formato  
Net_ Deep Residual Learning for InSAR Parameters Estimation.pdf

accesso aperto

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Creative commons
Dimensione 10.05 MB
Formato Adobe PDF
10.05 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/401507
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 34
  • ???jsp.display-item.citation.isi??? 30
social impact