Up-to-date canopy height model (CHM) estimates are of key importance for forest resources monitoring and disturbance analysis. In this work we present a study on the potential of Deep Learning (DL) for the regression of forest height from TanDEM-X bistatic interferometric (InSAR) data. We propose a novel fully convolutional neural network (CNN) framework, trained in a supervised manner using reference CHM measurements derived from the LiDAR LVIS airborne sensor from NASA. The reference measurements were acquired during the joint NASA-ESA 2016 AfriSAR campaign over five sites in Gabon, Africa, characterized by the presence of different kinds of vegetation, spanning from tropical primary forests to mangroves. Together with the DL architecture and training strategy, we present a series of experiments to assess the impact of different input features on the network estimation accuracy (in particular of bistatic InSAR-related ones). When tested on all considered sites, the proposed DL model achieves an overall performance of 1.46m mean error, 4.2m mean absolute error and 15.06% mean absolute percentage error. Furthermore, we perform a spatial transfer analysis aimed at deriving preliminary insights on the generalization capability of the network when trained and tested on data sets acquired over different locations, combining different kinds of tropical vegetation. The obtained results are promising and already in line with state-of-the-art methods based on both physical-based modelling and data-driven approaches, with the remarkable advantage of requiring only one single TanDEM-X acquisition at inference time.
A Deep Learning Framework for the Estimation of Forest Height from Bistatic TanDEM-X Data / Carcereri, Daniel; Rizzoli, Paola; Ienco, Dino; Bruzzone, Lorenzo. - In: IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING. - ISSN 1939-1404. - 16:(2023), pp. 8334-8352. [10.1109/JSTARS.2023.3310209]
A Deep Learning Framework for the Estimation of Forest Height from Bistatic TanDEM-X Data
Carcereri, DanielPrimo
;Bruzzone, Lorenzo
2023-01-01
Abstract
Up-to-date canopy height model (CHM) estimates are of key importance for forest resources monitoring and disturbance analysis. In this work we present a study on the potential of Deep Learning (DL) for the regression of forest height from TanDEM-X bistatic interferometric (InSAR) data. We propose a novel fully convolutional neural network (CNN) framework, trained in a supervised manner using reference CHM measurements derived from the LiDAR LVIS airborne sensor from NASA. The reference measurements were acquired during the joint NASA-ESA 2016 AfriSAR campaign over five sites in Gabon, Africa, characterized by the presence of different kinds of vegetation, spanning from tropical primary forests to mangroves. Together with the DL architecture and training strategy, we present a series of experiments to assess the impact of different input features on the network estimation accuracy (in particular of bistatic InSAR-related ones). When tested on all considered sites, the proposed DL model achieves an overall performance of 1.46m mean error, 4.2m mean absolute error and 15.06% mean absolute percentage error. Furthermore, we perform a spatial transfer analysis aimed at deriving preliminary insights on the generalization capability of the network when trained and tested on data sets acquired over different locations, combining different kinds of tropical vegetation. The obtained results are promising and already in line with state-of-the-art methods based on both physical-based modelling and data-driven approaches, with the remarkable advantage of requiring only one single TanDEM-X acquisition at inference time.File | Dimensione | Formato | |
---|---|---|---|
A_Deep_Learning_Framework_for_the_Estimation_of_Forest_Height_from_Bistatic_TanDEM-X_Data_compressed (1).pdf
accesso aperto
Tipologia:
Post-print referato (Refereed author’s manuscript)
Licenza:
Creative commons
Dimensione
1.01 MB
Formato
Adobe PDF
|
1.01 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione