A general framework for change detection is proposed to analyze multimodal remotely sensed data utilizing the Kronecker product between two data representations (vectors or matrices). The proposed method is sensor independent and provides comparable results to techniques that exist for specific sensors. The proposed fusion technique is a pixel-level approach that incorporates inputs from different modalities, rendering enriched multimodal data representation. Thus, the proposed hybridization procedure helps to assimilate multisensor information in a meaningful manner. A novel change index (zeta) is defined for the general multimodal case. This index is then used to quantify the change in bitemporal remotely sensed data. This article explores the usability, consistency, and robustness of the proposed multimodal fusion framework, including the change index, with proper validation on two multimodal cases: 1) the dual-frequency (C- and L-band) fully polarimetric Danish EMISAR data and 2) the dual-polarimetric synthetic aperture radar and Sentinel-2 multispectral data. Detailed analysis and validation using extensive ground-truth data are presented to establish the proposed framework.
A General Framework for Change Detection Using Multimodal Remote Sensing Data / Chirakkal, S.; Bovolo, F.; Misra, A. R.; Bruzzone, L.; Bhattacharya, A.. - In: IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING. - ISSN 1939-1404. - 14:(2021), pp. 10665-10680. [10.1109/JSTARS.2021.3119358]
A General Framework for Change Detection Using Multimodal Remote Sensing Data
Bovolo F.;Bruzzone L.;
2021-01-01
Abstract
A general framework for change detection is proposed to analyze multimodal remotely sensed data utilizing the Kronecker product between two data representations (vectors or matrices). The proposed method is sensor independent and provides comparable results to techniques that exist for specific sensors. The proposed fusion technique is a pixel-level approach that incorporates inputs from different modalities, rendering enriched multimodal data representation. Thus, the proposed hybridization procedure helps to assimilate multisensor information in a meaningful manner. A novel change index (zeta) is defined for the general multimodal case. This index is then used to quantify the change in bitemporal remotely sensed data. This article explores the usability, consistency, and robustness of the proposed multimodal fusion framework, including the change index, with proper validation on two multimodal cases: 1) the dual-frequency (C- and L-band) fully polarimetric Danish EMISAR data and 2) the dual-polarimetric synthetic aperture radar and Sentinel-2 multispectral data. Detailed analysis and validation using extensive ground-truth data are presented to establish the proposed framework.File | Dimensione | Formato | |
---|---|---|---|
A_General_Framework_for_Change_Detection_Using_Multimodal_Remote_Sensing_Data (1) (1).pdf
accesso aperto
Tipologia:
Versione editoriale (Publisher’s layout)
Licenza:
Creative commons
Dimensione
1.6 MB
Formato
Adobe PDF
|
1.6 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione