Among various multimodal remote sensing data, the pairing of multispectral (MS) and panchromatic (PAN) images is widely used in remote sensing applications. This article proposes a novel global collaborative fusion network (GCFnet) for joint classification of MS and PAN images. In particular, a global patch-free classification scheme based on an encoder-decoder deep learning (DL) network is developed to exploit context dependencies in the image. The proposed GCFnet is designed based on a novel collaborative fusion architecture, which mainly contains three parts: 1) two shallow-to-deep feature fusion branches related to individual MS and PAN images; 2) a multiscale cross-modal feature fusion branch of the two images, where an adaptive loss weighted fusion strategy is designed to calculate the total loss of two individual and the cross-modal branches; and 3) a probability weighted decision fusion strategy for the fusion of the classification results of three branches to further improve the classification performance. Experimental results obtained on three real datasets covering complex urban scenarios confirm the effectiveness of the proposed GCFnet in terms of higher accuracy and robustness compared to existing methods. By utilizing both sampled and non-sampled position data in the feature extraction process, the proposed GCFnet can achieve excellent performance even in a small sample-size case.
GCFnet: Global Collaborative Fusion Network for Multispectral and Panchromatic Image Classification / Zhao, Hui; Liu, Sicong; Du, Qian; Bruzzone, Lorenzo; Zheng, Yongjie; Du, Kecheng; Tong, Xiaohua; Xie, Huan; Ma, Xiaolong. - In: IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING. - ISSN 0196-2892. - 60:(2022), pp. 563281401-563281414. [10.1109/TGRS.2022.3215020]
GCFnet: Global Collaborative Fusion Network for Multispectral and Panchromatic Image Classification
Liu, Sicong;Bruzzone, Lorenzo;Zheng, Yongjie;
2022-01-01
Abstract
Among various multimodal remote sensing data, the pairing of multispectral (MS) and panchromatic (PAN) images is widely used in remote sensing applications. This article proposes a novel global collaborative fusion network (GCFnet) for joint classification of MS and PAN images. In particular, a global patch-free classification scheme based on an encoder-decoder deep learning (DL) network is developed to exploit context dependencies in the image. The proposed GCFnet is designed based on a novel collaborative fusion architecture, which mainly contains three parts: 1) two shallow-to-deep feature fusion branches related to individual MS and PAN images; 2) a multiscale cross-modal feature fusion branch of the two images, where an adaptive loss weighted fusion strategy is designed to calculate the total loss of two individual and the cross-modal branches; and 3) a probability weighted decision fusion strategy for the fusion of the classification results of three branches to further improve the classification performance. Experimental results obtained on three real datasets covering complex urban scenarios confirm the effectiveness of the proposed GCFnet in terms of higher accuracy and robustness compared to existing methods. By utilizing both sampled and non-sampled position data in the feature extraction process, the proposed GCFnet can achieve excellent performance even in a small sample-size case.File | Dimensione | Formato | |
---|---|---|---|
Pre-print-GCFnet_Global_Collaborative_Fusion_Network_for_Multispectral_and_Panchromatic_Image_Classification.pdf
accesso aperto
Tipologia:
Post-print referato (Refereed author’s manuscript)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
2.56 MB
Formato
Adobe PDF
|
2.56 MB | Adobe PDF | Visualizza/Apri |
GCFnet_Global_Collaborative_Fusion_Network_for_Multispectral_and_Panchromatic_Image_Classification_compressed.pdf
Solo gestori archivio
Tipologia:
Versione editoriale (Publisher’s layout)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
3.93 MB
Formato
Adobe PDF
|
3.93 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione