In this article, we present a new pansharpening method, a zero-reference generative adversarial network (ZeRGAN), which fuses low spatial resolution multispectral (LR MS) and high spatial resolution panchromatic (PAN) images. In the proposed method, zero-reference indicates that it does not require paired reduced-scale images or unpaired full-scale images for training. To obtain accurate fusion results, we establish an adversarial game between a set of multiscale generators and their corresponding discriminators. Through multiscale generators, the fused high spatial resolution MS (HR MS) images are progressively produced from LR MS and PAN images, while the discriminators aim to distinguish the differences of spatial information between the HR MS images and the PAN images. In other words, the HR MS images are generated from LR MS and PAN images after the optimization of ZeRGAN. Furthermore, we construct a nonreference loss function, including an adversarial loss, spatial and spectral reconstruction losses, a spatial enhancement loss, and an average constancy loss. Through the minimization of the total loss, the spatial details in the HR MS images can be enhanced efficiently. Extensive experiments are implemented on datasets acquired by different satellites. The results demonstrate that the effectiveness of the proposed method compared with the state-of-the-art methods. The source code is publicly available at https://github.com/RSMagneto/ZeRGAN.

ZeRGAN: Zero-Reference GAN for Fusion of Multispectral and Panchromatic Images / Diao, Wenxiu; Zhang, Feng; Sun, Jiande; Xing, Yinghui; Zhang, Kai; Bruzzone, Lorenzo. - In: IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS. - ISSN 2162-237X. - 34:11(2023), pp. 8195-8209. [10.1109/TNNLS.2021.3137373]

ZeRGAN: Zero-Reference GAN for Fusion of Multispectral and Panchromatic Images

Bruzzone, Lorenzo
2023-01-01

Abstract

In this article, we present a new pansharpening method, a zero-reference generative adversarial network (ZeRGAN), which fuses low spatial resolution multispectral (LR MS) and high spatial resolution panchromatic (PAN) images. In the proposed method, zero-reference indicates that it does not require paired reduced-scale images or unpaired full-scale images for training. To obtain accurate fusion results, we establish an adversarial game between a set of multiscale generators and their corresponding discriminators. Through multiscale generators, the fused high spatial resolution MS (HR MS) images are progressively produced from LR MS and PAN images, while the discriminators aim to distinguish the differences of spatial information between the HR MS images and the PAN images. In other words, the HR MS images are generated from LR MS and PAN images after the optimization of ZeRGAN. Furthermore, we construct a nonreference loss function, including an adversarial loss, spatial and spectral reconstruction losses, a spatial enhancement loss, and an average constancy loss. Through the minimization of the total loss, the spatial details in the HR MS images can be enhanced efficiently. Extensive experiments are implemented on datasets acquired by different satellites. The results demonstrate that the effectiveness of the proposed method compared with the state-of-the-art methods. The source code is publicly available at https://github.com/RSMagneto/ZeRGAN.
2023
11
Diao, Wenxiu; Zhang, Feng; Sun, Jiande; Xing, Yinghui; Zhang, Kai; Bruzzone, Lorenzo
ZeRGAN: Zero-Reference GAN for Fusion of Multispectral and Panchromatic Images / Diao, Wenxiu; Zhang, Feng; Sun, Jiande; Xing, Yinghui; Zhang, Kai; Bruzzone, Lorenzo. - In: IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS. - ISSN 2162-237X. - 34:11(2023), pp. 8195-8209. [10.1109/TNNLS.2021.3137373]
File in questo prodotto:
File Dimensione Formato  
ZeRGAN_Zero-Reference_GAN_for_Fusion_of_Multispectral_and_Panchromatic_Images.pdf

Solo gestori archivio

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 7.59 MB
Formato Adobe PDF
7.59 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/401171
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? 26
social impact