Pansharpening methods based on deep neural networks (DNNs) have been attracting great attention due to their powerful representation capabilities. In this article, to combine the feature maps from different subnetworks efficiently, we propose a novel pansharpening method based on a spatial and spectral extraction network (SSE-Net). Different from the other methods based on DNNs that directly concatenate the features from different subnetworks, we design adaptive feature fusion modules (AFFMs) to merge these features according to their information content. First, the spatial and spectral features are extracted by the subnetworks from low spatial resolution multispectral (LR MS) and panchromatic (PAN) images. Then, by fusing the features at different levels, the desired high spatial resolution MS (HR MS) images are generated by the fusion network consisting of AFFMs. In the fusion network, the features from different subnetworks are integrated adaptively, and the redundancy among them is reduced. Moreover, the spectral ratio loss and the gradient loss are defined to ensure the effective learning of spatial and spectral features. The spectral ratio loss captures the nonlinear relationships among the bands in the MS image to reduce the spectral distortions in the fusion result. Extensive experiments were conducted on QuickBird and GeoEye-1 satellite datasets. Visual and numerical results demonstrate that the proposed method produces better fusion results compared with literature techniques. The source code is available at https://github.com/RSMagneto/SSE-Net.
Spatial and Spectral Extraction Network With Adaptive Feature Fusion for Pansharpening / Zhang, Kai; Wang, Anfei; Zhang, Feng; Diao, Wenxiu; Sun, Jiande; Bruzzone, Lorenzo. - In: IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING. - ISSN 0196-2892. - 60:(2022), pp. 541081401-541081414. [10.1109/TGRS.2022.3187025]
Spatial and Spectral Extraction Network With Adaptive Feature Fusion for Pansharpening
Bruzzone, Lorenzo
2022-01-01
Abstract
Pansharpening methods based on deep neural networks (DNNs) have been attracting great attention due to their powerful representation capabilities. In this article, to combine the feature maps from different subnetworks efficiently, we propose a novel pansharpening method based on a spatial and spectral extraction network (SSE-Net). Different from the other methods based on DNNs that directly concatenate the features from different subnetworks, we design adaptive feature fusion modules (AFFMs) to merge these features according to their information content. First, the spatial and spectral features are extracted by the subnetworks from low spatial resolution multispectral (LR MS) and panchromatic (PAN) images. Then, by fusing the features at different levels, the desired high spatial resolution MS (HR MS) images are generated by the fusion network consisting of AFFMs. In the fusion network, the features from different subnetworks are integrated adaptively, and the redundancy among them is reduced. Moreover, the spectral ratio loss and the gradient loss are defined to ensure the effective learning of spatial and spectral features. The spectral ratio loss captures the nonlinear relationships among the bands in the MS image to reduce the spectral distortions in the fusion result. Extensive experiments were conducted on QuickBird and GeoEye-1 satellite datasets. Visual and numerical results demonstrate that the proposed method produces better fusion results compared with literature techniques. The source code is available at https://github.com/RSMagneto/SSE-Net.File | Dimensione | Formato | |
---|---|---|---|
TGRS3187025.pdf
accesso aperto
Tipologia:
Post-print referato (Refereed author’s manuscript)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
2.76 MB
Formato
Adobe PDF
|
2.76 MB | Adobe PDF | Visualizza/Apri |
Spatial_and_Spectral_Extraction_Network_With_Adaptive_Feature_Fusion_for_Pansharpening.pdf
Solo gestori archivio
Tipologia:
Versione editoriale (Publisher’s layout)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
7.29 MB
Formato
Adobe PDF
|
7.29 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione