In this paper, a segmentation-based approach to fine registration of multispectral and multitemporal very high resolution (VHR) images is proposed. The proposed approach aims at estimating and correcting the residual local misalignment [also referred to as registration noise (RN)] that often affects multitemporal VHR images even after standard registration. The method extracts automatically a set of object representative points associated with regions with homogeneous spectral properties (i.e., objects in the scene). Such points result to be distributed all over the considered scene and account for the high spatial correlation of pixels in VHR images. Then, it estimates the amount and direction of residual local misalignment for each object representative point by exploiting residual local misalignment properties in a multiple displacement analysis framework. To this end, a multiscale differential analysis of the multispectral difference image is employed for modeling the statistical distribution of pixels affected by residual misalignment (i.e., RN pixels) and detect them. The RN is used to perform a segmentation-based fine registration based on both temporal and spatial correlation. Accordingly, the method is particularly suitable to be used for images with a large number of border regions like VHR images of urban scenes. Experimental results obtained on both simulated and real multitemporal VHR images confirm the effectiveness of the proposed method.

Segmentation-Based Fine Registration of Very High Resolution Multitemporal Images / Han, Youkyung; Bovolo, Francesca; Bruzzone, Lorenzo. - In: IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING. - ISSN 0196-2892. - STAMPA. - 55:5(2017), pp. 2884-2897. [10.1109/TGRS.2017.2655941]

Segmentation-Based Fine Registration of Very High Resolution Multitemporal Images

Bovolo, Francesca;Bruzzone, Lorenzo
2017-01-01

Abstract

In this paper, a segmentation-based approach to fine registration of multispectral and multitemporal very high resolution (VHR) images is proposed. The proposed approach aims at estimating and correcting the residual local misalignment [also referred to as registration noise (RN)] that often affects multitemporal VHR images even after standard registration. The method extracts automatically a set of object representative points associated with regions with homogeneous spectral properties (i.e., objects in the scene). Such points result to be distributed all over the considered scene and account for the high spatial correlation of pixels in VHR images. Then, it estimates the amount and direction of residual local misalignment for each object representative point by exploiting residual local misalignment properties in a multiple displacement analysis framework. To this end, a multiscale differential analysis of the multispectral difference image is employed for modeling the statistical distribution of pixels affected by residual misalignment (i.e., RN pixels) and detect them. The RN is used to perform a segmentation-based fine registration based on both temporal and spatial correlation. Accordingly, the method is particularly suitable to be used for images with a large number of border regions like VHR images of urban scenes. Experimental results obtained on both simulated and real multitemporal VHR images confirm the effectiveness of the proposed method.
2017
5
Han, Youkyung; Bovolo, Francesca; Bruzzone, Lorenzo
Segmentation-Based Fine Registration of Very High Resolution Multitemporal Images / Han, Youkyung; Bovolo, Francesca; Bruzzone, Lorenzo. - In: IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING. - ISSN 0196-2892. - STAMPA. - 55:5(2017), pp. 2884-2897. [10.1109/TGRS.2017.2655941]
File in questo prodotto:
File Dimensione Formato  
Segmentation-Based Fine Registration of Very High Resolution Multitemporal Images.pdf

Solo gestori archivio

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 6.34 MB
Formato Adobe PDF
6.34 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/193566
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 18
  • ???jsp.display-item.citation.isi??? 13
social impact