A novel set-to-set distance-based spectral-spatial classification method for hyperspectral images (HSIs) is proposed. In HSIs, the spatially connected and spectrally similar pixels within each homogeneous region can be considered as one set of test samples, i.e., a test set, which should belong to the same class. In addition, each class of labeled pixels can be regarded as one set of training samples, i.e., a training set. Therefore, it is a natural consideration in the proposed method to measure the similarity between test and training sets via specific set-based distance criteria and then decide the classification label for each test set, accordingly. Specifically, the superpixel algorithm-based oversegmentation technique jointly exploits both the spatial similarity and structural information to first divide the HSI into multiple perceptually uniform regions. As a result, each segmented region corresponds to one test set. Then, each test/training set is represented with an affine hull (AH) model, which utilizes both the similarity and variance of pixels within each set to adaptively characterize the set. Finally, the class label for each test set is determined based on the closest geometry distance between test and training AHs. Experimental results on real HSI data sets demonstrate the superiority of the proposed algorithm over several well-known classification approaches, in terms of classification accuracy and computational speed.

Set-to-Set Distance-Based Spectral-Spatial Classification of Hyperspectral Images / Lu, Ting; Li, Shutao; Fang, Leyuan; Bruzzone, Lorenzo; Benediktsson, Jon Atli. - In: IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING. - ISSN 0196-2892. - 54:12(2016), pp. 7122-7134. [10.1109/TGRS.2016.2596260]

Set-to-Set Distance-Based Spectral-Spatial Classification of Hyperspectral Images

Lu, Ting;Bruzzone, Lorenzo;Benediktsson, Jon Atli
2016-01-01

Abstract

A novel set-to-set distance-based spectral-spatial classification method for hyperspectral images (HSIs) is proposed. In HSIs, the spatially connected and spectrally similar pixels within each homogeneous region can be considered as one set of test samples, i.e., a test set, which should belong to the same class. In addition, each class of labeled pixels can be regarded as one set of training samples, i.e., a training set. Therefore, it is a natural consideration in the proposed method to measure the similarity between test and training sets via specific set-based distance criteria and then decide the classification label for each test set, accordingly. Specifically, the superpixel algorithm-based oversegmentation technique jointly exploits both the spatial similarity and structural information to first divide the HSI into multiple perceptually uniform regions. As a result, each segmented region corresponds to one test set. Then, each test/training set is represented with an affine hull (AH) model, which utilizes both the similarity and variance of pixels within each set to adaptively characterize the set. Finally, the class label for each test set is determined based on the closest geometry distance between test and training AHs. Experimental results on real HSI data sets demonstrate the superiority of the proposed algorithm over several well-known classification approaches, in terms of classification accuracy and computational speed.
2016
12
Lu, Ting; Li, Shutao; Fang, Leyuan; Bruzzone, Lorenzo; Benediktsson, Jon Atli
Set-to-Set Distance-Based Spectral-Spatial Classification of Hyperspectral Images / Lu, Ting; Li, Shutao; Fang, Leyuan; Bruzzone, Lorenzo; Benediktsson, Jon Atli. - In: IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING. - ISSN 0196-2892. - 54:12(2016), pp. 7122-7134. [10.1109/TGRS.2016.2596260]
File in questo prodotto:
File Dimensione Formato  
5.07548355.pdf

Solo gestori archivio

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 3.05 MB
Formato Adobe PDF
3.05 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/168534
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 53
  • ???jsp.display-item.citation.isi??? 49
social impact