Spectral-spatial classification of hyperspectral imagery based on partitional clustering techniques

Research output: Contribution to journalArticlepeer-review

Abstract

A new spectralspatial classification scheme for hyperspectral images is proposed. The method combines the results of a pixel wise support vector machine classification and the segmentation map obtained by partitional clustering using majority voting. The ISODATA algorithm and Gaussian mixture resolving techniques are used for image clustering. Experimental results are presented for two hyperspectral airborne images. The developed classification scheme improves the classification accuracies and provides classification maps with more homogeneous regions, when compared to pixel wise classification. The proposed method performs particularly well for classification of images with large spatial structures and when different classes have dissimilar spectral responses and a comparable number of pixels.

Original languageEnglish
Article number4840429
Pages (from-to)2973-2987
Number of pages15
JournalIEEE Transactions on Geoscience and Remote Sensing
Volume47
Issue number8
DOIs
Publication statusPublished - Aug 2009

Bibliographical note

Funding Information: Manuscript received October 3, 2008; revised December 16, 2008. First published April 24, 2009; current version published July 23, 2009. This work was supported in part by the Marie Curie Research Training Network “HYPER-I-NET.” Y. Tarabalka is with the Faculty of Electrical and Computer Engineering, University of Iceland, 107 Reykjavik, Iceland and also with the GIPSA-Lab-Grenoble Institute of Technology, Domaine Universitaire, 38402 Saint-Martin-d’Hères Cedex, France (e-mail: [email protected]).

Other keywords

  • Clustering
  • Hyperspectral images
  • Majority vote
  • Segmentation
  • Spectral-spatial classification

Fingerprint

Dive into the research topics of 'Spectral-spatial classification of hyperspectral imagery based on partitional clustering techniques'. Together they form a unique fingerprint.

Cite this