Gesture recognition is a hot topic in computer vision and pattern recognition, which plays a vitally important role in natural human-computer interface. Although great progress has been made recently, fast and robust hand gesture recognition remains an open problem, since the existing methods have not well balanced the performance and the efficiency simultaneously. To bridge it, this work combines image entropy and density clustering to exploit the key frames from hand gesture video for further feature extraction, which can improve the efficiency of recognition. Moreover, a feature fusion strategy is also proposed to further improve feature representation, which elevates the performance of recognition. To validate our approach in a “wild” environment, we also introduce two new datasets called HandGesture and Action3D datasets. Experiments consistently demonstrate that our strategy achieves competitive results on Northwestern University, Cambridge, HandGesture and Action3D hand gesture datasets. Our code and datasets will release at https://github.com/Ha0Tang/HandGestureRecognition.

Fast and robust dynamic hand gesture recognition via key frames extraction and feature fusion / Tang, H.; Liu, H.; Xiao, W.; Sebe, N.. - In: NEUROCOMPUTING. - ISSN 0925-2312. - 331:(2019), pp. 424-433. [10.1016/j.neucom.2018.11.038]

Fast and robust dynamic hand gesture recognition via key frames extraction and feature fusion

H. Tang;N. Sebe
2019-01-01

Abstract

Gesture recognition is a hot topic in computer vision and pattern recognition, which plays a vitally important role in natural human-computer interface. Although great progress has been made recently, fast and robust hand gesture recognition remains an open problem, since the existing methods have not well balanced the performance and the efficiency simultaneously. To bridge it, this work combines image entropy and density clustering to exploit the key frames from hand gesture video for further feature extraction, which can improve the efficiency of recognition. Moreover, a feature fusion strategy is also proposed to further improve feature representation, which elevates the performance of recognition. To validate our approach in a “wild” environment, we also introduce two new datasets called HandGesture and Action3D datasets. Experiments consistently demonstrate that our strategy achieves competitive results on Northwestern University, Cambridge, HandGesture and Action3D hand gesture datasets. Our code and datasets will release at https://github.com/Ha0Tang/HandGestureRecognition.
2019
Tang, H.; Liu, H.; Xiao, W.; Sebe, N.
Fast and robust dynamic hand gesture recognition via key frames extraction and feature fusion / Tang, H.; Liu, H.; Xiao, W.; Sebe, N.. - In: NEUROCOMPUTING. - ISSN 0925-2312. - 331:(2019), pp. 424-433. [10.1016/j.neucom.2018.11.038]
File in questo prodotto:
File Dimensione Formato  
1-s2.0-S0925231218313663-main.pdf

Solo gestori archivio

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 2.2 MB
Formato Adobe PDF
2.2 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/250727
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 98
  • ???jsp.display-item.citation.isi??? 68
social impact