This study introduces a frequency domain-based gait recognition framework designed to improve the accuracy of identifying and tracking individuals in diverse surveillance environments. Our approach transforms 2D gait variables, extracted from 25 key points, into frequency domain representations using Fast Fourier Transform (FFT), effectively integrating video and sensor data. We developed the E-GAITS (Ewha GAit Identification Towards Surveillance system) dataset, which includes a wide range of environmental variables, to support this framework. The proposed methods, sensor value imitation and feature encoding, were extensively tested under various conditions, including different lighting scenarios and indoor/outdoor settings. The results demonstrate high recognition accuracy, with the integration of Inertial Measurement Unit (IMU) sensor data significantly enhancing performance, especially in low-light conditions. Despite the reliance on the off-the-shelf pose estimation algorithm, which has limitations in poor lighting, the framework sets a new benchmark for gait-based identification systems and shows significant potential for applications in security and forensic analysis. Future work will focus on improving robustness and exploring additional biometric modalities to further enhance accuracy.

Toward Robust Gait Identification: A Frequency Domain Approach in Varied Surveillance Environments / Song, Yeojin; Quagliato, Luca; Jang, Sewon; Chung, Egene; Ko, Seoyeon; Hwang, Seoyeong; Noh, Junhyug; Lee, Taeyong. - In: IEEE ACCESS. - ISSN 2169-3536. - 2026, 14:(2026), pp. 12484-12497. [10.1109/ACCESS.2026.3653781]

Toward Robust Gait Identification: A Frequency Domain Approach in Varied Surveillance Environments

Luca Quagliato
Co-primo
;
2026-01-01

Abstract

This study introduces a frequency domain-based gait recognition framework designed to improve the accuracy of identifying and tracking individuals in diverse surveillance environments. Our approach transforms 2D gait variables, extracted from 25 key points, into frequency domain representations using Fast Fourier Transform (FFT), effectively integrating video and sensor data. We developed the E-GAITS (Ewha GAit Identification Towards Surveillance system) dataset, which includes a wide range of environmental variables, to support this framework. The proposed methods, sensor value imitation and feature encoding, were extensively tested under various conditions, including different lighting scenarios and indoor/outdoor settings. The results demonstrate high recognition accuracy, with the integration of Inertial Measurement Unit (IMU) sensor data significantly enhancing performance, especially in low-light conditions. Despite the reliance on the off-the-shelf pose estimation algorithm, which has limitations in poor lighting, the framework sets a new benchmark for gait-based identification systems and shows significant potential for applications in security and forensic analysis. Future work will focus on improving robustness and exploring additional biometric modalities to further enhance accuracy.
2026
Song, Yeojin; Quagliato, Luca; Jang, Sewon; Chung, Egene; Ko, Seoyeon; Hwang, Seoyeong; Noh, Junhyug; Lee, Taeyong
Toward Robust Gait Identification: A Frequency Domain Approach in Varied Surveillance Environments / Song, Yeojin; Quagliato, Luca; Jang, Sewon; Chung, Egene; Ko, Seoyeon; Hwang, Seoyeong; Noh, Junhyug; Lee, Taeyong. - In: IEEE ACCESS. - ISSN 2169-3536. - 2026, 14:(2026), pp. 12484-12497. [10.1109/ACCESS.2026.3653781]
File in questo prodotto:
File Dimensione Formato  
Toward_Robust_Gait_Identification_A_Frequency_Domain_Approach_in_Varied_Surveillance_Environments_compressed (1).pdf

Solo gestori archivio

Descrizione: IEEE Access - Research article
Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 961.79 kB
Formato Adobe PDF
961.79 kB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/477251
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
  • OpenAlex 0
social impact