Genre classification can be considered as an essential part of music and movie recommender systems. So far, various automatic music genre classification methods have been proposed based on various audio features. However, such content-centric features are not capable of capturing the personal preferences of the listener. In this study, we provide preliminary experimental evidence for the possibility of the music genre classification based on the brain recorded signals of individuals. The brain decoding paradigm is employed to classify recorded brain signals into two broad genre classes: Pop and Rock. We compare the performance of our proposed paradigm on two neuroimaging datasets that contains the electroencephalographic (EEG) and the magnetoencephalographic (MEG) data of subjects who watched 40 music video clips. Our results indicate that the genre of the music clips can be retrieved significantly over the chancelevel using the brain signals. Our study provides a primary step towards user-centric music content retrieval by exploiting brain signals. © 2016 IEEE.

Brain and music: Music genre classification using brain signals

Ghaemmaghami Tabrizi, Pouya;Sebe, Niculae
2016-01-01

Abstract

Genre classification can be considered as an essential part of music and movie recommender systems. So far, various automatic music genre classification methods have been proposed based on various audio features. However, such content-centric features are not capable of capturing the personal preferences of the listener. In this study, we provide preliminary experimental evidence for the possibility of the music genre classification based on the brain recorded signals of individuals. The brain decoding paradigm is employed to classify recorded brain signals into two broad genre classes: Pop and Rock. We compare the performance of our proposed paradigm on two neuroimaging datasets that contains the electroencephalographic (EEG) and the magnetoencephalographic (MEG) data of subjects who watched 40 music video clips. Our results indicate that the genre of the music clips can be retrieved significantly over the chancelevel using the brain signals. Our study provides a primary step towards user-centric music content retrieval by exploiting brain signals. © 2016 IEEE.
2016
Eusipco
paris
European Signal Processing Conference, EUSIPCO
Ghaemmaghami Tabrizi, Pouya; Sebe, Niculae
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/166697
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 4
social impact