Genre classification can be considered as an essential part of music and movie recommender systems. So far, various automatic music genre classification methods have been proposed based on various audio features. However, such content-centric features are not capable of capturing the personal preferences of the listener. In this study, we provide preliminary experimental evidence for the possibility of the music genre classification based on the brain recorded signals of individuals. The brain decoding paradigm is employed to classify recorded brain signals into two broad genre classes: Pop and Rock. We compare the performance of our proposed paradigm on two neuroimaging datasets that contains the electroencephalographic (EEG) and the magnetoencephalographic (MEG) data of subjects who watched 40 music video clips. Our results indicate that the genre of the music clips can be retrieved significantly over the chancelevel using the brain signals. Our study provides a primary step towards user-centric music content retrieval by exploiting brain signals. © 2016 IEEE.
Brain and music: Music genre classification using brain signals
Ghaemmaghami Tabrizi, Pouya;Sebe, Niculae
2016-01-01
Abstract
Genre classification can be considered as an essential part of music and movie recommender systems. So far, various automatic music genre classification methods have been proposed based on various audio features. However, such content-centric features are not capable of capturing the personal preferences of the listener. In this study, we provide preliminary experimental evidence for the possibility of the music genre classification based on the brain recorded signals of individuals. The brain decoding paradigm is employed to classify recorded brain signals into two broad genre classes: Pop and Rock. We compare the performance of our proposed paradigm on two neuroimaging datasets that contains the electroencephalographic (EEG) and the magnetoencephalographic (MEG) data of subjects who watched 40 music video clips. Our results indicate that the genre of the music clips can be retrieved significantly over the chancelevel using the brain signals. Our study provides a primary step towards user-centric music content retrieval by exploiting brain signals. © 2016 IEEE.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione