Explainability has gained significant attention across various domains, yet it remains relatively underexplored in the field of music, particularly in Music Emotion Recognition. This paper presents XMERApp; a web application designed to provide interpretability for a deep learning model that classifies classical and acoustic guitar into four emotional states. Our system employs a deep learning architecture trained on improvised musical performances to classify emotions, while providing comprehensive explainability through multiple complementary approaches. The application offers users three levels of interpretability: (1) detailed breakdowns of prediction probabilities across different emotion categories, enabling users to understand the confidence and uncertainty in model predictions; (2) temporal visualization of emotion evolution throughout the improvisation, revealing how the model’s understanding of emotional content develops over time; and (3) LIME-based explanations that highlight specific spectrogram regions most influential to the model’s decisions within focused time windows. Additionally, users can listen to the specific spectrogram regions identified as critical for the emotion classification, gaining insights into which parts of the performance and frequency ranges contributed the most to the model’s output. The web-based nature of XMERApp enables deployment across many devices, including smart musical instruments, enhancing the interpretability of intelligent features embedded within them.
Towards Explainable Music Emotion Recognition for Guitar Improvisations / Rossi, Michele; Stefani, Domenico; Pauwels, Johan; Iacca, Giovanni; Turchet, Luca. - (2025), pp. 1-5. ( 2025 IEEE 6th International Symposium on the Internet of Sounds (IS2) L'Aquila 29th October-31st October 2025) [10.1109/is264627.2025.11284553].
Towards Explainable Music Emotion Recognition for Guitar Improvisations
Rossi, Michele;Stefani, Domenico;Iacca, Giovanni;Turchet, Luca
2025-01-01
Abstract
Explainability has gained significant attention across various domains, yet it remains relatively underexplored in the field of music, particularly in Music Emotion Recognition. This paper presents XMERApp; a web application designed to provide interpretability for a deep learning model that classifies classical and acoustic guitar into four emotional states. Our system employs a deep learning architecture trained on improvised musical performances to classify emotions, while providing comprehensive explainability through multiple complementary approaches. The application offers users three levels of interpretability: (1) detailed breakdowns of prediction probabilities across different emotion categories, enabling users to understand the confidence and uncertainty in model predictions; (2) temporal visualization of emotion evolution throughout the improvisation, revealing how the model’s understanding of emotional content develops over time; and (3) LIME-based explanations that highlight specific spectrogram regions most influential to the model’s decisions within focused time windows. Additionally, users can listen to the specific spectrogram regions identified as critical for the emotion classification, gaining insights into which parts of the performance and frequency ranges contributed the most to the model’s output. The web-based nature of XMERApp enables deployment across many devices, including smart musical instruments, enhancing the interpretability of intelligent features embedded within them.| File | Dimensione | Formato | |
|---|---|---|---|
|
paper_74.pdf
Solo gestori archivio
Tipologia:
Pre-print non referato (Non-refereed preprint)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
1.37 MB
Formato
Adobe PDF
|
1.37 MB | Adobe PDF | Visualizza/Apri |
|
Towards_Explainable_Music_Emotion_Recognition_for_Guitar_Improvisations.pdf
Solo gestori archivio
Tipologia:
Versione editoriale (Publisher’s layout)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
2.05 MB
Formato
Adobe PDF
|
2.05 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione



