Large online music databases under Creative Commons licenses are rarely recorded by well-known artists, therefore conventional metadata-based search is insufficient in their adaptation to instrument players’ needs. The emerging class of smart musical instruments (SMIs) can address this challenge. Thanks to direct internet connectivity and embedded processing, SMIs can send requests to repositories and reproduce the response for improvisation, composition, or learning purposes. We present a smart guitar prototype that allows retrieving songs from large online music databases using criteria different from conventional music search, which were derived from interviewing 30 guitar players. We investigate three interaction methods coupled with four search criteria (tempo, chords, key and tuning) exploiting intelligent capabilities in the instrument: (i) keywords-based retrieval using an embedded touchscreen; (ii) cloud-computing where recorded content is transmitted to a server that extracts relevant audio features; (iii) edge-computing where the guitar detects audio features and sends the request directly. Overall, the evaluation of these methods with beginner, intermediate, and expert players showed a strong appreciation for the direct connectivity of the instrument with an online database and the approach to the search based on the actual musical content rather than conventional textual criteria, such as song title or artist name.
Cloud-Smart Musical Instrument Interactions: Querying a Large Music Collection with a Smart Guitar / Turchet, Luca; Pauwels, Johan; Fischione, Carlo; Fazekas, György. - In: ACM TRANSACTIONS ON THE INTERNET OF THINGS. - ISSN 2691-1914. - 2020:3(2020). [10.1145/3377881]
Cloud-Smart Musical Instrument Interactions: Querying a Large Music Collection with a Smart Guitar
Turchet, Luca;
2020-01-01
Abstract
Large online music databases under Creative Commons licenses are rarely recorded by well-known artists, therefore conventional metadata-based search is insufficient in their adaptation to instrument players’ needs. The emerging class of smart musical instruments (SMIs) can address this challenge. Thanks to direct internet connectivity and embedded processing, SMIs can send requests to repositories and reproduce the response for improvisation, composition, or learning purposes. We present a smart guitar prototype that allows retrieving songs from large online music databases using criteria different from conventional music search, which were derived from interviewing 30 guitar players. We investigate three interaction methods coupled with four search criteria (tempo, chords, key and tuning) exploiting intelligent capabilities in the instrument: (i) keywords-based retrieval using an embedded touchscreen; (ii) cloud-computing where recorded content is transmitted to a server that extracts relevant audio features; (iii) edge-computing where the guitar detects audio features and sends the request directly. Overall, the evaluation of these methods with beginner, intermediate, and expert players showed a strong appreciation for the direct connectivity of the instrument with an online database and the approach to the search based on the actual musical content rather than conventional textual criteria, such as song title or artist name.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione