Rhodes, Chris;
Allmendinger, Richard;
Climent, Ricardo;
(2022)
Classifying Biometric Data for Musical Interaction Within Virtual Reality.
In:
Artificial Intelligence in Music, Sound, Art and Design. EvoMUSART 2022. Lecture Notes in Computer Science.
(pp. pp. 385-400).
Springer International Publishing: Cham, Switzerland.
Preview |
Text
Rhodes_OpenAccess.pdf - Accepted Version Download (15MB) | Preview |
Abstract
Since 2015, commercial gestural interfaces have widened accessibility for researchers and artists to use novel Electromyographic (EMG) biometric data. EMG data measures musclar amplitude and allows us to enhance Human-Computer Interaction (HCI) through providing natural gestural interaction with digital media. Virtual Reality (VR) is an immersive technology capable of simulating the real world and abstractions of it. However, current commercial VR technology is not equipped to process and use biometric information. Using biometrics within VR allows for better gestural detailing and use of complex custom gestures, such as those found within instrumental music performance, compared to using optical sensors for gesture recognition in current commercial VR equipment. However, EMG data is complex and machine learning must be used to employ it. This study uses a Myo armband to classify four custom gestures in Wekinator and observe their prediction accuracies and representations (including or omitting signal onset) to compose music within VR. Results show that specific regression and classification models, according to gesture representation type, are the most accurate when classifying four music gestures for advanced music HCI in VR. We apply and record our results, showing that EMG biometrics are promising for future interactive music composition systems in VR.
Type: | Proceedings paper |
---|---|
Title: | Classifying Biometric Data for Musical Interaction Within Virtual Reality |
Event: | 11th International Conference, EvoMUSART 2022, Held as Part of EvoStar 2022 |
ISBN-13: | 9783031037887 |
Open access status: | An open access version is available from UCL Discovery |
DOI: | 10.1007/978-3-031-03789-4_25 |
Publisher version: | https://doi.org/10.1007%2F978-3-031-03789-4_25 |
Language: | English |
Additional information: | This version is the author accepted manuscript. For information on re-use, please refer to the publisher’s terms and conditions. |
Keywords: | EMG, Interactive music, Machine learning, Music composition, Myo, Biometrics, Wekinator, VR, Virtual reality |
UCL classification: | UCL > Provost and Vice Provost Offices > School of Education > UCL Institute of Education UCL > Provost and Vice Provost Offices > School of Education > UCL Institute of Education > IOE - Culture, Communication and Media UCL > Provost and Vice Provost Offices > School of Education UCL |
URI: | https://discovery.ucl.ac.uk/id/eprint/10154935 |




Archive Staff Only
![]() |
View Item |