Abstract
This paper aims to demonstrate the mechanisms of a music recommendation system, and accompanying graphical user interface (GUI), that is capable of generating a playlist of songs based upon an individual’s emotion or context. This interactive music playlist generator has been designed as part of a broader system, Intended for mobile devices, which aims to suggest music based upon ‘how the user is feeling’ and ‘what the user is doing’ by evaluating real-time physiological and contextual sensory data using machine learning technologies. For instance, heart rate and skin temperature in conjunction with ambient light, temperature and global positioning satellite (GPS) could be used to a degree to infer one’s current situation and corresponding mood.
Original language | English |
---|---|
DOIs | |
Publication status | Published - 12 Jul 2016 |
Externally published | Yes |
Event | Electronic Visualisation and the Arts (EVA 2016) - Duration: 7 Dec 2016 → … |
Conference
Conference | Electronic Visualisation and the Arts (EVA 2016) |
---|---|
Period | 7/12/16 → … |