An Interactive Music Playlist Generator that Responds to User Emotion and Context

Jonathan Weinel

Research output: Contribution to conferencePaperpeer-review

Abstract

This paper aims to demonstrate the mechanisms of a music recommendation system, and accompanying graphical user interface (GUI), that is capable of generating a playlist of songs based upon an individual’s emotion or context. This interactive music playlist generator has been designed as part of a broader system, Intended for mobile devices, which aims to suggest music based upon ‘how the user is feeling’ and ‘what the user is doing’ by evaluating real-time physiological and contextual sensory data using machine learning technologies. For instance, heart rate and skin temperature in conjunction with ambient light, temperature and global positioning satellite (GPS) could be used to a degree to infer one’s current situation and corresponding mood.
Original languageEnglish
DOIs
Publication statusPublished - 12 Jul 2016
Externally publishedYes
EventElectronic Visualisation and the Arts (EVA 2016) -
Duration: 7 Dec 2016 → …

Conference

ConferenceElectronic Visualisation and the Arts (EVA 2016)
Period7/12/16 → …

Fingerprint

Dive into the research topics of 'An Interactive Music Playlist Generator that Responds to User Emotion and Context'. Together they form a unique fingerprint.

Cite this