Daly, Ian and Williams, Duncan and Hallowell, James and Hwang, Faustina and Kirke, Alexis and Malik, Asad and Weaver, James and Miranda, Eduardo and Nasuto, Slawomir J (2015) Music-induced emotions can be predicted from a combination of brain activity and acoustic features. Brain and Cognition, 101. pp. 1-11. DOI https://doi.org/10.1016/j.bandc.2015.08.003
Daly, Ian and Williams, Duncan and Hallowell, James and Hwang, Faustina and Kirke, Alexis and Malik, Asad and Weaver, James and Miranda, Eduardo and Nasuto, Slawomir J (2015) Music-induced emotions can be predicted from a combination of brain activity and acoustic features. Brain and Cognition, 101. pp. 1-11. DOI https://doi.org/10.1016/j.bandc.2015.08.003
Daly, Ian and Williams, Duncan and Hallowell, James and Hwang, Faustina and Kirke, Alexis and Malik, Asad and Weaver, James and Miranda, Eduardo and Nasuto, Slawomir J (2015) Music-induced emotions can be predicted from a combination of brain activity and acoustic features. Brain and Cognition, 101. pp. 1-11. DOI https://doi.org/10.1016/j.bandc.2015.08.003
Abstract
It is widely acknowledged that music can communicate and induce a wide range of emotions in the listener. However, music is a highly-complex audio signal composed of a wide range of complex time- and frequency-varying components. Additionally, music-induced emotions are known to differ greatly between listeners. Therefore, it is not immediately clear what emotions will be induced in a given individual by a piece of music. We attempt to predict the music-induced emotional response in a listener by measuring the activity in the listeners electroencephalogram (EEG). We combine these measures with acoustic descriptors of the music, an approach that allows us to consider music as a complex set of time-varying acoustic features, independently of any specific music theory. Regression models are found which allow us to predict the music-induced emotions of our participants with a correlation between the actual and predicted responses of up to r=0.234,p<0.001. This regression fit suggests that over 20% of the variance of the participant's music induced emotions can be predicted by their neural activity and the properties of the music. Given the large amount of noise, non-stationarity, and non-linearity in both EEG and music, this is an encouraging result. Additionally, the combination of measures of brain activity and acoustic features describing the music played to our participants allows us to predict music-induced emotions with significantly higher accuracies than either feature type alone (p<0.01).
Item Type: | Article |
---|---|
Uncontrolled Keywords: | Brain; Humans; Electroencephalography; Acoustic Stimulation; Brain Mapping; Emotions; Auditory Perception; Music; Adolescent; Adult; Aged; Middle Aged; Female; Male; Young Adult |
Divisions: | Faculty of Science and Health Faculty of Science and Health > Computer Science and Electronic Engineering, School of |
SWORD Depositor: | Unnamed user with email elements@essex.ac.uk |
Depositing User: | Unnamed user with email elements@essex.ac.uk |
Date Deposited: | 27 May 2021 12:14 |
Last Modified: | 30 Oct 2024 20:32 |
URI: | http://repository.essex.ac.uk/id/eprint/25442 |