Cardiac Response to Live Music Performance: Computing Techniques for Feature Extraction and Analysis

Elaine Chew1, Peter Taggart2, Pier Lambiase2
1CNRS-STMS (IRCAM), 2University College London


Abstract

Strong emotions and mental stress have been linked to deadly ventricular arrhythmias. Music evokes strong emotion through the regulation of tension and release and the modulation of changes and transitions. We exploit this in a novel study involving patients with implanted cardiac defibrillators to study the impact of live music performance---the live context heightens emotion response---on cardiac electrophysiology. We make continuous recordings directly from the heart muscle whilst the patients are listening to music, which is concurrently recorded in a separate stream. The patients also provide annotations of felt tension and perceived boundaries/transitions. The advantage of using music over other stimuli is the quantifiability of its expressive parameters; these parameters lend themselves readily to analysis and direct comparison with the corresponding cardiac signal time series. Here, we describe the computing techniques for representing and automatically analyzing the recorded cardiac and music information. The cardiac reaction is measured by the action potential duration (APD), an electrical parameter of major importance in the development of serious and fatal rhythm disturbances. The APD is approximated using the action recovery interval (ARI). We describe the steps to automatically extract the ARI and its associated features from the intra-cardiac recordings. The music parameters of interest are the time varying loudness, tempo, and harmonic tension. We also describe the computational music analysis techniques for extracting and representing these properties. The synchronized information layers thus constructed allow for systematic and quantitative analysis of cardiac response to performed music so as to provide insights to the music events that induce stress.