42
University of Coimbra - Portugal Faculty of Science and Technology Department of Informatics Engineering M USIC P SYCHOLOGY António Pedro Oliveira [email protected] 31st January 2007

Music Psychology

Embed Size (px)

Citation preview

Page 1: Music Psychology

University of Coimbra - PortugalFaculty of Science and Technology

Department of Informatics Engineering

MUSIC PSYCHOLOGY

António Pedro Oliveira

[email protected]

31st January 2007

Page 2: Music Psychology

2

Page 3: Music Psychology

Contents

1 Music Psychology 41.1 Emotions and Music . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

1.1.1 Emotions Theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51.1.2 Emotions Modelling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61.1.3 Psychophysiology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101.1.4 Cognitive Sciences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111.1.5 Neuroscience . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121.1.6 Music Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

1.2 Music Perception . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181.2.1 Melody . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221.2.2 Harmony . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241.2.3 Rhythm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241.2.4 Instrumentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241.2.5 Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

1.3 Music Cognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261.4 Music Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 281.5 Music Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 301.6 Music Therapy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

Bibliography 36

3

Page 4: Music Psychology

1 Music Psychology

Music Psychology is a sub-field of psychology that intends to know what are the main processesinvolved in our brain. The central purpose is to comprehend how humankind perceives mu-sic. There are many research groups involved, focusing in emotional, cognitive and physicalprocesses: Affective Science (Emotions and Music), Music Cognition, Music Perception, MusicPerformance, Music Theory and Music Therapy (some of them can be seen in figure 1.1). Thesubsequent sections intend to give a systematic overview and to provide an insight on the re-lations between music and emotions from the results of the main works that were consulted.The majority of these works don’t take into account all aspects that affect emotional perceptionin music. For this reason, an integrated review is made, nevertheless the focus is on musi-cal content and emotions induced on listeners, because this is the main objective of study ofthe thesis where this work is embedded. Besides musical content, there are other aspects likelistener variables (cognitive, emotional and physical states) and environment variables (contextand part of the day) that influence emotions induction. Consequently, some variables that influ-ence the induction process will not be controlled. Before the description of some works of MusicPsychology focused on musical content it is important to have an integrated view of the interdis-ciplinarity of Music Psychology and the study of sound (figure 1.1, according to [Hod96]). Fromthis figure we can see the sound phenomenon is studied from perspectives of diverse disciplines:Psychology, Education, Music, Sociology, Anthropology, Biology, Phisolophy, Physics and othersub-disciplines (being Music Psychology one of them).

1.1 Emotions and Music

Relations between musical features and emotions or psychophysiological responses are pre-sented in this section, based on the study of works on Music Psychology, Psychophysiology,Cognitive Science, Neuroscience, Psychology, among others. One aspect that is important tokeep in mind is that the dynamics of music performance seems to be much related to the emo-tional state of the musician. Human voice is the principal instrument to express music dynamicsand humans are very sensitive to its sound. Therefore, there is a close relationship betweenvoice (speech) and music, which is manifested in some works explored in the following para-graphs. The next sub-sections are centered on the presentation of emotions theories, modellingaspects of emotions elicited by musical stimuli, psychophysiological, cognitive and neurologicalaspects of emotion induction through musical stimuli and finally the study of relations betweenmusical content and emotions.

4

Page 5: Music Psychology

1.1. EMOTIONS AND MUSIC CHAPTER 1. MUSIC PSYCHOLOGY

Figure 1.1:Spaces of connection between Psychology, Music and other scientific disciplines[Hod96]

1.1.1 Emotions Theories

This section intends to give a background of the central emotions theories to Music Psychologyresearch. There is not a clear vision of what is the order of occurrence of brain processesafter a specific stimulus (event). Table 1.1 presents the main emotions theories and what is theprocesses flux in our body (particularly our brain) since the event is perceived.

Theory Emotions acquisition flow

James-Lange Event→ Physiological Arousal→ Interpretation→ Emotion

Cannon-Bard Event→ Physiological ArousalT

Emotion

Schachter-Singer Event→ Physiological Arousal→ Reasoning→ Emotion

Lazarus Event→ Thought→ EmotionT

Physiological Arousal

Facial Feedback Event→ Facial Changes→ Emotion

Table 1.1:Emotions theories and its acquisition flow 1

Shanley [Sha04] presented a summary of the main emotions theories, its basic emotions andapproaches used, based on [OT90] work (figure 1.2). From this summary we can infer 7 centralemotions: anger, happiness, fear, sadness, surprise, disgust and love.

Other thing that has to be cleared up is what distinguishes emotions from moods and themfrom other affective states. Scherer [Sch00b] suggested 5 affective states: emotions, moods,

5

Page 6: Music Psychology

1.1. EMOTIONS AND MUSIC CHAPTER 1. MUSIC PSYCHOLOGY

Figure 1.2:Emotions theories as presented in [OT90]

interpersonal stances, preferences and affect dispositions, which are presented in figure 1.3.The main differences between these states are their duration and intensity.

Russell [Rus89] proposed a 2 Dimensional Emotion Space (valence and arousal) to categorize28 emotions as can be seen in figure 1.4. Multidimensional scaling methods (MDS) were usedto select these emotions. The majority of works in Affective Computing use a 2 DimensionalEmotion Space like this; nevertheless sometimes the intensity dimension is also used.

1.1.2 Emotions Modelling

Schubert [Sch99] studied the relations among emotions and musical features (melodic pitch,tempo, loudness, timbral sharpness and texture) using a 2 Dimensional Emotion Space. Thisstudy was focused on how to measure emotions expressed by music and what musical featureshave an effect on arousal and activation of emotions. He developed an application to mea-

6

Page 7: Music Psychology

1.1. EMOTIONS AND MUSIC CHAPTER 1. MUSIC PSYCHOLOGY

Figure 1.3:Scherer’s affective states [Sch00b]

Figure 1.4:Russell’s emotions categorization [Rus89]

sure emotions in a 2 Dimensional Emotion Space and compared the results with similar works.The work has the following limitations: use only western tonal music; study in a 2 DimensionalEmotion Space (valence and arousal), other dimensions were not analysed; explanation of therelationship between emotion and music only at a philosophical level; emotions expressed bymusic (not by listeners neither by performers); use only self-report measures; relationship be-tween musical features and emotional response, thus there is a lack of study of human variables.Self-report measures of emotional response to music were used. There were used 3 principalmeasures: open-ended, checklists (ranking and matching) and scales (ranking, matching andrating). A summary of relations between arousal and valence of emotions and musical features ispresented in table 1.2. An extensive review of works done, studying the relations between musicalfeatures and emotions, was done to help in the study of continuous emotional responses to musicstimuli. Different kind of musical stimuli were studied in this review: isolated non-musical sounds,

7

Page 8: Music Psychology

1.1. EMOTIONS AND MUSIC CHAPTER 1. MUSIC PSYCHOLOGY

isolated musical sounds, specially composed melodies, pre-existing melodies, specially com-posed pieces, pre-existing pieces with modification and pre-existing pieces. Figure 1.5 presentsmore details about these musical stimuli, namely the kind of control, validity and stimulus. Timeseries analysis was used to model the dynamics of the emotional response to music. He in-fered, e.g. that changes in loudness and tempo influence arousal, and changes in melodic pitchinfluence valence. Later, Schubert [Sch04b] showed some methods used to establish continu-ous relations between changes in musical features and emotional response: serial correlation,psychoacoustic measurement and time series analysis.

Musical feature High valence Low valence High arousal Low arousal

Loudness - - High Low

Average Pitch High Low High Low

Pitch range - - High Low

Pitch variation High Low High Low

Melodic contour

variationRising Falling Rising Falling

Register High Low - -

Mode Major Minor - -

TimbrePiano, strings, few

harmonics, bright, soft

Brass, low register

instruments, timpani,

harsh, violin, woodwind,

voice

Brass, low register

instruments, timpani,

harsh, violin, bright,

strings

Woodwind, voice, few

harmonics, soft

Harmony Consonant

Dissonant, Melodic or

harmonic sequence,

melodic appoggiatura

Complex, dissonant,

diminished seventh chord-

Tempo - - fast slow

Articulation staccato legato

non-legato with sharp

contrasts between long

and short notes, staccato

legato

Note onset - - rapid onset slow onset

Vibrato intense deep fast deep and intense

Rhythmrhythmic activity, smooth,

flowing motionrough

sophisticated, rough,

rhythmic activity, smooth,

flowing motion

-

Meter - - triple duple

Table 1.2:Relations between arousal and valence of emotions and musical features

Likewise, Korhonen [Kor04] tried to model people’s perception of emotion in music. Modelsto estimate emotional appraisals to musical stimuli were reviewed (e.g. [Sch99, LO03]). Sys-tem identification techniques were applied to make linear models of the emotional appraisals.Methods of measuring and modelling approaches for emotional appraisals of music were pre-sented. Korhonen selected, estimated and validated ARX (Auto-Regression with eXtra inputs)

8

Page 9: Music Psychology

1.1. EMOTIONS AND MUSIC CHAPTER 1. MUSIC PSYCHOLOGY

Figure 1.5:Categorization of musical stimuli used by Schubert [Sch99]

and State-Space models. These models tested the output (valence and arousal) using 20 sub-sets of musical features as input. Musical features difficult to quantify meaningfully (e.g. rhythm)were not included in the mathematical model developed. He distinguished between global fea-tures (dynamic range, musical genre, etc) and local features (loudness, pitch, etc). Dynamics,mean pitch, pitch variation, timbre, harmony, tempo and texture were the musical properties usedin this study. 2 tools (Marsyas [TC00] and PsySound [Cab99]) were used to extract featuresrelated to the mentioned properties, as can be seen in figure 1.6. Other features were alsoused: articulation described by sharpness, vibrato by pitch variation, register by mean pitch andtimbre, mode by harmony, note onset by sharpness and melodic contour by pitch variation andmean pitch. Both Schubert and Korhonen studied the dynamics of the relations between musicalstimuli and listeners’ emotional expression.

Lavy [Lav01] presented a model of emotional response to music, considering music as asound, a human utterance and a narrative in a context. Expectancy, energy, timbre, patterns(Gestalt theory [Wer23]) and intensity are key aspects when we are talking about sound. Whatcan be infered is that even sounds of our everyday life, that come from the environment, induceemotions on us. Sounds of alarms, animals, cars and machines are some examples. Humanvoice can encode different emotions that can be observed by the spectrum of frequencies.

Bosse et al. [BJT06] used a simulation model to study the dynamics of emotions using music.This work has Damasio’s theory on core consciousness [Dam00] as background. Damasio de-scribed an emotional state as a collection of neural dispositions in the brain. These dispositionsare activated when stimulated, e.g. when listening to music, and trigger body responses (e.g.,shivers). According to Damasio’s theory, a stimuli (music) leads to an emotion, then to a feelingand finally to feeling a feeling (core consciousness).

9

Page 10: Music Psychology

1.1. EMOTIONS AND MUSIC CHAPTER 1. MUSIC PSYCHOLOGY

Figure 1.6:Korhonen’s musical features [Kor04]

1.1.3 Psychophysiology

Sloboda [Slo91] studied the ocurrence of physical symptoms (shivers, tears and lump in thethroat) when listening to music. He found that syncopation, enharmonic changes and melodicappogiaturas are related with weeping and and piloerection. Tears were induced by music withsequences and melodic appogiaturas and shivers were evoked by new or unexpected harmonies.

Bradley and Lang [BL00] studied emotional reactions to natural sounds. They tried to comparetwo dimensional distributions of arousal and pleasure when subjected to sound or picture stimuli.They also tried to assess subjects’ physiological reactions. The results when listening to musicshowed that sound activate motivational circuits implicated in emotional expression. Results weremeasured using Facial Electromyography (EMG), heart rate and skin conductance. It was foundthat emotional sounds produce physiological patterns similar to the ones induced by emotionalpictures. This means that some findings on the study of pictures as emotional stimuli can also beapplied to sound as emotional stimulus.

Klein [Kle03] used a 2 dimensional emotion space to be annotated by participants, accord-ing to their emotional response to musical stimuli. Psychophysiological measurements of skinconductance and corrugator EMG were also used. He found that: corrugator EMG is negativelycorrelated with pleasantness; Skin Conductance is positively correlated with activation.

10

Page 11: Music Psychology

1.1. EMOTIONS AND MUSIC CHAPTER 1. MUSIC PSYCHOLOGY

1.1.4 Cognitive Sciences

Wedin [Wed72] stimulated 49 subjects with 20 music samples, which were labelled using a groupof 150 words. Then, nonmetric multidimensional scaling was applied over the rank correlationsof subjects’ answers and 3 dimensions were extracted and labelled: tension/energy (arousal),happiness (valence) and solemnity. It was suggested that tension is related to dynamics (volume),dissonant-consonant harmonies and rhythmic complexity, and happiness with tempo, pitch andkey mode (major or minor). Bigand et al. [BFL05] tested listeners emotional response to 27musical samples1. These responses were stored in a matrix of emotional dissimilarity that wasanalysed using multidimensional scaling methods (MDS). The purpose was to the amount of timeneede to originate (different) emotional responses and it was showed that music with less than 1second is enough for that.

Fischer and Krehbiel [FK01] studied college students’ rating of music samples in a 2 Dimen-sional Emotion Space (valence and arousal). They discovered that: valence and arousal dimen-sions are independent; emotional perception of composer and performers emotional intentionare partially independent; and music-elicited feelings are similar to the intent of composers andperformers.

Ritossa and Rickard [RR04] studied the role of pleasantness and liking to predict emotions ex-pressed by music. Songs were rated by 121 subjects, according to pleasantness, liking, arousaland familiarity. Positive correlations were found between pleasantness and liking, and familiarityand liking. It was found that pleasantness is a better predictor. This study confirmed the useful-ness of use the dimensions pleasantness (valence) and arousal as dimensions used to classifyemotions in music. Familiarity was found equally important in the process of predicting emotionsexpressed in music.

Plack [Pla06] tested the role of the performance medium on listener emotional experience.143 participants were divided in 5 different groups using different performance media: voice,wind ensemble, marching band, piano and popular dance music. The aim was to study listeners’capacity to examine emotional response of performances from these groups. Results showed astrong relationship between groups and their corresponding excerpts.

Vickhoff and Malmgren [VM04] used action-perception theory to understand communication ofemotions in music. This theory is based on 3 principal constructs: present moment perception,implicit knowledge and imitation. One thing that is important to keep in mind is that there is a 2-way connection between emotions and movements. Empathy is important to understand feelingsof other people. These feeling can be categorized into 3 groups: categorical (happiness, fear,etc.), vitality (crescendo, pulsing and other kinetics terms) and relational (being loved, esteemed,etc.). There are 3 empathy catalysts: similarity, familiarity and cue salience. Entrainment isanother important concept that can be seen as a way to emotional contagion. In conclusion, thefollowing model is proposed: emotion -> motor activity -> audio information -> motor activity ->emotion. To exemplify, the emotional state of a composer is shared through musical dynamicpatterns when he is performing. Then, the listener’s sensorymotor scheme resonates with thisdynamics, which is transformed into an emotional state.

1http://www.u-bourgogne.fr/LEAD/people/bigand/timeemo.html

11

Page 12: Music Psychology

1.1. EMOTIONS AND MUSIC CHAPTER 1. MUSIC PSYCHOLOGY

1.1.5 Neuroscience

Blood and Zatorre [BZ01] used positron emission tomography (PET) to study brain regions in-volved in pleasant emotional responses to music. It was found that these responses are relatedwith brain regions involved in reward/motivation, chills, emotion and arousal. These regions are:ventral striatum, midbrain, amygdala, orbitofrontal cortex and ventral medial prefrontal cortex.Blood et al. [BZBE99] also used PET and found that activity in paralimbic and neocortical regionsis associated with musical dissonance and emotional responses to music (pleasant/unpleasant).However, they also suggest that different emotions have different neural substrates and othermusic perception components are processed in other brain regions.

Panksepp and Bernatzky [PB02] reviewed the most important works that study the functionof emotional systems during music appreciation, as well as the influence of music on memory,mood and brain activity. They proposed that subcortical emotional circuits affect the emotionalimpact of music. It seems that the dynamics of body movements (dance) are related to theexpression of emotions perceived in music. As music expression/perception usually entails manypsychological and psychophysiological processes, it is suitable as a therapeutic tool in whichemotional expression have a central role.

J. and R. Goguen [GG04] studied the musical qualia (non-objective aspects of experience)and infered that consciousness has a hierarchical decomposition, where each part has a spe-cific emotional tone. They established an analogy of emotions induced by music with emo-tions induced by other phenomena; for instance, a correct anticipation is rewarded with plea-sure. Panksepp [Pan98] suggested that basic emotions and motivations can be the foundation ofconsciousness.

Koelsch [Koe05] used imaging techniques to study emotions using musical stimuli. He sug-gested that processing of emotion has a temporal dynamic. So, the dynamics of emotions is agood way to describe emotional processes. He also studied the emotional responses to unex-pected musical events and found that irregular chords provoke changes in electrodermal activityand activate brain structures involved in emotion processing. Limbic and paralimbic structuresare involved during the processing of music with emotional valence.

1.1.6 Music Features

Meyer [Mey56] studied the meaning of emotions in music by bringing some gestalt principles intomusic. He defended that music is meaningless when it is in a style unfamiliar to us. Some princi-ples of pattern perception in music are presented in this book [Mey56]. He started by presentingmusical characteristics that affect melodic, rhythmic and metric continuity (table 1.3). Next, hepresented the principle of completeness and closure in melody, rhythm and harmony (table 1.4).The role of music structure and shape were studied and some ideas were drawn: sound stimuli isconceived as part of a structure; a weak/bad shape leads to tension; one way to weaken shapesis the excessive similarity of them; pitch uniformity is characterized by equidistant series of tones;harmony uniformity is characterized by equal vertical intervals or unchanging harmony or repet-itive progressions; formal completeness is characterized by texture. From this study expectationseems to be a relevant musical characteristic to induce emotions. Expressive variations in pitch,tempo, rhythm (rubato, vibrato, etc.), ornamentation and tonality are also important musical char-

12

Page 13: Music Psychology

1.1. EMOTIONS AND MUSIC CHAPTER 1. MUSIC PSYCHOLOGY

acteristics that are related to emotions induction. Meyer analyzed structural characteristics ofimportant music and its relation with emotional meaning in music, which can be very useful in theprocess of emotions induction with music.

Musical Law Music characteristics

Melodic continuitydelay; acceleration; contrast of parts; ornaments; shape

expectation of harmonic structures

Rhythmic continuitypulse; meter; rhythm; accent; hierarchical organization (iamb,

anapest, trochee, dactyl, amphibrach); rhythmic reversals

Metric continuityhierarchical organization; time signature; metric changes

(hemiola); polymeters

Table 1.3:Meyer’s laws of music continuity

Musical Law Music characteristics

Melodic completness and closuretonality; instrument tessitura; higher-level analysis (Schenker

analysis); relaxation of closure linked to lower pitches

Rhythmic completness and closure string of accented/unaccented

Metric completness and closure tonic/key

Table 1.4:Meyer’s laws of music completness and closure

Some ideas of Meyer [Mey56] were applied in Lindstrom work [Lin04], where he varied somemusical features (melody, but also rhythm and harmony) to know how music expresses emotionsto listeners. The perception of both musical structure (instability, complexity and tension) andemotions in music (sadness, anger and expressivity) were interpreted after accenting structuresand stressing notes. It seems that musical structure and emotions are closely related. Performersshowed that some notes are important to induce emotions by varying articulation, loudness andtiming. The following results were obtained: accent on a tense note enhances anger; notesessential in minor mode and the identification of major mode affect, respectively, the perceptionof sadness and happiness.

Gabrielsson and Lindstrom [GL01] findings on relations between happiness and sadness, andmusical features can be seen in table 1.5. It was also found that: major mode is associated withgrace, serenity and solemnity states; minor mode with tension, disgust and anger; staccato artic-ulation with gaiety, energy, activity, fear and anger; legato articulation with sadness, tenderness,solemnity and softness; high loudness with joy, intensity, power, tension and anger; low loudnesswith sadness, softness, tenderness, solemnity and fear; high register (pitch level) with happiness,grace, serenity, dreaminess, excitement, surprise, potency, anger, fear and activity; low register(pitch level) with sadness, dignity, solemnity, boredom and pleasantness.

Shanley [Sha04] studied the influence of musical features on emotional types and emotionalintensity. Mode, tempo and note density, pitch, loudness and texture were the musical features

13

Page 14: Music Psychology

1.1. EMOTIONS AND MUSIC CHAPTER 1. MUSIC PSYCHOLOGY

Emotion Articulation Harmony Loudness Melodic

range

Melodic

direc-

tion

Mode Pitch

level

Rhythm Tempo Timbre

Sadness Legato Complex

and

disonant

Low Narrow Falling Minor Low Firm Slow Few har-

monics,

soft

Happiness Staccato Simple

and con-

sonant

High Wide Rising Major High Regular /

Smooth

Fast Few har-

monics

Table 1.5:Gabrielsson and Lindstrom relations between happiness and sadness, and musicalfeatures

analysed and fear, love, happiness and sadness were the emotional types analysed. Correlationsbetween these variables were made to infer the relations between emotional types and musicalfeatures. He reviewed the works of: Gabrielsson and Lindstrom [GL01] on tempo and emotions(figure 1.7); Scherer and Oshinsky [SO77] on texture and emotions (figure 1.8); Juslin [Jus97a]on sound level, frequency spectrum, tone attack, tempo and articulation, and emotions (figure1.9).

Figure 1.7:Gabrielsson and Lindstrom discoveries on tempo and emotion [Sha04]

Figure 1.8:Scherer and Oshinsky discoveries on texture and emotions[Sha04]

Based on the analysis of experimental results and Maslow’s hierarchy of needs, a hierarchyof the influence of musical features on emotional intensity was developed. This work sustainedthe theory that brain responds to relative levels rather than absolute. On the whole, Shanley’sfindings on the relation between emotional types and musical features are the following: whenusing minor mode, fear is related to fast ascending melodic contour, high pitch levels, low-mediumtempo and low note density; when using major mode, fear is related to rapid tempo increasing,medium loudness, medium tempos and medium texture; when using major mode, love is relatedto long descending melodic contour and tempo, low pitch levels, low-medium tempo and loud-ness, and low note density; when using minor mode, love is related to long descending tempo,

14

Page 15: Music Psychology

1.1. EMOTIONS AND MUSIC CHAPTER 1. MUSIC PSYCHOLOGY

Figure 1.9:Juslin discoveries on musical features and emotions[Sha04]

fast descending loudness, low-medium tempo, low note density and medium texture; happinessis related to major mode, long ascending melodic contour, long ascending tempo, ascending tex-ture levels, medium-high pitch and medium tempo; and sadness is related to minor mode, longdescending melodic contour, descending tempo, low pitch levels, low note density, low-mediumtempo and medium loudness. Sadness and love have similar relations with musical features.

Figure 1.10:Hierarchy of the influence of musical features on emotional intensity [Sha04]

Collier and Hubbard [CH01] studied the effects of pitch, melodic contour direction and scalemode on happiness. It was found that happiness can be induced by music with ascending scales,major keys and high pitch.

Dalla Bella et al. [DBPRG01] evaluated the effect of tempo and mode in the recognition ofemotions in music 2. From this work it seems that tempo is more important than mode to makeemotional judgments in music. Gagnon and Peretz [GP03] established that both tempo and modeaffect the distinction between happy and sad musical excerpts, being tempo the more salient. Inaddition, this work supports the idea that structural features emotionally meaningful (e.g. tempoand mode) are easy to isolate.

Scherer and Zentner [SZ01] proposed rules to map between musical features and emotions.It was proposed the following factors that influence the emotional perception when listening tomusic:

2http://www.brams.umontreal.ca/plab/research/Stimuli/Dalla%20Bella%20et%20al%20(2001)/dallabella_2001_stimulis.html

15

Page 16: Music Psychology

1.1. EMOTIONS AND MUSIC CHAPTER 1. MUSIC PSYCHOLOGY

➢ experienced emotion = structural features × performance features × listener features ×contextual features;

➢ structural features = W1(segmental features) ×W2 (suprasegmental features);

➢ performance features = W3 (performer skills) ×W4 (performer state);

➢ listener features = W5 (musical expertise) ×W6 (stable dispositions) ×W7 (mood state);

➢ contextual features = W7 (location) ×W8 (event);

They also studied the routes through which music generate emotions. These routes can be cen-tral (central nervous system is involved) or peripheral (somatic and autonomic nervous systemsare involved). Appraisal processes, memory recall and empathy belong to the central route. Onthe other hand, motor expression manipulation (proprioceptive feedback) and expression facilita-tion of pre-existing emotions belong to the peripheral route. Finally, 3 components of measure-ment of affect were reviewed: cognitive changes, physiological arousal, and motor expressivebehaviour and action tendencies.

Juslin and Vastfjall [JV06] proposed a model to study 6 psychological mechanisms that canbe used to induce emotions. These mechanisms are: arousal potential; evaluative conditioning;emotional contagion; mental imagery; episodic memory; musical expectancy. Later, Scherer[Sch04a] suggested new ways to measure affective states induced by music. Additionally, hesuggested that the affects induced by music should be studied as feelings that integrate cognitiveand physiological affects and represented with production rules.

Kimura [Kim02] based his work on Juslin’s study [Jus01] (figure 1.11) of what are the musicalfeatures used by performers to communicate emotions with music. This work used instrumentalpieces of music to induce 7 emotions: fear, sadness, anger, tenderness, happiness, frustrationand surprise. Violinists’ expression of sadness, tenderness and happiness were perceived by thelisteners with more than 70% of success rate.

Krumhansl [Kru02] emphasized the significance of the relation between musical tension andrelaxation and the expectations of the sounds played. Other idea that is important to keep in mindis the relation between musical emotion and cognition of the musical structure. In tables 1.6, 1.7and 1.8 are presented, respectively, relations between: emotions and musical features, emotionsand psychophysiological responses and some relations between concerns related to music andemotions. This work made a systematic review of results from disciplines like Psychophysiology,Cognitive Science and Affective Science.

Emotion Tempo Harmony Ranges of Pitch Ranges of Dynamics Rhythms

Sadness Slow Minor Constant Constant -

Fear Rapid Dissonant Large Large -

Happiness Rapid Major Constant Constant Dancelike

Table 1.6:Relations between emotions and musical features

16

Page 17: Music Psychology

1.1. EMOTIONS AND MUSIC CHAPTER 1. MUSIC PSYCHOLOGY

Figure 1.11:Representation of musical features on a 2 dimensional emotion space [Jus01]

Thimm and Fischer [TF03] analysed the relationship between emotions (induced by music) andpsychoacoustical features of music (dissonance, sharpness, multiplicity, tonalness and volume).15 subjects rated music samples (jazz, classic and music from 17th to 20 th centuries) in a 2Dimensional Emotion Space (valence and arousal). From the analysis of subjects’ rating it wasconcluded that: activation is predicted by dissonance and/or loudness; tonalness and multiplicitypredicted pleasantness.

Berg and Wingstedt [BW05] used REMUPP [WLLB05] to investigate relations between musicalparameters (articulation, harmony, loudness, melodic range, mode, pitch level, rhythm, tempoand timbre) and emotions of happiness and sadness. Figure 1.12 presents the range interval ofthese parameters. Results of this work corroborated findings of previous works. It was found that:major mode, medium and bright instrumentation, fast tempo, short length of notes, high loudnessand high register are associated with happiness; minor mode , dark instrumentation, slow tempo,long length of notes, low loudness and low register with sadness.

Webster and Weir [WW05] studied emotional responses to variations of mode, texture and

17

Page 18: Music Psychology

1.2. MUSIC PERCEPTION CHAPTER 1. MUSIC PSYCHOLOGY

Emotion Heart rateBlood

pressure

Skin

conductanceTemperature Respiration

Rate of blood

flow

Amplitude of

blood flow

Sadness Change Change Change Change Normal Normal Normal

Fear Normal Normal Normal Normal Normal Change Change

Happiness Normal Normal Normal Normal Change Normal Normal

Table 1.7:Relations between emotions and psychophysiological responses

Musical concerns Other concerns

Global aspects of musical structure Overall mood of the music

Tension Mostly fear, but also happiness and sadness

TensionHeart rate, blood pressure, pitch height of the melody, density

of notes, dissonance, dynamics and key changes

Tension

Musical form (Lerdahl’s tree model [Ler06] chromatic tones,

interruption of harmonic processes, denial of stylistic

expectations)

Emotional expression in music Emotional expression in dance and speech

Pattern of temporal organization in music Patterns of intonational units in discourse

Table 1.8:Relations between other concerns related to music and emotions

tempo of music. As a result, it was infered that music with major keys, simple melodic texture andfaster tempos is associated with happiness and music with minor keys, thick harmonized textureand slow tempos is associated with sadness.

Lucassen [Luc06] studied the influence of timbre on the induction of emotions with music. Hefound that piano is emotionally neutral, marimba is very joyful, cello invokes strong sad emotionsand alt saxophone provoke negative and positive emotions.

Illie and Thompson [IT06] studied the emotional responses to variations of intensity, rate andpitch height in both music and speech. Participants used a 3 Dimensional Emotion Space: va-lence (happy-sad), energy arousal (excited-calm) and tension arousal (tense-relaxed) to ratesound excerpts. Loud excerpts were rated as happy, energetic and tense; fast music was ratedas having more energy than slow music; fast speech was rated as happier than slow speech; fastmusic was judged as tenser than slow music; high pitched speech and low pitched music wereassociated happiness.

1.2 Music Perception

According to Koelsch and Siebel [KS05] music perception affects emotions. Having this in mind,this section presents works on Music and Speech Perception, Music Semantics and Psychoa-coustics.

Deutsch [Deu82] established the relationship of findings in psychology to music theory. In table1.9 we can see a summary of psychological inferences made about timbre perception. It is saidthat: more than the localization of instruments, pitch and even timbre and loudness are moreimportant to our perception of the melody; temporal relationships between tones from different

18

Page 19: Music Psychology

1.2. MUSIC PERCEPTION CHAPTER 1. MUSIC PSYCHOLOGY

Figure 1.12:Range interval of musical parameters [BW05]

spatial locations are important to our perception of the melody; there is a tendency to hear theright tones on the right ear and the low tones of the left ear for right-handers because they havethe left hemisphere of brain more developed (the opposite is applied to lefthanders); pitch prox-imity affect the recognition of individual tones in a sequence. The law of stepwise progression,which is based on Gestalt principle of proximity [Wer23], states that melodic progression shouldbe by steps (a half step or a whole step) instead of skips (more than one step). As a result ofthis principle, similar sounds in frequency spectrum are likely to emanate from the same sourceand dissimilar sounds in frequency spectrum from different sources. The connectedness of asequence tones is affected by factors like pitch relationship, tempo, attentional set and the lengthof the sequence presented. Von Ehrenfels [vE90] argued that melodies are like visual shapes(Gestalt principle) and as a result of this pointed out that transposed melodies retain its essentialform. It is concluded that interval class can be perceived in a successive context, as an exampleof top-down shape analysis (hypothesis-testing) by the listener. It is argued that music, like otherinformation, is stored in a hierarchical structure, as a product of our processing mechanisms.This principle was applied to Schenker’s 3-level system [Sch73] in which notes at one level areprolonged by a sequence of notes at the next-lower level. This system was explained based ona tree-based approach of Lerdahl and Jackendoff [LJ77], where they found that the fundamentalrelationship expressed in the tree is the elaboration of a single pitch event by a sequence of pitchevents. However, Narmour [Nar77] said that these tree-based approaches were not appropriateand proposed that musical pieces should be conceptualized as interlocking networks. Deutsch[Deu99] presented many grouping mechanisms in music, like musical features that influence fu-sion and separation of spectral components, based on his previous work [Deu82].

Peretz and Coltheart [PC03] proposed a modular architecture for music processing, with thepathways of information flowing among modules. This architecture was developed with the help ofstudies of neurological patients. Figure 1.13 illustrates the modules involved in music processing,where temporal and pitch organization assume a central role.

Koelsch and Siebel proposed a new music perception model [KS05] build on what has beendone by Peretz and Coltheart [PC03]. They studied brain functions involved in music perception:acoustic analysis, auditory memory, auditory scene analysis, and processing of musical syntaxand semantics. It was also studied the influence of music perception on emotions, the autonomic

19

Page 20: Music Psychology

1.2. MUSIC PERCEPTION CHAPTER 1. MUSIC PSYCHOLOGY

Timbre perception tasks Psychology findings

Identification of timbre

Differences in timbre are related to the strengths of the various harmon-

ics of complex tones;

Simple tones are pleasant, but dull at low frequencies;

Tones with strong upper harmonics sound rough and sharp;

Complex tones with only odd harmonics sound hollow;

Critical band [Plo64], attack segment and steady state segment (if tim-

bre varies with time) play an important role on timbre perception;

Geometric models with at least 2 dimensions were developed [Gre75,

Wes78] to represent the timbral space, being the first dimension related

to the spectral and distribution of sound and the second to temporal fea-

tures, such as details of the attack;

Spectral fusion and separation

Musical tones of the same source are usually fuse together and musical

tones of different sources are usually separated to perceive usually dis-

tinct sound images [Hel09];

Onset synchronicity of spectral components, coordinated modulation in

a steady state and harmonicity of the components of a complex spec-

trum promote spectral fusion;

Perception of sequences of timbres

The Warren effect [WOFW69] says that sounds are organized into

separate streams, according to sound type and due to this it is hard to

form temporal relationships across streams.

Table 1.9:Psychological inferences about timbre perception

nervous system (ANS), the hormonal and immune systems and motor representations. Figure1.14 presents an overview of the cognitive modules involved in music perception and temporalrelations between them. Cochlea translates the initial acoustic information into neural activity thatreaches auditory brainstem. Then, in auditory cortex is done an initial features extraction fromthis information. This activity is followed by: the grouping of these features (pitch height, timbre,etc.) using gestalt principles and the analysis of intervals (pitch relation, time relations, etc.).Meaning and emotions are essentially induced when musical structures (harmony, meter, rhythmand timbre) are built. After all these processes body reactions and immune system variations areinduced via the autonomic nervous system (ANS).

Martin et al. [MSV98] presented the limitations of approaches based on music signal process-ing and music theory to do music content analysis. They showed the advantages of using aresearch framework based on a music listening approach, by analysing various case studies onthe extraction of rhythm, timbre and harmony from audio signals. They also noted that this isthe adequate way to analyse music content. In the same direction, Scheirer et al. [SWV00]developed a psychoacoustic model to analyse the human perception of music complexity. Thismodel extracts 16 musical features, which are based on loudness, tempo, pitch and auditoryscene analysis. Pampalk, Dixon and Widmer [PDW03] evaluated some music perception fea-tures important to do music similarity. Different approaches were used to compare 5 audio musicsimilarity measures.

Scheirer [Sch00a] developed Computational Auditory Scene Analysis (CASA) framework was

20

Page 21: Music Psychology

1.2. MUSIC PERCEPTION CHAPTER 1. MUSIC PSYCHOLOGY

Figure 1.13:Music processing model [PC03]

developed to embed the most important music perception theories. Psychoacoustic theoriesof human listening were tested through the use of computer-modeling approaches. Signal-processing techniques were used to extract important musical features from audio music. Onthe whole, he tried to connect studies from Music Psychology, Psychoacoustics and MusicalSignal Processing to study sound.

Dawes [Daw99] studied the elements that make up music to use them to produce emotionaland physiological responses to music. He started by presenting a study of the biology of hearingand auditory process model, followed by the study the origins of hearing and music. This worktried to bring the knowledge from various disciplines that study music perception using a biologicalperspective.

Meredith [Mer03] made a review of the most important aspects of auditory system that influ-ence the perception of music. Technologies like MEG, MRI and PET are becoming importantto study the neurophysiology of music perception. The anatomy of the auditory pathway wasexplained since the sound is perceived in the outer ear till he come to the auditory cortex. Fre-quencies of tones are represented in pitch-based tonotopics maps and frequencies of complextones are unified into a unitary pitch in auditory cortex. From the study of auditory neurons it wasfound that melodies are distributed represented in the auditory cortex.

Cariani et al. [COT04] presented important concepts and mechanisms of music psychologyfrom a perceptive and cognitive perspective. Perceptual dimensions of hearing (pitch, timbre,consonance/roughness, loudness and auditory grouping) used to represent high-level musicalfeatures (melody, harmony and rhythm) in the brain were presented. This can be seen in figure1.14 by the feature extraction and structure building modules. Music was also presented from

21

Page 22: Music Psychology

1.2. MUSIC PERCEPTION CHAPTER 1. MUSIC PSYCHOLOGY

Figure 1.14:Overview of the cognitive modules involved in music perception [KS05]

perspectives of different disciplines (Psychology, Biology, Linguistics, Neurophysiology, History,etc.). Figure 1.15 presents areas of the brain and its roles in music perception and performance.The autonomic nervous system (ANS) is the part of nervous system that influences the physio-logical responses according to specific (musical) stimuli.

Plack [Pla04] presented the most important auditory perception mechanisms by using psy-choacoustics concepts. Cochlea is responsible for spectral analysis, which decomposes theperceived signal into different frequency components. This decomposition is useful for pitch per-ception of complex tones and sound segregation and identification is done. Temporal patterns ofvibration are encoded on the basilar membrane, more properly in auditory nerves. Interaural timedifferences are then used to localise sounds. In summary, sounds foundations are presented tounderstand the auditory pathway from ear to cortex and then to understand how high-level soundfeatures are extracted and represented in the brain.

Large and Tretakis [LT05] proposed a theory of tonality that has its foundations on the nonlineartransformation of frequency in cochlea and central auditory nervous system (CANS). This kindof transformation is characterized by extreme sensitivity to weak stimuli, sharp frequency tuning,amplitude compression, frequency detuning, natural phase differences and nonlinear distortions.These properties conform to some psychoacoustics phenomena, like hearing thresholds, fre-quency discrimination, loudness scaling, Stevens’ rule, harmonic distortion, combination tonesand pitch perception.

1.2.1 Melody

“A mind is fundamentally an anticipator, an expectation-generator” -Daniel Dennett

Melody expectation is correlated to feelings of surprise, disappointment, fear and closure. This

22

Page 23: Music Psychology

1.2. MUSIC PERCEPTION CHAPTER 1. MUSIC PSYCHOLOGY

Figure 1.15:Brain areas and its roles in music cognition [COT04]

section intends to get some insight on this by reviewing important works on this area.

Schellenberg et al.[SAPM02] compared the Implication-Realization (I-R) [Nar90] and 2-factor[Sch97] models of melodic expectation using 3 features: simplicity, scope and selectivity. It wasalso tried to examine how melodic expectation change along the time. The implication-realizationmodel analyses registral direction, intervallic difference, registral return, proximity and closure.On the other hand, 2-factor model analyses pitch proximity and pitch reversal. They supported theimportance of the I-R model to studies of music perception and cognition, namely to associate thismodel with musical structures. A comparison is established between the importance of I-R modelfor Music Psychology and the universal grammar proposed by Chomsky to Psycholinguistics. Itis said that this I-R model is a universal grammar for melodies.

Eerola [Eer03] wished to test cross-cultural and statistical approaches to understand melodicexpectancy. This is done through the study of processes used for structuring, interpreting, re-membering and performing music. This work supports the idea that cultural background shapesthe influence of these processes during music perception. Melodic expectancies can be of 2types: pitch-related or temporal. Short-term auditory priming, auditory stream segregation, sen-sitivity to frequency of occurrences and rule-based heuristics of melodic continuations are pitch-related processes that are to the storage of musical events in sensory memory. On the other

23

Page 24: Music Psychology

1.2. MUSIC PERCEPTION CHAPTER 1. MUSIC PSYCHOLOGY

hand there are pitch-related stylistic (cultural) knowledge that is also important for melody expec-tation: tonal hierarchy, western schematic expectations, harmony, melody anchoring and melodicarchetypes. All the models used in this work are available in the MIR toolbox [ET04].

Larson [Lar04] proposed the theory of musical forces for melodic expectation and describes 2computational models of this theory. A multilevel model was proposed based on musical forcesof gravity, magnetism and inertia. Computer-generated and participant-generated expectationswere compared and the results showed a positive correlation between them. Later, Margulis[Mar05] elaborated a hierarchical model to evaluate melodic expectation based on 4 factors: sta-bility, proximity, direction and mobility. This model links expectancy rating to listeners’ experienceof musical tension, as well as theorized expectations and dynamics, affective contours of musicalexperience.

1.2.2 Harmony

Toiviainen and Krumhansl[TK03] used 2 self-organized models to study the dynamics of tonalityinduction. One model was based on pitch class distributions and the other on tone-transitiondistributions. Auditory scene analysis principles were used to design a dynamics matrix for thetone-transition model. The dynamic process of tonality induction was associated with musicaltension. Tension was measured by taking into account key distance and dissonance. Both thecomputer model used and subjects responses can be consulted on the web 3.

Steinbeis et al. [SKS06] studied the role of harmonic expectation in emotional experience.Some important results of literature were review, e.g. harmonic expectations are based on rela-tions of harmonic distance (Circle of Fifths). It was argued that music tension can be related tothe experienced emotion and even more, that the expectation of a harmonic event is inverselyproportional to the expected tension. Results from this work supported the idea that music withirregular chords evokes brain responses associated with musical structure processing. Addition-ally, it also supported Meyer’s [Mey56] idea that musical emotions arise through the suspensionand fulfilment of expectations. Furthermore, harmony expectancy violations are related to theincreasing of listener’s arousal.

1.2.3 Rhythm

Rhythm is a musical feature that influence human behaviours and the rhythmicity of the physiol-ogy of our body, particularly heart beats (see section 1.6 for more details).

Desain and Honing [DH03] worked on the categorization of rhythmic patterns. From theirstudy, the space of musical performances was partitioned into a small set of connected rhythmicregions.

1.2.4 Instrumentation

Padova et al. [PBLB03] examined the effect of timbre on listeners’ emotions. They varied spectralenergy, spectral structure and spectral density and analysed the emotional responses to these

3http://www.perceptionweb.com/misc/p3312/

24

Page 25: Music Psychology

1.2. MUSIC PERCEPTION CHAPTER 1. MUSIC PSYCHOLOGY

variations. It was found that changes of harmonic dynamic and harmonic ratios induced neg-ative emotions and spectral energy variations induced high levels of happiness. Moreover, therepetition of stimuli induces an intensity decrease in positive emotions and an intensity increasein negative emotions, fear and surprise. Another study from Padova et al. [PSB05] found thatdifferent timbre is associated with different emotions. Piano and hybrid sounds induced negativeemotions and flute sound induced another pattern of emotions. This study supported the ideathat salience is more important than tonality for music recognition. Lastly, it was found that timbreis very useful to analyse memory tasks performance.

1.2.5 Tools

This section is dedicated to the presentation of important tools to extract perception features frommusic signals.

Auditory MATLAB toolbox [Sla94] is widely used to study auditory models. It offers the possi-bility to represent sound as a waveform, spectogram/cochleagram or correlogram. This toolboxis particularly useful to understand how auditory periphery works and to compare/test theories. Itcomes with 6 types of auditory time-frequency representations. Feldbusch [Fel03] made an au-ditory model simulator for MATLAB. It accepts acoustic signals and calculates the outputs of theear at 5 different stages: outer ear signal envelope transformation, middle ear band pass filter,basilar membrane filterbank, inner hair cells and finally auditory nerve spike generation (auditoryneurons). Auditory Image Model is another MATLAB toolbox [BIP04] that has built-in models ofthe auditory system. This toolbox has an interface to visualize the output of 6 different main mod-els (applied in sequence to the original acoustic signal): Pre-cochlear Processing (PCP), BasilarMembrane Motion (BMM), Neural Activity Pattern (NAP), Strobe times to construct the auditoryimage (strobes), Stabilised Auditory Image (SAI) produced in the inferior colliculus and a userdefined process applied to the auditory image (user module).

Cabrera [Cab99] created a Mac program, named PsySound, to extract psychoacoustical fea-tures from music audio signals. This program comes with several models that can be appliedon sound files to obtain psychoacoustic measures: level, spectrum, cross-channel, loudness,dissonance and pitch. IPEM MATLAB toolbox [LLT01] aims to model computationally the humanauditory system to help in the process of perception-based music analysis. The toolbox allowsanalysing music in 3 different levels: sensorial, perceptual and cognitive. Roughness and onsetmodules for the sensorial level, pitch completion, rhythm, echoic memory for the perceptual leveland contextuality for the cognitive level.

Oreja [Est05] is a MATLAB toolbox to study the psychoacoustics of speech signals. Firstly,it allows the manipulation of the input signal: decomposition into different channels, analysis,selection and concatenation with other signals. Secondly, this signal can be transformed in dif-ferent ways (attenuation of the amplitude of the channels and noise addition). Ellis [Ell05] madea MATLAB auditory perception toolbox to extract the following features from audio: MFCCs andRelative Spectral Transform - Perceptual Linear Prediction (RASTA-PLP). These features arefundamentally used to speech analysis tasks.

25

Page 26: Music Psychology

1.3. MUSIC COGNITION CHAPTER 1. MUSIC PSYCHOLOGY

1.3 Music Cognition

This section explains some of mental representation, processes and structures involved in theinterpretation of music. To get some insight on this important works of this field are reviewed.

Minsky [Min81] cogitated how listening to music engages listener knowledge. It was explainedhow mind recognizes music by using a society of musical agents. Music recognition was alsocompared with visual phenomena. Various things about music were analysed: theme, sentic sig-nifance (association between sensory patterns and emotional states), rhythm and repetition, mu-sic composition and conduction, space and tune, syntactic theories, musical usage and sonatas(pieces played). He proposed that music can be used to learn about time, to fit things together,get along with others and to suppress troubles.

Chew [Che01] proposed the Spiral Array model for tonality that does a geometric representa-tion of pitch, interval, chord and key relations. It is a mathematical model that can be used to testmusic cognition theories by organizing and aggregating appropriate musical information (in pointsof a spiral). These models provided a framework suitable to design computational algorithms toanalyse and manipulate musical information, e.g. key-finding problems. Chew’s course [Che06]has focused some minor projects on research of music cognition topics. This course intends togive practical insight in music cognition topics, such as expressive performance, composition andimprovisation, and computational analysis. Now, I will present goals of some projects: visualizethe dynamics of emotional content in audio music 4, analysis of movie music 5, music similarityusing the spiral array model 6, vibrato generation in music performance 7, pitch, time and velocityvisualization 8, popular music segmentation 9, beat detection 10 and music structure analysis 11.

Temperley [Tem04] explored cognitive processes involved in perception of musical structures:metrical, melodic phrase, contrapuntal, tonal-pitch-class, harmonic and key. To do this, he imple-mented preference rule systems for each of them [TS99] (see section ?? for more details). Povel[Pov02] presented a computational model to simulate human perception of tonal melodies. It hasa key finder to select the key of the piece of music, which is the key with highest activation (from24 keys). Furthermore, it has a chord recognizer process, divided in 4 steps, to identify the chordwith highest activation in successive segments.

Whitman [Whi05] presents ways to represent computationally the information from the musicalsignal and context. Furthermore, he tries to make a model that links this information. One of thegoals of this thesis was to represent contextual and extra-signal data in the form of communitymetadata. Then, the framework proposed works with 2 kinds of musical data (audio signal andcommunity metadata) to obtain music meaning. Cepstral modulation is used to extract musicalmeaning from audio signal and Natural Language Processing (NLP) and Statistics are used toextract meaning from community metadata. In the end, the framework gives the following seman-

4http://www-scf.usc.edu/~ise575/b/projects/mosst_miles/concept.htm5http://www-scf.usc.edu/~ise575/b/projects/mardirossian/1.html6http://www-classes.usc.edu/engr/ise/599muscog/2004/projects/emmanuel/JoannEmmanuel/7http://www-classes.usc.edu/engr/ise/599muscog/2004/projects/yang/8http://www-scf.usc.edu/~ise575/a/projects/li/9http://www-scf.usc.edu/~ise575/a/projects/lee/Final%20Project.htm

10http://www-scf.usc.edu/~ise575/a/projects/mooser/BeatDetector.htm11http://www-scf.usc.edu/~ise575/a/projects/shiu/website/

26

Page 27: Music Psychology

1.3. MUSIC COGNITION CHAPTER 1. MUSIC PSYCHOLOGY

tics of music information: funky, cool, loud, romantic, etc. Figure 1.16 presents an overview ofthis framework.

Figure 1.16:Whitman’s framework to obtain musical meaning [Whi05]

Tristan [Jeh05] developed a music cognition framework to generate and create new music byusing audio examples and by applying machine listening and learning techniques. He tried toautomate the process the cycle of listening, composing and performing using a song database.Sounds and structures of music were analysed and musical parameters extractors. These para-meters are then used to help the process of synthesizing new meaningful musical structures. Themain goal of this thesis was to create music; nevertheless, much attention was dedicated to therepresentation and analysis of music. This is a scientific (analysis to understand), engineering(modeling to build) and artistic (synthesis to create) framework. Figure 1.17 represents the maintopics used in this framework and relations between them. Music Listening, Learning of MusicalSignals and Music Composition are the most important areas of application of this work. Firstly,it has scientific contributions: music cognition by combining listening and learning approaches;downbeat prediction; recursive structural analysis of music; automatic listening, composing andperforming; perceptual segmentation. Secondly, it has engineering contributions: autonomousDJ application; music restoration; music textures. Lastly, it has artistic contributions: created newartifacts of music.

Justus and Bharucha [JB02] reviewed the major findings in music perception and cognition.Perception and cognition of pitch, time-based perceptual organization in music, musical perfor-

27

Page 28: Music Psychology

1.4. MUSIC PERFORMANCE CHAPTER 1. MUSIC PSYCHOLOGY

Figure 1.17:Representation of the main topics uses in Tristan’s framework [Jeh05]

mance, cognitive neuroscience foundations of music and musical universals and origins are themost important. In pitch domain the following topics are studied the pitch height, pitch class,pitch categorization, relative pitch, absolute pitch, consonance, dissonance, scales and tonal hi-erarchies of stability, chords and harmonic hierarchies of stability, harmonic perception, harmonicrepresentation, harmonic expectation, melodic perception, melodic representation and melodicexpectation. Still, in time domain the following topics are studied: tempo, rhythmic pattern, group-ing, meter, event hierarchies and reduction, and relationship between time and pitch. Similarly,musical performance is centered in topics like: interpretation and planning, communication ofstructure, and musical expertise and skill acquisition. Cognitive neuroscience of music studiesthe following topics: neuropsychology, neuroimaging and electroencephalography. Finally, mu-sical universals and origins are based on: developmental music cognition, cross-cultural musiccognition and evolutionary psychology of music. In the whole, the major breakthroughs in variousfields of study of music cognition and perception are presented.

Similarly, Palmer and Jungers [PJ03] also reviewed some important discoveries in music cog-nition. They said that neural bases, music perception, memory and motor components are es-sentially similar for all people, which is not applied to musical behaviour. They also said thatrhythm and pitch can be the most important music features for psychology. Besides this, it is alsosaid that pitch, duration, loudness and timbre are important properties in music perception.

1.4 Music Performance

This section deals with musical aspects studied, essentially, during the expression of performers’inner feelings.

Baraldi [Bar03] considered both cognitive and cultural factors with the aim to explain aspects ofmusical experience and to analyse expressive intentions in musical improvisation. He proposedthat there is a common code for controlling musical expressivity by changing musical features(pitch, intensity, articulation and tempo) and suggested that listeners are able to recognize per-former’s expression intentions when very few musical means are used. Figure 1.18 presents thesensorial adjectives and the corresponding musical features to study music expressiveness.

28

Page 29: Music Psychology

1.4. MUSIC PERFORMANCE CHAPTER 1. MUSIC PSYCHOLOGY

Figure 1.18:Bonini’s adjectives and musical features [Bar03]

Widmer et al. [WGIV04] studied the role, principles and assumptions of 4 models of musicperformance (KTH, Todd, Mazzola and Machine Learning) to express emotions. Some empiricalevaluations of these models were presented. These models seem complementary, so the mostimportant features of each should be used to make better models.

Friberg et al. [FBS06] presented the KTH rule system used to relate musical performance fea-tures and emotional expression. Figure 1.19 presents the relation between emotions (happiness,sadness, angriness and tenderness) and performance features (tempo, sound level, articulation,phrase arch, final ritardandom punctuation and duration contrast).

Figure 1.19:KTH rules used to relate emotions with performance features [FBS06]

Bresin and Friberg [BF00] proposed rules for emotional expression tested with Director Mu-sices program [FCFS00] in music performances. This program is based on the KTH rule sys-tem [FBS06]. They analysed the previous work of Gabrielsson [Gab93, Gab95] and Juslin[Jus97b, Jus97a] to relate musical features (tempo, sound level, articulation and time deviations)and emotions (fear, anger, happiness, sadness, solemnity and tenderness), with the aim to com-pare with the rules used in the program used by them. This comparison is presented in figure1.20.

29

Page 30: Music Psychology

1.5. MUSIC THEORY CHAPTER 1. MUSIC PSYCHOLOGY

Figure 1.20:Rules for emotional expression during music performance (comparison with findingsof Gabrielsson and Juslin) [BF00]

1.5 Music Theory

A theoretical perspective to study music is presented in this section by reviewing important works.

Lerdahl and Jackendoff [LJ83] proposed a set of rules for tonal music. In addition, theyanalysed the syntax of music using hierarchical trees of relaxation/tension (right-branching moretension, left-branching more relaxation). This is done by taking into account the chord distance toa tonic. Tillmann et al. [TBB00] proposed a self-organizing neural network to embed the knowl-edge of western musical grammar, e.g., pitch dimension regularities. The process of learningused in this system, intends to internalize the correlational structure of tonal music. He had em-pirical findings on processing of tone, chord, key relationship, relatedness judgments, memory

30

Page 31: Music Psychology

1.6. MUSIC THERAPY CHAPTER 1. MUSIC PSYCHOLOGY

judgments and expectancies. Toussaint [Tou03] studied algorithmically melody and rhythm ofmusic. The similarity between rhythm and melody was studied using music as strings of sym-bols, the edit distance, the swap distance and the area-difference distance. Rhythm visualization,oddity and evenness were topics used to study rhythm.

Patel [Pat03] established a comparison between the hierarchical syntax structure of languageand music, using approaches from neuroscience and cognitive science. In the same way, musicand language have structural rules, respectively, harmonic and phrasal. Nevertheless, it seemsthat there is not a huge overlap of syntactic representation in both of them. Above all, a compar-ison was made to test the hypothesis that both language and music share a set of brain frontalareas for syntactic processing, which was confirmed.

Chattah [Cha06] used models of semiotics and pragmatics to analyse film music, taking intoconsideration formal design, melodic contour, pitch content, harmonic gestures, cadential for-mulas among other structural aspects of music. The messages that music communicate areanalysed through semiotics and pragmatics as can be seen in figure 1.21. Leitmotifs and topics(symbols), and music and sound parameters (icons) are studied using semiotic constructs. Onthe other side, qualitative and structural aspects of music, as well as similarities and dissimilaritiesbetween film narrative and music are studied from a pragmatics perspective.

Coutinho et al. [CGMM05] proposed an Artificial Life computational model to study musicontogeny, in particular the evolution of rhythms and emotional systems. The evolution of musicalcognition is studied using an artificial society of agents. So, this work is essentially centered incomputational musicology.

Rentfrow and Gosling [RG03] examined the role of personality in music preferences. It wasestablished the correlation between music genres and 4 dimensions of music preference: reflec-tive and complex, intense and rebellious, upbeat and conventional, and energetic and rhythmic(figure 1.22).

1.6 Music Therapy

Music is widely accepted as the main language to express emotions. As emotions are closely re-lated to physiological mechanisms music has been progressively used as a therapeutic mean. Inthe whole, in this section music will be studied as a medical/health science by reviewing importantworks in this area.

Aldridge [Ald93] studied music as medicine. Along the work it is established a comparisonbetween musical elements and human behaviours, specifically for the cure of diseases. Figure1.23 presents 2 different types of behaviours that can be induced by musical elements duringsessions of music therapy.

Pacchetti et al. [PMA+00] used music to induce motor and emotional responses in patientswith Parkinson Disease (PD). This study consisted of sessions of music and physical therapy in2 groups of 16 patients to reduce their clinical disability and improve their quality of life. Musictherapy (MT) sessions consisted of choral singing, voice exercise, rhythmic and free body move-ments and active music. It was concluded that MT is effective on motor, affective and behaviouralfunctions, and in PD rehabilitation programs. La Torre [LT03] presents a case study to infer theimportance of therapeutic use of music. Firstly, music can be a way to connect client and ther-

31

Page 32: Music Psychology

1.6. MUSIC THERAPY CHAPTER 1. MUSIC PSYCHOLOGY

apist if they are both familiar with musical constructs. Secondly, language and words can be apowerful way to communicate therapeutic messages through music. Lastly, listening or makingof sounds and music can be helpful activities to harmonize our body and mind.

Figure 1.21:Semiotics and Pragmatics processes used in Chattah’s framework [Cha06]

32

Page 33: Music Psychology

1.6. MUSIC THERAPY CHAPTER 1. MUSIC PSYCHOLOGY

Figure 1.22:Autocorrelation between music preferences and music genres [RG03]

Salamon et al. [SKBS03] studied the role of music in inducing positive emotions and subse-quent relaxation; reduce of stress and anxiety, as well as the relief of pathologies. Nitric oxide(NO) is proposed as the molecule (neurotransmitter) responsible for physiological and psycho-logical effects of relaxation. It is also analysed the effect of NO in auditory system and emotioncenters within the central nervous system. Chiu and Kumar [CK03] reviewed works done onmusic therapy, in particular to find what physiologic mechanisms are affected by music. Somestudies supported the idea that listening to (adequate) music reduces heart rate and blood pres-sure. Similarly, serum levels of ACTH, noradrenaline and cortisol showed significant reductionswhen listening to music. This study supported the idea that music listening is widely accepted asbeing helpful for music therapy.

Cadigan et al. [CCH+01] showed that music can be used to reduce blood pressure, respiratoryrate, heart rate, skin temperature and pain perception. This test was done in cardiac patientsresting on bed. Mok and Wong [MW03] found that music stimulates relaxation during surgery.The relaxation effect of music was demonstrated by the reduction of patients’ anxiety levels, heartrate and blood pressure.

Hilliard [Hil05] reviewed 11 empirical studies of music therapy (MT) and its emergent role inhospice and palliative care. From this review it was found that the following variables are positivelyinfluenced by MT: pain, physical comfort, fatigue and energy, anxiety and relaxation, time andduration of treatment, mood, spirituality and quality of life. The analysed literature can be seen infigure 1.24, particularly variables affected by MT studied in each study.

33

Page 34: Music Psychology

1.6. MUSIC THERAPY CHAPTER 1. MUSIC PSYCHOLOGY

Figure 1.23:Two different types of behaviours and musical elements used in music therapy[Ald93]

Figure 1.24:Music therapy review [Hil05]

Erkkila and Lartillot [ELL+04] made a music analysis system to analyse improvisations in clin-ical music therapy. This work intends to develop models (statistical or neural nets) to estimateperceived musical qualities (activity, valence, tension, salience, congruence, etc.) given extractedmusical features (texture, articulation, melody, rhythm, harmony, tonality and interaction betweenfeatures). This tool and similar tools can help the musical interaction and experience shared bythe client and therapist. The clinical relevance of the extracted features is still unknown.

Liljehdahl et al. [LSL05] developed 2 interactive music systems, which used REMUPP [WLLB05]to generate music to improve user’s physical health. The Digiwall system uses music to help theuser in the climbing activity and to foster the user to make music by physical motions. The BodyRest is a biofeedback system encourages the control of users’ physiology through visualization,muscle relaxation and other techniques. Biofeedbakc techniques are used to reduce stress, anx-iety and harmful physiologic responses. Sensors are used to monitor and control heart rate,

34

Page 35: Music Psychology

1.6. MUSIC THERAPY CHAPTER 1. MUSIC PSYCHOLOGY

blood pressure, brainwave activity and respiration. The music generator used in this system usesmappings between musical parameters (tempo, instrumentation and rhythmic complexity) andphysiological parameters (heart rate and heart rate variability), e.g. the system starts by generat-ing music with tempo equal to heart beats per minute. Body Rest system in presented with moredetail by Fagerlonn [Fag05]. It is a prototype to induce relaxation by controlling heart rate usingrelaxing music. This kind of music is essentially characterized by being melodious, delicate, har-monic and romantic. Classical music is adequate for this purpose. On the other hand, stimulativemusic is characterized by being loud, dynamic and rhythmic. It was concluded that heart rate isa stress indicator and that music of user’s preference easily induces states of relaxation.

Kemper and Danhauer [KD05] showed the main uses of music as a therapy, reviewing medicaleffects of music. MT is used to enhance well-being, reduce stress and anxiety, distract patientsfrom unpleasant symptoms, improve mood of patients, reduce pain, increase sense of comfortand relaxation, and in improving of empathy and compassion. It was found that different typesof music induce different types of emotions and it also depends on patients’ age. Classicalmusic decreases tension. Grunge rock increases hostility, fatigue, sadness and tension anddecreases relaxation, mental clarity, vigour and compassion. New age music increases relaxationand decreases hostility, mental clarity, vigour and tension.

35

Page 36: Music Psychology

Bibliography

[Ald93] D. Aldridge. The music of the body: Music therapy in medical settings. Advances: TheJournal of Mind-Body Health, 9, 1993.

[Bar03] F.B. Baraldi. An experiment on the communication of expressivity in piano improvisation anda study toward an interdisciplinary research framework of ethnomusicology and cognitivepsychology of music. University Lectures at Université Pierre et Marie Curie, page 72, 2003.

[BF00] R. Bresin and A. Friberg. Emotional coloring of computer-controlled music performances.Computer Music Journal, 24(4):44–63, 2000.

[BFL05] E. Bigand, S. Filipic, and P. Lalitte. The time course of emotional responses to music. Annalsof the New York Academy of Sciences, 1060(1):429–437, 2005.

[BIP04] S. Bleeck, T. Ives, and R.D. Patterson. Aim-mat: The auditory image model in matlab. ActaAcustica United With Acustica, 90:781–787, 2004.

[BJT06] T. Bosse, C.M. Jonker, and J. Treur. Formal analysis of damasio theory on core conscious-ness. International Conference on Cognitive Modeling, 2006.

[BL00] M.M. Bradley and P.J. Lang. Affective reactions to acoustic stimuli. Psychophysiology,37(02):204–215, 2000.

[BW05] J. Berg and J. Wingstedt. Relations between selected musical parameters and expressedemotions - extending the potential of computer entertainment. International Conference onAdvances in Computer Entertainment, Valencia, Spain, page 8, 2005.

[BZ01] A.J. Blood and R.J. Zatorre. Intensely pleasurable responses to music correlate with ac-tivity in brain regions implicated in reward and emotion. National Academy of Sciences,98(20):11818–11823, 2001.

[BZBE99] A.J. Blood, R.J. Zatorre, P. Bermudez, and A.C. Evans. Emotional responses to pleasant andunpleasant music correlate with activity in paralimbic brain regions. Nature Neuroscience,2:382–387, 1999.

[Cab99] D. Cabrera. Psysound: A computer program for psychoacoustical analysis. Proceedings ofthe Australian Acoustical Society Conference, 24:47–54, 1999.

[CCH+01] ME Cadigan, NA Caruso, SM Haldeman, ME McNamara, DA Noyes, MA Spadafora, andDL Carroll. The effects of music on cardiac patients on bed rest. Prog Cardiovasc Nurs,16(1):5–13, 2001.

[CGMM05] E. Coutinho, M. Gimenes, J.M. Martins, and E.R. Miranda. Computational musicology: Anartificial life approach. Workshop on Artificial Life and Evolutionary Algorithms Workshop, 2,2005.

[CH01] W.G. Collier and T.L. Hubbard. Musical scales and evaluations of happiness and awkward-ness: Effects of pitch, direction, and scale mode. The American Journal of Psychology,114(3):355–375, 2001.

36

Page 37: Music Psychology

Bibliography Bibliography

[Cha06] J.R. Chattah. Semiotics, Pragmatics, and Metaphor in Film Music Analysis. PhD thesis, TheFlorida State University College Of Music, 2006.

[Che01] E. Chew. Modeling tonality: Applications to music cognition. Annual Meeting of the CognitiveScience Society, 23:206–211, 2001.

[Che06] E. Chew. Topics in engineering approaches to music cognition. 2006.

[CK03] P. Chiu and A. Kumar. Music therapy: Loud noise or soothing notes? International Pediatrics,18(4):204–208, 2003.

[COT04] P. Cariani, A. Oxenham, and M. Tramo. Music perception and cognition. University Lecturesat MIT OpenCourseWare, 2004.

[Dam00] A. Damasio. The Feeling of What Happens: Body, Emotion and the Making of Conscious-ness. Harvest Books, October 2000.

[Daw99] A.T. Dawes. Mind, Music And Emotion. PhD thesis, University Of British Columbia, 1999.

[DBPRG01] S. Dalla Bella, I. Peretz, L. Rousseau, and N. Gosselin. A developmental study of the affectivevalue of tempo and mode in music. Cognition, 80:B1–B10, 2001.

[Deu82] D. Deutsch. The Psychology Of Music. Academic Press, 1982.

[Deu99] D. Deutsch. Grouping mechanisms in music. The Psychology of Music, pages 299–348,1999.

[DH03] P. Desain and H. Honing. The formation of rhythmic categories and metric priming. Percep-tion, 32(3):341–365, 2003.

[Eer03] T. Eerola. The Dynamics Of Musical Expectancy: Cross-Cultural And Statistical ApproachesTo Melodic Expectations. PhD thesis, University of Jyvaskila, 2003.

[ELL+04] J. Erkkilä, O. Lartillot, G. Luck, K. Riikkilä, and P. Toiviainen. Intelligent music systems inmusic therapy. Music Therapy Today, 5, 2004.

[Ell05] Daniel P. W. Ellis. Plp and rasta (and mfcc, and inversion) in matlab.http://www.ee.columbia.edu/ dpwe/resources/matlab/rastamat/, 2005.

[Est05] R. Esteban. Oreja... an environment for the design of psychoacoustic experiments. EuropeanSociety for Cognitive Psychology, 14, 2005.

[ET04] T. Eerola and P. Toiviainen. Mir in matlab: The midi toolbox. International Symposium onMusic Information Retrieval (ISMIR), 2004.

[Fag05] J. Fagerlönn. A prototype using music responding to heart rate for stress reduction. Master’sthesis, Luleå University of Technology, June 2005.

[FBS06] A. Friberg, R. Bresin, and J. Sundberg. Overview of the kth rule system for musical perfor-mance. Advances in Cognitive Psychology, 2(2-3):145–161, 2006.

[FCFS00] A. Friberg, V. Colombo, L. Frydén, and J. Sundberg. Generating musical performances withdirector musices. Computer Music Journal, 24(3):23–29, 2000.

[Fel03] F. Feldbusch. Auditory model simulator for matlab (afm), 2003.

[FK01] BR Fischer and SJ Krehbiel. Dynamic rating of emotions elicited by music. National Confer-ence for Undergraduate Research, Lexington, KY, pages 15–17, 2001.

[Gab93] A. Gabrielsson. Intention and emotional expression in music performance. Stockholm MusicAcoustics Conference, pages 108–111, 1993.

37

Page 38: Music Psychology

Bibliography Bibliography

[Gab95] A. Gabrielsson. Expressive intention and performance. Music and the Mind Machine, pages35–47, 1995.

[GG04] J. Goguen and R. Goguen. Time, structure and emotion in music. University Lectures at KeioUniversity, 2004.

[GL01] A. Gabrielsson and E. Lindström. The influence of musical structure on emotional expression.Music and emotion: Theory and research, pages 223–248, 2001.

[GP03] L. Gagnon and I. Peretz. Mode and tempo relative contributions to" happy-sad" judgementsin equitone melodies. Cognition & Emotion, 17(1):25–40, 2003.

[Gre75] J.M. Grey. An Exploration Of Musical Timbre. PhD thesis, Stanford University., 1975.

[Hel09] H.V. Helmholtz. Helmholtz physiological optics. Rochester, New York: Optical Society ofAmerica, 1925., 1909.

[Hil05] R.E. Hilliard. Music therapy in hospice and palliative care: a review of the empirical data.Evidence-based Complementary and Alternative Medicine, 2(2):173–178, 2005.

[Hod96] D.A. Hodges. Handbook of Music Psychology. IMR Press, 1996.

[IT06] G. Ilie and W.F. Thompson. A comparison of acoustic cues in music and speech for threedimensions of affect. Music Perception, 23:319–329, 2006.

[JB02] T. Justus and J. Bharucha. Music Perception and Cognition in Stevens Handbook of Experi-mental Psychology, Volume 1: Sensation and Perception, pages 453–492. New York: Wiley,2002.

[Jeh05] Tristan Jehan. Creating Music by Listening. PhD thesis, Massachusetts Institute of Technol-ogy, MA, USA, September 2005.

[Jus97a] PN Juslin. Emotional communication in music performance: A functionalist perspective andsome data. Music Perception, 14(4):383–418, 1997.

[Jus97b] PN Juslin. Perceived emotional expression in synthesized performances of a short melody:Capturing the listener judgment policy. Musicae Scientiae, 1(2):225–256, 1997.

[Jus01] PN Juslin. Communicating emotion in music performance: A review and a theoretical frame-work. Music and Emotion: Theory and Research, pages 309–337, 2001.

[JV06] P. Juslin, , and D. Västfjäll. How does music induce emotions in listeners? the amuse model.International Conference on Music Perception and Cognition, August 2006.

[KD05] K.J. Kemper and S.C. Danhauer. Music as therapy. Southern Medical Journal, 98(3):282–288, 2005.

[Kim02] J. Kimura. Analysis of emotions in musical expression, 2002.

[Kle03] M.W. Klein. Psychophysiological and emotional dynamic responses to music: An explorationof a two-dimensional model. National Conferences on Undergraduate Research (NCUR),2003.

[Koe05] S. Koelsch. Investigating emotion with music: Neuroscientific approaches. Annals of the NewYork Academy of Sciences, 1060(1):412–418, 2005.

[Kor04] M.D. Korhonen. Modeling continuous emotional appraisals of music using system identifica-tion. University of Waterloo, page 163, 2004.

38

Page 39: Music Psychology

Bibliography Bibliography

[Kru02] C.L. Krumhansl. Music: A link between cognition and emotion. Current Directions in Psycho-logical Science, 11(2):45–50, 2002.

[KS05] S. Koelsch and W. Siebel. Towards a neural basis of music perception. Trends in CognitiveSciences, 9:578–584, 2005.

[Lar04] S. Larson. Musical forces and melodic expectations: Comparing computer models and ex-perimental results. Music Perception, 21(4):457–498, 2004.

[Lav01] M.M. Lavy. Emotion and the experience of listening to music: A framework for empiricalresearch. University of Cambridge, April 2001.

[Ler06] F. Lerdahl. Tonal Pitch Space. Oxford University Press, USA, 2006.

[Lin04] e. Lindstrom. A Dynamic View of Melodic Organization and Performance. PhD thesis, ActaUniversitatis Upsaliensis Uppsala, 2004.

[LJ77] F. Lerdahl and R. Jackendoff. Toward a formal theory of tonal music. Journal of Music Theory,21(1):111–171, 1977.

[LJ83] F. Lerdahl and R.S. Jackendoff. A generative theory of tonal music. MIT Press Cambridge,Mass, 1983.

[LLT01] M. Leman, M. Lesaffre, and K. Tanghe. Introduction to the ipem toolbox for perception-basedmusic analysis. Mikropolyphonie-The Online Contemporary Music Journal, 7, 2001.

[LO03] T. Li and M. Ogihara. Detecting emotion in music. International Conference on Music Infor-mation Retrieval (ISMIR), 4:239–240, 2003.

[LSL05] M. Liljedahl, C. Sjömark, and N. Lefford. Using music to promote physical well-being viacomputer-mediated interaction. Musicnetwork Open Workshop, 5:5, 2005.

[LT03] M.A. La Torre. The use of music and sound to enhance the therapeutic setting. Perspectivesin Psychiatric Care, 39(3):129–132, 2003.

[LT05] E.W. Large and A.M.Y.E. Tretakis. Tonality and nonlinear resonance. Annals of the New YorkAcademy of Sciences, 1060(1):53–56, 2005.

[Luc06] T. Lucassen. Emotions of musical instruments. Twente Student Conference on IT, 4, 2006.

[Mar05] E.H. Margulis. A model of melodic expectation. Music Perception, 22(4):663–713, 2005.

[Mer03] D. Meredith. Music and the auditory system. University of London, April 2003.

[Mey56] L.B. Meyer. Emotion and Meaning in Music. University of Chicago Press, 1956.

[Min81] M. Minsky. Music, mind and meaning. Computer Music Journal, 5(3), 1981.

[MSV98] K.D. Martin, E.D. Scheirer, and B.L. Vercoe. Music content analysis through models of audi-tion. ACM Multimedia Workshop on Content Processing of Music for Multimedia Applications,Bristol, UK, 12, 1998.

[MW03] E. Mok and KY Wong. Effects of music on patient anxiety. AORN Journal, February 2003.

[Nar77] E. Narmour. Beyond Schenkerism: The Need for Alternatives in Music Analysis. Universityof Chicago Press, 1977.

[Nar90] E. Narmour. The Analysis and Cognition of Basic Melodic Structures: The Implication-realization Model. University of Chicago Press, 1990.

39

Page 40: Music Psychology

Bibliography Bibliography

[OT90] A. Ortony and TJ Turner. What’s basic about basic emotions? Psychology Review,97(3):315–31, 1990.

[Pan98] J. Panksepp. The periconscious substrates of consciousness: Affective states and the evo-lutionary origins of the self. Journal of Consciousness Studies, 5(5-6):566–82, 1998.

[Pat03] A.D. Patel. Language, music, syntax and the brain. Nature Neuroscience, 6(7):674–681,2003.

[PB02] J. Panksepp and G. Bernatzky. Emotional sounds and the brain: the neuro-affective founda-tions of musical appreciation. Behavioural Processes, 60(2):133–155, 2002.

[PBLB03] A. Padova, L. Bianchini, M. Lupone, and M.O. Belardinelli. Influence of specific spectralvariations of musical timbre on emotions in the listeners. 5th Triennial ESCOM Conference,September 2003.

[PC03] I. Peretz and M. Coltheart. Modularity of music processing. Nature Neuroscience, 6(7):688–691, 2003.

[PDW03] E. Pampalk, S. Dixon, and G. Widmer. On the evaluation of perceptual similarity measuresfor music. International Conference on Digital Audio Effects, 2003.

[PJ03] C. Palmer and M. K. Jungers. Music cognition. Encyclopedia of Cognitive Science, 3:155–158, 2003.

[Pla04] C.J. Plack. Auditory perception in Psychology: An International Perspective (PIP). Psychol-ogy Press, 2004.

[Pla06] D.S. Plack. The Effect of Performance Medium on the Emotional Response of the Listeneras Measured by the Continuous Response Digital Interface. PhD thesis, The Florida StateUniversity College Of Music, 2006.

[Plo64] R. Plomp. The ear as a frequency analyzer. The Journal of the Acoustical Society of America,36:1628, 1964.

[PMA+00] C. Pacchetti, F. Mancini, R. Aglieri, C. Fundaro, E. Martignoni, and G. Nappi. Active musictherapy in parkinson’s disease: an integrative method for motor and emotional rehabilitation.Psychosomatic Medicine, 62(3):386–93, 2000.

[Pov02] D.J. Povel. A model for the perception of tonal melodies. International Conference on Musicand Artificial Intelligence, 2:144–154, 2002.

[PSB05] A. Padova, R. Santoboni, and M.O. Belardinelli. Influence of timbre on emotions and recog-nition memory for music. Conference on Interdisciplinary Musicology, March 2005.

[RG03] P.J. Rentfrow and S.D. Gosling. The do re mis of everyday life: Examining the structure andpersonality correlates of music preferences. Journal of Personality and Social Psychology,84:1236–56, 2003.

[RR04] D.A. Ritossa and N.S. Rickard. The relative utility of pleasantness and liking dimensions inpredicting the emotions expressed by music. Psychology of Music, 32(1):5–22, 2004.

[Rus89] J.A. Russell. Measures of emotion. Emotion: Theory, research, and experience, 4:83–111,1989.

[SAPM02] E.G. Schellenberg, M. Adachi, K.T. Purdy, and M.C. McKinnon. Expectancy in melody: Testsof children and adults. Journal of Experimental Psychology: General, 131(4):511–537, 2002.

[Sch73] H. Schenker. Harmony. MIT Press, 1973.

40

Page 41: Music Psychology

Bibliography Bibliography

[Sch97] E.G. Schellenberg. Simplifying the implication-realization model of melodic expectancy. Mu-sic Perception, 14(3):295–318, 1997.

[Sch99] E. Schubert. Measurement and Time Series Analysis of Emotion in Music. PhD thesis,University of New South Wales, 1999.

[Sch00a] E.D. Scheirer. Music-Listening Systems. PhD thesis, Massachusetts Institute of Technology,2000.

[Sch00b] K.R. Scherer. Psychological models of emotion. The Neuropsychology Of Emotion, pages137–162, 2000.

[Sch04a] K.R. Scherer. Which emotions can be induced by music? what are the underlying mecha-nisms? and how can we measure them? Journal of New Music Research, 33(3):239–251,2004.

[Sch04b] E. Schubert. Research in expressing continuous emotional response to music as a functionof its psychoacoustic parameters: Current and future directions. International Congress onAcoustics, 2004.

[Sha04] P. Shanley. Music and emotion: The creation of a continuous response network in order toevaluate the extent of specific musical element expressiveness and the make-up of music’saffective personality, August 2004.

[SKBS03] E. Salamon, M. Kim, J. Beaulieu, and G.B. Stefano. Sound therapy induced relaxation: downregulating stress processes and pathologies. Medical science monitor, 9(5):116–121, 2003.

[SKS06] N. Steinbeis, S. Koelsch, and J.A. Sloboda. The role of harmonic expectancy violations inmusical emotions: Evidence from subjective, physiological, and neural responses. Journal ofCognitive Neuroscience, 18(8):1380, 2006.

[Sla94] M. Slaney. Auditory toolbox. Apple Computer, Inc. Technical Report, 45, 1994.

[Slo91] J.A. Sloboda. Music structure and emotional response: Some empirical findings. Psychologyof Music, 19(2):110, 1991.

[SO77] K.R. Scherer and J.S. Oshinsky. Cue utilization in emotion attribution from auditory stimuli.Motivation and Emotion, 1(4):331–346, 1977.

[SWV00] E.D. Scheirer, R.B. Watson, and B.L. Vercoe. On the perceived complexity of short musicalsegments. International Conference on Music Perception and Cognition, 6, August 2000.

[SZ01] K.R. Scherer and M.R. Zentner. Emotional effects of music: Production rules. Music andemotion: Theory and research, pages 361–392, 2001.

[TBB00] B. Tillmann, J.J. Bharucha, and E. Bigand. Implicit learning of tonality: A self-organizingapproach. Psychological Review, 107(4):885–913, 2000.

[TC00] G. Tzanetakis and P. Cook. Marsyas: a framework for audio analysis. Organised Sound,4(03):169–175, 2000.

[Tem04] D. Temperley. The Cognition of Basic Musical Structures. MIT Press, 2004.

[TF03] K. Thimm and B. Fischer. Emotional responses to music: Influence of psycho-acousticalfeatures. National Conferences on Undergraduate Research (NCUR), 2003.

[TK03] P. Toiviainen and C.L. Krumhansl. Measuring and modeling real-time responses to music:The dynamics of tonality induction. Perception, 32(6):741–766, 2003.

41

Page 42: Music Psychology

Bibliography Bibliography

[Tou03] G.T. Toussaint. Algorithmic, geometric, and combinatorial problems in computational musictheory. Encuentros de Geometria Computacional, 10:101–107, 2003.

[TS99] D. Temperley and D. Sleator. Modeling meter and harmony: A preference-rule approach.Computer Music Journal, 23(1):10–27, 1999.

[vE90] C. von Ehrenfels. Über gestaltqualitäten. Vierteljahresschrift für wissenschaftliche Philoso-phie, 14:249–92, 1890.

[VM04] B. Vickhoff and H. Malmgren. Why does music move us? Dept. of Philosophy, GöteborgUniversity, Sweden, 2004.

[Wed72] L. Wedin. Multidimensional scaling of emotional expression in music. Swedish Journal ofMusicology, 54:115–131, 1972.

[Wer23] M. Wertheimer. Untersuchungen zur lehre der gestalt ii. [investigation to the theory of gestalt].Psychological Research, 4:301–351, 1923.

[Wes78] DL Wessel. Low dimensional control of timbre, 1978.

[WGIV04] G. Widmer, W. Goebl, A. Intelligence, and A. Vienna. Computational models of expressivemusic performance: The state of the art. Journal of New Music Research, 33(3):203–216,2004.

[Whi05] B.A. Whitman. Learning the meaning of music. Massachusetts Institute of Technology, 2005.

[WLLB05] J. Wingstedt, M. Liljedahl, S. Lindberg, and J. Berg. Remupp: an interactive tool for in-vestigating musical properties and relations. New Interfaces For Musical Expression, pages232–235, 2005.

[WOFW69] R.M. Warren, C.J. Obusek, R.M. Farmer, and R.P. Warren. Auditory sequence: Confusion ofpatterns other than speech or music. Science, 164(3879):586, 1969.

[WW05] G.D. Webster and C.G. Weir. Emotional responses to music: Interactive effects of mode,texture, and tempo. Motivation and Emotion, 29(1):19–39, 2005.

42