2
Since humans evolved as a social species, it has been reasoned that due to their high biological relevance, emotional facial expres- sions are processed automatically, i.e. independent of attention. However, to what degree emotional expressions are in fact auto- matically encoded by the human brain remains highly debated in the literature. In an attempt to shed light on the temporal dynamics and functional topography of human emotional face processing and its relation to attention, the current symposium brings together results from a broad range of techniques used in the field of Cognitive Neuroscience such as Event-related Potentials (ERPs), intracranial Local Field Potentials (iLFPs) and functional Magnetic Resonance Imaging (fMRI) applied in various experimental paradigms. It will be shown that automaticity cannot necessarily be considered an all-or- none phenomenon; instead, different sub-processes and functionally distinctive neural substrates seem to exhibit different degrees of automatization. doi:10.1016/j.ijpsycho.2010.06.342 Automaticity in emotional processing? Manipulating voluntary attention and implicit semantic priming to bias processing of threatening signals in actions and faces Swann Pichon, Patrik Vuilleumier Lab. for Neurology and Imaging of Cognition, Department of Neuroscience, University of Geneva, Switzerland Perception of threatening signals can influence a wide range of perceptual and behavioral processes: these stimuli benefit from a perceptual advantage and also interrupt ongoing behavior while prompting the motor system for adaptive action. An important issue in cognitive neuroscience is to understand to what extent these treatments are immune from attention. It has been shown that the processing of threatening information occurs relatively indepen- dently of attention (Dolan & Vuilleumier 2003). Yet, the perceptual advantage for threat can vanish when the perceptual demand of a task is very high (Pessoa et al 2002). A first question we sought to address is whether attentional influence on threat processing is limited to perceptual processes or whether it extends to its behavioral consequences (Pichon, de Gelder, Grèzes, in prep). Neuroimaging studies have indeed reported increased hemodynamic responses to threat in regions associated with motor preparation (premotor cortex) and defensive behavior (periaqueductal gray, hypothalamus), but it remains unknown whether these regions are sensitive to attentional demand. Using fMRI, subjects were scanned while they observed threatening or neutral actions. We show that responses in temporal cortices, and especially the amygdala, were considerably attenuated during unattended, as compared to attended, threat, whereas behavioral interference and activity in motor and defense- related regions were independent of attention. A second question we sought to address is whether alteration of threat processing, as previously shown in the amygdala, necessarily requires voluntary attention or if it can occur outside a subject's awareness (Pichon, Rieger, Vuilleumier, in prep). Indeed, in social psychology, an extensive literature has shown that the implicit priming of semantic representations or goals can later produce bias in information processing or behavior (Bargh 2005). Using fMRI, we show that implicit priming with an emotional laden semantic field modifies the subsequent processing of facial expressions in the fusiform gyrus and the amygdala, but not the processing of pictures of places in the parahippocampal gyrus. Together, these data suggest that: a) if attentional demand impacts the processing of threat in the temporal cortices and amygdala, it does not influence threat responses in regions associated with motor preparation and defensive behaviours; b) that bias in information processing does not necessarily require explicit attention, but can occur following implicit semantic priming. This may be the result of long-lasting changes in brain states, which occur without reaching explicit conscious knowledge. doi:10.1016/j.ijpsycho.2010.06.343 Modulation of face processing by emotional expression during intracranial recordings in right fusiform cortex and amygdala Gilles Pourtois a , Patrik Vuilleumier b,c a Department of Experimental clinical and health psychology, Ghent University, Belgium b Laboratory for Behavioral Neurology and Imaging of Cognition, Department of Neuroscience, University of Geneva, Geneva, Switzerland c Clinic of Neurology, University of Geneva, Geneva, Switzerland We recorded intracranial Local Field Potentials (iLFPs) from structurally intact human visual cortex during several face processing tasks in a patient prior to brain surgery. iLFPs were measured from subdural electrodes implanted in a right fusiform region with face- sensitive activity, and a more medial location in the posterior parahippocampal gyrus with house-selective activity. This electrode implantation allowed us to compare neural responses to different facial properties within two adjacent, but functionally distinct, cortical regions. Several experiments were conducted to determine the temporal dynamics of perceptual and emotional effects on face- specific responses in the right fusiform. Our findings showed an early negative deflection (N200) that primarily reflected category-selective perceptual encoding of facial information, whereas higher-order effects of face individuation and emotional expression produced selective modulations in the same face-specific region during a later time-period (from 200 up to 1000 ms post-onset). These results shed new light on the time-course of face recognition mechanisms in the human visual cortex, and reveal anatomically overlapping, but temporally distinct, influences of identity or emotional factors on face processing in the right fusiform gyrus, which presumably reflect top-down feedback effects from distant brain areas, including the amygdala. This conjecture was verified by recording iLFPs in a second patient implanted with deep electrodes in the left lateral amygdala. Results disclosed an early emotional effect in the amygdala arising prior to, and independent of, attentional modulation. Altogether, these results suggest that the amygdala is involved in the early visual processing of emotional expression, while face-selective responses within the anterior fusiform gyrus are gated by emotional expression during a later time interval. doi:10.1016/j.ijpsycho.2010.06.344 Adaptation effects of emotional expressions in anxiety: Evidence for an enhanced late positive potential Anne Richards a , Emily Bethell a , Phil Pell a , Amanda Holmes b a Department of Psychological Science, School of Science, Birkbeck College, United Kingdom b School of Human and Life Sciences, Roehampton University, United Kingdom Prolonged exposure to a stimulus biases perception of a subsequent stimulus away from the adapting stimulus (Clifford & Rhodes, 2005). These aftereffects have been observed with a wide variety of stimuli, including facial identity (Rhodes & Jeffery, 2006) 234 Abstracts / International Journal of Psychophysiology 77 (2010) 206238

Adaptation effects of emotional expressions in anxiety: Evidence for an enhanced late positive potential

Embed Size (px)

Citation preview

Since humans evolved as a social species, it has been reasonedthat due to their high biological relevance, emotional facial expres-sions are processed automatically, i.e. independent of attention.However, to what degree emotional expressions are in fact auto-matically encoded by the human brain remains highly debated in theliterature.

In an attempt to shed light on the temporal dynamics andfunctional topography of human emotional face processing and itsrelation to attention, the current symposium brings together resultsfrom a broad range of techniques used in the field of CognitiveNeuroscience such as Event-related Potentials (ERPs), intracranialLocal Field Potentials (iLFPs) and functional Magnetic ResonanceImaging (fMRI) applied in various experimental paradigms. It will beshown that automaticity cannot necessarily be considered an ‘all-or-none phenomenon’; instead, different sub-processes and functionallydistinctive neural substrates seem to exhibit different degrees ofautomatization.

doi:10.1016/j.ijpsycho.2010.06.342

Automaticity in emotional processing? Manipulating voluntaryattention and implicit semantic priming to bias processing ofthreatening signals in actions and faces

Swann Pichon, Patrik VuilleumierLab. for Neurology and Imaging of Cognition, Department ofNeuroscience, University of Geneva, Switzerland

Perception of threatening signals can influence a wide range ofperceptual and behavioral processes: these stimuli benefit from aperceptual advantage and also interrupt ongoing behavior whileprompting the motor system for adaptive action. An important issuein cognitive neuroscience is to understand to what extent thesetreatments are immune from attention. It has been shown that theprocessing of threatening information occurs relatively indepen-dently of attention (Dolan & Vuilleumier 2003). Yet, the perceptualadvantage for threat can vanish when the perceptual demand of atask is very high (Pessoa et al 2002). A first question we sought toaddress is whether attentional influence on threat processing islimited to perceptual processes or whether it extends to its behavioralconsequences (Pichon, de Gelder, Grèzes, in prep). Neuroimagingstudies have indeed reported increased hemodynamic responses tothreat in regions associated with motor preparation (premotorcortex) and defensive behavior (periaqueductal gray, hypothalamus),but it remains unknown whether these regions are sensitive toattentional demand. Using fMRI, subjects were scanned while theyobserved threatening or neutral actions. We show that responses intemporal cortices, and especially the amygdala, were considerablyattenuated during unattended, as compared to attended, threat,whereas behavioral interference and activity in motor and defense-related regions were independent of attention. A second question wesought to address is whether alteration of threat processing, aspreviously shown in the amygdala, necessarily requires voluntaryattention or if it can occur outside a subject's awareness (Pichon,Rieger, Vuilleumier, in prep). Indeed, in social psychology, anextensive literature has shown that the implicit priming of semanticrepresentations or goals can later produce bias in informationprocessing or behavior (Bargh 2005). Using fMRI, we show thatimplicit priming with an emotional laden semantic field modifies thesubsequent processing of facial expressions in the fusiform gyrus andthe amygdala, but not the processing of pictures of places in theparahippocampal gyrus. Together, these data suggest that: a) ifattentional demand impacts the processing of threat in the temporalcortices and amygdala, it does not influence threat responses in

regions associated with motor preparation and defensive behaviours;b) that bias in information processing does not necessarily requireexplicit attention, but can occur following implicit semantic priming.This may be the result of long-lasting changes in brain states, whichoccur without reaching explicit conscious knowledge.

doi:10.1016/j.ijpsycho.2010.06.343

Modulation of face processing by emotional expression duringintracranial recordings in right fusiform cortex and amygdala

Gilles Pourtoisa, Patrik Vuilleumierb,ca Department of Experimental clinical and health psychology,Ghent University, Belgiumb Laboratory for Behavioral Neurology and Imaging of Cognition,Department of Neuroscience, University of Geneva, Geneva, Switzerlandc Clinic of Neurology, University of Geneva, Geneva, Switzerland

We recorded intracranial Local Field Potentials (iLFPs) fromstructurally intact human visual cortex during several face processingtasks in a patient prior to brain surgery. iLFPs were measured fromsubdural electrodes implanted in a right fusiform region with face-sensitive activity, and a more medial location in the posteriorparahippocampal gyrus with house-selective activity. This electrodeimplantation allowed us to compare neural responses to differentfacial properties within two adjacent, but functionally distinct,cortical regions. Several experiments were conducted to determinethe temporal dynamics of perceptual and emotional effects on face-specific responses in the right fusiform. Our findings showed an earlynegative deflection (N200) that primarily reflected category-selectiveperceptual encoding of facial information, whereas higher-ordereffects of face individuation and emotional expression producedselective modulations in the same face-specific region during a latertime-period (from 200 up to 1000 ms post-onset). These results shednew light on the time-course of face recognition mechanisms in thehuman visual cortex, and reveal anatomically overlapping, buttemporally distinct, influences of identity or emotional factors onface processing in the right fusiform gyrus, which presumably reflecttop-down feedback effects from distant brain areas, including theamygdala. This conjecture was verified by recording iLFPs in a secondpatient implanted with deep electrodes in the left lateral amygdala.Results disclosed an early emotional effect in the amygdala arisingprior to, and independent of, attentional modulation. Altogether,these results suggest that the amygdala is involved in the early visualprocessing of emotional expression, while face-selective responseswithin the anterior fusiform gyrus are gated by emotional expressionduring a later time interval.

doi:10.1016/j.ijpsycho.2010.06.344

Adaptation effects of emotional expressions in anxiety: Evidencefor an enhanced late positive potential

Anne Richardsa, Emily Bethella, Phil Pella, Amanda Holmesba Department of Psychological Science, School of Science,Birkbeck College, United Kingdomb School of Human and Life Sciences, Roehampton University,United Kingdom

Prolonged exposure to a stimulus biases perception of asubsequent stimulus away from the adapting stimulus (Clifford &Rhodes, 2005). These aftereffects have been observed with a widevariety of stimuli, including facial identity (Rhodes & Jeffery, 2006)

234 Abstracts / International Journal of Psychophysiology 77 (2010) 206–238

and emotional facial expression (e.g., Webster et al., 2004). In thecurrent experiment, we used this paradigm to examine the effect ofanxiety on the perception of emotional expression. Photographs of amale and a female model from the Karolinska Directed EmotionalFaces set (KDEF; Lundqvist et al. 1998) were selected and a series ofmorphs created by interpolating a fear exemplar with a neutralexemplar for each model. A pilot study established a range ofmorphed expressions falling on either side of the fear/neutralboundary, and these were then used in the main study. There were16 adaptation blocks in the experiment proper, and each block beganwith an adapting phase in which participants viewed (and categor-ized) 35 presentations of one endpoint from the morph continuum.This was immediately followed by a test phase, during whichparticipants viewed and categorized 33 images of the same model(8 dummy trials, 5 top-up trials and 20 test trials – 10 from either sideof the categorical boundary). We predicted that all test trials wouldbe perceived as having an emotional expression opposite to that ofthe adapting stimulus (e.g., adaptation to fear would produce‘neutral’ classifications for the test stimuli that were originally onthe ‘fear’ side of the boundary). EEG was recorded using a NeuroScanNuAmps system, and stimulus presentation was controlled using E-Prime. An analysis of the behavioural classification data revealed thatadaptation to fear created a shift in the classification of morphstowards ‘neutral’ and adaptation to neutral created a shift towards‘fear’. This shift was equivalent in high and low anxiety. An analysis ofthe ERP data, however, revealed a more pronounced late positivepotential in the high, but not the low, anxiety group followingadaptation to neutral compared to fear, and this effect wasparticularly enhanced for male than female expressions.

doi:10.1016/j.ijpsycho.2010.06.345

Automaticity of emotional face processing as revealed bytask independence

Julian Rellecke, Annekathrin Schacht, Werner SommerHumboldt-Universität zu Berlin, Department of Psychology, Berlin,Germany

To what extent processing of emotional facial expressions takesplace in an automatic fashion is still a matter of debate. One of themost important criteria of automatic processes is their independencefrom the subject's processing intention. In the current study,intention manipulation was operationalised through a variation ofthe task directed at angry, happy and neutral facial expressions whileemotion-related processing, as indicated by event-related potentials(ERPs), was observed. According to this logic, task-independentemotion effects in ERPs signal automatic processing of emotionalstimuli. We focused on two emotion-sensitive ERP components: theEarly Posterior Negativity (EPN) and the Late Positive Complex (LPC).While the EPN has been referred to as increased perceptual encoding,the LPC has been suggested to reflect higher cognitive operations,such as enhanced stimulus encoding into working memory. By theuse of Independent Component Analysis (ICA), task-dependence wasassessed separately for EPN and LPC. While the EPN did not show anyvariation across conditions, the LPC was affected by the task. Thissuggests that the increased perceptual encoding of emotional stimuliis a largely automatic process, whereas the influence of emotion onhigher cognitive processes depends on the intentional state of theobserver. Moreover, the same ICs that accounted for the EPN alsoaccounted for emotion effects in early visual evoked potentials and atlater time points. Similarly, the ICs reflecting the LPC showedemotion-specific activity temporally overlapping that of EPN-related

components. Such pattern indicates that processing of emotionalfacial expressions at perceptual and cognitive levels occurs in parallel.

doi:10.1016/j.ijpsycho.2010.06.346

Symposium 17: Sensory Processing in DevelopmentalPsychopathologySymposium Chair: Nicole Bruneau (France)

Sensory processing in developmental disorders

Nicole BruneauUMRS INSERM U 930, CNRS ERL 3106, Université François-Rabelais deTours, Centre de Pédopsychiatrie, CHRU de Tours, IFR 135 « ImagerieFonctionnelle », 37000 Tours, France

Studies in sensory neuroscience have revealed the criticalimportance of accurate sensory perception for cognitive develop-ment. Studying sensory processes might therefore improve ourunderstanding of behavioural and cognitive difficulties in develop-mental disorders. Electrophysiology is very suitable for such anapproach to developmental disorders. This symposium comprisestwo talks on sensory processing in autism and two on sensoryprocessing in children with dyslexia. Atypical brain reactivity,associated with automatic change-detection, has been evidenced inauditory and visual modalities in children with autism and might berelated to their behavioural difficulties in adapting to environmentalchange. In the second talk, atypical voice processing has beendemonstrated in autism and the proposal is that communication andlanguage disorders in autism might have their roots in this abnormalsensory processing. The rate of onset of the amplitude envelope (risetime) of auditory tone stimuli has been found to be abnormal inchildren with dyslexia, and it is proposed to be the sensory correlateof phonological impairment underlying their reading disorders.Single-letter reading is an early predictor of later reading abilities.The brain mechanisms involved in such an elementary reading taskperformed by children with dyslexia will be described in the fourthpresentation.

doi:10.1016/j.ijpsycho.2010.06.347

Visual automatic change detection in children with autism:An electrophysiological study

H. Cléry*a, N. Bruneaua, S. Rouxa, C. Barthélémya, P. Lenoirb,F. Bonnet-Brilhaulta, M. Gomotaa UMRS Imagerie et Cerveau, Inserm U930, CNRS ERL 3106, UniversitéFrançois Rabelais de Tours, CHRU de Tours, IFR 135 de Tours, Franceb Centre de Ressources Autisme, CHRU de Tours, France

Introduction: The clinical observations of autistic patients showthat they react in an unusual way to unattended sensory stimuli thatappear in the environment. These atypical behaviors can be observedin all sensory modalities and could be due to a dysfunction ofperceptive nature. Several studies have investigated auditory changedetection in children with autism (CWA) and have highlightedatypical change processing in this population. Gomot et al. (2002,Psychophysiology, 39: 577-584) studied the brain processes involvedin the automatic change detection in CWA using scalp potentials andSCD mapping. Their study showed a reduced MMN latency associatedwith an atypical topography involving the left frontal area in autism.The aim of the present study was to determine whether theseabnormalities in change detection are independent of the sensory

235Abstracts / International Journal of Psychophysiology 77 (2010) 206–238