12
Tactile stimulation and hemispheric asymmetries modulate auditory perception and neural responses in primary auditory cortex M. Hoefer a, b, , S. Tyll a, b , M. Kanowski b , M. Brosch c , M.A. Schoenfeld b, c , H.-J. Heinze b, d , T. Noesselt a, d a Department of Biological Psychology, Otto-von-Guericke-University Magdeburg, Postfach 4120, 39106 Magdeburg, Germany b Department of Neurology, Otto-von-Guericke-University Magdeburg, Leipziger Str. 44, 39120 Magdeburg, Germany c Leibniz Institute of Neurobiology, Brenneckestraße 6, 39118 Magdeburg, Germany d Center for Behavioral Brain Sciences, Magdeburg, Germany abstract article info Article history: Accepted 27 April 2013 Available online 9 May 2013 Keywords: Audiotactile Human fMRI Multisensory Temporal integration Although multisensory integration has been an important area of recent research, most studies focused on audio- visual integration. Importantly, however, the combination of audition and touch can guide our behavior as effec- tively which we studied here using psychophysics and functional magnetic resonance imaging (fMRI). We tested whether task-irrelevant tactile stimuli would enhance auditory detection, and whether hemispheric asymmetries would modulate these audiotactile benets using lateralized sounds. Spatially aligned task-irrelevant tactile stim- uli could occur either synchronously or asynchronously with the sounds. Auditory detection was enhanced by non-informative synchronous and asynchronous tactile stimuli, if presented on the left side. Elevated fMRI-signals to left-sided synchronous bimodal stimulation were found in primary auditory cortex (A1). Adjacent regions (planum temporale, PT) expressed enhanced BOLD-responses for synchronous and asynchronous left-sided bimodal conditions. Additional connectivity analyses seeded in right-hemispheric A1 and PT for both bimodal conditions showed enhanced connectivity with right-hemispheric thalamic, somatosensory and multi- sensory areas that scaled with subjects' performance. Our results indicate that functional asymmetries interact with audiotactile interplay which can be observed for left-lateralized stimulation in the right hemisphere. There, audiotactile interplay recruits a functional network of unisensory cortices, and the strength of these func- tional network connections is directly related to subjects' perceptual sensitivity. © 2013 Elsevier Inc. All rights reserved. Introduction During the last decade scientic interest in multisensory interplay (MSI) and its neural foundation has increased remarkably (see Driver and Noesselt, 2008, for review). At rst, higher-order association cortex and subcortical areas like the superior colliculi were perceived as key re- gions in which MSI might take place (Lewis and Van Essen, 2000; Meredith and Stein, 1986; Wallace et al., 1993). However, this view has been challenged by more recent studies indicating that MSI also takes place in low-level sensory-specic areas (for review see e.g. Cappe et al. (2009b)) and can even occur at subcortical stages (Cappe et al., 2009a; Musacchia et al., 2006; Noesselt et al., 2010). So far, most human imaging studies on MSI focused on audiovisual and visuotactile pairings (e.g. Kim and James, 2010; Macaluso et al., 2000; Molholm et al., 2002; Ramos-Estebanez et al., 2007; Shams et al., 2002; Werner and Noppeney, 2010a). Relatively few imaging studies have used audiotactile combinations (Foxe et al., 2002; Schürmann et al., 2006; for electrophysiological effects see e.g. Caetano and Jousmäki, 2006; Foxe et al., 2000; Gobbelé et al., 2003; Murray et al., 2005), though auditory and tactile signal transduction times from the ear and skin are much faster than visual signal transduction times. Therefore, audiotactile integration may rely on a mechanism, which is distinct from those of au- diovisual and visuotactile interplay. In accord, early modulations in ma- caques' primary auditory cortex have been found for tactile but not visual stimulation (Lakatos et al., 2007). Thus, we chose to identify the neural basis of audiotactile interplay in humans and specically focused here on auditory detection performance. Previous electrophysiological studies in macaques reported tactile inuences on low-level auditory cortex (e.g. Lakatos et al., 2007; for re- view see Kayser and Logothetis, 2007). Further, according to tracing studies in macaques, several sources of somatosensory inputs to audito- ry cortex exist, including insular (Hackett et al., 2007), somatosensory (Cappe and Barone, 2005) and multisensory cortex plus posterior tha- lamic nuclei (De la Mothe et al., 2006; for review see Musacchia and Schroeder, 2009; Smiley and Falchier, 2009). However, none of these studies has directly linked their neurophysiological/anatomical markers with behavior. Like in many cortical functions, hemispheric asymmetries may also play a role in MSI: it seems to predominantly recruit right-hemispheric NeuroImage 79 (2013) 371382 Corresponding author at: Otto-von-Guericke-University Magdeburg, Department of Biological Psychology, Postfach 4120, 39106 Magdeburg, Germany. Fax: +49 391 6711947. E-mail address: [email protected] (M. Hoefer). 1053-8119/$ see front matter © 2013 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.neuroimage.2013.04.119 Contents lists available at SciVerse ScienceDirect NeuroImage journal homepage: www.elsevier.com/locate/ynimg

Tactile stimulation and hemispheric asymmetries modulate auditory perception and neural responses in primary auditory cortex

Embed Size (px)

Citation preview

NeuroImage 79 (2013) 371–382

Contents lists available at SciVerse ScienceDirect

NeuroImage

j ourna l homepage: www.e lsev ie r .com/ locate /yn img

Tactile stimulation and hemispheric asymmetries modulate auditoryperception and neural responses in primary auditory cortex

M. Hoefer a,b,⁎, S. Tyll a,b, M. Kanowski b, M. Brosch c, M.A. Schoenfeld b,c, H.-J. Heinze b,d, T. Noesselt a,d

a Department of Biological Psychology, Otto-von-Guericke-University Magdeburg, Postfach 4120, 39106 Magdeburg, Germanyb Department of Neurology, Otto-von-Guericke-University Magdeburg, Leipziger Str. 44, 39120 Magdeburg, Germanyc Leibniz Institute of Neurobiology, Brenneckestraße 6, 39118 Magdeburg, Germanyd Center for Behavioral Brain Sciences, Magdeburg, Germany

⁎ Corresponding author at: Otto-von-Guericke-Univeof Biological Psychology, Postfach 4120, 39106 Magde6711947.

E-mail address: [email protected] (M. Hoe

1053-8119/$ – see front matter © 2013 Elsevier Inc. Allhttp://dx.doi.org/10.1016/j.neuroimage.2013.04.119

a b s t r a c t

a r t i c l e i n f o

Article history:Accepted 27 April 2013Available online 9 May 2013

Keywords:AudiotactileHumanfMRIMultisensoryTemporal integration

Althoughmultisensory integration has been an important area of recent research, most studies focused on audio-visual integration. Importantly, however, the combination of audition and touch can guide our behavior as effec-tively which we studied here using psychophysics and functional magnetic resonance imaging (fMRI).We testedwhether task-irrelevant tactile stimuliwould enhance auditory detection, andwhether hemispheric asymmetrieswouldmodulate these audiotactile benefits using lateralized sounds. Spatially aligned task-irrelevant tactile stim-uli could occur either synchronously or asynchronously with the sounds. Auditory detection was enhanced bynon-informative synchronous and asynchronous tactile stimuli, if presented on the left side. ElevatedfMRI-signals to left-sided synchronous bimodal stimulationwere found in primary auditory cortex (A1). Adjacentregions (planum temporale, PT) expressed enhanced BOLD-responses for synchronous and asynchronousleft-sided bimodal conditions. Additional connectivity analyses seeded in right-hemispheric A1 and PT for bothbimodal conditions showed enhanced connectivity with right-hemispheric thalamic, somatosensory and multi-sensory areas that scaled with subjects' performance. Our results indicate that functional asymmetries interactwith audiotactile interplay which can be observed for left-lateralized stimulation in the right hemisphere.There, audiotactile interplay recruits a functional network of unisensory cortices, and the strength of these func-tional network connections is directly related to subjects' perceptual sensitivity.

© 2013 Elsevier Inc. All rights reserved.

Introduction

During the last decade scientific interest in multisensory interplay(MSI) and its neural foundation has increased remarkably (see Driverand Noesselt, 2008, for review). At first, higher-order association cortexand subcortical areas like the superior colliculi were perceived as key re-gions in which MSI might take place (Lewis and Van Essen, 2000;Meredith and Stein, 1986; Wallace et al., 1993). However, this view hasbeen challenged by more recent studies indicating that MSI also takesplace in low-level sensory-specific areas (for review see e.g. Cappe etal. (2009b)) and can even occur at subcortical stages (Cappe et al.,2009a; Musacchia et al., 2006; Noesselt et al., 2010).

So far, most human imaging studies on MSI focused on audiovisualand visuotactile pairings (e.g. Kim and James, 2010; Macaluso et al.,2000; Molholm et al., 2002; Ramos-Estebanez et al., 2007; Shams et al.,2002; Werner and Noppeney, 2010a). Relatively few imaging studieshave used audio–tactile combinations (Foxe et al., 2002; Schürmann et

rsity Magdeburg, Departmentburg, Germany. Fax: +49 391

fer).

rights reserved.

al., 2006; for electrophysiological effects see e.g. Caetano and Jousmäki,2006; Foxe et al., 2000; Gobbelé et al., 2003;Murray et al., 2005), thoughauditory and tactile signal transduction times from the ear and skin aremuch faster than visual signal transduction times. Therefore, audiotactileintegrationmay rely on amechanism, which is distinct from those of au-diovisual and visuotactile interplay. In accord, early modulations in ma-caques' primary auditory cortex have been found for tactile but notvisual stimulation (Lakatos et al., 2007). Thus, we chose to identify theneural basis of audiotactile interplay in humans and specifically focusedhere on auditory detection performance.

Previous electrophysiological studies in macaques reported tactileinfluences on low-level auditory cortex (e.g. Lakatos et al., 2007; for re-view see Kayser and Logothetis, 2007). Further, according to tracingstudies inmacaques, several sources of somatosensory inputs to audito-ry cortex exist, including insular (Hackett et al., 2007), somatosensory(Cappe and Barone, 2005) and multisensory cortex plus posterior tha-lamic nuclei (De la Mothe et al., 2006; for review see Musacchia andSchroeder, 2009; Smiley and Falchier, 2009). However, none of thesestudies has directly linked their neurophysiological/anatomicalmarkerswith behavior.

Like in many cortical functions, hemispheric asymmetries may alsoplay a role in MSI: it seems to predominantly recruit right-hemispheric

372 M. Hoefer et al. / NeuroImage 79 (2013) 371–382

cortex (e.g. Downar et al., 2000;Giard and Peronnet, 1999;Molholmet al.,2002) though others have found less lateralized responses and evidencefor integration in the left hemisphere (e.g. Kayser et al., 2005; Murray etal., 2005; see Ramos-Estebanez et al., 2007 for left-hemispheric effectsin visuotactile interplay). In line with the notion of right-hemisphericdominance, Bushara et al. (2001) found a right-sided network of regionsfor audiovisual temporal judgments; and Noppeney and colleaguesreported correlations of strength of fMRI-signals in the right STS, auditorycortex and MT+ with behavioral responses using audiovisual stimuli(Lewis and Noppeney, 2010; Werner and Noppeney, 2010a). This indi-cates that in a multisensory context neural responses in the right hemi-sphere may be more closely linked to behavioral effects. However, noneof the studies above directly tested for a hemispheric specialization withlateralized stimuli.

On the behavioral level several effects for audio–tactile stimuluscombinations have been reported: For instance, the ventriloquist illu-sion reportedly also works for audio–tactile pairings (Bruns andRöder, 2010). Moreover, task-irrelevant tactile stimuli may also im-prove auditory sensitivity (Gillmeister and Eimer, 2007), thougheffects of hemispheric asymmetries were not tested there.

In a first human psychophysical experiment we directly tested foran increase in auditory sensitivity due to touch and for an effect ofhemispheric dominance using left and right-sided stimulation out-side the MR-scanner. We then investigated in an fMRI-experimenthow any increase in auditory sensitivity by a co-occurring touchand any effects of stimulated side relate to the modulation of regionalfMRI-signals, their inter-regional effective connectivity, and howthese changes in connectivity scale with subject-specific behavioralperformance.

Materials and methods

Participants

All subjects were paid volunteers (6 €/h) and naïve with respect tothe purpose of the study (except for one subject which is an author).They provided informed consent in accord with local ethics clearanceand had normal hearing and normal or corrected-to-normal vision.Recent studies also indicate that musical experience can shape multi-sensory integration (see e.g. Paraskevopoulos et al., 2012a, 2012b).However, this influence was less known when we started our experi-ment, thus was not acquired.

Behavioral experiment outside the scannerThis audiotactile experiment was run outside the scanner to con-

firm that any behavioral effects could not be attributed to the scannerenvironment. 24 subjects (eleven female; age range: 19–30 years,one left-handed) participated. Two subjects (one female) were ex-cluded due to performance below chance level in the auditory detec-tion trials. Thus, the data of 22 subjects were analyzed.

fMRI experiment23 subjects (eleven female; age range: 21–31 years, two left-handed)

participated in an fMRI experiment, to identify the neural underpinningsof the behavioral effects. Two subjects (both female) were excluded fromthe fMRI analysis because they stopped the experiment prior to the com-pletion of all 6 experimental sessions; a third subject (male)was excludedbecausehismovement parameters exceeded5 mmin abrupt headmove-ments. We included the behavioral data of all subjects in the later behav-ioral analysis, not to unduly bias the behavioral results; for fMRI-analysisthe data of 20 subjects were used.

Stimuli and apparatus

The following stimulus conditions were employed (see Fig. 1a):(a) unisensory: auditory stimulation (A), tactile stimulation (T),

and (b) multisensory: synchronous audio–tactile stimulation (ATS)and asynchronous audio–tactile stimulation with auditory stimula-tion preceding the tactile stimulus by 200 ms (ATAS). Further, therewas a baseline condition in which no stimulus was presented (N).Auditory stimuli were presented on the left or right side (behavior:11° visual angle; fMRI: 10° visual angle). Note that the lateralizedauditory stimulation used here was similar to those used in earlierauditory studies, which yielded to approximately 80% correct re-sponses in an auditory localization task (see Bonath et al., 2007).Left- and right-sided presentations were counterbalanced and tac-tile stimulation was equally likely for target and non-target trials.This design led to ten experimental conditions (synchronous bimod-al stimulation, asynchronous bimodal stimulation, unimodal tactilestimulation, and unimodal auditory stimulation, plus the baselinecondition each for the left and the right side). Albeit the two baselineconditions are virtually identical we treated them as two separateconditions since they were intended as control condition for eachside. Due to a suggestion by an anonymous reviewer we have addi-tionally calculated the behavioral data with a collapsed baselinecondition (see Supplementary Tables S1, S4 and S5), which gave vir-tually identical results.

Note that bimodal conditions were always presented on the sameside. This was realized by attaching two piezo-electric speakers to theleft and the right side of the central fixation point at the top of thescanner bore. Thus, a free-field auditory stimulation was used, insteadof monaural stimulation with headphones (Jäncke et al., 2002;Scheffler et al., 1998). Monaural stimulation may have yielded a dif-ferent activation pattern, but was not used in order to maximizeaudiotactile spatial congruence which is essential for multisensory in-tegration to occur (e.g. Stein and Stanford, 2008). Thus, we chosefree-field stimulation because of its high ecological validity.

The irrelevant tactile stimuli were non-vibratory pressure pulses.They were delivered via diaphragms inflated by pulses of pressuredair, controlled by a somatosensory stimulus generator (4-D Neuroim-aging, San Diego, California). Two lip clips were applied to the lowerlip (left and right corner of the mouth). The pneumatic bursts causeda 50 ms lasting deflection of the membrane of the lip clips, which feltlike a soft touch to the lip (see below for pressures employed in thetwo experiments). We chose the lips instead of fingertips for stimula-tion, because they are very sensitive, have a robust contralateral cor-tical representation (see e.g. Iannetti et al., 2003; Nguyen et al., 2004;Penfield and Boldrey, 1937), and because they are located close to theauditory stimulation which was close to the head. Hence, we mini-mized audiotactile spatial disparity in our experiments.

Behavior outside scannerVisual stimuli were presented on a 21 inch CRT monitor (Samsung

SyncMaster 1100MB). Central fixation was presented throughout theexperiment and subjects were instructed to fixate. Auditory stimuliwere presented via two piezo-electric speakers (to mimic stimuluspresentation inside the scanner) attached to the left and right sideof the monitor (11°/visual angle). Each auditory target stimulus wasa white noise sound burst with a duration of 50 ms (35 dB onaverage).

fMRIStimulation inside the scanner was identical to outside stimulation

with the following exceptions: visual response cues were presented viaa mirror-system onto a rear-projection screen located at the base of thescanner bed. Auditory stimuli were presented via two piezo-electricspeakers attached to the top of the scanner bore (+/−10°/visualangle). All stimuli were presented during silent periods (2 s) interleavedwith periods of scanning (2 s) to prevent scanner noise from interferingwith the perception of the auditory and tactile stimuli (rapid–sparsesampling protocol, see below for scanning protocol).

Fig. 1. Experimental conditions and design. A: Left column depicts unisensory conditions with sounds from the left or right (top) or with tactile stimuli delivered to the lower left orright lip. Middle column depicts bisensory conditions with synchronous (left) or asynchronous stimulation (right). Note that bisensory stimuli were always in spatial alignment.Right column depicts the no-target baseline condition, in which only the response cue was presented. B: Temporal sequence of sound-only target trials (left) and touch-onlyno-target trials (right). In all conditions a visual response cue (question mark) indicated that a manual response was required.

373M. Hoefer et al. / NeuroImage 79 (2013) 371–382

Design and procedure

Participants performed a forced-choice auditory detection taskand were instructed to ignore any co-occurring tactile stimulus.They were told to report via button press whether or not they hadheard a sound whenever a question mark appeared right above thefixation (see Fig. 1b) using the right hand. A question mark appearedalways 350 ms after a stimulus onset or a virtual no-stimulus onset. Itwas introduced to signal when a trial was over and a response re-quired. Thereby, it enabled us to obtain responses in all trials. Inboth experiments subject-specific accuracy was set approximatelybetween 60 and 80% in a threshold determination run prior to themain experiment by adjusting the sound pressure level accordingly.Therefore, subjects were presented with white-noise bursts of sevendifferent dB-levels (ranging from 28 to 40 dB for each side) in a ran-dom order. After each trial they had to give a response via pressingone of two buttons whether they heard a sound. The data were ana-lyzed before the main experiment and the dB-level with a detectionrate closest to 75% was chosen. Further, subjects wore ear-plugs notonly during the fMRI experiment, but also during the behavioral ex-periment to keep the experimental settings as similar as possible.Therefore, it was very unlikely that any sound made by the inflationof the lip clips would cause activations in the auditory cortex, andall subjects reported hearing no sounds associated with tactilestimulation.

Behavioral experimentAll stimuli were presented with a mean ISI of 2000 ms (range:

1700–2300 ms, rectangular distribution). Tactile stimuli were deliveredby a Neuroscan 4-D tactile stimulator (San Diego, USA). Every subjectcompleted six runs (320 trials per run; 20 (ATS), 20 (ATAS), 40 (A), 40(T), 40 (N) per side). The number of trials with touch plus sound (40),sound (40), touch (40) and no stimulation (40) was held identical toensure unbiased responses.

fMRIAfter the subject-specific threshold determination, a functional

localizer was run to identify brain areas which are specialized in pro-cessing the auditory and tactile stimuli employed here. This run had ablock design with 4 conditions (sound left, touch left, sound right,touch right; block length 20 s; 4 blocks per condition with blocks sep-arated by 12 s baseline condition). The order of the unisensory blockswas pseudo-randomized. Finally, the data of the six experimentalruns were collected; subjects responded to 960 stimuli, (160 trials

per run; for each hemifield: 10 (ATS), 10 (ATAS), 20 (A), 20 (T), and20 no-stimulus trials (N) per side). Again, the number of trials foreach side with touch plus sound (20), sound (20), touch (20) andno stimulation (20) was held identical to ensure unbiased responses.Stimuli were presented with a mean ISI of 4000 ms (range: 1600–6400 ms, poisson distributed) within silent interscan periods. Stimu-lus jittering was optimized for event-related-response estimation(Dale, 1999; Friston et al., 1999; Hinrichs et al., 2000).

fMRI data acquisition and analysis

All fMRI data were collected using a Siemens TRIO 3 Tesla MRscanner equipped with an eight channel head coil (Siemens,Erlangen, Germany). A rapid–sparse sampling protocol was used; ithas been successfully used before in other studies (see e.g. Bonath etal., 2007, 2013; Noesselt et al., 2007, 2010; Tyll et al., 2013) (131 vol-umes for the localizer run and 123 volumes per experimental run with32 slices covering the whole brain; TR: 4000 ms including a silentpause of 2000 ms; TE: 30 ms; spatial resolution: 3.5 × 3.5 × 4 mm;FOV: 224 × 224 mm).

The fMRI-data were analyzed with SPM 5 (Wellcome Department ofCognitiveNeurology, University College London, London, UK) using stan-dard preprocessing procedures including correction for motion artifacts,slice-time acquisition correction, normalization into standard stereotac-tic space using the MNI template (Montreal Neurological Institute), andspatial smoothing using a 6 mm full-width-at-half-maximum Gaussiankernel. Subject-specific effects were analyzed in a first-level modelwith all experimental conditions per run plus the realignment parame-ters as nuisance regressors to account for residual motion artifacts. Asecond-level analysis was then performed across subjects in MNI space.Condition-specific effects for each subject were estimated according tothe general linear model and passed to a second-level analysis as con-trasts. This involved creating the contrast images (pooled, i.e., summedover runs) pertaining to the conditional effects.

To obtain contrasts devoid of any remaining effects due to scannervibrations our analysis utilizes differential modulations.

Group data were analyzed using a random effects ANOVA with thebehavioral data (hit rate/correct rejection) as further covariates. In ad-dition, the psycho-physiological interaction (PPI) approach (Friston etal., 1997) was used for the analysis of effective connectivity in whichthe parameter d-prime was used as a covariate. The PPI connectivityanalysiswas seeded in a spherical region (4 mmdiameter) surroundingspecific maxima in auditory cortex identified in the group ANOVA.

374 M. Hoefer et al. / NeuroImage 79 (2013) 371–382

A two-step approach was used for the analysis of the fMRI data. Inthe first step, contrast of interest, the p-value was set to p b .05(corrected for multiple comparisons on the voxel level). Secondly, asubsequent characterization was done within these modulated areasat p b .05 (uncorrected).

For the exact identification of brain regions the Matlab based anato-my toolbox for SPMwith probability maps for primary auditory and so-matosensory regions (Eickhoff et al., 2007) and the Talairach Daemon(especially for subcortical regions; Lancaster et al., 1997, 2000) wereused. Throughout this article A1 labeling refers to voxels within areaTE 1.0 with a probability of 50% or higher.

Results

Behavioral results

Fig. 2 (left side, and Supplementary Figs. S1–S2 for split data inside/outside scanner) depicts the mean percentage of correct responses forsound trials (mean hit rate) and no-sound trials (mean correct rejection)with and without task-irrelevant concurrent or asynchronous touch. Thebehavioral results from all subjects participating in the fMRI experiment(n = 23) and subjects participating in the behavioral experiment(n = 22) were collapsed, since no significant effect between groups in-side and outside the scanner was found (see below for statistical details).Whenever a task-irrelevant tactile stimulus was presented on the sub-jects' left side their auditory sensitivity was improved. Moreover, for thetouch-only conditions the correct rejection rate was higher than for theno-stimulus condition. The behavioral data (hits and correct rejectionsfor sound and no-sound conditions respectively) were subjected to athree-way repeated measures ANOVA with the within-subject factorssound (present/absent), touch (present/absent), side (left/right) andthe between-subject factor experiment (inside/outside scanner) (see S1for an ANOVA with the factors sound and side). The ANOVA revealed asignificant main effect of sound (F (1, 43) = 29.7, p b .001) and signifi-cant interaction effects between the two factors sound and side (F (1,43) = 6.7, p b .02), as well as the three factors sound, touch and side(F (1, 43) = 7.2, p b .01). No significant main effect and no interactionswere found for the between-subject factor inside/outside scanner (allp's > .15). Additionally, post-hoc paired sample t-tests were performedand revealed the following results.

Fig. 2. Behavioral results of experiments 1–2. Left: Bar graph shows themean percentage (±SEthe left (L) and the right (R) side. Right: Bar graph shows themeand-prime (±SEM) (i.e. percepside.*p b .05, **p b .01. Subjects reacted significantly faster in the bimodal synchronous condR > AR: t (1, 44) = −2.4, p b .02) and to the bimodal asynchronous condition (ATS L > ATAauditory conditions compared to the bimodal asynchronous conditions (ATAS L > AL: t (1, 44)conditions compared to the control conditions (TL > NL: t (1, 44) = −9.9, p b .001; TR > NR

For the left side, subjects' performance in synchronous bimodalstimulus detection was significantly increased relative to unimodal au-ditory stimuli detection (ATS L > AL: t (1, 44) = 2.1, p b .02); also de-tection rates for the asynchronous bimodal condition (not included inthe factorial ANOVA) were significantly increased relative to soundsalone (ATAS L > AL: t (1, 44) = 2.6, p b .01) but did not differ fromthe synchronous bimodal condition (ATAS L vs. ATS L: p = .68). Thehit rate for the unimodal auditory conditions did not differ significantlyfor the left and the right side (AL > AR: t (1, 44) = 1.3, p = .1), and nosignificant correlation between subject-specific auditory performanceand audiotactile performance was evident. In contradistinction, on theright side the hit rate was significantly higher for the unimodal auditorycondition compared to the asynchronous bimodal presentation (ATASR > AR: t (1, 44) = −2.0, p b .03) but not statistically different fromthe synchronous bimodal condition (ATS R > AR: t (1, 44) = −1.1,p > .25). The correct rejections were significantly higher for the tactilecondition than for the control condition (T > N: t (1, 44) = 3.942,p b .001, collapsed across sides; TL > NL: t (1, 44) = 2.3, p b .02;TR > NR: t (1, 44) = 3.1, p b .005), indicating that subjects were notonly more accurate in identifying a trial as a sound trial on the leftside when a task-irrelevant and completely uninformative tactile stim-ulus was presented, but were also better in identifying a no-sound trialwhen a tactile stimuluswas presented (see Fig. 2, left side). In total 29 ofour subjects showed a higher effect for audiotactile compared to audito-ry stimulation on the left compared to the right side, 11 subjectsshowed an opposite effect and 5 out of 45 participants showed no later-alization. Further, we directly compared the performance in thetouch-only condition and found it to be significantly higher on theright side (TR > TL: t (1, 44) = 2.5, p b .01). Nevertheless, the meanof those two conditions differed less than 2% (TR: 91.7 ± 0.8; TL:90.1 ± 0.8).

To formally test for the assumption that an improvement in sensorysensitivity had taken place, we also computed and analyzed perceptualsensitivity and response bias, as indexed by the signal-detection param-eters d-prime (d′) and criterion (Green and Swets, 1966; Noesselt et al.,2010; Stanislaw and Todorov, 1999). Paired t-tests (see Fig. 2, right side,and S2–S4 for d′ data with two no-stimulus conditions and collapsedbaseline condition) revealed a significant greater d′ for both bimodalconditions on the left side relative to the left unimodal auditory condi-tion (ATS L > AL: t (1, 44) = 1.7, p b .05; ATAS L > AL: t (1, 44) = 1.9,

M) of correct responses (n = 45) for the five conditions ATS, ATAS, A, T and N presented ontual sensitivity) for the three conditionsATS, ATAS andA, presented on the left and the rightition compared to the unimodal auditory (ATS L > AL: t (1, 44) = −2.7, p b .005; ATSS L: t (1, 44) = 4.2, p b .001; ATS R > ATAS R: t (1, 44) = 4.0, p b .001), in the unimodal= 2.0, p b .02; ATAS R > AR: t (1, 44) = 3.1, p b .002); as well as in the unimodal tactile: t (1, 44) = −9.4, p b .001).

375M. Hoefer et al. / NeuroImage 79 (2013) 371–382

p b .05). No further significant effects were found, neither for condi-tions with stimulation on the right side nor the criterion on eitherside. Together, our accuracy results consistently demonstrate thattask-irrelevant tactile stimulation can enhance auditory perceptual sen-sitivity, if bimodal stimuli are presented on the left side.

In addition to accuracy measures, reaction time measures have alsobeen used to characterize MSI (Diederich and Colonius, 2004; Rach etal., 2011). Hence, we also analyzed reaction times using again athree-way repeated measures ANOVA with the within-subject factorssound (present/absent), touch (present/absent) and side (left/right) andthe between-subject factor experiment (inside/outside scanner). TheANOVA revealed a significant main effect of sound (F (1, 43) = 341.8,p b .001), a significant main effect of touch (F (1, 43) = 124.5,p b .001) and a significant interaction effect between the two factorssound and touch (F (1, 43) = 72.6, p b .001). No significant main effectwas found for the factor side and no interaction effect with it was signif-icant (all p's > .19) (see Fig. 3 for details and post-hoc tests). Thus, unlikethe lateralized increase in accuracy, speededRTs for synchronous bimodalstimuli were found for the left and the right side, whereas slowed RTswere found for asynchronous stimuli, again on both sides. The decreasedRTs for the synchronous condition regardless of side and the lateralizedaccuracy gain suggest that two potentially independent mechanisms ofaudiotactile interplay may exist: One mechanism speeds up RTs, theother increases accuracy (see e.g. Noesselt et al., 2010 for similar findingswith audiovisual stimuli). This pattern of results is not in line with theclassical notion of speed-accuracy trade-off. Further studies are neededto explorewhether speed-accuracy trade-offs aremodulated inmultisen-sory contexts.

Imaging data

Effects of local BOLD-response

Functional localizer results. We then questioned our fMRI-data for theneural basis of these behavioral effects. First, to identify unisensoryauditory and tactile, plus putative multisensory regions, fMRI-datafrom subjects' functional localizer runs were analyzed (thresholdedat p b .05, FDR-corrected). Auditory stimuli activated bilateral audito-ry core, belt and parabelt regions, as expected. Tactile stimuli activat-ed bilateral primary and secondary somatosensory cortex, as well asfrontal regions (Brodmann area 6 and 44), the parietal operculi/insulaand the posterior superior temporal gyrus (pSTS; plus the central

Fig. 3. Reaction time results of experiments 1–2. Bar graphs show the influence of audito-ry and tactile stimulation on mean reaction times (±SEM) in ms for the five conditionsATS, ATAS, A, T and N presented on the left (L) and the right (R) side (n = 45).*p b .05,**p b .01, ***p b .001.

thalamus at p b .05, uncorr.). Contrasting the sensory modalities oneach side of stimulation (sound left > touch left, sound right > touchright, and touch left > sound left, at p b .05, FDR-corr.) differences inthe BOLD-signal were found mainly in sensory-specific regions (seeTable 1). Potential multisensory regions with enhanced signals forboth stimulus types were found bilaterally within the pSTS and theinsula (dysgranular insula Id1, granular insula Ig2), as well as in theright precentral gyrus (Brodmann area 6). These t-maps were usedas inclusive masks for the experimental runs and served as regionsof interest. Please note, that this selection of regions was orthogonalto the analysis of the main experiment (Kriegeskorte et al., 2009).

Effects of tactile stimulation on region-specific fMRI-signal. Turning tothe experimental runs, group-level voxel-specific modulations (withinregions of interest independently identified in the localizer run) werecalculated for the multisensory relative to the mean of the unisensoryconditions (thresholded at p b .05, FDR-corrected). Due to thelateralized accuracy effects our primary focuswas on testing thedata ac-quired by stimulating the subjects' left side. We tested for the mean cri-terion (Beauchamp, 2005) by contrasting the left synchronous bimodalcondition (ATS L)with themean of both left unimodal conditions (mean(AL + TL)). A mask was used (AL > TL) to identify effects within areasspecialized in lateralized sound processing. BOLD-responses in the leftsynchronous bimodal condition were enhanced compared to the leftunisensory conditions within bilateral STG (planum temporale (PT):contralateral: [x = 58, y = −32, z = 12], t = 5.1; ipsilateral:[x = −54, y = −32, z = 8], t = 5.5) and contralateral primary audi-tory cortex (as defined by the SPM anatomy toolbox (Eickhoff et al.,2007) A1/TE 1.0; [x = 44, y = −22, z = 8]; t = 4.8; see Fig. 4, upperleft panel). Mean beta estimates (proportional to percent signal change)for A1 are plotted in Fig. 4 (lower left panel). A three-way repeatedmea-sures ANOVA on the extracted beta estimates in A1 revealed a signifi-cant main effect of sound (F (1, 19) = 9.4, p b .01) and of touch (F (1,19) = 8.4, p b .01), a significant interaction effect between the twofactors side and touch (F (1, 19) = 5.1, p b .05) and most notably be-tween all three factors (sound, touch, side; F (1, 19) = 4.6, p b .05).No interaction effect for sound and touch or sound and side was found(all p's > .29). Additionally, post-hoc paired sample t-tests on the A1beta estimates were performed. We found that the fMRI-signal esti-mates for the left bimodal synchronous condition were significantlyhigher than for the left bimodal asynchronous condition (ATS L > ATASL: t (1, 19) = 2.5, p b .05) and the unimodal conditions (ATS L > AL: t(1, 19) = 4.1, p b .001; ATS L > TL: t (1, 19) = 4.5, p b .001). Also,the estimates for the left bimodal asynchronous condition weresignificantly higher than for the left tactile condition (ATAS L > TL: t(1, 19) = 1.7, p b .05), indicating that the audiotactile responseexceeded the maximum of the unisensory response. Finally, we testedfor non-linear effects (superadditivity) in A1: the fMRI-signal estimatesfor the left bimodal synchronous condition were significantly higherthan for the sum of the unimodal auditory and tactile condition (ATSL > sum (AL + TL): t (1, 19) = 2.2, p b .05). This pattern was neitherobserved for the left-sided bimodal asynchronous condition nor forthe right-sided bimodal stimulations (all p's > .05).

Outside A1, additional voxel-based group signal increases for syn-chronous bimodal relative to the mean of the unimodal conditionswere found also within low-level tactile and multisensory areas (seeTable 2 for details).

We then tested whether the modulation in the BOLD-signal ob-served for left-sided stimulation indeed reflect changes in accuracy(whichwere observed only for left-sided stimulation) or rather changesin RTs (which were observed for left and right sided stimulation).Hence, we analyzed the data from the right-sided stimulation. If wewould find similar increases in the BOLD-signal in left auditory cortexfor right-sided bimodal stimuli as we found in right auditory cortex forleft-sided stimulation, it would suggest that the BOLD-response reflectsthe RT pattern. However, if no BOLD-increase for right-sided stimuli

Table 1Local maxima for sound vs. touch derived from functional localizer scans independent of main experiment. MNI-coordinates (x, y, z in mm), t- and p-values for the local maximaplus extent of clusters (p b .05, FDR-corrected; k ≥ 500 unless otherwise mentioned).

Mask Regions x y z Cluster-size Peak t-value p-Value

mm

SL > TL Right superior temporal gyrus 58 −30 6 1093 4.56 .001Right superior temporal gyrus (TE 1.1 80%) 42 −28 12 4.25 .001Left superior temporal gyrus −44 −34 8 1220 7.40 .001Left caudate nucleus −20 4 22 5.43 .001

SR > TR Right superior temporal gyrus 42 −30 12 771 7.48 .001Left superior temporal gyrus −42 −36 10 1093 7.20 .001Left superior temporal gyrus (TE 1.1 70%) −36 −32 14 7.17 .001Left superior occipital/parietal gyrus −20 −70 26 11,624 5.84 .001

TL > SL Right postcentral gyrus (BA 1 70%) — S1 58 −14 38 3851 9.57 .001Right supramarginal gyrus 60 −26 24 8.01 .001Right insula 42 −4 −2 7.35 .001Right middle temporal gyrus# 54 −60 0 158 4.21 .002Left supramarginal gyrus −58 −28 26 1174 7.23 .001Left postcentral gyrus — S1 −60 −18 30 7.05 .001Left insula −38 −4 −4 678 6.28 .001Left precentral gyrus (BA 6 70%) −58 4 32 5.64 .001

TR > SR Right supramarginal gyrus 58 −26 26 2667 7.96 .000Right insula 42 −4 −8 7.87 .000Left insula −40 −2 −2 4162 9.28 .000Left supramarginal gyrus −56 −28 22 8.93 .000Left postcentral gyrus — S1 −60 −18 30 8.00 .000

% = percentage of activation allocated to area X (Eickhoff et al., 2007).

# k ≥ 150.

376 M. Hoefer et al. / NeuroImage 79 (2013) 371–382

would be observed, this would suggest that the BOLD-modulationclosely resembles our accuracy results. In accord with our accuracy re-sults no significant modulation of fMRI-signals was found for the con-trast ATS R > mean (AR + TR) in auditory, tactile or multisensorybrain regions even at p b 0.05 uncorrected. This lack of significantresults suggests that the observed RT effect does not arise in low-levelauditory or somatosensory regions, unlike that right-hemispheric accu-racy effect, but may rather reflect unspecific arousal.

Third, we tested for an effect of asynchronous bimodal stimulation.For the voxel-wise group-level comparison of asynchronous audiotactilestimulation (ATAS L) minus the unisensory mean (AL + TL) enhancedfMRI-signalswere also found in auditory, tactile andmultisensory regions(see Table 3; Fig. 4, upper right panel). In contrast to the synchronous bi-modal condition we did not find enhanced activity in A1, but in the con-tralateral planum temporale ([x = 58, y = −32, z = 12], p b .01).Mean beta estimates extracted from right PT are plotted in Fig. 4 (lowerright panel). A three-way repeated measures ANOVA revealed a signifi-cant main effect of sound (F (1, 19) = 27.7, p b .001) and of touch (F(1, 19) = 16.6, p b .001), a significant interaction effect between thetwo factors sound and touch (F (1, 19) = 7.2, p b .05), and touch andside (F (1, 19) = 8.8, p b .01). Neither the main effect for the factorside nor for the interaction effect for sound and side nor the triple inter-action effect reached significance (all p's > .12). Additionally, post hoct-tests were performed (see above) and revealed the following results:the fMRI-signal estimates for both left-sided bimodal conditions weresignificantly higher than for the unimodal conditions (ATS L > AL: t (1,19) = 3.7, p b .001; ATS L > TL: t (1, 19) = 3.6, p b .01; ATAS L > AL: t(1, 19) = 2.6, p b .05; ATAS L > TL: t (1, 19) = 2.5, p b .05). Also, wetested for superadditivity: none of the fMRI-signal estimates for the bi-modal conditions was significantly higher than for the sum of theunimodal auditory and tactile condition (all p's > .27).

For completeness the right side was also tested, but the contrastATAS R > mean (AR + TR) did not reveal significant activations(p b .05, FDR-corrected, k > 20).

In sum, our group results indicate that the somatosensorily in-duced lateralized gain in auditory sensitivity is related to an increasein fMRI-signals in low-level auditory, tactile as well as multisensoryregions. For synchronous stimulation this increase can already be ob-served in primary auditory cortex. Here, the response to synchronous

audiotactile stimulation exceeded the mean, the maximum and thesum of unisensory responses. For asynchronous stimulation we findenhanced responses in auditory belt regions, but not primary audito-ry cortex. Moreover, these responses to asynchronous bimodal stim-ulation are significantly higher than the maximal unisensoryresponse, but not statistically different from summed unisensory re-sponses. Finally, changes in BOLD-response appear to be lateralized,and thus resemble behavioral accuracy measures, but not reactiontime measures.

Analysis of effective connectivity between brain areas (PPI). A1 was cho-sen as seeding point for a connectivity analysis (psycho-physiologicalinteraction, Friston et al., 1997), because significant effects werefound in this region when comparing ATS minus the unisensorymean (A + T) for left-sided stimulation. As an additional seed for asecond PPI analysis PT was chosen, because a local maximum wasfound there for the contrast ATAS > mean (A + T) for left-sided stim-ulation. Other multisensory regions like the STS did not show signifi-cant effects comparing bimodal and auditory stimulation and werethus not chosen as a seeding point.

While right A1 was chosen as the seed region for a connectivityanalysis, the subject-specific d′ was included in the analysis as covariate.We tested whether we would find enhanced connectivity that scaledwith the subject-specific audiotactile benefit (i.e. increase in sensitivity).For this comparison,we found enhanced coupling of A1with the superiortemporal sulcus (STS), the supramarginal gyrus (SMG), the thalamus andthe primary somatosensory cortex (S1)which scaledwith subjects' sensi-tivity (see Fig. 5 left column and Table 4). Since d′ is a sensitivity index,our connectivity results seeded in A1 support the notion that enhanceddetection of a near-threshold auditory stimulus in the context of tactilestimulation is related with increased effective connectivity of A1 withtactile and multisensory regions.

Right PT was chosen as the starting point for a second connectivityanalysis, again with d′ as covariate. As before, we tested whether wewould find a covariation of subject-specific performance with connectiv-ity in the context of audio–tactile relative to auditory stimulation but thistime in the context of asynchronous stimulation. For this, we foundenhanced coupling of PT with the STS, SMG, the thalamus and S1 (seeFig. 5 right column and Table 5) that scaled with subjects' sensitivity.

Fig. 4. fMRI results in auditory cortex. Top row left: Enhanced fMRI-signals within auditory cortex overlaid on subjects' mean anatomical image for the contrast ATS L > mean(AL + TL) at [x = 44, y = −24, z = 8]. Right: contrast ATAS L > mean (AL + TL) at [x = 58, y = −32, z = 12]. Bottom row left panel: Mean beta estimates (±SEM) (propor-tional to percent signal change) derived from contralateral A1 (yellow circle), and right panel: beta estimates (±SEM) of contralateral planum temporale (PT, yellow circle).*p b .05, **p b .01, ***p b .001.

377M. Hoefer et al. / NeuroImage 79 (2013) 371–382

In sum, our results demonstrate that task-irrelevant tactile stimulienhance the perceptual sensitivity in an auditory detection task whenpresented together— at the same time but alsowhen presented slightlyapart in time. This effect was only found for left-sided stimulation,suggesting that hemispheric asymmetries modulate audiotactile inter-play. The neural basis of this behavioral gain in accuracy comprises pri-mary auditory cortex for synchronous audio–tactile stimuli but extendsinto auditory belt regions. For the asynchronous stimuli we found en-hanced fMRI-signals only in auditory belt regions, especially planumtemporale. These fMRI-results were specific to left-sided stimuli,whereas no BOLD-increases for right sided stimuli were found.

The results from the connectivity analyses (seeded in A1 and PT inthe context of synchronous and asynchronous audiotactile stimula-tion, respectively) indicate that right-hemispheric A1, as well as PT,are functionally coupled with sensory-specific areas like the ventro-lateral thalamus and S1, and also with hetero-modal areas like thepSTS and the SMG and that this enhanced connectivity scaled withsubjects' behavioral performance.

Discussion

The present study investigated whether the detection of near-threshold sounds can be facilitated by presenting task-irrelevant tactilestimuli and whether functional asymmetries may modulate this effect.Our results consistently demonstrate an improvement in performancewhen a task-irrelevant tactile stimulus is presented togetherwith an au-ditory stimulus on subjects' left side. This result was replicated in a sec-ond fMRI-experiment with new subjects and showed an increase forsynchronous and asynchronous tactile co-stimulation. Importantly,our results implicate that the gain in accuracy was accomplishedthrough an improvement in perceptual sensitivity (as demonstratedby a significant greater d′ for AT relative to A on the left side) ratherthan through a mere response bias in accord with earlier studies using

audio–visual stimulation (e.g. Noesselt et al., 2010; Werner andNoppeney, 2010a, 2010b). This indicates that the behavioral effect inour study rather reflects a modulation of early sensory processes thana late decisional process according to the classical hierarchical model(Green and Swets, 1966; see also Odgaard et al., 2003).

Notably, our somatosensorily induced increase in auditory sensitivi-ty was only observed for left side stimulation. Our results accord wellwith previous studies (Downar et al., 2000; Giard and Peronnet, 1999;Molholm et al., 2002), which reported hemispheric differences in MSIwith a stronger involvement of the right hemisphere duringmultisenso-ry stimulus processing (Molholm et al., 2002) and during change detec-tion in amultisensory context (Downar et al., 2000). Our results are alsoin line with several fMRI-studies on multisensory integration whichreported larger BOLD-modulations in the right hemisphere for multi-sensory benefits (Lewis and Noppeney, 2010; Werner and Noppeney,2010a). Others however (e.g. Ramos-Estebanez et al., 2007) havereported left hemispheric advantages for visuotactile interplay mea-sured with TMS-phosphene induction, suggesting that either effects ofmodality combination (visuotactile vs. audiotactile) or task-related ef-fects may significantly shape hemispheric asymmetries. Alternatively,the auditory detection task might itself be lateralized (see e.g. Strackeet al., 2009), with some studies reporting left-hemispheric dominancefor auditory detection (Levänen et al., 1996). Given thatmultisensory in-terplay should show a higher effect for weaker unisensory representa-tions (rule of ‘inverse effectiveness’, Stein and Stanford, 2008), theresults by Levänen et al. (1996) suggest that the right hemispheremight have a higher gain when integrating additional informationfrom other senses (though note that detection performance in the audi-tory task was not different between left and right sounds in the currentstudy). In accord with the notion of auditory lateralization, it has alsobeen reported that responses in the left hemisphere during auditoryspatial processingmay relymore strongly on inter-hemispheric transfer,whereas responses in the right hemisphere are driven predominantly by

Table 2Local maxima for the contrast ATS L > mean (AL + TL). MNI-coordinates (x, y, z in mm), t- and p-values for the local maxima plus extent of clusters within auditory (SL > TL),tactile (TL > SL) and multisensory (Multi) regions identified by the independent functional localizer (p b .05 FDR-corrected; k ≥ 20, unless otherwise mentioned).

Mask Regions x y z Cluster-size Peak t-value p-Value

mm

SL > TL Right superior temporal gyrus 58 −30 12 232 5.09 .001Right Heschl gyrus (TE 1.0/1.1 70%) — A1 44 −22 8 4.77 .002Left middle temporal gyrus −54 −32 8 501 5.50 .001Left superior temporal gyrus −56 −22 4 4.81 .002

TL > SL Right supramarginal gyrus 52 −32 24 1538 6.05 .001Right superior temporal gyrus 58 −34 18 5.82 .001Right inferior frontal gyrus (p. triangularis) 48 36 4 88 4.12 .008Right middle frontal gyrus 50 44 4 3.47 .026Right precentral gyrus (BA 6 50%) 42 −10 52 28 3.96 .010Right thalamus (VLN)# 12 −14 10 11 3.84 .013Left superior temporal gyrus# −66 −36 22 16 3.97 .010Left middle temporal gyrus −40 −62 12 57 4.32 .005Left superior temporal gyrus −58 −16 12 65 4.21 .007Left insula −38 −6 2 20 3.58 .021

Multi Right superior temporal gyrus 52 −34 22 660 5.96 .001Right parietal operculum (OP4 60%) — S2 64 −12 12 5.81 .001Right insula (Ig2 30%) 42 −18 8 5.04 .001Left superior temporal gyrus −56 −32 10 463 5.49 .001

% = percentage of activation allocated to area X (Eickhoff et al., 2007).

# k ≥ 10.

378 M. Hoefer et al. / NeuroImage 79 (2013) 371–382

subcortical input (Hausmannet al., 2005; Krumbholz et al., 2007),whichmay hint at the possibility that audiotactile interplay can occur at a sub-cortical level (see also Noesselt et al., 2010; Van der Burg et al., 2013 forsubcortical mechanisms instrumental in audiovisual interplay). Futurestudies which would directly compare audiotactile interplay under dif-ferent task-regimes are needed to decide between these alternativeexplanations.

Further, we found an improved performance not only for synchro-nous bimodal stimulation, but also when the stimuli were presentedslightly apart in time (SOA = 200 ms) with our non-vibratory pulsedstimuli delivered to the lower lip. Recently, Gillmeister and Eimer(2007) reported a behavioral benefit between hearing and touch forauditory stimuli paired with synchronously presented vibrotactilestimuli delivered to the hands, but no effect was observed for asyn-chronous stimuli, which is in clear contradistinction to our results.Difference in stimulation type (vibrotactile vs. pressure pulse, see e.g.Schürmann et al., 2006) and place of stimulation (hand vs. lip, seeBergenheim et al., 1996; Fu et al., 2003; Menning et al., 2005) mayaccount for the observed differences for the asynchronous condition.

Another reason for our behavioral gain in the non-coincident condi-tion could be the short inter stimulus interval of 150 ms between thepreceding sound and the tactile stimulus which may still be perceivedas synchronous in the majority of trials; thus the subjective coincidencemight be related to the observed behavioral effects. Further, it has to beconsidered that the distance for neural signals to travel is shorter fromthe lips to the brain than, for example, from the finger tips to the brain(Bergenheim et al., 1996) and it has been suggested thatmore attention-al resources are deployed to monitor closer body parts (like the face)

Table 3Local maxima for the comparison of ATAS L > mean (AL + TL). MNI-coordinates (x, y, z(SL > TL), tactile (TL > SL) and multisensory regions (Multi) identified by the independen

Mask Regions x y

m

SL > TL Right superior temporal gyrus # 58 −TL > SL Right supramarginal gyrus 60 −

Right parietal operculum (OP1 30%) — S2 52 −Right insula 42 −

Multi Right superior temporal gyrus 58 −Right parietal operculum (OP1 70%) — S2 64 −

% = percentage of activation allocated to area X (Eickhoff et al., 2007).

# k ≥ 5.

than more distant parts (like the hand; Menning et al., 2005). The shorttransduction times and deployment of attentional resources could there-fore lead to the difference we found for non-coincident stimulation.Hence, future studies which will directly compare tactile co-stimulationto different body parts are needed.

Besides the increase in perceptual sensitivity faster reaction timesfor the synchronous bimodal conditions were observed. Remarkably,the results of RT show a different pattern than those of hit rate andd′. First, there is no difference between the sides of stimulation.Second, subjects reacted slower in the auditory-only condition thanin the synchronous bimodal conditions, but they were faster in theauditory than in the asynchronous bimodal condition. This suggestthat accuracy and RT may index distinguishable processes (Noesseltet al., 2010), and that the RT decrease in the synchronous conditionmay rather reflect a general warning effect, whereas differences inaccuracy may reflect multisensory interplay, an interpretation thatis supported by our neuroimaging findings discussed below.

Effects of local BOLD-response modulations

The neural basis of our behavioral effect was found already in pri-mary auditory cortex with increased fMRI-responses for the synchro-nous audio–tactile stimulation relative to all other conditions. Thisenhancement was highest in auditory regions in the right hemi-sphere. In accord, for audiovisual stimuli a similar effect was foundin right-hemispheric STS, MT+ and auditory cortex (Lewis andNoppeney, 2010; Werner and Noppeney, 2010a).

in mm), t- and p-values for the local maxima plus extent of clusters within auditoryt functional localizer (p b 0.05 FDR-corrected; k ≥ 10, unless otherwise mentioned).

z Cluster-size Peak t-value p-Value

m

30 12 7 4.34 .01524 36 10 4.42 .01622 22 14 4.17 .0222 −2 14 4.42 .013

34 18 213 4.77 .00318 16 5.37 .004

Fig. 5. Covariation of effective connectivity with subject-specific perceptual sensitivity in so-matosensory and multisensory regions. Left column: Effective connectivity with A1 in thecontext of ATs LminusAL covariedwith subject-specific perceptual sensitivity. Right column:Effective connectivitywithPT in the context ofATAS LminusAL covariedwith subject-specificperceptual sensitivity. 1 (red) = postcentral sulcus, 2 (orange) = central sulcus, 3 (yel-low) = precentral sulcus, 4 (magenta) = lateral sulcus, 5 (purple) = superior temporalsulcus, blue circles = somatosensory regions, green circles = multisensory regions.

379M. Hoefer et al. / NeuroImage 79 (2013) 371–382

Moreover, fMRI-responses to left-sided synchronous bimodal stimula-tion in right-sided A1 were also enhanced relative to the mean, themaximum and the sum of unisensory conditions (AL + TL). For theasynchronous conditions we found enhanced fMRI-signals in right-hemispheric PT, which was also modulated by synchronous bimodalstimulation. Here, however, fMRI-signals only exceeded the maximalunisensory response, but not the sum of unisensory stimulation. Thus,despite the similar increase in perceptual sensitivity for coincident andnon-coincident bimodal stimulation, the neural basis for these effectsdid not only differ in the local maximum of the effect but also in theobserved response profiles: while we found non-linear enhancement in

Table 4PPI results seeded in A1 with d-prime as covariate; MNI-coordinates (x, y, z in mm), t- and pauditory cortex (A1), d-prime as covariate within auditory (SL > TL), tactile (TL > SL) and muncorrected; k ≥ 30, unless otherwise mentioned).

Mask Regions x y

m

SL > TL Right cerebellum 20 −Right lingual gyrus 26 −Left middle temporal gyrus −52 −Left postcentral gyrus (BA 3b 50%) — S1 −40 −Left superior occipital gyrus −24 −Left cerebellum −24 −

TL > SL Right middle temporal gyrus 50 −Right supramarginal gyrus 58 −Right parietal operculum — S2 52Right precentral gyrus 58Right postcentral gyrus — S1 64 −Right postcentral gyrus — S1 60 −Right precentral gyrus (BA 6 80%) 56 −Right insula 38Right putamen 34 −Right thalamus (VLN)# 12 −Left postcentral gyrus — S1 −50 −Left postcentral gyrus (BA 1 40%) — S1 −60 −Left supramarginal gyrus −52 −Left parietal operculum — S2 −56Left insula (OP3 50%) −38 −Left cerebellum −14 −

Multi Right middle temporal gyrus/sulcus 48 −Right superior temporal gyrus/sulcus# 44 −Left insula (Id1 60%) −38 −

% = percentage of activation allocated to area X (Eickhoff et al., 2007).

# k ≥ 10.

A1 for coincident bimodal stimulation, this effect was not observed inPT neither for coincident nor for non-coincident bimodal stimulation.This suggests, that neural superadditive effects may be related to coinci-dent bimodal stimulation, but do not always directly translate to behav-ioral effects. Moreover, no effects in auditory cortex were found forright-sided stimulation, even with a more lenient threshold (p b .05,uncorrected), again in accord with our accuracy gains, but not ourRT-decreases.

The distinct response profiles for synchronous and asynchronousstimuli in auditory core and (para-) belt areas might be explainedby their differential response latencies to sounds: the only auditoryregion showing enhanced activity to asynchronous bimodal stimula-tion was PT. Potential homologues in macaque auditory cortex arethe caudomedial (CM) and the caudolateral (CL) regions. Many neu-rons in these areas respond to somatosensory stimulation (Fu et al.,2003; Kayser et al., 2005; Schroeder et al., 2001), possibly due to so-matosensory input from the retroinsular cortex and granular insula(Smiley et al., 2007). Importantly, CM can be distinguished from A1by its longer response latencies to sounds (Recanzone et al., 2000).Thus, these auditory belt regions might have broader temporal inte-gration windows leading to enhanced responses to non-coincidentbimodal stimulation.

We also found increased fMRI-responses in somatosensory regions forboth bimodal stimulations relative to all other conditions (synchronous:insula, SMG, thalamus (VLN); asynchronous: insula, SMG, parietaloperculum; see Tables 2 and 3), thereby suggesting that processing isenhanced in both the ‘driven’ auditory modality and ‘driving’ tactilemodality.

No enhanced fMRI-signals for bimodal stimuli were found in themultisensory STS, which is known to be a convergence zone foraudio–visual stimuli (Beauchamp et al., 2004; Noesselt et al., 2007,2010; Sadaghiani et al., 2009; Schroeder and Foxe, 2002; Wernerand Noppeney, 2010a, 2010b; for review see: Calvert and Thesen,2004; Ghazanfar and Schroeder, 2006; Smiley and Falchier, 2009)though Beauchamp et al. (2008) reported increased fMRI-signals for

-values for the local maxima plus extent of clusters for PPI analysis 1 seeded in primaryultisensory (Multi) regions identified by the independent functional localizer (p b .05

z Cluster-size Peak t-value p-Value

m

54 −18 88 3.69 .00154 −6 3.08 .00310 −10 50 3.89 .00124 54 50 3.85 .00188 30 80 2.89 .00548 −24 105 3.57 .00150 4 36 2.93 .00432 34 53 2.31 .0166 6 515 3.41 .0022 18 3.35 .002

10 20 2.91 .00514 46 39 2.42 .0138 50 2.12 .024

12 −6 33 2.8 .00616 0 2.2 .02110 4 10 2.46 .01220 28 199 3.88 .00114 34 2.28 .01128 28 1.94 .02610 4 57 3.36 .00214 12 38 3.26 .00266 −24 33 2.97 .00450 6 33 3.14 .00312 −4 17 2.53 .01014 −10 55 2.75 .007

Table 5PPI results seeded in PT with d-prime as covariate. MNI-coordinates (x, y, z in mm), t- and p-values for the local maxima plus extent of clusters for PPI analysis 2 seeded in planumtemporale (PT), d-prime as covariate within auditory (SL > TL), tactile (TL > SL) and multisensory (Multi) regions identified by the independent functional localizer (p b .05uncorrected; k ≥ 50, unless otherwise mentioned).

Mask Regions x y z Cluster-size Peak t-value p-Value

mm

SL > TL Right superior temporal gyrus 56 −18 −6 62 3.74 .001Right superior temporal gyrus (TE 1.1 60%) 48 −24 6 2.38 .014Right middle occipital gyrus 36 −76 22 262 3.57 .001Right superior occipital gyrus 26 −76 22 3.26 .002Right precuneus 14 −44 48 51 2.28 .017Right fusiform gyrus 28 −64 −12 223 3.98 .001Right lingual gyrus 26 −50 −4 3.32 .002Right hippocampus 26 −40 −2 2.44 .013Right cuneus 12 −86 32 93 3.23 .002Left insula −28 −30 14 83 3.32 .002Left parietal operculum — S2 −38 −34 18 2.66 .008

TL > SL Right middle temporal gyrus 58 −54 4 79 3.17 .003Right supramarginal gyrus 64 −32 36 167 3.27 .002Right inferior parietal gyrus 62 −36 46 2.17 .022Right postcentral gyrus — S1 66 −10 18 239 3.91 .001Right precentral gyrus 44 4 46 61 3.26 .002Right insula 36 10 4 283 4.8 .001Right parietal operculum — S2 56 6 10 4.45 .001Right thalamus# 12 −18 10 14 2.45 .012Left inferior frontal gyrus (BA 44 50%) −58 10 10 166 5.61 .001Left temporal pole −52 8 0 2.7 .007Left precentral gyrus (BA 6 40%) −56 2 26 2.45 .012

Multi Right superior temporal gyrus/sulcus# 62 −36 10 33 3.12 .003Right middle temporal gyrus/sulcus# 58 −56 4 34 2.49 .011Right postcentral gyrus (OP4 60%) 66 −14 14 59 2.88 .005Right supramarginal gyrus 60 −26 18 2.85 .005

% = percentage of activation allocated to area X (Eickhoff et al., 2007).

# k ≥ 10.

380 M. Hoefer et al. / NeuroImage 79 (2013) 371–382

integration of visual, auditory and tactile information in STS. Their hy-pothesis that STS-responses reflect sensory processing rather thantask performance is well supported by our data (though see Wernerand Noppeney, 2010b). We found an overlap of tactile and auditoryBOLD-responses in the STS for the localizer scans (no task), but notfor differential BOLD-responses comparing experimental conditions(detection task). Likewise, no significant effects were observed inthe superior colliculi, which has been found in some human studies(Leo et al., 2008; Maravita et al., 2008; Stevenson et al., 2010;Werner and Noppeney, 2010b), but this could be due to the detectiontask used here, or the fMRI-data acquisition, which was not optimizedfor signal detection in the superior colliculi. Althoughwe tried to min-imize vibrations and scanner noise during stimulus presentation bythe utilization of a rapid–sparse sampling (silent period of 2 s), scan-ner noise may have nevertheless interfered with our data. Shah et al.(1999) indicated that the duration of the periods without scannernoise is relevant, because longer silent periods lead to more pro-nounced BOLD responses. Hence, further studies are needed to cor-roborate our findings in auditory cortex to ascertain whether slowrapid–sparse sampling might yield to higher BOLD responses in theareas found in this study or to the activation of additional areas(though note that the behavioral data inside and outside the scannerwas very similar, thus scanning-induced stimulation may play less ofa role in our experiment).

Together, our voxel-based and ROI-analyses of regional fMRI-signalsindicate that a functional network of right-sided auditory and tactile re-gions are involved in audiotactile integration, with differential responseprofiles for synchronous and asynchronous bimodal stimulation in audi-tory cortex.

Effective connectivity between brain areas

Based on the results of the local fMRImodulations, we directly testedfor effective connections of A1 and PT (in the right hemisphere) with

other regions in the context of synchronous and asynchronous bimodalstimulation. A1 and PT both showed enhanced functional couplingwiththe postcentral gyrus (primary tactile regions), the insula and parietaloperculum (secondary tactile regions), the superior temporal gyrusand the supramarginal gyrus, and the ventrolateral nucleus (VLN) ofthe thalamus, (all located in the right hemisphere). Further studies areneeded to reveal the intrinsic connectivity of auditory and tactileregions during rest, which would have further increased scanningtime in our experiment beyond reasonable limits. Importantly, this en-hanced connectivity in our study scaled with subjects' behavioral sensi-tivity. Thus, sensory-specific auditory regions form a network with sub-cortical, sensory-specific tactile plus hetero-modal areas and thestrength of these functional connections reflects subjects' perceptualsensitivity.

A potential mechanism underlying our effects has recently beenreported in macaque studies: multisensory interplay taking place inprimary areas is (predominantly) caused by modulatory influence innon-granular cortical layers (Cappe et al., 2009; Lakatos et al.,2007). The (non-specific) thalamic system (Jones, 1998) might alsoplay a key role considering the timing and laminar profile of multi-sensory interplay in A1 (Budinger et al., 2006; Lakatos et al., 2007;Schroeder and Foxe, 2002). In accord, Beauchamp and Ro reported arare case of auditory–tactile synesthesia caused by an infarct restrict-ed to the right VLN (Beauchamp and Ro, 2008; Ro et al., 2007). Theywere able to show that the output of VLN is not restricted to motorareas, but sends efferent projections to somatosensory areas as well.Based on their findings they suggested that the reduced somatosen-sory thalamic input caused by the lesion in VLN might have led toshort-term unmasking of latent cross-modal connections betweenauditory and somatosensory areas. In a further experiment similarto ours they showed that task-irrelevant auditory stimuli increasethe sensitivity to near-threshold tactile stimuli (Ro et al., 2009). Ourfindings accord well with these observations and further suggest,that the closer the right-hemispheric thalamus, somatosensory,

381M. Hoefer et al. / NeuroImage 79 (2013) 371–382

multisensory and auditory cortex work together (as indexed by thestrength of effective connectivity), the higher the gain in auditory ac-curacy will be.

In conclusion,we found that audiotactile interplay facilitates auditoryperceptual sensitivity by enhancing neural processing in low-level audi-tory, somatosensory and multisensory cortex, plus the thalamus. More-over, audiotactile interplay is influenced by hemispheric asymmetriesand could only be observed for left-sided audiotactile stimulation inright-hemispheric regions. Effective connectivity of right-hemisphericauditory cortex with somatosensory cortex, multisensory cortex plusthe thalamus scaled with d′ thus provides further evidence that multi-sensory interplay helps the auditory system to enhance stimulus repre-sentation and makes it easier to detect near-threshold stimuli.

Acknowledgments

This work was supported by the DFG-SFB-TR31/TP A8 & TP A4.Wewish to thank two anonymous reviewers for valuable comments

and suggestions.

Conflict of interest

None of the authors has a conflict of interest with regard to themanuscript.

Appendix A. Supplementary data

Supplementary data to this article can be found online at http://dx.doi.org/10.1016/j.neuroimage.2013.04.119.

References

Beauchamp, M.S., 2005. Statistical criteria in FMRI studies of multisensory integration.Neuroinformatics 3, 93–113.

Beauchamp, M.S., Ro, T., 2008. Neural substrates of sound–touch synesthesia after athalamic lesion. J. Neurosci. 28, 13696–13702.

Beauchamp, M.S., Argall, B.D., Bodurka, J., Duyn, J.H., Martin, A., 2004. Unraveling mul-tisensory integration: patchy organization within human STS multisensory cortex.Nat. Neurosci. 7, 1190–1192.

Beauchamp, M.S., Yasar, N.E., Frye, R.E., Ro, T., 2008. Touch, sound and vision in humansuperior temporal sulcus. NeuroImage 41, 1011–1020.

Bergenheim, M., Johansson, H., Granlund, B., Pedersen, J., 1996. Experimental evi-dence for a synchronization of sensory information to conscious experience. In:Hameroff, S., Kaszniak, A., Scott, A. (Eds.), Toward a Science of Consciousness: TheFirst Tucson Discussions and Debates. MIT Press, Cambridge, Mass., pp. 303–310.

Bonath, B., Noesselt, T., Martinez, A., Mishra, J., Schwiecker, K., Heinze, H.-J., Hillyard, S. a,2007. Neural basis of the ventriloquist illusion. Curr. Biol. 17, 1697–1703.

Bonath, B., Tyll, S., Budinger, E., Krauel, K., Hopf, J.-M., Noesselt, T., 2013. Task-demandsand audio–visual stimulus configurations modulate neural activity in the humanthalamus. NeuroImage 66, 110–118.

Bruns, P., Röder, B., 2010. Tactile capture of auditory localization: an event-relatedpotentialstudy. Eur. J. Neurosci. 31, 1844–1857.

Budinger, E., Heil, P., Hess, A., Scheich, H., 2006. Multisensory processing via early corticalstages: connections of the primary auditory cortical field with other sensory systems.Neuroscience 143, 1065–1083.

Bushara, K.O., Grafman, J., Hallett, M., 2001. Neural correlates of auditory–visual stimulusonset asynchrony detection. J. Neurosci. 21, 300–304.

Caetano, G., Jousmäki, V., 2006. Evidence of vibrotactile input to human auditory cortex.NeuroImage 29, 15–28.

Calvert, G.A., Thesen, T., 2004. Multisensory integration: methodological approachesand emerging principles in the human brain. J. Physiol. Paris 98, 191–205.

Cappe, C., Barone, P., 2005. Heteromodal connections supportingmultisensory integrationat low levels of cortical processing in the monkey. Eur. J. Neurosci. 22, 2886–2902.

Cappe, C., Morel, A., Barone, P., Rouiller, E.M., 2009. The thalamocortical projection systemsinprimate: an anatomical support formultisensory and sensorimotor interplay. Cereb.Cortex 19, 2025–2037.

Cappe, C., Rouiller, E.M., Barone, P., 2009. Multisensory anatomical pathways. Hear. Res.258, 28–36.

Dale, a M., 1999. Optimal experimental design for event-related fMRI. Hum. BrainMapp. 8, 109–114.

De la Mothe, L.A., Blumell, S., Kajikawa, Y., Hackett, T.A., 2006. Thalamic connections ofthe auditory cortex in marmoset monkeys: core and medial belt regions. J. Comp.Neurol. 496, 72–96.

Diederich, A., Colonius, H., 2004. Bimodal and trimodalmultisensory enhancement: effectsof stimulus onset and intensity on reaction time. Percept. Psychophys. 66, 1388–1404.

Downar, J., Crawley, A.P., Mikulis, D.J., Davis, K.D., 2000. A multimodal cortical networkfor the detection of changes in the sensory environment. Nat. Neurosci. 3, 277–283.

Driver, J., Noesselt, T., 2008. Multisensory interplay reveals crossmodal influences on“sensory-specific” brain regions, neural responses, and judgments. Neuron 57,11–23.

Eickhoff, S.B., Paus, T., Caspers, S., Grosbras, M.-H., Evans, A.C., Zilles, K., Amunts, K., 2007.Assignment of functional activations to probabilistic cytoarchitectonic areas revisited.NeuroImage 36, 511–521.

Foxe, J.J., Morocz, I.A., Murray, M.M., Higgins, B.A., Javitt, D.C., Schroeder, C.E., 2000.Multisensory auditory–somatosensory interactions in early cortical processingrevealed by high-density electrical mapping. Cogn. Brain Res. 10, 77–83.

Foxe, J.J.,Wylie, G.R.,Martinez, A., Schroeder, C.E., Javitt, D.C., Guilfoyle, D., Ritter,W.,Murray,M.M., C. D., 2002. Auditory–somatosensory multisensory processing in auditoryassociation cortex: an fMRI study. J. Neurophysiol. 88, 540–543.

Friston, K.J., Buechel, C., Fink, G.R., Morris, J., Rolls, E., Dolan, R.J., 1997. Psychophysiologicaland modulatory interactions in neuroimaging. NeuroImage 6, 218–229.

Friston, K.J., Zarahn, E., Josephs, O., Henson, R.N., Dale, a M., 1999. Stochastic designs inevent-related fMRI. NeuroImage 10, 607–619.

Fu, K.-M.G., Johnston, T.A., Shah, A.S., Arnold, L., Smiley, J., Hackett, T.A., Garraghty,P.E., Schroeder, C.E., 2003. Auditory cortical neurons respond to somatosensorystimulation. J. Neurosci. 23, 7510–7515.

Ghazanfar, A.A., Schroeder, C.E., 2006. Is neocortex essentially multisensory? TrendsCogn. Sci. 10, 278–285.

Giard, M.H., Peronnet, F., 1999. Auditory–visual integration during multimodal objectrecognition in humans: a behavioral and electrophysiological study. J. Cogn. Neurosci.11, 473–490.

Gillmeister, H., Eimer, M., 2007. Tactile enhancement of auditory detection and perceivedloudness. Brain Res. 1160, 58–68.

Gobbelé, R., Schürmann, M., Forss, N., Juottonen, K., Buchner, H., Hari, R., 2003. Activationof the human posterior parietal and temporoparietal cortices during audiotactileinteraction. NeuroImage 20, 503–511.

Green,D.M., Swets, J.A., 1966. Signal Detection Theory andPsychophysics.Wiley, NewYork.Hackett, T.A., Smiley, J.F., Ulbert, I., Karmos, G., Lakatos, P., De laMothe, L.A., Schroeder, C.E.,

2007. Sources of somatosensory input to the caudal belt areas of auditory cortex.Perception 36, 1419–1430.

Hausmann, M., Corballis, M.C., Fabri, M., Paggi, A., Lewald, J., 2005. Sound lateralizationin subjects with callosotomy, callosal agenesis, or hemispherectomy. Brain Res. 25,537–546.

Hinrichs, H., Scholz, M., Tempelmann, C., Woldorff, M.G., Dale, A.M., Heinze, H.J., 2000.Deconvolution of event-related fMRI responses in fast-rate experimental designs:tracking amplitude variations. J. Cogn. Neurosci. 12 (Suppl. 2), 76–89.

Iannetti, G., Porro, C., Pantano, P., Romanelli, P., Galeotti, F., Cruccu, G., 2003. Representation ofdifferent trigeminal divisions within the primary and secondary human somatosensorycortex. NeuroImage 19, 906–912.

Jäncke, L., Wüstenberg, T., Schulze, K., Heinze, H.J., 2002. Asymmetric hemodynamicresponses of the human auditory cortex to monaural and binaural stimulation.Hear. Res. 170, 166–178.

Jones, E.G., 1998. Viewpoint: the core and matrix of thalamic organization. Neuroscience85, 331–345.

Kayser, C., Logothetis, N.K., 2007. Do early sensory cortices integrate cross-modalinformation? Brain Struct. Funct. 212, 121–132.

Kayser, C., Petkov, C.I., Augath, M., Logothetis, N.K., 2005. Integration of touch andsound in auditory cortex. Neuron 48, 373–384.

Kim, S., James, T.W., 2010. Enhanced effectiveness in visuo-haptic object-selectivebrain regions with increasing stimulus salience. Hum. Brain Mapp. 31, 678–693.

Kriegeskorte, N., Simmons, W.K., Bellgowan, P.S.F., Baker, C.I., 2009. Circular analysis insystems neuroscience: the dangers of double dipping. Nat. Neurosci. 12, 535–540.

Krumbholz, K., Hewson-Stoate, N., Schönwiesner, M., 2007. Cortical response to auditorymotion suggests an asymmetry in the reliance on inter-hemispheric connectionsbetween the left and right auditory cortices. J. Neurophysiol. 97, 1649–1655.

Lakatos, P., Chen, C.-M., O'Connell,M.N.,Mills, A., Schroeder, C.E., 2007. Neuronal oscillationsand multisensory interaction in primary auditory cortex. Neuron 53, 279–292.

Lancaster, J.L., Rainey, L.H., Summerlin, J.L., Freitas, C.S., Fox, P.T., Evans, a C., Toga, aW., Mazziotta, J.C., 1997. Automated labeling of the human brain: a preliminaryreport on the development and evaluation of a forward-transform method.Hum. Brain Mapp. 5, 238–242.

Lancaster, J.L., Woldorff, M.G., Parsons, L.M., Liotti, M., Freitas, C.S., Rainey, L., Kochunov,P.V., Nickerson, D., Mikiten, S.A., Fox, P.T., 2000. Automated Talairach atlas labelsfor functional brain mapping. Hum. Brain Mapp. 10, 120–131.

Leo, F., Bolognini, N., Passamonti, C., Stein, B.E., Làdavas, E., 2008. Cross-modal localizationin hemianopia: new insights on multisensory integration. Brain 131, 855–865.

Levänen, S., Ahonen, A., Hari, R., McEvoy, L., Sams, M., 1996. Deviant auditorystimuli activate human left and right auditory cortex differently. Cereb. Cortex 6,288–296.

Lewis, R., Noppeney, U., 2010. Audiovisual synchrony improves motion discriminationvia enhanced connectivity between early visual and auditory areas. J. Neurosci. 30,12329–12339.

Lewis, J.W., Van Essen, D.C., 2000. Corticocortical connections of visual, sensorimotor, andmultimodal processing areas in the parietal lobe of the macaque monkey. J. Comp.Neurol. 428, 112–137.

Macaluso, E., Frith, C.D., Driver, J., 2000.Modulation of human visual cortex by crossmodalspatial attention. Science 289, 1206–1208.

Maravita, A., Bolognini, N., Bricolo, E.,Marzi, C.A., Savazzi, S., 2008. Is audiovisual integrationsubserved by the superior colliculus in humans? Neuroreport 19, 271–275.

Menning, H., Ackermann, H., Hertrich, I., Mathiak, K., 2005. Spatial auditory attention ismodulated by tactile priming. Exp. Brain Res. 164, 41–47.

382 M. Hoefer et al. / NeuroImage 79 (2013) 371–382

Meredith, M.A., Stein, B.E., 1986. Visual, auditory, and somatosensory convergence on cells insuperior colliculus results in multisensory integration. J. Neurophysiol. 56, 640–662.

Molholm, S., Ritter,W.,Murray,M.M., Javitt, D.C., Schroeder, C.E., Foxe, J.J., 2002.Multisensoryauditory–visual interactions during early sensory processing in humans: a high-densityelectrical mapping study. Cogn. Brain Res. 14, 115–128.

Murray, M.M., Molholm, S., Michel, C.M., Heslenfeld, D.J., Ritter, W., Javitt, D.C., Schroeder,C.E., Foxe, J.J., 2005. Grabbing your ear: rapid auditory–somatosensory multisensoryinteractions in low-level sensory cortices are not constrained by stimulus alignment.Cereb. Cortex 15, 963–974.

Musacchia, G., Schroeder, C.E., 2009. Neuronal mechanisms, response dynamics andperceptual functions of multisensory interactions in auditory cortex. Hear. Res.258, 72–79.

Musacchia, G., Sams,M.,Nicol, T., Kraus, N., 2006. Seeing speech affects acoustic informationprocessing in the human brainstem. Exp. Brain Res. 168, 1–10.

Nguyen, B.T., Tran, T.D., Hoshiyama, M., Inui, K., Kakigi, R., 2004. Face representation inthe human primary somatosensory cortex. Neurosci. Res. 50, 227–232.

Noesselt, T., Rieger, J.W., Schoenfeld,M.A., Kanowski, M., Hinrichs, H., Heinze, H.-J., Driver, J.,2007. Audiovisual temporal correspondence modulates human multisensory superiortemporal sulcus plus primary sensory cortices. J. Neurosci. 27, 11431–11441.

Noesselt, T., Tyll, S., Boehler, C.N., Budinger, E., Heinze, H.-J., Driver, J., 2010. Sound-inducedenhancement of low-intensity vision: multisensory influences on humansensory-specific cortices and thalamic bodies relate to perceptual enhancementof visual detection sensitivity. J. Neurosci. 30, 13609–13623.

Odgaard, E.C., Arieh, Y.,Marks, L.E., 2003. Cross-modal enhancement of perceived brightness:sensory interaction versus response bias. Percept. Psychophys. 65, 123–132.

Paraskevopoulos, E., Kuchenbuch, A., Herholz, S.C., Pantev, C., 2012. Evidence for training-induced plasticity inmultisensory brain structures: anMEG study. PLoS One 7, e36534.

Paraskevopoulos, E., Kuchenbuch, A., Herholz, S.C., Pantev, C., 2012. Musical expertiseinduces audiovisual integration of abstract congruency rules. J. Neurosci. 32,18196–18203.

Penfield, W., Boldrey, E., 1937. Somatic motor and sensory representation in the cerebralcortex of man as studied by electrical stimulation. Brain 60, 389–443.

Rach, S., Diederich, A., Colonius, H., 2011. On quantifying multisensory interaction effectsin reaction time and detection rate. Psychol. Res. 75, 77–94.

Ramos-Estebanez, C., Merabet, L.B., Machii, K., Fregni, F., Thut, G., Wagner, T.A., Romei,V., Amedi, A., Pascual-Leone, A., 2007. Visual phosphene perception modulated bysubthreshold crossmodal sensory stimulation. J. Neurosci. 27, 4178–4181.

Recanzone, G.H., Guard,D.C., Phan,M.L., 2000. Frequency and intensity responseproperties ofsingle neurons in the auditory cortex of the behavingmacaquemonkey. J. Neurophysiol.83, 2315–2331.

Ro, T., Farnè, A., Johnson, R.M., Wedeen, V., Chu, Z., Wang, Z.J., Hunter, J.V., Beauchamp,M.S., 2007. Feeling sounds after a thalamic lesion. Ann. Neurol. 62, 433–441.

Ro, T., Hsu, J., Yasar, N.E., Elmore, L.C., Beauchamp, M.S., 2009. Sound enhances touchperception. Exp. Brain Res. 195, 135–143.

Sadaghiani, S., Maier, J.X., Noppeney, U., 2009. Natural, metaphoric, and linguistic auditorydirection signals have distinct influences on visual motion processing. J. Neurosci. 29,6490–6499.

Scheffler, K., Bilecen,D., Schmid, N., Tschopp, K., Seelig, J., 1998. Auditory cortical responsesin hearing subjects and unilateral deaf patients as detected by functional magneticresonance imaging. Cereb. Cortex 8, 156–163.

Schroeder, C.E., Foxe, J.J., 2002. The timing and laminar profile of converging inputsto multisensory areas of the macaque neocortex. Cogn. Brain Res. 14, 187–198.

Schroeder, C.E., Lindsley, R.W., Specht, C., Marcovici, A., Smiley, J.F., Javitt, D.C., 2001.Somatosensory input to auditory association cortex in the macaque monkey.J. Neurophysiol. 85, 1322–1327.

Schürmann, M., Caetano, G., Hlushchuk, Y., Jousmäki, V., Hari, R., 2006. Touch activateshuman auditory cortex. NeuroImage 30, 1325–1331.

Shah, N.J., Jäncke, L., Grosse-Ruyken, M.L., Müller-Gärtner, H.W., 1999. Influence ofacoustic masking noise in fMRI of the auditory cortex during phonetic discrimina-tion. J. Magn. Reson. Imaging 9, 19–25.

Shams, L., Kamitani, Y., Shimojo, S., 2002. Visual illusion induced by sound. Cogn. BrainRes. 14, 147–152.

Smiley, J.F., Falchier, A., 2009. Multisensory connections of monkey auditory cerebralcortex. Hear. Res. 258, 37–46.

Smiley, J.F., Hackett, T.A., Ulbert, I., Karmas, G., Lakatos, P., Javitt, D.C., Schroeder, C.E., 2007.Multisensory convergence in auditory cortex, I. Cortical connections of the caudal su-perior temporal plane in macaque monkeys. J. Comp. Neurol. 502, 894–923.

Stanislaw, H., Todorov, N., 1999. Calculation of signal detection theory measures.Behav. Res. Methods Instrum. Comput. 31, 137–149.

Stein, B.E., Stanford, T.R., 2008.Multisensory integration: current issues from the perspectiveof the single neuron. Nat. Rev. Neurosci. 9, 255–266.

Stevenson, R.A., Altieri, N.A., Kim, S., Pisoni, D.B., James, T.W., 2010. Neural processingof asynchronous audiovisual speech perception. NeuroImage 49, 3308–3318.

Stracke, H., Okamoto, H., Pantev, C., 2009. Interhemispheric support during demandingauditory signal-in-noise processing. Cereb. Cortex 19, 1440–1447.

Tyll, S., Bonath, B., Schoenfeld, M.A., Heinze, H.-J., Ohl, F.W., Noesselt, T., 2013. Neuralbasis of multisensory looming signals. NeuroImage 65, 13–22.

Van der Burg, E., Awh, E., Olivers, C.N.L., 2013. The capacity of audiovisual integration islimited to one item. Psychol. Sci. 24, 345–351.

Wallace, M.T., Meredith, M.A., Stein, B.E., 1993. Converging influences from visual,auditory, and somatosensory cortices onto output neurons of the superiorcolliculus. J. Neurophysiol. 69, 1797–1809.

Werner, S., Noppeney, U., 2010. Distinct functional contributions of primary sensory andassociation areas to audiovisual integration in object categorization. J. Neurosci. 30,2662–2675.

Werner, S., Noppeney, U., 2010. Superadditive responses in superior temporalsulcus predict audiovisual benefits in object categorization. Cereb. Cortex 20,1829–1842.