10
RESEARCH ARTICLE Daniel Sanabria Salvador Soto-Faraco Charles Spence Spatiotemporal interactions between audition and touch depend on hand posture Received: 10 June 2004 / Accepted: 26 February 2005 / Published online: 8 June 2005 Ó Springer-Verlag 2005 Abstract We report two experiments designed to assess the consequences of posture change on audiotactile spatiotemporal interactions. In Experiment 1, partici- pants had to discriminate the direction of an auditory stream (consisting of the sequential presentation of two tones from different spatial positions) while attempting to ignore a task-irrelevant tactile stream (consisting of the sequential presentation of two vibrations, one to each of the participant’s hands). The tactile stream presented to the participants’ hands was either spatiotemporally congruent or incongruent with respect to the sounds. A significant decrease in performance in incongruent trials compared with congruent trials was demonstrated when the participants adopted an uncrossed-hands posture but not when their hands were crossed over the midline. In Experiment 2, we investigated the ability of participants to discriminate the direction of two sequentially pre- sented tactile stimuli (one presented to each hand) as a function of the presence of congruent vs incongruent auditory distractors. Here, the crossmodal effect was stronger in the crossed-hands posture than in the un- crossed-hands posture. These results demonstrate the reciprocal nature of audiotactile interactions in spatio- temporal processing, and highlight the important role played by body posture in modulating such crossmodal interactions. Keywords Crossmodal interactions Spatiotemporal processing Audition Touch Posture change Introduction In recent years there has been rapid growth of research demonstrating audiotactile interactions in the spatial and temporal domains (e.g. Adelstein et al. 2003; Caclin et al. 2002; Foxe et al. 2000, 2002; Fu et al. 2003; Lloyd et al. 2003; Murray et al. 2005; Schroeder and Foxe 2002; Schroeder et al. 2001, 2003; Spence et al. 1998; Zampini et al. 2005). However, although it is now widely agreed that body posture plays a crucial role in the processing of tactile information, little is known about the effects that changes in body posture may have on audiotactile interactions. In this study we addressed the question of how changes in body posture modulate audiotactile spatio- temporal processing, with participants adopting either a crossed or uncrossed-hands posture. Experiment 1 was designed to study the effect of tactile distractors on auditory perception whereas in Experiment 2 touch acted as the target modality and audition as the distractor modality. Most audiotactile studies have reported that tactile information tends to affect auditory perceptual judgments more than auditory stimuli affect tactile judgments (e.g. Sherrick 1976; Soto-Faraco et al. 2004b; though see Ho¨tting and Ro¨der 2004). We therefore predicted a similar pattern of results in this study, that is, a greater effect of tactile distractors on the perception of auditory target stimuli than vice versa. However, we thought it possible that body posture might also play an important role in any audiotactile spatiotemporal inter- actions reported. By looking at the effect of changes in body posture on audiotactile interactions, we were able to investigate the spatial frame of reference involved in crossmodal inter- actions between auditory and tactile information pro- cessing. The spatial location of a tactile stimulus (and consequently also the direction of tactile motion) can be encoded according to a number of different frames of reference (e.g. somatotopic, body-centred, and/or allo- centric; see Burton and Sinclair 1996; Graziano and Ghandi 2000; Graziano et al. 2004; Penfield and Ras- D. Sanabria (&) S. Soto-Faraco C. Spence Department of Experimental Psychology, University of Oxford, South Parks Road, Oxford, OX1 3UD, UK E-mail: [email protected] Tel.: +44-1865-271307 Fax: +44-1865-310447 S. Soto-Faraco Cognitive Neuroscience Group-Parc Cientı´fic, Universitat de Barcelona, Spain Exp Brain Res (2005) 165: 505–514 DOI 10.1007/s00221-005-2327-5

Spatiotemporal interactions between audition and touch depend on hand posture

Embed Size (px)

Citation preview

RESEARCH ARTICLE

Daniel Sanabria Æ Salvador Soto-FaracoCharles Spence

Spatiotemporal interactions between audition and touch dependon hand posture

Received: 10 June 2004 / Accepted: 26 February 2005 / Published online: 8 June 2005� Springer-Verlag 2005

Abstract We report two experiments designed to assessthe consequences of posture change on audiotactilespatiotemporal interactions. In Experiment 1, partici-pants had to discriminate the direction of an auditorystream (consisting of the sequential presentation of twotones from different spatial positions) while attemptingto ignore a task-irrelevant tactile stream (consisting ofthe sequential presentation of two vibrations, one to eachof the participant’s hands). The tactile stream presentedto the participants’ hands was either spatiotemporallycongruent or incongruent with respect to the sounds. Asignificant decrease in performance in incongruent trialscompared with congruent trials was demonstrated whenthe participants adopted an uncrossed-hands posture butnot when their hands were crossed over the midline. InExperiment 2, we investigated the ability of participantsto discriminate the direction of two sequentially pre-sented tactile stimuli (one presented to each hand) as afunction of the presence of congruent vs incongruentauditory distractors. Here, the crossmodal effect wasstronger in the crossed-hands posture than in the un-crossed-hands posture. These results demonstrate thereciprocal nature of audiotactile interactions in spatio-temporal processing, and highlight the important roleplayed by body posture in modulating such crossmodalinteractions.

Keywords Crossmodal interactions Æ Spatiotemporalprocessing Æ Audition Æ Touch Æ Posture change

Introduction

In recent years there has been rapid growth of researchdemonstrating audiotactile interactions in the spatial andtemporal domains (e.g. Adelstein et al. 2003; Caclin et al.2002; Foxe et al. 2000, 2002; Fu et al. 2003; Lloyd et al.2003; Murray et al. 2005; Schroeder and Foxe 2002;Schroeder et al. 2001, 2003; Spence et al. 1998; Zampiniet al. 2005). However, although it is now widely agreedthat body posture plays a crucial role in the processing oftactile information, little is known about the effects thatchanges in body posture may have on audiotactileinteractions.

In this study we addressed the question of howchanges in body posture modulate audiotactile spatio-temporal processing, with participants adopting either acrossed or uncrossed-hands posture. Experiment 1 wasdesigned to study the effect of tactile distractors onauditory perception whereas in Experiment 2 touch actedas the target modality and audition as the distractormodality. Most audiotactile studies have reported thattactile information tends to affect auditory perceptualjudgments more than auditory stimuli affect tactilejudgments (e.g. Sherrick 1976; Soto-Faraco et al. 2004b;though see Hotting and Roder 2004). We thereforepredicted a similar pattern of results in this study, that is,a greater effect of tactile distractors on the perception ofauditory target stimuli than vice versa. However, wethought it possible that body posture might also play animportant role in any audiotactile spatiotemporal inter-actions reported.

By looking at the effect of changes in body posture onaudiotactile interactions, we were able to investigate thespatial frame of reference involved in crossmodal inter-actions between auditory and tactile information pro-cessing. The spatial location of a tactile stimulus (andconsequently also the direction of tactile motion) can beencoded according to a number of different frames ofreference (e.g. somatotopic, body-centred, and/or allo-centric; see Burton and Sinclair 1996; Graziano andGhandi 2000; Graziano et al. 2004; Penfield and Ras-

D. Sanabria (&) Æ S. Soto-Faraco Æ C. SpenceDepartment of Experimental Psychology, University of Oxford,South Parks Road, Oxford, OX1 3UD, UKE-mail: [email protected].: +44-1865-271307Fax: +44-1865-310447

S. Soto-FaracoCognitive Neuroscience Group-Parc Cientıfic,Universitat de Barcelona, Spain

Exp Brain Res (2005) 165: 505–514DOI 10.1007/s00221-005-2327-5

mussen 1950). However, although the frame of referenceinvolved in the representation of static tactile stimuli hasbeen widely investigated in recent years (e.g. Eimer et al.2001; Roder et al. 2000), far less research has been con-ducted regarding the perception of tactile stimuli that varyin both space and time (e.g. Soto-Faraco et al. 2004b).Therefore, the question of whether the representation oftactile spatiotemporal information is coded somatotop-ically, or in a more externally based frame of reference,remains an open and interesting issue that has yet to beexplored fully. One way to address this issue behaviourallyis by measuring the consequences of manipulating bodyposture (hands uncrossed vs crossed), as highlighted in thisstudy (see also Soto-Faraco et al. 2004b).

In Experiment 1, we used a variant of the crossmodaldynamic capture task (e.g. Soto-Faraco et al. 2002), inwhich participants had to discriminate the direction oftwo auditory stimuli presented in temporal sequencefrom different spatial locations while attempting to ig-nore distracting tactile stimuli that could be either spa-tiotemporally congruent or incongruent with theauditory target stimuli. The participants performed thistask with their hands placed in either an uncrossed orcrossed-hands posture. When the participants’ handswere uncrossed we predicted that there would be a sig-nificant crossmodal interaction, with participantsresponding more accurately in congruent trials (wherethe target and distractor were spatiotemporally coinci-dent) than in incongruent trials (where the target anddistractor were spatiotemporally incongruent). When theparticipants’ hands were crossed, however, we predicteda significant reduction of any crossmodal interaction.Numerous recent studies have shown that crossing thehands can result in an impairment of tactile temporalprocessing (e.g. Roder et al. 2004; Shore et al. 2002;Yamamoto and Kitazawa 2001; Wada et al. 2004). Forinstance, Shore et al. demonstrated that when crossingtheir hands participants needed a much larger intervalbetween the onset of two sequentially presented vibro-tactile stimuli (one delivered to each of the participants’hands) to reach a criterion of 75% correct judgmentsregarding the temporal order in which the tactile stimuliwere presented, than when their hands were uncrossed.

Therefore, if crossing the hands gives rise to an impair-ment in the spatiotemporal processing of tactile infor-mation, one might also expect a significant reduction ofthe effect of touch on the spatiotemporal perception ofauditory information in the crossed-hands condition (seeSoto-Faraco et al. 2004b on this point).

Experiment 1

Methods

Participants

Eighteen students (age range 18–27 years, mean21 years) at the University of Oxford took part inExperiment 1. All of the participants reported normaltactile sensitivity at the fingertips, normal hearing, andnormal or corrected-to-normal vision. Twelve of theparticipants were undergraduates at the University ofOxford and received course credit in return for theirparticipation. The other six participants received a £5(UK Sterling) gift voucher in return for their participa-tion. The experiment took approximately 30 min tocomplete and was performed in accordance with theethical standards laid down in the 1964 Declaration ofHelsinki. All of the participants gave their informedconsent prior to their inclusion in the study.

Apparatus and materials

Two loudspeaker cones (Creative SBS15; Singapore)positioned on the table top in front of the participantwere used to present the auditory stimuli (Fig. 1). Theloudspeaker cones were placed approximately 50 cmfrom the participant’s body, 15 cm to either side of theirmidline. Two wooden boxes were used to mount twovibrotactile stimulators (bone conduction vibrators,Oticon-A, 100 Ohm; Hamilton, Scotland), one placed infront of each loudspeaker cone, to ensure that the soundsand vibrations came from the same location. Participantsused footpedals located under the table to respond (onebeneath the toes of the right foot and the other beneath

Fig. 1 Schematic illustration ofthe set-up used in Experiments1 and 2. The footpedals used forresponding (placed on the floor,beneath the table) are notshown

506

the toes of the left foot). The loudspeaker cones, vibro-tactile stimulators, and footpedals were all controlled viaa computer parallel port using the Expe6 programminglanguage (Pallier et al. 1997), and a custom-built relaybox. The experiment was conducted in a dimly illumi-nated room.

The auditory stimuli consisted of two 50-ms puretones (5-ms amplitude envelope, 60-dB(A) sound pres-sure level as measured from the participant’s head posi-tion), one presented from each loudspeaker cone,separated by an inter-stimulus-interval (ISI) of 100 ms(this ISI remained constant across all conditions). Todiscourage the participants from focusing on any po-tential subtle acoustic differences between the loud-speaker cones that might have helped them to performthe task, the frequency of the two pure tones comprisingthe auditory sequence in each trial was varied betweenthree possible values (450, 500, and 550 Hz; cf. Soto-Faraco et al. 2002). The tactile displays consisted of two50-ms suprathreshold, 200-Hz vibrations, one presentedfrom each vibrator separated by an ISI of 100 ms.

Procedure

The participants sat in front of the loudspeakers withtheir hands positioned by the wooden box on which thevibrotactile stimulators were mounted. In the uncrossed-hands condition the participants rested their left indexfingertip on the left vibrotactile stimulator and their rightindex fingertip on the right vibrotactile stimulator. In thecrossed-hands condition the participants placed their leftindex finger on the right stimulator and their right indexfinger on the left stimulator. They were instructed to resttheir feet on the footpedals and to look straight aheadthroughout each block of experimental trials. Whitenoise was presented at 50 dB(A) from a loudspeakerpositioned 60 cm behind the loudspeaker cones used topresent the target stimuli, to mask any subtle auditorycues elicited by the activation of the vibrotactile stimu-lators. This loudspeaker was hidden from view behind ablack curtain.

In a typical trial the participants were presented withthe target auditory stream to which they had to make anunspeeded footpedal discrimination response, and a dis-tractor tactile stream, which they were instructed to tryand ignore. The distractor stream could either be pre-sented at the same time as the target auditory stream(synchronous) or 500 ms later (asynchronous) and in ei-ther the same (congruent) or opposite (incongruent)direction (from right-to-left or left-to-right). The partici-pants were instructed to respond to the direction of theauditory stream by releasing the corresponding footpedal(left for leftward targets, and right for rightward targets)and to ignore the tactile distractors as much as possible.The participants were asked to prioritize response accu-racy over response speed. Responses were only collectedafter 750 ms from the beginning of the trial, to ensure thatany lack of an effect of the tactile distractors on the per-ception of the auditory stream in the asynchronous con-dition was not simply caused by participants responding

to the auditory target before the presentation of the tactiledistractors. Note that while the participants might havebeen tempted to respond before the presentation of thedistractor stimulus in the asynchronous condition, theywere told they had to wait until the end of stimulus pre-sentation (i.e. after 750 ms in the asynchronous condi-tion) before making their response. This was stressed tothe participants before the start of the experiment. Also,the computer program controlling stimulus presentationalways waited for a response before proceeding to the nexttrial. After a participant’s response was recorded therewas a 2,000 ms interval before the start of the next trial.

The participants completed one block of 12 practicetrials at the start of their experimental session in whichthe auditory streams were presented in the absence ofany distractors, to familiarize them with the task at hand.The experimenter made sure the participants had under-stood the task, and the practice trials were repeated ifparticipants made more than one error. The participantsthen completed one block of 96 trials in the crossed-hands posture and one block of 96 trials in the un-crossed-hands posture, with block order counterbal-anced across participants.

Results

The accuracy data were submitted to three-way repeated-measures analysis of variance (ANOVA) with the factorsof Hand Posture (Uncrossed vs Crossed), Synchrony(Synchronous vs Asynchronous), and Congruency(Congruent vs Incongruent) (Fig. 2). The ANOVA re-vealed a significant main effect of Synchrony,F(1,17)=24.44, MSE=74.61, P<.001, with participantsresponding more accurately in the asynchronous condi-tion than in the synchronous condition overall (M=92vs 86%, respectively). Participants were also moreaccurate in the crossed-hands condition than in the un-crossed-hands condition (M=93 vs 84%, respectively),giving rise to a significant main effect of Hand Posture,F(1,17)=43.39, MSE=75.63, P<.001. The main effect ofCongruency was also significant, F(1,17)=21.01,MSE=75.86, P<.001, with participants respondingmore accurately in congruent trials than in incongruenttrials overall (M=92 vs 86%, respectively). The analysisrevealed a larger effect of Hand Posture (measured as thedifference in accuracy between the crossed and un-crossed-hands conditions) in the synchronous conditionthan in the asynchronous condition (M=14 vs 5%,respectively), as shown by the significant interaction be-tween Synchrony and Hand Posture, F(1,17)=12.55,MSE=66.16, P<.001. The effect of Congruency wassignificant in the synchronous condition, t(17)=5.84,P<.001, but not in the asynchronous condition, |t|<1,giving rise to a significant interaction between Con-gruency and Synchrony, F(1,17)=28.82, MSE=42.67,P<.001. Finally, the interaction between Hand Postureand Congruency was also significant, F(1,17)=22.50,MSE=75.86, P<.001. This interaction reflects the factthat there was a significant congruency effect in the un-

507

crossed-hands condition, t(17)=5.28, P<.001, but not inthe crossed-hands condition, |t|<1.

Analysis of the results from Experiment 1 also revealeda three-way interaction between Hand Posture, Syn-chrony, and Congruency F(1,17)=43.41, MSE=55.21, P<.001. We therefore decided to perform separateANOVAs for each level of the Synchrony factor (cf. Soto-Faraco et al. 2004b; see Fig. 2). In the synchronous con-dition, there was a main effect of Hand Posture,F(1,17)=52.65, MSE=70.40, P<.001, with participantsresponding more accurately in the crossed-hands condi-tion than in the uncrossed-hands condition overall (92 vs78%, respectively), and of Congruency, F(1,17)=34.21,MSE=82.20, P<.001, with participants’ discriminationof the direction of the auditory stream being more accu-rate in congruent trials than in incongruent trials overall(M=91 vs 80%, respectively). Finally, the interactionbetween Hand Posture and Congruency was significant,F(1,17)=41.23, MSE=98.83, P<.001. Participants re-sponded significantly more accurately in congruent trialsthan in incongruent trials in the uncrossed-hands condi-tion, t(17)=6.63, P<.001, but no such congruency effectemerged in the crossed-hands condition, t(17)=1.47,P=.15. In the asynchronous condition there was asignificant main effect of Hand Posture, F(1,17)=5.67,MSE=71.38, P<.05, with better performance in thecrossed-hands condition than in the uncrossed-handscondition (mean accuracy levels of 94 vs 90%, respec-tively). Neither the main effect of Congruency nor theinteraction between Congruency and Hand Posturereached significance, both Fs<1. A similar ANOVA onthe reaction time data revealed no significant terms.

Discussion

The results of Experiment 1 demonstrate that partici-pants were unable to ignore task-irrelevant tactile di-stractors when responding to the direction of the targetauditory stream, thus giving rise to a significant cross-modal interaction. This effect was, however, only seen inthe synchronous condition, when the target and dis-tractor stimuli were presented at the same time, and notin the asynchronous condition, when the onset of thetactile distractor stream occurred 500 ms after onset ofthe auditory target stream. The results of the asynchro-nous condition in Experiment 1 therefore provides abaseline measure of performance in the auditory dis-crimination task, showing that the perception of thedirection of auditory stream was unambiguous when thetactile distractors were presented at a temporal lag thatminimized the possibility of any crossmodal interactionbetween the target and distractor streams (e.g. Soto-Faraco et al. 2002).

If tactile information really does affect the perceiveddirection of an auditory stream one might expect thatthis crossmodal effect should be reflected in both con-gruent trials (in which an improvement of people’saccuracy would be expected) and in incongruent trials(where an impairment of people’s performance would beexpected). To determine whether this was, in fact, true,we conducted further statistical comparisons to testwhether performance in the synchronous condition wasimproved in congruent trials and impaired in incongru-ent trials, compared with the baseline asynchronouscondition. These tests revealed that participants’ per-formance was significantly worse in synchronous-incon-gruent trials, t(17)=5.96, P<.001, but that theirperformance was not significantly better in synchronous-congruent trials. However, it should be noted that thisnon-significant difference between the synchronous andasynchronous conditions in congruent trials may simplyreflect that participants were already responding at near-ceiling levels in the baseline (asynchronous) trials (seeExperiment 2 for a fuller discussion of this point).

Perhaps the most important result to emerge fromExperiment 1 was that the interfering effect of tactiledistractors on the spatiotemporal perception of auditorytarget stimuli was not found when participants adopted acrossed-hands posture (see Soto-Faraco et al. 2004b for asimilar result). Consequently, the participants in Exper-iment 1 were able to discriminate the direction of theauditory streams more accurately in the crossed handsposture than in the uncrossed posture. Our resultstherefore provide one of the few documented cases wherecrossing the hands has actually been shown to facilitateperformance rather than impairing it. Indeed, as notedearlier, crossing the hands can often lead to quite severedeficits in tactile perception (e.g. Roder et al. 2004; Shoreet al. 2002; Yamamoto and Kitazawa 2001; Wada et al.2004). Because touch acted as the distractor modalityrather than the target modality, crossing the handswould presumably have reduced the reliability of anytactile spatiotemporal information, hence effectively

Fig. 2 Mean percentage of correct responses (error bars representthe SE) in the auditory direction discrimination task as a functionof Hand Posture and Congruency in the synchronous (a) andasynchronous (b) conditions of Experiment 1

508

eliminating its interfering effect on the perception of theauditory stimuli. This, in turn, suggests that the effect ofcrossing the hands on the spatial representation of tactilestimuli occurs regardless of whether or not touch is theattended modality.

There is, however, an alternative explanation of thecrossed hands benefit reported in Experiment 1; it couldbe argued that crossing the hands might somehow inhibitcrossmodal interactions (cf. Soto-Faraco et al. 2004b),and that this might have made it easier for participantsto ignore the distractor stimuli. Experiment 2 was de-signed to try and discriminate between these two possibleexplanations of the results of the crossed-hands condi-tion of Experiment 1. We explored the effect of changesin body posture on audiotactile spatiotemporal interac-tions by assessing the effect of auditory distractors on theperception of pairs of tactile stimuli, with participantsonce again adopting either a crossed or an uncrossed-hands posture.

Experiment 2

Experiment 2 was designed to assess people’s ability todiscriminate the direction of tactile streams as a functionof changes in hand posture and auditory distractor con-gruency. One of the two competing hypotheses that wasproposed to account for the crossed hands results ofExperiment 1 was that crossing the hands simply inhibitedcrossmodal interactions. If this explanation were to provecorrect the auditory distractors should not have any dif-ferential effect on the spatiotemporal perception of thetactile target stimuli in the crossed hands compared withthe uncrossed hands of Experiment 2. Alternatively, if theresults of Experiment 1 were caused by a reduction in thereliability of the tactile signal when the participants cros-sed their hands, we would expect the crossmodal effect tobe larger when the hands are crossed in Experiment 2,because now the (weakened) tactile signal is the target.

Methods

Participants

Fourteen different participants (age range 18–33, mean24 years) from the University of Oxford took part inExperiment 2.

Apparatus, materials, design, and procedure

These were the same as for Experiment 1, with the fol-lowing exceptions: in Experiment 2, the participants hadto report the direction in which the vibrotactile stimuliseemed to move while trying to ignore the auditory di-stractors (i.e. the roles of the two modalities as target anddistractor were reversed). Before starting the experimentthe participants completed one block of 12 practice trialswith their hands in an uncrossed posture. The tactile

stimuli were presented in isolation (i.e. without the dis-tracting sounds) to ensure that the participants couldcorrectly discriminate the direction of the tactile stream.Next, the participants completed a further block of 12practice trials in which the auditory stimuli were pre-sented in isolation and in which participants had to re-spond to the direction of the auditory streams. We werethus able to ensure that if the auditory distractors werefound to have no effect on tactile perception this couldnot simply be attributed to the participants merely beingunable to discriminate the direction of the auditorystream. These first two blocks of practice were repeated ifparticipants made more than one error. Finally, theparticipants completed a third block of practice trials inwhich they responded to the direction of the tactilestreams with their hands in a crossed posture. Theaccuracy of participants’ performance in this blockserved as an index of the effect of crossing the hands indiscriminating the direction of the tactile streams in theabsence of any distractors 1.

Results

The accuracy data were submitted to three-way repeated-measures ANOVA with the factors of Hand Posture(Uncrossed vs Crossed), Synchrony (Synchronous vsAsynchronous), and Congruency (Incongruent vs Con-gruent). The ANOVA revealed a significant main effectof Hand Posture, F(1,13)=80.83, MSE=669.69, P<.01,with participants responding more accurately when theirhands were placed in the uncrossed posture (M=95%)than when they were placed in the crossed posture(M=51%). This result demonstrates the difficulty peoplehave discriminating the direction of a tactile stream whentheir hands are crossed (cf. Shore et al. 2002). There wasalso a significant main effect of Congruency,F(1,13)=21.86, MSE=202.47, P<.001, with participant’sresponding more accurately in congruent trials (79%)compared with incongruent trials( 67%). The interactionbetween Synchrony and Hand Posture was also sig-nificant, F(1,13)=6.14, MSE=70.83, P<.05, revealing alarger effect of Hand Posture (measured as the differencebetween the uncrossed-hands and the crossed-handsconditions) in the asynchronous condition than in thesynchronous condition (M=48 vs 40%, respectively).The interaction between Synchrony and Congruency,F(1,13)=14.38, MSE=168.42, P<.01, also reached sig-nificance, reflecting the fact that there was a Congruencyeffect in the synchronous condition, t(13)=4.54, P<.001,but not in the asynchronous condition, t(13)=1.78,P=.09. Finally, the interaction between Hand Postureand Congruency was significant, F(1,13)=15.66,

1The participants responded less accurately in the crossed-handspractice block (M=57%) than in the hands-uncrossed block(M=92%). This result supports the idea that crossing the handsimpairs discrimination of multiple tactile stimuli presented to thetwo hands even in the absence of any distractor stimuli beingpresented (cf. Shore et al. 2002).

509

MSE=117.58, P<.05, showing that the magnitude ofthe congruency effect was larger in the crossed-handscondition than in the uncrossed-hands condition (M=20vs 4%, respectively).

As in Experiment 1, analysis of the accuracy datarevealed a significant three-way interaction betweenHand Posture, Synchrony, and Congruency,F(1,13)=5.71, MSE=129.02, P<.05. Once again, weperformed separate ANOVAs on the data from eachlevel of the Synchrony factor (see Fig. 3). In the syn-chronous trials the interaction between Hand Postureand Congruency was significant, F(1,13)=12.61,MSE=194.71, P<.01. In contrast with Experiment 1,however, both the crossed-hands and the uncrossed-hands conditions now showed a typical pattern ofcrossmodal interaction (i.e. a significant difference be-tween congruent and incongruent trials; both P<.01).Crucially, the magnitude of the crossmodal capture effectwas significantly larger in the crossed-hands posture thanin the uncrossed-hands posture (M=35 vs 8%, respec-tively). Note that this is the opposite pattern of results tothat reported in Experiment 1, for which crossmodalinteractions were reported in the uncrossed-hands pos-ture but not in the crossed-hands posture. Participantsperformed more accurately in the uncrossed-hands con-dition than in the crossed-hands condition overall (93 vs53%, respectively) resulting in a significant main effect ofHand Posture, F(1,13)=57.11, MSE=392.74, P<.001.The Congruency factor also reached significance,F(1,13)=20.69, MSE=323.68, P<.001, with a mean ac-curacy of 84% in the congruent trials and 62% in theincongruent trials. Analysis of the asynchronous trials

revealed a significant main effect of Hand Posture,F(1,13)=92.39, MSE=347.88, P<.001. This term reflectsthe fact that participants responded more accurately inthe uncrossed-hands condition (M=97%) than in thecrossed-hands condition, where they performed atchance levels (M=49%). Neither the main effect ofCongruency, F(1,13)=3.17, MSE=47.21, P=.09, nor theinteraction between Hand Posture and Congruency,F(1,13)=2.38, MSE=51.89, P=.14, reached statisticalsignificance in the asynchronous trials.

A similar ANOVA on the RT data revealed a signif-icant main effect of Hand Posture, F(1,13)=11.12,MSE=290003, P<.01, with participants respondingmore rapidly in the uncrossed-hands condition(M=876 ms) than in the crossed-hands condition(M=1,215 ms). This result again supports the view thatcrossing the hands reduces the reliability of any tactileinformation, and so participants become confused aboutthe direction of the tactile stimuli. None of the otherterms in the analysis of the RT data reached significance.

Discussion

The results of Experiment 2 reveal that the extent towhich auditory distractors influence the perception of thedirection in which a pair of tactile stimuli appears tomove over time depends on the posture adopted byparticipants. When participants adopted the uncrossed-hands posture a significant crossmodal interaction wasreported, albeit reduced in magnitude compared withthat seen in Experiment 1 (in which the target and dis-tractor modalities were reversed). However, this cross-modal effect was much larger in the crossed-handscondition (M=35%) than in the uncrossed-hands con-dition (M=8%). This latter result is interesting becauseit supports the idea that crossing the hands effectivelyreduced the reliability of any tactile information. In fact,previous studies have shown that people are oftenuncertain when asked to localize two rapidly presentedvibrotactile stimuli (one presented to each hand) whenthey adopt the crossed-hands posture (e.g. Roder et al.2004; Shore et al. 2002; Yamamoto and Kitazawa 2001).Therefore, the results of Experiment 2 are incompatiblewith the hypothesis that crossing the hands simplyinhibits audiotactile crossmodal interactions per se. Ra-ther, they support the idea that crossing the hands re-duced the reliability of the processing of any tactileinformation, thereby eliminating the crossmodal inter-action reported in Experiment 1. We discuss the impli-cations of these results in ‘‘General discussion’’.

As in Experiment 1, crossmodal interactions wereonly seen in the synchronous condition with the asyn-chronous condition once again providing a baselinemeasure of participants’ performance in the tactile dis-crimination task, both in the crossed and uncrossed-hands postures. We conducted statistical comparisons totest whether performance was either improved in con-gruent trials, impaired in incongruent trials, or whetherboth effects were taking place in the synchronous con-

Fig. 3 Mean percentage of correct responses (error bars representSE) in the tactile direction discriminating task as a function ofHand Posture and Congruency in the synchronous (a) andasynchronous (b) conditions of Experiment 2

510

dition when compared with the baseline asynchronouscondition. These tests revealed that participants’ per-formance was impaired in synchronous–incongruenttrials compared with the asynchronous-incongruent tri-als (M=62 vs 71%, t(13)=3.14, P<.01). This resultsuggests that the presentation of a pair of auditory sti-muli at the same time as the target tactile stimuli im-paired people’s ability to discriminate the direction of atactile stream when they originated from the oppositelocations to the tactile stimuli. The comparison betweenthe synchronous and asynchronous conditions for thecongruent trials was also significant, t(13)=3.26, P<.01,revealing that participant’s performed more accurately inthe synchronous condition than in the asynchronouscondition (M=84 vs 75%, respectively). It seems,therefore, that the effect of auditory distractors on theperception of tactile stimuli can be shown both in con-gruent trials and in incongruent trials. We believe thatthe non-significant difference between the synchronousand asynchronous conditions in congruent trials shownin Experiment 1 was probably attributable to partici-pants’ responding at a near-ceiling level in that experi-ment, rather than because of any general absence of aneffect of tactile information on auditory discriminationon congruent trials.

General discussion

The most important result to emerge from the twoexperiments reported in this study is that audiotactileinteractions in spatiotemporal perception are modulatedby changes of body posture. When participants adoptedan uncrossed-hands posture in Experiment 1 tactileinformation had a strong effect on the perception of thedirection of an auditory stream (see Soto-Faraco et al.2004b, for similar results). The results from the un-crossed-hands condition in Experiment 2 revealed a re-ciprocal, albeit somewhat weaker, interaction betweentouch and audition. An independent samples t-testcomparing the magnitude of the crossmodal capture ef-fect in the synchronous uncrossed-hands conditions re-vealed that the effect of tactile distractors on auditoryperception was significantly larger than the effect ofauditory distractors on tactile perception, t(30)=3.88,P<.001. Although this result seems to suggest that touch‘‘dominates’’ over audition in the processing of spatio-temporal information, it is important to note that theauditory and tactile stimuli used in this study were notmatched in certain respects, such as, for example, theirintensity, and that this may, perhaps, have led to differ-ences between the strength of any crossmodal interac-tions in the two modalities. However, given that the sameasymmetry has now been reported using several of dif-ferent behavioural models (e.g. Sherrick 1976; Soto-Faraco et al. 2004b), it would seem at least somewhatmore likely that it reflects a genuine inequality betweenthe modalities.

The crossed-hands condition in this study revealed adifferent pattern of results than the uncrossed-hands

condition. While no influence of tactile distractors onauditory perception was detected in Experiment 1,auditory distractors actually exerted a stronger effect ontactile perception when the hands were crossed inExperiment 2. This result suggests that crossing thehands does not impair crossmodal interactions per se (cf.Soto-Faraco et al. 2004b), but that instead it reduces thereliability of any tactile information, thereby making itmore susceptible to the effect of distractors presented inanother modality.

The results of the two experiments reported heredemonstrate that crossmodal interactions cannot beviewed as reflecting a complete ‘‘dominance’’ or captureof one sensory modality by another (cf. Alais and Burr2004; Andersen et al. 2004; Battaglia et al. 2003; Caclinet al. 2002; Ernst and Banks 2002; Welch and Warren1980). It would seem, instead, that multisensory inter-actions in audiotactile information processing are bothflexible and adaptative. Consistent with this idea are theresults of a recent study by Hotting and Roder (2004)who demonstrated that auditory distractors have aweaker effect on blind people than on sighted individualswhen they try to judge the number of rapidly presentedtactile stimuli. Although both groups of participantswere likely to report having perceived more than onetactile stimulus when two or more tones were presentedsimultaneously with a single tactile stimulus (cf. Bresci-ani et al. 2005; Shams et al. 2000), the blind participantsperformed significantly more accurately than did thesighted group overall. Hotting and Roder argued thatthe accuracy of tactile information processing was higherin the blind participants (cf. Roder et al. 2004), thusreducing the effect of the auditory distractors on tactilediscrimination responses. These results are entirely con-sistent with the conclusions of the current study, namely,that the outcome of crossmodal interactions dependsupon the relative reliability of the processing within thesensory modalities involved in the perception of a par-ticular multisensory event (see Ernst and Bulthoff 2004,for a recent discussion of this issue).

In terms of the level (or levels) of information pro-cessing at which such interactions occur, it would seemthat in the current experiments crossing the handsmodulated (or interfered with) tactile processes that oc-curred before or at the same time as the crossmodalinteraction between auditory and tactile information (cf.Sanabria et al. 2004a, b, 2005). As noted earlier, this viewis consistent with previous studies that have demon-strated that crossing the hands impairs the spatiotem-poral processing of two tactile stimuli when presented ina rapid sequence (i.e. within 200 ms, as in this study; e.g.Roder et al. 2004; Shore et al. 2002; Yamamoto andKitazawa 2001). At present, by contrast, no one hasshown (and, a priori, there is no reason to believe) thatcrossing the hands impairs the processing of auditorystimuli. This would explain the lack of any distractingeffect from the tactile information on the perception ofthe auditory information in the crossed-hands conditionof Experiment 1, and the increased effect of auditorydistractors on the perception of the tactile targets in the

511

crossed-hands condition of Experiment 2, both effectscompared with the respective uncrossed hand conditionsand restricted to the synchronous condition.

One alternative hypothesis that should also be con-sidered here is that hand posture modulates the interac-tion between already integrated pairs of auditory andtactile stimuli (i.e., the first pair of simultaneously pre-sented stimuli are integrated before the appearance of thesecond pair of audiotactile stimuli). Support for this viewcomes from recent neurophysiological data from bothhuman (e.g. Foxe et al. 2000, 2002; Gobbele et al. 2003;Lutkenhoner et al. 2002; Murray et al. 2005; Schroederet al. 2003) and animal studies (e.g. Fu et al. 2003;Schroeder and Foxe 2002; Schroeder et al. 2001) sug-gesting that auditory and tactile stimuli can, under cer-tain conditions, be integrated within 100 ms of stimulusonset; that is, before the presentation of the second pairof stimuli in the synchronous condition of our experi-ments. However, this hypothesis alone cannot explain infull the pattern of results found in the two experimentsdescribed here. In fact, even if hand posture only mod-ulates the processing of multisensory information (i.e.constituted by the integration of an auditory and a tactilestimulus), one should still need to explain why crossingthe hands had a different effect depending on themodality of the target. It therefore seems more plausibleto argue that crossing the hands must be impairingunimodal processing of the tactile stimuli, which opensan interesting question about the relationship betweenunimodal and crossmodal information processing (seeSanabria et al. 2004a, b, 2005, for recent examplesdemonstrating the modulation of crossmodal audiovi-sual interactions by variations in the patterns of per-ceptual grouping taking place unimodally)2.

Another relevant question addressed by this study isrelated to the spatial frame of reference in which time-varying patterns of tactile information are representedduring audiotactile interactions. Although examinationof the literature for the case of static tactile events

shows apparently conflicting results (see Roder et al.2000, for evidence in favour of a somatotopic frame ofreference in the phenomenon of inhibition of return; seeLloyd et al. 2003, for evidence supporting the influenceof a more external frame of reference in endogenousspatial attention), our results do not fully support eitherof these two possible accounts (i.e. a purely somato-sensory or a more externally based frame of reference).If either frame of reference were to have prevailedcompletely in the experiments reported here there wouldhave been no reason to predict any significant differencein the magnitude of the crossmodal capture effects as afunction of hand posture in either experiment. If tactileinformation had been coded relative to a somatotopicframe of reference the congruency effect should havereversed (with participants making more correct re-sponses in incongruent trials than in congruent trials,congruency always being defined relative to the direc-tion in external space) in the crossed-hands conditioncompared with the uncrossed-hands condition (i.e. an‘‘external’’ left-to-right tactile stream would have beenperceived as a ‘‘somatotopic’’ right-to-left stream). Onthe other hand, if tactile information were representedin terms of a more externally based frame of reference,one would not have predicted there to be any significantdifferences between either the magnitude or direction ofthe congruency effects for the two postures (i.e. an‘‘external’’ left-to-right tactile stream would have re-mained a left-to-right stream in the crossed-handsposture). In addition, any account of tactile spatio-temporal processing based on either a somatotopic or amore externally based frame of reference would nothave predicted a significant difference in performance intactile discrimination as a function of hand posture inthe asynchronous baseline condition (see the ‘‘Results’’of ‘‘Experiment 2’’).

There are several possible explanations of these re-sults. It might be that a somatotopic frame of referenceprevailed in both experiments, but that visual spatialinformation (note that even in the dark, people canimagine the position of their hands) interfered in theprocessing of tactile information (see Roder et al. 2004for a similar explanation). Alternatively, however, itcould be that the significant impairment in performancemeasured in the (baseline) asynchronous condition ofExperiment 2 between the crossed and uncrossed-handsconditions reflects the consequence of the conjointinfluence of two different frames of reference in codingtactile motion (e.g. Kim and Cruse 2001; Pouget et al.2004; Spence and Walton 2005). In fact, although both asomatotopic and a more externally based frame of ref-erence code tactile spatiotemporal information in thesame way in the uncrossed-hands posture, they comeinto conflict when people adopt a crossed hands position(e.g. an external leftward tactile stream equates to a so-matotopic rightward stream). In any case, no matterwhat the correct explanation of this finding turns out tobe, these results suggest that crossing the hands reducedthe reliability of any tactile information presented inthese experiments (cf. Roder et al. 2004; Shore et al.

2To determine whether the two stimuli presented in each of thetactile sequences was perceived as a unique perceptual event (ratherthan as two individual tactile stimuli presented sequentially) an-other group of observers (N=8) was presented with 12 repetitionsof the tactile displays used here and asked to rate on a Likert scale(1–7) whether they experienced a connected tactile stimulus trav-elling from one hand to the other (a response of 7), or else twocompletely independent vibrations (a response of 1). The averagerating for the 150 ms SOA was 4.1, thus confirming that someimpression of motion was present. It should be noted, however,that given this average rating, our tactile display presumably elic-ited a weak (or broken) perception of apparent motion (i.e., incontrast to the strong apparent motion seen when large numbers ofclosely spaced stimuli are presented; cf. Strybel and Vatakis 2004).Note, however, that the auditory apparent motion conditions usedhere have also been shown to elicit the impression of motion (e.g.Soto-Faraco et al. 2004a, 2005). Moreover, a group of 11 observerswas tested using a staircase procedure (PEST algorithm; Taylorand Creelman 1967) designed to find the upper and lower SOAthresholds at which tactile apparent motion breaks down (i.e. the50% threshold) for pairs of 50-ms vibrations presented one to ei-ther hand. The thresholds were 34 ms (SD=31) and 623 ms(SD=313) respectively, that is, encompassing the SOA of 150 msused in our experiments.

512

2002; Wada et al. 2004; Yamamoto and Kitazawa 2001;see also Spence et al. 2003).

In summary, our results show that body posture af-fects the processing of spatiotemporal tactile informa-tion, modulating the way in which it interacts withspatiotemporal auditory information, suggesting thatcrossmodal interactions depend upon the reliability ofthe unimodal sensory information.

Acknowledgements The authors thank Nicholas P. Holmes forproviding comments on an earlier version of this manuscript. Thisstudy was supported by a Network Grant from the McDonnell-Pew Centre for Cognitive Neuroscience in Oxford to SalvadorSoto-Faraco and Charles Spence.

References

Adelstein BD, Begault DR, Anderson MR, Wenzel EM (2003)Sensitivity to haptic-audio asynchrony. Proceedings of the FifthInternational Conference on Multimodal Interfaces, New York.The Association for Computing Machinery Press, pp 73–76

Alais D, Burr D (2004) The ventriloquist effect results from near-optimal bimodal integration. Curr Biol 14:257–262

Andersen TS, Tiippana K, Sams M (2004) Factors influencingaudiovisual fission and fusion illusions. Cognit Brain Res21:301–308

Battaglia PW, Jacobs RA, Aslin, RN (2003) Bayesian integration ofvisual and auditory signals for spatial localization. J Opt SocAm 20:1391–1397

Bresciani JP, Ernst MO, Drewing K, Bouyer G, Maury V, KheddarA (2005) Feeling what you hear: auditory signals can modulatetactile tap perception. Exp Brain Res 162:172–180

Burton H, Sinclair R (1996) Somatosensory cortex and tactileperceptions. In: Kruger L (ed) Pain and touch. Academic, SanDiego, pp 105–177

Caclin A, Soto-Faraco S, Kingstone A, Spence C (2002) Tactile‘capture’ of audition. Percept Psychophys 64:616–630

Eimer M, Cockburn D, Smedley B, Driver J (2001) Cross-modallinks in endogenous spatial attention are mediated by commonexternal locations: evidence from event-related brain potentials.Exp Brain Res 139:398–411

Ernst MO, Banks MS (2002) Humans integrate visual and hapticinformation in a statistically optimal fashion. Nature 415:429–433

Ernst MO, Bulthoff HH (2004) Merging the senses into a robustpercept. Trends Cogn Sci 8:162–169

Foxe JJ, Morocz IA, Murray MM, Higgings BA, Javitt DC, Sch-roeder CE (2000) Multisensory auditory–somatosensory inter-actions in early cortical processing revealed by high-densityelectrical mapping. Cogn Brain Res 10:77–83

Foxe JJ, Wylie GR, Martinez A, Schroeder CE, Javitt DC, Guil-foyle D, Ritter W, Murray MM (2002) Auditory–somatosen-sory multisensory processing in auditory association cortex: anfMRI study. J Neurophysiol 88:540–543

Fu KMG, Johnston TA, Shah AS, Arnold L, Smiley J, Hackett TA,Garraghty PE, Schroeder CE (2003) Auditory cortical neuronsrespond to somatosensory stimulation. J Neurosci 23:7510–7505

Gobbele R, Schurrmann M, Forss N, Juottonen K, Buchner H,Hari R (2003) Activation of the human posterior and tempop-arietal cortices during audiotactile interaction. Neuroimage20:503–511

Graziano MSA, Ghandi S (2000) Location of the polysensory zonein the precentral gyrus of anesthetized monkeys. Exp Brain Res135:259–266

Graziano MSA, Gross CG, Taylor CSR, Moore T (2004) A systemof multimodal areas in the primate brain. In: Spence C, Driver J(eds) Crossmodal space and crossmodal attention. OxfordUniversity Press, Oxford, pp 51–69

Hotting K, Roder B (2004) Hearing cheats touch, but less incongenitally blind than in sighted individuals. Psychol Sci15:60–64

Kim D-H, Cruse H (2001) Two kinds of body representation areused to control hand movements following tactile stimulation.Exp Brain Res 139:76–91

Lloyd DM, Merat N, McGlone F, Spence C (2003) Crossmodallinks between audition and touch in covert endogenous spatialattention. Percept Psychophys 65:901–924

Lutkenhoner B, Lammertmann C, Simoes C, Hari R (2002)Magnetoencephalographic correlates of audiotactile interaction.Neuroimage 15:509–522

Murray MM, Molholm S, Michel CM, Heslenfeld DJ, Ritter W,Javitt DC, Shroeder CE, Foxe JJ (2005) Grabbing your ear:rapid auditory–somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment.Cereb Cortex (in press)

Pallier C, Dupoux E, Jeanin X (1997) EXPE: an expandable pro-gramming language for psychological experiments. Behav ResMethods Instrum Comput 29:322–327

Penfield W, Rasmussen T (1950) The cerebral cortex of man.Macmillan, New York

Pouget A, Deneve S, Duhamel JR (2004) A computational neuraltheory of multisensory spatial representations. In: Spence C,Driver J (eds) Crossmodal space and crossmodal attention.Oxford University Press, Oxford, pp 123–141

Roder B, Spence C, Rosler F (2000) Inhibition of return and ocu-lomotor control in the blind. Neuroreport 11:3043–3045

Roder B, Rosler F, Spence C (2004) Early vision impairs tactileperception in the blind. Curr Biol 14:121–124

Sanabria D, Soto-Faraco S, Chan J, Spence C (2004a) When doesintramodal perceptual grouping affect multisensory integration?Cogn Affect Behav Neurosci 4:218–229

Sanabria D, Soto-Faraco S, Chan J, Spence C (2004b) Exploringthe role of visual perceptual grouping on the audiovisual inte-gration of motion. Neuroreport 15:2745–2749

Sanabria D, Soto-Faraco S, Chan J, Spence C (2005) Intramodalperceptual grouping modulates crossmodal interaction: evidencefrom the crossmodal dynamic capture task. Neurosci Lett377:59–64

Schroeder CE, Foxe JJ (2002) Timing and laminar profile of con-verging inputs in multisensory areas of the macaque neocortex.Cogn Brain Res 14:195–207

Schroeder CE, Lindsley RW, Specht C, Marcovici A, Smiley JF,Javitt DC (2001) Somatosensory input to auditory associa-tion cortex in the macaque monkey. J Neurophysiol 85:1322–1327

Schroeder CE, Smiley J, Fu KMG, McGinnis T, O’Connell MN,Hackett TA (2003) Anatomical mechanisms and functionalimplications of multisensory convergence in early cortical pro-cessing. Int J Psychophysiol 50:5–18

Shams L, Kamitani Y, Shimojo S (2000) What we see is what youhear. Nature 408:788

Sherrick CE (1976) The antagonisms of hearing and touch. In:Hirsh SK, Eldredge DH, Hirsh IJ, Silverman SR (eds) Hearingand Davis: essays honouring Hallowell Davis. WashingtonUniversity Press, St. Louis, pp 149–158

Shore D, Spry E, Spence C (2002) Confusing the mind by crossingthe hands. Cogn Brain Res 14:153–163

Soto-Faraco S, Lyons J, Gazzaniga M, Spence C, Kingstone A(2002) The ventriloquist in motion: illusory capture of dynamicinformation across sensory modalities. Cogn Brain Res 14:139–146

Soto-Faraco S, Spence C, Kingstone A (2004a) Congruency effectsbetween auditory and tactile motion: extending the crossmodaldynamic capture effect. Cogn Affect Behav Neurosci 4:208–217

Soto-Faraco S, Spence C, Kingstone A (2004b) Crossmodal dy-namic capture: congruency effects of motion perception acrosssensory modalities. J Exp Psychol: HPP 30:330–345

Soto-Faraco S, Spence C, Kingstone A (2005) Assessing automa-ticity in the audio-visual integration of motion. Acta Psychol118:71–92

513

Spence C, Walton M (2005) On the inability to ignore touch whenresponding to vision in the crossmodal congruency task. ActaPsychol 118:47–70

Spence C, Baddeley R, Zampini M, James R, Shore D (2003)Multisensory temporal order judgments: when two locations arebetter than one. Percept Psychophys 65:318–328

Spence C, Nicholls MER, Gillespie N, Driver J (1998) Crossmodallinks in exogenous covert spatial orienting between touch,audition, and vision. Percept Psychophys 60:544–557

Strybel TZ, Vatakis A (2004) A comparison of auditory and visualapparent motion presented individually and with crossmodalmoving distractors. Perception 33:1033–1048

Taylor MM, Creelman CD (1967) PEST: efficient estimates onprobability functions. J Acoust Soc Am 41:782–787

Wada M, Yamamoto S, Kitazawa S (2004) Effects of handednesson tactile temporal order judgments. Neuropsychologica42:1887–1895

Welch RB, Warren DH (1980) Immediate perceptual response tointersensory discrepancy. Psychol Bull 3:638–667

Yamamoto S, Kitazawa S (2001) Reversal of subjective temporalorder due to arm crossing. Nat Neurosci 4:759–765

Zampini M, Brown T, Shore DI, Maravita A, Roder B, Spence C(2005) Audiotactile temporal order judgments. Acta Psychol118:277–291

514