Upload
others
View
13
Download
0
Embed Size (px)
Citation preview
1
Charles SpenceDepartment of Experimental Psychology,
Oxford University
Crossmodal Attention & Multisensory Integration:
Implications for Multimodal Interface Design
In the Realm of the Senses
Majority of information presented visually
Wickens (1980, 1984, 1988, 1992...) Structure of human
processing resources
Are there any costs of monitoring more than one sensory channel?
• Left/right discrimination task
Target Light
Target Vibrator
Target Loudspeaker
Touch is Sticky
Performance Costs Associated with Attending to Multiple Modalities
Possible Presented Cost in msAudition Vision = 55Touch Vision =104Touch Audition = 98 Vision Audition = 68Audition Touch = 67Vision Touch = 66
2
Same Position
Different Position
Relevant Visual
Irrelevant AuditoryRelevant Auditory
Irrelevant Visual
Relevant Visual
Relevant AuditoryIrrelevant Auditory
Irrelevant Visual
Shadowing Performance
30
40
50
60
70
Chewing-Lips
Speaking-Lips
1) Lip-reading facilitates shadowing2) Better performance when auditory &
visual information from same position
SamePositionDifferentPosition
1)
2)
% C
orre
ctDon’t Dial & Drive? Leeds Advanced Driving Simulator
Spence & Read (2003)
RS
RS
IS
ShadowSide
ShadowFront
SimulatorScreen
Multisensory Warning Signals
3
Multisensory Integration‘…there is no animal in which there is known to be a complete segregation of sensory processing’ (Stein et al., 1996)
Superadditivity Multisensory Enhancement
Multisensory Suppression
You simply cannot predict multisensory perception by studying senses in isolation
Multisensory Perception
4
Multisensory Motion Perception
Displays presented every 2 s until responseTask: Report direction of auditory motion
30 cm
Loudspeaker cone Fixation
LED
Lights
0.00
0.25
0.50
0.75
1.00
Synch Async
Perc
enta
ge c
orre
ct
1.50
1.75
2.00
2.25
2.50
Num
ber o
f dis
play
upd
ates
be
fore
resp
onse
Same
Opposite
Same
Opposite
Crossmodal Dynamic Capture
Auditory motion perception compromised by synchronous presentation of visual motion
Rules of Multisensory Integration• Superadditivity: Weak stimuli interact
synergistically when presented from same location at about same time
• Subadditivity: When these conditions are not met
• Sensory Dominance: Vision for space, hearing for time, olfaction for appetitive, touch & olfaction for affective
Virtual Body Effect
• Virtual body effect (shadows)• Tool-use (computer mice/laser pointers)
Incorporation & Embodiment Changing perception of touch with sound
Dry Hydrated
?
5
Headphones
Product
Microphone
Dimension scale
Footpedals
Multisensory SynchronizationWhen should you present multisensory stimuli?
Perception of Simultaneity
0
0.2
0.4
0.6
0.8
1
-200 -100 0 100 200
Stimulus Onset Asynchrony (ms)
Prop
ortio
n (S
imul
tane
ous R
espo
nse)
Sound first Vision first
Position Same Different
• Wide temporal window of multisensory integration
Same position
Different positions
• Perception of simultaneity enhanced when stimuli from same location
Biophysics: Transduction LatenciesSpeed of neural processing
SLOW
FAST
PhysicsLight travels faster than sound, so distant events seen first
‘Horizon of Simultaneity’
Physics cancels out biophysics at 10m
• Most interfaces closer than 10 m • Simultaneous presentation of
multisensory signals doesn’t assure perception of simultaneity
• Desynchronizing inputs might enhance multisensory integration & perception (warning signals)
Multisensory Synchronization
6
Multisensory Entertainment‘Most designers have gotten to the point in production where the decision is made to hit the viewer with everything they’ve got. The big sounds, the dramatic slam of music from the dead silence, the sudden appearance of the beast. And the kids sit there saying ‘been there...done that...ho hum...’’ (Ralph Thomas, ‘Nothing to sniff at?’, 2002)
Olfactory Interfaces?• Reducing symptoms of road rage• Alerting drowsy drivers• Burnt rubber smell for bad drivers• Olfactory console so drivers can choose
smell to suit mood/ surroundings• Technology available to introduce PC
smell (Digiscents failed; Arvel, Japan)
Conclusions• Attention & multisensory integration
critically determine perception & behavior• Spatial constraints on focused & divided
attention between hearing, sight & touch• Multisensory temporal synchrony• Understanding multisensory interactions
will lead to better interface design• From intuition to understanding via
cognitive neuroscience
Aging & Multisensory Perception
• By 2025, more than a billion people over 60 (US Senate Special Committee on Aging, 1985-1986)
Barcelona, June 2-5 2004www.multisense.info/2004Contact: [email protected]
5th A n n u a l M e e t i n g