11
JSLHR Research Article Coordination of Gaze and Speech in Communication Between Children With Hearing Impairment and Normal-Hearing Peers Olof Sandgren, a Richard Andersson, a Joost van de Weijer, a Kristina Hansson, a and Birgitta Sahlén a Purpose: To investigate gaze behavior during communication between children with hearing impairment (HI) and normal-hearing (NH) peers. Method: Ten HINH and 10 NHNH dyads performed a referential communication task requiring description of faces. During task performance, eye movements and speech were tracked. Using verbal event (questions, statements, back channeling, and silence) as the predictor variable, group characteristics in gaze behavior were expressed with Kaplan-Meier survival functions (estimating time to gaze-to- partner) and odds ratios (comparing number of verbal events with and without gaze-to-partner). Analyses compared the listeners in each dyad (HI: n = 10, mean age = 12;6 years, mean better ear pure-tone average = 33.0 dB HL; NH: n = 10, mean age = 13;7 years). Results: Log-rank tests revealed significant group differences in survival distributions for all verbal events, reflecting a higher probability of gaze to the partners face for participants with HI. Expressed as odds ratios (OR), participants with HI displayed greater odds for gaze-to-partner (ORs ranging between 1.2 and 2.1) during all verbal events. Conclusions: The results show an increased probability for listeners with HI to gaze at the speakers face in association with verbal events. Several explanations for the finding are possible, and implications for further research are discussed. Key Words: referential communication, eye tracking, child hearing impairment, gaze behavior, survival analysis F ace-to-face interaction consists of more than the verbal exchanges that make up the sound stream. The interaction is replete with gazes to the conver- sational partner. In a previous study, Sandgren, Andersson, van de Weijer, Hansson, and Sahlén (2012) demonstrated that children with normal hearing displayed increased probability of gaze-to-partner when asking questions than when mak- ing statements, results indicating that the linguistic and prag- matic content of the conversation influences gaze behavior. But does a hearing impairment affect the use of gaze in conversation additionally? In the present study, we test the hypothesis that children with hearing impairment gaze more to the conversational partner than normal-hearing peers by examining gaze-to-partner during different verbal events (questions, statements, back channeling, and silence) in a referential communication task. Gaze Behavior in Typical Populations Gaze behavior during conversation has been the topic of a number of studies, and many provide consistent results (Bavelas, Coates, & Johnson, 2002; Kendon, 1967; Mirenda, Donnellan, & Yoder, 1983; Turkstra, Ciccia, & Seaton, 2003). In free and unscripted dialogue, listeners look more at the speaker than vice versa (Kendon, 1967; Turkstra et al., 2003). Turkstra et al. (2003), for example, in a study of interactive behaviors in adolescents, reported that participants look at their partners 65% of the time when listening compared with 40% of the time when speaking. Kendon (1967), reporting ranges and providing few details on participant characteris- tics, found that adults looked at their partners 30%80% of the time when listening versus 20%65% when speaking. Differences between the studies regarding data collection and a Lund University, Lund, Sweden Correspondence to Olof Sandgren: [email protected] Editor: Rhea Paul Associate Editor: Elizabeth Crais Received October 23, 2012 Revision received April 19, 2013 Accepted September 24, 2013 DOI: 10.1044/2013_JSLHR-L-12-0333 Disclosure: The authors have declared that no competing interests existed at the time of publication. Journal of Speech, Language, and Hearing Research Vol. 57 942951 June 2014 A American Speech-Language-Hearing Association 942 Downloaded From: http://jslhr.pubs.asha.org/ by a Proquest User on 08/05/2014 Terms of Use: http://pubs.asha.org/ss/Rights_and_Permissions.aspx

Coordination of Gaze and Speech in Communication Between Children With Hearing Impairment and Normal-Hearing Peers

Embed Size (px)

Citation preview

Page 1: Coordination of Gaze and Speech in Communication Between Children With Hearing Impairment and Normal-Hearing Peers

JSLHR

Research Article

Coordination of Gaze and Speech inCommunication Between Children

With Hearing Impairment andNormal-Hearing Peers

Olof Sandgren,a Richard Andersson,a Joost van de Weijer,a

Kristina Hansson,a and Birgitta Sahléna

Purpose: To investigate gaze behavior during communicationbetween children with hearing impairment (HI) andnormal-hearing (NH) peers.Method: Ten HI–NH and 10 NH–NH dyads performed areferential communication task requiring description of faces.During task performance, eye movements and speech weretracked. Using verbal event (questions, statements, backchanneling, and silence) as the predictor variable, groupcharacteristics in gaze behavior were expressed withKaplan-Meier survival functions (estimating time to gaze-to-partner) and odds ratios (comparing number of verbal eventswith and without gaze-to-partner). Analyses compared thelisteners in each dyad (HI: n = 10, mean age = 12;6 years,mean better ear pure-tone average = 33.0 dB HL; NH: n = 10,mean age = 13;7 years).

Results: Log-rank tests revealed significant group differencesin survival distributions for all verbal events, reflecting a higherprobability of gaze to the partner’s face for participantswith HI. Expressed as odds ratios (OR), participants with HIdisplayed greater odds for gaze-to-partner (ORs rangingbetween 1.2 and 2.1) during all verbal events.Conclusions: The results show an increased probability forlisteners with HI to gaze at the speaker’s face in associationwith verbal events. Several explanations for the finding arepossible, and implications for further research are discussed.

Key Words: referential communication, eye tracking, childhearing impairment, gaze behavior, survival analysis

Face-to-face interaction consists of more than theverbal exchanges that make up the sound stream.The interaction is replete with gazes to the conver-

sational partner. In a previous study, Sandgren, Andersson,van deWeijer,Hansson, and Sahlén (2012) demonstrated thatchildren with normal hearing displayed increased probabilityof gaze-to-partner when asking questions than when mak-ing statements, results indicating that the linguistic and prag-matic content of the conversation influences gaze behavior.But does a hearing impairment affect the use of gaze inconversation additionally? In the present study, we test thehypothesis that children with hearing impairment gaze moreto the conversational partner than normal-hearing peers by

examining gaze-to-partner during different verbal events(questions, statements, back channeling, and silence) in areferential communication task.

Gaze Behavior in Typical PopulationsGaze behavior during conversation has been the topic

of a number of studies, and many provide consistent results(Bavelas, Coates, & Johnson, 2002; Kendon, 1967; Mirenda,Donnellan, & Yoder, 1983; Turkstra, Ciccia, & Seaton, 2003).In free and unscripted dialogue, listeners look more at thespeaker than vice versa (Kendon, 1967; Turkstra et al., 2003).Turkstra et al. (2003), for example, in a study of interactivebehaviors in adolescents, reported that participants look attheir partners 65% of the time when listening compared with40% of the time when speaking. Kendon (1967), reportingranges and providing few details on participant characteris-tics, found that adults looked at their partners 30%–80%of the time when listening versus 20%–65% when speaking.Differences between the studies regarding data collection and

aLund University, Lund, Sweden

Correspondence to Olof Sandgren: [email protected]

Editor: Rhea PaulAssociate Editor: Elizabeth Crais

Received October 23, 2012Revision received April 19, 2013Accepted September 24, 2013DOI: 10.1044/2013_JSLHR-L-12-0333

Disclosure: The authors have declared that no competing interests existed at thetime of publication.

Journal of Speech, Language, and Hearing Research • Vol. 57 • 942–951 • June 2014 • A American Speech-Language-Hearing Association942

Downloaded From: http://jslhr.pubs.asha.org/ by a Proquest User on 08/05/2014Terms of Use: http://pubs.asha.org/ss/Rights_and_Permissions.aspx

Page 2: Coordination of Gaze and Speech in Communication Between Children With Hearing Impairment and Normal-Hearing Peers

analysis should, however, be considered when comparingresults. Whereas both studies investigated gaze exchangesbetween unacquainted interlocutors, Turkstra et al. (2003)used video-based analysis of an adolescent population, andKendon (1967) studied adults using technology with lim-ited temporal resolution, likely increasing the variabilityof gaze-to-partner across individuals and obscuring rapidgaze exchanges between participants. The present studyuses eye-tracking technology and a semistructured task toaddress the need for temporal resolution and ecologicalvalidity.

Several attempts have been made to define the com-municative and social role played by gaze to the partner inconversation. Kendon (1967), who used a conversationanalytic approach, found a lower rate of gaze-to-partner atthe beginning of utterances and a higher rate at the end,causing him to suggest a relation between gaze behavior andturn taking in conversation. This turn-regulating function hasalso been described by Cummins (2012) and Bavelas et al.(2002), who, in addition, went on to propose that gaze notonly regulates turn exchanges but also creates short time-windows for back channeling responses (for example, “Yeah,”“Uh-huh,” “Mhm”) within an ongoing speaking turn. Pre-vious studies, reviewed and summarized by Mirenda et al.(1983), have proposed additional, pragmatically relatedfunctions of gaze, including declaring an interest in thetopic, expressing the degree of intimacy with the conver-sational partner, providing the speaker with information onmisunderstandings and communicative breakdowns, andexpressing emotions. By analogy, Turkstra (2005) con-cluded that an atypical gaze behavior can cause “negativeoutcomes that range from missing a cue to end a conver-sation to misperceiving the emotional communication ofone’s partner” (Turkstra, 2005, p. 1430).

Gaze Behavior in Atypical PopulationsGaze behavior is known to be affected by various

clinical disorders, including autism spectrum disorders (ASDs),which are associated with a lower rate of gaze-to-partner(Corden, Chilvers, & Skuse, 2008) and/or a distorted timingof gazes (Willemsen-Swinkels, Buitelaar, Weijnen, & vanEngeland, 1998), and Williams syndrome, for which atypi-cally high levels of gaze-to-partner are reported (Doherty-Sneddon, Riby, & Whittle, 2012). Norbury et al. (2009), ina study investigating the focus of gaze when viewing videoclips of emotionally engaging dialogue, found normal rates ofgaze to the characters’ eyes in teenagers with ASD but, inter-estingly, no relation between the gaze behavior and the levelof social competence. Instead, gaze behavior was related tocommunicative competence. The authors concluded that therole of gaze behavior, and in particular gaze to the partner’seyes, in the social competence of the teenagers with ASDmay have been exaggerated and that linguistic ability, atten-tion, and rigidity of behavior better predict social competenceoutcome. It has also been proposed that gaze-to-partner inASD may reflect more of an orienting function and less of areciprocal social exchange (Nadig, Lee, Singh, Bosshart, &

Ozonoff, 2010). This stems from an experiment in whichchildren with high-functioning autism, speaking about per-sonal interests as opposed to generic topics, exhibited moreatypical verbal production but more typical gaze behav-ior. Nadig et al. (2010) concluded that speaking abouta personal interest made participants with ASD more stereo-typical and monologue-like. Higher, more typical rates ofgaze-to-partner were only made possible because talkingabout a highly practiced topic made cognitive resourcesavailable to be directed at, for example, the partner.

Nadig et al.’s (2010) conclusion, which implies thatan appropriate gaze behavior requires sufficient allocation ofcognitive resources, finds support in the cognitive load hy-pothesis proposed by Glenberg, Schroeder, and Robertson(1998). The cognitive load hypothesis has been investigated inassociation with gaze aversion, that is, the deliberate avoid-ance of gaze-to-partner, in children with both typical andatypical development. Doherty-Sneddon and Phelps (2005)investigated why the recipient of a question looks away fromthe partner and tested whether this behavior serves to reducethe cognitive load or to alleviate the social stress associatedwith the risk of giving the wrong answer. Using a designcomparing face-to-face and video-linked questioning, theauthors found that the largest impact on gaze aversion wasthe degree of difficulty of the questions, indicating that gazeaversion serves more to manage cognitive load than socialstress. It is possible that the same mechanisms can help ex-plain Kendon’s (1967) finding of lower rates of gaze-to-partner at the beginning of utterances. The linguistic planningrequired to form an utterance is more easily performed whenblocking out unnecessary visual stimuli. On the other hand,turn-regulation mechanisms could also explain gaze aversionin the recipient of a question. By avoiding gaze-to-partner,the recipient claims the speaking turn and signals that aresponse is imminent.

Present StudyFrom previous research, it can be concluded that

gaze-to-partner is actively used in face-to-face conversation.Furthermore, gaze behavior is affected by linguistic and/orsocial deficits, and results have indicated a possible linkbetween linguistic proficiency and gaze-to-partner. Thepresent study addresses the paucity of research on gaze-to-partner in participants with hearing impairment, a populationreported to use visual cues more than normal-hearing peers(Skelt, 2006) and more often exhibiting language delay(Gilbertson & Kamhi, 1995; Hansson, Forsberg, Löfqvist,Mäki-Torkko, & Sahlén, 2004; Yoshinaga-Itano & Sedey,1998). An increased use of visual cues has been suggested as acompensatory strategy in children with hearing impairment,aiding language processing and comprehension and com-pensating for the degraded auditory input and restrictedability to use incidental hearing for learning (Blamey et al.,2001). This suggestion is further supported by findings ofimproved speech perception for audiovisual speech over speechpresented through the auditory modality only (Bergeson,Pisoni, & Davis, 2003; Garcia & Dagenais, 1998; Most,

Sandgren et al.: Coordination of Gaze and Speech 943

Downloaded From: http://jslhr.pubs.asha.org/ by a Proquest User on 08/05/2014Terms of Use: http://pubs.asha.org/ss/Rights_and_Permissions.aspx

Page 3: Coordination of Gaze and Speech in Communication Between Children With Hearing Impairment and Normal-Hearing Peers

Rothem,&Luntz, 2009;Woodhouse,Hickson,&Dodd, 2009).Furthermore, visual cues are used in the perspective-takingcrucial for efficient communication, as shown by childrentaking into account the partner’s field of view when inter-preting instructions (Nadig & Sedivy, 2002; Nilsen &Graham, 2009). Together, the results illustrate the necessityof instant integration of visual information in the percep-tion and processing of speech and in the pragmatic under-standing of communication.

In a detailed analysis of the gaze behavior of adults withsevere-to-profound hearing impairment while speaking toeither their audiologist or a family member, Skelt (2006)found the participants with hearing impairment behavedqualitatively similarly to the participants with normal hearingin Kendon’s (1967) study, however with higher rates of gaze-to-partner when listening and lower rates when speaking.Skelt (2006) described how the participants with hearing im-pairment through use of gaze initiations and gaze withdrawalscontrolled the turn exchanges in the conversations. Althoughgaze is only one of several tools used in turn regulation,Skelt (2006) emphasized its role in displaying readiness toaccept or reject the speaking turn.

Participants with hearing impairment are clinicallyrelevant to study since this population in many countries iseducated in inclusive settings (Hyde & Power, 2003; Stinson& Antia, 1999). Modern teaching often involves classroomtasks requiring collaboration between individual studentsor work in small groups (Toe & Paatsch, 2010), for whichquestioning and responding is necessary. Consequently, theschool setting demands well-functioning interaction, verbalas well as nonverbal. Previous studies have found studentswith hearing impairment to be less prone to request clarifyinginformation despite misunderstanding (Marschark et al.,2007) and to exhibit difficulties with turn taking, at least whennormal-hearing turn-taking behavior is considered the norm(Duchan, 1988). To address these issues, the present studyreports data on participants in middle childhood, a periodof increasing demands on independence in school work andpeer interaction, and investigates gaze behavior for its rolein the buildup and management of social interaction.

The present study used a referential communicationtask requiring collaboration between participants for suc-cessful completion. In order to ensure ecological validity, asemistructured, unscripted paradigm was used, defining theprocedure without restricting the participants’ productionor choice of conversational partner. In a previous study ofchildren with normal hearing, Sandgren et al. (2012) haveshown the verbal production of questions, statements, backchanneling, and silence to influence the probability of gaze-to-partner. In the present study, we investigate the influenceof these verbal events on the probability of gaze-to-partnerin children with hearing impairment. While expecting theverbal event to influence the probability of gaze-to-partnerfor all participants, hearing impaired or not, we hypothesizedthat the participants with hearing impairment would con-sistently exhibit higher probability of gaze-to-partner than theparticipants with normal hearing, possibly using gaze-to-partner as a compensatory strategy.

MethodParticipants

HI–NH dyads. Twenty children and adolescents (forthe sake of brevity, henceforth labeled children), seven girlsand 13 boys, ranging in age between 9;8 (years;months) and15;10 (M = 12;4, SD = 1;9) were recruited to form conver-sational pairs. Out of these, 10 participants, three girls andseven boys, mean age = 12;6 years, SD = 2;0, labeled HI),had documented bilateral mild-to-moderate sensorineuralhearing impairment, that is, pure-tone average (PTA) air-conduction hearing thresholds for octave frequencies from0.5 to 4 kHz (ISO 8253–1, 2010) between 26 and 55 dB HL(Clark, 1981), and had received bilateral hearing aids. Inthe collected sample, better ear PTA ranged between 20 and43 dB HL (M = 33.0, SD = 7.8). All impairments weresymmetrical (mean difference = 7.1 dB, SD = 6.1). Accordingto medical records, mean age at identification of the hear-ing impairment was 3;7 years (SD = 1;1) and mean age atamplification was 5;2 years (SD= 2;7). An outlier, with 25 dBHL, identified at 5;0 years and receiving amplification at10;0 years, was found not to differ from the other participantson the measures obtained in this study and was included inthe analyses. All participants with hearing impairment wereraised in oral speaking families and were educated in oralsettings, exhibited no speech impairments, and were givenno formal training in sign language, visually aided com-munication, or speech reading.

The remaining 10 participants, four girls and six boys(mean age = 12;3, SD = 1;7) were normal-hearing same-agepeers invited by the participant with hearing impairmentto take part in the study as conversational partners. All par-ticipants with hearing impairment chose to bring a classmate,thus, a partner familiar with their hearing loss, differingmaximally 1 year in age. All except three HI–NH dyadsconsisted of same-sex participants.

NH–NH dyads. Twenty children and adolescents,10 girls and 10 boys, ranging in age between 10;2 and 15;4(M= 13;6,SD= 1;11), were recruited to form normal-hearingcontrol dyads. Half of the participants in the control dyads,five girls and five boys (mean age = 13;7, SD = 1;11, labeledNH) composed a control group, matched to the ages of theHI group. The other half, five girls and five boys (mean age =13;5, SD = 2;0), were classmates invited by their NH peersto participate as conversational partners. All NH–NH dyadsconsisted of same-sex participants. Group descriptives aresummarized in Table 1.

The HI and NH groups did not differ significantlyon age, t(18) = 1.281, p = .22, or receptive grammar, t(18) =1.469, p = .159, as measured by standardized assessment withthe Test for Reception of Grammar, Version 2 (TROG-2;Bishop, 2009). All participants had Swedish as their firstlanguage and all had nonverbal IQs within normal limits asmeasured by Raven’s Standard ProgressiveMatrices (Raven,Raven, & Court, 2004). All participants had normal orcorrected to normal vision, and all normal-hearing partici-pants passed a 20 dB pure-tone hearing screening at 0.5, 1, 2,4, and 6 kHz before data collection. Ethical approval for the

944 Journal of Speech, Language, and Hearing Research • Vol. 57 • 942–951 • June 2014

Downloaded From: http://jslhr.pubs.asha.org/ by a Proquest User on 08/05/2014Terms of Use: http://pubs.asha.org/ss/Rights_and_Permissions.aspx

Page 4: Coordination of Gaze and Speech in Communication Between Children With Hearing Impairment and Normal-Hearing Peers

study was granted by the Regional Ethics Review Board forsouthern Sweden, approval number 2009/383.

Materials and ProcedureExperimental task. An unscripted referential commu-

nication task was used, in which the target children withhearing impairment and normal-hearing controls acted asthe listeners, and the conversational partners acted as thespeakers. This study reports data on the listener. The task haspreviously been used in several studies of conversationalstrategies and interaction in children with language and/orhearing impairment (Ibertsson, Hansson, Mäki-Torkko,Willstedt-Svensson, & Sahlén, 2009; Sandgren, Ibertsson,Andersson, Hansson, & Sahlén, 2011) and in a study of gazebehavior during linguistic problem solving between chil-dren with normal hearing (Sandgren et al., 2012). A screendisplaying 16 pictures of faces, visible only to the speaker, wasplaced between the participants. The listener was providedwith 24 pictures of faces. The instructions given were forthe speaker to describe each picture and its position with asmuch detail for the listener to be able to identify the correctpicture and place it in the correct position. The picturesof faces differed only in details, and the listener was forcedto request further information when confronted with aninsufficiently detailed description. The analysis focused onthe process of task resolution, not the end result, and a ceilingeffect was expected.

Equipment and data collection. During the referentialcommunication task, the participants wore identical SMIiView X HED head-mounted video-based pupil and cornealreflex eye-tracking systems, calibrated with a 9-point cali-bration procedure. The data from each eye-tracking systemwere merged with the video of a forward-facing camera,creating an output video showing the participant’s field ofviewwith amoving cursor indicating gaze position. The videowas filmed at 25 frames per second, creating an effectivesampling frequency of 25 Hz. The participants were seatedapproximately 120 cm from each other, separated by the30-cm tall picture screen. The height of the screen createdreal-life-like conversational conditions by allowing eyecontact and visual cues. The dialogues were video recordedusing a fixed digital video camera capturing both participantsfrom a side view. For audio recording, the camera’s built-in

microphone was used. Recordings were made in a quietlaboratory setting in the Humanities Laboratory at LundUniversity.

The dialogues were transcribed orthographically bythe first author and transcriptions were exported to ELAN(Wittenburg, Brugman, Russel, Klassmann, & Sloetjes,2006), an open-source audio and video annotation software,where each listener’s speechwas categorized into four types ofverbal events: requests, nonrequests, back channeling, andsilence. Requests included requests for confirmation of newinformation (“Has she got blue eyes?”), requests for confir-mation of old information (“Did you say she had blue eyes?”),and requests for elaboration (“What color are her eyes?”).In a previous study (Sandgren et al., 2011), these types ofrequests have been found to account for over 90% of requests.Back channeling included verbal signals of comprehensionand interest (for example, “Yeah,” “Uh-huh,” “Mhm”). Theremaining speech, including, for example, statements fromthe listener not directly related to the task resolution, such as“He looks a bit like your dad,” was categorized as non-requests in order to provide a baseline condition for com-parison in the analysis of gaze-to-partner during requests.Similarly, periods of silence, often constituting the partnerspeaking, were categorized to provide a baseline condition inthe analysis of gaze-to-partner during back channeling.The fourth author independently coded the verbal events in25% of the dialogues. The interrater reliability as estimatedwith Cohen’s kappa was .941. Table 2 provides examplesand group data on verbal event types.

Annotation of eye movements was made by the firstauthor using ELAN (Wittenburg et al., 2006). The outputvideos of the eye-tracking systems were merged and syn-chronized with the orthographic transcription, creating anannotation file containing all verbal and gaze annotations.Three areas of interest regarding gaze focus were specified:task (the pictures of faces), face (the partner’s face), andoff (gaze focused elsewhere). All instances of gaze within thespecified areas of interest were recorded, providing infor-mation on the participants’ gaze focus for the duration of theconversation. The second author independently annotatedthe eye movements in 20% of the data. Reliability was mea-sured using the overlap calculation in ELAN (Wittenburget al., 2006), with a modification weighting the annotationson their duration in time, allowing annotations of greater

Table 1. Group descriptives.

Study groups andconversation partners(n; sex)

Mean age(SD)

Mean BEPTA(SD)

Mean age atidentification

(SD)

Mean age atamplification

(SD)

Target HI (n = 10; 3 f, 7 m) 12;6 (2;0) 33.0 (7.8) 3;7 (1;1) 5;2 (2;7)HI partner (n = 10; 4 f, 6 m) 12;3 (1;7)Control NH (n = 10; 5 f, 5 m) 13;7 (1;11)NH partner (n = 10; 5 f, 5 m) 13;5 (2;0)

Note. BEPTA = better ear pure-tone average; HI = participants with hearing impairment; HI partner = normal-hearing conversational partners ofHI; NH = normal-hearing control group; NH partner = normal-hearing conversational partners of NH; f = female participants; m =male participants.

Sandgren et al.: Coordination of Gaze and Speech 945

Downloaded From: http://jslhr.pubs.asha.org/ by a Proquest User on 08/05/2014Terms of Use: http://pubs.asha.org/ss/Rights_and_Permissions.aspx

Page 5: Coordination of Gaze and Speech in Communication Between Children With Hearing Impairment and Normal-Hearing Peers

duration to affect the reliability score more than shorterannotations. The interrater reliability was 88.5%.

Data AnalysisGaze and verbal annotation data were extracted from

ELAN for analysis. In all, 2,946 cases of verbal events wereidentified and used in the analyses. The dependent variable(gaze to the speaker’s face) was scored binarily, on 10-msintervals, over a 3,000-ms time window centered at the onsetof the predictor event (the verbal events). Thus, for eachcase of a verbal event, 300 measurements of the occurrence ofgaze-to-partner (1/0) were made, covering the time spanbetween 1,500 ms preceding and 1,500 ms following the ver-bal event onset. The raw data were plotted in SPSS to provideprobability plots of gaze to the partner’s face for the differ-ent verbal event types. Figure 1 shows a schematic illus-tration of the verbal events and the gaze analysis window.

In order to analyze not only if but also when gazeto the speaker’s face occurred, data were fitted to a survivalfunction estimating the probability of the target event to

occur. The survival function estimates the event time (the timefrom the beginning of measurements, in this case 1,500 mspreceding the verbal event onset) to the target event, whilestatistically accommodating the influence of censored cases(verbal events performed without gaze-to-partner within thetime window) and displaying the probability of target eventoccurrence as a cumulative survival. For group comparisons ofthe probability of gaze-to-partner during the different verbalevent types, Kaplan-Meier survival analysis with Mantel-Coxlog-rank tests were performed in SPSS. Three analyses wereperformed, comparing (a) the probability of gaze-to-partnerduring requests to the probability during a baseline of non-requests; (b) the probability of gaze-to-partner during backchanneling to the probability during a baseline of silence; and(c) the probability of gaze-to-partner during the twomain typesof requests, that is, requests for confirmation of new infor-mation and requests for confirmation of old information. Theraw data (883,800 rows) used for the probability plots wereaggregated in two steps to (a) display whether the target event(gaze-to-partner) occurred within the 3,000-ms time windowand (b) establish the point in time at the target event. The

Table 2. Verbal event types, descriptions, examples, and distribution.

Verbal event type Description Example n (HI) n (NH)

Requests Questions 288 254“Has she got blue eyes?”a (194)a (182)a

“Did you say she had blue eyes?”b (54)b (57)b

“What color are her eyes?”c (40)c (15)c

Nonrequests Statements “He looks a bit like your dad.” 176 309Back channeling Feedback “Uh-huh.” “Mhm.” 269 165Silence Partner speaking 745 740Total 1,478 1,468

Note. n shows number of verbal events of each type.aRequest for confirmation of new information. bRequest for confirmation of old information. cRequest for elaboration.

Figure 1. Schematic illustration of verbal and gaze data. aRequest for confirmation of new information. bRequest for confirmation of oldinformation. cRequest for elaboration. dNonrequests. eBack channeling. fSilence. Gaze analysis window showing 3,000-ms time frame for studyof listeners’ gaze-to-partner, centered at verbal event onset.

946 Journal of Speech, Language, and Hearing Research • Vol. 57 • 942–951 • June 2014

Downloaded From: http://jslhr.pubs.asha.org/ by a Proquest User on 08/05/2014Terms of Use: http://pubs.asha.org/ss/Rights_and_Permissions.aspx

Page 6: Coordination of Gaze and Speech in Communication Between Children With Hearing Impairment and Normal-Hearing Peers

Kaplan-Meier survival analysis eliminates cases as theyexperience the target event, expressed graphically as a declin-ing slope ultimately showing the “survivors,” or censored cases(that is, the cases not experiencing the target event). Quanti-fication of group differences in survival time was expressedas odds ratios obtained by dividing the ratio of cases expe-riencing the target event in the two groups with the ratio ofcensored cases from both groups. Odds ratios with confidenceintervals not encompassing 1 indicate significant differencesbetween the groups.

ResultsRaw Data Examination

In an initial analysis, raw data were examined to explorethe probability of gaze-to-partner during the various verbalevents. Figure 2 provides an example of the raw data, show-ing probability plots for gaze-to-partner during requests(including requests for confirmation of new information,requests for confirmation of old information, and requestsfor elaboration) compared with a nonrequest baseline in a3,000-ms analysis window centered at the request/nonrequestonset. Raw data exploration indicates a higher probabilityof gaze-to-partner after request onset for HI than for NH.

Analysis of the corresponding raw data graphs gener-ated for gaze-to-partner during back channeling compared

with a baseline of silence indicates that back channeling onsetentailed a decreased probability of gaze-to-partner, still,however, with higher probability in the HI group.

While indicating a higher overall probability of gaze-to-partner for participants with HI than for participants withNH, raw data graphs displaying gaze-to-partner duringthe two types of requests for confirmation (of new and oldinformation, respectively) do not demonstrate clear differ-ences between request types.

Survival FunctionData were fitted to a Kaplan-Meier survival analysis

estimating the event time from beginning of measurements tooccurrence of gaze-to-partner for the verbal events of theparticipant groups and displaying the estimate as a cumula-tive survival. Figure 3 presents survival estimates for requestand nonrequest, with data labels displaying censored data(that is, verbal events produced without gaze-to-partner).Mantel-Cox log-rank tests of equality of survival distributionsrevealed significantly lower survival rates for the HI groupfor requests (c2 (1, N= 542) = 4.826, p = .028) as well as fornonrequests (c2 (1, N= 485) = 6.354, p = .012).

Survival estimates for back channeling and a baselineof silence are presented in Figure 4. Again, Mantel-Coxlog-rank statistics revealed significantly lower survival rates

Figure 2. Mean probability of gaze-to-partner during requests (solid line) and nonrequests (dotted line) for participants with NH (upper panel,n = 10) and participants with HI (lower panel, n = 10). The horizontal axis shows time, with 0 ms marking the onset of the verbal event.

Sandgren et al.: Coordination of Gaze and Speech 947

Downloaded From: http://jslhr.pubs.asha.org/ by a Proquest User on 08/05/2014Terms of Use: http://pubs.asha.org/ss/Rights_and_Permissions.aspx

Page 7: Coordination of Gaze and Speech in Communication Between Children With Hearing Impairment and Normal-Hearing Peers

Figure 4. Kaplan-Meier estimates of survival probabilities during back channeling (left panel) and silence (right panel) for participants with HI (solidline) and participants with NH (dotted line). The horizontal axis shows time, with 0 ms marking the onset of the verbal event. Data labels presentcensored data.

Figure 3. Kaplan-Meier estimates of survival probabilities during requests (left panel) and nonrequests (right panel) for participants with HI (solidline) and participants with NH (dotted line). The horizontal axis shows time, with 0 ms marking the onset of the verbal event. Data labels presentcensored data.

948 Journal of Speech, Language, and Hearing Research • Vol. 57 • 942–951 • June 2014

Downloaded From: http://jslhr.pubs.asha.org/ by a Proquest User on 08/05/2014Terms of Use: http://pubs.asha.org/ss/Rights_and_Permissions.aspx

Page 8: Coordination of Gaze and Speech in Communication Between Children With Hearing Impairment and Normal-Hearing Peers

for participants with HI than for participants with NH,regarding both back channeling (c2 (1, N= 434) = 11.801,p = .001) and silence (c2 (1, N= 1485) = 26.881, p < .001).

Mantel-Cox log-rank tests of equality of survival dis-tributions for the two types of requests for confirmation(that is, of new and old information) revealed no significantgroup differences for either request type (requests to confirmnew information: c2 (1, N= 376) = 0.879, p = .348; requeststo confirm old information: c2 (1, N= 111) = 2.024, p = .155).

Odds RatiosOdds ratios were estimated by dividing the ratio of

cases experiencing the event with the ratio of censored cases.Table 3 summarizes the data used for the analyses with sig-nificant group differences in odds ratios and survival distri-butions highlighted. A higher probability of gaze-to-partnerin association with requests, nonrequests, back channeling,and silence is shown for the participants with hearing im-pairment, as indicated by significant differences in Mantel-Cox log-rank tests of equality of survival distributions andsignificant differences in odds ratios, compared with theparticipants with normal hearing.

Summary of ResultsTo summarize the results, an increased probability of

concurrent gaze-to-partner during verbal events was shownfor participants with hearing impairment compared withparticipants with normal hearing. Kaplan-Meier survivalfunctions showed significantly reduced survival rates forparticipants with HI, reflecting a higher propensity to gazein association with requests, nonrequests, back channeling,and silence. Odds ratios express this finding by showingan increase in the odds to look at the partner of 1.5 to 2.1 forparticipants with HI during the same verbal events. Whiledisplaying similar overall patterns, nonsignificant survival

distributions and odds ratios were found for the differenttypes of requests for confirmation.

Discussion and ConclusionsThe results presented in this study support the hypoth-

esis that participants with hearing impairment do, indeed,gazemore to their conversational partner during verbal eventsthan do normal-hearing peers (matched in age, nonverbalreasoning, and receptive language skills). The findings—gathered from a task posing demands similar to many schooltasks—provide evidence that school-age children and ado-lescents with HI display significantly lower survival distribu-tions and increased odds for gaze-to-partner when askingquestions, when making statements, when providing thespeaker with back-channeling responses, and during silence.This discussion proposes directions for the next steps ofresearch, using these basic gaze data to delve into the reasonsbehind the increased use of gaze.

First, task-dependent characteristics may have influ-enced the gaze behavior. In this referential communicationtask, the participants were seated face-to-face in a laboratorysetting, with competing auditory and visual stimuli kept toa minimum. This may have made the participants with HImore prone to gaze at the speaker than would otherwise havebeen the case. It is, however, also easy to envision how thelaboratory setting could have resulted in the opposite be-havior, that is, that the favorable acoustic conditions wouldhave made gaze-to-partner less necessary. Adequate exami-nation of the influence of the task on the participants’ gazebehavior requires the experiment to be replicated in a varietyof settings, the most naturalistic being a classroom environ-ment. Future studies should also evaluate the effect of theconversational partner, comparing gaze with known and un-known partners. Furthermore, the referential communicationtask used in this study should be scrutinized and comparedwith other tasks encouraging more gaze exchanges between

Table 3. Data and result summary.

Verbal event GroupCases with

eventaCensoredcasesb

Odds ratio[95% CI] c2

Log rankc

p

Request HI 136 152. 1.5 [1.1, 2.1] 4.826 .028NH 95 159.

Nonrequest HI 71 105. 1.7 [1.1, 2.5] 6.354 .012NH 89 220.

Back channeling HI 107 162. 2.1 [1.4, 3.3] 11.801 .001NH 39 126.

Silence HI 342 403. 1.7 [1.4, 2.] 26.881 .000NH 242 498.

Requests for confirmation of new information HI 88 106. 1.2 [0.8, 1.9] 0.879 .348NH 73 109.

Requests for confirmation of old information HI 23 31. 1.7 [0.8, 3.8] 2.024 .155NH 17 40.

Note. Significant group differences are in bold.aNumber of cases experiencing gaze-to-partner. bNumber of cases not experiencing gaze-to-partner. cMantel-Cox p value for test of groupdifference in survival distribution between HI and NH groups.

Sandgren et al.: Coordination of Gaze and Speech 949

Downloaded From: http://jslhr.pubs.asha.org/ by a Proquest User on 08/05/2014Terms of Use: http://pubs.asha.org/ss/Rights_and_Permissions.aspx

Page 9: Coordination of Gaze and Speech in Communication Between Children With Hearing Impairment and Normal-Hearing Peers

the participants. However, gaze-to-partner during a taskrequiring visual attention to be directed away from the partner,performed with a friend with whom the child functions well,strengthens, rather than weakens, the argument of gaze to theconversational partner serving an important communicativerole. Whereas the present investigation provides necessarygroundwork on the gaze behavior in a semistructured taskwithout restrictions on verbal productions, future studiesshould also use scripted utterances of predetermined duration.By expanding the gaze analysis window, this would providean experimental control condition to more naturalistic tasks,adding details on the probability of gaze-to-partner over thecourse of an entire utterance.

Second, the higher probability of gaze-to-partner inparticipants with HI could be interpreted as a way of com-pensating for the degraded auditory signal. This should befurther explored in different ways. Systematic variation of theparticipants’ access to visual cues would provide informationon the possible benefit of visual cues. Previous studies haveshowed improved speech perception for audiovisual speechcompared with speech presented through only one modality,in participants with hearing impairment (see, e.g., Woodhouseet al., 2009), a finding deeply rooted in common knowledgeand practice. Future studies should further investigate individ-ual characteristics of the participants as a factor influencinggaze-to-partner. If future studies confirm a compensatory roleof gaze-to-partner, attempts must be made to tease apartunderlying causes that may interact with the hearing im-pairment such as cognitive and linguistic ability. Participantsfrom other age groups, and other degrees and etiologies ofhearing impairment, should be studied. Although inflated byan outlier who received auditory amplification unusuallylate, the participants in this study received hearing aids at amean age of approximately 5 years. Today, newborn infantsin Sweden undergo otoacoustic emission screening for hearingimpairment, enabling earlier identification and amplifica-tion. Thus, an investigation of whether earlier amplificationreduces the need for gaze-to-partner is warranted, as is areplication of the study in participants with more severe,and also unilateral, hearing impairments. Furthermore, apossible increased cognitive load on individuals with hearingimpairment could be investigated using measures of gazeaversion.

Without ruling out the possibility of the intraindividualbenefits described above, a third alternative, encompassinginterindividual benefits of gaze-to-partner during conversa-tion, should be investigated. Instead of just a monitoring rolefor gaze, the combination of gaze and speech can be used toadjust the communicative content to reach conversationalobjectives. This is similar to how, for example, number andlength of speaking turns and the use of requests can be used byindividuals with hearing impairment to control conversation(Caissie, Dawe, Donovan, Brooks, & MacDonald, 1998).Skelt (2006) described how participants with hearing im-pairment, by maintaining or withholding gaze, exert controlover the turn exchanges. The conversational partners in herstudy adhered to the gaze cue, allowing it to “overrule”syntactic and prosodic cues for turn exchange. In our data,

the overall similarities in gaze behavior between the partici-pants with and without hearing impairment can be seen as anindication of gaze serving as a more generally applied turn-regulating mechanism, similar for both groups. The groupdifferences in the probability for gaze-to-partner could, there-fore, express the extra need for visual cues accumulated bythe hearing impairment.

If future studies show gaze-to-partner to compensatefor auditory deficits and play a role in turn regulation, it isevident that gaze cues, and the opportunity to use them, mustbe considered vital for the ability of individuals with hearingimpairment to participate in interaction on equal termswith their normal-hearing peers. Clinical and educationalimplications could include increased awareness of the use ofgaze during conversation in the child with hearing impair-ment, family members, teachers, and friends, as well asclassroommodifications optimizing both visual and auditoryaspects of communication. With well-informed interlocutorsand well-adapted surroundings, individuals with hearingimpairment are more likely to be able to show their fullpotential, using both verbal and gaze cues to participate fullyin the interaction.

AcknowledgmentsWe gratefully acknowledge the support of the Linnaeus

Centre Thinking in Time: Cognition, Communication and Learning,financed by the Swedish Research Council (Grant 349-2007-8695).We would also like to thank Jonas Brännström for valuable com-ments in the preparation of the manuscript and express our sincer-est gratitude to all participants.

ReferencesBavelas, J. B., Coates, L., & Johnson, T. (2002). Listener responses

as a collaborative process: The role of gaze. Journal of Com-munication, 52, 566–580.

Bergeson, T. R., Pisoni, D. B., & Davis, R. A. (2003). A longitudinalstudy of audiovisual speech perception by children with hear-ing loss who have cochlear implants. The Volta Review, 103,347–370.

Bishop, D. V. M. (2009). Test for Reception of Grammar, Version 2(M. Garsell, Trans.). Stockholm, Sweden: Pearson.

Blamey, P. J., Sarant, J. Z., Paatsch, L. E., Barry, J. G., Bow, C. P.,Wales, R. J., . . . Toocher, R. (2001). Relationships among speechperception, production, language, hearing loss, and age inchildren with impaired hearing. Journal of Speech, Language,and Hearing Research, 44, 264–285.

Caissie, R., Dawe, A. L., Donovan, C., Brooks, H., & MacDonald,S. M. (1998). Conversational performance of adults with ahearing loss. Journal of the Academy of Rehabilitative Audiology,31, 45–67.

Clark, J. G. (1981). Uses and abuses of hearing loss classification.ASHA, 23, 493–500.

Corden, B., Chilvers, R., & Skuse, D. (2008). Avoidance of emo-tionally arousing stimuli predicts social-perceptual impairmentin Asperger’s syndrome. Neuropsychologia, 46, 137–147.

Cummins, F. (2012). Gaze and blinking in dyadic conversation:A study in coordinated behaviour among individuals. Languageand Cognitive Processes, 27, 1525–1549.

950 Journal of Speech, Language, and Hearing Research • Vol. 57 • 942–951 • June 2014

Downloaded From: http://jslhr.pubs.asha.org/ by a Proquest User on 08/05/2014Terms of Use: http://pubs.asha.org/ss/Rights_and_Permissions.aspx

Page 10: Coordination of Gaze and Speech in Communication Between Children With Hearing Impairment and Normal-Hearing Peers

Doherty-Sneddon, G., & Phelps, F. (2005). Gaze aversion: A re-sponse to cognitive or social difficulty?Memory & Cognition, 33,727–733.

Doherty-Sneddon, G., Riby, D. M., & Whittle, L. (2012). Gazeaversion as a cognitive load management strategy in autismspectrum disorder and Williams syndrome. Journal of ChildPsychology and Psychiatry, 53, 420–430.

Duchan, J. F. (1988). Assessing communication of hearing-impairedchildren: Influences from pragmatics. Journal of the Academy ofRehabilitative Audiology, 21(Mono. Suppl.), 19–40.

Garcia, J. M., & Dagenais, P. A. (1998). Dysarthric sentence intelli-gibility: Contribution of iconic gestures and message predic-tiveness. Journal of Speech, Language, and Hearing Research, 41,1282–1293.

Gilbertson, M., & Kamhi, A. G. (1995). Novel word learning inchildren with hearing impairment. Journal of Speech andHearing Research, 38, 630–642.

Glenberg, A., Schroeder, J., & Robertson, D. (1998). Averting thegaze disengages the environment and facilitates remembering.Memory & Cognition, 26, 651–658.

Hansson, K., Forsberg, J., Löfqvist, A., Mäki-Torkko, E., & Sahlén,B. (2004). Working memory and novel word learning in childrenwith hearing impairment and children with specific languageimpairment. International Journal of Language & Communica-tion Disorders, 39, 401–422.

Hyde, M., & Power, D. (2003). Characteristics of deaf and hard-of-hearing students in Australian regular schools: Hearinglevel comparisons. Deafness and Education International, 5,133–143.

Ibertsson, T., Hansson, K., Mäki-Torkko, E., Willstedt-Svensson, U.,& Sahlén, B. (2009). Deaf teenagers with cochlear implantsin conversation with hearing peers. International Journal ofLanguage & Communication Disorders, 44, 319–337.

International Organization for Standardization. (2010). ISO 8253–1.Acoustics—Audiometric test methods—Part 1: Pure-tone air andbone conduction audiometry. Geneva, Switzerland: Author.

Kendon, A. (1967). Some functions of gaze-direction in socialinteraction. Acta Psychologica, 26, 22–63.

Marschark, M., Convertino, C. M., Macias, G., Monikowski, C. M.,Sapere, P., & Seewagen, R. (2007). Understanding communi-cation among deaf students who sign and speak: A trivialpursuit? American Annals of the Deaf, 152, 415–424.

Mirenda, P. L., Donnellan, A. M., & Yoder, D. E. (1983). Gazebehavior: A new look at an old problem. Journal of Autism andDevelopmental Disorders, 13, 397–409.

Most, T., Rothem, H., & Luntz, M. (2009). Auditory, visual, andauditory-visual speech perception by individuals with cochlearimplants versus individuals with hearing aids. American Annals ofthe Deaf, 154, 284–292.

Nadig, A., Lee, I., Singh, L., Bosshart, K., & Ozonoff, S. (2010).How does the topic of conversation affect verbal exchangeand eye gaze? A comparison between typical development andhigh-functioning autism. Neuropsychologia, 48, 2730–2739.

Nadig, A. S., & Sedivy, J. C. (2002). Evidence of perspective-takingconstraints in children’s on-line reference resolution. Psycho-logical Science, 13, 329–336.

Nilsen, E. S., & Graham, S. A. (2009). The relations betweenchildren’s communicative perspective-taking and executivefunctioning. Cognitive Psychology, 58, 220–249.

Norbury, C. F., Brock, J., Cragg, L., Einav, S., Griffiths, H., &Nation, K. (2009). Eye-movement patterns are associatedwith communicative competence in autistic spectrum dis-orders. Journal of Child Psychology and Psychiatry, 50,834–842.

Raven, J., Raven, J. C., & Court, J. H. (2004). Manual for Raven’sProgressive Matrices and Vocabulary Scales. Section 3: Stan-dard ProgressiveMatrices: 2000 edition, updated 2004. San Antonio,TX: Pearson.

Sandgren, O., Andersson, R., van de Weijer, J., Hansson, K., &Sahlén, B. (2012). Timing of gazes in child dialogues: A time-course analysis of requests and back channelling in referentialcommunication. International Journal of Language & Com-munication Disorders, 47, 373–383.

Sandgren, O., Ibertsson, T., Andersson, R., Hansson, K., &Sahlén, B. (2011). “You sometimes get more than you ask for”:Responses in referential communication between children andadolescents with cochlear implant and hearing peers. Inter-national Journal of Language & Communication Disorders, 46,375–385.

Skelt, L. (2006). See what I mean: Hearing loss, gaze and repair inconversation (Unpublished doctoral dissertation). AustralianNational University, Canberra.

Stinson, M., & Antia, S. (1999). Considerations in educating deafand hard-of-hearing students in inclusive settings. Journal ofDeaf Studies and Deaf Education, 4, 163–175.

Toe, D. M., & Paatsch, L. E. (2010). The communication skills usedby deaf children and their hearing peers in a question-and-answer game context. Journal of Deaf Studies and DeafEducation, 15, 228–241.

Turkstra, L. S. (2005). Looking while listening and speaking:Eye-to-face gaze in adolescents with and without traumatic braininjury. Journal of Speech, Language, and Hearing Research, 48,1429–1441.

Turkstra, L. S., Ciccia, A., & Seaton, C. (2003). Interactive be-haviors in adolescent conversation dyads. Language, Speech, andHearing Services in Schools, 34, 117–127.

Willemsen-Swinkels, S. H. N., Buitelaar, J. K., Weijnen, F. G., &van Engeland, H. (1998). Timing of social gaze behavior inchildren with a pervasive developmental disorder. Journal ofAutism and Developmental Disorders, 28, 199–210.

Wittenburg, P., Brugman, H., Russel, A., Klassmann, A., & Sloetjes,H. (2006, May). ELAN: A professional framework for multi-modality research. Paper presented at the Fifth Interna-tional Conference on Language Resources and Evaluation,Genoa, Italy.

Woodhouse, L., Hickson, L., & Dodd, B. (2009). Review of visualspeech perception by hearing and hearing-impaired people:Clinical implications. International Journal of Language &Communication Disorders, 44, 253–270.

Yoshinaga-Itano, C., & Sedey, A. L. (1998). Language of early-and later-identified children with hearing loss. Pediatrics,102, 1161–1171.

Sandgren et al.: Coordination of Gaze and Speech 951

Downloaded From: http://jslhr.pubs.asha.org/ by a Proquest User on 08/05/2014Terms of Use: http://pubs.asha.org/ss/Rights_and_Permissions.aspx

Page 11: Coordination of Gaze and Speech in Communication Between Children With Hearing Impairment and Normal-Hearing Peers

Reproduced with permission of the copyright owner. Further reproduction prohibited withoutpermission.