17
Pergamon Computersin HumanBehavior,Vol. 11, No. 2, pp. 223-239, 1995 Copyright © 1995 Elsevier Science Ltd Printed in the USA. All rights reserved 0747-5632/95 $9.50 + .00 0747-5632(94)00033-6 Using Computer and Video Technologies to Develop Interpersonal Skills J. Olin Campbell, Cheryl A. Lison, Terry K. Borsook, Jay A. Hoover, and Patricia H. Arnold Peabody College of VanderbiltUniversity Abstract -- Two studies investigated ways in which computer and video technology can support expert human coaches in order to reduce instructor time and potentially increase access to powerful learning environments. The content goal was to train undergraduate students to facilitate others' interpersonal problem solving. In Experiment 1 the standard classroom treatment used instructor lecture and guided discussion, instructor modeling of skills, and role play, with the instructor present the full time. The combined instructor/ computer/video treatment substituted computer instruction for lecture and video for instructor modeling. The dependent variable was performance in a role play. Both treatments required 9 hr of subject time, but the combined treatment reduced instructor time from 9 to 4.5 hr. The combined treatment yielded comparable subject outcomes: F(1, 25) = 1.21, p = NS. In Experiment 2, the classroom treat- ment remained essentially the same. In the optimized treatment subjects scored videotaped examples and applied the scoring technique in teams to their own videotaped role-play performance. Total subject time in both treatments was 6 hr. Instructor time was 6 hr in the classroom treatment and 2 hr in the combined treatment. The optimized treatment yielded significantly better performance: F(1, 23) = 20.27, p < .001. Independent rating of the role plays by three counseling students (who were blind to the experimental treatments and to the subjects assigned to each treatment) indicated a similar result, with t(24) = 2.67, p < .05. We conclude that computer- and video-supported methods have the potential to decrease instructor time and increase learner performance, even for complex interpersonal problem solving skills. Requests for reprints should be addressed to J. Olin Campbell, Peabody College of Vanderbilt University, Peabody Box 321, Nashville, TN 37203. 223

Using computer and video technologies to develop interpersonal skills

Embed Size (px)

Citation preview

Page 1: Using computer and video technologies to develop interpersonal skills

Pergamon Computers in Human Behavior, Vol. 11, No. 2, pp. 223-239, 1995

Copyright © 1995 Elsevier Science Ltd Printed in the USA. All rights reserved

0747-5632/95 $9.50 + .00

0747-5632(94)00033-6

Using Computer and Video Technologies to Develop Interpersonal Skills

J. Olin Campbell, Cheryl A. Lison, Terry K. Borsook, Jay A. Hoover, and Patricia H. Arnold

Peabody College of Vanderbilt University

A b s t r a c t - - Two studies investigated ways in which computer and video technology can support expert human coaches in order to reduce instructor time and potentially increase access to powerful learning environments. The content goal was to train undergraduate students to facilitate others' interpersonal problem solving. In Experiment 1 the standard classroom treatment used instructor lecture and guided discussion, instructor modeling of skills, and role play, with the instructor present the fu l l time. The combined ins tructor/ computer/video treatment substituted computer instruction for lecture and video for instructor modeling. The dependent variable was performance in a role play. Both treatments required 9 hr of subject time, but the combined treatment reduced instructor time from 9 to 4.5 hr. The combined treatment yielded comparable subject outcomes: F(1, 25) = 1.21, p = NS. In Experiment 2, the classroom treat- ment remained essentially the same. In the optimized treatment subjects scored videotaped examples and applied the scoring technique in teams to their own videotaped role-play performance. Total subject time in both treatments was 6 hr. Instructor time was 6 hr in the classroom treatment and 2 hr in the combined treatment. The optimized treatment yielded significantly better performance: F(1, 23) = 20.27, p < .001. Independent rating o f the role plays by three counseling students (who were blind to the experimental treatments and to the subjects assigned to each treatment) indicated a similar result, with t(24) = 2.67, p < .05. We conclude that computer- and video-supported methods have the potential to decrease instructor time and increase learner performance, even for complex interpersonal problem solving skills.

Requests for reprints should be addressed to J. Olin Campbell , Peabody College of Vanderbil t University, Peabody Box 321, Nashville, TN 37203.

223

Page 2: Using computer and video technologies to develop interpersonal skills

224 Campbell et aL

Many jobs today require team problem solving and enhanced interpersonal skills. These skills require performance-based learning and assessment that is situ- ated in the types of experiences for which the knowledge and skills will be used (e.g., Brown, Collins, & Duguid, 1989; Cognition and Technology Group at Vanderbilt, 1990). Training in this area is time consuming, however, and the avail- ability of skilled instructors to provide coaching and assessment is limited.

Instructors are usually constrained by high student-to-faculty ratios in the types of modeling, mentoring, and feedback they can provide. When faced with teaching complex skills to many people using scarce resources, instructors often must lec- ture to "cover the content." Assessing and developing skilled performance, as opposed to knowledge about the skill, usually means giving up the economy of scale that a lecture strategy provides, however. If the lecture is supplemented by having students mentor each other without feedback from an expert, they can rapidly propagate errors.

Learner-centered environments tend to support performance development rather than rote recall, but the low ratios they require are expensive. Without some assis- tance, students in poorer educational systems are likely to be denied access to the advantages of a learner-centered environment. This set of studies investigates ways in which computer and video technology can support expert human coaches, poten- tially reducing instructor time and increasing access to powerful learning environ- ments. More specifically, it investigates the efficiency of technology for developing interpersonal skills.

There are now many meta-analyses that are interpreted as indicating the supe- riority of computer-based instruction and interactive video. These include Orlansky and String (1977), Fletcher (1990), and Kulik and Kulik (1987). A typi- cal reported finding is a savings of 20%-30% in learning time. Few studies, how- ever, have focused specifically on interpersonal skills and problem solving. A review by Cronin and Cronin (1992) discusses empirical studies of interactive video in soft skill areas. They conclude that while there appear to be significant instructional benefits, the absence of a comprehensive theory to explain the results and the methodological weaknesses of some studies mandate a conserva- tive interpretation.

Because interpersonal skills are difficult to characterize or analyze, conducting research, evaluating performance, and providing feedback to learners is difficult. In addition, there are many types of problem solving interactions. To focus this research problem, we started with a relatively simple two-person interaction, where one person is helping another clarify and resolve an interpersonal conflict.

In these studies we contrast three technologies or delivery system models:

1. Classroom: Instructor-directed activities include lecture and guided discussion, instructor modeling of skills, and role play. The instructor is required at all times. The individual relates to the learning system as a learner who observes, takes notes, and engages in limited interactions. This approach tends to be knowledge oriented and assessed by paper-and-pencil tests.

2. Combined: This treatment, as used in Experiment 1, combines computer instruc- tion (used by a pair of subjects at each station, who practice together as prompt- ed by the system), with instructor-guided discussion, video modeling, and instructor-monitored role play. The instructor is not required full time because a lab proctor, who is not trained in the content area, can facilitate use of the com- puter and video learning environment. The individual relates to the learning sys-

Page 3: Using computer and video technologies to develop interpersonal skills

Developing interpersonal skills 225

tem as a partner and a user. This approach tends to be skill oriented and assessed by performance-based simulations.

3. Optimized: This treatment, as used in Experiment 2, adds a videotaped role play to the combined treatment. Working with others in small teams, subjects learn to use a rating guide to assess their own skill practice and to provide feedback to each other. The individual relates to the learning system as a team member and a user.

BACKGROUND

One of the most important aspects of our approach is to make skilled performance visible to the learner by modeling and critiquing examples of the skills. To do so, we have adopted principle-based learning strategies derived from Merrill (1987) and Clark (1989), whose work extends the instruction design research of Gagn6 (1985). We classified learning objectives by type (e.g., fact, concept, procedure, rule, principle, or process) and use (e.g., remember vs. apply). Then we matched each type/use with an appropriate instructional strategy and designed a means to provide evaluative feedback on performance.

In our studies, the primary objective type was the application of principles in an interpersonal communication model. Learners were expected to use (i.e., apply) the skill with another person. They had to adapt to different situations based on guide- lines (principles). This is the only type of behavior assessed in the pre- and posttests. Subjects in the assessments were not asked to recall knowledge about interpersonal skills, nor could they use a rote procedure. Some supporting skills to help learners attend to another person are procedural (e.g., "sit facing the other per- son, 3 to 4 feet away, shoulders squared with theirs, leaning forward and looking directly at them"). Other supporting objectives involve concept application, such as recognizing and classifying good and poor examples of using the skills. For exam- ple, in Experiment 2 we asked subjects to rate the performance of video models and of each other. A few supporting skills required remembering facts (e.g., the sequence of attending, responding, personalizing, and initiating), but these were practiced only as part of applying the principles.

Research in using technology in the context of interpersonal skills has been lim- ited. Schroeder, Dyer, Czerny, Youngling, and Gillotti (1986) have reported on an extensive simulation designed to train Army second lieutenants in leadership skills. Videodisc interpersonal skills training and assessment (VISTA) used three treat- ments: (a) instructor-acted role plays, (b) videodisc-simulated role plays with feed- back and discussion of the user's selected actions, and (c) programmed text.

Training included 50 min of exposure to the major treatment, then testing. One dependent measure was the written Leadership Principles Test, which asked sub- jects to list (a) principles for dealing with people, (b) principles for performing their duties, and (c) any Army information they had learned or been reminded of during the treatment. The other dependent measure was user acceptance as mea- sured by a subjective preference questionnaire. The videodisc treatment yielded significantly higher scores on the test, while role play was significantly preferred on the questionnaire. Both role play and videodisc were significantly preferred over the programmed text treatment. These results suggest that a combination of role plays with feedback on video, and instructor-acted role playing should be con- sidered as components for developing leadership skills.

Page 4: Using computer and video technologies to develop interpersonal skills

226 Campbell et al.

Alpert's (1986) study on development of reflective response skills for novice counselors provides an additional indication of the value of even text-oriented computer-delivered simulations. The simulation was designed to teach reflective response discrimination. Subjects were undergraduate students. Treatment condi- tions were: (a) instructor-led discussion and role plays and (b) text-based simula- tion delivered by computer in which the subject read a client statement then selected the most appropriate response and received a short discussion of the relative merit of that response.

The dependent variable was performance on a written test, designed as an analog measure of empathic response. Alpert (1986) found significant improvement by the computer simulation group but not the instructor-led discussion and role-play group. When the treatments were reversed for the same groups of subjects, the result was the same: the group that received the instructor-led discussion and role plays did not show further significant improvement, while the group that received the computer simulation did improve significantly. These results suggest an increase in empathic skills following computer-delivered simulations.

Limitations of Prior Research

In reviewing the research that has investigated the value of technology to facilitate the learning and use of interpersonal skills, we have become keenly aware of the limited investigation of user performance outcomes. In both the Alpert (1986) and Schroeder, et al. (1986) studies, the major dependent variable was the score on a written test. A performance test (such as role play) might have been a more appro- priate measure, since the actual skill is used in a face-to-face conversation where physical attending and response to nonverbal cues are important. Authors cited the time limitations, the problem of obtaining qualified raters, and the general unrelia- bility of such measures as reasons for not using performance tests. Other problems included using different instructors for each treatment, not using independent raters, and the possibility that the confederate in a role play assessment might dif- ferentially facilitate one treatment group over another. Other explanations can be offered for the results found in these studies and in the meta-analyses noted above. For example, new systems have a novelty effect on learners, or differences may occur when the instructor is not the same across treatments.

Confusing Media and Methods

R. E. Clark (1991) notes that media and method are often confused in reports of instructional research, so performance gains or time savings are attributed to a medium (such as computer-based instruction) rather than to the method that was delivered via that medium (e.g., individual tutorial vs. group lecture). At the annual meeting of the American Educational Research Association in 1993, Clark extend- ed this position with reference to factors of cost and feasibility. Specifically, cost of one medium versus another may favor different methods of instruction. We concur. Tutorials may be impractical for a large class when provided by an instructor, but may become feasible when delivered by computer. By this reasoning, studies that attribute gains to one delivery medium over another may not be inappropriate when economic feasibility is factored in. That is, it may not be feasible to compare the methods of lecture and individual tutoring for a large class, without considering cost. Our investigations consider cost (e.g., instructor time) to be part of the data

Page 5: Using computer and video technologies to develop interpersonal skills

Developing interpersonal skills 227

that must be included in a theory of instruction, along with models of the learner, the target system, and the expert (Campbell, 1992).

Issues Addressed

In their metaanalysis of interpersonal skills training using interactive video, Cronin and Cronin (1992) note a number of methodological weaknesses in prior studies in this area. We have attempted to address some of these methodological concerns and to: (a) focus on methods and media in the context of cost and benefits, (b) investigate interpersonal skills needed in teams, (c) measure actual skill perfor- mance instead of knowledge or attitude about the skill, (d) assess the extent to which a confederate in a role-play performance assessment may differentially facil- itate one treatment group over another, and (e) include assessment by raters inde- pendent of the study and blind to treatment groups.

OVERVIEW OF THE EXPERIMENTS

We conducted two experiments, using the first as a pilot of the instruction methods, the assessment procedures, and the research methods. Following the pilot we redesigned each component. One of the goals of the first experiment was to stay as close as possible to the methods and materials used in the standard classroom treat- ment. In the second experiment we made the assessment more demanding, decreased the learning time, and added independent raters. In the optimized treat- ment we also decreased the proportion of instructor time and optimized the instruc- tional methods.

In both experiments the treatments were two versions of an interpersonal skills training module designed to help students improve their problem solving and com- munication skills for interpersonal problems. Subjects in each treatment group were given a role-play pretest that was videotaped for scoring. After each subject received the assigned treatment, a test was administered that repeated the role play but used a different problem.

Content Model

Our content model is based on the Carkhuff (1987) method for developing helping skills. This method, which combines insight and action approaches, emerged from two decades of research and application in counseling and teaching.

The method enables a user to recognize, learn, and practice each skill as a phase for consulting with or guiding another person to problem solution. The person providing the help first attends, looking directly at the other person, leans forward, and main- tains eye contact. Attending is maintained throughout the interaction. Responding refers specifically to reflecting back both the feeling and the content of what the person with the problem said. The next phase is personalizing in which the helping person assists the person with the problem to take ownership of it. The final phase, initiating, entails agreeing on some concrete action that would solve the problem.

For novices, it is important to distinguish and to work through the phases sequen- tially while helping another person. An expert's fluency allows for a less recogniz- ably structured sequence of phases, however. It seems simply than an expert is "helpful." The more expert the level of skill, the more transparent it appears.

Page 6: Using computer and video technologies to develop interpersonal skills

228 Campbell et al.

In interpersonal skills, expert behavior is often paradoxical: most naive observers expect that a person who is helping another will make suggestions, offer a critique of the other's performance, and generally take control of the situation. While this might be an appropriate response where the goal is information transfer, it is less appropriate where the goal is helping others gain insight into a situation and take ownership of a problem themselves. Since it is often difficult for novices to under- stand or even recognize expert performance in interpersonal skills, an important fac- tor in designing a learning system is to find a way to make expertise visible.

These studies investigate a two-person interaction - - a simple team. One person (termed the helpee) has a problem and must own and solve that problem. The other person, a helper who is not directly part of the problem or solution, will consult by attending, responding, and assisting the person with the problem to generate a solu- tion for it. The helper facilitates insight, ownership, idea generation, and solution planning without taking over the problem. We chose problems for the assessments and practice that were nominated by students as representing common interperson- al issues they were likely to encounter at school, with their families, or at work. Students also created their own problems in role play practice sessions.

Research Hypothesis

Given the qualified positive results for use of interactive video for soft skills instruction, as noted by Cronin and Cronin (1992), in Experiment 1 we hypothe- sized that, given equivalent learner time and half the instructor time, the combined system would yield user outcomes equivalent to the classroom system. We felt we could accomplish this by substituting computer and video learning methods for some of the instructor's lectures and modeling. We did not expect the combined treatment to yield better results than the classroom treatment. This is primarily because we were cutting the instructor time in half. In addition, we intended to pro- vide tighter experimental controls of subject assignment, rater bias, and instructor differences than in some prior studies that found positive results.

EXPERIMENT 1

Method

Subjects. Subjects were students enrolled in an undergraduate course on organiza- tional development. Results are reported for N = 29, with 16 subjects in the class- room condition and 13 subjects in the combined condition. One subject in the combined condition withdrew from the course after the pretest.

The course traditionally has included a module on interpersonal teamwork skills. Subjects took part in the experiment as a regular part of the course and were not paid. The experimental module was not included in the subjects' grades, and all subjects were so informed. In addition, all subjects were given the oppor- tunity for individual coaching and for access to the computer software following the experiment.

Materials. We used two problems: (a) the team leader must deal with uncoopera- tive team members and (b) team members must deal with a domineering leader. Subjects who were given problem (a) as their pretest were given problem (b) as

Page 7: Using computer and video technologies to develop interpersonal skills

Developing interpersonal skills 229

their posttest and vice versa. The two problems were counter balanced, so half of each treatment group received (a) first and the other half received (b).

Procedure. Ives (1990) described an instruct ional model used to develop videodisc-delivered interpersonal skills training. R. C. Clark (1989) similarly described a model for teaching principle-based skills. We adapted the models for this study as described below.

The classroom treatment was primarily instructor-directed and included lecture and discussion, instructor modeling of the skills, and group activity to practice the skills with the instructor's coaching. Subjects were able to experience an expert responding to them as they discussed their feelings and problems. The instructor was assisted by a graduate student who is an expert on the Carkhuff (1987) model, and for the purpose of this study is thus viewed as a coinstructor. The instructors were involved in the design and delivery of both treatments, although they were not both present in every classroom session.

The combined treatment used computer-based tutorials that included two-person role play exercises with evaluative feedback. Subjects also responded to a state- ment from a character presented on the computer by selecting from possible responses. In addition, the combined treatment included video to present concepts and to model skills. The video presented an interaction in which use of each skill (i.e., attending, responding, personalizing, and initiating) was labeled with a subti- tle to facilitate learner recognition and understanding. The subjects used a rating guide to assess each others' use of the interpersonal problem solving skills while using the computer. The instructor introduced the learning unit to both groups at the same time. The graduate assistant functioned as the instructor for the combined group and provided a limited amount of small-group coaching and feedback at the end of the instruction period. For Experiment 1 we used a sequential approach where subjects learned and practiced one skill before moving to the next.

Subjects were randomly assigned to the classroom or to the combined treatment. Both treatments required 9 hr of contact time, but the combined group received only 4.5 hr of instructor time, with the other portion spent in the computer lab (where proctors were available). Both groups had the same instructor team. Table 1 presents the Experiment 1 contents and treatments by session.

The dependent variable was performance in a two-person role play where a con- federate portrayed an individual with a personal problem, while the subject helped the person find a solution. The interaction was videotaped and judged by two trained raters using an observation checklist. The subjects' performance was rated and compared with their performance in a similar role play at the end of the inter- personal skills module.

Before beginning the session, subjects were told that they would be asked to help a "friend" - - someone in the same course whom they had known for 2 years. Thus the person in the role play did not require a formal introduction, and there was assumed trust between the participants. Subjects were told that there would only be a few minutes for the conversation because the scene was a classroom 5 rain before the start of the class.

The role playing sessions lasted between 4 and 5 min. Side-by-side chairs for the participants were separated by a small table "decorated" with a house plant that served as a screen for the microphones. At 4 min the video camera operator cued the confederate, who then brought the session to a conclusion by the end of 5 min.

Page 8: Using computer and video technologies to develop interpersonal skills

230 Campbell et al.

Table 1 Experiment 1: Content and Treatments by Session

Content Classroom Treatment Combined Treatment

Pretest

Class Session h Overview Skill: Attending

Class Session II Skill: Responding to content

Class Session III Skill: Responding to feeling

Class Session IV Skill: Responding to meaning

Class Session V Skills: Personalizing and Initiating

Posttest

Assessment 1: role play

Instructor overview Instructor lecture/discussion Instructor modeling

Instructor lecture/discussion Role play with instructor

Instructor lecture/discussion Role play with instructor

Instructor lecture/discussion Role play with instructor

Instructor lecture/discussion Practice with instructor

Assessment 2: role play

Assessment 1: role play

Instructor overview Computer instruction in dyads Role play in dyads

Computer instruction in dyads Role play in dyads

Computer instruction in dyads Role play in dyads

Computer instruction in dyads Role play in dyads

Computer instruction in dyads

Assessment 2: role play

We se lec ted two p r o b l e m s that g roup m e m b e r s typ ica l ly e n c o u n t e r in w o r k teams in business and education. Thus, these p rob lems situated the skills in a con- text s imi lar to one in which they would ul t imate ly be used. The p r o b l e m s con- cerned ei ther "easy r ider" t eam m e m b e r s who were not doing their share o f the work, or a domineer ing team leader who would not use the ideas o f t eam members . The confedera te offered an initial s ta tement that was consis tent ly presented accord- ing to a p r o b l e m script. Addi t ional aspects o f the p rob l em were unvei led as the subject responded.

F o l l o w i n g is an excerp t f rom a pre tes t t ranscr ipt where C is the confede ra t e helpee s truggling with uncoopera t ive t eam members , and S is the subject, funct ion- ing as a helper:

C: Hey, how you doing? S: Hey, Tom, how are you? C: Doing pretty good. Having a few problems with this class. S: Like what? Tell me. C: Well, I don't mean to dump all this on you right before class. . , do you mind? S: No. C: Well, we've had this team meeting on Thursday and . . . there are four people in my

work team and . . , how many people do you have on your work team? S: We have five. C: You have five. Well, we have four. So it's myself and the other team leader. We showed

up Thursday night, and last Fr iday. . . Bob let us get in our work teams again to work on the goals and the objectives and the same two people didn't show up again. So it's myself and the other team leader a n d . . . I don't know what to do here because I don't want to get stuck doing all the work and I think that four brains are better than two.

S: So let me make sure I hear you right. You feel frustrated with this group because.. .

Two raters ana lyzed the conversa t ion for each subject, ca tegor iz ing the type o f each ep i sode and the sequence o f each type o f interact ion, then c o m p u t i n g the

Page 9: Using computer and video technologies to develop interpersonal skills

Developing interpersonal skills 231

number of points based on the type and sequence of subject actions. A maximum of 6 points could be earned based on the type of subject actions (i.e., attending, responding, personalizing, and initiating) and the sequence in which they occurred (e.g., a point was awarded for personalizing only after responding and for initiating only after personalizing). Raters were trained using a rating guide similar to the one given subjects. After independently rating all subjects in either the pretest or the posttest, raters compared their categorization of each episode. If they failed to reach 90% agreement, they refined the rating guide and then used the new guide to independently re-rate all of the subjects. In this way, many ambiguities were clari- fied, so that both the guide used by raters and the simplified guide/job aid given to subjects was refined for Experiment 2.

Videotapes were transcribed and raters used the transcripts for initial rating and the videotapes to verify intonation and other nonverbal cues, including attending. Ratings were compared for agreement (agree/total) on the presence of an episode and for agreement on category (e.g., responding) assigned to an episode where both agreed on its presence. The performance of subjects in the classroom group was compared to that of subjects in the group that participated in the combined treatment with reduced instructor contact time.

Results

The assessment procedure produced a .92 interrater agreement on categorization of episodes. Analysis of subject data is based on the average of the scores assigned by the two raters.

Table 2 presents the mean scores on the Experiment 1 performance test. There was no significant difference between pretest scores for the classroom group and combined group, with t(27) = .45, and p = NS. Subjects in both treatment groups made significant gains in performance, with t(15) = 7.54, p < .001 for the class- room group, and t ( l l ) = 8.83, p < .001 for the combined group. An analysis of covariance (with pretest as covariate) showed no significant effect of method, with F(1, 25) = 1.21, p = NS. There was a significant effect of pretest performance, with F(1, 25) = 4.6, p < .05.

To obtain an indication of whether the interpersonal skills were used outside the classroom, we interviewed subjects following the posttest to determine if they had spontaneously applied the principles. Many of them recounted in detail a conversa- tion with a roommate, friend, or family member where they had used the interper- sonal problem solving skills with positive results.

Table 2 Experiment 1 : Mean Pretest and Posttest Scores for the

Classroom and Combined Conditions

Treatment

Method n Pretest n Posttest

Classroom M SD Combined M SD

16 16 2.37 4.66 0.85 1.21

13 12 2.19 4.96 1.33 1.08

Page 10: Using computer and video technologies to develop interpersonal skills

232 Campbell et aL

Discussion

Our hypothesis was supported: given equivalent learner time and half the instructor time, the combined treatment yielded results that were not significantly different from the classroom treatment. This finding is congruent with the results of Schroeder et al. (1986) and of Alpert (1986), but extends them by considering cost in instructor time, by using actual performance rather than a written test of knowl- edge, by using the same instructor for both treatments, and by providing additional methodological rigor. As a pilot, it also showed a number of deficiencies: (a) inde- pendent raters were not used, (b) ceiling effects in the rating instrument were encountered, and (c) differences in length of conversations as a factor in the num- ber of appropriate responses were not accounted for.

On the other hand, subjects had little difficulty mastering the basic rating guide, and it was integrated easily into instruction in the form of a job aid. As is typical in protocol analysis, there was considerable discussion and revision of the detailed scoring guide used by the raters in order to achieve a high level of agreement on category assignments.

To make the skills visible to novice learners, we presented each skill in depth separately, in the sequence in which it would be used. We used computer and video to replicate classroom methods as closely as possible, since we wanted to compare the same method using a different medium, while decreasing instructor time requirements.

Given the positive outcome of the pilot, we then asked, "what would we need to do to achieve an order of magnitude improvement in learning productivity"? We identified three desired outcomes: (a) decrease total learning time by one-third in relation to Experiment 1 (from 9 hr to 6 hr), (b) decrease instructor time by two- thirds in relation to the classroom treatment (from 6 hr to 2 hr), and (c) create a method that can be adapted to the classroom. The intent was to compare the new optimized treatment to a classroom treatment that was very similar to that used in Experiment 1.

Following a brainstorming session, we began rapid prototyping of our ideas, try- ing them out using a live instructor, flipcharts, and paper handouts with single sub- jects. We quickly realized that in the short time available for subjects to learn, they would need to practice the skills and receive coaching feedback very early. This required us to demonstrate and coach the integrated skill set instead of using a sequential approach. We could use the instructor only sparingly, so we decided on initial computer-led tutoring with two-person simulations (as in Experiment 1, but in less time), followed by brief initial instructor coaching. We would then have subjects in five-person teams assess their own and others' skills, using a job aid with occasional instructor review and correction.

In addit ion to changing the method in the new opt imized treatment, we changed the learning relationships. In Experiment 1, the subjects were partners and their work was mediated and coordinated through the software in the com- puter-based learning program. We retained the two-person computer component in Experiment 2, but added a coordinated team of five or six people who video- taped each other practicing the skills set, then rated and discussed their perfor- mances. This aspect was informed by Bandura's (1977) social learning theory. The result was an integrated learning approach, where subjects would recognize and use the entire skill set rapidly rather than practicing a sequence of skills that would later be combined.

Page 11: Using computer and video technologies to develop interpersonal skills

Developing interpersonal skills 233

For Experiment 2, we hypothesized that performance in the optimized treatment would at least match that in the standard classroom treatment, while decreasing the amount of learning time for both treatments, and the amount of instructor time in the optimized treatment.

EXPERIMENT 2

Method

The context and problem sets for Experiment 2 were the same as Experiment 1. A module for developing interpersonal skills was presented as part of a course in organizational development. While the problem formats were the same (a domi- neering leader or uncooperative participants), we used only the former for all of the pretests and only the latter for all of the posttests.

In Experiment 2, our instructional method had subjects learn and practice an integrated set of skills (as they would actually be used), rather than learning one skill at a time. We also reduced the duration of the interpersonal skills module to a total of 6 hr of class time for each treatment in Experiment 2. The instructors par- ticipated in all 6 instruction hours of the classroom treatment (although they were not both present for all sessions). They handled the course administration with both groups meeting together (this was counted as instructor time in both treat- ments). The graduate student instructor provided training for subjects in the opti- mized treatment.

For scoring purposes, we changed the rating scale from a 0 -6 range to a 0-1 ratio scale (the points earned were divided by the number of episodes, thus captur- ing the density of effective statements). In this way the length of conversation and ceiling effects were diminished in relation to an absolute scale. In addition, we made the pre- and posttests more challenging: we required subjects to give at least three responses to meaning (both content and feeling) before giving a personalizing response. In Experiment 1, the responses to meaning could come either before or after personalizing and subjects could earn the maximum score without either per- sonalizing or initiating. We did this in Experiment 1 because we thought that requiring three meaning responses before personalizing would be too difficult for beginners and they might not be able to progress to personalizing and initiating. That was not the case.

We also added a test by independent raters in Experiment 2 to validate the rat- ings and to check the consistency with which the role player presented problems and responded to subjects. These raters were graduate students in counseling who were blind to which treatment each subject received and to the treatment model itself. They rated performance by responding to the question, "how effective were the skills of the person providing the help?". The independent raters also assessed the degree to which the role player facilitated the effectiveness of the person pro- viding the help. Both were on a 9-point scale.

Subjects. The subjects were undergraduates enrolled during the spring semester of the same course as Experiment 1. There are five class sections of approximately 26 students each. Within the section that participated in this study, subjects were ran- domly assigned to one of five teams, averaging five people each. Two teams were assigned to the instructor-led (classroom) treatment (N = 11), while three teams

Page 12: Using computer and video technologies to develop interpersonal skills

234 Campbell et al.

(N = 15) were assigned to the optimized treatment. Two men and 24 women partic- ipated in the study.

Materials. Subjects in both groups received the same handouts of the Carkhuff (1987) model and related information. Table 3 presents the content and treatments for both groups.

Procedure. Subjects were given a videotaped role play pretest before the treatment began. After each subject received the assigned treatment, a posttest was adminis- tered that repeated the role play with a different problem. The treatment required 6 classroom hours. We used two versions of the instructional program. Both used essentially the same materials, including a job aid that described the four skills in terms of how they would be rated.

The classroom version used the following method for facilitating interpersonal skill development: individuals were instructed in a class lecture with modeling and coached in small group role-play sessions by an instructor and graduate assistant for the entire 6-hr period of instruction. One session included a discussion linking the problem solving skills to the assertiveness and conflict management topics that are also addressed in the course.

The optimized version began with an instructor overview and a video that mod- eled the skills. It used computer and video to present the concepts, model the inter- actions, and facilitate feedback on performance. The computer portion was a tutorial that required subject interactions and dyads rating each other. The tutorial was less extensive than in Experiment 1, however. The subjects rated example behaviors presented on video and computer, then practiced the behaviors and rated each other. In this version, the instructor time was reduced from 6 hr to 2 hr, with only lab proctors available to help subjects use the system.

Both treatments included 6 hr (three class periods of 2 hr each) of instruction, which was interspersed with related topics in the course. The same instructor/grad-

Table 3 Experiment 2: Content and Treatments by Session

Content Classroom Treatment Optimized Treatment

Pretest

Class Session I Skill: Integrated model

Class Session II Skill: Integrated model

Class Session III Skill: Integrated model

PosRest

Assessment 1: role play

Instructor overview Instructor lecture and modeling Group interaction

Instructor lecture Instructor modeling (Fishbowl) Role-play skill practice (triads)

Instructor lecture Connection to assertiveness

and conflict management Role-play skill practice (triads)

Assessment 2: role play

Assessment 1: role play

Instructor overview Video overview & skill model

(graduate assistant) Tutorial and practice Video rating protocol Skill practice (in dyads)

Video review Videotaped role-play skill and rating

practice with feedback (in teams)

Videotaped role-play skill and rating practice with feedback (in teams)

Assessment 2: role play

Page 13: Using computer and video technologies to develop interpersonal skills

Developing interpersonal skills 235

uate assistant pair as in the instructor-directed treatment introduced the learning unit and attended to course administrative matters. During this time at the begin- ning of class sessions, both treatment groups met together. Approximately 45 min of instructor time (out of 6 hr for the classroom group and 2 hr for the optimized group) was spent this way. The graduate student instructor provided a limited amount of small group coaching and feedback at the end of the instruction period for the optimized treatment.

In both treatment groups, we provided the subjects with a simplified version of the scoring guide used by the raters for the pre- and posttests. Those in the opti- mized group practiced using this guide in teams to assess their own performance. One person played the role of helper/consultant, another the helpee, a third rated the helper during the interaction, a fourth ran the video camera, and the others took notes to provide feedback. The team then viewed the videotape and discussed the ratings. An instructor visited each group at least once to ensure that incorrect understandings were not perpetuated. Each team selected their best exemplar for a team competition to select the best examples of use of the skills. The instructor offered comments on these exemplars.

The performance of subjects in the classroom group was compared to that of subjects in the group that participated in the optimized instructor/computer/video treatment with reduced instructor contact time.

Results

The assessment procedure produced high measures of interrater agreement for the two trained raters: .96 on presence of episodes, and .90 on categorization of episodes. Correlation between raters' scores was .96. The mean of the two raters' scores was used for subsequent analysis.

Table 4 presents the results of the raters using the scoring guide. In Experiment 2 the classroom group did not show a significant pre/post gain, t(10) = 3.41, p = NS, although the optimized group did, t(14) = 4.35, p < .01. Analysis of covariance (with pretest as the covariate) showed a significant treatment effect, F(1, 23) = 20.27, p < .001. It did not indicate a significant pretest effect, F(1, 23) =.02, p = NS. A sepa- rate rating of the posttest by three counseling students also indicated a significant dif- ference between groups in favor of the optimized treatment, t(24) = 2.67, p < .05. The independent raters did not perceive a significant difference between groups in terms of the confederate's facilitation of the subjects' effectiveness, t(24) = 1.52, p = NS.

Table 4 Experiment 2: Mean Pretest and Posttest Scores for the

Classroom and Optimized Conditions

Treatment

Method n Pretest n Posttest

Classroom 11 11 M .32 .30 SD .13 .16

Optimized 15 15 M .32 .63 SD .16 .19

Page 14: Using computer and video technologies to develop interpersonal skills

236 Campbell et al.

GENERAL DISCUSSION

Our hypothesis was supported in a way we did not expect: performance in the opti- mized treatment of Experiment 2, even with one-third the instructor time, at least matched the performance of the classroom treatment group. The classroom group did not show a significant pretest to posttest improvement, however, while the group that received the optimized treatment did. It is noteworthy that this was an assessment of the actual skill, and not knowledge about the skill, as might be assessed in a typical classroom approach.

Possible Reasons for Classroom Group's Lack of Improvement in Experiment 2

The classroom group improved in Experiment 1, but not in Experiment 2, where the group receiving the optimized treatment did improve. We considered several possibilities for the classroom group's lack of improvement:

1. Outliers in the classroom group. One possibility is that some subjects with exceptionally low scores in the classroom group may have brought down the mean. Inspection of the data did not substantiate this idea. On the posttest, 7 of the 11 classroom subjects scored at or below the score of the lowest subject in the optimized treatment.

2. Low density of relevant and focused learning opportunities. There was one-third less time (6 hr) available in Experiment 2 than in Experiment 1. It is possible there was insufficient time using the classroom treatment to develop the skills because there was lower density of relevant learning opportunities than in the optimized treatment. The interpersonal skills module was interwoven with other topics over a period of 3 weeks, and subjects in the classroom treatment may have perceived the skills to be only an extension of other class work. The class- room treatment in Experiment 2 included some content related to assertiveness and conflict management, which are particular strengths of the course instructor. These are vital topics in interpersonal skills, yet they do not directly contribute to performance on the dependent measure. It can be difficult to focus the treat- ment in the classroom model.

3. Few opportunities for practice and individual feedback. It might be argued that the practice afforded by the pretest should produce improvement by itself. As we have noted, however, the skills are virtually invisible to novices; without evaluative feedback they would not know how to improve. Subjects in the class- room treatment did not use the video recording and feedback of their own per- formance. Watching oneself on video and then rating the interaction can be very helpful. Indeed, performance feedback is a missing factor for improvement in everyday interactions with others. The feedback afforded by the videotaped practice using the job aid in the optimized treatment offered subjects a means to observe in what way their performance matched the desired characteristics of the rating guide, and what corrections they needed to make. It makes visible to learners how the assessment will be scored and supports their learning as they assess the performance of their peers and themselves. While the job aid was also provided to the classroom group, it was less emphasized in practice sessions.

4. Differences in instructors. While both instructors helped design and deliver the treatments, the graduate student instructor was more experienced in use of the

Page 15: Using computer and video technologies to develop interpersonal skills

Developing interpersonal skills 237

.

Carkhuff (1987) methods and spent a smaller amount of time instructing the classroom group in Experiment 2 than in Experiment 1. It should be noted that the graduate student instructor spent only 1 hr and 15 min coaching the opti- mized group (45 min of the 2 hr was spent on administrative matters), yet they showed significant gains. The test was more challenging in Experiment 2. As noted above, we made the test more challenging, reflecting the high level of performance of subjects in Experiment 1 that in some cases resulted in ceiling effects.

Of these possible reasons for lack of improvement in the classroom group, those that seem most compelling to us are (a) the increased challenge of the test in rela- tion to Experiment 1 and (b) the lower density of relevant practice in relation to the optimized treatment. The focus and relevant practice with evaluative feedback is what the optimized system was designed to provide. The optimized treatment, with embedded assessment, allows users to make their own performance visible, to describe its strengths and weaknesses, and to monitor changes in their performance over time as they use the assessment as a learning and performance tool.

Validating the Interpretation of Results

Did our results simply reflect performance in using the Carkhuff (1987) model, or were the skills actually facilitative? We used a second, independent measure to val- idate the interpretation of the results. This was accomplished by asking counseling students, who were trained in helping skills, to rate the videotaped interactions. Even though these raters were blind to the treatments and to subject assignment, their scoring also indicated a significant difference between the two treatment groups. This finding provides an initial indication of validity: individuals who adhere to Carkhuff's (1987) model of helping behavior are perceived as facilitative by those who are trained to help others solve problems.

Increasing Learner Access to a Master Teacher

Using a computer to teach interpersonal skill concepts and video to model behavior increases learner access to the expertise of the master teacher in two ways: (a) dur- ing the computer instruction phase, the user engages with the master teacher's expertise because it is embedded in the design of the computer module and (b) instructor time that is saved early in the instruction sequence is available later when the user is ready for more sophisticated coaching. By using video and job aids to coach performance, peers can practice and improve skills together, again freeing some instructor time for more sophisticated feedback.

Previously, we noted several weaknesses in prior studies of computer-and video- supported interpersonal skills development, together with some ways in which Experiment 1 addressed these issues (albeit with its own limitations). Experiment 2 added independent raters and an assessment of the degree to which the confederate in a performance assessment role play may have differentially facilitated one treat- ment group over another. Taken together, we believe these studies provide an indi- cation that the learning and assessment techniques used here may be applied to interpersonal skills in a way that will produce high levels of agreement between raters and high levels of achievement by learners.

Future studies are needed to explore ways to increase access to the powerful learn- ing environments that computer and video technology can create for developing

Page 16: Using computer and video technologies to develop interpersonal skills

238 Campbell et al.

interpersonal skills. We have considered only the cost of instructor time because it can be high for developing interpersonal skills. We did not address the cost of hard- ware acquisition because we believe that computing power is becoming a com- modity whose price will continue to plummet. Software and video production costs were not addressed because these can be amortized over large numbers of users, unlike coaching by an individual instructor.

We have also not addressed the issue of learner time, a factor that is complicated by whether students pay for their training (e.g., in school) or are paid to attend training (e.g., by their employer). The costs when students are paid a salary while they attend training are typically higher than those for instructors, facilities, or trav- el because there are many students taking a course over an extended time. As a consequence, reducing learning time to achieve a given level of performance is a highly leveraged activity. Conversely, in a learning situation where a school is paid for providing a given number of hours of education, administrators, course devel- opers, and instructors may have little incentive to decrease learning time.

The approach we have used in these studies has the potential to increase access to the development of interpersonal skills and has implications for training design. We have anecdotal evidence of our subjects' transfer of the interpersonal skills to situations involving friends and family. The explicitness of these positive accounts leads us to speculate that the use of computer simulation and video feedback may support an environment that facilitates transfer. We are pursuing this issue with regard to motivation to learn and use the skills.

Acknowledgments - - We gratefully acknowledge the comments and suggestions of John Bransford, David Cordray, J. R. Newbrough, Dan Rock, and two anonymous reviewers on this research and on earlier drafts of this article. We appreciate the participation of the instructors and students in two sec- tions of the Organizational Development in Human Service Settings class in the Human and Organizational Development program at Peabody. This research was supported by a gift to Peabody College of Vanderbilt University.

REFERENCES

Alpert, D. (1986). A preliminary investigation of computer-enhanced counselor training. Computers in Human Behavior, 2, 63-70.

Bandura, A. (1977). Social learning theory. Englewood Cliffs, NJ: Prentice Hall. Brown, J. S., Collins, A., & Duguid, E (1989). Situated cognition and the culture of learning.

Educational Researcher, 18(1), 32-42. Campbell, J. O. (1992). The opportunities of a changing technology for performance assessment. In

What we can learn from performance assessment for the professions: 1992 invitational conference (pp. 69-76). Princeton, NJ: Educational Testing Service.

Carkhuff, R. R. (1987). The art of helping VI. Amherst, MA: Human Resource Development Press, Inc.

Clark, R. C. (1989). Developing technical training. New York: Addison-Wesley. Clark, R. E. (1991). When researchers swim upstream: Reflections on an unpopular argument about

learning from media. Educational Technology, 31(2), 34-40. Cognition and Technology Group at Vanderbilt (1990). Anchored instruction and its relationship to

situated cognition. Educational Researcher, 19(6), 2-10. Cronin, M. W., & Cronin, K. A. (1992). Recent empirical studies of the pedagogical effects of inter-

active video instruction in "soft skill" areas. Journal of Computing in Higher Education, 3(2), 53-85.

Fletcher, J. D. (1990). Effectiveness and cost of interactive videodisc instruction in defense training and education (IDA Paper P-2372). Alexandria, VA: Institute for Defense Analyses.

Gagne, R. M. (1985). The conditions of learning (4th ed.). New York: Holt, Rinehart and Winston.

Page 17: Using computer and video technologies to develop interpersonal skills

Developing interpersonal skills 239

Ives, W. (1990). Soft skills in high tech: Computerizing the development of interpersonal skills. Instruction Delivery Systems, March-April, 12-15.

Kulik, J., & Kulik, C. (1987). Computer-based instruction: What 200 evaluations say. In M. Simonson & S. Zvacek (Eds.), Proceedings of Selected Research Paper Presentations at the 1987 Convention of the Association for Educational Communications and Technology (pp. 18-24). Ames, IA: Iowa State University.

Merrill, M. D. (1987). The new component design theory: Instructional design for courseware author- ing. Instructional Science, 16, 19-34.

Orlansky, J., & String, J. (1977). Cost effectiveness of computer-based instruction in military training (IDA Paper P-1375). Alexandria, VA: Institute for Defense Analyses.

Schroeder, J. E., Dyer, F. N., Czerny, P., Youngling, E. W., & Gillotti, D. P. (1986). Videodisc inter- personal skills training and assessment (VISTA): Overview and findings, volume 1. Final report. (Report No. ARI-TR-703). Fort Benning, GA: Litton Mellonics Systems Development Div. (ERIC Document Reproduction Service No. ED 274 329).