21
I 1 DOCUMENT RESUME ED 076 135 HE 004 109 AUTHOR Chu, Yu-kuang TITLE Study and Evaluation of the Student Response System in Undergraduate Instruction at Skidmore College. INSTITUTION Skidmore Coll., Saratoga Springs, N.Y. SPONS AGENCY National Science Foundation, Washington, D.C. PUB DATE 72 NOTE 20p. EDRS PRICE MF-$0.65 HC-$3.29 DESCRIPTORS *Autoinstructional Aids; *Computer Assisted Instruction; Computer Oriented Programs; Computer Science; Educational Innovation; *Educational Technology; Electronic Equipment; *Higher Education; *Instructional Media; Program Descriptions; Teaching Machines; Teaching Methods IDENTIFIERS General Electric Company; Skidmore College; *Student Response System ABSTRACT The Student Response System (SRS) is discussed and evaluated. This is an electronic system whereby an instructor receives instant feedback from students by asking questions each with up to five multiple choice answers. The report covers a description oL the SRS, a review of the Skidmore College project on computer applications, and a review of the grant from the National Science Foundation, physical setup for the project, common uses of the SRS, uses of the SRS at Skidmore, combined uses of the SRS in class, uses of computer analysis, individualized uses of the response system, extent of usage, evaluation of the response system, a small statistical evaluation, and a summary of evaluation. (MJM)

Study and Evaluation of the Student Response System in

  • Upload
    lamdan

  • View
    215

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Study and Evaluation of the Student Response System in

I

1

DOCUMENT RESUME

ED 076 135 HE 004 109

AUTHOR Chu, Yu-kuangTITLE Study and Evaluation of the Student Response System

in Undergraduate Instruction at Skidmore College.INSTITUTION Skidmore Coll., Saratoga Springs, N.Y.SPONS AGENCY National Science Foundation, Washington, D.C.PUB DATE 72NOTE 20p.

EDRS PRICE MF-$0.65 HC-$3.29DESCRIPTORS *Autoinstructional Aids; *Computer Assisted

Instruction; Computer Oriented Programs; ComputerScience; Educational Innovation; *EducationalTechnology; Electronic Equipment; *Higher Education;*Instructional Media; Program Descriptions; TeachingMachines; Teaching Methods

IDENTIFIERS General Electric Company; Skidmore College; *StudentResponse System

ABSTRACTThe Student Response System (SRS) is discussed and

evaluated. This is an electronic system whereby an instructorreceives instant feedback from students by asking questions each withup to five multiple choice answers. The report covers a descriptionoL the SRS, a review of the Skidmore College project on computerapplications, and a review of the grant from the National ScienceFoundation, physical setup for the project, common uses of the SRS,uses of the SRS at Skidmore, combined uses of the SRS in class, usesof computer analysis, individualized uses of the response system,extent of usage, evaluation of the response system, a smallstatistical evaluation, and a summary of evaluation. (MJM)

Page 2: Study and Evaluation of the Student Response System in

FILMED PROM BEST AVAILABLE COPY

Report to the National Science Foundation

Yu-knang Chu, Ph. D

Professo.r Ezne.ritus of Asian Studies and EducationPrincipal Investigator

F.rrnerly Director of the Cornputer Applications CenterSkidmore College

Printed and Distributed bythe Education Support Project of

he enera/ Electric CoznpanyWith the .Perznission of SkidnloreCelle

/972

U S. DEPARTMENTOF HEALTH

EDUCATION& WELFARE

OFFICEOF EDUCATION

THIS DOCUMENTHAS SEEN

REPRO

OUCEDEXACTLY

AS RECEIVEDFROM

THE PERSONORORGANIZATION

ORIG

MATING ITPOINTS

OF VIEWOR OPIN

IONS STATEDDO NOT

NECESSARILY

REPRESENTOFFICIAL

OFFICEOF EDU

CATIONPOSITION

OR POLICY

Page 3: Study and Evaluation of the Student Response System in

ACKNOWLEDGMENT

Grateful acknowledgment is due the Skidmore College faculty users of theStudent Response System, whose experiences and comments formed the sub-stance of this Report and without whose cooperation this Project could not havebeen executed. They were: Claudia T. Assini, Assistant in Geology; BettyBalevic, Instructor in Business; Dr. Daniel Balmuth, Professor of History;Dr. Parker B. Baum, Professor of Chemistry; Dr. Beverly J. Becker, Pro-fessor of Physical Education and Chairman of the Department; Dr. MargaretK. Brown, Assistant Professor of. Mathematics and Assistant Director of theComputer Applications Center; Joseph Bruchac III, Instructor in English;Dr. Yu-kuang Chu, Professor Emeritus of Asian Studies and Education andDirector of the Computer Applications Center; Dr. Denton Crocker, Pro-fesscr of Biology and Chairman of the Department; Sharon Crow, Instructorin Physical Education; E. Beverly Field, Associate Professor of Biology;Ruth Fleishman, Laboratory Coordinator in Biology; Dr. Robert D. Foulke,Professor of English and Chairman of the Department; Sallie Haven, Instruc-tor in Sociology; Lois Hollenbeck, Associate Professor of Biology; Dr. R.John Huber, Assistant Professor of Psychology; Krishnaji S. Ku lkarni,Visiting Lecturer in Asian Studies and Art; Dr. L.P. Lipsitt, VisitingLecturer in Psychology; Joanne M. Lunt, Assistant Professor of PhysicalEducation; Dr. Robert P. Mahoney, Associate Professor of Biology; BetsyF. McGlashan, Laboratory Instructor in Biology; Dr. Sayra B. Nikoloff,Associate Professor of Education; Dr. Robert M. Oswalt, Assistant Professorof Psychology; Dr. Edward P. Reagen, Professor of Economics; Dr. WilliamC. Spears, Associate Professor of Psychology; Dr. Judithe D. Speidel,Assistant Professor of Education; Dr. John J. Thomas, Assistant Professorof Geology; Dr. Paul H. L. Walter, Associate Professor of Chemistry;Prescott Wintersteen, Jr., Instructor in History; and Dr. Stuart K. Witt,Assistant Professor of Government.

Sincere thanks are also due Dr. E. Lloyd Rivest, Manager, InformationSystems Programs, Research and Development Center, and Mr. AlfredLeBlang, Manager, Education Support Project, both of the General ElectricCompany for a number of computer analysis programs, programming assis-tance, maintenance of the Student Response System, frequent consultationsand counselings. More recently, Mr. William R. Griffith, Education Appli-cations Specialist of the Education Support Project, has also given us help.My colleague in the Skidmore Computer Applications Center, Dr. Margaret K.Brown, has been a constant source of support and advice.

Our Study and Evaluation Project could not have been launched without thefunding by the National Science Foundation and the financial and moral sup-port given by the Administration of Skidmore College. We owe the inception ofthe Project to Virginia Lester, Director of Educational Research at the College.The Faculty-Student Committee on Computer Applications performed the im-portant function of periodically reviewing our work from the point of view ofthe needs and interests of the faculty and the students.

To all of the above, grateful acknowledgment is hereby made.

)

Page 4: Study and Evaluation of the Student Response System in

i

CONTENTS

page_I. DESCRIPTION OF THE STUDENT RESPONSE SYSTEM 1

II. SKIDMORE PROJECT ON COMPUTER APPLICATIONS ANDGRANT FROM THE NATIONAL SCIENCE FOUNDATION 1

III. PHYSICAL SETUP FOR THE PROJECT 3

IV. COMMON USES OF THE STUDENT RESPONSE SYSTEM 3

V. USES OF THE STUDENT RESPONSE SYSTEM ATSKIDMORE 4

VI. COMBINED USES OF THE STUDENT RESPONSE SYSTEMIN CLASS 7

VII. USES OF COMPUTER ANALYSES 7

VIII. INDIVIDUALIZED USES OF THE RESPONSE SYSTEM 9

IX. EXTENT OF USAGE10

X. EVALUATION OF THE RESPONSE SYSTEM 11

XI. A SMALL STATISTICAL EVALUATION 14

XII. SUMMARY OF EVALUATION 17

1

Page 5: Study and Evaluation of the Student Response System in

STUDY AND EVALUATION OF THE STUDENT RESPONSE SYSTEMIN UNDERGRADUATE INSTRUCTION AT SKIDMORE COLLEGE

Yu-kuang Chu, PhD

I. DESCRIPTION OF THE STUDENT RESPONSESYSTEM

A Student Response System, known as SRS 1000C,has been developed in recent years by the Researchand Development Center of the General ElectricCompany. It is essentially an electronic systemwhereby an instructor can get instant feedback fromhis students by asking questions each with up to fivemultiple-choice answers. There is an Input Stationin front of each student, containing five white num-bered buttons corresponding to numbered multiple-choice answers. (See Fig. 1.) The student pushesone of the buttons to indicate his selected answer.If he wants to change his mind, there is a red but-ton to cancel his previous choice, which he can usebefore he puts in another answer.

Student responses are scanned and reflectedimmediately in five percentage meters on the in-structor's panel mounted on a lectern, showing himwhat percentage of the class has chosen each optionunder a given question. (See Fig. 2.) There is asixth meter to indicate the total percentage of theclass that has answered that question. If there is acorrect answer, its number may be shown in a "Dis-play Box" in front of the class. The five buttons canalso be used as a five-step rating scale to expressdegrees of agreement, importance, interest, use-fulness. etc.

If desired by the instructor, the Response Systemcan be linked up with a teletypewriter, which willrecord and print out the student responses, identifi-able by seat numbers and question numbers, and atthe same time will encode the data on a paper tape.The printout is immediately available after class forinspection by the instructor or students. (See Fig.3.)The tape may be fed into the teletypewriter, when it

is used as a terminal.on line with a computer. Thecomputer will analyze the data according to a selectedprogram and the results will be printed on the tele-typewriter. Such a printed analysis can be obtainedin only a few minutes.

II. SKIDMORE PROJECT ON COMPUTERAPPLICATIONS AND GRANT FROM THENATIONAL SCIENCE FOUNDATION

The Student Response System has been installedin recent years at several large universities, wherelarge classes have found the equipment useful. ButGE wanted-to explore the capability of this equipmentin a small institution, and Skidmore College, whichis a four-year liberal arts college located at SaratogaSprings. New York, with most classes enrolling 15 to25 students each, was interested in introducing com-puter applications into the teaching-learning processin undergraduate instruction. So the two organizationsentered into a cooperative research project. Theaim of the cooperative project is to try to use theSRS in as many different ways and academic disciplinesas possible and to attempt an evaluation of its useful-ness in undergraduate instruction.

Skidmore applied to the National Science Founda-tion for financial assistance (Proposal No. JO 00317)and received in October 1969 a grant of $70,000(Grant No. GJ-317) for the following budgeted ex-penses:

Purchased Equipment $53,000Rental Equipment & Communications 15,700Travel 300Expendable Supplies 1,000

Total $70,000

Figure 1. Student Station

1

Page 6: Study and Evaluation of the Student Response System in

sOu R0

e0 60 /0

C01

1110

64t A

at p

utT

on.

RIS

PO

NS

I

70 0 _

--

SC

ft

-10

1.1

00 70 -

--

10-

60 -

--,-

-40 (1

OU

FS

TIO

NN

UM

ItlC

OR

RE

CT

AN

SW

ER

PR

OG

RA

MI

RE

SP

ON

SE

CO

UN

T X

10

3 3

50

6

RE

SP

ON

SE

541

*

CA

l111

11A

lk

01/1

5110

NP

luM

BIR

47

01.1

1511

0N 0

11

GE

NE

RA

LE

LEC

TR

ICS

RS

100

0

10

RE

SP

ON

SE

TIM

E

30'

2011

1it,

40 60S

EC

ON

DS

S

SESI

ON

ESS

ME

NN

ErV

illia

k

Figu

re 2

.In

stru

ctor

's C

ontr

ol P

anel

Page 7: Study and Evaluation of the Student Response System in

Ct 0 trl o tfl c, tflN N cf) tri

1ft

ind d ld el of ei id of el.i113 a)

vifu (1.)

Er) Er) Er) Er) Er) Er) Er)

14' 1. 1 1,90010 10120422002220002242400024000000000000000

90020 30330433003320002323100023000000000000000

90030 40440244004440004444400044000000000000000

90040 44440144004440004444400044000000000000000

90050 33320333003330003333300033000000000000000

90060 12310311001110001131100011000000000000000

90070 23220322001120001142300022000000000000000

90080 31340134003330001334200042000000000000000

k____T__J90.0.20 22140222002210002302200022000000000000000Line No.

Printout of Responses on a Teletypewriter. Formatis UK, same for all kinds of questions. In this ex-ample, nine questions were used, and there were 16studeats in the class. Responses are represented bynumbers. The first column, after the Line Numbers,consists of Instructor's Answers, if questions havecorrect answers. Otherwise, Instructor's responsewould be "0." Students' responses are identified bySeat Numbers. Some seats were not occupied.Student at Seat 31 failed to answer the first threequestions. Perhaps he was late to class.

Figure 3. Raw Data Printout

The anticipated contribution by Skidmore to thesupport of the project, including Indirect Costs,was $78, 763, but this figure was actually exceededby about $25,000 due to unexpected increase in con-struction costs. The purpose of the grant, as desig-nated by the NSF, was "Study and Evaluation of theStudent Response System in Undergraduate Education."The period of the grant was originally for two years(October 1969 - October 1971), but due to the latestart of the project caused by delay in construction,the termination date was subsequently extended byNSF, at Skidmore's request, to October 1972. How-ever, there was no additional funding for the extendedyear though the grant was entirely expended duringthat time.

III. PHYSICAL SETUP FOR THE PROJECT

With the NSF grant, Skidmore purchased theSRS 1000C equipment from the General ElectricCompany. To install the System, Skidmore con-structed a 40-seat classroom and related projectionand terminal rooms as paint of the facilities of anewly established Computer Applications Center onthe top floor of the library building on the New

3

Campus. The classroom is equipped not only withthe SRS but also for movie, slide, and overhead pro.jection as well as with a tape recorder. The Researchand Development Center of General Electric providedprogramming and consulting services. Computingtime was obtained, free of charge, on a time-sharingbasis by arrangements with Griffiss Air Force Basein Rome, New York, which has a GE 645 computer.Our terminals were linked by leased telephone lineswith the Rome computer, and one of the terminals canbe connected through an interface with the ResponseSystem in the classroom. The whole system was inoperating condition in May 1970, just as the academicyear was drawing to a close. So real use of thefacility could not begin in earnest until September.This fact was the justification for the request forextension of the grant period. This means the presentevaluation study was based on our use of the Systemfor one and a half academic years,

IV. COMMON USES OF THE STUDENT RESPONSESYSTEM

From the start we adopted the policy of goingbeyond some obvious and common uses of the SRSand of trying to foster the more creative aspects ofteaching and learning through the use of the System.Two obvious and common uses, and our intuitivejudgments thereof, are as follows:

1. The first is to ascertain the level or degreeof immediate comprehension by students of the ma-terial presented by the instructor. He may ask ques-tions of the true-false or multiple-choice type, inthe course of his presentation or at the end. Students'responses will show how successfully he has commu-nicated with them and whether there are points thatneed clarification. This use of the SRS is importantin large classes with scores or hundreds of students,but the need for this is perhaps not so great in smallclasses, where the instructor is in intimate rapportwith his students. Nevertheless, this use is importantin courses with a great deal of highly objective content,as in science and mathematics. If the questions gobeyond asking the students to recall information andprovoke thinking and discussion, then this use of theSRS is important in all academic disciplines and inclasses of any size. It tends to encourage a moreactive learning process than a straight lecture-testtechnique would.

2. Another obvious use of the SRS is to admin-ister multiple-choice or true-false tests to largenumbers of students with minimum labor and maximumspeed in scoring and analysis of results. If this isfor the purpose of grading students in courses, weare rather critical of this use. In view of the present-day thinking ..nd attitudes of, students regarding theirown education, we do not want them to feel that hereis a mechanical system to manage their learning andto induce in them conformity in thinking as the averageobjective tests are wont to do. Testing for experi-mental purposes, with the cooperation of students andnot for grading would, of course, be a different matter.

Page 8: Study and Evaluation of the Student Response System in

V. USES OF THE STUDENT RESPONSE STYSTEMAT SKIDMORE

Following are some of the actual uses of the SRSmade by a number of the Skidmore faculty. Theselected examples do not exhaust the "repertoire"of any single instructor or the potentiality of theSystem, but they aim at a variety in uses, It willbe observed that none of the ways used the Systemfor grading students and all of them representedattempts to use the SRS to promote active teaching-learning processes, which departed in varying degreesfrom the simple pattern of straight lectures followedby tests of retention. Many used the student re-sponses as a point of departure for teaching. Thistended to increase students' feeling.of relevancy ofwhat was taught to their own interests and needs.

The sample uses are grouped under four cate-gories according to four different ways of using theSRS, for each of which there is a distinctive computeranalysis program. These programs except one werefurnished by GE.

A. Using the SRS to Teach Understanding, Knowledge,and Skills by Asking Questions Having CorrectAnswers

1. Sentence variations to teach punctuation andstylistics. In an English class for culturally de-prived high school graduates preparing them forcollege admission, an instructor used the ResponseSystem in one period to teach punctuation, specificallythe correct uses of the comma and the semicolon.He projected on a screen a series of sentences, eachin several variations, involving the use of the twopunctuation marks. Students were asked to pick outthe correct version in each case. Discussion ofthese cases constituted instruction. This was anexample of inductive teaching of a grammatical rule.Computer analysis of student responses showed howeach student .did, as well as which questions weredifficult or easy, thus providing a guide for subse-quent instruction. (See Fig. 4.)

The same technique was used in a class in Fresh-man Composition to teach stylistics. A number ofshort paragraphs were shown to the class by over-head projection. In each paragraph, after severalsentences of context material, a "punch-line" sen-tence was shown in four variant forms, all grammat-ically correct. The students were asked to indicatetheir preferred form by pressing the appropriatebutton, and their choices were immediately reflectedin the percentage meters on the instructor's panel.They were asked to explain their choices, which re-sulted in a lively discussion. Strictly speaking, inthis case of stylistics there is no correct choice,although the instructor indicated his own preferencefor one form, or alternative forms, depending on thepurposes of the writer.

4

GENERAL INFORMATION

9 QUESTIONS SCORED16 STUDENTS PRESENT

CLASS MEAN k 5.75STANDARD DEVIATION 1.60RELIAB'LITY COEFFICIENT 0.21

PROBLEM ANALYSIS

PROBLEM PERCENT CORRECT RESPONSE COUNTNO. CORRECT ANSWER 0 I a 3 4 5

I 6.2 1 1 1 10 0 4 02 56.2 3 I 1 4 9 I 03 87.5 4 1 0 2 0 14 04 93.7 4 0 1 0 0 15 05 93.7 3 0 0 1 15 O 06 75.0 I 0 12 I 3 0 07 40.0 2 0 4 8 3 I, 08 43.7 3 0 3 2, 7 4 09 68.7 2 1 2 11 1 I 0

RANK ORDER

2222228

STUDENT ANALYSIS

DISC.INDEX

o.1.00. 50.25

O..75.50. 50

. 25

SEAT NUMBER SCORE PCT. CORRECT T-SCORE

6 8 88.89 64.062 7 77.78 57.817 7 77.78 57.81

10 7 77.78 57.8111 7 77.78 57.8119 7 77.78 57.8125 7 77.78 57.8112 6 66.67 51.868 17 6 66.67 51.568 24 6 66.67 51.56

11 3 5 55.56 45.3111 16 5 55.56 25.3111 20 5 55.56 45, 3114 18- 4 44,44 19.0715 I 3 33.33 32. 8216 5 2 72.22 2!.57

Output of Computer Program "ANAL 1" AnalyzingResponses to Questions flaying Correct Answers."Discrimination Index" indicates the power of agiven question to distinguish between the top quarterand the bottom quarter of the class. The higher theindex, the greater the power. The conversion ofraw score into "T-score" makes possible comparisonof student performance -an different tests, if "normalcurve" distribution of abilities is assumed.

Figure 4. ANAL 1 OUTPUT

2. Diagnosis of a class or of individual students.A professor in Foundations of Education used theSRS to administer a true-false test in order to ascer-tain what the students already knew or did not knowas regards the unit'she was about to teach. Thisinformation enabled her to concentrate on teachingthose things which the students did not know. Italso served to motivate the students as they couldsee clearly what they needed tq learn and thus guidedtheir readings in assigned references.

A professor of English used the SRS to administera diagnostic test in English grammar consisting of50 questions of the objective type. The printout on thecomputer terminal immediately available upon com-pletion of the test as well as the later analysis bythe computer showed clearly the strengths and weak-nesses of each individual student without tediousanalysis performed manually by the instructor. Re-medial instruction individually designed for eachstudent could then be given:*

Page 9: Study and Evaluation of the Student Response System in

3. Other examples under this category of usewere (a) A Chemistry professor used the SRS totest student understanding of a concept and its impli-cations, some of which he did not discuss but expectedhis students to deduce, if they had correctly under-stood the concept. (b) A mathematics professor askedstudents to respond to several questions at the begin-ning of each period, highlighting the major points ofthe last lecture and serving as a springboard for thecurrent lecture. (c) A professor of Psychology usedthe SRS to teach the relaticn of standard deviation tonormal frequency distribution in a variety of statis-tical operations. (d) A biology instructor conductednear the end of a semester a quick review sessionduring which students responded through the SRS toa long list of objective-type questions covering themajor areas of the course, thus revealing to eachstudent his own weaknesses in advance of the realsemester' examination.

B. Using the SRS to Conduct an Opinion Poll or toExpress Preferences

Naturally, under this category of use, questionshave no standard answers.

1. Studying contempoudent values sandmores. A class of 30 students in a course on Educa-tion and Culture answered the questions dealing withpresent-day student values and mores in a nationalsurvey by LIFE Magazine (Vol. 70, No. 1, Jan. 8,1971, pp. 22-30) and compared their own resultswith those of a national sample. Since the questionswere answered by "Yes" or "No," computer analysissimply gave the percentage of students choosing option#1 (Yes) and that for option #2 (No) under each ques-tion. (See Fig. 5.) The results served as a basis fordiscussing education in relation to culture.

2. Managing discussion in committee meetings.Student organizations also used the SRS. A studentcommittee of 23 members conducted several meetingsin the Response System classroom discussing whatstudents wanted to recommend to the faculty in regardto certain all-college requirements, such as a foreignlanguage, Freshman English, and physical education.The-chairman presented a question dealing with aspecific requirement with four alternative ways ofhandling it plus a fifth option for a "way not coveredby the preceding four," as follows:

(1) Maintenance of the requirement.

(2) Abolition of the requirement.

(3) Reduction of the amount of the requirement.

(4) Exemption from the requirement on demon-stration of proficiency.

(5) A solution not covered by the preceding four.She used the SRS to find out the initial reactions ofcommittee members. Then they discussed the variousalternatives, and students voting for the fifth optionaired their views. She took soundings again. On thebasis of majority position she formulated a proposition

5

QUESTION RESPONSE COUNTNUMBER 0 1 2 3 4 5

1 6 13 11 0 0 054.2% 45.8% 0% 0% 0%

2 0 25 5 0 0 083.3% 16.7% 0% 0% 0%

3 0 8 22 0 0 026.7% 73.3% 0% 0% 0%

49 2 4 24 0 0

14.3% 85.7% 0% o% 0%50 1 zs 4 0 0

86.2% 13.8% 0% o% 0%

Output of Computer Program "OPINION." AnalyzingResponses to Questions Having No Correct Answers.It gives the number of students choosing each option,and its corresponding percentage. ander each question.Percentages are calculated in terms of the total num-ber of students responding to a given question. E.g.,six students failed to Answer Question #I; this numberwas ignored in calculating percentages under thisquestion.

Figure 5. "OPINION" Output.

and asked the committee members to express agree-ment or disagreement with it through the SRS. (Fortechnique, see Category C ) She repeatedthis several times, Bach time refining or qualifyingthe proposition until she got a consensus or a largeaffirming majority. She considered the System usefulin both recording the process of opinion formationand eliciting discussion in terms of the breakdown ofopinion on a particular question. Each member ofthe committee could anonymously express his thought

wellell as vocally defend it, if he desired. Thus,every one participated in some way during themeetings.

3. Other examples of this category of use are:(a) A professor of Physical Education in teaching acourse on evaluation procedures and research tech-niques used multiple-choice questions through theSRS to test student selection of research proceduresor formulas to be employed, to determine the sequenceof steps in research, or to evaluate research literature.(b) A professor of Elementary Education askedstudents to select one of five alternative techniquesfor teaching the meanings of certain words in givenpassages in textbooks. (c) A psychology professorcompared the attitudes towards suicide of two groupsof students, one of which had had a "bull" sessionwith him on the subject, but the other had not. (d) Alecturer in Asian Studies discussed with his studentstheir reactions to foreign films monitored by the SRS.(e) An instructor in Chinese history explored studentbackground for and present interest in area studies aswell as their attitudes toward public issues relative toAsia. (f) A Geology professor asked his students toevaluate an experimental part of his course anonymouslyat the end of the term by answering a list of questionsthrough the SRS. (g) An orientation program making

Page 10: Study and Evaluation of the Student Response System in

explicit the mutual expectations of the instructor andhis students in the way of teaching-learning goals,course contents and. methods, evaluation of studentperformance, etc. was developed for use by facultymembers at the beginning of a term so as to achieveunderstanding and rapport between instructor andstudents.

C. Using the SRS as a Rating Scale to ExploreAttitudes and Feelings

1. Using SRS as a rating scale to study attitudes.A visiting lecturer at a Child Psychology Institute,conducted for a group of college teachers of Psychol-ogy, explored through the SRS the attitudes of thegroup towards the behavioral point of view in Psy-chology. The professor presented orally a list of19 statements on various implications for child de-velopment and education of the point of view in ques-tion. He asked members of the group to express theiragreement or disagreement with each statementusing the five buttons as a five-step scale:

"1" representing "Total Disagreement.'""2" representing "Disagreement.""3" representing "Neutrality.""4" representing "Agreement.""5" representing "Total agreement."

The percentage meters on the instructor's panelinstantly showed him how the group reacted to a givenstatement. He announced the percentages to thegroup, followed by comments by him or by partici-pants. It was interesting to note that his first state-ment was the most general and comprehensive, andhis 19th statement was identical to the first. Forsome participants, their responses on the 19th indi-cated a significant shift in attitudes from their re-sponses towards the first statement, apparentlycaused by their reacting to the intervening ones.

In computer analysis of rating responses, But-ton #1 is arbitrarily chosen not as a standard but asa reference point in scoring. A students' responseis scored in terms of the number of steps it is awayfrom "1." "Four" is the maximum score for any onequestion. (See Fig. 6.) if Button #3 represents"neutrality," as in the case above, then a score of"2" means "neutrality.'" The farther up a score isabove "2," the stronger the tendency towards "totalagreement." On the other hand, the lower down ascore is below "2," the stronger the tendency towards"total disagreement."

However, if a scale represents linear progressionfrom zero to maximum (e. g., "1" = "Useless,""2" = "A bit useful;" "3" = "Moderately useful;""4" = "Quite useful;" and "5" = "Very useful,") thenthe higher the score, the stronger the tendency to-wards the upper end of the scale.

6

PROBLEM REFERENCENUMBER POINT

GENERA L INFORMATION

19 QUESTIONS SCORED26 STUDENTS PRESENTCLASS MEAN a 2.02

PRoBLE54 ANALYSIS

RESPONSE COUNT1 2 3 4

TOTAL NtEAN5 SCORE SCORE

1 0 4 1 1 4 16 792 0 9 10 3 ' 3 1 293 0 0 2 0 14 10 844 1 7 5 3 6 4 455 .1 19 4 1 0 2 146 o 4 7 5 6 4 SI

..........-.............---"7----- ,--;:::."''-'-''''',...s.......,,.............:_,--'--""---- --."--...*--'

18 1 0 0 1 4 7 14 86 3.3119 1 0 12 5 5 2 2 29 1.12

3.041.123. 231.80.54

1.96

STUDENT ANALYSIS

RANK ORDER SEAT NO. TOTAL SCORE MEAN SCORE T.SCORE

15 53 2.79 64.282 14 49 2. 58 61. 133 25 48 2.53 60.343 48 2.53 60.34

13 18

13 17 38

------- 38

23 ff23

4 ---------.12 29

253.6 7 26

24 27

2.00 52.472.00 52.47

1.53 45. 381.53 45.581.42 43.811.37 43.02

Output of Computer Program "ANAL 2" AnalyzingResponses on a Rating Seale. SCORE 7 No. of stepsaway from "1." A response on #3 on the scale isscored as "2" (2 steps from "1"). A response onh5 on the scale is scored as "4." which is the maxi-mum score for any one question. In PAOBLEMANALYSIS. TOTAL SCORE is the sum of the scoresof all the students on that problem, and MEAN SCOREis obtained by dividing TOTAL SCORE by the numberof students. In STUDENT ANALYSIS. TOTAL SCOREis the suns of the scores on all problems achievedby that student. ancoMEAN SCORE is obtained bydividing TOTAL SCORE by the number of problems.

Figure 6. ANAL 2 OUTPUT

2. Other examples of this category of use are:(a) A Biology professor generated class discussionof an assigned reading by asking students to expressthrough the SRS their agreement or disagreement withthe general point of view of the author as well as withspecific statements he made. (b) After using theSRS in the teaching of a class or course, the instructorasked the students to rate on a five-point scale theusefulness of the Response System in that particularclass or course. (c) A group of students interestedin admission procedures of the College was asked toexpress the degree of agreement or disagreementwith the hypothetical proposition that Skidmore hence-forth cease using SAT-Verbal and SAT-Math scoresin considering an applicant for admission.

D. Using the SRS to Establish Prioritiesaft

1. Studying priorities in value systems of dif-ferent societies. A professor of Asian studies asked

Page 11: Study and Evaluation of the Student Response System in

his students, who happened to be all women, each togive him a list of several faciA.rs or values she wouldlook for in selecting a prospective mate. From theselists he made a composite list of 22 factors. Thefirst step was to find the five top values for the classas a whole. The students were asked to examine thelist of 22 factors and each to select five factors mostimportant for herself. As each factor was announcedby the instructor, each student pressed either Button#5 to indicate it was among her top five values or But-ton Ni to say it was not, but never used the in-betweenbuttons. From the immediately available printout ofraw data, the instructor could readily select the fivefactors with the most "5's" and the least "l's." Thesecond step was to establish an order of prioritiesamong the top five. Students were asked to assignmentally one of the five buttons to each of the fivefactors, with no button used more than once and with"5" representing the highest priority and "1" the low-est among the top five. Then, as each factor wasannounced, students pressed the appropriate button.(For computer calculation of weighted scores, seeFig. 7.) The results were: first, love; second,intelligence and education; third, compatibility; fourth,integrity; and fifth, sense of humor.

QUESTIONNUMBER J

ROPONSE COUNT WEIGHTEDI 2 3 4 S SCORE

MEANSCORE

Western Marriage,

Compatibility : 0 1 S S 6 1 SS 3.06Humor 2 0 9 5 2 2 0 33 1.23Integrity 3 0 3 7 2 S 1 4$ 2.67Intel!. k F.due. 4 0 3 0 7 4 4 60 3.33Love 5 0 3 0 0 2 13 76 4.22Chinese Marriage;

Loyalty 6 9 3 3 4 5 1 50 2.78Health k Fenn. 7 0 1 2 S S S 6S 3.61Humility L Obed. 8 0 6 2 2 .-- 2 6 54 3.00Love of Children 9 0 3 'It 3 1 2 44 1.44Comp. ram. Bligrd. 10 0 S .-0- 3 6 4 SS 3.22

Output of Computer Program uTXSCALE.D Estab-lishing Priorities Among Items. "Weighted Score" Isobtained by multiplying each button value by the num-ber of persons choosing that button and then summingup the products. E.g., for "Compatibility" it is(OxO) + (Ix!) + (5x2) + (5x3) + (6x4) + (1x5) z 55."Mean Score" is obtained by dividing 55 by the num-ber of students participating (18). Since Button #5was used to represent the highest priority, the largerMean Score, the higher the priority.

Figure 7. T:CSCALE OUTPUT

Next he reminded the class of its earlier study ofthe traditional family system in China, under whichmarriage was no' a personal but a family affair, andas such, it was arranged by parents with the primaryobjective of producing offspring. The class wasasked to suggest a list of values Chinese parentswould look for in a prospective daughter-in-law.Again, through the use of SRS, the top five values andtheir priorities were ascertained. They were: first,health and fertility; second, comparable family back-ground; third, humility and obedience to parents-in-law; fourth, loyalty to family; and fifth, love of child-ren. It became obvious to the class that traditional

7

Chinese and contemporary Western marriages hadvery different value systems. None of the top fiveChinese values was found in the composite list of 22Western values.

It was pointed out to the class that marriage wasmerely a single illustration of a general principlevery important in the study of foreign cultures andsocieties. What is this principle? Class discussionconcluded with this generalization: each society orculture has its own value system appropriate for eachof its institutions; you cannot impose the value systemof one society upon another unless you change the lattersociety. The lesson was a good example of havingstudents perform a demonstrationof an abstractprinciple. The SRS with its capability of instant feed-back made it possible to compress the whole teaching-learning process into a 50-minute period, with notedious calculations on the part of the instructor orstudents.

2. Other examples of this category of use were:(a) In a class in Dance Appreciation, students wereshown a film of two Chinese, two Korean, and oneJapanese dance and asked to arrange the five dancesin descending order or personal likes and explain why.(b) A group of college admissions officers arrangedin order of priorities five specific factors governing ,the admission of applicants.

VI. COMBINED USES OF THE STUDENT RESPONSESYSTEM IN CLASS ,

To recapitulate, the above illustrative uses ofthe SRS are classified into four categories accordingto the nature of the questions used:

A. Using the SRS to teach understanding, know-ledge, and skills by asking questions havingcorrect answers.

B. Using the SRS to conduct an opinion poll or toexpress preferences by asking questionshaving no standard answers.

C. Using the SRS as a rating scale to exploreattitudes and feelings.

D. Using the SRS to establish priorities.

In any one instructional period, the instructor mayuse any combination of categories to suit his teachingobjective. However, if he wants computer analysis ofstudent responses to be One after class, then therecords of responses to different kinds of questionsmust be separated before transmission to the com-puter, because each type of question calls for adifferent computer analysis program. (See Figs. 4through 7.)

VII. USES OF COMPUTER ANALYSES

For many faculty users of the SRS, the informationgained from the percentage meters on the instructor'spanel was sufficient for feedback purposes. The in-structor would not know which student responded inwhat way, but he could readily see what percentage

Page 12: Study and Evaluation of the Student Response System in

of the class reacted in what way. This is the easiestand the commonest way of using the Responst.Since this information is instantaneously avaittlde, ittends to affect the teaching - learning process in 'nedi-ately. This use is important both in teaching basicknowledge and in directing discussion.

Some instmotors wanted, in addition to informa-tion from the percentage meters, a recording ofstudent responses in the form of a printout on theteletypewriter ava,lable :Almon immediately. It takesonly five seconds to rmo"d zt d print out the responsesof 40 students as well an tha A' the instructor to onequestion. This information i3 more exact than thereadings of percentage meters since one can tell fromthe printout how the student at each occupied seatresponded. (See Fig. 3.) If the printout is postedafter class with a list of the questions used and if astudent remembers where he sat, he can comparehis own responses with those of tlic instructor andthose of his fellow students.

Other instructors definitely wanted computeranalysis and regarded the step of recording as in-cidental to the step of creating a paper tape with theresponses encoded on it, by means of which the datacould be transmitted through a telephone wire to adistant computer for analysis according to the pro-gram selected. Since we had access to the computerin Rome, New York, only in 'he afternoon of a week-day, the analysis could be made available to the in-structor on the day of his class or in the followingmorning, depending on the hour of his class.

Since the wishes of the faculty users varied inthe above way, we found it useful to organize ourwritten instructions on "How to Operate the ResponseSystem" into three parts: Part I, How to Operatethe Percentage Meters; Part II, How to Obtain a Re-cording; and Part III, What to Do to get a ComputerAnalysis. Part I was included in Part II, and PartsI and II were included in Part III. Each user couldpursue the instructions only as far as his interestwent. As a user gained confidence in using the Sys-tem in a simple way, he could move on to a morecomplex way. This avoided the danger of "frighten-ing" or confusing a new user with a lot of complicatedinstructions given in one shot. In briefing facultymembers, we had to emphasize frequently how easyit was to operate the System.

As to computer analysis, there were four pro-grams to take care of the four major uses of the Re-sponse System as outlined in the preceding two sec-tions and a fifth one for changing the format of data.There were other programs to treat data collectedby the System in the Free Mode. (See section follow-ing.) All programs, except one, for analysis of SRSdata were created and furnished to us by GeneralElectric. The exception, called "OPINION," wascreated by our own staff. Following are some com-mon uses of various programs:

1. ANAL 1: This program scored student re-sponses according to a standard key supplied by the

8

instructor or correct answers entered by him throughthe instructor's panel. (See Fig. 4.) Its greatestusefulness was in connection with giving tests orquizzes, in which every question had one and onlyone correct answer, whether for the purpose of gen-erating discussion or review or for grading students.It yielded statistical information on the performanceof the class as a whole as well as that of individualstudents. A feature of the program not illustrated inFig. 4 was the capability to make individual diagnosisby instructing the computer to conduct as many "passes"over the data as there were sub-areas of the test,yielding for each student a s...b-score for each ariaas well as a total score for the whole test.

ANAL 1 was also found very useful for item anal-ysis of a test. The section of the output called "Prob-lem Analysis" showed what percentage of the classhad answered each question correctly, and from the"Response Count" for each question the Instructorcould see not only the number of students gettingthe correct answer but also the most "popular"errors. Such item analysis enabled the instructorto improve his test and his technique of testing.

2. OPINION: This program was very useful inanalyzing student opinions on questions and issueshaving no standard answers. It simply gave the num-ber and the percentage of students choosing eachoption under each question and left it to the instructorand students to interpret the significance. (SeeFig. 5.) Though the Response Count in ANAL 1 gavethe number of students choosing each option undereach question and so could also be used to study thedistribution of student opinions, many instructorsbalked at the use of ANAL 1 for this purpose, be-cause the program presumed one correct answer foreach question. So OPINION was created to tabulateopinions from a neutral or open-minded point of view.It also had the advantage of giving both numbers andpercentages of students under each option.

A new use of this program has been planned. Theauthor of a new text in General Psychology will let aclass use the book in manuscript form. Every majorparagraph will be numbered. At the beginning ofeach class period, the students will be asked to spenda few minutes evaluating each numbered paragraphin the chapter assigned for that class period byanswering a question such as the following:

"How would you characterize this paragraph?

"(1) Meaning not clear

"(2) A concrete example would help

"(3) Language too involved or difficult

"(4) Content uninteresting or not relevant"(5) Satisfactory"

On the basis of student comments the author will re-vise the manuscript.

OPINION was tailored for data on opinions enteredthrough the SRS in a class situation. For such data

Page 13: Study and Evaluation of the Student Response System in

entered through the Response System in the FreeMode (see next section), there was a program,called "NLSCORE, " which accomplished the samepurpose.

3. ANAL 2, This program analyzed student re-sponses when the five buttons of the Input Station wereused as a live-step scale for rating_ purposes. (SeeFig. 6.) It was found very useful in all sorts ofstudent evaluations of instructional materials, courses,and instructors. It was also frequently used to stim-ulate class discussion by allowing students to expresscandidly their agreement or disagreement with whatwas said by the instructor or the textbook, etc. TheSRS encouraged a 100% student participation.

4. TXSCALE: This program yielded weightedscores for various items to be arranged in order ofpriority in the judgment of a class. (See Fig. 7.)Since priorities are usually subjective, this wasanother way of studying student opinions.

5. LSTOMAN: This program was devised tosurmount the limitation of ANAL 1 and ANAL 2 toresponses from a single class or section of studentsbecause of the identification of students by seatnumbers., When students from two or more sectionsof the same course responded to the same questionswe used LSTOMAN to combine the data into manualform, inserting section numbers to differentiatestudents who occupied the same seats but in differentsections. The combined data could then be processedby one of the ANAL programs.

In summary, the main point to be emphasized indiscussing the use of computer analyses is that anal-ysis by computer saved the instructor a lot of timeand from a great deal of drudgery; and hence, tendedto encourage him to obtain such analyses more fre-quently than otherwise, thus hopefully resulting inimprovement -of- instruction.

VIII. INDIVIDUALIZED USES OF THE RESPONSESYSTEM

The above ways of using the SRS illustrate itsoperation in the Group Mode, i.e., in class instruc-tion. The System can also operate in the FreeMode, which facilitates individualized instructionand independent study. Students working with in-dividualized instructional material, whether "pro-grammed" or otherwise, may come to the ResponseSystem classroom at any time and enter their re-sponses to questions through the SRS. Differentstudents working on material in different courses maywork at the same time in the classroom, each athis own pace. Their responses will be continuouslyscanned and recorded on a teletypewriter in mixed-uporder. ID numbers for students and courses willnecessarily be used so that the computer may sortout the student responses by courses, by students ineach course, and by questions under each student'sID. After sorting, the computer will score the re-sponses and analyze the results in a manner specified

9

by the analysis program selected. To test thiscapability of the SRS, a few examinations in Biology,Psychology, etc. were administered through theSystem operating in the Free Mode. At first, weexperienced great difficulty in having Free Mode dataprocessed by the computer we regularly used, butwhen we shifted to the computer of the GE Researchand Developinent Center, for the processing of suchdata, the results were very satisfactory.

During the spring semester of 1970-71 theDepartment of English ran an experiment to comparethe effectiveness of three different approaches to theteaching of Freshman Composition. It involved 11sections of this course, with a total enrolment of191 students. The results of the pre-test and post-test were processes oy the combined use of SRS andthe computer.

The SRS is also a convenience to students orfaculty doing independent research projects involvingquantitative studies. For example, questionnairereturns can be entered through the SRS, providedthere are no more than five options under each ques-tion and it is understood that any open-ended ques-tions with written-in answers will have to be scoredby hand. There is a computer analysis program whichwill give, on a drintout, the number of respondentschoosing each option under each question and itscorresponding percentage in terms of the total num-ber of people answering that question. This particu-lar analysis program called "NESCORE, " can handlequestionnaires with up to 99 questions and up to 199respondents.

Several faculty committee questionnaires con-cerning curriculum questions, _one involving as manyas 600 returns, were processed in the above way.This saved the committees a lot of tedious workotherwise involved in manual calculation. Again,more-than a dozen questionniare studies conductedas individual projects by students inBusiness andNursing were processed, with the students learningthe techniques of processing.

There are, of course, other forms of quantitativestudies compatible with treatment by the SRS. Thepoint to be emphasized here is that the researcher,whether student or faculty, can be spared from time-consuming and tedious tabulations and calculationsand so focus his main attention on analyzing hisresearch problem, formulating his design, collectinghis data, thinkirg through the procedure of treatinghis data, and when the results of analysis are deliveredby the computer, interpreting the results and drawinghis conclusions--precisely those steps in his projectthat are most educational and least mechanical. Thetime required for completion of work is speeded upimmeasurably by the computer.

Plans have been laid to encourage faculty membersto produce material for individualized instructioneither in connection with their courses or as indepen-den: study projects. A group of ten professors from

1

Page 14: Study and Evaluation of the Student Response System in

as many academic disciplines has agreed to authorinstructional material, accompanied by appropriatemedia, each from his own disciplinary point of viewbut all focused on a common theme. Students willbe encouraged to achieve some degree of integrationbased on their exposure to multidisciplinary materi-als. The Response System will probably be of usein monitoring- student responses as feedback infor-mation to the student to help him improve his learningand to the faculty writers to enable them to improveWit. instructional materials. But all this is in thefuture.

IX. EXTENT OF USAGE

Just how much use did the Student ResponseSystem get and in what kinds of courses? Courseswere scheduled to meet in the SRS classroom at therequest of the instructors concerned. In order toinvite faculty requests, general announcements weremade to the entire faculty, at least once a semester,urging them to inspect the facility and decide forthemselves whether they could usefully employ it intheir courses. Surprisingly, such general announce-ments brought forth little interest. This fact seemeda very interesting commentary on the present generalfeeling among liberal arts college teachers towardsmachines and technology.

Accordingly, a second approach was employed."Hopeful prospects" in the faculty were individuallyinvited to come for a personal briefing in regard tothe operation of the System and its potential uses inundergraduate teaching. Each decided for himselfwhether he would like to try it. In the one and a halfyears the SRS has been operating, the number offaculty members thus briefed exceeded 70, whichamounted to nearly 50% of the whole faculty. Re-cently, two academic departments requested briefingsand demonstrations for their departmental staff.Such leadership tends to encourage faculty use;

1- As a result of these briefings, 29 different facultymembers (about 40% of those briefed) each decidedto and did hold one of his classes in the SRS class-room. They taught in this room a total of 35 differentcourses taught to 45 different classes, distributedas follows:

Mathematics, Biology,Courses Classes

Chemistry, Geology 11 16

Psychology 5 6

Economics, Government,Sociology, History 5 5

English, Asian Studies,Dance, Art 8

Business, Elementary .

Education, Physical Educ. 6

TOTAL 35 45

10

The average enrolment in these 45 classes isover 15 students each. This would mean no less than675 students, representing about 37% of the total stu-dent body, have participated in the use of the SRS inthe last one and a half academic years.

The extent of use among these 45 classes variedwidely, ranging from several class periods througha few weeks to steady use throughout the semester.Another way to indicate the amount of use is thatduring the first academic year the average numberof class periods per week, in which classes wereconducted in the SRS classroom, was 10, averaging --two classes per week day. During the fall semesterand wii.ter term of the second year, the average in-creased to over 15 class periods per week or 3 perweek day, representing a 50% increase in use.

Of the 29 faculty users of the SRS, thirteen were"steady" users in the sense that all meetings of theirclasses took place in the special classroom throughoutthe duration of the course. The other sixteen wereoccasional or short-term users. Of the total num-ber of faculty tfgers (29), fourteen were "repeaters"in the sense that, after their first experience withthe SRS, they either taught another course using itor the same course to a different class.

Some of the "non-repeaters" fir.st tried out theSystem in the fall of the second year, and at the timeof the writing of this report, they had not yet had anopportunity to repeat. A few had the opportunity-butdid not choose to repeat.

When all the data on use are considered, we can-not say that the SRS took to instant popularity. Yetthe GE representative cooperating with Skidmore inthe project told us th..t we werit doing "fantastically"well, considering the size of our institution as wellas the fact that we relied chieay on voluntary effortsof faculty members in developing instructional ma-tt.rial for SRS teaching.*

The reasons for a relatively small start are:(1) college faculty anywhere is usually cautious inadopting new changes in curriculum and methods;(2) many instructors are more subject-matter-centered than student-centered in their approach tothe problem of college teaching; (3) some instructorsare far more concerned with the highly sophisticatedoutcomes of human:stie learning which the ResponseSystem cannot bring out but require the writing ofessays, or constructed answers to questions, orother complex modes of expression: and (4) Skidmoreis in transition from its old campus to a graduallybuilt-up new campus. During the period under reviewabout two-thirds of its classes were held on the NewCampus, while the remainder was still conducted on

For a more detailed discussion of this point, seeQuestion 8 under "X. Evaluation of the ResponseSystem."

Page 15: Study and Evaluation of the Student Response System in

the old with a class schedule half an hour behind thatof the New Campus in order to permit students tocommute between the two locations. As a consequence,classes scheduled for the old campus could not haveshort-term use of the SRS classroom on the NewCampus. (5) The capacity Gf this special classroombeing 40 seats precluded very large classes fromusing the classroom, even if the instructors had sodesired.

X. EVALUATION OF THE RESPONSE SYSTEM

In our original application to the National ScienceFoundation, it was stated in reference to evaluation:"Evaluation should aim primarily at finding out howmany different ways the System can be used to pro-mote 'creative' teaching, in what academic disciplines,and for what educational objectives. ... Eyaluationsshould be specific and relevant to each way it (theSystem) is used. ...the instructor and his student:.should rate the System to indicate the extent to whichthe facility has been helpful in achieving their educa-tional objectives... However, it should be clearlyunderstood that the^ evaluation... will be mainlyqualitative. Statistics will receive a secondaryemphasis. ..."

In accord with the above statement, our evaluationwill be qualitative and analytical rather than statisti-cal. Three sources of information will serve as the

.basis for evaluation: (a) data on usage; (b) facultyand student ratings; and (c) opinions of faculty usersobtained by individual interviews and in group dis-cussion. The greatest importance is attached to thethird source, for there we try to go behind and be-yond numerical figures and to interpret our totalexperience with the Response System. We shallorganize our observations around certain questions,frequently raised by visitors, and around the evalu-ative remarks made by our faculty users.

-1. Is the-SRS Better Pitted for the Teaching-of SomeAcademic Disciplines Than Others?

Our data on usage showed that the SRS was usedin 35 different courses belonging to 15 different academicdisciplines representing the humanities, mathematicsand sciences, social sciences, and professional fields.The question seems to imply that the Response Systempermitting student response only to objective-typequestions is perhaps better suited to the teaching ofthe so-called highly objective disciplines such asmathematics and the-sciences. But this view over-looks the possibility of using the System to generateclass discussion of opinion- issues in the less exactdisciplines, Alsoi-the nature of any academic dis--cipline and the nature of the teaching-learning processare both so complex that this dichotomy into "objec-tive" and "subjective" disciplines is false. Every

**Very small classes with less than 10 students eachwould present no problem as the System could bespecially calibrated to take care of small numbers.

11

discipline has some objective content and some sub-jective elements. It is our considered opinion thattheoretically any instructor in any discipline. possiblywith the exception of courses primarily concernedwith tt.e development of performance skills, couldfind some use for the SRS in certain aspects of histeaching, if he chose to do so. Faculty users areamong those who are aware of the importance of therole of student responses in the teaching-learningprocess. So the extent of use varied with the in-structor rather than with the discipline.

2. is the SRS Useful Primarily for the Purpose ofChecking_Student Understanding of Lectures andAdministering Objective Examinations?

As detailed in the section on "Uses of the StudentResponse System at Skidmore" above, our facultyusers have emphasized a variety of ways of using theSystem to encourage a more active participation ihthe learning process on the part of the students ant'a greater adaptation of the teaching process to stut;entinterests and needs on the part of the instructor. Theuses departed in varying degrees from the usuallecture-test pattern of instruction.

3. In the Opinion of the Faculty Users and theStudents in Their Classes, was the ResponseSystem Useful for the Purposes They Had inMind?

We requested every instructor to ask his stu-dents, at the end of a period of use, to answer anon-ymously the following question through the SRS andthat he himself also answered it

"How useful has the Response System beento the teaohing-learning process as con-ducted in this classroom in this course?Rate its usefulness according to thefollowing scale:

1 2 3 4 5Useless A bit Moderately Quite Very

useful useful useful useful"

Our records are not complete since some in-structors did not have time or did not choose toparticipate in the evaluation. Nevertheless, ratingswere obtained from over 64% of the classes. Weshall resist the temptation to give an average num-erical figure as a statistically precise index of use-fulness. Since the purpose for which the System wasused, the duration of time its use, and the numberof students involved differed so widely that any averagenumerical index would be as meaningful as adding

-apples,-oranges, and grapes together and taking anaverage of them. For what they are worth, we maysummarize faculty and student ratings of the SRS fortheir own purposes as follows:

(a) Faculty ratings turned in range from "2"(A bit useful) to "5" (Very useful), with the majorityfor "3" (Moderately useful), and 40% for "4" (Quiteuseful) and "5" combined. A very few faculty users

Page 16: Study and Evaluation of the Student Response System in

known to be critical of the SRS did not turn in ratingsfor themselves or their students. Had they done so."1" (Useless) might have been their rating.

(b) As for student ratings, individual ratingsranged from "1" to "5," with the great majority for"3" (Moderately useful). Average student rating wascomputed for each reporting class. These classaverages ranged from "2.1" to "4.7" with the medianaverage rating at "3.3."

(c) In two-thirds of the classes with ratings.the instructors' ratings were somewhat higher thanthe students' average ratings. In only one class wasthe instructor's rating the same as the average stu-dent rating for that class.

4. In Teaching, What Can Be Done Through the SRSThat Cannot be Achieved Without It?

The Response System is'uniquely useful:

(a) when anonymity of student response is es-sential. For example, in collecting information re-garding drug use or sex practice among students, theinstructor can assure them of anonymity by usingonly the percentage meters on the instructor's paneland by not recording. Seats are far enough apart andthe buttons in the Input Station are close enough thateven neighbors cannot easily observe how a studentis responding. If he shields the Station with onehand while responding, he can be absolutely sure ofprivacy. If the instructor desires recording andsubsequent computer analysis of data, then he shouldallow students to sit anywhere they like and promisenot to take any special notice of who sits where.This use of the SRS is better than asking for writtenreports without a signature, because handwriting cansometimes be recognized. Anonymity also encouragesstudents to express freely views that are apt to beunpopular with their classmates or unfavorable tothe instructor;

(b) when 100% student participation in discussionis desired. Some feel that the use of the SRS, whilejustifiable by the instant feedback not otherwise avail-able in a very large class, is really unnecessary in asmall one. It is true that if an instructor maintainsgood rapport with a small class, there can be a freeflow of dialogue between them. Yet, in such a situa-tion very often a few bright or aggressive studentstend to monopolize the dialogue with the instructor,the rest of the class just sitting back as silentspectators. With the use of the SRS, every studenthas to be active to the extent of-making a decisionon the question raised. After the responses are col-lected and noted, the instructor may indeed throwopen the question for free discussion. Having com-mitted themselves to a certain decision, even theformerly "silent" students would be more inclined tospeak up for their positions. Hence, class discussionwill be more widespread and the learning processwill be more active for all students;

12

(c) When speedy recording of responses andquick computer analysis are desired. Needless tosay. this use saves the instructor from a lot oftedious work and therefore will probably encouragehim to take more frequent soundings of student re-sponses.

5. Doesn't the Operation of the Response S.vstemTend to "Intervene" Between the Instructor andHis Students?

After asking a question. the instructor has topush two or three buttons and the students have topress one before he can talk about their responses.This intervention is disliked by many students andis distracting to the instructor. His train of thoughtis interrupted by his operation of the machine. Thisnegative feeling is so strong in the case of a fewfaculty users that they would not use the Systemagain. There is no good solution to this problem.Each faculty user has to decide for himself thequestion of trading off this undesirable feature for theadvantages of the System.

Recently, two faculty users who didn't care tohandle machinery had one or two students in theirclasses trained in the operation of the SRS. At thebeginning of each class period, a student would makeready the System. The instructor might on the spurof the moment use the SRS. and the student wouldoperate it for him or the instructor might never useit in that period. It does not hurt the equipment to bein readiness for any length of time. This solutiontended to reduce the feelihg of distraction.

6. Are there any Inherent Weaknesses in theResponse System?

One weakness is that it cannot accept a con-structed answer to an open-ended question. Nothingcan be don. to overcome this. Another limitation isthat the number of options under any question cannotexceed five, though it may be less than five, as in atrue-false question. A partial solution is to use thefifth option of "Other alternatives." When an in-structor sees a significant percentage of studentschoose the fifth option, he should ask students to airtheir own ideas. Another solution is to repeat aquestion, each time having up to five choices. Stu-dents who find their answers under the first appearanceof the question will ignore its second appearance, orvice versa.

7. Shouldn't Students Have the Educational Experienceof Asking Questions and Discovering Many Dif-ferent Ways of Answering a Question Instead ofHaving Questions and Multiple-Choice AnswersProvided by the Teacher All the Time?

In the experience of some instructors, themultiple-choice answers under a question often sharp-ened the student perception of the question, and in theensuing discussion the students frequently gave more

I

Page 17: Study and Evaluation of the Student Response System in

diverse responses than when a question was asked inthe traditional way.

However, certainly the instructor should not bethe one to ask questions and provide alternative an-swers all the time. In a free-wheeling discussion,the questions as well as the options under each ques-tion may indeed be suggested by the students them-selves. They may be written on the blackboard,obviating previous preparation, and then be respondedto by the students through the SRS. For example, ina Dance Appreciation class, after seeing two filmson Asian dances students were asked to suggest im-portant characteristics of dances in Asia insofar asthey wore illustrated in the films. The instructorlisted their suggestions on the blackboard and thenasked the students each to indicate through the SRSwhich characteristics were within the most importantfive in his own opinion, uninfluenced by others in theclass. The results stimulated a lively discussion,in which the instructor helped the students distinguishthe trivial from the essegtial.

8 Doesn't Pre-Programming of Questions Take ALot of Time?

Yes, but hopefully, some of the material de-veloped may be of use again when the course is taughtnext time. The more creative uses of the ResponseSystem do require much more of the instructor intime and effort--defining teaching goals, planningof teaching procedures, pre-programming of ques-tions, preparing media, etc. While preplanningtends, in itself, to improve instruction, quite apartfrom the use of any physical facility, many facultymembers feel under pressure of time, that they can-not afford the extra time and effort involved. Ulti-mately, the question boils down to one of prioritiesin the use of instructor time. What does the instruc-tor aim to accomplish in teaching? Does the use ofSRS to monitor promptly student responses tend tohelp him realize his goal?_Is_it worth- the -extra timeand effort involved?

One way to solve this difficult problem is to getexpert college teachers to develop software involvingthe use of SRS in various disciplines. Just as theuse of computers in undergraduate instruction did notget very far until software was created in varioussubjects of the curriculum, the same is true of theuse of the Response System. Granted that collegeteachers are usually very reluctant to adopt wholecourses worked out by others, it does not precludethe possibility of producing instructional materials inthe form of a large number of small_modular unitson specific topics commonly found in a course forundergraduates, and thus allowing the instructor tochoose whateVer units he likes to be incorporatedinto his own course. A good example of this is thevery large number of audio-tutorial units in Biologydeveloped and published by the Burgess PublishingCompany, except that these units are intended forindividual study whereas we are talking about unitsfor class instruction through the SRS.

13

An instructor in Physical Education at Skidmoreplans to write a unit on First Aid to be taught entirelythrough the Response System in a workshop. Shethinks that teaching First Aid in Utz conventional waymay be a bit boring, but teaching it through the SRSwill point up areas that need discussion and will re-sult in more efficient learning in less time. Whileshe is willing to make this extra effort on a voluntarybasis, to get a large number of teachers to developsoftware in various disciplines funds are needed tomake possible the giving of "released time" to themor to compensate them for the use of their free timein the summer. Any rapid expansion of the use ofSRS would have to await this development.

Administrators must not think that once theequipment is bought, use of it in teaching will follow.Resources are needed for continuous faculty develop-ment, if continuous improvement of instruction isthe goal. One of the most important lessons wehave learned from our project is that we did not planadequately for faculty development. While we werehighly gratifiedby so many faculty members makingentirely voluntary efforts in developing units for SRSteaching, the use of the System could have been muchwider, had there been a systematic plan for providingfaculty incentives.

9., Doesn't Pre-Programming Result in Inflexibilityin Teaching Procedure and Create in the StudentA Feeling of Being "Restricted" by the Instructor'sMaterial?

It doesn't have to. The instructor should feelfree to depart from a prepared list of questions andlet the class discussion go in whatever direction thatseems desirable. The unused questions may be usedsome other time. Or, pre-program only one or twoquestions as a starter and let the next question growout of the discussion of the preceding one, thus resul-ting in an open-ended teaching-learning process.

Sometimes it is wise to provide an opportunityfor students to "let off steam" by having such optionsas "Cop out," "A stupid question," "I have a brilliantanswer," etc. Oral discussion after entering re-sponses through the SRS will, among other functions,satisfy students who feel the need for vocal response.

10. Does the Use of the Response System AlwaysInvolve Media Preparation?

The instructional material to which students areto respond may be presented by the projection offilms or slides, or transparencies on an overheadprojector, by playing tapes, or by distribution ofmimeographed material, or by staging an activity inprocess such as a demonstration, a debate, etc. Butnone of these things is required. A simple questionmay be presented orally in class or written on theboard. Whether an instructor is to go in for audio-visual or other technological aids is entirely up tohim.

Page 18: Study and Evaluation of the Student Response System in

11. Do Students Peel at Ease in Using the ResponseSystem?

We have not studied the problem, but sonicstudents have remarked on their feeling of "toomuch competitiveness." Such feeling may arise insome students, perhaps because they can see theirown success or failure in a much more clearly de-fined way than in vague self-estimates without theuse of the SRS. Is this good or bad for learning?Any feeling of competitiveness may be lessened byproviding strict anonymity and by not using the SRSfor grading purposes.

An instructor pointed out that long-term studentusers of the Response System might be different intheir fpelings from occasional users. He sensedthat some students were a little uneasy in using theSystem at the beginning of the semester, but becameperfectly at ease at the end of the term. This ob-servation was in accord with a study made by BethSulzer at Southern Illinois University, who at the endof a semester of Educational Psychology asked herclass of 322 students to answer the question, "How doyou like the student response system?" on a five-point scale from "Favorable" to "Unfavorable."Over 50% of the class responded with "1" or "2."=:-

12. Is there Any Special Seating Arrangement in aResponse System Classroom?

The seats in our classroom are arranged in fiverows, of eight each, facing the instructor. The In-structor's panel is mounted on a fairly large lectern.We chose this arrangement because it would accom-modate the greatest number of seats within a limitedspace. Some of our faculty users have criticized thisseating arrangement as being too conventional andthe lectern as too formal. In these days of informalityirregularity and casualness are preferred. t

13. What is the Cost of a Student Response System

Naturally, it will vary with the number of seatsinvolved, among other factors. It would not be usefulto mentionthe price we paid for our system, for theequipment was specially manufactured for Skidmore,not a product of assembly line production, and theprice is several years old. Suffice it to say that themagnitude of the cost was on approximately the sameorder as that of an electronic language laboratoryhaving the same number of seats.

The GE Student Response System 1000C is fairlyrugged, and service calls have been infrequent.

14. Is the Use of the Computer in Connection withthe SRS an Economical One?

In the use of the SRS, students at no time comeinto direct contact with the computer. The Systemcollects and records student responses, and whenthese are fed through a terminal into a computer,the latter will analyze the data according to theanalysis program selected. Such a program mayyield individual analysis as well as group analysis.The computing involves only one terminal on linewith a remote computer in a time-sharing system.As compared with CAI (Computer Assisted Instruction),which requires at any one time a separate terminaland a separate.port into the computer for everystudent, the SRS use of the computer is far moreeconomical. Of course, it does not have the advan-tage of "immediate turn-about" that CAI has. Themode of computer use in connection with the SRSmay be called, "Computer Analyzed Instruction" or"CAnI."

. _

XI. A SMALL STATISTICAL EVALUATION

Despite our de-emphasis of statistical evaluationfor our purposes, we are going to present a smallstudy of this nature in a specific instance with noattempt at broad generalization. Two sections of acourse in General Psychology were taught by thesame instructor, studied the same textbook andsubstantially the same subject matter, and at the endof the semester took the same identical final exam-ination. This examination was of the multiple-choicetype. One section was taught in the fall of 1970 in aconventional classroom, while the other was taughtin the fall of 1971 using regularly the Response Systemthroughout the semester. The idea of comparingthese two sections was more or less an after-thought.So there was no preplanning to make the two sectionsequal in all essential respects. Registrations for thesections were uncontrolled. Students, as well as theinstructor, were not aware of the comparison. Theinstructor was enthusiastic about the use of the SRS,and most of the students who had such use assigneda high rating to its usefulness.

We used SAT-Verbal and SAT-Math scores,English Composition score, "Converted Rank" inhigh school, and Grade Point Average at Skidmore

*Beth Sulzer, Teaching-Educational Psychology with the Aid of a Computerized Student Response System. 1968, P. 6.

tConceivably, students could sit on the floor, each carrying an Input Station box with a long wire connecting itwith the scanning machine in the rear projection room, and the instructor's panel could be mounted on the armof a sofa chair or coffee table. Let your imagination go! But whatever arrangements you choose, make surethat it will be easy to insure privacy of response. If there are rising tiers of seats, students on a higherlevel should not be able to see how a student on a lower tier responds. Similarly, the instructor's panelshould be so placed that students cannot easily see which button the instructor pushes when he puts in a cor-rect response to a question. All of this is not, strictly speaking, pertinent to an evaluation of the ResponseSystem, but colleges contemplating its installation may want to take notice of our errors on the side offormality and conventionality.

14

Page 19: Study and Evaluation of the Student Response System in

as measures of scholastic aptitude and academicperformance to show the extent to which the twosections were equal. Then we compared the FinalExamination Scores in General Psychology (notsemester grades in that course) of the two sectionsin order to discover if there was any significant dif-ference between them.

The following table summarizes the data:

growth and general maturity might be an importantfactor.

The differences between the,means of the twosections in Psychology Final Examination Scoreswas 4.2 in favor of the SRS Section. In all ourstatistical calculations, we assumed that the scores'in the two sections were normally and independentlydistributed. The Standard Deviation of the difference

TABLEAl{

Showing the Characteristics of the Students in the Conventional Section and the SRS Sectionand Their Final Examination Scores in General Psychology

MeasuresConventional

SectionSRS

Section

Difference:SRS over orunder Conv'l

No. of Students*FreshmanSophomoreJunior

TOTAL

22 (54%)12 (29%)

7 (17%)

7 (20%)19 (54%)

9 (26%)

-15 (-34%)+7 (+25%)+2 ( +9%)

41 35 -6

SAT-Verbal ScoresMean 573 558 -15S. D. 63 71 +8

SAT-Math ScoresMean 538 542 +4S. D. 68 74 +6

Engl. Comp. ScoresMean 586 584 -2S. D. 70 73 +3

Converted Rank in H. S.'Mean 60 59 -1S. D. 8 -1

Grade-Point-Averagesat Skidmore***Mean 2.76 3.03 +.27S. D. . 58 .50 -.08

Psy. Final Exam ScoresMean 76. 4 80.4 +4.2S. D. 9 10 +1

*10 students from the conventional section and 5 students from the SRSsection were excluded from the comparison fse lack of complete data.Those in the comparison were all females.'

**Conversion of Rank in High School into a two-digit lgure, taking classsize into consideration, followed a formula developed by Validity StudyService of the College Entrance Examination Board.

***Grade-Point Average was not cumulative but only for the semester inwhich General Psychology was taken.

It will be noted that the two sections were not toodissimilar except in two respects. One was a lessthan 3% difference in SAT-Verbal scores in favor ofthe Conventional Section. The other was the fact thatwhereas 54% of the Conventional Section consisted offreshman students, in the SRS Section sophomoresmade up 54% of the class. An additional year of

15

between means was found +o be 2.2, and the 95% Con-fidence Interval for this difference was 4.2 f 4.3,or from -0.1 to 8.5. By hypothesis testing, wefound that the probability of a difference this greator greater happening by chance alone was 0.0574.So we are 94% sure that there is a significant dif-ference in the performance of the two sections.

Page 20: Study and Evaluation of the Student Response System in

Could the difference be explained by the prepon-derance of sophomores in the SRS Section? To testthis hypothesis, only sophomores in the two sectionswere compared. The data are summarized in thefollowing table:

Statistical studies seldom yield unequivocal con-clusions. It is true of this small study of ours also.Our findings are not out of line with conclusions ofthree similar studies previously made by other in-vestigators:

TABLE 2

Showing the Characteristics of only Sophomore Students in the Conventional Section and the SRS Sectionand Their Final Examination Scores in General Psychology

Difference:

MeasuresConventional

SectionSRS

SectionSRS over orunder Conv'l

No. of Sophomores 12 19 +7

SAT-Verbal ScoresMean 584 578 -6S. D. 59 64 +5

SAT-Math ScoresMean 529 552 +23S. D. 63 65 +2

Engl. Comp. ScoresMean 566 603 +37S. D. 57 57 0

Converted Rank in H. S.Mean 60 61 +1S. D. 9 7 -2

Grade-Point-Averagesat SkidmoreMean 2.66 3.15 +.49S. D. .51 .53 +.02

Psy. Final Exam. ScoresMean 77.8 82. 0 +4.2S. D. 10 10 0

This time there existed a less than 4% differencein SAT-Math Scores, a difference of about- 6% in Eng-lish Composition Scores, and an 18% difference inGrade-Point-Average, all in favor of the SRS Secticr.The difference between the means of the two sectionsin Psychology Final Examination Scores was 4.2,which was exactly the same as the difference betweenthe two total sections. The Standard Deviation of thedifference between the means of the two sophomoregroups, by the small sample method, was 3.8, andthe 95% Confidence Interval was 4. 2 ± 7.77, or from-3.57 to 11.97. By hypothesis testing we are lessthan 75% sure that there is a significant differencein the performance of these two groups of sophomores.The slight superiority of the sophomores in the SRSSection in academic ability as indicated by their some-what better mean scores in SAT-Math, English Com-position, and Grade-Point-Average could at leastpartially explain some of the found difference betweenthe means of the two groups. But at least the direc-tion of the difference is in favor of the SRS group.

While a difference of 4.2 points between themean scores of the two sections on the Psychologyfinal examination in favor of the SRS Section was not

. 16

large, we must bear in mind an inherent limitationof the "equivalent group comparison" technique,namely, a common examination of the multiple-choice type had to be based on largely factual con-tent material taught in common to both sections andignored any innovative materials and methods usedonly in the experimental section as well as the moresubtle and subjective learnings that could have ac-companied the innovative features of the teaching-learning process. If there had been, instead, manysmall tests on specific topics in the course, designedin such a way as to reveal not only mastery of factualcontent but also more sophisticated learning outcomes,the difference in the performance of the two sectionsmight have been larger or at least more meaningful,even if not any larger.

(1) Dr. Henry 0. Thompson at Syracuse Universitytaught one section of a course in Religion in the tra-ditional way and another section using a much largeramount of visual material to replace some of hislectures and supplement others as well as employingthe Student Response System. At the end of thesemester it was found:

Page 21: Study and Evaluation of the Student Response System in

"Achievement, as measured by examin-ations, did not vary significantly betweenthe groups. This finding is not definitive,however, since the final exam was basedon the common lecture material and didnot attempt to test the outcomes of theenriched experiences of the experimentalgroup."*

(2) Dr. Martha W. Bradley, in teaching a course inEducation, used a manual response method to avisual-verbal mode of instruction in one section andthe Student Response System in another section. Acomparison of student achievement in the two sectionsled to the conclusion:

"The difference (in favor of the SRS Section)was significant (at the .01 level of confidencefor the quizzes but not significant for thefinal examination)."t

(3) Dr. William P. Hillgartner compared the effectson student achievement of three treatments in threesections of a course in Teacher Education:

(a) Use of SRS for individual diagnosis plusindividualized assignments.

(b) Use of SRS but no individualized assignments.

(c) No use of SRS and no individualized assign-ments.

He found:

"The results strongly suggest that theregular incorporation of a responsesystem into a program of instructiontends to raise the level of achievement...When the response system is used to collectfeedback upon which individual assignmentsmay be based, the level of achievement tendsto rise ever. higher. "$

XII. SUMMARY OF EVALUATION

A large majority of our faculty users have foundthe Response System either "moderately" or "Quite"useful, in some cases, "very" useful, for thefollowing purposes:

1. Generating class discussion -- breaking the ice,stimulating thought, preventing degeneration ofdiscussion into a lecture.

2. Collecting information on student opinions, feel-ings, attitudes, and other subjective aspects

*Center for Instructional Communications. Mono-graph Series, II: Instructional Systems Develop-ment. Syracuse University. 1968. Part III,P. 4.

flbid, Part III, P.S. (Italics inserted by the presentwriter.)

tlbid, Part III, P. 6.

17

pervading the learning process whether in non-science or science courses.

3. Student evaluation of instructional materials- -films, slides, readings, research, courses,instructors, etc.

4. Monitoring student acquisition of basic know-ledge--facts, concepts, procedures, etc. --whether gained from individual study or fromclass instruction.

5. Administering quizzes, tests, examinationswhether for grading purposes or not.

6. Monitoring individualized study and analysis ofquantitative data entered through the SRS in theFree Mode.

The chief limitation of the System is that it cannotaccept constructed answers to open-ended questions.Hence, it cannot monitor the more sophisticatedlearning outcomes. However, even the most sophis-ticated learning has to rest on a foundation of basicknowledge of facts and concepts, which can be readilytapped by objective-type questions. If the instructorrequires his students to assume greater responsibilityfor acquiring basic knowledge through individualstudy, this will free him from going over basicmaterial in lectures and enable him to teach in classonly what the students have failed to get themselvesand to devote more time to sophisticated and creativeteaching apart from the SRS.

Obviously, it is unrealistic to expect any deviceto work perfectly in all situations. There is nopanacea in the complicated task of college teaching.In the personal opinion of the Principal Investigatorof this project, the outstanding virtue of the SRS isthat it directs the attention of the instructor to studentresponses. Having associated himself with collegeteachers for over forty years in throe differentcountries, he believes that many instructors, in-cluding himself, when they walk into the classroom,are preoccupied with the material they are going tolecture on. Seldom do they think of student responsesto the material unless they intend to tell a joke. Yetpsychological researchers have demonstrated thatstudents learn, not directly from instructional ma-terials, but from their own reactions to the teaching.This explains why different students often learn dif-ferent things from the same material or the sameteacher. The instant collection of responses of allstudents and the speedy analysis.of the data by com-puter make it impossible for the instructor to ignorethis evidence. He doesn't wait days, weeks, or evenmonths until he gives`a test before he knows what thestudents have learned. With the evidence before him,he can continue to exploit his strengths, and whereimprovement is called for, he would probably tryother ways of teaching. Thus, he is encouraged ,

continuously to improve his instruction to the point ofhis own satisfaction. To use a Chinese simile, Thescratch is applied to exactly where it itches."