4
&N SYSTEMS Computer Networks and ISDN Systems 30 ( 1998) 587-590 Doctoral Consortium Paper Hypermedia learning environments limit access to information Megan Quentin-Baxter Fucrrlry of Medicine Computing Centre, University of Newcastle upon Ty?e, Newcastle upon i’vne NE2 4HH. UK Abstract Audit trails and questionnaires were used to evaluate students’ use of a highly interactive hypermedia learning environment. The learning material under investigation had similar functionality to a WWW interface and was composed of images and text combined with rollover and clickable maps, hypertext links and interactive questions. Every interaction a student made was logged in the audit trail, and a method of analysing the audit trails was applied to measure the amount of information accessed by each student. The most successful student accessed only 32% of the available information in 93 minutes, and each student studied different material to the others. Students overestimated how much information they had accessed from the total available, with those accessing the least overestimating comparatively more. The interactive strategy adopted by learners affected the amount of information accessed, and it was concluded that the increasing use of interactive hypermedia in teaching highlights the need for further research in order to ensure that some students are not systematically disadvantaged. 0 1998 Published by Elsevier Science B.V. All rights reserved. Kqwords: Evaluation; Hypermedia; Browsing efficiency; Audit trails; Logging 1. Introduction With an increasing use of the Internet for delivering “distant” or “open” learning, course designers and admin- istrators need substantial ways to assure the quality of the education received by students. Evaluation of interactive hypermedia learning materials using logging, indexes and audit trails is extremely important for informing develop- ers about the way in which people learn from electronic sources. Hypermedia rewards curiosity learning by pro- viding relevant information behind highlighted links [6]. at the expense of presenting a clear learning agenda and placing each new concept to be learned in the context of the last. Students can follow their own path through inter- connected information, producing an intricate trail which is difficult to interpret or evaluate numerically. “Evalu- ation” studies are often confounded in their attempts to measure learning gain resulting from the use of computer based teaching materials (for a fuller discussion see [I]). Many studies have assessed the relative value of the teach- ing materials based on students’ yuuliturive responses to questionnaires, an approach supported by Wittrock [9] who indicated that student reports of “attention to leam- ing” was a better predictor of learning achievement than time-on-task. This study used questionnaires and audit trails to inves- tigate how high school biology students accessed subject material embedded in a highly interactive hypermedia learning environment. The results indicated some of the factors affecting ability to access information, and il- lustrated how students conceptualised their own learning experience in relation to measured achievements. It did not extend to evaluating learning gain or associate leam- ing strategy with learning style. This work was undertaken within a larger PhD study aimed at developing and eval- uating a hypermedia computer-based learning package in biology, and which was completed in 1997. The package was written in an authoring system with features very sim- 0169-755?/98/$19.00 0 1998 Published by Elsevier Science B.V. All rights reserved PII SO169-7552(98)00130-5

Hypermedia learning environments limit access to information

Embed Size (px)

Citation preview

Page 1: Hypermedia learning environments limit access to information

&N SYSTEMS Computer Networks and ISDN Systems 30 ( 1998) 587-590

Doctoral Consortium Paper

Hypermedia learning environments limit access to information

Megan Quentin-Baxter Fucrrlry of Medicine Computing Centre, University of Newcastle upon Ty?e, Newcastle upon i’vne NE2 4HH. UK

Abstract

Audit trails and questionnaires were used to evaluate students’ use of a highly interactive hypermedia learning environment. The learning material under investigation had similar functionality to a WWW interface and was composed of images and text combined with rollover and clickable maps, hypertext links and interactive questions. Every interaction a student made was logged in the audit trail, and a method of analysing the audit trails was applied to measure the amount of information accessed by each student. The most successful student accessed only 32% of the available information in 93 minutes, and each student studied different material to the others. Students overestimated how much information they had accessed from the total available, with those accessing the least overestimating comparatively more. The interactive strategy adopted by learners affected the amount of information accessed, and it was concluded that the increasing use of interactive hypermedia in teaching highlights the need for further research in order to ensure that some students are not systematically disadvantaged. 0 1998 Published by Elsevier Science B.V. All rights reserved.

Kqwords: Evaluation; Hypermedia; Browsing efficiency; Audit trails; Logging

1. Introduction

With an increasing use of the Internet for delivering “distant” or “open” learning, course designers and admin- istrators need substantial ways to assure the quality of the education received by students. Evaluation of interactive hypermedia learning materials using logging, indexes and audit trails is extremely important for informing develop- ers about the way in which people learn from electronic sources. Hypermedia rewards curiosity learning by pro- viding relevant information behind highlighted links [6]. at the expense of presenting a clear learning agenda and placing each new concept to be learned in the context of the last. Students can follow their own path through inter- connected information, producing an intricate trail which is difficult to interpret or evaluate numerically. “Evalu- ation” studies are often confounded in their attempts to measure learning gain resulting from the use of computer based teaching materials (for a fuller discussion see [I]).

Many studies have assessed the relative value of the teach- ing materials based on students’ yuuliturive responses to questionnaires, an approach supported by Wittrock [9] who indicated that student reports of “attention to leam- ing” was a better predictor of learning achievement than time-on-task.

This study used questionnaires and audit trails to inves- tigate how high school biology students accessed subject material embedded in a highly interactive hypermedia learning environment. The results indicated some of the factors affecting ability to access information, and il- lustrated how students conceptualised their own learning experience in relation to measured achievements. It did not extend to evaluating learning gain or associate leam- ing strategy with learning style. This work was undertaken within a larger PhD study aimed at developing and eval- uating a hypermedia computer-based learning package in biology, and which was completed in 1997. The package was written in an authoring system with features very sim-

0169-755?/98/$19.00 0 1998 Published by Elsevier Science B.V. All rights reserved PII SO169-7552(98)00130-5

Page 2: Hypermedia learning environments limit access to information

5X8 M. Quentin-Baxtrr/Cr)nlplrter Networks and ISDN Sy Ttcnt.~ 30 (199X) 587-590

ilar in functionality to many of those currently available on the WWW.

2. Method

Thirty five learners used an interactive hypermedia package in order to learn as much as possible about the subject for up to two hours, and 22 of these used it again for a further hour on a separate occasion. The package was styled as an interactive textbook with many hypermedia links and a spatial map (for a discussion of the package see http://www.ncl.ac.uW-nmqb/rats/). After each occa- sion each student completed a qualitative questionnaire asking for general details about themselves, and for their comments about aspects of the package, including esti- mates of the proportion of subject information that they had accessed. Each interaction a student undertook with the package was recorded, creating a complete audit trail (log) of the subject material accessed and preferred inter- active methods. Teachers were questioned to obtain their views of how their students had fared.

Audit trails were quantified by counting the number of times each “object” (a question, clickable image or piece of text. such as hypertext or an annotation presented when the cursor rolled over an image) was accessed [S]. There were over 1200 “objects” of subject material which could be presented either in response to browsing activ- ities or to stimulate the student to respond to questions, The method also quantified “error” interactions which occurred when further information or stimuli were not presented (such as clicking where no hypermedia links were available). Every object presented was either “new”, the student had never seen that information or question before. or “repeated”. where they were revisiting mate- rial that they had previously accessed. Students in this study were permitted to stop using the package when they wanted. in order to investigate whether compensatory learning strategies were employed. such as “quicker” stu- dents stopping earlier. Investigation of moderator variables (such as gender or experience of computers) using statis- tical analysis on the unmodified audit trail data was not possible because each student spent a different amount of time with the package, and it was expected that the amount of information accessed would be positively cor- related in a diminishing way with the time spent using the package.

The quantified audit trails were compared to identify the amount of material presented to each student, and what material was presented to students (a) in common with each other and (b) between their first and second occasion of using the package.

3. Results

The 35 students in this study produced substantial au- dit trails which, when quantified. illustrated what material they had accessed and their preferred interactive strat- egy. The maximum amount of information presented to any individual was only 32% in 93 minutes (mean and standard deviation = 19 f 8% within 73 f 18 minutes; II = 28), which was subjectively considered to be inef- ficient as students were expected to have completed the package in this time. This lack of success was surpris- ing to the teachers, as it contrasted with their impression of the students working successfully as a whole. Further analysis showed that, between them, the students accessed over 8 1% of the available information, indicating that they covered substantially different subject information to each other.

Student estimates of their own achievement were not correlated with the amount of information accessed (Fig. I ). They differentially overestimated the percent- age of information, whereby students who had accessed comparatively less information from the package varied considerably in their estimates of achievement. Results to a similar question indicated that the spatial map may have increased students’ appreciation of the depth and content of the material available.

Some students adopted a browsing strategy while others preferred to respond to questions. Students who browsed accessed much more information than those who concentrated on undertaking questions. Students changed their interactive strategy from browsing to questions. which was estimated by examining the composition of the audit logs over time. Audit trails from students who used the package more than once were characterised by

F. 2” 100 n=35 P q i = 0.07

; 80 p = 0.64

P

; 60

g z

2 40

g

z 20

E k 0

0 20 40 60 80 100

Total information Accessed from Program po)

Fig. I. Relationship between the measured and the estimated percentage of information accessed from the package after the tirst occasion.

Page 3: Hypermedia learning environments limit access to information

M. LSDN S,wtems 30 (1998) 587-590 5x9

“revision” of material covered on the previous occasion. Over

these

Page 4: Hypermedia learning environments limit access to information

s90 M. Quentin-Baxter/Computer Net~vrks and ISDN SWCIIIS .W i 1998) 587-590

Metropolitan University) and Mrs. G.R. Goldsmith (School of Computing and Management Sciences, Sheffield Hal- lam University) for their continual encouragement, and the Lord Dowding Fund (NAVS) for their financial support of this work. Thanks also to Dr. G.R. Hammond and Dr. A.M. McDonald (University of Newcastle) for comments on drafts of this script.

References

[ I] Clark, R.E.. Evidence for confounding in computer-based in- structional studies: analyzing the meta-analyses. Educakmul Communication and Technolog.~ Journal, 33(4): 249-262. 1985.

[2] Hammond, N. and J. Allinson. Extending hypertext for learn- ing: an investigation of access and guidance tools. in: Pro- ceedings of the British Computer Socirtyv HCI ‘89. 1989. pp. 293-304.

[3] Kinzie. M.B. and H.J. Sullivan, Continuing motivation.

learner control and CAI. Educarional Technology Resrurch and Drvrk~prnr~. 37(2): 5-14.c 1989).

141 Markle. S.M.. Unchaining the slaves: discovery learning is not being told, British Jownal of Educctrional Tec/mo/o~~~, 23(3): 222-237. 1992.

151 Miller. G.A.. The magical number seven. plus or minus two: some limits on our capacity for processing information, f$vcho/ogica/ Revir,t. 63: 8 l-97. 1956.

161 Nelson. T.H.. Managing immense storage. B~rc. 225-23X. January 1988.

171 Nielsen. J.. The art of navigating through hypertext. C;~nrw- r~icatior~s of r/w ACM. 33: 297-3 10. 1990.

(81 Quentin-Baxter, M. and D.G. Dewhurst. A method for eval- uating the efticiency of presenting information in a hypermc- dia environment. Conrplrters wd E&-clrion. 18: 178-l X2 1992.

191 Wittrock. M.C.. Students’ thought processes. in: M.C. Wit- track (Ed.), Harldbook of’ Research on Trachiryg. MacMillan. New York. NY, 1986. pp. 297-3 14.

[ IO] Yildiz. R. and M.J. Atkins, Evaluating multimedia applica- tions. Cwnpltrrrs ami Education. 21(1/2): 133-139. 1993.