10
© 2005 The Authors. Journal compilation © 2006 British Educational Communications and Technology Agency. Published by Blackwell Publishing, 9600 Garsington Road, Oxford OX4 2DQ, UK and 350 Main Street, Malden, MA 02148, USA. British Journal of Educational Technology Vol 37 No 1 2006 45–54 doi:10.1111/j.1467-8535.2005.00521.x Blackwell Publishing Ltd.Oxford, UKBJETBritish Journal of Educational Technology0007-1013British Educational Communications and Technology Agency, 200520053714554Article Reflecting on the think-aloud method for evaluating e-learningBritish Journal of Educational Technology Reflecting on the think-aloud method for evaluating e-learning Deborah Cotton and Karen Gresty Deborah Cotton has a DPhil in Education, and her research interests include e-learning, environmental education, and teaching controversial issues. She has a particular interest in developing educational research methodologies. Karen Gresty is senior lecturer in the School of Biological Sciences. Her research interests are e-support for healthcare students in biosciences and public understanding of science. Address for correspondence: Deborah Cotton, Educational Development and Learning Technologies, University of Plymouth, Drake Circus, Plymouth, Devon, PL4 8AA, UK. Email: [email protected] Abstract E-learning is increasingly being used in higher education settings, yet research examining how students use e-resources is frequently limited. Some previous studies have used the think-aloud method (an approach with origins in cognitive psychology) as an alternative to the more usual questionnaire or focus groups, but there is little discussion in the educational literature about the advantages and disadvantages of this approach. In this paper, we discuss our experience of using the think-aloud method in a recent study, and we reflect on its potential contribution as a research method. A number of concerns about the method arose during our study, including the level of guidance given to participants, observer influence, and the complexity of data analysis. We conclude, however, that the richness of the data collected outweighs these constraints, and that the think-aloud method has the potential to enhance research in this field. Introduction Computer-based learning, or e-learning, is increasingly being employed in educational settings and is a key focus for the development of higher education in the UK. This is in part a response to the 1997 Dearing Report, which emphasised the growth of technology in higher education, and in part due to the need to teach increasing numbers of student with decreasing resources. The recent white paper, The Future of Higher Education, also advocated an increase in e-learning in order to provide more flexible learning opportu- nities in the context of widening participation [Department for Education and Skills (DFES, 2004)], and a national e-learning strategy is currently under development (see DFES 2003). Broad, Matthews, and McDonald (2004) comment on the increasing pressures within universities for academic staff to embrace e-learning, with this provid- ing a key strand of many institutions’ teaching and learning strategies.

Reflecting on the think-aloud method for evaluating e-learning

Embed Size (px)

Citation preview

© 2005 The Authors. Journal compilation © 2006 British Educational Communications and Technology Agency. Published by Blackwell Publishing, 9600 Garsington Road, Oxford OX4 2DQ, UK and 350 Main Street, Malden, MA 02148, USA.

British Journal of Educational Technology Vol 37 No 1 2006

45–54doi:10.1111/j.1467-8535.2005.00521.x

Blackwell Publishing Ltd.Oxford, UKBJETBritish Journal of Educational Technology0007-1013British Educational Communications and Technology Agency, 200520053714554Article

Reflecting on the think-aloud method for

evaluating e-learningBritish Journal of Educational Technology

Reflecting on the think-aloud method for evaluating e-learning

Deborah Cotton and Karen Gresty

Deborah Cotton has a DPhil in Education, and her research interests include e-learning, environmentaleducation, and teaching controversial issues. She has a particular interest in developing educationalresearch methodologies. Karen Gresty is senior lecturer in the School of Biological Sciences. Her researchinterests are e-support for healthcare students in biosciences and public understanding of science. Addressfor correspondence: Deborah Cotton, Educational Development and Learning Technologies, University ofPlymouth, Drake Circus, Plymouth, Devon, PL4 8AA, UK. Email: [email protected]

Abstract

E-learning is increasingly being used in higher education settings, yet researchexamining how students use e-resources is frequently limited. Some previousstudies have used the think-aloud method (an approach with origins incognitive psychology) as an alternative to the more usual questionnaire orfocus groups, but there is little discussion in the educational literature aboutthe advantages and disadvantages of this approach. In this paper, we discussour experience of using the think-aloud method in a recent study, and wereflect on its potential contribution as a research method. A number ofconcerns about the method arose during our study, including the level ofguidance given to participants, observer influence, and the complexity of dataanalysis. We conclude, however, that the richness of the data collectedoutweighs these constraints, and that the think-aloud method has thepotential to enhance research in this field.

Introduction

Computer-based learning, or e-learning, is increasingly being employed in educationalsettings and is a key focus for the development of higher education in the UK. This is inpart a response to the 1997

Dearing Report,

which emphasised the growth of technologyin higher education, and in part due to the need to teach increasing numbers of studentwith decreasing resources. The recent white paper,

The Future of Higher Education

, alsoadvocated an increase in e-learning in order to provide more flexible learning opportu-nities in the context of widening participation [Department for Education and Skills(DFES, 2004)], and a national e-learning strategy is currently under development (seeDFES 2003). Broad, Matthews, and McDonald (2004) comment on the increasingpressures within universities for academic staff to embrace e-learning, with this provid-ing a key strand of many institutions’ teaching and learning strategies.

46

British Journal of Educational Technology Vol 37 No 1 2006

© 2005 The Authors. Journal compilation © 2006 British Educational Communications and Technology Agency.

The development of web-based resources for higher education is a particularly prolificarea, offering students the opportunity to access a wide range of information sourcesat a time and place convenient to them. Such resources have the potential to offerstudents access to ‘open learning’, a process described by Race (1994) as involvinglearning which is: at your own pace; at times and places of your own choosing; whenyou are in control of how you learn—in terms of methods, structure and revision (p.23–4). However, some web-based resources replicate didactic teaching methods or text-books without offering any substantial advantages, and many lecturers simply use theweb to post information for students to access. It is clear that advances in technologywill not automatically lead to learning enhancement.

As the use of technology in higher education has increased, so too have the argumentsabout its relative merits. Advocates claim that e-learning has the potential to transformeducation and that it provides demonstrable benefits for learners, whilst critics arguethat it has not fulfilled its potential and fails to deliver the benefits promised (see Under-wood, 2004). Evaluation of e-learning resources is increasingly being viewed as a cru-cial issue in teaching and learning in higher education (Crowther, Keller & Waddoups,2004; Oliver, McBean, Conole & Harvey, 2002), yet it has often taken a secondary role,as project funding is primarily directed at development of such resources. Consequently,there is little compelling evidence about the effectiveness of much e-learning and seem-ingly contradictory research findings are commonplace. For example, Adams (2004)notes that ‘...often only low level learning is achieved as a result of using thesematerials’, whilst Kekkonen–Moneta and Moneta (2002) conclude that ‘...carefullydesigned interactive e–learning modules foster higher–order learning outcomes’. Thismay simply reflect the huge range of e-learning materials available, but it may also beindicative of the lack of theoretical underpinning and methodological rigour of muche-learning research (Adams, 2004; Underwood, 2004). In her review of currentresearch in this area, Underwood concludes that, ‘Islands of excellence exist, in con-junction with huge oceans of poor practice’ (Underwood, 2004, p. 137).

The DFES consultation document provides strong encouragement for action researchin higher education settings by emphasising the importance of adequate evaluation ofonline resources, including the ‘intensive evaluation of learning experiences to balancelarge-scale studies’ (DFES, 2003, p. 25). In a recent publication on e-learning by theHigher Education Academy, Conole stressed the importance of developing effectivee-learning research methodology:

More rigorous research methods are needed to ensure valid and meaningful findings. This meansmore systematic research but also a better understanding of the benefits and limitations ofdifferent methods. (Conole, 2004)

Given the very strong focus on questionnaires and focus group interviews in previousstudies, a need was identified for a research method that focused on the immediatelearning experience of students using e-resources and the factors which influence deci-sion making therein. The think-aloud method described below appears to offer this

Reflecting on the think-aloud method for evaluating e-learning

47

© 2005 The Authors. Journal compilation © 2006 British Educational Communications and Technology Agency.

possibility but—as with any method—has its limitations. In this paper, we aim tocontribute to the methodological debate by offering a reflection on our experience ofusing this approach.

The current study

This paper is based on a project funded by the Learning and Teaching Support Network(LTSN) Subject Centre for Health Sciences and Practice, which involved a detailed eval-uation of an online biological resource for student nurses (‘Headstart’), developed at theUniversity of Plymouth. The Headstart package is aimed at stage 1 nursing studentsprior to, and during, their first year of higher education. It aims to offer additionalsupport material and guidance about biosciences within a nursing context, to helpstudents acquire the background knowledge that they will need in order to build adeeper understanding of the subject, and to increase student confidence and motiva-tion. Modelled on the Indonesian Buddhist temple at Borobudur, the resource is struc-tured as a hierarchical pyramid with a series of themed galleries. The lower galleriescover the more basic aspects of biological sciences, building up to the more complexissues in the higher levels and culminating in the ‘Bioviews’ section in which somecontemporary issues in Biology are explored. (Further details about the resource canbe found in Gresty & Cotton, 2003).

A considerable quantity of evaluation data relating to the resource had already beenamassed, largely collected as online feedback or retrospectively in the form of question-naire responses. Whilst these data are of interest and can provide information on stu-dents’ views about the resource, we felt that they gave little insight into the ways inwhich it was actually being used by the students and the possible impact on theirlearning. We also had access to a vast amount of automatic tracking data but wereuncertain how this could be used to our best advantage and were aware that such datawere not always reliable. It was apparent that immediate access to students’ reasoningabout their interactive decisions, alongside data about their use of the different pagesof the online resource, would provide valuable insights into the learning process inwhich they were engaged. We also felt that programme review and development wouldbe greatly assisted by additional information about the nature of the learning experi-ence of individuals and particularly the learning difficulties they encounter whilst usinge-resources.

In this study, we therefore decided to pilot an alternative method of evaluation by usinga ‘think-aloud protocol’. This method involved observing individual students whilstthey used the online resource, noting their navigational decisions and asking them toarticulate their thoughts and feelings as they used the resource. The development of thethink-aloud protocol is generally attributed to Ericsson and Simon (1984), and theapproach has been widely used in cognitive psychology research, often to investigateproblem solving. More recently, the approach has been used to study human–computerinteractions and to evaluate new software (eg, Crowther

et al

, 2004; Oliver

et al

, 2002),often under the guise of ‘usability testing’. However, there is little discussion to be foundin the educational literature about the advantages and disadvantages of utilising the

48

British Journal of Educational Technology Vol 37 No 1 2006

© 2005 The Authors. Journal compilation © 2006 British Educational Communications and Technology Agency.

methodology in this context. Even in

Rethinking University Teaching

—a central textconcerning the use of learning technologies in higher education—barely a paragraphis devoted to the method, and it is dismissed apparently on the grounds of one negativeexperience (Laurillard, 2002, p. 42).

The think-aloud method enables the evaluation of the thought processes or decisionmaking of someone performing a specific task (see Ericsson & Simon, 1984 or VanSomeren, Barnard & Sandberg, 1994 for a detailed description of this approach). Theverbal response can be captured with a tape or video recorder, or by note taking. In thisstudy, we encouraged participants to verbalise their thoughts or feelings as they navi-gated the online resource in order to understand their interactive behaviour. They werenot given a specific task to complete but rather were asked to use the resource as theywould do normally. The session was videotaped to provide screen shots of the pagesviewed, and notes were taken by the researcher present. We anticipated that thisapproach would provide us with a rich data set, the benefits of which would outweighits time-consuming nature. The study also provided insights into the advantages anddisadvantages of using a think-aloud technique in an e-learning context, and it is thisaspect which is addressed in the following section.

Methodological issues

This study raised a number of issues related to the use of the think-aloud method, whichare not addressed adequately in the literature. The think-aloud method is a type ofconcurrent verbal protocol (as opposed to a retrospective protocol, in which verbalisa-tions are made after the task has been completed). As used by Ericsson and Simon(1984), the method requires participants to verbalise their thoughts whilst performinga task. The researcher does not generally interact with the participants after the initialinstructions on how to complete the task, and explicit instructions are given on whatshould be said to participants, because, ‘The subject’s TA (think-aloud) protocol... maywell be influenced by the exact wording of the TA instructions’ (Ericsson & Simon, 1984,p. 80). Researchers are advised to give very general instructions, simply to ‘think aloud’,and to verbalise ‘everything that passes through your head’. Ericsson and Simon urgecaution about changing the verbalisation instructions in the light of evidence that thismay change the structure of the thought process itself. Following this guidance, ourinitial plan was to provide a general request that participants ‘try to think-aloud—talkas much as you can about what is going through your head as you use the resource’.However, we did have some reservations as to whether this would provide us withmeaningful data or merely a rambling discourse.

In attempting to utilise Ericsson and Simon’s methodology, we encountered two keyproblems:

1. It quickly became clear that we needed to increase the level of guidance given toparticipants (level 1 student nurses). The students simply did not know what kindsof thoughts to articulate in response to this very general instruction. This phenom-enon reflects a common problem in social research, which is that participants find

Reflecting on the think-aloud method for evaluating e-learning

49

© 2005 The Authors. Journal compilation © 2006 British Educational Communications and Technology Agency.

it ‘...much easier to talk about thought processes if they know on which type ofthoughts to focus their attention’ (Calderhead, 1981). However, the focus providedmust not distort or invalidate the evidence.

2. A large part of the Headstart resource is in text form, and it was simply not realisticto expect participants to read and simultaneously vocalise their thoughts about apiece of text.

In an attempt to resolve these problems, we piloted a slightly different method, optingto be more flexible in our use of instructions to participants, not only by starting withthe general request that they ‘think-aloud’ but also by developing a range of otherprompts that might help us to collect the type of data that we needed (eg, ‘How are youdeciding where to go?’ ‘What do you think of the information in this section?’). Ques-tions were posed during pauses in the think-aloud commentary, usually when studentswere making navigational decisions (viewing menus) or when they had finished asection of the resource (in order to elicit specific feedback about individual sections).There was no set time schedule for prompts; instead, they were raised as appropriatewith each participant. Whilst the prompts were considered useful, they did notconstitute a large proportion of the data collected. Most students were successful atarticulating their thoughts to some degree, and the prompts merely acted as a reminderto think-aloud after periods of quiet reading. There is no evidence from our data thatthe think-aloud procedure acted as a catalyst for metacognition, or that it changedstudents’ thought processes.

In addition, a very short interview (consisting of three questions) was added at the endof the observation period to ensure that we had covered the key issues of concern. Thevideo camera was left running to record the interview, but both researcher and studentwere off-camera in a face-to-face setting. This additional input raises issues about thestatus of the researcher, moving between the role of observer to that of interviewer asthe observation proceeded. It seems possible that such interventions would increaseparticipants’ reactivity to the researcher:

The very fact of observation can have an effect on behaviour. Where the people whose behaviouris being described know that they are being observed, they may adjust what they do in variousways, some of which may be relevant to the research focus. (Foster, Gomm & Hammersley, 1996)

One recent research study suggested that interrupting participants undertakingproblem-solving processes had no significant impact on reactivity of the participantswhen compared to a standard think-aloud protocol (Karahasanovic, Hinkel, Sjøberg &Thomas, 2004). However, it seems implausible that any participant undertaking athink-aloud protocol (irrespective of the prompts used) is not going to be affected by thepresence of the researcher (and the video camera). We would argue that this does notnegate the value of such data and discuss the possible influence of the observer in thenext section.

In order to avoid interrupting participants whilst they navigated the resource, we con-sidered using the technique of ‘stimulated recall’ (a form of retrospective protocol),

50

British Journal of Educational Technology Vol 37 No 1 2006

© 2005 The Authors. Journal compilation © 2006 British Educational Communications and Technology Agency.

whereby the video recording is used as an ‘aide memoire’, enabling participants toaccess their interactive decision making shortly after the event. This technique, devel-oped by Bloom (1953), aims to aid recall of original events and authentic thoughts byusing an audio or video stimulus:

The basic idea underlying the method of stimulated recall is that a subject may be enabledto relive an original situation with vividness and accuracy if he (sic) is presented with alarge number of the clues or stimuli which occurred during the original situation. (Bloom,1953)

This method has obvious advantages in that it would enable participants to focus onwhat they were doing at a particular time whilst freeing them from having to do theactivity and simultaneously talk about it. However, there is some controversy over theextent to which stimulated recall can enable participants to recount their interactivethoughts. Prompted recall after the event is different in quality from concurrent think-aloud even when the time-lapse is slight. Yinger (1986) argues that thoughts are veryrapidly lost from short-term memory, and that—particularly when videotape is used asa stimulus—participants will explicate their thinking in relation to the videotape ratherthan the original event (a form of

post

hoc

rationalisation). These criticisms, togetherwith the exceptionally time-consuming nature of this approach, led us to reject it in thisparticular study.

In distancing ourselves from the very precise think-aloud techniques of the cognitivepsychologists, it can be argued that the focus of our study was rather different fromthose discussed by Ericsson and Simon (1984). We did not intend to draw conclusionsabout the participants’ cognitive processes and, to a large extent, our aims were rathermore pragmatic. Our primary concern was with students’ interactive decision making,taking into account the three key evaluation areas outlined by Milton and Lyons(2003):

1 Interface usability (Is it easy to use?)2 Content validity (Does it make sense?)3 Educational utility (Do they learn from it?).

The methodology we adopted was therefore a kind of ‘prompted think-aloud’ with theaim of encouraging students to articulate their thinking as clearly as possible and toenhance the data collected.

Reflections on our experience

The data from our study suggest that some participants were influenced by the obser-vational set-up, and that they did not engage with the resource in exactly the same wayas they would do ‘normally’ (despite instructions to do so!). Statements such as, ‘I wouldwrite that down’, or ‘I think I’ll go back to that’ suggested that there was some elementof artefact prompted by the observational situation. It also appeared as though somestudents had not come prepared to learn (instead they were ‘doing research’), and thatthe social context had influenced their use of the resource. This may have been com-

Reflecting on the think-aloud method for evaluating e-learning

51

© 2005 The Authors. Journal compilation © 2006 British Educational Communications and Technology Agency.

pounded by a lack of explicit instructions prior to starting the observation session. Itwould have been possible (and perhaps beneficial) to allow students to listen to anexample of a think-aloud protocol by someone of their own level and calibre, workingon a task and disciplinary environment that would not immediately encourage directimitation. However, we rejected the idea of explicitly training students to ‘think-aloud’,because we did not want to introduce training artefacts prior to the study. Ericsson andSimon (1984) found ‘...substantial evidence that differences in performance wereinduced by telling the subject

how

to verbalise.’ (p. 107).

There is evidence that some students find think-aloud easier than others, and this toowas supported by our experiences. On some occasions, the researcher appeared to begetting a ‘tour’ of the resource, in which students demonstrated the parts they liked ordisliked. Other students appeared to be moving through the resource at such highspeed that it was doubtful whether they could have read even a small proportion of thetext viewed. Early analysis of the data log indicated that one student had spent anaverage of only 18 seconds on each page viewed during the observation. This may bean inherent problem with using a hypertext resource for educational purposesbecause, according to Laurillard (2002), ‘...the user expects to be doing somethingevery few seconds’ (p. 110). Whilst other students navigated the resource at a moreleisurely pace, it does seem that fast-paced hyperlinking is a problem for some studentsusing hypertext resources, and one which we suggest is likely to impact on retention ofinformation.

It is possible that students usually accessed the resource to find specific information,and it might therefore have been preferable to set a specific learning task for the stu-dents to complete during the observation period. However, this was also rejected onthe grounds that it would have introduced an even greater degree of artificiality intothe situation, as Headstart is not recommended to students in this way. We were pri-marily interested in students’ normal learning behaviours, and the potential educa-tional benefits of using the online resource in these ways. It was notable that few ofthe participants wrote anything down whilst engaging with the resource, but it wasnot clear whether this was an artefact of the observational set-up or part of theirnormal learning behaviour. Several students commented that they had previouslyprinted out parts of the resource (they did not have access to a printer in the observa-tional setting).

Despite some reservations about the influence of the observational set-up, the dataobtained via the think-aloud method appear to reflect the students’ genuine thoughts(as opposed to simply telling us what they thought we wanted to hear). The participantsprovided a wealth of information (both positive and negative), ranging from their aver-sion to certain colours or layouts to the ways in which they felt the resource enhancedtheir learning. They demonstrated a range of different ways of using the resource, manyof which had not been anticipated by the researchers, and enabled us to identify navi-gational and other issues which would not have been evident from questionnaire orfocus group data.

52

British Journal of Educational Technology Vol 37 No 1 2006

© 2005 The Authors. Journal compilation © 2006 British Educational Communications and Technology Agency.

Analysis issues

There remains a substantial problem of analysis to overcome, particularly in relationto the think-aloud data. For each participant in our study, we have accumulated:

• Initial questionnaire response providing background information in terms of biolo-gical knowledge, computing experience, age, gender, etc;

• Researcher’s notes taken during the observation period;• Electronic tracking log of pages viewed during the observation, taken from the web-

server;• Video of observation (focused on computer screen only), encompassing pages viewed

and accompanying think-aloud comments (30 minutes to 1 hour long).

Bringing this data together and formulating conclusions about the effectiveness of theresource is a complex and time-consuming task that should not be under-estimated.Nonetheless, the study has already revealed some unexpected information about howstudents use the resource. It has also proved considerably more insightful than previous(overwhelmingly positive) questionnaire responses.

The analysis will incorporate the use of different pages of the resource during eachobserved session (utilising the available tracking data) alongside the student commentsfrom the think-aloud recordings. We aim to produce a diagrammatic representation ofstudents’ navigation of the resource in order to reveal which pages are used most andleast. We are also undertaking a thematic analysis of the students’ think-aloud com-ments using QSR N6 (qualitative data analysis software), with the aim of developingand testing a number of propositional statements about the data (see Silverman 1993).This procedure (analytic induction) involves examining small sections of data to see ifthey fit the initial hypothesis or proposition, then reformulating the hypothesis to enablethe inclusion of all relevant data. Use of this method, according to Silverman, ‘...offersa powerful tool through which to overcome the danger of purely anecdotal fieldresearch’ (p. 170).

Provisional findings suggest that students make navigational decisions based on a vari-ety of content and design factors (eg, avoiding topics viewed as ‘too difficult’; selectingpages which were seen as relevant in terms of exam success or nursing practice; reject-ing pages with lots of small text or few images). A previous study by Crowther

et al

(2004) found site navigation to be such a substantial problem that it overshadowed anylearning issues. However, early indications from our study suggest that we may be ableto draw conclusions about the nature of the learning taking place whilst students usethe resource. We intend to investigate the ways and extent to which students makesense of the resource information, and use it to enhance their learning. There areindications that ‘familiarity’ is a key issue in helping students understand and remem-ber biological terms and concepts (repetition of lecture material, revision for exams, andmaking ‘alien’ biological terms familiar). The ability to return to information in theirown time and at their own pace was seen as crucial by the students, and the repetitionof key information was vital in helping them to retain it (‘It all does make sense, it’s justremembering it!’). The quizzes with instant feedback were also seen as a valuable part

Reflecting on the think-aloud method for evaluating e-learning

53

© 2005 The Authors. Journal compilation © 2006 British Educational Communications and Technology Agency.

of the learning experience by many students, being used both as an entry point to theresource, as well as a formative test of knowledge after engagement.

Conclusions

The think-aloud method has the potential to provide a distinctly different and enhanceddimension to e-learning research. Benefits included the collection of very detailed dataabout the resource in real-time use. This study provided us with a wealth of valuableinformation on the various ways in which students used the online resource, and theirparticular likes and dislikes. Each volunteer produced new insights, and the range ofprevious knowledge and computing experience of the participants provided a fascinat-ing perspective on their varied approaches to e-learning. Some of the informationgained from the observations simply could not have been obtained via other methods.For example, one participant managed to overlook much of the information in theresource because of navigational naivety. This problem would not have been immedi-ately obvious from an analysis of the tracking data and might not have come to lightin an interview, because the student was simply unaware that more information wasavailable on the site. It was only the think-aloud comments that made the researcheraware that there was a problem and, for this reason alone, it provides a valuable sourceof data in e-learning research.

To conclude, we suggest that the think-aloud method can effectively contribute toevaluating e-learning, despite the potential problems outlined above. Analysis can bedifficult and time consuming, but the video-tapes do permit different aspects to bestudied and revisited as appropriate. Moreover, the data produced via this method arerich and exceptionally revealing. Used in conjunction with more traditional data col-lection methods (such as questionnaire data, tracking data, etc.), the think-aloudmethod should be considered an effective means of enquiry for studies in this field.

Acknowledgements

This project would not have been possible without generous funding from the LTSNSubject Centre for Health Sciences and Practice. The authors would also like to expresstheir thanks to all the student participants who enabled this research to take place.Finally, we would like to thank Professor Andrew Hannan for his valuable commentsleading to the improvement of earlier drafts of this paper.

References

Adams, A. M. (2004). Pedagogical underpinnings of computer-based learning.

Journal ofAdvanced Nursing

,

46

, 1, 5–12.Bloom, B. S. (1953). Thought processes in lectures and discussions.

Journal of General Education

,

7

, 160–169.Broad, M., Matthews, M., & McDonald, A. (2004). Accounting education through an online-

supported virtual learning environment.

Active Learning in Higher Education

,

5

, 2, 135–151.Calderhead, J. (1981). Stimulated recall: a method for research on teaching.

British Journal ofEducational Technology

,

51

, 211–217.Conole, G. (2004). The role of research in informing practice.

Exchange

, Issue 6, Spring 2004,30–31.

54

British Journal of Educational Technology Vol 37 No 1 2006

© 2005 The Authors. Journal compilation © 2006 British Educational Communications and Technology Agency.

Crowther, M. S., Keller, C. C. & Waddoups, G. L. (2004). Improving the quality and effectivenessof computer-mediated instruction through usability evaluations.

British Journal of EducationalTechnology

,

35

, 3, 289–304.Dearing, R. (1997).

Higher education in the learning society: The Dearing Report

(National Commit-tee of Inquiry into Higher Education). London: HMSO.

DFES (2003).

Towards a unified e-learning strategy: consultation document

. Nottingham: DFESPublications.

DFES (2004).

White Paper: The Future of Higher Education

. Retrieved 3 November,2004, from http://www.dfes.gov.uk/hegateway/strategy/hestrategy/foreword.shtml

Ericsson, K. A. & Simon, H. A. (1984).

Protocol analysis: verbal reports as data

(Revised ed.).London: MIT Press.

Foster, P., Gomm, R. & Hammersley, M. (1996).

Constructing educational inequality

. London: TheFalmer Press.

Gresty, K. A. & Cotton, D. R. E. (2003). Supporting biosciences in the nursing curriculum:development and evaluation of an online resource.

Journal of Advanced Nursing

,

44

, 4, 339–349.

Karahasanovic, A. Hinkel, U. N., Sjøberg, D. & Thomas, R. (2004). Think-aloud versus feedbackcollection in software engineering research: a controlled experiment, (submitted for journalpublication). Retrieved November 3, 2004, from http://www.simula.no/publication_one.php?publication_id

=

812Kekkonen–Moneta, S. & Moneta, G. B. (2002). E–learning in Hong Kong: comparing learning

outcomes in online multimedia and lecture versions of an introductory computing course.

British Journal of Educational Technology

,

33

, 4, 423–433.Laurillard, D. (2002).

Rethinking university teaching: a conversational framework for the effective useof learning technologies

(2nd ed.). London: Routledge/Falmer.Milton, J. & Lyons, J. (2003). Evaluate to improve learning: reflecting on the role of teaching and

learning models.

Higher Education Research and Development

,

22

, 3, 297–312.Oliver, M., McBean, J., Conole, G. & Harvey, J. (2002). Using a toolkit to support the evaluation

of e-learning.

Journal of Computer Assisted Learning

,

18

, 2, 199–208.Race, P. (1994).

Open learning handbook: promoting quality in designing and delivering flexible learn-ing

. London: Kogan Page.Silverman, D. (1993).

Interpreting qualitative data: methods for analysing talk, text and interaction

.London: SAGE Publications.

Underwood, J. (2004). Research into information and communications technologies: wherenow?

Technology Pedagogy and Education

,

13

, 2, 135–145.Van Someren, M. W., Barnard, Y. F. & Sandberg, J. A. C. (1994).

The think-aloud method: apractical guide to modelling cognitive processes

. London: Academic Press. Retrieved 3 November,2004, from http://hcs.science.uva.nl/usr/maarten/Think-aloud-method.pdf

Yinger, R. J. (1986). Examining thought in action: a theoretical and methodological critique ofresearch on interactive teaching.

Teaching and Teacher Education

, 2, 263–282.