7
Effects of a metacognitive support device in learning environments Maria Bannert a, * , Melanie Hildebrand a , Christoph Mengelkamp b a Educational Media, Chemnitz University of Technology, Reichenhainer Strasse 41, 09126 Chemnitz, Germany b Center of Educational Research, University of Koblenz-Landau, Buergerstrasse 23, 76829 Landau, Germany article info Article history: Available online 8 August 2008 Keywords: Metacognitive instruction Support device Hypermedia learning Knowledge acquisition abstract Successful learning is mainly based on metacognitive activities which have to be performed and con- stantly monitored during learning. Research reveals that many learners have difficulties in performing such metacognitive activities spontaneously, which most probably results in lower learning outcomes. The aim of this study is to experimentally analyse the effects of a metacognitive support device combined with a paper-based prompting scheme. With this support device, students are instructed to activate their repertoire of metacognitive knowledge and skills which should further enhance learning and transfer. University students of the experimental group (n = 29) were instructed by means of a metacognitive support device why metacognitive activities are useful and how to apply them during learning. In addi- tion, during learning, they were prompted to apply the metacognitive activities they just had learned. Students of the control group (n = 27) were not instructed why and how to use metacognitive activities, and furthermore, they were not prompted during learning to apply these metacognitive activities. Rather, they were instructed by a computer device how to organise a work place for their studies so all groups were treated in a similar way. The students’ learning task was to learn about ‘‘psychological theories of using pictures in multimedia learning environments” within 60 min. Immediately afterwards, learning outcome was measured with a test. Altogether, 56 university students participated, counterbalanced according to their prior knowledge as well as metacognitive knowledge. As expected, students of the experimental group showed better transfer performance compared with the control group. In addition, training did increase metacognitive behavior measured by subjective ratings. Ó 2008 Elsevier Ltd. All rights reserved. 1. Introduction Hypermedia learning environments usually offer a high degree of learner control in order to support individual learner’s needs and preferences in the specific learning context (e.g., Lawless & Brown, 1997). However, research in the last decade has shown repeatedly that students have to possess specific metacognitive, cognitive, and affective pre-requisites in order to learn successfully with such electronic learning environments (e.g., Astleitner & Leut- ner, 1995; Azevedo, 2005; Dillon & Gabbard, 1998; McManus, 2000; Rouet & Levonen, 1996; Unz & Hesse, 1999). As one major conse- quence, current educational research is concentrating on the design and evaluation of additional measures to provide students lacking such pre-requisites with support devices integrated in the learning environment (e.g., Clarebout & Elen, 2006). However, most recent hypermedia research indicates that these additional educational measures are not very effective either, since students tend to not use the support devices offered, or do not use the support devices as intended by the designers (Bannert, 2005; Clarebout & Elen, 2006; see also contributions in this issue). One possible solution is to provide systematic instructions on why and how to use the support devices in order to convince students to comply with these devices integrated in the learning environment. The focus of this study lies on metacognitive support. In a series of experimental studies, we have repeatedly found that unsuccess- ful hypermedia learning is partly due to a lack of students’ sponta- neous use of metacognitive activities during learning (see Bannert, 2007). In consequence of these findings, a metacognitive support device was developed and experimentally evaluated. With this sup- port device, students are instructed to activate their repertoire of metacognitive knowledge and skills, which should further enhance learning and transfer. Thus, the main aim of this paper is to exper- imentally analyse the effects of this metacognitive support device on learning. In the following, research on metacognitive support will be sketched first in order to present a psychologically founded framework for the design of metacognitive support devices. 2. Metacognitive support Metacognitive support aims to increase student’s learning com- petence by means of systematic instruction. Most metacognitive 0747-5632/$ - see front matter Ó 2008 Elsevier Ltd. All rights reserved. doi:10.1016/j.chb.2008.07.002 * Corresponding author. E-mail address: [email protected] (M. Bannert). Computers in Human Behavior 25 (2009) 829–835 Contents lists available at ScienceDirect Computers in Human Behavior journal homepage: www.elsevier.com/locate/comphumbeh

Effects of a metacognitive support device in learning environments

Embed Size (px)

Citation preview

Page 1: Effects of a metacognitive support device in learning environments

Computers in Human Behavior 25 (2009) 829–835

Contents lists available at ScienceDirect

Computers in Human Behavior

journal homepage: www.elsevier .com/locate /comphumbeh

Effects of a metacognitive support device in learning environments

Maria Bannert a,*, Melanie Hildebrand a, Christoph Mengelkamp b

a Educational Media, Chemnitz University of Technology, Reichenhainer Strasse 41, 09126 Chemnitz, Germanyb Center of Educational Research, University of Koblenz-Landau, Buergerstrasse 23, 76829 Landau, Germany

a r t i c l e i n f o a b s t r a c t

Article history:Available online 8 August 2008

Keywords:Metacognitive instructionSupport deviceHypermedia learningKnowledge acquisition

0747-5632/$ - see front matter � 2008 Elsevier Ltd. Adoi:10.1016/j.chb.2008.07.002

* Corresponding author.E-mail address: [email protected]

Successful learning is mainly based on metacognitive activities which have to be performed and con-stantly monitored during learning. Research reveals that many learners have difficulties in performingsuch metacognitive activities spontaneously, which most probably results in lower learning outcomes.The aim of this study is to experimentally analyse the effects of a metacognitive support device combinedwith a paper-based prompting scheme. With this support device, students are instructed to activate theirrepertoire of metacognitive knowledge and skills which should further enhance learning and transfer.

University students of the experimental group (n = 29) were instructed by means of a metacognitivesupport device why metacognitive activities are useful and how to apply them during learning. In addi-tion, during learning, they were prompted to apply the metacognitive activities they just had learned.Students of the control group (n = 27) were not instructed why and how to use metacognitive activities,and furthermore, they were not prompted during learning to apply these metacognitive activities. Rather,they were instructed by a computer device how to organise a work place for their studies so all groupswere treated in a similar way. The students’ learning task was to learn about ‘‘psychological theories ofusing pictures in multimedia learning environments” within 60 min. Immediately afterwards, learningoutcome was measured with a test. Altogether, 56 university students participated, counterbalancedaccording to their prior knowledge as well as metacognitive knowledge. As expected, students of theexperimental group showed better transfer performance compared with the control group. In addition,training did increase metacognitive behavior measured by subjective ratings.

� 2008 Elsevier Ltd. All rights reserved.

1. Introduction

Hypermedia learning environments usually offer a high degreeof learner control in order to support individual learner’s needsand preferences in the specific learning context (e.g., Lawless &Brown, 1997). However, research in the last decade has shownrepeatedly that students have to possess specific metacognitive,cognitive, and affective pre-requisites in order to learn successfullywith such electronic learning environments (e.g., Astleitner & Leut-ner, 1995; Azevedo, 2005; Dillon & Gabbard, 1998; McManus, 2000;Rouet & Levonen, 1996; Unz & Hesse, 1999). As one major conse-quence, current educational research is concentrating on the designand evaluation of additional measures to provide students lackingsuch pre-requisites with support devices integrated in the learningenvironment (e.g., Clarebout & Elen, 2006). However, most recenthypermedia research indicates that these additional educationalmeasures are not very effective either, since students tend to notuse the support devices offered, or do not use the support devicesas intended by the designers (Bannert, 2005; Clarebout & Elen,

ll rights reserved.

e (M. Bannert).

2006; see also contributions in this issue). One possible solutionis to provide systematic instructions on why and how to use thesupport devices in order to convince students to comply with thesedevices integrated in the learning environment.

The focus of this study lies on metacognitive support. In a seriesof experimental studies, we have repeatedly found that unsuccess-ful hypermedia learning is partly due to a lack of students’ sponta-neous use of metacognitive activities during learning (see Bannert,2007). In consequence of these findings, a metacognitive supportdevice was developed and experimentally evaluated. With this sup-port device, students are instructed to activate their repertoire ofmetacognitive knowledge and skills, which should further enhancelearning and transfer. Thus, the main aim of this paper is to exper-imentally analyse the effects of this metacognitive support deviceon learning. In the following, research on metacognitive supportwill be sketched first in order to present a psychologically foundedframework for the design of metacognitive support devices.

2. Metacognitive support

Metacognitive support aims to increase student’s learning com-petence by means of systematic instruction. Most metacognitive

Page 2: Effects of a metacognitive support device in learning environments

830 M. Bannert et al. / Computers in Human Behavior 25 (2009) 829–835

training research was predominantly conducted in learning set-tings mainly to improve the individual’s (linear) text based learn-ing (e.g., Hasselhorn & Hager, 1998; Schunk & Zimmerman, 1998;Weinert, 1994; Weinstein, Husman, & Dierking, 2000). This re-search suggests three general principles for effective metacognitiveinstruction which are all based on empirical research and, thus,may also be relevant for hypermedia learning (Bannert, 2007).

One major design principle is to integrate metacognitiveinstruction into the domain-specific instruction. That is to say,not to teach metacognitive activities separately as ends in them-selves, but to embed their teaching in the subject matter (e.g., Ba-ker, 1994; Lin, 2001). Another design principle of effectiveinstruction is to explain the application and usefulness of all in-structed metacognitive strategies. Otherwise, students will notuse them spontaneously in the training session or afterwards,which has been pointed out repeatedly in empirical research(e.g., Hasselhorn & Hager, 1998; Pressley, Borkowski, & Schneider,1989; Weinstein et al., 2000; Winne, 2005). According to the thirddesign principle, it is important that enough training time is pro-vided to students in order to implement and automatise the meta-cognitive activities that have just been learned (Friedrich & Mandl,1992; Haller, Child, & Walberg, 1988).

With respect to the directiveness of the instructional interven-tion, Friedrich and Mandl (1992) distinguish direct and indirectprocedures. Whereas direct support is realised by means of explicitmetacognitive training, more indirect support is realised by meansof adequate learning heuristics integrated into the learning envi-ronment. For example, with the aid of metacognitive promptswhich generally demand students to carry out specific activitiesduring learning, learners could be stimulated to activate theirown learning heuristics and/or to consider and apply specificlearning heuristics implemented in the hypermedia system. In or-der to be successful, both kinds of intervention should consider thethree design principles mentioned above.

The decision whether to design direct or indirect metacognitiveinstruction strongly depends on the student’s metacognitive com-petence which consists of metacognitive knowledge and skills(Veenman, 2005). For students lacking metacognitive competence(so-called mediation deficit, e.g., Hasselhorn, 1995), direct trainingis necessary in order to extensively teach the metacognitive knowl-edge and skills. Often, students already possess metacognitiveskills, but do not perform them spontaneously (so-called produc-tion deficit, e.g., Hasselhorn, 1995; Weinert, 1984). For these stu-dents, metacognitive prompts seem to be an adequate measurewhich stimulates students to apply their skills during learning(Bannert, 2007).

The target group of this project were university students whoshould already possess the metacognitive skills outlined abovedue to learning skills collected during their studies. Hence, we as-sume that unsuccessful learning in this target group is more a mat-ter of a production deficit than a mediation deficit. Consequently, tosupport their learning, direct extensive training is not necessary be-cause they already possess basic metacognitive competence; rather,it is important to support them in using their metacognitive knowl-edge and skills during learning. In short, when supporting studentsshowing a production deficit, instructional support should not focuson the acquisition of metacognitive competence (respectively, itsteaching), but rather on stimulating the use of the metacognitivecompetence available. Thus, we have focused on a short-time inter-vention (i.e., a pre-training combined with prompts) which wasdeveloped on the basis of the metacognitive training studies avail-able (e.g., Bannert, 2007; Hattie, Biggs, & Purdie, 1996; Lin, 2001;Rosenshine, Meister, & Chapman, 1996; Weinstein et al., 2000).

According to the research outlined so far, metacognitiveinstruction requires students to explicitly reflect, monitor, and re-vise their learning process. Metacognitive support focuses stu-

dents’ attention on their own thoughts and on understanding theactivities they are engaged in during their course of learning. Inline with this, we could demonstrate with three experiments thatprompting students to plan, monitor, and evaluate their own wayof learning allows them to activate their repertoire of metacogni-tive knowledge and strategies, which further enhances hypermedialearning and transfer (Bannert, 2007). However, even though theseinterventions were successful, the students felt restricted in theirscope of learning. After learning, they reported that they felt dis-turbed in their own way of learning when they had to considerthe demanding activities asked of by metacognitive prompts.Hence, we designed a metacognitive support device in order to of-fer more flexible and more effective metacognitive support. Be-sides, we used a different learning topic and other kinds oflearning materials in order to find out whether our repeatedly de-tected positive effect on transfer performance would also appear inother learning scenarios.

3. Research questions and hypotheses

The aim of this study is to experimentally investigate the effectsof a metacognitive support device on metacognitive and strategiclearning activities and learning outcome including transfer. Themetacognitive support device was realised with a combination ofa computer-assisted metacognitive support device and metacogni-tive prompts as described below. In particular, the following re-search questions were addressed in the current study:

1. Does the realised metacognitive support device influence thelearning process by engaging students in more metacognitiveand strategic learning activities?

2. Does the realised metacognitive support devise increase stu-dents’ learning performance?

It is assumed that the realised metacognitive support devicewill affect the learning process by inducing students’ metacogni-tive activities, which, in turn, will lead to better learningperformance.

4. Method

4.1. Sample and experimental design

Overall, 56 university students enrolled in Educational Mediaparticipated (mean age = 21.8; SD = 2.2; female: 80.6%). Partici-pants were matched according to prior knowledge and metacogni-tive knowledge measured with questionnaires and randomlyassigned to one of the two treatments afterwards.

Students in the experimental group (n = 29) were instructed bymeans of the metacognitive support device why metacognitiveactivities are useful and how to apply them during learning. Inaddition, they were prompted to apply these activities duringlearning realised with a paper-based scheme. Students in the con-trol group (n = 27) were not metacognitively instructed andprompted but also instructed by means of a computer devicehow to organise a work place for their studies.

4.2. Procedure, instruments and materials

Learner characteristics were obtained by questionnaire aboutone week before the experiment started. Prior knowledge wasmeasured with a self-constructed multiple-choice test, metacogni-tive knowledge with the LIST-questionnaire (Wild, Schiefele, &Winteler, 1992) – this instrument is similar to the MSQL-question-naire (Pintrich, Smith, Garcia, & McKeachi, 1993). The LIST-ques-

Page 3: Effects of a metacognitive support device in learning environments

M. Bannert et al. / Computers in Human Behavior 25 (2009) 829–835 831

tionnaire consists of several scales of which we used the metacog-nition scale, which indicates knowledge about monitoring and con-trolling one’s learning. Furthermore, we used the cognitivestrategic learning scales organisation, elaboration and rehearsal inorder to assess strategic learning activities. Table 1 presents thecorresponding statistics and reveals that the treatment groups donot differ significantly. While reliabilities of all used cognitive stra-tegic learning scales are good, reliabilities for pre-knowledge andmetacognition scales are quite low; however, they still fulfil therequirements for experimental group comparisons.

The experiment took about 2 h. Before the learning sessionstarted, the participants in the experimental group (n = 29) were in-structed with a computer support device why metacognitive activ-ities are useful and how to apply them during learning (see Fig. 1and description below). The students in the control group (n = 27)were not trained why and how to use metacognitive activities. Inorder to treat them equally, they were also instructed with a com-puter device how to organise a work place for their studies. Forexample, they were instructed about ergonomic aspects of com-puter desks and office chairs as well as ideal illumination and tem-perature. For both groups, the instruction lasted about 30 min.

After the specific instruction, the students’ learning task was tolearn about ‘‘psychological theories of using pictures in multimedialearning environments” in such a way that they would be able toteach and explain these concepts and theories to other studentsafterwards. With the aid of this specific demand, the students sur-face learning should be minimised and, respectively, deep learningshould be explicitly stimulated. During learning, the students inthe experimental metacognitive support device group were in-

Table 1Learner characteristics: means and standard deviations of students’ prior knowledge and

E

Pre-knowledge (n = 7, Cronbach’s Alpha = 0.52) M (SD) 1Metacognition (n = 10, Cronbach’s Alpha = 0.51) M (SD) 4Organisation (n = 8, Cronbach’s Alpha = 0.79) M (SD) 4Elaboration (n = 8, Cronbach’s Alpha = 0.78) M (SD) 4Rehearsal (n = 7, Cronbach’s Alpha = 0.88) M (SD) 3

Notes: EG = computer-assisted metacognitive instruction, CG = no computer-assisted mettesting.

Fig. 1. User interface of the computer-as

structed to apply the metacognitive activities they had justlearned. Therefore, a diagram visualising all metacognitive activi-ties was offered to them on a sheet of paper to serve as a metacog-nitive prompt during learning (see Fig. 2). In respect toexperimental control, the participants in the control group wereoffered a similar scheme containing important aspects of an ergo-nomic working place such as computer desk and chair or illumina-tion and temperature. Students in both groups were equallyinstructed about the possibility to write down information on asheet of paper during the learning process. The learning sessionwas restricted to 60 min.

Immediately after learning, learning outcome and the students’metacognitive and strategic activities during learning were mea-sured with questionnaires.

Learning outcomes were measured on three different levels bymeans of recall, knowledge and transfer tasks. To obtain recall per-formance, the students had to write down all the concepts andterms they could remember on a sheet of paper, e.g., picture, men-tal model, modality. For each correct concept one point was as-signed, a 65 point maximum could be reached altogether.Knowledge achieved was measured with 7 multiple choice itemswith 4 alternatives about the use of pictures in multimedia. Onepoint was assigned to each correctly answered item, thus, a maxi-mum of 7 points could be achieved. For further analysis, thisknowledge score was corrected for guessing. To measure transferperformance, students had to apply their knowledge by giving de-sign recommendations when and how to use pictures in multime-dia learning environments. For example, they had to answer thequestions ‘‘How should pictures be designed to support a learner’s

metacognitive knowledge

G (n = 29) CG (n = 27) t(1,54) d

.69 (1.23) 2.11 (1.67) �1.081 �0.25

.22 (0.47) 4.17 (0.53) 0.383 0.09

.40 (0.74) 4.55 (0.79) �0.544 �0.18

.31 (0.73) 4.20 (0.63) 0.561 0.17

.71 (1.01) 4.12 (1.12) �1.434 �0.37

acognitive instruction; 1 = min = less activities; 6 = max = high activities; one-tailed

sisted metacognitive support device.

Page 4: Effects of a metacognitive support device in learning environments

At the beginning of learning:

During learning:

At the end of learning:

Orientation

Goal Specification

Planning

Judgement of Information

Monitoring

Regulation

Evaluation: Checking goals and comprehension

Fig. 2. Metacognitive prompting scheme.

832 M. Bannert et al. / Computers in Human Behavior 25 (2009) 829–835

mental model construction?” or ‘‘Should a multimedia designeruse a visual or an audiovisual representation format?”. Each cor-rectly answered question of a total of 6 was assigned 5 points, thus,a maximum of 30 points could be achieved by students. Due to thelimited budget, there was no possibility to further assess how wellstudents would actually teach the main concepts to others, as de-manded in their assignments.

The computer-assisted metacognitive support device was realisedwith html-scripts and the use of Windows Explorer. It is a so-calledclosed web-based learning environment as no link to externalpages was available. Fig. 1 presents a screenshot of the user inter-face. In 6 pages, students were instructed about the importance ofplanning, monitoring, and evaluation processes. For each metacog-nitive activity, a specific question was posed, e.g., ‘‘What do I actu-ally want to learn?”, ‘‘Do I remember and understand the topics Ihave learned?”, ‘‘Did I reach my learning goals?”. Additionally, acomprehensive exercise was included to give students the possibil-ity to immediately apply these activities.

The learning material consisted of an introduction chapter aboutpictures in multimedia learning environments (Weidenmann,2002). This learning topic was presented to the students in a lineartext format, i.e., a copy of all 14 pages including 9 pictures was pre-sented to the students in A4 paper format. Fig. 3 gives an exampleof the learning materials.

5. Results

5.1. Metacognitive and strategic learning activities

To test the hypothesis whether the metacognitive support de-vice would increase metacognitive activities during learning, thescales of the retrospective LIST-questionnaire (Wild et al., 1992)were analysed with respect to group differences (see Table 2).

In line with our assumption, there is a treatment effect with re-spect to the metacognition scale (t1,54 = 4.327, p1 = 0.000, d = 1.12).That is, students of the experimental group retrospectively re-ported a higher amount of metacognitive activities conducted dur-ing learning. In addition, they reported a significantly higherdegree of cognitive organisation activities (t1,54 = 2.211,p1 = 0.014, d = 0.49). However, there are no significant group differ-ences with respect to cognitive elaboration and rehearsal activities.Thus, the experimental manipulation had no effect on this type ofcognitive strategic learning activities.

5.2. Learning outcome

Table 3 presents the mean scores of all three learning outcomemeasures broken down into treatments. As one can see, the varia-tion of the metacognitive support had no effect on recall and

Page 5: Effects of a metacognitive support device in learning environments

Fig. 3. Example of the paper-based learning material (source: Weidenmann, 2002).

Table 2Means and standard deviations of students’ self-reported use of metacognitive and strategic learning activities

Questionnaire scale EG (n = 29) CG (n = 27) t(1,54) d

Metacognition (n = 10, Cronbach’s Alpha = 0.71) M (SD) 4.32 (0.63) 3.57 (0.67) 4.327*** 1.12Organisation (n = 8, Cronbach’s Alpha = 0.75) M (SD) 4.48 (0.60) 3.98 (1.02) 2.211* 0.49Elaboration (n = 8, Cronbach’s Alpha = 0.83 M (SD) 4.04 (0.83) 3.90 (0.95) 0.609 0.15Rehearsal (n = 7, Cronbach’s Alpha = 0.87 M (SD) 3.06 (1.26) 2.91 (1.05) 0.479 0.14

Notes: EG = computer-assisted metacognitive instruction, CG = no computer-assisted metacognitive instruction; 1 = min = less activities, 6 = max = high activities.* p < 0.05.

*** p < 0.001, one-tailed testing.

Table 3Learning outcome: means and standard deviations of students’ recall, knowledge, and transfer performance

Learning outcome EG (n = 29) CG (n = 27) t(1,54) d

Recall M (SD) 8.69 (3.30) 8.22 (3.74) 0.497 0.126Knowledge (n = 7, Cronbach’s Alpha = 0.53) M (SD) 5.02 (1.34) 4.48 (2.04) 1.046 0.268Transfer (n = 6, Cronbach’s Alpha = 0.69) M (SD) 14.45 (6.48) 11.70 (4.89) 1.778* 0.562

Notes: EG = computer-assisted metacognitive instruction, CG = no computer-assisted metacognitive instruction.* p < 0.05, one-tailed testing.

M. Bannert et al. / Computers in Human Behavior 25 (2009) 829–835 833

knowledge test performance. However, t-tests for independentgroups revealed a significant effect for the transfer task(t1,54 = 1.78, p1 = 0.04, d = 0.56). As expected, students learningwith the metacognitive support device showed better transfer per-formance compared with the control group.

6. Summary and discussion

In accordance with prior research, the hypotheses were partlyconfirmed by the results: As expected, participants who were in-structed with a metacognitive support device about why and

how to use metacognitive activities combined with paper-basedmetacognitive prompts reached significantly better transfer per-formance. However, no significant increase in recall and knowl-edge test performances was obtained. This is in line with severalother metacognitive intervention studies. For example, in threeexperimental studies evaluating the effects of different metacogni-tive support devices in which different learning topics as well asdifferent degrees of computer support were used, we obtainedequally significant improvements only in transfer performance(e.g., Bannert, 2003; Bannert, 2005; Bannert, 2006). In accordancewith Lin and Lehman (1999), who also found an effect for transfer

Page 6: Effects of a metacognitive support device in learning environments

834 M. Bannert et al. / Computers in Human Behavior 25 (2009) 829–835

tasks only, we argue that solving transfer tasks requires a deeperunderstanding and, therefore, they are affected the most stronglyby metacognitive activities. Consequently, the evaluation of meta-cognitive interventions must incorporate transfer tasks as mainevaluation criteria in order to investigate their educationaleffectiveness.

Effects of the metacognitive support device, as realised herewith a combination of computer-based instruction and paper-based prompts, on the self-reported use of metacognitive and stra-tegic activities were partly obtained as expected. Participants inthe metacognitive instruction group retrospectively reported ahigher degree of metacognition and organisation activities duringlearning, albeit not with respect to elaboration and rehearsal strat-egies. It has not been reported so far that these self-reports did notpredict the learning performance significantly (Pearson Correlationcoefficients for different scales and transfer performance are allsmaller than: r < 0.17, p = 0.22), which may also explain the unex-pected lack of effects on all scales. With regard to this finding, weneed additional online measures in further research (such as think-ing aloud protocols, e.g., Bannert & Mengelkamp, 2008; Hofer,2004; Veenman 2005, logfile recording, e.g., Winne & Perry,2000; Wirth, 2004, and/or eye-movement methods, e.g., O’Haraand Sellen, 1997) in order to find out in which quality the sup-ported group actually performed the instructed metacognitiveand strategic learning activities. With such online methods, we willalso be able to analyse students’ learning behaviour in more detail.One could speculate, for instance, that the students did not reallyuse the metacognitive support device because they ignored themetacognitive instruction or the prompts or even both, and thusdid not really benefit from the support, as indicated by some miss-ing effects of our outcome measures. Thus, process analysis provid-ing data whether and in which quality the students used themetacognitive support device provided are necessary in future re-search in order to further optimise the use of tools in learningenvironments.

Looking at our learning setting one has to consider thatalthough it was authentic, it was not too difficult and not too over-loading. Due to financial reasons, it included linear text learningmaterials and paper-based prompts. With respect to other studiesin which we mainly used hypermedia learning materials (Bannert,2007) we assume that the same effects or even greater effectscould be obtained in more complex open-ended non-linear hyper-media environments as we know from research on metacognition,that difficult tasks require more metacognitive competence com-pared to easy learning tasks in well structured learning settings(e.g., Hasselhorn, 1995; Pressley et al., 1989; Weinert, 1984). Thus,this study gives first insights in the role and support of metacogni-tive activities during learning which is necessary in order to designand use computer-based learning settings successfully.

This research focused on instructional measures with very shortinterventions mainly due to the assumption of the existence of ametacognitive production deficit. Maybe this assumption waswrong, and instead, at least for some of the participants, a media-tion deficit was the case. This was not examined at the beginning ofthe study because up to now, there is no economical and validassessment instrument available to examine these different meta-cognitive competence levels with. Thus, we were only able toexamine the participants’ metacognitive knowledge which is mucheasier to access by questionnaire; however, this measure does notnecessarily correlate with metacognitive skills and competence(Bannert & Mengelkamp, 2008; Mengelkamp & Bannert, submittedfor publication; Veenman, 2005). Moreover, our support device of-fered no feedback concerning the metacognitive activities’ qualityto the students (e.g., van den Boom, Paas, van Merriënboer, & vanGog, 2004). Although we do not know the participants’ metacogni-tive competence level due to the current lack of valid assessment

tools, with regard to current research of self-regulated learning(e.g., Winne, 2005), we assume that instructing students moreintensively with adequate feedback in a longer training sessionshould increase their strategic learning behaviour and, in turn, im-prove their learning outcome.

In this study, some students reported after learning that theyfelt restricted in their own way of learning when they had to con-sider the demanding activities asked of by metacognitive support.Most probably, this intervention requires additional cognitivecapacities, which may also be true for tool use in general (e.g., Calvi& de Bra, 1997). From the perspective of cognitive load theory(Sweller, van Merriënboer, & Paas, 1998), one could consider thismeasure as a procedure to raise germane cognitive load sincelearning performance was increased for transfer tasks. However,our support measure is still not optimal as it was too restricting.Thus, we want to investigate further how computer-assisted meta-cognitive support could be optimised.

Summing up, this kind of metacognitive support was effective,which can not be seen as a matter of course in recent metacogni-tive intervention studies which often reveal no learning effects atall (Hasselhorn & Hager, 1998; van den Boom et al., 2004). To con-clude, our results confirm other research showing the need forscaffolding students’ learning activities in general in electroniclearning environments (e.g., Clarebout & Elen, 2006). Future re-search should focus on how we can optimise different supportmeasures by conducting process analysis.

References

Astleitner, H., & Leutner, D. (1995). Learning strategies for unstructuredhypermedia – A framework for theory, research, and practice. Journal ofEducational Computing Research, 13, 387–400.

Azevedo, R. (2005). Using hypermedia as a metacognitive tool for enhancingstudent learning? The role of self-regulated learning. Educational Psychologist,40, 199–209 (Special Issue on Computers as Metacognitive Tools for EnhancingStudent Learning).

Baker, L. (1994). Fostering metacognitive development. In Reese, H. (Ed.), Advancesin Child Development and Behavior, 25, 201–239.

Bannert, M. (2003). Effekte metakognitiver Lernhilfen auf den Wissenserwerb invernetzten Lernumgebungen [Effects of metacognitive help devices onknowledge acquistion in networked learning environments]. Zeitschrift fürPädagogische Psychologie, 17, 13–25.

Bannert, M. (2005). Designing metacognitive support for hypermedia learning. In T.Okamoto, D. Albert, T. Honda, & F. W. Hesse (Eds.), The 2nd joint workshop ofcognition and learning through media-communication for advanced e-learning(pp. 11–16). Tokyo, Japan: Sophia University.

Bannert, M. (2006). Effects of reflection prompts when learning with hypermedia.Journal of Educational Computing Research., 4, 359–375.

Bannert, M. (2007). Metakognition beim Lernen mit Hypermedien. [Metacognition inHypermedia Learning]. Münster: Waxmann.

Bannert, M., & Mengelkamp, C. (2008). Assessment of metacognitive skills by meansof instruction to think-aloud and reflect when prompted. Does the verbalisationmethod affect learning? Metacognition and Learning, 3, 39–58.

Calvi, L., & De Bra, P. (1997). Proficiency-adapted information browsing and filteringin hypermedia educational systems. User Modelling and User-AdaptedInteraction, 7, 257–277.

Clarebout, G., & Elen, J. (2006). Tool use in computer-based learning environments:Towards a research framework. Computers in Human Behavior, 22, 389–411.

Dillon, A., & Gabbard, R. (1998). Hypermedia as an educational technology: A reviewof the quantitative research literature on learner comprehension, control, andstyle. Review of Educational Research, 68, 322–349.

Friedrich, H. F., & Mandl, H. (1992). Lern- und Denkstrategien - ein Problemaufriß[Learning and thinking strategies]. In H. Mandl & H. F. Friedrich (Eds.), Lern- undDenkstrategien. Analyse und Intervention (pp. 3–54). Göttingen: Hogrefe.

Haller, E. P., Child, D. A., & Walberg, H. J. (1988). Can comprehension be taught?A quantitative synthesis of metacognitive studies. Educational Researcher, 17,5–8.

Hasselhorn, M. (1995). Kognitives Training: Grundlagen, Begrifflichkeiten undDesiderate [Cognitive training: Foundations, terms and desiderates]. In W.Hager (Ed.), Programme zur Förderung des Denkens bei Kindern (pp. 14–40).Göttingen: Hogrefe.

Hasselhorn, M., & Hager, W. (1998). Kognitive Trainings auf dem Prüfstand: WelcheKomponenten charakterisieren erfolgreiche Fördermaßnahmen? [Cognitivetrainings re-examined: Components of success]. In M. Beck (Ed.), Evaluationals Maßnahme der Qualitätssicherung. Pädagogisch-psychologische Interventionenauf dem Prüfstand (pp. 85–98). Tübingen: Deutsche Gesellschaft fürVerhaltenstherapie.

Page 7: Effects of a metacognitive support device in learning environments

M. Bannert et al. / Computers in Human Behavior 25 (2009) 829–835 835

Hattie, J., Biggs, J., & Purdie, N. (1996). Effects of learning skills interventions onstudent learning: A meta-analysis. Review of Educational Research, 66, 99–136.

Hofer, B. (2004). Epistomological understanding as a metacognitive process:Thinking aloud during online searching. Educational Psychologist, 39, 43–55.

Lawless, K. A., & Brown, S. W. (1997). Multimedia learning environments: Issues oflearner control and navigation. Instructional Science, 25, 117–131.

Lin, X. (2001). Designing metacognitive activities. Educational Technology Researchand Development, 49, 1042–1629.

Lin, X., & Lehman, J. D. (1999). Supporting learning of variable control in acomputer-based biology environment: Effects of prompting college students toreflect on their own thinking. Journal of Research in Science Teaching, 36,837–858.

McManus, T. F. (2000). Individualizing instruction in a web-based hypermedialearning environment: Nonlinearity, advanced organizers, and self regulatedlearners. Journal of Interactive Learning Research, 11, 219–251.

Mengelkamp, C., & Bannert, M. (submitted for publication) Judgements aboutKnowledge: Searching for Factors Influencing their Validity. Metacognition andLearning.

O’Hara, K., & Sellen, A. (1997). A comparison of reading paper and on-linedocuments. Papers: Papers about paper. In S. Pemberton (Ed.) Proceedings ofACM CHI 97 conference on human factors in computing systems (pp. 335–342).New York, NY: ACM.

Pintrich, P. R., Smith, D. A. F., Garcia, T., & McKeachi, W. J. (1993). Reliability andpredictive validity of the motivated strategies for learning questionnaire(MSLQ). Educational and Psychological Measurement, 53, 801–814.

Pressley, M., Borkowski, J. G., & Schneider, W. (1989). Good information processing:What it is and how education can do to promote it. International Journal ofEducational Research, 13, 857–867.

Rosenshine, B., Meister, C., & Chapman, S. (1996). Teaching students to generatequestions: A review of the intervention studies. Review of Educational Research,66, 181–221.

Rouet, J.-F., & Levonen, J. J. (1996). Studying and learning with hypertext: Empiricalstudies and their implications. In J.-F. Rouet, J. J. Levonen, A. Dillon, & R. J. Spiro(Eds.), Hypertext and cognition (pp. 9–23). Mahwah, NJ: Lawrence ErlbaumAssociates.

Schunk, D. H., & Zimmerman, B. J. (Eds.). (1998). Self-regulated learning. Fromteaching to self-reflective practice. New York: Guilford.

Sweller, J., van Merrienboer, J., & Paas, F. (1998). Cognitive architecture andinstructional design. Educational Psychology Review, 10, 251–296.

Unz, D. C., & Hesse, F. W. (1999). The use of hypertext for learning. Journal ofEducational Computing Research, 20, 279–295.

van den Boom, G., Paas, F., van Merrienboer, J., & van Gog, T. (2004). Reflectionprompts and tutor feedback in a webbased learning environment: Effects onstudents’ self-regulated learning competence. Computers in Human Behavior, 20,551–567.

Veenman, M. V. (2005). The assessment of metacognitive skills: What can belearned from multi-method designs? In C. Artelt & B. Moschner (Eds.),Lernstrategien und Metakognition: Implikationen für Forschung und Praxis.Münster: Waxmann.

Weidenmann, B. (2002). Abbilder in Multimediaanwendungen [Use of pictures inmultimedia]. In L. J. Issing & P. Klimsa (Eds.), Information und Lernen mitMultimedia und Internet (pp. 83–96). Weinheim: PVU.

Weinert, F. E. (1984). Metakognition und Motivation als Determinanten derLerneffektivität: Einführung und Überblick [Metacognition and motivation asdeterminants of learning activity: Introduction and overview]. In F. E. Weinert &R. H. Kluwe (Eds.), Metakognition, Motivation und Lernen (pp. 9–23). Stuttgart:Kohlhammer.

Weinert, E. (1994). Lernen lernen und das eigene Lernen verstehen [Learninglearning and understanding own learning]. In K. Reusser & M. Reusser-Weyeneth (Eds.), Verstehen - Psychologischer Prozeß und didaktische Aufgabe(pp. 183–205). Bern: Huber Verlag.

Weinstein, C. E., Husman, J., & Dierking, D. R. (2000). Self-regulation interventionswith a focus on learning strategies. In M. Boekaerts, P. R. Pintrich, & M. Zeidner(Eds.), Handbook of self-regulation (pp. 727–747). San Diego, California:Academic Press.

Wild, K.P., Schiefele, U., & Winteler, A. (1992). LIST. Ein Verfahren zur Erfassung vonLernstrategien im Studium [LIST. A scale for assessing learning strategies]. (GelbeReihe: Arbeiten zur Empirischen Pädagogik und Pädagogischen Psychologie, Nr.20). Neubiberg: Universität der Bundeswehr, Institut fürErziehungswissenschaft und Pädagogische Psychologie.

Winne, P. (2005). A perspective on state-of-the-art research on self-regulatedlearning. Instructional Science, 33, 559–565.

Winne, P. H., & Perry, N. E. (2000). Measuring self-regulated learning. In M.Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation(pp. 531–566). San Diego, CA: Academic Press.

Wirth, J. (2004). Selbstregulation von Lernprozessen [self-regulation of learningprocesses]. Münster: Waxmann.