30

Click here to load reader

Student Learning: What Has Instruction Got to Do With It?

  • Upload
    john-r

  • View
    215

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

Student Learning: What HasInstruction Got to Do WithIt?Hee Seung Lee and John R. AndersonDepartment of Psychology, Carnegie Mellon University, Pittsburgh, Pennsylvania 15213;email: [email protected], [email protected]

Annu. Rev. Psychol. 2013. 64:445–69

First published online as a Review in Advance onJuly 12, 2012

The Annual Review of Psychology is online atpsych.annualreviews.org

This article’s doi:10.1146/annurev-psych-113011-143833

Copyright c© 2013 by Annual Reviews.All rights reserved

Keywords

discovery learning, direct instruction, instructional guidance,explanation, worked-examples

Abstract

A seemingly unending controversy in the field of instruction scienceconcerns how much instructional guidance needs to be provided ina learning environment. At the one extreme lies the claim that it isimportant for students to explore and construct knowledge for them-selves, which is often called discovery learning, and at the other extremelies the claim that providing direct instruction is more beneficial thanwithholding it. In this article, evidence and arguments that supporteither of the approaches are reviewed. Also, we review how differentinstructional approaches interact with other instructional factors thathave been known to be important, such as individual difference, self-explanation, and comparison. The efforts to combine different instruc-tional approaches suggest alternative ways to conceive of learning andto test it.

445

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 2: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

Contents

INTRODUCTION . . . . . . . . . . . . . . . . . . 446THE DEBATE OVER DISCOVERY

LEARNING . . . . . . . . . . . . . . . . . . . . . . 446Instructional Design Based on a

Constructivist Theory ofLearning . . . . . . . . . . . . . . . . . . . . . . . 448

Practice Facilitates SuccessfulDiscovery Learning . . . . . . . . . . . . . 449

PROVISION OF DIRECTINSTRUCTION. . . . . . . . . . . . . . . . . . 451Empirical Evidence on Superiority

of Direct Instruction . . . . . . . . . . . . 451Example-Based Learning:

Worked Examples . . . . . . . . . . . . . . 451INTERACTION WITH OTHER

INSTRUCTIONAL FACTORS . . . 453Individual Difference (Expertise

Reversal Effect) . . . . . . . . . . . . . . . . . 453Self-Explanation . . . . . . . . . . . . . . . . . . . 455Cognitive Load Theory and

Designing an Effective WorkedExample . . . . . . . . . . . . . . . . . . . . . . . . 456

Effects of Comparison in Learningby Worked Examples . . . . . . . . . . . 458

Effects of Instructional Explanationsin Learning by Worked Examples 459

COMBINING DISCOVERYLEARNING AND DIRECTINSTRUCTION APPROACHES . 461Invention Activity Followed

by Direct Instruction . . . . . . . . . . . . 461SUMMARY . . . . . . . . . . . . . . . . . . . . . . . . . . 462CONCLUSIONS . . . . . . . . . . . . . . . . . . . . 463

INTRODUCTION

Learning results from what the student doesand thinks, and only from what the studentdoes and thinks. The teacher can advancelearning only by influencing what the studentdoes to learn.

Herbert A. Simon (1916–2001)

In the field of learning science, many effortshave been made to find optimal instructional

conditions to promote student learning. Assuggested by Herbert Simon, one of thefounders of cognitive science, educators mustunderstand how students learn and thentranslate this understanding into practice whendesigning a learning environment.

THE DEBATE OVER DISCOVERYLEARNING

A seemingly unending controversy in the fieldof instruction science concerns how muchinstructional guidance needs to be provided ina learning environment (Kirschner et al. 2006,Kuhn 2007, Tobias & Duffy 2009). Is it betterto try to tell students what they need to know,or is it better to give students an opportunity todiscover the knowledge for themselves? Con-ceiving of this problem as deciding whetherto give or withhold assistance, Koedinger &Aleven (2007) called this issue the “assistancedilemma.” The contrast between the two po-sitions is best understood as a continuum, andboth ends appear to have their own strengthsand weaknesses. As a result, it is very diffi-cult to find the right balance between the twoextremes. The assistance dilemma is also relatedto the notion of “desirable difficulty” (Bjork1994, Schmidt & Bjork 1992). Learning condi-tions that introduce certain difficulties duringinstruction appear to slow the rate of learningbut often lead to better long-term retentionand transfer than learning conditions withless difficulty (e.g., mixed- or spaced-practiceeffect).

In this article, we briefly review older evi-dence and then focus on the more recent studieson this issue. At one end of the continuum, it isargued that minimizing instruction encouragesstudents to discover or construct knowledgefor themselves by allowing students to freelyexplore learning materials (Bruner 1961,Papert 1980, Steffe & Gale 1995). This ap-proach is based on a constructivist theory oflearning, and Jean Piaget (1970, 1973, 1980) hasbeen often referenced as a basis for construc-tivism. For instance, according to Piaget (1973),“To understand is to discover, or reconstruct by

446 Lee · Anderson

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 3: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

rediscovery, and such conditions must be com-plied with if in the future individuals are tobe formed who are capable of productionand creativity and not simply repetition”(p. 20). In a discovery learning environment,students are regarded as active learners andare given opportunities to digest materialsfor themselves rather than as passive learnerswho simply follow directions. Discovery-basedapproaches have been widely accepted asmajor teaching methods by teachers andeducators with constructivist views of learning.In truth, however, most discovery learningenvironments actually involve some amountof guidance, so this approach might better bereferred to as “minimal guidance.”

Why has a discovery learning approachbeen widely advocated by many teachers andresearchers? Reiser et al. (1994) summarizedpossible cognitive and motivational benefitsof discovery learning. First, learning throughdiscovery is believed to have several cognitivebenefits such as the development of inquiryskills and the utility of learning from errors.Students are thought to have more meaningfulunderstanding over rote learning due to greatamounts of self-generating processes and fromattempting to explain and understand theirmistakes. Generating activity has been knownto help long-term retention (Bobrow & Bower1969, Lovett 1992, Slamecka & Graf 1978).However, these potential cognitive benefitscan also be lost when trying to discover theknowledge. For instance, students may not beable to remember how they solved a problemafter excessive floundering (Lewis & Anderson1985). Also, this floundering tends to increaselearning time, and students may never be ableto discover the important principles that theyare expected to learn (Ausubel 1964).

It has been also argued that discovery meth-ods produce benefits for retention and trans-fer (Bruner 1961, Suchman 1961). To test thisproposition, Gutherie (1967) trained studentsto decipher cryptograms with different formsof instructional methods. Gutherie comparedstudents who were given explicit rules followedby problem practice with students who just tried

to solve the problems and had to discover therules. The discovery students did better on thetransfer problems that required new rules. Sim-ilarly, McDaniel & Schlager (1990) found thatthe discovery method provided benefits whenstudents had to generate a new strategy to solvea transfer problem but not when they could ap-ply the learned strategy.

Second, discovery learning is believed to in-crease students’ positive attitudes toward thelearning domain (Bruner 1961, Suchman 1961).Learning through exploration allows studentsto have more control in a task, and this in turnfosters more intrinsic motivation. Intrinsicallymotivated students are known to find a learn-ing task more rewarding and tend to do moreproductive cognitive processing in comparisonwith extrinsically motivated students (Lepper1988). In addition, it is argued that discoverylearning enables students to learn additionalfacts about the target domain. For instance,when preschool children were given an adult’spedagogical instruction without interruption,they tended to focus on only the target functionof a toy that was shown by the adult (Bonawitzet al. 2011). In contrast, when the pedagogicaldemonstration was experimentally interrupted,children explored the function of the toy morebroadly and were more likely to discover novelinformation.

Despite all the arguments for the benefitsof discovery learning, the empirical evidencehas been mixed at best. There now have beendecades of efforts to dissuade educators of thebenefits of discovery learning. For instance, in a1968 summary of 25 years of research, Ausubel(1968, pp. 497–498) wrote:

[A]ctual examination of the research literatureallegedly supportive of learning by discoveryreveals that valid evidence of this nature is vir-tually nonexistent. It appears that the variousenthusiasts of the discovery method have beensupporting each other research-wise by takingin each other’s laundry, so to speak, that is,by citing each other’s opinions and assertionsas evidence and by generalizing wildly fromequivocal and even negative findings.

www.annualreviews.org • Instructional Guidance in Student Learning 447

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 4: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

Reviewing the more recent research in apaper provocatively titled “Should there bea three-strikes rule against pure discovery?”,Mayer (2004, p. 17) concludes:

Like some zombie that keeps returning fromits grave, pure discovery continues to haveits advocates. However, anyone who takes anevidence-based approach to educational prac-tice must ask the same question: Where is theevidence that it works? In spite of calls for freediscovery in every decade, the supporting ev-idence is hard to find.

Kirschner et al. (2006), in their similarlyprovocatively titled paper “Why minimalguidance during instruction does not work:an analysis of the failure of constructivist,discovery, problem-based, experiential, andinquiry-based teaching,” come to a similarconclusion. However, as a sign that we donot have an all-or-none decision to makebetween pure discovery and direct instruction,Mayer (2004) concludes that it is important forstudents to construct their own knowledge andadvocates “guided discovery” as the best way toachieve this. Also, Kirschner et al. acknowledgewhat they call the “expertise reversal effect”(Kalyuga 2007, Kalyuga et al. 2003), whereexperienced learners benefit more with lowlevels of guidance than with high levels ofguidance. In this article, we review some of theprevious studies that compare many differentforms of direct instruction and discoverylearning approaches to investigate the effect ofdifferent amounts of instructional guidance.

Instructional Design Based on aConstructivist Theory of Learning

Discovery learning is often aligned with whatare called constructivist theories of learning,although as Anderson et al. (1998) note, thereis a wide variety of constructivist positions,some of which are mutually contradictory.These various positions often suffer from a lackof clear operational definitions and replicableinstructional procedures (Klahr & Li 2005).

That qualification being noted, there havebeen several reports of successful constructivistdesigns in studies conducted in a schoolenvironment (e.g., Carpenter et al. 1998, Cobbet al. 1991, Hiebert & Wearne 1996, Kamii &Dominick 1998, Schwartz et al. 2011).

Cobb and his colleagues (1991) developeda set of instructional activities based on aconstructivist view of learning and investigatedthe effect of the design to help children’smathematical learning. In project classrooms,children used a variety of physical manipula-tives and worked in pairs to solve mathematicalproblems. After working in pairs, a teacher led awhole-class discussion, and the children talkedabout their interpretations and solutions. Atthe end of the project, students were givena standardized achievement test and anotherarithmetic test developed by the projectgroup. The results showed that students inproject and nonproject classrooms were notdifferent in terms of the level of computationalperformance. However, the students in theproject classrooms demonstrated higher levelsof conceptual understanding than those in thenonproject classrooms. Although the ability toperform computational tasks seemed similarbetween the groups, closer analysis showed thatnonproject students heavily depended on theuse of standard algorithms. This dependencywas consistent with the beliefs nonprojectstudents demonstrated on the questionnaireasking about reasons for success in mathe-matics. The nonproject students believed itwas important to conform to the solutionprocedures of others. The project students,however, believed collaborating (explainingtheir thinking to others) and understandingwere important for success in mathematics.

In another study, Hiebert & Wearne(1996) traced children’s development ofunderstanding of mathematical conceptsand computational skills over the first threeyears of school in two different instructionalenvironments. To teach place value and mul-tidigit addition and subtraction, they providedstudents with either conventional instructionor alternative instruction. In the alternative

448 Lee · Anderson

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 5: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

instruction classrooms, students were presentedwith contextualized problem situations and en-couraged to represent problem quantities withphysical materials (base-ten blocks) and writtennumbers. Using both representations, studentshad to develop solution strategies and wereencouraged to discuss their strategies with theclass. The standard algorithms for addition andsubtraction were not formally taught; instead,students discussed how and why invented pro-cedures did or did not work. On the other hand,in the conventional instruction classroom, theinstruction was mainly guided by the textbook.Teachers taught students how to find theanswer, and students worked individually. Theuse of physical manipulatives was not required,so they were not used much. Assessmentsof mathematical understanding supportedsuperiority of the alternative instruction at theend of the third grade. Also, when the rela-tions between conceptual understanding andcomputational skill were analyzed, differentpatterns of development were identified be-tween the two groups. The majority of studentswho received alternative instruction seemedto show good understanding prior to or con-current with good computational skill. On theother hand, conventionally instructed studentstended to show correct computation skillsbefore they developed good understanding.

As shown in these studies, discovering one’sown procedures can lead to better understand-ing and transfer. Carpenter and his colleagues(1998) investigated how inventing strategieswas related to the understanding of mathe-matical concepts and procedures. In this study,children were traced for three years to assessunderstanding of concepts and procedureson multidigit addition and subtraction. Thestudy compared students who used an inventedstrategy with students who used a standardalgorithm. Students who invented a strategywere able to use not only their own inventedstrategy (if asked to do so), but also the standardalgorithm after they learned that. Inventionstudents also showed better understanding ofbase-ten number concepts and better perfor-mance in a transfer task. On the other hand,

the algorithm group showed significantly morebuggy algorithms in their problem solvingthan did the invented-strategy group.

Kamii & Dominick (1998) also argued thatteaching algorithms could harm understand-ing of multidigit computation. These investi-gators compared students who had been taughtstandard algorithms for multidigit computationwith those who had not. The students who didnot receive the algorithm instruction outper-formed those who were taught algorithms. Thealgorithm-taught students also tended to pro-duce more unreasonable answers, implying thatthey depended on the use of learned proce-dures and lacked a deep conceptual understand-ing about the computation procedures.

Practice Facilitates SuccessfulDiscovery Learning

The studies discussed above were conducted inthe classroom, and in this setting it is difficultto control all the factors at play (nor would onewant to). Some more-focused laboratorystudies suggest that students learn better in adiscovery learning environment. Interestingly,these studies are related to high levels of prac-tice. When combined with high levels of prac-tice or longer acquisition time, students appearto learn better in a discovery learning environ-ment than in a direct instruction environment.

Brunstein et al. (2009) investigated howlearning improves as students become moreexperienced through a series of learning ses-sions under different instructional conditions.Algebra-like problems were constructed in anovel graphical representation, and this novelformat allowed studying of solving equationsanew in college populations. Participants re-ceived different types of instructions accordingto experimental conditions; students were givenverbal direction on general characterization ofactions, direct demonstration on what to do,both, or none (discovery condition). Thus, inthe discovery condition, participants were pro-vided with none of the guidance, and they hadto learn from the consequences of their actions.The results showed that although discovery

www.annualreviews.org • Instructional Guidance in Student Learning 449

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 6: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

students showed the worst performance in theearly problems by making more errors, theyperformed best on the later problems by mak-ing fewer errors and taking shorter problem-solving time in comparison with the other-instructed students. Also, in the later phaseof learning, students in the discovery learningcondition showed the best performance,regardless of the position of the problems.

In their second study, Brunstein and hercolleagues (2009) obtained quite different re-sults when students only had one-quarter of theproblems to practice on. About 50% of the par-ticipants in the discovery condition felt lost andwanted to quit the study, whereas none had quitin the first study. The remaining discovery par-ticipants did worse than those who received di-rect instruction. Comparing findings from thefirst and second study, the discovery learningapproach appeared to be effective only withhigh levels of practice. Without this practiceto consolidate their understanding, students inthe discovery condition had an especially hardtime in understanding problems.

Similar results were obtained by Dean &Kuhn (2006), who investigated the effects ofdirect instruction and discovery learning onteaching control-of-variable strategy (CVS) inthe science domain. This study followed stu-dents’ progress (acquisition and maintenance)over approximately six months. Fourth-gradestudents learned to design unconfounded ex-periments through computer-based inquirytasks under one of the three conditions: di-rect instruction only (DI), practice only (PR),or a combination of instruction and practice(DI+PR). To design an unconfounded exper-iment, students were required to make a com-parison by manipulating only one factor whilesetting all other conditions the same. In theDI condition, students received a single ses-sion of instruction without long engagement.In the PR condition, students freely practicedCVS with a computer program over 12 sessionswithout direct instruction. After general ini-tial instruction, only direct instruction condi-tions (both DI and DI+PR) received a series ofcomparisons between two different experimen-

tal conditions and comments about whether thecomparison was good or bad and an explanationas to why.

In a replication of the results of Klahr &Nigam (2004), direct instruction proved to beeffective in an immediate assessment. How-ever, in the tests given after the eleventh week,the advantage of direct instruction did not re-main without further practice. In contrast, thepractice group showed continually improvingperformance over time. Dean & Kuhn (2006,p. 394) conclude, “. . .direct instruction appearsto be neither a necessary nor sufficient condi-tion for robust acquisition or for maintenanceover time.” These results are consistent withthe findings of Brunstein et al. (2009) in thatalthough minimal guidance may not be effec-tive in the earlier stage of learning, with highlevels of practice, performance improves overtime. However, Klahr and his colleagues (e.g.,Klahr & Nigam 2004, Matlen & Klahr 2010)repeatedly found positive learning gains fromdirect instruction on teaching CVS; some ofthese studies are reviewed in detail in a latersection.

The finding that discovery learning can beeffective when accompanied with high levels ofpractice also suggests a new interpretation of aprevious study (Charney et al. 1990) that inves-tigated three different instructional approachesto teach college students to use a spread-sheet program with a command line interface.The three experimental conditions were tuto-rials, problem solving, and learner exploration.The tutorial condition was given the highestlevel of instruction and the exploration condi-tion was given the lowest level of instruction,with the problem-solving condition in between.The results showed that the tutorial conditionwas worst and the problem-solving conditionwas best, with exploration coming in between.However, Tuovinen & Sweller (1999) criticizedthis study because time-on-task was not con-trolled. Alternatively, the longer training mayhave enabled minimal guidance to be effective,and the condition might have been superior todirect instruction even if direct instruction weregiven more time.

450 Lee · Anderson

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 7: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

PROVISION OF DIRECTINSTRUCTION

Empirical Evidence on Superiority ofDirect Instruction

Although the studies reviewed above might beseen as support for discovery learning, there isno lack of studies showing the superiority ofdirect instruction in many different domains,such as problem-solving rules (Craig 1956,Gagne & Brown 1961, Kittel 1957), program-ming (Fay & Mayer 1994, Lee & Thompson1997), science (Chen & Klahr 1999, Klahr &Nigam 2004, Matlen & Klahr 2010, Strand-Cary & Klahr 2008), mathematics (Carroll1994, Cooper & Sweller 1987, Sweller &Cooper 1985), and procedure learning (Rittle-Johnson et al. 2001). The success reported bytutoring programs in mathematics (such as theCognitive Tutor) also supports the importanceof providing instructional guidance in responseto students’ needs (Anderson et al. 1995,Koedinger et al. 1997).

Another good example showing the ad-vantages of direct instruction is the series ofstudies Klahr and his colleagues have done onCVS. The original study (Chen & Klahr 1999)demonstrated that direct instruction was moreeffective than discovery learning in improv-ing children’s ability to design unconfoundedexperiments. However, this study has been crit-icized with respect to its epistemology becausehigh CVS scores do not mean high level ofauthentic scientific inquiry (Chinn & Malhotra2001). Following this criticism, Klahr & Nigam(2004) investigated effects of direct instructionand discovery learning on CVS in a moreauthentic context with third- and fourth-gradechildren. They found that, as in earlier studies,direct instruction was more effective thandiscovery learning. Moreover, they found thaton the “far transfer” science fair assessment, themany children who mastered CVS in the directcondition performed just as well as the fewchildren who mastered it in the discovery con-dition. Thus, contrary to one of the commonclaims for the superiority of discovery learning,

their study demonstrated that far transfer didnot depend on how children learned something,only that they learned it. Further investigationsby Strand-Cary & Klahr (2008) have alsobolstered the effectiveness of direct instruc-tion compared with the discovery learningapproach. These studies are particularly note-worthy because they show the superiority of di-rect instruction in a more complex domain andon transfer tasks, which differs from commonlyheld beliefs that direct instruction is only effec-tive for rote skills and direct tests of knowledge.

Matlen & Klahr (2010) examined the effectof different sequences of high versus low lev-els of instructional guidance on teaching CVSto find an optimal temporal sequence of guid-ance. By crossing the amount of instructionwith two separate training sessions, four dif-ferent orderings of instructional guidance weretested. The four conditions were high+high,high+low, low+high, and low+low, depend-ing on whether early and late practice providedhigh or low instructional guidance. High guid-ance provided direct instruction and inquiryquestions, whereas low guidance provided onlyinquiry questions. The study found best learn-ing and transfer when high levels of guidancewere repeated in the early and late training ses-sions (i.e., high+high condition).

Example-Based Learning:Worked Examples

A particularly interesting class of studies com-pares example-based learning with problem-based learning conditions. In example-basedlearning (often referred to as the worked-example condition), learners are provided witha worked example to study. Worked examplesare instructional tools to provide an expert’ssolution that students can emulate. Theytypically involve a problem statement, step-by-step solution steps, and a final answer tothe problem (Atkinson et al. 2000, Renkl et al.1998). Worked examples are usually alternatedwith problems. In contrast, in problem-basedlearning, learners simply practice solving

www.annualreviews.org • Instructional Guidance in Student Learning 451

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 8: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

problems after initial instruction. In this kindof manipulation, the worked examples are oftencharacterized as providing a form of directinstruction (Kirschner et al. 2006), but onecould reasonably argue that the examples onlyprovide scaffolding for a discovery process. Inany case, as reviewed below, example-basedinstruction is often effective.

Carroll (1994) examined the effect of workedexamples as an instructional support in the al-gebra classroom. High school students learnedto translate words describing mathematical sit-uations into formal equations (e.g., writing“five less than a number” as “x–5”) in eithera worked-example condition or conventionalpractice condition. Initial instruction includedthree examples and three practice problems inboth conditions. In the worked-example con-dition, students were given a worksheet with12 pairs of problems, one example followedby one problem. In the conventional practicecondition, students had to solve 24 problemsin the same order but without any examples.Students from the worked-example conditionoutperformed those from the practice condi-tion by showing fewer errors on both imme-diate and delayed posttest (both learned andtransfer problems), decreased need for assis-tance from the teacher, and less time taken tocomplete the work. It had been reported earlierin several other studies that the traditional prac-tice of problem solving was not as effective asexample-problem pairs (e.g., Cooper & Sweller1987, Paas & Van Merrienboer 1994b, Sweller& Cooper 1985, Trafton & Reiser 1993).

Tuovinen & Sweller (1999) also comparedthe exploration-learning condition with theworked-example condition in college studentslearning to use a database program. The supe-riority of worked examples was again reported,consistent with findings by Carroll (1994),but this time the advantage occurred only forinexperienced learners. Tuovinen & Swellerhad participants rate the cognitive load theyexperienced (ratings of mental effort requiredto complete the task using a Likert scale). Cog-nitive load is considered a multidimensionalconstruct that represents the load imposed on

the cognitive system while performing a partic-ular task and is often conceptualized with men-tal load, mental effort, and performance (Paas& Van Merrienboer 1994a,b). The explorationgroup reported experiencing higher cognitiveload than the worked-example group, but againthe difference was reliable only for studentswho had less experience. This suggests thatproviding examples is effective in part becauseit lowers cognitive load for challenged learners.

Zhu & Simon (1987) claimed that studentscould learn from worked examples and problemsolving equally successfully and efficiently with-out lectures or other forms of direct instruc-tion as long as examples and/or problems areappropriately arranged in a way that studentsdo not make too much trial-and-error search.When learning from examples, students use theworked examples to induce the relevant proce-dures and principles and then apply these to newproblems. On the other hand, when learning bydoing (i.e., problem solving), students have tofirst generate appropriate worked examples forthemselves. When a problem solver correctlysolves a problem, the solution path becomes aworked-out example.

In Zhu & Simon’s (1987) study, studentslearned to factor quadratic algebraic expres-sions and showed learning in the problem-solving condition comparable to the learning-from-examples condition without lectures.Three possible explanations were suggestedfor this successful learning in both conditions.First, students had already studied the meaningof factoring and factoring of integers, thus allstudents had the background knowledge thatwas prerequisite for learning current materials.Second, students were provided with proce-dures for checking the correctness of theiranswers. This reduced the probability of stu-dents making errors of induction from incorrectsolutions. Third, all examples and problemswere carefully arranged so that students had toattend to only certain aspects of problems. Thiscould reduce inefficient trial-and-error searchand in turn reduce working memory load.

Besides studies on worked examples, thereis a great abundance of studies showing that

452 Lee · Anderson

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 9: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

providing an example is an effective instruc-tional method. Some studies have comparedproviding an example with providing proce-dures or rules. For example, Fong et al. (1986)reported that students who were trained withexamples performed as well as students whowere trained with explicit rules in learningstatistical concepts. Both training methodswere equally effective at improving the qualityof statistical reasoning. Training on bothmethods had an additional positive effect. Reed& Bolstad (1991) also found that it was moreeffective to provide both examples and writtenprocedures than to provide either examples orprocedures alone when teaching to constructequations for work situation word problems.However, in this study, providing exampleswas more effective than providing writtenprocedures only.

INTERACTION WITH OTHERINSTRUCTIONAL FACTORS

Individual Difference(Expertise Reversal Effect)

As Tuovinen & Sweller (1999) demonstrated,the effectiveness of instructional method mightdiffer based on a learner’s previous experienceor prior knowledge level. One instructionalapproach might be ideal for experiencedlearners but might not be effective, or mighteven be detrimental, for inexperienced learn-ers or novices. The expertise reversal effect(Kalyuga 2007, Kalyuga et al. 2003) is anexample showing the interaction between thelevel of the learner and level of instruction.This occurs when instructional guidancehelps inexperienced learners, but it is notbeneficial for experienced learners. The ideaof aptitude-treatment interaction (Cronbach& Snow 1977) has a long history and hasbeen tested by many researchers. For example,Campbell (1964) found that the high-aptitudegroup benefited from a self-direction learningmethod, whereas the low-aptitude group ben-efited from a programmed instruction method.Cronbach & Snow (1977) also reported thatwhen learners were given an opportunity to

process the information in their own way, onlyhigh-ability learners benefited; low-abilitylearners appeared to be handicapped by this.Aptitude-treatment interactions have beenfound in many domains including multimedialearning (Mayer & Sims 1994, Seufert et al.2007), probability calculation (Renkl 1997), andlogic programming (Kalyuga et al. 2001). Thisidea easily expands into the implementationof an adaptive instructional support found inintelligent tutoring systems, where instructionis adapted in response to the learner’s progress(Anderson et al. 1995, Salden et al. 2010).

Shute (1992) argued that some studiesperhaps failed to produce successful effectsof instructional manipulations simply becausetheir manipulations were having differenteffects on different ability groups. To testthis idea, Shute (1992) investigated effects oftwo learning environments on mastering thebasic principles of electricity and examinedhow effects differed depending on the learner’sability. In the rule-application environment,participants were given feedback that explicitlystated the variables and the relationshipsamong those variables that were used todescribe a principle for a given problem. Inthe rule-induction environment, the relevantvariables were identified by feedback, butparticipants had to induce the relationshipsand generate their own interpretation. Whilethere was no main effect of the learningenvironment on learning outcomes, there wasa significant interaction between cognitiveability (an associative learning measure) andthe learning environment. Also, the interactionpattern was different for different learningoutcome measures. For declarative knowledgeacquisition, rule-induction was more effectivefor high-ability learners, but rule-applicationwas more effective for low-ability learners.In contrast, for procedural skill acquisition,high-ability learners benefited more from therule-application environment, and low-abilitylearners showed poor learning outcomeregardless of the type of learning environment.

A possible explanation for this intrigu-ing interaction is that learning is enhanced

www.annualreviews.org • Instructional Guidance in Student Learning 453

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 10: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

when there is a good match among learningenvironment, outcome measure, and cognitiveability. For example, one will acquire morerobust declarative representations if oneinduces the rules than if one just applies rules.Because the high-ability learners possessedrelevant cognitive skills, they were able tounderstand concepts and formulate a rule inthe rule-induction condition. However, thelow-ability learners lacked these cognitive skillsto induce the rules and were able to acquiremore declarative knowledge when the ruleswere explicitly provided in the rule-applicationcondition. In contrast, the rule-application en-vironment supports acquisition of proceduralskills. In the rule-application condition, thehigh-ability learners promptly applied rulesand procedures without a demanding inductionprocess. This allows more opportunity to prac-tice procedural skills than in the rule-inductioncondition. However, the low-ability learnersnever proceduralized necessary skills withinthe training time because of their deficientskills, and they performed poorly regardless ofthe type of learning environment.

Kalyuga et al. (2001) examined the inter-action between instructional guidance andlearners’ knowledge level. In their study, tradeapprentices from manufacturing companieswere given general instruction on pro-grammable logic controller programs for relaycircuits and then given experimental trainingsessions. Participants had either pure problem-solving practice or a mixed worked-exampleand problem-solving practice. They found thatstudents benefited less from worked examplesas they mastered more material. The worked-example group showed performance superiorto that of the problem-solving group in theearly phase of the learning, but the differencewas reversed in the end of the learning.

The findings indicate that levels of learnerknowledge interact with levels of instructionalguidance and suggest that students may learnbetter if different instructional methods areused, depending on the learner’s experiencethrough the acquisition/learning phase. Ac-cording to this rationale, Renkl et al. (2000)

proposed the combination of two instructionalmethods (worked example and problem solv-ing) by presenting examples in the early stageof learning and then presenting problems in thelater stage of learning. Renkl and his colleagues(2002) tested this proposal and tested a fadingprocedure against traditional example-problempairs. In the fading procedure, a complete ex-ample is presented first, and then increasinglymore incomplete examples are presented byomitting solution steps. Finally, a completeproblem is presented. The study found posi-tive effects of the fading procedure on near-transfer items. Atkinson et al. (2003) replicatedthis fading-out example effect by comparingexample-problem pair learning with the back-ward fading procedure (where the last solu-tion steps are omitted first). Schwonke et al.(2007) also found that tutored problem solv-ing combined with gradually faded examplesled to a better transfer performance than didtutored problem solving alone. However, all ofthese studies employed a fixed fading scheme,and the fading schedule was not adapted to thestudent’s learning. Schwonke et al. (2007) sug-gested the fading example would be more bene-ficial if worked-out steps were to fade adaptivelyfor each individual learner.

Following this suggestion, Salden et al.(2010) examined the effects of the fadingof worked-out examples that occurred eitherfixedly or adaptively within the Geometry Cog-nitive Tutor. In the fixed fading condition,students were initially provided with completeworked examples, the problems with examplesteps gradually faded, and at the end they re-ceived pure problems according to the fixedschedule. In the adaptive fading condition, anindividual student’s mastery of geometry the-orems estimated by a Bayesian knowledge-tracing algorithm (Corbett & Anderson 1995)was used to decide when a worked-out stepshould be faded. The results showed thatthe adaptive fading of worked-examples ledto a better performance on delayed postteststhan did the fixed fading of worked-examplesor the standard tutored problem-solvingpractice.

454 Lee · Anderson

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 11: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

Self-Explanation

Researchers have also been interested in howthe discovery versus instruction dichotomy in-teracts with other instructional factors. For in-stance, the effect of self-explanation has beeninvestigated along with the amount of instruc-tional guidance in several studies (e.g., Atkin-son et al. 2003, Rittle-Johnson 2006). The self-explanation effect occurs when students try toexplain the example solutions to themselvesand then learn more than those who do not.The original study on self-explanation was per-formed by Chi et al. (1989) and has been fol-lowed up in many laboratory experiments (e.g.,Renkl et al. 1998, Siegler 2002) and classroomstudies (e.g., Aleven & Koedinger 2002, Haus-mann & VanLehn 2007). Self-explanation ac-tivity is thought to help learning by causingthe generation effect when students generatetheir own explanations. For example, Haus-mann & VanLehn (2007) showed that studentswho were prompted to generate their own ex-planations for examples showed greater learn-ing gains than those who were prompted toparaphrase provided explanations for the sameexamples. A follow-up study by Hausmann et al.(2009) examined effects of different types ofself-explanation prompts and found that jus-tification and step-focused prompts benefitedmore from studying examples than did themeta-cognitive prompts. It appears that the firsttwo prompts facilitate the acquisition of prob-lem schema when students generate justifica-tion for each solution step.

Chi and her colleagues (1989) divided stu-dents into “good” and “poor” categories basedon their problem-solving scores and analyzedthe quality of their self-generated explanationscollected using the think-aloud method. Theanalysis revealed that good and poor studentsdiffered not only in the amount of verbalprotocols they provided, but also with respectto the quality of their explanations. Good stu-dents produced more explanations, more ideastatements, and more statements that identifiedtheir own misunderstandings. While solvingproblems, good students tended to make more

specific inquiries to examples they studied ear-lier when they had difficulty. However, in thisstudy, studying time was not controlled. Goodstudents actually spent more time to studythe worked-out examples than did the poorstudents. Therefore, it was not clear whethermore time-on-task or better self-explanationachieved the successful learning.

This kind of different characterization ofself-explanation from good versus poor stu-dents was also reported by Renkl (1997), whocontrolled time-on-task. With verbal protocolanalysis, four groups of participants were iden-tified with respect to self-explanation styles, in-dependently from achievement data. The fourstyles were passive, superficial, principle based,and anticipative. Passive explainers generateda poor quality of self-explanations and did notinspect many examples. Superficial explainersinspected many examples, but they spent rela-tively little time when studying each example.These two groups showed worse performanceon the posttest in comparison with principle-based and anticipative reasoners. Principle-based explainers attempted to emphasize themeaning and goal of operators and elaborate onthe underlying principles of examples. Antici-pative reasoners appeared to use an example totest their problem solving. This group of peopleanticipated the next step of the example solu-tion and moved on to the next page to checkwhether their anticipated solution step was ac-tually correct or not. Pretest score differencessuggest that different levels of prior knowledgeaffected the preference of explanation style. An-ticipative reasoners had a relatively high level ofprior knowledge, whereas principle-based ex-plainers had a low level of prior knowledge.

Aleven & Koedinger (2002) showed thatprompting for self-explanation was beneficialfor learning in a class environment by im-plementing it as part of the Cognitive TutorGeometry course. Students practiced problemsolving in an intelligent tutor program eitherwith or without a prompt to explain solutionsteps. In the self-explanation condition, stu-dents had to type the name of the problem-solving principle that justified the solution step,

www.annualreviews.org • Instructional Guidance in Student Learning 455

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 12: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

and the tutor then provided feedback on thecorrectness of the typed principle. Studentswho were prompted to explain their solutionsteps showed greater understanding and bettertransfer performance than those who were notasked to explain steps. Students trained with-out self-explanation prompts appeared to showshallow procedural knowledge.

Rittle-Johnson (2006) investigated whetherpromoting self-explanation is effective in com-bination with either direct instruction or dis-covery learning conditions. Third- throughfifth-grade children learned to solve mathemat-ical equivalence problems. Children often un-derstand the equal sign ( = ) as an operator sig-nal that gets the answer rather than understand-ing it as a relational symbol meaning that twosides of the equations are the same (Baroody& Ginsburgh 1983, Rittle-Johnson & Alibali1999). The four conditions were constructedby crossing two factors, instruction type (in-struction versus invention) and self-explanationprompt (self-explanation versus no explana-tion). For the instruction groups, a teachertaught a correct add-subtract procedure forsolving problems. For the invention groups, noinstruction was provided, and instead childrenwere simply told to think of a new way to solvethe problem. For self-explanation groups, chil-dren were given an additional screen showingtwo different answers from two children at an-other school: one correct and one incorrect an-swer. The children were asked to explain howthe answers were obtained by the other childrenand why each answer was correct or incorrect.For the no-explanation group, the additionalscreen was not provided. The results showedthat self-explanation and instructional type didnot interact; rather, they simply had an additiveeffect on learning in that both self-explanationand direct instruction helped children learn acorrect procedure.

Cognitive Load Theory and Designingan Effective Worked Example

As we have reviewed, it is often found thatconventional problem-solving practice is not

an ideal instructional method (e.g., Cooper& Sweller 1987, Sweller & Cooper 1985),and worked examples have been suggested asa better instructional approach (e.g., Carroll1994, Paas 1992, Renkl 2002, Tuovinen &Sweller 1999; for review, see Atkinson et al.2000). Also, worked examples are especiallyeffective for inexperienced learners. Becauselevels of knowledge tend to interact with levelsof instructional guidance, some variants of thisinstructional method, such as the fading pro-cedure, were suggested to maximize its effect.What makes the worked-example approacheffective for inexperienced learners but notfor experienced learners? One of the mostdiscussed explanations is the cognitive loadtheory (Sweller 1988). Humans have limitedworking memory capacity (Baddeley 1992,Miller 1956), and problem solving requiresusing this limited working memory. Therefore,solving a problem involves high working mem-ory demands (e.g., to keep track of where oneis in a search space), and most of the workingmemory resources are consumed for thisactivity rather than for supporting learning.

Sweller (1988) elaborates this idea in termsof learning domain schemas. Problem solvingand schema acquisition are both demanding ofthe mechanisms of selective attention and lim-ited working memory capacity. Problem solverstend to focus on reducing the difference be-tween the current state and the goal problemstate and try to find the right operators to re-duce this difference. This focus on specific dif-ferences does not help construct the generalschemas for a domain. Moreover, learners oftenflounder as they search for the right operatorsand lose touch with the important information.However, when direct instruction or a workedexample is given, learners do not need to usetheir working memory resources for an ineffi-cient search and instead can use them to learnthe essential relations between problem-solvingmoves. If a means-ends strategy prevents learn-ers from acquiring schemas, reducing or elim-inating goal specificity helps enhance schemaacquisition by eliminating the possibility of us-ing a means-ends strategy to solve a problem.

456 Lee · Anderson

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 13: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

In some studies, when a conventional specificgoal was replaced by a nonspecific goal, learn-ing was actually enhanced (Miller et al. 1999,Sweller & Levine 1982, Sweller et al. 1983).

Sweller and his colleagues (1998) furtherdiscuss three types of cognitive load: intrinsic,extraneous, and germane. Intrinsic cognitiveload cannot be altered by instructional designbecause it is intrinsic to the learning material,whereas extraneous and germane cognitiveload can be reduced or induced by instructionaldesign. Intrinsic load is the inherent level ofdifficulty that is directly associated with the ma-terial. When learning material involves moreelements to consider (e.g., learning to multiplyout the denominator in an equation), it has amore intrinsic load than when it does not (e.g.,memorizing Fe is the symbol for iron). Extra-neous load is often a result of poor instructionaldesign and consumes one’s working memorycapacity with irrelevant activity, whereasgermane load is a result of mental efforts thatcontribute to schema construction. Thus,an appropriate instructional design shouldreduce the extraneous cognitive load whileinducing the germane cognitive load withinworking memory capacity. For example, Paas& Van Merrienboer (1994b) demonstrated thatproviding worked examples (in comparison toproblem solving) enhanced learning by reduc-ing the extraneous load, and that introducingvariability in examples had positive effectsonly when the extraneous cognitive load wasreduced.

According to cognitive load theory, the ex-pertise reversal effect (Kalyuga et al. 2003) isan example of this phenomenon. Novices donot have sufficient prior knowledge to orga-nize key information provided in the prob-lem. Therefore, they have to do unproductiveproblem-solving searches. However, as learn-ers become more experienced, the knowledgeis stored in long-term memory, and the well-organized knowledge structures help overcomeworking memory limitations. This difference inworking memory capacity between experiencedand inexperienced learners results in differentbeneficial effects from worked examples.

Can direct instruction or worked examplesharm learning by increasing working mem-ory load? Several studies have shown that thisis actually possible and suggest that instruc-tion needs to be designed to reduce extrane-ous working memory load so that learners canfocus on essential learning activities. Learningmaterials often are presented in various modal-ities such as text and diagram. When multiplesources of information are presented together,learners need to integrate corresponding rep-resentations. Difficulty in integrating separatesources of information causes split attention(Tarmizi & Sweller 1988) and prevents learnersfrom constructing a relevant schema by increas-ing working memory load. Chandler & Sweller(1991) demonstrated that in the design of in-struction, a diagram alone was more effectivethan a diagram with text. Also, presenting textin both visual and auditory format was less ef-fective than in auditory format only (Craig et al.2002, Kalyuga et al. 2000, Mayer et al. 2001).However, a dual-mode presentation is not al-ways worse than a single-mode representation.If integration of different formats of informa-tion does not create a working memory bur-den, it can be effectively used. One of the majorreasons that a word-plus-diagram presentationis not superior to a stand-alone diagram is theextensive visual search it requires. In essence,people need to find which part of the text cor-responds to which part of the diagram. Basedon this idea, Jeung et al. (1997) tried to reducethe visual search by using visual flashes to iden-tify the part of a diagram to which the audi-tory text was referring. This technique provedto enhance learning. The importance of visualcueing also has been reported in the domain ofanimations by several researchers (Boucheix &Lowe 2010, de Koning et al. 2010).

Koedinger and his colleagues (2010) providean alternative account for the worked-exampleeffect and the expertise reversal effect. Theyargue that problem-solving practice is noteffective for novice learners, not because ofexhausted working memory capacity (as arguedby cognitive load theory), but rather becauseof lack of environmental support for filling

www.annualreviews.org • Instructional Guidance in Student Learning 457

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 14: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

in their knowledge gaps. Worked examplesprovide more input than problem solving andtherefore offer beginning learners a better op-portunity for the induction and sense-makingprocess. In contrast, advanced learners needrefinement and fluency building, and theseskills are better provided by problem-solvingpractice than worked examples.

Effects of Comparison in Learningby Worked Examples

Many researchers have emphasized the impor-tance of comparison for learning and transfer(e.g., Gentner et al. 2003, Gick & Holyoak1983). The National Council of Teachers ofMathematics standards also emphasize the im-portance of comparing solution methods asan instructional practice (Natl. Counc. Teach.Math. 1989, 2000). Students are encouraged toshare and compare their solution methods withtheir classmates. This comparing method hasbeen used as one of the instructional changesin many constructivism-based classrooms (e.g.,Cobb et al. 1991, Hiebert & Wearne 1996).

Rittle-Johnson and her colleagues inves-tigated when and how comparison helpedlearning in mathematics with school-agechildren in a series of studies. Rittle-Johnson &Star (2007) had seventh-grade children learn tosolve multistep linear equations [e.g., 3(x + 1)= 15] under one of the two different conditions,either the comparison or sequential condition.In the comparison condition, students wereprovided with sets of two worked examplesillustrating different solutions for the sameproblem and were encouraged to compare andcontrast the two examples. The solution steps ofthe two worked examples were mutually alignedtogether on the same page, and each step of thesolutions was labeled (e.g., distribute, combine)as well. In the sequential condition, studentsstudied the identical worked examples, but eachworked example was presented on a separatepage. Also, students were prompted to reflecton the solution of each example. After twodays of intervention, students were tested onconceptual knowledge, procedural knowledge,

and procedural flexibility. The results showedthat students from the comparison conditiongained more procedural knowledge and flexi-bility than those from the sequential condition,but there was no difference in conceptualknowledge between the two groups. Studentswho compared alternative solution methodswere more likely to use the more efficientnonconventional methods and were better ableto transfer their methods to novel problems.

Although the comparison proved to facili-tate learning for multiple solution methods inmathematics, it is also important to know whenand how comparison facilitates learning. Rittle-Johnson & Star (2009) showed that the effec-tiveness of comparison actually depended onwhat types of things were compared. Eighth-grade children learned to solve equations usingworked examples in one of three different com-parison conditions: comparing solution meth-ods, comparing problem types, and comparingequivalent problems. The first condition wasidentical to the comparison condition used inthe previous study by Rittle-Johnson & Star(2007) and involved learning multiple solutionmethods for one problem (i.e., one problemwith two solution methods). In the comparingproblem types condition, students learned tosolve different problems with the same solu-tion method (i.e., two different problems withone solution method). In the comparing equiv-alent problems condition, students learned tosolve equivalent problems with the same so-lution method (i.e., two equivalent problemswith one solution method). The posttest resultsshowed that comparing solution methods wasmore effective for both conceptual knowledgeand procedural flexibility than comparing prob-lem types or comparing equivalent problems.Therefore, the benefits of comparison appearto depend on how worked examples differ.

Rittle-Johnson and her colleagues (2009)further examined the importance of priorknowledge in learning from comparison. Stu-dents were divided into two groups based onwhether or not they attempted algebraic meth-ods in a pretest. The results showed that stu-dents who attempted algebraic methods at

458 Lee · Anderson

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 15: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

pretest (high prior knowledge group) benefitedmost from comparing solution methods, butstudents who did not attempt algebraic meth-ods at pretest (low prior knowledge group) wereharmed by comparing solution methods. Stu-dents appeared to need sufficient prior knowl-edge in order to benefit from comparing al-ternative solution methods. When students donot have enough prior knowledge, two simulta-neously presented worked examples are simplytwo unfamiliar examples, and the comparisonactivity just adds to the working memory load.In contrast, when students have enough priorknowledge, they can make an analogy from afamiliar example to an unfamiliar example, andthe comparison activity can be appropriatelyhandled by their working memory resources.

Effects of Instructional Explanationsin Learning by Worked Examples

In the studies reviewed, there was considerablevariation in how much explicit (verbal) instruc-tion accompanied the examples. Will studentslearn better from worked examples with in-structional explanations or will they learn betterif they are given only worked examples with-out instructional explanations? A large numberof studies compared learning by worked exam-ples with instructional explanation and with-out instructional explanation. In some studies,effects of receiving versus generating explana-tions have been compared when students learnwith worked examples. There are both positive(e.g., Atkinson 2002, Lovett 1992, Renkl 2002)and negative (e.g., Ward & Sweller 1990) ef-fects of the provision of instruction. Some stud-ies showed neutral effects as well (e.g., Gerjetset al. 2006).

Lovett (1992) investigated the benefits ofgenerating and receiving information whenlearning by problem solving versus whenlearning by example. Students learned to solveprobability calculation problems in one of fourexperimental conditions. By crossing instruc-tion type (worked example versus problemsolving) with explanation type (instructionalexplanation versus self-explanation), four

conditions were constructed. All groups of stu-dents demonstrated comparably good perfor-mance on the near-transfer test. A far-transfertest, however, showed a significant interactionbetween the instruction type and explanationtype. When students learned by workedexamples, students who received instructionalexplanation outperformed those who did self-explanation. On the other hand, when studentslearned by problem-solving practice, the pat-tern of results was reversed. Students who didnot receive instructional explanation (i.e., self-explanation) showed better performance thanthose who did receive instructional explanation.These results were explained by the consistencyof source information. In the example-basedlearning, the source of the solution is the exper-imenter, whereas in problem-based learning,the source of the solution is the subject. Like-wise, the source of elaboration is experimenterfor instructional explanation and the subject forself-explanation. When there are inconsistentinformation sources, subjects have to integratetheir own information with the experimenter’s,and this might have increased cognitiveload and weakened their problem memories.Whether this is the correct explanation ornot, the benefits of providing instructionalexplanations were found only when studentslearned in worked-example conditions.

Renkl (2002) also demonstrated that in-structional explanations had a positive effect onlearning by worked-out examples. In this study,he first compared favorable features of self-explanation and instructional explanation andthen created a learning environment by com-bining and maximizing their respective advan-tages. Relative to self-explaining activities, in-structional explanations are not usually adaptedto the prior knowledge of individual learnersand are more likely to be provided withoutconsideration of students’ ongoing cognitiveactivity1 (Renkl 2002). Also, students lose an

1Intelligent tutoring systems such as Cognitive Tutor oftenprovide instructional explanations that are adapted to an indi-vidual learner’s knowledge level (Corbett & Anderson 1995).

www.annualreviews.org • Instructional Guidance in Student Learning 459

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 16: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

opportunity to benefit from the generationeffect (Hausmann & VanLehn 2007, Lovett1992). However, instructional explanationshave the important benefit of correctness.Students are known to generate incorrectself-explanations and then suffer from theillusion of understanding (Chi et al. 1989).Also, instructional explanations help studentsovercome comprehension problems that theycannot solve for themselves. Renkl developeda learning environment based on this analysis.By including or excluding an instructionalexplanation button, student teachers learned tosolve probability calculation problems underone of two different conditions. Instructionalexplanations had a positive effect in far transferbut not near transfer. He also found that theexplanations were used mostly by participantswith low levels of prior knowledge.

Catrambone (1998) found that the provi-sion of a simple label for worked exampleshelps learning. When a label is provided for agroup of solution steps that go together, stu-dents attempt to self-explain the purpose of thegrouped solution steps and organize them withsubgoals. This is consistent with the findingthat students can understand the general ra-tionale of problems better when problem solu-tions are broken down into smaller meaningfulsolution units (i.e., modular examples) ratherthan when examples focus on problem cate-gories and their associated overall procedures(Catrambone 1994, Gerjets et al. 2004).

Although several studies support the criticalrole of instructional explanations in example-based learning, this effectiveness does not seemto be guaranteed. For instance, Gerjets et al.(2006) reported no effect of providing instruc-tion on learning probability calculation prob-lems. Although the amount of instruction hadno effect on test performance, students who re-ceived high levels of instruction erroneously feltmore successful at learning than those who re-ceived low levels of instruction. As a matter offact, more instructional explanations increasedstudying time; thus, less-elaborated example-based instruction was more efficient than more-elaborated example-based instruction.

Provision of instructional explanation some-times even produces negative effects whenadded in an inappropriate way. Through a seriesof experiments, Ward & Sweller (1990) demon-strated that when the instructional explana-tion failed to direct attention appropriately, itfailed to reduce cognitive load and thus wasnot effective. In one experiment, tenth-gradestudents learned geometric optics problemsin one of three different experimental learn-ing conditions: conventional worked example,split-attention worked example, and conven-tional problem. In the split-attention worked-example condition, extra textual explanationwas added, but not in an integrated format.This group was no better than the conventionalproblem-solving group, and both were worsethan the conventional example group.

Negative effects of providing instructionalexplanations also seem to occur by reducingself-explanation activities. Schworm & Renkl(2006) investigated how generating explana-tions interacts with receiving explanation in adomain of instructional design. Student teach-ers learned to design effective worked-outexamples for high school students in sev-eral domains including geometry and physics.By crossing presence of self-explanation (self-explanation versus no self-explanation) withpresence of instruction (instructional explana-tion versus no instruction), four different learn-ing conditions were constructed. Participantswere given initial instructions on basic princi-ples of worked-out example design, and theystudied solved example problems. Schworm &Renkl (2006) found that instructional explana-tions hurt learning when participants generatedself-explanations but helped learning when theydid not. There was reduced self-explanation ac-tivity in the presence of instructions.

Given contradictory evidence, it is hardto draw a conclusion on the role of instruc-tional explanations added on worked exam-ples. To address this issue, Wittwer & Renkl(2010) conducted a meta analysis (see alsoWittwer & Renkl 2008). In order to inves-tigate whether instructional explanations sup-port example-based learning, 21 experimental

460 Lee · Anderson

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 17: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

studies were reviewed and analyzed with var-ious moderating factors, and four major con-clusions were reached. First, the overall effectof instructional explanation for example-basedlearning appears to be minimal. Although theprovision of instructional explanation led to sig-nificantly better learning outcome than no in-structional explanation, the effect size was small(d = 0.16). The benefit of instructional expla-nations was greater when the control conditionwas not supported by self-explanation. Second,instructional explanations were more effectivefor acquiring conceptual knowledge rather thanprocedural knowledge (d = 0.36). Third, theeffectiveness of instructional explanations dif-fered based on the learning domain. In mathe-matics, it had significantly positive effects (d =0.22), but the effects were not clear in other do-mains including science and instructional de-sign. Fourth, instructional explanations werenot necessarily more helpful than the othersupporting methods such as self-explanation.This analysis showed that prompting for self-explanation was as effective as adding instruc-tional explanations for example-based learning.

COMBINING DISCOVERYLEARNING AND DIRECTINSTRUCTION APPROACHES

Invention Activity Followedby Direct Instruction

Both discovery learning and direct instructionapproaches are known to have unique advan-tages (Koedinger & Aleven 2007), and therehave been several attempts to combine discov-ery learning approaches with direct instruc-tion approaches. For example, worked exam-ples and problem-solving practice were success-fully combined using a fading procedure (e.g.,Atkinson et al. 2003, Renkl et al. 2002, Saldenet al. 2010). When transitioning from workedexamples to problem-solving practice depend-ing on the stage of learning, it led to success-ful learning. There have been other attemptsto combine invention activity with a follow-updirect instruction (e.g., feedback, lecture, and

text), and several studies have shown that thiscombined method is more beneficial than ad-ministering only direct instruction and practicewithout invention activities (e.g., Kapur 2011,Kapur & Bielaczye 2011, Schwartz & Martin2004, Schwartz et al. 2011).

For example, Schwartz & Martin (2004)demonstrated that a student’s invention activ-ities appeared inefficient because they failedto generate canonical solutions, but when asubsequent instruction was embedded in a test,these students actually did better than thosewho were directly taught and had to practicewithout invention activities. In this study,ninth-grade algebra students studied statisticalconcepts under one of two instructionalconditions (invention versus tell-and-practice)and then were tested under one of two testconditions (presence versus absence of aworked example embedded into the test).Students in the tell-and-practice conditionperformed on par, regardless of whether therewas a worked example embedded in the test.In contrast, students in the invention groupoutperformed these two groups, but onlywhen there was a worked example embeddedin the test. This study shows that while oneinstructional approach may look ineffective, theefficacy of the approach may actually be hiddenin what Bransford & Schwartz (1999) call asequestered problem-solving paradigm. Thisis when participants are sequestered for testsof their learning to prevent them from possibleexposure to other sources that may positivelyor negatively affect their performance in exper-iments. In contrast, in a preparation for futurelearning paradigm (Schwartz & Bransford1998), learners are tested not based on whetherthey can generate a finished product, but ratherbased on whether they are prepared to learn togenerate a new product.

Schwartz and his colleagues (2011) also re-ported similar findings with adolescent stu-dents. Students learned a concept of densityunder either a tell-and-practice condition oran invent-with-contrasting-cases condition. Inthe tell-and-practice condition, students weretold the relevant concepts and formulas on

www.annualreviews.org • Instructional Guidance in Student Learning 461

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 18: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

density and then practiced with contrast-ing cases. In the invent-with-contrasting-casescondition, students had to invent formulas withthe same contrasting cases first, and then for-mulas were provided only after they completedall the inventing tasks. Both groups of studentsshowed a similar level of proficiency at applyinga density formula on a word problem; however,the invention students showed better perfor-mance on the transfer tests that also required anunderstanding of ratio concepts but had seman-tically unrelated topics. Schwartz et al. (2011)argued that the tell-and-practice students didnot have a chance to find the deep structurebecause they simply focused on what they hadbeen told and practiced applying the learnedformulas. Similar to the findings of Schwartz &Martin (2004), the inventing activity appearedto serve as preparation for future learning, andthus when the expert solutions were providedlater, these students could appreciate the ex-pert solutions better than those who were notprepared.

Kapur (2011, Kapur & Bielaczye 2011) alsotested the effects of combining invention ac-tivities with follow-up instruction in multipleclassroom studies and showed that it was in-deed more effective than just providing directinstruction without invention activities. Kapur(2008) explains this with what he calls produc-tive failure. Even though most students fail togenerate valid methods on their own during theinvention phase, this failure experience actuallyhelps students become prepared to learn betterin the following learning phase by activatingstudents’ prior knowledge and having studentsattend to critical features of the concept.

Based on the idea that invention activ-ity can bolster learning when combined witha follow-up instruction, Roll and his col-leagues (2010) integrated the strengths of ex-ploration/invention with strengths of direct in-struction in a computer-based tutor called theInvention Lab. In the Invention Lab, studentsare given invention tasks where they have toinvent novel methods for computing certainproperties of data. Roll et al. (2011) found thatstudents who both designed and evaluated their

own methods performed better than those whoonly evaluated methods without design activ-ity on conceptual knowledge and debuggingtests.

SUMMARY

We have reviewed several decades of debatesand empirical evidence on discovery learningapproaches and direct instructional approaches.Both positive and negative effects have beenreported. Positive effects of discovery learn-ing have been reported in alternative classroomprojects where students were encouraged to in-vent their own procedure to solve a problemand discuss their solutions with their classmates.High levels of practice also facilitate successfuldiscovery learning. Minimal guidance is knownto have several cognitive benefits such as bettermemory and transfer as a result of the gener-ation effect and to develop better attitudes to-ward a learning domain. However, it can be dis-advantageous when, as a result of unnecessaryexcessive floundering, students fail to discoverprinciples they are expected to learn.

On the other hand, a number of empiricalstudies suggest positive effects of providing di-rect instruction. Strong empirical evidence isespecially found in example-based learning, al-though one can question whether this should becharacterized as direct instruction. When stu-dents are given step-by-step solutions, they areknown to learn better than when they simplypractice problem solving. According to the cog-nitive load theory, worked examples help stu-dents focus on relevant problem solution stepsby reducing irrelevant search activity such asmeans-ends analysis that is mostly found whensolving unfamiliar problems. In contrast to thestrong empirical support for worked examples,it is not still clear whether adding explanationsto worked examples helps learning or not. It ap-pears that extra explanation is only helpful whenit is appropriately embedded into a worked ex-ample in a way that allows learners to integratemultiple sources of information without bur-dening their working memory. When multiplesources of information fail to be integrated, it

462 Lee · Anderson

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 19: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

causes a split-attention effect and thus hinderslearning.

We have also reviewed other instructionalfactors that might influence the effectiveness ofinstructional methods. Learner characteristics,such as prior knowledge, interact with levels ofinstruction. The provision of guidance is some-times not beneficial for advanced learners (i.e.,expertise reversal effect), whereas it helps in-experienced learners. There is strong supportto suggest that self-explanation helps learn-ing through the generation process. Comparingmultiple solutions also increases the effective-ness of worked examples, especially when learn-ers have sufficient prior knowledge to make ananalogy from one to the other.

Several attempts have been made to com-bine strengths of both discovery learningand direct instruction approaches. Workedexamples and problem-solving practice weresuccessfully combined using a fading procedure(e.g., Atkinson et al. 2003, Renkl et al. 2002,Salden et al. 2010). When transitioning fromworked examples to problem-solving practicedepending on the stage of learning, it led to bet-ter learning outcome than administering justone instructional method. Following sugges-tions from the preparation for future learningmethod, having students experience an explo-ration phase followed by an instruction phasealso led to better learning. Productive failurecan take advantage of the strengths of boththe discovery learning and direct instructionapproaches.

CONCLUSIONS

Although there are islands of clarity in this field,it is apparent that there is not a comprehen-sive understanding that would predict the out-come of different amounts of guidance acrossdifferent learning situations. We think the fun-damental reason for this is that despite all thepronouncements, there is not a detailed under-standing of the mechanisms by which studentsturn their learning experiences into knowledge.

We have mainly focused on domains wherethe target competence is the ability to solve

problems. In such domains it rarely (if ever)happens that students can simply take the wordsthey hear from a teacher or find on a page andconvert these into the sort of knowledge thatthey can transfer. In some way students mustconstruct the knowledge by understanding howit applies to their problem solving. It is alsoa rare case that the best way for students toachieve such an understanding is by being leftto figure it out entirely for themselves. Acquir-ing knowledge the first time this way took cen-turies. The key question is how students can beguided to construct this knowledge efficientlyin a form that will transfer across the desiredrange of situations.

Looking back on this review, we are struckby two things. First, there is relatively little ev-idence (but not none) that verbal instructionhelps. Second, there seems to be a great abun-dance of evidence that providing an example ofa problem solution does help. We are temptedto believe that pure discovery learning succeedsonly because successful discovery can providethe student with examples to learn from, whichthey have come to understand through the dis-covery process. We are equally tempted to be-lieve that pure verbal instruction is effectiveonly to the extent that it helps students under-stand real or imagined examples. That is, wesuspect that learning in problem-solving do-mains is fundamentally example based and thatboth instruction and discovery have their effectsin helping students understand the examples.

While the acquisition of problem-solvingcompetence may be example based, we havealso reviewed ample evidence that not allexamples are equally effective and that whataccompanies these examples can be critical.The most important role of verbal instructionmay be to draw attention to the critical aspectsof the examples. It is also important to do thisin a way that is efficient and does not burdenthe student with unnecessary processing. Itis often possible to achieve the same effectby nonverbal highlighting mechanisms. Thesequencing and juxtaposition of examples canserve a role similar to highlighting criticalfeatures. If students can solve the problem on

www.annualreviews.org • Instructional Guidance in Student Learning 463

Erratum

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 20: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

Table 1 Advantages and disadvantages of providing instruction

Advantages Disadvantages• Provides correct solutions and explanations • Solution methods may be rotely learned and

poorly remembered• Guides students to material to be learned • Discourages learning that goes beyond the

instruction• Identifies critical features in the examples • Prevents students from testing the adequacy of

their understanding• Makes time efficient by reducing floundering and

irrelevant search• Processing verbal instruction can pose a

comprehension burden• Reduces working memory demands created by

managing problem solving• Splits attention when multiple sources of

information are not integrated

their own without guidance, this can be aneffective way to identify what is critical aboutthe example solution that is generated.

With this perspective in mind, we have puttogether Table 1, which summarizes someof the possible advantages and disadvantagesfrom providing instructional guidance to thelearner.2 The biggest advantage of instructionis that it provides learners with correct infor-mation that may never be found by learners ontheir own. On the other hand, this informationmay be only rotely memorized and poorly re-membered. Instruction will focus the student

on critical material and pull their focusing awayfrom the irrelevant, but it will also discour-age types of learning that may be more useful.When studying an example, the instruction canhighlight the critical features but can also pre-vent students from testing whether they reallyunderstand what is critical in the example. Typ-ically, instruction will prevent floundering, butprocessing the instruction itself can be a timesink. Finally, problem solving on one’s own cantake resources away from learning, but so cantrying to integrate the multiple sources of infor-mation that are frequently part of instruction.

DISCLOSURE STATEMENT

The authors are not aware of any affiliations, memberships, funding, or financial holdings thatmight be perceived as affecting the objectivity of this review.

ACKNOWLEDGMENTS

Preparation of this review was supported by IES grant R305A100109. We thank Abe Anderson,Jennifer Ferris, Keith Holyoak, David Klahr, Ken Koedinger, Marsha Lovett, and Daniel Schacterfor valuable comments on the paper.

LITERATURE CITED

Aleven V, Koedinger KR. 2002. An effective meta-cognitive strategy: learning by doing and explaining witha computer-based cognitive tutor. Cogn. Sci. 26:147–79

Anderson JR, Corbett AT, Koedinger KR, Pelletier R. 1995. Cognitive tutors: lessons learned. J. Learn. Sci.4:167–20

2A similar table on the benefits and cost of assistance giving and withholding is found in Koedinger & Aleven (2007).

464 Lee · Anderson

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 21: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

Anderson JR, Reder LM, Simon H. 1998. Radical constructivism and cognitive psychology. In Brookings Paperson Education Policy 1998, ed. D Ravitch, pp. 227–78. Washington, DC: Brookings Inst. Press

Atkinson RK. 2002. Optimizing learning from examples using animated pedagogical agents. J. Educ. Psychol.94:416–27

Atkinson RK, Derry SJ, Renkl A, Wortham DW. 2000. Learning from examples: instructional principles fromthe worked examples research. Rev. Educ. Res. 70:181–214

Atkinson RK, Renkl A, Merrill MM. 2003. Transitioning from studying examples to solving problems: effectsof self-explanation prompts and fading worked-out steps. J. Educ. Psychol. 95:774–83

Ausubel DP. 1964. Some psychological and educational limitations of learning by discovery. Arith. Teach.11:290–302

Ausubel DP. 1968. Educational Psychology: A Cognitive View. New York: Holt, Rinehart & WinstonBaddeley A. 1992. Working memory. Science 255:556–59Baroody AJ, Ginsburgh HP. 1983. The effects of instruction on children’s understanding of the “equals” sign.

Elem. Sch. J. 84:199–212Bjork RA. 1994. Memory and metamemory considerations in the training of human beings. In Metacognition:

Knowing About Knowing, ed. J Metcalfe, A Shimamura, pp. 185–205. Cambridge, MA: MIT PressBobrow SA, Bower GH. 1969. Comprehension and recall of sentences. J. Exp. Psychol. 80:455–61Bonawitz E, Shafto P, Gweon H, Goodman ND, Spelke E, Schulz L. 2011. The double-edged sword of

pedagogy: Instruction limits spontaneous exploration and discovery. Cognition 120:322–30Boucheix JM, Lowe RK. 2010. An eye tracking comparison of external pointing cues and internal continuous

cues in learning with complex animations. Learn. Instr. 20:123–35Bransford JD, Schwartz DL. 1999. Rethinking transfer: a simple proposal with multiple implications. In Review

of Research in Education, ed. A Iran-Nejad, PD Pearson, 24:61–101. Washington, DC: Am. Educ. Res.Assoc.

Bruner JS. 1961. The act of discovery. Harv. Educ. Rev. 31:21–32Brunstein A, Betts S, Anderson JR. 2009. Practice enables successful learning under minimal guidance. J. Educ.

Psychol. 101:790–802Campbell VN. 1964. Self-direction and programmed instruction for five different types of learning objectives.

Psychol. Sch. 1:348–59Carpenter TP, Franke ML, Jacobs VR, Fennema E, Empson SB. 1998. A longitudinal study of invention and

understanding in children’s multidigit addition and subtraction. J. Res. Math. Educ. 29:3–20Carroll WM. 1994. Using worked examples as an instructional support in the algebra classroom. J. Educ.

Psychol. 86:360–67Catrambone R. 1994. Improving examples to improve transfer to novel problems. Mem. Cogn. 22:606–15Catrambone R. 1998. The subgoal learning model: creating better examples so that students can solve novel

problems. J. Exp. Psychol.: Gen. 127:355–76Chandler P, Sweller J. 1991. Cognitive load theory and the format of instruction. Cogn. Instr. 8:293–332Charney DH, Reder LM, Kusbit GW. 1990. Goal setting and procedure selection in acquiring computer

skills: a comparison of tutorials, problem-solving, and learner exploration. Cogn. Instr. 7:323–42Chen Z, Klahr D. 1999. All other things being equal: children’s acquisition of the control of variables strategy.

Child Dev. 70:1098–120Chi MTH, Bassok M, Lewis MW, Reimann P, Glaser R. 1989. Self-explanations: How students study and

use examples in learning to solve problems. Cogn. Sci. 13:145–82Chinn CA, Malhotra BA. 2001. Epistemologically authentic scientific reasoning. In Designing for Science:

Implications from Everyday, Classroom, and Professional Settings, ed. K Crowley, CD Schunn, T Okada,pp. 351–92. Mahwah, NJ: Erlbaum

Cobb P, Wood T, Yackel E, Nicholls J, Wheatley G, et al. 1991. Assessment of a problem-centered second-grade mathematics project. J. Res. Math. Educ. 22:3–29

Cooper G, Sweller J. 1987. The effects of schema acquisition and rule automation on mathematical problem-solving transfer. J. Educ. Psychol. 79:347–62

Corbett AT, Anderson JR. 1995. Knowledge tracing: modeling the acquisition of procedural knowledge. UserModel. User-Adapt. Interact. 4:253–78

www.annualreviews.org • Instructional Guidance in Student Learning 465

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 22: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

Craig R. 1956. Directed versus independent discovery of established relations. J. Educ. Psychol. 47:223–35Craig S, Gholson B, Driscoll D. 2002. Animated pedagogical agents in multimedia educational environments:

effects of agent properties, picture features, and redundancy. J. Educ. Psychol. 94:428–34Cronbach LJ, Snow RE. 1977. Aptitudes and Instructional Methods: A Handbook for Research on Interactions. New

York: IrvingtonDean D, Kuhn D. 2006. Direct instruction versus discovery: the long view. Sci. Educ. 91:384–97De Koning BB, Tabbers HK, Rikers RMJP, Paas F. 2010. Learning by generating versus receiving instructional

explanations: two approaches to enhance attention cueing in animations. Comp. Educ. 55:681–91Fay AL, Mayer RE. 1994. Benefits of teaching design skills before teaching LOGO computer programming:

evidence for syntax-independent learning. J. Educ. Comp. Res. 11:187–210Fong GT, Krantz DH, Nisbett RE. 1986. The effects of statistical training on thinking about everyday

problems. Cogn. Psychol. 18:253–92Gagne RM, Brown LT. 1961. Some factors in the programming of conceptual learning. J. Exp. Psychol.

62:313–21Gentner D, Loewenstein J, Thompson L. 2003. Learning and transfer: a general role for analogical encoding.

J. Educ. Psychol. 95:393–405Gerjets P, Scheiter K, Catrambone R. 2004. Designing instructional examples to reduce intrinsic cognitive

load: molar versus modular presentation of solution procedures. Instr. Sci. 32:33–58Gerjets P, Scheiter K, Catrambone R. 2006. Can learning from molar and modular worked examples be

enhanced by providing instructional explanations and prompting self-explanations? Learn. Instr. 16:104–21

Gick ML, Holyoak KJ. 1983. Schema induction and analogical transfer. Cogn. Psychol. 15:1–38Gutherie JT. 1967. Expository instruction versus a discovery method. J. Educ. Psychol. 1:45–49Hausmann RGM, Nokes TJ, VanLehn K, Gershman S. 2009. The design of self-explanation prompts: the fit

hypothesis. In Proc. 31st Annu. Conf. Cogn. Sci. Soc., pp. 2626–31. Amsterdam: Cogn. Sci. Soc.Hausmann RGM, VanLehn K. 2007. Explaining self-explaining: a contrast between content and generation.

In Proc. Artificial Intelligence in Educ., ed. R Luckin, KR Koedinger, J Greer, pp. 417–24. Amsterdam: IOSPress

Hiebert J, Wearne D. 1996. Instruction, understanding, and skill in multidigit addition and subtraction. Cogn.Instr. 14:251–83

Jeung H, Chandler P, Sweller J. 1997. The role of visual indicators in dual sensory mode instruction. Educ.Psychol. 17:329–43

Kalyuga S. 2007. Expertise reversal effect and its implications for learner-tailored instruction. Educ. Psychol.Rev. 19:509–39

Kalyuga S, Ayres P, Chandler P, Sweller J. 2003. The expertise reversal effect. Educ. Psychol. 38:23–31Kalyuga S, Chandler P, Sweller J. 2000. Incorporating learner experience into the design of multimedia

instruction. J. Educ. Psychol. 92:126–36Kalyuga S, Chandler P, Tuovinen J, Sweller J. 2001. When problem solving is superior to studying worked

examples. J. Educ. Psychol. 93:579–88Kamii C, Dominick A. 1998. The harmful effects of algorithms in grades 1–4. In The Teaching and Learning

of Algorithms in School Mathematics: 1998 Yearbook, ed. LJ Morrow, MJ Kenney, pp. 130–40. Reston, VA:Natl. Counc. Teach. Math.

Kapur M. 2008. Productive failure. Cogn. Instr. 26:379–425Kapur M. 2011. A further study of productive failure in mathematical problem solving: unpacking the design

components. Instr. Sci. 39:561–79Kapur M, Bielaczye K. 2011. Classroom-based experiments in productive failure. In Proc. 33rd Annu. Conf.

Cogn. Sci. Soc., ed. L Carlson, C Hoelscher, TF Shipley, pp. 2812–17. Austin, TX: Cogn. Sci. Soc.Kirschner PA, Sweller J, Clark RE. 2006. Why minimal guidance during instruction does not work: an analysis

of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educ.Psychol. 41:75–86

Kittel JE. 1957. An experimental study of the effect of external direction during learning on transfer andretention of principles. J. Educ. Psychol. 48:391–405

466 Lee · Anderson

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 23: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

Klahr D, Li J. 2005. Cognitive research and elementary science instruction: from the laboratory, to theclassroom, and back. J. Sci. Educ. Technol. 4:217–38

Klahr D, Nigam M. 2004. The equivalence of learning paths in early science instruction: effects of directinstruction and discovery learning. Psychol. Sci. 15:661–67

Koedinger KR, Aleven V. 2007. Exploring the assistance dilemma in experiments with cognitive tutors. Educ.Psychol. Rev. 19:239–64

Koedinger KR, Anderson JR, Hadley WH, Mark M. 1997. Intelligent tutoring goes to school in the big city.Int. J. Artif. Intell. Educ. 8:30–43

Koedinger KR, Corbett AT, Perfetti C. 2010. The knowledge-learning-instruction (KLI) framework: towardbridging the science-practice chasm to enhance robust student learning. Carnegie Mellon Univ. Tech.Rep. http://pact.cs.cmu.edu/pubs/PSLC-Theory-Framework-Tech-Rep.pdf

Kuhn D. 2007. Is direct instruction an answer to the right question? Educ. Psychol. 42:109–13Lee M, Thompson A. 1997. Guided instruction in LOGO programming and the development of cognitive

monitoring strategies among college students. J. Educ. Comp. Res. 16:125–44Lepper MR. 1988. Motivational considerations in the study of instruction. Cogn. Instr. 5:289–309Lewis MW, Anderson JR. 1985. Discrimination of operator schemata in problem solving: procedural learning

from examples. Cogn. Psychol. 17:26–65Lovett MC. 1992. Learning by problem solving versus by examples: the benefits of generating and receiving

information. In Proc. 14th Annu. Conf. Cogn. Sci. Soc., pp. 956–61. Hillsdale, NJ: ErlbaumMatlen BJ, Klahr D. 2010. Sequential effects of high and low guidance on children’s early science learning. In

Proc. 9th Int. Conf. Learn. Sci., ed. K Gomez, L Lyons, J Radinsky, pp. 1016–23. Chicago: Int. Soc. Learn.Sci.

Mayer RE. 2004. Should there be a three-strikes rule against pure discovery learning? The case for guidedmethods of instruction. Am. Psychol. 59:14–19

Mayer RE, Heiser J, Lonn S. 2001. Cognitive constraints on multimedia learning: when presenting morematerial results in less understanding. J. Educ. Psychol. 93:187–98

Mayer RE, Sims VK. 1994. For whom is a picture worth a thousand words? Extensions of a dual-coding theoryof multimedia learning. J. Educ. Psychol. 84:389–460

McDaniel MA, Schlager MS. 1990. Discovery learning and transfer of problem-solving skills. Cogn. Instr.7:129–59

Miller C, Lehman J, Koedinger K. 1999. Goals and learning in microworlds. Cogn. Sci. 23:305–36Miller GA. 1956. The magical number seven, plus or minus two: some limits on our capacity for processing

information. Psychol. Rev. 63:81–97Natl. Counc. Teach. Math. 1989. Curriculum and Evaluation Standards for School Mathematics. Reston, VA:

Natl. Counc. Teach. Math.Natl. Counc. Teach. Math. 2000. Principles and Standards for School Mathematics. Reston, VA: Natl. Counc.

Teach. Math.Paas F. 1992. Training strategies for attaining transfer of problem-solving skill in statistics: a cognitive-load

approach. J. Educ. Psychol. 84:429–34Paas F, Van Merrienboer J. 1994a. Instructional control of cognitive load in the training of complex cognitive

tasks. Educ. Psychol. Rev. 6:351–71Paas F, Van Merrienboer J. 1994b. Variability of worked examples and transfer of geometrical problem solving

skills: a cognitive-load approach. J. Educ. Psychol. 86:122–33Papert S. 1980. Mindstorms: Children, Computers, and Powerful Ideas. New York: Basic BooksPiaget J. 1970. Genetic Epistemology. New York: Columbia Univ. PressPiaget J. 1973. To Understand Is to Invent. New York: GrossmanPiaget J. 1980. Adaptation and Intelligence: Organic Selection and Phenocopy. Chicago: Univ. Chicago PressReed SK, Bolstad CA. 1991. Use of examples and procedures in problem solving. J. Exp. Psychol.: Learn. Mem.

Cogn. 17:753–66Reiser BJ, Copen WA, Ranney M, Hamid A, Kimberg DY. 1994. Cognitive and Motivational Consequences of

Tutoring and Discovery Learning. Tech. Rep., Inst. Learn. Sci., Northwestern Univ.Renkl A. 1997. Learning from worked-out examples: a study on individual differences. Cogn. Sci. 21:1–29

www.annualreviews.org • Instructional Guidance in Student Learning 467

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 24: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

Renkl A. 2002. Worked-out examples: Instructional explanations support learning by self-explanations. Learn.Instr. 12:529–56

Renkl A, Atkinson RK, Maier UH. 2000. From studying examples to solving problems: Fading worked-outsolution steps helps learning. In Proc. 22nd Annu. Conf. Cogn. Sci. Soc., ed. L Gleitman, AK Joshi, pp. 393–98. Mahwah, NJ: Erlbaum

Renkl A, Atkinson RK, Maier UH, Staley R. 2002. From example study to problem solving: smooth transitionshelp learning. J. Exp. Educ. 70:293–315

Renkl A, Stark R, Gruber H, Mandl H. 1998. Learning from worked-out examples: the effects of examplevariability and elicited self-explanations. Contemp. Educ. Psychol. 23:90–108

Rittle-Johnson B. 2006. Promoting transfer: the effects of direct instruction and self-explanation. Child Dev.77:1–15

Rittle-Johnson B, Alibali MW. 1999. Conceptual and procedural knowledge of mathematics: Does one leadto the other? J. Educ. Psychol. 91:175–89

Rittle-Johnson B, Siegler RS, Alibali MW. 2001. Developing conceptual understanding and procedural skillin mathematics: an iterative process. J. Educ. Psychol. 93:346–62

Rittle-Johnson B, Star JR. 2007. Does comparing solution methods facilitate conceptual and procedural knowl-edge? An experimental study on learning to solve equations. J. Educ. Psychol. 99:561–74

Rittle-Johnson B, Star JR. 2009. Compared with what? The effects of different comparisons on conceptualknowledge and procedural flexibility for equation solving. J. Educ. Psychol. 101:529–44

Rittle-Johnson B, Star JR, Durkin K. 2009. The importance of prior knowledge when comparing examples:influences on conceptual and procedural knowledge of equation solving. J. Educ. Psychol. 101:836–52

Roll I, Aleven V, Koedinger KR. 2010. The invention lab: using a hybrid of model tracing and constraint-basedmodeling to offer intelligent support in inquiry environments. In Proc. Int. Conf. Intelligent Tutor. Syst.,ed. V Aleven, J Kay, J Mostow, pp. 115–24. Berlin: Springer Verlag

Roll I, Aleven V, Koedinger KR. 2011. Outcomes and mechanisms of transfer in invention activities. In Proc.33rd Annu. Conf. Cogn. Sci. Soc., ed. L Carlson, C Hoelscher, TF Shipley, pp. 2824–29. Austin, TX: Cogn.Sci. Soc

Salden R, Aleven V, Schwonke R, Renkl A. 2010. The expertise reversal effect and worked examples in tutoredproblem solving. Instr. Sci. 38:289–307

Schmidt RA, Bjork RA. 1992. New conceptualization of practice: Common principles in three paradigmssuggest new concepts for training. Psychol. Sci. 3:207–17

Schwonke R, Wittwer J, Alven V, Salden R, Krieg C, Renkl A. 2007. Can tutored problem solving benefitfrom faded worked-out examples? In Proc. European Cogn. Sci. Conf. 2007, ed. S Vosniadou, D Kayser, AProtopapas, pp. 59–64. New York: Erlbaum

Schworm S, Renkl A. 2006. Computer-supported example-based learning: when instructional explanationsreduce self-explanations. Comp. Educ. 46:426–45

Schwartz DL, Bransford JD. 1998. A time for telling. Cogn. Instr. 16:475–522Schwartz DL, Chase CC, Oppezzo MA, Chin DB. 2011. Practicing versus inventing with contrasting cases:

the effects of telling first on learning and transfer. J. Educ. Psychol. 103:759–75Schwartz DL, Martin T. 2004. Inventing to prepare for future learning: the hidden efficiency of encouraging

original student production in statistics instruction. Cogn. Instr. 22:129–84Seufert T, Janen I, Brunken R. 2007. The impact of intrinsic cognitive load on the effectiveness of graphical

help for coherence formation. Comp. Hum. Behav. 23:1055–71Shute VJ. 1992. Aptitude-treatment interactions and cognitive skill diagnosis. In Cognitive Approaches to Auto-

mated Instruction, ed. JW Regian, VJ Shute, pp. 15–47. Hillsdale, NJ: ErlbaumSiegler RS. 2002. Microgenetic studies of self-explanations. In Microdevelopment: Transition Processes in Devel-

opment and Learning, ed. N Granott, J Parziale, pp. 31–58. New York: Cambridge Univ. PressSlamecka NJ, Graf P. 1978. The generation effect: delineation of a phenomenon. J. Exp. Psychol.: Hum. Learn.

Mem. 4:592–604Steffe LP, Gale J. 1995. Constructivism in Education. Hillsdale, NJ: ErlbaumStrand-Cary M, Klahr D. 2008. Developing elementary science skills: instructional effectiveness and path

independence. Cogn. Dev. 23:488–511

468 Lee · Anderson

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 25: Student Learning: What Has Instruction Got to Do With It?

PS64CH17-Lee ARI 18 January 2013 10:14

Suchman JR. 1961. Inquiry training: building skills for autonomous discovery. Merrill-Palmer Q. Behav. Dev.7:147–69

Sweller J. 1988. Cognitive load during problem solving: effects on learning. Cogn. Sci. 12:257–85Sweller J, Cooper GA. 1985. The use of worked examples as a substitute for problem solving in learning

algebra. Cogn. Instr. 2:59–89Sweller J, Levine M. 1982. Effects of goal specificity on means-ends analysis and learning. J. Exp. Psychol.:

Learn. Mem. Cogn. 8:463–74Sweller J, Mawer R, Ward M. 1983. Development of expertise in mathematical problem solving. J. Exp.

Psychol.: Gen. 112:639–61Sweller J, Van Merrienboer J, Paas F. 1998. Cognitive architecture and instructional design. Educ. Psychol.

Rev. 10:251–96Tarmizi R, Sweller J. 1988. Guidance during mathematical problem solving. J. Educ. Psychol. 80:424–36Tobias S, Duffy TM, eds. 2009. Constructivist Instruction: Success or Failure. New York: RoutledgeTrafton JG, Reiser RJ. 1993. The contribution of studying examples and solving problems to skill acquisition.

In Proc. 15th Annu. Conf. Cogn. Sci. Soc., ed. M Polson, pp. 1017–22. Hillsdale, NJ: ErlbaumTuovinen JE, Sweller J. 1999. Comparison of cognitive load associated with discovery learning and worked

examples. J. Educ. Psychol. 91:334–41Ward M, Sweller J. 1990. Structuring effective worked examples. Cogn. Instr. 7:1–39Wittwer J, Renkl A. 2008. Why instructional explanations often do not work: a framework for understanding

the effectiveness of instructional explanations. Educ. Psychol. 43:49–64Wittwer J, Renkl A. 2010. How effective are instructional explanations in example-based learning? A meta-

analytic review. Educ. Psychol. Rev. 22:393–409Zhu X, Simon HA. 1987. Learning mathematics from examples and by doing. Cogn. Instr. 4:137–66

www.annualreviews.org • Instructional Guidance in Student Learning 469

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 26: Student Learning: What Has Instruction Got to Do With It?

PS64-FrontMatter ARI 15 November 2012 14:20

Annual Review ofPsychology

Volume 64, 2013 Contents

Prefatory

Shifting Gears: Seeking New Approaches for Mind/Brain MechanismsMichael S. Gazzaniga � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 1

Biological Bases of Behavior

The Endocannabinoid System and the BrainRaphael Mechoulam and Linda A. Parker � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � �21

Vision

SynesthesiaJamie Ward � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � �49

Scene Perception, Event Perception, Object Recognition

Visual Aesthetics and Human PreferenceStephen E. Palmer, Karen B. Schloss, and Jonathan Sammartino � � � � � � � � � � � � � � � � � � � � � � � � �77

Attention and Performance

Detecting Consciousness: A Unique Role for NeuroimagingAdrian M. Owen � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 109

Executive FunctionsAdele Diamond � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 135

Animal Learning and Behavior

The Neuroscience of Learning: Beyond the Hebbian SynapseC.R. Gallistel and Louis D. Matzel � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 169

Evolutionary Psychology

Evolutionary Psychology: New Perspectives on Cognitionand MotivationLeda Cosmides and John Tooby � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 201

Origins of Human Cooperation and MoralityMichael Tomasello and Amrisha Vaish � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 231

vi

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 27: Student Learning: What Has Instruction Got to Do With It?

PS64-FrontMatter ARI 15 November 2012 14:20

Language and Communication

Gesture’s Role in Speaking, Learning, and Creating LanguageSusan Goldin-Meadow and Martha Wagner Alibali � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 257

Nonverbal and Verbal Communication

The Antecedents and Consequences of Human Behavioral MimicryTanya L. Chartrand and Jessica L. Lakin � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 285

Intergroup Relations, Stigma, Stereotyping, Prejudice, Discrimination

Sexual PrejudiceGregory M. Herek and Kevin A. McLemore � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 309

Social Neuroscience

A Cultural Neuroscience Approach to the Biosocial Natureof the Human BrainShihui Han, Georg Northoff, Kai Vogeley, Bruce E. Wexler,

Shinobu Kitayama, and Michael E.W. Varnum � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 335

Organizational Climate/Culture

Organizational Climate and CultureBenjamin Schneider, Mark G. Ehrhart, and William H. Macey � � � � � � � � � � � � � � � � � � � � � � � � 361

Industrial Psychology/Human Resource Management

Employee RecruitmentJames A. Breaugh � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 389

Learning and Performance in Educational Settings

Self-Regulated Learning: Beliefs, Techniques, and IllusionsRobert A. Bjork, John Dunlosky, and Nate Kornell � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 417

Teaching of Subject Matter

Student Learning: What Has Instruction Got to Do With It?Hee Seung Lee and John R. Anderson � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 445

Health Psychology

Bringing the Laboratory and Clinic to the Community: MobileTechnologies for Health Promotion and Disease PreventionRobert M. Kaplan and Arthur A. Stone � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 471

Research Methodology

Multivariate Statistical Analyses for Neuroimaging DataAnthony R. McIntosh and Bratislav Misic � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 499

Contents vii

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 28: Student Learning: What Has Instruction Got to Do With It?

PS64-FrontMatter ARI 15 November 2012 14:20

Social Network Analysis: Foundations and Frontiers on AdvantageRonald S. Burt, Martin Kilduff, and Stefano Tasselli � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 527

Indexes

Cumulative Index of Contributing Authors, Volumes 54–64 � � � � � � � � � � � � � � � � � � � � � � � � � � � 549

Cumulative Index of Chapter Titles, Volumes 54–64 � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � � 554

Errata

An online log of corrections to Annual Review of Psychology articles may be found athttp://psych.AnnualReviews.org/errata.shtml

viii Contents

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 29: Student Learning: What Has Instruction Got to Do With It?

AnnuAl Reviewsit’s about time. Your time. it’s time well spent.

AnnuAl Reviews | Connect with Our expertsTel: 800.523.8635 (us/can) | Tel: 650.493.4400 | Fax: 650.424.0910 | Email: [email protected]

New From Annual Reviews:Annual Review of Organizational Psychology and Organizational BehaviorVolume 1 • March 2014 • Online & In Print • http://orgpsych.annualreviews.org

Editor: Frederick P. Morgeson, The Eli Broad College of Business, Michigan State UniversityThe Annual Review of Organizational Psychology and Organizational Behavior is devoted to publishing reviews of the industrial and organizational psychology, human resource management, and organizational behavior literature. Topics for review include motivation, selection, teams, training and development, leadership, job performance, strategic HR, cross-cultural issues, work attitudes, entrepreneurship, affect and emotion, organizational change and development, gender and diversity, statistics and research methodologies, and other emerging topics.

Complimentary online access to the first volume will be available until March 2015.TAble oF CoNTeNTs:•An Ounce of Prevention Is Worth a Pound of Cure: Improving

Research Quality Before Data Collection, Herman Aguinis, Robert J. Vandenberg

•Burnout and Work Engagement: The JD-R Approach, Arnold B. Bakker, Evangelia Demerouti, Ana Isabel Sanz-Vergel

•Compassion at Work, Jane E. Dutton, Kristina M. Workman, Ashley E. Hardin

•ConstructivelyManagingConflictinOrganizations, Dean Tjosvold, Alfred S.H. Wong, Nancy Yi Feng Chen

•Coworkers Behaving Badly: The Impact of Coworker Deviant Behavior upon Individual Employees, Sandra L. Robinson, Wei Wang, Christian Kiewitz

•Delineating and Reviewing the Role of Newcomer Capital in Organizational Socialization, Talya N. Bauer, Berrin Erdogan

•Emotional Intelligence in Organizations, Stéphane Côté•Employee Voice and Silence, Elizabeth W. Morrison• Intercultural Competence, Kwok Leung, Soon Ang,

Mei Ling Tan•Learning in the Twenty-First-Century Workplace,

Raymond A. Noe, Alena D.M. Clarke, Howard J. Klein•Pay Dispersion, Jason D. Shaw•Personality and Cognitive Ability as Predictors of Effective

Performance at Work, Neal Schmitt

•Perspectives on Power in Organizations, Cameron Anderson, Sebastien Brion

•Psychological Safety: The History, Renaissance, and Future of an Interpersonal Construct, Amy C. Edmondson, Zhike Lei

•Research on Workplace Creativity: A Review and Redirection, Jing Zhou, Inga J. Hoever

•Talent Management: Conceptual Approaches and Practical Challenges, Peter Cappelli, JR Keller

•The Contemporary Career: A Work–Home Perspective, Jeffrey H. Greenhaus, Ellen Ernst Kossek

•The Fascinating Psychological Microfoundations of Strategy and Competitive Advantage, Robert E. Ployhart, Donald Hale, Jr.

•The Psychology of Entrepreneurship, Michael Frese, Michael M. Gielnik

•The Story of Why We Stay: A Review of Job Embeddedness, Thomas William Lee, Tyler C. Burch, Terence R. Mitchell

•What Was, What Is, and What May Be in OP/OB, Lyman W. Porter, Benjamin Schneider

•Where Global and Virtual Meet: The Value of Examining the Intersection of These Elements in Twenty-First-Century Teams, Cristina B. Gibson, Laura Huang, Bradley L. Kirkman, Debra L. Shapiro

•Work–Family Boundary Dynamics, Tammy D. Allen, Eunae Cho, Laurenz L. Meier

Access this and all other Annual Reviews journals via your institution at www.annualreviews.org.

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.

Page 30: Student Learning: What Has Instruction Got to Do With It?

AnnuAl Reviewsit’s about time. Your time. it’s time well spent.

AnnuAl Reviews | Connect with Our expertsTel: 800.523.8635 (us/can) | Tel: 650.493.4400 | Fax: 650.424.0910 | Email: [email protected]

New From Annual Reviews:

Annual Review of Statistics and Its ApplicationVolume 1 • Online January 2014 • http://statistics.annualreviews.org

Editor: Stephen E. Fienberg, Carnegie Mellon UniversityAssociate Editors: Nancy Reid, University of Toronto

Stephen M. Stigler, University of ChicagoThe Annual Review of Statistics and Its Application aims to inform statisticians and quantitative methodologists, as well as all scientists and users of statistics about major methodological advances and the computational tools that allow for their implementation. It will include developments in the field of statistics, including theoretical statistical underpinnings of new methodology, as well as developments in specific application domains such as biostatistics and bioinformatics, economics, machine learning, psychology, sociology, and aspects of the physical sciences.

Complimentary online access to the first volume will be available until January 2015. table of contents:•What Is Statistics? Stephen E. Fienberg•A Systematic Statistical Approach to Evaluating Evidence

from Observational Studies, David Madigan, Paul E. Stang, Jesse A. Berlin, Martijn Schuemie, J. Marc Overhage, Marc A. Suchard, Bill Dumouchel, Abraham G. Hartzema, Patrick B. Ryan

•The Role of Statistics in the Discovery of a Higgs Boson, David A. van Dyk

•Brain Imaging Analysis, F. DuBois Bowman•Statistics and Climate, Peter Guttorp•Climate Simulators and Climate Projections,

Jonathan Rougier, Michael Goldstein•Probabilistic Forecasting, Tilmann Gneiting,

Matthias Katzfuss•Bayesian Computational Tools, Christian P. Robert•Bayesian Computation Via Markov Chain Monte Carlo,

Radu V. Craiu, Jeffrey S. Rosenthal•Build, Compute, Critique, Repeat: Data Analysis with Latent

Variable Models, David M. Blei•Structured Regularizers for High-Dimensional Problems:

Statistical and Computational Issues, Martin J. Wainwright

•High-Dimensional Statistics with a View Toward Applications in Biology, Peter Bühlmann, Markus Kalisch, Lukas Meier

•Next-Generation Statistical Genetics: Modeling, Penalization, and Optimization in High-Dimensional Data, Kenneth Lange, Jeanette C. Papp, Janet S. Sinsheimer, Eric M. Sobel

•Breaking Bad: Two Decades of Life-Course Data Analysis in Criminology, Developmental Psychology, and Beyond, Elena A. Erosheva, Ross L. Matsueda, Donatello Telesca

•Event History Analysis, Niels Keiding•StatisticalEvaluationofForensicDNAProfileEvidence,

Christopher D. Steele, David J. Balding•Using League Table Rankings in Public Policy Formation:

Statistical Issues, Harvey Goldstein•Statistical Ecology, Ruth King•Estimating the Number of Species in Microbial Diversity

Studies, John Bunge, Amy Willis, Fiona Walsh•Dynamic Treatment Regimes, Bibhas Chakraborty,

Susan A. Murphy•Statistics and Related Topics in Single-Molecule Biophysics,

Hong Qian, S.C. Kou•Statistics and Quantitative Risk Management for Banking

and Insurance, Paul Embrechts, Marius Hofert

Access this and all other Annual Reviews journals via your institution at www.annualreviews.org.

Ann

u. R

ev. P

sych

ol. 2

013.

64:4

45-4

69. D

ownl

oade

d fr

om w

ww

.ann

ualr

evie

ws.

org

by U

nive

rsity

of

Chi

cago

Lib

rari

es o

n 05

/10/

14. F

or p

erso

nal u

se o

nly.