15
Teaching and Teacher Education 24 (2008) 1426–1440 Capturing and assessing evidence of student teacher inquiry: A case study Peter Rich a, , Michael Hannafin b a Instructional Psychology and Technology, Brigham Young University, Provo, UT 84602, USA b Educational Psychology and Instructional Technology, University of Georgia, Athens, GA 30602, USA Received 21 June 2007; received in revised form 7 November 2007; accepted 21 November 2007 Abstract In this paper, we present five principles underlying teacher inquiry and report a case study during which student teachers in a US teacher education programme used evidence-based decision support, an inquiry method designed to help teachers analyse and adapt their own teaching through the use of a video analysis tool. This case study examined interplay among tools designed to guide teaching practice, student teachers’ own self-guided inquiry, and feedback from cooperating teachers as they made instructional decisions. Preservice teachers initially accepted the guidance provided by teaching analysis tools, but abandoned the tools in favour of informal self-assessments and feedback from cooperating teachers when they assumed teaching responsibilities in their own classrooms. We discuss the role of external support and video evidence in guiding preservice teacher inquiry. r 2008 Elsevier Ltd. All rights reserved. Keywords: Video; Teacher education; Teacher inquiry; Evidence; Preservice; Mentor; Cooperating teacher 1. Introduction Inquiry in teacher education involves system- atically researching one’s practices in context in order to improve teaching and learning. Many forms of inquiry have been reported in teacher education, including action research, critical in- quiry, reflective practices, analysis of beliefs, video clubs, and teacher research (Lytle & Cochran- Smith, 1994). Most approaches, however, are consistent with principles initially set forth by John Dewey (1910) and subsequently operationalised in various preservice teacher education programmes (Noffke, 1997). Several authorities have since proposed teacher inquiry methods (see, for example, Hubbard & Power, 2003; Korthagen, 2001). In this paper, we examine how preservice teachers enact teacher inquiry to make instructional decisions using tools and feedback from cooperating teachers (CTs) to analyse their practices. According to Dewey (1933), an inquiry begins with a question, a ‘‘perplexity’’ (p. 12). Something must challenge the thought process and bring to bear conscious and purposeful thinking about a subject. Dewey (1910) notes that ‘‘thinking is not a case of spontaneous combustion; it does not occur just on ‘general principles.’ There is something ARTICLE IN PRESS www.elsevier.com/locate/tate 0742-051X/$ - see front matter r 2008 Elsevier Ltd. All rights reserved. doi:10.1016/j.tate.2007.11.016 Corresponding author. Tel.: +706 389 6383; fax: +706 542 4123. E-mail addresses: [email protected], [email protected] (P. Rich), hannafi[email protected] (M. Hanna- fin).

Capturing and assessing evidence of student teacher inquiry: A case study

Embed Size (px)

Citation preview

Page 1: Capturing and assessing evidence of student teacher inquiry: A case study

ARTICLE IN PRESS

0742-051X/$ - s

doi:10.1016/j.ta

�Correspondfax: +706542 4

E-mail addr

peter_rich@byu

fin).

Teaching and Teacher Education 24 (2008) 1426–1440

www.elsevier.com/locate/tate

Capturing and assessing evidence of studentteacher inquiry: A case study

Peter Richa,�, Michael Hannafinb

aInstructional Psychology and Technology, Brigham Young University, Provo, UT 84602, USAbEducational Psychology and Instructional Technology, University of Georgia, Athens, GA 30602, USA

Received 21 June 2007; received in revised form 7 November 2007; accepted 21 November 2007

Abstract

In this paper, we present five principles underlying teacher inquiry and report a case study during which student teachers

in a US teacher education programme used evidence-based decision support, an inquiry method designed to help teachers

analyse and adapt their own teaching through the use of a video analysis tool. This case study examined interplay among

tools designed to guide teaching practice, student teachers’ own self-guided inquiry, and feedback from cooperating

teachers as they made instructional decisions. Preservice teachers initially accepted the guidance provided by teaching

analysis tools, but abandoned the tools in favour of informal self-assessments and feedback from cooperating teachers

when they assumed teaching responsibilities in their own classrooms. We discuss the role of external support and video

evidence in guiding preservice teacher inquiry.

r 2008 Elsevier Ltd. All rights reserved.

Keywords: Video; Teacher education; Teacher inquiry; Evidence; Preservice; Mentor; Cooperating teacher

1. Introduction

Inquiry in teacher education involves system-atically researching one’s practices in context inorder to improve teaching and learning. Manyforms of inquiry have been reported in teachereducation, including action research, critical in-quiry, reflective practices, analysis of beliefs, videoclubs, and teacher research (Lytle & Cochran-Smith, 1994). Most approaches, however, areconsistent with principles initially set forth by John

ee front matter r 2008 Elsevier Ltd. All rights reserved

te.2007.11.016

ing author. Tel.: +706 389 6383;

123.

esses: [email protected],

.edu (P. Rich), [email protected] (M. Hanna-

Dewey (1910) and subsequently operationalised invarious preservice teacher education programmes(Noffke, 1997). Several authorities have sinceproposed teacher inquiry methods (see, for example,Hubbard & Power, 2003; Korthagen, 2001). In thispaper, we examine how preservice teachers enactteacher inquiry to make instructional decisionsusing tools and feedback from cooperating teachers(CTs) to analyse their practices.

According to Dewey (1933), an inquiry beginswith a question, a ‘‘perplexity’’ (p. 12). Somethingmust challenge the thought process and bring tobear conscious and purposeful thinking about asubject. Dewey (1910) notes that ‘‘thinking is not acase of spontaneous combustion; it does not occurjust on ‘general principles.’ There is something

.

Page 2: Capturing and assessing evidence of student teacher inquiry: A case study

ARTICLE IN PRESSP. Rich, M. Hannafin / Teaching and Teacher Education 24 (2008) 1426–1440 1427

specific which occasions and evokes it’’ (p. 12).Similarly, Schon (1983) noted that a teacher‘‘reflects in action’’ when a ‘‘puzzling, troubling orinteresting phenomenon’’ (p. 50) occurs for thatteacher. Teacher educators seek to help preserviceteachers consciously consider the perplexities oftheir own classrooms and teaching, encouragingthem to become problem solvers (Dawson, 2006).Often, these perplexities occur in the teacher’severyday, moment-to-moment activities and ac-tions.

The skills, the arts of looking and listening tothose things that happen every day in classroomsand that subsequently tend to be overlooked areinvaluable to the teacher. It is so easy to peg one’sself into the doldrums of hopelessness in school,the routines of day-to-day life in school along withthe bureaucratic ‘‘mumbo jumbo’’ that don’t seemto reflect who we want to be as human beings andwhat we want to do for children in classrooms.(Poetter et al., 1997, p. 184)

Thus, preservice teachers must learn to problematiseclassroom events in order to understand them andultimately improve their ability to teach.

Hubbard and Power (2003) suggested that pre-service teachers also need support to establishparameters for their inquiries to ensure that theprocess does more than simply confirm or corrobo-rate a predetermined outcome. Rich et al. (2005)characterised the inquiry process as ‘‘purposefulobservation’’ that involves deliberate planning toanticipate what, when, and how methods are to beobserved. ‘‘It is one thing to have flashes ofinspiration and creative insights, but it requirescareful planning and rational decision-making toput most novel ideas into practice’’ (Goodman,1991). While sufficiently flexible to accommodateunexpected events, systematic inquiry focuses ob-servation on specific qualities and attributes ofteacher and student activity.

During reflective thought ‘‘judgment [is] sus-pended during further inquiry’’ (Dewey, 1910,p. 13). ‘‘Stepping back’’ involves detaching oneselffrom the observation without becoming detachedfrom the evidence gathered, or the reflective process(Schon, 1983). Hubbard and Power (2003) char-acterise stepping back as ‘‘seeing and seeing again’’(p. 88), noting that preservice teachers must first beable to describe observed evidence before makingany concluding judgment. ‘‘Thinking, in short, mustend as well as begin in the domain of concrete

observations, if it is to be complete thinking’’(Dewey, 1910, p. 96).

Teacher educators have lauded the inquiryprocess for encouraging preservice teachers tofinally see the connections between the theories oflearning and the practice of teaching (Poetter et al.,1997), something preservice teachers report asdifficult to do (Maloch et al., 2003). Gitlin andTeitelbaum (1983) contend that preservice teachersmust ‘‘‘utilitise’ their university instruction andother sources of relevant knowledge to considerwhy particular schooling practices occur and theireducational (and ethical) implications’’ (p. 230).Analysis, therefore, helps to guide preservice tea-chers’ assessments of their and others’ practice(Gitlin, Barlow, Burbank, Kauchek, & Stevens,1999). Analysis involves weighing concrete evidenceof practice with established norms, theories andresearch.

Dewey (1910) further noted that ‘‘demand for thesolution of a perplexity is the steadying and guidingfactor in the entire process’’ (p. 11). The purpose ofinquiry-oriented teacher education is to increaselearning by improving individual teaching practices.If preservice teachers do nothing to improve theirpractices, then the inquiry remains incomplete.Educational researchers have made increasing callsfor a ‘‘knowledge-base’’ in teacher education duringthe past 10 years (Darling-Hammond, 2000; Gitlinet al., 1999; Hiebert, Gallimore, & Stigler, 2002;Supovitz, 2002). This knowledge-base can only bebuilt on a personal and professional level if teachersfind and execute solutions to their inquiries. Gitlinand Teitelbaum (1983) extend this idea by declaringthat preservice teachers can validate their analysesand findings by presenting and sharing their findingswith a public audience, moving beyond whatintuition and literature already tell them. In orderto improve one’s practices through inquiry, ateacher must necessarily return to the perplexitythat initiated the inquiry by acting out a viablesolution to it.

In a recent study (Rich, Recesso, Allexsaht-Snider, & Hannafin, 2007), we examined howpreservice teachers analysed and adapted theirpractices while implementing evidence-based deci-sion support (EBDS)—a scaffolded inquiry ap-proach—via the video analysis tool (VAT). EBDSinvolves planning, analysing, reflecting, and adapt-ing instructional approaches by comparing evidenceof one’s practice with accepted norms, conventions,and standards. We found that preservice teachers

Page 3: Capturing and assessing evidence of student teacher inquiry: A case study

ARTICLE IN PRESSP. Rich, M. Hannafin / Teaching and Teacher Education 24 (2008) 1426–14401428

who were engaged in video analysis pinpointeddiscrepancies between their perceived and recordedactions. In the current study, we examined howstudent teachers engaged EBDS as they planned for,executed, and adapted their instruction.

1All names are pseudonyms.

2. The study

Teacher education programmes in the UnitedStates often provide successively intensive opportu-nities for preservice teachers to observe and practiceprofessional teaching. In the present study, pre-service teachers participated in a four-semester,phased experience. During the first semester, theyspent up to 2 weeks observing multiple teachers indifferent classrooms. They then spent 3 weeksworking with a single teacher and taught at leastone lesson during the second semester. During thethird semester, the preservice teachers spent 4 weekswith a single classroom teacher and taught aminimum of one entire unit (i.e., five lessons) on amutually agreed-upon topic. During the fourth andfinal semester, the preservice teachers spent 10weeks in a single classroom, assuming completeresponsibility for a minimum of 2 weeks. Super-vising teachers are commonly referred to as ‘‘Co-operating’’ or ‘‘Mentor’’ teachers. For the purposesof this study, we refer to these inservice teachers asCTs.

During their student teaching experience, acohort of 26 student teachers used EBDS toexamine a self-defined attribute of their practice.All had previously used the VAT to analyse theirteaching during a 1-month internship in their thirdsemester of teacher education (Rich et al., 2007). Inthe present study, all student teachers engaged intwo EBDS cycles during a 10-week student teachingexperience. While we collected survey informationfrom the entire cohort, in order to examineindividual experiences more closely, we soughtparticipants who would vary in motivation toengage in this process based on individual self-reports and the researchers’ observations of theprior experience. We offered a stipend for participa-tion, but none of the ‘‘less motivated’’ participantscompleted the study due to anticipated concernsover adding tasks during their student teaching.Thus, we used the complete data for, and draw ourconclusions from, four preservice teachers consid-ered motivated to participate and use the ap-proaches.

In addition to individual analyses, we invited (andcompensated) the CTs to also analyse our four case-study participants’ videos. We reasoned that doingso would provide additional clinical expertise in theanalysis and interpretation of evidence of studentteacher practice. Thus, we hoped to provide multi-ple triangulation points to assess student teachers’evidence-based decisions. Interestingly, there was agreat degree of difference in each CT’s experience inmentoring preservice teachers. Kristen1 was herCT’s 32nd intern and Susan was the 10th intern to aCT who was her school’s student teacher liaison. Incontrast, Lisa and Zoe were both the first mentoringexperience for their CTs. Three of the four CTsassessed their protege’s videos using the sameframework as selected by their respective studentteachers.

3. Methods

3.1. Data and instrumentation

The data sources, instruments, and data reduc-tion and analysis methods used in this study areshown in Table 1.

3.1.1. Professional development plan

Prior to student teaching, participants created aprofessional development plan to document whichaspects of their teaching they planned to analyse,what evidence they would collect around that issue,and their rationale for their instructional decisions.

3.1.2. Lenses

In order to interpret evidence of their practicesthrough an established framework, participantswere asked to choose a lens to analyse theirevidence. A lens provides a specific perspectivethrough which teaching practices can be examined.Teacher educators identified three attributes of astate-developed framework for teacher developmentthat they felt were important for student teachers tofocus on (Table 2). We reasoned that providingstate-sponsored teacher assessment lenses wouldhelp student teachers focus their analyses aroundimportant teaching concepts.

3.1.3. Video analysis tool

Participants collected and analysed video evi-dence of their practice using the VAT (see Fig. 1)

Page 4: Capturing and assessing evidence of student teacher inquiry: A case study

ARTICLE IN PRESS

Table 1

Data sources, instruments and analyses used for planning,

implementing and adapting practice via EBDS

Instructional

decisions

Data source and

instruments

Analysis

Planning � Professional

development

plan

� Recorded ‘‘pre-

brief’’

� Written unit/

lesson plans

� Open-coding, looking for

evidence of implicit vs.

evidence-based decisions

� Cross comparison to

highlight themes across

participants

Implementing � Student teacher

(ST) VAT

comments

� Cooperating

teacher (CT)

VAT comments

� Follow-up

interviews

� Constant comparison of

codes, focusing on EBDS

stages

� Triangulate interviews,

surveys, debriefs, and

VAT comments to look

for references to

suggested and enacted

decisions. Compare

within and across

participants

Adapting � Follow-up

surveys

� Recorded ‘‘de-

brief’’

� Final reflections

� ST follow-up

interviews

� CT follow-up

interviews

� Compare debrief with

VAT analyses, looking

for evidence of implicit

vs. evidence-based

decisions

� Code for degree of

alignment with teacher

assessment instrument

and frequency of lens

use. GTSM code counts

of VAT comments

� Content-based analysis

of ST and CT VAT

comments

� Timeline comparison of

ST coded events against

CT coded events

� Triangulation of VAT

comments regarding

assessment framework

with interviews

P. Rich, M. Hannafin / Teaching and Teacher Education 24 (2008) 1426–1440 1429

(see Bryan and Recesso (2006) and Rich andHannafin (2008) for detailed descriptions). UsingVAT, student teachers uploaded and segmentedevidence of their teaching practices, then analysedthem via comments associated with the videosegments under review. Student teachers alsoannotated each event and, if desired, interpretedthe specific video clip using their chosen lens; these

meta-data were then saved in a database (step 4).We collected and analysed comments made by bothstudent teachers as well their CTs’ analyses of thesame VAT videos.

3.1.4. Pre-briefing and debriefing videos

In order to determine the extent to which studentteachers’ actions resulted from implicit reflection ordetailed analysis of evidence, we asked each case-study participant to video-record their thoughtsprior to and following each recorded lesson. Thepurpose of the pre-brief was to document studentteachers’ preactive instructional decisions (Jackson,1968) before they were enacted. The debrief,obtained immediately following their teaching butprior to formal analysis, was designed to documentintuitive reactions prior to evidence-based compar-ison and to compare them with captured evidence oftheir actions.

3.1.5. Follow-up interviews

Interview protocols were constructed by choosing‘‘big questions’’ (Mason, 1996) to guide the inter-view, then constructing sub-questions and breakingthese into micro-level questions. The purpose of thisfinal interview was to report on the perceived effectof EBDS inquiry on participants’ instructionaldecisions. We asked participants to describe how(or if) evidence affected their understanding andenactment of their instructional decisions. Partici-pants also reported the extent to which they usedthe new state-developed teacher assessment instru-ment and any effect of this on their instructionaldecisions. After interviewing each student teacher,we then interviewed CTs using a similar semi-structured protocol in order to confirm or contra-dict evidence gathered from the student teachers.

3.1.6. Final reflection paper

At the conclusion of student teaching, partici-pants completed a 2–3 page reflection paper thatsynthesised their experience using EBDS and itsimpact on their instructional decisions. The paperprovided first-person, narrative evidence of personalgrowth during the inquiry process.

3.1.7. Intermediate and follow-up surveys

Student teachers completed one survey at themid-point and one was administered after complet-ing student teaching. The surveys queried the extentto which participants engaged EBDS, developedinitial courses of action, and used a state-developed

Page 5: Capturing and assessing evidence of student teacher inquiry: A case study

ARTICLE IN PRESS

Fig 1. Participants analysing video evidence of their practice using the video analysis tool (VAT).

Table 2

Example of a lens available to participants to guide/interpret their inquiries

Attribute H:

Not yet evident Basic Proficient Advanced

Identification of

student strengths

and needs

� Develops differentiated assessment

plan/activities

� Applies differentiated assessment

to all students

� Needs routine direction and

support to develop and implement

differentiated assessment

� Is knowledgeable of varied

assessment approaches

� Organises assessments based

on individual student needs

� Applies methods for

assessing individual student

needs

� Uses feedback from peers to

revise assessments for

individual student needs

� Seeks support to revise

assessments.

� Seeks opportunities to

discover new assessment

methods

� Dynamically adapts assessments to

address specific students’ needs

� Implements a range of assessments for

the needs of each child

� Develops innovative assessments for

specific students

� Modifies assessments on the fly based on

‘‘teachable moments’’ to account for

individual student needs

� Is a resource to peers for sharing varied

and individualised assessment methods

Domain: Assessment: Teachers understand and use a range of formal and informal assessment strategies to evaluate and ensure the

continuous development of all learners

P. Rich, M. Hannafin / Teaching and Teacher Education 24 (2008) 1426–14401430

lens. Through the survey, all cohort members hadthe opportunity to share the outcomes of theirinquiry projects and the perceived benefit of EBDSin this process.

3.1.8. Individual supporting documents

During the course of interviews, when partici-pants referenced resources related to their asser-tions, we obtained the corresponding documents

Page 6: Capturing and assessing evidence of student teacher inquiry: A case study

ARTICLE IN PRESSP. Rich, M. Hannafin / Teaching and Teacher Education 24 (2008) 1426–1440 1431

and used these for triangulation purposes whenavailable.

3.2. Data analysis

We used an emergent coding scheme (Glaser &Strauss, 1967) and constant comparative analysis tolook for themes across participants’ actions at eachEBDS stage. Using Charmaz’s (2002) concept ofaction verb descriptions, we coded interviews, VATcomments, pre-brief and debrief records, finalreflections, and related supporting documents foreach participant. Thereafter, we employed a con-stant comparative process to identify and definethemes within and across participants. We con-ducted our analysis using Atlas.tis because of itshermeneutic approach to coding, which allowed usto connect codes at various levels and to definerelationships between and among codes. We usedmatrix displays (Miles & Huberman, 1994) totriangulate similar and/or contradictory evidenceacross data sources.

We also analysed similarities and differencesbetween student teachers’ and CTs’ VAT com-ments. We aligned the time-stamps indicating when

Fig. 2. A graphical overview

student teachers and CTs coded each video andcontrasted their comments, or annotations, relatedto the specific segments to determine whether theyaddressed the same issue(s). Finally, we conducted across-comparative analysis of the CTs’ VAT com-ments to determine whether their comments weredescriptive or evaluative.

4. Procedures

Fig. 2 provides a graphical overview of the studyprocedures. In the following section, we discuss howparticipants enacted their inquiry during two EBDSiterations.

4.1. Establishing a trigger

Similar to finding and framing a question, atrigger initiates an inquiry about a particular aspectof one’s teaching. To encourage student teachers tosuccinctly define a trigger statement, they developeda professional development plan prior to beginningstudent teaching. Additionally, because researchconsistently shows that preservice teachers needguidance and direction when analysing their own

of the study procedures.

Page 7: Capturing and assessing evidence of student teacher inquiry: A case study

ARTICLE IN PRESSP. Rich, M. Hannafin / Teaching and Teacher Education 24 (2008) 1426–14401432

practice (Dawson, 2006; Griffin, 2003; Richardson,1996), participants framed their inquiries by select-ing a lens that represented one of the threeattributes of a state-developed framework forteacher practices. Teacher educators identified threeattributes they wished student teachers to focustheir inquiries: One attribute focused on assessingindividual students’ strengths and needs, anotherattribute focused on accommodating individualstudent needs, and the third attribute focused onclassroom management and learning environments.

4.2. Planning for and collecting evidence

Consistent with the systematic observation in-quiry principle, student teachers identified informa-tion to collect in order to investigate their triggers.We instructed participants to identify a minimum oftwo sources of evidence for their inquiries. Since theemphasis was on improving student teachers’practice through video, video was a required sourceof evidence. Once student teachers outlined theirplans, they collected evidence for their first inquirywithin the first 6 weeks of student teaching, such asrecording practice related to their trigger anduploading videos to the VAT.

4.3. Analysis

EBDS analysis subsumes the inquiry principles of‘‘stepping back’’ and ‘‘analysis,’’ and involves boththe description and interpretation of collectedevidence. Recent research (Davis, 2006) demon-strates guidance helps preservice teachers to benefitfrom the teaching knowledge experts deem impor-tant. During analysis, student teachers used theirselected lens, or framework, to codify and interprettheir evidence. Using VAT’s annotation feature,participants associated comments to their lens-related practices via hyperlinks. The CTs alsoanalysed student teachers’ videos using the sametrigger and lens, providing additional evidence toguide interpretation.

4.4. Developing and enacting a course of action

Noffke (1997) points out, ‘‘The process ofpersonal transformation through the examinationof practices and self-reflection may be a necessarypart of social change, especially in education; it is,however, not sufficient’’ (p. 329). Thus, followingthe interpretation of their collected evidence,

student teachers identified, chose, and enacted asolution related to their individual inquiry. In sodoing, EBDS becomes a formative process for self-improvement that teachers may engage iteratively tosystematically analyse and progressively refine theirteaching. As Dewey (1910) noted about outcomes,‘‘the ultimate educative value of all deductiveprocesses is measured by the degree to which theybecome working tools in the creation and develop-ment of new experiences’’ (p. 96).

During the last 4 weeks of student teaching,participants repeated the process (Round 2) de-scribed above. They identified a new or refinedtrigger statement (via a mid-term survey), createdplans for collecting data, video-recorded themselves,uploaded their videos to the VAT system, analysedtheir videos using the VAT, and decided on a courseof action for future teaching. At the end of studentteaching, we also collected each participant’s finalcourse of action, final reflection paper, and finalsurvey and conducted follow-up interviews.

5. Researcher statement

We have been involved with the E-TEACHinitiative funded by a United States Departmentof Education Preparing Teachers to use Technology(PT3) grant. The initiative, in part, involves devel-oping and refining a technology innovation—Web-based tools and evidence-based practices—usingdesign-based research. As part of this emerginginnovation, our research addresses the adaptationand use of the technology innovation by preserviceteachers, teacher educators, mentors, and CTs. Webelieve that Web-supported, evidence-based ap-proaches can facilitate the formative developmentof preservice teachers’ practices. Consistent with theprinciples of design-based research, we attempt toharbour a healthy cynicism as to the value of thesystem such that we might improve both the toolsand the methods through deliberate and iterativechanges.

6. Findings and discussion

To gauge the degree to which participants enactedthe EBDS methods, we asked each to summarisetheir individual inquiry process; their responses aresummarised in Table 3.

Each participant chose a unique trigger state-ment, or inquiry, and reportedly progressed througheach EBDS stage accordingly. Despite uniquely

Page 8: Capturing and assessing evidence of student teacher inquiry: A case study

ARTICLE IN PRESS

Table 3

Comparison across participants’ reported EBDS processes

Participant Trigger Evidence planning/

collection

Analysis and

interpretation

Course of action

Kristen You have to teach for a

little while before you can

even start. ‘Cause you have

to know what your

weaknesses are. You have

to know what you’re going

to work on. You have to

know where you are

struggling or what you are

uncomfortable withyonce

you have some experience,

then you focus more on

what you wanna’ work on.

And choose your topic

I do the videotapes, I know

what I’m looking at, and

I’m already going to be

better than I was, just

because I’m conscious of it

I analyzed the first video,

usually after looking at all

that, so I could really tell

what I was doing

After I found the different

ways to make it

betteryanalyze it and

then reflect on where it’s

headed. If it’s something

that you did perfect in that

time, then you can close it

and feel like you have that

down, but otherwise keep

going. Know that that’s

something that you need to

pay attention to in the

future

Lisa First I was observed and

decided, with the help of

my observers, something

that I needed help on

I did research in an

informal way by talking to

people. And then I

implemented the research,

and the different ways and

strategies

After you videotape the

first time (laughs), I always

looked at how I could do

better. You know, either

talk to somebodyy

Then I decided what I still

needed work on...So kind

of back to the beginning of

the observationy The first

time, in the beginning of

my stages, I was

observedyafter I watched

that first video I was able

to observe myself and

decide what I needed to

work on for the next time

And then I did assessment

by video

Susan I picked something that I

thought I needed to work

on or saw the problem. I

thought about it

I collected some evidence

by videoing myself and

talking to other people and

observing my cooperating

teacher

I analyzed what went on in

the lesson. I got feedback

from my cooperating

teacher

And then I developed

another strategy for

combating the problem

that was a little different

from my first one, just

because of what I found

out. And then I taped

again and that just led me

to more thinking

Zoe In the beginning I wasn’t

really sure what I wanted

to focus onyI knew I

wanted to focus on

something and so I chose

something that was almost

isolated to the situation I

was in, with a very

standards-driven school

Through the process and

through that video

I was able to find

something that I really

want to focus on in my

teachingybecause of my

goals as a teacher. Not

because of the situation I

was in

I’ve already begun doing

this, kind of working on it

in the classroom I’m in

now...Working on

recognizing the key points

that I need to cover

beforehand in my

planning. And then

keeping the pre-assessment

minimum so that I can go

right into those key points

and get the kids started

P. Rich, M. Hannafin / Teaching and Teacher Education 24 (2008) 1426–1440 1433

individual inquiries, participants shared similar ex-periences in how they approached the inquiry process.

6.1. Enacting EBDS stages

6.1.1. Establishing and refining a trigger

Of the three lenses available, Kristen originallyintended to address assessment-related concerns—

an area she most wanted to improve in her teaching.However, prior to recording her first lesson, shechanged focus. Kristen’s CT noted,

She felt comfortable with assessmentsyso shewanted to choose something that she didn’t feelquite so comfortable withy[As she] wentthrough the student teaching experience and she

Page 9: Capturing and assessing evidence of student teacher inquiry: A case study

ARTICLE IN PRESSP. Rich, M. Hannafin / Teaching and Teacher Education 24 (2008) 1426–14401434

worked more and more with the kids, she realisedthat sheywanted to focus on differentiation.

Kristen’s new trigger addressed emergent events inher classroom. Her CT recalls, ‘‘I have a very widerange of abilities in the classroomyIn reading wework in literacy groups, but ityvaries by ability.’’Because of this range in abilities, Kristen wanted todifferentiate instruction for both higher as well aslower performing learners.

Lisa began her inquiry by observing how hercurrent student teaching context differed from theprior semester. In her prior experience teaching inthe fifth grade, she could give several directions andexpect students to follow them. But in kindergarten,she noticed, ‘‘I just kind of shout out all thedirections, and they were lost.’’ In her first pre-briefing video, Lisa reported that the transitionbetween story-time and calendar was difficult. Thus,her initial trigger focused on transitions betweenfloor and seat time. Lisa noted that this decisionoriginated during prior field experiences, but withdifferent implications for her kindergarten class-room. ‘‘I’d beenyfocusing on transitions anddirections, the whole time, from my first practicum,and in my second. And then I realised that I had tochange it depending on the age level, so I decided totryyfocusing on ways to work with my kindergar-teners.’’

Prior to choosing a trigger, Susan also based herinquiry on needs evident in her present classroom.After observing her CT for several weeks, shestated:

I noticed that [the same kids] got called on a lotmore than some of the students who didn’t reallyknow what was going on, so they didn’t wanttoyembarrass themselves by raising their handsor they weren’t engaged enough to know whatquestion was being asked.

Although Susan observed her CT calling on specificstudents more than others, she noted the sametendency in her own teaching: ‘‘the first few times Itaught, I was doing the exact same thing as she did.‘Cause it’s easy just to call on the kids that areraising their hands to get the answer that you needto move on to the next point that you are going tomake.’’

Similarly, Zoe modified her focus based onevidence in her current student teaching context.Whereas she previously conducted read-aloudactivities primarily for enjoyment, she reported

feeling constrained by the need to associate herteaching with a standard. ‘‘At the school I’m atright now, everything has to be backed up with astandard. And so I want to make sure that I’m ableto transition from a read-aloud into an effectiveactivity.’’ Interestingly, Zoe indicated that adher-ence to standards ‘‘wasn’t necessarily somethingthat was going to overall enhance my teachingyforyears to come.’’ Thus, her first inquiry addressedexternal expectations more than intent to improveher own practice.

Although asked to identify a trigger prior tobeginning student teaching, all four participantssubsequently modified their inquiry based on theiremergent classroom needs. As Kristen noted, ‘‘onceyou have some experience, then you kind of focusmore on what you wanna’ work on.’’ Zoe’s secondinquiry, which emerged as she initially analysed herteaching using the VAT, became more personallyrelevant than externally relevant, and unlike herinitial trigger, a focus she planned to continue inthe future. Interestingly, while participating studentteachers underscored the importance of identifyinga personally relevant trigger during actual teachingexperience, their CTs recommended they findand investigate a trigger as soon as possible,preferably within the first 2 weeks of their experi-ence. While participants intended to transition totheir inquiries as they became better acquaintedwith their teaching responsibilities and needs,tension was evident between the student teachers’agency to identify a situationally relevant triggerand their CTs’ expectations to investigate usingtheir trigger.

Although directed to develop a trigger a priori,participants established their initial trigger withexplicit advice from a CT or a university observer.Kristen recalled, ‘‘[my CT] told me pretty much thatshould be it.’’ Susan relied on observations andconversations regarding her CT’s practice, andcollaboratively discussed approaches to involvestudents. As Lisa notes, ‘‘first I was observed anddecided, with the help of my observers somethingthat I needed help on.’’ While Zoe’s second triggerwas influenced somewhat by her video analysis, shelater wrote in her final paper, ‘‘though I recognisedthis weakness in myself, initially it was brought tomy attention by my university supervisor during myfirst observation.’’ Susan deferred to the CT todevelop a trigger: ‘‘she is the expert in her class-room, ‘cause she knows how I am doing in there, orwhat’s normal for these kids.’’

Page 10: Capturing and assessing evidence of student teacher inquiry: A case study

ARTICLE IN PRESSP. Rich, M. Hannafin / Teaching and Teacher Education 24 (2008) 1426–1440 1435

6.1.2. Planning for and collecting evidence

In professional development plans written priorto their initial inquiry, each participant identified aminimum of one additional related piece of evidenceto corroborate findings. Susan created a checklist totrack the frequency with which her CT called onstudents during class; she then asked her to keep asimilar tally while Susan was teaching. However, thelists were rarely used because, ‘‘we’d half-way startand then wouldn’t finish them. So [we] just kind ofgave up a little bit on that.’’ Kristen planned to usea structured approach with her CT. She reported:

We did what was called a matrix, and we hadwhat was the three groups and three days. Wedid what we were gonna’ do for each group withthe same lessonyWhen I was preparing, Imentally did it, but I didn’t write it down.

Zoe and Lisa, in contrast, did not plan to collectevidence other than informal observations fromtheir CTs and university supervisors. Lisa pro-vided written observations received from outsideobservers, but reported that she did not usethem in her formal self-analysis. Thus, participantscollected and analysed primarily their own videoevidence.

While video evidence is central to EBDS inquiry,Sherin and van Es (2007) recently cautioned thatvideo’s ‘‘keyhole’’ effect—the tendency to focusnarrowly on specific events to the exclusion of other,equally important events—may limit the ability toidentify critical nuances and complexities related toclassroom activities. EBDS attempts to address thisby using complementary and converging evidencefrom non-video sources to augment, complement orrefute video evidence. Despite indicating theirintent, our participants did not include suchevidence suggesting that collecting multiple formsof evidence may be impractical for preserviceteachers in many classroom settings. David Shum(1994), an authority in the use of evidence, suggeststhat force—influence on a decision—is more criticalthan the amount of evidence. Participants in theWest et al. (2007) study characterised this as theextent to which evidence represents adequatelythe critical elements in a given situation. Thus,while we are concerned about our participants’seeming over-reliance on video evidence, it may bethat they perceived it to have greater force for theirindividual inquiries.

6.1.3. Analysing practices: lenses, triggers, and

mentors

Having previously examined how participantsapproach analysis (Rich et al., 2007), we examinedhow and why participants ‘‘noticed’’ (Sherin & vanEs, 2005; van Es & Sherin, 2002) discrepancies intheir teaching and generated possible solutions totheir actions using a lens from the state framework.Lisa reportedly used all three of the provided lensesfor her inquiry on student engagement. An analysisof her VAT comments reveals that Lisa explicitlyintegrated the language used in the lens to make hercomments. She described the lens as an assessmenttool for her progress, referring to them as teachingstandards:

Those are what we’re trying to meet to beeffective teachers. So those are the things thatwe’ve looked at, that I’ve looked at everysemester, in my teachingy.Last semester I thinkI would have been here and now after workingwith different people and trying different strate-gies, I’m herey.So, it was good to see the growthalong the spectrum of the standards, and also toknow, ok I’m meeting this standard. I cangraduate now.

Interestingly, Lisa and her CT only addressed thesame issues in their video analyses 10/21 times(about 48%). Her CT took a different approach tocoding video, explaining that she used the video asan opportunity to address aspects of Lisa’s teachingthat were un-related to her inquiry. Additionally, ofthe 21 annotations she made, only 2 could beconstrued as more than descriptive commentary onwhat Lisa was doing. Lisa did not receive her CT’sfeedback in time to implement any of it. Lisaexplained, ‘‘[My CT] is shyyand really quiet. I haveto ask her explicitly, did I do this? And if she says,‘yes,’ then I have to say, ‘how can I do itdifferently?’ So it’s, it was hard for us to commu-nicate through the videoywithout sitting down andhaving to talk to her.’’ While the lens was useful forLisa, it was not the principal mechanism fordetecting the influence of her work with individualstudents in the overall class. Rather, Lisa’s refocus-ing seems to have been influenced by a combinationof her video analysis and discussions with heruniversity professor. She reported, ‘‘I saw throughmy video that I had students wiggling in thebackground and not really paying attention. But Ialso got that as a suggestion from Dr. Trubach.’’

Page 11: Capturing and assessing evidence of student teacher inquiry: A case study

ARTICLE IN PRESSP. Rich, M. Hannafin / Teaching and Teacher Education 24 (2008) 1426–14401436

Kristen reported difficultly analysing her first setof videos using the provided lens, stating that herclassroom set-up worked well for providing indivi-dual attention to students, but poorly for videoanalysis. After attempting to differentiate the read-ing level of books selected, which was not easilyobservable on video, she stated, ‘‘The video doesnot capture that very well, but they were almost awhole grade level apart.’’ Video’s limited ability tocapture differentiated techniques may have influ-enced her conclusion, ‘‘I didn’t find very manydifferences in my instruction between the low levelgroup and the upper level group.’’

When asked to describe the role of the lens in heranalysis, she responded, ‘‘I kind of felt like mostpeople are going to be on basic anyway.’’ Interest-ingly, her CT identified the same limitation: ‘‘WhenI looked at the lens, I thought well, before I even seethe videos, just from watching her every day, I knowshe’s pretty much gonna’ be [basic].’’ Thus, ratherthan using the lens to examine enacted teachingpractices, both Kristen and her CT assumed that herpractices would be basic because she was abeginning teacher.

Kristen did, however, apply her selected trigger toanalyse her practice, as did her CT. Kristen and herCT addressed the same issue 7/10 times (70%). HerCT described her approach to video analysis: ‘‘Itried to relate them to what she was working on.Not just make random comments about her lesson.I tried to think about how she was differentiatingand meeting student needs.’’ Her comments weremostly evaluative, in which she posed questions toencourage Kristen to think about her teaching. Shereported that they discussed Kristen’s teaching forat least 1 h every day after school and that theirdiscussions focused on ‘‘differentiation’’ for20–30min of that time. Additionally, she reportedthat video facilitated feedback and discussion whenshe and Kristen jointly discussed their individualVAT analyses, ‘‘because lots of times you teach alesson and you remember part of what you did, butyou don’t really remember everything you didyIlike being able to give her the feedback right there.Ok, here’s the snippet I’m talking about.’’

Like Kristen, Susan reported the lens to be oflittle use during her self-analyses: ‘‘I think the realcomments, or what helped me the most are just thethings that I typed in.’’ Both Susan and her CT’scomments also focused on the trigger rather than onthe lens. Each analysed her first VAT video withinthe first few days of her lesson. Out of 27

overlapping comments (in regards to time), Susanand her CT addressed the same issue 20/27 times(about 74%). Of the 5 times that Susan and her CTboth rated an overlapping clip using the lens, onlyonce did they differ in their rating of Susan’spractice. Susan’s CT’s comments were descriptiveand evaluative, offering both recommendations andpraise for Susan’s practices. Susan reported that herCT’s comments helped to reinforce areas where shehad been successful, highlighting instances in whichSusan had encouraged responses from students evenwhen they initially gave incorrect answers.

According to Zoe, ‘‘I don’t remember which [lens]I chose.’’ Like Susan and Kristen, Zoe’s VATcomments focused on her selected trigger. Whilenoting during her first video that she took too muchtime on a single topic, she observed that sheoriginally had many students participating, butdragged a lesson to the point at which only a fewstudents were participating. As with Lisa, Zoe’sinitial analysis was influenced not by her CT, but bya combination of her video analysis and commentsfrom her university facilitator who observed areview activity at the conclusion of a lesson thattook longer than expected prior to recess.

He could see the kids getting antsy, and he saidyou want to end it a few minutes before recess,give them a little time extra, so the next time yousay, Alright, we’re gonna’ play a racer slide,everybody’s thrilled. So he was the one thatbrought it to my attention that you need to endthings when the kids are still enthusiastic aboutit.

Zoe also sought outside support in subsequentactivities, requesting that her university observerfocus on her ability to end lessons on a high note.Interestingly, when probed about whether theobserver gave written feedback on her performance,Zoe noted that he used the ‘‘checklist that theprogrammes gave him, butyhe checked ‘good’ onclosure.’’ Thus, she underscored the need for face-to-face feedback from observers in order to get in-depth feedback.

6.1.4. Enacting and adapting a course of action

All participants used both video evidence andinput from outside observers to develop, enact andrevise their course of action. Kristen, for example,reported, ‘‘talking with Ms. Bolden was huge. Moreso than watching the videosyShe’s able to tell mepractical things to do to make it better.’’ Having

Page 12: Capturing and assessing evidence of student teacher inquiry: A case study

ARTICLE IN PRESSP. Rich, M. Hannafin / Teaching and Teacher Education 24 (2008) 1426–1440 1437

made successful transitions, Lisa sought practicalsuggestions from the student teaching supervisor: ‘‘Isaw through my video that I had students wigglingin the background and not really paying attention.But I also got that as a suggestion from Dr.Trubachyafter I had watched the first video.’’During her first video analysis, Susan noted that hersecond-language learners were more often disen-gaged than the rest of the class. She identified theneed to ‘‘think about setting them up for successmoreybecause you couldn’t just ask [students] aquestion and then expect [them] to answerythat’swhy they weren’t participatingythey didn’t under-stand.’’ Together with her CT, she developed andenacted several techniques, including using hand-gestures to communicate, asking simpler questions,and calling on a variety of students. Zoe’s course ofaction involved creating a lesson that would be‘‘efficient and engaging.’’ She began her secondvideotaped lesson by ‘‘immediately let[ting her]students know the plan for the day.’’ Zoe commen-ted, ‘‘I definitely think this was better in timing thanthe last taped activity. I’m glad this is what Ifocused on.’’ Thus, by incorporating the input shehad received from a university observer with herown analysis, Zoe’s second course of action provedmore effective.

7. General discussion

Several findings emerged as important to evi-dence-based, video-augmented teacher inquiry.First, both preservice and CTs did not meaningfullyuse the state-standard lens to examine practice eventhough those attributes were considered to besufficiently important to include in the state teach-ing framework. In an era in which teachingstandards have become increasingly prominentworldwide, this resistance is of great concern.Perhaps the lens, alone, provided insufficientguidance; alternatively, limited familiarity with thelens and video analysis might have encouraged areturn to current practices. Among participants,Lisa, the individual who recognised the lens fromher earlier courses used it most often.

However, this may also reflect a tacit resistance tousing ‘‘standards-based’’ approaches to assessteaching practice. During trigger development, forexample, participants co-opted the focus of theassigned lens, imposed their own standards forevidence rather than applying those assigned to thelens, or simply disregarded or abandoned their

initial lens entirely. In a prior study (Rich et al.,2007), the same participants customised individuallenses. Kristen, who used Bloom’s taxonomy as alens during the previous study, explained: ‘‘I likedthe first time better. But it might have been becauseI was able to choose it myself, and it was really easyto measure which level I was on.’’ Susan supportedthe idea of creating her own lens, something shegained great benefit from in her prior experience: ‘‘Ithink it might have been more useful for me tocreate some sort of rubric thing, too [because]ywhat I really ended up doing was typing up morespecific comments to what I was looking for andwhen I did myyrefining on the video.’’

If the state-standards lenses did not guideparticipants’ analyses of their teaching practice,what did? All participants and two CTs reportedthat their comments were grounded in theiridentified trigger statements. Participants generallyfollowed one CT’s approach to, ‘‘not just makerandom comments about her lesson’’ by focusing onthe defined trigger. The lens was designed to aid inidentifying those state-specific teaching attributesdeemed critical for teacher development. AlthoughDias, Calandra, and Fox (2007), who conductedextensive video reflection research with preserviceteachers, noted that excessively structured guidancemay promote scripted responses, the present find-ings suggest that such guidance may be ignored—even when key teaching standards are emphasised.

However, ‘‘more knowledgeable other(s)’’(MKO) (Vygotsky, 1987) may have validated whatparticipants ‘‘noticed’’ (van Es & Sherin, 2002)during video analysis. Susan stated that her CT’s,‘‘comments were the most important part to meyshe would give me feedback in the classroom, butwhen she had to sit down and write comments likethat, it was a lot more in-depth than what I hadbeen getting from her.’’ Even when CT feedbackwas minimal, Lisa and Zoe combined video analysiswith input from the university teacher educatorsthat supervised and observed student teachers: Zoeasked her observer to pay special attention to herredefined trigger in future observations; when Lisabelieved that she had successfully accomplished herfirst goal, she sought out the course instructor forfurther direction. While all participants noted theimportance of video analysis for examining theirteaching, the CTs and teacher educators were moreinstrumental in directing and refining their inquiries.

External support was also important as partici-pants sought to identify situationally relevant

Page 13: Capturing and assessing evidence of student teacher inquiry: A case study

ARTICLE IN PRESSP. Rich, M. Hannafin / Teaching and Teacher Education 24 (2008) 1426–14401438

triggers. Once participants assumed teaching re-sponsibilities in their own classrooms, they rede-fined their triggers based on observation andnegotiation. Additionally, while participants voiceda preference for determining their own assessmentframework, they relied more heavily on an MKO’srecommendations because ‘‘they can give mepractical things to do to make it better’’ (Kristen).

However, consistent with Miller and Carney’s(2007) findings, our CTs varied widely in their use ofthe state-sponsored tool to evaluate student teacherpractice. One of the goals of teacher inquiry is todevelop educators capable of critiquing and im-proving their own teaching. While this researchhighlights the importance of a mentor in studentteaching, it is also important to encourage studentteachers to learn to examine their practice based onobservations and evidence of their own teaching.Several researchers have cited the importance ofgrounding student teacher inquiries in a specificframework (Dawson, 2006; Loughran, 2002; Par-kinson, 2005; Zeichner & Tabachnick, 1991). Thisstudy suggests that simply providing a given frame-work may not provide sufficient guidance andsupport to facilitate its use. Rather, teacher educa-tion programmes may need to balance externalguidelines with individual teachers’ personal agency.

Finding the balance between student teacheragency and external approaches to inquiry mayprove problematic. Davis (2006) analysed 70+preservice teachers’ reflective journals, and high-lighted several problems in their self-evaluative andreflective abilities. ‘‘They do not consistently pro-vide evidence for their claims, generate alternativesto their decisions, or question their assumption-syFurthermore, their reflection may lack focus andbe judgmental rather than evaluative’’ (p. 282). Weemployed the state-adopted lens to guide studentteachers to focus on issues and assessments con-sidered important to professional teacher educators.In effect, while supportive, we did not determinewhether student teachers enacted key teaching skillsor how they ‘‘measured up’’ to the state’s standards.Since CTs vary in their use of the tools andmentoring experience, it is especially problematicto give preservice teachers the agency to conducttheir own inquiries without simultaneously assuringthat the mechanisms are in place to help themaddress effectively critical teaching attributes.

Another concern was the reliance on a singlesource of evidence (video) to assess one’s practices.While video capture extended our participants’

ability to reflect on their practices beyond anecdoteand recollection, it represents only one aspect oftheir classroom practice (Sherin & van Es, 2007).Given the clamour for increased student evidence(Whitehurst, 2002) for teacher decision-making, weneed to examine multiple inquiry-based approaches,such as action research where teachers examineevidence of student understanding to enact change(Noffke, 1997). Our participants opted to collectonly video data, which fails to address Messick’s(1994) cautions about consequential validity: Giventhe complexities of the classroom, video alone doesnot provide valid evidence of teaching or learningeffectiveness. According to Phillip Davies (1999),while we need to incorporate existing evidence from‘‘worldwide research on education and associatedsubjects’’ (p. 109), teacher educators need to‘‘establish sound evidence where existing evidenceis lacking or of a questionable, uncertain, or weaknature’’ (p. 109).

The mentor—CT or university observer—wascrucial throughout inquiry. Prior to and followingthe action stages (collecting evidence and course ofaction), they provided insight to guide how parti-cipants considered their inquiries. In their review ofnearly 100 studies, Wideen, Mayer-Smith, andMoon (1998) concluded that successful teachereducation programmes involved, ‘‘close collabora-tionybetween the players in teacher education’’ (p.152). Our study reveals that student teachers soughtout different players. Lisa and Zoe sought out theirteacher education professors when the aid providedto them in the classroom was insufficient. Yet,Susan’s comment that ‘‘when [my CT] had to sitdown and write comments, it was a lot more in-depth than what I had been getting from her[orally]’’ indicates that video and written analysisare vital for promoting collaboration amongplayers. Other researchers (Baker & Milner, 2006)report that when CTs address needs in preserviceteachers’ zone of proximal development, they focuson pedagogy more than personality. However, thepresent findings indicate the need to better preparesupporting educators for ‘‘conversations betweenthe mentors and the student teachersy[that] havean impact on student teacher classroom practice’’(Hawkey, 1998, p. 657).

8. Conclusions

The use of video proved to be beneficial to ourparticipants, but sometimes in ways other than

Page 14: Capturing and assessing evidence of student teacher inquiry: A case study

ARTICLE IN PRESSP. Rich, M. Hannafin / Teaching and Teacher Education 24 (2008) 1426–1440 1439

those intended. The role of mentors—CTs andteacher educators—was crucial in helping preserviceteachers to both direct and interpret videos ofteaching practices. However, based on differencesbetween formal evidence of practice and less-structured feedback garnered from experiencedand inexperienced CTs, mentors also need to useanalytic tools, interpret formal evidence of teachingand provide practice-specific guidance accordingly.Additionally, while our participants indicated apreference for creating or choosing their ownassessment frameworks, it may prove difficult tobalance the agency given to teachers to define theirown priorities with external expectations to demon-strate specific competencies. Inquiry-based methodsmay improve preservice teachers’ teaching knowl-edge and skills, but further study is needed todevelop and validate structured, formal approachesto refining inquiry methods that influence theirpractice.

References

Baker, R. S., & Milner, J. O. (2006). Complexities of collabora-

tion: Intensity of mentors’ response to paired and single

student teachers. Action in Teacher Education, 28(3), 61–72.

Bryan, L. A., & Recesso, A. (2006). Promoting reflection with a

web-based video analysis tool. Journal of Computing in

Teacher Education, 23(1), 31–39.

Charmaz, K. (2002). Qualitative interviewing and grounded

theory analysis. In J. Gubrium, & J. A. Holstein (Eds.),

Handbook of interview research (pp. 675–694). Thousand

Oaks: Sage.

Darling-Hammond, L. (2000). Teacher quality and student

achievement: A review of state policy evidence. Educational

Policy Analysis Archives, 8(1) Retrieved 29 October 2007 from

/http://epaa.asu.edu/epaa/v8n1/S.

Davies, P. (1999). What is evidence-based education? British

Journal of Educational Studies, 47(2), 108–121.

Davis, A. D. (2006). Characterizing productive reflection among

preservice elementary teachers: Seeing what matters. Teaching

and Teacher Education, 22, 281–301.

Dawson, K. (2006). Teacher inquiry: A vehicle to merge

prospective teachers’ experience and reflection during curri-

culum-based, technology-enhanced field experiences. Journal

of Research on Technology in Education, 38(3), 265–292.

Dewey, J. (1910). How we think (1st ed.). Lexington, Massachu-

setts: D.C. Heath.

Dewey, J. (1933). How we think: A restatement of the relation of

reflective thinking to the educative process. New York: D.C.

Heath and Company.

Dias, L., Calandra, B. D., & Fox, D. L. (2007). Teacher

candidates’ experience with digital video editing for reflection:

How much scaffolding do they need? Paper presented at the

annual meeting of the American Educational Research

Association. Chicago, IL.

Gitlin, A., Barlow, L., Burbank, M. D., Kauchek, D., & Stevens,

T. (1999). Pre-service teachers’ thinking on research: Implica-

tions for inquiry oriented teacher education. Teaching and

Teacher Education, 15(7), 753–759.

Gitlin, A., & Teitelbaum (1983). Linking theory and practice: The

use of ethnographic methodology by prospective teachers.

Journal of Education for Teaching, 9(3), 225–234.

Glaser, B., & Strauss, A. L. (1967). The discovery of grounded

theory: Strategies for qualitative research. Chicago: Aldine De

Gruyter.

Goodman, J. (1991). Using a methods course to promote

reflection and inquiry among preservice teachers. In B. R.

Tabachnick, & K. Zeichner (Eds.), Issues and practices in

inquiry-oriented teacher education (pp. 56–76). Bristol, PA:

The Falmer Press.

Griffin, M. L. (2003). Using critical incidents to promote and

assess reflective thinking in preservice teachers. Reflective

Practice, 4(2), 207–220.

Hawkey, K. (1998). Mentor pedagogy and student teacher

professional development: A study of two mentoring relation-

ships. Teaching and Teacher Education, 14(4), 657–670.

Hiebert, J., Gallimore, R., & Stigler, J. W. (2002). A knowledge

base for the teaching profession: What would it look like and

how can we get one? Educational Researcher, 31(5), 3–15.

Hubbard, R. S., & Power, B. M. (2003). The art of classroom

inquiry. A handbook for teacher-researchers (2nd ed.). Ports-

mouth, NH: Heinemann.

Jackson, P. W. (1968). Life in classrooms. New York: Holt,

Reinhart, & Winston, Inc.

Korthagen, F. A. J. (2001). A reflection on reflection. In F. A. J.

Korthagen (Ed.), Linking practice and theory: The pedagogy

of realistic teacher education (pp. 51–68). Mahwah, New

Jersey: Lawrence Erlbaum Associates.

Loughran, J. J. (2002). Effective reflective practice: In search of

meaning in learning about teaching. Journal of Teacher

Education, 53(1), 33–43.

Lytle, S. L., & Cochran-Smith, M. (1994). Inquiry, knowledge,

and practice. In S. Hollingsworth, & H. Sockett (Eds.),

Teacher research and educational reform, Vol. 93 (pp. 22–51).

Chicago, IL: University of Chicago Press (Part 1).

Maloch, B., Flint, A. S., Eldridge, D., Harmon, J., Loven, R.,

Fine, J., et al. (2003). Understanding beliefs, and reported

decision-making of first-year teachers from different reading

teacher preparation programs. Elementary School Journal,

103(5), 431–457.

Mason, J. (1996). Qualitative researching. Thousand Oaks, CA:

Sage.

Messick, S. (1994). The interplay of evidence and consequences in

the validation of performance assessments. Educational

Researcher, 23(2), 13–23.

Miles, M. B., & Huberman, M. (1994). Qualitative data analysis

(2nd ed.). Thousand Oaks, CA: Sage Publications.

Miller, M., & Carney, J. (2007). Using video traces software to

support the statewide assessment and licensure of beginning

teachers. Paper presented at the annual meeting of the

American Association of Colleges of Teacher Education.

New York, NY.

Noffke, S. E. (1997). Professional, personal, and political

dimensions of action research. Review of Educational Re-

search, 22, 305–343.

Parkinson, D. D. (2005). Unexpected student reflections from an

underused genre. College Teaching, 53(4), 147–151.

Page 15: Capturing and assessing evidence of student teacher inquiry: A case study

ARTICLE IN PRESSP. Rich, M. Hannafin / Teaching and Teacher Education 24 (2008) 1426–14401440

Poetter, T. S., Pierson, J., Caivano, C., Stanley, S., Hughes, S., &

Anderson, H. D. (1997). Voices of inquiry in teacher education.

Mahwah, NJ: Lawrence Erlbaum Associates, Inc.

Rich, P., & Hannafin, M. (in press). Decisions & reasons:

Examining preservice teacher decision-making through video

self-analysis. Journal of Computing in Higher Education.

Rich, P., Recesso, A., Allexsaht-Snider, M. & Hannafin, M.

(2007). The use of video-based evidence to analyse, act on, and

adapt preservice teacher practice. Paper presented at the

annual meeting of the American Educational Research

Association, Chicago, IL.

Rich, P., Recesso, A., Shepherd, C., Deaton, B., Wang, F., &

Hannafin, M.J. (2005). Improving teacher educator and

preservice teacher practices through technology-enhanced

evidence-based inquiry. Paper presented at the annual meeting

of the American Educational Research Association, Mon-

treal, Canada.

Richardson, V. (1996). The role of attitudes and beliefs in

learning to teach. In J. Sakula (Ed.), Handbook of research on

teacher education (2nd ed., pp. 102–119). New York: Simon &

Schuster Macmillan.

Schon, D. (1983). The reflective practitioner: How professionals

think in action. San Francisco: Jossey-Bass.

Sherin, M. G., & van Es, E. A. (2005). Using video to support

teachers’ ability to notice classroom interactions. Journal of

Technology and Teacher Education, 13(3), 475–491.

Sherin, M. G., & van Es, E. A. (2007). Using video to document

changes in teachers’ professional vision. Paper presented at the

annual meeting of the American Educational Research

Association, Chicago, IL.

Supovitz, J. (2002). Developing communities of instructional

practice. Teachers College Record, 104(8), 1591–1626.

van Es, E. A., & Sherin, M. G. (2002). Learning to notice:

Scaffolding new teachers’ interpretations of classroom inter-

actions. Journal of Technology and Teacher Education, 10(4),

571–596.

Vygotsky, L. S. (1987). The collected works of L.S. Vygotsky:

Problems of general psychology, including the volume, Thinking

and speech. New York: Plenum Press.

West, R. E., Recesso, A., Hannafin, M. J., Rich, P., Shepherd, C.,

& Deaton, B. (2007). Evidential boundaries and the assessment

of teacher practices. Paper presented at the annual meeting of

the American Educational Research Association, Chicago,

IL.

Whitehurst, G. (2002). Scientifically based research on teacher

quality: Research on teacher preparation and professional

development. Paper presented at the White House Conference

on Preparing Tomorrow’s Teachers, Washington, D.C.

Wideen, M., Mayer-Smith, J., & Moon, B. (1998). A critical

analysis of the research on learning to teach: Making the case

for an ecological perspective on inquiry. Review of Educa-

tional Research, 68(2), 130–178.

Zeichner, K., & Tabachnick, B. R. (1991). Reflections on

reflective thinking. In B. R. Tabachnick, & K. Zeichner

(Eds.), Issues and practices in inquiry-oriented teacher educa-

tion (pp. 1–21). Bristol, PA: The Falmer Press.