18
231 Journal of Early Childhood Teacher Education, 27:231–247, 2006 Copyright © National Association of Early Childhood Teacher Educators ISSN: 1090-1027 print/ 1745-5642 online DOI: 10.1080/10901020600843442 UJEC 1090-1027 1745-5642 Journal of Early Childhood Teacher Education, Vol. 27, No. 3, June 2006: pp. 1–37 Journal of Early Childhood Teacher Education Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned Development Assessment Skills A. B. Dorfman et al. AVIVA B. DORFMAN 1 , GARY R. GALLUZZO 2 AND SAMUEL J. MEISELS 3 1 University of Michigan-Flint, Flint, Michigan, USA 2 George Mason University, Fairfax, Virgina, USA 3 Erikson Institute, Chicago, Illinois, USA This study investigates the development of prospective teachers’ observation skills and understanding of assessment in two teacher education programs that integrate infor- mation about performance assessment in varying degrees into their preparation and field experiences. Focusing on eight student teachers, we used interview data to inves- tigate the influence of exposure to principles of a curriculum-embedded performance assessment—the Work Sampling System™—through use of a handbook for teacher educators and student teaching placements. We found that when the preparation expe- riences were supported by placements in which assessment was ongoing and curricu- lum-embedded, student teachers demonstrated strengths in their assessment skills and understanding. Implications for future research and teacher preparation programs are discussed. Introduction The Integration of Program with Placement Research concerning how students become teachers is only beginning to address the effects of teacher education programs on aspiring teachers’ classroom performance. Stud- ies address a host of variables including program design (Galluzzo & Pankratz, 1990; Howey, 1996; Howey & Zimpher, 1989); the role preservice teachers play in learning to teach, including their beliefs about teaching, teachers, children, and schooling (Book & Freeman, 1986; Richardson, 1996); and the role of field experiences in shaping the thoughts and behaviors of preservice teachers (Cochran-Smith, 1995; Feiman-Nemser & Buchmann, 1985, 1989). Additional investigations have shown that consistency between program content and the strengths of the cooperating teacher in the field placement is key to student teachers’ ability to demonstrate program goals in their teaching (Copeland, 1975; Feiman-Nemser & Buchmann, 1989). Received 20 March 2006; accepted 24 May 2006. We are grateful for the invaluable assistance of the students, faculty, and cooperating teachers who made this study possible. The study was supported by a grant from the Joyce Foundation. The views expressed in this paper are those of the authors and do not necessarily represent the positions of the Foundation. Address correspondence to Aviva B. Dorfman, Education Department, University of Michigan- Flint, Flint, MI, 48502. E-mail: [email protected]

Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

Embed Size (px)

Citation preview

Page 1: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

231

Journal of Early Childhood Teacher Education, 27:231–247, 2006Copyright © National Association of Early Childhood Teacher EducatorsISSN: 1090-1027 print/ 1745-5642 onlineDOI: 10.1080/10901020600843442

UJEC1090-10271745-5642Journal of Early Childhood Teacher Education, Vol. 27, No. 3, June 2006: pp. 1–37Journal of Early Childhood Teacher Education

Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

Development Assessment SkillsA. B. Dorfman et al. AVIVA B. DORFMAN1, GARY R. GALLUZZO2 AND SAMUEL J. MEISELS3

1University of Michigan-Flint, Flint, Michigan, USA2George Mason University, Fairfax, Virgina, USA3Erikson Institute, Chicago, Illinois, USA

This study investigates the development of prospective teachers’ observation skills andunderstanding of assessment in two teacher education programs that integrate infor-mation about performance assessment in varying degrees into their preparation andfield experiences. Focusing on eight student teachers, we used interview data to inves-tigate the influence of exposure to principles of a curriculum-embedded performanceassessment—the Work Sampling System™—through use of a handbook for teachereducators and student teaching placements. We found that when the preparation expe-riences were supported by placements in which assessment was ongoing and curricu-lum-embedded, student teachers demonstrated strengths in their assessment skills andunderstanding. Implications for future research and teacher preparation programs arediscussed.

Introduction

The Integration of Program with Placement

Research concerning how students become teachers is only beginning to address theeffects of teacher education programs on aspiring teachers’ classroom performance. Stud-ies address a host of variables including program design (Galluzzo & Pankratz, 1990;Howey, 1996; Howey & Zimpher, 1989); the role preservice teachers play in learning toteach, including their beliefs about teaching, teachers, children, and schooling (Book &Freeman, 1986; Richardson, 1996); and the role of field experiences in shaping thethoughts and behaviors of preservice teachers (Cochran-Smith, 1995; Feiman-Nemser &Buchmann, 1985, 1989). Additional investigations have shown that consistency betweenprogram content and the strengths of the cooperating teacher in the field placement is keyto student teachers’ ability to demonstrate program goals in their teaching (Copeland,1975; Feiman-Nemser & Buchmann, 1989).

Received 20 March 2006; accepted 24 May 2006.We are grateful for the invaluable assistance of the students, faculty, and cooperating teachers

who made this study possible. The study was supported by a grant from the Joyce Foundation. Theviews expressed in this paper are those of the authors and do not necessarily represent the positionsof the Foundation.

Address correspondence to Aviva B. Dorfman, Education Department, University of Michigan-Flint, Flint, MI, 48502. E-mail: [email protected]

Page 2: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

232 A. B. Dorfman et al.

Implications of these analyses highlight the unpredictable interplay among preserviceteachers, their aspirations and beliefs, their program, and their performance in their fieldexperiences. In fact, Heuwinkel (1998) suggests that learning to teach is an individual andpersonal activity influenced by multiple factors, including the preservice teachers’ beliefs,personal background, and the importance a student gives to practicing a skill during thestudent teaching experience. In contrast, Zeichner (1999) emphasizes the difficulty ofmodifying the tacit beliefs, understandings, and worldviews that students bring to teachereducation programs.

In short, studies of teacher education demonstrate that the content, contexts, andintent of teacher education programs must be taken into account in order to clarify theimpact a particular program has on its participants (see Evertson, Hawley, & Zlotnik,1985). In addition, the relationship of the preparatory program with field experiences,especially when cooperating teachers model the content of the program, seems to enhancethe preservice teachers’ abilities to implement the intent of the program (Feiman-Nemser &Buchmann, 1985). The present study focuses on the interplay among several of thesefactors.

Theoretical Perspective

This study reflects a model of early childhood and elementary teacher preparation thatassumes that teaching and learning are complex, interactive, and coconstructed processes(Bransford, Brown, & Cocking, 1999). How teachers perceive their role and their respon-sibility for fostering children’s learning and development influences the nature of thelearning that occurs (see Meisels, Harrington, McMahon, Dichtelmiller, & Jablon, 2001).

From this perspective, in contrast to testing—for knowledge, skill acquisition, or dis-positions—assessment entails the documentation and evaluation of evidence to guideinstructional decisions. It is “the process of answering questions about specific aspects ofchildren’s knowledge, skills, personality, and academic achievements” (Meisels, 1994,p. 205).

The connection between continuous performance assessment and the development ofteaching skills has been studied from a constructivist position that emphasizes the impor-tance to active learning of self-assessment and reflection (Earl & LeMahieu, 1997). Sev-eral studies demonstrate the association between improved instruction and teachers’ use ofperformance assessments (Baron, 1996; Dorfman, 1997; Gong & Reidy, 1996; Khattri,Kane, & Reeve, 1995; LeMahieu & Eresh, 1996; Mills, 1996). Teachers involved in thedevelopment of assessments and evaluation criteria (rubrics and standards) and in evaluat-ing children’s work have shown positive changes in their understanding of teaching,learning, and assessment and their ability to accommodate student diversity (Darling-Hammond, Ancess, & Falk, 1995; Nicholson, 2000).

The Study

The present study was designed to investigate the influence of exposure to principles of acurriculum-embedded performance assessment—The Work Sampling System (WSS;Meisels, Jablon, Marsden, Dichtelmiller, & Dorfman, 1994, 2001)—through use of ahandbook for teacher educators (Meisels, Harrington et al., 2001).

We explored the consequences of exposure to WSS in two different teacher prepara-tion programs and in both WSS and non-WSS student teaching placements. Threeresearch questions guided this research:

Page 3: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

Development Assessment Skills 233

1. To what extent did the student teachers have a clear understanding of the connectionsbetween teaching, learning, and assessment?

2. How did the student teachers think about and apply concepts of assessment, evidencegathering, and observation in their student teaching placements?

3. What role did experience with WSS play in these student teachers’ construction of theconcepts of assessment, evidence gathering, and observation?

The Work Sampling System

WSS is a system of authentic performance assessment for use with children from pre-school through Grade 6. Its purpose is to help teachers to document and evaluate chil-dren’s skills, knowledge, and behaviors, using the experiences, activities, and products ofdaily classroom life. It is comprised of three complementary processes: 1) observationwith Developmental Guidelines and Developmental Checklists, 2) collection of children’swork in Portfolios, and 3) summarization and evaluation using the WSS Summary Report(see Meisels, 1996, 1997).

The WSS Developmental Guidelines comprise observational criteria based onnational standards and knowledge of child development. They describe developmentallyappropriate expectations for children at each age/grade level. Teachers use the Develop-mental Checklists to record ongoing observations three times per year.

WSS advocates a structured approach to Portfolio collection through the collection oftwo types of work samples: Core Items document growth and progress, and IndividualizedItems portray each child’s unique characteristics and learning across the curriculum.

Teachers use Summary Reports to evaluate student performance and progress threetimes a year. They use information from the Developmental Checklists and Portfolios tosummarize a child’s strengths and challenges in ratings and brief written comments.

The three elements constitute an integrated system that helps teachers to structuretheir assessments, using techniques best suited to their styles, students, and contexts. Theyprovide shared criteria for student evaluation through meaningful, curriculum-based activ-ities and avoid reliance upon on-demand testing.

Preparation Programs

Student teachers in this study were enrolled in two different teacher education programs(referred to as College A and College B) located in a midwestern city. The students werenot taught specifically to implement WSS. Rather, they were taught about the system andexposed to principles of WSS through exercises from the Handbook. Some WSS materialsand concepts were introduced to provide these prospective teachers with opportunities topractice observing learners, documenting meaningful learning, and reflecting on their ownpractice. All of the students were placed for student teaching in kindergarten classrooms.Information about the program contexts was obtained from program descriptions and dis-cussions with key faculty in each university.

College A is a branch campus of a large state university system. The students aregraded conventionally for their course credit and, although coursework may involve learn-ing to reflect and evaluate their practices, students are not directly involved in gradingtheir own work. Within the undergraduate certification program for early childhood andelementary school teachers, the introduction to WSS has been integrated into a singlecourse on assessment and into the accompanying seminar. Some students in our sampletook the assessment course prior to student teaching; for others it was concurrent with

Page 4: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

234 A. B. Dorfman et al.

their placement. In each of the two courses the introduction to WSS was brief and wasassociated with a field assignment.

College B is a small private institution that is noteworthy for its unique, innovativeapproach to teacher preparation. College B calls the program an “ability-based” learningprogram in teacher education. It is part of a college-wide focus on eight general areas ofability that are considered necessary characteristics of individuals educated in the liberalarts. The education program builds upon these areas and identifies five additional abilitiesof a professional educator. Students’ competence in these abilities is evaluated by a sys-tem of ongoing performance assessments developed by the faculty. Courses do not includeexams or tests throughout the program. Rather, student assessment involves the develop-ment of rubrics and application of evaluation and self-evaluation utilizing those rubrics.Students also create portfolios to document their achievements. They evaluate and areevaluated on their performance, not only in terms of the products they produce, but alsofor all components of the tasks including the process of developing and using the products.

WSS is integrated into three courses in the early childhood sequence at College B.Students are encouraged to follow up and incorporate what they have learned in their stu-dent teaching depending on the interests of their cooperating teacher, the type of place-ment they elect, and the wishes of their supervisor. Students use the Work SamplingSystem Omnibus Guidelines (Jablon, Dichtelmiller, Marsden, & Meisels, 1994; 2001) as arequired text for these courses and are expected to complete observations of children andto do other assignments derived from the Handbook using the Guidelines as a resource.The Omnibus Guidelines are a compendium of the Developmental Guidelines that displaysix age/grade level expectations side by side in a single volume.

As is clear from this description, the student teachers in our sample have receivedwidely disparate introductions to WSS during their preparation for student teaching.Moreover, the institutions themselves and the program experiences provided to the stu-dents are not directly comparable. This complication was taken into account during thedata analysis and is addressed later in this paper.

Method

Participants

Students in both institutions are required to complete a 20-week period of student teachingbefore certification. We selected a sample of eight students in the first third to half of theirstudent teaching placements. Four students were chosen from each institution. All werefemale, although their ages ranged from typical undergraduates to nontraditional returningstudents. Four students—two from each institution—were placed with kindergarten teach-ers who were using WSS as their primary method of classroom assessment. Four wereplaced in kindergarten classrooms that were not using WSS. The two College A (tradi-tional program) students in non-WSS placements were placed in kindergartens that uti-lized observational assessment. Table 1 provides the names of the students (identified bypseudonyms), their preparation programs, and the nature of their placements with respectto WSS.

Data Sources

One investigator conducted a semistructured, audiotaped interview with each student dur-ing weeks 5–9 of the 20-week student teaching placement. The interviews ranged in

Page 5: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

Development Assessment Skills 235

length from 50–85 minutes. The interview protocol grew out of the major goals andassumptions of the Handbook and the qualities, skills, and attributes that the Handbookaims to foster in teacher education students. We designed the interview as a set of ques-tions intended to collect information about these students’ demonstration of specific targetqualities and attributes (see Table 2). In this paper we discuss three qualities in particular:1) the students’ understanding of assessment, 2) their sources of assessment information,and 3) their observation skills.

Table 1Students, Programs, and Placements

College WSS placement Non-WSS placement

College A Heather MeganErin Sarah

College B Polly RachelMary Shirley

Table 2Qualities, Skills, and Approaches Targeted by the Handbook

Major Category Subcategories

1. Thinking like a teacher a) Reasoned practice and the role of evidenceb) Evidence used to justify decisions and plans

2. Connections between teaching, learning, and assessment

a) Use of assessment information to inform planning and instruction

b) Sources of assessment information

3. Reflective habits of mind a) Critical thinkingb) Bias and awarenessc) Self-evaluation

4. Assessment skills a) Observation skillsb) Documentingc) Evaluatingd) Guiding children in self-assessment

5. Connecting theory to practice a) Knowledge of child developmentb) Knowledge of individual learnersc) Knowledge of curriculumd) Planning instruction, focus on activity

or curricular goal

6. Focus on learners and learning a) Individualized goals and instructionb) Knowledge of individual studentsc) Development of planning: process and focus

7. Self-efficacy a) Continuum of concern: from self, personal performance, and achievement to outcomes for children and their learning

b) Grounds for student’s sense of what to teach, observe, document, and assess

Page 6: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

236 A. B. Dorfman et al.

In order to give the interviews a conversational flow, the protocol was semistructured.The questions asked of all students were the same, though the conversation may have pro-ceeded differently for each interviewee. Each question included an associated list ofprobes or prompts. The interviewer posed an open-ended main question and allowed therespondent to discuss the question in her own terms using the respondents’ terms in fol-lowing up in a responsive way. The interviewer had a guiding list of probes or prompts toadd if the respondent’s spontaneous answer did not address certain aspects of the mainquestion. For example, the first question asked the student teacher to describe her currentteaching situation. If a student described the classroom and the relationship with her coop-erating teacher, but did not address the learners, she would be asked to describe the learn-ers in her classroom. If she described the children in her class spontaneously, no suchprompt would be necessary and would not be offered. Interview questions did not varywith preparation program or placement. The interview questions were finalized after con-ducting a pilot set of interviews with three student teachers (an independent sample) and arevision process thereafter.1

On the day of the student interview, we interviewed each cooperating teacher to col-lect supporting or contrasting evidence for the student teacher interviews. The interviewswith cooperating teachers lasted 20–30 minutes. We asked the cooperating teacher tocompare this student to other student teachers and to describe aspects of the target quali-ties demonstrated by the students. Our particular interest was in the cooperating teachers’perspectives about the students’ classroom practice.

Analysis

All interviews were audiotaped and transcribed. We analyzed the transcripts using the tar-get qualities in Table 2 as a priori categories. During the analysis, we identified the tran-scripts identified by number only, to mitigate bias from knowledge of the preparationprograms and placements. Using an analytical framework based on the target qualities, wecategorized each student’s statements. We then created individual profiles of how eachstudent demonstrated the target qualities. The cooperating teachers’ interview transcripts,organized thematically by the target qualities, were evaluated as triangulating evidence ofthe students’ actions in practice, to verify the conclusions reached in each individual pro-file. Due to space limitations, we do not report the cooperating teachers’ perspectives here,but used them to compare with our conclusions about the students’ strengths and practicesand verify whether our impressions of the students were consistent with their classroomfunctioning. Subsequently, we compared students’ individual profiles for each target qual-ity, within each institution, across institutions, and between the students placed in WSSand non-WSS classrooms.

The analysis in this report is based on the first author’s review of the data. Beforecompleting the study, the second author independently reviewed a sample of half the tran-scripts for reliability. In the process of creating the individual profiles, we independentlyanalyzed the interview responses for content. Independently, we categorized eachresponse as pertaining to the target attributes, and subsequently ranked the quality of theresponses using a shared rubric. While we did not calculate interrater reliability, when wereconciled our ratings we found few instances of disagreement, and the individual dis-crepancies in ranking were always within one level of one another.

1The interview protocols for the student teacher interview and the cooperating teacher inter-view are available upon request.

Page 7: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

Development Assessment Skills 237

Since we were unable to include in this study a contrast group of students withoutexposure to WSS in their preparation programs, we can not disentangle the effects of theteacher preparation program and the differences in Handbook use that resulted in differinglevels of exposure to WSS prior to student teaching. There may be additional differencesthat confound our results, such as the students’ overall academic success and oral expres-sion. In addition, the small sample size limits our ability to generalize from our conclu-sions. Although our interviews took place at a single point in time and were notaccompanied by observations of the students in the classroom, the interviews with thecooperating teachers were used to verify the profile of each student. In the following sec-tions we identify our sample of student teachers as “students” and their pupils as “chil-dren” for ease of differentiation.

Results

Understanding Assessment

The distinction between testing and assessment appeared clear to these students, andseemed particularly salient for the College B students, who were accustomed to beingassessed by ongoing evaluation, portfolios, and structured self-assessment procedures.A few students referred to the difference between testing and assessment spontaneouslyduring the interview. All spoke about observing purposefully during classroom activi-ties in order to learn about the children they taught. Most were reasonably familiar withthe importance of obtaining assessment information from multiple sources, such asreviewing children’s work, watching their ongoing play and interactions with others,and listening to their comments and conversations. None of the students viewed conven-tional testing as the most appropriate method of instructional assessment, although afew expressed support for on-demand assessment in the form of individually adminis-tered evaluations.

Shirley (College B, non-WSS placement) demonstrated her understanding of the roleof evidence in practice when she described how she learned to distinguish children whodid not pay attention from those who could not process information or follow directions.She realized that these were distinctions developed through ongoing observations duringthe placement. She described the difference between testing and assessing as follows:

A test is something you give at the end of the week and you get grades on itand if you do badly you get an F, you know? It’s . . . not really to see what youknow; it’s just how you can put it on a paper correctly. But I think an assess-ment is over time. You see how they grow, you see what they can tell you thisday, see what they can tell you that day. I think it’s more of an ongoing pro-cess, and I think a test is just right then and there.

She attributed her understanding to her experiences in her preparation program and thecycles of assessment and self-assessment as a student. Shirley clearly describes the differ-ence between what can be learned from on-demand testing in contrast to ongoing assess-ment. As she says, a test gives information about a moment in time, whereas assessmentprovides continuous information about the process of learning and about children’s reper-toires of skills and achievements. Indeed, from ongoing assessment a teacher can derive asense of “what you know,” as opposed to merely whether or not “you can put it on a papercorrectly.”

Page 8: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

238 A. B. Dorfman et al.

In addition to being able to differentiate between testing and assessment, these stu-dents appeared to recognize the implications of this difference for instruction. WhenMegan (College A, non-WSS placement), who was placed in a kindergarten that madeactive use of observational assessment, was asked whether changes in the way she thoughtabout assessment altered the way she thought about teaching in general, she respondedthat her experience of studying about assessment had motivated her to rethink her ownschooling. She said,

I just wonder how things would have been different if I had been in a settinglike this, because it’s just so much fun and the kids . . . get to choose what theywant to do. . . . And it’s made me change, you know? “Well that’s kind ofsilly. Why should they have to take tests like that when they can learn throughwhat they want to do?”

Megan realized that if the teacher is not limited to conventional testing, she is free to pro-vide children a broader range of activities, materials, and opportunities to engage in learn-ing. Rachel (College B, non-WSS placement) also spoke about the implications of thisview of assessment for children’s motivation to learn. She credited her preparation pro-gram with teaching her that multiple ways of assessing allow a teacher to develop a rangeof motivating instructional methods. Overall, these students understood that the differencebetween testing and assessment transcends the information a teacher can obtain aboutchildren from simplistic, right/wrong examinations, and influences the way that the class-room and instruction can be organized.

Erin (College A, WSS placement) portrayed her knowledge of the relationshipbetween teaching and assessment when she described how she used information from stu-dents to design instruction. For example, when asked what she learns from observation,she said,

Well, observation helps a lot because you are given the opportunity to view a[children], how they interact with peers, how they interact with adults, bythemselves, when they’re not being watched, you know, when they thinkthey’re not being watched, as well as how they work academically. And withthat, that gives you an idea of who they are, how they learn, and you know,what their direction could be.

In this example Erin displays awareness of multiple sources of information as well asunderstanding of the implications of that information and of attending to the whole childwhen using assessment information to design instruction specifically for each child.

The students demonstrated a good understanding of assessment, how assessmentmethods include more than testing, and how assessment provides information for design-ing instruction. Overall, College B students made more spontaneous mention of makingassessment a routine and integral part of planning curriculum and instruction than did Col-lege A students.

Sources of Assessment Information

There was ample evidence in the interviews about the students’ views of what is importantto include in assessments. Planning for assessment was routinely part of the process ofcurriculum development for most of these student teachers, particularly the College B

Page 9: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

Development Assessment Skills 239

students. However, a close reading of their transcripts reveals stylistic differences in theemphasis they placed on differing sources of assessment information.

The major difference between students in performance assessment placements as con-trasted with those in traditional classrooms was in the students’ working definition of evi-dence of learning. In general, students in placements where observational methods ofassessment were used, particularly WSS, emphasized multiple sources of information.They spoke of observing children’s behaviors, language, and products in differing con-texts. Students in more traditional classrooms relied upon a narrower range of evidence.

Retention and recall. For example, when asked how she evaluated a particular lesson,Rachel (College B, non-WSS placement) read from her planning binder and said, “Myassessment was how the children raised their hands. What else did I write? [Reading fromher notes:] ‘They were eager to answer my questions and participate. Their answers werecorrect and their predictions were quite accurate.’” When asked what made her feel that athematic unit she had just described was successful, Rachel said, “Well when I correcttheir work, when I’m having an oral discussion, they all would raise their hands. They allsay the same answer.” These examples display Rachel’s reliance upon children’s engage-ment in activities, skill mastery, participation in whole group discussions, and the accu-racy of their responses.

Shirley (College B, non-WSS placement) also seemed to rely heavily on large group dis-cussion and participation for her assessment. The evidence she presented included children’sattention level, products, and ability to follow directions. Shirley also gauged her sense ofchildren’s learning by their ability to recall information during instructional activities.

In short, students from non-WSS placements spoke about the importance of learningfrom observation and described multiple sources of information about children. Neverthe-less, in practice, their assessment of children’s learning relied upon a relatively limitedrange of evidence, emphasizing whole group discussions, skill mastery, and recall andretention of information.

Broad and varied sources of information. In contrast, the students in performance assess-ment placements appeared to use a broader definition of evidence, including children’sbehavior, language, and products. For example, when asked how she knows students haveattained the goals she set for them, Polly (College B, WSS placement) said,

By the work presented, by talking with them about the process. You know,“How did you decide to do it that way?” and, “Gee, that’s a unique way to doit. What made you think about that?” . . . I always try to question them and letthem explain it to me, because maybe they can’t demonstrate it on paper, butthey can tell me about it. And that tells me a lot about what they know andwhere they are too.

Polly’s approach is more differentiated than Rachel’s and Shirley’s. She described assources of information analyzing children’s work and interacting with children individu-ally. Polly also emphasized the importance of the learning process as evidence of chil-dren’s understanding and sense of accomplishment.

Sarah (College A, non-WSS kindergarten using observational assessment) stated thatshe knew when children attained the goals she set for them by assessing on different occa-sions and keeping track of the changes in children’s responses. Her discussion of evaluat-ing a lesson for effectiveness demonstrates her reliance upon observation:

Page 10: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

240 A. B. Dorfman et al.

Well, you see the kids in different parts of the day talking about it, or showingsomething that we did during the opening. . . . You see them actually doing itand you know, “Oh wow, they got that.”

Polly and Sarah’s approaches to assessment are also characteristic of the approaches of theother students in WSS placements and placements where observational assessment wasused. On the whole, these six students demonstrated a broad working definition of evi-dence of learning that went beyond children’s ability to retain and recall information.They generally relied upon more sources of information than did Rachel and Shirley,whose placements used a traditional approach to assessment.

Observation Skills

During the interviews student teachers provided or were asked to provide specific descrip-tions of individual children and events in the classroom. We found that these examplesvary in terms of the quality of the student teachers’ observations; specifically, in degree ofdetail included and in students’ understanding of the meaning of their observations. Thisvariation appears to be related to student teaching placement, but not to preparation pro-gram. The students in WSS placements described observations of individual children andof the class as a whole in particularly insightful ways. Although the non-WSS-placed stu-dents displayed a theoretical understanding of the role of observation and were alsoengaged in active observation, as a group they reported fewer and less insightful individ-ual examples throughout their interviews. The only exception to this was Shirley (CollegeB, non-WSS placement). She provided more examples of individual students than othernon-WSS placed student teachers, demonstrating her attention to observational informa-tion. Interestingly, Shirley revealed at the end of her interview that her field placement justprior to student teaching had been in a school using WSS.

WSS placements. When asked to address issues of assessment, Erin (College A, WSSplacement) and Mary (College B, WSS placement) both talked about how they collectedclass-level information. Erin was convincing in her understanding of the role of evidence,and the evidence that she marshaled to make decisions about her practice was chosen andapplied carefully. She knew how to exploit instructional opportunities to learn about chil-dren’s strengths and learning styles. For example, she described using children’sresponses to the calendar activities at the start of each school day to assess children’sunderstanding of number, numeral, and quantity. She said,

I can pick up the kids that really need to look and to count, and kids that canjust point to 25 and they know where 25 is just because they know their num-bers, or kids that really need to see that this is what 25 is.

Erin was able to describe the cues children give her to distinguish children who needto count, those who need to see the numerals, and those who are proficient in their under-standing of quantity and numerals. Mary demonstrated her use of evidence about thewhole class in the following account:

The other week I had a situation where I asked a question during a patterningactivity. I asked what patterns were and the kids said, ‘Colors.’ ‘So is theresomething, can you pattern anything else beside colors’ ‘No.’ Well those

Page 11: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

Development Assessment Skills 241

simple questions told me a whole heck of a lot. But you know, I immediatelywent into a lesson of how you pattern many and different things besides col-ors. Because often a simple assessment of those questions just, you know,right off the bat you found out so much.

In this example, Mary obtained group-level information through discussion. Sheimmediately used it to address the children’s knowledge and to inform her instructionalactivities.

Heather (College A, WSS placement) called attention to observing patterns over timeand the importance of piecing together individual observations into composites. For exam-ple, when asked how she knows what to observe about children, she began to list the kindsof things to which she attends. Among them she listed children’s affective state and said,

We have a couple of kids who really don’t want to go home at night, and I mean,they’re really happy in school, but when it’s time to go home, no. Moodchanges. When you say it’s time to get ready to go home, they’re just unhappy.

This conclusion appeared to rely on multiple observations combined over time.Heather’s propensity for using data from several observations was also revealed in otherstatements. She noted that she focuses on the “kind of work they enjoy doing. Some of ourkids like the fine motor. Other kids, that’s when a lot of behavior problems come up, whenwe’re doing fine motor tasks.” She concluded, “I mean you’ve really got to take snapshots of each day and put them together.” With this statement, Heather summarized theprocess of observing repeatedly and learning more about students from combiningresponses into patterns.

Erin (College A, WSS placement) and Polly (College B, WSS placement) describedchildren with special learning needs and illustrated their use of observational data toengage these children in the classroom. Erin described an autistic child in her class as follows:

She loves to paint. So I take time out and we paint together and she loves it.She’s focused and it works on her fine motor, and she’s just really set. So Ithink that’s neat. And then you can take painting and moving it on to doingthings like pastels, or doing something in different art aspects, and movingthat on to writing or whatever. And having movement with a pencil. I meanshe’s moving it with a crayon, but it’s long strokes. But it’s neat to have herbody come and concentrate on one thing, instead of running, which is a traitthat she has. So that helps me think about what can I do next to challenge her.

From this example we learn how Erin takes the time to get to know the children’s interestsand learning styles and uses this information to plan appropriate instruction for individualchildren. Polly’s story illustrates a similar use of observational data. This illustration con-cerns a child with limited English language proficiency:

I’ll put on a CD just to fill in time until we’re going to lunch or something likethat, and he’ll sing at the top of his lungs and with such gusto and joy that hislittle face lights up. And he knows every word, pronounces it beautifully, andhe’s just in hog heaven. And so I realize that maybe his verbal participation inclass is not what it should be, but when the music’s on he’s all of it, and hegets all the other kids excited about singing. So, there’s his opportunity to be

Page 12: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

242 A. B. Dorfman et al.

part of the class and equal with the rest and to be able to, you know, feel thatsense of accomplishment. So I try to put singing in everyday, just because Iknow that’s his thing.

By observing the boy’s responsiveness and obvious enjoyment of music, Polly found away to support his English language development and to contribute to his feeling of beingan equal in classroom activities.

Another example shows a notably flexible use of observation. Mary (College B, WSSplacement) tells of a child whose patterning work looks like

just a bunch of marks, and just glancing at it you think she was just scribblingand doing nothing. But if you were watching her during it, she would be tak-ing one marker and going blue, orange, blue, orange, so she was patterning.But you couldn’t see that on her paper.

In this illustration Mary describes the importance of paying attention to the processwhereby the child’s work was created in order to learn about the child’s thinking andskills. She continued,

When you look at the end product you can learn a lot from the kids. . . . Onechild that—I thought, she must be a special needs child just from the way sheinteracts with the other kids and her level of participation in class—but whenyou look at her work, everything is always right, it’s done correctly, it’s donenicely. You wouldn’t get that from just speaking with her and your regularinteractions with her. It’s not until she proves it on a piece of paper that yousee that she really is catching on to things, otherwise you wouldn’t suspect it.

This illustration displays Mary’s skills as an observer, her ability to collect varied evi-dence of learning, and her flexible use of this information. She knows that informationabout process is often very important for understanding children’s thinking. She alsoknows that at other times, the product itself is the key to understanding the learning itdemonstrated.

Non-WSS placements. The students who were placed in non-WSS placements providedfar fewer examples of individual children’s learning in their interviews. Megan (CollegeA, non-WSS placement) did not provide any specific examples or illustrations at all. Thestudents in non-WSS placements also gave examples that differed in character from thoseof the other students. Their illustrations were descriptive, but students were not as insight-ful about the meaning of the example, nor did the students discuss how they used theirobservational knowledge.

For example, when asked to relate how she developed individual goals for a studentduring an activity, Sarah (College A, non-WSS placement) described the way sheincluded a child with limited language skills during a lesson.

I expected him to be able to choose a picture, and then, of course, he didn’tknow what the animal was, or wasn’t able to express it, and then we modeledit by talking, you know, saying “That’s a deer,” or “That’s a bear” and then hewould repeat it back to us. So for him that was just a big step for him to beable to say what the animal was.

Page 13: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

Development Assessment Skills 243

Sarah’s response consisted of her conclusions about Mark’s abilities and the way sheworked him into the lesson, but there were few illustrations of what Mark does or discus-sion of her goals for his learning. Similarly, Rachel (College B, non-WSS placement) tolda story of a child who had difficulty writing letters and copying his name. Rachel neverarticulated a distinct goal for this child, but focused almost entirely on the child’s difficul-ties. When asked if she had developed a specific goal for this child during the thematicunit she prepared, she emphasized that this boy “gets frustrated. He says he’s dumb, youknow (pause). He recognizes that he’s different. But I keep challenging him. You alwayshave to challenge them, no matter what.” Although probed repeatedly in varied ways,Rachel did not successfully describe an individual learning objective for the boy. Shirley(College B, non-WSS placement) provided many individual examples about children’slearning in the course of her interview. However, her examples lacked articulation of herunderstanding of the observation, and gave no evidence of effort to solve the problemsthey posed.

For Shirley and Rachel, the weakness in their observational ability could be related tothe emphases in assessment described earlier. It is reasonable to expect that a classroomthat primarily encourages skill mastery and information acquisition will focus on thesesources of information. It is possible that these two students do not demonstrate moreinsight about the children in their classrooms because their focus on what counts as evi-dence of learning is more narrow and fixed than is that of students who are expected toacquire and use a broad range of instructional data.

Summary of observation skills. We found a difference in the quality of observationsbetween students in WSS and in non-WSS placements. Those in WSS placements seembetter able to use assessment information flexibly, gain insights and understanding fromanalyzing patterns of behavior and products over time, and address whole class activityand individual learning more competently than their non-WSS placement counterparts.This was true regardless of which preparation program the students were enrolled in.

This finding was surprising and contrary to our expectations. Given the intensity ofintroduction to principles of WSS in their preparation in College B and the brief introduc-tion in College A, a discrepancy leaning towards increased observational proficiency forCollege B students, regardless of placement, would have been more in keeping with ourexpectations. To our minds, this difference speaks to the intensity of the placement experi-ence and its influence upon student teachers in their developing identity as a teacher. Wesuggest that the framework for observation provided by the WSS Developmental Guide-lines and the structured collection of student work presented in portfolios may create ageneral approach to observation, assessment, and teaching that influences how studentteachers learn.

Conclusions

Our conclusions are presented in terms of the three research questions delineated earlier:First, to what extent did the student teachers have a clear understanding of the con-

nections between teaching, learning, and assessment? Students from both preparation pro-grams and in both kinds of placements demonstrate a good understanding of assessment.They value the role of evidence in making instructional decisions, as documented in thecomments they made about designing instruction to suit children’s interests, experiences,and abilities. They draw a clear distinction between testing and assessment, and under-stand the implications of this distinction for planning curriculum and instruction. Thinking

Page 14: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

244 A. B. Dorfman et al.

of assessment as a routine part of planning curriculum appeared to vary more with prepa-ration program than with placement, as College B students were more likely to mentionthis spontaneously than College A students.

Second, how did the student teachers think about and apply concepts of assessment,evidence gathering, and observation? We learned that these concepts are an integral partof the ways these students understand and enact the role of the teacher in the classroom.They discussed using observation to learn about individual children and their classes as awhole. They drew conclusions on the basis of evidence, and they relied upon formal andinformal observations, identifying both processes as critical aspects of assessment.

However, there were differences in how these students used formal and informalobservational data. In describing our findings about the sources of assessment informationthat the students relied upon to make judgments, we demonstrated the stylistic differencesbetween the students. The two College B students who were placed in non-WSS class-rooms seemed to have access to a narrower range of evidence than the other students, inspite of the more intense exposure to WSS and performance assessment in their prepara-tion program. They relied heavily upon large group discussion and participation, recalland retention of information, and skill mastery when evaluating learning in their class-rooms. In contrast, the students in our sample who were placed in WSS classrooms or inclassrooms using observational assessment methodology seemed to hold a broader defini-tion of what constitutes evidence of learning and were more flexible about how they col-lected that information.

College B students received a more comprehensive exposure to principles of WSS inthe context of a program that emphasizes assessment and self-evaluation. Interestingly,the two College B students who were placed in non-WSS classrooms had the mostrestricted working definition of evidence of learning. This finding is in contrast to the twoCollege A students who were also placed in non-WSS classrooms, but in classroomswhere observational assessment was practiced routinely. These College A studentsreceived a less comprehensive exposure to WSS in their preparation program; yet, theirapproach to evidence gathering resembled that of the students who were placed in class-rooms where WSS was implemented. These students did their student teaching in an envi-ronment where they witnessed cooperating teachers demonstrating practices that areconsistent with elements of WSS.

In addition, we discovered qualitative differences in the frequency of examples ofobservations, the detail within the examples, and the understanding of the meaning of theexamples that students gave when describing specific classroom events or individuallearners. Students placed in classrooms using WSS appeared to be more competent in theirability to collect and relay anecdotal information and to understand its meaning. Theyappeared to be more likely to use observation to recognize patterns over time. They alsoseemed to be more flexible in what they considered evidence of learning in different chil-dren, as Mary described in attending at times to product and at other times to process inorder to develop a sense of children’s learning.

These two conclusions—WSS-placed students’ broader definition of evidence oflearning and their enhanced observational skills—emphasize the importance of the consis-tency of the student teaching placement with the goals of the preparation program. Thisfinding is consistent with Heuwinkle’s (1996) conclusion that a major influence on learn-ing to teach is the student’s perception of the importance of a skill and its demonstrationduring the student teaching experience. In an environment where the cooperating teacherroutinely models observation and evidence gathering, it is reasonable to assume that thestudent teacher would perceive the acquisition of these skills as important.

Page 15: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

Development Assessment Skills 245

Third, what role did experience with WSS play in these student teachers’ constructionof the concepts of assessment, evidence gathering, and observation? The most significantrole that exposure to WSS had in the students’ construction of these concepts came aboutas a result of the overlap between preparation program and placement type. Because of thedifferences between the preparation programs, and the students’ varied exposure to WSS,it is impossible to disentangle the influence of the program from exposure to WSS per se.Again, our finding that College B students placed in non-WSS classrooms seemed to holda narrow working definition of “evidence of learning” is intriguing. This finding wasunexpected, given that College B students had firsthand experience with performanceassessment in their college preparation and they received the most exposure to WSS.

Our evidence seems to suggest that consistency between preparation program and thestudent teaching placement is extremely important. In this case, only when there was amatch between the preparation program and the student teaching placement did WSS ful-fill its role in the development of the student teachers. This finding echoes that of Cope-land (1975), who described student teachers demonstrating target abilities whenpreservice training and the cooperating teachers’ strengths and preparation were consistentwith the program’s goals. In the present study the prospective teachers who were placed inWSS classrooms appeared to be well on their way to becoming professionals who under-stand the role of evidence in decision-making and use observation and ongoing assess-ment to support reasoned practice in the classroom. In particular, they seem to be keenobservers of learners and learning.

Some indirect support for this conclusion emerges from an examination of two of thestudents, Shirley and Erin. Of the students placed in non-WSS placements, in herresponses Shirley included the highest number of examples or illustrations concerningspecific children or class events. Upon further examination she revealed that her previousfield placement, just prior to student teaching, was in a classroom where WSS was used.

When we created the individual profiles of the students across all the target qualities,Erin stood out as a student who was strong in almost all of the characteristics examined.Erin took an active stance as a learner and was willing to take risks and revise her plans onthe basis of experience. She solicited feedback from her cooperating teacher frequently,yet took a critical perspective on this feedback. Her profile was strong across the targetqualities, although not necessarily the strongest in any particular category. Erin, a CollegeA student, was placed in a WSS placement and had also completed a field placement in aWSS classroom prior to the beginning of student teaching.

Erin and Shirley’s strengths, in combination with their additional WSS experiences,add support to our preliminary conclusion that for WSS to fulfill its potential in the prepa-ration of student teachers, exposure in the preparation program should be associated withstudent teaching placements in classrooms of cooperating teachers using WSS or anothercomparable observational assessment. It appears that the consistency between programand placement enhances the effectiveness of WSS’s role in the preparation of prospectiveteachers.

We believe that the results presented here provide support for developing studies toevaluate integrated preparation programs that link specific academic content with studentteaching placements that are consistent with and exemplify the principles of the prepara-tion program. The differences we found between the students’ demonstrated abilities werecontrary to what might have been expected, given the discrepancies between the prepara-tion programs.

We recognize the limitations of our study—the differences in the two preparation pro-grams and among the students, as well as the small sample size—and bear in mind that our

Page 16: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

246 A. B. Dorfman et al.

conclusions are preliminary. Yet, they are suggestive of directions for future research andfor the development of integrated programs of academic preparation and field placementpractice.

To investigate the link between program and placement further, such a study wouldinclude preparation programs that are comparable to one another, as well as one withoutan introduction to WSS. This would facilitate the researcher’s ability to disentangle theinfluence of program differences from that of the field placement. Moreover, rather thanrely upon a single interview, the inclusion of observations of the student teachers in theirclassrooms would greatly enrich the portrait of students’ understandings of assessmentand how these are or are not incorporated into their practice. Additional data might be col-lected in interviews or in the form of students’ writing about assessment at different pointsof their academic preparation, and especially just prior to student teaching.

Such a study could also be enhanced by the addition of university supervisors’ reportson the students during the field placement in addition to descriptions of the settings inwhich the placements occur. This structure would enable the researcher to follow thedevelopment of students’ understandings over time and to articulate the students’ under-standings at the beginning of the field placement. It would allow the researcher to take intoaccount individual differences between and among students, such as teaching ability,overall success in the teacher preparation program, prior field experiences, and communi-cative abilities, as well as to compare the content of student teaching roles and responsibil-ities. Increased knowledge about students prior to student teaching and of the contexts oftheir placements would enable direct evaluation of the influence of the field placementand strengthen the conclusions of the study. The balance of multiple data sources and datacollected at different time points could more sensitively explain the complex relationshipbetween students’ academic preparation, practical field experiences, and their develop-ment as professional teachers.

References

Baron, J. B. (1996). Developing performance-based student assessments: The Connecticut experi-ence. In J. B. Baron & D. P. Wolf (Eds.), Performance-based student assessment: Challengesand possibilities (95th Yearbook of the National Society for the Study of Education, pp. 166–191). Chicago, IL: The National Society for the Study of Education.

Book, C., & Freeman, D. (1986). Differences in entry characteristics of elementary and secondaryteacher candidates. Journal of Teacher Education, 3(72), 47–51.

Bransford, J. D., Brown, A. L., & Cocking, R. R. (1999). How people learn: Brain, mind, experi-ence, and school. Washington, DC: National Academy Press.

Cochran-Smith, M. (1995). The power of teacher research in teacher education. In S. Hollingsworth &H. Sockett (Eds.), Teacher research and educational reform. Chicago, IL: University of ChicagoPress.

Copeland, W. D. (1975). The relationship between microteaching and student teacher classroomperformance. Journal of Educational Research, 68, 289–293.

Darling-Hammond, L., Ancess, J., & Falk, B. (1995). Authentic assessment in action: Studies ofschools and students at work. New York, NY: Teachers’ College Press.

Dorfman, A. B. (1997). Performance assessment and teacher professional development. Unpub-lished doctoral dissertation, University of Michigan, Ann Arbor.

Earl, L. M., & LeMahieu, P. G. (1997). Rethinking assessment and accountability. In A. Hargreaves(Ed.), Rethinking educational change with heart and mind (1997 ASCD Yearbook pp.149–168).Alexandria: VA, Association for Supervision and Curriculum Development.

Evertson, C. M., Hawley, W. D., & Zlotnik, M. (1985). Making a difference in educational qualitythrough teacher education. Journal of Teacher Education, 36(3), 2–12.

Page 17: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned

Development Assessment Skills 247

Feiman-Nemser, S., & Buchmann, M. (1985). Pitfalls of experience in teacher education. TeachersCollege Record, 87(1), 53–66.

Feiman-Nemser, S., & Buchmann, M. (1989). Describing teacher education: A framework and illus-trative findings from a longitudinal study of six students. Elementary School Journal, 89, 365–377.

Gong, B., & Reidy, E. F. (1996). Assessment and accountability in Kentucky’s school reform. InJ. B. Baron & D. P. Wolf (Eds.), Performance-based student assessment: Challenges and possi-bilities (95th Yearbook of the Society for the Study of Education, pp. 215–233). Chicago, IL:Society for the Study of Education.

Galluzzo, G. R., & Pankratz, R. S. (1990, September–October). Five attributes of a program knowl-edge base. Journal of Teacher Education, 41(4), 7–14.

Heuwinkel, M. K. (1998). An investigation into preservice teachers’ interactive assessment of stu-dent understanding in a traditional program and a professional development school. Unpub-lished doctoral dissertation, Greeley, CO: University of Northern Colorado.

Howey, K. R. (1996). Designing coherent and effective teacher education programs. In J. S. Sikula(Ed.), Handbook of research on teacher education (2nd ed.). New York, NY: Macmillan.

Howey, K. R., & Zimpher, N. L. (1989). Profiles of preservice teacher education: Inquiry into thenature of programs. Albany: State University of New York Press.

Jablon, J. R., Marsden, D. B., Meisels, S. J. & Dichtelmiller, M. L. (1994; 2001). Work SamplingSystem Omnibus Guidelines. Ann Arbor, MI: Rebus Inc.

Khattri, N., Kane, M. B., & Reeve, A. L. (1995). How performance assessments affect teaching andlearning. Educational Leadership, 53(3), 80–83.

LeMahieu, P. G., & Eresh, J. T. (1996). Coherence, comprehensiveness, and capacity in assessmentsystems: The Pittsburgh experience. In J. B. Baron & D. P. Wolf (Eds.), Performance-based stu-dent assessment: Challenges and possibilities (95th Yearbook of the Society for the Study ofEducation, pp. 125–142). Chicago, IL: Society for the Study of Education.

Meisels, S. J. (1994). Designing meaningful measurements for early childhood. In B. L. Mallory &R. S. New (Eds.), Diversity in early childhood education: A call for more inclusive theory, prac-tice, and policy (pp. 205–222). New York, NY: Teachers College Press.

Meisels, S. J. (1996). Performance in context: Assessing children’s achievement at the outset ofschool. In A. J. Sameroff & M. M. Haith (Eds.), The five to seven year shift: The age of reasonand responsibility (pp. 410–431). Chicago, IL: University of Chicago Press.

Meisels, S. J. (1997). Using Work Sampling in authentic performance assessments. EducationalLeadership, 54, 60–65.

Meisels, S. J. (2000). On the side of the child: Personal reflections on testing, teaching, and earlychildhood education. Young Children, 55(6), 16–19.

Meisels, S. J., & Harrington, H. L., with McMahon, P., Dichtelmiller, M. D., & Jablon, J. R. (2001).Thinking like a teacher: Using observational assessment to improve teaching and learning.Boston, MA: Allyn & Bacon.

Meisels, S. J., Jablon, J. R., Marsden, D. B., Dichtelmiller, M. L., & Dorfman, A. B. (1994; 2001).The Work Sampling System: An Overview. Ann Arbor, MI: Rebus, Inc.

Mills, R. P. (1996). Rewriting the tests: Lessons from the California state assessment system. In J. B.Baron & D. P. Wolf (Eds.), Performance-based student assessment: Challenges and possibilities(95th Yearbook of the Society for the Study of Education, pp. 166–191). Chicago, IL: Society forthe Study of Education.

Nicholson, J. M. (2000). Examining evidence of the consequential aspects of validity in a curriculum-embedded performance assessment. Unpublished doctoral dissertation, University of Michigan,Ann Arbor.

Richardson, V. (1996). The role of attitudes and beliefs in learning to teach. In J. S. Sikula (Ed.),Handbook of research on teacher education (2nd ed.). New York, NY: Macmillan.

Zeichner, K. (1999). The new scholarship in teacher education. Educational Researcher, 28(9), 4–15.

Page 18: Learning to Teach: Developing Assessment Skills When Program and Placement Are Aligned