36
1 Assessment for learning in Assessment for learning in science: science: Issues in learning to Issues in learning to interpret student work interpret student work Center for the Assessment and Evaluation of Student Learning (CAESL) University of California, Berkeley University of California, Los Angeles Lawrence Hall of Science

Assessment for learning in science: Issues in learning to interpret student work

Embed Size (px)

DESCRIPTION

Assessment for learning in science: Issues in learning to interpret student work. Center for the Assessment and Evaluation of Student Learning (CAESL) University of California, Berkeley University of California, Los Angeles Lawrence Hall of Science. UCLA Shaunna Clark Joan Herman - PowerPoint PPT Presentation

Citation preview

Page 1: Assessment for learning in science:   Issues in learning to interpret student work

1

Assessment for learning in Assessment for learning in science: science:

Issues in learning to interpret Issues in learning to interpret student workstudent work

Center for the Assessment and Evaluation of Student Learning (CAESL)

University of California, BerkeleyUniversity of California, Los Angeles

Lawrence Hall of Science

Page 2: Assessment for learning in science:   Issues in learning to interpret student work

2

WestEdDiane

CarnahanKaren Cerwin

Kathy DiRannaJo Topps

U.C. BerkeleyMaryl Gearhart

Jennifer Pfotenhauer

Cheryl Schwab

UCLAShaunna ClarkJoan Herman

Sam NagashimaEllen OsmundsonTerry Vendlinski

Lawrence Hall of Science

Lynn BarakosCraig Strang

Page 3: Assessment for learning in science:   Issues in learning to interpret student work

3

PresentationPresentation

• Program and participants

• Research framework • Design and methods• Selected findings• Implications

3

Page 4: Assessment for learning in science:   Issues in learning to interpret student work

4

Situation • assessments in materials of variable qualityassessments in materials of variable quality• teachers lack expertise to reviseteachers lack expertise to revise• professional practices not well establishedprofessional practices not well established

Argument • science education reform (NRC/NSES) science education reform (NRC/NSES) • known benefits of classroom assessment known benefits of classroom assessment

(e.g., Black & Wiliam, 1998; Sloane & Wilson, (e.g., Black & Wiliam, 1998; Sloane & Wilson, 2000)2000)

• value of reflective practice and long term value of reflective practice and long term collaboration (Garet et al, 2001)collaboration (Garet et al, 2001) 4

Impetus for programImpetus for program

Page 5: Assessment for learning in science:   Issues in learning to interpret student work

5

Principles • integrated with practice• long term• collaborative

• reflective practice

Core strategy• assessment portfolio

5

CAESL Leadership AcademyCAESL Leadership Academy7/03 - 12/047/03 - 12/04

Page 6: Assessment for learning in science:   Issues in learning to interpret student work

6

N = 19 (23)

Years teaching 14.6

Education (percentage)MA + 53

Recent participation in (percentage)professional organizations 89assessment 95

Perceived [weak (1) – very strong (5)]content knowledge 4.5knowledge of standards 4.5knowledge of assessment 4.3confidence in teaching science 4.7

Academy participantsAcademy participants

6

Page 7: Assessment for learning in science:   Issues in learning to interpret student work

7

Interwoven structures• district vertical teams w/ administratorsdistrict vertical teams w/ administrators• cross district grade level teamscross district grade level teams• independent classroom implementationindependent classroom implementation

Series of portfolios • repeated opportunities to build expertiserepeated opportunities to build expertise

7

Program organizationProgram organization

Page 8: Assessment for learning in science:   Issues in learning to interpret student work

8

Portfolio: I. PlanPortfolio: I. Plan

Establish learning goals• analyze ‘conceptual flow’ of materialsanalyze ‘conceptual flow’ of materials• align with standardsalign with standards

Select assessments• choose key assessments to track choose key assessments to track

progress: pre -> junctures -> post progress: pre -> junctures -> post • identify the concepts assessed identify the concepts assessed • anticipate ‘expected student anticipate ‘expected student

responses’responses’ 8

Page 9: Assessment for learning in science:   Issues in learning to interpret student work

9

Portfolio: II. Portfolio: II. ImplementationImplementation

Interpret student work• refine assessment criteriarefine assessment criteria• scorescore• chart and identify trendschart and identify trends

Use evidence• document instructional follow-document instructional follow-

up and feedbackup and feedback9

Page 10: Assessment for learning in science:   Issues in learning to interpret student work

10

Portfolio: III. Evaluate & Portfolio: III. Evaluate & reviserevise

10

Evaluate using student work• alignment with goals and alignment with goals and

instructioninstruction• quality of tasks and criteriaquality of tasks and criteria• methods of analysismethods of analysis

Revise and strengthen

Page 11: Assessment for learning in science:   Issues in learning to interpret student work

11 11

Rated for completeness Complete:Complete: I, II (some student I, II (some student

work), IIIwork), III

Partial: Partial: I I oror III, II (some) III, II (some)

Minimal: Minimal: I or III onlyI or III only

None (but participating)None (but participating)

Portfolio Completion

Page 12: Assessment for learning in science:   Issues in learning to interpret student work

12

Fall 2003 Spring 2004 Fall 2004

Level (22) (23) (19)

Complete .50 .39 .47

Partial* .32 .30 .16

Minimal .18 .26 .16

None .00 .00 .21*but includes student work

Portfolio Completion

12

Page 13: Assessment for learning in science:   Issues in learning to interpret student work

13

StudyStudy

13

Focus • Growth in understanding and practice• Supports and barriers

Longitudinal, nested design• 18 months = 3 portfolios• Cohort: Surveys, focus groups, portfolios

• Cases: Interviews and observations

Page 14: Assessment for learning in science:   Issues in learning to interpret student work

14

Framework for Framework for classroom assessment classroom assessment

expertiseexpertiseUnderstanding of assessment concepts

Facility with assessment practices

14

Page 15: Assessment for learning in science:   Issues in learning to interpret student work

UNDERSTANDING ASSESSMENT CONCEPTS

QUALITY GOALS FOR STUDENT LEARNING AND

PROGRESS

QUALITY USE

QUALITY

TOOLS

15

Page 16: Assessment for learning in science:   Issues in learning to interpret student work

UNDERSTANDING ASSESSMENT CONCEPTS

QUALITY GOALS FOR STUDENT LEARNING AND

PROGRESS

QUALITY USE

QUALITY

TOOLSSOUND

INTERPRETATION

16

Page 17: Assessment for learning in science:   Issues in learning to interpret student work

ESTABLISH GOALS & DESIGN ASSESSMENT PLAN

ASSESS

USE INFORMATION TO GUIDE TEACHING &

LEARNING

REVISE ASSESSMENTS

PROVIDE INSTRUCTION

INTERPRET STUDENT WORK17

CLASSROOM ASSESSMENT PRACTICES

Page 18: Assessment for learning in science:   Issues in learning to interpret student work

ASSESS

USE INFORMATION TO GUIDE TEACHING &

LEARNING

REVISE ASSESSMENTS

PROVIDE INSTRUCTION

INTERPRET STUDENT WORK

Soundness of interpretation:- criteria capture student understanding?- scoring consistent?- interpretation appropriate to purpose?

ESTABLISH GOALS & DESIGN ASSESSMENT PLAN

18

USING CONCEPTS TO GUIDE PRACTICE

Page 19: Assessment for learning in science:   Issues in learning to interpret student work

ASSESS

USE INFORMATION TO GUIDE TEACHING &

LEARNING

REVISE ASSESSMENTS

PROVIDE INSTRUCTION

INTERPRET STUDENT WORK19

ESTABLISH GOALS & DESIGN ASSESSMENT PLAN

Soundness of interpretation:- criteria capture student understanding?- scoring consistent?- interpretation appropriate to purpose?

USING CONCEPTS TO GUIDE PRACTICE

Page 20: Assessment for learning in science:   Issues in learning to interpret student work

20

20

Selected findingsSelected findings

Portfolios: Patterns of change• assessment criteria• analysis of whole class patterns• alignment

Exit survey and focus groups• perceived growth• supports, barriers, needs

Page 21: Assessment for learning in science:   Issues in learning to interpret student work

21

Source• series of 2 or 3 portfolios (n ≈ 10)

Issues & constraints• burden of documentation

• paper & pencil assessments

• professional choice

Patterns in portfoliosPatterns in portfolios

21

Page 22: Assessment for learning in science:   Issues in learning to interpret student work

22

Assessment criteria

22

+from global/holistic to more specific, differentiated, and assessable

+from focus on surface features to efforts to capture student understanding

+from dichotomous (right/wrong) to attention to qualitative levels of understanding

=but … quality variable despite teacher interest (example: reliance on content analysis or notes)

Page 23: Assessment for learning in science:   Issues in learning to interpret student work

23

Whole class analysis

23

+from few to efforts at systematic analysis using charting or content analysis

+from global patterns toward more differentiated analysis (item analysis, item clustering) and efforts to coordinate group & individual patterns

+efforts to analyze progress (espec. pre-post)

=but … information often unintegrated, inferences unsystematic, comparisons inappropriate

Page 24: Assessment for learning in science:   Issues in learning to interpret student work

24

Alignment

24

+efforts to align interpretations with learning goals, tasks, and criteria

+efforts to revise criteria to strengthen alignment

+fewer inferences about ancillary skills not assessed

=but … problematic alignment of assessments and inferences to track progress

Page 25: Assessment for learning in science:   Issues in learning to interpret student work

25

Understanding of CAESL• 1 (none) <--> 5 (full)

Implementation of CAESL

• 1 (none) <--> 4 (full) <--> 5 (beyond)

25

Exit survey

Page 26: Assessment for learning in science:   Issues in learning to interpret student work

26

Item (examples) Mean[None (1)– Full (5)]

Percentagebelow full

(5)

INTERPRETATION OF WORK

Analyzing whole class sets 4.3 .50

Comparing pre and post 4.4 .45

Comparing pre and juncture 4.4 .50

UNDERSTANDING

26

Page 27: Assessment for learning in science:   Issues in learning to interpret student work

27

Item (examples) MeanNone

(1)–Full+(5)

Percentagebelow full (4)

INTERPRETATION

Evaluating and developing criteria

Evaluating criteria – content quality 3.5 .40

Developing using ESRs 3.6 .40

Refining using stud. work 3.9 .25

Scoring

Using criteria to score 4.0 .20

Checking reliability of scoring 3.8 .30

Charting scores 3.2 .55

Charting and analyzing patterns

Charting conceptual understanding 3.2 .65

Analyzing whole class patterns 3.8 .30

IMPLEMENTATION

27

Page 28: Assessment for learning in science:   Issues in learning to interpret student work

28

Exit focus groupsExit focus groups

28

Page 29: Assessment for learning in science:   Issues in learning to interpret student work

ESTABLISH GOALS & DESIGN ASSESSMENT PLAN

ASSESS

USE INFORMATION TO GUIDE TEACHING &

LEARNING

REVISE ASSESSMENTS

PROVIDE INSTRUCTION

INTERPRET STUDENT WORK29

PRACTICES STRENGTHENED MOST?

Page 30: Assessment for learning in science:   Issues in learning to interpret student work

ESTABLISH GOALS & DESIGN ASSESSMENT PLAN

ASSESS

USE INFORMATION TO GUIDE TEACHING &

LEARNING

REVISE ASSESSMENTS

PROVIDE INSTRUCTION

INTERPRET STUDENT WORK30

PRACTICES STRENGTHENED LEAST?

Page 31: Assessment for learning in science:   Issues in learning to interpret student work

ESTABLISH GOALS & DESIGN ASSESSMENT PLAN

ASSESS

USE INFORMATION TO GUIDE TEACHING &

LEARNING

REVISE ASSESSMENTS

PROVIDE INSTRUCTION

INTERPRET STUDENT WORK31

PRACTICES MOST IMPORTANT?

Page 32: Assessment for learning in science:   Issues in learning to interpret student work

32

Portfolio• establishing goals• revision and re-application• resource for collaboration

Resources• CAESL framework • articles and books• grade level teams & facilitators

• long term program

Supports

32

Page 33: Assessment for learning in science:   Issues in learning to interpret student work

33

Portfolio• assessment development• only paper and pencil• focus on perf. assessments• time

Resources• weak assessments• limited frameworks• no clear models for progress• gaps in teacher knowledge

Barriers

33

Context• standards• testing• school & district

Page 34: Assessment for learning in science:   Issues in learning to interpret student work

34

Portfolio• assessment development• only paper and pencil• focus on perf. assessments• time

Resources• weak assessments• limited frameworks• no clear models for progress• gaps in teacher knowledge

Barriers

34

Context• standards• testing• school & district

Unnamed challenges• inquiry• ancillary skills

Page 35: Assessment for learning in science:   Issues in learning to interpret student work

35

Portfolio …if….• streamline• focus on goals, interpretation, and use

• refinement not development

• expand assessment types

Resources•embedded assessments

•handbook•conceptual development

•grade level collaboration

•coaching and facilitation

Context•align with district and state assessments

35

Requests

Page 36: Assessment for learning in science:   Issues in learning to interpret student work

36

Strengthen materials & resources

Expand to unit assessment systems

Align assessment content and quality

Modify program organization 36

Implications