38
MODULE 3 1st 2nd 3rd

MODULE 3 1st 2nd 3rd. The Backward Design Learning Objectives What is the purpose of doing an assessment? How to determine what kind of evidences to

Embed Size (px)

Citation preview

MODULE 3

1st

2nd

3rd

The Backward Design

Learning Objectives

• What is the purpose of doing an assessment?

• How to determine what kind of evidences to look for?

• What kind of methods can be used? When?

• How to create assessment tasks and evaluation criteria?

• How to make sure the assessment is valid and reliable?

Why to assess?

The assessment purpose is to measure understanding, not to generate grades!!!!

• Reliable information to infer about student learning

• Feedback to improve their teaching methods

Provide professor with:

• Feedback on how well they understand the content

• Feedback to improve their learning

Provide students with:

How to create assessments?

Assessment Objectives

Evidences of Learning

Assessment

Evaluation Criteria

1

2

3

4

Validity and Reliability5

Assessment Objectives

Big IdeasImportant to know and do

Superficial knowledge

Core Concepts

Worth to be familiar with

1

Evidences of Learning

Big Ideas

Important to know and do

Superficial knowledge

Core Concepts

Worth to be familiar with

Know Concepts,

Definitions

Ability to transfer

knowledge to different

contexts

Related to the ABILITY of doing something

Ability to apply a

specified framework

to contexts

approached in class

2

EVIDENCE refers to something that can be DEMONSTRATED!

Micro-descriptive

Micro, domain-specific

Macro, across domains, multi-disciplinary

Evidences of Learning

Recall Definitions

Summarize ideas, explain concepts

Apply concepts to situations similar to the ones approached in class

Break concepts into parts and understand their relationship

Apply concepts to situations different from the ones approached in class. Create new application or interpretation of the concepts

Judge results of concepts application and make decision about the quality of the application

2

Bloom 2001

Assessment

Assessment Tasks

When to assess?

Which Method?

3

When to assess?

Photo AlbumSnapshot vs

3

Summative Formative + Summative

Formative and Summative Assessment

3

Formative

• Objective is give feedback to

students

• Build learning

• Students can adjust

Summative

• More focused on grade

• End of the grading period.

There is no opportunity to

adjust and show improvement

Both are necessary! At least 50% of each!

Combination of both leads to a good result!

FF S

Continuous Assessment

Different Moments and

Different Methods!!

3

Assessment Tasks

Big Ideas

Important to know and do

Superficial knowledge

Core Concepts

Worth to be familiar with Traditional Quizzes and Tests

• Paper-and-pencil• Multiple-Choice

• Constructed response

Performance and Task Projects

• Complex• Open-Ended

• AuthenticAdapted from “Understanding by Design”, Wiggins and McTighe

3

Assessment Tasks

Quizzes and Traditional Tests

Ask about definition

Open-Ended Questions

Simple Performance Task

Straightforward application, Exercises

Analytical Task

Experiments, Scenarios Simulation, Cases

Complex Performance Task

Application to new contexts and situations, create artifact or project

Result of Analysis - Decision

Pros vs Cons, Cost vs Benefits, Reflection

3

Bloom 2001

Authentic Tasks

Authentic Task

• Is realistic contextualized

• Replicates key challenging real-life situations

• Requires judgment and innovation

• Students are asked to “do” the subject

• Assesses students ability to integrate concepts and ideas

• Gives the opportunity to practice and get feedback

Task that reflects possible real-world challenges

It is problem-based NOT an exercise!

3

It is a performance-based assessment!

From “Understanding by Design”, Wiggins and McTighe

Authentic Task vs. Exercise

Authentic Task

• Accuracy is what matters

• Question is “noisy” and

complicated

• Various approaches can be used

• Integration of concepts and skills

• Appropriate solution

• Arguments is what matters

• Right approach

• Right solution and answer

Exercise

3

In Class, Formative

Out of class, summative

From “Understanding by Design”, Wiggins and McTighe

How to formulate an Authentic Task?

oal

ole

udience

ituation

erformance

tandards

What is the goal of the task? What is the problem that has to be solved?

What is the student role?What students will be asked to do?

Who is the audience?Who is the client? Who students need to convince?

What is the situation or the context?What are the challenges involved?

3

From “Understanding by Design”, Wiggins and McTighe

Evaluation Criteria4

• Provide feedback for students

• Be clear

• Communicated in advance

• Be consisted of independent variables

• Focus on the central cause of performance

• Focus on the understanding and use of the Big Idea

Must…

Types of Evaluation Criteria4

Check List

Criteria

Check List4

2. List of individual traits with the maximum points associate to each of them

There are two types of Check Lists

1. List of questions and their correct answers

Check List: Questions and answers4

This type is used to Multiple-choice, True/False, etc. In other words, where there is a correct answer

1. A2. C3. D4. B5. B6. D

Check List: Traits and their value 4

Performance

Trait 1

Trait 2

Trait ...

Weight (%) or points

Weight (%) or points

Weight (%) or points

Grade = weighted average or Grade = sum of points

Analytic Rubric is better4

• Provides more detailed feedback for students

• Provides students with information about how they

will be evaluated

• Is clearer

• Evaluates independently each characteristic that

composes performance

• Holistic Rubric is used when it is required only an overall impression

On the other hand…

Analytic Rubrics4

How to create them?

4 How to create Analytical Rubrics?

Ideas

Organization

Grammar

Excellent Satisfactory Poor

Levels of achievement

Traits

Example: a simple rubric to evaluate an essay

It can be created from a Check List! 4

Performance

Trait 1

Trait 2

Trait ...

Weight (%) or points

Weight (%) or points

Weight (%) or points

Excellent

Acceptable

Unacceptable

Excellent

Acceptable

Unacceptable

Excellent

Acceptable

Unacceptable

The difference is that each trait is broken down into levels of achievement, which have detailed description!

How to define traits?4

Get samples of students’ previous work1.

2.Classify the sample into different levels (strong, middle, poor…) and write down the reasons

Cluster the reasons into traits3.

It can be defined based on experience or on historical data:

Write down the definition of each trait4.

Select among the samples the ones that illustrate each trait5.

Continuously refine the traits’ definitions6.

It can also be defined based specific objectives and learning questions

From “Understanding by Design”, Wiggins and McTighe

How to build Analytic Rubric?4

http://rubistar.4teachers.org/index.php

The following website is a free tool that helps to create rubrics

5 Validity and Reliability

Validity and Reliability5

http://ccnmtl.columbia.edu/projects/qmss/images/target.gif

Target Desired understandings / objectives

Shots Assessment Outcomes

Checking for Validity5

• Is it possible to a student do well on the assessment task, but really not

demonstrate the understandings you are after?

• Is it possible to a student do poorly, but still have significant understanding of the

ideas? Would this student be able to show his understanding in other ways?

If yes, the assessment is not valid. It does not provide a good evidence to make any inference

Self-assess the assessment tasks by asking yourself the following questions:

(Note: for both questions, consider the task characteristics and the rubrics used for evaluation)

Adapted from “Understanding by Design”, Wiggins and McTighe

Checking for Validity5

The previous questions can be broken down into more detailed questions:

How likely is that a student could do well on the assessment by:

• Making clever guesses based on limited understanding?

• Plugging in what was learned, with accurate recall but limited understanding?

• Making a good effort, with a lot of hard work, but with limited understanding?

• Producing lovely products and performance, but with limited understanding?

• Applying natural ability to be articulated and intelligent, but with limited

understanding?

Next Slide

From “Understanding by Design”, Wiggins and McTighe

Checking for Validity5

How likely is that a student could do poorly on the assessment by:

• Failing to meet performance goals despite having a deep understanding of the Big

Ideas?

• Failing to meet the grading criteria despite having a deep understanding of the Big

Ideas?

Make sure all the answers are “very unlike” !!!

From “Understanding by Design”, Wiggins and McTighe

Checking for Reliability5

Assess rubric reliability by asking:

• Would different professors grade similarly the same exam?

• Would the same professor give the same grade if he grades the test twice, but at

different moments?

Assess task reliability by asking:

• If a student did well (or poorly) in one exam, would he do well (or poorly) in a similar

exam?

Task reliability can be achieved by applying continuous assessments

From “Understanding by Design”, Wiggins and McTighe

Summary

Learning

Objectives

Evidences

of Learning

Observable,Demonstrable

Summary

Time

Evidences of Learning

Formative Assessment Tasks

Summative

Assessment

Task

• Complexity depends on the desired level of understanding

• Clear evaluation criteria (Rubrics)

• Task and criteria must provide accurate and consistent judgments

Learning Objectives

• What is the purpose of doing an assessment?

• How to determine what kind of evidences to look for?

• What kind of methods can be used? When?

• How to create assessment tasks and evaluation criteria?

• How to make sure the assessment is valid and reliable?

References

• The main source of information used in this module is the following book

Wiggins, Grant and McTighe, Jay. Understanding by Design. 2nd Edition. ASCD, Virginia,

2005.

• Rubrics

http://rubistar.4teachers.org/index.php