82
Graduate Attribute Assessment Workshop Dalhousie University Brian Frank Director (Program Development) Faculty of Engineering and Applied Science Queen's University May 31, 2010

Graduate Attribute Assessment Workshop

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment WorkshopDalhousie University

Brian FrankDirector (Program Development)

Faculty of Engineering and Applied ScienceQueen's University

May 31, 2010

Page 2: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 2

Objectives• Apply accepted assessment principles to create a

process for informing program improvement and meeting CEAB requirements

• Review specific processes and tools used at Queen's University

Be familiar with:• Tools• Technology• Terminology

Page 3: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 3

Administrative issues

Questions/issues/discussion?

Mobile email:[email protected]

Compiled resources available at http://bit.ly/9OSODq (short link)

Summary 1-page handout for reference

Ask throughout

Page 4: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 4

Perspective

“The institution must demonstrate that the graduates of a program possess the attributes under the following headings... There must be processes in place that demonstrate that program outcomes are being assessed in the context of these attributes, and that the results are applied to the further development of the program.”

Page 5: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 5

Graduate attributes

Knowledge base

Individual and team work

Use of engineering tools

Design

Investigation

Problem analysis

Communication skills

Professionalism

Impact on society and environment

Ethics and equity

Economics and project management

Lifelong learning

Engineering science

Technical laboratory

Project/experiential

Page 6: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 6

Graduate attribute assessment

Outcomes assessment is used to answer questions like:

What can students do?

How does their performance compare to our stated expectations?

It identifies gaps between our perceptions of what we teach and what students learn program-wide.

Page 7: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 7

Inputs and Outcomes

Inputs

Student pre-university backgroundFaculty education, professional statusOngoing faculty developmentClass sizesContentCampus resourcesContact hoursLaboratory equipmentSupport services

Outcomes

Demonstrated abilities(cognitive, skills, attitudes)Indirect measures of abilities

Page 8: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 8

Instructors: “We do assess outcomes – by grades”

Electric Circuits IElectromagnetics ISignals and Systems IElectronics IElectrical Engineering LaboratoryEngineering CommunicationsEngineering Economics...Electrical Design Capstone

78568271867688

86

Student transcript

How well does the program preparestudents to solve open-ended

problems?

Are students prepared to continuelearning independently after

graduation?

Do students consider the socialand environmental implications of

their work?

What can students do withknowledge (plug-and-chug vs.

evaluate)?Course grades usually aggregateassessment of multiple objectives,

and are indirect evidence for some expectations

Page 9: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 9

Outcomes assessment widely used• Common in the Canadian primary, secondary, and

community college educational systems

• Required by most engineering accreditation bodies

• Recommended Undergraduate Degree-Level Expectations created by provincial Ministers of Education (required for all Ontario post-secondary programs)

• CDIO Syllabus includes specific learning outcomes

Page 10: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 10

Good news:

Most programs probably already have people doing this on a small scale:

• Some instructors use course learning objectives

• Design course instructors often assess design, communications, teaming skills separately

• Rubrics are becoming common for assessing non-analytical objectives

Can identify innovators and key instructors (e.g. project-based design courses, communications, economics)

Page 11: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 11

Setting up a process(without overwhelming faculty, irritating staff,

and going deeper into debt)

Page 12: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 12

Questions for programs:

What are your program's specific and measurable expectations ?

Given requirements:Assess in 12 broad areas (graduate attributes), and

create a process for program improvement.

How to measure the studentsagainst specific expectations?

Where to measure the expectations(courses, internships,extra-curriculars...)?

What process for analyzing data andusing for improvement?

Page 13: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 13

Example of comprehensive curriculum design overview

by P. Wolf at U Guelph

From P. Wolf, New Directions for Teaching and Learning,Volume 2007, Issue 112 (p 15-20). Used with permission.

Page 14: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 14

Assessment principles• Assessment works best when the program has clear

objectives.

• Assessment requires attention to both outcomes and program.

• Assessment should be periodic, not episodic

• Assessment should be part of instruction

Page 15: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 15

Sample process flow

Identify majorobjectives (includinggraduate attributes)

Create measurableassessment criteria

Map to courses/experiences

Identify appropriatemeasures to assess

(reports, simulation,tests,...)

Stakeholder input

Course changes/Measure

Analyze and evaluate data

Programimprovement

Page 16: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 16

Program objectives

Identify majorobjectives (includinggraduate attributes)

Create measurableassessment criteria

Map to courses/experiences

Identify appropriatemeasures to assess

(reports, simulation,tests,...)

Stakeholder input

MeasureAnalyze and evaluate data

Programimprovement

Course changes/Measure

Page 17: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 17

Creating Program objectives• CEAB graduate attributes

• Strategic plans

• Advisory boards

• Major employers of graduates

• Input from stakeholders

• Focus groups, surveys

• SWOT (strengths, weaknesses, opportunities, threats) analysis

What do you want your program to be known for?

Page 18: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 18

Create specific assessment criteria

Identify majorobjectives (includinggraduate attributes)

Create specific and measurable

assessment criteria

Map to courses/experiences

Identify appropriatemeasures to assess

(reports, simulation,tests,...)

Stakeholder input

MeasureAnalyze and evaluate data

Programimprovement

Course changes/Measure

Page 19: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 19

Why do we need assessment criteria?

Lifelong learningAn ability to identify and address their own educational needs in a changingworld in ways sufficient to maintain their competence and to allow them to

contribute to the advancement of knowledge

Can this be directly measured?

Would multiple assessorsbe consistent?

How meaningful would the assessment be?

Page 20: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 20

Why measurable criteria

Lifelong learningAn ability to identify and address their own educational needs in a changingworld in ways sufficient to maintain their competence and to allow them to

contribute to the advancement of knowledge

Can this be directly measured?

Would multiple assessorsbe consistent?

How meaningful would the assessment be?

Probably not, so more specific measurable criteria are needed.This allows the program faculty to decide what is important

Page 21: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 21

Measurable criteria: examples

Lifelong learningAn ability to identify and address their own educational needs in a changingworld in ways sufficient to maintain their competence and to allow them to

contribute to the advancement of knowledge

Critically evaluates informationfor authority, currency, and

objectivity

Uses information ethically and legally to accomplish a specific purpose

Develops a research plan to meet information needs

Graduateattribute

The student:

Describes the types of literature of their field and how it is produced

Assessmentcriteria

Page 22: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 22

Creating measurable criteria• What specific things should students demonstrate?

• What do they need to be able to do?

• Are they measurable and meaningful?

• Can involve cognitive (recalling, analyzing, creating), attitudes, skills

Content areaLevel of expectation(“describes”, “compares”, “applies”, “creates”, etc.)

Critically evaluates information for authority, currency, and objectivity

Page 23: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 23

Problematic criteria

Content area

What does the author mean? Students can state the laws?Plug numbers into equations? Apply laws to solveconceptual problems? ...

Learn static physics principles including Newtonian lawsfor linear motion

Page 24: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 24

Taxonomy

Creating(design, construct, generate ideas)

Evaluating(critique, judge, justify decision)

Analyzing(compare, organize, differentiate)

Applying(use in new situation)

Understanding(explain, summarize, infer)

Remembering(list, describe, name)

Anderson, L. W. and David R. Krathwohl, D. R., et al (Eds..) (2001) A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives. Allyn & Bacon. Boston, MA (Pearson Education Group

Page 25: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 25

Verbs for cognitive expectations• Define

• List

• State

• Recall

• Identify

• Recognize

• Calculate

• Label

• Locate

• Interpret

• Compare

• Contrast

• Solve

• Estimate

• Explain

• Classify

• Modify

• Integrate

• Analyze

• Hypothesize

• Evaluate

• Justify

• Develop

• Create

• Extrapolate

• Design

• CritiqueHigher order skills

Page 26: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 26

Create criteria (10 min)• Groups of 2-4:

• Select a graduate attribute to focus on • 1. Independently create some measurable criteria for

that attribute reflecting your objectives• 2. Discuss criteria as a table. Are the criteria

measurable? Are they meaningful? Would they be consistent from one rater to another?

Page 27: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 27

Follow-up to creating criteria

Any points for discussion?

Page 28: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 28

Assessment criteria resources• EC2000, ABET 2009

• UK-SPEC, Engineering Subject Centre Guide

• Engineers Australia

• CDIO

• Foundation Coalition

• UDLEs

• Discipline-specific (Civil Engineering Body of Knowledge, IET criteria for electrical and computer engineering, etc.)

Note: assessment criteria may also be known as:

Performance criteriaOutcomesCompetenciesObjectives

Many linked at:

http://bit.ly/9OSODq(case sensitive, no zeros)

Page 29: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 29

Measurable criteria

Identify majorobjectives (includinggraduate attributes)

Create measurableassessment criteria

Map to courses/experiences

Identify appropriatemeasures to assess

(reports, simulation,tests,...)

Stakeholder input

MeasureAnalyze and evaluate data

Programimprovement

Course changes/Measure

Page 30: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 30

Criteria mapping

AssessmentAssessmentCriteriaCriteria

First year

Graduating year

Middle years

First year courses

Design Physics Calculus Chemistry etc.

Design project course

Assignment 1used to assess:

Criteria 1Criteria 2Criteria 3

Assignment 2used to assess:

Criteria 1Criteria 4Criteria 5

Team proposalused to assess:

Criteria 1Criteria 6Criteria 7

etc.

Page 31: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 31

Mapping opportunities• Courses

• Co-ops/internships

• Co-curricular activities (competitive teams, service learning, etc.)

• Exit or alumni surveys/interviews

• ...

Page 32: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 32

Example: ABET recommends mapping tables

AssessmentCriteria

Development Assessment Method

Measurement Time measured

Assessment coordinator

Evaluation

Produces research information for the team

ME113, EM213, ME213, ME235, ME333, ME412

PortfoliosPeer Evaluations, Faculty Evaluations

ME 213ME412

ME 213 EvenME412 Odd

Even – ArmalyOdd - Richards

Curriculum Committee

Demonstrates understanding of team roles when assigned

ME113, EM213, ME213, ME235, ME333, ME412

Peer Evaluations, Faculty Evaluations

ME 213ME412

ME 213 EvenME412 Odd

Even – ArmalyOdd - Richards

Curriculum Committee

Shares in the work of the team

ME113, EM213, ME213, ME235, ME333, ME412

Peer Evaluations, Faculty Evaluations

ME 213ME412

ME 213 EvenME412 Odd

Even – ArmalyOdd - Richards

Curriculum Committee

Demonstrates good listening skills

ME113, EM213, ME213, ME235, ME333, ME412

Peer Evaluations, Faculty Evaluations

ME213ME412

ME 213 EvenME412 Odd

Even – ArmalyOdd - Richards

Curriculum Committee

Page 33: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 33

Assessment measures

Identify majorobjectives (includinggraduate attributes)

Create measurableassessment criteria

Map to courses/experiences

Identify appropriatemeasures to assess

(reports, simulation,tests,...)

Stakeholder input

MeasureAnalyze and evaluate data

Programimprovement

Course changes/Measure

Page 34: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 34

Assessment measures

How to measure the students against specific expectations?

• Direct measures – directly observable or measurable assessments of student learning• E.g. Student exams, reports, oral examinations,

portfolios, etc.

• Indirect measures – opinion or self-reports of student learning or educational experiences• E.g. grades, student surveys, faculty surveys, focus group

data, graduation rates, reputation, etc.

Page 35: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 35

Assessment measures

Local written exam (e.g. question on final)

Standardized written exam (e.g. Force concept inventory)

Performance appraisal(e.g. Lab skill assessment)

Simulation(e.g. Emergency simulation)

Behavioural observation(e.g. Team functioning)

External examiner(e.g. Reviewer on design projects)

Oral exam(e.g. Design projects presentation)

Focus group

Surveys and questionnaires

Oral interviews

Portfolios(student maintained material addressing

outcomes)

Archival records(registrar's data, previous records, ...)

Page 36: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 36

External assessment measures• Concept inventories (Force Concept Inventory,

Statics concept inventory, Chemistry Concept Inventory, …)

• Surveys of learning, engagement, etc.• National Survey of Student Engagement (National data

sharing, allowing internal benchmarking), E-NSSE• Course Experience Questionnaire• Approaches to Studying Inventory• Academic motivation scale• Engineering attitudes survey

Page 37: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 37

Measuring using rubrics

• Creating defined levels (“scales”) of expectations reduces variability between graders, makes expectations clear to students

Page 38: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 38

Rubrics

ScalesDimensions

Not demonstrated

Marginal Meets expectations

Exceeds expectations

Dimension 1:Information

Descriptor: E.g.Information sources not credible or critically evaluated

Descriptor: E.g.Information sources credible but not explicitly evaluated

Descriptor: E.g.Variety of sources used, critically evaluated for currency, authority...

Descriptor: E.g.Uses sources generally accepted as best in field...

Dimension 2:Design

Descriptor Descriptor Descriptor Descriptor

Dimension 3:Communications

Descriptor Descriptor Descriptor Descriptor

Analytic rubric shown above.Descriptors are specific descriptions of what is seen for a particular scale

Page 39: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 39

Rubrics• Holistic: assess a work as a whole, rather than by

components• more appropriate when performance tasks require

students to create some sort of response and where there is no definitive correct answer (Nitko 2001)

• Often faster to use, but provide less feedback

• Analytic: assess a work by multiple components• preferred when a fairly focused type of response is

required; may be one or two acceptable responses and creativity is not an essential feature of the students' responses (Nitko 2001)

• Slower, but often more defensible in engineering context

Page 40: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 40

Example: Analytic rubric

Page 41: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 41

Task: Assessment measures (5 min)• Take some assessment criteria developed by group

previously:• determine three ways that they could be assessed (a list

of assessment measures are on summary sheet), at least one done using a direct assessment measure

• If any are difficult to measure, consider whether they the criteria should be modified

Page 42: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 42

Measure

Identify majorobjectives (includinggraduate attributes)

Create measurableassessment criteria

Map to courses/experiences

Identify appropriatemeasures to assess

(reports, simulation,tests,...)

Stakeholder input

Course changes/Measure

Analyze and evaluate data

Programimprovement

Page 43: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 43

Measuring principles• Not required to measure every criteria every year.

Could measure in years of accreditation cycle as follows:• Design: Years 1,4• Communications: Years 2,5• Knowledge: Years 3,6...

• No requirement to assess every student – appropriate sampling may be appropriate for some assessment measures

• Assessment is for the program; no requirement to demonstrate that every student meets all expectations

Page 44: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 44

Data gathering and storage• Modern learning management systems are able to

link outcomes to learning activities

• E.g. Moodle, Blackboard, Desire2Learn

• Reports, assignments, quizzes in the LMS can be linked to outcomes and simultaneously graded for course marks and assessment criteria

• Examples using Moodle later

Page 45: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 45

Analyze and Evaluate

Identify majorobjectives (includinggraduate attributes)

Create measurableassessment criteria

Map to courses/experiences

Identify appropriatemeasures to assess

(reports, simulation,tests,...)

Stakeholder input

MeasureAnalyze and evaluate data

Programimprovement

Page 46: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 46

Analyze and evaluate• Longitudinal comparison of students competency

• Histogram of results by level (did or did not meet expectations)

• Analysis of which areas are weakest, variation by student experience/background

• Triangulation: examination of correlation between results on multiple assessments of single criteria

Page 47: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 47

Results: example

Page 48: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 48

Is this data useful?• Validity: how well an assessment measures what it

is supposed to• Direct measures vs. indirect• Authentic assessment (emulating professional practice)

• Reliability: the degree to which an instrument measures the same way each time it is used under the same condition with the same subjects; • the repeatability of the measurement• a measure is considered reliable if a person's score on

the same test given twice is similar• Estimated by test/retest, or internal consistency using

multiple methods to assess same criteria

Page 49: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 49

Program improvement

Identify majorobjectives (includinggraduate attributes)

Create measurableassessment criteria

Map to courses/experiences

Identify appropriatemeasures to assess

(reports, simulation,tests,...)

Stakeholder input

MeasureAnalyze and evaluate data

Programimprovement

Page 50: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 50

Program improvement• Changes in existing courses

• New courses/streams (Queen's is going through this)

• New approaches (service learning, co-ops, case-study, problem-based learning, ...)

Page 51: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 51

Faculty buy-in to process

Scenario 1

Instructors of certain courses assigned assessment criteria and rubrics to be used in their classes that were developed by {X} and {Y} from other universities/countries.

Scenario 2

Instructors asked to be part of a committee, with facilitator, to answer the question “what do we want our students to be able to do, and how do we assess it”, and to establish a process.

Page 52: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 52

Faculty buy-in• Blindly using other group's criteria may not be

appropriate

• Often takes some time to understand rationale for criteria

• Going through process of developing criteria aids in using it

• Even if the list of criteria is similar, the faculty buy-in and comfort with using is not

Page 53: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 53

Anecdotal info from US experience• Takes about 18 months to setup assessment process

• Faculty reaction skeptical to negative at first, but after 4-5 years value often perceived in outcomes assessment

• Capitalize on what you're already doing: innovators, first adopters, experimenters, and pick battles that are (a) necessary, and (b) you can win

• Go for early wins

• Don't generate reams of data that you don't know what to do with: create information, not data

Page 54: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 54

Institutional examples

Page 55: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 55

Institutional examples• U Calgary approach: for programs that are part of

CDIO (MECH), use that syllabus and expectations and map it to CEAB graduate attributes

• U Sherbrooke: some programs primarily problem based, have been using program objectives for years, so will be adapting current processes

• Queen's U: next

Page 56: Graduate Attribute Assessment Workshop

Triangulated

Specific

Leveled

Multi-use

Queen's Engineering:Long-term goals

Establish criteria that are meaningful and measurable (directly or indirectly)

Criteria measured using multiple methods or events to assess validity

Graduate attributes measured at multiple times in students' program with leveled expectations

Satisfy expectations of both CEAB and province (OCAV UDLEs)

Page 57: Graduate Attribute Assessment Workshop

Queen's University process• Spring 2009: Gather relevant resources (e.g.

research, other accreditation guidelines)

• Summer 2009: Working groups of faculty, students, topical experts created specific program-wide assessment criteria (next slide)

• Summer 2009: Setup learning management system (Moodle) to manage assessments

• Sept 2009-April 2010: Piloting assessment in first year courses (Design, calculus, chemistry)

• May-August 2010: Analyzing data, working with departments to plan for assessing fourth year

Page 58: Graduate Attribute Assessment Workshop

Working groups• Queen's process is driven by faculty for improving

the program, not just for accreditation purposes

• “What do we want our students to be able to do, and what data would we like to have to improve the program?”

• Asked instructors, undergraduate chairs, topical experts to participate in creating assessment criteria, including some “skeptics”

• 7 working groups formed, covering 12 graduate attributes

• Working groups met for 1 hour 3-4 times, developed initial assessment criteria

Page 59: Graduate Attribute Assessment Workshop

Graduate attribute

categories

levels

Assessment criteriaLinkage to OCAV UDLEs

Page 60: Graduate Attribute Assessment Workshop

AssessmentAssessmentCriteriaCriteria

First year

Graduating year

Middle years

First year courses

Design Physics Calculus Chemistry etc.

Design project course

Assignment 1used to assess:

Criteria 1Criteria 2Criteria 3

Assignment 2used to assess:

Criteria 1Criteria 4Criteria 5

Team proposalused to assess:

Criteria 1Criteria 6Criteria 7

etc.

Assessment criteria mapping

Page 61: Graduate Attribute Assessment Workshop

Course implementation• Criteria compared to first year design and

professional skill course (APSC-100), and course deliverables adapted to meet objectives

• Assessment criteria entered into Moodle, linked to assignment submissions

• When students uploaded assignments/reports, grading was automatically tied to outcomes in Moodle (discussed later)

• Two first year engineering science and math courses assessed on final exams

Page 62: Graduate Attribute Assessment Workshop

Linking outcomes to coursesCEAB # Attribute Category Code Assessm ent Criteria Event٣.٠٤ Design Process overview FYDE١ Iterates steps in a def ined design process to design system, component, or process to solve open-ended complex problem.Team Proposal٣.٠٤ Design Process overview FYDE١ Iterates steps in a def ined design process to design system, component, or process to solve open-ended complex problem.Team Final٣.٠٤ Design Problem def inition FYDE٢a Accurately identif ies and describes the presented problem Team assignment ١٣.٠٤ Design Problem def inition FYDE٢b Identif ies customer and user needs Team assignment ١٣.٠٤ Design Problem def inition FYDE٢c Gathers and uses information from appropriate sources, including applicable standards, patents, regulations as appropriate (AECS) (٣a; ٣c) (similar to LL١FYb and IN١FYa)Team Proposal٣.٠٤ Design Problem def inition FYDE٢c Gathers and uses information from appropriate sources, including applicable standards, patents, regulations as appropriate (AECS) (٣a; ٣c) (similar to LL١FYb and IN١FYa)Team Proposal٣.٠٤ Design Conceptual design FYDE٣ Produces a variety of potential design solutions suited to meet functional Specif ications (ESC) (٢a; ٣bii)Team Final٣.٠٤ Design Conceptual design FYDE٣ Produces a variety of potential design solutions suited to meet functional Specif ications (ESC) (٢a; ٣bii)Team Proposal٣.٠٤ Design Preliminary design FYDE٤a Performs systematic evaluations of the degree to w hich several design concept options meet project criteria (٢) (AECS) (١cii; ٣biv)Team Final٣.٠٤ Design Preliminary design FYDE٤a Performs systematic evaluations of the degree to w hich several design concept options meet project criteria (٢) (AECS) (١cii; ٣biv)Team Proposal٣.٠٤ Design Preliminary design FYDE٤b Feasible proposal for implementation and testing Team Proposal٣.٠٤ Design Preliminary design FYDE٤b Feasible proposal for implementation and testing Team Final٣.٠٤ Design Evaluation FYDE٧ Compares the design solution against the functional specif ications (AECS) Team Final٣.٠٦ Teamw ork Teamw ork FYTE٢a Recognizes a variety of w orking and learning preferences Peer/individual evaluation٣.٠٦ Teamw ork Teamw ork FYTE٢a Recognizes a variety of w orking and learning preferences Team assignment ١٣.٠٦ Teamw ork Teamw ork FYTE٢b Applies principles of conf lict management to resolve team issues Peer/individual evaluation٣.٠٦ Teamw ork Teamw ork FYTE٢c Assumes responsibility for ow n w ork; is self directed (٦ai) Peer/individual evaluation٣.٠٦ Teamw ork Teamw ork FYTE٢d Describes ow n temperament Individual assignment ١٣.٠٦ Teamw ork Teamw ork FYTE٢e Analyzes impact of ow n temperament on group w ork Individual assignment ١٣.٠٦ Teamw ork Leadership FYTE٣ Exercises initiative and contributes to team goal-setting Peer/individual evaluation٣.٠٧ Communications Written FYCO١a Identif ies and repeats standard formats Written٣.٠٧ Communications Written FYCO١b Recalls and reproduces standard grammar and mechanics Written٣.٠٧ Communications Written FYCO١c Summarizes and paraphrases w ritten w ork accurately w ith appropriate citations Team assignment ٢٣.٠٧ Communications Written FYCO١c Summarizes and paraphrases w ritten w ork accurately w ith appropriate citations Written٣.٠٧ Communications Written FYCO١c Summarizes and paraphrases w ritten w ork accurately w ith appropriate citations Team Proposal ٣.٠٧ Communications Written FYCO١c Summarizes and paraphrases w ritten w ork accurately w ith appropriate citations Team Final٣.٠٧ Communications Oral FYCO٢ Delivers clear and organized formal presentation follow ing established guidelines Oral٣.٠٧ Communications Graphical communications FYCO٣ Uses f igures and tables appropriately to compliment text. Standard conventions employed. Written٣.٠٨ Professionalism Professionalism FYPR١a Demonstrates punctuality, responsibility and appropriate communication etiquette Peer/individual evaluation٣.٠٨ Professionalism Professionalism FYPR١b Participates actively in meetings, helps to generate ideas Peer/individual evaluation

Page 63: Graduate Attribute Assessment Workshop

Specific examples• Professional skills assessment using rubric

• Communication skills assessment using rubric

• Knowledge assessment on calculus exam

• Problem analysis assessment on chemistry exam

Page 64: Graduate Attribute Assessment Workshop

Assessing

• Avoid extra work: assess criteria as part of assignment/report grading

Assignment uses program-wide assessment criteria as expectations for grades.Allow simultaneous grading for (1) course and (2) graduate attribute assessment.

Example 1. Professional skills assessment on an assignment

Page 65: Graduate Attribute Assessment Workshop

Assessing

Report requirements use program-wide assessment criteria as expectations for grades.Allow simultaneous grading for (1) course and (2) graduate attribute assessment.

Example 2. Communication assessment on a report

Page 66: Graduate Attribute Assessment Workshop

Example 3: Knowledge assessment• Calculus instructor asked questions on exam that

specifically targeted 3 assessment criteria for “Knowledge”:

1. “Create mathematical descriptions or expressions to model a real-world problem”

2. “Select and describe appropriate tools to solve mathematical problems that arise from modeling a real-world problem”

3. “Use solution to mathematical problems to inform the real-world problem that gave rise to it”

Page 67: Graduate Attribute Assessment Workshop

Example 3: Knowledge assessment• Calculus exam organized to allow three criteria to be

assessed independently:• Students asked to create equation(s) that model a

physical system.• Students asked to solve a mathematical expression

(“typical” mathematics)• Interpret the meaning of an equation that models a

physical system

• Marks on those questions used to assess student ability to meet expectations.

Page 68: Graduate Attribute Assessment Workshop

Use to evaluate how well students meet expectations and to improve program

Histogram of results in First year design course

Page 69: Graduate Attribute Assessment Workshop

Resources/time commitment• Creating assessment criteria: 7 committees of

approximately 5 people who each met about 4 times

• Mapping criteria to a course and creating rubrics for assessment: ~ 10 hours

• Large scale curricular changes: ~10 person committee, most of whom had 1 course relief bought out by dean

• Coordination (resource gathering, planning, curricular planning): ~40% of a position

Page 70: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 70

Technology support:Learning management systems and outcomes

Page 71: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 71

Learning management systems• Blackboard: measurement instruments, reporting

and tracking

• Desire2Learn: “Competencies” tools

• Sakai: outcomes, portfolio strength

• Moodle: outcomes• Show how outcomes managed in Moodle

Page 72: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 72

Selected assessment criteria (“outcomes”)Queen's identifiers for assessment criteria

Page 73: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 73

Moodle online assignments

Page 74: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 74

Assignment upload

Page 75: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 75

Outcome grading for assignments

Page 76: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 76

Outcome grading - popup

Page 77: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 77

Gradebook report by outcome

Page 78: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 78

Moodle development• Customizing Moodle for our purposes

• Group upload/grading of assignments• Peer evaluation• Class response system (“clickers”)

• Future collaboration

Page 79: Graduate Attribute Assessment Workshop

Graduating year process (September 2010)• Forming group of capstone course instructors to

look at sharing resources

• Develop discipline-specific expectations on top of program wide expectations

• Assess most professional skills in capstone courses

Page 80: Graduate Attribute Assessment Workshop

Plans• Broaden assessment to upper years: capstone

courses, economics, technical communications

• Start triangulation: e.g. assess ability to make assumptions and approximations in 3 courses in same semester

• Assess inter-rater reliability

• Seek faculty & student feedback after pilot year

Page 81: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 81

Conclusion• National collaboration:

• Group of engineering education specialists and educational developers tasked by NCDEAS Education Committee and CEAB to develop materials/guidelines

• Resource sharing via web• Regional collaboration and workshops• Publication of processes/plans at Canadian Engineering

Education Association (CEEA) conferences

• Training opportunities for curriculum chairs, etc.:• Workshop at CEEA (June 7-9 at Queen's University)• ABET Institute for Development of Excellence in

Assessment Leadership

Page 82: Graduate Attribute Assessment Workshop

Graduate Attribute Assessment Workshop 82

Discussion/questions?

Brian [email protected]