22
Tips for Writing SACS Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) March 2013

Tips for Writing SACS Program Assessment Reports

  • Upload
    radha

  • View
    25

  • Download
    1

Embed Size (px)

DESCRIPTION

Tips for Writing SACS Program Assessment Reports. Office of Planning, Institutional Research, and Assessment (PIRA). March 2013. Relation Between Existing Assessment and SACS Reports. Ideally you already assess students’ learning - PowerPoint PPT Presentation

Citation preview

Page 1: Tips for Writing SACS  Program Assessment Reports

Tips for Writing SACS Program Assessment Reports

Office of Planning, Institutional Research, and Assessment (PIRA)

March 2013

Page 2: Tips for Writing SACS  Program Assessment Reports

• Ideally you already assess students’ learning• Ideally you already improve your program to increase

student achievement• SACS reports should simply describe these activities • Reports should follow SACS guidelines and use SACS

terminology to assist SACS reviewers• Data or other findings that measure student learning should

be included, as should interpretation of findings• But don’t create special data collection process for SACS;

just summarize existing processes• Initiatives to improve should be included

Page 3: Tips for Writing SACS  Program Assessment Reports

• defined desired mission, student learning outcomes (SLOs), and related measures,

• collected and evaluated results from ongoing assessment,

• undertaken actions to continuously improve learning.

Help reviewers find key components quickly & easily.

Implement Change

(Improve)

Collect Findings

Define SLOs & Measures

Evaluate Results

Page 4: Tips for Writing SACS  Program Assessment Reports

• mission and program outcomes (objectives)• student learning outcomes (3+) and related measures (2+ each, 1 should be direct)• assessment findings: results for measures of student

learning from multiple years (if feasible)• discussion of results: faculty review of findings, including

whether performance of students meets expectations• discussion of changes: initiatives to improve student

learning and/or program• evidence continuous improvement has occurred (new for

2013)• clear narrative and organization to make compliance

obvious (does everything make sense?)

Page 5: Tips for Writing SACS  Program Assessment Reports

Student Student LearningLearning

Outcome 1Outcome 1(Definition)(Definition)

Student Student LearningLearning

Outcome 1Outcome 1(Definition)(Definition)

Student Student Learning Learning

Outcome 2Outcome 2(Definition)(Definition)

Student Student Learning Learning

Outcome 2Outcome 2(Definition)(Definition)

Student Student Learning Learning

Outcome 3Outcome 3(Definition)(Definition)

Student Student Learning Learning

Outcome 3Outcome 3(Definition)(Definition)

Discussion for Discussion for Continuous Continuous

ImprovementImprovement

3)3)Faculty ReviewFaculty Review: Do : Do findings show findings show continuous continuous improvement?improvement?

4)4)Program Program Improvement: Improvement: What What changes should be changes should be made?made?

1)1)Assessment Assessment Findings: Findings: Data for Data for EACH measure for 2+ EACH measure for 2+ yearsyears

Discussion for Discussion for Continuous Continuous

ImprovementImprovement

3)3)Faculty ReviewFaculty Review: Do : Do findings show findings show continuous continuous improvement?improvement?

4)4)Program Program Improvement: Improvement: What What changes should be changes should be made?made?

1)1)Assessment Assessment Findings: Findings: Data for Data for EACH measure for 2+ EACH measure for 2+ yearsyears

Exam questions

that clearly relate to

outcomes

Course Evaluations

Graduating Student Surveys

Capstone reviewed

with faculty-developed rating grid

Graduate School

Dissertation & Thesis

Rating Grid Other

Indirect Measures

Other Direct

Measures

• For each outcome 2-3 measures are required; at least one must be a direct measure (direct - dark blue, indirect - light blue)

• A single measure (e.g., rating grid) can assess more than one outcome.

• Build operationally realistic assessments into your annual departmental calendar.

Program Assessment at the University of Miami Program Assessment at the University of Miami Office of Planning, Institutional Research, and Assessment

(Rev 3-2013)

• Assessment findings should assist in identifying areas for improvement within programs.

• Identified and resolved changes should be reflected in the discussion section of reports to PIRA.

• A program should have 3-5 measurable outcomes, each tied to the program mission.

• Student learning outcomes relate to attainment of knowledge, skills, behaviors, or values.

• Common outcomes include: knowledge of theory and research in the field, ability to think critically about the field of study, oral and written communication skills.

Your mission statement and program outcomes (objectives) should align with the mission of the University and your program’s strategic plan.

Mis

sion

Sta

tem

ent &

Mis

sion

Sta

tem

ent &

Prog

ram

Out

com

es/O

bjec

tives

Prog

ram

Out

com

es/O

bjec

tives

Mis

sion

Sta

tem

ent &

Mis

sion

Sta

tem

ent &

Prog

ram

Out

com

es/O

bjec

tives

Prog

ram

Out

com

es/O

bjec

tives

AssessmentMeasures

Page 6: Tips for Writing SACS  Program Assessment Reports

• tie it to UM Mission:“The University of Miami’s mission is toeducate and nurture students, to createknowledge, and to provide service to ourcommunity and beyond. Committed to excellence and proud of the diversity of our University family, we strive to develop future leaders

of our nation and the world.” and your strategic plan

• describe program outcomes/objectives (e.g., prepare graduates to . . ., teach gen-ed courses, research, service)

Page 7: Tips for Writing SACS  Program Assessment Reports

• describe reasonable expectations for student learning (knowledge, skills, values, and behaviors)• include at least 3 SLOs, each with correct structure and language• make SLOs easy to identify (e.g., use bolding & numbering) and clearly stated (follow expected structure

Most common error: Programs describe what they do. Solution: Describe what you want students to learn.

Page 8: Tips for Writing SACS  Program Assessment Reports

• Start with words like Students… Graduates… We want students to…

• Include verbs or phrases like will demonstrate… should have ability to … will analyze and synthesize…

• Include words like …breadth of understanding of……mastery of……a capacity for…

• Describe expected competence (e.g., broad knowledge, communication, critical thinking)

Page 9: Tips for Writing SACS  Program Assessment Reports

“Help students develop research skills by providing opportunities for supervised laboratory practice.”

write“Graduates will demonstrate the ability to conduct laboratory

research.”----------------------

“Students will participate in interpersonal, interpretative, and presentational communicative activities and be guided in the

development of literacy skills in the language of study through the communicative acts of reading, writing, and creating discourse around

texts of all types.”write

“Students will demonstrate literacy skills in the language of study through the communicative acts of reading, writing, and creating

discourse around texts of all types.”

Page 10: Tips for Writing SACS  Program Assessment Reports

• Students should demonstrate an overall knowledge and understanding of the core concepts in [insert program here], including the essential skills to conduct research in the [insert program here].

• We want students to graduate with strong written [and/or oral] communication skills.

• Our doctoral students should be able to conduct independent research worthy of publication.

• Graduates should have an understanding and capability to work with the systems and hardware components that support software.

• Students should demonstrate critical thinking, including the ability to analyze, synthesize, and draw valid conclusions.

Page 11: Tips for Writing SACS  Program Assessment Reports

• ensure each SLO has 2+ measures • ensure at least 1 direct measure (objective outside source—see

p. 4 of Resources)• ensure indirect measure (usually self-reported measure)

accompanied by direct—see p. 4 of Resources• Instead of course grades or pass rates used (SACS

discourages), substitute project grades (plus description relating exam/project to SLO)

• consider rating grids since easier to trend over time and 1 grid can be used for all SLOs—see pp. 8 & 9 of Resources

Most common error: Programs describe how faculty provide feedback to help individual students. Solution: Describe aggregate measures used to evaluate student learning.

Page 12: Tips for Writing SACS  Program Assessment Reports

“Students are given tests…”write

“Grades from tests that measure the students’ ability to [describe what test is for] will be used to assess [SLO].”

---------------------- Table of grades for course

useTable of grades for final paper

(plus description of assignment using language of SLO)

Page 13: Tips for Writing SACS  Program Assessment Reports

• Graduate School Rating Grid at final defense (already supposed to be using); fast and easy (PIRA will analyze—see pp. 8 & 9 of Resources)

• Same rating grid, but used for proposal defense (and/or for each year in program)—use same standards for both to show students’ progress

• Qualifying/comprehensive exam (but need to explain what’s tested so link to SLO is clear)

• Rating grids from supervisors of TAs, RAs, GAs, internships • Ratings from audience for presentations on student research• Number of publications, conference presentations, grants• Graduating Master’s Student Survey (items similar to ones

on p. 10 of Resources available from PIRA)

Page 14: Tips for Writing SACS  Program Assessment Reports

• Graduating Senior Survey—very easy (PIRA/Toppel collect, analyze, send); small programs should use combined years (green column) rather than trends (orange columns)—see p. 10 of Resources

• Rating grids for capstone papers, projects, etc. (see p. 8 of Resources for sample you can adapt)

• Grades from items on tests or assignments that directly measure a given SLO

• Rating grids from supervisors of internships, practica• Additional items relating to improvement in each SLO that

are added to faculty evaluations or final exams• Existing items on New General Form for faculty/course

evaluations that relate to critical thinking or communicating on the subject

Page 15: Tips for Writing SACS  Program Assessment Reports

• ensure each measure has corresponding findings (and no findings without earlier measure)

• insert corresponding outcome/measure as heading for each set of results

• ensure multiple years or insert explanation that data not provided for new program/revised measures:

“As part of the major three-year “continuous improvement update” of our program assessment report in 2012, we decided to start using rating grids in conjunction with XXX [e.g., senior projects] to allow us to more easily monitor changes in student learning over time. Because this is a new measure, we have data for only the 2012-13 academic year, but we will continue to update the data in upcoming years to monitor continuous improvement in student learning.”

Page 16: Tips for Writing SACS  Program Assessment Reports

• if measure is a narrative rather than data, ensure summary plus sample evaluations or insert statement (see p. 6 of Resources)

• ensure results are presented clearly (tables) • decide if appendix of data, survey instrument, etc. necessary

(usually not)• put findings related to Program Outcomes (new for 2013) under

new sub-heading: Findings Relating to Program Outcomes

Most common errors: Programs simply state they evaluate student learning or omit measure(s). Solution: You should provide evidence of assessment activity (table/text summary of findings).

Page 17: Tips for Writing SACS  Program Assessment Reports

• statement that faculty as a group reviewed (e.g., dates/minutes of meeting)

• discussion of whether faculty think students demonstrated desired level of learning

• initiatives you implemented to improve student learning–see p. 6 of Resources

• whenever possible, an indication of which SLO is affected

• whether improvements seem to be working (new for 2013)

Page 18: Tips for Writing SACS  Program Assessment Reports

Most common errors: oNo statement indicating faculty reviewed oNo statement of how faculty think students are doingoNo mention of which SLO affected by improvement initiatives oNo mention of whether there has been improvement over time

Solutions include: oDates or minutes of faculty meetingsoEvaluation of how well each SLO achievedoWhich SLO will benefit from improvement (if relevant)oEffectiveness of prior initiatives and how learning will be improved

Page 19: Tips for Writing SACS  Program Assessment Reports

• Add bold, indents, and/or underlines to assist reviewers • Nest measures under related SLOs • Label/nest Outcomes/Measures in Findings section• Include discussion of improvements/changes in Discussion

section, not in SLO or Findings sections • Remove yellow template instructions• Use SACS terminology (Student Learning Outcomes,

Measures of SLOs, etc.)• Delete extraneous text and data (clarity more important

than length)• Expand acronyms (e.g., RSMAS, PRISM)• Spell check; fix typos

Page 20: Tips for Writing SACS  Program Assessment Reports

• Study resources and checklist before starting• Use existing assessments and available student work

whenever possible (saves time and effort)• Consider developing a rating grid with 1-2 items per

each learning outcome—see p. 8 of Resources• Contact PIRA for summary of results Graduate School

Rating Grid; email PIRA scanned forms for students we don’t have

• Use Graduating Senior Survey (GSS) or Graduating Master’s Student Survey (GMSS) summary

• Consider starting with measures and then writing SLOs to go with them instead of the other traditional order

Page 21: Tips for Writing SACS  Program Assessment Reports

• Need to provide evidence of improvement based on initiatives, wherever possible (though sometimes hard to see, especially with small Ns and short time periods)

• New emphasis from SACS: Need to add material (Findings, Improvements, and Discussion) related to Program Outcomes (NOT Processes). Examples:• Retention/graduation rates, average time to degree

(from PIRA)• Ratings from GSS or GMSS (from PIRA)• Job placement (from Grad Program Review profile)• Graduate program review, professional accreditation• Efforts to improve quality, interdisciplinarity, etc.