23
Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

Embed Size (px)

Citation preview

Page 1: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

Tips for Writing SACSCOC Academic Program Assessment Reports

Office of Planning, Institutional Research, and Assessment (PIRA)

Fall 2015

Page 2: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

Relation Between Existing Assessment and SACSCOC Reports

• Ideally you already assess students’ learning• Ideally you already improve your program to increase

student achievement

Page 3: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

Relation Between Existing Assessment and SACSCOC Reports

• Program Assessment Reports should describe these activities using SACSCOC guidelines and terminology

• Data or other findings that measure student learning should be included, as should interpretation of findings

• But don’t create special data collection process for SACSCOC; just summarize existing processes

• Initiatives to improve should be included

Page 4: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

• SACSCOC reviewers seek evidence of improvement based on initiatives

• Add material on Program Outcomes, such as: • Retention/graduation rates, average time to

degree• Data from Graduating Senior Survey (GSS) or

Graduating Master’s Student Survey (GMSS)• Job placement (from Grad Program Review

profile)• Graduate Program Review, professional

accreditation• Other measures of program success (e.g., quality,

effectiveness, interdisciplinary opportunities).

Recent Changes

Page 5: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

• Study resources and template before starting• Use existing assessments of student work and

program outcomes• Start with your existing assessments (measures) and

then write outcomes to go with them

Tips for Writing an Efficient Report

Page 6: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

• Consider developing a rating grid with 1-2 items per each learning outcome—see p. 8 of Resources

• Contact PIRA for summary of results of the Graduate School Rating Grid

• Use Graduating Senior Survey (GSS) or Graduating Master’s Student Survey (GMSS) summary

Tips for Writing an Efficient Report

Page 7: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

We Offer Help (Seriously)

Page 8: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

Show Reviewers Clear Evidence that You Have . . .

• defined desired mission, student learning outcomes (SLOs), and related measures,

• collected and evaluated results from ongoing assessment,

• undertaken actions to continuously improve learning.

Help reviewers find key components quickly & easily

Implement Change

(Improve)

Collect Findings

Define SLOs & Measures

Evaluate Results

Page 9: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

Use PIRA Template for Key Elements

• Mission and program outcomes (objectives)• Student learning outcomes (3+) and related measures

(2+ each, 1 should be direct)• Assessment findings: results for measures of student

learning from multiple years (if feasible)• Discussion of results: faculty review of findings, including

whether performance of students meets expectations• Discussion of changes: initiatives to improve student

learning and/or program• Evidence continuous improvement has occurred• Clear narrative and organization to make compliance

obvious (does everything make sense?)

Page 10: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

Student Learning

Outcome 1(Definition)

Student Learning

Outcome 2(Definition)

Student Learning

Outcome 3(Definition)

Discussion for Continuous

Improvement

3) Faculty Review: Do findings show continuous improvement?

4) Program Improvement: What changes should be made?

5) Assessment Findings: Data for EACH measure for 2+ years

Exam questions

that clearly relate to

outcomes

Course Evaluations

Graduating Student Surveys

Capstone reviewed

with faculty-developed rating grid

Graduate School

Dissertation & Thesis

Rating Grid Other

Indirect Measures

Other Direct

Measures

• For each outcome 2-3 measures are required; at least one must be a direct measure (direct - dark blue, indirect - light blue)

• A single measure (e.g., rating grid) can assess more than one outcome.

• Build operationally realistic assessments into your annual departmental calendar.

Program Assessment at the University of Miami Office of Planning, Institutional Research, and Assessment

(Rev 3-2013)

• Assessment findings should assist in identifying areas for improvement within programs.

• Identified and resolved changes should be reflected in the discussion section of reports to PIRA.

• A program should have 3-5 measurable outcomes, each tied to the program mission.

• Student learning outcomes relate to attainment of knowledge, skills, behaviors, or values.

• Common outcomes include: knowledge of theory and research in the field, ability to think critically about the field of study, oral and written communication skills.

Your mission statement and program outcomes (objectives) should align with the mission of the University and your program’s strategic plan.

Mis

sion

Sta

tem

ent &

Prog

ram

Out

com

es/O

bjec

tives

AssessmentMeasures

Page 11: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

Your Mission Statement Should

• tie to UM Mission: “The University of Miami’s mission is to educate and nurture students, to create knowledge, and to provide service to our community and beyond. Committed to excellence and proud of the diversity of our University family, we strive to develop future leaders of our nation

and the world.” and your strategic plan• describe program outcomes/objectives (e.g., prepare

graduates to . . ., research, service)

Page 12: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

• Describe reasonable expectations for student learning (knowledge, skills, values, and behaviors)

• Include at least 3 SLOs, each with correct structure and language

• Make SLOs easy to identify (e.g., use bolding & numbering) and clearly stated (follow expected structure)

Most common error: Programs describe what they do. Solution: Describe what you want students to learn.

When Writing Student Learning Outcomes:

Page 13: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

Possible SLOs

• Students should demonstrate an overall knowledge and understanding of the core concepts in [insert program here].

• We want students to develop strong communication skills, including writing skills.

• Our doctoral students should be able to conduct independent research worthy of publication.

• Graduates should have an understanding and capability to work with the systems and hardware components that support software.

Page 14: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

When Writing Measures:

• Ensure each SLO has 2+ measures • Include at least one direct measure (e.g., rating grid,

exam or project grade) and add any indirect measures (e.g., student surveys, course evals)—see p. 4 of Resources

• Instead of course grades or pass rates, substitute project grades (plus description relating exam/project to SLO)

• Consider rating grids since easier to trend over time and 1 grid can be used for all SLOs—see pp. 8 & 9 of Resources

Most common error: Programs describe how faculty provide feedback to help individual students. Solution: Describe aggregate measures used to evaluate student learning.

Page 15: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

Examples of Bad and Good Measures: Instead of . . .

“Students are given tests…”write

“Grades from tests that measure the students’ ability to

[describe what test is for] will be used to assess [SLO].”

---------------------- Table of grades for course

useTable of grades for final paper

(plus description of assignment using language of SLO)

Page 16: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

Good Graduate Program Measures

• Graduate School Rating Grid at final defense; fast and easy (PIRA will analyze—see pp. 8 & 9 of Resources)

• Same rating grid, but used for proposal defense—use same standards for both to show students’ progress

• Qualifying/comprehensive exam construct linked to SLO• Rating grids from supervisors of TAs, RAs, GAs, internships • Ratings from audience for presentations on student research• Number of publications, conference presentations, grants• Graduating Master’s Student Survey (items similar to ones

on p. 10 of Resources available from PIRA)

Page 17: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

Good Undergraduate Measures• Graduating Senior Survey—very easy (PIRA/Toppel collect,

analyze, send); small programs should use combined years rather than trends—see p. 10 of Resources

• Rating grids for capstone papers, projects, etc. (see p. 8 of Resources for sample)

• Grades from items on tests or assignments that directly measure a given SLO

• Rating grids from supervisors of internships, practica• Additional items relating to improvement in each SLO that

are added to faculty evaluations or final exams• Existing items on New General Form for faculty/course

evaluations that relate to critical thinking or communicating on the subject

Page 18: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

When Writing Assessment Findings:

• Ensure each measure has corresponding findings (and no findings without earlier measure)

• Insert corresponding outcome/measure as heading for each set of results

• Include multiple years or insert explanation that data not provided for new program/revised measures:

“As part of the major three-year “continuous improvement update” of our program assessment report in 2014, we decided to start using rating grids in conjunction with XXX [e.g., senior projects] to allow us to more easily monitor changes in student learning over time. Because this is a new measure, we have data for only the 2014-15 academic year, but we will continue to update the data in upcoming years to monitor continuous improvement in student learning.”

Page 19: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

When Writing Assessment Findings:

• If measure is a narrative, provide summary plus sample evaluations or insert statement (see p. 6 of Resources)

• Ensure results are presented clearly (tables) • Decide if appendix of findings, survey instrument, etc. will be

necessary (usually not)• Put findings related to Program Outcomes under new sub-

heading: Findings Relating to Program Outcomes

Most common errors: Programs simply state they evaluate student learning or omit measure(s). Solution: You should provide evidence of assessment activity (table/text summary of findings).

Page 20: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

When Writing the Discussion Section:

• Statement that faculty as a group reviewed (e.g., dates/minutes of meeting)

• Discussion of whether faculty think students demonstrated desired level of learning

• Initiatives you implemented to improve student learning–see p. 6 of Resources

• Whether improvements seem to be working and which SLO is affected

Page 21: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

In Discussion Section

Most common errors: o No statement indicating faculty reviewed o No statement of how faculty think students are doingo No mention of which SLO affected by improvement initiatives o No mention of whether there has been improvement over

time

Solutions include: o Dates or minutes of faculty meetingso Evaluation of how well each SLO achievedo Which SLO will benefit from improvement (if relevant)o Effectiveness of prior initiatives and how learning will be

improved

Page 22: Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

Help Reviewers Find What They Need

• Add bold, indents, and/or underlines to assist reviewers • Nest measures under related SLOs • Label/nest Outcomes/Measures in Findings section• Remove yellow template instructions• Expand acronyms (e.g., RSMAS, PRISM)• Spell check and fix typos