DEMYSTIFYING EVALUATION -...

Preview:

Citation preview

Kristian L. Gordon, MPH, CHES

December 8, 2011

DEMYSTIFYING EVALUATION

Make It Plain, Make It Useful, Make It Simple, and Make it Happen!

• Describe the different types of evaluation in non-technical terms. (Make It Plain)

• Make the case for program evaluation as an essential element of program sustainability (Make It Useful)

• Examine the evaluation framework and its application using real world examples. (Make It Simple)

• Review the South Carolina Online Reporting and Evaluation System (SCORES) and its association with the Obesity State Plan Evaluation. (Make It Happen)

LEARNING OBJECTIVES

Describing the different types of evaluation in non-technical terms

MAKE IT PLAIN

Program evaluation is the systematic collection of information about the activities,

characteristics, and outcomes of programs to make judgments about the program, improve

program effectiveness, and/or inform decisions about future programming .

Patton , M.Q. (1997) . Uti l iza t ion - foc used eva luation: The new century text (3 rd ed. ) . Thousand Oaks , CA: Sage.

WHAT IS EVALUATION?

Engage Stakeholders

Describe the Program

Focus the evaluation

design

Gather credible evidence

Justify Conclusions

Ensure use and share lessons

learned

SYSTEMATIC PROCESS

Standards Utility

Feasibility Propriety Accuracy

What is evaluation? • Systematic process

• Purposeful

• Grounded in the realities of “real world” practice

• Investigative

• Discovering

• Revealing

• Enhancer

• Information

MAKING IT PLAIN

• Formative – conducted for the purpose of refining or improving a program and is often conducted by people closely involved with the program.

• Performance Management – focuses on activities, direct products and services delivered, and the results of those products or services.

• Process/Monitoring– checking the program’s implementation and appropriate use of resources for program improvement, accountability, and the development of effective implementation models.

TYPES OF EVALUATION

Outcome Evaluation – assesses changes in knowledge, attitude, practices as a result of an implemented change.

Impact Evaluation – assesses what happens as a result of the implemented change such as behavior change.

Summative (Outcome/Impact) – used to determine the merit, worth, or value in a way that leads to making a final evaluative judgment.

TYPES OF EVALUATION CONTINUED

Evaluation Models & Approach Goal-Based Approach

Goal-Free Approach

Utilization Focused Approach

Four-Level Model

Evaluation Designs Non-experimental

Pre-experimental – (survey administered after the completion of a training)

Quasi-experimental – (Nonrandom assignment)

True experimental – (Random Assignment)

Return on Investment (ROI)

EVALUATION JARGON

QUESTIONS?

Identifying the role of evaluation in sustainability

MAKE IT USEFUL

The existence of structures and processes which allow a program to leverage resources to most effectively

implement evidence-based polices and activities over time.

WHAT IS SUSTAINABILITY?

Source: Center for Tobacco Policy Research

Strategic Planning

Funding Stability

Political Support

Partnerships

Organizational

Capacity

Program Improvement

Surveillance & Evaluation

Communications

Public Health Impacts

SUSTAINABILITY COMPONENTS

Source: Center for Tobacco Policy Research

• An essential element of program sustainability

• Continuously ask questions that matter

• Use evaluation findings for decision-making and action

• If you don’t systematically collect information on what your program is doing you will not know how to communicate what is working and why it is working, what improvements need to be made, or how to make the case for future funding.

P re s k i l l H . , & B oy le S . ( 2008) . A m u l t id i s c ip l in a ry m od e l o f e va lu a t ion c a p a c i t y b u i ld in g . A m e r ic a n J o u r n a l o f Eva lu a t io n , 29 , 4 4 3- 459 . 6

WHY EVALUATION?

QUESTIONS?

Identifying how to implement evaluation in your everyday practice

MAKE IT SIMPLE

What is your evaluation experience? Beginner

Intermediate

Advanced

POLL

WHERE DO I FIND AN EVALUATOR?

Source: University of Wisconsin-Extension Publications

Evaluation Question

Indicator/ Measure

Data Source Data Collection Method

Schedule/ Timeline

EVALUATION PLAN TEMPLATE

• Create a logic model to describe the program

• Identify stakeholders and brainstorm the evaluation purpose and what you want to know/learn about the program? (Engage)

• Write questions that reflect what you want to know? (Focus) – Be specific and clear

– Who, What, When Where, How, and Why

– Ask, “How will this information be used?”

• Carefully consider what information is needed in order to answer each question. (Data Source)

• Consider how you can collect the information needed and in what format you want the information to be presented. (Data Collection Methods) – Simultaneously consider the use of the information

• Prepare a plan for how the information will be complied analyzed, and interpreted. (Analysis)

• Prepare a plan for how the information will be formatted and disseminated (Use)

GENERATING AN EVALUATION PLAN

Inputs Activities Outputs Short-term Outcomes

Intermediate Outcomes

Long-term Outcomes

BASIC LOGIC MODEL

Process Evaluation Outcome Evaluation

A visual representation of the program. It shows what is

put into the program, what you intend to do, what is

expected to result from the program activities, and what

is intend to change over the short, intermediate, and long

term.

SCHOOL/COMMUNITY GARDEN EVALUATION

The Health in Action Coalition has to report to their funder how they used $1,000 to help

provide seniors at the Georgetown Senior Center with an environment supportive of healthy living.

What does the funder want to know?

How much did it cost to implement a school/community garden?

What volunteers/partners engaged in the garden planning and implementation?

To what extent is the garden integrated in the school curriculum/senior citizen activities?

What challenges/barriers were encountered during the planning and implementation of the garden?

DEVELOPING EVALUATION QUESTIONS

Evaluation Questions (What do you want to know?)

Measures/Indicators (Evidence)

How much did it cost to implement a school/community garden?

- Cost of supplies - Volunteer in-kind hours donated

What volunteers/partners engaged in the garden planning and implementation?

- Partners - Staff - Volunteers - Students

To what extent is the garden integrated in the school curriculum/community activities?

- Sustainability plan for garden upkeep maintenance

- # Classrooms/subjects/grades that use the garden in the curriculum

What challenges/barriers were encountered during the planning and implementation of the garden and how were they resolved?

- Challenges - Solutions

DETERMINING AND DEVELOPING EVALUATION MEASURES AND/OR INDICATORS

Evaluation Questions (What do you want to know?)

Measures/Indicators (Evidence)

Data Source (Who has this information?)

How much did it cost to implement a school/community garden?

- Cost of supplies - Volunteer in-kind hours

donated

Garden Coordinator

What volunteers/partners engaged in the garden planning and implementation?

- Partners - Staff - Volunteers - Students

Garden Coordinator

To what extent is the garden integrated in the school curriculum/community activities?

- Sustainability plan for garden upkeep maintenance

- # of Classrooms/subjects/ grades that use the garden in the curriculum

- Community members - Teachers/Principles

What challenges/barriers were encountered during the planning and implementation of the garden and how were they resolved?

- Challenges/barriers encountered

- Work around solutions

Garden Coordinator

IDENTIFYING DATA SOURCES

Evaluation Questions (What do you want to know?)

Measures/Indicators (Evidence)

Data Source (Who has this information)

Data Collection Methods

How much did it cost to implement a school/community garden?

- Cost of supplies - Volunteer in-kind hours

donated

Garden Coordinator

- Spreadsheet tracking expenses

- Volunteer hours sign in sheet

What volunteers/partners engaged in the garden planning and implementation?

- Partners - Staff - Volunteers - Students

Garden Coordinator

- Volunteer sign-in sheet

- Partner List/Log

To what extent is the garden integrated in the school curriculum/community activities?

- Sustainability plan for garden upkeep maintenance

- # of Classrooms/subjects/ grades that use the garden in the curriculum

- Community members - Teachers/ Principles

- Review of Community sustainability plan

- Review of classroom lesson plans/curriculum

- Classroom observations

What challenges/barriers were encountered …..

- Challenges/barriers encountered

- Work around solutions

Garden Coordinator

- Review of Work/Plan status column

DATA COLLECTION METHODS

What type of evaluation design do the evaluation questions and data collection methods suggest?

Quasi-Experimental

Experimental

Random

Summative

Non-Experimental

POLL

WALKING TRAIL EVALUATION

ESMM Forest Acres has been funded to implement a community walking trail. They want to ensure that the walking trail is

implemented as planned and document how the resources are used for this project. They hope to document lessons learned during

planning and implementation in order to provide justification of an effective implementation model to future funders.

What do you want to know?

• Who was involved in the development of the walking trail?

• What was the process for getting the walking trail developed in Forest Acres ?

• What challenges/barriers were encountered during the planning and implementation of the walking trail?

• How is the walking trail being advertised/promoted in the community?

DEVELOPING EVALUATION QUESTIONS

What type of evaluation does this scenario suggest?

Impact Evaluation

Performance Management

Process Evaluation

Outcome Evaluation

All the above

POLL

Evaluation Questions (What do you want to know?)

Measures/Indicators (Evidence)

What assessments or audits were performed to establish the need?

- Type of assessment(s) used

What resources (financial) were necessary to implement the program?

- Amount of money spent on program planning and implementation

Who was involved in the development of the walking trail?

- Partners - Staff - Volunteers

What was the process for getting the walking trail developed in Forest Acres?

Work plan listing specific activities conducted and successfully completed.

To what extent was the walking trail implemented as planned?

Work plan documenting challenges/barriers with completing a specific work plan activities.

To what extent is the walking trail being advertised/promoted in the community?

Mechanisms/outlets/venues/ used to promote the trail

DETERMINING AND DEVELOPING EVALUATION MEASURES AND/OR INDICATORS

Evaluation Questions (What do you want to know?)

Measures/Indicators (Evidence)

Data Source (Who has this information?)

What assessments or audits were performed to establish the need?

- Type of assessment(s) used

- Walking Trail Coordinator

- Report of the assessment findings

What resources (financial) were necessary to implement the program?

- Amount of money spent on program planning and implementation

Walking Trail Coordinator

What was the process for getting the walking trail developed in Forest Acres?

- Tasks implemented to implement the walking

Walking Trail Coordinator

To what extent, was the walking trail implemented as planned?

- Challenges/barriers encountered during the planning and implementation of the walking trail

Walking Trail Coordinator

IDENTIFYING DATA SOURCES

Evaluation Questions (What do you want to know?)

Measures/Indicators (Evidence)

Data Source (Who has this information)

Data Collection Methods

What assessments or audits were performed to establish the need?

- Type of assessment(s) used

- Walking Trail Coordinator

- Report of the assessment findings

- Document Review

What resources (financial) were necessary to implement the program?

- Amount of money spent on program planning and implementation

Walking Trail Coordinator

- Spreadsheet tracking expenses

What was the process for getting the walking trail developed in Forest Acres?

- Tasks implemented to implement the walking

Walking Trail Coordinator

- Project Work/Action Plan

To what extent, was the walking trail implemented as planned?

- Challenges/barriers encountered during the planning and implementation of the walking trail

Walking Trail Coordinator

- Review of the Project Work/Action Plan

DATA COLLECTION METHODS

Evaluation Questions

Evaluator skills/experience

Resources

Stakeholder preference

Level of acceptable intrusiveness

Accuracy of Information

Availability

Timeliness

Funding requirements

WHAT TO CONSIDER WHEN SELECTING DATA COLLECTION METHODS

Questionnaires, surveys, checklists

Interviews

Document review

Observation

Focus Groups

Expert or peer review

Photographs/videos

Logs

Community Forum

DATA COLLECTION METHODS

Not all evaluation collection methods require formal analysis.

Quantitative data analysis Calculate basic descriptive statistics (frequency, mean,

median, mode)

Additional Statistical Analysis – correlations, chi square, t-test, and analysis of the variance

Qualitative data analysis Content analysis

“Pre-set” or “emergent” categories

Multiple reviews to make meaning of the data

DATA ANALYSIS

It is important to plan for how data will be analyzed before it is collected.

Consider:

What data analysis method (i.e. frequency, mean etc…) would be appropriate and why?

Will this data analysis method (i.e. frequency, mean etc…) answer my question?

What do you hope to learn by analyzing the data using this method?

PLANNING FOR DATA ANALYSIS

Quantitative

Number of respondents

Data tables to that include frequency, percentage, mean etc…

Chart/Graphs

Qualitative

Number of people who provided comments

Reoccurring themes (provide quotes)

Unexpected findings

Ensure quotes remain anonymous – unless permission is obtained in advance

DATA REPORTING

Report how you conducted the evaluation and what was discovered

Not all evaluations will result in a formalized written report.

The format in which the evaluation findings are packaged should be based on the primary intended users and uses.

Group discussions

Executive summaries

Newsletter, bulletin, brief, brochures

Video/Media communication

COMMUNICATING THE EVALUATION PROCESS AND FINDINGS

University of Wisconsin-Extension Publications http://www.uwex.edu/ces/pdande/evaluation/evaldocs.html

http://www.uwex.edu/ces/pdande/resources/index.html

Patton, M.Q. (2008) Utilization-focused Evaluation. 4th edition. Thousand Oaks, CA: Sage Publications.

RESOURCES

QUESTIONS?

How putting program monitoring data into SCORES helps monitor the implementation of the Options for Action (SC Obesity State Plan)

MAKE IT HAPPEN

Change happens at the local level

Building local level capacity to conduct effective program monitoring and evaluation will enable South Carolina to better understand What is going on to address

obesity? What strategies works and

why? What strategies do not work

and why?

Partners

Best Practice Implmtn

Evaluation

Change

PUTTING IT ALL TOGETHER

QUESTION

How do we show what the state of SC is doing to make healthy eating and active living essential to the everyday culture where we live, work, learn, pray, and play?

Monitoring

Evaluation

Tracking, documenting, summarizing the inputs, activities, and outputs of the South Carolina Obesity State Plan

Examples:

• Types of objectives/activities implemented to work toward policy and environmental change

• Policy adoption

• Environmental changes

PROGRAM MONITORING

Have you used SCORES to input your coalition/organization accomplishments?

Yes

No

POLL

Centralized online reporting system used to document how the South Carolina Obesity State Plan (Options for Action) Objectives and Activities are being implemented by community and state level partners

Consistent framework to guide reporting of nutrition, physical activity, and obesity prevention efforts

SOUTH CAROLINA ONLINE REPORTING & EVALUATION SYSTEM

• Partners

• Funding

• Tools

Inputs

• Objectives

• Activities

• Action Types

Activities • # of People Reached

• Resources Leveraged

Outputs

•Policy & Environmental Change

•Implementation Evaluation

Short Term

• Outcome Evaluation

Inter-Term

• Advanced Evaluation

Long Term

SCORES IN LOGIC MODEL TERMS

Primary Focus of SCORES

OBESITY STATE PLAN EVALUATION

Evaluation Question

Indicators/Measures Sources (who

has this information?)

Data Collection Methods

Schedule/ Timeline

What has occurred on the local level to address policy, systems, or environmental change related to nutrition, physical activity, and obesity?

- # of OFA objectives being implemented

- # and type of local level policy changes adopted

- # and type of local level environ. changes adopted

- ESMMSC Chapters

- Local coalitions

- SCDHEC Region Staff

- Working Well Centers of Excellence

- Partners

- Review of SCORES data

- Quarterly

To help develop models of implementation so that other local coalitions can replicate what worked well

To demonstrate what your local community is doing to impact the burden of obesity in South Carolina

To make contributions towards County Profile Sheets

WHY ENTER INTO SCORES?

Enables the state of South Carolina to answer the following questions:

Who is implementing Options for Action?

What is being implementing?

Where is implementation occurring?

What policies and environmental changes are occurring?

What partners are supporting implementation of nutrition, PA, and obesity prevention efforts?

How much money is being leveraged?

What challenges/barriers are being encountered during implementation of OFA Activities?

And many more….

WHAT CAN BE LEARNED FROM SCORES DATA?

Make It Plain – Purposeful and systematic process of understanding what you do.

Make it Useful – Evaluation will help sustain your efforts.

Make It Simple – Ask questions and outline the process for how to answer those questions.

Make It Happen – Evaluation occurring at the local level will pave the way for evaluation at the state level.

SUMMARY

Kristian L. Gordon

Evaluation Coordinator

gordonkl@dhec.sc.gov

803-545-4210

QUESTIONS

SURVEY MONKEY 4 brief questions