Evaluation. Practical Evaluation Michael Quinn Patton

  • View
    214

  • Download
    2

Embed Size (px)

Transcript

  • Slide 1
  • Evaluation
  • Slide 2
  • Slide 3
  • Practical Evaluation Michael Quinn Patton
  • Slide 4
  • Systematic collection of information about: Activities Characteristics Outcomes Programs Personnel Products Of To be used by specific people to : Reduce uncertainties Improve effectiveness Make decisions
  • Slide 5
  • What have we done? How well have we done it? Whom have we done it to? How much have we done? How effective has our program been? What could we do better or differently?
  • Slide 6
  • Slide 7
  • Slide 8
  • Benefits of Program Evaluation Reflect on progress - where were going, where were coming from Improve programs Influence policy makers and funders - ensure funding and sustainability Build community capacity and engage community Share what works and what doesnt with others Strengthen accountability
  • Slide 9
  • 4 Standards: Useful Feasible Proper Accurate Joint Committee on Standards of Educational Evaluation, 1994
  • Slide 10
  • Useful Will results be used to improve practice or allocate resources better? Will the evaluation answer stakeholders questions?
  • Slide 11
  • Feasible Does the political environment support this evaluation? Do you have personnel, time, and monetary resources to do it in house? Do you have resources to contract with outside consultants? If you cant evaluate all parts of the program, what parts can you evaluate?
  • Slide 12
  • Proper Is your approach fair and ethical? Can you keep individual responses confidential?
  • Slide 13
  • Accurate Are you using appropriate data collecting methods? Have interviewers been trained if you are using more than one? Have survey questions been tested for reliability and validity?
  • Slide 14
  • Slide 15
  • Step 1: Engage Stakeholders
  • Slide 16
  • Those involved in program operations administrators managers staff contractors sponsors collaborators coalition partners funding officials
  • Slide 17
  • Those served or affected by the program clients family members neighborhood organizations academic institutions elected officials advocacy groups professional organizations skeptics opponents
  • Slide 18
  • Primary intended users of the evaluation Those in a position to do or decide something regarding the program. In practice, usually a subset of all stakeholders already listed.
  • Slide 19
  • Step 2: Describe the Program
  • Slide 20
  • Mission Need Logic model components inputs outputs outcomes Objectives outcome process Context setting history environmental influences
  • Slide 21
  • Step 3: Focus the Design
  • Slide 22
  • Goals of Focusing Evaluation assesses issues of greatest concern to stakeholder - and at the same time: Evaluation using time and resources as efficiently as possible
  • Slide 23
  • Questions to be answered to focus the evaluation: What questions will be answered? (i.e. what is the real purpose? What outcomes will be addressed?) What process will be followed? What methods will be used to collect, analyze, and interpret the data? Who will perform the activities? How will the results be disseminated?
  • Slide 24
  • Step 4: Gather Credible Evidence
  • Slide 25
  • Data must be credible to the evaluation audience Data gathering methods are reliable and valid Data analysis is done by credible personnel triangulation - applying different kinds and data to answer the question
  • Slide 26
  • Indicators Translate general program concepts into specific measures Samples of indicators participation rates client satisfaction changes in behavior or community norms health status quality of life expenditures
  • Slide 27
  • Data Sources Routine statistical reports census vital stats NHANES Program Reports log sheets service utilization personnel time sheets Special Surveys
  • Slide 28
  • Sources of Data People participants staff key informants representatives of advocacy groups Documents meeting minutes media reports surveillance summaries Direct Observation
  • Slide 29
  • Selected Techniques for Gathering Evidence
  • Slide 30
  • Step 5: Justify Conclusions
  • Slide 31
  • Justification Steps: What are the findings? What do the findings mean? How do the findings compare with the objectives for the program? What claims or recommendations are indicated for program improvement?
  • Slide 32
  • Step 6: Ensure Use and Share Lessons Learned
  • Slide 33
  • Evaluations that are not used or inadequately disseminated are simply not worth doing. The likelihood that the evaluation findings will be used increases through deliberate planning, preparation, and follow-up. Practical Evaluation of Public Health Programs, Public Health Training Programs
  • Slide 34
  • Activities to Promote Use and Dissemination: Designing the evaluation from the start to achieve intended uses Preparing stakeholders for eventual use by discussing how different findings will effect program planning Scheduling follow-up meetings with primary intended users Disseminating results using targeted communication strategies
  • Slide 35
  • Slide 36
  • Group Work Describe the ideal stakeholder group for your project evaluation. What questions will be answered? (include the questions inherent in your objectives) What data will be collected, analyzed and interpreted? How will this get done & by whom? How will you disseminate your findings?