38
Evaluating Innovation, Part1 121612 1 December 16, 2011 Cricket Mitchell, PhD CIMH Consultant Evaluating Innovation Part 1 of 2 Introduction CIMH developed a 2-Part ELearning Course on Evaluating MHSA Innovation projects – Course 1 reviews the Guidelines for Innovation and outlines the foundation for solid evaluation – Course 2 reviews strategies for measurement and determining success Today’s webinar will ‘hit the highlights’ 2

Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 1

December 16, 2011Cricket Mitchell, PhD

CIMH Consultant

Evaluating InnovationPart 1 of 2

Introduction

• CIMH developed a 2-Part ELearning Course on Evaluating MHSA Innovation projects– Course 1 reviews the Guidelines for

Innovation and outlines the foundation for solid evaluation

– Course 2 reviews strategies for measurement and determining success

• Today’s webinar will ‘hit the highlights’

2

Page 2: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 2

Overview• Having a Solid Foundation for Evaluating

Innovation Projects– Articulating program elements

– Articulating program goals

• Measuring Innovation program goals• Determining the most appropriate points of

comparison for findings• Linking program elements with the achievement of

program goals– Building and Utilizing an Innovation Logic Model

• How and when to apply various evaluation strategies

3

Having a Solid Foundation for Evaluating Innovation Projects

• Very broadly defined, evaluation is the systematic collection and assessment of information (data) to provide useful feedback about an activity or an event– Within this broad definition we can identify two

prongs, or two approaches, to evaluation• Process evaluation

and

• Outcome evaluation

4

Page 3: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 3

Having a Solid Foundation for Evaluating Innovation Projects

• Process evaluation– Investigates the process of delivering the activity

or service or event

– Incorporates data collection from multiple perspectives or opinions, at multiple points in time

– The data collected are largely qualitative, yet can be summarized quantitatively

• e.g., 75% of participants believed that having a peer assist them with required paperwork greatly improved their ability to access necessary supportive services

5

Having a Solid Foundation for Evaluating Innovation Projects

• Outcome evaluation– Investigates the extent to which an activity or

service or event achieved the intended goals or targets

– Can incorporate data from multiple perspectives

– Time points for data collection are generally at the beginning (pre-) and at the end (post-) of an activity or service or event

• Sometimes outcome evaluation involves data collection only at the end

6

Page 4: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 4

Articulating Program Elements

• Who is the target population?

• What are the specific program activities?– What are the program elements?

– What are the processes or strategies by which the elements are achieved?

• What is the timeline?

• Who delivers the activities?

• Do any of the above vary across recipients or target populations?

7

Articulating Program Elements

• Testing something new

– Is there more than one new practice being tested?

• If so, steps for articulating program elements must be repeated for each

8

Page 5: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 5

Articulating Program Elements

• Testing an adaptation of an effective

mental health practice

– What is the adaptation? Where is the change being made?

– Is there more than one change or adaptation being tested?

– Will the effective practice also be delivered without the adaptation?

9

Articulating Program Elements

• Bringing something new to mental health

that has been effective in another field

– What differences will there be as a result of delivering this in mental health

• Will there be changes in the target population, in

the program elements, in the timeline, or in the staffing?

• Is there more than one change necessary or indicated as a result of delivering this in mental

health?

10

Page 6: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 6

Articulating Program Goals

• Regardless of the Innovation approach,

program learning goals related to processas well as outcomes should be articulated

– As previously described, Innovation evaluation combines two prongs of evaluation, process and outcome

11

• One approach counties will apply with

Innovation evaluation will be focused on

process goals, to learn about their novel

and creative innovation activities

– To understand the “how, what, when, and who”

Articulating Program Goals

12

Page 7: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 7

• Counties will also employ an approach to

Innovation evaluation that is focused on

outcome goals, to learn about the impact

of their novel and creative innovation

activities on mental health outcomes

Articulating Program Goals

13

• Process learning goals and outcomelearning goals combine to inform a broader set of findings and recommendations from Innovation programs

14

Articulating Program Goals

Page 8: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 8

Diverse stakeholder

input is important in

all aspects of

Innovation evaluation

15

Measurement

• Measuring process learning goals

– Most likely form of measurement will be subjective

• e.g., project-developed questionnaires, interviews, focus groups

16

Page 9: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 9

Measuring Process Goals

• Include, when possible and appropriate: program personnel (at various levels), recipients or members of target population, family members, broad range of stakeholders – try to view the program from a 360°perspective and be broad and inclusive in gathering subjective data

• Ensure that questions included (in questionnaires, interviews, and/or focus groups) cover the breadth of the learning goals – those at the individual, program, and/or system level

17

Measurement

• Most likely forms of measurement for

outcome learning goals are objective– Using objective measurements provides opportunities

for points of comparison (which will be discussed in

further detail...)

18

Page 10: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 10

Measuring Outcome Goals

• Standardized questionnaires may exist that assess the intended outcomes of the Innovation program– If adapting an existing, effective mental health

practice, best to use the same standardized questionnaires used in the original research, if applicable

– If testing a new practice, or testing the application of an effective practice from another field to mental health, a review of the literature to identify standardized questionnaires that assess the desired goals/outcomes of the program would be worthwhile

19

• Examples of Standardized Questionnaires– Innovation programs that have improved mental

health as part of their outcome learning goals could consider a self-report measure of general mental health functioning, e.g.:

• The Behavior and Symptom Identification Scale (BASIS-24®)

– Adult self-report for ages 18 and older

• The Youth Outcome Questionnaires (YOQ® and YOQ-SR®)

– Parent or caregiver report for ages 4-18

– Youth self-report for ages 12-18

Measuring Outcome Goals

20

Page 11: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 11

Measuring Outcome Goals

• Behavioral Indicators may be available in existing program, county or state databases that would reflect the intended outcomes of the Innovation program– If adapting an existing, effective mental health

practice, best to use the same behavioral indicators used in the original research, if applicable

– If testing a new practice, or testing the application of an effective practice from another field to mental health, a review of existing data sources within the program, county or state to identify those that may reflect the desired outcomes of the program would be worthwhile

21

• Examples of Behavioral Indicators– Innovation programs that have improved aspects

of behavioral functioning as part of their outcome learning goals could consider a variety of behavioral indicators, e.g.:

• Independent living• Foster care placement• Enrollment in educational or vocational program• Psychiatric hospitalization• Emergency room visits for mental health and/or

substance abuse issues• Incarceration

Measuring Outcome Goals

22

Page 12: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 12

Measuring Outcome Goals

• In addition to objective measures, subjective measures may also be used to assess the extent to which outcome learning goals of the Innovation Program have been met according to the perceptions of a wide range of stakeholders – think about that 360°perspective– Ensure diverse inclusion and cultural relevance

when designing questions that relate to subjective opinion of service or practice goals

23

• Timing of Measurement

– Program learning goals should be assessed frequently

• Quarterly measurement may prove most informative

– Findings at one measurement point may indicate the need for modifications and/or additions to future learning goals (and program elements)

Measurement

24

Page 13: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 13

• Learning goals may be short-term as well as long-term; and, measurement for each goal may or may not occur within the INN program’s timeline (e.g., long-term desired outcomes)– Innovation programs that implement mental health

practices or services that have a discreet beginning and a clear end, should employ the use of pre-/post-assessment. Whenever possible, the subjective and/or objective indicators of practices with discreet start and stop points should be measured prior to the beginning of the specific mental health practice and immediately following the cessation of the practice.

Timing of Measurement

25

• Project-developed questionnaires,

interviews, and/or focus groups can

simultaneously assess learning process

and outcome goals, depending upon the

timing of the measurement

Timing of Measurement

26

Page 14: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 14

Determining the Most Appropriate Point(s) of Comparison

• What do data tell us?

– Data, in and of themselves, don’t mean very much. It’s important to have at least one relative point of comparison, particularly when using objective measures, to determine the extent to which a program’s findings are impactful or meaningful.

27

Determining the Most Appropriate Point(s) of Comparison

• Examples of the Need for a Point of Comparison in Understanding Data...– Let’s say that an Innovation program is testing

interagency collaboration to promote stabilization after discharge from an inpatient psychiatric facility

• After one year of implementing the Innovation their rate of subsequent psychiatric crises among discharged clients is 35%

• Without a point of comparison, we don’t know if this is an improvement, a worsening, or about the same as before the Innovation

28

Page 15: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 15

– Let’s say that an Innovation program is testing an adaptation of an effective mental health intervention that has been adapted for the specific cultural needs of the community

• Pre- and post-intervention scores on a standardized questionnaire of mental health functioning indicate a 20% reduction in distressing symptomotology

• Without a point of comparison, such as the average decrease in symptoms demonstrated in the existing research on the effective mental health practice without the adaptation, we don’t know if the adaptation is performing as well as the original practice

Determining the Most Appropriate Point(s) of Comparison

29

Determining the Most Appropriate Point(s) of Comparison

– Testing something new– Given that this is something entirely new, the

most logical comparison would be to usual care or business as usual without this new practice

• Historical data – if available, compare both subjective and objective data from the program, setting, or service before the new practice is implemented

• Current data - whenever possible, collect the same subjective and objective data on a similar program, setting or service that does not have the new mental health practice introduced

30

Page 16: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 16

• Example of Historical and Current Data for Points of Comparison in Testing Something New– Let’s say that an Innovation program is testing a new

approach to engaging an underserved cultural group in available mental health services and supports

• Rates of initiation in services and continued participation after one month are collected

– Historical Data: Compare these rates to those for the prior year, before the Innovation program

– Current Data: Compare these rates to those for a different area within the same county, one that has similar representation of this cultural group as the community in which the Innovation is implemented

Determining the Most Appropriate Point(s) of Comparison

31

Determining the Most Appropriate Point(s) of Comparison

– Testing an adaptation of an effective practice

– Since this is an adaptation of an existing,

effective mental health practice, one logical

comparison is to the findings from the basic

research that established the effectiveness of the

original practice

• Expectations from the research – what data were used to establish this practice as effective? To what extent can this Innovation program collect the same or similar data on the adapted practice?

32

Page 17: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 17

• Example of Expectations from the Research as Points of Comparison when Testing an Adaptation

– Let’s say that an Innovation program is testing a

family-based adaptation to an effective individual

intervention for first time juvenile offenders

• Recidivism is tracked

– Expectations from the Research: Compare this rate to the

rate of recidivism for the effective individual intervention in

the existing research on the practice

Determining the Most Appropriate Point(s) of Comparison

33

Determining the Most Appropriate Point(s) of Comparison– Testing an adaptation of an effective practice

(cont’d)– Another logical comparison would be to compare

the county or agency’s implementation of the original practice, without the adaptation

• Historical data – if applicable, compare data from the agency’s implementation of the original practice prior to implementing the adaption, to data collected after the adaptation

• Current data – if applicable, compare data from concurrent implementations of the original practice and the adapted practice within the same agency

34

Page 18: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 18

• Example of Historical and Current Data from the Effective Practice for Points of Comparison when Testing an Adaptation– Using the earlier example of testing a family-

based adaptation to an effective individual intervention for first time juvenile offenders

• Recidivism is tracked– Historical Data: Compare this rate to the rate of recidivism

for the County’s prior implementation of the effective individual intervention, without the adaptation, if possible

– Current Data: Compare this rate to the rate of recidivism for the County’s concurrent implementation of the effective individual intervention, without the adaptation, if possible

Determining the Most Appropriate Point(s) of Comparison

35

Determining the Most Appropriate Point(s) of Comparison

– Testing an adaptation of an effective

practice (cont’d)

– Another possible point of comparison would be to usual care or business as usual without this adapted practice (follow the same steps as described above under ‘testing a new practice’ with regard to historical and/or current data)

36

Page 19: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 19

• Example of Historical and Current Data from Usual Care for Points of Comparison when Testing an Adaptation– Using the same example of testing a family-

based adaptation to an effective individual intervention for first time juvenile offenders

• Recidivism is tracked– Historical Data: Compare this rate to the County’s overall

rate of recidivism prior to the implementation of this adapted practice

– Current Data: Compare this rate to the rate of recidivism among those in the County who do not participate in the adapted practice

Determining the Most Appropriate Point(s) of Comparison

37

Determining the Most Appropriate Point(s) of Comparison– Bringing something new to mental health that has

been effective in another field– Since this is a mental health application of an existing

practice from another field, the most logical comparison is to the findings from the basic research that established its effectiveness in the original field in which it was implemented (though not necessarily so if the practice is being used for an entirely new purpose versus being used for a similar purpose applied to the field of mental health)

• Expectations from the research – what data were used to establish this practice as effective? To what extent can this Innovation program collect the same or similar data when this practice is applied in mental health?

38

Page 20: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 20

• Example of Expectations from the Research as a Point of Comparison when Bringing Something New to Mental Health from Another Field– Let’s say that an Innovation project is testing data

analysis and reporting systems that are traditionally used in business settings to increase the efficiency of and decrease overall resources necessary for reporting mental health outcomes across the County

• Initial and annual resource investment is tracked– Expectations from the Research: Research from the business field

provides an indication of the cost-savings introduced by these analysis and reporting systems

Determining the Most Appropriate Point(s) of Comparison

39

Determining the Most Appropriate Point(s) of Comparison– Bringing something new to mental health that has

been effective in another field (cont’d)– Is there anyone else in the community who has

implemented this practice, either historically or currently, in the original field in which it was found to be effective?

• Historical data – if possible, compare data from the other field’s historical implementation of the original practice in the same community, to data collected from the mental health implementation

• Current data – if possible, compare data from the other field’s current implementation of the original practice in the same community, to data collected from the mental health implementation

40

Page 21: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 21

Determining the Most Appropriate Point(s) of Comparison

– Bringing something new to mental health that has been effective in another field (cont’d)– Another possible point of comparison would

be to usual care or business as usual without this mental health application of an effective practice from another field (follow the same steps as described above under ‘testing a new practice’ with regard to historical and/or current data)

41

• Example of Historical and Current Usual Care Data for Points of Comparison when Bringing Something New to Mental Health– Using the same example of testing data analysis and

reporting systems that are traditionally used in business settings to increase the efficiency of and decrease overall resources necessary for reporting mental health outcomes across the County

• Initial and annual resource investment is tracked– Historical Data: Compare these to the County’s resource

investment in this area prior to the implementation of these new systems... Data on resource investment will also need to be balanced with perceived efficiency and utility of outcome reporting

Determining the Most Appropriate Point(s) of Comparison

42

Page 22: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 22

– Current Data: If the County tests the application of these

business analysis and reporting systems in a subset of

mental health service provision, they can compare the

resource investment and the perceived efficiency and

utility to these same data elements in the rest of their

mental health service provision (usual care/business as

usual)

Determining the Most Appropriate Point(s) of Comparison

43

Linking Innovation Program Elements with the

Achievement of Program Learning Goals:

Building and Utilizing a Logic Model

• Linking the findings from process and outcome learning goals back to independent program elements is the heart of evaluation for Innovation– The ultimate goal is to identify program

elements that are related to the achievement of desired goals and can be adopted or replicated by others

44

Page 23: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 23

Linking Innovation Program Elements with the

Achievement of Program Learning Goals:

Building and Utilizing a Logic Model

• A Logic Model is a “logical” means by which linkages, or hypothesized relationships, can be examined

• The Logic Model is a pictorial representation, a flow chart, of the hypothesized relationships between program Inputs, Activities, Outputs, and Outcomes

45

Linking Innovation Program Elements with the

Achievement of Program Learning Goals:

Building and Utilizing a Logic Model

• Inputs– Resources necessary to achieve objectives

• Activities– What the program does with the resources to meet

the objectives

• Outputs– Direct products of program activities

• Outcomes– Changes that result from the program’s activities and

outputs

46

Page 24: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 24

Logic Model Example (slide 1 of 4)

• Let’s say we have an Innovation program

that pairs a TAY peer mentor with an older

TAY as they are discharged from the

foster care system...

47

Logic Model Example (slide 2 of 4)

– Inputs: TAY peer mentors; representatives from collaborating agencies; guiding principles, policies and training

– Activities: Pairing TAY with peer mentor at time of discharge from FC; peer-provided emotional support, information, and assistance with completing paperwork and forms; coordination of service options by collaborating agencies; coordination of service delivery by collaborating agencies

48

Page 25: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 25

Logic Model Example (slide 3 of 4)

– Outputs: TAY engagement in supports offered by peer mentor; TAY engagement in coordinated services and supports from collaborating agencies

– Outcomes: Increased utilization of services and supports by TAY exiting the foster care system; Increased enrollment in educational and/or vocational opportunities; Increased employment; Increased rates of independent living

49

Logic Model Example (slide 4 of 4)

•TAY Peer Mentors

•Reps from Collaborating Agencies (e.g., CW/SS, MH/BH, Ed, WFTD)

•Guiding principles, policies, and training

TAY Peer Mentor paired with TAY at time of d/c from FC

Peer-provided assistance with forms

Peer-provided emotional support

Coordinated service options

Coordinated service delivery

Inputs Activities OutcomesOutputs

TAY engagement in emotional and informational supports from Peer Mentor

TAY engagement in coordinated services and supports from community-based agencies

Increased utilization of services and supports by TAY exiting the FC system

Increased enrollment in educational and/or vocational opportunities Increased rates of

Independent Living

Increased rates of employment

Improved Quality of Life

50

Page 26: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 26

Utilizing Your Innovation Logic Model

• Utilizing the Logic Model involves

employing various measurement tools,

both subjective and objective, measuring

process goals as well as mental health

outcome goals

51

Utilizing Your Innovation Logic Model

• Each component of the Logic Model

should be associated with one or more

forms of measurement

– As we review examples of measurement for each component in our Example Logic Model, consider how data collection could be accomplished

52

Page 27: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 27

Examples of Measurement for Inputs

•TAY Peer Mentors

•Reps from Collaborating Agencies (e.g., CW/SS, MH/BH, Ed, WFTD)

•Guiding principles, policies, and training

Inputs •How many TAY Peer Mentors are

hired?•Are they retained/do they stay

employed?

•How many agencies are

represented in the collaboration?•How many representatives

participate from each agency?•Are they retained/do they consistently participate?

•How are the Guiding Principles,

Policies, and Training protocols

developed and enacted?53

Examples of Measurement for Activities

TAY Peer Mentor paired with TAY at time of d/c from FC

Peer-provided assistance with forms

Peer-provided emotional support

Coordinated service options

Coordinated service delivery

Activities •How is the older TAY discharging from Foster Care

paired with a Peer Mentor?•How often does the Peer Mentor provide emotional support?

•Does the emotional, informational, and instrumental support provided meet the needs

of the TAY?•How are service options presented to the TAY?

•Are services delivered in a

coordinated manner?

54

Page 28: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 28

Examples of Measurement for Outputs

Outputs

TAY engagement in emotional and informational supports from Peer Mentor

TAY engagement in coordinated services and supports from community-based agencies

•Does the TAY discharging from Foster Care access and utilize the emotional, information, and instrumental supports offered/provided by the Peer Mentor?

•Does the TAY find them helpful?

•Does the TAY engage in any of the coordinated service options presented?

•Does the TAY stay enrolled in any of the services?

55

Examples of Measurement for Outcomes

Outcomes

Increased utilization of services and supports by TAY exiting the FC system

Increased enrollment in educational and/or vocational opportunities Increased rates of

Independent Living

Increased rates of employment

Improved Quality of Life

•Does the TAY enroll in any educational or vocational opportunities?•Does the TAY seek employment?

•Does the TAY gain employment?

•Does the TAY retain employment?

•Does the TAY utilize available community-based services and supports?•What is the TAY’s living environment status six months, 12 months, 24 months after discharge from Foster Care?•What is the TAY’s perceptions of his/her Quality of Life?

56

Page 29: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 29

Examples of Measurement for Process Goals

•TAY Peer Mentors

•Reps from Collaborating Agencies (e.g., CW/SS, MH/BH, Ed, WFTD)

•Guiding principles, policies, and training

TAY Peer Mentor paired with TAY at time of d/c from FC

Peer-provided assistance with forms

Peer-provided emotional support

Coordinated service options

Coordinated service delivery

Inputs Activities OutcomesOutputs

TAY engagement in emotional and informational supports from Peer Mentor

TAY engagement in coordinated services and supports from community-based agencies

Increased utilization of services and supports by TAY exiting the FC system

Increased enrollment in educational and/or vocational opportunities Increased rates of

Independent Living

Increased rates of employment

Improved Quality of Life

57

• Hold focus groups, conduct interviews, or develop questionnaires to get the perceptions of those involved with the Innovation on the hypothesized relationships between program Inputs, Activities, Outputs, and Outcomes– TAY clients of the Innovation program

– Peer Mentors– Representatives from collaborating agencies

– Other stakeholders

• Track dates, number of encounters, and other objective data elements for use when you employ various evaluation strategies

Examples of Measurement for Process Goals

58

Page 30: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 30

Utilizing Your Innovation Logic Model

• Timelines should be established for review

of the logic model, likely linked with the

timing of measurement

– Document the progression of each review of the logic model - including evidence that supports proposed relationships as well as evidence that either does not support or suggests unexpected relationships

59

Utilizing Your Innovation Logic Model

• Utilizing the Logic Model involves

employing various evaluation strategies

– Evaluation strategies cannot be implemented without data collected through various methods of measurement, both objective and subjective

60

Page 31: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 31

Employing Various Evaluation Strategies

• There are a number of evaluation strategies, some more complicated than others– Today’s training will review a range that can

be used; however, this is not meant to be exhaustive

• The selection of an evaluation strategy(ies) is driven by the question(s) of interest

61

Employing Various Evaluation Strategies

• Recognizing the variability with which counties have resources for evaluation (either internal or external), this review will emphasize two aspects1. Strategies that can be readily employed without

complex software or extensive evaluation knowledge

2. Strategies that require either dedicated research and evaluation staff and/or an evaluation consultant

62

Page 32: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 32

Strategies that can be readily employed:

• How many? • What proportion? • What is the average? • How does this compare to

another group (e.g., to business as usual, either historical or current, or to findings from existing research)?

• How does this compare to a pre-determined criterion or benchmark?

• How does the finding after an activity compare to the finding before an activity?

• What is the proportion of change or difference or improvement?

• What proportion of the group meets or exceeds a pre-determined criterion or benchmark?

• Do the findings look different for different subgroups (e.g., gender, ethnicity)?

To answer questions such as...

63

• Descriptive statistics, such as frequency counts, percentages, means and ranges– Can be simply calculated; can also all be done in

Microsoft Excel - user-friendly, many options for displaying findings, easy-to-learn tutorials

– Frequency counts can be displayed as bar graphs/frequency distributions

– Percentages can be displayed as pie charts

– Means, or averages, can be graphed (e.g., bar graphs)

• Side-by-side comparisons (e.g., to another group, to findings from existing research, the same group pre and post, to a pre-determined criterion)

Strategies that can be readily employed:

64

Page 33: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 33

• Descriptive Statistic: Frequency Counts,

Bar Chart

237

168

132

Year Prior to Innovation 1st Year After Innovation 2nd Year After Innovation

Psychiatric Hospitalizations

65

• Descriptive Statistic: Percentages, Pie

Chart

45%

35%

13%

7%

"How helpful was the peer mentor in assisting you with completing forms?"

Very helpful

Somewhat helpful

A little helpful

Not helpful

66

Page 34: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 34

• Descriptive Statistic: Percentages, Bar

Chart

16%

48%

69%

0%

10%

20%

30%

40%

50%

60%

70%

80%

Year Prior to Innovation 1st Year After Innovation 2nd Year After Innovation

Percentage of Mental Health Clients with a Primary Care Physician

67

• Descriptive Statistic: Means/Averages, Bar

Chart, Side-by-Side Comparisons

6057

3431

0

10

20

30

40

50

60

70

Effective Practice without Adaptation Effective Practice with the Adaptation

Ave

rag

e T

ota

l S

co

re

Youth Outcome QuestionnairesTotal Score

Pre

Post

Solid line

indicates

clinical cutpoint 68

Page 35: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 35

Strategies that require research and evaluation expertise:

• To answer questions such as...

– Are the differences observed statistically significant, meaning that we can be confident they are not likely to be due to chance?

– Are there statistically significant differences in the achievement of outcomes across different groups (e.g., by gender or ethnicity?

69

• Paired t-tests– Same group pre and post– Outcome/finding is continuous, quantitative

• Independent t-tests– Two different groups– Outcome/finding is continuous, quantitative

• Crosstab analyses/chi-square tests– More than two groups (e.g., differences by ethnicity)

– Relationships between two categorical variables

• Regression analyses– To examine the relationship of more than one variable

on an outcome

Strategies that require research and evaluation expertise:

70

Page 36: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 36

• Key points:

– It is not necessary to use complex strategies to conduct meaningful evaluation

– It is helpful to know a bit about each when working with an internal research and evaluation department and/or an external evaluation consultant

Strategies that require research and evaluation expertise:

71

Summary • Innovation evaluation combines both process evaluation and

outcome evaluation, with the ultimate goal of contributing to learning

• Evaluation for all Innovation projects should include measurement of process and outcome learning goals

• In order to understand data collected, the most appropriate point(s) of comparison should be identified– This will be informed by the general approach of the Innovation

project

• As a flow chart representing the components of an Innovation program, a Logic Model serves as a guide for testing hypothesized relationships between program elements and program goals

• A number of strategies can be readily employed that don’t require extensive knowledge of research or complex evaluation techniques in order to evaluate an Innovation program

72

Page 37: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 37

73

Questions

Next Steps

• Part 2 of this webinar series will explore representative examples of Innovation evaluation within each of the three cohort areas:– New roles for peers

– Culturally relevant services

– Primary care/behavioral health integration

• Please send specific questions and/or examples to me in advance, and I will incorporate them into our discussion

74

Page 38: Evaluating Innovation - MHSA Resources Clearinghousemhsaresources.org/.../2012/05/PresentationEvaluatingInnovationPart… · Evaluating Innovation, Part1 121612 13 • Learning goals

Evaluating Innovation, Part1 121612 38

75

The End

Contact Information

•Cricket Mitchell, PhD•Email: [email protected]

•Cell phone: 858-220-6355