Upload
coral-greer
View
219
Download
0
Embed Size (px)
Citation preview
Ten Years of Evidence-based Education: A Status Report
Ronnie Detrich
Wing Institute
Goals for Today
• Review mandates of NCLB and IDEIA to emphasize scientifically-based instruction.
• Place in broader context of evidence-based practice.
• Status of evidence-based education after 10 years.
2001 No Child Left Behind
By 2014 every student will be at grade level.
Instructional methods will be scientifically based.
Educators will be held accountable for outcomes.
A Closer Look atScientifically-based Instruction
• NCLB: interventions to improve educational performance are based on scientific research. In NCLB there are over 100 references to scientific
research.
• IDEIA (2004): interventions are scientifically based instructional practices.
A Closer Look at Scientifically-based Instruction
• Specific requirements of IDEIA include: Pre-service and professional development for all who
work with students with disabilities to ensure such personnel have the skills and knowledge necessary to improve the academic achievement and functional performance of children with disabilities, including the use of scientifically based instructional practices, to the maximum extent possible.
A Closer Look atScientifically-based Instruction
• Scientifically based early reading programs, positive behavioral interventions and supports, and early intervention services to reduce the need to label children as disabled in order to address the learning and behavioral needs of such children.
A Closer Look atScientifically-based Instruction
• The Individualized Education Program (IEP) shall include a statement of the special education and related services and supplementary aids and services, based on peer-reviewed research to the extent practicable, to be provided to the child, or on behalf of the child, and a statement of the program modifications or supports that will be provided for the child.
A Closer Look at Scientifically-based Instruction
• In determining if a child has a specific learning disability, a local education agency may use a process that determines if a child responds to a scientific, research-based intervention as part of the evaluation procedures.
After 10 Years: How Are We Doing?
• Student performance has not changed in last decade.As measured by NAEP.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, 2000, 2002, 2003, 2005, and 2007 Reading Assessments.
Age 17Score
Age 13Score
Age 9Score
Age 17 Proficiency
Are We Getting Our Money’s Worth?
SOURCE: U.S. Department of Education, National Center for Education Statistics. (2009). Digest of Education Statistics, 2008 (NCES 2009-020), Chapter 2 and Table 179.
We were doing betterin 1970 than 2009 because we were getting same effect for half the cost.
After 10 Years: How Are We Doing?
• NCLB and IDEIA increased interest in evidence-based education.Placed in broader context of evidence-based practice
and data-based decision making.
• “Evidence-based” has developed two meanings.Practices that meet evidence standards. Process for practitioner decision-making.
What is Evidence-based Practice(Validated Practices)
• Often perceived as a list of interventions practitioners must use.National Reading Panel
5 elements of scientifically-based reading. Reading First-many states established lists of approved
reading programs.
Insurance companies will not fund autism services unless intervention is on the list.
What is Evidence-based Practice?(The Process)
• At its core the EBP movement is a consumer protection movement. It is not about science per se. It is a policy to use science for the benefit of
consumers. “The ultimate goal of the ‘evidence-based
movement’ is to make better use of research findings in typical service settings, to benefit consumers and society….” (Fixsen, 2008)
What Is Evidence-based Practice?
• EBP is a decision-making approach that places emphasis on evidence to: guide decisions about which interventions to use; evaluate the effects of an intervention.
Professional Judgment
Best available evidence
Client Values
Sackett et al (2000)
Professional Professional JudgmentJudgmentBest Available EvidenceBest Available EvidenceClient Client
ValuesValues
IdentifyIdentify
ImplementImplementEvaluateEvaluate
Evidence-basedEvidence-based
InterventionIntervention
IdentifyIdentify
ImplementImplementEvaluateEvaluate
Phases of Evidence-based Education
How Are We Doing?
• The term “evidence-based” has become ubiquitous in last decade.No consensus about what it means.At issue is what counts as evidence.Federal definition emphasizes experimental
methods.Preference for randomized trials.Definition has been criticized as being positivistic.
IdentifyIdentify
What Counts as Evidence?
• Ultimately, this depends on the question being asked.
• In EBP the goal is to identify causal relations between interventions and outcomes.Experimental methods do this best.
IdentifyIdentify
What Counts as Evidence?
• Even if we accept causal demonstrations to be evidence, there is no consensus.Randomized Clinical Trials (RCT) have become
the “gold standard.”There is controversy about the status of single
case designs.WWC has recently established standards for single case
designs. No well established method for calculating effect
sizes.
IdentifyIdentify
What is an Evidence-based Intervention?
• Identification is more than finding a study to support an intervention.
• Identification involves distilling a body of knowledge to determine the strength of evidence.Systematic review
IdentifyIdentify
How Are Evidence-based Interventions Identified?
• Distillation requires standards of evidence for reviewing the literature.Standards specify:
the quantity of evidencethe quality of evidence
IdentifyIdentify
Relationship Between Quality and Quantity of EvidenceQ
ualit
y
Quantity
IdentifyIdentify
Evidence-based
May meet evidence standards
Not Evidence-based
May meet evidence standards
How Are Evidence-based Interventions Identified?
• Number of organizations publishing evidence-based reviews.What Works ClearinghouseBest Evidence EncyclopediaPromising Practices NetworkCoalition for Evidence-based Policy
• Share commonalities but also differences.Modest correlations (Briggs, 2008).
IdentifyIdentify
Implications of EBP Reviews
How are consumers to decide?
Validated Not Validated
Standard 1
Standard 2
Intervention X
IdentifyIdentify
Most likely with hierarchy approach
Most likely with threshold approach
Effective IneffectiveEff
ectiv
eIn
effec
tive
True
Positive
True
Negative
False
Positive
False
NegativeAsse
ssed
As
sess
ed
Effec
tiven
ess
Effec
tiven
ess
Actual EffectivenessActual Effectiveness
Effective IneffectiveIn
effec
tive
Effec
tive
IdentifyIdentify
Choosing Between False Positives and False Negatives
• At this stage, it is better to have more false positives than false negatives.
False Negatives:Effective interventions will not be selected for implementation.
As a consequence, less likely to determine that they are actually effective.
False Positives: Progress monitoring will identify interventions that are not effective.
IdentifyIdentify
IdentifyIdentify
Is Reading Mastery 1 an effective program for beginning readers?Is Reading Mastery an effective reading program for beginning readers?Are Direct Instruction reading programs effective for beginning readers?Are direct instruction reading programs effective for beginning readers?
Little research available specific to Reading Mastery 1More research availableEven more research availableEven more research available Expansion has changed the initial question
Reading Mastery
1
Implications of EBP Reviews
• Emphasis on “best” evidence. In many instances non-existent.
• In absence, what is basis for decision making? “Best available” evidence is standard in evidence-
based practice (the process).
IdentifyIdentify
What Works Practice Guides: Best Available Evidence
Level of Support Percent Minimal 45%
Moderate 33%Strong 22%
14 Practice Guides78 Total Recommendations
IdentifyIdentify
Tim Slocum, 2011
ImplementingEvidence-based Interventions
Where Good Ideas go to Die
ImplementImplement
Research to Practice Gap Most Evident
Research to Practice
• Gap concern in many disciplines.
• Education is not excluded.
• Scientist/Practitioner model aimed to close the gap.
Gap or Chasm?
ImplementImplement
Scope of the Problem
550 named interventions for children and adolescents
BehavioralCognitive-behavioral
Empirically evaluated?
Evidence-based interventions are less likely to be used than interventions for which there is no evidence or there is evidence about lack of impact.
Kazdin (2000)
ImplementImplement
James Lancaster first experiment demonstrating how to prevent scurvy.
1601
John Lind again experimentally demonstrated the effectiveness of citrus in preventing scurvy.
1747 1795
British Navy adopted policy to have citrus on all ships in the Royal Navy.
Scurvy in the British Royal Navy:An Example of the Research to Practice Gap
ImplementImplement
Research to Practice Issues
• The lag time from efficacy research to effectiveness research to dissemination is 10-20 years.
(Hoagwood, Burns & Weisz, 2002)
• Practitioners often view research as irrelevant and impractical.
(Hoagwood, Burns & Weisz, 2002)
• Only 4 of 10 Blueprint Violence Prevention programs had the capacity to disseminate to 10+ sites in a year. (Elliott & Mihalic, 2004)
ImplementImplement
Challenges of Implementation
• Average life span of an educational innovation is 18-48 months (Latham, 1988).Why?
Innovation more difficult than expected. Causes too much change. Takes too much time. Supporters leave. Personnel lack training. External funds run out. Inadequate supervision.
ImplementImplement
80% of initiatives ended within 2 years
90% of initiatives ended within 4 years
Data from Center for Comprehensive School Reform
Implementation is Fundamental
ImplementImplement
Implementation is MultiplicativeImplementImplement
Diffusion of InnovationRogers, Diffusion of Innovation, 2003
• Diffusion of innovation is a social process, even more than a technical matter.
• The adoption rate of innovation is a function of its compatibility with the values, beliefs, and past experiences of the individuals in the social system.
ImplementImplement
Principles for Effective Diffusion:Improving the Odds (Rogers, 2003)
• Innovation has to solve a problem that is important for the “client.”
• Innovation must have relative advantage over current practice.
• It is necessary to gain support of the opinion leaders if adoption is to reach critical mass and become self-sustaining.
• Innovation must be compatible with existing values, experiences and needs of the community.
ImplementImplement
Principles of Effective Diffusion:Improving the Odds
• Innovation is perceived as being simple to understand and implement.
• Innovation can be implemented on a limited basis prior to broad scale adoption.
• Results of the innovation are observable to others.
ImplementImplement
Effective Programs Are Not Effectively Implemented
• Elliott & Mihalic (2004) review Blueprint Model Programs (violence prevention and drug prevention programs) replication in community settings.Programs reviewed across 5 dimensions
Site selectionTrainingTechnical assistanceFidelitySustainability
ImplementImplement
Keys to Effective Implementation
• Critical elements in site readinessWell connected local championStrong administrative supportFormal organizational commitmentsFormal organizational staffing stabilityUp front commitment of necessary resourcesProgram credibility within the communityProgram sustained by the existing operational budget.
ImplementImplement
Keys to Effective Implementation
• Critical elements of trainingAdhere to requirements for training, skills, and
education.Hire all staff before scheduling training.Encourage administrators to attend training.Plan and budget for staff turnover. Implement program immediately after training.
ImplementImplement
Keys to Effective Implementation
• Critical elements of Technical AssistanceProactive plan for technical assistance.
• Critical elements of FidelityMonitor fidelity.
• Critical elements of SustainabilityFunction of how well other dimensions are
implemented.
ImplementImplement
Phases of Implementation
• Adoption of Practice• Implementation
Initial to full scale
• Sustainability• 2-4 years to achieve full implementation.
(Fixsen, Naoom, Blasé, Friedman, & Wallace, 2005)
ImplementImplement
Barriers to Effective Implementation
(Training)• Teachers are primary means of exposure to
interventions.• Students will not benefit from effective practices
if they are not exposed to them.• Data suggest that preparation programs are not
preparing trainees to use evidence-based practices.
ImplementImplementImplementImplement
ImplementImplement
Barriers to Effective Dissemination
Survey of School Psychology Directors of Training
29%
Evidence-based interventions
Knowledge
(Shernoff, Kratochwill, & Stoiber, 2003)
Training41% programs
directors
ImplementImplement
Barriers to Effective Implementation
• Adoption vs AdaptationPrograms almost always require some adaptation to
fit local circumstances.What can be changed without doing “violence” to
evidence-based program?Usual advice is to implement core components but others
can be changed.Rarely are core components known.
ImplementImplement
Evaluating Evidence-based Interventions
Practice-based evidence
about evidence-based practices
EvaluateEvaluate
Evaluating Evidence-based InterventionsProgress Monitoring
• Implementation of evidence-based intervention does not assure success.Necessary to evaluate impact in local context.
No intervention will be effective for all students.Cannot predict who will benefit.
EvaluateEvaluate
Evaluating Evidence-based InterventionsProgress Monitoring
• Two methods of evaluation:FormativeSummative
• Formative facilitates real time decision- making.
General Outcome Measures (GOMs)
• The larger community is concerned with measures such as academic achievement, bullying, substance abuse.
• Curriculum-based measurement well established for assessing academic performance, especially early grades.
• There are no comparable measures for social behavior.SWPBS relies on Office Discipline Referrals.
EvaluateEvaluate
Evaluating Evidence-based Interventions
• Curriculum based measurement is a powerful means for evaluating impact of academic interventions.Scores on CBM correlated with scores on high stakes
test.Can be used to predict how students will perform on high
stakes tests.
EvaluateEvaluate
Benefits of Formative Assessment
• Progress monitoring 2-5/week in math and reading:4 times as effective as 10% increase in per pupil
spending;6 times as effective as voucher programs;64 times as effective as charter schools;6 times as effective as increased accountability.
EvaluateEvaluate
Yeh (2007)
Benefits of Formative Assessment
Hattie, Visible Learning, 2009Fuchs & Fuchs, 1986
EvaluateEvaluate
Evidence-based Education, Progress Monitoring and Treatment Integrity
• Student data provides feedback about progress.• If we know about adequacy of treatment integrity
then can make decisions:Adequacy of interventionAdequacy of implementation
If implementation is inadequate then focus should be on improving educator behavior.
If implementation is adequate then focus should be on changing intervention so student can succeed.
Decisions can be made about increasing or decreasing intensity of intervention.
EvaluateEvaluate
Summing Up
• Evidence-based interventions provide best chance to improve positive student outcomes.
• Federal policy encourages use.• Processes exist for identifying effective
practices.• There is no apparent systematic plan for
implementing policy.Without a plan, policy likely to fail.
Reasons for Hope
• Emerging science of implementationNational Implementation Research NetworkGlobal Implementation Conference, Aug., 2011.
• Federally funded projectState Scaling Up and Implementation of Evidence-
based Practices (SISEP)
http://scalingup.org/
Thank you
Copies can be downloaded at www.winginstitute.org