Evidence-Based Practice-JCOLE

Preview:

Citation preview

BEST APPLICATION OF EVIDENCE-BASED METHODS

Implementing Evidence-Based Practices Into Daily Decisions & Applied Work

Jennifer Cole, MS, ATC, CSCS

Justifying Professional Practice Think of 3 reasons why

you have chosen a specific

action/practice. Are your reasons.. 1. Valid? 2. Reliable? 3. From a high-quality

source?

Photo credits

Evidence-Based Practice ■ Integrates individual

expertise and best available external evidence from systematic research by combining science and professional practice

■ Ultimate goal: – Support practitioners in their

decision making– Eliminate the use of ineffective,

inappropriate, too expensive & potentially dangerous practices

Photo credits

Using EBP to Achieve Outcome■ Identify gaps between what

is & what should be– Specify what your lecture

requires to be more effective– Measurable desired outcomes of

lectures– Achievable steps to meet

desired outcome– Relevant , outcome will meet the

needs of your class and align with learning objective

– Time needed to achieve outcome will fit into semester

Photo credits

Evidence-Based Practice

Photo credits

Best Practice #1: Asking the question■ Construct a well-built practical question P: Population I: Intervention C: Comparison O: Outcome

Example: “Compared to traditional lectures, what are the most effective interactive/active learning strategies to improve learning outcomes in the Millennial students?”

Best Practice #2: Necessary Data Sources & Search Skills

• Cochrane library

• MEDLINE• Gray

Literature• NHS Centre

for Reviews & DisseminationPhoto credits

Best Practice #3:Utilize the Highest Standard of Resources

■ Systematic Reviews: systematically compares results from different studies to establish generalizability of findings and consistency

Photo credits

Best Practice #4:Interpret the Forest Plot

Graph used in meta-analyses to

illustrate treatment effect

size of the studies

Photo credits

Best Practice #5: Identify Valid Results■ Critical Appraisal of Review: – Selection bias– Performance bias– Attrition bias– Detection bias■ Appraisal of Included Studies– Appropriately addresses validity – Appropriately combined findings

Photo credits

Best Practice #6: Implications for Practicing the Evidence

Problem is a priority

LARGE desirable effect anticipated

small undesirable effect anticipated

Evidence reduces inequalities

Evidence is feasible

Photo credits

Research vs EBP-- Clickers in the Classroom

The effects of audience response

systems on learning outcomes in health

professions education. A

systematic reviewClassroom Questioning with

Immediate Electronic

Response: Do Clickers Improve

Learning?

Evidence Based

Systematic

Review

Randomized

Block Experime

ntal Design

Best Practice #1: Well-built Question

Systematic Review

Randomized Block Design• What are the effects

of ARS on learning outcomes in health professions education?

• P: Undergrad health professions

• I: Audience Response System

• C: Traditional lecture• O: Effects in learning

outcomes

• Focused on the substantive difference in learning outcomes between traditional classrooms and classrooms using clickers in undergraduates

• P: Undergrad• I: Clickers• C: Traditional

Lectures • O: Differences in

learning outcomes

Best Practice #2: Data Source/Search Skills ■ Systematic Review: – Data source: MEDLINE– Search: MEDLINE, systematic review + classroom response system ■ Randomized Block: – Data source: Decision Sciences Journal of Innovative Education – Search: Clickers in the classroom + learning outcome

Best Practice #3: Utilize the Highest Standard of Resources

The effects of audience response

systems on learning

outcomes in health

professions education. A systematic

review

Classroom Questioning with

Immediate Electronic

Response: Do Clickers Improve

Learning? Systematic Review 21 Studies Randomized Block 1 Study

Bes

t Pr

acti

ce #

4: A

naly

zing

th

e Ev

iden

ceFo

rest

Plo

ts

Best Practice #5: Identify Valid Results (Systematic Review)

• Most of the studies were at a high risk of bias due to inadequate blinding of participants and/or outcome assessors. In addition, many included trials presented outcome data that was not complete or not clearly described.

• Inclusion bias was minimized by prospectively establishing the search strategy and by having two authors screen all potential studies, maximizing the likelihood that this review is inclusive of all relevant studies

• Magnitude of effect was smaller for randomized trials compared to nonrandomized studies.

Best Practice #6: Implications for Practicing the Evidence ■ Are learning outcomes

in my traditional lectures a priority problem?

■ What are large desirable effects?

■ Are there any small undesirable effects?

■ Will inequalities be reduced?

■ Are clickers feasible?

Conclusion■ Nearly impossible to keep up with

professional literature 350,000 RCTs in PubMED)

■ The development of improved technologies reduce time necessary to fill gaps in the evidence base & reduce uncertainty in decision-making process

■ Ability to justify decisions on the basis of valid information by explaining the strength of evidence used to make decisions

■ Better practice, better outcomes

Photo credits

Recommended