32
Evidence-Informed Practice Training Module 3: CRITICAL APPRAISAL Evaluating the Evidence Research and Evaluation Department

Evidence-Informed Practice Training

  • Upload
    bonner

  • View
    31

  • Download
    0

Embed Size (px)

DESCRIPTION

Evidence-Informed Practice Training. Module 3: CRITICAL APPRAISAL Evaluating the Evidence. Research and Evaluation Department. Learning Objectives. Participants will be able to identify and classify different types of research studies according the hierarchy of evidence - PowerPoint PPT Presentation

Citation preview

Page 1: Evidence-Informed Practice Training

Evidence-Informed Practice Training

Module 3: CRITICAL APPRAISAL

Evaluating the Evidence

Research and Evaluation Department

Page 2: Evidence-Informed Practice Training

Learning Objectives

Participants will be able to identify and classify different types of research studies according the hierarchy of evidence

Participants will be able to determine the level of evidence according to the hierarchy of evidence

Participants will use the relevant Critical Appraisal tool to appraise a systematic review, a practice guideline and a qualitative report.

Page 3: Evidence-Informed Practice Training

3

Why Critically Appraise

Page 4: Evidence-Informed Practice Training

4

Page 5: Evidence-Informed Practice Training

5

What is critical appraisal when applied to research?

Critical appraisal is the process of systematically assessing and interpreting research studies by asking 3 key questions: – Is the study valid? – What are the results? – How relevant are the results from this study to my workplace?

Critical appraisal is an essential part of evidence-informed practice and allows us to assess the quality of research evidence and decide whether a body of research evidence is good enough to be used in decision making.

Kostoris Library; The Christie. NHS Foundation Trust. Retrieved October 28, 2009 from http://www.christie.nhs.uk/pro/library/Critical-Appraisal.aspx

Page 6: Evidence-Informed Practice Training

6

What are you Evaluating?

The Three Broad Questions

Research Design

VALIDITY

Treatment Effect

RESULTS

Applicability

RELEVANCE

Page 7: Evidence-Informed Practice Training

Why critically appraise ?

Determine whether research deserves your consideration

…not just to expose shoddy research methods! Critical appraisal doesn’t mean rejecting papers or

evidence outright…some findings may be relevant Critical appraisal does require some knowledge of research

design

…but not detailed knowledge of statistics! Critical appraisal may not provide you with the “easy”

answer Critical appraisal may mean you find that a favored

intervention is in fact ineffective

Page 8: Evidence-Informed Practice Training

To determine the strength of the evidence found...

in order to make sound decisions...

about applying the findings to clinical practice.

Page 9: Evidence-Informed Practice Training
Page 10: Evidence-Informed Practice Training

10

How to identify and classify different types of Research

Page 11: Evidence-Informed Practice Training

11

All research falls into these two broad categories…

QUANTITATIVE QUALITATIVE

Page 12: Evidence-Informed Practice Training

12

Evidence pyramid (QUANTITATIVE)

(primary research)

(secondary research)

Page 13: Evidence-Informed Practice Training

13

Page 14: Evidence-Informed Practice Training

14

Categories of Evidence – Level

Primary – research study report (unfiltered)

Secondary – some type of synthesis of several research study reports (filtered, pre-appraised)– Literature Reviews (traditional literature or narrative

review)

– Evidence Syntheses (systematic review w or w/o meta-analysis, qualitative review, quantitative review)

– Clinical Practice Guidelines

Page 15: Evidence-Informed Practice Training

Quantitative evidence - Primary (unfiltered)

Experimental vs. Observational

– Randomized controlled trial (researcher determines exposure and observes outcomes prospectively) - E

– Cohort (researcher studies and compares exposed & unexposed groups retrospectively or prospectively) - O

– Case controlled study/case series (researcher retrospectively examines case and may look for controls or comparisons) - O

Page 16: Evidence-Informed Practice Training

Quantitative evidence - Primary (unfiltered)

The quick tip:

Did the investigator CHOOSE the treatment? 

YES: If the treatment was assigned, this is an experiment.

NO: If the treatment was not assigned this is an observational study, and we are simply observing what happens naturally.

Page 17: Evidence-Informed Practice Training

Study Designs

PRIMARY

Case ReportsCase Series

Observational

Descriptive

Analytical / Epidemiological

InterventionalEcological

(Correlational)

Cross-Sectional(Surveys)

Randomized Controlled

Trials

ClinicalControlled

Trials

Case Controlled

Studies

CohortStudies

SECONDARY

Systematic Reviews

Guidelines

(Narrative Reviews)

Page 18: Evidence-Informed Practice Training

Qualitative evidence – primary (unfiltered)

Major qualitative research approaches: Grounded theory – looks to develop theory

about basic social processes (Lobiondo-Wood & Haber, 2009)

Ethnography – focuses on scientific description and interpretation of cultural or social groups

and systems (Creswell, 1998)

Phenomenology – goal is to understand the meaning of a particular experience as it is lived by the participant (Lobiondo-Wood & Haber, 2009)

Case Study Method – reviews the peculiarities and the commonalities of a specific case (Creswell, 1998)

18

Page 19: Evidence-Informed Practice Training

REVIEW

Research Design

VALIDITY

Treatment

Effect GENERALIZABILITY

Of RESULTS

Applicability

RELEVANCE

Is the quality of the study good enough to use the results?

What do the results meanfor my patients (clinical

significance) ?

Are the findings applicable in my setting?

Page 20: Evidence-Informed Practice Training

20

REVIEWREVIEW

Meta- synthesis

Page 21: Evidence-Informed Practice Training

21

Individual Work – Classifying research

Page 22: Evidence-Informed Practice Training

22

How to appraise systematic reviews, practice guidelines & other publications

Page 23: Evidence-Informed Practice Training

23

REVIEWREVIEW

Systematic Reviews are: a research summary of all evidence that relates to a particular question, the question could be on an intervention effectiveness, causation, diagnosis or prognosis. (Cullum et al, 2008)

Practice Guidelines are: systematically developed statements to assist practitioners and patient decisions about appropriate health care for specific circumstances.

Meta-Syntheses: the synthesis of findings across multiple qualitative reports

TOP

Quantitative

TOP

Quantitative and Qualitative

Qualitative

Meta- Analysis: combines quantitative data across studies

Page 24: Evidence-Informed Practice Training

24

Appraising of Guidelines Research & Evaluation

Page 25: Evidence-Informed Practice Training

EB CLINICAL PRACTICE GUIDELINE

Definition: A systematically developed statement based on scientific literature that helps practitioners and their patients to make appropriate health care decisions specific to their clinical circumstances.

Purpose: To make explicit recommendations with a definite intent to influence what clinicians do.

Page 26: Evidence-Informed Practice Training

EB CLINICAL PRACTICE GUIDELINES

INTENDED GOALS

Enhance patient / health outcomes Increase cost effectiveness of health care delivery Synthesize large volumes of information Codify optimal practice as an education tool

CPGs must be flexible and reflect unique nature of each patient and practice setting.

Page 27: Evidence-Informed Practice Training

Appraisal of Guidelines Research & Evaluation Instrument (AGREE)

Methodology

VALIDITY

Final Recommend-

ations

RESULTS

Potential Uptake

RELEVANCE

Page 28: Evidence-Informed Practice Training

Appraisal of Guidelines Research & Evaluation Instrument (AGREE)

Consists of 23 Key items in 6 Domains

Scope and Practice (3) Stakeholder Involvement (4) Rigour of development (7) Clarity and Presentation (4) Applicability (3) Editorial Independence (2)

Page 29: Evidence-Informed Practice Training

29

1. Scope and purpose – is concerned with the overall aim of the guideline, the specific clinical questions and the target patient population

2. Stakeholder involvement – focuses on the extent to which the guideline represents the views of its intended users.

3. Rigour and development – relates to the process used to collects and synthesize the evidence, methods to formulate recommendations and to update the guideline

4. Clarity and Presentation – looks at the language and format of the guideline

Agree continued

Page 30: Evidence-Informed Practice Training

30

Agree continued

5. Applicability – pertains to the potential organizational, behavioral and cost implications of the guideline

6. Editorial Independence – is concerned with the independence of the recommendations and acknowledgement of possible conflict of interest from the guideline development group

Page 31: Evidence-Informed Practice Training

31

Appraising Systematic Reviews and Other Study Types

Critical Appraisal Skills Program (CASP)Since its ‘birth’ in 1993, the Critical Appraisal Skills

Programme (CASP) has helped to develop an evidence-based approach in health and social care, working with

local, national and international groups.  

CASP aims to enable individuals to develop the skills to find and make sense of research evidence, helping them to put

knowledge into practice.

See http://www.phru.nhs.uk/pages/PHD/CASP.htm

for appraisal tools

Page 32: Evidence-Informed Practice Training

32

Group work – Appraising Research