Upload
bonner
View
31
Download
0
Embed Size (px)
DESCRIPTION
Evidence-Informed Practice Training. Module 3: CRITICAL APPRAISAL Evaluating the Evidence. Research and Evaluation Department. Learning Objectives. Participants will be able to identify and classify different types of research studies according the hierarchy of evidence - PowerPoint PPT Presentation
Citation preview
Evidence-Informed Practice Training
Module 3: CRITICAL APPRAISAL
Evaluating the Evidence
Research and Evaluation Department
Learning Objectives
Participants will be able to identify and classify different types of research studies according the hierarchy of evidence
Participants will be able to determine the level of evidence according to the hierarchy of evidence
Participants will use the relevant Critical Appraisal tool to appraise a systematic review, a practice guideline and a qualitative report.
3
Why Critically Appraise
4
5
What is critical appraisal when applied to research?
Critical appraisal is the process of systematically assessing and interpreting research studies by asking 3 key questions: – Is the study valid? – What are the results? – How relevant are the results from this study to my workplace?
Critical appraisal is an essential part of evidence-informed practice and allows us to assess the quality of research evidence and decide whether a body of research evidence is good enough to be used in decision making.
Kostoris Library; The Christie. NHS Foundation Trust. Retrieved October 28, 2009 from http://www.christie.nhs.uk/pro/library/Critical-Appraisal.aspx
6
What are you Evaluating?
The Three Broad Questions
Research Design
VALIDITY
Treatment Effect
RESULTS
Applicability
RELEVANCE
Why critically appraise ?
Determine whether research deserves your consideration
…not just to expose shoddy research methods! Critical appraisal doesn’t mean rejecting papers or
evidence outright…some findings may be relevant Critical appraisal does require some knowledge of research
design
…but not detailed knowledge of statistics! Critical appraisal may not provide you with the “easy”
answer Critical appraisal may mean you find that a favored
intervention is in fact ineffective
To determine the strength of the evidence found...
in order to make sound decisions...
about applying the findings to clinical practice.
10
How to identify and classify different types of Research
11
All research falls into these two broad categories…
QUANTITATIVE QUALITATIVE
12
Evidence pyramid (QUANTITATIVE)
(primary research)
(secondary research)
13
14
Categories of Evidence – Level
Primary – research study report (unfiltered)
Secondary – some type of synthesis of several research study reports (filtered, pre-appraised)– Literature Reviews (traditional literature or narrative
review)
– Evidence Syntheses (systematic review w or w/o meta-analysis, qualitative review, quantitative review)
– Clinical Practice Guidelines
Quantitative evidence - Primary (unfiltered)
Experimental vs. Observational
– Randomized controlled trial (researcher determines exposure and observes outcomes prospectively) - E
– Cohort (researcher studies and compares exposed & unexposed groups retrospectively or prospectively) - O
– Case controlled study/case series (researcher retrospectively examines case and may look for controls or comparisons) - O
Quantitative evidence - Primary (unfiltered)
The quick tip:
Did the investigator CHOOSE the treatment?
YES: If the treatment was assigned, this is an experiment.
NO: If the treatment was not assigned this is an observational study, and we are simply observing what happens naturally.
Study Designs
PRIMARY
Case ReportsCase Series
Observational
Descriptive
Analytical / Epidemiological
InterventionalEcological
(Correlational)
Cross-Sectional(Surveys)
Randomized Controlled
Trials
ClinicalControlled
Trials
Case Controlled
Studies
CohortStudies
SECONDARY
Systematic Reviews
Guidelines
(Narrative Reviews)
Qualitative evidence – primary (unfiltered)
Major qualitative research approaches: Grounded theory – looks to develop theory
about basic social processes (Lobiondo-Wood & Haber, 2009)
Ethnography – focuses on scientific description and interpretation of cultural or social groups
and systems (Creswell, 1998)
Phenomenology – goal is to understand the meaning of a particular experience as it is lived by the participant (Lobiondo-Wood & Haber, 2009)
Case Study Method – reviews the peculiarities and the commonalities of a specific case (Creswell, 1998)
18
REVIEW
Research Design
VALIDITY
Treatment
Effect GENERALIZABILITY
Of RESULTS
Applicability
RELEVANCE
Is the quality of the study good enough to use the results?
What do the results meanfor my patients (clinical
significance) ?
Are the findings applicable in my setting?
20
REVIEWREVIEW
Meta- synthesis
21
Individual Work – Classifying research
22
How to appraise systematic reviews, practice guidelines & other publications
23
REVIEWREVIEW
Systematic Reviews are: a research summary of all evidence that relates to a particular question, the question could be on an intervention effectiveness, causation, diagnosis or prognosis. (Cullum et al, 2008)
Practice Guidelines are: systematically developed statements to assist practitioners and patient decisions about appropriate health care for specific circumstances.
Meta-Syntheses: the synthesis of findings across multiple qualitative reports
TOP
Quantitative
TOP
Quantitative and Qualitative
Qualitative
Meta- Analysis: combines quantitative data across studies
24
Appraising of Guidelines Research & Evaluation
EB CLINICAL PRACTICE GUIDELINE
Definition: A systematically developed statement based on scientific literature that helps practitioners and their patients to make appropriate health care decisions specific to their clinical circumstances.
Purpose: To make explicit recommendations with a definite intent to influence what clinicians do.
EB CLINICAL PRACTICE GUIDELINES
INTENDED GOALS
Enhance patient / health outcomes Increase cost effectiveness of health care delivery Synthesize large volumes of information Codify optimal practice as an education tool
CPGs must be flexible and reflect unique nature of each patient and practice setting.
Appraisal of Guidelines Research & Evaluation Instrument (AGREE)
Methodology
VALIDITY
Final Recommend-
ations
RESULTS
Potential Uptake
RELEVANCE
Appraisal of Guidelines Research & Evaluation Instrument (AGREE)
Consists of 23 Key items in 6 Domains
Scope and Practice (3) Stakeholder Involvement (4) Rigour of development (7) Clarity and Presentation (4) Applicability (3) Editorial Independence (2)
29
1. Scope and purpose – is concerned with the overall aim of the guideline, the specific clinical questions and the target patient population
2. Stakeholder involvement – focuses on the extent to which the guideline represents the views of its intended users.
3. Rigour and development – relates to the process used to collects and synthesize the evidence, methods to formulate recommendations and to update the guideline
4. Clarity and Presentation – looks at the language and format of the guideline
Agree continued
30
Agree continued
5. Applicability – pertains to the potential organizational, behavioral and cost implications of the guideline
6. Editorial Independence – is concerned with the independence of the recommendations and acknowledgement of possible conflict of interest from the guideline development group
31
Appraising Systematic Reviews and Other Study Types
Critical Appraisal Skills Program (CASP)Since its ‘birth’ in 1993, the Critical Appraisal Skills
Programme (CASP) has helped to develop an evidence-based approach in health and social care, working with
local, national and international groups.
CASP aims to enable individuals to develop the skills to find and make sense of research evidence, helping them to put
knowledge into practice.
See http://www.phru.nhs.uk/pages/PHD/CASP.htm
for appraisal tools
32
Group work – Appraising Research