Data Distancing Series: Systematic Reviews & Meta-Analyses ... · Systematic Reviews &...

Preview:

Citation preview

Data Distancing Series: Systematic Reviews & Meta-Analyses - Part 1

Dr. Michal Perlman, Samantha Burns & Gabriella Nocita May 13, 2020

Outline – Day 11. Integration of Evidence: Why and How?

2. Overview of Types of Integrating of Evidence

3. Benefits of Systematic Reviews and Meta-Analyses

4. Being a Critical Consumer

5. Searching for Systematic Reviews and Meta-Analyses

Integration of Evidence

● Science requires replication of findings

● Need to see consistency across studies

● But sometimes it’s hard to tell what the pattern of results is

● Need to integrate across results from different studies

● There are different ways to do this that vary in how systematic and labour

intensive they are

● Focusing today on systematic review and meta-analysis

Types of Integrations of EvidenceNarrative Review Paper

● Qualitative summaries of evidence

● Usually written by experts

● Typically, involve informal, subjective methods of collecting and

interpreting studies

● Often do not explicitly describe how the reviewers searched or selected

their studies

● Often does not include quality assessments of the selected studies

● Tend to selectively cite literature that reinforces preconceived notions

Types of Integrations of EvidenceSystematic review

● Exhaustive search for primary studies on a focused topic

● Selection of studies using clear and reproducible eligibility criteria

● Critical appraisal of studies for quality

● Synthesis of results according to a pre-determined and explicit method

● Usually qualitative in nature

Types of Integrations of EvidenceMeta-analysis

● Usually the final step in a systematic review

● The statistical pooling of data across studies to generate summary

(pooled) estimates of effects

● The term ‘effect’ refers to any measure of association between exposure

and outcome (e.g. odds ratio)

Comparative OverviewSystematic Review Narrative Review

Question Focused on a single question Not necessarily focused on a single question, but may describe an overview

Protocol A peer review protocol or plan is included

No protocol is included

Background Both provide summaries of the available literature on a topicObjectives Clear objectives are identified Objectives may or may not be identified

Inclusion and exclusion criteria

Criteria stated before the review is conducted

Criteria not specified

Search strategy Comprehensive search conducted in a systematic way

Strategy not explicitly stated

Process of selecting articles Usually clear and explicit Not described in a narrative reviewProcess of evaluating articles

Comprehensive evaluation of study quality

Evaluation of study quality may or may not be included

Process of extracting relevant information

Usually clear and specific Not clear or explicit

Results and data synthesis Clear summaries of studies based on high quality evidence

Summary based on studies where the quality of the articles may not be expected. May also be influenced by the reviewer's theories, needs and beliefs

Discussion Written by an expert or group of experts with a detailed and well grounded knowledge of the issues

Process for Conducting Systematic Reviews/Meta-Analysis

What Makes Systematic Reviews and Meta-Analyses Great Research Tools?

Great Methodology For...● Systematic reviews and meta-analyses integrate findings across all

available research in a given area● This makes them great for:

○ Determining the extent to which a topic has been researched○ Quickly getting up to speed on a topic○ Finding key researchers and studies in a given area○ Understanding the pattern of results across studies ○ Identifying gaps in literature and future directions○ Coming up with your own research question○ Using as a starting point for your own literature review

● To reap these benefits, we have to be critical consumers

Important Considerations

1. Quality of Studies: Risk of Bias

2. Publication/Reporting Bias

3. Heterogeneity

4. Interpretation of Results

Assessment of Study Quality

● Very important step to determine how to be interpret the results

● Trying to avoid: “garbage in/garbage out”

● There are standardized tools for doing this○ Can use scores as inclusion criteria

○ Common Examples are the Newcastle-Ottawa and Cochrane Risk of Bias

○ A 2017 Systematic Review found 18 different risk of bias assessments

Ask yourself:

● Did the researchers assess study quality? If so, how?

● If not, of what quality do the included studies appear to be?

Publication/Reporting BiasesSignificant/Positive studies are more likely to be:

● Submitted for publication

● Accepted for publication

● Written in English

● Published more quickly

● Cited more often

Ask yourself:

● Did the authors account for publication or reporting bias? If so, how?

● What types of studies did they include? Do these studies introduce bias?○ If so, do they mention this as a limitation in the discussion?

Heterogeneity in Study Results

● Issues of heterogeneity in studies and results are paramount

● Combining heterogeneous effects (e.g. ,some strong positive associations

and some negative associations) can be very misleading

● If you combine apples and oranges you will end up with fruit salad. We

are social scientists and don’t want to make fruit salad…

Heterogeneity in Study Results● There are statistics that test for heterogeneity ● If it is too great researchers should not force results into a single meta-

analysis ● Instead they should try to find explanations for the observed

heterogeneity

Ask yourself:

● Did the authors make “fruit salad”?● Do the effects/outcomes the authors combined seem to make sense?● Were the authors too conservative in what they combined? Were they too

lax in what they combined?● Was their combination strategy different for the systematic review vs. the

meta-analysis?

Interpretation of Results● Discussing limitations in the existing literature and in current study is key

○ Ask yourself:■ Do the authors include limitations of literature? What about of their own study?■ If not, what are some such limitations (e.g., outdated studies, all cross-sectional)?

● Implications○ Ask yourself:

■ Do the authors discuss the implications of their integration of results?■ If so, do they make sense given what else is known in the field?■ If not, how might the results influence areas important to the field (e.g., public

policy, resource allocation, how variables are defined)

● Future directions for research○ Ask yourself:

■ Do the authors discuss where to go from the results?■ How can you use the results to inform gaps in literature and future studies?

Making Sure it’s Relevant: Where and How to Search for a Meta Analysis

How to Search for a Systematic Review/ Meta-Analysis

1. Cochrane Reviews: https://www.cochranelibrary.com/search

2. Campbell Systematic Reviews: https://onlinelibrary.wiley.com/journal/18911803

3. Library Databases with Meta-Analysis as a search term

4. Search the topic of interest in a database and filter by publication

type

Activity

Search for a Meta-Analysis in your field of interest

OR

Watch Demo of a Search

Coming Up Next…● How to conduct a systematic review and meta-analysis

○ Step-by-step guide

● Interactive activities to practice the steps

○ Participation through Google docs

Questions?

Data Distancing Series:Systematic Reviews & Meta-Analyses Part 2

Dr. Michal Perlman, Samantha Burns & Gabriella Nocita May 15, 2020

Outline – Day 2

1. Steps for Conducting Systematic Reviews & Meta-Analyses

2. Getting Started○ Developing a research question○ Defining key variables○ Determining inclusion criteria

3. Putting it All Together○ Prisma Flow Chart○ Forest Plot

4. General Conclusions

Steps for Conducting Systematic Reviews & Meta-Analyses

Starting a Systematic Review & Meta-Analysis

Developing a Research Question

What is a Research Question?

● The initial step in a research project

● An answerable inquiry into a topic

Systematic Review Research Questions

1. What are the current implemented curriculums that target developing creativity in early childhood education?

1. What are the most important skills required in the 21st century digital economy?

1. What are effective interventions for for reducing nausea and vomiting in adults with cancer receiving chemotherapy?

Meta-Analysis Research Questions1. Is educator professional development associated with child outcomes in

early childhood education?

1. What Interventions are effective for preventing falls in people after stroke?

1. Are Cannabinoids effective for reducing nausea and vomiting in adults with cancer receiving chemotherapy?

Activity #1Create a research question, related to your field, that is appropriate for

both systematic reviews and meta-analyses.

https://docs.google.com/document/d/1VtA4JAQndHtQpc2s48la3YZhXm9KEm-QDmQt9C1n9IQ/edit?usp=sharing

Defining Key Variables

What is a Definition?Conceptual Definition

Describes qualities of the variables independent of time and space

OR: Tells you what the concept means (dictionary definition)

(e.g., Intelligence: the capacity for learning, reasoning and mental activity)

Operational Definition

Clear, concise, detailed and measurable definition of a concept

OR: Tells you how to measure the concept, specifically for your study

(e.g., Intelligence: the total score on a standardized Stanford Binet IQ test)

Deriving Definitions From Research Question

Is educator early childhood specialization associated with child outcomes in early childhood education?

Deriving Definitions From Research Question

Is educator early childhood specialization associated with child outcomes in early childhood education?

Deriving Definitions From Research Question

Is educator early childhood specialization associated with child outcomes in early childhood education?

Definitions of Early Childhood Education

Conceptual

● Out of home care provided to young children by adults

Definitions of Early Childhood Education

Conceptual

● Out of home care provided to young children by adults

Operational

● Licensed, centre-based programs serving preschool-aged children (30-72 months)○ Daycare○ Nursery school○ Preschool○ Pre-kindergarten○ Head Start programs

Deriving Definitions From Research Question

Is educator early childhood specialization associated with child outcomes in early childhood education?

Definitions of Child Outcomes

Conceptual

● Children’s cognitive, academic, socio-emotional, health, and motor development

Definitions of Child Outcomes

Conceptual

● Children’s cognitive, academic, socio-emotional, health, and motor development

Definitions of Child Outcomes

Conceptual

● Children’s cognitive, academic, socio-emotional, health, and motor development

Operational

Academic

Definitions of Child Outcomes

Conceptual

● Children’s cognitive, academic, socio-emotional, health, and motor development

Operational

Academic

Language

Definitions of Child Outcomes

Conceptual

● Children’s cognitive, academic, socio-emotional, health, and motor development

Operational

Academic

Language

Receptive Language

Definitions of Child Outcomes

Conceptual

● Children’s cognitive, academic, socio-emotional, health, and motor development

Operational

Academic

Language

Receptive Language

Scores on Peabody Picture Vocabulary Test (PPVT)

Activity #2Using your research question from Activity #1, determine a conceptual and

operational definition for one of your key terms.

https://docs.google.com/document/d/1R9tlOziBig_B7S1dGkLa1Lk-

dH9c_mKgrNaCOr01iZg/edit?usp=sharing

Determining Inclusion Criteria

SPICE – Synonyms Acronym S P I C E

Meaning Setting Perspective Intervention Comparison Evaluation

Definition

Setting is the context for the question - where

Perspective is the users, potential users, or stakeholders of the service - for whom

Intervention is the action taken for the users, potential users, or stakeholders - what

Comparison is the alternative actions or outcomes -what else

Evaluation is the result or measurement that will determine the success of the intervention - what result or how well

Our Study ECECChildren and Educators

Factors in ECEC Settings

Control or comparison

Effect of Intervention on Creativity, Collaboration and Problem-solving

Examples

ECE, Nursery, Preschool, Headstart, Licensed Centre-based care

People in these settings

Teacher responsiveness, curriculum, materials, activities, environment. Note: Intervention

Business as usual, different programs/intervention

Soft skills, how they are operationalized, effectiveness of interventions on soft skills (or their components)

Putting It Together: Final Product

Prisma Flow Chart

Forest Plot

Conclusions● Systematic reviews and meta-analyses are rigorous ways of synthesizing

research findings

● They offer many benefits, but require careful interpretation

● They save consumers of research lots of time and heartache!

● Being a critical consumer is key

● Many resources are available within and outside of the University of

Toronto to support you in completing a systematic review and meta-

analysis

Thank you!Stay safe and healthy J

gabriella.nocita@mail.utoronto.casamantha.burns@utoronto.ca

Recommended