89
Evaluating Evidence in Medicine: What Can Go Wrong? Skeptic’s Toolbox 2012 Harriet Hall, MD The SkepDoc

Evaluating Evidence in Medicine: What Can Go Wrong?

  • Upload
    allie

  • View
    46

  • Download
    0

Embed Size (px)

DESCRIPTION

Evaluating Evidence in Medicine: What Can Go Wrong?. Skeptic’s Toolbox 2012 Harriet Hall, MD The SkepDoc. Overview. What constitutes evidence in medicine? What can go wrong in clinical studies? Why even “evidence-based medicine” is flawed. Is This Evidence?. - PowerPoint PPT Presentation

Citation preview

Page 1: Evaluating Evidence in Medicine: What Can Go Wrong?

Evaluating Evidence in Medicine:What Can Go Wrong?

Skeptic’s Toolbox2012

Harriet Hall, MDThe SkepDoc

Page 2: Evaluating Evidence in Medicine: What Can Go Wrong?

Overview

• What constitutes evidence in medicine?• What can go wrong in clinical studies?• Why even “evidence-based medicine” is

flawed.

Page 3: Evaluating Evidence in Medicine: What Can Go Wrong?

Is This Evidence?

Page 4: Evaluating Evidence in Medicine: What Can Go Wrong?

Is This Evidence? MRI Study of Salmon

• A salmon was shown photographs of humans in social situations. It was asked to think about what emotion the individual in the photo must have been experiencing.

• The salmon couldn’t talk, but:• On the fMRI scan, areas in the salmon’s brain

lit up, indicating increased blood flow, indicating that the salmon was thinking.

Page 5: Evaluating Evidence in Medicine: What Can Go Wrong?

Is This Evidence That:

• Salmon can see pictures?• Salmon know what human emotions are?• Salmon can identify emotions from pictures?• Salmon can respond to requests of what to

think about?

Page 6: Evaluating Evidence in Medicine: What Can Go Wrong?

What’s Wrong With This Picture?

The Salmon Was Dead and Gutted!

Page 7: Evaluating Evidence in Medicine: What Can Go Wrong?

Statistical Artifact

• Each fMRI scan measures 50,000 voxels (3-D pixels) and each study involves thousands of scans.

• If you mine the data, you can find practically anything you want.

• Brain scans are the new phrenology– A blunt instrument– Scans are pooled to establish normal average– Often don’t mean what people think they mean

Page 8: Evaluating Evidence in Medicine: What Can Go Wrong?

Amen poster

Page 9: Evaluating Evidence in Medicine: What Can Go Wrong?

Would You Accept This Evidence?

• I tried it. I got better. It worked for me.• Lots of people tried it and got better.• We gave it to a lot of people in a study and they

improved.• We compared it to a no-treatment group or a

usual-treatment group and it worked better.• We compared it to a placebo and it worked better.• The weight of evidence from a large body of

studies shows that it works better than placebo

Page 10: Evaluating Evidence in Medicine: What Can Go Wrong?

Is This Evidence?

• I tried it. I got better. It worked for me.– Anecdote. Plural of anecdote is not data.– Post hoc ergo propter hoc fallacy– Does Echinacea prevent colds?– Removing glucosamine didn’t remove effects

• We gave it to a lot of people in a study and they improved.– Uncontrolled study. Maybe they would have improved

without treatment.– Cold got better in a week with treatment, lasted 7 days

without treatment.

Page 11: Evaluating Evidence in Medicine: What Can Go Wrong?

Is This Evidence?

• Our study compared it to a no-treatment group or a usual-treatment group and it worked better.– Hawthorne effect: Doing something is better than doing

nothing.

• Our study compared it to a placebo and it worked better.– Was the study blinded?– Double blind, placebo-controlled randomized study is the

Gold Standard.

• BUT: What if we do a Gold Standard study on something totally implausible and it works better than a placebo?

Page 12: Evaluating Evidence in Medicine: What Can Go Wrong?

There’s A Lot of Evidence:A Fire Hose of Information

• 21 million papers are listed in PubMed:– 700,000 more each year– One a minute

• PubMed lists 23,000 journals, and there are many more not listed

• You can find a study to support any belief.

Page 14: Evaluating Evidence in Medicine: What Can Go Wrong?

Ioannidis

• The smaller the study, the less likely the research findings are to be true

• The smaller the effect, the less likely the research findings are to be true.

• The greater the financial and other interests, the less likely the research findings are to be true

• The hotter a scientific field, (with more research teams involved), the less likely the research findings are to be true.

Page 15: Evaluating Evidence in Medicine: What Can Go Wrong?

Evaluating a Study

• Ask a lot of questions• I’ll cover some

Page 16: Evaluating Evidence in Medicine: What Can Go Wrong?

Skeptics Question Everything

Page 17: Evaluating Evidence in Medicine: What Can Go Wrong?

What Kind of Study?

• Case report• Case series• Case-control• Cohort • Epidemiologic • RCT• Placebo-controlled• Blinded (single or double)

Page 18: Evaluating Evidence in Medicine: What Can Go Wrong?

Who’s Paying?

• Studies sponsored by pharmaceutical companies more likely to be positive– Subtle bias– Unpublished negative information

• Studies by researchers with financial conflicts of interest (consulting fees, honoraria from pharmaceutical company) more likely to be positive 91% vs. 67%

Page 19: Evaluating Evidence in Medicine: What Can Go Wrong?

Big Pharma Distortion

• Turner looked at all antidepressant studies registered with FDA– Published studies: 94% positive– Unpublished studies: 51% positive

Evidence that antidepressants don’t work?No.

Page 20: Evaluating Evidence in Medicine: What Can Go Wrong?

Effect Size

Page 21: Evaluating Evidence in Medicine: What Can Go Wrong?

Turner vs. Kirsch

• Kirsch said < .5 means ineffective– Effect size from journals: .41– True effect size: .31– Therefore antidepressants are not effective

• Turner said glass not empty, 1/3 full• Patients’ responses not all-or-none; partial responses can be

meaningful

• Antidepressants DO work, just not as well as originally thought.• Kirsch supports psychotherapy, but its effect size is much less

than .5.

Page 22: Evaluating Evidence in Medicine: What Can Go Wrong?

Scam Product Testing

• In-house: by non-academics on company’s payroll– Worthless. Tweaked to get desired results

• Independent testing companies: guns for hire• Minuscule effects touted as significant• Effects found, but not specific to product

– Amino acids may improve muscle strength• Effects may not apply to average people (i.e.

taping injuries)

Page 23: Evaluating Evidence in Medicine: What Can Go Wrong?

Are the Researchers Biased?

• Homeopathy studies done by homeopaths• Chiropractic studies done by chiropractors• Surgical studies done by surgeons• Studies published in specialty journals for a

biased audience

Page 24: Evaluating Evidence in Medicine: What Can Go Wrong?

Who Are the Subjects?

• Self selection bias: who volunteers?– Believers?– Professional subjects?

• Select group not typical of the general population.– Men only? No children? Limited age group?– Subjects with concurrent diseases not accepted– Subjects taking other medications not accepted.

Page 25: Evaluating Evidence in Medicine: What Can Go Wrong?

Were Negative Studies Suppressed?

• File drawer effect – Negative studies not submitted for publication.– What if 4/5 studies were negative but only the

positive one published?• Publication bias

– Journals don’t like to publish negative studies.– Journals don’t like to publish replications that

debunk original results. (Bem, Wiseman)

Page 26: Evaluating Evidence in Medicine: What Can Go Wrong?

Did Workers Mislead Author?

• Technicians and subordinates know what the researcher hopes to find.– May try to please the boss, consciously or

unconsciously– May circumvent blinding procedures– Can record 4.5 as 4 or 5.– Faking to make job easier (homeopathy prep)

Page 27: Evaluating Evidence in Medicine: What Can Go Wrong?

Did Workers Mislead Author?• Benveniste homeopathy study• Counting basophil degranulation under the

microscope is somewhat subjective• Only one technician got positive results

Page 28: Evaluating Evidence in Medicine: What Can Go Wrong?

What Are the Odds?

• 9 out of ten drugs in Phase I clinical trials fail.• 50% of drugs that reach Phase III trials fail.• A far higher percentage of promising drugs

never make it to clinical trials; they fail in animal and in vitro trials.

Page 29: Evaluating Evidence in Medicine: What Can Go Wrong?

Do the Data Justify the Conclusion?

• Teaching exercise:1. Read the data section first2. Draw your own conclusions3. Read the paper’s conclusions4. Scratch your head

Page 30: Evaluating Evidence in Medicine: What Can Go Wrong?

Do the Data Justify the Conclusion?

Conclusion: low cholesterol kills children. The higher the cholesterol, the better for health.

Page 31: Evaluating Evidence in Medicine: What Can Go Wrong?

Do the Data Justify the Conclusion?• Sample of opportunity: data not collected systematically• Too few points to show correlation• Correlation doesn’t prove causation• Other explanations:

– Hygiene– Poverty– Disease– Starvation– Genetic factors– Less access to medical care

• Better explanation: undernourished children have abnormally low cholesterol levels

Page 32: Evaluating Evidence in Medicine: What Can Go Wrong?

Do the Data Justify the Conclusion?

Conclusion: by the year 2038 100% of children will be autistic

Page 33: Evaluating Evidence in Medicine: What Can Go Wrong?

What Aren’t They Telling Us?• Selection methods• Randomization methods• Identity of placebo• Whether people were fooled by placebo• Proper blinding procedures?

• Other factors– Glassware not thoroughly washed?– Contaminants in lab?– Mouse XMRV virus contaminated cell cultures in CFS study– Did they really do what they said they did?

Page 34: Evaluating Evidence in Medicine: What Can Go Wrong?

How Many Dropouts?

- 10 total patients: 7 neg. 3 pos. = 30% pos.- 6 drop out because it’s not working- 30% success rate now looks like 75%

Page 35: Evaluating Evidence in Medicine: What Can Go Wrong?

Where Was the Study Done?

Percent of Acupuncture Trials with Positive Results

• Canada, Australia, New Zealand 30%• US

53%• Scandinavia

55%• UK 60%• Rest of Europe

78%• Asia

98%• Brazil, Israel, Nigeria 100%

Page 36: Evaluating Evidence in Medicine: What Can Go Wrong?

What was the sample size?

• 1/3 of the chickens got better• 1/3 of the chickens stayed the same• What about the other third?

Page 37: Evaluating Evidence in Medicine: What Can Go Wrong?

Were There Errors in Statistics?

• Wrong statistical test used• Errors in calculation

Page 38: Evaluating Evidence in Medicine: What Can Go Wrong?

What About Noncompliance?

• Did all subjects take their pills?• Did they take them on time?

Page 39: Evaluating Evidence in Medicine: What Can Go Wrong?

Noncompliance

• HIV Prophylaxis study in Africa– 95% said they usually or always took meds on time– Pill count data: 88%– Tests showed adequate plasma levels of drug: 15-

26%

Page 40: Evaluating Evidence in Medicine: What Can Go Wrong?

Tooth Fairy Science

• Are they trying to study something that doesn’t exist?

Page 41: Evaluating Evidence in Medicine: What Can Go Wrong?

Emily Rosa and the Emperor's New Clothes

Page 42: Evaluating Evidence in Medicine: What Can Go Wrong?

Inaccurate Measuring Methods?

• Questionnaires rely on unreliable memories and patient honesty.– “30% less pain”– “I eat like a bird” – “Only one drink”

Page 43: Evaluating Evidence in Medicine: What Can Go Wrong?

Using a Bogus Test? Measuring the Components of ASEA

• A mixture of 16 chemically recombined products of salt and water with completely new chemical properties.

• They used a fluorescent indicator as a probe for unspecified “highly reactive oxygen species”

Page 44: Evaluating Evidence in Medicine: What Can Go Wrong?

How Many Endpoints Were There?

• Multiple endpoints: some will show false correlations just by chance

• Statistical corrections applied?• Inappropriate data mining?• The heart prayer study

– 6 positive out of 26 factors studied – Inconsistent pattern

Page 45: Evaluating Evidence in Medicine: What Can Go Wrong?

Were Goalposts Moved?

• AIDS prayer study: endpoint death• Not enough subjects died: AIDS drugs kept them

alive• They went back and looked at a lot of other

factors and found some apparent successes (i.e., fewer doctor visits) but no change in objective tests like CD4 count.

• Only 40 patients. Study wasn’t designed to test non-death outcomes.

Page 46: Evaluating Evidence in Medicine: What Can Go Wrong?

Statistical Significance ≠Clinical Significance

• Did the drug lower the BP by 1% or 30%?• Was the endpoint a lab value or a clinical

benefit? • B vitamin supplements lower homocysteine

but don’t lower risk of heart disease• PSA screening finds cancers; doesn’t improve

survival• Are the results POEMS – Patient Oriented

Evidence that Matters?

Page 47: Evaluating Evidence in Medicine: What Can Go Wrong?

Was There Fraud?

• Dipak Das, resveratrol researcher– Review board found him guilty of 145 counts of

fabrication or falsification of data– 12 of his papers retracted so far

Page 48: Evaluating Evidence in Medicine: What Can Go Wrong?

“I was blinded by work and my drive for achievement”

• Hwang Woo-suk, stem cell researcher in South Korea, claimed to have cloned human embryonic stem cells– Fabricated crucial data– Embezzlement and bioethics law

violations– Prison sentence (suspended)– 2 papers in Science retracted.– Fired from his job

Page 49: Evaluating Evidence in Medicine: What Can Go Wrong?

Columbia Prayer Study

• Prayer doubled success of in vitro fertilization– Seriously flawed study– Convoluted design with 3 levels of overlapping prayer groups– No controls for prayers outside study – Investigated for lack of informed consent

• Authors– Lobo, lead author, only learned of study 6-12 months after it was

completed. Denied any involvement other than editorial help.– Cha severed his relationship with Columbia, refused to comment– Wirth:

• Paranormal researcher with no medical degree • Con man who went to federal prison for fraud and conspiracy

• Bruce Flamm debunked it in Skeptical Inquirer• Retracted by journal, but only years later• Still being cited as a valid study

Page 50: Evaluating Evidence in Medicine: What Can Go Wrong?

How Were the Data Reported?

• NNT and NNH– Lipitor for primary prevention of heart attacks:

• 19% Reduction• NNT 75-250, NNH 200.

• Absolute risk vs. relative risk– Cellphones increase the risk of acoustic neuroma. Relative

risk 200%.– Baseline risk is 1:100,000 – 200% of 1 is 2– Absolute risk 1 more in 100,000, or 0.00001%

Page 51: Evaluating Evidence in Medicine: What Can Go Wrong?

What Are the Confidence Intervals?

• Confidence interval of 95%:

Page 52: Evaluating Evidence in Medicine: What Can Go Wrong?

Where Was the Study Published?

• Acupuncture studies in acupuncture journals?• Homeopathy studies in homeopathy journals?• Acupuncture or homeopathy study published

in a major medical journal?

Page 53: Evaluating Evidence in Medicine: What Can Go Wrong?

Were the Results Misinterpreted?

• True acupuncture = sham acupuncture• Both better than placebo pill• Acupuncturist’s interpretation:

– Sham acupuncture must work too• Better interpretation: both are impressive

placebos.

Page 54: Evaluating Evidence in Medicine: What Can Go Wrong?

Were Recommendations Justified?

• X didn’t work, but it didn’t cause harm, and since we have no other effective treatment for Y, we should continue to use X.

• New drug X didn’t work better than the placebo, but we didn’t see any side effects, and since we have no other effective treatment for Y, X should be approved for marketing.

Page 55: Evaluating Evidence in Medicine: What Can Go Wrong?

Do we really know what the study showed?

• Peer critiques, letters to editor• Media distortions

– Presenting preliminary evidence as definitive– Misinterpreting results of study

Page 56: Evaluating Evidence in Medicine: What Can Go Wrong?

Glucosamine/Chondroitin Study

• Overall: not effective• Subgroup analysis (10 subgroups)

– Pos. in patients with moderate to severe arthritis– Neg. in patients with mild to moderate arthritis

• Reported in the media as both + and –• Authors said study not powered to show

effectiveness in subgroup

Page 57: Evaluating Evidence in Medicine: What Can Go Wrong?

What does p value mean?

• It’s significant at the p=0.05 level, so it must be true.

• p=0.05 – Means 5 in 100 chance that a positive result is false– Doesn’t mean a 95% chance that a positive result is

true.

• Says nothing about the meaning of a positive result

Page 58: Evaluating Evidence in Medicine: What Can Go Wrong?

4 possible outcomes

Page 59: Evaluating Evidence in Medicine: What Can Go Wrong?

P Value = Specificity: Probability that an ineffective treatment will give a false positive result

Page 60: Evaluating Evidence in Medicine: What Can Go Wrong?

Sensitivity: The probability that an effective treatment will show a positive result

Page 61: Evaluating Evidence in Medicine: What Can Go Wrong?

Positive Predictive Value: if the study is positive, how likely is it to be true?

PPV = TP / TP + FP

TP

FP 0.05

Page 62: Evaluating Evidence in Medicine: What Can Go Wrong?

Prior Probability 50%

If researchers don’t consider prior probability, they are automatically assigning a PP of 50%. PPV = 80/80+5 = 94%. 6% chance that positive results are wrong.

0.05

Page 63: Evaluating Evidence in Medicine: What Can Go Wrong?

PPV with Prior Probability 5%

TP: 80% of 5%= 4%

FP: 5% of 95%= 4.75%

PPV = TP/TP + FP = 4%/8.75%

Chance that positiveResult is true: 46%

Slightly less thancoin toss

TP

FP

Page 64: Evaluating Evidence in Medicine: What Can Go Wrong?

PPV with Prior Probability 5% and 1%PP 5%TP: 80% of 5%= 4%

FP: 5% of 95%= 4.75%

PPV = TP/TP + FP = 4%/8.75%

Chance thatpositive result is true: 46%

Slightly less thancoin toss

PP 1%TP: 80% of 1%= 0.8%

FP: 5% of 99%= 4.95%

PPV= 0.8%/0.8% + 4.95%= 0.8/5.75= 0.14%

Chance that positive resultis true: 14%

Chance that positive resultIs false: 86%

The Plausibility Problem, by David Weinberg, on SBM

FP

Page 65: Evaluating Evidence in Medicine: What Can Go Wrong?

Statistical Significance

• The p=0.05 cutoff is arbitrary• Statistical significance doesn’t mean clinical

significance• Be wary of studies that say outcome was

positive but admit it was not statistically significant.– “X worked better than Y, but the results didn’t reach

significance.”

Page 66: Evaluating Evidence in Medicine: What Can Go Wrong?

Power of a study

• A power of .8 in medical research is considered a very powerful study and would require a large number of patients.

• If the power is less than that, the Positive Predictive Value drops even more.

Page 67: Evaluating Evidence in Medicine: What Can Go Wrong?

When CAN You Believe Research?Bausell’s Quick Checklist

• Randomized with a credible control group• At least 50 subjects per group• Dropout rate 25% or less• Published in a high-quality, prestigious, peer-

reviewed journal

Page 68: Evaluating Evidence in Medicine: What Can Go Wrong?

When CAN You Believe Research?

• Confirmed by other studies• Consistent with other knowledge• Prior probability

Page 69: Evaluating Evidence in Medicine: What Can Go Wrong?

Systematic Reviews and Meta-analyses

• A flawed method to sort out conflicting research results.

• A few high quality studies trump the conclusions of a meta-analysis.

• The results of a meta analysis usually fail to predict the results of future good clinical trials.

Page 70: Evaluating Evidence in Medicine: What Can Go Wrong?

Does It Make Sense?• Energy medicine proponents claim to

have measured a 2 milligauss magnetic field emanating from practitioners’ hands

• Reproducible measurements by other scientists fall in the range of 0.004 milligauss.

• The magnetic field of the earth is 500 milligauss.• Refrigerator magnet: 50,000 milligauss.• Even if the 2 milligauss measurement were accurate, it

would be many orders of magnitude below the cell’s noise level and billions of times less than the energy received by your eye when viewing the brightest star.

Page 71: Evaluating Evidence in Medicine: What Can Go Wrong?

Evidence-Based Medicine Isn’t Enough

EBM is working hard, but it got something wrong.

Page 72: Evaluating Evidence in Medicine: What Can Go Wrong?

What’s missing from the EBM pyramid?

Carl Sagan

Page 73: Evaluating Evidence in Medicine: What Can Go Wrong?

Don’t forget prior plausibility

• “Extraordinary claims require extraordinary evidence.”

• If basic science says a treatment is implausible, we must set the bar for clinical evidence higher.

Page 74: Evaluating Evidence in Medicine: What Can Go Wrong?

EBM Founders’ Assumptions• The rigorous clinical trial is the final

arbiter of any claim that has already demonstrated promise by all other criteria—basic science, animal studies, legitimate case series, small controlled trials, “expert opinion,” etc.

• Claims lacking in promise were not even part of the discussion.

Page 75: Evaluating Evidence in Medicine: What Can Go Wrong?

Plausibility Spectrum for CAM

• Homeopathy: close to zero.• Acupuncture: intermediate

– Underlying Oriental concepts have low plausibility– But it’s plausible that inserting needles in the skin

could cause physiological effects• Herbal medicine: high plausibility because

plants produce drugs.

Page 76: Evaluating Evidence in Medicine: What Can Go Wrong?

EBM Accepts Tooth Fairy Science

• Typical fairy tales:– Reiki (faith healing that substitutes Eastern mysticism

for Christian beliefs) – Homeopathy (essentially a form of sympathetic magic) – Therapeutic touch (A misnomer for smoothing out

wrinkles in a mythical human energy field without actually touching the patient)

• Trying to apply the tools of science to these therapeutic modalities based on fantasy just produces a lot of confusing noise

Page 77: Evaluating Evidence in Medicine: What Can Go Wrong?

Tooth Fairy Science:Studying Therapeutic Touch

• Therapeutic Touch is said to manipulate the alleged “human energy field.”

• Controlled TT studies have been done for:– Pain– Bone marrow transplant– Recovery from cardiac surgery • Positive results due to the effects of

suggestion and attention, not to a mythical energy field.

Page 78: Evaluating Evidence in Medicine: What Can Go Wrong?

Tooth Fairy Science: Therapeutic Touch

• 2008 study: Randomized trial of healing touch to speed recovery from coronary artery bypass surgery: – Decrease in anxiety and length of stay – No significant differences for other endpoints

• Lab study: therapeutic touch affects DNA synthesis and mineralization of human osteoblasts in culture.

• Cochrane review: positive

Page 79: Evaluating Evidence in Medicine: What Can Go Wrong?

Another EBM Pitfall: Pragmatic Studies

• Clinical trials select patients to minimize possible confounders: Subjects tend to be healthier and on fewer medications than the average patient.

• Pragmatic studies look at the outcome of treatments in real-world settings.

• Clotbuster drugs worked well for strokes in clinical trials, but with more extensive use in ERs they caused more strokes from bleeding complications.

Page 80: Evaluating Evidence in Medicine: What Can Go Wrong?

Pragmatic Trials May Not Be Appropriate for CAM Treatments

• Intended to evaluate practical real-world use of treatments that have already been proven to work in clinical trials.

• Pragmatic trials can’t provide objective evidence that a treatment has effects beyond placebo.

• CAM proponents favor pragmatic studies because– They don’t control for placebo effects.– They can bypass good science.– They can make CAM look better than it really

is.

Page 81: Evaluating Evidence in Medicine: What Can Go Wrong?

Pragmatic Trial

• Acupuncture vs. usual care for low back pain.• Acupuncture wins.• Have you proved acupuncture is really more

effective? • No, this is Cinderella Science

Page 82: Evaluating Evidence in Medicine: What Can Go Wrong?

Cinderella Revised

Cinderella in rags and ashes

Ugly Stepsister who has had a complete makeover

Before After

Page 83: Evaluating Evidence in Medicine: What Can Go Wrong?

Pragmatic Trial of Acupuncture for Low Back Pain

Usual Care Acupuncture

Needles alone Acupuncture

Page 84: Evaluating Evidence in Medicine: What Can Go Wrong?

Evaluating Evidence in Medicine: What Can Go Wrong?

• Placebo effect• Therapeutic effect of consultation (suggestion, expectation)• Unassisted natural healing (natural course of disease)• Unrecognized treatments (the spaghetti sauce factor)• Regression toward the mean• Other concurrent conventional treatment• Cessation of unpleasant treatment• Lifestyle changes• Publishing standards (p=0.05)• Publication bias

Page 85: Evaluating Evidence in Medicine: What Can Go Wrong?

What Can Go Right?

• Large well designed studies• Prior plausibility• Strongly positive results• Consensus of experts• Coherent body of evidence

Page 86: Evaluating Evidence in Medicine: What Can Go Wrong?

Evaluating Evidence in Medicine: What Can Go Wrong?

Page 87: Evaluating Evidence in Medicine: What Can Go Wrong?

Dr. Jay Gordon Pediatrician to the Stars

• “My very strong impression is that children with the fewest vaccines, or no vaccines at all, get sick less frequently and are healthier in general. I truly believe they also develop less autism.”

Page 88: Evaluating Evidence in Medicine: What Can Go Wrong?

Take-Home Points

• There are various kinds of evidence, some more credible than others.

• Most published clinical studies are wrong.• Even “evidence-based medicine” can get it wrong.• Statistical significance ≠ clinical significance.• Prior probability is important.• Clinical studies don’t “prove” – they only change

the probabilities.

Page 89: Evaluating Evidence in Medicine: What Can Go Wrong?

The End