[screen 1]
“Scientists discover chocolate helps weight loss!” The headline claims a study proves it. You read the actual study - it was 12 people for two weeks with no control group.
Headlines routinely overstate research findings. Understanding how to read studies helps you evaluate scientific claims without needing a PhD.
[screen 2]
Why This Matters
Research is increasingly used to support claims in public discourse. But most people encounter research through media headlines, not original studies.
This creates opportunities for misrepresentation:
- Headlines exaggerate findings
- Important caveats are omitted
- Single studies are treated as definitive
- Preliminary research is presented as proven
- Corporate-funded studies downplay conflicts of interest
Basic research literacy is self-defense.
[screen 3]
Anatomy of a Research Paper
Most research papers follow standard structure:
Abstract: Brief summary of entire study
Introduction: Background and research question
Methods: How the study was conducted
Results: What the data showed
Discussion: Interpretation and limitations
Conclusion: What researchers conclude
You don’t need to read everything - abstract, methods, and discussion/limitations are most important for evaluation.
[screen 4]
Peer Review Explained
Peer review means other experts in the field evaluate research before publication:
- Reviewers check methodology and reasoning
- They can request changes or reject publication
- Process is usually anonymous
- Not perfect - bad studies get through, politics can interfere
Peer-reviewed publication in reputable journals is a quality signal, but not a guarantee of truth. Non-peer-reviewed work (preprints, press releases) should be treated more skeptically.
[screen 5]
Types of Studies
Different study types provide different levels of evidence:
Randomized Controlled Trial (RCT): Gold standard - random assignment to treatment or control group
Observational: Watching what happens naturally without intervention (can show correlation, not causation)
Meta-analysis: Combining results from multiple studies (strong evidence when done well)
Case study: Detailed examination of individual cases (useful for rare phenomena, not generalizable)
Survey: Asking people questions (subject to response bias)
Understanding study type helps evaluate strength of conclusions.
[screen 6]
Sample Size and Selection
Two critical questions:
How many participants?
- Small samples (< 30): Very preliminary
- Medium (30-100): Suggestive
- Large (1000+): More reliable
- Very large (10,000+): Can detect small effects
How were they selected?
- Random sampling from target population: Best
- Convenience samples (students, volunteers): Limited generalizability
- Self-selected: Highly biased
A study of 20 college students doesn’t tell you much about humans in general.
[screen 7]
Control Groups and Randomization
Quality experiments include:
Control group: Comparison group receiving placebo or standard treatment (without this, you can’t know if your intervention caused the effect)
Randomization: Participants randomly assigned to groups (prevents selection bias)
Blinding: Participants and/or researchers don’t know who’s in which group (prevents expectation effects)
Studies without these features are much weaker evidence.
[screen 8]
Reading the Limitations Section
The most honest part of a research paper is often the “limitations” section, where researchers acknowledge weaknesses:
- Small sample size
- Short duration
- Limited generalizability
- Potential confounding factors
- Measurement challenges
Headlines never mention limitations. Reading them yourself provides essential context for evaluating claims.
[screen 9]
Correlation vs. Causation Revisited
Observational studies can show correlation but not causation. Just because A and B occur together doesn’t mean A causes B:
- B might cause A (reverse causation)
- C might cause both A and B (confounding variable)
- It might be coincidence
“Ice cream sales correlate with drowning deaths” - doesn’t mean ice cream causes drowning. Both increase in summer (confounding variable: warm weather).
Only controlled experiments can establish causation.
[screen 10]
Conflicts of Interest
Who funded the research matters:
- Tobacco companies funded studies downplaying smoking risks
- Pharmaceutical companies fund drug trials (positive publication bias)
- Food industry funds nutrition studies favoring their products
- Political advocacy groups fund research supporting their positions
This doesn’t automatically invalidate research, but it’s a red flag requiring extra scrutiny. Look for independent replication.
[screen 11]
How Media Distorts Research
Common patterns when research becomes news:
- Causation claimed from correlation: “Coffee causes cancer” from observational study
- Exaggerated effect sizes: “50% reduction!” sounds dramatic but might be meaningless if baseline risk is tiny
- Ignoring limitations: Small sample, short duration, or conflicts of interest omitted
- Extrapolating beyond data: Animal studies presented as if proven in humans
- Single study treated as definitive: Science requires replication
Always try to find the original study when evaluating research claims.
[screen 12]
Evaluating Research Claims
Before accepting research-based claims:
-
Find the original study (not just media coverage)
-
Check if it’s peer-reviewed
-
Look at sample size and selection
-
Check study type (observational or experimental?)
-
Read the limitations section
-
Identify who funded it
-
Check for conflicts of interest
-
See if it’s been replicated
-
Look for systematic reviews or meta-analyses
-
Consider if media coverage matches actual findings
[screen 13]
Red Flags in Research
Be skeptical when you see:
- Press release but no published paper
- Extraordinary claims with weak methodology
- Conflicts of interest not disclosed
- Refusing to share data or methods
- Results never replicated
- Published in low-quality or predatory journals
- “Breakthrough” claims based on preliminary research
- Researchers making claims beyond their data
Quality research acknowledges uncertainty and limitations.
[screen 14]
Building Research Literacy
You don’t need to become a scientist to evaluate research claims:
- Learn to read abstracts and limitations
- Understand basic study types
- Know that single studies prove little
- Check funding and conflicts
- Find the original source before accepting headlines
- Look for systematic reviews over single studies
Science is a cumulative process. Individual studies are pieces of larger puzzles, not definitive answers. Media literacy requires understanding this distinction.