← Back to Insights

Module: Why People Believe - Cognitive Biases and Psychology

By SAUFEX Consortium 23 January 2026

[screen 1]

“How could anyone believe that?” we ask when seeing people accept obvious disinformation. But believing false information isn’t about intelligence - it’s about how human brains work.

We all have cognitive biases: mental shortcuts that usually help us but can lead us astray. Understanding these biases is essential for understanding disinformation vulnerability - including our own.

[screen 2]

The Cognitive Miser Model

Humans conserve mental energy:

Reality: Our brains face overwhelming information

Response: Mental shortcuts (heuristics) for efficiency

  • Quick judgments without deep analysis
  • “Good enough” answers, not perfect
  • Automatic processing over deliberate

Usually helpful: Shortcuts enable fast decisions

Problem: Shortcuts can be exploited

Implication: Disinformation works because it exploits how we normally think

Everyone: Even smart, educated people use shortcuts

Not stupidity - efficiency with vulnerabilities.

[screen 3]

Confirmation Bias

We seek information confirming existing beliefs:

Definition: Tendency to search for, interpret, and recall information that confirms beliefs

Mechanisms:

  • Selective attention: Notice confirming evidence
  • Selective interpretation: Interpret ambiguity favorably
  • Selective recall: Remember confirming information

Example: If you believe X, you’ll notice evidence supporting X and ignore evidence against

Why it exists: Maintaining consistent worldview is cognitively easier

Disinformation exploitation: False claims confirming beliefs spread faster than corrections

One of most powerful and pervasive biases.

[screen 4]

Motivated Reasoning

We reason to reach desired conclusions:

Definition: Goal-directed reasoning where the goal is reaching preferred conclusion

Process:

  1. Desired conclusion exists
  2. Search for evidence supporting it
  3. Uncritically accept supporting evidence
  4. Critically scrutinize contradicting evidence
  5. Conclude what you wanted

Differs from confirmation bias: More active, goal-directed

Identity-protective cognition: Motivated reasoning to protect group identity

Example: Evaluating same evidence differently based on whether it supports your political team

Implication: Facts alone won’t change minds when identity at stake

[screen 5]

The Illusory Truth Effect

Repetition increases belief:

Finding: Repeated statements rated as more truthful than new ones

Mechanism: Familiarity confused with truth

  • Familiar feels easier to process
  • Processing fluency interpreted as truth

Disturbing: Works even when:

  • People know statement is false
  • Statement contradicts knowledge
  • Source is unreliable

Disinformation exploitation: Repeat lies until they feel true

Defense: Awareness helps but doesn’t eliminate effect

Why disinformation campaigns repeat same false claims.

[screen 6]

Availability Heuristic

We judge probability by what easily comes to mind:

Definition: Estimating likelihood based on mental availability of examples

Examples:

  • Overestimating terrorism risk after news coverage
  • Fearing plane crashes more than car accidents
  • Believing crime increasing when coverage increases

Why it exists: Often what’s easily recalled is actually common

Problem: Vivid, emotional, recent events more available than common but mundane

Disinformation exploitation: Dramatic false stories create inflated threat perceptions

“If I can think of examples, it must be common.”

[screen 7]

The Backfire Effect

Corrections can strengthen false beliefs:

Definition: Correcting misinformation sometimes increases belief

Mechanism:

  • Correction perceived as threat to identity
  • Defensive processing activated
  • Counter-arguments generated
  • Belief strengthened

Contested: Recent research suggests less common than initially thought

When it occurs: Identity-central beliefs, distrusted sources

Implication: Simply stating facts can backfire

Solution approaches: Affirmation before correction, trusted messengers

Why “just correct it” doesn’t work.

[screen 8]

Source Confusion

We remember claims but forget context:

Sleeper effect: Over time, message persuasiveness increases as source forgotten

Mechanism:

  • Remember the claim
  • Forget it was debunked
  • Forget source was unreliable
  • Claim feels true

Implication: Debunking has limited durability

Example: Remember seeing claim about vaccines, forget it was fact-check showing it false

Why repeating false claim to debunk is risky: Amplifies claim, contributes to confusion

[screen 9]

Dunning-Kruger Effect

Incompetence includes not recognizing incompetence:

Finding: People with low expertise overestimate their knowledge

Mechanism: Lacking knowledge to recognize gaps

Curve:

  • Low knowledge = high confidence
  • Increasing knowledge = decreasing confidence (see complexity)
  • High knowledge = appropriate confidence

Disinformation relevance: People confident in false beliefs may lack knowledge to recognize falsity

Everyone vulnerable: Experts in one domain can be overconfident in another

“I did my own research” - without expertise to evaluate it.

[screen 10]

Proportionality Bias

Big events must have big causes:

Definition: Expecting event magnitude to match cause magnitude

Rejection of randomness: Hard to accept chance or small causes for major events

Example: Conspiracy theories about assassinations, pandemics, terrorist attacks

  • “A lone gunman couldn’t possibly…”
  • “There must be powerful forces behind this…”

Why: Desire for world to make sense; big events should have big explanations

Disinformation exploitation: Conspiracy theories offer proportional explanations for major events

[screen 11]

Continued Influence Effect

Misinformation influences beliefs even after correction:

Finding: False information continues shaping thinking after learning it’s false

Mechanism:

  • Information integrated into mental model
  • Removing it leaves explanatory gap
  • Gap uncomfortable, so information persists

Example: Believing WMDs in Iraq even after learning intelligence was wrong

Implication: First impression matters enormously

Solution: Provide alternative explanation when debunking

Why rapid correction important, and why correction isn’t enough.

[screen 12]

Social Proof

We follow what others do:

Definition: Looking to others’ behavior to determine correct action

Usually helpful: Crowd wisdom often accurate

Vulnerability: Can be manipulated

  • Fake accounts creating false consensus
  • Bot amplification of fringe views
  • Coordinated inauthentic behavior

Mechanism: “If everyone believes this, it must be true”

Disinformation exploitation: Creating illusion of consensus

Why FIMI operations use bot networks.

[screen 13]

Group Identity and Belief

Beliefs signal belonging:

Expressive function: Beliefs as identity markers

  • Not about truth, about belonging
  • “People like us believe this”
  • Dissent risks exclusion

Identity-protective cognition: Protecting group identity over individual accuracy

Implication: Correcting false belief perceived as attacking identity

Political tribalism: Partisan identity drives belief more than evidence

Example: Same claim believed or rejected based on partisan source

Why polarization makes disinformation harder to counter.

[screen 14]

The Role of Emotion

Feelings shape belief:

Affect heuristic: Feeling influences judgment

  • Positive feeling = low risk, high benefit judgment
  • Negative feeling = high risk, low benefit judgment

Emotional content advantages:

  • More attention
  • Better memory
  • Wider sharing
  • Bypasses critical thinking

Fear and anger: Particularly powerful

  • Amplify sharing
  • Reduce deliberation
  • Increase susceptibility

Disinformation pattern: Emotionally charged false claims

Not rational actors - emotional beings.

[screen 15]

System 1 vs. System 2 Thinking

Two modes of thought:

System 1 (fast, automatic):

  • Intuitive
  • Effortless
  • Emotional
  • Stereotyping
  • Heuristic-based

System 2 (slow, deliberate):

  • Logical
  • Effortful
  • Analytical
  • Critical
  • Evidence-based

Default: System 1 (easier)

Vulnerability: Disinformation designed for System 1

  • Quick, emotional judgments
  • No deep analysis

Defense: Activate System 2

  • Pause before sharing
  • Critical evaluation
  • Fact-checking

[screen 16]

Cognitive Load and Vulnerability

Mental exhaustion increases susceptibility:

Cognitive load: Mental effort being used

Finding: High cognitive load increases vulnerability

  • Less mental resources for critical evaluation
  • More reliance on shortcuts
  • Easier to manipulate

Modern life: Constant cognitive load

  • Information overload
  • Decision fatigue
  • Stress and distraction

Implication: Tired, stressed, overwhelmed people more vulnerable

Strategy: Disinformation often exploits moments of crisis when load is highest

[screen 17]

Who Is Most Vulnerable?

Universal vulnerability with variation:

Not just the “gullible”: Everyone vulnerable to some disinformation

Risk factors:

  • Strong prior beliefs (confirmation bias)
  • Low trust in institutions (alternative sources)
  • High cognitive load (less critical evaluation)
  • Strong group identity (identity-protective cognition)
  • Age extremes (digital literacy, cognitive changes)
  • Isolation (no corrective input)

Education: Not protective as expected

  • Educated people can be better at rationalizing
  • Motivated reasoning works regardless of intelligence

Context matters: Same person vulnerable in some contexts, not others

[screen 18]

Building Resilience

Understanding biases is first step:

Individual strategies:

  • Awareness of own biases
  • Slow down, activate System 2
  • Seek disconfirming evidence
  • Consider alternative explanations
  • Check sources
  • Notice emotional manipulation

Social strategies:

  • Diverse social networks
  • Trusted correctors
  • Open discussion norms

Structural strategies:

  • Media literacy education
  • Platform design for deliberation
  • Information environment quality

Reality: Can’t eliminate biases, but can recognize and compensate

Humility: Recognize everyone is vulnerable, including you

Understanding why people believe disinformation - including ourselves - is essential for effective counter-messaging. Not “those foolish people” but “we humans with predictable vulnerabilities.” Empathy and understanding enable better responses than condemnation.