← Back to Insights

Module: What is Disinformation?

By SAUFEX Consortium 23 January 2026

[screen 1]

“Fake news.” “Propaganda.” “Lies.” We use many terms to describe false information, but precise definitions matter. If we want to address information manipulation, we must understand what we’re fighting.

This module defines disinformation and related concepts, providing the foundation for everything that follows.

[screen 2]

Core Definitions

Three key terms describe problematic information:

Disinformation: False information deliberately created and spread with intent to deceive

Misinformation: False information spread without intent to deceive (often by mistake)

Malinformation: Genuine information shared to cause harm (leaked private info, out-of-context truth)

Key distinction: Intent. Disinformation requires deliberate deception; misinformation doesn’t.

[screen 3]

The Intent Problem

How do we know someone intended to deceive?

Challenge: We can’t read minds. Intent is inherently difficult to establish.

Reality: Unless we have evidence of planning (leaked communications, coordinated campaigns, financial motives), intent is inferred.

Approach: Analysts assess intent based on:

  • Consistency of false claims
  • Coordination patterns
  • Financial or political motives
  • Sophistication of operation
  • Response to corrections

Intent assessment is often a confidence level, not certainty.

[screen 4]

What Makes Information “False”?

Establishing falsity requires standards:

Scientific topics: Contrary to scientific consensus

  • Example: “Vaccines cause autism” contradicts overwhelming evidence

Verifiable facts: Contradicts documented reality

  • Example: “Election had 3 million illegal votes” contradicted by evidence

Contested claims: More difficult

  • Interpretation vs fact
  • Emerging issues without consensus

Context matters: Same information can be true, false, or misleading depending on context and framing.

[screen 5]

Truth, Facts, and Trust

How do we establish truth?

Science: Repeatable, testable, consensus-based

Journalism: Multiple sources, verification, evidence

Intelligence: Source evaluation, corroboration, confidence levels

Law: Evidence, witness testimony, standards of proof

Common thread: Trust in sources and methods

Reality: Fact-checking ultimately relies on trusted authorities and evidence. When trust erodes, establishing truth becomes harder.

[screen 6]

“Fake News” - A Problematic Term

Why “fake news” is inadequate:

Originally: Entirely fabricated stories presented as news

Weaponized: Now used to dismiss any inconvenient reporting

Problems:

  • Weaponized by politicians to attack legitimate journalism
  • Implies all false information looks like “news”
  • Oversimplifies complex phenomenon

Better terms: Disinformation, misinformation, false information, information manipulation

Precise language matters.

[screen 7]

Types of False Information

Disinformation takes many forms:

Fabricated content: Completely false, invented from nothing

Manipulated content: Genuine content altered (edited images, doctored documents)

Imposter content: Impersonating genuine sources

Misleading content: Selective truth, misleading framing

False context: Genuine content with false context (old photo presented as recent)

Satire/parody: Humorous content mistaken for real (not disinformation unless intent to deceive)

[screen 8]

Disinformation vs. Propaganda

Related but distinct concepts:

Propaganda:

  • Communication to influence opinion
  • May use truthful or false information
  • Often one-sided, not necessarily false
  • Historically associated with governments

Disinformation:

  • Specifically false information
  • Deliberately deceptive
  • Can be from any actor
  • Modern focus on deception

Overlap: Propaganda campaigns often include disinformation, but propaganda isn’t always disinformation.

[screen 9]

Online vs. Offline Disinformation

Digital changes the game:

Traditional disinformation:

  • Pamphlets, radio, TV
  • Limited reach without resources
  • Slower spread
  • Centralized creation

Digital disinformation:

  • Anyone can create and spread
  • Global reach instantly
  • Algorithmic amplification
  • Permanent (archived, screenshotted)
  • Scale and speed unprecedented

Same basic concept, transformed by technology.

[screen 10]

Harm from Disinformation

Why disinformation matters:

Individual harms:

  • Bad decisions based on false beliefs
  • Health risks (medical misinformation)
  • Financial losses (scams)
  • Psychological distress

Societal harms:

  • Undermining democratic processes
  • Public health crises (vaccine hesitancy)
  • Erosion of trust in institutions
  • Polarization and conflict
  • Violence (motivated by false beliefs)

Cumulative effect: Degraded information environment where truth becomes harder to identify.

[screen 11]

Not All False Information is Disinformation

Important distinctions:

Honest mistakes: Journalists occasionally err, then correct

Satire: Clearly intended as humor (The Onion, etc.)

Contested interpretations: Disagreement about implications

Evolving information: Scientific understanding changes

Opinion: Subjective views aren’t “false”

Disinformation requires: Falsity + intent to deceive + deliberate spread

Overuse of “disinformation” label can itself be problematic.

[screen 12]

Who Creates Disinformation?

Multiple actors with different motives:

State actors: Foreign influence operations, domestic propaganda

Political actors: Campaigns, partisans seeking advantage

Financial actors: Profit from engagement, advertising, scams

Ideological actors: True believers spreading aligned false narratives

Malicious actors: Chaos, entertainment, attention-seeking

Hybrid operations: Combinations of above

Understanding actor motivations helps understand and counter disinformation.

[screen 13]

Information Disorder Framework

Claire Wardle’s comprehensive framework:

Three types (disinformation, misinformation, malinformation)

Seven types of content (fabricated, manipulated, imposter, etc.)

Three elements:

  • Agent (who created/spread)
  • Message (what is the content)
  • Interpreter (who receives and understands)

Value: Comprehensive view of information ecosystem problems

Framework widely adopted by researchers and policymakers.

[screen 14]

Disinformation vs. Conspiracy Theories

Overlapping but distinct:

Conspiracy theories:

  • Explain events through secret plots
  • May be believed sincerely by promoters
  • Often unfalsifiable
  • Psychological and social functions

Disinformation:

  • False information deliberately spread
  • Creator knows it’s false
  • Specific claims, not just theories

Overlap: Disinformation campaigns often exploit and amplify conspiracy theories

Not all conspiracy theorists are disinformation agents; not all disinformation is conspiracy theories.

[screen 15]

The Definitional Challenge

Why definitions are contested:

Politics: “Disinformation” used to delegitimize opponents

Epistemology: What counts as “false” is sometimes unclear

Free speech: Concerns about who decides truth

Cultural differences: Varying norms about truth and deception

Evolution: Phenomenon evolving faster than definitions

Reality: Perfect definitions impossible, but working definitions necessary for research and policy

Acknowledge limitations while working with best available definitions.

[screen 16]

Why Precise Definitions Matter

Consequences of imprecise language:

Policy: Vague laws can be abused for censorship

Research: Incomparable studies if measuring different things

Public discourse: “Disinformation” loses meaning through overuse

Trust: False accusations damage credibility

Effectiveness: Can’t address problem not clearly defined

Balance: Precision without paralysis - act while refining understanding

[screen 17]

Beyond “True” and “False”

Information can be problematic without being false:

Misleading but true: Selective facts creating false impression

Manipulative framing: True information presented deceptively

Harmful truth: Private information weaponized

Context stripping: Facts without necessary context

Focus: Harm and manipulation, not just falsity

Information integrity about more than just factual accuracy.

[screen 18]

Working Definition for This Programme

For EMoD purposes:

Disinformation: False, misleading, or deceptively framed information deliberately created or spread to deceive, manipulate, or cause harm

Broad enough: Captures various problematic information

Specific enough: Excludes honest mistakes and legitimate disagreement

Acknowledges: Gray areas and edge cases

Practical: Enables policy and intervention design

With this foundation, we can explore how disinformation spreads, how to recognize it, and how to counter it.