← Back to Insights

Module: Cross-Border Disinformation

By SAUFEX Consortium 25 January 2026

Purpose: Build plans that survive jurisdiction mismatch and censorship optics.

Output format: Assessment → Confidence (low/med/high) → Next action


[screen 1]

Borders for Law, Not Content

Disinformation campaigns cross borders effortlessly. Liability and legitimacy don’t.

A campaign can originate in Country A, be hosted in Country B, amplified through a platform headquartered in Country C, target audiences in Countries D through G, and evade jurisdiction everywhere.

Your legal authority stops at your border. The content doesn’t.


[screen 2]

Export Logic Exists

Some state and non-state actors treat influence operations as export products:

  • Playbooks developed domestically, applied internationally
  • Reusable assets (networks, infrastructure, personnel)
  • Distribution services available for hire
  • Proven techniques licensed to allied actors

This is an industry, not a series of isolated incidents. Campaigns that worked in one country get adapted for others.


[screen 3]

Platforms Are Geopolitical Actors

Major platforms operate globally but are headquartered in specific jurisdictions.

What this means:

  • They optimize for scale, not national interest
  • Fragmented governance creates operational flexibility
  • Different rules in different markets (when enforced at all)
  • Lobbying power concentrated in home jurisdiction

Platforms navigate between governments. Understanding their positioning helps predict their behavior.


[screen 4]

Fragmentation Is a Strategy

For malign actors, fragmented response is a feature, not a bug.

Exploitation patterns:

  • Forum shop for weakest jurisdiction
  • Exploit gaps between national enforcement
  • Use sovereignty arguments to block cooperation
  • Play democracies against each other

If responses split, campaigns win by default. Coordination is necessary, not optional.


[screen 5]

Minimum Viable Alignment

Perfect international agreement is impossible. Minimum viable alignment is achievable.

Elements of MVA:

  • Shared thresholds (what constitutes actionable content)
  • Shared reporting format (how to communicate about incidents)
  • Shared escalation triggers (when to coordinate response)
  • Shared attribution standards (what evidence counts)

You don’t need identical laws. You need compatible responses.


[screen 6]

Legitimacy Constraints

Here’s the hard truth:

If your plan looks like censorship, it will fail politically — even if “effective” by narrow metrics.

Legitimacy requirements:

  • Due process (clear rules, appeal mechanisms)
  • Transparency (public about what’s being addressed)
  • Proportionality (response matches harm)
  • Non-partisan application (not just targeting opponents)

Effective isn’t enough. Legitimate and effective is the standard.


[screen 7]

Operational Coordination

Vague cooperation fails. Operational coordination requires answers:

Who decides?

  • Which entity triggers response?
  • What’s the decision-making process?
  • Who has authority in ambiguous cases?

Who acts?

  • Platform? Government? Civil society?
  • What’s the sequence of actions?

Who communicates?

  • Single voice or coordinated messaging?
  • Who talks to media?

Who measures?

  • What counts as success?
  • How is effectiveness assessed?

Name it. Don’t vibe it.


[screen 8]

DIM Application for Cross-Border

Cross-border scenarios usually need blended toolkit:

Gen 4 (Reach reduction)

  • Platform action across jurisdictions
  • Shared enforcement standards
  • Coordinated moderation requests

Gen 3 (Prebunking)

  • Resilience building before campaigns hit
  • Shared early warning
  • Cross-border media literacy

Gen 5 (Cohesion)

  • Building aligned civil society
  • Cross-border fact-checking networks
  • Shared professional standards

Gen 2 (Debunking) — use sparingly

  • Rapid response when claims cross borders
  • But debunking in one country can be amplified to reach others

[screen 9]

Practical Scenario

Situation: A coordinated campaign is spreading election disinformation across three EU member states. The content originates from servers in a non-EU country. Platforms are US-based. Each member state has different legal frameworks for addressing online harms.

Your task (20 minutes):

Write a 12-line cross-border memo covering:

  • Key constraints (legal, political, operational)
  • Minimum shared actions (what all parties should do)
  • What you’ll explicitly avoid (overreach, legitimacy risks)
  • Success metrics (how you’ll know if response worked)
  • Assessment + Confidence + Next action

[screen 10]

Sample Memo Structure

CROSS-BORDER RESPONSE MEMO: Election Disinfo Campaign

Constraints:

  • Jurisdiction mismatch: origin outside EU, platforms in US, targets in 3 MS
  • Legal variation: each MS has different takedown authority
  • Timing: election in 14 days limits coordination window
  • Legitimacy: any response will be characterized as “censorship” by some

Minimum Shared Actions:

  • All 3 MS report to platform via existing channels (24hr)
  • Shared attribution standard: “coordinated inauthentic behavior” (requires evidence)
  • Single public statement from EU-level entity (avoids fragmented messaging)
  • Synchronized prebunking content through existing fact-check networks

Explicit Avoidance:

  • Will NOT claim foreign state attribution without strong evidence
  • Will NOT pursue content removal for opinion (only coordinated inauthenticity)
  • Will NOT make independent national statements (coordination priority)

Success Metrics:

  • Reach reduction >50% within 72 hours
  • No legitimate speech chilled (measured by false positive reports)
  • Coordinated response maintained (no defection)
  • Post-action review with lessons learned

Assessment: Manageable cross-border incident requiring coordination discipline. Confidence: Medium (depends on platform cooperation and MS alignment). Next action: Initiate MS coordination call; parallel platform escalation.


[screen 11]

Common Cross-Border Failures

Failure 1: Going it alone

  • Single country acts, others don’t follow
  • Creates perception of overreach
  • Doesn’t address root problem

Failure 2: Waiting for perfect alignment

  • Coordination delays while campaign grows
  • Perfect becoming enemy of good

Failure 3: Ignoring legitimacy

  • Effective action that appears authoritarian
  • Political backlash exceeds disinformation harm

Failure 4: Platform dependence

  • Assuming platform will act on request
  • No contingency for non-cooperation

[screen 12]

Building Cross-Border Capacity

Long-term investments that pay off:

Relationships: Know your counterparts before incidents Standards: Agreed protocols before they’re needed Technology: Shared detection and monitoring infrastructure Trust: Built through repeated successful cooperation Learning: Systematic post-incident review and improvement

You can’t build coordination during a crisis. Build it before.


[screen 13]

The Sovereignty Trap

Malign actors exploit sovereignty rhetoric:

“You’re interfering in our domestic affairs” “This is censorship by foreign powers” “Democratic nations shouldn’t coordinate against speech”

Counter-framing:

  • Defending democracy isn’t threatening it
  • Coordinated defense isn’t coordinated censorship
  • Sovereignty includes right to protect information environment

But this only works if your actions are genuinely legitimate. Overreach validates the critique.


[screen 14]

Module Assessment

Scenario: Intelligence indicates a foreign state actor is preparing a disinformation campaign targeting upcoming elections in 5 European countries. The campaign will use platforms based in the US, content created in the foreign state, and amplified through a mix of authentic local partisans and inauthentic accounts.

Task (15 minutes):

  1. List 3 jurisdiction-related constraints
  2. Propose minimum viable alignment (3 elements)
  3. What platform cooperation would you seek?
  4. What legitimacy guardrails would you implement?
  5. How would you handle domestic actors who amplify the foreign campaign?
  6. Assessment + Confidence + Next action

Scoring:

  • Credit realistic constraints acknowledgment
  • Reward legitimacy awareness
  • Penalize responses that ignore sovereignty concerns

[screen 15]

Key Takeaways

  • Disinformation trades cross borders; governance doesn’t
  • Export logic: influence operations are a repeatable product
  • Platforms are geopolitical actors navigating between governments
  • Fragmentation is a strategy for malign actors; coordination is necessary defense
  • Minimum viable alignment beats perfect agreement that never happens
  • If it looks like censorship, it will fail politically
  • Operational coordination requires named roles: who decides, acts, communicates, measures
  • Cross-border usually needs blended DIM: Gen 4 + Gen 3 + Gen 5, with Gen 2 used sparingly
  • Build coordination capacity before crises, not during them

Next Module

Continue to: Attack the Rows — Externalities, internalisation, and the “solved ledger.” How to stop subsidizing spillovers.