← Back to Insights

Module: Platform Cooperation and Information Sharing

By SAUFEX Consortium 23 January 2026

[screen 1]

A foreign influence operation is detected on Facebook. Does Twitter know to look for it? When researchers identify manipulation tactics, do platforms adapt? Can governments share threat intelligence with platforms effectively?

Addressing information threats requires cooperation across platforms, between platforms and governments, and with civil society. Understanding this cooperation ecosystem is essential for effective defense.

[screen 2]

Why Cooperation Matters

Most threats span multiple platforms:

  • Foreign operations operate across Facebook, Twitter, YouTube, TikTok simultaneously
  • Banned actors migrate between platforms
  • Manipulation tactics spread across services
  • Harmful content gets cross-posted

Effective response requires coordination that no single actor can achieve alone.

[screen 3]

Platform-to-Platform Cooperation

Platforms share information about threats through various mechanisms:

Hash sharing databases: Known child exploitation imagery, terrorist content

Coordinated inauthentic behavior: Information about removed networks

Malware and phishing: Security threat intelligence

Crisis response: Coordinating during breaking events

Cooperation helps, but competitive dynamics and different policies complicate coordination.

[screen 4]

The Global Internet Forum to Counter Terrorism (GIFCT)

Created 2017 by Facebook, Microsoft, Twitter, YouTube:

Purpose: Coordinate responses to terrorist content

Key activity: Hash-sharing database for known terrorist content

Evolution: Expanded beyond original members to 20+ platforms

Effectiveness: Rapid removal of terrorist propaganda after attacks

Concerns: Who defines “terrorism”? Scope creep? Accountability?

GIFCT demonstrates potential for platform cooperation but also raises governance questions.

[screen 5]

Crisis Response Cooperation

During crises (attacks, disasters, conflicts), platforms sometimes coordinate:

  • Elevated content review for misinformation
  • Boosting authoritative sources
  • Sharing information about manipulation operations
  • Cross-platform policy enforcement

Examples include responses to terrorist attacks, natural disasters, and election interference attempts. Ad hoc coordination during Ukraine conflict showed potential.

[screen 6]

Platform-Government Cooperation

Platforms interact with governments in multiple ways:

Legal requests: Court orders for data or content removal

Threat intelligence: Government sharing information about foreign operations

Consultation: Input on policy development

Enforcement cooperation: Supporting DSA and other regulation implementation

Crisis communication: Coordinating during security threats

Cooperation varies by jurisdiction and political relationship.

[screen 7]

Government Requests for Content Removal

Governments regularly request content removal:

Legal process: Court orders based on national law

Emergency requests: During crises without full legal process

Voluntary requests: Asking platforms to remove content not clearly illegal

Statistics: Tens of thousands of requests annually to major platforms

Transparency: Platforms report these in transparency reports

Balance between respecting local law and avoiding authoritarian censorship is challenging.

[screen 8]

Threat Intelligence Sharing

Governments share intelligence about foreign influence operations:

FIMI detection: Government intelligence about foreign operations

Attribution support: Helping platforms understand actor identities

Trend analysis: Pattern recognition across classified sources

Advance warning: Alerts about anticipated operations

Challenges:

  • Classification concerns
  • Timeliness
  • Platform trust in government information
  • Political sensitivity

More systematized in recent years but still ad hoc.

[screen 9]

Platform-Researcher Cooperation

Researchers need platform data; platforms need external validation:

Data access partnerships: Providing data for academic research

Bug bounty programs: Rewarding security vulnerability discovery

Research collaborations: Joint studies on platform effects

Consultation: Researchers advising on policy

Tensions:

  • Privacy vs research needs
  • Platform control over research agenda
  • Selective data provision
  • Researcher independence

DSA mandates researcher access, potentially improving cooperation.

[screen 10]

Fact-Checker Partnerships

Platforms partner with independent fact-checkers:

Third-party fact-checking programs: IFCN-certified fact-checkers review content

Platform response: Reducing distribution of false content, adding labels

Funding: Platforms often fund fact-checking operations

Coverage: Multiple countries and languages

Effectiveness debates:

  • Corrections reach fewer than false claims
  • Labels sometimes increase belief (backfire effect)
  • Fact-checkers overwhelmed by volume
  • But: Some evidence of reduced spread

[screen 11]

Civil Society Cooperation

NGOs and advocacy groups work with platforms:

  • Reporting harmful content and coordinated campaigns
  • Providing expertise on specific issues (hate speech, child safety, etc.)
  • Pressure campaigns for policy changes
  • Monitoring platform commitments
  • Collaborative policy development

Relationship ranges from partnership to adversarial depending on issue and platform.

[screen 12]

Cross-Border Cooperation

Information threats ignore national borders, requiring international cooperation:

EU-US cooperation: Information sharing on foreign influence

Five Eyes intelligence sharing: Extended to platform threat intelligence

Interpol and Europol: Law enforcement coordination

OECD and G7: Policy development and coordination

Regional networks: Southeast Asian, Latin American cooperation

Effectiveness limited by different legal frameworks and political relationships.

[screen 13]

Information Sharing Challenges

Cooperation faces multiple obstacles:

Competition: Platforms are business rivals

Privacy: Legal limits on data sharing

Trust: Concerns about how information will be used

Standardization: Different data formats and categorization

Speed: Urgent threats require rapid sharing

Volume: Too much information to process effectively

Liability: Legal risk from sharing potentially wrong information

[screen 14]

The API Access Question

Researchers and watchdogs need programmatic access to platform data:

Arguments for open APIs:

  • Enable independent monitoring
  • Support research on platform effects
  • Democratize platform oversight

Platform concerns:

  • Privacy violations
  • System abuse and scraping
  • Competitive intelligence extraction
  • Security risks

Many platforms have restricted API access in recent years, hampering research. DSA requires researcher access but implementation details matter.

[screen 15]

Private Messaging Challenges

End-to-end encrypted messaging creates cooperation dilemmas:

  • Platforms can’t access content to moderate
  • Law enforcement can’t access for investigations
  • But: Encryption protects privacy and security
  • Proposals for compromises (client-side scanning) are contentious

No consensus on balancing privacy and safety in encrypted spaces.

[screen 16]

Best Practices for Cooperation

Effective cooperation requires:

Clear protocols: Standardized processes for information sharing

Timely communication: Rapid sharing during urgent threats

Appropriate scope: Focused on genuine threats, not overreach

Accountability: Oversight of how shared information is used

Reciprocity: Information flowing both directions

Privacy protection: Safeguarding user data

Transparency: Public reporting on cooperation activities

Trust building: Sustained relationships, not just crisis response

[screen 17]

Future of Platform Cooperation

Cooperation is likely to increase:

  • Regulatory requirements (DSA, Online Safety Act)
  • Growing sophistication of threats
  • Public pressure for coordination
  • Learning from successes and failures

Key questions:

  • How to balance cooperation with competition?
  • Can small platforms participate meaningfully?
  • How to prevent authoritarian abuse of cooperation mechanisms?
  • What governance ensures cooperation serves public interest?

The cooperation ecosystem will continue evolving as threats and governance mature.

[screen 18]

Your Role

As users and citizens, you can support effective cooperation by:

  • Reporting suspicious coordinated activity
  • Supporting independent researchers and fact-checkers
  • Advocating for meaningful transparency
  • Holding both platforms and governments accountable
  • Understanding cooperation complexity (not just demanding perfect solutions)

Effective defense requires whole-of-society approach. Individual contributions create collective resilience.