← Back to Insights

Module: The Market for Bad News

By SAUFEX Consortium 25 January 2026

Purpose: Shift the frame from “wrong beliefs” to “transactions that clear.”

Output format: Assessment → Confidence (low/med/high) → Next action


[screen 1]

Reality Check

Disinformation spreads because it clears trades: attention for status, outrage for belonging, clicks for money. Truth is optional.

This isn’t a moral observation. It’s a market observation. Understanding the market is the first step to intervening in it.


[screen 2]

The Bedtime Story

“If people knew the facts, they’d believe differently.”

This is comforting. It’s also largely wrong.

Markets don’t care. They price behavior, not correctness. People don’t consume information to be correct — they consume it to feel connected, informed, validated, or entertained.

If fact-based information competed on pure merit, cable news would look very different.


[screen 3]

Product vs. By-product

Here’s the framing shift:

Often the “content” isn’t the product. It’s the by-product.

The actual products:

  • Engagement (keeps you on platform)
  • Recruitment (builds community or customer base)
  • Donations (political or ideological fundraising)
  • Authority (positions creator as expert)
  • Distraction (pulls attention from something else)

The false claim is just the delivery vehicle.


[screen 4]

First Question

Stop asking: “Is it true?”

Start asking: “What trade does this enable, and for whom?”

This reframes the problem. You’re not fact-checking beliefs — you’re analyzing transactions.

Who is giving what? Who is getting what? What market does this content serve?


[screen 5]

Who Profits

Multiple actors extract value from the disinformation ecosystem:

Creators: Attention, followers, influence, money Platforms: Engagement, time-on-site, ad revenue Campaigners: Political or ideological gains Opportunists: Financial exploitation of attention The “counter” ecosystem: Sometimes even fact-checkers and researchers extract value from the problem’s persistence

This isn’t cynicism. It’s incentive mapping.


[screen 6]

Who Pays

Costs are distributed, often to those who didn’t choose to participate:

Social trust: Degraded by competing “truths” Institutional legitimacy: Eroded when authorities are routinely questioned Public attention: Consumed by noise instead of signal Security budgets: Organizations pay for defensive measures Mental health: Anxiety, confusion, polarization

The bill goes downstream to people who never clicked.


[screen 7]

Confidence Discipline

A critical distinction:

You can be high-confidence on the market framing while low-confidence on attribution.

“This content serves an engagement market and generates ad revenue” — assessable. “This content was created by Russian intelligence” — requires different evidence.

Don’t mix them. Market analysis doesn’t require knowing who’s behind it.


[screen 8]

DIM Application

The market framing informs response selection:

If amplification is the harm channel: → Gen 4 (moderation, reach reduction) → Target the distribution, not the belief

If persuasion is the channel: → Gen 3 (prebunking) or Gen 5 (community resilience) → Work on the demand side

If it’s noise without significant impact: → Sometimes “do nothing” wins → Response can create the market that wasn’t there


[screen 9]

Practical Scenario

Situation: A health misinformation account with 200,000 followers promotes unproven supplements. Posts get high engagement. Account has affiliate links to supplement sales.

Your task (8 minutes):

Write 8 lines covering:

  • Transaction framing (1 sentence)
  • Who profits / who pays (3 bullets)
  • 2 hypotheses (organic belief vs. incentivized promotion)
  • 1 incentive lever you could pull
  • Assessment + Confidence + Next action

[screen 10]

Sample Response

Transaction framing: Account converts health anxiety into supplement sales via trust-building content.

Who profits:

  • Account holder: affiliate revenue, influence
  • Supplement companies: sales
  • Platform: engagement, ad revenue adjacent to content

Who pays:

  • Followers: money, potentially health
  • Healthcare system: treating preventable harms
  • Trust environment: erosion of legitimate health information

Hypotheses:

  1. True believer who discovered monetization opportunity
  2. Purely commercial operation optimizing for sales

Incentive lever: Affiliate link disclosure requirements; platform demonetization for health misinformation

Assessment: Commercial disinformation operation. High confidence on market structure, low confidence on creator intent. Next action: Document revenue model, consider platform report for undisclosed paid promotion.


[screen 11]

Key Takeaways

  • Disinformation spreads because it clears trades, not because people are stupid
  • “Is it true?” matters less than “What trade does this enable?”
  • Identify who profits and who pays — follow the incentives
  • Market analysis doesn’t require attribution — you can be confident about economics while uncertain about actors
  • Response should target the market structure, not just the beliefs
  • Sometimes the best intervention is changing what pays

Next Module

Continue to: TTF Simplest Form: Write the Ledger — Turn slogans into trades. If you can’t write the ledger, you can’t change the ledger.