Divergence score
An integer from 0 to 100 that answers one question: how much do outlets disagree about this story?
Higher = more disagreement. Lower = more consensus.
What goes into it
Four signals, combined:
- Framing spread. How many distinct framing tags the sources used. An event framed by three outlets as
critical, three aspro-action, and three asneutralscores higher than one where all nine usedneutral. - Fact agreement. From the fact ledger: the ratio of
confirmedtodisputedandomittedclaims across sources. - Sentiment spread. Standard deviation of per-article sentiment. High spread means some outlets are enthusiastic while others are grim.
- Bias coverage. Whether the story is covered across the spectrum. A story covered only by one side gets a lower spread score (there is less to disagree about) but is flagged separately in
/divergence/gaps.
The weights are:
| Signal | Weight |
|---|---|
| Framing spread | 0.40 |
| Fact agreement | 0.30 |
| Sentiment spread | 0.15 |
| Bias coverage | 0.15 |
Weights are not configurable in v2. They may be tuned between releases.
Reading the score
| Range | Label | What it means |
|---|---|---|
| 0 - 33 | Agreement | Outlets substantially agree on facts and framing |
| 34 - 66 | Some divergence | Framing varies, maybe one or two disputed facts |
| 67 - 100 | High divergence | Outlets tell meaningfully different stories about the same events |
The homepage uses these three colors (green, amber, red) for the divergence pill. The API does not return the label - compute it client-side from the number.
What it is not
- Not a measure of quality. Low divergence does not mean good reporting.
- Not a measure of truth. The fact ledger is a separate signal.
- Not a measure of political bias. An outlet scoring high on divergence on one story is not "more biased" - it is reporting a story others reported differently.