Skip to content

Q003 — Assessment

Query

Has the academic literature identified and documented the absence of formal evidence evaluation frameworks in fact-checking as a gap?

Hypotheses Tested

  • H1 (Documented Gap): Multiple academic papers explicitly identify the absence of formal evidence evaluation frameworks in fact-checking.
  • H2 (No Documentation): The academic literature has not identified this as a gap.
  • H3 (Nuanced): Some papers note methodological limitations in passing but no sustained research program addresses this gap explicitly.

Finding

H3 is best supported by the evidence, but the literature is closer to H1 than initially expected.

The academic literature HAS identified epistemological and methodological limitations of fact-checking, but the identification is scattered across multiple disciplines and framed in different terms rather than constituting a unified research program focused on the absence of formal evidence evaluation frameworks.

Confidence: HIGH

Synthesis

The gap has been identified from multiple disciplinary perspectives:

1. Philosophy and Epistemology (Direct identification)

  • Uscinski & Butler (2013) (SRC-Q3-06): The earliest and most explicit identification. Argued fact-checking methods "fail to stand up to the rigors of scientific inquiry." Identified eight fundamental problems including epistemology, bias, ambiguity, and objectivity. However, the paper's hostile stance toward fact-checking (recommending it be "condemned to the dustbin of history") reduced its constructive impact.

  • Fernandez-Roldan & Teira (2024) (SRC-Q3-10): The most analytically rigorous identification from philosophy of science. Concluded that fact-checkers' "verification protocols" fail to achieve reproducibility or accountability. Argued that fact-checking is presented as "quasi-scientific" but does not meet the epistemological standards this implies.

  • Seeck et al. (2025) (SRC-Q3-02): Identified three "deep-rooted challenges threatening the epistemological basis of fact-checking" (objectivism, truth regimes, causal relations). The paper's title — "An Epistemological Framework" — signals that such a framework was NEEDED because one did not exist.

2. Journalism Studies (Implicit identification)

  • Koliska & Roberts (2024) (SRC-Q3-11): Found shared epistemological BELIEFS across 40 organizations but documented these as informal norms, not formal methodology — implicitly documenting the gap between shared beliefs and codified framework.

  • Steensen, Kalsnes & Westlund (2024) (SRC-Q3-09): Documented how the absence of a formal framework allows fact-checking to degenerate into "confirmative epistemology" under time pressure. This is documentation of the gap through its CONSEQUENCES rather than through direct statement.

  • Cazzamatta (2025) (SRC-Q3-03): Documented variation in verification approaches across countries, organizations, and topics — empirical evidence of absent standardization.

  • "Methodology Used by Fact-Checkers" (2024) (SRC-Q3-12): Explicitly noted that IFCN "does not provide more detailed guidance" beyond its five principles — direct documentation that the leading practitioner standard lacks methodological depth.

3. Computational Research (Technical identification)

  • Vladika & Matthes (2023) (SRC-Q3-01): Explicitly identified that no existing datasets account for "disagreeing evidence, or differing levels and strength of evidence." Framed this as a "promising research direction," directly recognizing the gap in computational fact-checking resources.

  • "Misinformation as a Harm" (2023) (SRC-Q3-15): Stated explicitly that "fact-checking processes overall are still young and not standardized."

4. Emerging Gap-Filling Attempts

  • Grut (2026) (SRC-Q3-14): Documented fact-checkers adopting intelligence/OSINT practices for visual verification, producing "substantiated verification." This is the closest evidence of practitioners organically filling the gap, though limited to visual verification.

  • "What is a fact in the age of generative AI?" (2026) (SRC-Q3-13): Proposes three categories of facts plus "emergent facts" — a theoretical contribution that uses fact-checking as an epistemic lens rather than identifying it as epistemologically deficient.

Counter-Evidence

Two forms of counter-evidence weaken the gap thesis:

  1. The sociotechnical framing (SRC-Q3-05): If fact-checking is properly understood as a sociotechnical problem-solving practice rather than a quasi-scientific truth-finding enterprise, then the absence of formal epistemological frameworks may be APPROPRIATE rather than a gap. The comparison to GRADE/IPCC/ICD 203 may be a category error — those frameworks serve disciplines that explicitly position themselves as scientific, which fact-checking does not necessarily claim to be.

  2. The practitioner confidence finding (SRC-Q3-11): Koliska & Roberts found that fact-checkers share high confidence in their ability to determine objective truth through transparent, reproducible processes. From the practitioner perspective, there may be no PERCEIVED gap even if scholars identify one.

Assessment of Gap Documentation Quality

The gap has been identified by:

  • At least 3 papers that EXPLICITLY argue fact-checking lacks formal epistemological rigor (SRC-Q3-06, SRC-Q3-10, SRC-Q3-15)
  • At least 3 papers that PROPOSE frameworks to address epistemological challenges (SRC-Q3-02, SRC-Q3-13, SRC-Q3-14)
  • At least 4 papers that IMPLICITLY document the gap through empirical findings of inconsistency, variation, or informal methodology (SRC-Q3-03, SRC-Q3-09, SRC-Q3-11, SRC-Q3-12)

However, these papers:

  • Come from different disciplines (philosophy, journalism studies, computational linguistics) and do not cross-reference each other comprehensively
  • Frame the gap in different terms (epistemological failure, reproducibility deficit, standardization absence, methodological variation)
  • Do not converge on a single proposed solution

JUDGMENT: The gap is DOCUMENTED but DISPERSED. It has been identified by enough independent sources to constitute a genuine finding, but it has not been consolidated into a recognized research program with a shared problem definition and solution path.

Researcher Bias Warning

The researcher's declared interest in demonstrating that fact-checking lacks formal frameworks creates a risk that the assessment overweights gap-identifying papers and underweights counter-evidence. Specific mitigation applied:

  • The sociotechnical counter-argument (SRC-Q3-05) is given full consideration.
  • Amazeen's defense (SRC-Q3-07) is acknowledged as a legitimate challenge.
  • The practitioner confidence finding (SRC-Q3-11) is noted as potentially undermining the gap thesis from a practitioner perspective.
  • The assessment distinguishes between EXPLICIT and IMPLICIT gap identification to avoid over-claiming.

Gaps and Limitations

  1. The Uscinski & Butler (2013) paper is the most cited gap-identification source, but its hostile framing may have discouraged constructive follow-up on the specific evidence-evaluation gap.
  2. Non-English academic literature was not searched.
  3. The computational fact-checking literature was searched primarily through surveys; individual system papers may contain more specific gap identifications.
  4. The distinction between "the gap exists" and "the gap has been documented" is important — this assessment is about documentation, not about whether the gap objectively exists.