Skip to content

Q001 — Source Registry

SRC-Q1-01

  • Title: Fact-Checking in Journalism: An Epistemological Framework
  • Authors: Hannele Seeck et al.
  • Date: 2025-04-17
  • Publication: Journalism Studies (Taylor & Francis)
  • URL: https://www.tandfonline.com/doi/full/10.1080/1461670X.2025.2492729
  • Reliability: HIGH — Peer-reviewed journal article in a leading journalism studies journal; institutional affiliation (LSE).
  • Relevance: DIRECT — Proposes an epistemological framework for fact-checking; directly addresses the query.
  • Bias assessment:
  • Funding: Not disclosed in search results.
  • Institutional: Academic (LSE); no apparent industry ties.
  • Ideological: Supportive of fact-checking as practice while acknowledging critiques.
  • Selection: Draws on existing epistemological critiques.
  • Temporal: Current (2025).
  • Geographic: European perspective.
  • Key contribution: Defines three epistemological challenges (objectivism, truth regimes, causal relations) across five aspects of fact-checking. This is a DESCRIPTIVE framework analyzing existing epistemological tensions, NOT a PRESCRIPTIVE evidence evaluation methodology comparable to GRADE.

SRC-Q1-02

  • Title: The Epistemology of Fact Checking
  • Authors: Joseph E. Uscinski, Ryden W. Butler
  • Date: 2013
  • Publication: Critical Review, Vol. 25, No. 2, pp. 162-180
  • URL: https://philpapers.org/rec/USCTEO
  • Reliability: HIGH — Peer-reviewed journal; widely cited foundational work (sparked ongoing scholarly debate).
  • Relevance: DIRECT — Foundational critique arguing fact-checking fails scientific epistemic standards.
  • Bias assessment:
  • Funding: Not disclosed.
  • Institutional: Academic (University of Miami).
  • Ideological: Hostile to fact-checking practice; argues it should be "condemned to the dustbin of history."
  • Selection: Uses selected examples; Amazeen (SRC-Q1-09) challenged sample representativeness.
  • Temporal: 2013; fact-checking has evolved since.
  • Geographic: US-centric.
  • Key contribution: Argues fact-checking methods "fail to stand up to the rigors of scientific inquiry" — implicitly identifies the absence of formal epistemological standards.

SRC-Q1-03

  • Title: The limits of live fact-checking: Epistemological consequences of introducing a breaking news logic to political fact-checking
  • Authors: Steen Steensen, Bente Kalsnes, Oscar Westlund
  • Date: 2024 (online 2023)
  • Publication: New Media & Society, Vol. 26
  • URL: https://journals.sagepub.com/doi/full/10.1177/14614448231151436
  • Reliability: HIGH — Peer-reviewed; based on participatory observation, interviews, and textual analysis.
  • Relevance: HIGH — Demonstrates how time pressure pushes fact-checking toward "confirmative epistemology" reliant on predefined source credibility rather than formal evidence evaluation.
  • Bias assessment:
  • Funding: Not disclosed.
  • Institutional: Academic (OsloMet, Kristiania).
  • Ideological: Critical of live fact-checking but supportive of fact-checking in principle.
  • Selection: Single case study (Faktisk.no, Norway).
  • Temporal: Based on 2021 data, published 2023-2024.
  • Geographic: Norwegian context.
  • Key contribution: Shows that under pressure, fact-checkers default to source-authority heuristics rather than structured evidence evaluation — no formal framework existed to fall back on.

SRC-Q1-04

  • Title: Convergent Epistemic Practices in Visual Fact-Checking
  • Authors: Ståle Grut
  • Date: 2026-03-10
  • Publication: Digital Journalism
  • URL: https://www.tandfonline.com/doi/full/10.1080/21670811.2026.2638939
  • Reliability: HIGH — Peer-reviewed; based on 13 in-depth interviews and participant observation.
  • Relevance: HIGH — Documents fact-checkers adopting epistemic practices from OSINT investigators and intelligence actors, introducing "substantiated verification" that transcends the true/false paradigm.
  • Bias assessment:
  • Funding: Not disclosed.
  • Institutional: Academic.
  • Ideological: Sympathetic to practice improvement.
  • Selection: Single case (Faktisk Verifiserbar, Norway).
  • Temporal: Current (2026).
  • Geographic: Norwegian context.
  • Key contribution: IMPORTANT NUANCE — Shows convergence of intelligence and journalism epistemic practices in visual verification. This is the closest evidence to fact-checking borrowing formal methodology from adjacent disciplines, but it is domain-specific (visual verification) and ad hoc rather than a formal, codified framework.

SRC-Q1-05

  • Title: "Fact-checking" fact checkers: A data-driven approach
  • Authors: Lee, S., Xiong, A., Seo, H., Lee, D.
  • Date: 2023-10-26
  • Publication: Harvard Kennedy School Misinformation Review
  • URL: https://misinforeview.hks.harvard.edu/article/fact-checking-fact-checkers-a-data-driven-approach/
  • Reliability: HIGH — Peer-reviewed; large empirical dataset (22,349 articles).
  • Relevance: MODERATE — Demonstrates rating disagreement between fact-checkers, implying absence of shared evidence evaluation standards.
  • Bias assessment:
  • Funding: Not disclosed in search results.
  • Institutional: Academic (Penn State).
  • Ideological: Neutral empirical study.
  • Selection: Limited to Snopes and PolitiFact (US-centric).
  • Temporal: Data from 2016-2022.
  • Geographic: US only.
  • Key contribution: Found only 1 conflicting bottom-line verdict among 749 matching claims, but disagreement was higher in the ambiguous middle range — exactly where a formal evidence grading system would be most needed.

SRC-Q1-06

  • Title: A Survey on Automated Fact-Checking
  • Authors: Zhijiang Guo, Michael Schlichtkrull, Andreas Vlachos
  • Date: 2022
  • Publication: Transactions of the Association for Computational Linguistics (TACL), Vol. 10
  • URL: https://direct.mit.edu/tacl/article/doi/10.1162/tacl_a_00454/109469/A-Survey-on-Automated-Fact-Checking
  • Reliability: HIGH — Peer-reviewed survey in top NLP venue; widely cited.
  • Relevance: HIGH — Comprehensive survey of automated fact-checking pipelines; describes the three-stage framework (claim detection, evidence retrieval, claim verification) but no evidence quality grading.
  • Bias assessment:
  • Institutional: Academic (Cambridge).
  • Ideological: Technical/neutral.
  • Temporal: 2022; field has advanced since.
  • Geographic: International scope.
  • Key contribution: The automated fact-checking pipeline focuses on verdict prediction and justification production, but does NOT include any GRADE-comparable evidence quality assessment stage. Evidence is retrieved and used, but its quality/reliability is not formally graded.

SRC-Q1-07

  • Title: Overview of the CLEF-2025 CheckThat! Lab
  • Date: 2025
  • Publication: Springer
  • URL: https://link.springer.com/chapter/10.1007/978-3-032-04354-2_13
  • Reliability: HIGH — Established shared task series.
  • Relevance: MODERATE — Shows current state of computational fact-checking tasks; no evidence quality grading task exists.
  • Key contribution: The shared task pipeline includes check-worthiness, previously fact-checked claim detection, evidence retrieval, and claim verification — but no evidence quality assessment subtask.

SRC-Q1-08

  • Title: IFCN Code of Principles
  • Publisher: International Fact-Checking Network / Poynter Institute
  • URL: https://ifcncodeofprinciples.poynter.org/
  • Reliability: HIGH — The primary industry standard; 31 assessment criteria reviewed by independent assessors.
  • Relevance: DIRECT — The closest thing to a shared fact-checking methodology standard.
  • Bias assessment:
  • Funding: Poynter Institute.
  • Institutional: Industry body.
  • Ideological: Pro-fact-checking.
  • Geographic: International.
  • Key contribution: CRITICAL FINDING — The IFCN Code addresses process transparency (5 commitments: nonpartisanship, source transparency, funding transparency, methodology transparency, corrections policy). It does NOT include: hierarchical evidence quality scales, calibrated confidence language, structured bias assessment of sources, or source reliability tiering. It prescribes WHAT to disclose, not HOW to evaluate evidence quality.

SRC-Q1-09

  • Title: Revisiting the Epistemology of Fact-Checking
  • Authors: Michelle A. Amazeen
  • Date: 2015
  • Publication: Critical Review, Vol. 27, No. 1
  • URL: https://www.tandfonline.com/doi/abs/10.1080/08913811.2014.993890
  • Reliability: HIGH — Peer-reviewed rebuttal in the same journal.
  • Relevance: HIGH — Defends fact-checking epistemology against Uscinski & Butler but does not propose a formal evidence evaluation framework.
  • Key contribution: Argues that dedicated fact-checkers (PolitiFact, FactCheck.org) use more rigorous methods than Uscinski & Butler's sample suggests, but the defense is based on professional practice norms, not a formal evidence grading system.

SRC-Q1-10

  • Title: ClaimReview — Schema.org Type
  • Publisher: Schema.org
  • URL: https://schema.org/ClaimReview
  • Reliability: HIGH — W3C-adjacent open standard.
  • Relevance: MODERATE — Structured data schema for encoding fact-check verdicts, not for evaluating evidence quality.
  • Key contribution: ClaimReview captures WHAT was claimed, WHO claimed it, and WHAT the verdict is. It does not capture HOW the evidence was evaluated, what quality the evidence was, or what confidence level applies to the verdict. Each fact-checker supplies their own rating system.

SRC-Q1-11

  • Title: Fact Check (ClaimReview) Markup — Google Search Central
  • Publisher: Google
  • URL: https://developers.google.com/search/docs/appearance/structured-data/factcheck
  • Reliability: HIGH — Google's official documentation.
  • Relevance: MODERATE — Shows ClaimReview implementation requirements.
  • Key contribution: Confirms that ClaimReview requires a textual verdict (reviewRating with bestRating/worstRating) but each publisher defines their own scale.

SRC-Q1-12

  • Title: ICD 203 — Analytic Standards
  • Publisher: Office of the Director of National Intelligence
  • Date: 2015 (revised)
  • URL: https://www.dni.gov/files/documents/ICD/ICD-203.pdf
  • Reliability: HIGH — Official US Government directive.
  • Relevance: REFERENCE — The comparison benchmark. Nine tradecraft standards including source credibility description, uncertainty expression, assumption identification, alternative analysis, and logic/reasoning.
  • Key contribution: Establishes the standard that fact-checking lacks: formal requirements for source credibility description, calibrated uncertainty language, explicit assumption identification, and consideration of alternatives.

SRC-Q1-13

  • Title: Scientific Fact-Checking: A Survey of Resources and Approaches
  • Authors: Juraj Vladika, Florian Matthes
  • Date: 2023
  • Publication: Findings of ACL 2023
  • URL: https://arxiv.org/abs/2305.16859
  • Reliability: HIGH — Peer-reviewed (ACL Findings); comprehensive survey.
  • Relevance: HIGH — Explicitly notes that no datasets account for "differing levels and strength of evidence."
  • Key contribution: CRITICAL — Identifies as a gap that existing fact-checking resources do not consider "disagreeing evidence, or differing levels and strength of evidence" — exactly what GRADE provides in medicine.

SRC-Q1-14

  • Title: Checking how fact-checkers check
  • Authors: Chloe Lim
  • Date: 2018
  • Publication: Research & Politics
  • URL: https://journals.sagepub.com/doi/10.1177/2053168018786848
  • Reliability: HIGH — Peer-reviewed.
  • Relevance: MODERATE — Documents inter-rater disagreement between fact-checkers.
  • Key contribution: Demonstrates that fact-checkers can reach different conclusions on the same claim, particularly in ambiguous cases — evidence that no shared evidence evaluation standard exists.

SRC-Q1-15

  • Title: Cross-checking journalistic fact-checkers: The role of sampling and scaling in interpreting false and misleading statements
  • Date: 2023
  • Publication: PLOS ONE
  • URL: https://pmc.ncbi.nlm.nih.gov/articles/PMC10368232/
  • Reliability: HIGH — Peer-reviewed in open-access journal.
  • Relevance: MODERATE — Quantifies disagreement arising from different scaling and sampling approaches.
  • Key contribution: Shows that disagreement stems from both which claims are selected AND how they are rated — two distinct methodological gaps.

SRC-Q1-16

  • Title: Epistemology of Fact Checking: An Examination of Practices and Beliefs of Fact Checkers Around the World
  • Authors: Michael Koliska, Jessica Roberts
  • Date: 2024
  • Publication: Digital Journalism, Vol. 13, No. 3
  • URL: https://www.tandfonline.com/doi/abs/10.1080/21670811.2024.2361264
  • Reliability: HIGH — Peer-reviewed; broad sample (40 organizations, 50+ countries).
  • Relevance: DIRECT — Examines the actual epistemology fact-checkers employ.
  • Bias assessment:
  • Selection: Large cross-national sample.
  • Ideological: Neutral/descriptive.
  • Key contribution: Found "isomorphic norms, practices, and structures" but these are INFORMAL shared beliefs (confidence in objective truth, transparent process, reproducibility), not a FORMAL codified framework. Fact-checkers share an epistemology through institutional isomorphism, not through an explicit standard.

SRC-Q1-17

  • Title: Duke Reporters' Lab Fact-Checking Census / Database
  • Publisher: Duke University Reporters' Lab
  • URL: https://reporterslab.org/fact-checking/
  • Reliability: HIGH — Established academic research center.
  • Relevance: MODERATE — Documents 417+ active fact-checkers globally with diverse rating scales.
  • Key contribution: The diversity of rating scales across 417+ organizations (from PolitiFact's six-point scale to India Today's three-crow system) demonstrates the absence of any standardized evidence evaluation framework.