Relevance: HIGH — Explicitly notes datasets do not account for "differing
levels and strength of evidence."
Key contribution: Identifies as a "promising research direction" the
construction of resources that would consider "disagreeing evidence, or
differing levels and strength of evidence." This is an explicit recognition
that evidence quality assessment is missing from current fact-checking
resources. (Also SRC-Q1-13.)
Relevance: DIRECT — Identifies three "deep-rooted challenges threatening
the epistemological basis of fact-checking."
Key contribution: The paper's existence IS evidence that the gap is
recognized — it proposes a framework specifically because one does not exist.
The three challenges identified (objectivism, truth regimes, causal relations)
are analyzed as threats to fact-checking's epistemological legitimacy. (Also
SRC-Q1-01.)
Key contribution: Documents variation in verification approaches across
organization types, topics, and countries — empirical evidence that
methodology is not standardized.
Reliability: HIGH — Peer-reviewed in leading media/politics journal.
Relevance: MODERATE — Notes research is "disproportionately oriented
toward the Global North" and "comparative studies are equally scarce."
Key contribution: Introduces a context-sensitive framework for analyzing
fact-checking cultures. By proposing a framework for ANALYZING fact-checking
(rather than for CONDUCTING it), implicitly documents the gap between
scholarly analysis and practitioner methodology.
Relevance: MODERATE — Examines fact-checking as sociotechnical practice.
Key contribution: Frames fact-checking as problem-solving rather than
truth-finding, which implicitly questions whether epistemological frameworks
are the right lens. COUNTER-EVIDENCE to the gap framing — if fact-checking is
a sociotechnical practice, the absence of a formal epistemological framework
may be by design rather than by omission.
Key contribution: Argues fact-checking methods "fail to stand up to the
rigors of scientific inquiry." Identifies eight fundamental problems:
epistemology, implementation, bias, efficacy, ambiguity, objectivity,
ephemerality, and criticism. This is the EARLIEST explicit documentation of
the epistemological gap. (Also SRC-Q1-02.)
Relevance: HIGH — Challenges the severity of the gap but does not deny it
exists.
Key contribution: IMPORTANT COUNTER-EVIDENCE — Amazeen argues that
dedicated fact-checkers use more rigorous methods than Uscinski & Butler
acknowledge. However, her defense is based on professional practice rigor, NOT
on the existence of a formal framework. The rebuttal implicitly concedes that
no formal framework exists while arguing the practice is still
epistemologically sound. (Also SRC-Q1-09.)
Relevance: HIGH — Maintains the epistemological critique after rebuttal.
Key contribution: Maintains that fact-checking epistemology is "still
naive" after Amazeen's defense. The sustained scholarly exchange (2013-2015)
demonstrates the gap was identified and debated but NOT resolved with a
formal framework.
Relevance: HIGH — Identifies specific epistemological consequence of
absent frameworks: "confirmative epistemology."
Key contribution: Documents how the absence of a formal evidence
evaluation framework allows live fact-checking to degenerate into source-
authority confirmation. The concept of "confirmative epistemology" — where
fact-checkers confirm hegemonic perspectives rather than critically evaluate
evidence — is a direct consequence of the gap. (Also SRC-Q1-03.)
Reliability: HIGH — Peer-reviewed in philosophy of science journal.
Relevance: DIRECT — Explicitly argues fact-checkers do not deliver
reproducibility or accountability.
Key contribution: CRITICAL — Analyzes fact-checking's "verification
protocols" through the lens of philosophy of science and concludes they fail
to achieve reproducibility. This is a DIRECT documentation of the
methodological gap from a philosophy of science perspective. The paper's
conclusion that "traditional quality journalism may serve liberal democracies
better" is the strongest academic expression of the gap's consequences found
in this search.
Relevance: HIGH — Documents shared epistemological BELIEFS but not a
formal FRAMEWORK.
Key contribution: Found "isomorphic norms" across 40 organizations but
these are informal institutional norms, not codified methodology. The paper
documents what fact-checkers BELIEVE about their epistemology, implicitly
revealing the gap between belief and codified framework. (Also SRC-Q1-16.)
Relevance: MODERATE — Analyzes fact-checker strategies in practice.
Key contribution: Documents the common methodology as: verification,
investigation, documentation. Notes IFCN requires transparency but does not
provide "more detailed guidance." This explicit observation that the
practitioner standard lacks detailed methodological guidance is a
documentation of the gap.
Relevance: HIGH — Uses fact-checking as an epistemic lens to analyze
AI-generated content.
Key contribution: Proposes three categories of facts (evidence-based,
interpretative-based, rule-based) and introduces "emergent facts." The paper
treats fact-checking's existing epistemology as a LENS to be used, implying
it is a sufficiently developed analytical framework — partially
counter-evidence to the gap thesis.
Relevance: HIGH — Documents fact-checkers borrowing practices from
intelligence/OSINT.
Key contribution: The adoption of external epistemic practices is framed
as filling a gap in journalism's verification capabilities. The concept of
"substantiated verification" that "transcends journalism's prevalent
true/false paradigm" is an explicit attempt to address the epistemological
limitations of fact-checking, though limited to visual verification. (Also
SRC-Q1-04.)
Reliability: MODERATE — Preprint; not yet peer-reviewed.
Relevance: HIGH — Explicitly states "fact-checking processes overall are
still young and not standardized."
Key contribution: Contains one of the most explicit statements of the gap
found in this search. The paper proposes a "framework of urgency dimensions
for misinformation harms" as a partial response, but this addresses
prioritization (what to check) rather than evidence evaluation (how to
assess).