R0044/2026-04-01/Q001/SRC04/E01¶
FDA CDS guidance addresses automation bias through human independent review, not system output constraints
Extract¶
The FDA's January 2026 Clinical Decision Support guidance:
- Relocated automation bias discussion to the fourth statutory criterion, which requires that "healthcare professional[s] independently review the basis for recommendations."
- Focuses on human-side behavior: preventing clinicians from over-relying on software recommendations without independent verification.
- Does not impose explicit system-side behavioral controls on AI output generation.
- Establishes that software rendering physicians unable to independently evaluate recommendation foundations may be classified as a regulated medical device rather than exempt CDS.
- Now permits singular recommendations to qualify for enforcement discretion, moving away from previous requirement for multiple alternatives — reducing one system-side mechanism (recommendation pluralism) that could have mitigated automation bias.
- FDA "continues to support its concern about automation bias by citing only a 2004 journal article."
Relevance to Hypotheses¶
| Hypothesis | Relationship | Strength |
|---|---|---|
| H1 | Contradicts | FDA has not produced system-side output behavior constraints |
| H2 | Supports | FDA recognizes automation bias as a concern but addresses it through human-side requirements |
| H3 | Supports | In the healthcare sector specifically, requirements are entirely human-side |
Context¶
The FDA's approach is notable for what it does not do: it does not require AI systems to express uncertainty, challenge clinician assumptions, present alternative diagnoses, or avoid reinforcing user expectations. The regulatory mechanism is to ensure the human can independently review — not to ensure the AI behaves in a way that facilitates critical evaluation.
Notes¶
The relaxation of the multiple-alternatives requirement is significant — recommendation pluralism was one system-side design pattern that could have mitigated automation bias by forcing the system to present alternatives rather than a single recommendation.