Skip to content

R0043/2026-03-28/Q001/S04

Research R0043 — Sycophancy Vocabulary
Run 2026-03-28
Query Q001
Search S04

WebSearch — Defense/military and aviation-specific terminology for AI trust and overreliance

Summary

Field Value
Source/Database WebSearch (two queries combined)
Query terms (1) AI overreliance automation bias defense military terminology; (2) DOD directive AI autonomous systems "appropriate trust" "calibrated trust" human-machine teaming terminology; (3) AI automation complacency aviation FAA human factors terminology
Filters None
Results returned 30
Results selected 4
Results rejected 26

Selected Results

Result Title URL Rationale
S04-R01 AI Safety and Automation Bias — CSET Georgetown https://cset.georgetown.edu/publication/ai-safety-and-automation-bias/ Cross-domain case studies (Tesla, aviation, military) using automation bias terminology
S04-R02 AI Trust and Autonomy Labs — SEI/CMU https://www.sei.cmu.edu/news/ai-trust-and-autonomy-labs-fill-the-gap-between-ai-breakthroughs-and-dod-deployment/ DoD "calibrated trust" framework and CaTE center
S04-R03 The Dangers of Overreliance on Automation — FAA Safety Briefing https://medium.com/faa/the-dangers-of-overreliance-on-automation-5b7afb56ebdc FAA's own terminology for automation overreliance and complacency
S04-R04 Human Factors Requirements for Human-AI Teaming in Aviation — MDPI https://www.mdpi.com/2673-7590/5/2/42 Academic paper noting aviation taxonomies need updating for AI-specific terms

Rejected Results

Result Title URL Rationale
S04-R05 Risks of AI in military targeting — ICRC https://blogs.icrc.org/law-and-policy/2024/09/04/the-risks-and-inefficacies-of-ai-systems-in-military-targeting-support/ Uses generic "automation bias" without domain-specific terms
S04-R06 Military AI needs regulation — Diplo https://www.diplomacy.edu/blog/why-military-ai-needs-urgent-regulation/ Policy advocacy without terminology focus
S04-R07 AI for Military Decision-Making — CSET https://cset.georgetown.edu/publication/ai-for-military-decision-making/ Focuses on capabilities not terminology
S04-R08 AI in military decision-making — ICRC https://blogs.icrc.org/law-and-policy/2024/08/29/artificial-intelligence-in-military-decision-making-supporting-humans-not-replacing-them/ Overlapping content with other ICRC source
S04-R09 Military AI Dialogues 2025 — UNODA https://disarmament.unoda.org/en/updates/key-takeaways-military-ai-peace-security-dialogues-2025 High-level policy; no terminology detail
S04-R10 Various other results Multiple URLs Remaining results were general overviews, duplicates, or tangential

Notes

Defense and aviation have the most mature terminology for human-side phenomena. Key finding: aviation researchers explicitly note that existing terms like "complacency" and "over-trust" are "probably not nuanced enough to capture the full transactional relationships between human crews and AI support systems" (MDPI), indicating awareness that new terminology is needed for AI-specific interactions.