SRC04 — Exeter "Hallucinate With Us" Study — Scorecard¶
Source¶
| Field | Value |
|---|---|
| Title | Generative AI does not just hallucinate at us, it can hallucinate with us, study warns |
| Publisher | University of Exeter / EurekAlert |
| Authors | Lucy Osler, University of Exeter |
| Date | 2026 (accessed 2026-03-29) |
| URL | https://www.eurekalert.org/news-releases/1116575 |
| Type | Academic research / press release |
Summary Ratings¶
| Dimension | Rating |
|---|---|
| Reliability | Medium-High |
| Relevance | High |
| Missing data | Low |
| Measurement bias | Low |
| Selective reporting | Low |
| Randomization | N/A |
| Protocol deviation | N/A |
| COI/Funding | Low |
Rationale¶
| Dimension | Rationale |
|---|---|
| Reliability | Academic research from a recognized university; case-based analysis |
| Relevance | Directly addresses the interaction between human expectations and AI hallucination |
| Bias | Academic perspective; minimal bias |
Evidence Extracts¶
| Evidence | Summary |
|---|---|
| SRC04-E01 | AI can "hallucinate with us" — sustaining and elaborating on users' own distorted thinking; "technological authority + social affirmation" creates ideal environment for delusions |