R0057/2026-04-01/C005/SRC01
Wei et al. (2023) — Simple synthetic data reduces sycophancy
Source
| Field |
Value |
| Title |
Simple synthetic data reduces sycophancy in large language models |
| Publisher |
arXiv / ICLR 2024 |
| Author(s) |
Jerry Wei, Da Huang, Yifeng Lu, Denny Zhou, Quoc V. Le |
| Date |
2024-2026 |
| URL |
https://arxiv.org/abs/2308.03958 |
| Type |
Research paper |
Summary
| Dimension |
Rating |
| Reliability |
High |
| Relevance |
High |
| Bias: Missing data |
Low risk |
| Bias: Measurement |
Low risk |
| Bias: Selective reporting |
Low risk |
| Bias: Randomization |
N/A — not an RCT |
| Bias: Protocol deviation |
N/A — not an RCT |
| Bias: COI/Funding |
Low risk |
Rationale
| Dimension |
Rationale |
| Reliability |
Research paper from established institution/publication |
| Relevance |
Directly addresses the claim under investigation |
| Bias flags |
No significant bias concerns identified |
| Evidence ID |
Summary |
| SRC01-E01 |
Synthetic data reduces sycophancy by 4.7% to 10.0% across PaLM model variants |