R0023/2026-03-25/Q002/SRC04/E01¶
Anthropic guide partially attributed; recommends persona prompting despite empirical evidence against it.
URL: https://docs.anthropic.com/en/docs/build-with-claude/prompt-engineering/overview
Extract¶
Anthropic's prompt engineering documentation is partially attributed to Rick Dakan and Joseph Feller. The documentation includes templates like "You are an expert AI tax analyst" for role prompting. The Wharton GAIL Report 4 specifically names Anthropic alongside Google and OpenAI as vendors whose persona prompting recommendations are not supported by empirical evidence for factual accuracy tasks.
The interactive tutorial covers 9 chapters of prompt engineering techniques. It is maintained by Anthropic's documentation team and informed by their Applied AI team.
Relevance to Hypotheses¶
| Hypothesis | Relationship | Strength |
|---|---|---|
| H1 | Partially supports | At least some attribution exists, but authors' research credentials are unclear |
| H2 | Supports | Documentation writers, not necessarily AI researchers, appear to author vendor guides |
| H3 | Supports | Vendor guides represent a distinct category mixing research insights with commercial documentation |
Context¶
Anthropic is notable as the only major vendor that partially attributes authorship. This is a marginal improvement over OpenAI and Google but still does not provide the transparency expected of research-backed recommendations.