R0049/2026-03-31/Q003-SRC03 — Scorecard¶
Source¶
| Title | Elicit: AI for scientific research |
| Publisher | Elicit Inc. |
| Authors | Elicit team |
| Date | 2022-2026 |
| URL | https://elicit.com/ |
| Type | Commercial product |
Ratings¶
| Dimension | Rating |
|---|---|
| Reliability | Medium-High |
| Relevance | Medium |
| Missing data | Some concerns |
| Measurement | Low risk |
| Selective reporting | Some concerns |
| Randomization | N/A |
| Protocol deviation | N/A |
| COI/funding | Some concerns |
Rationale¶
| Dimension | Rationale |
|---|---|
| Reliability | Commercial product with independent evaluation studies; proprietary implementation details |
| Relevance | Leading AI systematic review tool but optimizes for screening efficiency, not analytical rigor |
| Missing data | Proprietary — internal methodology not fully documented |
| Selective reporting | Company-reported metrics may emphasize strengths |
| COI/funding | Commercial product reporting on own capabilities |
Evidence Extracts¶
| Evidence | Summary |
|---|---|
| SRC03-E01 | Elicit achieves 81.4% accuracy in systematic review tasks; optimizes for screening efficiency, not analytical rigor features |