SRC01 — Elicit — Scorecard¶
Source Information¶
| Field | Value |
|---|---|
| Title | Elicit: AI for scientific research |
| URL | https://elicit.com/ |
| Authors | Elicit (formerly Ought) |
| Publication | Commercial platform + multiple academic evaluations |
| Date | Ongoing (evaluated 2025-2026) |
| Type | AI research platform |
Summary Ratings¶
| Dimension | Rating |
|---|---|
| Reliability | High |
| Relevance | High |
| Overall | High |
Rationale¶
| Domain | Assessment |
|---|---|
| Authority | Leading research tool; evaluated in Cochrane, SAGE, and multiple academic studies |
| Methodology | Structured data extraction with 99.4% accuracy; systematic review workflow |
| Currency | Current (2026) |
| Objectivity | Multiple independent academic evaluations |
| Scope | 138M+ papers, 545K+ clinical trials |
| Corroboration | Consistent across multiple independent reviews |
Evidence Extracts¶
| Evidence ID | Summary |
|---|---|
| SRC01-E01 | Structured extraction + systematic review workflow, no analytical rigor framework |