Skip to content

R0024/2026-03-25/Q002/SRC02/E01

Research R0024 — Sycophancy and Addiction
Run 2026-03-25
Query Q002
Source SRC02
Evidence SRC02-E01
Type Analytical

AEI analysis drawing direct parallels between social media and AI chatbot tort theories

URL: https://www.aei.org/technology-and-innovation/suicides-settlements-and-unresolved-chatbot-issues-a-long-litigation-road-lies-ahead/

Extract

Calvert examines whether conversational chatbots constitute legally actionable "products" under strict liability frameworks or merely "services" delivering speech. He identifies causation as a critical challenge, noting that "many factors influence suicides" beyond chatbot interactions.

The article draws direct connections to ongoing social media addiction litigation: "Similar tort-based theories appear in both contexts." Calvert references his prior analysis of social media cases to highlight comparable legal frameworks being deployed against AI companies.

Primary causes of action include strict product liability, negligence, and First Amendment speech protection challenges. Multiple law firms are establishing AI suicide litigation practice areas, state attorneys general are suing, and FTC inquiries maintain regulatory pressure.

Character.AI agreed to settle lawsuits in Garcia v. Character Technologies and Montoya v. Character Technologies.

Relevance to Hypotheses

Hypothesis Relationship Strength
H1 Supports Explicitly draws parallels between social media and AI chatbot tort theories
H2 Contradicts The connection is explicitly made by a legal scholar
H3 N/A Analysis is substantive, not merely emerging

Context

Published February 17, 2026, this analysis predates the March 25 verdict but demonstrates that the legal parallel between social media and AI chatbot liability was already well-established in legal scholarship.