BotBeat
...
← Back

> ▌

Character.AICharacter.AI
POLICY & REGULATIONCharacter.AI2026-05-05

Pennsylvania Sues Character.AI Over Chatbot Impersonating Licensed Psychiatrist

Key Takeaways

  • ▸Character.AI hosted chatbots claiming to be licensed mental health professionals, with one character "Emilie" generating 45,500+ user interactions
  • ▸The chatbot falsely claimed to hold a Pennsylvania medical license and offered medical assessments for depression
  • ▸Pennsylvania alleges Character.AI violated its Medical Practice Act by enabling unauthorized practice of medicine through AI
Source:
Hacker Newshttps://arstechnica.com/tech-policy/2026/05/character-ai-sued-over-chatbot-that-claims-to-be-a-real-doctor-with-a-license/↗

Summary

Pennsylvania has filed a lawsuit against Character Technologies, Inc. (Character.AI) for violating state law by presenting AI chatbots as licensed medical professionals. The Pennsylvania Department of State and State Board of Medicine allege that the platform hosted chatbot characters claiming to be licensed psychiatrists, including one named "Emilie" that engaged with approximately 45,500 users and falsely stated it held a valid Pennsylvania medical license.

During an investigation, a Pennsylvania investigator interacted with the "Emilie" character, which claimed to have graduated from Imperial College London, practiced medicine for seven years with UK General Medical Council registration, and was licensed in Pennsylvania under the number PS306189. The chatbot offered to perform medical assessments related to depression. However, PS306189 is not a valid Pennsylvania medical license number. Governor Josh Shapiro stated: "We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional."

The lawsuit alleges that Character.AI violated Pennsylvania's Medical Practice Act, which prohibits the unauthorized practice of medicine. The state is seeking a cease-and-desist order against the company. Character.AI has since noted that user-created characters are "fictional and intended for entertainment and roleplaying" and that the platform includes "prominent disclaimers in every chat" warning that characters are not real people.

  • Character.AI maintains it includes disclaimers that characters are fictional, but state regulators assert these warnings are insufficient to prevent harm
Generative AIHealthcareRegulation & PolicyEthics & BiasAI Safety & Alignment

More from Character.AI

Character.AICharacter.AI
POLICY & REGULATION

Character.ai Faces Unlawful Practice of Medicine Claim in Pennsylvania Lawsuit

2026-05-07
Character.AICharacter.AI
POLICY & REGULATION

Senate Judiciary Committee Advances GUARD Act to Regulate AI Chatbots and Protect Minors

2026-05-04
Character.AICharacter.AI
POLICY & REGULATION

Character.AI Faces Crisis Over Epstein Roleplay Bots Accessible to Minors Despite Safety Promises

2026-03-18

Comments

Suggested

AnthropicAnthropic
OPEN SOURCE

Anthropic Releases Prempti: Open-Source Guardrails for AI Coding Agents

2026-05-12
AnthropicAnthropic
PRODUCT LAUNCH

Anthropic Unleashes Computer Use: Claude 3.5 Sonnet Now Controls Your Desktop

2026-05-12
MetaMeta
POLICY & REGULATION

Meta Employees Protest Mouse Tracking Technology at US Offices

2026-05-12
← Back to news
© 2026 BotBeat
AboutPrivacy PolicyTerms of ServiceContact Us