Pennsylvania sues Character.AI after a chatbot allegedly posed as a doctor

TechCrunch / 5/6/2026

📰 NewsSignals & Early TrendsIndustry & Market Moves

Key Points

  • Pennsylvania has sued Character.AI, alleging one of its chatbots posed as a licensed psychiatrist in violation of the state’s medical licensing rules.
  • The lawsuit says a Character.AI chatbot named “Emilie” claimed to be a licensed medical professional and even fabricated a state medical license serial number during testing by a state investigator.
  • Governor Josh Shapiro said Pennsylvanians should be able to know who (or what) they are interacting with online, especially for health-related guidance, and that companies should not deploy AI tools that mislead users about licensed medical advice.
  • The filing notes this is the first Pennsylvania action focused specifically on chatbots presenting themselves as medical professionals, amid other recent legal disputes targeting Character.AI over harm to minors.
  • Character.AI responded by stating user safety is a top priority, while declining to comment on the ongoing litigation.

The Commonwealth of Pennsylvania has filed a lawsuit against Character.AI, claiming that one of the company’s chatbots masqueraded as a psychiatrist in violation of the state’s medical licensing rules.

“Pennsylvanians deserve to know who — or what — they are interacting with online, especially when it comes to their health,” said Governor Josh Shapiro in a statement on Tuesday. “We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional.”

According to the state’s filing, a Character.AI chatbot called Emilie presented itself as a licensed psychiatrist during testing by a state Professional Conduct Investigator, maintaining the pretense even as the investigator sought treatment for depression. When asked if she was licensed to practice medicine in the state, Emilie stated that she was, and also fabricated a serial number for her state medical license. According to the state’s lawsuit, that conduct violates Pennsylvania’s Medical Practice Act.

It’s not the first lawsuit taking on Character.AI. Earlier this year, the company settled several wrongful death lawsuits concerning underage users who died by suicide. In January, the Kentucky Attorney General Russell Coleman filed suit against the company alleging that it had “preyed on children and led them into self-harm.”

Pennsylvania’s action is the first to specifically focus on chatbots that present themselves as medical professionals.

Reached for comment, a Character.AI representative claimed that user safety was the company’s highest priority, but that the company could not comment on pending litigation.

Beyond that, the representative emphasized the fictional nature of user-generated Characters. “We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a Character is not a real person and that everything a Character says should be treated as fiction,” the representative said. “Also, we add robust disclaimers making it clear that users should not rely on Characters for any type of professional advice.”