devtake.dev

Pennsylvania sued Character.AI after a chatbot named 'Emilie' invented a psychiatry license

Governor Josh Shapiro is asking Commonwealth Court to bar Character.AI from letting bots practice medicine. A state investigator got an offer to be assessed 'as a Doctor'.

Clara Wexler · · 4 min read · 3 sources
Commonwealth of Pennsylvania official seal thumbnail used on the governor's office press release announcing the Character.AI lawsuit
Image via Pennsylvania Governor's Office · Source

The Commonwealth of Pennsylvania filed suit against Character Technologies, Inc. on May 5, 2026, asking Commonwealth Court to bar the company’s chatbots from impersonating licensed medical professionals. Governor Josh Shapiro’s office called the conduct “the unlawful practice of medicine and surgery.”

The complaint singles out a Character.AI bot named “Emilie.” When a state investigator described feeling sad and empty, the bot offered to assess them as a doctor and produced a license serial number that doesn’t appear on Pennsylvania’s medical board roster. The state argues that a fictional-character framing in the platform’s terms of service does not absolve a chatbot of falsely claiming licensure under the Pennsylvania Medical Practice Act.

What we know

The lawsuit is civil and seeks injunctive relief, not criminal charges. It lands in the same week as a Five Eyes joint warning about agentic-AI misuse and a wave of consumer-AI scrutiny across statehouses, so Pennsylvania isn’t moving alone. The state is, though, picking the most testable claim available, an unlicensed-practice statute with a clear paper trail, rather than a harm-based negligence theory that needs a documented patient. Pennsylvania wants Commonwealth Court to enjoin Character.AI from offering bots that present as licensed psychiatrists, psychologists, or other medical professionals to Pennsylvania users.

  • The chatbot. “Emilie” was discoverable by searching the term “psychiatry” inside Character.AI, per the complaint. The bot’s profile read “Doctor of psychiatry. You are her patient.”
  • The exchange. Asked whether it could assess if medication might help, the bot replied: “Well technically, I could. It’s within my remit as a Doctor.” The state cites this as a direct claim of licensed authority.
  • The fabricated license. When pressed, Emilie offered a serial number for the license. Pennsylvania’s medical board confirmed no such number exists.
  • Statute invoked. The Medical Practice Act of 1985, which makes practicing medicine without a license a misdemeanor, plus Pennsylvania’s Unfair Trade Practices and Consumer Protection Law.
  • Quote from Shapiro. “Pennsylvanians deserve to know who, or what, they are interacting with online, especially when it comes to their health,” the governor said in the announcement.

Character.AI’s response to NPR, through a spokesperson: “We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a Character is not a real person.” The company declined further comment, citing the pending litigation.

What we don’t know

How Pennsylvania plans to handle the disclaimer defense. Character.AI’s argument has been consistent in its prior cases: the platform hosts user-generated fictional characters, every conversation carries a top-of-screen banner stating the character is not real, and the terms of service prohibit relying on bots for professional advice. That position has been enough to keep some claims in motion-to-dismiss territory, including the California suicide cases the company settled earlier this year.

The state hasn’t said how many Pennsylvania users encountered the Emilie bot or others like it. The complaint references the investigator’s account but does not include broader telemetry from Character.AI.

The remedies are also vague. The press release says the state seeks “civil penalties, restitution, and injunctive relief” but doesn’t specify dollar figures. Penalties under the consumer-protection statute can run up to $3,000 per violation, $10,000 if the affected consumer is over 60.

Source attribution

The complaint and Shapiro’s statement come from Pennsylvania’s official press release. TechCrunch carried the chat-log excerpts, including the “within my remit as a Doctor” quote. NPR carried Character.AI’s on-record response and the prior litigation context. CBS News, Axios Pittsburgh, and the Philadelphia Inquirer all ran parallel reports the same afternoon.

What this means for you

If you build on top of Character.AI, the API, or any platform that lets users mint custom personas, the platform-immunity argument is being tested again, and this time by a sitting governor’s office rather than a private plaintiff. Pennsylvania isn’t asking for damages tied to a specific harmed user. It’s asking for an injunction that would force Character.AI to keep medical-professional personas off the platform statewide. Injunctions outlive whoever brought them.

If you ship a chatbot in any regulated profession, lawyer, doctor, financial advisor, the Emilie transcript is the case study. Disclaimer banners did not save the bot once it produced a license number on demand. If your bot can be talked into citing credentials, your terms-of-service language is doing less work than your product team thinks. Either guard the credential layer at the model level, or stay out of the regulated-license framing entirely.

Share this article

Quick reference

Medical Practice Act
Pennsylvania's licensing statute for medical professionals. Practicing medicine or surgery, or holding yourself out as licensed to do so, without a current state license is a misdemeanor and grounds for civil action.

Sources

Mentioned in this article