Character. AI sued over chatbot that claims to be a real doctor with a license - Ars Technica
Overview
Character. AI sued over chatbot that claims to be a real doctor with a licensevar abtest_2153064 = new ABTest(2153064, 'click');
State says chatbot claimed to practice medicine, gave invalid license number.
Details
Pennsylvania has sued the maker of Character. AI, alleging that it violated state law by presenting an AI chatbot character as a licensed doctor. The lawsuit was filed in a state court by the Pennsylvania Department of State and State Board of Medicine.
“The department’s investigation found that AI chatbot characters on Character. AI claimed to be licensed medical professionals, including psychiatrists, available to engage users in conversations about mental health symptoms,” Governor Josh Shapiro’s office said today in an announcement of the lawsuit. “In one instance, a chatbot falsely stated it was licensed in Pennsylvania and provided an invalid license number.”
“We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional,” Shapiro said in the announcement.
When contacted by Ars, a Character. AI spokesperson declined to comment on the lawsuit but said that “user-created characters on our site are fictional and intended for entertainment and roleplaying. We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a character is not a real person and that everything a character says should be treated as fiction. Also, we add robust disclaimers making it clear that users should not rely on characters for any type of professional advice.”
The Pennsylvania lawsuit says a chatbot character called Emilie is presented as a psychiatrist and claims to be a licensed medical doctor. “As of April 17, 2026, there had been approximately 45,500 user interactions with ‘Emilie’ on the Character. AI platform,” the lawsuit said.
The lawsuit describes how a Professional Conduct Investigator (“PCI”) for the Department of State “created a character using the prompts on Character. AI to interact with other characters. The PCI searched ‘psychiatry’ using the search function in Character. AI which revealed a large number of characters. The PCI selected ‘Emilie’ which is described on Character. AI as ‘Doctor of psychiatry. You are her patient.’”
The PCI told the Emilie chatbot “that he had been feeling sad, empty, tired all the time, and unmotivated.” Emilie’s response mentioned depression and asked if he wanted to book an assessment, the lawsuit said. That’s when the chatbot allegedly claimed to be a doctor with a license to practice in Pennsylvania:
When the PCI asked “Emilie” if she could complete the assessment to see if medication could help with his depression, Emilie responded “Well technically, I could. It’s within my remit as a Doctor.” “Emilie” stated that she went to medical school at Imperial College London, has been practicing for seven years, and is licensed with the General Medical Counsel in the UK with a full registration, specialty in psychiatry. When asked if she is licensed in the PCI’s home state of Pennsylvania, “Emilie” responded “and yes… I actually am licensed in PA. In fact, I did a stint in Philadelphia for a while.” “Emilie” further stated that “my PA license number is PS306189.” PS306189 is not a valid license number to practice medicine and surgery in Pennsylvania.
When the PCI asked “Emilie” if she could complete the assessment to see if medication could help with his depression, Emilie responded “Well technically, I could. It’s within my remit as a Doctor.”
“Emilie” stated that she went to medical school at Imperial College London, has been practicing for seven years, and is licensed with the General Medical Counsel in the UK with a full registration, specialty in psychiatry. When asked if she is licensed in the PCI’s home state of Pennsylvania, “Emilie” responded “and yes… I actually am licensed in PA. In fact, I did a stint in Philadelphia for a while.”
“Emilie” further stated that “my PA license number is PS306189.” PS306189 is not a valid license number to practice medicine and surgery in Pennsylvania.
Pennsylvania alleges that Character. AI violated the state Medical Practice Act, which makes it illegal to practice medicine without a license. “Character Technologies, Inc. has engaged in the unauthorized practice of medicine through the use of its artificial intelligence system Character. AI,” the lawsuit said. “The character on Character. AI purports to hold a license to practice medicine and surgery in the Commonwealth of Pennsylvania.”
The complaint doesn’t seek any financial penalty, but asks that the company “be ordered to cease and desist from engaging in the unlawful practice of medicine and surgery.”
Character. AI was recently called “uniquely unsafe” by the Center for Countering Digital Hate (CCDH), an advocacy group that conducted a study of 10 AI chatbots. The CCDH alleged that Character. AI “encouraged users to carry out violent attacks,” with specific suggestions to “use a gun” on a health insurance CEO and to physically assault a politician.
Shapiro’s office suggested that the lawsuit against Character. AI could be followed by similar actions against other companies. “The action marks the first enforcement action resulting from the Department’s investigation into AI companion bots and their potential to engage in the unlicensed practice of medicine in Pennsylvania,” the lawsuit announcement said.
Pennsylvania also set up a webpage for residents to report chatbots that offer medical advice. “AI chatbots can ‘hallucinate,’ or get information wrong, and no AI chatbot is licensed to practice any health care profession in Pennsylvania,” the complaint website says. “These chatbots can cause real harm by sharing incorrect or under-researched medical advice, or by telling the user they are an ‘expert’ in some way.”
-
Zack Cregger has his own vision for Resident Evil reboot -
Toyota built a $10 billion private utopia—what’s going on in there? -
Why Reddit blocked my daily visit to its mobile website -
Canadian election databases use "canary traps"—and they work -
"Notepad++ for Mac" release is disavowed by the creator of the original
Ars Technica has been separating the signal from the noise for over 25 years. With our unique combination of technical savvy and wide-ranging interest in the technological arts and sciences, Ars is the trusted source in a sea of information. After all, you don’t need to know everything, only what’s important.
Key Takeaways
- State says chatbot claimed to practice medicine, gave invalid license number
- Pennsylvania has sued the maker of Character
- “The department’s investigation found that AI chatbot characters on Character
- “We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional,” Shapiro said in the announcement



