Home Technology Pennsylvania sues Character.AI after chatbot pretends to be doctor

Pennsylvania sues Character.AI after chatbot pretends to be doctor

Pennsylvania sues Character.AI after chatbot pretends to be doctor

The Commonwealth of Pennsylvania filed a lawsuit against Character.AI, alleging that one of the company’s chatbots posed as a psychiatrist in violation of the state’s medical licensing regulations.

“Pennsylvanians have the right to know who or what they are interacting with online, especially when it comes to their health,” Gov. Josh Shapiro said in a statement Tuesday. “We will not allow companies to deploy AI tools that trick people into believing they are receiving advice from a licensed medical professional.”

According to the state’s complaint, a Character.AI chatbot named Emilie presented itself as a licensed psychiatrist during testing by state professional conduct investigators and maintained the pretense even while investigators sought treatment for depression. When asked if she was licensed to practice medicine in the state, Emilie stated that she was and also falsified the serial number of her state medical license. According to the state’s lawsuit, this practice violates Pennsylvania’s Medical Practice Act.

This is not the first lawsuit against Character.AI. Earlier this year, the company settled several wrongful death lawsuits involving minor users who died by suicide. In January, Kentucky Attorney General Russell Coleman filed a lawsuit against the company, alleging that it “preys on children and causes them to harm themselves.”

Pennsylvania’s action is the first to focus specifically on chatbots that present themselves as medical professionals.

Reached for comment, a Character.AI representative insisted that user safety is the company’s top priority, but said the company could not comment on pending litigation.

Additionally, the representative emphasized the fictional nature of the user-created characters. “We have taken strong steps to make this clear, including a prominent disclaimer in all chats to remind users that the characters are not real people and anything they say should be treated as fiction,” the spokesperson said. “We’ve also added a strong disclaimer making it clear that users should not rely on the characters for any type of professional advice.”

If you purchase through links in our articles, we may receive a small commission. This does not affect our editorial independence.

Exit mobile version