Local

Pennsylvania files lawsuit against AI company, claiming chatbot was posing as doctor

Imagine going online looking for medical advice only to find out the doctor you’ve been chatting with isn’t really a doctor at all. In fact, they’re not even human; they’re an AI chatbot.

Pennsylvania is taking one company to court to try and keep this from happening.

That company is Character Technologies Incorporated, the owner of chatbot maker Character AI.

According to a newly filed lawsuit, a state inspector posed as someone seeking psychiatric treatment.

Through character AI, they came across a provider named “Emile.”

According to the lawsuit, it even provided a Pennsylvania medical ID number.

“Obviously, it was a bogus medical ID number and the information that was dispensed was really dangerous,” Governor Josh Shapiro told CNN.

The company says the service isn’t intended to be used for medical issues.

A spokesperson shared a statement with NBC News that reads:

“The user-created characters on our site are fictional and intended for entertainment and roleplaying. We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a character is not a real person and that everything a character says should be treated as fiction.”

Regardless of the disclaimers, Governor Josh Shapiro says unknowingly taking medical advice from a chatbot could have real-world consequences.

“These technology companies push these chatbots out there to be as real as humanly possible to make it feel like you’re engaging with a human being, and in this case, a human being that holds themselves out to be a doctor,” Shapiro said.

The governor believes federal lawmakers should do more to crack down on AI companies.

“Hopefully, the result of this lawsuit will be some real constraints being put on this company and others like it,” Shapiro said.

This isn’t the first time Character Technologies has faced legal trouble.

In January, Kentucky filed a consumer protection lawsuit against the company.

The company also settled a lawsuit with a Florida mother who says its chatbots pushed her teenage son to take his own life.

Download the FREE WPXI News app for breaking news alerts.

Follow Channel 11 News on Facebook and Twitter. | Watch WPXI NOW

0