Local

Scammers are using AI to clone voices of your loved ones and fake emergency calls

PITTSBURGH — Beth Royce of Pittsburgh got a disturbing call in March.

The caller ID showed that it was from her sister, but when she answered, the voice on the line was an unknown man who was screaming that he had taken her sister and wanted $1,500.

“I heard muffled sobs in the background that sounded like a woman’s voice. So of course, I was like, Oh my God, that’s my sister,” said Royce.

With help from other family members, Royce was eventually able to determine the call was a scam.

But now the FTC is warning that criminals are starting to use artificial intelligence to make these kinds of scams far more believable.

“This has completely flipped these scams on its head”, says Chip Muir, who is a South Park native and served as the Assistant Attorney General of Virginia and in the White House, and now works as a cybersecurity software consultant.

The new danger is voice cloning.

And there are many websites online, some even free, that allow you to upload a clip of someone’s voice, write a script, and get that voice to say absolutely anything.

“I actually think this is quite a substantial risk”, said Muir, “and the reason why I think that is that it is going to be the most credible and easily manipulatable way to go out and commit new scams.”

We used one of these websites to upload the voice of Channel 11 News Anchor Gordon Loesch and played the clips for Muir.

“Absolutely incredible. I have been sitting here talking to you for about 20 minutes and if you’d left the room and turn off the lights (I) would have thought I’m still talking with you,” said Muir. “It has the inflections, your voice, correct. It basically has your pauses down.”

The technology isn’t designed for malicious or fraudulent intent, but that doesn’t stop scammers from using it that way.

We did have to check a box confirming that we had consented to clone the voice and that we wouldn’t use it for fraud.

“Absolutely not a deterrent,” said Muir. “If you’re actually going to use this for nefarious purposes, the absolute least of your concerns is the ‘I’ve lied on the check box’ on the terms and services, right?”

Jennifer Destefano of Arizona testified to Congress in Washington D.C., this summer after she got a call from a scammer using her daughter’s cloned voice.

“‘Mom, these bad men have me. Help me, help me, help me,’” said Destefano of the voice on the call, who she soon realized was not her daughter.

Now, she’s warning lawmakers of the dangers of the technology.

“All that kept going through my head, how are they going to use this, how are they going to manifest it,” said Destefano. “How are they going to use it to lure a kid?”

Right now, the technology limits voice cloning to a pre-set script and what you type out.

But experts say, in the future, AI could allow scammers to respond and interact using a cloned voice in a real-time conversation.

“There’s really no limit to the technology,” said Muir. “When we talk about the speed and capability, eventually it’s going to get there.”

So, how do you protect yourself and your family?

Experts say you should limit what you publicly post on social media of your voice.

You can also set up codewords that only your family members would know.

But the most practical advice is to just pause and take a moment to think if something doesn’t seem right.

Experts say you can’t just trust a voice any longer.

You should try to call back the loved one who supposedly contacted you at their phone number and verify the situation with them or other family members.

Download the FREE WPXI News app for breaking news alerts.

Follow Channel 11 News on Facebook and Twitter. | Watch WPXI NOW