HOUSTON - Law enforcement says kidnapping scams are on the rise with a new twist. Scammers are using artificial intelligence to clone the voice or videos of a loved one to trick you into paying a ransom for their release.
Victims say these deep fakes were so convincing, they left them scared for their loved ones' lives.
"I heard my mom’s voice kind of fading away like someone was taking the phone away from her," said TikTok blogger @citylivingsoutherngirl. "And I heard weeps; this guy then gets on the phone, and he goes, 'Hey, I have your mom and if you don’t send me money, I'm going to kill this (expletive)."
She posted the warning, saying scammers cloned her mother's voice to convince her to pay a $1,000 ransom.
"Send it to me on Venmo, send it to me CashApp," she said in her post. "And I can hear what I think is my mom in the background, weeping."
GuidePoint Cybersecurity Expert Felix Simmons showed us how artificial intelligence can clone his voice.
"This is just me talking right now, but when I go to generate," said Simmons, clicking on the generate button.
"This is a computer generation of my voice, this is how this works," said the computer in Simmons' voice.
Georgia State University criminology professor David Maimon says he found a video clip on the dark web. He told us, "the guy who posted it seems to be a vendor who is selling this tool for other fraudsters to use."
"Listen, I want to meet with you, I really do want to meet with you and this is one of the reasons why I'm coming over to Canada," says a voice on the video.
In the video clip, a male face in the upper left corner appears to be speaking, but the words appear to come from the female face in the lower right corner, speaking to an elderly man in the middle.
"I just want to do this and when I get to over ... to you, I will pay you instantly," said the voice.
Haywood Talcove with LexisNexis Risk Solutions says scammers are lifting people's videos and voices off the internet and social media posts.
"For the most part, it’s already out on the internet," said Talcove. "You’re not going to get your face back. I’m not going to get my voice back. It’s already there."
Talcove says AI can even fool some security systems.
"It was able to get past facial recognition systems, it was able to get past systems where your voice is your password," he said.
The misuse of AI is taking scams to a whole new level.
"I called the number back, and it called my mom. And I was like, ‘Mom, are you OK?’ And she’s like, ‘What are you talking about?’" said Tiktoker @citylivingsoutherngirl.
Here are tips from law enforcement to protect yourself from a kidnapping scam:
- Establish a safe word with your family. Ask the caller for that word. If they can't give it, that's a sign the call is fake.
- Ask the caller to describe what your loved one looks like.
- Call or text the victim from another phone.
- Don't send money. Call the police on another phone.