“I swear it was her cry”: How AI tricked a woman into believing her sister was kidnapped

Police are issuing renewed warnings about a disturbing new scam that uses artificial intelligence to mimic the voices of loved ones, convincing victims that a family member has been kidnapped.

FOX 5 NY's Teresa Priolo spoke to Stephanie, who asked to go by her first name. She says she was nearly extorted after receiving a phone call from her sister’s number. 

What we know:

When she picked up, she heard a man threatening to kill her sister, and a woman crying in the background who sounded exactly like her.

A horrifying call that felt real

Timeline:

The caller claimed he had just been released from jail and needed money to get home. Then, in the background, Stephanie heard what sounded like her sister screaming, crying and begging for help.

"She has this specific way she cries," Stephanie said. "And I swear to God, that’s what I heard in the cry as well. So to me, it was real."

Panicked, she logged on to Find My iPhone and saw her sister’s phone pinging from her apartment. That should have been a sign that nothing was wrong, but the voice on the phone was so convincing, she couldn’t risk ignoring it.

Stephanie made a payment, but the man demanded more, threatening to kill her sister if police were contacted. Then the call suddenly disconnected.

Desperate, Stephanie and her cousin called again from another phone — no answer. 

They contacted the doorman at her sister’s building and begged for a welfare check. Moments later, he confirmed what seemed impossible: her sister was fine. She had been asleep the entire time.

A scam powered by AI

Local perspective:

Investigators say the call was part of a growing wave of AI-assisted "virtual kidnapping" scams. Scammers harvest audio from social media, short clips on Instagram, Snapchat, or Facebook Reels, and use it to create realistic voice clones that sound exactly like a victim’s loved one.

"They grab information off of social media, any video with audio, and put it into a ‘FraudGPT’ generator," cyber security expert, Robert Siciliano explained. "That allows them to create the exact likeness of that person’s voice."

These scams are growing more sophisticated, using advanced language models to carry on full, natural-sounding conversations without pauses, glitches, or robotic tones that once gave fakes away.

How to know if something is AI 

Experts say the best way to protect yourself is to stay calm and ask questions only your loved one would know the answer to,  such as what you ate for dinner last night, a shared inside joke, or the name of a favorite restaurant.

"To tell if it’s real, ask something only a human, your human, would know," Siciliano advised.

Things like ‘what did we have for dinner?’ or ‘what’s my sister's favorite color,' can help identify whether the threat is real.

What you can do:

Police are now urging New Yorkers to think twice before sharing personal videos publicly and to verify any alarming calls before sending money.

"You have to be calm, cool, and collected," one expert said. "That’s the only way to outsmart the technology."

The Source: This article is based on reporting by FOX 5 NY's Teresa Priolo.

Artificial IntelligenceTechnologyCrime and Public Safety