Deepfake Kidnapping Calls: The Chilling New Face of AI Voice Scams
Written by: Charles Bethea, staff writer at The New Yorker
One night in Brooklyn, Robin and her husband Steve were asleep when her phone rang twice — her mother-in-law, Mona, calling after midnight. Concerned, Robin answered and heard Mona’s voice crying, “I can’t do it, I can’t do it.” Moments later, a man’s voice came through: “Get Steve.” Within minutes, the couple were convinced Mona and her husband Bob were being held hostage.
Steve, a law-enforcement officer, called a colleague while staying on the line. The male caller threatened, “I’ve got a gun to your mom’s head.” The ransom? Just $500 via Venmo — an absurdly low sum that nonetheless seemed real enough to pay. When the payment went through, the caller demanded another $250 “for travel.” Only after the ordeal did Steve and Robin reach Mona — safe in bed, unaware of any call. The “hostage situation” had been faked with AI.
AI Voice Cloning: A Disturbing Evolution
Robin and Steve had fallen victim to a new kind of scam: AI voice cloning. Advances in generative AI have made it possible to reproduce anyone’s voice from a short audio clip. Hany Farid, a computer scientist at the University of California, Berkeley, explains: “I can now clone the voice of just about anybody and get them to say just about anything.”
While synthetic voices have existed for decades — from Apple’s early “Hello, I’m Macintosh” demo to Siri and Alexa — they’ve only recently become indistinguishable from real ones. In 2022, New York-based startup ElevenLabs unveiled software that can clone a voice using as little as 45 seconds of audio. Today, such technology is available for as little as $5 per month.
How Deepfake Scams Work
The scammer’s method is simple: scrape short voice clips from social media, generate an AI clone, and place a call that mimics the victim’s loved one. The emotional impact of hearing a child or parent’s voice in distress can override logic — a psychological manipulation as old as crime itself, now powered by AI.
Victims like Robin and Steve are not alone. Arizona mother Jennifer DeStefano received a call from an unknown number. Her teenage daughter’s voice sobbed, “Mom, they have me!” A man then threatened violence and demanded $50,000. It was an AI-generated voice. DeStefano later testified before the U.S. Senate Judiciary Committee, warning lawmakers about the terrifying realism of these scams.
In another case, RaeLee Jorgensen, a teacher’s aide, got a call from her son’s number. “Hey, Mom, this is Tate,” said the familiar voice. Then another voice interjected, “I have your son and I’m going to shoot him.” Minutes later, she learned Tate was safe at school — but the emotional damage was real.
The Legal and Ethical Lag
Governments are struggling to keep pace. U.S. Senator Jon Ossoff described the issue as “urgent,” asking whether society can “get good enough fast enough at discerning real from fake.” The proposed QUIET Act aims to increase penalties for those who use AI to impersonate others, while states like Arizona seek to classify AI as a weapon when used in crimes.
The Federal Trade Commission (FTC) reports Americans lost over $2 billion to impostor scams in 2022. In 2024, it offered a $25,000 prize to developers who could build real-time deepfake detection tools. Some entries use AI to detect voice watermarking or metadata inconsistencies — but “there are no silver bullets,” says FTC official Will Maxson.
AI Voice Cloning’s Legitimate Uses
Not all applications are harmful. Companies like The Voice Keeper use the technology to preserve voices of patients with ALS or throat cancer. Film studios deploy AI dubbing to sync actors’ voices in multiple languages. Even public figures such as Keith Byars and Mayor Eric Adams have used cloned voices for commercial or outreach purposes — though privacy advocates have raised concerns about consent and misuse.
Protecting Yourself from AI Voice Scams
- Pause before reacting. If a loved one sounds panicked, verify their safety via another channel (text or video call).
- Establish a family password. Use a pre-agreed code word only real relatives know.
- Never send money on the spot. Scammers rely on emotional urgency — step back and confirm.
- Be cautious with voice data. Limit what personal audio you share publicly.
Robin’s family now has their own verification plan. “It doesn’t seem like this scam is going to stop anytime soon,” she said. “So we came up with an extended-family password.” When asked if Mona remembered it, she laughed: “I’m going to have to go over it again. I still can’t believe I was worth only $750.”