‘I’ve got your daughter’: Mom warns of terrifying AI voice cloning scam that faked kidnapping

Jennifer DeStefano said she got a call from an unfamiliar phone number and almost let it go to voicemail. But what happened next was terrifying. (Source: KPHO)
Published: Apr. 10, 2023 at 5:31 PM EDT

SCOTTSDALE, Ariz. (KPHO/Gray News) – A mother in Arizona is warning others about a terrifying phone scam involving artificial intelligence that can clone a loved one’s voice.

Jennifer DeStefano said she got a call from an unfamiliar phone number and almost let it go to voicemail. However, her 15-year-old daughter was out of town skiing, so she picked up the phone, fearing maybe there had been an accident.

“I pick up the phone and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” DeStefano recalled. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.”

DeStefano said she then heard a man’s voice say, “Put your head back, lie down.”

DeStefano’s confusion turned to terror.

“This man gets on the phone and he’s like, ‘Listen here. I’ve got your daughter. This is how it’s going to go down. You call the police, you call anybody, I’m going to pop her so full of drugs. I’m going to have my way with her and I’m going to drop her off in Mexico,’” DeStefano recalled. “And at that moment, I just started shaking. In the background she’s going, ‘Help me, Mom. Please help me. Help me,’ and bawling.”

The man on the phone then demanded money, first asking for $1 million, then lowering his demand to $50,000 when DeStefano said she did not have the funds.

DeStefano kept him talking. During the phone call, she happened to be at her other daughter’s dance studio, surrounded by worried moms who wanted to help. One called 911, and another called DeStefano’s husband.

Within just four minutes, they confirmed her daughter was safe.

“She was upstairs in her room going, ‘What? What’s going on?’” DeStefano said. “Then I get angry, obviously, with these guys. This is not something you play around with.”

Once she realized her daughter was safe, DeStefano hung up. But there had been no doubt in DeStefano’s mind that it was her daughter’s voice on the phone.

“It was completely her voice. It was her inflection. It was the way she would have cried,” she said. “I never doubted for one second it was her. That’s the freaky part that really got me to my core.”

But it turns out, the 15-year-old never said any of it. The voice on the phone was just a clone created by artificial intelligence.

Subbarao Kambhampati, a computer science professor at Arizona State University specializing in AI, said voice cloning technology is rapidly improving.

“You can no longer trust your ears,” Kambhampati said.

Previously, cloning a voice would take a large number of samples from the person who was being cloned. Nowadays, Kambhampati said a voice can be cloned with just three seconds of your voice.

“And with the three seconds, it can come close to how exactly you sound,” Kambhampati said. “Most of the voice cloning actually captures the inflection as well as the emotion.”

Deep learning technology currently has very little oversight, and according to Kambhampati, it is becoming easier to access and use.

“It’s a new toy, and I think there could be good uses, but certainly there can be pretty worrisome uses too,” he said.

Dan Mayo, the assistant special agent in charge of the FBI’s Phoenix office, said scammers who use voice cloning technology often find their prey on social media.

To avoid becoming a victim of scams like this, Mayo urges everyone to keep their profiles on private mode and not visible to the public.

“You’ve got to keep that stuff locked down. The problem is, if you have it public, you’re allowing yourself to be scammed by people like this, because they’re going to be looking for public profiles that have as much information as possible on you, and when they get a hold of that, they’re going to dig into you,” Mayo said.

According to the Federal Trade Commission, scammers will often ask victims to wire money, send cryptocurrency or pay ransom with gift cards. Once the money is transferred, getting it back is almost impossible.

Mayo said red flags to look for include the phone number coming from an area code that you’re not familiar with, the phone number being an international number, and the person on the phone not allowing you to talk to other family members for help.

“Just think of the movies. Slow it down. Slow the person down. Ask a bunch of questions,” Mayo said. “If they have someone of interest to you, you’re going to know a lot of details about them that this scam artist isn’t going to know. You start asking questions about who it is and different details of their background that are not publicly available, you’re going to find out real quick that it’s a scam artist.”

It’s unknown how many people have received similar scam calls about a family emergency or fake kidnapping using an AI voice clone, but Mayo said it “happens on a daily basis,” but not everyone reports the call.

Mayo said he believes people are so relieved that their family members are safe that they forget to report the scam.

“However, there are some people who give in to these and they end up sending the money to these individuals,” Mayo said. “Trust me, the FBI is looking into these people, and we find them.”

As for DeStefano, she’s thankful she didn’t send the scammers any money, but that didn’t stop her from being traumatized.

“I literally just sat down and broke down crying,” she said. “They were tears for all of the what ifs. It all just seemed so real.”