Mom Says Kidnapping Scammers Used AI To Recreate Her Daughter's Voice

Kidnapping scams are becoming more and more common, so much so that both the National Institute of Health (NIH) and the FBI have recently issued warning for families to be on the lookout for such calls. And now scammers are taking advantage of new AI technology that can make their calls more convincing (and frightening) than ever.

This week, an Arizona mom reported that she got a call from fake kidnappers who used her 15-year-old daughter’s voice to make her believe that she was being held captive, and would only be released if she paid a ransom.

Jennifer DeStefano told Arizona CBS News Channel 5 that when she picked up a call from an unknown number, she immediately recognized her teen daughter’s voice, and it was in distress.

“I pick up the phone, and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” DeStefano told the local news station. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.”

Then a man got on the phone and demanded $1 million from the family, or else he would take advantage of her and then drop her in Mexico. In the background, she could hear her daughter calling for help and crying.

Because her daughter was away on a ski trip, DeStefano feared that the call could be valid.

“It was never a question of who is this? It was completely her voice. It was her inflection. It was the way she would have cried,” she told Channel 5. “I never doubted for one second it was her. That’s the freaky part that really got me to my core.”

Luckily DeStefano was at her other daughter’s dance studio at the time, and the women around her jumped into action as soon as they heard what was happening. One called 911 and another called Jennifer’s husband, who was quickly able to locate their daughter, safe and sound.

In the past, it took a lengthy amount of recording and a lot of time for a computer to reliably take on a specific human’s voice. But now, with AI technology taking off, experts say that you can copy a voice with only a few seconds of recorded material at hand.

“You can no longer trust your ears,” Subbarao Kambhampati, a computer science professor at Arizona State University specializing in AI, told the news channel. “Most of the voice cloning actually captures the inflection as well as the emotion. Obviously, if you spoke in your normal voice, I wouldn’t necessarily be able to clone how you might sound when you’re upset, but if I also had three seconds of your upset voice, then all bets are off.”

Experts also urge parents to be extra careful about social media, especially sharing too much private info or voice clips, both on your accounts and your kids’ accounts. But Jennifer said in a Facebook post that her daughter was careful and this still happened.

“Brie does NOT have any public social media accounts that has her voice,” she wrote on Facebook. “She has a few public interviews for sports/school that have a large sampling of her voice.”

Authorities say to be wary of calls from numbers outside your local area code as well as international calls. The scammers will also try to dissuade you from calling your significant other, who might be in contact with your child, or who might recognize the scam. Finally, the scammers will try to keep you on the phone and ask you to wire you money or gifts cards instead of meeting them at a physical location. They might also ask for a more reasonable amount of money than actual kidnappers — in the DeStefano case, they originally asked for $1 million before quickly downshifting to $50,000.

To slow down the situation, ask for solid proof they have your child, and, of course, call the police as soon as possible.

DeStefano had another tip: “Have a family emergency word or question that only you know so you can validate you are not being scammed with AI,” she suggested on Facebook.

https://urbhy.com/mom-says-kidnapping-scammers-used-ai-to-recreate-her-daughters-voice/?feed_id=6657&_unique_id=6465dce35f655

Post a Comment

أحدث أقدم