Monday, May 08, 2023

Arizona Mother Warns About AI Voice Cloning After Kidnapping Scam

From The Epoch Times.com (April 14):

An Arizona mother received an unexpected phone call from her daughter, only it wasn’t actually her daughter calling.

Jennifer DeStefano’s 15-year-old daughter called her while out of town on a ski trip, so DeStefano didn’t assume anything was out of the ordinary.

“I pick up the phone, and I hear my daughter’s voice, ‘Mom!’ and she’s sobbing,” DeStefano told a local news station affiliated with CBS.

Responding to her daughter’s voice asking, “What happened?” Her daughter replied: “‘Mom, I messed up,’ and she’s sobbing and crying,” DeStefano told the outlet.

DeStefano then begins to panic as she hears a man’s voice in the background.

“I hear a man’s voice say, ‘Put your head back. Lie down,’ DeStefano said, confused as to what was actually happening to her daughter.

“This man gets on the phone, and he’s like, ‘Listen here. I’ve got your daughter. This is how it’s going to go down. You call the police, you call anybody, I’m going to pop her so full of drugs. I’m going to have my way with her, and I’m going to drop her off in Mexico,” DeStefano explains.

The frightened mother begins shaking as she can hear her daughter yelling, “Help me, Mom. Please help me. Help me,” DeStefano said.

Despite the frightening call, DeStefano’s daughter’s voice was a clone created by artificial intelligence.

How are AI Voice Clones Created

Subbarao Kambhampati, a computer science professor at Arizona State University specializing in AI, explained how realistic and confusing it can be to detect a deep fake voice.

“You can no longer trust your ears,”  Kambhampati said in an interview with WKYT.

With the constantly advancing technology, voice cloning makes it easier for individuals to create another personality and voiceover.

“In the beginning, it would require a larger amount of samples. Now there are ways in which you can do this with just three seconds of your voice. Three seconds. And with the three seconds, it can come close to how exactly you sound,” Kambhampati said.

Protecting Yourself From Scam Attacks

According to the Federal Trade Commission, scammers will often ask victims to wire money, send cryptocurrency, or pay the ransom with gift cards. This way the scammer can easily get away with the money without being traced.

Dan Mayo, the assistant special agent in charge of the FBI’s Phoenix office, alerted the public about scammers scouting for victims through public social media profiles.

“You’ve got to keep that stuff locked down. The problem is, if you have it public, you’re allowing yourself to be scammed by people like this because they’re going to be looking for public profiles that have as much information as possible on you, and when they get a hold of that, they’re going to dig into you,” Mayo told WKYT.

Mayo provides some insights on how you can recognize if you’re being targeted for a scam and what to look out for.

“If the phone number is coming from an area code you’re unfamiliar with, that should be one red flag,” Mayo said. “Second red flag; international numbers. Sometimes they will call from those as well. The third red flag; they will not allow you to get off the phone and talk to your significant other. That’s a problem.”

With AI voice cloning on the rise, Mayo says the FBI is keeping track of scammers’ every move.

“However, there are some people who give in to these and they end up sending the money to these individuals,” Mayo said. “Trust me, the FBI is looking into these people, and we find them.”

The AI voice clone of DeStefano’s daughter never raised a doubt that the voice she was talking with wasn’t truly her daughter.

“It was never a question of who is this? It was completely her voice. It was her inflection. It was the way she would have cried,” DeStefano said. “I never doubted for one second it was her. That’s the freaky part that really got me to my core.”

Safe and Sound

DeStefano was able to quickly confirm her daughter was safe from family members.

“She was upstairs in her room going, ‘What? What’s going on?’” DeStefano said. “Then I get angry, obviously, with these guys. This is not something you play around with.”

DeStefano posted a warning message to all social media users on her Facebook.

“The only way to stop this is with public awareness!!,” she said. “Also, have a family emergency word or question that only you know so you can validate you are not being scammed with AI! Stay safe!!!” [source]

How long will it be until someone figures out how to make a cloned voice sing? Bringing back John Lennon or Michael Jackson singing again? Then you have to have a ChatGDP program write the lyrics for the dead singers.

1 comment:

Andy said...

Also, book publishing companies can use this technology to make audio books. Just sample someone's voice (or even an author's voice) and have an AI read the book. It would be cheaper and the AI doesn't need breaks. It can read the whole book in one session. The publisher could (if it had voice samples) even have a dead author read his or her works.