Tech & Science

AI kidnapping scam copied teen girl’s voice in $1M extortion attempt – National

An Arizona mother was left reeling after narrowly avoiding paying thousands of dollars to a scammer after she was convinced she was holding her 15-year-old daughter hostage.

Jennifer DeStefano said Local KPHOs She never suspected for a moment that it was her daughter who answered the phone.

“It was totally her voice,” the Scottsdale resident said in a video interview last week.

DeStefano said she got a call from an unknown number while she was out at her other daughter’s dance studio. She thought of sending it to voicemail, but she worried that her 15-year-old daughter was out of town skiing and there had been an accident, so she answered the call.

“I picked up the phone and I heard my daughter saying, ‘Mom!'” She sobs,” DeStefano said. “I said, ‘What’s wrong?’ And she said, ‘Mommy, I messed up,’ and she was sobbing and crying.”

The story continues under the ad

read more:

Amazon’s Alexa could soon mimic the voice of a deceased loved one

At that moment, a male voice took over the phone and appeared to order DeStefano’s daughter to lie down.

“This man answered the phone and said, ‘Listen here. I have your daughter. This is how I get down. You call the police, you call anyone, me. I’m going to fill her up with drugs, I’m going my way with her, and I’m going to drop her off in Mexico.

“And at that moment I started shaking. In the background, she bellows, ‘Help me, mom. Help me. Help me.'”

DeStefano said the voice was indistinguishable from that of her real daughter.

“It was totally her voice. It was her refraction. It was the way she cried,” DeStefano said.

read more:

Missing work with a cold? AI may determine if you are actually sick

The man who answered the phone demanded $1 million for his daughter’s safe return. DeStefano said she didn’t have that much money and eventually lowered her “ransom” to her US$50,000.

The story continues under the ad

DeStefano was in another daughter’s dance studio, so he was surrounded by other concerned parents who had caught up with the situation. One called her 911, the other her DeStefano’s husband.

Within four minutes, KPHO was able to confirm that DeStefano’s daughter, who was believed to have been kidnapped, was safe.

DeStefano hangs up on the scammer and breaks down in tears.

“Everything looked so real,” she said. “I never for a second suspected it was her. That’s the weird part that really got me to the core.”

read more:

Blue Jays pitcher furious after airline forced pregnant wife to clean up after kids

In reality, the 15-year-old girl hadn’t said anything her mother had heard on the phone that day. Police are still investigating the extortion attempt, but it is believed that the crooks used artificial intelligence (AI) software to clone her teenage voice.

AI-generated voices are already being used on-screen to replicate actors. One of his recent examples is James Earl Jones, now 92 years old. Star WarsLast year, the actor signed a deal that allowed Disney to use AI to recreate his voice from his first performance for use in the TV series. Obi-Wan Kenobi.

The story continues under the ad

Experts say AI speech generation is becoming easier for ordinary people to access and use as technology improves. This is no longer just for Hollywood and his computer programmers.

Previously, large recordings were required to create a reliable cloned voice, but now it takes only seconds of the recorded voice.

AI Laboratory Released Artificial Voice Beta Tool in January shared a twitter thread It emerged that a “series of actors” were using the company’s technology for “malicious purposes.”

ElevenLabs writes that its VoiceLab technology is being used increasingly in “many voice duplication abuse cases.” This has led the company to roll out a series of new features to make it easier to validate synthetic speech as AI-generated and hide it behind the scenes. Paywall.

The story continues under the ad

Dan Mayo, an assistant special agent in charge of the FBI’s Phoenix office, told KPHO that crooks using AI voice duplication techniques are becoming more common. Not all victims of AI fraud report it, but it “occurs on a daily basis,” he said.

“Believe me, the FBI is investigating these people and we will find them,” Mayo said.

read more:

Suspect charged after black youth who visited wrong home in Kansas City shot

Mayo is urging people to keep their social media profiles private and out of the public eye.

“If you have a public[social media account]you can get scammed by people like this because they are looking for a profile that reveals as much information about you as possible. and they’re going to dig into you,” Mayo said.

Earlier this month, a Canadian couple were swindled out of $21,000 after a man claiming to be a lawyer persuaded them that their son was in prison for killing a diplomat in a car accident.

read more:

A robot is trying to hijack the radio waves.Next Target: Trusted Radio Announcer

The story continues under the ad

Scammers used AI-generated voice to impersonate the couple’s son and begged the parents to pay fake legal fees. The Washington Post reported.

Son told the outlet that the voice was “close enough to really believe what my parents talked to me about.”

The couple sent the money to the scammers in bitcoin before realizing their son was not in danger when he called to check in later that night.

Should I report fraud or cybercrime in Canada? Canada Fraud Center.

© 2023 Global News, a division of Corus Entertainment Inc.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button