Scammers using AI to copy voices of Family & Friends to Trick you

Fraudsters employ AI to mimic voices of loved ones, aiming to deceive individuals. Beware of voice-mimicking scams.A mom targeted by an alleged AI voice scam spoke at a Senate hearing in June.

Facebook
Twitter
LinkedIn
Pinterest
Pocket
WhatsApp

A mom was tricked by a fake AI voice scam that said her daughter was kidnapped. Experts say scammers can use a little bit of social media to copy voices with AI. Just a few seconds of online stuff can help scammers use AI to copy voices.

Earlier this year, Jennifer DeStefano got a call from an unfamiliar number. She was shocked to hear her daughter’s voice, sounding desperate and scared, as if she was being kidnapped.

“I hear her saying, ‘Mom, these bad men have me. Help me. Help me. Help me.’ And even just saying it just gives me chills,” DeStefano told ABC News.

In truth, Briana was perfectly fine. But scammers supposedly used AI to copy Briana’s voice and tried to scare her family into giving them money.

This situation is just one case showing a worrying pattern. A major cybersecurity company called Check Point Technologies mentions that they’ve noticed a big rise in scams and attacks using AI in the past year. In 2022, phone and online scams took about $10 billion from Americans, as reported by the FBI Internet Crime Complaint Center.

When DeStefano got that disturbing call on Jan. 20, her 15-year-old daughter Briana had been away on a ski trip.

“This man gets on and he says, ‘Listen here. I have your daughter. If you call anybody, you call the police, I’m gonna pump your daughter so full of drugs, I’m gonna have my way with her, I’m gonna drop her in Mexico, and you’re never gonna see your daughter again,’” DeStefano said.

Then, DeStefano says, the scammer asked for $1 million.

“That’s when I went into panic mode. And I just opened up the door, put the phone on mute, and started screaming for help,” DeStefano said.

A person nearby heard the noise and called 911. The emergency operator said it seemed like DeStefano was caught in a common scam and asked if she had talked directly to her daughter.

Eventually, DeStefano reached her husband, who was also on the ski trip, and he confirmed that Briana was safe.

However, there were still unanswered questions. Who or what was really speaking on the phone, and how did they manage to imitate Briana’s voice so convincingly that even her own mother was fooled?

Experts caution that even a short snippet of social media content can provide scammers with what they need to replicate someone’s voice using artificial intelligence.

There has been a rise in reports of this complex scam in recent months. In May, a man from Texas shared with “Good Morning America” that his father was tricked into giving $1,000 to a scammer. The scammer posed as his grandson, Christian, and claimed he was in trouble in Mexico, needing money to get out of the situation.

Pete Nicoletti from Check Point Technologies suggests that all family members should establish a “safe word.” This word can be used when communicating with a loved one who appears to be in a kidnapping situation.

Rich Frankel, a former FBI special agent and ABC News contributor, explains that stopping this type of cybercrime is difficult. He advises recording any suspicious calls and attempting to directly contact your loved ones as a precautionary measure.

“I would call law enforcement right away, because if it is a real kidnapping, you want law enforcement involved. And if it’s a scam, you wanna know about it right away,” Frankel said.

DeStefano is taking steps to prevent others from becoming victims of such scams. In June, she testified at a Senate Judiciary subcommittee hearing focused on AI and human rights.

“Is this the future we are creating by enabling the use of artificial intelligence without consequence and without regulation?” DeStefano asked lawmakers.

Facebook
Twitter
LinkedIn
Pinterest
Pocket
WhatsApp

Never miss any important news. Subscribe to our newsletter.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top

Can’t get enough?

Never miss any important news. Subscribe to our newsletter.