Mid-section Photo of a Middle-aged Man Receiving a Call
© fizkes/Shutterstock.com

It’s official, new AI-driven scams are here.

A recent slew of generative artificial intelligence applications like image-creators, chatbots, and now voice-cloners are being leveraged by cybercriminals to dupe victims.

In this case, highly convincing voice emulation technology is now being abused by cybercriminals to spoof countless unsuspecting victims and defraud them for millions of dollars. All AI generative voice software needs is a brief snippet of a person’s voice to convincingly reproduce it, after which a scammer can call a victim, and pretend to be someone close to them. Meanwhile, people upload their voices on apps and social media every day, offering plenty of samples.

In just one real-life example revealed by The Washington Post on Sunday, an elderly couple rushed to withdraw thousands of dollars from their bank after being called by an AI voice scammer, led to believe that their grandson was arrested and calling them from jail. “We were convinced that we were talking to Brandon,” the couple said.

Not Just a ‘Deepfake’

The use of AI technology to reproduce voice is not exactly new and is also known as voice deepfakes. However, full-blown generative AI voice cloning is now practically indistinguishable from the real thing.

An early example of someone stitching together a voice — well before full-blown generative AI became widely available — was in 2019 when scammers convinced the CEO of a U.K.-based energy company that he was speaking with his boss.

This resulted in the CEO transferring $243,000 to a fraudulent supplier. At the time, officials said that this was the first instance they had heard of a scam created completely with the help of AI.

Deepfakes also come in video and text form. For instance, just before Valentine’s Day this year, researchers at McAfee Corp warned that revolutionary AI tools like ChatGPT and Google Bard would make online dating scams like catfishing much harder to detect.

At present, scammers are also using AI technology to propagate sophisticated LinkedIn job scams. In terms of AI voice replication technology, there are already several websites out there that offer voice generation and/or cloning for free, such as Resemble.ai.

Another company that offers such services, ElevenLabs, noted that the company was “seeing an increasing number of voice cloning misuses cases,” in a Jan. 30 tweet. The next day, ElevenLabs stopped offering the voice cloning capability on the free version of its VoiceLab tool.

Consumers Lost Over $8 Billion to Fraud in 2022

AI tools are an incredible aid to scammers looking to orchestrate impersonation schemes. Rapid innovations in machine learning, such as large language models (LLMs) and multimodal artificial intelligence frameworks, have led to a boom in the generative AI products market.

For cybercriminals, it is easier and quicker — not to mention more effective — to defraud someone of their money with a cloned voice rather than having to send out an old-school phishing email to nab a victim’s account password. Unfortunately, this trend is leading to a sharp rise in impersonation scams this year. According to the U.S. Federal Trade Commission (FTC), consumers reported losing almost $8.8 billion to fraud in 2022.

On the other hand, this technology has other, more positive uses. According to Gartner, conversational AI technology is expected to revolutionize contact centers in the form of an 80$ billion reduction in labor costs by 2026.

“Gartner projects that one in 10 agent interactions will be automated by 2026, an increase from an estimated 1.6% of interactions today that are automated using AI. Conversational AI can automate all or part of a contact center customer interaction through both voice and digital channels, through voicebots or chatbots, and it is expected to have transformational benefits to customer service and support organizations within two years.”

Precautions and Safety Measures

The implications of being able to convincingly mimic any voice from a brief recording are quite alarming. To prevent having your voice used in a scam, avoid recording and posting your voice to social media, especially publicly. It also may be wise to permanently delete voice recordings from other apps you’ve downloaded once they have served their purpose.

If you receive a call from someone you supposedly know but believe they are acting strange or suspicious, ask specific identifying questions about the person. Be wary of anyone requesting financial assistance.

If you receive calls from someone claiming to be a representative of a company, make sure you never give out your password or any authentication codes if prompted. Do not interact with them at all if you suspect something suspicious. You should only interact with representatives when you’ve initiated the communication. When emailing, make sure to check that email addresses are coming from official domains.

To find out more about the various types of AI-generated scams, take a look at our in-depth guide on deepfakes.

Leave a comment