AI scam calls that mimic familiar voices are a growing problem, here’s how they work

Call scams that use artificial intelligence to mimic the voices of people you may know are used to exploit unsuspecting members of the public. These calls use what is known as generative AI, which refers to systems that can create text, images, or any other medium such as video, based on a user’s requests.

Deepfakes have gained notoriety in recent years with a number of high-profile incidents, such as actress Emma Watson’s likeness being used in a series of suggestive ads appearing on Facebook and Instagram.

There was also the widely shared and debunked video from 2022 in which Ukrainian President Volodymyr Zelensky appeared to be telling Ukrainians to lay down their arms.

Now, the technology of creating an audio deepfake – a lifelike copy of a person’s voice – is becoming more and more common. To create a realistic copy of someone’s voice you need data to train the algorithm. This means having many audio recordings of the voice of your intended targets. The more examples of the person’s voice you can feed into the algorithms, the better and more convincing the final copy will be.

Many of us are already sharing details of our daily lives on the internet. This means that the audio data needed to create a realistic copy of a voice could be readily available on social media. But what happens once a copy is out there? What’s the worst that can happen? A deepfake algorithm could allow anyone with the data to make you say whatever they want. In practice, this can be as simple as typing a text and having the computer say it out loud in what sounds like your own voice.

Big challenges

This capability risks increasing the prevalence of disinformation and audio misinformation. It can be used to try to sway international or national public opinion, as seen with Zelensky’s videos.

But the ubiquity and availability of these technologies also poses significant challenges at the local level, particularly in the growing trend of AI scam calls. Many people will have received a scam or phishing call informing us, for example, that our computer has been compromised and that we need to log in immediately, potentially giving the caller access to our data.

Audio spectrogram.
Real and deepfake voices can be distinguished by their spectrogram or voiceprint.
Brastock/Shutterstock

It is often very easy to spot a hoax, especially when the caller is making requests that someone from a legitimate organization would not. However, now imagine that the voice on the other end of the phone is not just a stranger, but sounds exactly like a friend or loved one. This injects a whole new level of complexity and panic for the hapless recipient.

A recent story reported by CNN highlights an incident where a mother received a call from an unknown number. When she answered the phone, she was her daughter. The daughter was reportedly kidnapped and she was phoning her mother to submit a ransom demand.

Indeed, the girl was safe and sound. The crooks had faked her voice. This is not an isolated incident, with variations of the scam including an alleged car accident, where the victim calls family to ask for money to help them after an accident.

Old trick using new technology

This is not a new scam per se, the term virtual kidnapping scam has been around for several years. It can take many forms, but a common approach is to trick victims into paying a ransom to free a loved one they believe is under threat.

The scammer tries to establish unquestioning compliance, in order to get the victim to pay a quick ransom before the deception is discovered. However, the dawn of powerful and available AI technologies has upped the ante significantly and made things more personal. It’s one thing to hang up on an anonymous caller, but it takes real trust in your judgment to hang up on a call from someone who sounds just like your child or partner.

There is software that can be used to identify deepfakes and it will create a visual representation of the audio called a spectrogram. When you listen to the call it might seem impossible to distinguish it from the real person, but the voices can be distinguished when the spectrograms are analyzed side by side. At least one group has offered detection software for download, although those solutions may still require some technical knowledge to use.

Most people won’t be able to generate spectrograms, so what can you do when you’re not sure what you’re hearing is real? As with any other form of media you may encounter: be skeptical.

If you get a call from a loved one out of the blue and they ask you for money or make requests that seem out of the ordinary, call them back or text them to confirm that you are indeed talking to them.

As the capabilities of AI expand, the lines between fact and fiction will blur more and more. And it’s not likely that we’ll be able to put the technology back in the box. This means that people will have to become more cautious.

#scam #calls #mimic #familiar #voices #growing #problem #heres #work
Image Source : theconversation.com

Leave a Comment