Beware! ‘Sophisticated’ AI voice cloning scams are skyrocketing
By
VanessaC
- Replies 10
We know all too well how quickly technology can change, and it can be both a blessing and a curse—depending on how you look at it.
But there are some malicious people out there who also take advantage of technological advancements and use them for ill-gotten gains.
Case in point: The emergence of more sophisticated Artificial Intelligence (AI) voice cloning scams.
This type of scam, which has seen a sharp rise in the past year, involves cybercriminals stealing a victim’s voice from social media and using it to fool unsuspecting family members with a fearmongering phone call. This is done in the hopes of receiving money or access to private information.
Mike Scheumack, Chief Innovation Officer at IdentityIQ, an identity theft protection and credit score monitoring firm, said: 'AI has been around for a long time, and software companies have been using it to advance technology for a while.'
'We’ve seen it start entering into this kind of cybercriminal space slowly, then all of the sudden just ramp up very quickly over the past year or so.’
'We’ve seen a lot in terms of advanced phishing scams, targeted phishing scams, we’ve seen where AI is being used to generate very specific emails, and the language is very specific as to who the target is,’ he continued.
'We’ve seen AI voice cloning scams increase over the past year as well, which is a very scary topic.'
So, how exactly is AI voice cloning done?
'All they need is as little as 3 seconds, 10 seconds is even better to get a very realistic clone of your voice,' Scheumack said.
A scammer then uses this three-second clip of someone's voice—usually taken from a social media post—and runs it through an AI program that replicates the voice.
They can then make the clone say whatever they want by typing in words, and, depending on how the scam is scripted, they can even add more emotions like fear or laughter into the voice.
To make this scam even more credible, scammers will use AI programs to search the internet for more data on the person they are targeting—this may include data about what they do for a living and more.
'The scary thing is, is that this is not your next-door neighbour doing this…This is a sophisticated organisation, it’s not [just] one person doing it. You have people that are researching on social media and gathering data about people,' Scheumack added.
'Those are not the same people that are going to plug in your voice. You have somebody else that’s going to clone the voice. You have somebody else that’s going to actually commit the act of calling. And you have somebody come to the victim’s house and pick up money if the scam is working.'
Scheumack also discussed how this tactic was used in a recent interview the firm did with an individual who received what she believed to be a panicked call from her daughter, who was at a camp. However, it was found to be an AI-generated voice clone of her daughter, which used an audio sample the scammers found on social media.
The scammers also found a social media post of her daughter about leaving for camp and had used that information to make the call more credible.
Another telling trait of AI voice cloning scams is that the voice clone calls are usually short. The scammer will try to cut off a potential conversation by saying something like, 'I can’t talk right now,' to induce panic while they try and take advantage of the victim’s emotions to gain access to money or sensitive information.
'The goal of the scammer is to get you into fight or flight and to create urgency in your mind that your loved one is in some sort of trouble. So the best way to deal with those situations is to hang up and immediately call your loved one to verify if it’s them or not,' Scheumack explained.
In another example, a couple in Texas was scammed $5,000 through a call they received from their son.
The scammer reportedly resorted to extreme measures to persuade them and fabricated a story that the person involved in the accident their son was in was a pregnant woman who had a miscarriage.
‘I could have sworn I was talking to my son. We had a conversation,’ Kathy, the mother recalled.
You can read more about this story here.
So, with an upsurge of these scams reported, what actions can you take to ensure you do not fall prey to them?
Scheumack said the first step is to be cautious about what you post online that’s available to the public. Secondly, if you do receive a call from an unknown number and it’s someone you know who says they’re in an urgent situation—hang up the phone and verify it’s actually them.
'Generally, take caution with that—that should be a red flag to you if you’re receiving a call from an unknown number and it’s a relative or a loved one and there’s an urgent situation.'
'You should definitely take a second to think about that.'
Scheumack also suggested that families consider implementing a password they can use to verify their identity when they are calling because of an emergency.
Members, what do you think about this recent development in scams? Have you encountered something similar? Let us know in the comments below!
But there are some malicious people out there who also take advantage of technological advancements and use them for ill-gotten gains.
Case in point: The emergence of more sophisticated Artificial Intelligence (AI) voice cloning scams.
This type of scam, which has seen a sharp rise in the past year, involves cybercriminals stealing a victim’s voice from social media and using it to fool unsuspecting family members with a fearmongering phone call. This is done in the hopes of receiving money or access to private information.
Mike Scheumack, Chief Innovation Officer at IdentityIQ, an identity theft protection and credit score monitoring firm, said: 'AI has been around for a long time, and software companies have been using it to advance technology for a while.'
'We’ve seen it start entering into this kind of cybercriminal space slowly, then all of the sudden just ramp up very quickly over the past year or so.’
'We’ve seen a lot in terms of advanced phishing scams, targeted phishing scams, we’ve seen where AI is being used to generate very specific emails, and the language is very specific as to who the target is,’ he continued.
'We’ve seen AI voice cloning scams increase over the past year as well, which is a very scary topic.'
So, how exactly is AI voice cloning done?
'All they need is as little as 3 seconds, 10 seconds is even better to get a very realistic clone of your voice,' Scheumack said.
A scammer then uses this three-second clip of someone's voice—usually taken from a social media post—and runs it through an AI program that replicates the voice.
They can then make the clone say whatever they want by typing in words, and, depending on how the scam is scripted, they can even add more emotions like fear or laughter into the voice.
To make this scam even more credible, scammers will use AI programs to search the internet for more data on the person they are targeting—this may include data about what they do for a living and more.
'The scary thing is, is that this is not your next-door neighbour doing this…This is a sophisticated organisation, it’s not [just] one person doing it. You have people that are researching on social media and gathering data about people,' Scheumack added.
'Those are not the same people that are going to plug in your voice. You have somebody else that’s going to clone the voice. You have somebody else that’s going to actually commit the act of calling. And you have somebody come to the victim’s house and pick up money if the scam is working.'
Scheumack also discussed how this tactic was used in a recent interview the firm did with an individual who received what she believed to be a panicked call from her daughter, who was at a camp. However, it was found to be an AI-generated voice clone of her daughter, which used an audio sample the scammers found on social media.
The scammers also found a social media post of her daughter about leaving for camp and had used that information to make the call more credible.
Another telling trait of AI voice cloning scams is that the voice clone calls are usually short. The scammer will try to cut off a potential conversation by saying something like, 'I can’t talk right now,' to induce panic while they try and take advantage of the victim’s emotions to gain access to money or sensitive information.
'The goal of the scammer is to get you into fight or flight and to create urgency in your mind that your loved one is in some sort of trouble. So the best way to deal with those situations is to hang up and immediately call your loved one to verify if it’s them or not,' Scheumack explained.
In another example, a couple in Texas was scammed $5,000 through a call they received from their son.
The scammer reportedly resorted to extreme measures to persuade them and fabricated a story that the person involved in the accident their son was in was a pregnant woman who had a miscarriage.
‘I could have sworn I was talking to my son. We had a conversation,’ Kathy, the mother recalled.
You can read more about this story here.
So, with an upsurge of these scams reported, what actions can you take to ensure you do not fall prey to them?
Scheumack said the first step is to be cautious about what you post online that’s available to the public. Secondly, if you do receive a call from an unknown number and it’s someone you know who says they’re in an urgent situation—hang up the phone and verify it’s actually them.
'Generally, take caution with that—that should be a red flag to you if you’re receiving a call from an unknown number and it’s a relative or a loved one and there’s an urgent situation.'
'You should definitely take a second to think about that.'
Scheumack also suggested that families consider implementing a password they can use to verify their identity when they are calling because of an emergency.
Key Takeaways
- Scammers are ramping up their utilisation of Artificial Intelligence (AI) tools to clone their victims’ voices and dupe their loved ones into sending money or sharing private information.
- In order to clone voices, scammers either record a person’s voice or find an audio clip on social media. These audio samples are then run through an AI program to produce realistic voice clones.
- Mike Scheumack, the Chief Innovation Officer at identity theft protection firm IdentityIQ, revealed that AI voice cloning scams have surged over the past year.
- As a precaution, Scheumack suggested everyone be wary of what they post online, be careful when receiving urgent calls from unfamiliar numbers, and consider creating a family password for verification purposes in emergency situations.