Your voice is at risk: Scammers find new way to steal identities
By
Seia Ibanez
- Replies 4
Now that technology is advancing at an unprecedented pace, it's no surprise that the tools of tricksters and con artists are evolving as well.
The latest threat to our security doesn't come in the form of a suspicious email or a dodgy text message but from something far more personal—our own voices.
Scammers are now using sophisticated AI technologies to clone voices and commit fraud, a concerning development that has implications for all of us.
Imagine receiving a call from a loved one claiming they're in trouble and urgently need money.
But what if the voice pleading for your assistance wasn't actually your family member or friend but a scammer using a cloned version of their voice?
This is the reality of voice cloning technology, which has become the latest tool in a scammer's arsenal.
Criminals are now using sophisticated artificial intelligence (AI) to recreate people's voices with chilling accuracy.
These audio deepfakes can be weaponised for financial fraud, defamation, and even to create cheap labour by replicating someone's voice without their consent.
The technology has evolved to the point where it's difficult to distinguish between a real voice and an AI-generated one.
This has caught the attention of authorities and consumer advocacy groups, urging the public to be vigilant.
‘At the moment, we worry about people that call us that try and get our information or tell us that we have to pay something like a toll or whatever,’ Monica Whitty, a Professor of Human Factors in Cybersecurity and Head of the Department of Software Systems and Cybersecurity at Monash University said.
‘But now it could be that they try to garner bits of your voice that then can be used and trained with AI…to sound like yourself.’
‘They (the scammers) might ring a family member of mine using my voice to say, “Look, I've had my wallet stolen, I'm in Spain—can you send me some money? It's really urgent. I've been kidnapped. They've got me for hostage,”’ Whitty said as she explained how audio deepfake works.
‘So they'll set up a fake scenario with your voice to legitimise that big scenario.’
Although voice fraud is not new, Whitty and others emphasised that technological advancements have expanded its range and effectiveness.
‘In scams like romance scams or investment scams, it used to be the case that you'd only bring in voice when it looked like the person was doubting the person creating the fake relationship…the voice helps develop more trust,’ Whitty said.
‘It's evolved to the fact that the technology can mimic someone's voice that you actually know. So now that makes it even better for the criminal to use that in these scams.’
Shahriar Kaisar, a Cybersecurity Researcher and Leader in the Information Security and Business Department of the Royal Melbourne Institute of Technology (RMIT) said that the rise of generative AI technology, such as artificial intelligence systems like ChatGPT, has led to a significant advancement in the sophistication of these scams.
These scams evolved to ‘a very different level, where it has become very difficult to distinguish between what is real and what is not.’
‘They use a machine learning model where they would collect voice samples or sometimes even video images from online public sources—it could be a YouTube video, it could be something that you have shared on TikTok,’ Kaisar explained.
The Australian Association of Voice Actors (AAVA) also felt the impact, with voice actors discovering their voices cloned and used without permission.
‘People come to us and say, “Oh, I just found out that my voice is being used on a YouTube video episode that I had nothing to do with, and what can I do about it?”’ Simon Kenny, the President of AAVA, said.
‘And the answer is, at this stage, not much, sadly, because the legislation hasn't caught up yet.’
The industry is pushing for legislation to protect individuals' voices and likenesses from being used unethically.
So, how can you protect yourself from falling victim to these voice-cloning scams?
Kaisar warned people to be careful about what they share online, especially videos and voice recordings, to prevent being victims of voice cloning.
‘We would want some help from the platforms that are being used for developing deepfakes as well, and also platforms that are being used for sharing those,’ he added.
Whitty further urged people to remain vigilant and don’t spend too much time talking to someone suspicious or unknown.
‘If you've got a family member or someone that's ringing you for something urgent, it's better to stop, pause and…then ring that person up again or contact them in some other way just to make sure that it was them talking to you,’ Whitty said.
‘If it says unknown, just ignore it altogether.’
This comes after Sunshine Coast Mayor Rosanna Natoli became a victim when her image was used without consent when her friend thought they talked on Skype, where in fact, it was Natoli her friend was talking to.
Have you or someone you know encountered a voice cloning scam? Share your experiences in the comments below.
The latest threat to our security doesn't come in the form of a suspicious email or a dodgy text message but from something far more personal—our own voices.
Scammers are now using sophisticated AI technologies to clone voices and commit fraud, a concerning development that has implications for all of us.
Imagine receiving a call from a loved one claiming they're in trouble and urgently need money.
But what if the voice pleading for your assistance wasn't actually your family member or friend but a scammer using a cloned version of their voice?
This is the reality of voice cloning technology, which has become the latest tool in a scammer's arsenal.
Criminals are now using sophisticated artificial intelligence (AI) to recreate people's voices with chilling accuracy.
These audio deepfakes can be weaponised for financial fraud, defamation, and even to create cheap labour by replicating someone's voice without their consent.
The technology has evolved to the point where it's difficult to distinguish between a real voice and an AI-generated one.
This has caught the attention of authorities and consumer advocacy groups, urging the public to be vigilant.
‘At the moment, we worry about people that call us that try and get our information or tell us that we have to pay something like a toll or whatever,’ Monica Whitty, a Professor of Human Factors in Cybersecurity and Head of the Department of Software Systems and Cybersecurity at Monash University said.
‘But now it could be that they try to garner bits of your voice that then can be used and trained with AI…to sound like yourself.’
‘They (the scammers) might ring a family member of mine using my voice to say, “Look, I've had my wallet stolen, I'm in Spain—can you send me some money? It's really urgent. I've been kidnapped. They've got me for hostage,”’ Whitty said as she explained how audio deepfake works.
‘So they'll set up a fake scenario with your voice to legitimise that big scenario.’
Although voice fraud is not new, Whitty and others emphasised that technological advancements have expanded its range and effectiveness.
‘In scams like romance scams or investment scams, it used to be the case that you'd only bring in voice when it looked like the person was doubting the person creating the fake relationship…the voice helps develop more trust,’ Whitty said.
‘It's evolved to the fact that the technology can mimic someone's voice that you actually know. So now that makes it even better for the criminal to use that in these scams.’
Shahriar Kaisar, a Cybersecurity Researcher and Leader in the Information Security and Business Department of the Royal Melbourne Institute of Technology (RMIT) said that the rise of generative AI technology, such as artificial intelligence systems like ChatGPT, has led to a significant advancement in the sophistication of these scams.
These scams evolved to ‘a very different level, where it has become very difficult to distinguish between what is real and what is not.’
‘They use a machine learning model where they would collect voice samples or sometimes even video images from online public sources—it could be a YouTube video, it could be something that you have shared on TikTok,’ Kaisar explained.
The Australian Association of Voice Actors (AAVA) also felt the impact, with voice actors discovering their voices cloned and used without permission.
‘People come to us and say, “Oh, I just found out that my voice is being used on a YouTube video episode that I had nothing to do with, and what can I do about it?”’ Simon Kenny, the President of AAVA, said.
‘And the answer is, at this stage, not much, sadly, because the legislation hasn't caught up yet.’
The industry is pushing for legislation to protect individuals' voices and likenesses from being used unethically.
So, how can you protect yourself from falling victim to these voice-cloning scams?
Kaisar warned people to be careful about what they share online, especially videos and voice recordings, to prevent being victims of voice cloning.
‘We would want some help from the platforms that are being used for developing deepfakes as well, and also platforms that are being used for sharing those,’ he added.
Whitty further urged people to remain vigilant and don’t spend too much time talking to someone suspicious or unknown.
‘If you've got a family member or someone that's ringing you for something urgent, it's better to stop, pause and…then ring that person up again or contact them in some other way just to make sure that it was them talking to you,’ Whitty said.
‘If it says unknown, just ignore it altogether.’
This comes after Sunshine Coast Mayor Rosanna Natoli became a victim when her image was used without consent when her friend thought they talked on Skype, where in fact, it was Natoli her friend was talking to.
Key Takeaways
- Criminals are utilising sophisticated AI to clone people’s voices and commit fraud and other malicious acts.
- Experts warned individuals to be vigilant and cautious when answering calls from unknown numbers and sharing information online.
- Voice actors and others in the voice industry are discovering unauthorised use of their voices and are pushing for legal protections.
- The public is advised to be careful with the personal content they share online and to verify urgent requests from family or friends to prevent falling victim to these scams.