Your voice is at risk: Scammers find new way to steal identities

Now that technology is advancing at an unprecedented pace, it's no surprise that the tools of tricksters and con artists are evolving as well.

The latest threat to our security doesn't come in the form of a suspicious email or a dodgy text message but from something far more personal—our own voices.

Scammers are now using sophisticated AI technologies to clone voices and commit fraud, a concerning development that has implications for all of us.



Imagine receiving a call from a loved one claiming they're in trouble and urgently need money.

But what if the voice pleading for your assistance wasn't actually your family member or friend but a scammer using a cloned version of their voice?

This is the reality of voice cloning technology, which has become the latest tool in a scammer's arsenal.


shutterstock_1626735937.jpg
Scammers use your voice as a new way to scam people. Credit: Shutterstock


Criminals are now using sophisticated artificial intelligence (AI) to recreate people's voices with chilling accuracy.

These audio deepfakes can be weaponised for financial fraud, defamation, and even to create cheap labour by replicating someone's voice without their consent.

The technology has evolved to the point where it's difficult to distinguish between a real voice and an AI-generated one.

This has caught the attention of authorities and consumer advocacy groups, urging the public to be vigilant.



‘At the moment, we worry about people that call us that try and get our information or tell us that we have to pay something like a toll or whatever,’ Monica Whitty, a Professor of Human Factors in Cybersecurity and Head of the Department of Software Systems and Cybersecurity at Monash University said.

‘But now it could be that they try to garner bits of your voice that then can be used and trained with AI…to sound like yourself.’

‘They (the scammers) might ring a family member of mine using my voice to say, “Look, I've had my wallet stolen, I'm in Spain—can you send me some money? It's really urgent. I've been kidnapped. They've got me for hostage,”’ Whitty said as she explained how audio deepfake works.


shutterstock_1690003840.jpg
Audio deepfake is a sophisticated scam that uses someone else’s voice. Credit: Shutterstock


‘So they'll set up a fake scenario with your voice to legitimise that big scenario.’

Although voice fraud is not new, Whitty and others emphasised that technological advancements have expanded its range and effectiveness.

‘In scams like romance scams or investment scams, it used to be the case that you'd only bring in voice when it looked like the person was doubting the person creating the fake relationship…the voice helps develop more trust,’ Whitty said.

‘It's evolved to the fact that the technology can mimic someone's voice that you actually know. So now that makes it even better for the criminal to use that in these scams.’



Shahriar Kaisar, a Cybersecurity Researcher and Leader in the Information Security and Business Department of the Royal Melbourne Institute of Technology (RMIT) said that the rise of generative AI technology, such as artificial intelligence systems like ChatGPT, has led to a significant advancement in the sophistication of these scams.

These scams evolved to ‘a very different level, where it has become very difficult to distinguish between what is real and what is not.’

‘They use a machine learning model where they would collect voice samples or sometimes even video images from online public sources—it could be a YouTube video, it could be something that you have shared on TikTok,’ Kaisar explained.



The Australian Association of Voice Actors (AAVA) also felt the impact, with voice actors discovering their voices cloned and used without permission.

‘People come to us and say, “Oh, I just found out that my voice is being used on a YouTube video episode that I had nothing to do with, and what can I do about it?”’ Simon Kenny, the President of AAVA, said.

‘And the answer is, at this stage, not much, sadly, because the legislation hasn't caught up yet.’

The industry is pushing for legislation to protect individuals' voices and likenesses from being used unethically.



So, how can you protect yourself from falling victim to these voice-cloning scams?

Kaisar warned people to be careful about what they share online, especially videos and voice recordings, to prevent being victims of voice cloning.

‘We would want some help from the platforms that are being used for developing deepfakes as well, and also platforms that are being used for sharing those,’ he added.

Whitty further urged people to remain vigilant and don’t spend too much time talking to someone suspicious or unknown.



‘If you've got a family member or someone that's ringing you for something urgent, it's better to stop, pause and…then ring that person up again or contact them in some other way just to make sure that it was them talking to you,’ Whitty said.

‘If it says unknown, just ignore it altogether.’

This comes after Sunshine Coast Mayor Rosanna Natoli became a victim when her image was used without consent when her friend thought they talked on Skype, where in fact, it was Natoli her friend was talking to.
Key Takeaways
  • Criminals are utilising sophisticated AI to clone people’s voices and commit fraud and other malicious acts.
  • Experts warned individuals to be vigilant and cautious when answering calls from unknown numbers and sharing information online.
  • Voice actors and others in the voice industry are discovering unauthorised use of their voices and are pushing for legal protections.
  • The public is advised to be careful with the personal content they share online and to verify urgent requests from family or friends to prevent falling victim to these scams.
Have you or someone you know encountered a voice cloning scam? Share your experiences in the comments below.
 
  • Like
Reactions: Ezzy

Seniors Discount Club

Sponsored content

Info
Loading data . . .
If you listen to the voice there is no personality in the speakers tone. A I deep fake scams all sound robotic, talk too fast and don't speak like the actual person. Phrases are the big give away with these scams.
 
  • Like
Reactions: DLHM and PattiB
The wonders of the digital age. It will only get better for scammers, hackers and the Secret Police.
 

Join the conversation

News, deals, games, and bargains for Aussies over 60. From everyday expenses like groceries and eating out, to electronics, fashion and travel, the club is all about helping you make your money go further.

Seniors Discount Club

The SDC searches for the best deals, discounts, and bargains for Aussies over 60. From everyday expenses like groceries and eating out, to electronics, fashion and travel, the club is all about helping you make your money go further.
  1. New members
  2. Jokes & fun
  3. Photography
  4. Nostalgia / Yesterday's Australia
  5. Food and Lifestyle
  6. Money Saving Hacks
  7. Offtopic / Everything else

Latest Articles

  • We believe that retirement should be a time to relax and enjoy life, not worry about money. That's why we're here to help our members make the most of their retirement years. If you're over 60 and looking for ways to save money, connect with others, and have a laugh, we’d love to have you aboard.
  • Advertise with us

User Menu

Enjoyed Reading our Story?

  • Share this forum to your loved ones.
Change Weather Postcode×
Change Petrol Postcode×