Scammers can steal your voice from social media videos—protect yourself now!

In today's digital age, where sharing personal moments on social media has become second nature, a new threat looms over the unsuspecting user.

Scammers, with the aid of advanced artificial intelligence (AI), are now capable of cloning voices from social media videos to orchestrate elaborate scams.

This alarming trend has prompted warnings for consumers to be vigilant about the content they share online.


Initially surfacing abroad last year, many AI voice scams seem to be an advancement of the ‘Hi Mum’ text message scam that gained widespread attention in Australia in 2022.

The process is frighteningly simple and efficient. Scammers scour the internet for videos containing clear audio of their targets.

With just a few seconds of voice recording, they can replicate the way a person speaks.

The cloned voice is then used to deceive friends and family members through phone calls or voicemails, pleading for urgent financial help.


compressed-shutterstock_698258266.jpeg
Consumers were warned that scammers might use social media videos to clone voices with AI for fraud. Credit: Shutterstock


Recent research from Starling Bank has shed light on the prevalence of this scam, revealing that 28 per cent of people from the United Kingdom have encountered an AI voice cloning attempt in the past year.

What's more concerning is that nearly half of the population is unaware that such a scam even exists.

This lack of awareness is compounded by the fact that 8 per cent of individuals admitted they would likely comply with a request for money, even if the call seemed unusual.

Meanwhile, in Australia, according to the ACCC's National Anti-Scam Centre, there have been fewer than five reports of suspected voice or video cloning by scammers using AI technology since 2022.

However, experts and businesses are warning that this threat is increasing, with the National Australia Bank (NAB) highlighting AI voice cloning as a major scam risk for Australians this year.


Lisa Grahame, Chief Information Security Officer at Starling Bank, cautioned, ‘People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters.’

The bank recommended using a secure phrase with trusted friends and family to verify the authenticity of a call.

‘Scammers only need three seconds of audio to clone your voice, but it would only take a few minutes with your family and friends to create a safe phrase to thwart them,’ Ms Grahame explained.

‘So it’s more important than ever for people to be aware of these types of scams being perpetuated by fraudsters and how to protect themselves and their loved ones from falling victim.’


While a safe phrase can significantly reduce the risk of falling prey to these scams, it's not foolproof.

Scammers are constantly evolving their tactics, and there's always a chance that safe words could be compromised.

Therefore, it's crucial to remain sceptical of any unexpected financial requests received over a call.

Double-checking by calling back on a known and trusted number or using Australia’s dedicated helpline 1300 CYBER1 (1300 292 371) to contact your bank directly can provide an additional layer of security.


These sophisticated scams have not only targeted individuals but have also duped large international businesses.

A notable case involved a Hong Kong company employee who was conned into transferring HK$200 million (AU$ 37.8 million) after fraudsters used a deepfake video conference call to impersonate senior company officers.

The suspect is thought to have pre-downloaded videos and then employed artificial intelligence to insert counterfeit voices into the video conference.

‘AI presents incredible opportunities for industry, society and governments, but we must stay alert to the dangers, including AI-enabled fraud,’ Lord Hanson, Home Office minister with responsibility for fraud, stated.


Meanwhile, one of the initial ways AI voice scams have been used against Australians is by endorsing fraudulent investment schemes.

For instance, when Sydney Advertising Executive Dee Madigan received a message from a social media account impersonating Queensland Premier Steven Miles, she immediately recognised it as a scam.

‘Steven's a friend of mine, so I knew it wasn't him, [but] he had a fantastic idea about an investment,’ she recalled.

Curious to see how the scammer would respond, Madigan requested a phone call, expecting it would end the interaction. However, she was taken aback by what happened next.

‘All of a sudden, my phone suddenly rang and on the other end was his voice,’ she narrated.

During a short phone call and a subsequent audio message, the voice claimed to be too busy to chat but assured her that more details about the investment opportunity would be sent.

‘It was surprisingly good,’ Madigan remarked.

‘[It was] slightly robotic, but much better than I thought it would have been. It definitely did sound like Steven.’
Key Takeaways
  • Consumers were warned that scammers could use social media videos to clone voices with AI for fraudulent purposes.
  • A study by Starling Bank revealed that 28 per cent of the UK population experienced an AI voice cloning scam attempt, and 46 per cent were unaware of such scams.
  • Starling Bank recommended the use of a safe phrase among friends and family to verify the authenticity of calls.
  • Even big businesses have been duped by sophisticated AI scams, with a case in Hong Kong involving a HK$200 million fraud through a deepfake video conference.
  • AI voice scams targeting Australians initially involved endorsing fake investment schemes, as demonstrated when Sydney Executive Dee Madigan received a convincing voice message impersonating Queensland Premier Steven Miles despite recognising it as a scam.
Have you or someone you know encountered a voice cloning scam? How do you safeguard your digital presence? Share your thoughts and experiences with us in the comments below, and let's help each other stay one step ahead of the scammers.
 
Sponsored
I don't understand people's obsession with plastering things all over the internet on Facebook, etc.
If you want to tell someone something, ring them or send an SMS. If you want to tell several people do a share SMS.
Simple. The whole world really doesn't need to know
Exactly. It looks like technology has come back and bitten the "look at me" brigade on the arse.
 
My police officer brother, has been telling me for years not to enter into conversation with telemarketers, as they use syllables, language, from your conversation to make up voice scams
Neither should you answer "Yes" when they ask is this " whatever your name is" as that yes can be used to agree to something you are not aware of
 

Join the conversation

News, deals, games, and bargains for Aussies over 60. From everyday expenses like groceries and eating out, to electronics, fashion and travel, the club is all about helping you make your money go further.

Seniors Discount Club

The SDC searches for the best deals, discounts, and bargains for Aussies over 60. From everyday expenses like groceries and eating out, to electronics, fashion and travel, the club is all about helping you make your money go further.
  1. New members
  2. Jokes & fun
  3. Photography
  4. Nostalgia / Yesterday's Australia
  5. Food and Lifestyle
  6. Money Saving Hacks
  7. Offtopic / Everything else

Latest Articles

  • We believe that retirement should be a time to relax and enjoy life, not worry about money. That's why we're here to help our members make the most of their retirement years. If you're over 60 and looking for ways to save money, connect with others, and have a laugh, we’d love to have you aboard.
  • Advertise with us

User Menu

Enjoyed Reading our Story?

  • Share this forum to your loved ones.
Change Weather Postcode×
Change Petrol Postcode×