Beware: Scammer creates AI clone of Aussie politician to swindle Bitcoin investors
In an age where technology is advancing at a breakneck pace, it's no surprise that the tools of deception are becoming more sophisticated.
The latest scam to hit the headlines is a chilling reminder that not everything is as it seems, especially when it comes to the digital realm.
Imagine receiving a message from a familiar voice, one you trust, only to discover it's a machine mimicking your confidant to lure you into a scam.
This is precisely what happened when a scammer used an AI voice clone of Queensland Premier Steven Miles to run a Bitcoin investment con.
The incident unfolded when Advertising Executive Dee Madigan received a private message from someone impersonating the Premier.
As friends with Miles, Madigan immediately knew something was amiss and decided to play along.
‘It was all pretty standard stuff,’ she recalled. ‘I said I was interested in investing but pretended to have trouble setting up a trading account just to annoy the guy.’
‘When I asked him to jump on a phone call with me to walk me through it, I figured that would be the end of it,’ she continued.
But Madigan was surprised to be confronted with a voice message that sounded eerily like the Premier himself.
The scammer, using advanced AI technology, had cloned Premier Miles' voice to create a convincing impersonation, complete with awkward pauses and familiar intonations.
‘It sounded like him,’ she shared. ‘A bit stunted and awkward, but it was his voice. It was pretty creepy.’
The message said: ‘Sorry for the rush, Dee. I’m about to enter a meeting, but I just wanted to talk to you, as I haven’t been able to do that since I promised.’
‘Concerning the investment, I’ll definitely shoot you a text when I have some free time.’
Madigan expressed concern, stating: ‘Someone who doesn’t know the technology might fall for something like this.’
‘A lot of friends of mine now have a code word so if they get a call from someone in their family that seems suspicious, they have to say it so they know it’s legitimate. I think that’s a good idea,’ she added.
This unsettling event is a stark reminder for our community, particularly for those who may not be as tech-savvy.
Scammers are becoming increasingly cunning, using tools like AI to exploit our trust in familiar voices. It's a tactic that's not just limited to public figures; it could target anyone, including ourselves or our loved ones.
Consumer advocacy group CHOICE warned that a 'tsunami' of AI voice scams is on the horizon, and Australians should be on high alert.
These scams are not just a theoretical threat; they have already caused devastation overseas, bypassing the defences of even the most vigilant individuals.
The National Australia Bank (NAB) also chimed in, noting a rise in phone calls from supposed loved ones in distress, urgently needing money.
While these scams have been more prevalent in the United Kingdom and the United States, it's only a matter of time before they become common in Australia.
The technology behind these scams, known as deepfake audio, has been around for a while but requires extensive audio samples to create a convincing copy.
However, advancements in AI have dramatically reduced the amount of audio needed, with companies like OpenAI developing tools that can clone a voice with just a 15-second sample.
Queensland Premier Steven Miles made it clear that the government will never solicit investments in Bitcoin or any other financial schemes.
‘The fake clip of what sounds like my voice is obviously terrifying—for me and for anyone who might accidentally be conned,’ the Premier said.
‘The Queensland Government will never try to get you to invest in Bitcoin. If you come across a scam, you should report it to scamwatch.gov.au.’
‘Queenslanders should know that there will be a lot of misinformation in the lead-up to the October election. Only get your news from reliable sources, and if something seems off, fact-check it with credible sources,’ he continued.
‘If you’re ever in doubt head to qld.gov.au or contact my office.’
For our members at the Seniors Discount Club, it's crucial to stay informed and vigilant.
Consider following Madigan’s strategy of establishing a code word with family and friends to verify identities in suspicious calls.
Always be sceptical of unsolicited investment advice, especially if it comes from high-profile individuals or loved ones who wouldn't typically discuss such matters.
Remember, in the digital age, caution is your best defence against scammers' cunning tricks.
Stay safe, stay sceptical, and when in doubt, hang up the phone and reach out directly to the person who supposedly contacted you.
You can listen to the AI-generated voice message from the scammer here:
Source: @deemadigan/X (Twitter)
We'd love to hear from you, members. Have you encountered similar scams, or do you have tips for spotting and avoiding them? Share your experiences and advice in the comments below!
The latest scam to hit the headlines is a chilling reminder that not everything is as it seems, especially when it comes to the digital realm.
Imagine receiving a message from a familiar voice, one you trust, only to discover it's a machine mimicking your confidant to lure you into a scam.
This is precisely what happened when a scammer used an AI voice clone of Queensland Premier Steven Miles to run a Bitcoin investment con.
The incident unfolded when Advertising Executive Dee Madigan received a private message from someone impersonating the Premier.
As friends with Miles, Madigan immediately knew something was amiss and decided to play along.
‘It was all pretty standard stuff,’ she recalled. ‘I said I was interested in investing but pretended to have trouble setting up a trading account just to annoy the guy.’
‘When I asked him to jump on a phone call with me to walk me through it, I figured that would be the end of it,’ she continued.
But Madigan was surprised to be confronted with a voice message that sounded eerily like the Premier himself.
The scammer, using advanced AI technology, had cloned Premier Miles' voice to create a convincing impersonation, complete with awkward pauses and familiar intonations.
‘It sounded like him,’ she shared. ‘A bit stunted and awkward, but it was his voice. It was pretty creepy.’
The message said: ‘Sorry for the rush, Dee. I’m about to enter a meeting, but I just wanted to talk to you, as I haven’t been able to do that since I promised.’
‘Concerning the investment, I’ll definitely shoot you a text when I have some free time.’
Madigan expressed concern, stating: ‘Someone who doesn’t know the technology might fall for something like this.’
‘A lot of friends of mine now have a code word so if they get a call from someone in their family that seems suspicious, they have to say it so they know it’s legitimate. I think that’s a good idea,’ she added.
This unsettling event is a stark reminder for our community, particularly for those who may not be as tech-savvy.
Scammers are becoming increasingly cunning, using tools like AI to exploit our trust in familiar voices. It's a tactic that's not just limited to public figures; it could target anyone, including ourselves or our loved ones.
Consumer advocacy group CHOICE warned that a 'tsunami' of AI voice scams is on the horizon, and Australians should be on high alert.
These scams are not just a theoretical threat; they have already caused devastation overseas, bypassing the defences of even the most vigilant individuals.
The National Australia Bank (NAB) also chimed in, noting a rise in phone calls from supposed loved ones in distress, urgently needing money.
While these scams have been more prevalent in the United Kingdom and the United States, it's only a matter of time before they become common in Australia.
The technology behind these scams, known as deepfake audio, has been around for a while but requires extensive audio samples to create a convincing copy.
However, advancements in AI have dramatically reduced the amount of audio needed, with companies like OpenAI developing tools that can clone a voice with just a 15-second sample.
Queensland Premier Steven Miles made it clear that the government will never solicit investments in Bitcoin or any other financial schemes.
‘The fake clip of what sounds like my voice is obviously terrifying—for me and for anyone who might accidentally be conned,’ the Premier said.
‘The Queensland Government will never try to get you to invest in Bitcoin. If you come across a scam, you should report it to scamwatch.gov.au.’
‘Queenslanders should know that there will be a lot of misinformation in the lead-up to the October election. Only get your news from reliable sources, and if something seems off, fact-check it with credible sources,’ he continued.
‘If you’re ever in doubt head to qld.gov.au or contact my office.’
For our members at the Seniors Discount Club, it's crucial to stay informed and vigilant.
Consider following Madigan’s strategy of establishing a code word with family and friends to verify identities in suspicious calls.
Always be sceptical of unsolicited investment advice, especially if it comes from high-profile individuals or loved ones who wouldn't typically discuss such matters.
Remember, in the digital age, caution is your best defence against scammers' cunning tricks.
Stay safe, stay sceptical, and when in doubt, hang up the phone and reach out directly to the person who supposedly contacted you.
You can listen to the AI-generated voice message from the scammer here:
Source: @deemadigan/X (Twitter)
Key Takeaways
- Queensland Premier Steven Miles' voice was cloned using AI to create a voice deepfake for a Bitcoin investment scam.
- Advertising Executive Dee Madigan was approached by the scammer, who used the cloned voice in a phone call and a voice message on Telegram.
- Experts and consumer groups warned that AI voice scams are on the rise and could pose significant risks, especially in an election year.
- OpenAI has developed technology that can clone a voice from just 15 seconds of sample audio, highlighting the advancement and potential dangers of AI capabilities.
We'd love to hear from you, members. Have you encountered similar scams, or do you have tips for spotting and avoiding them? Share your experiences and advice in the comments below!