Sophisticated digital deception hits widow—learn how to recognise warning signs
By
Gian T
- Replies 24
In an age where technology has become intertwined with our daily lives, it's heartbreaking to hear stories like that of Perth widow Maggie Ayres, who was cruelly duped out of tens of thousands of dollars by a scammer wielding AI deepfake technology.
This alarming incident reminds us that while the digital world offers incredible conveniences, it also presents sophisticated dangers that target our most vulnerable moments.
After losing her husband of 46 years, Maggie sought companionship through online dating sites.
Unfortunately, her search for love opened the door to a deceitful predator who used advanced technology to exploit her trust and generosity.
The scammer, who called himself 'Bryan,' used deepfake technology to create a convincing digital persona.
‘I found that his kindness and patience grew on me, making me feel safe, appreciated and loved,’ she said.
Deepfakes are a form of artificial intelligence that can manipulate video and audio to make it appear as though someone is saying or doing something they are not.
In this case, the scammer used the stolen identity of a US real estate agent to craft video calls that seemed genuine to the unsuspecting Maggie.
Their relationship progressed through scheduled video calls and messages, with 'Bryan' claiming to be in charge of an oil rig.
As trust was built, he eventually convinced Maggie to loan him substantial money to repair supposedly broken equipment.
The illusion shattered during a glitched video call, revealing the scammer's true face and leaving Maggie with the devastating realisation that she had been scammed.
‘While I still heard Bryan's voice, I saw a Black man sitting in a cupboard covering his head with a blanket, so I could still see his face,’ she said.
‘I shouted out loud in disbelief, 'Is this the reality? Is this guy the scammer? Am I really being scammed?’
Maggie wasn’t alone. In recent weeks, two Western Australians have lost over $1 million combined to similar deep fake romance scams.
The sophistication of these scams is alarming, and it's clear that the technology is outpacing the awareness and regulations needed to combat them.
The federal government has begun to take action, with new laws criminalising the sharing of AI-generated pornographic material.
However, experts like Paul Litherland from Surf Online Safe argue that immediate and comprehensive action is required to regulate the use of AI in such deceptive practices.
‘You can pump as much video or photo of that person into the AI, (and) it comes back almost a seamless sewing of video that looks like it's them,’ he said.
‘We need to get onto AI now, not in another five years or five weeks,’
‘It's respond now because we need these organisations to be regulated.’
Experts say warning signs include abnormal movements, low-quality video, and mismatched lip-syncing.
Always remember: if you haven’t met someone in person, don’t send money. Ayres shared, I've not only lost my companion, but also my independence, my self-confidence and my value,'
Have you encountered suspicious online behaviour? How do you stay safe while navigating the digital world? Your insights could help a fellow member avoid a similar fate.
This alarming incident reminds us that while the digital world offers incredible conveniences, it also presents sophisticated dangers that target our most vulnerable moments.
After losing her husband of 46 years, Maggie sought companionship through online dating sites.
Unfortunately, her search for love opened the door to a deceitful predator who used advanced technology to exploit her trust and generosity.
The scammer, who called himself 'Bryan,' used deepfake technology to create a convincing digital persona.
‘I found that his kindness and patience grew on me, making me feel safe, appreciated and loved,’ she said.
Deepfakes are a form of artificial intelligence that can manipulate video and audio to make it appear as though someone is saying or doing something they are not.
In this case, the scammer used the stolen identity of a US real estate agent to craft video calls that seemed genuine to the unsuspecting Maggie.
Their relationship progressed through scheduled video calls and messages, with 'Bryan' claiming to be in charge of an oil rig.
As trust was built, he eventually convinced Maggie to loan him substantial money to repair supposedly broken equipment.
The illusion shattered during a glitched video call, revealing the scammer's true face and leaving Maggie with the devastating realisation that she had been scammed.
‘While I still heard Bryan's voice, I saw a Black man sitting in a cupboard covering his head with a blanket, so I could still see his face,’ she said.
‘I shouted out loud in disbelief, 'Is this the reality? Is this guy the scammer? Am I really being scammed?’
Maggie wasn’t alone. In recent weeks, two Western Australians have lost over $1 million combined to similar deep fake romance scams.
The sophistication of these scams is alarming, and it's clear that the technology is outpacing the awareness and regulations needed to combat them.
The federal government has begun to take action, with new laws criminalising the sharing of AI-generated pornographic material.
However, experts like Paul Litherland from Surf Online Safe argue that immediate and comprehensive action is required to regulate the use of AI in such deceptive practices.
‘You can pump as much video or photo of that person into the AI, (and) it comes back almost a seamless sewing of video that looks like it's them,’ he said.
‘We need to get onto AI now, not in another five years or five weeks,’
‘It's respond now because we need these organisations to be regulated.’
Experts say warning signs include abnormal movements, low-quality video, and mismatched lip-syncing.
Always remember: if you haven’t met someone in person, don’t send money. Ayres shared, I've not only lost my companion, but also my independence, my self-confidence and my value,'
Key Takeaways
- Perth widow Maggie Ayres was duped out of tens of thousands of dollars by a scammer using AI deepfake technology in a romance scam.
- The scammer, named 'Bryan', built trust through messages and scheduled video calls before convincing her to lend him money for purported oil rig equipment repairs.
- Deepfake technology was used to manipulate live video calls, mimicking a real person's voice and facial expressions, leading to Ayres questioning the reality during a glitched call.
- Experts are calling for immediate action on AI regulation, highlighting red flags in deepfake videos, and advising against sending money without meeting in person.
Last edited: