Chilling true story: how scammers used AI to stage a kidnapping
Imagine being on the phone one day and hearing the panicked cries of your daughter on the other end of the line.
That's exactly what happened to one mum when her 15-year-old daughter, Briana, called her in distress.
When the mum picked up the phone, her heart sank to the pit of her stomach–it sounded like her daughter was being held against her will. A man was heard giving orders in the background, instructing Briana to lie down and put her head back.
The frightened mum was in her car, picking up her 13-year-old daughter Aubrey from her dance class in Arizona. Briana had been two hours away to attend a skiing competition with her father, so the mum couldn't help but fear the worst.
As the man on the phone spoke, her fears were quickly replaced with terror. 'You need to pay $1 million if you want to see her again,' he threatened. He then told her that he would pick her up in a white van, put a bag over her head, and if she didn't bring the money in cash, he'd drive her off to Mexico and leave her dead.
Faced with a deluge of threats from the man, the mum rushed to the lobby of the building she fetched Aubrey from and quickly alerted other parents to her ordeal while placing her phone on mute. This sparked a frantic buzz of panicked calls to the police.
At this point, one of the mothers suddenly uttered words that would both ground and disturb the mum: ‘It’s AI (artificial intelligence), it’s an AI scam.’
It made sense for the mum, who remembered recent reports of similar scams.
Desperate to know the truth but unable to accept what she was told for fear of things being more than a scam, the horrified mum kept the phone on mute to demand her other daughter, Aubrey, contact the rest of their family.
‘But I just couldn’t accept it. I kept thinking, my baby is out there, and someone needs to save her now. It was my daughter’s voice, I would never have mistaken that,’ the mother said.
The situation was quickly resolved when the mum eventually heard from her husband, who had found Briana. On the phone, her husband and Briana confirmed that she was safe at the ski resort.
The mother confronted the scammer afterwards. ‘I unmuted the other line, where the man was still telling me all the things he’d do to us if we didn’t comply. I started calling him out, calling him a scammer, but he kept going, saying he had her. I hung up on him,’ she narrated.
However, the police refused to investigate further because ‘no one had been hurt and no money had been taken’. To them, this was considered a prank call.
But the mum was certain it wasn't a prank and was devastated to learn that something like this was possible. After all, it was her daughter's voice that the scammer had used, and it sounded far too real to be a scam.
The story shook the mum and her daughters, especially Aubrey. In one instance, she shared, Aubrey panicked when a bloke tried to give her his number for fear he was a kidnapper.
Where the scammers managed to duplicate Aubrey’s voice also puzzled the mum. To the best of her knowledge about these scams, it takes three seconds of audio to replicate a person’s voice. Although Briana maintains private social media accounts, there are available videos of her participating in sports and school interviews, but none of her expressing distress. The emergence of advanced AI technology raises concerns about its potential misuse to harm children.
Unfortunately, with this technology, criminals are becoming bolder with their craft, enough to make one shudder in fear.
Seniors, we implore you to stay vigilant and inform your families of this criminal tactic to better protect you from criminal activities.
Document everything if you encounter suspicious activities and contact your local police station to report the incident.
Stay safe, SDC Members! Let us know what you think of this story in the comments below.
That's exactly what happened to one mum when her 15-year-old daughter, Briana, called her in distress.
The frightened mum was in her car, picking up her 13-year-old daughter Aubrey from her dance class in Arizona. Briana had been two hours away to attend a skiing competition with her father, so the mum couldn't help but fear the worst.
As the man on the phone spoke, her fears were quickly replaced with terror. 'You need to pay $1 million if you want to see her again,' he threatened. He then told her that he would pick her up in a white van, put a bag over her head, and if she didn't bring the money in cash, he'd drive her off to Mexico and leave her dead.
Faced with a deluge of threats from the man, the mum rushed to the lobby of the building she fetched Aubrey from and quickly alerted other parents to her ordeal while placing her phone on mute. This sparked a frantic buzz of panicked calls to the police.
At this point, one of the mothers suddenly uttered words that would both ground and disturb the mum: ‘It’s AI (artificial intelligence), it’s an AI scam.’
It made sense for the mum, who remembered recent reports of similar scams.
‘But I just couldn’t accept it. I kept thinking, my baby is out there, and someone needs to save her now. It was my daughter’s voice, I would never have mistaken that,’ the mother said.
The situation was quickly resolved when the mum eventually heard from her husband, who had found Briana. On the phone, her husband and Briana confirmed that she was safe at the ski resort.
The mother confronted the scammer afterwards. ‘I unmuted the other line, where the man was still telling me all the things he’d do to us if we didn’t comply. I started calling him out, calling him a scammer, but he kept going, saying he had her. I hung up on him,’ she narrated.
However, the police refused to investigate further because ‘no one had been hurt and no money had been taken’. To them, this was considered a prank call.
But the mum was certain it wasn't a prank and was devastated to learn that something like this was possible. After all, it was her daughter's voice that the scammer had used, and it sounded far too real to be a scam.
Where the scammers managed to duplicate Aubrey’s voice also puzzled the mum. To the best of her knowledge about these scams, it takes three seconds of audio to replicate a person’s voice. Although Briana maintains private social media accounts, there are available videos of her participating in sports and school interviews, but none of her expressing distress. The emergence of advanced AI technology raises concerns about its potential misuse to harm children.
Unfortunately, with this technology, criminals are becoming bolder with their craft, enough to make one shudder in fear.
Key Takeaways
- A mother experienced a harrowing scam involving artificial intelligence (AI) and her daughter's kidnapping.
- The scammer used an audio clip replicating the daughter's voice, causing the mother to believe her daughter was in immediate danger.
- Confirmation from her husband that their daughter was safe did not immediately convince the mother, illustrating the intense emotional effect of the scam.
- Reflecting on the incident, the mother expressed concern about how people with ill intent can use AI technology in harmful ways.
Seniors, we implore you to stay vigilant and inform your families of this criminal tactic to better protect you from criminal activities.
Document everything if you encounter suspicious activities and contact your local police station to report the incident.
Stay safe, SDC Members! Let us know what you think of this story in the comments below.