Beware! ‘Sophisticated’ AI voice cloning scams are skyrocketing

We know all too well how quickly technology can change, and it can be both a blessing and a curse—depending on how you look at it.

But there are some malicious people out there who also take advantage of technological advancements and use them for ill-gotten gains.

Case in point: The emergence of more sophisticated Artificial Intelligence (AI) voice cloning scams.


This type of scam, which has seen a sharp rise in the past year, involves cybercriminals stealing a victim’s voice from social media and using it to fool unsuspecting family members with a fearmongering phone call. This is done in the hopes of receiving money or access to private information.


SDC Images (3).png
Scammers are becoming more sophisticated with their methods and have been using AI programs to clone their victims’ voices. Image source: cookie_studio on Freepik.


Mike Scheumack, Chief Innovation Officer at IdentityIQ, an identity theft protection and credit score monitoring firm, said: 'AI has been around for a long time, and software companies have been using it to advance technology for a while.'

'We’ve seen it start entering into this kind of cybercriminal space slowly, then all of the sudden just ramp up very quickly over the past year or so.’

'We’ve seen a lot in terms of advanced phishing scams, targeted phishing scams, we’ve seen where AI is being used to generate very specific emails, and the language is very specific as to who the target is,’ he continued.

'We’ve seen AI voice cloning scams increase over the past year as well, which is a very scary topic.'


So, how exactly is AI voice cloning done?

'All they need is as little as 3 seconds, 10 seconds is even better to get a very realistic clone of your voice,' Scheumack said.

A scammer then uses this three-second clip of someone's voice—usually taken from a social media post—and runs it through an AI program that replicates the voice.

They can then make the clone say whatever they want by typing in words, and, depending on how the scam is scripted, they can even add more emotions like fear or laughter into the voice.


To make this scam even more credible, scammers will use AI programs to search the internet for more data on the person they are targeting—this may include data about what they do for a living and more.

'The scary thing is, is that this is not your next-door neighbour doing this…This is a sophisticated organisation, it’s not [just] one person doing it. You have people that are researching on social media and gathering data about people,' Scheumack added.

'Those are not the same people that are going to plug in your voice. You have somebody else that’s going to clone the voice. You have somebody else that’s going to actually commit the act of calling. And you have somebody come to the victim’s house and pick up money if the scam is working.'

Scheumack also discussed how this tactic was used in a recent interview the firm did with an individual who received what she believed to be a panicked call from her daughter, who was at a camp. However, it was found to be an AI-generated voice clone of her daughter, which used an audio sample the scammers found on social media.

The scammers also found a social media post of her daughter about leaving for camp and had used that information to make the call more credible.

Another telling trait of AI voice cloning scams is that the voice clone calls are usually short. The scammer will try to cut off a potential conversation by saying something like, 'I can’t talk right now,' to induce panic while they try and take advantage of the victim’s emotions to gain access to money or sensitive information.

'The goal of the scammer is to get you into fight or flight and to create urgency in your mind that your loved one is in some sort of trouble. So the best way to deal with those situations is to hang up and immediately call your loved one to verify if it’s them or not,' Scheumack explained.


In another example, a couple in Texas was scammed $5,000 through a call they received from their son.

The scammer reportedly resorted to extreme measures to persuade them and fabricated a story that the person involved in the accident their son was in was a pregnant woman who had a miscarriage.

‘I could have sworn I was talking to my son. We had a conversation,’ Kathy, the mother recalled.

You can read more about this story here.


So, with an upsurge of these scams reported, what actions can you take to ensure you do not fall prey to them?

Scheumack said the first step is to be cautious about what you post online that’s available to the public. Secondly, if you do receive a call from an unknown number and it’s someone you know who says they’re in an urgent situation—hang up the phone and verify it’s actually them.

'Generally, take caution with that—that should be a red flag to you if you’re receiving a call from an unknown number and it’s a relative or a loved one and there’s an urgent situation.'

'You should definitely take a second to think about that.'

Scheumack also suggested that families consider implementing a password they can use to verify their identity when they are calling because of an emergency.
Key Takeaways
  • Scammers are ramping up their utilisation of Artificial Intelligence (AI) tools to clone their victims’ voices and dupe their loved ones into sending money or sharing private information.
  • In order to clone voices, scammers either record a person’s voice or find an audio clip on social media. These audio samples are then run through an AI program to produce realistic voice clones.
  • Mike Scheumack, the Chief Innovation Officer at identity theft protection firm IdentityIQ, revealed that AI voice cloning scams have surged over the past year.
  • As a precaution, Scheumack suggested everyone be wary of what they post online, be careful when receiving urgent calls from unfamiliar numbers, and consider creating a family password for verification purposes in emergency situations.
Members, what do you think about this recent development in scams? Have you encountered something similar? Let us know in the comments below!
 
  • Like
Reactions: Ezzy and Littleboy8
Sponsored
We know all too well how quickly technology can change, and it can be both a blessing and a curse—depending on how you look at it.

But there are some malicious people out there who also take advantage of technological advancements and use them for ill-gotten gains.

Case in point: The emergence of more sophisticated Artificial Intelligence (AI) voice cloning scams.


This type of scam, which has seen a sharp rise in the past year, involves cybercriminals stealing a victim’s voice from social media and using it to fool unsuspecting family members with a fearmongering phone call. This is done in the hopes of receiving money or access to private information.


View attachment 30822
Scammers are becoming more sophisticated with their methods and have been using AI programs to clone their victims’ voices. Image source: cookie_studio on Freepik.


Mike Scheumack, Chief Innovation Officer at IdentityIQ, an identity theft protection and credit score monitoring firm, said: 'AI has been around for a long time, and software companies have been using it to advance technology for a while.'

'We’ve seen it start entering into this kind of cybercriminal space slowly, then all of the sudden just ramp up very quickly over the past year or so.’

'We’ve seen a lot in terms of advanced phishing scams, targeted phishing scams, we’ve seen where AI is being used to generate very specific emails, and the language is very specific as to who the target is,’ he continued.

'We’ve seen AI voice cloning scams increase over the past year as well, which is a very scary topic.'


So, how exactly is AI voice cloning done?

'All they need is as little as 3 seconds, 10 seconds is even better to get a very realistic clone of your voice,' Scheumack said.

A scammer then uses this three-second clip of someone's voice—usually taken from a social media post—and runs it through an AI program that replicates the voice.

They can then make the clone say whatever they want by typing in words, and, depending on how the scam is scripted, they can even add more emotions like fear or laughter into the voice.


To make this scam even more credible, scammers will use AI programs to search the internet for more data on the person they are targeting—this may include data about what they do for a living and more.

'The scary thing is, is that this is not your next-door neighbour doing this…This is a sophisticated organisation, it’s not [just] one person doing it. You have people that are researching on social media and gathering data about people,' Scheumack added.

'Those are not the same people that are going to plug in your voice. You have somebody else that’s going to clone the voice. You have somebody else that’s going to actually commit the act of calling. And you have somebody come to the victim’s house and pick up money if the scam is working.'

Scheumack also discussed how this tactic was used in a recent interview the firm did with an individual who received what she believed to be a panicked call from her daughter, who was at a camp. However, it was found to be an AI-generated voice clone of her daughter, which used an audio sample the scammers found on social media.

The scammers also found a social media post of her daughter about leaving for camp and had used that information to make the call more credible.

Another telling trait of AI voice cloning scams is that the voice clone calls are usually short. The scammer will try to cut off a potential conversation by saying something like, 'I can’t talk right now,' to induce panic while they try and take advantage of the victim’s emotions to gain access to money or sensitive information.

'The goal of the scammer is to get you into fight or flight and to create urgency in your mind that your loved one is in some sort of trouble. So the best way to deal with those situations is to hang up and immediately call your loved one to verify if it’s them or not,' Scheumack explained.


In another example, a couple in Texas was scammed $5,000 through a call they received from their son.

The scammer reportedly resorted to extreme measures to persuade them and fabricated a story that the person involved in the accident their son was in was a pregnant woman who had a miscarriage.

‘I could have sworn I was talking to my son. We had a conversation,’ Kathy, the mother recalled.

You can read more about this story here.


So, with an upsurge of these scams reported, what actions can you take to ensure you do not fall prey to them?

Scheumack said the first step is to be cautious about what you post online that’s available to the public. Secondly, if you do receive a call from an unknown number and it’s someone you know who says they’re in an urgent situation—hang up the phone and verify it’s actually them.

'Generally, take caution with that—that should be a red flag to you if you’re receiving a call from an unknown number and it’s a relative or a loved one and there’s an urgent situation.'

'You should definitely take a second to think about that.'

Scheumack also suggested that families consider implementing a password they can use to verify their identity when they are calling because of an emergency.
Key Takeaways

  • Scammers are ramping up their utilisation of Artificial Intelligence (AI) tools to clone their victims’ voices and dupe their loved ones into sending money or sharing private information.
  • In order to clone voices, scammers either record a person’s voice or find an audio clip on social media. These audio samples are then run through an AI program to produce realistic voice clones.
  • Mike Scheumack, the Chief Innovation Officer at identity theft protection firm IdentityIQ, revealed that AI voice cloning scams have surged over the past year.
  • As a precaution, Scheumack suggested everyone be wary of what they post online, be careful when receiving urgent calls from unfamiliar numbers, and consider creating a family password for verification purposes in emergency situations.
Members, what do you think about this recent development in scams? Have you encountered something similar? Let us know in the comments below!
I talk to no one over the phone unless the number is recorded in my contacts.
 
We know all too well how quickly technology can change, and it can be both a blessing and a curse—depending on how you look at it.

But there are some malicious people out there who also take advantage of technological advancements and use them for ill-gotten gains.

Case in point: The emergence of more sophisticated Artificial Intelligence (AI) voice cloning scams.


This type of scam, which has seen a sharp rise in the past year, involves cybercriminals stealing a victim’s voice from social media and using it to fool unsuspecting family members with a fearmongering phone call. This is done in the hopes of receiving money or access to private information.


View attachment 30822
Scammers are becoming more sophisticated with their methods and have been using AI programs to clone their victims’ voices. Image source: cookie_studio on Freepik.


Mike Scheumack, Chief Innovation Officer at IdentityIQ, an identity theft protection and credit score monitoring firm, said: 'AI has been around for a long time, and software companies have been using it to advance technology for a while.'

'We’ve seen it start entering into this kind of cybercriminal space slowly, then all of the sudden just ramp up very quickly over the past year or so.’

'We’ve seen a lot in terms of advanced phishing scams, targeted phishing scams, we’ve seen where AI is being used to generate very specific emails, and the language is very specific as to who the target is,’ he continued.

'We’ve seen AI voice cloning scams increase over the past year as well, which is a very scary topic.'


So, how exactly is AI voice cloning done?

'All they need is as little as 3 seconds, 10 seconds is even better to get a very realistic clone of your voice,' Scheumack said.

A scammer then uses this three-second clip of someone's voice—usually taken from a social media post—and runs it through an AI program that replicates the voice.

They can then make the clone say whatever they want by typing in words, and, depending on how the scam is scripted, they can even add more emotions like fear or laughter into the voice.


To make this scam even more credible, scammers will use AI programs to search the internet for more data on the person they are targeting—this may include data about what they do for a living and more.

'The scary thing is, is that this is not your next-door neighbour doing this…This is a sophisticated organisation, it’s not [just] one person doing it. You have people that are researching on social media and gathering data about people,' Scheumack added.

'Those are not the same people that are going to plug in your voice. You have somebody else that’s going to clone the voice. You have somebody else that’s going to actually commit the act of calling. And you have somebody come to the victim’s house and pick up money if the scam is working.'

Scheumack also discussed how this tactic was used in a recent interview the firm did with an individual who received what she believed to be a panicked call from her daughter, who was at a camp. However, it was found to be an AI-generated voice clone of her daughter, which used an audio sample the scammers found on social media.

The scammers also found a social media post of her daughter about leaving for camp and had used that information to make the call more credible.

Another telling trait of AI voice cloning scams is that the voice clone calls are usually short. The scammer will try to cut off a potential conversation by saying something like, 'I can’t talk right now,' to induce panic while they try and take advantage of the victim’s emotions to gain access to money or sensitive information.

'The goal of the scammer is to get you into fight or flight and to create urgency in your mind that your loved one is in some sort of trouble. So the best way to deal with those situations is to hang up and immediately call your loved one to verify if it’s them or not,' Scheumack explained.


In another example, a couple in Texas was scammed $5,000 through a call they received from their son.

The scammer reportedly resorted to extreme measures to persuade them and fabricated a story that the person involved in the accident their son was in was a pregnant woman who had a miscarriage.

‘I could have sworn I was talking to my son. We had a conversation,’ Kathy, the mother recalled.

You can read more about this story here.


So, with an upsurge of these scams reported, what actions can you take to ensure you do not fall prey to them?

Scheumack said the first step is to be cautious about what you post online that’s available to the public. Secondly, if you do receive a call from an unknown number and it’s someone you know who says they’re in an urgent situation—hang up the phone and verify it’s actually them.

'Generally, take caution with that—that should be a red flag to you if you’re receiving a call from an unknown number and it’s a relative or a loved one and there’s an urgent situation.'

'You should definitely take a second to think about that.'

Scheumack also suggested that families consider implementing a password they can use to verify their identity when they are calling because of an emergency.
Key Takeaways

  • Scammers are ramping up their utilisation of Artificial Intelligence (AI) tools to clone their victims’ voices and dupe their loved ones into sending money or sharing private information.
  • In order to clone voices, scammers either record a person’s voice or find an audio clip on social media. These audio samples are then run through an AI program to produce realistic voice clones.
  • Mike Scheumack, the Chief Innovation Officer at identity theft protection firm IdentityIQ, revealed that AI voice cloning scams have surged over the past year.
  • As a precaution, Scheumack suggested everyone be wary of what they post online, be careful when receiving urgent calls from unfamiliar numbers, and consider creating a family password for verification purposes in emergency situations.
Members, what do you think about this recent development in scams? Have you encountered something similar? Let us know in the comments below!
Remedy is simple - just create a secret phrase that only you and your kids would know.
 
They sure don't give up ,I had 4 emails already today one the Macafee one the others the parcel one had to laugh one was in French
I get at least 4 emails a day I don't open them just delete I know the emails that I do want so anything else is just deleted
 
  • Like
Reactions: Ezzy and IAN3005
We know all too well how quickly technology can change, and it can be both a blessing and a curse—depending on how you look at it.

But there are some malicious people out there who also take advantage of technological advancements and use them for ill-gotten gains.

Case in point: The emergence of more sophisticated Artificial Intelligence (AI) voice cloning scams.


This type of scam, which has seen a sharp rise in the past year, involves cybercriminals stealing a victim’s voice from social media and using it to fool unsuspecting family members with a fearmongering phone call. This is done in the hopes of receiving money or access to private information.


View attachment 30822
Scammers are becoming more sophisticated with their methods and have been using AI programs to clone their victims’ voices. Image source: cookie_studio on Freepik.


Mike Scheumack, Chief Innovation Officer at IdentityIQ, an identity theft protection and credit score monitoring firm, said: 'AI has been around for a long time, and software companies have been using it to advance technology for a while.'

'We’ve seen it start entering into this kind of cybercriminal space slowly, then all of the sudden just ramp up very quickly over the past year or so.’

'We’ve seen a lot in terms of advanced phishing scams, targeted phishing scams, we’ve seen where AI is being used to generate very specific emails, and the language is very specific as to who the target is,’ he continued.

'We’ve seen AI voice cloning scams increase over the past year as well, which is a very scary topic.'


So, how exactly is AI voice cloning done?

'All they need is as little as 3 seconds, 10 seconds is even better to get a very realistic clone of your voice,' Scheumack said.

A scammer then uses this three-second clip of someone's voice—usually taken from a social media post—and runs it through an AI program that replicates the voice.

They can then make the clone say whatever they want by typing in words, and, depending on how the scam is scripted, they can even add more emotions like fear or laughter into the voice.


To make this scam even more credible, scammers will use AI programs to search the internet for more data on the person they are targeting—this may include data about what they do for a living and more.

'The scary thing is, is that this is not your next-door neighbour doing this…This is a sophisticated organisation, it’s not [just] one person doing it. You have people that are researching on social media and gathering data about people,' Scheumack added.

'Those are not the same people that are going to plug in your voice. You have somebody else that’s going to clone the voice. You have somebody else that’s going to actually commit the act of calling. And you have somebody come to the victim’s house and pick up money if the scam is working.'

Scheumack also discussed how this tactic was used in a recent interview the firm did with an individual who received what she believed to be a panicked call from her daughter, who was at a camp. However, it was found to be an AI-generated voice clone of her daughter, which used an audio sample the scammers found on social media.

The scammers also found a social media post of her daughter about leaving for camp and had used that information to make the call more credible.

Another telling trait of AI voice cloning scams is that the voice clone calls are usually short. The scammer will try to cut off a potential conversation by saying something like, 'I can’t talk right now,' to induce panic while they try and take advantage of the victim’s emotions to gain access to money or sensitive information.

'The goal of the scammer is to get you into fight or flight and to create urgency in your mind that your loved one is in some sort of trouble. So the best way to deal with those situations is to hang up and immediately call your loved one to verify if it’s them or not,' Scheumack explained.


In another example, a couple in Texas was scammed $5,000 through a call they received from their son.

The scammer reportedly resorted to extreme measures to persuade them and fabricated a story that the person involved in the accident their son was in was a pregnant woman who had a miscarriage.

‘I could have sworn I was talking to my son. We had a conversation,’ Kathy, the mother recalled.

You can read more about this story here.


So, with an upsurge of these scams reported, what actions can you take to ensure you do not fall prey to them?

Scheumack said the first step is to be cautious about what you post online that’s available to the public. Secondly, if you do receive a call from an unknown number and it’s someone you know who says they’re in an urgent situation—hang up the phone and verify it’s actually them.

'Generally, take caution with that—that should be a red flag to you if you’re receiving a call from an unknown number and it’s a relative or a loved one and there’s an urgent situation.'

'You should definitely take a second to think about that.'

Scheumack also suggested that families consider implementing a password they can use to verify their identity when they are calling because of an emergency.
Key Takeaways

  • Scammers are ramping up their utilisation of Artificial Intelligence (AI) tools to clone their victims’ voices and dupe their loved ones into sending money or sharing private information.
  • In order to clone voices, scammers either record a person’s voice or find an audio clip on social media. These audio samples are then run through an AI program to produce realistic voice clones.
  • Mike Scheumack, the Chief Innovation Officer at identity theft protection firm IdentityIQ, revealed that AI voice cloning scams have surged over the past year.
  • As a precaution, Scheumack suggested everyone be wary of what they post online, be careful when receiving urgent calls from unfamiliar numbers, and consider creating a family password for verification purposes in emergency situations.
Members, what do you think about this recent development in scams? Have you encountered something similar? Let us know in the comments below!
Well they must all shop at Woolies or Coles and have their photo taken.......
 
  • Like
Reactions: IAN3005
When a caller askes "Am I speaking to (name)" never say the word yes. never answer YES to anybody over the phone unless you who you are talking to. They record this to use as a voice signature whenever they can to scam you. Always reply with, this is he or she, speaking, possibly, who wants to know, etc. If you are not sure tell them you'll call back and use the number you know is correct for the business that called, but never answer YES to any questions they ask.
 
  • Like
Reactions: gordon1940
When a caller askes "Am I speaking to (name)" never say the word yes. never answer YES to anybody over the phone unless you who you are talking to. They record this to use as a voice signature whenever they can to scam you. Always reply with, this is he or she, speaking, possibly, who wants to know, etc. If you are not sure tell them you'll call back and use the number you know is correct for the business that called, but never answer YES to any questions they ask.
I understand the "Yes" bit, but surely if you speak at all the Scammer may use it for voice signature?
 
URGENT NEWS FOR MEMBERS RE SCAMS - HOW TO DEAL WITH THEM
ID Support NSW events for Scam Awareness Week

ID Support NSW want to empower you to safeguard your personal information and recognise threats. Our informative sessions provide guidance for all NSW customers on how to prevent identity misuse. These start Mon 27 NOV 2023 with a topic covered every day & 2 sessions per day @ 10am & 2pm. MON TOPIC - Family & Friends Impersonation, TUES TOPIC - Bank Impersonations, WED TOPIC - Government Impersonations, THUR TOPIC - Celebrity Impersonations, FRI TOPIC - Business Impersonations.

WEBSITE: https://www.nsw.gov.au/id-support-nsw/events?dateFrom=2023-11-27&dateTo=2023-12-01&sort=soonest PROVIDED BY: ID Support NSW events

🎉Thank you SDC Staff 🌷 💐👍:) for supplying us with information re scams. The above sessions hopefully will give all of us even more 'ammunition' to use against scams - Belleclare
 
Things are getting tricky for actors, presenters, public orators, newsreaders and folk in similar jobs. Here's what occurred to one actor:

Stephen Fry, 66-year-old actor, author and narrator was at the O2 in London on 14 Sept 2023 to deliver a speech on Artificial Intelligence (AI) at the CogX Festival technology conference.

During his speech, Fry had cautioned against the dangers of AI, playing the audience a clip of what appeared to be his voice narrating a historical documentary.

“I said not one word of that – it was a machine,” Fry revealed, according to Fortune. “Yes, it shocked me. They used my reading of the seven volumes of the Harry Potter books, and from that dataset, an AI of my voice was created and it made that new narration.”

He continued: “What you heard was not the result of a mash-up. This is from a flexible artificial voice, where the words are modulated to fit the meaning of each sentence. It could therefore have me read anything from a call to storm Parliament to hard porn, all without my knowledge and without my permission. And this, what you just heard, was done without my knowledge.”

Fry added: “So I heard about this, I sent it to my agents on both sides of the Atlantic, and they went ballistic – they had no idea such a thing was possible.”

“Tech is not a noun, it is a verb, it is always moving,” he concluded. “What we have now is not what will be. When it comes to AI models, what we have now will advance at a faster rate than any technology we have ever seen. One thing we can all agree on: It’s a f***ing weird time to be alive.”

From: Stephen Fry ‘rushed to hospital’ after falling ‘two metres’ off O2 stage during AI conference (yahoo.com) in The Independent newspaper, based in London.
 

Join the conversation

News, deals, games, and bargains for Aussies over 60. From everyday expenses like groceries and eating out, to electronics, fashion and travel, the club is all about helping you make your money go further.

Seniors Discount Club

The SDC searches for the best deals, discounts, and bargains for Aussies over 60. From everyday expenses like groceries and eating out, to electronics, fashion and travel, the club is all about helping you make your money go further.
  1. New members
  2. Jokes & fun
  3. Photography
  4. Nostalgia / Yesterday's Australia
  5. Food and Lifestyle
  6. Money Saving Hacks
  7. Offtopic / Everything else
  • We believe that retirement should be a time to relax and enjoy life, not worry about money. That's why we're here to help our members make the most of their retirement years. If you're over 60 and looking for ways to save money, connect with others, and have a laugh, we’d love to have you aboard.
  • Advertise with us

User Menu

Enjoyed Reading our Story?

  • Share this forum to your loved ones.
Change Weather Postcode×
Change Petrol Postcode×