Dick Smith fooled by voice-manipulating ‘deepfake’ video; blames social media platforms
By
VanessaC
- Replies 1
While technology is incredibly impressive, the advancements can also be a little bit frightening.
One of the biggest scares at the moment is the rise of 'deepfakes', which are fraudulent videos created by artificial intelligence (AI) to make people appear as though they are doing or saying things when, in fact, they aren’t.
In a startling revelation, renowned Australian entrepreneur Dick Smith has admitted to being momentarily deceived by a deepfake video that manipulated his own voice.
The deepfake video was so convincing that it fooled even him.
'It fooled me, I thought I was answering these questions,' he said.
In this case, it was used to create a video that mimicked a segment of Channel Nine’s A Current Affair featuring prominent business figures, including Gina Rinehart, Andrew Forrest, Jim Chalmers, and Dick Smith.
'It was A Current Affair, that segment that these criminal gangs have made up, and it was completely false, and that actually got my voice in a completely different text.'
This incident has led him to criticise social media giants Facebook and Instagram for their failure to effectively combat this escalating form of cyber fraud.
Despite personally contacting the platforms to have the video removed, he claims to have seen no concrete action.
'Understand, these ads have been going for over two years, and Facebook and Instagram do nothing about it at all,' he said.
'I’ve contacted Facebook, I’ve contacted the ACCC, I’ve contacted everyone I can.'
'I’ve even had emails to Gina Rinehart because they are using her name and haven’t been able to stop it.'
Smith's frustration is shared by many who believe that social media platforms should take more responsibility for the content shared on their sites.
Andrew Forrest, another business figure featured in the deepfake video, has taken legal action against Meta Platform (formerly Facebook) over alleged failings to prevent its systems from being used to commit crimes.
‘You’ve got to understand, Facebook and Instagram are incredibly wealthy, and they just delay every court case,’ Smith added.
The businessman then vowed to ensure 'virtually every Australian' would never believe ads on Facebook or Instagram until the problem is resolved.
‘I’m going to make sure that virtually every Australian never ever believes an ad on Facebook or Instagram, never ever again until Facebook or Instagram do something about checking their ads before they go on air,’ he shared.
Deepfake technology has recently been making headlines worldwide as it has opened up a new frontier in cyber fraud.
By manipulating facial appearances and voices, fraudsters can create convincing videos of individuals saying or doing things they never did.
This technology poses a significant threat to personal and financial security, as it can be used to impersonate individuals, spread disinformation, or trick people into revealing sensitive information or parting with their money.
While social media platforms have policies in place to prohibit deceptive or misleading content, the rapid advancement of deepfake technology presents a significant challenge.
A Meta spokesperson acknowledged the issue, stating that the company is 'constantly tackling scams through a combination of technology, such as new machine learning techniques and specially trained reviewers, to identify content and accounts that violate our policies.'
'We are currently also working across industries and with the government to identify new ways to stop scammers.'
'We encourage people to use our in-app reporting tools when they see any suspicious activity. We encourage those who have fallen victim to scams to reach out to their local law enforcement agency.'
Have you encountered a deepfake video or been a victim of a similar scam? Share your experiences and thoughts in the comments below.
One of the biggest scares at the moment is the rise of 'deepfakes', which are fraudulent videos created by artificial intelligence (AI) to make people appear as though they are doing or saying things when, in fact, they aren’t.
In a startling revelation, renowned Australian entrepreneur Dick Smith has admitted to being momentarily deceived by a deepfake video that manipulated his own voice.
The deepfake video was so convincing that it fooled even him.
'It fooled me, I thought I was answering these questions,' he said.
In this case, it was used to create a video that mimicked a segment of Channel Nine’s A Current Affair featuring prominent business figures, including Gina Rinehart, Andrew Forrest, Jim Chalmers, and Dick Smith.
'It was A Current Affair, that segment that these criminal gangs have made up, and it was completely false, and that actually got my voice in a completely different text.'
This incident has led him to criticise social media giants Facebook and Instagram for their failure to effectively combat this escalating form of cyber fraud.
Despite personally contacting the platforms to have the video removed, he claims to have seen no concrete action.
'Understand, these ads have been going for over two years, and Facebook and Instagram do nothing about it at all,' he said.
'I’ve contacted Facebook, I’ve contacted the ACCC, I’ve contacted everyone I can.'
'I’ve even had emails to Gina Rinehart because they are using her name and haven’t been able to stop it.'
Smith's frustration is shared by many who believe that social media platforms should take more responsibility for the content shared on their sites.
Andrew Forrest, another business figure featured in the deepfake video, has taken legal action against Meta Platform (formerly Facebook) over alleged failings to prevent its systems from being used to commit crimes.
‘You’ve got to understand, Facebook and Instagram are incredibly wealthy, and they just delay every court case,’ Smith added.
The businessman then vowed to ensure 'virtually every Australian' would never believe ads on Facebook or Instagram until the problem is resolved.
‘I’m going to make sure that virtually every Australian never ever believes an ad on Facebook or Instagram, never ever again until Facebook or Instagram do something about checking their ads before they go on air,’ he shared.
Deepfake technology has recently been making headlines worldwide as it has opened up a new frontier in cyber fraud.
By manipulating facial appearances and voices, fraudsters can create convincing videos of individuals saying or doing things they never did.
This technology poses a significant threat to personal and financial security, as it can be used to impersonate individuals, spread disinformation, or trick people into revealing sensitive information or parting with their money.
While social media platforms have policies in place to prohibit deceptive or misleading content, the rapid advancement of deepfake technology presents a significant challenge.
A Meta spokesperson acknowledged the issue, stating that the company is 'constantly tackling scams through a combination of technology, such as new machine learning techniques and specially trained reviewers, to identify content and accounts that violate our policies.'
'We are currently also working across industries and with the government to identify new ways to stop scammers.'
'We encourage people to use our in-app reporting tools when they see any suspicious activity. We encourage those who have fallen victim to scams to reach out to their local law enforcement agency.'
Key Takeaways
- Dick Smith, a prominent entrepreneur, revealed he was fooled by a deepfake video utilising his own voice and face and criticised Facebook and Instagram for not adequately addressing the issue.
- The deepfake video, which posed as an advertisement on social media and manipulated a segment of Nine’s A Current Affair, caused Australians to lose hundreds of dollars in a scam.
- Smith revealed he has contacted Facebook, the ACCC and other entities requesting action over the fraudulent video, but has yet to see concrete results.
- Meta, the parent company of Facebook and Instagram, maintains that there are processes in place to tackle scams and deceptive content.
Last edited: