More celebrities are speaking out against 'deepfake' scams
- Replies 5
Have you ever seen a celebrity in a video endorsing a product or making controversial statements, only to find out later it was all fake?
This disturbing new phenomenon is on the rise, and well-known Aussie media figures Georgie Gardner and Richard Wilkins are blowing the whistle after personally being victimised.
Deepfakes utilise advanced artificial intelligence to digitally impose someone's likeness and voice into video or audio. The results can often be eerily convincing, fooling even those closest to the victims.
Recently, Hollywood legend Tom Hanks took to social media to warn fans about a deepfake video using his image to promote a dental care plan.
'I have nothing to do with it,' Hanks insisted.
Comedian Hamish Blake was also horrified to see his face prominently displayed on an online ad for weight loss gummies, which he never endorsed.
'I don't even understand what that product is,' Blake told 2GB host Ben Fordham on his radio program.
Blake joked the scripting of the video was amusing, with the fake Blake claiming he's given the dubious product to all his loved ones.
'Not even just a few—all of them. Everyone I've ever met. The gardener is getting it. The guy who works two floors below me…Guess what, Bruce, you've got a pack,' he joked.
But perhaps most disturbing was Gardner's account of her grandmother getting duped by a deepfaked ad campaign claiming Georgie endorsed anti-aging skincare creams.
'She [her grandmother] said, “Oh darling, I wasn't aware you were selling face creams. I just bought a whole box of them!”' Georgie revealed.
Gardner was alerted to the ruse only when visiting her grandmother, who knew Georgie would never advertise wrinkle creams. Still, the fact that an intimate family member was so easily fooled shows how convincingly realistic these AI impersonations can be.
'The frightening thing about it is twofold,' she explained.
'One is that my grandmother, who knows me super well, was caught up in it. But the other thing was how impossible it was to get it taken down.'
Fellow Channel 9 personality Richard Wilkins has also been personally stung by deepfake technology. Several videos surfaced depicting Wilkins supposedly being arrested in public places—pure fiction but looking eerily like real footage.
'Someone called me the other day and said you've now been arrested in a park in London,' Wilkins told Talking Honey. 'It looked so real, too.'
Even his son was briefly fooled until he realised Richard 'would never be seen out without wearing his skinny jeans'.
Wilkins agreed that it's hard to get rid of these fake videos, and the best we can do is tell what's real and fake.
In 2023, Scamwatch received many reports about scams involving celebrities. These reports have reached a total of 209,466 cases, and what's worrisome is that 10.5 per cent of these reports have resulted in financial losses for the victims.
The financial impact of these scams is already massive, with victims losing over $367 million. To give you some perspective, in the previous year, Australians collectively experienced a record-breaking loss of $3.1 billion due to scams.
A recent example of this issue involves David 'Kochie' Koch, the former co-host of Sunrise and a well-known business personality.
Scammers have been using his image to promote investment scams, which has led him to work with authorities to address this problem.
Members, take note—if you stumble upon any suspicious celebrity ads on social media, consider this your signal to conduct thorough research and verify their authenticity. Your hard-earned money is at stake, and we certainly wouldn't want it to fall into scammers' hands!
Have you encountered anything similar recently? Please share your experiences with us in the comments below.
This disturbing new phenomenon is on the rise, and well-known Aussie media figures Georgie Gardner and Richard Wilkins are blowing the whistle after personally being victimised.
Deepfakes utilise advanced artificial intelligence to digitally impose someone's likeness and voice into video or audio. The results can often be eerily convincing, fooling even those closest to the victims.
Recently, Hollywood legend Tom Hanks took to social media to warn fans about a deepfake video using his image to promote a dental care plan.
'I have nothing to do with it,' Hanks insisted.
Comedian Hamish Blake was also horrified to see his face prominently displayed on an online ad for weight loss gummies, which he never endorsed.
'I don't even understand what that product is,' Blake told 2GB host Ben Fordham on his radio program.
Blake joked the scripting of the video was amusing, with the fake Blake claiming he's given the dubious product to all his loved ones.
'Not even just a few—all of them. Everyone I've ever met. The gardener is getting it. The guy who works two floors below me…Guess what, Bruce, you've got a pack,' he joked.
But perhaps most disturbing was Gardner's account of her grandmother getting duped by a deepfaked ad campaign claiming Georgie endorsed anti-aging skincare creams.
'She [her grandmother] said, “Oh darling, I wasn't aware you were selling face creams. I just bought a whole box of them!”' Georgie revealed.
Gardner was alerted to the ruse only when visiting her grandmother, who knew Georgie would never advertise wrinkle creams. Still, the fact that an intimate family member was so easily fooled shows how convincingly realistic these AI impersonations can be.
'The frightening thing about it is twofold,' she explained.
'One is that my grandmother, who knows me super well, was caught up in it. But the other thing was how impossible it was to get it taken down.'
Fellow Channel 9 personality Richard Wilkins has also been personally stung by deepfake technology. Several videos surfaced depicting Wilkins supposedly being arrested in public places—pure fiction but looking eerily like real footage.
'Someone called me the other day and said you've now been arrested in a park in London,' Wilkins told Talking Honey. 'It looked so real, too.'
Even his son was briefly fooled until he realised Richard 'would never be seen out without wearing his skinny jeans'.
Wilkins agreed that it's hard to get rid of these fake videos, and the best we can do is tell what's real and fake.
In 2023, Scamwatch received many reports about scams involving celebrities. These reports have reached a total of 209,466 cases, and what's worrisome is that 10.5 per cent of these reports have resulted in financial losses for the victims.
The financial impact of these scams is already massive, with victims losing over $367 million. To give you some perspective, in the previous year, Australians collectively experienced a record-breaking loss of $3.1 billion due to scams.
A recent example of this issue involves David 'Kochie' Koch, the former co-host of Sunrise and a well-known business personality.
Scammers have been using his image to promote investment scams, which has led him to work with authorities to address this problem.
Key Takeaways
- Public figures such as Tom Hanks and Hamish Blake have warned fans about artificial intelligence (AI) deep fakes being used in scams, with their likeness being used without permission to promote products.
- In an episode of Talking Honey, Georgie Gardner revealed her image was used without her consent to sell face creams, confusing her family and difficulty in having the false endorsement taken down.
- Richard Wilkins shared that AI deepfakes of him being 'arrested' have been created and circulated, creating false narratives about his private life.
- David 'Kochie' Koch, former Sunrise co-host and business personality, is collaborating with authorities to combat scammers who use his image for investment scams.
Members, take note—if you stumble upon any suspicious celebrity ads on social media, consider this your signal to conduct thorough research and verify their authenticity. Your hard-earned money is at stake, and we certainly wouldn't want it to fall into scammers' hands!
Have you encountered anything similar recently? Please share your experiences with us in the comments below.