Face value: Meta's new feature aims to unmask scammers, but at what cost?
- Replies 1
A leading social media platform has introduced new face recognition technology to enhance user experience and security.
This cutting-edge feature aims to streamline various functionalities on the platform, promising users an improved and more personalised interaction.
However, the implementation of such technology also raises important questions about privacy and data protection, sparking a mix of excitement and concern among users and experts alike.
In a world where our digital footprints are as significant as our real ones, Facebook's latest announcement has certainly raised eyebrows and sparked conversations among our tech-savvy seniors.
The social media giant, now under the umbrella of Meta, is reintroducing face recognition technology, but this time with a twist aimed at enhancing security and user experience.
Facebook's reintroduction of face recognition technology comes with a noble intent: to shield users from the increasingly sophisticated celeb-bait scams and to streamline the process of account recovery.
These scams, which have exploited the images of well-known Australian figures like TV host Karl Stefanovic and billionaire Andrew Forrest, lure unsuspecting individuals into fraudulent schemes, often leading to financial loss and compromised personal information.
Mr Forrest is presently engaged in a legal battle with Meta in the United States regarding fraudulent cryptocurrency ads on Facebook that misuse his image.
Meta's approach involves a one-time comparison of faces in advertisements with the public figures' Facebook and Instagram profile pictures. If the system detects a match and confirms the ad as a scam, it will be blocked.
This proactive step is a part of Meta's broader strategy to combat the misuse of their platform by scammers.
‘We immediately delete any facial data generated from ads for this one-time comparison regardless of whether our system finds a match, and we don’t use it for any other purpose,’ a Meta spokesperson stated in a media release.
This decision follows three years after Facebook discontinued its facial recognition software due to privacy and regulatory issues, during which it deleted the images of one billion users.
Facebook reports that early testing with a small group of celebrities and public figures has yielded promising results, enhancing the speed and efficiency of its review system.
The company plans to expand the program soon to include a larger group of public figures, who will receive in-app notifications about their enrolment.
Public figures can choose to opt-out in the Accounts Centre.
Meta is intensifying efforts against Facebook accounts that impersonate public figures, often aiming to trick people into interacting with scam content or defrauding them of money.
These scammers promote bogus investment schemes and attempt to collect personal and sensitive information.
The detection system for these accounts is similar to the one used for identifying fake ads.
People often lose access to their Facebook and Instagram accounts due to forgotten login details, misplaced devices, or when scammers obtain their passwords.
Regaining access can be challenging, so Meta is introducing a new verification method.
Users can upload a video selfie, which Meta will use facial recognition technology to compare with the profile picture for verification, similar to the biometric authentication systems on phones.
Meta assures that these video selfies will be encrypted, securely stored, not visible on the user’s profile, and deleted after the comparison, regardless of the match outcome.
The Australian Government is considering new mandatory industry codes to compel social media companies and banks to combat scammers.
For more information on identifying and avoiding scams, visit Scamwatch, operated by the National Anti-Scam Centre.
Mr Forrest’s lawyer, Simon Clarke, stated that the billionaire is pursuing legal action against Meta in California due to fraudulent ads that appeared on Meta’s platform.
He mentioned that the next court hearing is scheduled for 31 October 2024 in the Northern District Division of the Federal Court, San Jose, California.
‘Facebook purchased a world-leading facial recognition company 12 years ago—they could have adopted this technology then, well before the epidemic of celebrity scams started,’ Mr Clarke said.
‘Whilst genuine attempts to stop fraudulent ads online are welcome, one has to wonder why they are doing this now?’
As Facebook introduces its new face recognition technology aimed at enhancing user experience and safety, concerns about online scams continue to loom large.
Many Australians are calling on social media platforms to take decisive action against fraudulent advertisements that exploit users, emphasising the need for stronger measures to protect the community from scams.
This increasing demand for accountability reflects a broader desire for safer online spaces, highlighting the importance of both technological advancements and user protection in the digital landscape.
What are your thoughts on Facebook's reintroduction of face recognition technology? Do you see it as a necessary step towards a safer online experience, or does it raise concerns about privacy for you? Share your insights and experiences in the comments below, and let's navigate this digital age together with both caution and an open mind.
This cutting-edge feature aims to streamline various functionalities on the platform, promising users an improved and more personalised interaction.
However, the implementation of such technology also raises important questions about privacy and data protection, sparking a mix of excitement and concern among users and experts alike.
In a world where our digital footprints are as significant as our real ones, Facebook's latest announcement has certainly raised eyebrows and sparked conversations among our tech-savvy seniors.
The social media giant, now under the umbrella of Meta, is reintroducing face recognition technology, but this time with a twist aimed at enhancing security and user experience.
Facebook's reintroduction of face recognition technology comes with a noble intent: to shield users from the increasingly sophisticated celeb-bait scams and to streamline the process of account recovery.
These scams, which have exploited the images of well-known Australian figures like TV host Karl Stefanovic and billionaire Andrew Forrest, lure unsuspecting individuals into fraudulent schemes, often leading to financial loss and compromised personal information.
Mr Forrest is presently engaged in a legal battle with Meta in the United States regarding fraudulent cryptocurrency ads on Facebook that misuse his image.
Meta's approach involves a one-time comparison of faces in advertisements with the public figures' Facebook and Instagram profile pictures. If the system detects a match and confirms the ad as a scam, it will be blocked.
This proactive step is a part of Meta's broader strategy to combat the misuse of their platform by scammers.
‘We immediately delete any facial data generated from ads for this one-time comparison regardless of whether our system finds a match, and we don’t use it for any other purpose,’ a Meta spokesperson stated in a media release.
This decision follows three years after Facebook discontinued its facial recognition software due to privacy and regulatory issues, during which it deleted the images of one billion users.
Facebook reports that early testing with a small group of celebrities and public figures has yielded promising results, enhancing the speed and efficiency of its review system.
The company plans to expand the program soon to include a larger group of public figures, who will receive in-app notifications about their enrolment.
Public figures can choose to opt-out in the Accounts Centre.
Meta is intensifying efforts against Facebook accounts that impersonate public figures, often aiming to trick people into interacting with scam content or defrauding them of money.
These scammers promote bogus investment schemes and attempt to collect personal and sensitive information.
The detection system for these accounts is similar to the one used for identifying fake ads.
People often lose access to their Facebook and Instagram accounts due to forgotten login details, misplaced devices, or when scammers obtain their passwords.
Regaining access can be challenging, so Meta is introducing a new verification method.
Users can upload a video selfie, which Meta will use facial recognition technology to compare with the profile picture for verification, similar to the biometric authentication systems on phones.
Meta assures that these video selfies will be encrypted, securely stored, not visible on the user’s profile, and deleted after the comparison, regardless of the match outcome.
The Australian Government is considering new mandatory industry codes to compel social media companies and banks to combat scammers.
For more information on identifying and avoiding scams, visit Scamwatch, operated by the National Anti-Scam Centre.
Mr Forrest’s lawyer, Simon Clarke, stated that the billionaire is pursuing legal action against Meta in California due to fraudulent ads that appeared on Meta’s platform.
He mentioned that the next court hearing is scheduled for 31 October 2024 in the Northern District Division of the Federal Court, San Jose, California.
‘Facebook purchased a world-leading facial recognition company 12 years ago—they could have adopted this technology then, well before the epidemic of celebrity scams started,’ Mr Clarke said.
‘Whilst genuine attempts to stop fraudulent ads online are welcome, one has to wonder why they are doing this now?’
As Facebook introduces its new face recognition technology aimed at enhancing user experience and safety, concerns about online scams continue to loom large.
Many Australians are calling on social media platforms to take decisive action against fraudulent advertisements that exploit users, emphasising the need for stronger measures to protect the community from scams.
This increasing demand for accountability reflects a broader desire for safer online spaces, highlighting the importance of both technological advancements and user protection in the digital landscape.
Key Takeaways
- Facebook is reintroducing face recognition technology to protect users from scams and assist with account recovery.
- The technology will be used to prevent celeb-bait scams by verifying if images in ads match the profile pictures of public figures.
- Meta will delete any facial data generated from the one-time ad comparison, ensuring it's not used for any other purposes.
- Andrew Forrest is suing Meta over scam cryptocurrency advertisements using his image, with the next court appearance scheduled for 31 October 2024.