Unsettling revelations from whistleblower Frances Haugen: Is Artificial Intelligence becoming a dangerous threat?
We've all heard the term Artificial Intelligence (or AI) floating around lately. While generally used across various industries, it may be a bigger and more dangerous threat to civilisation than anyone ever thought.
This is according to whistleblower Frances Haugen, who calls for more robust regulations in Australia to protect people against the alarming rise of artificial intelligence.
You may know Ms Haugen as the former Facebook employee who quit her job in 2021 after disclosing thousands of company documents, which revealed that toxic and dangerous content was deliberately and knowingly being spread by the platform.
She had a successful career in Silicon Valley in California before testifying that Facebook promoted hate speech and Instagram pushed eating disorder content to teenagers.
She warned about the imminent ‘era of opacity’ since AI is growing and more economies are now relying on software running on data centres worldwide.
According to Ms Haugen, without stronger regulation, there would be a ‘repeat of what we saw with social media’ on a greater scale.
As she explained to the National Press Club, 'When we start getting into scalable systems that run on data centres, [a] very small number of people can have civilisation-impacting levels of power.’
‘At Facebook, there’s a very small number of people who really understand how these algorithms work, and yet it impacts what everyone sees in the news,' she added.
‘When we see what kind of scale that can impact, that can have really serious consequences,’ Ms Haugen said.
Currently, millions of people are using AI programs such as ChatGPT daily—and they're on the rise.
Recent surveys suggest nearly a quarter of all Australians have used AI software in their work, and 70 per cent of teenagers have used AI at least once.
Additionally, Ms Haugen believed that newer algorithms on social media platforms promoted ‘the most divisive, polarising content’ to generate profit.
‘It used to be content that kept you around was the best content. Then they said if you get a reshare, a like, a comment...that’s what good content is,’ she said.
She continued: ‘Now, our most divisive, polarising, worst content gets the most distribution. Unless we pass laws that say, “Hey, we will not repeat the errors of social media, we have to have transparency...we have to protect whistleblowers”, we will see repeats of what we saw with Facebook.’
‘Any time there’s a profit motive and no feedback cycle to correct those lies, we will see those gaps get bigger and bigger,’ Ms Haugen declared.
According to a free speech advocate, the protections for whistleblowers and media are failing in Australia.
Powerful figures like award-winning journalist Peter Greste have echoed Ms Haugen’s thoughts, citing the need for stronger protections for whistleblowers and the media, as well as increased transparency with the government and the Australian Defence Force (ADF).
'I just think that at the moment, generally speaking, as a country and particularly with our security and defence agencies, we are too obsessed with secrecy, with keeping these things closed,' said Greste.
‘I think that’s causing a huge amount of problems for everybody,’ he added.
What do you think of this story, members? Have you tried using any AI software? Let us know in the comments below!
This is according to whistleblower Frances Haugen, who calls for more robust regulations in Australia to protect people against the alarming rise of artificial intelligence.
You may know Ms Haugen as the former Facebook employee who quit her job in 2021 after disclosing thousands of company documents, which revealed that toxic and dangerous content was deliberately and knowingly being spread by the platform.
She had a successful career in Silicon Valley in California before testifying that Facebook promoted hate speech and Instagram pushed eating disorder content to teenagers.
She warned about the imminent ‘era of opacity’ since AI is growing and more economies are now relying on software running on data centres worldwide.
According to Ms Haugen, without stronger regulation, there would be a ‘repeat of what we saw with social media’ on a greater scale.
As she explained to the National Press Club, 'When we start getting into scalable systems that run on data centres, [a] very small number of people can have civilisation-impacting levels of power.’
‘At Facebook, there’s a very small number of people who really understand how these algorithms work, and yet it impacts what everyone sees in the news,' she added.
‘When we see what kind of scale that can impact, that can have really serious consequences,’ Ms Haugen said.
Currently, millions of people are using AI programs such as ChatGPT daily—and they're on the rise.
Recent surveys suggest nearly a quarter of all Australians have used AI software in their work, and 70 per cent of teenagers have used AI at least once.
Additionally, Ms Haugen believed that newer algorithms on social media platforms promoted ‘the most divisive, polarising content’ to generate profit.
‘It used to be content that kept you around was the best content. Then they said if you get a reshare, a like, a comment...that’s what good content is,’ she said.
She continued: ‘Now, our most divisive, polarising, worst content gets the most distribution. Unless we pass laws that say, “Hey, we will not repeat the errors of social media, we have to have transparency...we have to protect whistleblowers”, we will see repeats of what we saw with Facebook.’
‘Any time there’s a profit motive and no feedback cycle to correct those lies, we will see those gaps get bigger and bigger,’ Ms Haugen declared.
According to a free speech advocate, the protections for whistleblowers and media are failing in Australia.
Powerful figures like award-winning journalist Peter Greste have echoed Ms Haugen’s thoughts, citing the need for stronger protections for whistleblowers and the media, as well as increased transparency with the government and the Australian Defence Force (ADF).
'I just think that at the moment, generally speaking, as a country and particularly with our security and defence agencies, we are too obsessed with secrecy, with keeping these things closed,' said Greste.
‘I think that’s causing a huge amount of problems for everybody,’ he added.
Key Takeaways
- Former Facebook whistleblower Frances Haugen has expressed concern about the rapid development of artificial intelligence without stronger regulation.
- Haugen warned that unchecked advances in AI would result in an 'era of opacity' and civilisation-altering impacts, increasing online misinformation.
- Haugen suggested that cutting-edge algorithms prioritise 'the most divisive, polarising content' to enhance profit.
- The whistleblower stated that without laws to ensure transparency and protection for whistleblowers, the problems witnessed with social media platforms such as Facebook would recur on a more significant scale.
What do you think of this story, members? Have you tried using any AI software? Let us know in the comments below!