Nine was slammed for ‘AI editing’ a Victorian MP’s dress. How can news media use AI responsibly?


file-20240131-17-f3bxeg.jpg

Nine News/Georgie Purcell via X/The Conversation



Earlier this week, Channel Nine published an altered image of Victorian MP Georgie Purcell that showed her in a midriff-exposing tank top. The outfit was actually a dress.

Purcell chastised the channel for the image manipulation and accused it of being sexist. Nine apologised for the edit and blamed it on an artificial intelligence (AI) tool in Adobe Photoshop.



Generative AI has become increasingly prevalent over the past six months, as popular image editing and design tools like Photoshop and Canva have started integrating AI features into their programs.

But what are they capable of, exactly? Can they be blamed for doctored images? As these tools become more widespread, learning more about them and their dangers – alongside opportunities – is increasingly important.

What happened with the photo of Purcell?​


Typically, making AI-generated or AI-augmented images involves “prompting” – using text commands to describe what you want to see or edit.



But late last year, Photoshop unveiled a new feature, generative fill. Among its options is an “expand” tool that can add content to images, even without text prompts.

For example, to expand an image beyond its original borders, a user can simply extend the canvas and Photoshop will “imagine” content that could go beyond the frame. This ability is powered by Firefly, Adobe’s own generative AI tool.





Nine resized the image to better fit its television composition but, in doing so, also generated new parts of the image that weren’t there originally.



The source material – and if it’s cropped – are of critical importance here.

In the above example where the frame of the photo stops around Purcell’s hips, Photoshop just extends the dress as might be expected. But if you use generative expand with a more tightly cropped or composed photo, Photoshop has to “imagine” more of what is going on in the image, with variable results.





Is it legal to alter someone’s image like this? It’s ultimately up to the courts to decide. It depends on the jurisdiction and, among other aspects, the risk of reputational harm. If a party can argue that publication of an altered image has caused or could cause them “serious harm”, they might have a defamation case.



How else is generative AI being used?​


Generative fill is just one way news organisations are using AI. Some are also using it to make or publish images, including photorealistic ones, depicting current events. An example of this is the ongoing Israel-Hamas conflict.

Others use it in place of stock photography or to create illustrations for hard-to-visualise topics, like AI itself.



Many adhere to institutional or industry-wide codes of conduct, such as the Journalist Code of Ethics from the Media, Entertainment & Arts Alliance of Australia. This states journalists should “present pictures and sound which are true and accurate” and disclose “any manipulation likely to mislead.”

Some outlets do not use AI-generated or augmented images at all, or only when reporting on such images if they go viral.



Newsrooms can also benefit from generative AI tools. An example includes uploading a spreadsheet to a service like ChatGPT-4 and receiving suggestions on how to visualise the data. Or using it to help create a three-dimensional model that illustrates how a process works or how an event unfolded.

What safeguards should media have for responsible generative AI use?​


I’ve spent the last year interviewing photo editors and people in related roles about how they use generative AI and what policies they have in place to do so safely.



I’ve learned that some media outlets bar their staff from using AI to generate any content. Others allow it only for non-realistic illustrations, such as using AI to create a bitcoin symbol or illustrate a story about finance.

News outlets, according to editors I spoke to, want to be transparent with their audiences about the content they create and how it is edited.

In 2019, Adobe started the Content Authenticity Initiative, which now includes major media organisations, image libraries and multimedia companies. This has led to the rollout of content credentials, a digital history of what equipment was used to make an image and what edits have been done to it.

This has been touted as a way to be more transparent with AI-generated or augmented content. But content credentials are not widely used yet. Besides, audiences shouldn’t outsource their critical thinking to a third party.



In addition to transparency, news editors I spoke to were sensitive to AI potentially displacing human labour. Many outlets strive to use only AI generators that have been trained with proprietary content. This is because of the ongoing cases in jurisdictions around the world over AI training data and whether resulting generations breach copyright.

Lastly, news editors said they are aware of the potential for bias in AI generations, given the unrepresentative data AI models are trained on.



This year, the World Economic Forum has named AI-fuelled misinformation and disinformation as the world’s greatest short-term risk. It placed this above even disasters like extreme weather events, inflation and armed conflict.

file-20240131-25-bln54d.png
The top ten risks as outlined in the World Economic Forum’s Global Risk Report 2024. World Economic Forum, Global Risks Perception Survey 2023–2024​
Because of this risk and the elections happening in the United States and around the world this year, engaging in healthy scepticism about what you see online is a must.

As is being thoughtful about where you get your news and information from. Doing so makes you better equipped to participate in a democracy, and less likely to fall for scams.

This article was first published on The Conversation, and was written by , T.J. Thomson, Senior Lecturer in Visual Communication & Digital Media, RMIT University

 
  • Like
Reactions: PattiB
Sponsored
All the changes in the photo of Ms Purcell could be replicated using a pre-AI version of Photoshop.

Furthermore, despite the initial claims of Chanel Nine, neither of the changes to her bust and midriff can be made by AI alone.

It took ADMP (a Dirty Minded Pervert) to do that.
 
No human has the right to alter a photo for publication & then present it as if it is the original. This photo is definitely detrimental to the lady involved & ch.9 should be taken to court for misrepresentation & the fine imposed should be large. This type of reporting must be nipped in the bud or who knows where it will end.
 
  • Like
Reactions: Bridgit and PattiB
Doesn't surprise me as the 9 is purely ran by an Ex liberal federal treasurer. Nothing will change while he is still there, he just cant help himself.
 
  • Love
  • Like
Reactions: AlanQ and PattiB
Think I need this technology to improve photos of me. Maybe it could imagine I look 20 or 30 years younger! Use of it is a bit of a worry. So much for the days when we were told that photos don’t lie. Guess photoshop put an end to that.
 
  • Like
Reactions: maherdj
Pretty much what everyone should expect from a Tabloid 'news' show. Where sport makes up most of what they consider to be a news worth story. Most of the crap on there 6pm show is pure entertainment and promotion of their other rubbish shows, ie ACA.
 
  • Love
Reactions: PattiB

Join the conversation

News, deals, games, and bargains for Aussies over 60. From everyday expenses like groceries and eating out, to electronics, fashion and travel, the club is all about helping you make your money go further.
  • We believe that retirement should be a time to relax and enjoy life, not worry about money. That's why we're here to help our members make the most of their retirement years. If you're over 60 and looking for ways to save money, connect with others, and have a laugh, we’d love to have you aboard.
  • Advertise with us

User Menu

Enjoyed Reading our Story?

  • Share this forum to your loved ones.
Change Weather Postcode×
Change Petrol Postcode×