Home » AI » Meta Starts To Label AI-Generated Contents
BFF

Meta Starts To Label AI-Generated Contents

You’re likely going to see more AI-generated content on Facebook, Threads, and Instagram, but it will be labeled.

Meta announced that starting in May, it would label videos and pictures made with AI in the hopes of being more transparent about what content is and how it’s made across its platforms.

Currently, Meta removes AI-generated content when it’s detected, but only “created or altered by AI to make a person appear to say something they didn’t say,” according to a blog post by Monika Bickert, VP of Content Policy at Meta.

The company says that it has worked with its Oversight Board to update its policies around AI-created content, since the previous approach was created in 2020, before the current explosion of AI media.

“We agree with the Oversight Board’s recommendation that providing transparency and additional context is now the better way to address manipulated media and avoid the risk of unnecessarily restricting freedom of speech, so we’ll keep this content on our platforms so we can add labels and context,” Bickert wrote.

So, instead of seeing less AI-generated content, we may start to see more on Threads, Instagram, and Facebook, but Meta hopes to provide more transparency and “additional context” via these labels. The company says it may add “more prominent labels” if the “altered images, video, or audio create a particularly high risk of materially deceiving the public on a matter of importance.”

The company will still remove content if it violates any of it’s other policies, like voter interference, bullying, harassment, violence, and incitement. Its fact-checkers will still rate content as False or Altered, and such content will get a label and moved lower in your feed.


Meta promises to start labeling AI-generated content in May 2024, and stop removing said content solely on that basis in July. This, says Meta, will let people have a little time to figure out the self-disclosure process, as well.

Leave a Reply

Your email address will not be published. Required fields are marked *