Facebook Says Content Was Mistakenly Pulled Down Because of a Bug

Facebook Says Content Was Mistakenly Pulled Down Because of a Bug

What’s happening

Facebook parent company Meta revealed in a quarterly report that its media-matching technology had a bug that was later fixed.

Why it matters

The social network said that the bug led to content that didn’t violate its rules mistakenly being pulled down.

Facebook parent company Meta said Tuesday that a bug resulted in content getting mistakenly pulled down in the first three months of this year. The social media giant said it fixed the problem and restored posts that were incorrectly flagged for violating its rules, including against terrorism and organized hate.

Facebook took action against 2.5 million pieces of content that were flagged for organized hate in the first quarter, up from 1.6 million in the fourth quarter of 2021. The social network also took action against 16 million pieces of terrorism content in the first quarter, which more than doubled from 7.7 million in the fourth quarter. Meta attributed the spike to a bug in its media-matching technology. A graph in the company’s quarterly standards enforcement report showed that the social network restored more than 400,000 pieces of content mistakenly flagged for terrorism.

Meta’s photo-and-video service Instagram also took action against more terrorism and organized hate content because of this bug. The error also affected other types of content. Because of this issue, Facebook restored 345,600 pieces of content flagged for suicide and self-injury in the first quarter, up from 95,300 in the fourth quarter, the report said. The social network also restored more than 687,800 pieces of content mistakenly flagged for sexual exploitation in the first quarter, up from 180,600 in the previous quarter.

The errors raise questions about how well Meta’s automated technology works and whether there are other bugs or mistakes that haven’t been caught. The company said it’s been taking more steps to prevent content moderation errors from happening. Meta is testing new AI technology that learns from appeals and content that’s restored. It’s also experimenting with giving people more advanced warning before the social network penalizes them for rule violations, Meta Vice President of Integrity Guy Rosen said in a press call Tuesday.

Rosen said when a false positive gets fed into its media-matching technology it’ll “fan out” and pull down a large amount of content that doesn’t violate the platform’s rules.

“We have to be very diligent about the so-called seeds that go into the system before that fan-out occurs. What we had in this case is introduction of some new technology, which introduced some false positives into the system,” Rosen said, adding that content was later restored.

At the same time, Facebook is also facing scrutiny for not removing terrorism content before it goes viral. Over the weekend, livestreamed video that officials said was posted on Twitch by the white man accused of fatally shooting 10 Black people in a Buffalo grocery store also spread on social networks such as Facebook and Twitter. The Washington Post reported that a link to a copy of the video surfaced on Facebook and was shared more than 46,000 times and received more than 500 comments. Facebook didn’t remove the link for more than 10 hours.

Rosen said once the company became aware of the shooting, employees quickly designated the event as a terrorist attack and removed any copies of the video and what officials have said is the shooter’s 180-page hate-filled rant. 

One of the challenges, Rosen said, is that people create new versions of the video or links to try to evade enforcement by social media platforms. The company, like with any occurrence, is going to refine its systems to more quickly detect violating content, he said. Rosen added that he didn’t have any more details to share about what specific steps Facebook is considering.

https://www.cnet.com/news/social-media/facebook-says-content-was-mistakenly-pulled-down-because-of-a-bug/