Home > Media News >
Source: http://www.mashable.com
Mashable: Surprise, surprise: Facebook doesn't always make the right decision.
Facebook's content moderation systems are clearly in need of repair.
On Thursday, Meta's Oversight Board announced that it had reversed two of Facebook's decisions to remove content from its platform. The independent group's conclusions point to major flaws in Facebook's content moderation protocols in two major areas: the platform's use of automated systems to take down content and the removal of newsworthy content by human moderators.
The first case from the Oversight Board concerns a Facebook user in Colombia who had posted a cartoon image depicting police brutality from the National Police of Colombia in September 2020. Facebook removed the user's post 16 months later when the company's automated systems matched the cartoon image with one stored in a Media Matching Service bank.
The Oversight Board determined it was wrong for Facebook to remove the user's post because the image depicted did not violate Facebook's rules and should not have been added to the Media Matching Service bank.
And, according to the Oversight Board, this user wasn't the only one affected. In total, 215 users appealed the removal of a post which included this image. Of those, 98 percent were successful in their appeal to Meta. However, the cartoon image remained in the bank and continued to lead to automated detections and subsequent post removals. Meta only removed the image from the Media Matching Service bank when the Oversight Board decided to take up this particular case.
In the second case, the Oversight Board determined Meta wrongly removed a news post about the Taliban. In January 2022, an India-based newspaper had posted a link to an article on its website about the Taliban's announcement to re-open schools for women and girls. Meta had determined that the post was in violation of its Dangerous Individuals and Organizations policy as it construed the post as "praise" of the Taliban.
As a result, Meta removed the post and limited the Indian newspaper's access to certain Facebook features, such as Facebook livestreaming. The newspaper attempted to appeal the decision but it was not reviewed due to a lack of Urdu-speaking reviewers at the company.
Once more, when the Oversight Board decided to take this case, Meta then reversed its decision, restored the content, and removed the Facebook Page limitations. Simply reporting on newsworthy events is not a violation of Facebook's policies, the Oversight Board determined.
While the affected users in these specific cases may be fairly small in number or reach, the Oversight Board used the opportunity to recommend broader changes to Facebook's content moderation systems, whether it be automated or human-reviewed.
Founded in 2018, the Oversight Board was formed to create somewhat of a Supreme Court for Meta's content moderation decisions. The organization released the decisions on its first cases in January 2021. One of those early rulings was heavily criticized as it called for the restoration of a removed post that Muslim activist groups deemed as hate speech. But, the Oversight Board's most notable case up to this point has easily been its decision to uphold Meta's suspension of Donald Trump on Facebook. The former President was suspended from the platform following the violent riots at the Capitol building on Jan. 6.
The Oversight Board's decision did force Meta to set a timeframe for Trump's suspension, however. Shortly after this 2021 ruling from the Oversight Board, Meta announced it would consider allowing Trump back on its platforms in January 2023. That may have sounded far off into the future back in June 2021, but now that's just a few months away. If and when Trump returns to Facebook next year, don't be surprised to see his name on an Oversight Board case or two...or twenty.
Right Now
23 Dec, 2024 / 07:51 AM
Dubai is one of the safest cities in the world and this tourist’s experience is proof of it
Top Stories