Home > Media News >
Facebook published its Content Distribution Guidelines detailing the roughly three-dozen types of posts it demotes for various reasons in the News Feed, like clickbait and posts by repeat policy offenders. That process, which relies heavily on machine learning technology to automatically detect problematic content, effectively throttles the reach of offending posts and comments without the author knowing.
Facebook has mostly confirmed in various reports over the years about the guidelines.They don’t detail exactly how a demotion works and exactly how much it reduces a piece of content’s reach. Or how severely a certain kind of post, like a link to spam, is throttled in the News Feed relative to a post about health misinformation, for example.
“We want to give a clearer sense of what we think is problematic but not worth removing” because it doesn’t explicitly violate platform policy, Jason Hirsch, Facebook’s head of integrity policy, told The Verge. He said the company hopes to add more information to the guidelines over time, including how demotions throttle specific kinds of content relative to others. But he said Facebook likely won’t stack rank the severity of demotions “for adversarial reasons.”
The guidelines spell out that Facebook’s policy is to suppress stories that have been disputed by users as inaccurate — as was the case with The Post’s dubious reporting — until a review is completed by its network of third-party fact-checkers. That policy was made known widely only a year ago after critics accused the company of political bias for censoring The Post.
According to the distribution guidelines, other types of content Facebook demotes include links to spam sites, “low quality” comments that are either very lengthy with copied text or contain no words at all, posts in groups from accounts that share at a “very high frequency,” and news articles without a clear byline.
Releasing these guidelines is part of a bigger effort to disclose more about how the News Feed works to the public, according to Hirsch. Media outlets and politicians are increasingly examining Facebook’s negative effects on the world, and lawmakers in the US and elsewhere are looking to regulate how social media companies police their platforms.
Top Stories