Home > Media News >
Source: https://in.mashable.com/
Internal Facebook documents paint a troubling picture.
No matter how bad conditions appear on the surface, it's even more treacherous below.
That's the disturbing takeaway from the Facebook Papers, a collection of internal Facebook documents leaked by whistleblower Frances Haugen and reviewed by 17 news organizations. Their stories paint a picture of a company broken beyond repair, that, despite scandal after scandal, still has the power to shock.
A small taste of that reporting, summed up below, proves just how bad the situation truly is.
1. Facebook's leaders ignored their own employees' cries for reform
The people who work at Facebook are not a monolith, and the Atlantic reports that company documents show some employees calling out real-world harm caused by the platform — only to be brushed aside by higher-ups.
"How are we expected to ignore when leadership overrides research based policy decisions to better serve people like the groups inciting violence today," a Facebook staffer wrote in the fallout of the Jan. 6 attack on the U.S. Capitol. "Rank and file workers have done their part to identify changes to improve our platform but have been actively held back. "
2. Mark Zuckerberg personally approved censoring anti-government posts abroad while masquerading as a free speech advocate in the U.S.
Facebook CEO Mark Zuckerberg doesn't want to be in the business of censoring political speech, he has repeatedly insisted. And yet, according to the Washington Post, he's personally done just that when it suits his company's bottom line.
The Post highlights a particular nasty example of the CEO's duplicity in Vietnam, where, according to people familiar with the decision, Zuckerberg himself made the call to censor anti-government posts on behalf of the ruling Communist Party in 2020.
Vietnam is an important market for Facebook. A 2018 Amnesty International estimate found Facebook earned approximately $1 billion in annual revenue from the country.
3. Facebook's own researchers were shocked by its algorithm's recommendations
That the Facebook algorithm amplifies divisive content is now a widely understood fact. Even so, the horrific nature of that content still has the power to shock even Facebook's own researchers.
"On Feb. 4, 2019, a Facebook researcher created a new user account to see what it was like to experience the social media site as a person living in Kerala, India," reports the New York Times. "For the next three weeks, the account operated by a simple rule: Follow all the recommendations generated by Facebook's algorithms to join groups, watch videos and explore new pages on the site."
I've seen more images of dead people in the past three weeks than I've seen in my entire life total.
According to internal Facebook documents, the experiment laid bare just how skewed Facebook's recommendation systems are.
"Following this test user's News Feed, I've seen more images of dead people in the past three weeks than I've seen in my entire life total," wrote the Facebook researcher.
4. Facebook puts politics front and center when enforcing its own rules
Reporting has shown that Zuckerberg feared the wrath of Facebook's conservative users, and thus would personally intervene on behalf of right-wing pundits and publishers. Leaked documents contained in the Facebook Papers and highlighted by Politico show that even Facebook's own researchers were aware of this, and repeatedly called it out internally.
"Facebook routinely makes exceptions for powerful actors when enforcing content policy," wrote a Facebook data scientist in a 2020 internal presentation titled Political Influences on Content Policy. "The standard protocol for enforcement and policy involves consulting Public Policy on any significant changes, and their input regularly protects powerful constituencies."
Notably, as Politico points out, the Public Policy team referred to by the researcher includes Facebook lobbyists.
What's more, Facebook researchers confirmed that Zuckerberg himself often got involved in deciding whether a post should stay or go — suggesting a two-tier system of enforcement dependent on unwritten rules.
In multiple cases the final judgement about whether a prominent post violates a certain written policy are made by senior executives, sometimes Mark Zuckerberg. If our decisions are intended to be an application of a written policy then it's unclear why executives would be consulted. If instead there was an unwritten aspect to our policies, namely to protect sensitive constituencies, then it's natural that we would like executives to have final decision-making power.
5. It took a threat from Apple to prompt a Facebook full-court press against human trafficking
Human traffickers have used Facebook's tools to power their work. As CNN reports, a 2020 internal Facebook document made clear that Facebook was long aware of this fact.
"[Our] platform enables all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via complex real-world networks," reads the internal Facebook report in part.
And yet, while human trafficking has long been explicitly banned on Facebook, it took Apple threatening to boot Facebook and Instagram from the Apple App Store in 2019 for Facebook to muster the type of response one might have expected much earlier.
"Removing our applications from Apple platforms would have had potentially severe consequences to the business, including depriving millions of users of access to IG & FB," reads the document reviewed by CNN. "To mitigate against this risk, we formed part of a large working group operating around the clock to develop and implement our response strategy."
Importantly, Apple wasn't the first to bring the issue to Facebook's attention.
"Was this issue known to Facbeook [sic] before the BBC enquiry and Apple escalation?" the internal Facebook report asks. "Yes."
Top Stories