Home > Media News >
Source: https://www.hrw.org/
The new German law that compels social media companies to remove hate speech and other illegal content can lead to unaccountable, overbroad censorship and should be promptly reversed, Human Rights Watch said today. The law sets a dangerous precedent for other governments looking to restrict speech online by forcing companies to censor on the government’s behalf.“Governments and the public have valid concerns about the proliferation of illegal or abusive content online, but the new German law is fundamentally flawed,” said Wenzel Michalski, Germany director at Human Rights Watch. “It is vague, overbroad, and turns private companies into overzealous censors to avoid steep fines, leaving users with no judicial oversight or right to appeal.”
Parliament approved the Network Enforcement Act, commonly known as NetzDG, on June 30, 2017, and it took full effect on January 1, 2018.
The law requires large social media platforms, such as Facebook, Instagram, Twitter, and YouTube, to promptly remove “illegal content,” as defined in 22 provisions of the criminal code, ranging widely from insult of public office to actual threats of violence. Faced with fines up to 50 million euro, companies are already removing content to comply with the law.
At least three countries – Russia, Singapore, and the Philippines – have directly cited the German law as a positive example as they contemplate or propose legislation to remove “illegal” content online. The Russian draft law, currently before the Duma, could apply to larger social media platforms as well as online messaging services.
Two key aspects of the law violate Germany’s obligation to respect free speech, Human Rights Watch said. First, the law places the burden on companies that host third-party content to make difficult determinations of when user speech violates the law, under conditions that encourage suppression of arguably lawful speech. Even courts can find these determinations challenging, as they require a nuanced understanding of context, culture, and law. Faced with short review periods and the risk of steep fines, companies have little incentive to err on the side of free expression.
Second, the law fails to provide either judicial oversight or a judicial remedy should a cautious corporate decision violate a person’s right to speak or access information. In this way, the largest platforms for online expression become “no accountability” zones, where government pressure to censor evades judicial scrutiny.
At the same time, social media companies operating in Germany and elsewhere have human rights responsibilities toward their users, and they should act to protect them from abuse by others, Human Rights Watch said. This includes stating in user agreements what content the company will prohibit, providing a mechanism to report objectionable content, investing adequate resources to conduct reviews with relevant regional and language expertise, and offering an appeals process for users who believe their content was improperly blocked or removed. Threats of violence, invasions of privacy, and severe harassment are often directed against women and minorities and can drive people off the internet or lead to physical attacks.
Criticism of the new law has intensified over the past six weeks after content from some high-profile users was blocked or their accounts were temporarily suspended, even though some of those actions were due to violations of the company’s user rules rather than NetzDG.
Users whose speech was censored either by NetzDG or a violation of a company’s user agreement include a leader of the far-right Alternative for Germany party, a satire magazine, and a political street artist. The content of many other less public personalities may have been improperly blocked or removed, either under NetzDG or a violation of user rules, Human Rights Watch said.
Four of the larger political parties now oppose the law: The Left, which voted against the law; the Free Democrats and the Alternative for Germany, which were not in parliament when the law passed; and the Green Party, which abstained in parliament’s vote. A senior official of the Christian Social Union, which was part of the government that proposed the law, has also come out against it.
Chancellor Angela Merkel has defended the need to regulate the internet but said “it may be that we also have to make changes” to the law. The coalition agreement between her Christian Democratic Union, the Christian Social Union, and the Social Democratic Party for a new government, released on February 7, calls the NetzDG law a “correct and important step” but says the government will evaluate ways to “further develop” the law.
Many organizations dedicated to human rights and media freedom have opposed the law since it first appeared in draft form. The Global Network Initiate, a coalition of nongovernmental organizations, academics, investors, and companies committed to free expression and privacy online, said the law would “outsource decisions” about freedom of expression to private companies. In an open letter to eight EU commissioners, a group of six civil society and industry associations said the law would chill freedom of speech online by incentivizing companies to remove reported content. The freedom of expression organization Article 19 issued a legal critique of the law, saying it will “severely undermine freedom of expression in Germany, and is already setting a dangerous example to other countries.”
The United Nations special rapporteur on freedom of opinion and expression, David Kaye, said the draft law was at odds with international human rights standards. The government defended the law, citing changes to the draft that Kaye reviewed, such as more flexibility on deadlines to remove content and the introduction of an authorized body to review complex cases, but failed to address Kaye’s key concern that the law places responsibilities on private companies to regulate the exercise of freedom of expression.
“With the NetzDG law, Germany has undermined free speech at home and set a troubling example for other countries that want to block artistic expression, social criticism, political activism, or independent journalism online,” Michalski said. “Forcing companies to act as censors for government is problematic in a democratic state and nefarious in countries with weak rule of law.”
A Flawed Law
Under the NetzDG law, companies with more than 2 million registered users in Germany are required to establish an effective and transparent procedure to receive and review complaints of allegedly illegal content. They must block or remove “manifestly unlawful” content within 24 hours of receiving a complaint but have up to one week or potentially more if further investigation is required. In especially complex cases, companies can refer the case to an industry-funded but government-authorized body that is required to make determinations within a seven-day window. The government has not yet produced the criteria for authorizing such a body, and it can change the criteria at will.
Companies must inform users of all decisions made in response to complaints and provide justification, but the law does not provide for meaningful judicial oversight or a process of judicial appeal when users want to contest a corporate or industry body decision to block or remove a post.
Under the law, the Federal Ministry of Justice and Consumer Protection can fine a responsible individual up to 5 million euro and the company up to 50 million euro for failing to establish a compliance system or for failing to issue a public report on their actions related to the law every six months. The amount of the fine depends on the gravity of the offense and the number of users on the platform, but the ministry has not yet produced the fine structure.
Company Responses
To comply with the law, social media companies have created new mechanisms to report allegedly illegal content and hired reviewers to analyze those reports. These reviewers join the teams these companies already had in place to monitor compliance with their user agreements.
Google, which owns YouTube, announced in December 2017 that, over the next year, it would bring the total number of people working to address content that might violate its policies to over 10,000. Facebook told Human Rights Watch that it employs about 10,000 content reviewers globally, either directly or via contractors, including from two centers in Germany, primarily to monitor violations of its “Community Standards” but also violations of NetzDG.
Both of these companies, as well as Twitter, have reporting forms specifically for NetzDG, which helps them to assess potential violations of the law and to collect data for the required six-month reports.
A significant difference between reporting a violation of community standards and a violation of NetzDG is the right to appeal. For the former, Facebook, YouTube, and Twitter all offer the chance for users to challenge a decision to block or remove content. For the latter, the law does not require the company to offer an appeals process and the companies have not done so.
Domino Effect
The precedent that the NetzDG has set deserves special attention, as governments around the world increasingly look to restrict online speech by forcing social media companies to act as their censors, Human Rights Watch said. Some examples include:
In Singapore, a country with a record of using overly broad criminal laws to chill free speech, the government is citing the German law as a positive example as it proposes ways to tackle “fake news.”
In the Philippines, the Act Penalizing the Malicious Distribution of False News and Other Related Violations was submitted to congress in June, referencing the German law. The bill proposes fines for social media companies that fail to remove false news or information “within a reasonable period” and imprisonment for responsible individuals. It is currently with the Committee on Public Information and Media and is one of the measures being discussed in a Senate hearing on ways to tackle fake news.
In Russia, the ruling United Russia party submitted two draft laws to the State Duma in July to regulate online content. Citing the German law, one of them requires social media platforms with more than 2 million registered users and other “organizers of information dissemination” in Russia to remove, within 24 hours of receiving a complaint, certain types of illegal content, such as information that propagates war; incites national, racial, or religious hatred; defames the honor, dignity, or reputation of another person; or is disseminated in violation of administrative or criminal law. The other law levels fines for failure to remove illegal content (from 3 to 5 million rubles (US$53,220 to $88,700) for individuals and from 30 to 50 million rubles (US$532,200 to $887,000) for legal entities. The first law has entered the first hearing stage and the second law is still under review.
In Venezuela, the pro-government Constituent Assembly on November 8 adopted the “Anti-Hate Law for Peaceful Coexistence and Tolerance.”Among other provisions that restrict free speech and association, the law imposes high fines on social media platforms that fail to delete content that “constitute[s] propaganda advocating war or national, racial, religious, political, or any other kind of hatred” within six hours of posting.
In Kenya, the Communications Authority issued guidelines in July that oblige social media platforms to close accounts that have been used to disseminate “undesirable political contents” within 24 hours after it is brought to the platform’s attention, though no one is known to have been punished yet. Undesirable content includes political messages that are “offensive, abusive, insulting, misleading, confusing, obscene or profane language.”
In Europe, the European Commission has called for social media platforms to assume greater responsibility for identifying and removing illegal online content, including a code of conduct for IT companies. The UK and French governments have been developing a joint action plan to improve the identification and deletion of online material that state authorities find terrorist, radical, or hateful. Their proposals include pressing companies to automate the detection and speed up the suspension or removal of illegal content, as well as provide access to encrypted content.
In the United Kingdom, Prime Minister Theresa May recently called on large social media companies to do more to identify and remove terrorist content. One of her ministers called for tax penalties against tech companies that were slow to remove content or refused to give the government access to encrypted messages
Top Stories