German Facebook ruling boosts EU push for stricter content moderation
Federal Court tells social media giant to reinstate deleted posts because the company did not properly inform the users.
Germany’s Federal Court ruled Thursday that Facebook had to reinstate racist comments because it had improperly removed them.
In its ruling, the Karlsruhe-based court said that while Facebook had the right to decide what stays on or off the platform, it needs to be more forthcoming with users on how it does so. The court found that the social media company did not inform two users that it had removed their posts. It added that the company should have also informed and given users an opportunity to respond before suspending them from its platform.
The decision sets a major precedent for how social media companies police content on their platforms, likely giving a boost to EU lawmakers in Brussels, who are pushing to add further obligations on social media companies.
It also upends a years-long status quo whereby platforms were effectively free to develop and implement content moderation policies on their own, which critics say has resulted in a lack of transparency, information, and potential recourse for the users.
The case in question is from 2018, when Facebook removed posts in which two German users attacked migrants because it said the posts violated its policy on hate speech, and then suspended the users’ accounts for several days. The users complained that the posts’ removal was a violation of free speech.
The court said while Facebook was entitled to set strict content rules banning hateful speech and to block users, the way it implemented its content moderation policy was not proper.
“Provisions in general terms and conditions are ineffective if they unreasonably disadvantage the contractual partner of the user contrary to the requirements of good faith,” said the court.
Facebook will not be allowed to delete the posts again after they are reinstated.
“We welcome the ruling of the Court, which upholds the principle that platforms like ours are allowed to remove hate speech according to company policies and block the respective user accounts,” said a Facebook spokesperson.
The company added that they will examine the “Court’s reasoning to ensure we can continue to effectively remove hate speech in Germany.”
Based on the judgement, large social media platforms will have to put more effort into balancing their rights as companies to apply and enforce their own rules and those of their users.
Ripples in Brussels
“Individuals are finding a voice against platforms here with the help of a court, this is a really important decision because it shows how individuals can exercise their rights and that nobody is above the law,” Matthias Kettemann, a lead researcher on platform law at the Leibniz Institute for Media Research.
The court’s judgement highlighted a general trend toward empowering individuals using powerful social media platforms like Facebook, which now effectively operate as town squares.
European lawmakers are currently working on a bill to force tech companies to better moderate their platforms. The bill, called the Digital Services Act, would also force social media platforms to be more transparent about how they police their platforms and allow users to challenge companies’ decisions on their posts.
“Users should understand why platforms make decisions and have options to appeal them and see their content reinstated in case that platforms make mistakes,” said Christoph Schmon, international policy director for the Electronic Frontier Foundation, a digital rights group.
But the German court decision went further than the Digital Services Act’s proposals, ordering Facebook to inform users and ask for their response before banning them from the platform, which experts say will signal to lawmakers in Brussels to amend the Digital Services Act in kind.
“The judgement will certainly have some impact on future amendments and ideas that some (political) groups will try to push forward within the Digital Services Act,” said Eliška Pírková, Europe policy analyst at digital rights association Access Now.
Still, she cautioned against overestimating the impact of the case on the EU bill.
“We are seeing members of the European Parliament being influenced rather by the national regulations or legislative proposals, within their own member states, than by the decisions of courts,” she said.
Julian Jaursch, a platform governance researcher at the Stiftung Neue Verantwortung think tank in Berlin, believes the case dented social media companies’ power.
“The court case is an acknowledgement that rules for content moderation have so far been, in most cases, unilaterally decided and set by the tech platforms, and now it’s very clear that there are limits.”
UPDATED: This article has been updated to include further details and comments regarding the case.This article is part of POLITICO’s premium Tech policy coverage: Pro Technology. Our expert journalism and suite of policy intelligence tools allow you to seamlessly search, track and understand the developments and stakeholders shaping EU Tech policy and driving decisions impacting your industry. Email email@example.com with the code ‘TECH’ for a complimentary trial.