​Germany’s Apex Federal Court voids parts of Facebook’s Term of Services

Lumen Database Team
4 min readAug 23, 2021

On July 29, 2021, the Bundesgerichtshof, Germany’s Federal Supreme Court, invalidated parts of Facebook’s Terms of Service (ToS) relating to community guidelines and ruled that Online Service Providers (OSPs) would be required to inform their users about the removal of posts ex-post at the least and about blocking of user accounts ex-ante. In both cases, the Court noted that the user must be given an opportunity to be heard before a new decision is made on that individual case.

The facts of the cases related to user comments on Facebook about immigrants in Germany and allegations that they could ‘murder and rape and no one would care and that they would ‘forever lie on the taxpayer’s pockets. Both cases (cases III ZR 179/20 and III ZR 192/20) were brought by the aggrieved users against Facebook for removing these comments on grounds of hate speech and were initially dismissed by lower and regional courts. The matter was finally appealed to the German Federal Supreme Court earlier this year.

The Court voided Facebook’s ToS regarding the deletion of posts and account blocking for violating community standards. It required that a balancing test be carried out in situations where rights of both parties (the OSP and its user) were involved, such as in the ToS of an OSP. In order to balance the OSP’s right to carry out their profession against the user’s right to freedom of expression, the Court stated that an OSP’s TOS had to respect the rights of both parties. Facebook’s ToS regarding the deletion of posts and blocking of accounts for violating its community standards, which denied users an opportunity to being heard before Facebook made a final decision regarding removal of content, was held invalid since it did not balance the right to free speech of the users.

With this judgment, Germany has now presented intermediaries with a double-edged sword. On the one hand, the NetzDG, passed by the German Parliament in 2018, mandates removal of content within 24 hours if patently illegal and within seven days for other illegal content. On the other, the German Apex Court has now mandated a four-step procedure in order to remove content and block accounts: to flag (and in cases of user post, remove) illegal content, to inform the user, to give them an opportunity to be heard and finally, to make a new decision, all of which might be difficult to accomplish in 24 hours.

Even though the NetzDG was recently amended to introduce a ‘counter-presentation’ procedure, the amendment obliged platforms to review their decisions upon request and did not institute a requirement for platforms to engage in dialogue with users before taking action on their accounts and posts. Even though in cases of user posts, OSPs have the option of informing them after the removal of such posts, it is likely that there may be a need to rethink the three-strike policy wherein the user’s account is blocked after multiple violations.

This German ruling comes as the European Union is drafting the Digital Services Act, a law aimed at increasing social media transparency and allowing users to challenge OSP’s decisions on their posts. While the Court did not pass judgment on whether the content itself was illegal, it has certainly provided clarity on the appropriate procedure that OSPs should follow when exercising their right to remove posts and block user accounts for non-compliance with community standards.

Informing the user each time that their content is removed and giving them a chance of being heard may also lead to increased transparency, an initiative that was also made clear in 2020 through Germany’s amendment to the NetzDG by way of bi-annual transparency report requirements. Making removal requests available for research, on the Lumen database or otherwise, might also add to this increased transparency, and may also help improve decision making by OSPs.

Presumably, the German Court’s judgment is rooted in the acknowledgment that OSPs often err on the side of caution when removing content, thereby causing a chilling effect. However, it remains to be seen if the takedown process, which is currently predicated largely on algorithms, will be able to encompass the context needed for calibrating the ‘opportunity of being heard.

About the Author: Shreya is an Employee Fellow at the Berkman Klein Center, where she works on the Lumen Project. She is a passionate digital rights activist and uses her research and writing to raise awareness about how digital rights are human rights. She tweets at @shreyatewari96

--

--

Lumen Database Team

Collecting and facilitating research on requests to remove online material. Visit lumendatabase.org and email us if you have questions.