Mapping Interpretations of the Law in Online Content Moderation in Germany

Ben Wagner, Matthias C. Kettemann, Anna Sophia Tiedeke, Felicitas Rachinger, M.T. Sekwenz

Research output: Working paper/PreprintWorking paperScientific

Abstract

Content moderation is a vital condition that online platforms must facilitate according to the law and to create (adequate) online environments for their users. While new regulatory requests, like the Digital Services Act in the European Union create novel obligations for platforms, other legal dimensions like the law of Member States are an additional layer of legal grounds for the moderation of content and relevant for the decisions taken on a day-by-day basis. The decisions taken are either grounded in reasons stemming from law or can be based on contractual grounds like the platform’s Terms of Service and their Community Standards. How to measure these essential aspects of content moderation empirically however is still unclear. We therefore ask the following research question: How do online platforms interpret the law when they moderate online content?To understand this complex interplay and to test the quality of the platform’s content moderation claims empirically, this article develops a methodology that facilitates empirical evidence on the individual decisions taken per piece of content, while highlighting the subjective element of content classification by human moderators. We then apply this methodology to a single empirical case, an anonymous medium-sized German platform which has provided us access to their content moderation decisions. By better understanding how platforms interpret the law we can understand how complex content moderation, its regulation and compliance practices are, as well as to what degree legal moderation might differ from moderation due to contractual reasons in dimensions like the need for context, information, or time.Our results show a considerable divergence between the platforms interpretation of the law and our own interpretation. We believe that a significant number of platform legal interpretations are incorrect. These divergent interpretations of the law mean that we believe platforms are removing legal content that they falsely believe to be illegal (‘over-blocking’) while simultaneously not moderating illegal content (‘under-blocking’). In conclusions, we provide recommendations for content moderation system design that takes (legal) human content moderation into account and creates new methodological ways to test their quality and effect on speech on online platforms.
Original languageEnglish
Pages1-48
Number of pages48
Publication statusPublished - 2024

Fingerprint

Dive into the research topics of 'Mapping Interpretations of the Law in Online Content Moderation in Germany'. Together they form a unique fingerprint.

Cite this