Delete or not to Delete: Methodological Reflections on Content Moderation

Research output: Contribution to conferenceAbstractScientific

22 Downloads (Pure)


Content moderation is protecting human rights such as freedom of speech, as well as the right to impart and seek information. Online platforms implement rules to moderate content on their platforms through their Terms of Service (ToS), which provides for the legal grounds to delete content. Content moderation is an example of a socio-technical process. The architecture includes a layer that classifies content according to the ToS, followed by human moderation for selected pieces of content. New regulatory approaches, such as the Digital Services Act (DSA) or the Artificial Intelligence Act (AIA) demand more transparency and explainability for moderation systems and the decisions taken. Therefore, this article answers questions about the socio-technical sphere of human moderation: • How is certainty about content moderation decisions perceived within the moderation process?
• How does the measurement of time affect content moderator’s work?
• How much context is needed to take a content moderation decision?
A sample of 1600 pieces of content was coded according to international and national law, as well as on the Community Standards developed by Meta, mimicking a content moderation scenario that includes lex specialis for content moderation – the German Network.
Original languageEnglish
Publication statusPublished - 2022
Event2022 International Empirical Legal Studies Conference - Amsterdam, Netherlands
Duration: 1 Sept 20222 Sept 2022


Conference2022 International Empirical Legal Studies Conference
Internet address


  • Content Moderation
  • Digital Services Act
  • Artificial Intelligence Act
  • Human Rights
  • Explainability


Dive into the research topics of 'Delete or not to Delete: Methodological Reflections on Content Moderation'. Together they form a unique fingerprint.

Cite this