Rhythms of Regulation – Orchestrating Transparency under the Digital Services Act’s New Score

Activity: Talk or presentationTalk or presentation at a conference

Description

The Digital Services Act (DSA) provides new transparency mechanisms and harmonizes rules for content moderation and platform governance in the European Union. Under the new law, platforms must provide transparency reports that provide a birds-eye perspective on the platform’s content moderation quality and processes according to Art 15, 24, and 42 DSA, including information about content moderation teams and education, automated means used in the process, details about internal complaint handling systems and out-of-court dispute settlements, and notices received by public authorities. Additionally, a Statement of Reason (SOR) demands information on all content moderation decisions taken on platforms, including details about moderation choice and action on a piece of content or user account, the legal or contractual ground, the actor identifying the content in question, or an explanation about the decision taken, in line with Art 17 DSA. These SORs are publicly available in the Transparency Database. By looking at these two sources of transparency reporting, we want to contribute to a more holistic understanding of compliance with the DSA and means of transparency overall.

Transparency reports from previous laws like the NetzDG or the KoPlG regulating illegal content were critiqued due to their shallow statements, incomplete information, or skewed presentation of the situation on the platform. Therefore, we aim to answer the overarching question: How can various DSA transparency and control mechanisms contribute to understanding the quality of content moderation systems and DSA compliance of platforms?

This article analyses how different transparency mechanisms of online platforms can meaningfully be analyzed together to control their statements and analyze their context. Therefore, we selected a set of Very Large Online Platforms (VLOPs) and Very Large Search Engines (VLOSEs) as examples of the most detailed transparency requirements, including biannual reporting obligations under Art 42 DSA. To better understand how SOR and transparency reports fit together and can be used as interlocked mechanisms of control, we compare the second round of transparency reports released on the 25th of April 2024 and the correlating SORs for the same period.

Therefore, we answer four research questions. First, what legal provisions regulate transparency and control mechanisms in the DSA? Second, what information does the SOR database provide about content moderation systems and actions? Third, what information do the transparency reports provide about content moderation systems and platform compliance? Finally, how can various DSA transparency and control mechanisms contribute to understanding the quality of content moderation systems and compliance with the DSA?

By answering these questions, we aim to contribute to the ongoing debate about what determines compliance with transparency regulation under the new law. Furthermore, we aim to create a better understanding of how researchers, regulators, platforms, Trusted Flaggers, and users can combine different means of transparency to be better informed about the quality of content moderation and transparency reporting. By analyzing strengths and weaknesses in the current reporting practices of VLOPs and VLOSEs, we want to support the early processes of creating informative, accurate, and meaningful reports under the DSA. Understanding the means of reporting not as their isolated process but rather as a transparency-enhancing ecosystem that needs different parts to flourish and grow organically.
Period11 Apr 2024
Event titleRegulators and Regulation in the Digital Era at the European University Institute
Event typeConference
LocationFlorence, ItalyShow on map

Keywords

  • Digital Services Act
  • Transparency Reporting
  • Statement of Reason
  • Interlocking Control Mechanisms