Design Recommendations for Safer Election Campaigning Online

Research output: Contribution to conferenceAbstractScientific

11 Downloads (Pure)

Abstract

The internet is a place where the political opinion of voters is ever more formed on platforms and their user-generated content globally. A sphere in which the right to free- dom of expression, information and free and fair election are core human rights normative safeguards for our democracy. Securing this process for whoever is not an easy task, as examples like the 2016 US election, the Brexit campaign or the events of the 6th of Jan- uary 2021 illustrate. The European Union has taken regulatory action to secure the digital manifestations of elections, by issuing legislation like the General Data Protection Regu- lation, the Artificial Intelligence Act, the Digital Services Act (DSA)or the Proposal for a regulation of the European Parliament and the European council on the transparency and targeting of political advertising. The aim is to make platforms more transparent, regarding their algorithms deciding on recommendations, price of the ad, or to standardize content moderation to a certain degree. Platforms on the other hand use their Terms of Service (ToS) to implement their Community Standards – a selection of law-like clauses allowing for deletion or blocking of content – to set quasi-norms to safeguard democracy on- line. The ToS used by very large platforms (VLOP) according to Art 25 DSA however does not include granular clauses for European campaigning. The more recent design solutions on platforms include advertising repositories, or warning labels attached to problematic content to better inform the public. However, moderation of content addressing the heart of democracy and the democratic process per se is crucial for the status of human rights in Europe. The first decision taken on a piece of content – if it should be uploaded on the platform or not – is usually automated and controlled by machine learning algorithms. The system in place selects the pieces of content that will be decided upon in the next process step by a human. The moderation of political speech, however, is not solely text-based but does include a fine line of sarcastic elements, emojis or visual content to express itself which is another obstacle to moderating in an electoral context. This article, therefore, asks the question about how to better safeguard the right to fair elections, the right to freedom of expression and information in online campaigning and elections adhering to the recent European legislation, such as the GDPR, the DSA and the AIA and the proposal on the transparency and targeting of political advertising? The article answers the question by taking a closer look at the publicly available data published by platforms on behalf of their transparency reports. Furthermore, the ToS and Community Standards should be analyzed and compared. The process and architecture of content moderation for the selected online platforms are described and modelled according to the publicly available information. Only by providing a more concrete look at content moderation design and practice better solutions for the digital future of democracy can be crafted.
Original languageEnglish
Publication statusPublished - 2022
EventDemocracy & Digital Citizenship Conference Series - Roskilde University, Roskilde , Denmark
Duration: 29 Sept 202230 Sept 2022
https://events.ruc.dk/democracyanddigitalcitizenship/conference

Conference

ConferenceDemocracy & Digital Citizenship Conference Series
Country/TerritoryDenmark
CityRoskilde
Period29/09/2230/09/22
Internet address

Fingerprint

Dive into the research topics of 'Design Recommendations for Safer Election Campaigning Online'. Together they form a unique fingerprint.

Cite this