2023 . 11 . 21

Algorithmic systems: how should DSA risk assessments be conducted?

news image
The Digital Services Act (DSA) entered into force in August 2023 for Very Large Online Platforms and Search engines (respectively VLOPs and VLOSEs). These actors must comply with their set of DSA obligations including conducting systemic risks assessment and mitigation. While these new obligations hold great promises, the DSA articles do not specify how the identification and assessment of these systemic risks should occur. Civil society started investigating the topic and delivered recommendations and methodologies. In this piece, I present a selection of their work on the subject.

The DSA is a landmark EU regulation which is currently changing the digital landscape. It establishes a new set of accountability obligations for platforms to create a safer digital space. The DSA promisingly introduced a self-assessment and mitigation obligation for VLOPs and VLOSEs about the systemic risks stemming from their services. These actors must “diligently identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services.”

Systemic risks include non-exclusively the dissemination of illegal content, any negative effects on the exercise of fundamental rights, the intentional manipulation of their services, including through inauthentic use or automated exploitation means, that has a negative effect on the protection of public health, minors, civic discourse, or on electoral processes and public security. 

The DSA emphasises the need for VLOPs and VLOSEs to consider the influence of recommender systems, content moderation systems, and other algorithmic systems on these risks. However, the legislation lacks specific details on the risk assessment procedure, prompting civil society to step up and produce insights for a meaningful implementation of this obligation. A selection is presented below. 

  • Risks to media freedom and diversity

AlgorithmWatch delivered an outline of a risk assessment method for measuring the risks posed by internet services to media freedom and diversity. They focused on how to identify and assess the risks that internet service generates for freedom of speech and media pluralism. They established a framework composed of 4 steps which they then applied to the digital media sector. They provided various case studies. 

  • Risks to fundamental rights 

ECNL (European Center for Not-for-Profit Law) and Access Now released key recommendations to conduct fundamental rights impact assessments under the DSA. Their paper aims to help primarily the European Commission in their enforcement activities but also VLOPs and VLOSE for their self-assessment. 

  • Risks to disinformation spread

An independent study analysed systemic risks caused by pro-Kremlin disinformation campaigns. The study establishes a methodological approach for civil society and the broader expert community to contribute to assessing the different types of risks caused by disinformation on online platforms. The report can be taken into account by the EC when analysing the risk assessments submitted by VLOPs and VLOSEs. 

Final reflections

These initiatives show that while risk assessment and mitigation obligations under the DSA are powerful tools against online harms, a closer examination of the methodology is indeed essential.  The EC announced that other studies will be conducted. 

Close interdisciplinary collaboration between relevant stakeholders will be key to ensure well thought procedure and methodology. Especially because conducting and evaluating a risks assessment owns, as underlined by AlgorithmWatch, normative and technological challenges. Designing risk assessment  requires great care and continuous efforts because every risk model unavoidably “requires simplification and abstraction of reality in some respects”.

Author: Noémie Krack, Legal Researcher, CiTiP, KU Leuven-imec.