The Digital Services Act Package: The Digital Services Act
The European Commission proposed The Digital Services Act package, a set of rules aimed at creating a safer and more transparent digital space in the EU.
The Digital Services Act gives effect to the principle that what’s illegal offline should also be illegal online. It aims to improve the safety of digital spaces by guaranteeing users’ fundamental rights.
The Act applies to online platforms that act as intermediaries between businesses and consumers; and provide services in the EU. Once the Act passes into law, it shall apply to all platforms 15 months afterwards or from January 1, 2024, whichever is later.
Main issues
The Act aims to address “the trade and exchange of illegal goods, services and content online” and algorithms used for harmful purposes, such as to “amplify the spread of disinformation” (digital-strategy.ec.europa.eu, 2022). Consequently, platforms are obliged to take down illegal content and goods more swiftly and to address the spread of misinformation more seriously. Furthermore, dark patterns, which are manipulative interfaces designed to trick users into making certain decisions and other misleading practices such as clickbait (potentially), may be outlawed.
Transparency and accountability
The Act obliges platforms to conduct risk assessments to identify the risks that they pose to users’ fundamental rights, including freedom of expression and consumer protection; and to implement measures to mitigate these. For instance, major platforms such as Facebook, Twitter, Google and Amazon must assess how their “algorithmic systems, advertising models and data practices contribute to systemic risks and adapt their systems and processes” to mitigate these risks (amnesty.org, 2022). Furthermore, platforms have to submit “independent yearly audits” and grant regulators, independent researchers and civil society access to their data and the inner workings of their algorithms (amnesty.eu, 2022). For instance, large platforms, such as Facebook, has to make the inner workings of their recommender algorithms (e.g. used to determine the order of content on one’s newsfeed) transparent to users and must provide data to researchers so that they may gain insight as to how online risks evolve. Hence, in the words of the European Commissioner for Competition, Margrethe Vestager, the Act ensures that “platforms are held accountable for the risks their services can pose to society and citizens.”
Proportionality
One concern about the Act is that it may be too stringent of a regulation and thus, may hamper innovation. However, the Act distinguishes amongst platforms of different sizes, with bigger platforms bearing greater obligations. In the words of Commissioner for the Internal Market Thierry Breton: “The DSA is setting clear, harmonised obligations for platforms – proportionate to size, impact and risk." Furthermore, the largest platforms (those that have at least 45 million users in the EU), such as Facebook, are going to be reviewed the most. Moreover, regarding the concern that the Act may hamper innovation, the other half of the Digital Services Act package: the Digital Markets Act, may provide relief. Dr. Nathalie Moreno points out that this Act promotes innovation by establishing a level playing field for smaller, newer entrants in relation to the rules implemented by established, large “gatekeeper” platforms.
Rights of users
Users have the right to choose how content is presented to them. They may opt for content to be presented to them in chronological order instead of a display (e.g. newsfeed) based on algorithmic profiling (amnesty.org, 2022). One manifestation of this right is Instagram’s option for users to view content in chronological order. Furthermore, targeted advertising based on users’ sensitive data, such as one’s ethnicity; and advertising targeted at minors, is prohibited. Moreover, the Act requires platforms to implement effective mechanisms that allow users to flag illegal content, to provide clear reasons as to why content was taken down and to grant users the right to appeal such decisions. However, the Act does not define what constitutes illegal content. Rather, it leaves this matter to be decided by the individual member states. Moreno notes that this subjective variation may be an issue and that “good guidance” on what constitutes illegal content is needed (thestack.technology, 2022). Additionally, German Pirate Party MEP Patrick Breyer, criticises the rights that the Act bestows upon users. He argues that these fail “in multiple respects to protect our fundamental rights online” (european-pirateparty.eu, 2022).
“Crisis mechanism”
Inspired by Russia’s invasion of Ukraine and the resulting spread of Russian misinformation online, the Act provides that the EU Commission can “analyse and control” the activities of large platforms in relation to the specific crisis at hand. This provision may be relied upon during crises such as pandemics and wars. Furthermore, large platforms are obliged to introduce new measures to tackle misinformation during crises.
Penalties
If the Act’s provisions were to be violated, the EU Commission could impose a fine of up to 6% of a platform’s global turnover. If repeated violations were to be committed, a platform may even be banned from conducting business in Europe.
By: Maiya Dario