Press freedom in the digital age: the impact of the Digital Services Act
Today, on the 25th of August, the Digital Services Act enters into force in all EU member states for the largest digital services, including platforms such as Facebook/Meta, Twitter/X, Instagram, TikTok, LinkedIn and Youtube. Free Press Unlimited applauds this news, as it is an important step forward in protecting press freedom and the safety of journalists in the online space.
These are extraordinary times for press freedom. In the current digital age, a handful of big tech companies act as gatekeepers of access to information and freedom of expression. Online platforms have a strong impact on what people read, see or view: the algorithms decide which information people have access to and consume. Furthermore, online platforms also serve as a breeding ground for disinformation and hate speech.
Systemic regulation of the online space therefore proved to be crucial. Something Free Press Unlimited has been advocating for for a long time. The Digital Services Act of the European Union aims to do exactly that. The Digital Services Act, or DSA in short, aims to create a safer digital space where the fundamental rights of users are protected. And we believe that it can, even though we also see room for improvement.
Obligations for press freedom
The DSA includes important obligations to protect press freedom. It should ensure that illegal content is quickly and adequately dealt with and that the spread of disinformation is mitigated. Online services and platforms need to analyze the risks of illegal content and disinformation being spread on their platforms and take measures to reduce its dissemination. Besides this, clear mechanisms need to be in place to flag and address illegal content, while ensuring transparency of content moderation decisions and making it possible for users to dispute content moderation decisions.
Free Press Unlimited is pleased to see that the DSA takes a huge step forward in increasing accountability for internet actors and in creating a safer online space.
In case of extraordinary circumstances affecting public security or public health (such as COVID19), the DSA includes important articles on the development of crisis protocols to curtail the rapid spread of illegal content and disinformation and ensure reliable information reaches the public. These measures against illegal content and disinformation can reduce the intimidation of journalists, (self) censorship, and promote access to reliable information. This contributes to a more enabling environment for press freedom.
In addition, the DSA empowers users by creating more clarity on why certain information is recommended and obligates platforms to allow users to opt-out from recommendation systems. It also sets guidelines for advertisements, prohibiting ads based on sensitive data (such as ethnicity, political opinions or sexual orientation) and forcing platforms to label all ads and inform users on who is promoting them.
Free Press Unlimited is pleased to see that the DSA takes a huge step forward in increasing accountability for internet actors and in creating a safer online space.
The DSA’s impact on the online safety of journalists
In today's world, a major threat for the safety of journalists is online violence, especially women journalists who are disproportionately affected by this. While the DSA addresses this partly by imposing measures to remove or encourage removal of illegal content, Free Press Unlimited is concerned that it does not cover the full scope of online violence as it does not pertain to harmful content. This is problematic because most online violence has not (yet) been defined as illegal in national legislation, even though it is incredibly harmful and can result in offline attacks.
Online violence can be just as effective in “shutting someone up” as physical harm.
As an example, doxxing, where one's personally identifiable information (such as an address, phone number, etc.) is published publicly, is not illegal in most countries. Nonetheless, it can have great safety implications as this information can lead to physical attacks. In a 2020 global study of women journalists, 73 percent of respondents said they experienced online abuse. Twenty percent reported that they had been attacked or abused offline in connection with online abuse.
Lastly, online violence has severe impacts on different levels. It has an acute impact on journalists personally, often in the form of psychological harm, but also a very real effect on press freedom and access to information. Online violence can be just as effective in “shutting someone up” as physical harm.
We argue that online violence needs to be further defined according to international human rights standards, including the different types of violations that are committed online. This definition should be harmonized globally, to ensure it can be effectively addressed in regulation and legislation. Through the Coalition Against Online Violence, where Free Press Unlimited leads the Regulation working group, this is exactly what we work towards.
Human content moderation is crucial to protect journalists
Free Press Unlimited has long called for "escalation channels'' that allow users to easily report online violence, and for those reports to be dealt with appropriately. To be effective, it is crucial that these are not automated systems, but real people who speak the language of the country where the violence took place and know the context. While the DSA does oblige platforms to put in place mechanisms to allow users to report illegal content, there is a lack of clarity on what these mechanisms should look like and what the minimum standards are for them to work effectively.
Another crucial element of effective content moderation is timely intervention. By quickly and effectively ending threats on social media platforms, further escalation can be prevented.
The DSA also argues that online platforms have to ensure that decisions are not solely taken on the basis of automated means, but it does not explain to what extent platforms need to ensure human involvement in content moderation. Free Press Unlimited believes that human, contextualized intervention is essential: human moderators are better equipped to take the nuances of language, as well as cultural and sociopolitical context, into account. This moderation should apply not only to illegal content, but also harmful content to address the full extent of online violence.
Another crucial element of effective content moderation is timely intervention. While the DSA does obligate online platforms to "act expeditiously", it does not indicate any time frames for content moderation. In practice, this might mean that removing content takes a long time and hereby becomes ineffective. One hateful comment can quickly spiral into a blaze of attacks, with the notable example of Maria Ressa who, at one point, received over 90 hate messages per hour. Online attacks often escalate to rape threats, death threats and in the worst case scenario: offline attacks. By quickly and effectively ending threats on social media platforms, further escalation can be prevented.
Free Press Unlimited therefore thinks further clarity is needed on what effective and impactful content moderation looks like, ensuring interventions are contextualized, timely, and human. This not only affects press freedom in general, but also influences what information we all have access to.