The European Commission has proposed a new Digital Services Act package to tackle illegal online content and create a ‘safer digital space’. Joseph Downing examines how the new package of legislation might affect the way we communicate online, and whether we should be concerned at the prospect of EU officials having the power to set standards concerning illegal digital content.
Big tech has been blamed for many contemporary political and social ills, including fuelling the rise of extremist politics and spreading disinformation during the Covid-19 pandemic. The EU’s response to these concerns has been to draft a new Digital Services Act package of legislation that aims to ‘create a safer digital space where the fundamental rights of users are protected and to establish a level playing field for business’.
However, the measures included in the new package – which consists of two different acts, the Digital Services Act and the Digital Markets Act – present some significant issues. It is also unclear whether the operationalisation and implementation of the legislation will actually help solve the problems that exist with digital communications.
Content moderation
A key aim of the package is to tackle the spread of illegal content and misinformation online. This has been a significant problem for some time, but two key issues emerge here.
First, there are freedom of speech implications associated with establishing Europe-wide standards on what is ‘illegal’ content. The fact that these standards will be decided by unelected officials in Brussels arguably sets a dangerous precedent. Indeed, similar legislation proposed by Emmanuel Macron’s government in France was recently struck down by the country’s Supreme Court on the grounds that it would pose a significant risk to freedom of expression.
Second, the rules set out a framework for platforms to work with specialised ‘trusted flaggers’ to identify and remove content. However, the legislation remains ambiguous about how individuals will become ‘trusted’ and how they will be trained or retained over time. This reproduces many of the issues that platform moderation has already been criticised for, namely that it tends to be both unaccountable and expensive.
The potential commercial burden for social media companies is enormous, and even the maximum fines for non-compliance with the legislation (6% of operating profits, though actual fines are likely to be much smaller) could be seen as cheaper if they are simply factored in as a business cost. This is not to mention the huge toll content moderation takes on human workers, something which is likely to prove extremely problematic in terms of staff training and retention, as well as staff wellbeing.
Complimenting human workers with algorithms and AI would seem to be a safer and logical alternative. However, these algorithms have been criticised in the past for being too opaque and lacking transparency, as well as for missing harmful content due to the complex nuances of the text, image, video and audio that make up the social media landscape.
The new package specifies that these processes should be made transparent. Again, this is not as straightforward as it may seem. Algorithms also sort content to generate the revenue social media outlets need to survive and thus they are extremely commercially sensitive. Platforms invest huge amounts of money in the human and machine infrastructure needed to generate these complex models and they are highly unlikely to be willing to openly offer up their trade secrets.
Platform migration and the ‘agility’ of fake news
Another important issue that could significantly limit the effectiveness of the new legislation is the remarkable agility of users themselves. Social media regulation and platform censorship aimed at taking down violent or hateful content is nothing new. However, users have shown significant resourcefulness in getting around these attempts through platform migration.
This has frequently been demonstrated by the Islamic State (IS) group and alt-right and conspiracy theory influencers, among others. These actors have simply side-stepped censorship by moving to other apps like telegram. The fact that many conspiracy theories thrive on ideas of victimhood and persecution by ‘the elite’, as well as general paranoia about authorities trying to prevent people from discovering the truth, means that increased censorship gives further fuel to the fire. As social media platforms continue to proliferate and grow, questionable content will always be able to find a home.
It is difficult to imagine a situation in today’s digital world where social media could be left unregulated. However, establishing effective regulation is highly complex and problematic. It is easy to promise a ‘safer digital space’ and to ‘safeguard users’ rights’, but far more difficult to actually deliver on such promises or to gain compliance from large multinational companies. Whether the EU’s new package of legislation can actually make a difference in this context remains to be seen.
Europeanising social media regulation?
The Digital Services Act also demonstrates a somewhat troubling aspect of Europeanisation. This is evident in the case of France, where Emmanuel Macron’s previous attempts to legislate in this area were struck down by the French Supreme Court. The EU legislation, which has broad similarities with the previous legislation in France, has been pushed for by Macron as part of his assertion of the French position in Europe.
In doing so, Macron has effectively used the EU’s legislative institutions to promote and adopt regulations defeated by his own Supreme Court. This is problematic as France will now be obliged to pass laws of its own to comply with the new EU legislation. Thus, France will get regulations pushed onto it from above that it rejected at the national level.
Quite apart from the effectiveness of the EU’s attempts to regulate digital communications, the new legislation therefore raises some important questions about the process of Europeanisation and the EU’s democratic deficit. These questions are arguably just as important as the problems the Digital Services Act package aims to solve.
About the author
Joseph Downing is a Senior Lecturer in Politics and International Relations at Aston University and a Visiting Fellow in the European Institute at the London School of Economics. He is the author of French Muslims in Perspective: Nationalism, Post-Colonialism and Marginalisation under the Republic (Palgrave Macmillan, 2019).