Digital Services: managing harmful content while protecting freedom of expression
-
Updating EU rules on “notice and action” procedures to protect users’ rights
-
Stricter conditions for targeted ads and more control over what users see online
-
Existing or new EU body to monitor and impose fines
The Legal Affairs Committee calls on the Commission to review existing laws applicable to online providers and put forward new legislation on online content management.
The legislative initiative, adopted with 22 votes in favour and 1 against, will contribute to the upcoming Commission’s Digital Services Act (DSA) package.
The text calls for EU-wide standards on how hosting platforms should moderate content and apply the so-called “notice and action” mechanisms. These new rules on content moderation, in addition to judicial redress, should provide legal clarity for platforms and guarantees for fundamental rights of users, MEPs say.
When content is flagged or taken down, users should be notified and able to seek redress through a national dispute settlement body. However, a final decision should be taken by an independent judiciary and not fall on private undertakings. “We do not want private companies to police the internet”, said rapporteur Tiemo Wölken (S&D, DE) after the vote.
To ensure platforms comply, the Commission should assess options for a European entity to monitor and impose fines, such as an EU body or a coordinated network of national authorities, MEPs propose.
Limiting harmful content
The Legal Affairs Committee urges the Commission to draw a clear distinction between illegal and harmful content. Future rules on content management should also address the spread of harmful content that is not illegal and protected under freedom of speech rules, such as fake news or disinformation.
When it comes to self-regulation, MEPs are firm that platforms should not employ upload filters or any form of ex-ante content control for harmful or illegal content.
“Make users less dependent on algorithms”
“We need to address the business practice of selling personalised targeted ads and the impact this has on the spread of harmful content”, said Mr Wölken at a press conference, by providing users with more control over what they see online, including the possibility to opt-out of content curation altogether. “The idea is not to censor unwanted content, but to make users less dependent on the algorithms that reward attention-seeking content, thus increasing their freedom of information”, he said.
Targeted advertising must be regulated more strictly in favour of less intrusive, contextualised forms of advertising that require less data and does not depend on previous user interaction with content. Being shown personalised advertising should depend on users’ free, informed and unambiguous consent.
The upcoming DSA should introduce transparency rules on the terms for accumulation of data for this purpose and provide for the right to use digital services anonymously whenever possible.
For more details on these and additional recommendations, see the compromise amendments approved by the Legal Affairs Committee on Monday.
Next steps
The vote in plenary is scheduled for the 19-22 October plenary session. This legislative initiative, along with recommendations from the Internal Market and Civil Liberties committees, will feed the upcoming Commission’s Digital Services Act package, expected before the end of the year.