Author

Debbie Heywood

Senior professional support lawyer

Read More
Author

Debbie Heywood

Senior professional support lawyer

Read More

18 January 2021

Radar - January 2021 – 2 of 3 Insights

EC draft Digital Services Act

  • In-depth analysis

What's the issue?

In June 2020, the European Commission launched a consultation on a proposed new Digital Services Act package. The consultation looked at a wide range of issues including online harms, online advertising, online competition, smart contracts and governance, as well as at issues around intermediary liability for user generated content.

What's the development?

Following the consultation, the draft Digital Services Act (DSA) was published by the EC in December 2020. It is part of the Commission's package of proposals to foster competition and innovation in the digital economy while preserving fundamental rights and improving online safety for users. The DSA focuses on online intermediary services, in part, overhauling aspects of the eCommerce Directive. It is intended to improve mechanisms for removal of illegal content while protecting freedom of speech, and to create stronger oversight of very large online platforms.

What does this mean for you?

This is the initial proposal for a DSA. The European Parliament and Council will now consider it and come up with their own versions. Following that, trilogues will begin to produce a consolidated version which will become final once approved through the ordinary legislative process. This could take some time to agree. The GDPR, for example, took four years to complete and another two before it came into effect, and the ePrivacy Regulation has still to be agreed some four years after publication.

There is considerable political appetite for regulation of digital service providers which may hasten the legislation through to enactment but it is difficult, at this stage, to predict how long it will take or what it will look like in final form.

If the DSA comes in as currently envisaged, it will place a significant compliance burden on service providers and leaves questions around lawful but harmful content largely unanswered unlike the UK's proposed Online Safety Bill. On the other hand, the EC has included a wider range of unlawful content in scope than the UK which has a number of notable exclusions including in relation to intellectual property, data protection and consumer protection.

More detail

Who is caught by the DSA?

It contains a range of obligations which apply to different classes of online players depending on their role, size and impact in the online ecosystem so whether they are:

  • any intermediary service offering network infrastructure – eg internet access providers and domain name registrars but including:
  • hosting services
  • online platforms (excluding small and micro enterprises) which bring together sellers and consumers eg online marketplaces, app stores, collaborative economy platforms and social media platforms, and
  • very large online platforms – those reaching more than 10% of the 450m consumers in EU.

The DSA will have extra-territorial effect and will apply to intermediary services with a "substantial connection" to the EU.

What are the overarching new measures?

These include:

  • measures to counter illegal goods, services and content online – including a mechanism for users to flag content and for platforms to cooperate with 'trusted flaggers'
  • new obligations on traceability of business users of online marketplaces to help identify sellers of illegal goods
  • effective safeguards for users – including the possibility to challenge content moderation decisions made by platforms
  • obligations on very large platforms to prevent misuse of their systems
  • access to key data of the largest platforms by researchers
  • an oversight structure including a new European Board for Digital Services.

infographic

(Reproduced from EC press release)

Hosting exemption

Providers which act as 'mere conduits' or which only cache or host information have the fewest obligations and they retain the benefit of the hosting exemption for liability for third party content which is carried over from the eCommerce Directive. The DSA makes it clear that there is no general obligation to monitor content and that carrying out voluntary or mandatory investigations to detect and remove illegal content will not void the exemption. The exemption does not apply to liability under consumer protection law where an online platform appears to be providing products or services itself.

Illegal content

Illegal content is defined broadly to cover any information which does not comply with EU or Member State law. Lawful but harmful content is not defined and is not subject to take down measures. This is a different approach to that to be taken by the UK on online harms in its upcoming Online Safety Bill.

All providers must:

  • respond to takedown orders for illegal content
  • respond to disclosure orders to provide authorities with information about individual users
  • include information in their terms and conditions on usage restrictions, policies and procedures, tools and algorithms used for content moderation and decision making, as well as about any human review.

Hosting service providers must implement prescribed processes for 'notice and action' under which potentially illegal content is reported and reports are responded to and actioned. This must include providing users with information about redress and how any take down decision was made, including automated decision making. The information must be published on a Commission database. Once subject to notice of potentially illegal content, the provider will most likely lose its liability exemption.

Online platforms must:

  • set up an internal complaint-handling system to deal with any complaints about decisions to remove or suspend content or a user account
  • comply with decisions of certified out-of-court dispute settlement bodies
  • process notices from 'trusted flaggers' as a priority. Trusted flaggers are entities which have expertise in identifying and notifying illegal content, represent collective interests and are independent from any platform
  • identify and suspend users who repeatedly upload illegal content or submit unfounded complaints about content
  • report suspicions of a possible serious criminal offence to law enforcement or judicial authorities
  • operate know your business customer (KYBC) procedures and make the information available to consumers on a platform whose interface should also support provision of legally required pre-contractual and product safety information.

Additional rules for very large online platforms

Very large online platforms must comply with all the obligations which apply to intermediary services, hosting service providers and online platforms but they have a number of additional obligations including:

  • identifying systemic risk including of their services being used for disseminating illegal content, to negatively affect fundamental rights, or of being manipulated to negatively affect protection of public health, discourse, security and the democratic process
  • mitigating identified systemic risks – for example through improved content moderation systems, updated terms and conditions, limiting display of adverts, using trusted flaggers, and cooperation with other online platforms and adherence to codes of conduct
  • submitting to annual independent compliance audits
  • publishing the parameters and influencing factors on any recommender systems used
  • providing their Digital Services Coordinator (DSC) (see below) with electronic access to data to enable the DSC to monitor their compliance and assess systemic risk. Further delegated legislation will address data protection, trade secrecy and security concerns.

Online advertising

All online platforms must disclose in real time and on a per ad, per user basis:

  • that the information is an advert
  • on whose behalf the ad is displayed
  • "meaningful" information about the parameters used to target the ad to the user.

Very large online platforms must, in addition, publish information at least one year after display about:

  • the content of the advert
  • on whose behalf it was displayed
  • whether it was targeted and, if so, the parameters used for targeting
  • the number or recipients targeted and reached.

In addition to the obligations set out in the DSA, very large platforms will be expected to develop voluntary standards for the interoperability of advert repositories and for the sharing of data between advertising intermediaries, and to develop and adhere to codes of conduct.

Reporting and accountability

All providers will be required:

  • To produce annual compliance reports. The scope of the report varies according to the type of provider but they are involved and will constitute a considerable administrative burden.
  • To appoint and notify a single point of contact to act as liaison with regulators.
  • To set out restrictions on content and information about content moderation policies and tools including algorithmic decision-making and human review in their terms and conditions.

Online platforms will also need to produce reports every six months detailing their average monthly recipients so it is clear when they need to be designated as very large online platforms.

Very large online platforms will need to appoint compliance officers to cooperate with the DSC, the European Commission and to organise audits and staff training.

All non-EU providers will be required to appoint a legal representative in a Member State in which they provide services. The representative must have the power and resource to cooperate with EU regulators and can be held liable for the provider's non-compliance.

Regulatory oversight and enforcement

A combination of EU and Member State bodies will support the implementation and enforcement of the DSA:

  • Digital Services Committee – will support the Commission and monitor compliance of very large online platforms.
  • European Board for Digital Services – will have a supervisory and guidance role and will publish reports on systemic risks with very large online platforms and best practice mitigation.
  • Digital Services Coordinators – the lead competent authorities in their jurisdiction, responsible for regulation of providers whose main establishment is in their jurisdiction and related enforcement.
  • Competent authorities – each Member State must have one or more competent authority to monitor and enforce compliance.

DSCs will have a range of enforcement powers including to:

  • require information about suspected infringements from providers and other traders
  • carry out on-site inspections and seize or take copies of information relating to a suspected infringement
  • require explanations from individual employees of a provider suspected of infringement
  • accept compliance commitments and make them binding
  • make cease and desist orders and impose remedies to end infringement
  • adopt interim measures to avoid the risk of serious harm
  • on suspicion of a serious criminal offence involving threat to life or personal safety, request the competent judicial authority to temporarily restrict access to the service for a period of up to four weeks
  • Impose fines on the following bases:
    • up to 6% of annual global income or turnover (whichever is higher) for general non-compliance
    • up to 1% of annual global income or turnover (whichever is higher) for failure to cooperate with investigations and with the DSC
    • up to 5% of average daily global income or turnover (whichever is higher) for continuing non-compliance.

Find out more

For more information, listen to our webinar which also covers the Digital Markets Act and the creation of the Digital Markets Unit in the UK. We will also be publishing more about digital policy in the UK and EU on Download. If you would like to receive updates on these and other issues affecting digital services, sign up here.

In this series

Technology, media & communications

UK government sets out final approach to regulating online harms

In-depth analysis

by Debbie Heywood, Alex Walton

Technology, media & communications

EC draft Digital Services Act

In-depth analysis

by Debbie Heywood

Technology, media & communications

EC draft Digital Markets Act

In-depth analysis

by Debbie Heywood

Call To Action Arrow Image

Latest insights in your inbox

Subscribe to newsletters on topics relevant to you.

Subscribe
Subscribe

Related Insights

Server room corridor
Data protection & cyber

Global Data Hub – EC publishes draft UK adequacy decisions

19 February 2021
Quick read

by Vinod Bange and Debbie Heywood

Click here to find out more
Robotic hand
Technology, media & communications

When can you use electronic signatures and how do you witness signatures?

15 February 2021
In-depth analysis

by Debbie Heywood

Click here to find out more
Multi Coloured Computer Wafer Macrophotography
Technology, media & communications

How do you provide ranking transparency under the P2BR?

15 February 2021
In-depth analysis

by Debbie Heywood

Click here to find out more