3 de 6

29 janvier 2021

EU and UK Digital policy – 3 de 6 Publications

The Digital Services Act – how to start preparing

The DSA may be some way off, but here's a round-up of what you can do now.

  • Briefing
En savoir plus
Auteur

Adam Rendle

Associé

Read More

For digital intermediaries with users in the EU, the European Commission's proposed Digital Services Act (DSA) will fundamentally change their responsibilities in relation to illegal content.

While they will continue to benefit from the well-known hosting and other intermediary defences in the eCommerce Directive, that privileged status will come at a price: they will have to act in a more responsible, transparent and diligent way to ensure a safe online environment.

It is likely to take two to three years before the DSA is in force, but services need to think through key questions as they start to prepare for its impact.

The gating question: is my service in scope?

The Regulation applies only to intermediary services such as online platforms, social media services, messaging services, marketplaces and ISPs. Intermediaries are those services which will be familiar from the eCommerce Directive, consisting of mere conduit, caching and hosting services.

The DSA applies even to services which aren't established in the EU – as long as services have a "substantial connection" to the EU. Having significant users in, or activities targeted towards, the EU will be enough to bring services in scope.

What category does my service fall into?

There are four categories of service:

  • All providers of intermediary services (they are the mere conduit, caching and hosting services from the eCommerce Directive).
  • Providers of hosting services.
  • Providers of online platforms (this is a new concept, being hosting services which "store and disseminate to the public information", unless that activity is a minor and purely ancillary feature of another service; social media services are the obvious example of those platforms).
  • Very large online platforms (this is a subset of online platforms being those with over 45 million monthly active users).

They form a 'pyramid' of services, with providers of intermediary services at the bottom. All the obligations discussed below apply to those services and, as you move up the pyramid, additional obligations are placed on each subset (see here for more).

What type of content does my service have to be aware of?

Any type of "illegal content", which has a very simple but broad meaning – anything not in compliance with EU or Member State law. Importantly, it therefore includes intellectual property infringement. Harmful, but not illegal, content is not subject to removal obligations.

What proactive steps does my service have to take in relation to illegal content?

All services need to include information on usage restrictions in their terms and conditions – eg about policies, procedures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. All services also have to report each year on content moderation they engaged in, including things like activities to detect, identify and address illegal content or information in breach of the terms and conditions.

The very large online platforms have to carry out annual risk assessments on, for example, the dissemination of illegal content through their services, negative effects on fundamental rights, and take measures to mitigate those risks – eg adapting content moderation or initiating or adjusting cooperation with trusted flaggers. They may also need to adapt recommender systems, limit the display of targeted advertisements or co-operate with other online platforms.

They will have to appoint one or more compliance officers to report to the highest management level and actively monitor compliance, train management and employees, arrange audits and co-operate with enforcement bodies.

Helpfully for intermediaries, the DSA clarifies that, just because a provider carries out voluntary own-initiative investigations aimed at detecting, identifying and removing illegal content, that doesn't mean the service can't rely on the immunities. Having said that, if they find anything and have sufficient knowledge of the unlawful content but don't take the necessary action in response, they will lose their protection.

What reactive steps does my service have to take in relation to illegal content?

All services will have to respond to takedown orders from Member State authorities and specify any responsive action taken without undue delay. They will also have to respond to disclosure orders from authorities regarding individual users, specifying the effect on them without undue delay.

For hosting services, there is a much more prescriptive notice and action regime for complainants submitting notices of allegedly illegal content. If complainants follow that mechanism and provide the envisaged detail then it could become difficult for the providers to maintain their hosting immunity. "Action" is then needed to take down illegal content but the DSA does not impose stay down obligations.

Hosting service providers also have to report on the number and effectiveness of notices made under their "notice and action mechanism".

For online platforms, certain complainants, for example, representatives of IP rights holders, can become "trusted flaggers" and platforms have to treat their notices with priority and without delay. If they become aware of serious criminal offences involving a threat to life or individual safety, they have to inform law enforcement.

How will my decision-making have to change in relation to illegal content?

All services will have to apply and enforce their usage restrictions in a diligent, objective and proportionate manner with due regard to the rights and legitimate interests of all parties involved, including fundamental rights. That will be a difficult balancing exercise in relation to content which is alleged to be illegal, and it will be more difficult for services to take the 'easy option' of just taking down content in response to a complaint.

There are quite laborious safeguards proposed to avoid lawful content being taken down erroneously, which will mean decision-making has to be more considered and take into account more factors.

Automated take-down will be more difficult. We envisage considerable work will need to be done on improving internal processes to ensure decision-making is supportable. One reason for that is that, if a hosting service provider takes something down, it would have to provide detailed reasons for the decision to the content uploader, including information on the use made of automated means and a reference to the legal ground relied on.

As well as providing reasons, online platforms will need an internal complaint-handling system accessible to uploaders who have had their information removed or their access suspended or terminated. Those complaints can then be escalated to an out-of-court dispute settlement process, with which platforms must engage in good faith and be bound by the decisions.

What information will my service have to disclose?

There is considerable and repeated focus on the obligations to disclose the use of algorithmic decision-making, whether in general terms and conditions or in fact-specific correspondence with consumers and traders.

Online platforms have to publish annual reports on the number and effectiveness of complaints made under their internal complaint-handling system, the number, outcome and lead time of any out-of-court dispute settlements and the number of suspensions imposed in relation to manifestly illegal content, manifestly unfounded complaints and manifestly unfounded takedown notices. They also have to report details of automated content moderation, including specifying the purposes it was used for, its accuracy and any safeguards employed.

In addition, online platforms must publish six-monthly reports on the average number of monthly active users in each Member State, to allow the Digital Services Coordinator to determine whether they should be designated a very large online platform.

Very large online platforms will be required to set out in their terms and conditions the parameters they use for any content recommender systems, together with options for users to modify or influence them, including one option not based on profiling. They should also consider adapting their algorithmic recommender systems as part of the measures they take to mitigate systemic risk in relation to illegal content, freedom of expression and discrimination, and manipulation of services that impact society.

On request from a digital service coordinator, or the Commission, very large online platforms will be required to provide such data as is necessary to monitor compliance with the DSA. The Recitals contemplate that this may include data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems. Not only can these authorities request access for themselves, they can also request such data to be provided to vetted academic researchers.

Although the DSA does offer providers the ability to ask authorities to amend their requests, where giving access to such data would lead to significant vulnerabilities for the security of their service or the protection of confidential information, if they do so, they will need to suggest alternative disclosure.

This is likely to be a major problem for affected platforms. Details of algorithmic systems, particularly recommender systems, are highly confidential trade secrets and we expect platforms to push back on such disclosure requirements.

Very large online platforms will be subject to annual audits by independent organisations, leading to an audit report and an audit implementation report, and will have to appoint compliance officers.

These measures represent a huge administrative burden for providers, exposing their internal content moderation to unprecedented levels of scrutiny.

What impact is there on online advertising?

Online platforms will need to disclose, in real time and on an ad-by-ad and user-by-user basis information identifying ads as such, on whose behalf the ad has been displayed and "meaningful information" about the main parameters used to target the user.

Very large online platforms will have to maintain, for at least one year after last display, a publicly available depository containing information including, in addition to the above, the content of the ad, the period during which the ad was displayed, the number of recipients targeted, and the number of recipients reached.

These proposals are unprecedented and may lead to an unmanageable level of clutter, particularly if required to be provided on an ad-by-ad level.

How do services have to deal with traders on a marketplace?

Online platforms will have to collect basic 'know your customer' (KYC) information from traders and use reasonable efforts to cross-check this against free online databases. To support consumer rights further, online platforms need to design their interfaces in a way which allows traders to provide such pre-contractual and product safety information as is required under EU and Member State consumer protection laws.

For marketplaces, there is a new limitation on the hosting immunity for liability under consumer protection law for those platforms which allow contracts between users and traders and where the service makes it look like the relevant product or information is provided by, or under the authority or control, of the online platform.

Who will enforce the rules and what will the sanctions be?

Each Member State will have to identify national competent authorities and digital services coordinators who will be the primary enforcement bodies in the jurisdictions in which providers have their main establishment. Very large online platforms will be supervised by the European Commission.

All services will need to appoint a single point of contact for the relevant authorities or, if established outside the EU, a legal representative in a Member State. Services will be subject to regulation in their country of main establishment. If ex-EU services don't appoint an EU established legal representative, all Member States can have jurisdiction over them.

Individuals will have rights of complaint either to the coordinator in their country of residence or in the place of the service's establishment. Fines of up to 6% of total annual global income or turnover will be available for non-compliance.

What are the next legislative steps?

The Commission's public consultation is open until 3 March 2021. The subsequent legislative process could take more than two years as the three EU institutions consider and negotiate the proposals under the ordinary legislative process. Two decades of pent-up demand for reform in this area is unlikely to quieten any voices in support of regulatory intervention.

Find out more

To discuss the issues raised in this article in greater detail, please reach out to a member of our Technology, Media & Communications team.

Return to

home

Go to Interface main hub