19 September 2022

Digital Services Act (DSA) - an overview

Duties under the Digital Services Act

Alexander Schmalenberger looks at the main obligations on intermediaries (other than those relating to illegal content).

More
Author

Alexander Schmalenberger, LL.B.

Knowledge Lawyer

Read More
Author

Alexander Schmalenberger, LL.B.

Knowledge Lawyer

Read More
In this edition

The Digital Services Act (DSA) contains a number of obligations on intermediaries. Obligations regarding illegal content are covered here. This article provides an overview of the most important other duties and looks at their enforcement.

 

Classification of duties

 

There are over 20 duties set out in the DSA although they do not all affect all service providers and, in our view, are of varying practical impact. The duties imposed by the law on platforms and search engines can be clustered into several sets: one group of obligations is intended to achieve effective supervision by competent authorities. Another group is intended to enable users and rights holders to effectively exercise their rights. Finally, the transparency obligations are intended to allow research into and assessment of the impact of platforms and search engines on public safety and order.

 

Duties that facilitate regulation and supervision

 

In order for a digital service provider to be effectively regulated, it must designate a point of contact based in the Union. In doing so, the service provider (unless it is a "very large online platform" (VLOP) or "very large search engine" (VLOSE) with more than 45 million users, in which case the Commission is in principle responsible) also determines which national authorities are responsible for enforcement of the DSA against it.

 

If the service provider fails to designate a single point of contact, all authorities in all Member States of the European Union in which the provider offers services are competent. Therefore, as we discuss in more detail here, service providers should keep in mind whether they have users based in the European Union and how many they have. Service providers are, however,  also free to designate contact persons as a precautionary measure given that does not necessarily mean the DSA applies.

 

Duties that facilitate the exercise of rights

 

Digital service providers need to designate contact points through which the Commission, so public authorities and users of the service can easily and quickly get in touch with them. It is sufficient if the contact points can be reached electronically, so they do not have to be located in the Union. However, the contact point must also employ people with whom contact can be made, even if the use of modern technology, such as chatbots, is also partly permissible to deal with requests until personal contact is demanded.

 

The contact points for the respective addressees, eg authorities and users, do not have to be identical. Likewise, the contact point does not have to be the legal representative. However, merging the function is an option if there is no doubt about the applicability of the law - a contact point in the Union could already be considered an establishment and thus subject the service to the law for that reason alone. Insofar as VLOPs in particular have to name a compliance function, consideration should be given to integrating this as well.

 

The terms of use of the service must be communicated to the users in a transparent and comprehensible manner. If minors are the main users of the application, the terms of use must also be comprehensible to them. The terms of use must include all rules governing the use of the application even if they are not set out in text form or in writing. This means that where a practice evolves to become established, it may need to be included in the terms of use within the meaning of the DSA. Similarly, any internal functions applied to all users may also have to be set out.

 

Although the law does not contain precise rules on this, it seems obvious that terms of use should be provided in an official language of the Union that is widely understood - eg English - and the official language of the provider's registered office. Only the VLOPs and the VLOSEs have to provide their terms of use in machine-readable form and in all the official languages in which they offer their services. But it would certainly be advisable for smaller platforms to cover at least the languages of the Member States where they offer their services.

 

In the event of disagreements between users and platform providers, users can now complain to the service provider and – if the matter is not resolved – initiate an out-of-court dispute resolution procedure. For this purpose, the provider must have qualified personnel to deal with complaints. The decisions they take must be justified, taking into account the fundamental rights of the users. A body recognised by the Digital Services Coordinator must be appointed to oversee the dispute resolution process. The costs of this procedure, even if successful, are mainly the responsibility of the platform operator. The complainant may use the procedure for free or a nominal fee and can demand reimbursement of reasonable expenses. This lack of financial risk for complainants may encourage new offerings by legal service providers, especially to consumers.

 

Digital service providers are also required to report criminal offences. If the provider becomes aware of facts that give rise to the suspicion that there is a threat to the safety or life of one or more persons, they must file a report. This requirement is more far-reaching than it appears at first glance: according to the explanatory memorandum to the DSA, it includes, for example, depictions of abuse of children or incitement to terrorism. This means that things must also be reported that may take place long before or long after a possible act of violence. This may go far beyond the reporting obligations that currently exist in the respective Member State. If, in addition, under the respective applicable criminal law, eg Germany, persons who are obliged to report can be punished like the perpetrators under certain circumstances, this obligation is potentially highly problematic. Although the interpretation and application of the criminal law of the Member States is the responsibility of the local courts and is largely beyond the influence of the European Union, some changes to criminal law should be expected. Finally, it is worth adding that under certain circumstances, for example under German law, such violations could lead to tort claims by the injured parties against the platform.

 

Online marketplaces must ensure that traders leave correct contact details. Otherwise, traders must be blocked. The platform is also required to make the fulfilment of these information obligations simple. In this context, it should be noted that due to European requirements in the Member States, eg Germany since 1 August 2022, the commercial registers and business registers are publicly accessible for free. Therefore, the respective marketplaces should establish a system according to which the information of the traders can be regularly compared with the data from the registers. Furthermore, the traders' offers must be regularly checked to see whether they are selling illegal goods.

 

Duties that facilitate research

 

Part of the point of the transparency report requirements is to provide information to regulators and governments to help inform policy decisions.  Digital service providers with a moderation system must provide an annual summary of their decisions. Providers of very large online platforms must also report in particular on the moderation decisions made in-house. Transparency obligations also include explanations about decision-making processes with algorithms, the moderation of content and other important decisions at platforms.

 

Transparency must also be established in the playout of advertising and in the recommender systems. For example, the basic function of the recommender systems must be explained transparently. In particular, it must be explained which features affect the system and how. VLOPs must also offer a system that does not require profiling, ie the use of personal data and profiling of minors is prohibited.  See here for more.

 

Enforcement and legal protection

 

The enforcement of the DSA obligations is primarily the responsibility of the competent authorities. Aside from in relation to VLOPs and VLOSEs, which fall under the Commission's jurisdiction, only the authorities of the Member States are competent. Legal protection against the measures of the Commission must then be sought before the courts of the European Union. Legal protection against measures by the Member State authorities is governed by the procedural law of the relevant Member State.

 

Users of the services can also turn to the competent courts under international civil procedure law; for consumers, this will usually be the competent court in their Member State. It is also conceivable that competitors, eg competing platforms, can bring claims against each other under the respective national laws for unfair competition for violations of the DSA. This presupposes that the DSA also establishes market conduct rules which is not a given.

 

A range of duties

 

While the main object of the DSA is arguably to deal with illegal online content in a harmonised manner, it is also about regulating big tech more widely.  As a result, the duties under the DSA are extensive and varied.  Businesses will need to understand which apply to them and how to comply.

 

Return to

home

Go to Interface main hub