The Digital Services Act or DSA entered into force on 16 November 2022 and will have general application from 17 February 2024. For Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) specified by the Commission under Article 33 (4) DSA, the DSA applies already four months after notification. Also, some regulations have already been applicable since 16 November 2022 (Article 93 (2) DSA). No delegated acts of the EU Commission are yet available, that are to make the DSA more manageable are not expected until the end of 2023 at the earliest. Practice by the authorities is also not yet in place, and the first decisions from courts will not come until after the general entry into force in February 2024. A number of unresolved questions are already emerging.
Scope of application for certain providers
To whom is the DSA applicable?
Due to the comprehensive obligations imposed by the DSA, it is important for all potential addressees to clarify the applicability of the DSA to their own service at an early stage. On the one hand, there is the challenge that the DSA has a potentially very broad scope of application to a wide variety of offers and providers, and on the other hand the fact that the DSA can also cover individual partial aspects of an offer. To this extent, the implications are sometimes far-reaching and require extensive preparation, while at the same time there are still hardly any interpretation aids for legal practitioners to fall back on.
Intermediary services
The starting point for determining the personal scope of application is Article 2 (1) and Art 2(2) DSA, according to which the application of the DSA always requires an “intermediary service”. Intermediary services in turn are defined in Article 3 lit. g) (i) to (iii) as a “mere conduit” service (i), “caching” service (ii) or “hosting” service (iii). In interpreting these terms, it is possible for the time being to fall back on the demarcation between access, caching and host providers, as established in the E-Commerce Directive (Articles 12-14).
Not applicable to content providers
Accordingly, the DSA does not apply to providers of their own content, such as providers of video or audio content or videogames. In addition to the clear wording, this also follows from Recital 18, which stipulates that the exemptions from liability (i.e. Articles 4 et seq. of the DSA) do not apply to content of providers that are created by the providers themselves or under their editorial responsibility. In this case, the provider as content provider is regularly responsible for its own content anyway.
Hybrid offers - hybrid application - secondary functions
There is not always a clear “black or white”: a service or part of a service provided by the same provider may fall under the DSA, but other offers by the same provider, or other parts of a service, may not. This follows from Recital 15. Conversely, the provisions of the DSA then also only apply insofar as the respective parts fall within the scope of the DSA.
However, Article 3 lit. (i) DSA provides for an exception for online platforms. If hosting, i.e. the storage of information on behalf of a user, is only a minor and purely “ancillary function” of another service or a minor functionality of the “principal service”, then under certain further conditions this alone should not constitute an online platform. Whether this exception only applies to the described area of online platforms, or whether within the framework of the DSA a general distinction can be made between principal and minor ancillary functions, must currently still be considered open.
The relationship between the DSA and other EU laws
Large number of regulatory areas affected
Since the DSA touches on a wide spectrum of areas of regulation and applies - very broadly - to the most diverse types of unlawful content, the question arises as to its demarcation from other laws. First of all, Article 2 (3) DSA states that the DSA does not affect the application of the E-Commerce Directive. However, this is only partially correct, because according to Article 89 (1) DSA, the liability regulations of the E-Commerce Directive are repealed and de facto replaced by Art 4 et seq. DSA. Accordingly, references in the E-Commerce Directive to its Articles 12-15 are deemed to be references to Article 4, 5, 6 and 8 of the DSA (Art 89 (2) DSA). In German law, the liability provisions of Sections 7-10 Telemedia Act (TMG) are likely to be repealed accordingly.
Demarcation
According to Article 2 (4) DSA, the DSA is “without prejudice” to EU legal acts regulating “other aspects” of the provision of intermediary services in the internal market or “specifying and complementing” the DSA. Specifically mentioned are EU rules on copyright, consumer protection and product safety regulations and data protection law, in particular the EU General Data Protection Regulation (GDPR) and the e-Privacy Directive. The recitals also mention the Audiovisual Media Services Directive (AVMSD), the Platform-to-Business Regulation, the Regulation on addressing the dissemination of terrorist content online, the Unfair Commercial Practices Directive, the Consumer Rights Directive, the Unfair contractual terms Directive, and as regards copyright, the InfoSoc Directive, the Enforcement Directive and the DSM Copyright Directive, and in addition the European rules of private international law and international civil procedure law. Such regulations will then take precedence over the provisions of the DSA - probably in the sense of a “lex specialis”.
However, the DSA does not say when such “other aspects” are regulated, i.e. when the DSA is specified or complemented by other laws. The demarcation is of no small consequence because the sanctions of the respective EU laws also differ. As an example, consider the “online content sharing service providers” (“OCSSP’s”) under the DSM Copyright Directive. These are usually hosting providers in the sense of Article 6 DSA; nevertheless, according to Article 17 of the DSM Copyright Directive, they perform an own act of making available to the public and can “exempt” themselves from this liability under the conditions of the DSM Copyright Directive. This raises the question whether (and if so, to what extent) the liability privilege under Art 6 DSA can still apply to the cases regulated in Art 17 DSM Copyright Directive; outside its scope, on the other hand, there may be room for the DSA. This example alone shows that a precise demarcation must be made in each individual case. The more precise the regulations of the special laws for certain areas, the more likely they will take precedence over the DSA.
Existing national laws
At the same time, certain national laws will be superseded by the DSA. Thus, it is assumed - in addition to the already mentioned provisions in the German TMG - that the German Network Enforcement Act (Netzwerkdurchsetzungsgesetz – NetzDG), which also regulates the handling of platforms with illegal content and overlaps with the DSA in important parts, will have to be repealed. According to reports, the German government plans to present a corresponding law in the first half of 2023.
Ban on “Dark Patterns”
The DSA only appears to be breaking new legal ground with the prohibition of socalled “dark patterns” in Articles 25 and 31 DSA.
Definition of Dark Patterns
The DSA itself, however, does not contain a definition for “dark patterns”. Instead, the Recitals must be consulted. According to Recital 67, “dark patterns” are practices that aim to prevent users from making autonomous and informed choices or decisions. This is less about the content of (e.g. advertising) statements, but primarily about the “structure, design or functionalities” of online interfaces (i.e. primarily websites or apps), for example because the choices are not presented in a neutral way, in that certain choices are given more prominences through “visual, auditory or other components” - a topic, that was recently discussed above all by data protection supervisory authorities in connection with cookie banners. Also mentioned is the practice of repeatedly asking a user to resubmit a choice they have already made, or making it more difficult to cancel compared to logging in, making default settings difficult to change, and misleading users by enticing them to make certain transactions. The decisive criterion here is that the user’s freedom of choice is distorted or impaired (see Article 25 (1) DSA).
Legal mix: DSA, Unfair Commercial Practices Directive, GDPR, Consumer Rights Directive
Legally, there is a mixed situation, as “dark patterns” are not just addressed by the DSA. In this respect, it will have to be determined whether the DSA or other regulations are to be applied with priority. Dark patterns can also be covered by the Unfair Commercial Practices (UCP) Directive, and GDPR. Under the UCP Directive, a breach of the general prohibition of unfair commercial practices (Article 5), the prohibition of misleading (Articles 6-7 UCP Directive) or aggressive practices (Articles 8-9 UCP Directive) can be considered in particular, as well as a breach of certain regulatory examples of the “black list” of the UCP Directive (Annex I, especially points 5, 6, 7, 18, 19, 31, 20 and 26). Examples are differently visible buttons, trick questions, misleading free samples and subscription traps or “confirmshaming”. Numerous examples can be found in the Commission's Guidelines on the interpretation and application of the UCP Directive (2021/C 526/01).
Dark patterns may also constitute a breach of GDPR, in particular of the principles of Article 5 GDPR, of the requirements for consent to be given freely, specific and informed (Article 4 No. 11, Article 7 GDPR), of the transparency requirements, and of the principle of privacy by design (Article 25 GDPR). A number of examples can be found in the EDSA’s Guidelines 3/2022 on “Dark patterns in social media platform interfaces” of 14.03.2022. It is important to note that the UCP Directive and the GDPR take precedence over the DSA (Article 25 (2) DSA).
When it comes to cookie consent management, dark patterns can also prevent effective consent in the sense of Article 5 (3) of the e-Privacy Directive (in Germany: Section 25 Act on Data Protection in Telecoms and Telemedia - TTDSG). Statements on this can be found above all in the opinions of the German data protection supervisory authorities described as inadmissible “nudging” (see DSK, OH Telemedien dated 1 December 2021, as of December 2022; LfD Niedersachsen, Handreichung: Datenschutzkonforme Einwilligungen auf Webseiten - Anforderungen an Consent-Layer, as of September 2022).
Default settings in the form of already activated check boxes which relate to any extra payment are also inadmissible according to Article 22 of the Consumer Rights Directive.
“Dark patterns" are also mentioned in the European Parliament's resolution of 18 January 2023 on consumer protection in online video games (Consumer protection in online video games: a European single market approach, 2022/2024/(INI)) as one of the areas that may require further legal regulation. Here, the member states are called upon to enforce consumer rights and consider further legislative action.
The question of which regulation is relevant and which has priority will therefore be discussed intensively in the future in the case of “Dark Patterns”.