The EU is in the process of updating its legal framework to create a Europe “fit for the Digital Age”. The Digital Services Act (DSA) is a major pillar of this initiative, sitting alongside the Digital Markets Act. Its primary goal is to ensure that what is illegal offline is also illegal online. Because of this, the DSA largely focuses on accountability requirements rather than prohibitions on online service providers which are held responsible for the content on their digital platforms.
The DSA does, however, introduce new restrictions on online platforms in relation to targeted online advertising and the use of sensitive personal information, as well as a prohibition on dark patterns, and rules on recommender systems and the inner workings of the algorithms they use.
Article 2(n) DSA defines an "Advertisement" as:
The definition covers both commercial and non-commercial adverts, including adverts for political parties or NGOs.
The DSA divides online platforms into three groups. The category of ‘micro or small enterprises’ contains enterprises that employ fewer than 50 persons and have annual turnovers and/or annual balance sheet totals that do not exceed EUR 10 million. This category is explicitly excluded from the obligations mentioned in the DSA.
All other online platforms are bound by the DSA with additional obligations on very large online platforms (VLOPs). An enterprise will be considered to be a VLOP if the online platform provides its service to an average monthly number equal or higher than 45 million active recipients in the Union.
The European Parliament lobbied for a full restriction on targeted advertising in the belief that presenting advertisements based on targeting techniques can have serious negative effects by, for example, contributing to disinformation campaigns or discriminating against certain groups.
The EU legislators, however, opted for an obligation on online platforms to be transparent about targeted advertising. Under Article 24 DSA, online platforms are required to be transparent about:
As a result of this last obligation, relevant advertisements will have to be accompanied by the profile characteristics used as the basis to direct the advertisement to the recipient. For example, advertisements for gyms will have to show that the recipient is targeted due to a range of criteria which might include because they are an individual interested in sports between 25 and 35 years old and living near the location of the advertised gym. While the criteria may be fairly clear-cut in that example, advertisements for less popular or more controversial subjects will also be obliged to show how the recipient is targeted. This might be because the recipient has liked specific groups on social media or because of the search history of their search engine. In short, the recipient will need to be informed about the particulars relevant for the advertisements shown and about their digital label.
The rules on targeted advertising will enable users to understand and evaluate the advertisements they are shown, including, crucially, who paid for them.
The DSA also bans techniques using so-called ‘dark patterns’. Dark patterns refer to misleading 'tricks', particularly using the design of website interfaces and/or mobile applications, that manipulate users into choices they do not intend to or might not otherwise make.
Under Article 23a(1):
Only dark patterns which were not already covered by the Unfair Commercial Practices Directive and the GDPR are covered by the DSA. These include the (non-exhaustive) examples which are mentioned in Recital 51(b):
The way people access and share information online is significantly impacted by recommender systems. Recommender systems are defined by the DSA as:
In short, the recommender system decides which content is visible to the recipient and in which order using parameters set by the platform. These systems can have a significant impact on the way the recipients retrieve information and interact with the system and the online information.
The DSA prescribes that the way recommender systems are used has to be set out in the terms and conditions of the online platforms. The main parameters have to be described as well as the options for the recipients to modify or influence those parameters. Further, the ability to modify or to select the preferred options also has to be directly and easily accessible from the specific section of the interface of the online platform.
In addition to enabling users to modify or select the parameters of recommender systems, VLOPs also have to provide them with at least one option for each of their recommender systems to suggest or prioritise search results without profiling. Where that option is selected, the system will not use or process personal data to analyse or predict aspects concerning the economic situation, health, personal preferences, interests, reliability, behaviour, location or movements of the recipient.
For a lot of SMEs the obligations set out in the DSA on advertising, dark patterns and recommender systems will not apply as they will not qualify as online platforms under the DSA. However, if you meet the thresholds of 50 employees or EUR 10 million of turnover, the DSA will have a very significant impact on your business when it comes into force.
The way targeted advertisements are shown will change to the extent it might lead to more companies to opt for non-targeted advertising. The processes on your platform or website need to be reviewed to make sure they do not fall within the scope of dark patterns and, when using recommender systems, at the very least, your terms and conditions or other parts of your platform or website will need to be updated.
Gregor Schmid and Philipp Koehler highlight the key elements of the incoming EU Digital Services Act.
1 of 8 Insights
Adam Rendle looks at the differences and similarities in the approach of the EU and UK to online safety under incoming legislation.
2 of 8 Insights
Alexander Schmalenberger looks at the scope of the Digital Services Act, what it covers and who is caught.
3 of 8 Insights
Johanna Götz looks at the DSA's approach to online intermediary responsibility for illegal content.
4 of 8 Insights
Alexander Schmalenberger looks at the main obligations on intermediaries (other than those relating to illegal content).
5 of 8 Insights
Sasun Sepoyan and Otto Sleeking look at the impact of Article 24c of the DSA.
7 of 8 Insights
Elisa-Marlen Eschborn looks at the Member State enforcement provisions of the DSA.
8 of 8 Insights