The EU is in the process of updating its legal framework to create a Europe “fit for the Digital Age”. The Digital Services Act (DSA) is a major pillar of this initiative, sitting alongside the Digital Markets Act. Its primary goal is to ensure that what is illegal offline is also illegal online. Because of this, the DSA largely focuses on accountability requirements rather than prohibitions on online service providers which are held responsible for the content on their digital platforms.
The DSA does, however, introduce new restrictions on online platforms in relation to targeted online advertising and the use of sensitive personal information, as well as a prohibition on dark patterns, and rules on recommender systems and the inner workings of the algorithms they use.
What is an advertisement under the DSA?
Article 2(n) DSA defines an "Advertisement" as:
- “information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and presented by an online platform on its online interface against remuneration specifically for promoting that information.”
The definition covers both commercial and non-commercial adverts, including adverts for political parties or NGOs.
Classification of online platforms
The DSA divides online platforms into three groups. The category of ‘micro or small enterprises’ contains enterprises that employ fewer than 50 persons and have annual turnovers and/or annual balance sheet totals that do not exceed EUR 10 million. This category is explicitly excluded from the obligations mentioned in the DSA.
All other online platforms are bound by the DSA with additional obligations on very large online platforms (VLOPs). An enterprise will be considered to be a VLOP if the online platform provides its service to an average monthly number equal or higher than 45 million active recipients in the Union.
Targeted advertising
The European Parliament lobbied for a full restriction on targeted advertising in the belief that presenting advertisements based on targeting techniques can have serious negative effects by, for example, contributing to disinformation campaigns or discriminating against certain groups.
The EU legislators, however, opted for an obligation on online platforms to be transparent about targeted advertising. Under Article 24 DSA, online platforms are required to be transparent about:
- the fact that the information displayed is an advertisement
- the natural or legal person on whose behalf the advertisement is displayed, and
- the main parameters used to determine the recipient to whom the advertisement is displayed.
As a result of this last obligation, relevant advertisements will have to be accompanied by the profile characteristics used as the basis to direct the advertisement to the recipient. For example, advertisements for gyms will have to show that the recipient is targeted due to a range of criteria which might include because they are an individual interested in sports between 25 and 35 years old and living near the location of the advertised gym. While the criteria may be fairly clear-cut in that example, advertisements for less popular or more controversial subjects will also be obliged to show how the recipient is targeted. This might be because the recipient has liked specific groups on social media or because of the search history of their search engine. In short, the recipient will need to be informed about the particulars relevant for the advertisements shown and about their digital label.
The rules on targeted advertising will enable users to understand and evaluate the advertisements they are shown, including, crucially, who paid for them.
Dark patterns
The DSA also bans techniques using so-called ‘dark patterns’. Dark patterns refer to misleading 'tricks', particularly using the design of website interfaces and/or mobile applications, that manipulate users into choices they do not intend to or might not otherwise make.
Under Article 23a(1):
- “Providers of online platforms shall not design, organize or operate their online interfaces in a way that deceives, manipulates or otherwise materially distorts or impairs the ability of recipients of their service to make free and informed decisions.”
Only dark patterns which were not already covered by the Unfair Commercial Practices Directive and the GDPR are covered by the DSA. These include the (non-exhaustive) examples which are mentioned in Recital 51(b):
- making use of exploitative design choices to direct the recipient to actions that benefit the provider of intermediary services, but which may not be in the recipients’ interests, presenting choices in a non-neutral manner, such as giving more prominence to certain choices through visual, auditory, or other components when asking the recipient of the service for a decision
- repeatedly requesting a recipient of the service to make a choice where such a choice has already been made, or
- making the procedure of cancelling a service significantly more cumbersome than signing up to it, or making certain choices more difficult or time-consuming than others, making it unreasonably difficult to discontinue purchases or to sign out from a given online platform.
Recommender systems
The way people access and share information online is significantly impacted by recommender systems. Recommender systems are defined by the DSA as:
- “fully or partially automated systems used by online platforms to suggest or prioritize in online interface specific information to recipients of the service, including as a result of a search initiated by the recipient of the service or otherwise determining the relative order or prominence of information displayed.”
In short, the recommender system decides which content is visible to the recipient and in which order using parameters set by the platform. These systems can have a significant impact on the way the recipients retrieve information and interact with the system and the online information.
The DSA prescribes that the way recommender systems are used has to be set out in the terms and conditions of the online platforms. The main parameters have to be described as well as the options for the recipients to modify or influence those parameters. Further, the ability to modify or to select the preferred options also has to be directly and easily accessible from the specific section of the interface of the online platform.
VLOPs
In addition to enabling users to modify or select the parameters of recommender systems, VLOPs also have to provide them with at least one option for each of their recommender systems to suggest or prioritise search results without profiling. Where that option is selected, the system will not use or process personal data to analyse or predict aspects concerning the economic situation, health, personal preferences, interests, reliability, behaviour, location or movements of the recipient.
What does this mean for your business?
For a lot of SMEs the obligations set out in the DSA on advertising, dark patterns and recommender systems will not apply as they will not qualify as online platforms under the DSA. However, if you meet the thresholds of 50 employees or EUR 10 million of turnover, the DSA will have a very significant impact on your business when it comes into force.
The way targeted advertisements are shown will change to the extent it might lead to more companies to opt for non-targeted advertising. The processes on your platform or website need to be reviewed to make sure they do not fall within the scope of dark patterns and, when using recommender systems, at the very least, your terms and conditions or other parts of your platform or website will need to be updated.