1 of 6

9 March 2020

Data protection by design and default – 1 of 6 Insights

Beware 'dark patterns' – data protection regulators are watching

Lucie Audibert looks at the increasing regulatory scrutiny of dark patterns and nudge techniques in light of the GDPR DPDD requirement.

More
Author

Lucie Audibert

Associate

Read More

You may not have heard of dark patterns, but you've definitely seen them on websites or even used them on your own website – whether knowingly or not. Much like 'nudges' or 'sticky techniques', they are interfaces designed to intentionally steer users of digital services into making certain choices. They can be harmless but they are also criticised for manipulating consumers into subscribing to things they don't want, or into choosing to give up personal data when they need not have.

In the realm of privacy, EU regulators are starting to use their GDPR powers to sanction the use of dark patterns which go against the principles of privacy by design and default. What may have been considered as not much more than aggressive marketing is turning into an unlawful practice.

Digital interface designers need to be aware that building dark patterns into websites and apps in order to collect more personal data, is likely to be problematic under the GDPR and may fall foul of transparency, data minimisation, and data protection by design and default (DPDD) requirements.

What are dark patterns?

The term "dark patterns" was coined by UX designer Harry Brignull, a few years after being pickpocketed by a friendly stranger he was dancing with at a nightclub. Yes, you read that right. After a few internet searches he discovered that he had fallen prey to a widely known scam, the "drunk dancer technique" – and he found that putting a name to the scam empowered him to not fall victim to it again. He decided to put a name to another sort of deceptive techniques, those used on websites and apps to "make you do things that you didn't mean to, like buying or signing up for something."

Dark patterns are widely used across digital platforms and services – to see just how common they are, have a look at the Dark Patterns Twitter feed, which records instances dark pattern usage "to spread awareness and to shame companies that use them". In fact, they have been used pretty much since digital user interfaces first existed.

Use of dark patterns is regularly reported in various contexts; in e-commerce, most recently through a study by a Princeton research group that crawled a sample of 11,000 websites to reveal dark patterns influencing shopping behaviours. These include:

  • Bundling up products that consumers didn't choose to bundle up (for example, sneaking insurance into customers' baskets and making it hard to remove).
  • Digital subscription services which automatically charge the user at the end of their free trial.
  • Pushing software updates as "necessary" without providing a cancellation option.

There are all sorts of problems with these practices in terms of consumer protection rules, but in the realm of interfaces offering (or not offering) privacy choices, stakes are arguably higher – we're no longer talking about enticing customers into buy things they don't need, but pushing them to make choices that will erode their privacy online.

Dark patterns and privacy

Dark patterns used in privacy options interfaces are increasingly criticised as non-transparent practices that go against the GDPR requirement for DPDD (and other principles). A 2018 report by the Norwegian Consumer Council, Deceived by Design, helped focus regulator attention on practices used by tech companies "to discourage us from exercising our rights to privacy". The report classified dark patterns into five categories:

  • Default settings – setting privacy-intrusive default choices and hiding or obscuring preselected defaults (the good old 'opt-in vs opt-out' debate).
  • Ease – making the choice of the privacy-preserving option more cumbersome (a typical example faced by everyone almost every day is having to click countless toggles to disable all non-essential cookies from a website).
  • Framing – focusing wording on the positive aspects of one choice while glossing over any potentially negative aspects to entice users to select this choice (eg claiming tracking will lead to "improved services" and omitting to mention any negative consequences).
  • Rewards and punishment – rewarding the choice that the service provider prefers through extra functionality or a better service, and punishing the other choice (eg using an ultimatum such as deleting your account). This is also known as "Confirmshaming" (another term coined by Henry Brignull): "The act of guilting the user into opting into something. The option to decline is worded in such a way as to shame the user into compliance."
  • Forced action and timing – forcing users to choose between actions on the spot, for example, by showing a popup that needs to be engaged with before accessing the service without a clear option to postpone the process.

In April 2019, LINC, the CNIL think tank produced a detailed report, Shaping Choices in the Digital World, which makes similar observations on dark patterns (see our article for more). The way it categorises the practices varies from the Norwegian report but it essentially covers the same range and looks at them not only from the privacy perspective but also in terms of psychology and behaviour.

The report concludes that design is vital in preventing potentially harmful practices and important to "positively support users in understanding the mechanics of digital services" and as a key to obtaining genuine consent. This is framed not only in the context of the GDPR, but also in maintaining a fair commercial and competitive environment.

After the GDPR and the most recent updates to the PECR came into force in 2018, the requirement to obtain consent to use tracking cookies made cookie banners and privacy notices so ubiquitous that users have become almost immune to them. Many don't read them anymore, but more worryingly, they have a tendency to think their only option to access the site they're looking for is to agree to everything.

The user's priority, more often than not, is to get rid of privacy notices by accepting the defaults presented, rather than taking time to make an informed choice. Exploiting that vulnerability is easy with the use of dark patterns, for example, by making the privacy-eroding option more prominent or appealing than the privacy-preserving option. We've all seen some variations of this notice:

This brief notice packs pretty much all categories of dark patterns described in the Norwegian Consumer Council's report. There has always been a fine line between acceptable (if bullish) marketing practices and unlawful ones, but this type of notice is becoming increasingly unacceptable given the GDPR's requirements.

Regulators crack down on dark patterns

Dark patterns are no longer just an academic or consumer activist's concept – the term is now used in official institutional discourse to describe unfair practices. In April 2019, Giovanni Buttarelli, the late European Data Protection Supervisor, delivered a speech entitled Dark patterns in data protection: law, nudging, design and the role of technology. Without mincing his words, he described dark patterns as "a way for companies to circumvent these principles by ruthlessly nudging consumers to disregard their privacy and to provide more data than necessary." Highlighting the grey area between legitimate marketing and manipulation, he said "there is only a small gap between nudging and recklessly taking advantage of natural human traits."

The GDPR empowered regulators across the EU to fine companies for breaches of the GDPR. The CNIL (the French Data Protection Authority) fined Google  EUR 57m (subject to appeal) for lack of transparency, inadequate information and lack of valid consent in its ads personalisation tool. It dissected Google's Privacy Policy and Terms of Service and the whole interface around it and found, among other issues, breaches of Article 6 for failing to obtain valid consent to personalised advertising processing. In particular, it found that due to the design of Google's interface:

  • Users weren't able to identify all services, websites and apps that processed personal data, nor the purposes of the processing, preventing them from forming "a proper perception of the nature and volume of data collected" so consent wasn't sufficiently informed.
  • Users weren't able to reject processing activities without having to click on "More options" (instead of "Accept all"), so that the processing activities were authorised and masked "by default" so consent wasn't specific.

This meticulous dissection of Google's privacy interface by the CNIL inflicted a major blow to the use of dark patterns, further distinguishing between suspect but lawful and unlawful practices.

User interface designers should also bear in mind that the ICO's recently published Age Appropriate Design Code clearly prohibits the use of "nudge techniques to lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections." This reflects the GDPR's high threshold for obtaining children's consent. See more on the Age Appropriate Design Code here.

What to do about it?

What all this means is that designers of digital services should review and, if necessary, revise their user interfaces to rid them of dark patterns. The European Data Protection Board published extensive draft guidelines in November 2019 to help companies implement the principles of DPDD which should help move designers away from the use of dark patterns.

It is also time to give serious thought to Legal Design, an innovative way to produce legal documents "that people might actually read voluntarily, if not always enthusiastically", as our very own Jo Joyce and Tamara Mackay-Temesy put it in an article predicting the rise of Legal Design in 2020. The aim is for cookie banners, notices and other privacy tools to serve their real purpose: informing users about their rights and giving them the means to actually exercise them, rather than ticking a compliance box and relegating the boring privacy stuff to hidden corners of websites. Contact us for help with designing bright and clear patterns!

Back to

Global Data Hub

Go to Global Data Hub main hub