You may not have heard of dark patterns, but you've definitely seen them on websites or even used them on your own website – whether knowingly or not. Much like 'nudges' or 'sticky techniques', they are interfaces designed to intentionally steer users of digital services into making certain choices. They can be harmless but they are also criticised for manipulating consumers into subscribing to things they don't want, or into choosing to give up personal data when they need not have.
In the realm of privacy, EU regulators are starting to use their GDPR powers to sanction the use of dark patterns which go against the principles of privacy by design and default. What may have been considered as not much more than aggressive marketing is turning into an unlawful practice.
Digital interface designers need to be aware that building dark patterns into websites and apps in order to collect more personal data, is likely to be problematic under the GDPR and may fall foul of transparency, data minimisation, and data protection by design and default (DPDD) requirements.
The term "dark patterns" was coined by UX designer Harry Brignull, a few years after being pickpocketed by a friendly stranger he was dancing with at a nightclub. Yes, you read that right. After a few internet searches he discovered that he had fallen prey to a widely known scam, the "drunk dancer technique" – and he found that putting a name to the scam empowered him to not fall victim to it again. He decided to put a name to another sort of deceptive techniques, those used on websites and apps to "make you do things that you didn't mean to, like buying or signing up for something."
Dark patterns are widely used across digital platforms and services – to see just how common they are, have a look at the Dark Patterns Twitter feed, which records instances dark pattern usage "to spread awareness and to shame companies that use them". In fact, they have been used pretty much since digital user interfaces first existed.
Use of dark patterns is regularly reported in various contexts; in e-commerce, most recently through a study by a Princeton research group that crawled a sample of 11,000 websites to reveal dark patterns influencing shopping behaviours. These include:
There are all sorts of problems with these practices in terms of consumer protection rules, but in the realm of interfaces offering (or not offering) privacy choices, stakes are arguably higher – we're no longer talking about enticing customers into buy things they don't need, but pushing them to make choices that will erode their privacy online.
Dark patterns used in privacy options interfaces are increasingly criticised as non-transparent practices that go against the GDPR requirement for DPDD (and other principles). A 2018 report by the Norwegian Consumer Council, Deceived by Design, helped focus regulator attention on practices used by tech companies "to discourage us from exercising our rights to privacy". The report classified dark patterns into five categories:
In April 2019, LINC, the CNIL think tank produced a detailed report, Shaping Choices in the Digital World, which makes similar observations on dark patterns (see our article for more). The way it categorises the practices varies from the Norwegian report but it essentially covers the same range and looks at them not only from the privacy perspective but also in terms of psychology and behaviour.
The report concludes that design is vital in preventing potentially harmful practices and important to "positively support users in understanding the mechanics of digital services" and as a key to obtaining genuine consent. This is framed not only in the context of the GDPR, but also in maintaining a fair commercial and competitive environment.
After the GDPR and the most recent updates to the PECR came into force in 2018, the requirement to obtain consent to use tracking cookies made cookie banners and privacy notices so ubiquitous that users have become almost immune to them. Many don't read them anymore, but more worryingly, they have a tendency to think their only option to access the site they're looking for is to agree to everything.
The user's priority, more often than not, is to get rid of privacy notices by accepting the defaults presented, rather than taking time to make an informed choice. Exploiting that vulnerability is easy with the use of dark patterns, for example, by making the privacy-eroding option more prominent or appealing than the privacy-preserving option. We've all seen some variations of this notice:
This brief notice packs pretty much all categories of dark patterns described in the Norwegian Consumer Council's report. There has always been a fine line between acceptable (if bullish) marketing practices and unlawful ones, but this type of notice is becoming increasingly unacceptable given the GDPR's requirements.
Dark patterns are no longer just an academic or consumer activist's concept – the term is now used in official institutional discourse to describe unfair practices. In April 2019, Giovanni Buttarelli, the late European Data Protection Supervisor, delivered a speech entitled Dark patterns in data protection: law, nudging, design and the role of technology. Without mincing his words, he described dark patterns as "a way for companies to circumvent these principles by ruthlessly nudging consumers to disregard their privacy and to provide more data than necessary." Highlighting the grey area between legitimate marketing and manipulation, he said "there is only a small gap between nudging and recklessly taking advantage of natural human traits."
This meticulous dissection of Google's privacy interface by the CNIL inflicted a major blow to the use of dark patterns, further distinguishing between suspect but lawful and unlawful practices.
User interface designers should also bear in mind that the ICO's recently published Age Appropriate Design Code clearly prohibits the use of "nudge techniques to lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections." This reflects the GDPR's high threshold for obtaining children's consent. See more on the Age Appropriate Design Code here.
What all this means is that designers of digital services should review and, if necessary, revise their user interfaces to rid them of dark patterns. The European Data Protection Board published extensive draft guidelines in November 2019 to help companies implement the principles of DPDD which should help move designers away from the use of dark patterns.
It is also time to give serious thought to Legal Design, an innovative way to produce legal documents "that people might actually read voluntarily, if not always enthusiastically", as our very own Jo Joyce and Tamara Mackay-Temesy put it in an article predicting the rise of Legal Design in 2020. The aim is for cookie banners, notices and other privacy tools to serve their real purpose: informing users about their rights and giving them the means to actually exercise them, rather than ticking a compliance box and relegating the boring privacy stuff to hidden corners of websites. Contact us for help with designing bright and clear patterns!
Jo Joyce looks at common issues faced by two different types of businesses trying to implement privacy by design and default.
2 of 6 Insights
Our international team looks at the views of the EDPB and other EU regulators on DPDD.
3 of 6 Insights
Debbie Heywood looks at what the GDPR and the ICO have to say about data protection by design and default.
4 of 6 Insights
Tamara Mackay-Temesy covers a variety of key practical privacy by design and default issues to consider during the design process.
5 of 6 Insights
Jo Joyce looks at the ICO's recently finalised Age Appropriate Design Code.
6 of 6 Insights