Dark Patterns in UX and how to spot them


Early in March 2021, California banned the use of ‘Dark Patterns’ in relation to the sale of consumers’ personal data in an amendment to the California Consumer Privacy Act (CCPA). This change marked a landmark shift in efforts to protect consumers from practices that aim to ‘trick or confuse’ users into opting into the sale.

What are Dark Patterns?

While the amendment to the CCPA is specifically aimed at protecting the right to opt-out (or not opt-in) to the sale of personal data (Privacy Zuckering), there are many other examples that qualify in the same realm. 

If you’ve ever had to navigate a convoluted series of links to find account details and the ability to delete or unsubscribe (Roach Motel), been presented with shaming language when you attempt to do so (Confirshaming), or had a free trial of a service silently charging you at the end of your trial without warning (Forced Continuity), then you’ve encountered ‘Dark Patterns’. Dark patterns were defined by Harry Brignull, who coined the term and identified these ‘patterns’ in 2010.

Dark Patterns in UX: Unsubscribe

How are Dark Patterns used?

While I can safely say I know no one in the industry who would engage in the use of true dark patterns, there are clearly people out there who do. You only need to look at the Hall of Shame to see that some of the biggest companies in the world are engaged in these practices, and sadly that will act as a signpost for aspiring businesses to follow suit.

Most often these patterns seem to be employed by businesses solely focused on driving up key metrics such as email sign-ups, subscriptions uptakes, and retentions, with some companies having their entire business model centred around preventing you from switching off from their service, aka the ‘Roach Motel’. 

What's truly terrifying is that somewhere down the line these misdirections, confusions, and tricks will have been A/B tested and found to deliver the desired results. I’m of the opinion, however, that this is most likely to happen within internal UX and CRO departments that don’t have the same degree of separation that external agencies have to say the practices are ethically and morally wrong. 

Ultimately, the apparent gains from such practices are generally short-lived which either results in the business taking massive hits later on or, in the case of the giants like Amazon, Ryanair, and Apple to name just a few, using progressively more sneaky tactics and layers of complexity to maintain ‘growth’.

Dark Patterns in UX: Countdown Timer

The dangers of misrepresentation (of dark patterns)

Despite Brignull’s definitions, there are many who still see shades of grey when it comes to UX and CRO practices and this is where the lines start to blur with people across the digital spectrum typically tagging things they don’t like as ‘dark patterns’. These later get taken and attributed to Brignull as an official ‘dark pattern’ which, in the light of CCPA landmark regulation, could spell the start of a difficult time for the digital industry if clear boundaries are not drawn and widely agreed upon.

For me the definition is very clear, a true dark pattern is designed to manipulate and trick you into doing something you wouldn't otherwise do. How this differs from ethical UX and CRO practice really boils down to the intent and truth behind the practice employed at any given time. 

One of the common misrepresentations I see, that has got lost in the muddy water of true ‘dark pattern’ vs. ‘things people don’t like’, is that countdown timers on sites are bad and pressure users into buying things they don’t need or making purchases they wouldn't otherwise make. On the surface, this argument seems to fall in line with our definition. 

This is true if, for example, the timer is used to misrepresent a limited-time offer but then the same item is available for the same price after the offer expires. In this case, the intent is to mislead and trick the user, so yes this is a dark pattern. 

If the timer is used to represent a true offer and clearly signal to the user that offers expires at a very specific time, then the timer is merely an accurate and concise way of indicating this. The intent is therefore honest and the offer is true. 

My point here is that the intent behind that action or mechanism is the dark pattern, and not necessarily directly the mechanism itself. This is an important differentiation for any wider legislation, should it come to that.

Where do we stand on Dark Patterns and is the future of UX and CRO doomed?

Overall we have a clear opinion on true dark patterns: they shouldn't be used, period. They’re cheap tactics and you’ll trash your reputation and wear out your users' patience. Any marketeer, UX designer, or CRO specialist worth their salt will employ ethical solutions based on a sound strategy that respects the user base. 

Going forward, I hope we don’t require legislation the world over to protect users (which is also all of us who work in the industry) and emerging conversations and greater awareness can only be a good thing in helping users identify and avoid them where possible. However, I do see the advantage of legislation to prevent companies from using specific dark patterns and it’ll be interesting to see how things progress in California. 

We legislate to protect people against other forms of scams and tricks and I see this as falling very much in the same vein but, as mentioned earlier, we need to be careful not to simply stamp any onsite practice as a dark pattern just because we don't personally agree with or like it. Any such legislation has to take into account the intention behind the mechanism. 

As to the future of UX and particularly CRO, it's certainly not doom and gloom. For the vast majority of us that operate ethically, there’ll be no issues in the short term but there is certainly no immediate answer to the practice of dark patterns. It might be legislation, it might be overarching pressure from the industry to stop the practice, or it might even be an ever-increasing awareness campaign like that of Brignull’s ‘Hall of shame’ that finally brings an end to these nefarious tactics.

Only time will tell but one thing is for sure: if you work in the industry in UX and CRO then this affects you, and you need to be involved in the conversation.