person lost in maze or dark iron walls.

Image: Dan Asaki via Unsplash

The Dark Side of Design uses more than cookies to mislead you

‘Move fast and break things’ – an internal motto used by Facebook until 2014, describes in a nutshell how short-sighted business goals and fast-track development with skeletal user testing impact us as individuals, as a society and the design industry. Dark Patterns are everywhere and most of the time we don’t even notice them. Still, they take a toll on our physical and mental health and societal cohesion. So what are Dark Patterns and how do they actually work?

Have you ever been stressed out while booking a room online? Thinking that you might miss out on that perfect holiday, because your dream room might be snatched up by these three other strangers looking at that same room, in this very moment? Or have you ever felt slightly annoyed that the cookie popup window only gave you a very obvious option to accept all cookies and seemingly nothing else? 

My personal low was signing up for a free Adobe Cloud trial for two weeks, which automatically turned into a subscription. I continued to use it for a couple of months until I decided there was no need anymore. Trying to cancel the subscription was the moment I found out that I had automatically entered a one-year contract with Adobe. In order to cancel it early on, I had to pay a penalty fee of over $80. I felt despair, powerless and positively wrathful. Indeed, I had become the victim of a roach motel

We encounter Dark Patterns all the time while using the internet, mobile applications or even in the analogue world – hence the saying ¨always read the tiny print¨. Design that tricks and misleads us into specific actions, desirable to the company behind the digital service we’re using, often against our own interests. These design traits successfully exploit our cognitive behavioural patterns and psychology: an unpleasant experience. From an ethical perspective? Deeply concerning.


Unethical design is more than just annoying: it causes physical, emotional and societal harm. User wellbeing and trust are sacrificed for short-term business goals. Digital tools designed to be addictive cause sleep deprivation and make us physically much less active. Personally identifiable data can be exposed and deadly accidents happen through distraction. Online bullying and toxic body-images are fertile ground to cause anxiety and deepen depression, particularily harmful to young people (see: Instagram linked to child suicide). On a societal level, Dark Patterns contribute to political polarisation and misinformation, driving engagement (enragement, really) and click rate exponentially. Conspiracy theories are thriving. Further, digital tools are drivers of exclusion as many people are not thought of when products are designed by a homogenous minority. Algorithms with in-built biases reinforce stereotypes and structural oppression, yet they are difficult to correct. 

So how do Dark Patterns work?

Dark Patterns are carefully crafted and rely on broadly five strategies to deceive users, as defined by the UXP2 Lab. A digital product may incorporate several of these strategies at once. 

By Nagging, user attention is disturbed and redirected. This interruption may be pop-ups that hide the interface or audio that distracts us. We are repeatedly drawn away from our indented task, to what the company wants us to do. 

With Obstruction, something we want to do is purposefully made much more difficult, in the hope that we give up along the way. The cost, time spent and nerve, being higher than the expected benefit. We are very much used to these Dark Patterns when trying to adjust cookie settings, unsubscribe from spam or taking up the herculean task of truly deleting our Facebook account. Obstructive strategies are more common than their absence, making helpful and smooth cancellation processes a pleasant and rare surprise. 

The attempt to hide, disguise or delay the divulging of information important to the user goes under Sneaking. This trick adds additional costs, when purchasing a product or service, hidden until the very end and easily overlooked. It can go as far as actually sneaking products into your basket without your doing. Another example is how you enter a premium subscription with one click, but the process of cancelling it becomes much more complicated. Amazon is currently facing legal challenges for these practices. 

Have you ever caught an interface trying to hide the very information you were looking for – maybe an unsubscribe button being white on white background? Or felt offended or even guilt-tripped (confirmshaming) into clicking an ad? Then you were a victim of Interface Interference. Users are intentionally being confused and manipulated into prioritising what the company behind the interface desires. This can be through visual deception such as disguised ads that pretend to be legitimate reviews or search results (I’m looking at you, Yelp!).

Sometimes we are suddenly blocked from what we originally intended to do, or from receiving access to further functions, unless we perform a specific task: a forced action. For example, a product is made unusable without installing the newest update. Or a game requires you to log-into Facebook and then demands to invite-spam all your contacts: a classic social pyramid scheme.

Call to action

We are used to being manipulated by the digital tools we use. Still, the absurdity of these design patterns becomes insulting when imagining them in the physical world. How would you feel if an employee at the supermarket tries to sneak random products into your basket? Or the cashier asks you repeatedly if you would like to purchase these five products you have bought two months ago. Or keeps hold of your credit card details just in case you might want to buy something the future – of course, for your convenience only. Yet, in the virtual realm these behaviours are insidious, tricking even aware users.

So what can we do against Dark Patterns? How can we create alternatives? 
Many users are not consciously aware that Dark Patterns exist. Hence, a first powerful step against the practice is to call out the companies who use them., started by cognitive scientist Harry Brignull – the person who coined the term first in 2010, is a library of patterns to name and shame deceptive user interfaces. 

An important role however, lies with designers themselves. They need to be aware of their responsibility not only towards the client, but to end users. By creating a community of designers who value ethics, by connecting and raising awareness, we can catalyse change. In February, we held a conference on ethical design: Ethics Matters.
Enthusiasm for the conference was overwhelming, with over a 1000 registrations and over 500 people attending from all over the world. In a participative session, our newly formed community of ethical designers and developers have developed a manifesto of ethical design, which will be published soon. 

The movement has only just started. 
You can join the effort and our community via this meet-up group
Collectively, we can demand the design industry to do better.