A Detailed Guide to Deceptive Patterns (or Dark Patterns)

Adyasha SahooLaw

A Detailed Guide to Deceptive Patterns (or Dark Patterns)

Dark Patterns or Deceptive Design Patterns are certain dirty tricks websites and apps use that make consumers or users take specific actions they do not mean to. In 2010, Harry Brignull coined the term dark patterns. For example, whenever WhatsApp updates its policies, it only gives the users the option to “Agree and Continue“. The users do not have an option to select or express their disagreement. Since users rely heavily on such apps, they quickly fall for such tricks. The service providers save a lot by drafting a standard agreement, but the consumers are at a loss. Alongside, Brignull also registered his website at deceptive.design, formerly known as dark patterns. This website is an online library that introduces the concept of dark patterns to the general public.

Understanding Dark Patterns

Brignull introduced the concept of dark patterns in a 2010 blog post on 90percentofeverything.com. It involves a great understanding of human psychology to trick the users into doing specific actions they would not otherwise have done. In a speech, he discussed how we are losing a lot of good designers to the dark side. He explained two sides of dark patterns, ethical and unethical.

An interesting example is a donation website showing a pre-filled amount for donation. When users open the website, they will notice that an amount is pre-selected. They may assume that this is the amount most people are donating. The chances of a user opting to enter a manual amount are decreased. On the other hand, there are examples of e-commerce platforms that suggest must-buy accessories when a user buys a mobile phone. This may push the user to think these products are necessary and buy them.

Here, there are two types of consumers. The first type does not notice such mischief by themselves. However, the second type notices it after purchasing it, but they are too lazy to put some extra effort into returning it. Thanks to our lack of awareness, laziness, and negligence, the developers and designers make money. The following sections discuss different dark patterns Brignull has introduced over the years.

Types of Dark Patterns

1. Comparison Prevention

This dark pattern prevents the users from comparing the price or quality of products. The platforms make it difficult for users to do a comparative analysis. This makes users susceptible to cognitive biases such as social proof or herd mentality. In layman’s terms, users prefer buying the products because a particular celebrity has promoted them or based on public reviews.

At times, the platform displays features and prices in a complex manner such that the essential information is not visible to users. For example, while buying combo products from an e-commerce platform, there is a good chance that you will not have access to price break-ups.

2. Confirm Sharming

This pattern often triggers uncomfortable emotions to influence users’ decision-making. Websites usually do this by using derogatory or not-so-civil language in the opt-out labels. They target the user’s emotions, make them give in to their desired action, and benefit from it.

In 2018, a US-based e-commerce website selling first-aid packs and medical supplies adopted this dark pattern. While seeking permission from users to send notifications, its opt-out label read “No, I do not want to stay alive” or “No, I prefer to bleed to death“.

3. Disguised Ads

Disguised ads blur the line between the real and fake content. In this pattern, the users mistakenly believe they are clicking on the native content, but it is an advertisement in disguise. These disguised ads are often related to articles or content related to users’ interests, making it more likely for the users to click on them. As a result, the websites generate more revenue from the advertisements.

A real-world example of this is when we search for products online. We will likely see sponsored ads when we browse the feed on our social media handles.

4. Fake Scarcity

This pattern creates a fear of missing out (FOMO) situation for the users. It works by creating an artificial sense of limited availability for a product. In general terms, these are misleading messages to show the users that the product is in high demand. This pushes the users into buying the product.

You will often see such misleading notifications by an e-commerce website or app for the products saved in your cart.

5. Fake Social Proof

This pattern creates an illusion that a particular product is popular with falsified or exaggerated endorsements or reviews. In social proof (i.e. herd mentality), people conform to others’ behaviour patterns. It is a shortcut to avoid the hard work of self-evaluation before deciding.

Notifications like “X person from Y city purchased this item 5 minutes ago” and “10 customers have already bought this product today” are examples of this pattern.

6. Fake Urgency

The users are put under time pressure, which lessens their ability to critically evaluate the available information. As a result, they may experience anxiety or stress. Under such pressure, they give in to the seller’s interest.

The most suitable example of this can be the timers set by e-commerce apps during their sales. You will suddenly see a notification saying, “Hurry up! Sale ending in 12 minutes” to push you into buying the intended product.

7. Forced Action

In this pattern, the sellers offer the users something of interest but want them to do something in return. This pattern is usually combined with other dark patterns, such as sneaking or trick wording. Users see an optional action, which is actually a forced action. So often, this pattern’s practical implementations have a low-contract background.

For example, LinkedIn tricked users into registering with their email address in 2015. For most people, it appeared harmless, as most websites require an email address for registration. They created a low-contract effect button by keeping the background blue and the text grey.

8. Sneaking

The sellers undertake actions to hide, disguise, or delay certain information relevant to the user. In 2015, a UK-based sports retailer added an unwanted magazine to users’ carts without their knowledge. It only added a cost of £1, which many users did not mind. The users had to actively remove this magazine from their card if they did not want to purchase it.

9. Trick Wording

This pattern uses ambiguous language to mislead and deceive users. Most users do not read detailed information word by word. Here, the providers take advantage of the scan reading strategy, making a piece of content look like it is saying one thing when, in fact, it is saying something else which would only benefit the providers.

For instance, a checkbox may seem to allow users to subscribe to a particular newsletter. However, due to ambiguous language, a user may end up subscribing to something entirely against their interests.

10. Visual Interface

Here, the providers intentionally hide or disguise information relevant to the users with the help of a visual interface. It is usually done through low-contrast visuals or small-size text boxes.

Tesla used this pattern in 2019 when the company added a new feature allowing Tesla car owners to buy vehicle upgrades. Many car owners mistakenly purchased this feature, and Tesla denied refunding them. The car manufacturer hid this feature on the purchase screen by presenting it with a low-contrast effect.

11. Nagging

A platform keeps on interrupting the user with a request to do something that is against the user’s best interest. Nagging is a kind of tax the platform imposes, though it does not have a monetary value. However, the interruptions make the user consider giving in to this nagging as more convenient.

For example, Instagram kept nagging its users for months in 2018 to turn on notifications. Users could not reject the request entirely, as the only option available was Not Now.

12. Pre-selection

In this pattern, the platform gives a default option to the users to influence their decision-making. This pattern works on default effect cognitive bias. In this bias, people are more likely to choose the selected option.

The donation website discussed earlier is an example of this dark pattern. Moreover, a New York Times article also mentions how the 2020 Trump Campaign used this pattern for a pre-selected checkbox for monthly recurring donations.

13. Roach Motel

This dark pattern makes it hard for users to cancel a subscription they quickly signed up for. The subscription provider usually hides the cancel option and implements a complex cancellation procedure. This can include calling customer service or raising a ticket/request/complaint. This often leads the user to give up trying to cancel and instead pay for the services, even if they are useless.

In the case of Consumer Financial Protection Bureau v. TransUnion, TransUnion deceived the users by falsely marketing credit scores and credit-related products. It signed up on the user’s behalf without their consent, and users did not even have a cancellation option. The court directed the company to obtain consent for recurring payments and provide an easy cancellation option to users.

14. Hidden Costs

Using this pattern, the providers omit the additional fees, charges, or costs until the users reach the payment page. Since users have invested their time, they are likely to proceed further.

For example, airline websites do not mention convenience fees when you search for flights. Once you reach the final payments page, BOOM!

15. Hidden Subscriptions

The providers employ techniques such as sneaking and misdirection in this dark pattern. They make the users sign up for recurring subscriptions. Once done, they do not notify users, and the users continue to complete the payment. This pattern is usually seen in combination with the Roach Motel pattern. An ideal example of this is how free trial subscription apps do not notify the users before the trial ends.

In addition, the providers can also use the Nagging pattern to push the users into linking their bank accounts or credit cards to make the payments. A user only knows when the money is debited from their account.

16. Obstruction

This pattern deliberately makes it difficult for the users to complete their actions. For instance, Ireland’s Data Protection Commission held WhatsApp liable for not providing non-users with the necessary information and making it difficult to access by spreading it across several documents.


These dark patterns show how designers use the psychological underpinnings to exploit human cognition and our decision-making process. This has a direct impact on the users’ trust. As technology and technology-based applications have become inherent in our lives, we often ignore such practices or give negligible attention to them. However, ask yourself: the reels you saw today on Instagram, did you decide on their order? Certainly not. As a result, it is imperative to promote ethical design principles so that humans continue to exercise their independence in the digital space.

Featured Image Credits: Image by freepik