Dark patterns are the tricks of the trade used by many top companies. They are designed to manipulate users into taking actions they wouldn’t have done otherwise. These can include various online events, anything from making purchases to signing up for a service to sharing personal information purely for a business’s gain.
Unfair players can use tactics designed to imitate commonplace elements in layouts and buttons like banners in children’s apps or more deceptive pattern practices pushing consumers into agreeing to things they may not fully understand. For example, complicated legalese hidden in a labyrinth of terms and conditions or misleading options that make users believe they have control over their privacy when they actually don’t.
The term “dark patterns” was first shared with the world in 2010 by UX designer Harry Brignull. It’s important to note that in 2022 the term was altered by its creator and his team to “deceptive patterns” to avoid language that might reinforce harmful stereotypes. Brignull defined dark patterns as “a user interface that has been carefully crafted to trick users into doing things that are not in their interest and usually at their expense.” Since then, it’s become one of the hot topics in the digital marketing industry.
Knowing about, identifying, and avoiding these dark pattern behaviors is essential for companies and marketers. They not only undermine customer confidence and break GDPR and other email compliance and marketing rules, but they also run the danger of drawing the notice of watchdog groups like the Federal Trade Commission, which has pledged to take action against businesses that use these strategies. So what do typical dark pattern tricks look like?
A dark pattern strategy called disguised advertising tricks people into clicking on advertisements by posing as helpful information covering topics relevant to the readers to drive traffic and make money.
As you browse a website and come across a fascinating article, all of a sudden you click on what appears to be a useful button or link. In actuality, that’s an advertisement.
The example below illustrates this pattern perfectly. Seeing a banner offering cashback for your recent booking? It feels like a no-brainer, but there’s a hidden cost to it — some service is going to charge you 15 pounds per month, and most likely, you won’t see that cash back since you are not paying for it yet. That’s how people end up buying products and subscriptions they weren’t planning to pay for in the first place.
Imagine if you clicked to learn more about a fantastic bargain you saw online, only to end up on a completely different page and sign up for a trial for a product you never wanted. At work is the bait-and-switch pricing tactic. It draws you in with an interesting promise, then upends things to leave you with frequently unpleasant (and definitely unexpected) results.
X user @viewfromhk shared his recent experience of calling for an Uber from the airport. The initial price suggested by the app was 52.28 pounds, but after confirming the order it prompted the passenger to accept a new price of 74.24 for no apparent reason. He said he had to try it three times before recording the screen, as he thought it was a bug. But it was a simple dark pattern in action.
Did you like reading the free e-book that was available for a short while? Unknowingly, you may be paying for the follow-up. That’s action-forced continuity: Under this dark pattern strategy, customers are pushed to keep using the service by billing them without warning, usually following free trials. Businesses use this cunning strategy to keep you committed and paying even if you weren’t planning to stay a subscriber for very long.
In the example below shared by an X user @code_is_ the trivial and very affordable trial of Career.io could have quickly grown into a next payment that is 745.76% higher and a subscription that apparently was a pain to cancel. (The trial starts at $4.70 one year later since this screenshot was taken. Even dark marketers have to keep up with inflation somehow).
Just as you’re about to complete an online purchase, sure of the price, you’re hit with a barrage of unstated costs and levies at the very last stage. Some companies use the common dark pattern of hidden costs to make their pricing seem cheaper than it really is. They think that by delaying these extra costs to the very last minute, you will eventually cave in and pay to avoid the hassle of starting your transaction elsewhere. In this video, Arun Maini, a YouTube tech blogger gives a very detailed explanation of how deceptive pricing works in the corporate strategies of Uber, Amazon, and Netflix.
These are but a few of the several dark pattern techniques you may find online. You as the customer need to be conscious of these dishonest design strategies and handle internet contacts with caution. As a marketer or business owner, you should be aware of these problems unless you want to betray the confidence of your audience.
You have most likely encountered dark patterns in your everyday online life as they are real and, unfortunately, many companies use them. For example, research shows on average, consumers using a subscription service would encounter 6.2 dark patterns when trying to cancel.
And don’t get me started on dark patterns in email marketing, we can write a separate guide about it. After you hit that “Subscribe” button you can encounter a whole range of deceptive techniques. For instance, almost impossible-to-find unsubscribe options similar to the example below.
And then you have misleading subject lines fishing for clicks and opens to manipulating your emotions and forcing you to keep receiving promotional emails with all those “we sad to see you go…”.
But let’s look at a few instances of dark patterns from different industries and contexts.
The game really starts before you even get on the plane. Airlines sometimes get criticized for using dark patterns, especially in their seat selections and pricing.
One often-used tactic is to instill a false feeling of urgency. There can be notices saying things like “Only 2 seats left at this price!” or “Three others are looking at this destination right now.” Even if the deal isn’t as good as it looks, these scarcity signals are meant to make you feel forced to book right away.
Another trick? Charging more for basic amenities and the right to have any seat on a plane. That is exactly what happened to me and my fiancee during our trip from San Francisco to Austin a few weeks ago. We had to pay United Airlines $56.98 for “preferred seats” to NOT sit together. Was it because we booked the flight last minute? Or was it the smallest plane? Not at all.
The company offered us just three or four seats to choose from and made us pay for our “preferred” seats throughout the booking process. There was no option to sit together, so we had to book the closest possible seats apart and ended up paying for this “additional service”. And I am afraid to even guess if we had a sit on the flight at all without it, since it was seriously overbooked. In addition to annoying clients, this tactic takes more money out of their pockets — often without clearly disclosing these fees upfront.
That is a typical dark pattern: charging more for a poor experience.
A photo-sharing firm, EyeEm has recently changed its Terms & Conditions, but this isn’t just an ordinary update. In 2023, EyeEm was acquired by Freepik, and as part of the deal, your personal photo can be used as AI training material without your express permission.
The update was presented in an email that had the subject line “Important update in our Terms & Conditions”. As these changes and emails are typical, people might’ve not even opened this one. If the subject line read something like “Your photos will now be used to train AI” (which is the gist of this “important change”), users would’ve been more receptive to these events. More than that, the whole scheme of opting people in by default and making it a hurdle to opt out is in itself also a deceptive pattern. Users have a brief thirty-day window in which to opt out, and the removal procedure may take up to 180 days.
As user concerns piled up, a dark pattern manipulating user permission and limits became very obvious. Quick action from the community led to many looking for another image-storing option.
Tinder was punished for bending EU laws and not being transparent about its personalized pricing feature that was based on users’ age without them knowing about it.
Under pressure from regulators, Tinder pledged to amend its business practices by the middle of April 2024, including giving up age-based pricing and explaining how and why it customizes prices. To be sure Tinder meets its word, the European Commission and several watchdog consumers are keeping a tight eye on it.
These cases show how corporate dark patterns may penetrate many industries, damaging customer confidence and taking advantage of legal ambiguities. The lesson is obvious for companies that want to follow the regulations, though: Using dark patterns may boost short-term earnings at the price of consumers’ confidence and perhaps legal consequences.
And lastly, building sincere relationships with customers is not only morally right but also long-term profitable as you will have years-long loyalty and good standing with your customers. One research found that 76% of US adults think online services overcomplicate the cancellation process, and 92% are likely to choose a competitor in this case.
While many businesses agree that nothing called “dark” can be good, the internet is still full of it — incorrect use of personal information by social media platforms and email marketing gone wrong with dark pattern tricks.
This leads us to the subject of consumer protection and trade commissions, specifically the federal beasts. Acts with prominent capital letters, such as GDPR, CPRA, and COPPA, demonstrate that the FTC opposes this viral style of deception. Even massive corporations are not immune to prosecution. For example, the French data protection authorities penalized Google and Meta $170 million and $68 million, respectively in 2022. But let’s dig deeper.
The General Data Protection Regulation (GDPR) is a set of regulations that member states of the European Union and those who do business in the EU must adhere to to protect the privacy of digital data, including GDPR email compliance. The European Data Protection Board (EDPB) is responsible for ensuring that the GDPR is consistently applied across the EU.
The EDPB’s recent guidelines even address social media platforms, urging them to avoid deceptive design practices that could mislead users into making privacy-endangering decisions without clear consent.
EDPB regularly releases binding decisions that fine companies that are not GDPR compliant and force them to follow the guidelines. For example, in 2023 it imposed a fine of €345 million (approximately $374 million) on TikTok for unfairly processing the personal data of children between 13 and 17.
Across the pond, California addresses user privacy with the California Consumer Privacy Act (CCPA) and the California Privacy Rights Act of 2020 (CPRA), which grant California residents the right to know what personal information is gathered about them, how it is used, and with whom it is shared. These laws also include rules against dark patterns.
The rules forbid actions that obstruct the opt-out process and demand simplicity for consumers who wish to do so. This covers banning ambiguous language or complicated navigation that could lead users to give permission unintentionally.
CCPA compliance is regularly enforced, and no company, even an international giant, is immune. For instance, in 2022, Sephora agreed to $1.2 million in penalties for sneakily selling customers’ data.
The US government shows its resolve to take action against companies that employ dark patterns by releasing a paper listing many deceptive techniques employed across several industries.
Here’s a breakdown: The Federal Trade Commission asks businesses to do three basic things:
Sounds simple, right? It does not appear to be so for everyone. Just look at Amazon. These people won the prize for non-consensual Prime subscriptions and a complicated cancelation process. The FTC was not amused. Result? A quick punch of legal action coming their way.
And let’s not forget about Epic Games making Fortnite fans pay for something they didn’t want. The FTC fined them $245 million.
Another example of the FTC’s swift action is the case of Vonage, the communications company that had very complicated cancellation procedures, so much so that it cost them $100 million.
Not to mention a famously huge $5 billion fine for Facebook:
The bottom line? Play by the rules, be honest with your consumers, and leave the dark patterns in the shadows where they belong. Your business (and your bank account) will thank you in the long run.
Let’s recap. Some businesses employ quite dubious strategies known as “dark patterns” to mislead customers into making choices they may come to regret. These are unlawful as well as unethical behaviors, including anything from unclear cancellation procedures to hidden costs.
Some of the typical dark pattern tactics are:
Significant fines are being levied by major regulators including the FTC, GDPR, and CCPA for dark patterns. Ask Google and Amazon; they’ve been hit with some heavy fines.
What is then the lesson? It’s easy: give openness first priority, facilitate customers’ ability to make educated judgments, and concentrate on developing genuine, sincere relationships. Long term, your company will benefit and your clients will value it.