Dark pattern
Contents26
- Definition and terminology
- Common types and examples
- Obstruction patterns
- Interface interference
- Forced action
- Sneaking and information hiding
- Social proof and urgency
- Mind tricks and business incentives
- Cognitive biases exploitation
- Incentives and short-term gains
- Legal and regulatory landscape
- United States framework
- European Union's approach
- Enforcement cases and penalties
- Impact on consumers and businesses
- Consumer harms
- Impact on businesses
- Detection, avoidance and mitigation
- Technical detection and tools
- Ethical design alternatives
- Consumer protection and advocacy
- Further reading
- Gallery
- Carly
- The Economist
- References
⚠️ Article status notice: This article has been marked as incomplete
This article needs additional work for its sourcing and verifiability to meet the wiki's Content Guidelines and be in line with our Mission Statement for comprehensive coverage of consumer protection issues.
This notice will be removed once sufficient documentation has been added to establish the systemic nature of these issues. Once you believe the article is ready to have its notice removed, please visit the Moderator's noticeboard, or the discord and post to the #appeals channel.
Learn more ▼
This article has been flagged due to verification concerns. While the topic might have merit, the claims presented lack citations that live up to our standards, or rely on sources that are questionable or unverifiable by our standards. Articles must meet the Moderator Guidelines and Mission statement; factual accuracy and systemic relevance are required for inclusion here!
Articles in this wiki are required to:
- Provide verifiable & credible evidence to substantiate claims.
- Avoid relying on anecdotal, unsourced, or suspicious citations that lack legitimacy.
- Make sure that all claims are backed by reliable documentation or reporting from reputable sources.
Examples of issues that trigger this notice:
- A topic that heavily relies on forum posts, personal blogs, or other unverifiable sources.
- Unsupported claims with no evidence or citations to back them up.
- Citations to disreputable sources, like non-expert blogs or sites known for spreading misinformation.
To address verification concerns:
- Replace or supplement weak citations with credible, verifiable sources.
- Make sure that claims are backed by reputable reporting or independent documentation.
- Provide additional evidence to demonstrate systemic relevance and factual accuracy. For example:
- Avoid: Claims based entirely on personal anecdotes or hearsay without supporting documentation.
- Include: Corporate policies, internal communications, receipts, repair logs, verifiable video evidence, or credible investigative reports.
If you believe this notice has been placed in error, or once the article has been updated to address these concerns, please visit the Moderator's noticeboard, or the #appeals channel on our Discord server: Join here.
🔧 Article status notice: This article may rely heavily on AI/LLMs
This article has been marked because it may have heavy use of LLM generated text that affects its perceived or actual reliability and credibility.
To contact a moderator for removal of this notice once the article's issues have been resolved, or if this was a mistake, please use either the Moderator's noticeboard, or the #appeals channel on our Discord server (Join using this link]).
Learn more ▼
Common issues include:
- affect the validity of the claims made (e.g. by not citing sources)
- make use of tone not complaint with the wiki's editorial guidelines
- be overly extensive in areas that are not relevant to the mission statement
- come across as automatically generated, bringing the wiki's credibility into question
As a result this article needs additional work to meet the wiki's Content Guidelines and be in line with our Mission Statement for comprehensive coverage of consumer protection issues.
How You Can Improve This Article
- Replace or supplement weak or hallucinated citations with credible, verifiable sources.
- Remove content you determine to be inaccurate
- Link the problem to modern forms of consumer protection concerns, such as privacy violations, barriers to repair, or ownership rights
- Replace language that that is non-compliant with the editorial guidelines of this wiki.
As the article may incorporate text from a large language model, it may include inaccuracies or hallucinated information. Please keep this in mind if you are using this article as a source for information.
A dark pattern is a manipulative design practice that tricks or influences users into making decisions that may not align with their true preferences or interests. These techniques exploit cognitive biases and behavioral psychology to benefit businesses, often at the expense of user autonomy. Initially coined by user experience (UX) designer Harry Brignull in 2010, the concept has evolved into a significant focus of regulatory scrutiny and academic research.[1][2]
The prevalence of dark patterns is remarkably widespread, and they represent a growing concern in digital interfaces. A 2019 study examining 11,000 e-commerce websites found approximately 10% employed deceptive practices,[3] while a 2022 European Commission report indicated that 97% of popular apps used by EU consumers displayed them.[4]
Definition and terminology
The term dark patterns was originally defined by Harry Brignull as "design tricks that manipulate users into taking actions they didn't intend to". The Federal Trade Commission (FTC) describes them as "design practices that trick or manipulate users into making choices they would not otherwise have made and that may cause harm".[1][2]
There is ongoing discussion regarding the most appropriate terminology. Alternative labels include deceptive design, manipulative UX, coercive design, or anti-patterns. Some advocates argue for terms like deceptive patterns to more accurately describe the intentional nature of these designs and avoid potential racial connotations. Brignull himself has transitioned to using deceptive.design.[2]
What distinguishes dark patterns from merely persuasive design is their exploitative nature – they are not about creating value for users but about benefiting the service provider through manipulation and deception.
Common types and examples
Research has identified numerous specific dark patterns, with one comprehensive study proposing a taxonomy comprising 68 distinct types. These manifest across various industries and digital contexts.[5]
Obstruction patterns
These designs make desired actions (like rejecting tracking) significantly more difficult than accepting alternatives. A classic example is the Roach Motel pattern, where signing up for a service is straightforward but cancellation is excessively difficult. The FTC highlighted this pattern in their case against ABCmouse, where cancellation was made "extremely difficult" despite promising "Easy Cancellation".[6]
Interface interference
This category includes designs that manipulate interface elements to steer user behavior. Misdirection focuses user attention on one element to obscure another critical detail. Disguised ads blend advertisements with genuine interface elements, like fake "Download" buttons on software websites.[1]
Forced action
These patterns require users to complete unnecessary actions to access desired functionality. Forced registration demands that users create an account to complete a task. Forced continuity involves automatically transitioning users from free trials to paid subscriptions without adequate notification. The FTC alleged that Adobe violated regulations by "tricking customers into enrolling in subscription plans without proper disclosure".[1][7]
Sneaking and information hiding
These practices involve concealing or obscuring material information from users, such as
- Hidden costs reveal unexpected fees only at checkout, a practice employed by ticketing platforms
- Drip pricing advertises only part of a product's total price initially and then imposes other mandatory charges later[1]
- Checking a box by default that installs potentially-unwanted software, such as an "anti-virus" that's actually spyware or a crypto-miner
Social proof and urgency
These patterns exploit social influence and time pressure to manipulate decisions. False activity messages misrepresent site activity or product popularity. False scarcity creates pressure to buy immediately by claiming limited inventory. Baseless countdown timers display fake countdown clocks that reset when expired.
Mind tricks and business incentives
Cognitive biases exploitation
Dark patterns trick users by taking advantage of unconscious thoughts. For example, in cookie banners the "Accept All" option is the first option listed and uses a green background. People tend to choose the first option before considering others. Green is associated with good in design. In cookie banners, there is also a "Manage my choices" option that typically involves opting out of each data collection category or website one at a time. It is easier for users to accept all cookies than to decline them, due to using dark patterns.[8][9]
Incentives and short-term gains
Editor note: This section is overly technical language and needs to be simplified, as well as include citations. It's also clearly made by AI/LLM.
The persistence of dark patterns is driven by their effectiveness in achieving short-term business objectives like increased conversion rates. Additionally, the competitive landscape fosters copycat behavior, as companies mimic their rivals' strategies.
Research suggests these short-term gains often come with long-term consequences. Studies indicate that "once users feel manipulated, they don't just avoid your settings—they avoid your brand". The erosion of trust can have significant business implications.
Legal and regulatory landscape
United States framework
In the United States, regulation occurs primarily through existing consumer protection statutes. The FTC Act empowers the Federal Trade Commission to take action against "unfair or deceptive acts or practices in or affecting commerce".[10]
In October 2024, the FTC amended its Negative Option Rule to include specific requirements for cancellation mechanisms, implementing a "Click-to-Cancel" provision.[11] The FTC later voted on 9 May 2025 to extend the original 14 May 2025 compliance deadline by sixty days.[12][13]
On 8 July 2025, the Eighth Circuit Court of Appeals vacated the entire 2024 change to the Negative Option Rule on procedural grounds in Custom Communications, Inc. v. Federal Trade Commission.[14][15] Despite the legal setback, the FTC carried out findings against Match.com, Chegg Inc., Cleo AI and Amazon based on the Restore Online Shoppers' Confidence Act (ROSCA) and Section 5 of the FTC Act.[16]
On 30 January 2026, the FTC indicated renewed interest in updating the Negative Option Rule by submitting a draft Advance Notice of Proposed Rulemaking (ANPRM) to the Office of Management and Budget (OMB) for review.[17][18] It was opened to public comment on 11 Mar 2026.[19]
European Union's approach
The European approach combines general consumer protection laws with data privacy-specific regulations. While the General Data Protection Regulation (GDPR) doesn't explicitly mention dark patterns, its requirements for valid consent effectively prohibit many deceptive designs.[20]
The Digital Services Act (DSA) and Digital Markets Act (DMA) further address dark patterns by prohibiting practices that "deceive or manipulate" users.[21]
Enforcement cases and penalties
Recent years have seen significant enforcement actions:
- Epic Games paid $245 million to settle charges related to deceptive patterns in Fortnite.[22]
- Noom paid $62 million to settle charges regarding deceptive subscription practices.[23]
- TikTok received a €345 million fine for failing to protect children's data through manipulative consent practices.[24]
Impact on consumers and businesses
Consumer harms
A dark pattern is harmful to consumers in many ways, often leading to financial loss, violations of privacy, and emotional distress.
Dark patterns could manipulate consumers financially by faking urgency, importance, scarcity, and sales to trick them into making purchases they might not have made if they were able to make a properly informed decision.
They could also be used violate the consumers privacy by hiding information on what data a service collects and how that data is used, and hiding or excluding options to stop data collection or delete already collected data.
The above examples of how consumers may be affected are often combined with or directly involve dark patterns that cause emotional distress, often attempting to frustrate the consumer enough to accept what patterns are being used against them, and stop looking for any in-site settings or other methods to bypass the patterns.[1][20]
The most vulnerable consumers are those who are unfamiliar with computers and the internet, and those with mental or physical disabilities that impair them from either recognizing dark patterns or avoiding them if possible.
Impact on businesses
Dark patterns are often effective enough to be used by a wide range of businesses regardless of how successful they are. Even when consumers discover that a service they use contains dark patterns, they're highly unlikely to stop using the service or inform other users which in turn makes it highly unlikely a business will stop using dark patterns. Despite consumer backlash being unlikely to lead to any change, many dark patterns violate consumer protection laws and could lead to heavy lawsuits against businesses which are much more effective.[1]
Detection, avoidance and mitigation
Editor note: This entire section reads as LLM.
Technical detection and tools
Efforts to automatically detect dark patterns are evolving but face significant challenges. A comprehensive study found that existing tools could only identify 31 of 68 identified dark pattern types, a coverage rate of just 45.5%.[5] The study proposed a Dark Pattern Analysis Framework (DPAF) to address existing gaps.
Ethical design alternatives
Companies can implement ethical alternatives that respect user autonomy. They should provide a balanced choice architecture where users can decline as easily as they accept represents an ethical approach for obstruction patterns. Designers should implement neutral default settings that don't assume consent.[9]
Transparency and clear communication are essential. Companies should provide honest explanations of data practices and costs in clear, understandable language.
Consumer protection and advocacy
Consumer education plays a crucial role. Initiatives like the Dark Patterns Tip Line allow users to report deceptive designs they encounter. Advocacy organizations provide resources to help identify and avoid dark patterns.[2]
Further reading
Gallery
Examples of dark patterns, with notes.
-
An example of manipulating the user by minimizing the noticeability of the "More" option while emphasizing only the "Accept" button.
-
Diving deeper shows Marketing enabled by default and using a color to match the font text. The "Deny" option is dark text and uses a light-gray color border that is both harder to see and generally associated with denial of action.
-
Mixpanel is labeled as "essential", but hidden within the collapsed section is an explanation that it's a tracker. MyCarly may genuinely consider it necessary, but a tracker is still a tracker. Google Tag Manager is also enabled by default, with the same issue as the previous image.
-
The message that appears on user's visit to the website. Cookie management is located closely above the bright Continue button.
-
"Do not sell or share" is enabled by default, but comes with a disclaimer. (See file page for further notes.)
References
- ↑ 1.0 1.1 1.2 1.3 1.4 1.5 1.6 "Bringing Dark Patterns to Light". Federal Trade Commission. Sep 2022. Archived from the original on 9 Dec 2025. Retrieved 22 Mar 2026.
- ↑ 2.0 2.1 2.2 2.3 Brignull, Harry; Leiser, Mark; et al. (25 Apr 2023). "Dark Patterns: inside the interfaces designed to trick you". Deceptive.Design. Archived from the original on 22 Mar 2026. Retrieved 22 Mar 2026.
- ↑ Cimpanu, Catalin (11 Nov 2019). "Study of over 11,000 online stores finds 'dark patterns' on 1,254 sites". ZDNET. Archived from the original on 14 Nov 2025. Retrieved 8 Nov 2025.
- ↑ Lupiáñez-Villanueva, Francisco; Boluda, Alba; et al. (Apr 2022). "Behavioural study on unfair commercial practices in the digital environment". Publications Office of the EU. doi:10.2838/859030. ISBN 978-92-76-52316-1. Archived from the original on 18 Jan 2026. Retrieved 22 Mar 2026.
- ↑ 5.0 5.1 Li, Meng; Wang, Xiang; Nei, Liming; Li, Chenglin; Liu, Yang; Zhao, Yangyang; Xue, Lei; Kabir Sulaiman, Said (2024-12-12). "[2412.09147] A Comprehensive Study on Dark Patterns". arXiv. doi:10.48550/arXiv.2412.09147. Archived from the original on 9 Nov 2025. Retrieved 2025-11-08.
- ↑ Keller and Heckman LLP (28 Sep 2020). "FTC Targets Negative Option Schemes in Two Multimillion Dollar Settlements". Lexology. Archived from the original on 14 Nov 2025. Retrieved 28 Nov 2025.
- ↑ "FTC Charges Adobe". Federal Trade Commission. 17 Jun 2024. Archived from the original on 17 Jun 2024. Retrieved 22 Mar 2026.
- ↑ Stroink-Skillrud, Donata (2 Feb 2023). "Your Cookie Consent Banner is (Probably) Not Compliant". MainWP. Archived from the original on 16 Feb 2026. Retrieved 22 Mar 2026.
- ↑ 9.0 9.1 Keyser, Robert (2023-10-05). "Cookie Consent Dark Patterns: How to Identify and Fix Them". Ethyca. Archived from the original on 12 Dec 2025. Retrieved 2025-08-11.
- ↑ "FTC Act". Federal Trade Commission. Archived from the original on 27 Jan 2026. Retrieved 22 Mar 2026.
- ↑ "Federal Trade Commission Announces Final "Click-to-Cancel" Rule Making It Easier for Consumers to End Recurring Subscriptions and Memberships". Federal Trade Commission. 16 Oct 2024. Archived from the original on 17 Oct 2024. Retrieved 22 Mar 2026.
- ↑ "FTC Votes on Negative Option Rule Deadline". Federal Trade Commission. 9 May 2025. Archived from the original on 10 May 2025. Retrieved 22 Mar 2026.
- ↑ Ferguson, Andrew N.; Holyoak, Melissa; Meador, Mark R. (9 May 2025). "Statement of the Commission Regarding the Negative Option Rule". Federal Trade Commission. Archived from the original on 10 May 2025. Retrieved 22 Mar 2026.
- ↑ "Click to Cancel Just Got Cancelled: Eighth Circuit Vacates Entirety of FTC's Negative Option Rule". Cooley. 11 Jul 2025. Archived from the original on 25 Jul 2025. Retrieved 22 Mar 2026.
- ↑ Conkle, Brooke; Cover, Jason; et al. (10 Jul 2024). "Eighth Circuit Vacates FTC's Negative Option Rule for Procedural Violations". Consumer Financial Services Law Monitor. Archived from the original on 19 Jul 2025. Retrieved 22 Mar 2026.
- ↑ Goodrich, Brian J.; Genn, Benjamin; et al. (25 Sep 2025). "FTC Steps Up Subscription Enforcement After "Click to Cancel" Rule Struck Down". Holland & Knight. Archived from the original on 26 Sep 2025. Retrieved 22 Mar 2026.
- ↑ "FTC Submits Draft ANPRM Related to Negative Option Plans to OMB for Review". Federal Trade Commission. 30 Jan 2026. Archived from the original on 31 Jan 2026. Retrieved 22 Mar 2026.
- ↑ "U.S. FTC Signals Renewed Interest in "Click-to-Cancel" Rulemaking". Sidley. 9 Feb 2026. Archived from the original on 22 Mar 2026. Retrieved 22 Mar 2026.
- ↑ "FTC Seeks Public Comment in Response to Advance Notice of Proposed Rulemaking Regarding Negative Option Marketing Practices". Federal Trade Commission. 11 Mar 2026. Archived from the original on 11 Mar 2026. Retrieved 22 Mar 2026.
- ↑ 20.0 20.1 "Guidelines on Dark Patterns in Social Media Platform Interfaces". European Data Protection Board. 14 Feb 2023. Archived from the original on 26 Feb 2023. Retrieved 22 Mar 2026.
- ↑ "Digital Services Act". European Commission. Archived from the original on 16 Feb 2026. Retrieved 22 Mar 2026.
- ↑ "Epic Games to Pay $245 Million". Federal Trade Commission. 19 Dec 2022. Archived from the original on 19 Dec 2022. Retrieved 22 Mar 2026.
- ↑ Davis, Ayumi (14 Feb 2022). "Noom to Pay $62M to Customers Forced Into Renewals They Didn't Want". Newsweek. Archived from the original on 14 Feb 2022. Retrieved 22 Mar 2026.
- ↑ "Irish Data Protection Commission announces €345 million fine of TikTok". Data Protection Commission. 15 Sep 2023. Archived from the original on 1 Feb 2026. Retrieved 22 Mar 2026.