Debevoise & Plimpton Discusses “Dark Patterns” and Regulatory Scrutiny

There has been significant regulatory attention recently to “dark patterns,” including FTC guidance, state privacy laws, and state and federal enforcement actions. Some of this activity involves new regulations, and some is based on decades-old consumer protection laws that prohibit unfair and deceptive practices.

There is no single definition for “dark patterns,” but the term generally refers to user interfaces (e.g., websites, apps) that are designed to manipulate a user’s behavior and subvert a consumer’s choices, causing the user to engage in conduct that they did not expect or desire, or impairing individuals’ ability to make an informed decision. Examples of dark patterns include the following:

  • having a process for opting out of data sharing or cancelling a subscription that involves an unnecessarily large number of steps;
  • requiring consumers to click through, or listen to, reasons why they should not opt out of data sharing or cancel their subscription;
  • using confusing language, such as double negatives, that tricks people into believing they are doing one thing, but they are actually doing the opposite; and
  • requiring a consumer to scroll through a long, complicated document to identify the mechanism to opt out of data sharing or cancel a subscription.

Legislative and Regulatory Definitions and Examples in the Privacy Context

Dark patterns are addressed by the Colorado, Connecticut, and California privacy laws; part of the current and draft regulations implementing the California Consumer Privacy Protection Act (CCPA); and draft regulations implementing the Colorado Privacy Act (CPA). In the context of privacy statutes, the use of dark patterns is generally prohibited when presenting consumers with opt-out rights (e.g., opt out of sale, use of sensitive data, sharing for targeted advertising) or when obtaining required consents. Dark patterns also negate any otherwise required consent obtained through the use of dark patterns.

The CPRA defines a dark pattern as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision making, or choice, as further defined by regulation.” Colorado and Connecticut law use the same definition, with Connecticut adding that a dark pattern “includes, but is not limited to, any practice the Federal Trade Commission refers to as a ‘dark pattern’.”

Draft regulations issued by the California Privacy Protection Agency define similar techniques as dark patterns and add further color as to how businesses should present choices so as not to be considered a dark pattern, specifically:

  • drafting language that is easy for consumers to read and understand;
  • presenting choices that do not impose unnecessary friction when consumers try to do something that deprives a business of revenue;
  • providing consumers “symmetry in choice,” such as offering consumers “Yes” and “No” options rather than “Yes” and “Ask me later” options;
  • avoiding manipulative user interfaces, such as unintuitive placement of inaptly sized buttons;
  • avoiding manipulative language that might guilt a consumer or subvert choice, such as “No, I don’t want to save money” when offering a financial incentive; and
  • avoiding “choice architecture” that involves bundling choices so that the consumer must consent to using personal information for reasonably expected purposes along with a consent to use the data for unexpected purposes (g., when providing consent to sharing location data to better find gas prices close to the consumer is bundled with consent to sell geolocation data to third parties).

Draft regulations issued by the Colorado Attorney General include similar principles for designing a user interface to avoid dark patterns:

  • relying on “[c]onsumer silence or failure to take an affirmative action” to obtain consent;
  • using preselected checkboxes should not be used to obtain consent; and
  • unnecessarily interrupting a consumer’s interaction with a service to seek consent, including through the use of “multiple pop-ups which cover or otherwise disrupt the content or service they are attempting to interact with because they declined” to consent.

Those draft regulations also suggest considering “vulnerabilities or unique characteristics” of the audience targeted to present choice options, such as through the use of font size and space between buttons for elderly audiences.

Enforcement Trends

The FTC and state attorneys general have voiced concerns over dark patterns and recently been bringing enforcement actions relating to dark patterns by treating them as deceptive practices:

  • FTC v. Credit Karma. In September, the FTC settled an action with Credit Karma for allegedly deploying dark patterns that misrepresented the chances consumers had at obtaining credit cards by saying they were “pre-approved.” The FTC found that such claims were deceptive because they conveyed “false certainty” to consumers. According to the complaint, Credit Karma conducted A/B testing, which compares “two versions of a claim or design feature to determine which better drives sales or consumer action.” Credit Karma allegedly used the method to evaluate click rates for its pre-approval claim when compared to claims that expressed a consumers’ odds of approval (g., a claim that a consumer had an “Excellent” likelihood of approval). The FTC stated that “when user interfaces are designed, including with the aid of A/B testing, to trick consumers into taking actions in a company’s interest and that lead to consumer harm, such design tricks have been described as “dark patterns.’”
  • New York v. Fareportal Inc. Earlier this year, the New York Attorney General (the “NYAG”) settled a matter involving alleged dark patterns for $2.6 million with a company that operates several travel-related websites, such as CheapOair.com. The NYAG alleged that Fareportal used dark patterns to create a “false sense of urgency and social pressure” to prompt customers to book flights and hotels quickly. Specifically, the NYAG’s findings stated that the company added “1” to the number of tickets consumers were searching for (“X”) and informed consumers only that number of tickets (X + 1) were still available at a specific price. Similarly, the NYAG targeted the websites’ use of countdown timers with the message “Book now before tickets run out” as creating a sense of false urgency and scarcity. The NYAG also found that the company used “misleading social nudge messages” that inaccurately represented the number of consumers who had purchased travel protection or upgraded seats.
  • Several States v. Google, LLC. In January 2022, the attorneys general of Texas, Washington, Indiana, and the District of Columbia filed separate suits over allegedly deceptive location tracking practices that deceived consumers about the ability of control they have over their location privacy. The suits allege that the defendant used dark patterns, for example, according to the DCAG complaint, “repeated nudging to enable location settings,” “repeated prompting to re-enable location when using a Google app,” and providing consumers with an “all or nothing” opt-in option, to prevent consumers from adopting privacy protective settings.

Regulatory Guidance

In September, the FTC published a Staff Report titled “Bringing Dark Patterns to Light,” which provides insight into the agency’s concerns and enforcement priorities. The guidance addresses the FTC’s expectations for companies and its recommended best practices and also serves as a notice to companies that the FTC will continue to scrutinize and take action against the use of dark patterns.

The report draws on past enforcement actions and describes examples of common dark patterns, including the following:

  • design elements that induce false beliefs to manipulate consumer choice, such as comparison-shopping sites that purport to provide objective product information but actually offer better rankings and ratings to companies that pay for higher placement, or an advertisement formatted to appear as a news or feature article, where consumers may not recognize it as an advertisement;
  • false countdown timers and claims that a product or service is almost sold out when it is not;
  • design elements that hide or delay disclosure of material information, including those involving hidden or buried fees and “drip pricing” in which consumers are only provided with the true all-in price at the end of a transaction;
  • design elements that lead to unauthorized charges, such as gaming applications that suddenly include in-app purchases in a location where a consumer is used to clicking quickly, or through the use of a free trial period that results in recurring charges of which the consumer was unaware;
  • design elements that obscure or subvert privacy choices, such as default user settings that maximize data collection, or user interfaces that when presenting choices to the user, the website’s preferred choice is highlighted, while the alternative, disfavored choice is grey (g., “Yes” choice has a bold, colored background and “No” choice is greyed out, comparable to an inactive button).

Key Takeaways

In light of the increasing regulatory focus on dark patterns, companies that market to consumers online should consider the following risk-mitigation strategies:

  • Neutrality: Presenting information in an even-handed and neutral manner. When designing user interfaces, consider how design elements may affect consumer impression and understanding.
  • Cautious Use of A/B Testing: Testing the efficacy of marketing messages and user interfaces often provides beneficial insights to businesses, but companies should exercise caution if such experimentation suggests that consumers are being unwittingly influenced into making decisions against their interests.
  • Disclosure of Material Terms: Although it is increasingly challenging, given the real estate on mobile phones and other devices, businesses should strive to ensure that material terms are easy to access and disclosed early and prominently in the user flow, rather than at the end of a transaction or buried in terms and conditions.
  • Expect Consent to Be Scrutinized: For certain practices, regulators are increasingly expecting companies to obtain affirmative and unambiguous consent from the consumer. Accordingly, there is risk in relying on notice alone for these practices, such as charging consumers for an in-app purchase, enrolling them in a service, or sharing their sensitive personal information.
  • Consider Legal Involvement in the Design Process: Particularly where companies will be subject to forthcoming state privacy laws, companies should consider including lawyers familiar with those laws in the design and user-interface development of online marketing and other product initiatives.

This post comes to us from Debevoise & Plimpton LLP. It is based on the firm’s memorandum, “Dark Patterns: What Are They and How Can Companies Avoid Regulatory Scrutiny?” date October 12, 2022, and available here.