The U.S. Federal Trade Commission, along with two other international consumer advocacy networks, on Thursday announced the results of a study into the use of “dark patterns” — or manipulative design techniques — that can compromise users’ privacy or lead them to purchase products or services or take other actions they otherwise wouldn’t. In an analysis of 642 websites and apps offering subscription services, the study found that the majority (about 76%) used at least one dark pattern and nearly 67% used more than one.
Dark patterns refer to a set of design techniques that can subtly encourage users to take some type of action or compromise their privacy. They are particularly popular among opt-in websites and apps and have been a focus of the Federal Trade Commission in recent years. For example, the FTC sued dating app giant Match for fraudulent practices that included making it difficult to unsubscribe through its use of dark patterns.
The release of the new report could signal that the FTC is planning to pay increased attention to this type of consumer fraud. It also comes as the U.S. Department of Justice is suing Apple over its alleged monopoly on the App Store — a marketplace that generates billions of dollars in billings and sales for digital goods and services, including those that come through subscription apps.
The new report, released Thursday, addresses many types of dark patterns, such as intrusion, obstruction, complaining, forced labor, social proof, and more.
Among the most common dark patterns encountered in the study was phishing, referring to the inability to turn off auto-renewal of subscriptions during the sign-up and purchase process. 81% of the sites and apps studied used this technique to ensure their subscriptions would automatically renew. In 70% of cases, subscription providers did not provide information on how to cancel a subscription, and 67% failed to provide a date by which consumers should cancel their subscription to avoid being charged again.
Another common drawback of subscription apps is that they make it more difficult or tedious to take a certain action, such as canceling a subscription or bypassing registration for a free trial, where the “X” to close the offer is grayed out and somewhat hidden from view.
Urgency involves repeatedly asking the consumer to perform some type of action that the company wants them to do. (Although not a subscription app, one example of urgency is how TikTok repeatedly prompts users to upload their contacts to the app, even after the user says “no.”)
Coercive action means requiring a consumer to take some kind of step to access specific functionality, such as filling in their payment details to participate in a free trial – something 66.4% of the sites and apps in the study required.
Meanwhile, social proof uses the power of the crowd to influence a consumer, usually to complete a purchase, by displaying metrics related to some type of activity. This is particularly common in the e-commerce industry, where a company will show how many other people are browsing the same product or adding it to their cart. For subscription apps, social proof can be used to nudge users to subscribe by showing how many other people are doing the same thing.
The study found that 21.5% of the websites and apps it examined used notifications and other forms of social proof to push consumers to sign up for a subscription.
Websites can also try to create a sense of urgency to get consumers to buy. This is something we see regularly on Amazon and other e-commerce sites, where people are alerted to low stock, prompting them to quickly complete a purchase, but it may be less commonly used to sell subscriptions.
Interface interference is a broad category that refers to the ways in which an app or website is designed to nudge a consumer toward a favorable decision for the company. This can include things like pre-selecting items, such as longer or more expensive subscriptions — as 22.5% of study participants did — or using a “false hierarchy” to visually present more favorable options to the company more prominently. The latter was used by 38.3% of companies in the study.
Interface interference may also include what the study referred to as “affirmative shaming” — the use of language to evoke emotions to manipulate a consumer’s decision-making process, such as “I don’t want to miss out, sign up with me!”
The study was conducted from January 29 to February 2 as part of the International Consumer Protection and Enforcement Network (ICPEN). Annual ShowThe study included 642 subscription-based sites and apps. The FTC is chairing ICPEN for the 2024-2025 period, she noted. Officials from 27 authorities in 26 countries participated in the study, using dark pattern descriptions created by the OECD. However, their scope was not to determine whether any practices were illegal in the affected countries; that was up to individual governments to decide.
The FTC participated in the ICPEN review, which was also coordinated with the Global Privacy Enforcement Network, a network of more than 80 privacy enforcement authorities.
This isn’t the first time the FTC has looked into the use of dark patterns. In 2022, it also authored a report Detailed report of a group of dark patternsBut this wasn’t limited to subscription-based websites and apps. The older report focused on dark patterns across industries, including e-commerce and kids’ apps, as well as different types of dark patterns, such as those used in cookie consent banners and more.