If 2021 is any indication, the Federal Trade Commission (FTC) shows no signs of slowing down in its pursuit of enforcement actions to address a wide variety of alleged privacy and cybersecurity issues. Under the leadership of new chair, Lina Khan, the past year has seen the FTC engage is a variety of new and expanding enforcement actions exhibiting an increasing interest in regulating data privacy and security, as well as other consumer protection areas.

While the FTC has become the de facto regulator for entities that are not subject to other sector-specific regulations, the Commission’s assertion of authority over privacy and cybersecurity matters is limited by its statutory powers under section 5 of the FTC Act, which prohibits “unfair or deceptive acts or practices” that injure consumers. The FTC’s expansion of that authority to cover privacy and cybersecurity matters has only grown more aggressive in recent years but has also become the subject of close judicial review. Notably, in 2018, the Eleventh Circuit ruled, in LabMD, Inc. v. FTC, that the FTC did not have unlimited authority to dictate the details of companies’ privacy and cybersecurity protections. Earlier this year, the Supreme Court, in AMG Capital Mgmt., LLC v. FTC, held that Section 13(b) of the FTC Act does not allow the FTC to obtain monetary relief in federal court. The FTC has asked Congress to use its authority to remedy this ability, and claims that this constitutes a loss of its “best and most efficient tool for returning money to consumers who suffered losses as a result of deceptive, unfair, or anticompetitive conduct.”

The FTC has pushed for a more expansive view of its authority for several years, and this has only intensified over the last year. Even before the AMG decision, the FTC had been advocating for Congress to address the gap in Section 13(b), which only explicitly provides for the FTC’s ability to order injunctive relief and is silent on monetary relief. While waiting on Congress to address the issue, we expect for the FTC to continue to bring enforcement actions and order restitution and disgorgement via their Section 19 authority, which provides for these types of relief, but only after a final cease-and-desist order, which can be challenged and is subject to review of appellate courts.

Key Areas of Focus in 2021

None of these challenges to its authority have slowed the FTC from pursuing its overarching goals of consumer protection and transparent data collection and data use practices. This year, the FTC has brought several enforcement actions in certain key areas of focus pertaining to privacy, cybersecurity, and consumer protection: children under 18, algorithmic and biometric bias, health apps, and deceptive and manipulative conduct on the internet.

Children Under 18

As we will discuss in more detail in tomorrow’s post, protecting data related to students and children has been a key focus of the FTC this year. The FTC brought several enforcement actions in 2021 against companies for targeted advertising practices directed at minors. In just one example, the FTC brought an enforcement action against KuuHuub Inc., maker of an online coloring book application, for allegedly failing to provide notice to parents or obtain parental consent before collecting personal information from children, in violation of the Children’s Online Privacy Protection Act (COPPA). Without making any admissions, KuuHuub agreed to pay a $3 million civil penalty and refund all minors who had signed up to use the application. Additionally, KuuHuub agreed to notify its users of the COPPA violation and delete any personal information collected from minors under the age of 13 without parental consent.

Algorithmic and Biometric Bias

With the advancement and increased use of machine learning, artificial intelligence (AI), and similar technologies, the FTC is taking a closer look at how to avoid the biases and inequities that are sometimes created with the use of algorithms and biometrics. As the FTC explained in a recent blog post: “apparently ‘neutral’ technology can produce troubling outcomes—including discrimination by race or other legally protected areas.”  These statements suggest the FTC has a close eye on AI and machine learning technology, which may play out in enforcement actions as these technologies increasingly integrate into daily life.

Health Apps

In 2021 the FTC targeted Health Apps and connected device companies, signaling its increasing scrutiny of apps that consumers use to monitor and improve their health, and therefore handle sensitive personal health data. In June of 2021, the FTC finalized its settlement with Flo Health, Inc., its first ever privacy-related health application case. The FTC alleged that Flo Health provided health data from millions of its user to third parties that provided marketing and analytics services (including Facebook and Google) to Flo Health in violation of its own promises to its users to keep their information private. As part of the settlement, the FTC required Flo Health to notify users about the disclosure in violation of Flo Health’s promise and to instruct third party recipients of users’ health information to destroy the data.

Coming off the heels of the Flo Health settlement, the FTC issued a Policy Statement explaining the FTC’s view that the Health Breach Notification Rule applies to mobile health applications. These applications and other devices that consumers use to monitor their health had previously existed in a gray area—not covered by HIPAA and not clearly within the realm of the Health Breach Notification Rule, either. Although the FTC has not enforced the Rule since its enactment, this Policy Statement, along with the recent action against the Flo Health, signal that it intends to bring enforcement actions under the Rule. Makers of apps and devices that use similar technology to monitor and track consumers’ health should expect increased scrutiny in coming months.

Dark Patterns

The FTC has declared its commitment to joining other regulators worldwide, especially in the EU, in combatting dark patterns or deceptive and manipulative conduct built into user interfaces on the web. Such dark patterns have the effect, intentionally or unintentionally, of obscuring, subverting, or impairing consumer autonomy, decision-making, or choice. Dark patterns can take a few different forms, all of which mislead consumers in some ways. For example, a website may include a fake countdown timer that suggests there is an expiring offer to pressure consumers into making a purchase, or hide additional fees from users until the company has collected data and guided the consumer to a checkout page, or make it  much easier and more obvious for users to opt-in to the sharing of additional data than it is to opt-out of that sharing.

The FTC’s commitment to addressing the use of dark patterns by businesses is evident in its complaint against Age of Learning alleging use of dark patterns in conjunction with its service “ABC Mouse.” The complaint alleges that ABC Mouse deliberately designed its service so that it was hard for consumers to cancel, despite promising “Easy Cancellation.”

Expectations for 2022 and Beyond

In 2022, companies should prepare for the FTC to continue to take an aggressive stance against alleged privacy violations. In its statement on 2022 regulatory priorities, the FTC stated that it is “particularly focused” on seeking “penalties for firms that engage in data abuses” and may seek individual as well as company accountability.  The Commission has also signaled that it intends to undertake an extended rulemaking proceeding to develop unfairness privacy principles related to its authority under Section 5 of the FTC Act. As long as we are still without a comprehensive federal privacy law, the FTC will remain the primary privacy regulator, and this rulemaking would further clarify where the Commission intends to focus its attention. We will be watching closely for developments here in the new year.