On the first Day of Data, we recap a busy year for the Federal Trade Commission (“FTC”), highlighting key enforcement decisions from 2023 and reading the tea leaves for what promises to be an equally active 2024 for the agency on topics ranging from online tracking technologies to artificial intelligence.
Online Tracking Technologies
In what was in many ways the year of the pixel, the FTC stepped forward as a leading regulator in this space, consistent with the roles its taken as the primary privacy regulator in the U.S. Based on similar factual circumstances and with many of the same allegations and remedies, the FTC brought enforcement actions against GoodRx, Premom, and BetterHelp. In each matter, the FTC alleged that through online tracking technologies, including pixels and software development kits (“SDKs”), these businesses shared individuals’ sensitive health information with third parties such as Facebook and Google and monetized that health data for targeted advertising. According to the FTC, these actions not only violated the FTC Act and the companies’ own privacy representations, but in the case of GoodRx and Premom, also constituted an unauthorized disclosure of individually identifiable health information under the Health Breach Notification Rule (“HBNR”). Marking the first time the FTC has used this authority, which applies to vendors of personal health records that fall outside the scope of HIPAA, the agency declared that unauthorized disclosures of health information constitute a breach of security requiring notice under the HBNR. Among other alleged shortcomings, the FTC further noted that GoodRx, Premom, and BetterHelp did not take steps to limit third-party vendors’ use of health information, made misrepresentations about HIPAA compliance, and failed to maintain sufficient policies and procedures to safeguard users’ sensitive information. In addition to civil penalties of $1.5 million (GoodRx), $100,000 (Premom), and $7.8 million (BetterHelp), the FTC’s orders prohibit the companies from sharing health information for advertising and require the companies to obtain user consent before disclosing health information for other purposes, cease making misrepresentations about privacy practices, instruct third parties to delete consumers’ health data, issue notices under the Health Breach Notification Rule, limit data retention, and implement a comprehensive privacy program. While the use of tracking technologies is widespread and essential to many business functions, businesses, especially those in the health care space and other sectors involving consumers’ sensitive information, would be well advised to take stock of the tracking technologies used on their websites and mobile apps, focusing in particular on what data is disclosed, to whom, and what limitations are placed on the recipients’ use of that information.
Data Broker Industry
Following publication of an advanced notice of public rulemaking (“ANPR”) in August 2022 highlighting the FTC’s concerns about the commercial surveillance industry, it came as no surprise when later that same month, the agency took action against data broker Kochava. The FTC argued that Kochava’s collection and sale of consumers’ precise geolocation from their mobile devices, enabling tracking of their movements to sensitive locations such as reproductive health clinics, places of worship, homeless and domestic violence shelters, and addiction recovery centers, exposes consumers to injury in the form of stigma, discrimination, physical violence, emotional distress, and other harms. The FTC sought a permanent injunction in Iowa federal court on the grounds that Kochava’s sale of sensitive data constituted an “unfair” practice in violation of Section 5 of the FTC Act in that it causes or is likely to cause substantial injury to consumers that they cannot reasonably avoid and that is not outweighed by countervailing benefits to consumers or competition. However, in May 2023, the judge granted Kochava’s motion to dismiss, finding that the FTC did not sufficiently allege substantial consumer harm, but allowing the agency to file an amended complaint. The FTC did so the next month, offering additional factual allegations in an amended complaint that was only just unsealed in November 2023 despite the opposition of Kochava, whose motion for Rule 11 sanctions against the FTC was denied. We expect 2024 to bring further intrigue in this ongoing saga.
The data broker industry will undoubtedly continue to be under the spotlight in the new year and years to come. In addition to potential next steps by the FTC in connection with the ANPR discussed above, the Consumer Financial Protection Bureau this year announced an inquiry into data brokers’ business practices and specifically whether certain data brokers should be regulated under the Fair Credit Reporting Act. States have also been active in this space, with Texas and Oregon passing data broker statutes in 2023, and California amending its existing data broker law. The California Delete Act is particularly noteworthy for its novel requirement that by 2026 all data brokers respond to a single deletion mechanism, to be created by the California Privacy Protection Agency, through which consumers can direct data brokers to delete their personal information. Given the broad manner in which a data broker is defined under these state laws, companies should evaluate whether their practices may bring them within scope.
Children’s privacy was another area of focus for the FTC in 2023. This year, the FTC finalized record-setting penalties against Epic Games, creator of the popular video game Fortnite. In dual complaints, the FTC secured $275 million for violations of COPPA, alleging that Epic collected children’s personal information without parental consent and set voice and text chat features to “on” by default, as well as $245 million for deploying dark patterns that the FTC says tricked players into racking up unauthorized charges. Other prominent FTC COPPA actions in 2023 included actions against Microsoft $20 million arising out of allegations that, through its Xbox product, the company failed to obtain parental consent before collecting children’s personal information and retained such information longer than was reasonably necessary, and against Amazon’s Alexa, which alleged Amazon retained children’s voice recordings indefinitely, misled parents about their ability to delete their child’s voice recordings by retaining written transcripts, and misled users about their ability to delete geolocation information, in violation of COPPA and the FTC Act.
Building on its 2022 policy statement on education technology (“ed tech”) tools, the FTC also fined ed tech provider Edmodo $6 million for failing to obtain verifiable parental consent and using children’s personal information for advertising. Moreover, while COPPA permits schools to serve as intermediaries between vendors and parents to obtain parental consent or to act as parents’ agents and consent on their behalf, the FTC alleged that Edmodo did not fulfill the necessary conditions for doing so, including by failing to provide a notice of its information practices containing all required information to the schools and using children’s information for non-educational commercial purposes. The resulting order mandates that Edmodo not require children to provide more information than is reasonably necessary to participate in activities, follow COPPA requirements for obtaining schools’ authorization to collect children’s personal information, not use such information for commercial purposes such as advertising or building user profiles, cease using schools as intermediaries to obtain parental consent, maintain a data retention and deletion schedule, and delete all personal information of children that was collected without parental consent or proper school authorization, as well as any models or algorithms developed in whole or in part using such information.
COPPA was also a focal point of the FTC’s high-profile move against Meta in 2023, alleging in part that the social networking company violated COPPA, the FTC Act, and prior FTC orders by allowing children to communicate with contacts who were not approved by their parents on Meta’s Messenger Kids product. The FTC proposed modifying its 2020 order against Meta, which carried a $5 billion penalty, including in relevant part by strictly limiting how Meta can use children’s and teens’ information (only to provide the service and for security purposes) and imposing an outright ban on Meta’s ability to monetize the information or use it for its own commercial purposes, such as advertising or enhancing its data models and algorithms, even after the users turn 18.
General Section 5 Enforcement
As seen in the complaints and consent decrees discussed above, the FTC continues to use its general Section 5 authority to crack down on privacy and cybersecurity practices that it alleges are unfair or deceptive. In addition to incorporating such claims into the complaints filed and consent decrees reached in the specialized topic areas detailed above, the FTC also brought cases focused solely on Section 5 violations for inadequate information security and privacy practices, including misrepresentations or deceptive statements made by companies regarding the same. These complaints and orders offer valuable and concrete insights into what the FTC views as “reasonable” in this space. For example, the FTC started out 2023 by finalizing two orders against the online alcohol marketplace Drizly, which suffered a data breach in 2020, and ed tech provider Chegg for alleged “security failures” and “lax security,” respectively. While neither case involved monetary penalties, the FTC argued that the companies did not provide reasonable security for individuals’ personal information in that they failed to implement written information security policies, impose reasonable access controls (e.g., password requirements and multifactor authentication), prevent data loss by monitoring and enabling logs of its systems, conduct regular risk assessments (e.g., vulnerability scans, penetration testing), encrypt personal information (instead storing it in plain text), train employees and third-party contractors on information security, and implement procedures to inventory and delete personal information once it is no longer necessary. The companies were also faulted for making what the FTC considered to be deceptive statements about their security practices, including that they implemented reasonable security measures. The resulting orders require Drizly (as well as its CEO personally) and Chegg to, among other obligations, not make misrepresentations about privacy and security; implement data minimization, retention, and mandated deletion; document and abide by a written information security program with specific enumerated elements; undertake third-party information security assessments; and incorporate multi-factor authentication, both internally for employees, contractors, and affiliates and as an option for consumers.
Last but not least, in November 2023, the FTC accused prison communications service provider Global Tel*Link Corp. of lax data security, misrepresentations about its cybersecurity posture, unfairly delaying and limiting notice of a data breach to a small population of potentially affected individuals and making false and misleading statements in the notice about the severity of and risk resulting from the incident. As a result of a data security incident that potentially impacted upwards of 649,500 individuals, the FTC alleged that Global Tel*Link waited nine months to issue notices and only notified approximately 45,000 people. The resulting proposed order therefore requires Global Tel*Link to not only enhance its security practices but also to notify affected individuals who did not previously receive written notice as well as relevant Global Tel*Link’s prison and jail customers, post the notice on the home page and home screen of the relevant Global Tel*Link websites and apps, offer credit monitoring and identity protection services to all affected individuals, and in the event of future security incidents, notify affected individuals within 30 days, which is shorter than some state data breach notification statutes allowing for 45- or 60-day windows.
If 2023 is any indication, the next year promises to be another significant term for the FTC. In addition to continued attention to online tracking technologies, data brokers, children’s privacy, and reasonable privacy and cybersecurity practices, we’re also watching for continued FTC action regarding biometrics. The agency issued a “warning” about biometrics and resulting privacy and data security concerns in a May 2023 policy statement. Previewing potential remedies the FTC may pursue in relation to biometrics in the future, the agency is seeking to modify its 2020 order against Meta to expand existing notice and consent requirements relating to facial recognition to apply to any future uses of such technology.
Of course, it wouldn’t be a proper retrospective or look ahead without discussing artificial intelligence. While the FTC did not undertake any enforcement actions explicitly targeting AI, it wove significant statements and policy positions about AI into its complaints and orders throughout the year. In one of its clearest declarations to date, the FTC’s press release in the action against Amazon’s Ring stated that the agency will “hold businesses accountable for how they obtain and use the consumer data that powers their algorithms,” emphasizing that where the training of machine learning algorithms involves human review of such data, businesses must obtain consumers’ affirmative express consent. In this action and others, including against Edmodo and Amazon’s Alexa, the FTC ordered deletion of not only consumer information that the agency viewed as collected without proper consent but also any models, algorithms, and other “affected work product” or “data products” that relied on or incorporated such data. These stipulations are reminiscent of the algorithmic disgorgement seen in prior FTC actions, including against WW International (formerly Weight Watchers) in 2022 as well as Everalbum in 2021 and Cambridge Analytica in 2019. We expect to see more orders involving data and algorithmic disgorgement in coming years. In 2023, the FTC also launched an investigation into ChatGPT, issued a joint statement with the heads of other federal agencies on enforcement efforts against discrimination and bias in automated systems, and most recently, quietly approved a resolution that will allow the FTC to more easily issue civil investigative demands in non-public investigations into AI-related products and services, signaling that the FTC is just getting started with its AI agenda. 2024 promises to be an eventful year, indeed.