Last holiday season, we were looking under the tree to see if President Biden and the U.S. Congress would leave the gift of a new national children’s online privacy and safety law—and whether it would turn out to be a welcome surprise or a lump of coal. It was widely reported that a group of senators were pushing to include the Children and Teens’ Online Privacy Protection Act (“COPPA 2.0”) and the Kids Online Safety Act (“KOSA”) in the fiscal year 2023 funding bill. However, once everything was unwrapped, the bills were pulled from the funding package.

            This holiday season we may well see the ghost of kids’ privacy and safety legislation past. A few weeks ago, Senator Maria Cantwell, Chair of the Senate Committee on Commerce, Science, and Transportation (“Senate Commerce Committee”), stated that the Senate is on track to pass a package of child online-safety measures this year. The bills in discussion once again include COPPA 2.0 and KOSA, which were reported out of the Senate Commerce Committee in July, as well as the EARN IT Act, STOP CSAM Act, and the REPORT Act. It is too early to tell whether this holiday season will bring the gift of a Senate-passed national children’s online privacy and safety package, but we will continue to monitor Congress for any updates in 2024.

           As we noted in our earlier post in this series, The Federal Trade Commission (“FTC”) was also in the spirit last holiday season when it announced on December 19, 2022 that Epic Games, the creator of the popular video game Fortnite, would pay a fine of $275 million resulting from alleged Children’s Online Privacy Protection Act (“COPPA”) violations, a record-setting fine for COPPA enforcement. The settlement also included a $245 million refund for alleged “dark patterns” that tricked players into making unwanted purchases, the FTC’s largest administrative order ever. After those penalties were finalized in early 2023, the Agency continued to vigorously enforce COPPA by bringing enforcement actions against Edmodo for alleged misuse of its educational technology services, Amazon for alleged misuse of its virtual assistant Alexa, and Microsoft for alleged misuse of its Xbox gaming system. Along with its enforcement actions, the FTC also proposed significant changes to its 2020 privacy order with Facebook (now Meta) including prohibiting the monetization of any data collected from users under the age of 18.

            There is no indication that the FTC’s focus on COPPA enforcement will ebb in 2024, so we would not be surprised to continue to see significant enforcement actions in the coming year. Further, since 2024 will be the last year of President Biden’s first term, there will also likely be updates from the FTC on the COPPA Rule review, initiated in 2019, and the Commercial Surveillance and Data Security rulemaking, initiated in 2022, which includes important sections on children’s online privacy and safety. The FTC may also announce findings on the comments it received regarding the use of facial recognition as a parental consent mechanism under COPPA. In addition to the FTC, 42 attorneys general brought a lawsuit earlier this year, alleging that Meta’s business practices violate state consumer protection laws and COPPA. We will be watching developments in that lawsuit closely in the coming year.

            In 2023, the Biden Administration’s approach to children’s online privacy and safety included actions outside the FTC’s purview. The Administration announced the creation of an interagency Task Force on Kids Online Health and Safety, the initiation of a Department of Education (“ED”) rulemaking focused on student privacy under the Family Educational Rights and Privacy Act, the promulgation of model policies and best practices for school districts on the use of internet-enabled devices by the ED, the encouragement of state broadband administrators and other state digital equity leaders to pursue ways of preventing online harassment of children by the Department of Commerce, and the deepening of the partnership between the Department of Homeland Security and the Department of Justice with the National Center for Missing and Exploited Children to fight child sexual abuse material. In addition to these actions, the National Telecommunications and Information Administration solicited comments on best practices to protect minors’ mental health, safety, and privacy online.

            In 2023, state lawmakers across the country also placed social media on the naughty list, especially where kids were concerned. Florida, Arkansas, Utah, Texas, and Louisiana all adopted laws that restrict children’s ability to access social media. Florida’s law prohibits online platforms likely to be predominantly accessed by children from processing any children’s personal information if the online platform has actual knowledge or willfully disregards that the processing may result in substantial harm or privacy risk to a child with no parental override of the prohibition. Under Arkansas’s, Utah’s, Texas’s, and Louisiana’s laws, social media platforms need express consent from a parent or guardian before creating a social media account for a child. We expect to see more state legislatures attempt to pass such laws in the year ahead. The Maryland and Minnesota legislatures will be ones to watch because they each held robust debates on children’s privacy and safety bills during the 2023 legislative session.

            Even so, there are significant concerns about these laws’ constitutionality. In September, the U.S. District Court for the Northern District of California granted a preliminary injunction against the most significant children’s online privacy and safety law, the California Age-Appropriate Design Code Act. This injunction was accompanied by similar injunctions against the Arkansas law and the Texas law. We expect to continue to see such challenges in 2024, given that the laws restrictions could implicate First Amendment rights and other constitutional principles.

In Europe, there have been important legislative and enforcement developments for children’s online privacy and safety in 2023 as well. For legislation, the United Kingdom passed the Online Safety Act of 2023, which requires social media platforms to prevent children from accessing harmful and age-inappropriate content. For enforcement, both the U.K.’s Information Commissioner’s Office and the E.U.’s Data Protection Commission levied multi-million dollar fines against TikTok for its alleged misuse of children’s personal information. In 2024, it is likely that enforcement actions will continue, and the U.K. will develop further guidance for compliance with the Online Safety Act.