While students are about to embark on their holiday break, there is no such luck for educational technology (“EdTech”) providers. Privacy, cybersecurity, and artificial intelligence compliance obligations have proliferated over the past year, with no signs of slowing down. While it is hard to keep track of the numerous regulations and proposals on the state and federal level, below, I have highlighted a few issues for EdTech providers to monitor in the coming year.

COPPA Rulemaking

On January 11, 2024, the Federal Trade Commission (“FTC”) officially published its Notice of Proposed Rulemaking (“NPRM”) updating the Children’s Online Privacy Protection Rule (“COPPA Rule”). At that time, I published a blog providing analysis and commentary of the NPRM, which you can read here. It is unlikely that the FTC will release a final rule in the twilight of the Biden administration, but there is always the possibility. While increasing children’s privacy and safety protections has generally become a bipartisan effort, we unfortunately do not have direct insight into how Republican-appointed FTC Commissioners view the NPRM because there were no Republican-appointed FTC Commissioners on the FTC at the time the NPRM was published. We will have to wait and see for the Trump administration’s approach.

One of the main topics of focus of the NPRM is the applicability of the COPPA Rule to EdTech. The FTC has been telegraphing its interest in ensuring EdTech providers follow the COPPA Rule for years. In May 2022, the FTC published a policy statement on the COPPA Rule’s applicability to EdTech, which expounded on the FTC’s enforcement focus as it relates to EdTech specifically highlighting the COPPA Rule’s prohibitions against mandatory collection, limitations on how Ed-tech providers can use and retain personal information, and cybersecurity requirements. The FTC followed the policy statement with an enforcement action against EdTech provider Edmodo in May 2023. And then the FTC published its NPRM codifying longstanding FTC guidance on the interplay between the COPPA Rule and Ed-tech providers.

The FTC’s current COPPA Rule guidance specifies that an operator may rely on “school consent” under the Family Educational Rights and Privacy Act (“FERPA”) instead of verifiable parental consent under COPPA, when it collects a child’s personal information (under 13 years old) provided the operator uses the information for an educational purpose and for no other commercial purpose including marketing and advertising. Under FERPA, an EdTech provider can use “school consent” for the collection of student information if it falls under the school official exception. The school official exception allows a school to disclose personal information from a student’s education record without parental or student consent to an EdTech provider if the EdTech provider is providing institutional services or functions, is under the direct control of the school with respect to the use and maintenance of education records, and complies with FERPA requirements surrounding use and redisclosure.

The NPRM codifies the currently existing COPPA Rule guidance allowing “school consent” where the data is used for a school-authorized education purpose. The NPRM defines “school-authorized education purpose” as “any school-authorized use related to a child’s education. Such use shall be limited to operating the specific educational service that the school has authorized, including maintaining, developing, supporting, improving, or diagnosing the service, provided such uses are directly related to the service the school authorized. School-authorized education purpose does not include commercial purposes unrelated to a child’s education, such as advertising.” While the FTC states that it is only codifying existing guidance, the NPRM makes explicit that which previously was vague and discusses how narrowly it is interpreting the exception.

Cybersecurity Obligations

As EdTech usage proliferates, companies and schools should carefully assess how the COPPA Rule NPRM will affect their use and disclosure of student data. But that should not be the only focus. As Ropes and Gray’s PLI treatise chapter on education data underscores, there are significant cybersecurity risks for entities with educational data and a growing number of regulatory obligations. In 2024, the educational and research sector experienced the largest increase in attacks compared to all other industries. The education and research sector saw a 53% year-over-year increase, averaging 3,341 attacks per organization per week. In response to the growing risks, regulators have focused on ensuring compliance with an assortment of cybersecurity requirements for educational institutions, under the COPPA Rule, FERPA, the Student Aid Internet Gateway Enrollment Agreement, the Gramm-Leach-Bliley Act, and the FTC’s Identity Theft Red Flags Rule.

The COPPA Rule NPRM is a good example of a recent proposal by regulators to require covered entities to implement certain cybersecurity controls. The NPRM includes requirements such as designating an employee to coordinate the information security program; identifying and, at least annually performing additional assessments to identify risks to the confidentiality, security, and integrity of personal information collected from children; designing, implementing, and maintaining safeguards to control any identified risks, as well as testing and monitoring the effectiveness of such safeguards; and, at least annually, evaluating and modifying the information security program. It also requires covered entities to obtain written (not verbal or other) assurances from third parties or other operators that they will employ reasonable measures to maintain the confidentiality, security, and integrity of the information they receive from the operator. Regulators are continuing to look at the above requirements as baseline requirements for a sufficient cyber program. For a more in-depth review of cybersecurity requirements as well as incident reporting requirements, please review Ropes & Gray’s PLI treatise.

AI Focus

Privacy and cybersecurity are not the only topics regulators overseeing the protection of student information are focusing on. For example, last month, the U.S. Department of Education’s Office of Civil Rights (“OCR”) released guidance on how covered entities could avoid the discriminatory use of artificial intelligence (“AI”). The guidance provides specific examples of uses of artificial intelligence that constitute race, color, national origin, sex, and disability discrimination. These examples focus on EdTech providers providing services such as facial recognition, AI-enabled student risk assessments for disciplinary actions, predictive analytics, scheduling software, and test proctoring. EdTech providers using generative AI or planning to use generative AI in their services should carefully review these examples and configure their services to ensure they don’t run afoul of OCR’s guidance. Surely this will be a topic of enforcement by OCR in the coming years. In fact, on November 26, 2024, the FTC took action against a company that allegedly made false claims about the extent to which its AI-powered security screening system can detect weapons including in school settings. This may be a precursor of future enforcement.

Conclusion

EdTech providers need to carefully review their privacy and cybersecurity programs are compliant with their obligations under applicable regulations discussed above. And as more EdTech providers integrate generative AI into their products and services, companies should ensure they are not inadvertently violating civil rights laws. There continues to be a bipartisan focus from legislators and regulators on protecting kids’ privacy and safety. Regardless of the change of presidential administrations, protecting students’ information from cybersecurity threats, privacy harms, and discriminatory uses will still be a priority.

For more information on PLI’s new edition of its cyber law treatise, Cybersecurity: A Practical Guide to the Law of Cyber Riskclick here.