Following several unsuccessful attempts to secure federal preemption of state artificial intelligence regulations through Congress, President Trump turned to executive action, signing a sweeping executive order last Thursday night, entitled “Ensuring a National Policy Framework for Artificial Intelligence”. The Executive Order directs federal agencies to challenge state laws regulating AI, with the stated goals of establishing a “minimally burdensome national standard” for AI and preempting conflicting state regulations.

In a Client Alert published Friday, Ropes & Gray partners Jamie E. DarchFran FairclothRegina Sam PentiStephanie A. Webster, counsel Chetan A. Patil, and associates Joanne J. Hyun and Kate Kaplan summarize the Executive Order’s key provisions and analyze its potential impact on existing and proposed state laws.

To read the full Ropes & Gray alert, click here.

As firms face rising data volumes, competitive pressure, and regulatory scrutiny, asset managers are increasingly turning to tools driven by artificial intelligence for everything from investment research and portfolio construction to risk modeling and operational efficiency.

In a recent whitepaper, Ropes & Gray partners Melissa Bender, Amy Jane Longo, Fran Faircloth, Megan Bisk, Colleen Meyer, and associate Michaela Powers outline the principal legal and regulatory considerations for asset managers adopting AI. Providing a high-level framework for managing legal, regulatory, operational, and reputational risks associated with AI adoption, the whitepaper also offers practical steps to implement responsible, compliant use.

To read the full Ropes & Gray alert, click here.

December is upon is, which means it is time for the Data, Privacy, and Cybersecurity team at Ropes & Gray to kick off the 12 Days of Data, our annual blog series looking back at 2025 and ahead to what 2026 is likely to bring in the world of data protection. As regulators, courts, and policymakers continue to reshape the data protection landscape at a rapid pace, this series will highlight the trends, inflection points, and open questions that should be on every organization’s radar heading into the new year. Each post will focus on a discrete set of legal developments or a particular regulated sector, offering practical takeaways rather than year-end lists for their own sake.

The 12 Days of Data will be scattered between now and the end of the year, so be sure to subscribe to www.RopesDataPhiles.com to receive alerts as each post in the series goes live and to stay up to date on the latest insights from our team.

An increasingly aggressive plaintiffs’ bar has brought purported class action suits based on the nearly ubiquitous use of tracking technologies used for website analytics. Although any actual harm to the plaintiffs is difficult to articulate, the health care industry has been plagued by a series of these cases. Now the plaintiffs may be moving to financial services with the potential for statutory penalties of hundreds of dollars per user when a duty of confidentiality can be credibly implicated. 

The tracking tags, pixels and similar website analytics technologies are nothing new. Rather, the technologies at issue in such complaints are widely used on websites and mobile applications across industries, including by government entities, to collect information about user behaviors and interactions with the online platform where they are embedded. That information is then sent to a third party for analytics used to enhance user experience on the platform. Many of these technologies are integral to an organization’s ability to ensure its websites and applications are functioning properly, among other things providing crash reports when users encounter issues. Additionally, many consumer-facing businesses contract with third parties to provide session replay scripts, a software that monitors and records web-user activity such as keystrokes, clicks, and scrolling.  Despite the pervasiveness of these technologies, plaintiffs have seized on ambiguities in the California state wiretap act, known as the California Information Privacy Act, as well as federal wiretap law as the basis for exceptionally large damage demands.

Continue Reading Pixel Litigation Risk at Financial Institutions

On this episode of the R&G Tech Studio podcast, Ropes & Gray partners and co-leaders of the firm’s AI initiative, Megan Baca and Ed McNicholas, delve into the key implications of President Trump’s new AI Executive Order 14179, contrasting it with the previous Biden administration’s approach to AI regulation. They explore the nuances of AI innovation versus AI safety, the potential conflicts between federal and state regulations, and the global landscape of AI governance. Tune in for an insightful conversation on how companies can navigate the evolving regulatory environment while balancing innovation and compliance.
Click here to listen.

The Artificial Intelligence and Machine Learning (“AI/ML”) risk environment is in flux. One reason is that regulators are shifting from AI safety to AI innovation approaches, as a recent DataPhiles post examined. Another is that the privacy and cybersecurity risks such technologies pose, which this post refers to as adversarial machine learning (“AML”) risk, differ from those posed by pre-AI/ML technologies, especially considering advances in agentic AI. That newness means that courts, legislatures, and regulators are unlikely to have experience with such risk, creating the type of unknown unknowns that keep compliance departments up at night.

This post addresses that uncertainty by examining illustrative adversarial machine learning attacks from the National Institute of Standards and Technology AML taxonomy and explaining why known attacks create novel legal risk. It further explains why existing technical solutions need to be supplemented by legal risk reduction strategies. Such strategies include asking targeted questions in diligence contexts, risk-shifting contractual provisions and ensuring that AI policies address AML. Each can help organizations clarify and reduce the legal uncertainty AML threats create.

Continue Reading Adversarial Machine Learning in Focus: Novel Risks, Straightforward Legal Approaches

On March 7, 2025, the Department of Homeland Security (“DHS,” “the agency”) disbanded the Critical Infrastructure Partnership Advisory Council (“CIPAC,” “the Council”), originally established in 2006 to facilitate communication between the public and private sectors on critical infrastructure issues. CIPAC’s termination comes against the backdrop of the 2015 Cybersecurity Information Sharing Act’s (“CISA 2015,” “the Act”) upcoming expiration on September 30, 2025. CIPAC and CISA 2015 have jointly provided a valuable legal and operational framework for sharing information between the public and private sector in the U.S. for the past decade. Financial services industry stakeholders and members of Congress have expressed concern in recent months over increased cyber threats to industry stakeholders should the current public-private information sharing framework deteriorate. These recent developments are poised to significantly impact the financial services industry’s cybersecurity landscape – absent steps by Congress and the Administration to provide continuity for the current framework. 

Continue Reading CIPAC Disbandment and CISA 2015 Reauthorization: Recent Developments in the U.S. Cybersecurity Landscape

On this episode of the R&G Tech Studio podcast, Rohan Massey, a leader of Ropes & Gray’s data, privacy and cybersecurity practice, is joined by data, privacy and cybersecurity counsel Edward Machin to discuss the AI literacy measures of the EU AI Act and how companies can meet its requirements to ensure their teams are adequately AI literate. The conversation delves into the broad definition of AI systems under the EU AI Act, the importance of AI literacy for providers and deployers of AI systems, and the context-specific nature of AI literacy requirements. They also provide insights into the steps organizations should take to understand their roles under the AI Act, develop training modules, and implement policies and procedures to comply with AI literacy principles.

Click here to listen.

Ropes & Gray’s health care partner, David Peloquin, spoke with Bloomberg Law on the additional DOJ instructions regarding the Biden-era Executive Order 14117. DOJ has provided clarity surrounding the effective date for enforcement, with a promise to delay any enforcement efforts until July 8 for companies that show “good faith efforts to comply.” David noted that this guidance “allows compliance officers, privacy officers, those within companies working on this, to really get the resources they need.” To read the full article click here, and to read the Ropes & Gray client alert detailing the additional guidance, click here.