As the year draws to a close, reform of the data subject access request (DSAR) regime in the EU and the UK may turn out to be a welcome gift for organisations grappling with complex access requests. Regulators in both jurisdictions are signalling a more flexible, pragmatic approach to compliance, recognising that DSARs have often been exploited for tactical or disruptive ends.Continue Reading On the Eleventh Day of Data… Unwrapping DSARs in 2026

In 1950, reflecting on the future of machine intelligence, Alan Turing observed: “We can only see a short distance ahead, but we can see plenty there that needs to be done.” With several large language models, most notably OpenAI’s GPT-4.5, passing the Turing Test in 2025, some governments have taken steps towards stricter regulation this year, with others still working to determine what “needs to be done” for AI regulation in the year ahead.

Most notably, this year saw key provisions of the EU AI Act—the world’s first comprehensive AI-dedicated law—take effect. However, instead of seeing the “Brussels effect” with AI regulation, going into 2026, the global approach appears to be leaning towards that of the UK and U.S., which have led the charge for a looser regulatory environment in recent years.Continue Reading On the Eighth Day of Data… AI Regulation – A 2025 Recap and a Look Ahead to 2026

On 30 November 2022, OpenAI made its ChatGPT generative artificial intelligence chatbot publicly available. In the two years since, its unprecedented growth has fostered a dramatic shift in public attention to and interest in all forms of AI. Now, the possibilities and risks presented by the continued development of AI are also firmly at the top of mind for businesses and regulators across the world.Continue Reading New Year’s Resolutions: What 2025 Holds for AI Regulation

2023 was the year of artificial intelligence — and 2024 is already shaping up to be more (much more) of the same.  The European Union’s legislative bodies passed the AI Act earlier this month, and although the text has yet to be finalised on the world’s first comprehensive AI law, the hype around it already feels unstoppable.  That hype will turn into hard work over the next 12 months, as organisations grapple with understanding their obligations under the Act and putting in a governance framework that meets those obligations.  Needless to say, it will not be an easy task.Continue Reading The Three European Union Laws That Need Your Attention in 2024

Earlier this year, the UK government released an AI white paper outlining its light-touch, pro-business proposal to AI regulation. Eight months on, and the UK appears to be sticking firm with this approach, with Jonathan Camrose (UK First Minister for AI and Intellectual Property) stating in a speech on 16 November 2023 that there will be no UK law on AI ‘in the short term’.

This stance has been taken in spite of the developments being made around the world in this area. The EU for example, by contrast, continues to make significant steps towards finalization and implementation of its landmark AI Act, with policy-makers announcing that they had come to a final agreement on the Act on 8 December 2023. Progress has also been made across the pond with President Biden issuing the executive order on Safe, Secure and Trustworthy Artificial Intelligence on 30 October 2023, with the intention of cementing the US as a world leader in the field. The UK’s reluctance to regulate in this area has been criticised by some as not addressing consumer concerns – but will this approach continue into 2024?Continue Reading AI Regulation in 2024 – Will The UK Continue to Remain The Outlier?

Introduction

Throughout 2022, cybersecurity lawyers have kept their eyes firmly fixed on two pieces of EU cybersecurity legislation: the NIS2 Directive (“NIS2”) and the Cyber Resilience Act (the “CRA”). With NIS2 having been formally enacted by the EU and the draft text of the CRA being published by the European Commission in September 2022, businesses should take time in 2023 to digest the implications of NIS2 and the CRA on their cybersecurity compliance programmes, both in terms of organisational measures and product compliance.Continue Reading 2023 – A Year for Reflection on EU Cybersecurity

Preeminent privacy scholar and George Washington University Law School professor, Daniel Solove joined Ropes & Gray’s virtual conference on “The Future of Global Data Protection,” for a wide-ranging discussion with Edward McNicholas, co-leader of the Ropes & Gray data, privacy & cybersecurity practice, in which the pair explored:

  • The state of complexity and inconsistency in the international privacy law landscape
  • The inherent flaws in the models on which privacy laws are currently based
  • The risks of moving toward a regulatory model
  • Theories of harm in data breach cases
  • The role of the courts in adjudicating privacy laws

Please see below for an overview of some of these topics, or to access a recording of the session please visit our blog: RopesDataPhiles.Continue Reading How Data Breaches Are Shaping the Global Data Protection Debate

The Courts of Justice of the European Union (CJEU) held in its July 2020 Schrems II decision that, in order for entities in other countries to import personal data from the European Economic Area (EEA), the importer must be able to provide data protections ‘essentially equivalent’ to those the EEA offers under its General Data Protection Regulation. The CJEU expressed particular concern that United States’ national security intelligence gathering laws prevent U.S.-based entities from providing such protections. This decision has sharply limited the sharing of clinical research data from the EEA to the United States. After describing the pertinent aspects of the Schrems II decision, this article evaluates U.S. national security intelligence gathering frameworks, including Section 702 of the Foreign Intelligence Surveillance Act and Executive Order 12333. The article then leverages recent draft guidance from the European Data Protection Board to explain how entities may be able to adopt widely used contractual and technical measures, such as data pseudonymization, to provide ‘essentially equivalent’ protections in the clinical research context.
Continue Reading Demystifying Schrems II for the Cross-Border Transfer of Clinical Research Data