The past year has seen unprecedented growth and development of artificial intelligence (“AI”) tools, which have been significantly propelled by the rapid deployment of generative AI (“GenAI”) tools.  The health care and life sciences industries have increasingly sought the use of AI and GenAI tools to promote innovation, efficiency and precision in the delivery of treatment and care, as well as in the production of biologics and medical devices.  For example, AI tools may more accurately predict and analyze diagnostic test results and develop personalized treatments than traditional tools; may improve clinical trial design, eligibility screening and data analysis; may be used as a diagnostic tool in a clinical trial designed to assess the safety or efficacy of a medical device; and may be used to accelerate the drug development timeline.  While such uses raise inherent concerns regarding, among other things, the improper use and/or disclosure of personal information, the introduction and/or perpetuation of bias and discrimination, as well as data security, reliability, transparency and accuracy, there is currently no developed federal or cohesive state regulatory framework designed to minimize such risks.  

Continue Reading The 2023 AI Boom Calls for Further Regulation of the Use of AI Tools in the Health Care and Life Sciences Industries

Decisions, decisions.  We are deluged by decisions.  What present should I buy?  Is the small cheese plate enough for my party guests, or should I go with the large?  How much of my bonus should I set aside for retirement this year, or should I up my charitable giving? 

Wouldn’t it be nice if we could all get a little technological assistance in making choices this holiday season?

Continue Reading Jingle All the Algorithms: Automated Decisionmaking Amidst a Blizzard of State Privacy Laws

Earlier this year, the UK government released an AI white paper outlining its light-touch, pro-business proposal to AI regulation. Eight months on, and the UK appears to be sticking firm with this approach, with Jonathan Camrose (UK First Minister for AI and Intellectual Property) stating in a speech on 16 November 2023 that there will be no UK law on AI ‘in the short term’.

This stance has been taken in spite of the developments being made around the world in this area. The EU for example, by contrast, continues to make significant steps towards finalization and implementation of its landmark AI Act, with policy-makers announcing that they had come to a final agreement on the Act on 8 December 2023. Progress has also been made across the pond with President Biden issuing the executive order on Safe, Secure and Trustworthy Artificial Intelligence on 30 October 2023, with the intention of cementing the US as a world leader in the field. The UK’s reluctance to regulate in this area has been criticised by some as not addressing consumer concerns – but will this approach continue into 2024?

Continue Reading AI Regulation in 2024 – Will The UK Continue to Remain The Outlier?

Not that long ago, financial sector regulations seldom mentioned cybersecurity expressly, instead addressing the issue indirectly through restrictions focused on general system safeguards and omnibus reporting requirements. Gone are those days. Over the past few years, federal and state regulators have increased focus on information security issues impacting financial institutions, introducing a spate of cyber rules that often include stringent regulatory reporting and disclosure requirements. This year was no different.

Continue Reading Making a List and Checking it Twice: The Impact of Cybersecurity Regulations on Financial Services in 2023

Last holiday season, we were looking under the tree to see if President Biden and the U.S. Congress would leave the gift of a new national children’s online privacy and safety law—and whether it would turn out to be a welcome surprise or a lump of coal. It was widely reported that a group of senators were pushing to include the Children and Teens’ Online Privacy Protection Act (“COPPA 2.0”) and the Kids Online Safety Act (“KOSA”) in the fiscal year 2023 funding bill. However, once everything was unwrapped, the bills were pulled from the funding package.

Continue Reading Naughty or Nice: Children’s Online Privacy and Safety Developments and Expectations

For the second day of data, we are taking a look around the world. The most significant new international data protection law of 2023 is probably India’s long-awaited comprehensive data protection law, the Digital Personal Data Protection Act, 2023 (the “DPDP Act”). The DPDP Act was enacted and notified in the Official Gazette on 11 August 2023. The law will not come into effect until the government provides notice of an effective date, which is still forthcoming, with different effective dates expected for different provisions. Last month, Rohan Massey, co-leader of Ropes & Gray’s data, privacy & cybersecurity practice, sat down with Sajai Singh, a partner at J. Sagar Associates in Bangalore, to discuss the law.

Continue Reading Unpacking India’s Digital Personal Data Protection Act

On the first Day of Data, we recap a busy year for the Federal Trade Commission (“FTC”), highlighting key enforcement decisions from 2023 and reading the tea leaves for what promises to be an equally active 2024 for the agency on topics ranging from online tracking technologies to artificial intelligence.

Continue Reading Walking in a Data Wonderland: A Look Back at the FTC’s 2023 Privacy Enforcement Actions

As the year draws to a close, we’re excited to kick off the third annual installment of the 12 Days of Data, our favorite holiday tradition.

Join us for a festive journey over the next few weeks, as we count down twelve key areas of growth in privacy and cybersecurity in 2023 and look forward to what we can expect in 2024. We will be dashing through privacy law developments in the United States and abroad, the evolution of cybersecurity, and the latest trends in data litigation. It’s a comprehensive look at the year that was and a glimpse into the future of data protection.

We’re making our list and checking it twice, ensuring you don’t miss a single update. Subscribe to www.RopesDataPhiles.com to receive alerts about our latest posts.

On November 13, 2023, New York Governor Kathy Hochul announced the release of proposed statewide hospital cybersecurity regulations that would require state-licensed hospitals to establish cybersecurity programs, policies and procedures (the “Proposed Regulations”). The Proposed Regulations feature requirements regarding cybersecurity policies and procedures, personnel, user authentication methods, security risk assessments, incident response plans, and two-hour reporting of certain incidents.

If approved by the New York State Public Health and Health Planning Council (“PHHPC”) and subsequently finalized, the Proposed Regulations would supplement federal Health Insurance Portability and Accountability Act (“HIPAA”) Security Rule requirements but would be broader in some respects, including with regard to what information is subject to the requirements.

Click here to read Ropes & Gray’s Client Alert on the proposed requirements

On October 30, 2023, President Biden issued an executive order (“EO”) on the safe, secure, and trustworthy development and deployment of artificial intelligence (“AI”) that has the potential to set far-reaching standards governing the use and development of AI across industries. Although the EO does not directly regulate private industry, apart from certain large-scale models or computing clusters deemed to potentially impact national security (discussed below), it requires federal agencies including the Departments of Commerce (principally through the National Institute of Standards and Technology (“NIST”)), Energy, and Homeland Security, among others, to issue standards and guidance and to use their existing authorities, including regulatory authorities, to police the use of AI in ways that will impact business for years to come. In addition, it devotes federal resources toward AI-related education, training and research, including the further development of privacy enhancing technologies (“PETs”) such as differential privacy and synthetic data generation.

Click here to read Ropes & Gray’s Client Alert detailing the new EO.