In the wake of the Supreme Court’s 2021 decision in Facebook v. Duguid—which held that most smartphones and similar modern technology do not qualify as “automated telephone dialing systems,” under the Telephone Consumer Privacy Act (TCPA)—there has been a spike in state legislative activity aimed at strengthening local telemarketing laws. Florida’s Telephone Solicitation Act (FTSA) became the first state telemarketing law of its kind on July 1, 2021. The FTSA, which does not clearly define the types of automated technology covered by the statute, creates room for a broader interpretation of the types of devices that can qualify as regulated dialing technology. Oklahoma has now become the next state to enact such legislation, the Oklahoma Telephone Solicitation Act (OTSA), which largely mimics the FTSA and came into effect on November 1, 2022.

Continue Reading Oklahoma’s New Restrictive Telemarketing Law: Could Other States Be Next?

On November 9, 2022, the New York Department of Financial Services (“NYDFS”) announced proposed amendments to its Part 500 Cybersecurity Rules (“Proposed Amendments”), revising an initial set of draft amendments released in July 2022. While NYDFS may have relatively limited jurisdiction, its emphasis on rapid breach reporting and data governance have had considerable influence on other U.S. financial services regulators. The current Cybersecurity Rules impose a 72-hour reporting requirement for cybersecurity events, and the Proposed Amendments go farther, creating an additional 24-hour notification obligation in the event a ransomware payment is made. Additionally, the Proposed Amendments create new requirements for larger “Class A” companies, including a risk assessment by an external expert every three years and an independent audit of cybersecurity programs annually.

Continue Reading NYDFS Proposes Significant Amendments to its Cybersecurity Rules

Illinois continues to be a hotbed of privacy litigation, in large part due to Illinois’s landmark Biometric Information Privacy Act (BIPA), which was enacted in 2008. Despite the flood of cases in the wake of Rosenbach v. Six Flags Ent. Corp., 2019 IL 123186, 129 N.E.3d 1197 (Ill. 2019), this is only the first BIPA class action lawsuit to proceed to trial. On October 12, 2022, in Richard Rogers v. BNSF Railway Company (Case No. 19-C-3083, N.D. Ill.), a federal jury in Chicago found in favor of a class of more than 44,000 truck drivers who alleged that BNSF Railway Company (BNSF) violated BIPA by unlawfully scanning employee fingerprints for identity verification purposes without giving notice and obtaining their prior written permission. U.S. District Judge Kennelly entered a judgment against BNSF for $228M in damages. This case highlights many important considerations for organizations deploying biometric technologies in Illinois, including the potential for vicarious liability for a vendor’s actions, and provides valuable insight into how damages in BIPA cases are calculated. This decision from the Illinois court demonstrates that defendants can face significant civil liability in BIPA litigation, and companies using or collecting biometric information should be aware of these risks.

Continue Reading First-Ever BIPA Trial – Jury Awards Staggering $228M in Damages

At a meeting of the California Privacy Protection Agency (“CPPA”) on June 8, we learned additional information about the initial batch of proposed regulations (“Proposed Regulations”) to the California Privacy Rights Act (“CPRA”) that were published on May 27. The Proposed Regulations keep much of the pre-existing California Consumer Privacy Act (“CCPA”) regulations but modify and add some key provisions. Because the CPRA was drafted as an amendment to the CCPA, the Proposed Regulations reference the CCPA (as amended by the CPRA). The Proposed Regulations focus on data subject rights, contractual requirements, and obligations related to disclosures, notices, and consents. Additional proposals will cover cybersecurity audits, privacy risk assessments, and automated decision making, among other areas. While we expect significant changes as the Proposed Regulations proceed through the formal rulemaking process, which the CPPA has not yet officially started, we provide our key takeaways below:

Continue Reading Recent Activity from the California Privacy Protection Agency

On April 28, 2022, the Connecticut General Assembly passed SB 6, the Act Concerning Personal Data Privacy And Online Monitoring (the “Connecticut Privacy Act”) by a vote of 144-5, which puts Connecticut on course to become the fifth state to enact a comprehensive data privacy law, following California, Virginia, Colorado, and Utah. The bill, which passed the state senate 35-0, now awaits the signature of Governor Ned Lamont. If it becomes law, the bulk of the statute is set to take effect July 1, 2023.

The bill passed by Connecticut legislature closely follows the structure of similar laws enacted in other states, giving support to the Colorado legislature’s claim, that “states across the United States are looking to [the Colorado Privacy Act, enacted in 2021] and similar models to enact state-based data privacy requirements and to exercise the leadership that is lacking at the national level.” One of the Connecticut bill’s sponsors and its key proponent in the state senate, Sen. James Maroney, compared the legislation to Colorado’s statute, saying that both SB 6 and the Colorado law are less aggressive than the California Consumer Privacy Act (“CCPA”) but provide more privacy protections that similar bills passed by other states.

Continue Reading Connecticut Becomes the Fifth State to Pass a Comprehensive Data Privacy Law

The California Attorney General’s office (OAG) recently released its first formal written opinion on the scope of the rights granted to consumers under the California Consumer Privacy Act (CCPA), specifically, the right for a consumer to know about the personal information that a business collects from them. The opinion comes in response to a question submitted by California Assembly member Kevin Kiley as to whether a consumer’s right to know the specific pieces of personal information that a business has collected about that consumer applies to internally generated inferences the business holds about them. The OAG asserted that the right to know does apply to such inferences, albeit with certain key exceptions.

Continue Reading California Attorney General’s Office Releases First Formal CCPA Opinion

On March 24, 2022, Utah Governor Spencer Cox signed into law the Utah Consumer Privacy Act (“UCPA”), which was unanimously passed by the state legislature earlier this month. Utah is the fourth U.S. state to pass a comprehensive privacy law, following California, Virginia, and Colorado. The UCPA will go into effect on December 31, 2023.

The Utah law generally resembles the three existing state privacy models, but closely tracks with the Virginia Consumer Data Protection Act (CDPA) and Colorado Privacy Act (CPA), suggesting that states are shifting away from California’s more stringent strand of privacy regulation toward a version that balances the spirit of the EU’s General Data Protection Regulation (GDPR), in terms of purpose limitation and consumer protection, against the need to avoid overly burdening companies. In fact, the UCPA is seen by some as more business-friendly than legislation passed in Virginia and Colorado: Utah’s law does not require businesses to conduct data protection assessments and does not compel companies to provide a mechanism for consumers to appeal denials of requests to exercise personal data rights.

Continue Reading Utah Passes Comprehensive Privacy Law

In a unanimous decision issued on February 3, 2022, the Illinois Supreme Court held in McDonald v. Symphony Bronzeville Park that the Illinois State Workers’ Compensation Act (“WCA”) did not bar claims under the Illinois’ Biometric Information Privacy Act (“BIPA”). In doing so, the court eliminated one significant defense commonly raised in such cases, since many BIPA class actions are brought in the context of employment (many of which were stayed pending the decision in McDonald). Critically, though, the decision does not preclude other potential defenses including claims of federal preemption.

BIPA is one of the most actively litigated privacy statutes in the United States. Among other things, it requires that businesses obtain consent prior to collecting biometric information (fingerprints, facial geometry information, iris scans and the like), issue a publicly available data retention policy, and refrain from certain data sales and disclosures. Because BIPA provides for a private right of action along with statutory damages of $1,000 to $5,000 per violation, it has proved fertile ground for the plaintiff’s bar.

Continue Reading Illinois Supreme Court Finds Illinois Biometric Information Privacy Act Not Preempted By State Workers’ Compensation Law

Since the passage of the California Consumer Privacy Act (CCPA) in 2018, many states have proposed sweeping data protection legislation, but only two others, Colorado and Virginia, have so far succeeded in passing such laws. That may soon change. In 2021, several states came close to enacting comprehensive privacy legislation and that momentum has continued into this year, with data protection bills being carried over, introduced, and reintroduced in state legislatures across the country. As the possibility of a federal privacy law dwindles—particularly during this midterm year—state legislatures are poised to be the source of major data protection developments in 2022. Throughout the year, Ropes & Gray will monitor and analyze these developments in state privacy laws, beginning with a discussion of the latest iteration of the proposed New York Privacy Act.

Continue Reading State Privacy Law Developments: The New York Privacy Act

Artificial Intelligence (AI), including machine learning and other AI-based tools, can be effective ways to sort large amounts of data and make uniform decisions. The value of such tools has been embraced by some employers as an efficient way to address current increased hiring needs in the current job market. The use of artificial intelligence () as an aid to employers in making employment decisions—e.g., recruitment, resume screening, or promotions—has been on the radar of lawmakers and regulators in recent years, particularly out of concern for the risk that these tools may mask or entrench existing discriminatory hiring practices or create new ones. For example, some workers have filed charges with the Equal Employment Opportunity Commission (EEOC) based on alleged discrimination that resulted from employers’ use of AI tools, leading the EEOC to establish an internal working group in October 2021 to study the use of AI for employment decisions. Elsewhere, a bill addressing the discriminatory use of AI was proposed in Washington, DC in late 2021, and Illinois enacted one of the first U.S. laws directly regulating the use of AI in employment-related video interviews in 2019. In contrast, a bill proposed in California in 2020 suggested that AI could be used in employment to help prevent bias and discrimination.

On November 10, 2021, the New York City Council passed the latest such bill, which places new restrictions on New York City employers’ use of AI and other automated tools in making decisions on hiring and promotions. The measure—which takes effect on January 2, 2023—regulates the use of “automated employment decision tools” (AEDTs) which it defines as computational processes “derived from machine learning, statistical modeling, data analytics, or artificial intelligence” that issue a “simplified output” to “substantially assist or replace” decision-making on employment decisions (i.e., hiring new candidates or promoting employees). Under the new law, employers and employment agencies are barred from using AEDTs to screen candidates unless certain prerequisites are met. First, the AEDT must be subject to a bias audit within the last year. Second, a summary of the results of the most recent audit, as well as the distribution date of the AEDT, must be made publicly available on the employer’s or employment agency’s website. The law describes this “bias audit” as “an impartial evaluation by an independent auditor” which “shall include, but not be limited to” assessing the AEDT’s “disparate impact on persons” based on race, ethnicity, and sex.

Continue Reading NYC Law Aims To Reduce Bias Introduced by AI in Employment Decisions