Artificial Intelligence (AI), including machine learning and other AI-based tools, can be effective ways to sort large amounts of data and make uniform decisions. The value of such tools has been embraced by some employers as an efficient way to address current increased hiring needs in the current job market. The use of artificial intelligence () as an aid to employers in making employment decisions—e.g., recruitment, resume screening, or promotions—has been on the radar of lawmakers and regulators in recent years, particularly out of concern for the risk that these tools may mask or entrench existing discriminatory hiring practices or create new ones. For example, some workers have filed charges with the Equal Employment Opportunity Commission (EEOC) based on alleged discrimination that resulted from employers’ use of AI tools, leading the EEOC to establish an internal working group in October 2021 to study the use of AI for employment decisions. Elsewhere, a bill addressing the discriminatory use of AI was proposed in Washington, DC in late 2021, and Illinois enacted one of the first U.S. laws directly regulating the use of AI in employment-related video interviews in 2019. In contrast, a bill proposed in California in 2020 suggested that AI could be used in employment to help prevent bias and discrimination.

On November 10, 2021, the New York City Council passed the latest such bill, which places new restrictions on New York City employers’ use of AI and other automated tools in making decisions on hiring and promotions. The measure—which takes effect on January 2, 2023—regulates the use of “automated employment decision tools” (AEDTs) which it defines as computational processes “derived from machine learning, statistical modeling, data analytics, or artificial intelligence” that issue a “simplified output” to “substantially assist or replace” decision-making on employment decisions (i.e., hiring new candidates or promoting employees). Under the new law, employers and employment agencies are barred from using AEDTs to screen candidates unless certain prerequisites are met. First, the AEDT must be subject to a bias audit within the last year. Second, a summary of the results of the most recent audit, as well as the distribution date of the AEDT, must be made publicly available on the employer’s or employment agency’s website. The law describes this “bias audit” as “an impartial evaluation by an independent auditor” which “shall include, but not be limited to” assessing the AEDT’s “disparate impact on persons” based on race, ethnicity, and sex.


Continue Reading NYC Law Aims To Reduce Bias Introduced by AI in Employment Decisions

As 2021 comes to a close, it is a great time to take stock of the present state of affairs with respect to U.S. privacy laws. With the relatively recent passage of comprehensive privacy laws in California, and additional countries adopting laws that closely follow the principles of the EU’s General Data Protection Regulation (GDPR), along with increasing public concerns regarding how companies manage customers’ personal data, legal practitioners entered 2021 with high hopes that comprehensive federal privacy legislation may finally be on the horizon. Nevertheless, in a trend that is likely to continue in the year ahead, it was the states rather than federal legislatures that successfully added to the ranks of privacy laws with which businesses will soon need to comply.

Continue Reading Momentum Builds for State Privacy Laws but the Possibility of a Federal Law Remains Remote

Private employers in New York will now need to notify and obtain employee acknowledgement prior to engaging in any electronic monitoring under the provisions of S2628, signed by Governor Kathy Hochul on November 8, and effective May 7, 2022. With this law, New York joins Connecticut and Delaware in mandating that employers provide employee notice of monitoring, which, in practice, can be integrated into the sort of employee privacy notice required under the California Consumer Privacy Act.

Applicability and Obligations for Businesses

S2628 applies to any private employer with a place of business in New York that electronically monitors employees’ communications and internet activity. The law’s core provisions require that upon an employee’s hiring, the employer must provide prior written notice alerting the employee that their telephone conversations, e-mails, and internet access or usage may be monitored using any electronic device or system such as a computer, telephone, wire, radio, or electromagnetic, photoelectronic, or photo-optical systems. The notice must be in writing or electronic form and acknowledged by the employee in writing or electronically. Employers must also post the notice describing the electronic monitoring in a conspicuous place that is readily available for employees to view.


Continue Reading New York Law Will Require Employee Notice and Acknowledgement Prior to Electronic Monitoring by Employer

LockOn July 22, 2020, New York’s Department of Financial Services (NYDFS) filed its first cybersecurity enforcement action against First American Title Insurance Company (First American), seeking civil monetary penalties for several violations of its cybersecurity regulation, 23 NYCRR §500.  Entities subject to New York’s Financial Services Law, such as First American, may be subject to a civil penalty up to $1,000 per violation or up to $5,000 per intentional violation, and according to NYDFS, each instance of unauthorized disclosure of NPI constitutes a separate violation. Therefore, an enforcement action under 23 NYCRR §500 may result in a hefty fine, particularly in the even of a large-scale data breach.
Continue Reading NYDFS Brings its First Cybersecurity Enforcement Action

Cyber SecurityWe reported last summer on two new legislative enactments in New York putting new demands on how companies handle the personal data of New York residents: the Identity Theft Protection and Mitigation Services Act (ITPMS Act), and the Stop Hacks and Improve Electronic Data Security Act (SHIELD Act). Both were signed into law on July 25, 2019, and as described below, both have since then come gradually into full effect. This includes their most significant feature: as of March 21, 2020, “any business that owns or licenses computerized data which includes private information of a resident of New York” now faces the prospect of an enforcement action by the New York Attorney General’s (AG) Office for the assessment of penalties if the company fails to develop, implement and maintain “reasonable safeguards” for the protection of that information.
Continue Reading “Reasonable Safeguards Requirement” For Personal Information of New York Residents Now Kicks In (with even broader Privacy/Security Legislation Still in the Offing)