On Nov. 27, the California Privacy Protection Agency (CPPA) unveiled draft automated decision-making technology (ADMT) regulations that would set forth new protections for employees and consumers.
The draft ADMT regulations would govern businesses’ use of such technology by requiring pre-use notice to job applicants and employees, reinforcing rights to opt out of and access information about businesses’ use of automated tools, and requiring businesses to conduct risk assessments in certain situations.
The draft regulations would apply to California residents, including consumers, employees, job applicants, and other individuals in the business-to-business or employment context. In particular, the draft regulations would require employers that use automated tools to notify job applicants and employees that an employment decision (e.g., denial of employment opportunity or lowered compensation) was based on the use of ADMT and that the employee has a right to access information about how the technology was used.
The draft regulations are part of the CPPA’s regulatory authority established in amendments to the California Consumer Privacy Act (CCPA), which were approved by California voters in 2020 as part of Proposition 24, the California Privacy Rights Act of 2020, or the so-called CCPA 2.0.
However, while the draft regulations would put California on the forefront of regulating the use of ADMT, including artificial intelligence, with respect to individual privacy concerns, it is important to note that the regulations are merely a draft of potential regulations, and the formal rulemaking process has not begun yet.
The draft regulations were published to facilitate public comment and will be discussed at the CPPA’s board meeting on Dec. 8, along with previously released proposed regulations regarding cybersecurity audits and risk assessments. As a result, these proposed regulations are likely subject to change before they are finalized.
Draft Regulations
The regulations would apply to automated decision-making technology, which is defined as “any system, software, or process—including one derived from machine-learning, statistics, or other data-processing or artificial intelligence—that processes personal information and uses computation” to make or facilitate a decision.
Such decisions include profiling, which the regulations define as automated processing of personal information to analyze or predict a person’s job performance, behaviors, whereabouts, and other attributes, such as reliability, economic situation, or health. Profiling would include the use of some productivity monitoring tools, including keystroke loggers, attention monitors, facial- or speech-recognition technology, and social media and web-browsing monitoring applications.
The draft regulations would require businesses that use ADMT to provide employees with advance notice that informs them about the use of the technology, the decision-making process and outputs, and their rights to opt out of and access information about the use of such technology.
The notice should include the purpose of the technology’s use, a description of a person’s rights to opt out, a description of the person’s right to access information about the use of such technology, and an easy method for a person to get additional information about the use of the technology.
Such additional information includes:
- The logic used by the tool, including factors used to generate decisions.
- The decision-making result, such as a numerical score.
- How the business intends to use the output, including whether there is human involvement in the process.
- Whether a business’s use of the tool has been evaluated for validity, reliability, and fairness.
Businesses would be required to allow employees and applicants to opt out of the use of the automated tool for “a decision that produces legal or similarly significant effects,” which includes employment opportunities and compensation.
The draft regulations would provide certain exceptions to the opt-out right in various instances, such as investigating security incidents, preventing fraudulent or illegal actions directed at the business, for safety purposes, where requested by the consumer, and if there is no reasonable alternative method of processing. There would be a rebuttable presumption that a reasonable alternative method of processing exists, and the business would bear the burden of establishing otherwise.
Additionally, the draft regulations would require businesses to provide employees, job applicants, and independent contractors with the ability to opt out of “profiling,” including the use of keystroke loggers, productivity or attention monitors, video or audio recording or live-streaming, facial- or speech-recognition technology, automated emotion assessment, location trackers, speed trackers, and web-browsing, mobile-application, or social-media monitoring tools.
The profiling provisions allow employees and consumers to opt out of profiling “while they are in a publicly accessible place,” which would include the use of Wi-Fi or Bluetooth tracking, radio frequency identification, drones, video or audio recording or live-streaming, facial- or speech-recognition technology, automated emotion assessment, geofencing, location trackers, or license-plate recognition.
Businesses using automated tools would have to provide two or more methods for consumers to submit opt-out requests. At least one of those methods would have to reflect the manner in which the business primarily interacts with the consumer.
Businesses would be allowed to provide consumers with the option to allow specific uses of automated tools, so long as an option to opt out of all uses is offered. Businesses would be required to wait at least 12 months from the date an opt-out request was received before asking consumers whether they want to consent to the use of automated tools.
Risk Assessments
The proposed regulations would work in tandem with the previously released proposed regulations governing risk assessments, which will also be discussed at the Dec. 8 board meeting. The proposed risk assessment regulations would require businesses to conduct risk assessments where the “processing of consumers’ personal information presents significant risk to consumers’ privacy.”
Employers in California may want to consider how the draft regulations would impact their businesses and current employment practices and policies. However, the draft regulations are far from being finalized, and there will be further opportunities for discussion and comment.
Simon McMenemy, Sean Nalty, Benjamin Perry, and Zachary Zagger are attorneys with Ogletree Deakins. © 2023. All rights reserved. Reprinted with permission.