On November 27, 2023, the California Privacy Protection Agency (“CPPA”) released a first draft of rules for automated decision-making technologies under California’s privacy law. The proposed rules revolve around providing notice of the technology’s use, opting out, and consumer access to business information.
In general, the proposed rules would require businesses using automated decision-making technology to provide consumers with a “pre-use notice” about the purpose of the technology’s use and consumers’ rights to opt-out of and access additional information about such use, including instructions for how to exercise these rights. Additional information in the “pre-use notice” would include the following:
- the logic used in the technology, including key parameters that affect the output of the technology, and why those parameters are key
- the intended output (e.g., a numerical compatibility score);
- the output(s) of the automated decision-making technology with respect to the consumer;
- how the business used or plans to use such output to make a decision, including the role of human involvement; and
- whether the business’s use of the technology has been evaluated for validity, reliability, and fairness, including the outcome of any such evaluation.
With respect to the purpose, the proposed regulation specifically would prohibit a business from using a generic phrase such as “to improve our services,” stating that the phrase was “insufficient for the consumer to understand the business’s proposed purpose for using the automated decision-making technology.” Instead, businesses would need to be more specific.
Regarding consumers’ opt-out rights, businesses would be required to provide consumers with the ability to opt-out of the following use cases:
- decisions producing legal or similarly significant effects;
- *profiling a consumer acting in their capacity as an employee, independent contractor, job applicant, or student; and
- **profiling a consumer while they are in a publicly accessible space.
* The proposed rules provide the following profiling examples: keystroke loggers; productivity or attention monitors; video or audio recording or live-streaming; facial or speech recognition or detection; automated emotion assessment; location trackers; speed trackers; and web browsing, mobile application, or social media monitoring tools.
** The proposed rules provide the following profiling examples: wifi or Bluetooth tracking; radio frequency identification; drones; video or audio recording or live-streaming; facial or speech recognition or detection; automated emotion assessment; geofencing; location trackers; and license plate recognition.
The business would have 15 business days to halt the use of the personal information after receipt of the opt-out request.
Notably, the draft regulations specify that a notification or tool regarding cookies would not by itself be an acceptable method for submitting opt-out requests of the business’s use of automated decision-making technology. Moreover, in responding to opt-out requests, a business would be permitted to present the consumer with the choice to allow specific uses of the technology as long as a single option to opt-out of all uses of automated decision-making technology is also offered.
Additionally, the proposed regulations would exempt businesses from providing consumers with opt-out or access rights when the technology’s use is necessary to achieve, and is solely used for, one of the following purposes, from which the proposed regulations explicitly exclude profiling a consumer for behavioral advertising:
- preventing, detecting, and investigating security incidents compromising the availability, authenticity, integrity, or confidentiality of stored or transmitted personal information;
- resisting malicious, deceptive, fraudulent, or illegal actions directed at the business and to prosecute those responsible for such actions;
- protecting the life and physical safety of consumers (recall that California defines “consumers” to include employees); or
- to provide the good or perform the service specifically requested by the consumer if there are no reasonable alternative methods of doing so.
Note the above does not explicitly mention security incidents involving commercial information or trade secrets, however such incidents may be covered by the second bullet. The proposed regulations also include some provisions specifically relating to the use of automated decision-making technology in the employment context, including rebuttable presumptions relating to the use of alternative methods of processing personal data.
Furthermore, the draft regulations also include the following options for CPPA board discussion:
- profiling a consumer that the business has actual knowledge is under the age of 16; and
- processing personal information of consumers to train automated decision-making technology.
Absent from the proposed regulations, however, are provisions addressing anonymization or pseudonymization, which would be expected especially in light of the vast amounts of data at issue in automated decision-making technology.
The CPPA will discuss the proposal at its December 8th board meeting, and the formal rulemaking process is expected to begin in 2024.
Our Take
These proposed rules are only the first draft, and several provisions are designated as options for CPPA board discussion. Among them is the right to opt out of processing personal information to train automated decision-making technology. Should this provision become promulgated, it is unclear what such requirement would entail given technological limitations surrounding model training. For example, to what extent would a model already trained on consumer’s personal information be permitted to continue operating following an opt-out request? Additionally, businesses maintaining models may have to consider ways to identify, tag, and document output pertaining to California consumers given that the access rights require businesses to provide all of the outputs with respect to the consumer in a simple and easy-to-use method. How the disclosure, opt-out, and access rights play out between businesses leveraging in-house models versus businesses relying on third-party foundational models will likely spark debate.