Here We Go Again: Another Ballot Initiative for CCPA in 2020

As companies get ready for the California Consumer Privacy Act’s (CCPA) effective date of January 1, 2020, compliance is complicated because there are still several moving variables:

–           Draft regulations have been proposed but may not be final until after January 1, 2020.

–           The recent amendments to CCPA include two important exceptions (business-to-business (B2B) and the “employee” exceptions) that sunset on December 31, 2020. It is anticipated that amendments to CCPA will be introduced in the California legislature during the 2020 session on these topics and others.

–           A ballot initiative to amend CCPA may be presented directly to California voters. The proposed initiative had originally been filed with the California Attorney General on September 25, 2019, but an amended ballot initiative was received by the Attorney General on November 13, 2019. This version has some potential surprises for companies subject to CCPA.

Background

Readers may recall that CCPA was swiftly enacted by the California legislature in 2018 in order to prevent a proposed ballot initiative covering privacy from being placed on the ballot. Some of the 2019 amendments to CCPA corrected some errors and clarified some provisions that resulted from CCPA’s rapid movement through the California legislature.

The proponents of the 2019 ballot initiative obtained the required number of signatures to have the initiative placed on the 2020 ballot. The original initiative contained many of the same changes to CCPA that were ultimately enacted by the legislature and signed by the Governor in October of 2019.

The November 13 version of the initiative still contains some provisions that were enacted by the CCPA amendments, but it also includes many other proposed changes to CCPA that could affect many companies.

Ballot Initiative – Amendments to Version 3

Highlights of the 51-page amended initiative include:

–           There is still no private right of action.

–           It would extend the B2B and “employee” exceptions through December 31, 2022 (proposed section 154(m) & (n)).

–           Although the initiative would add many concepts from Europe’s General Data Protection Regulation (GDPR) (including provisions relating to new term “sensitive personal information”), it does not include the GDPR’s concept of “joint controller.” It does, however, include a provision for joint ventures or partnerships where each party controls at least a forty percent interest, as part of the revised definition of “Business” (proposed section 140(d)(3).

  • The proposed new term “sensitive personal information” would be defined as follows in proposed section 140(ae):

            “Sensitive personal information”: means: (1) personal information that reveals (A) a consumer’s social security, driver’s license, state identification card, or passport number; (B) a consumer’s account log-in, financial account, debit card, or credit card number in combination with any required security or access code, password, or credential allowing access to an account; (C) a consumer’s precise geolocation; (D) a consumer’s racial or ethnic origin, religious or philosophical beliefs, or union membership; (E) the contents of a consumer’s mail, email and text messages, unless the business is the intended recipient of the communication; (F) a consumer’s genetic data; and (2)(A) the processing of biometric information for the purpose of uniquely identifying a consumer; (B) personal information collected and analyzed concerning a consumer’s health; or (C) personal information collected and analyzed concerning a consumer’s sex life or sexual orientation. Sensitive personal information is “publicly available” pursuant to paragraph (2) of subdivision (v) of Section 1798.140 shall not be considered sensitive personal information or personal information.

–           The initiative would potentially extend the 12-month “look-back” period (proposed section 130(a)(2(B)). Depending upon new regulations, a consumer could request data from more than 12 months prior to the request, unless it was “impossible or would involve disproportionate effort” by the business. This change would affect information collected on or after January 1, 2022.

–           The proposed initiative would also call for regulations relating to businesses “whose processing of consumers’ personal information presents significant risk to consumers’ privacy or security” to perform an annual risk assessment and submit that assessment to the new California Privacy Protection Agency.

–           The required notice (online privacy policy for many companies) would be expanded to include several new terms, including the categories of “sensitive personal information” that are collected and “shared,” and the length of time the business intends to retain each category.

  • The proposed new term “share” would, subject to some exceptions, be defined as follows in proposed section 140(ah):

            “Share,” “shared,” or “sharing” means sharing, renting, releasing, disclosing, disseminating, making available, transferring , or otherwise communicating orally, in writing, or by electronic or other means, a consumer’s personal information by the business to a third party for cross-context behavioral advertising, whether or not for monetary or other valuable consideration, including transactions between a business and a third party for cross-context behavioral advertising for the benefit of a business in which no money is exchanged.

–           The proposed initiative would also add two new consumer rights: the right to correct inaccurate personal information (proposed new section 106) and a new right to limit use and disclosure of “sensitive personal information” (proposed new section 121).

–           The “Do Not Sell My Personal Information” link would change under the proposed new initiative. It would become “Do Not Sell or Share My Personal Information” and there would be a second link “Limit the Use of My Sensitive Personal Information. In the alternative, pursuant to new regulations, a business could allow consumers to opt-out of sales or sharing and limit the use of sensitive personal information “through an opt-out preference signal sent with the consumer’s consent by a platform, technology, or machine.” (Proposed section 135(a) & (c)).

 

The proposed ballot initiative illustrates that privacy—and not only in California—will continue to be a topic of interest in 2020 and beyond.

 

First multi-million GDPR fine in Germany: €14.5 million for not having a proper data retention schedule in place

Data Protection Report - Norton Rose Fulbright

On October 30, 2019 the Berlin Commissioner for Data Protection and Freedom of Information (Berliner Beauftragte für Datenschutz und InformationsfreiheitBerlin DPA) issued a €14.5 million fine on a German real estate company, die Deutsche Wohnen SE (Deutsche Wohnen),  the highest German GDPR fine to date. The infraction related to the over retention of personal data. For the first time, the Berlin DPA applied the new calculation method for GDPR fines issued by the German Datenschutzkonferenz recently (see our recent post).

Continue reading

Covert monitoring in the workplace – impact on an employee’s privacy

Data Protection Report - Norton Rose Fulbright

The Grand Chamber of the European Court of Human Rights (ECHR) has held that Spanish shop workers’ right to privacy under Article 8(1) of the European Convention on Human Rights was not violated when their employer obtained evidence of theft from covert CCTV footage of the employees.

The case involved five employees who worked as cashiers at a supermarket chain. The employer noticed stock discrepancies and as part of the investigation installed CCTV cameras, both visibly within the store and hidden cameras at the checkouts. Although customers and staff were aware that CCTV cameras operated, the employees were not aware of the concealed cameras. The cameras provided evidence of the staff involved in stealing items and the staff were dismissed. They claimed unfair dismissal on the basis that the surveillance had been unlawful.

The employees claimed that the use of covert surveillance had breached their right to privacy under Article 8 of the ECHR. The High Court in Spain accepted that the surveillance had been justified by the employer’s reasonable suspicion of theft and that it had been appropriate for the legitimate aim of detecting theft. However, the Chamber of the ECHR originally overturned the decision of the Spanish Court, holding that the court had failed to strike a balance between the rights of the employer to protect its property and the workers’ right to respect for their private life. The case was referred to the  Grand Chamber of the ECHR for a final ruling.

The Grand Chamber overturned the previous decision by a majority of 14 to three and determined that there had been no breach of the workers right to privacy. The Court held that states should ensure that any monitoring by employers is proportionate and is accompanied by adequate and sufficient safeguards against abuse. It referred to the criteria set out in the Barbulescu case:

  • Has the employee been notified of the possibility of surveillance measures? If so was the notification given before the implementation?
  • The extent of the monitoring by the employer and the degree of intrusion into the employee’s privacy.
  • Has the employer provided legitimate reasons to justify monitoring?
  • Could monitoring have been established in a less intrusive manner?
  • What are the consequences of the monitoring for the employee?
  • Has the employee been provided with appropriate safeguards?

In this case the court held that the surveillance had been necessary and proportionate and took into account the following factors:

  • The employer had a legitimate reason for the surveillance, the suspicion of theft due to the significant losses that the employer had suffered.
  • The extent of the monitoring had been limited as regards the area that it covered and the staff being monitored. The monitoring took place in an area that was open to the public. The employee’s expectation of privacy was therefore lower than in places that were private in nature.
  • The duration of the surveillance had not been excessive. The employer had not set a maximum duration of the surveillance, but it had in fact only lasted ten days.
  • The surveillance had only been viewed by certain individuals before the applicants had been informed: the Manager, the employer’s legal representative and a trade union representative.
  • Although the consequences of the monitoring for the applicants had been significant in that they had been dismissed, the surveillance had not been used for any purposes other than to investigate the thefts and to take the disciplinary measures against those responsible.
  • There had been no other means by which to fulfill the legitimate aim. If the employees had been notified of the surveillance then this may well have defeated its purpose.  This was significant in that in most cases of surveillance prior knowledge is a requirement.

Our take

Employers will be pleased with this result as it gives better cover when covert monitoring is the only option. However, it should be noted that the dissenting judgement did consider that employers should not be entitled to operate covert surveillance.  Indeed the UK Information Commissioner’s Office guidance sets out that it is rare for covert monitoring to be justified  and that it should only operate in exceptional circumstances. The decision to implement cover monitoring should still remain a last resort but employers can improve their position by including wording in monitoring policies that clarifies to employees when  such surveillance could take place.

California Governor signs all 5 CCPA amendments

On Friday, October 11, 2019, the California Governor signed all five of the California Consumer Privacy Act amendments that were awaiting his signature (AB 25, 874, 1146, 1355, and 1564) as well as an amendment to California’s data breach law (AB 1130).  We had previously written about the impact on CCPA if all five amendments went into effect here. Continue reading

Mic Drop: California AG releases long-awaited CCPA Rulemaking

On October 10, 2019, with just weeks to go until the law goes into effect, the California Attorney General released the long-awaited draft regulations for the California Consumer Privacy Act (CCPA).  The proposed rules shed light on how the California AG is interpreting and will be enforcing key sections of the CCPA.  In the press release announcing the proposed regulations, Attorney General Becerra described CCPA as “[providing] consumers with  groundbreaking new rights on the use of their personal information” and added, “It’s time we had control over the use of our personal data.”   The proposed regulations are intended to operationalize the CCPA and provide practical guidance to consumers and businesses subject to the law.  In addition to the draft rules, the AG’s office also published a “CCPA Fact Sheet” and the “Initial Statement of Reasons,” which also provide insights as to the regulatory focus and enforcement priorities.  According to the AG’s office, the draft rules summarized below, are needed to “[mitigate] the asymmetry of knowledge and power between individuals and businesses.”  Businesses “must comply to the greatest extent it can” to give consumers greater control over their personal information: to vest consumers with the right to know details about how their personal information is collected, used, and shared by businesses; the right to take control of their information by having businesses delete it and stop selling it; and the right to exercise these privacy rights without suffering discrimination in price or service.

The rules are not final.  The Attorney General will hold public hearings in four California cities during the first week of December to hear comments.  Written comments will be accepted by the Attorney General until 5 PM (Pacific time) on December 6, 2019.

Below is a summary of the proposed regulations that may have the most impact for organizations who are seeking to operationalize the CCPA requirements in time for the January 1 deadline.

Continue reading

No surprises in the recent Planet49 European Court of Justice judgment

Data Protection Report - Norton Rose Fulbright

On 1 October 2019, the European Court of Justice (ECJ) delivered its judgement on Case C – 673/17 (the “Planet49” case), which relates to the consent and transparency requirements for the use of cookies and similar technologies. The ECJ largely followed the March 2019 Opinion of Advocate General Szpunar and the judgment is generally consistent with the recent regulatory guidance issued by the UK and French data protection authorities in this area.

The decision:

Planet49 GmbH, a German company offering an online lottery service, used two checkboxes on its website at the login page for the online lottery. The first unticked checkbox forced users to consent to being contacted for marketing purposes by third-party companies before participating in the lottery. The second checkbox, which was pre-checked, sought consent for installing cookies on users’ browsers. The German federal consumer rights group (Bundesverband der Verbraucherzentralen) brought an action asserting that the requested declarations of consent did not satisfy the relevant requirements of the German data protection laws. In November 2017, the Federal Court of Justice referred the following questions to the ECJ, which found as follows:

  • Is a pre-checked checkbox, which the user must deselect to refuse his or her consent, valid consent for the purposes of Article 5(3) of the e-Privacy Directive (namely the “cookie consent” requirement) and have the consent requirements under the GDPR been met?

The ECJ held that a pre-checked box is not sufficient to obtain valid consent for placing cookies on users’ devices under Article 5(3) of the e-Privacy Directive, as it does not constitute an unambiguous indication of the data subject’s wishes.

The ECJ acknowledges that Article 5(3) of the e-Privacy Directive does not prescribe a specific way of obtaining consent to the storage of and access to cookies on users’ devices. However, it observes that the wording “given his or her consent” means that some action is required on the part of the user. It also notes that “consent” under the e-Privacy Directive previously had to be read as having the same meaning as consent under the Data Protection Directive (Directive 95/46/EC) and that the requirement for consent to be an “indication of the data subject’s wishes” under Directive 95/46/EC also points to “active, rather than passive, behaviour”. This, it concludes, is not the case where pre-checked boxes are used.

The court also points out that “only active behaviour on the part of the data subject with a view to giving his or her consent” fulfils the requirement under Directive 95/46/EC for consent to be “unambiguously” given. It would appear impossible to ascertain objectively whether a website user had actually granted his or her consent by merely continuing with his or her activity on the website visited (continuing browsing or scrolling) and, in doing so, failing to deselect a pre-checked box.

The ECJ considers that the GDPR has now closed off any debate on this issue, stating that the consent requirements under the GDPR are stricter, expressly requiring active consent and precluding “silence” and “pre-ticked boxes of inactivity” from constituting valid consent.

Separately on the question of consent needing to be “specific”, the ECJ notes that “consent must relate specifically to the processing of the data and cannot be inferred from an indication of the data subject’s wishes for other purposes”. Therefore, the fact that a user selects the participate button for the promotional lottery is not sufficient to conclude that the user validly gave his or her consent to the storage of cookies, or to the sharing of his or her data with commercial partners.

In a departure from the Opinion of Advocate General Szpunar, the ECJ did not rule on whether making participation in the lottery conditional upon the user giving his or her consent to advertising complied with the requirement for consent to be “freely given”, as this question was not referred to it. However, based on the AG’s opinion and recent guidance on the meaning of consent under the GDPR, we consider that it would often be difficult to meet the “freely given” requirement in this context.

  • Does it make a difference whether the information stored or accessed by the cookie or tracking technology constitutes personal data or not?

The ECJ points out that Article 5 (3) of the e-Privacy Directive merely refers to “information without characterising that information or specifying that it must be personal data”. Therefore, it is irrelevant if the data accessed by cookies constitutes personal data or not and the consent requirement in Article 5 (3) of the e-Privacy Directive applies regardless.

  • What information must the service provider give to users about the use of cookies and other tracking technology under Article 5(3) of the ePrivacy Directive and: (a) does this include information about the duration of the cookies; and (b) information about third parties being given access to the cookies?

The ECJ clarified that the information provided must enable the user to determine the consequences of any consent he or she gives and, in this case, be sufficiently detailed so as to enable the user to understand the functioning of the cookies employed. It considers that this requires information about the duration of the cookies and whether or not third parties may have access to those cookies to be provided, a position it considers is supported by the transparency requirements of the GDPR.

Position in Germany

In Germany, the position on cookies in national law remains unique. Section 15 (3) of the German Telemedia Act (Telemediengesetz) still allows cookies to be used in certain circumstances on an opt-out basis. The question remains how this will be interpreted in the light of the GDPR and the ECJ’s decision above, as the ECJ only answered the specific questions raised and did not give further guidance on the relationship between the GDPR and the German Telemedia Act. However, the ECJ decision is certainly consistent with recent guidance issued by the German data protection authorities, which provides that in most use cases online tracking requires the user’s prior consent in the form of an opt-in solution and that reliance on other grounds for processing, such as “legitimate interests” is not acceptable. This is especially so in cases where cookies allow a third party to track users across different sites. Therefore, the ECJ’s decision may be another step towards a more harmonised approach being taken across the whole of the European Union.

Our take:

The ECJ decision should come as no surprise to people following this area of data protection law, as it confirms the position on consent already set out in the GDPR and reflects recent regulators’ guidance across Europe. It also re-enforces the need for companies to revisit their cookies notices and consent mechanisms to ensure that they are compliant with the position taken.

It is unfortunate that the judgment does not provide any further clarity on the level of information required about the third parties who may access the cookie data or how to collect consent on behalf of these third parties. However, this is perhaps unsurprising given that the regulators have recognised this as a key challenge that they are still considering, and it should not stop companies from improving their cookie notices and consent mechanisms in the meantime.

The right to be forgotten: the CJEU sides with Google in two landmark cases

Norton Rose Fulbright - Data Protection Report blog

On 24 September 2019 the Court of Justice of the European Union (CJEU) gave two judgments (Cases C-507/17 and C-136/17) ruling that: (i) de-referencing by Google should be limited to EU Member States’ versions of its search engine with some important qualifications; and (ii) when Google receives a request for de-referencing relating to a link to a web page on which sensitive data are published, a balance must be sought between the fundamental rights of the person requesting such de-referencing and those of internet users potentially interested in that information.

Google has already faced the issue related to the right to be forgotten before the CJEU under Directive 95/46/CE (Directive) in the landmark “Google Spain” case[1] where the judges ruled that a search engine operator can be obliged to remove links to information about an individual from its list of results. This decision led to a large number of requests from individuals to remove such links and notably four complaints to the CNIL from individuals following the rejections by Google of their requests for de-referencing (see point 2 below).

The CJEU has ruled in Google’s favour.

1.      The right to be forgotten ends at the borders of the EU

In its decision of 10 March 2016 the CNIL had imposed a fine of €100,000 on Google Inc. because of the latter’s refusal, when granting a de-referencing request, to apply it to all its search engine’s worldwide domain name extensions.

Consequently, in its first judgment[2], the CJEU was asked to clarify the territorial scope of the right to be forgotten to determine whether a search engine operator is required to carry out that de-referencing on all its search engine’s worldwide domain name extensions or whether, on the contrary, it is required to do so only at a European or national level.

The CJEU’s decision starts by pointing out that, in the current globalised world, the access by internet users – including those located outside the EU – to the referencing of a link referring to information about an individual whose centre of interests is situated in the EU is likely to have “immediate and substantial effects on that person within the Union itself”, suggesting a worldwide de-referencing duty.

However, the Court qualifies this statement by stating that:

  • many non-EU countries may take a different approach to the right to de-referencing or may not even grant such a right; and
  • the right to the protection of personal data, not being an absolute right, must be balanced against other fundamental rights in line with the principle of proportionality.

In light of the foregoing, the Court ruled that an “operator is not required to carry out that de-referencing on all versions of its search engine, but on the versions of that search engine corresponding to all the Member States …”. The Court underlined that such a de-referencing must, if necessary, be accompanied by  “measures which … effectively prevent or, at the very least, seriously discourage an internet user conducting a search from one of the Member States on the basis of a data subject’s name from gaining access, via the list of results displayed following that search, [through a version of that search engine outside the EU], to the links which are the subject of that request”.

The Court nevertheless pointed out that, while EU law does not currently require the de-referencing, by an operator such as Google, to apply to all the versions of its search engine, such practice is not prohibited. A balance between a data subject’s right to privacy and the protection of personal data concerning him/her, on the one hand, and the right to freedom of information, on the other, should be made by the authorities of Member States and could in certain cases still require the operator to carry out a worldwide de-referencing.

It should also be noted that Google used geo-blocking to stop a user apparently in an EU Member State from accessing the content de-referenced in the Google search page in his/her Member State through a search page of a non-EU Member State where the content was not de-referenced. When the CJEU refers to “effective prevention” and “serious discouragement” it is clear that this type of additional technological measure will be required; what is not clear is how much further a Member State authority can require a search engine to go.

2.      Prohibition on processing certain categories of sensitive data: fundamental rights vs. freedom of information

In its second judgment,[3] four individuals had requested that Google de-reference various links, appearing in the lists of results displayed by the search engine following searches of their names, resolving to web pages published by third parties. The web pages included a satirical photo-montage of a politician, articles mentioning an individual as a public relations officer of the Church of Scientology, the judicial investigation of a politician and the sentencing of another person for sexual assaults on minors respectively.

Following Google’s refusal to de-reference, the four individuals brought complaints before the CNIL, seeking an order for Google to de-reference links. The CNIL did not take their complaints up. The parties then brought their case before the French Council of State (“Conseil d’Etat”) which referred a number of questions to the CJEU, including whether the prohibition imposed on other controllers on processing special category personal data – such as political opinions, religious or philosophical beliefs and sex life –  without falling within one of a restrictive set of grounds also applies to the operator of a search engine.

The Court found that it did. In this context, the judges emphasised that operators of search engines are responsible “not because personal data referred to in those provisions appear on a web page published by a third party but because of the referencing of that page and in particular the display of the link to that web page in the list of results presented to internet users following a search”.

This allowed the Court to go on to find that a search engine operator only needed to comply with the very restrictive grounds for processing special category personal data once a request for removal (and the search engine operator’s verification) had been made. This neatly allowed searches which return special category personal data in the results to remain viable.

The Court then recalled its Google Spain’s judgment which held that while a data subject’s rights may, as a general rule, override the freedom of information of internet users, the balance between these rights must be assessed on a case-by-case basis taking into account:

  • the nature of the information in question and its sensitivity for the data subject’s private life; and
  • the interest of the public having that information, an interest “which may vary, in particular, according to the role played by the data subject in public life”.

Consequently, the Court concluded that, where search engines such as Google are facing a request by a data subject to exercise his/her right to be forgotten relating to a link to a web page containing special category personal data, they must consider all the relevant factors of the specific case and take into account “the seriousness of the interference with the data subject’s fundamental rights to privacy and protection of personal data” so that only the links in the list of results, displayed following a search on the basis of the data subject’s name, that are “strictly necessary for protecting the freedom of information of internet users” are retained.

The Court added that, where the processing relates to information made public by the data subject, an operator of a search engine may refuse to accede to a request for de-referencing provided that:

  • the processing meets all the other conditions of lawfulness; and
  • unless the data subject has the right to object to that processing on compelling legitimate grounds relating to his/her particular situation.

The Court also took an expansive view of the definition of criminal convictions data by holding that reporting on an investigation or trial was caught regardless of whether the data subject was convicted or not subsequently. It held that:

  • where the search returned criminal convictions data “which no longer reveal the current situation” then the operator, in light of all the circumstances of the case, must balance the data subject’s fundamental rights and the public’s right to freedom of information, considering “the nature and seriousness of the offence in question, the progress and the outcome of the proceedings, the time elapsed, the part played by the data subject in public life and his/her past conduct, the public’s interest at the time of the request, the content and form of the publication and the consequences of publication for the data subject”; and
  • if the results (e.g. because of the order in which links appear) suggest a criminal conviction is still current when it is not (and there is still a public interest in the older information remaining accessible), then on request from a data subject, the search engine operator must adjust the results so that the up to date accurate information is most prominent.

Our take

Although the first CJEU judgment would appear to put an end to the French data protection authority’s absolutist vision of the territorial scope of the right to be forgotten, the detail is somewhat different. The door remains open in particularly serious cases for a data protection authority to find that de-listing in all EU member states, coupled with geo-blocking access to search pages in non-EU countries, would be insufficient to protect the affected data subject’s privacy. Indeed on the very day the judgments were published, the CNIL underlined that, since a global de-referencing is not prohibited, it still has the authority to force a search engine operator to delist results on all the versions of the search engine where justified in a particular case to guarantee the rights of the individuals concerned (see the CNIL press release here).

As for the second judgment, the Court confirmed that a similar balancing test between the data subject’s fundamental rights and the public right to freedom of information should apply in the context of the processing of criminal convictions data.

In our view, although these judgments go some way to curbing the extraterritorial impact of the GDPR, they still leave the door open for a worldwide application in the most egregious of cases.

They also do little to relieve search engine operators’ burdens in adjudicating right to be forgotten requests – the balancing test remains as precarious as ever – and we therefore expect to see continued complaints and CJEU references in this area.

For more information, the full text of the judgments can be found here and here.

[1] CJEU, 13 May 2014, C-131-12, Google Spain SL, Google Inc. / Agencia Española de Protección de Datos, Mario Costeja González
[2] CJEU, 24 September 2019, C-507-17, Google LLC, successor in law to Google Inc. v Commission nationale de l’informatique et des libertés (CNIL)
[3] CJEU, 24 September 2019, C-136-17, GC and Others v Commission nationale de l’informatique et des libertés (CNIL)

LexBlog