First multi-million GDPR fine in Germany: €14.5 million for not having a proper data retention schedule in place

Data Protection Report - Norton Rose Fulbright

On October 30, 2019 the Berlin Commissioner for Data Protection and Freedom of Information (Berliner Beauftragte für Datenschutz und InformationsfreiheitBerlin DPA) issued a €14.5 million fine on a German real estate company, die Deutsche Wohnen SE (Deutsche Wohnen),  the highest German GDPR fine to date. The infraction related to the over retention of personal data. For the first time, the Berlin DPA applied the new calculation method for GDPR fines issued by the German Datenschutzkonferenz recently (see our recent post).

Continue reading

Covert monitoring in the workplace – impact on an employee’s privacy

Data Protection Report - Norton Rose Fulbright

The Grand Chamber of the European Court of Human Rights (ECHR) has held that Spanish shop workers’ right to privacy under Article 8(1) of the European Convention on Human Rights was not violated when their employer obtained evidence of theft from covert CCTV footage of the employees. Continue reading

California Governor signs all 5 CCPA amendments

On Friday, October 11, 2019, the California Governor signed all five of the California Consumer Privacy Act amendments that were awaiting his signature (AB 25, 874, 1146, 1355, and 1564) as well as an amendment to California’s data breach law (AB 1130).  We had previously written about the impact on CCPA if all five amendments went into effect here. Continue reading

Mic Drop: California AG releases long-awaited CCPA Rulemaking

On October 10, 2019, with just weeks to go until the law goes into effect, the California Attorney General released the long-awaited draft regulations for the California Consumer Privacy Act (CCPA).  The proposed rules shed light on how the California AG is interpreting and will be enforcing key sections of the CCPA.  In the press release announcing the proposed regulations, Attorney General Becerra described CCPA as “[providing] consumers with  groundbreaking new rights on the use of their personal information” and added, “It’s time we had control over the use of our personal data.”   The proposed regulations are intended to operationalize the CCPA and provide practical guidance to consumers and businesses subject to the law.  In addition to the draft rules, the AG’s office also published a “CCPA Fact Sheet” and the “Initial Statement of Reasons,” which also provide insights as to the regulatory focus and enforcement priorities.  According to the AG’s office, the draft rules summarized below, are needed to “[mitigate] the asymmetry of knowledge and power between individuals and businesses.”  Businesses “must comply to the greatest extent it can” to give consumers greater control over their personal information: to vest consumers with the right to know details about how their personal information is collected, used, and shared by businesses; the right to take control of their information by having businesses delete it and stop selling it; and the right to exercise these privacy rights without suffering discrimination in price or service.

The rules are not final.  The Attorney General will hold public hearings in four California cities during the first week of December to hear comments.  Written comments will be accepted by the Attorney General until 5 PM (Pacific time) on December 6, 2019.

Below is a summary of the proposed regulations that may have the most impact for organizations who are seeking to operationalize the CCPA requirements in time for the January 1 deadline.

Continue reading

No surprises in the recent Planet49 European Court of Justice judgment

Data Protection Report - Norton Rose Fulbright

On 1 October 2019, the European Court of Justice (ECJ) delivered its judgement on Case C – 673/17 (the “Planet49” case), which relates to the consent and transparency requirements for the use of cookies and similar technologies. The ECJ largely followed the March 2019 Opinion of Advocate General Szpunar and the judgment is generally consistent with the recent regulatory guidance issued by the UK and French data protection authorities in this area. Continue reading

The right to be forgotten: the CJEU sides with Google in two landmark cases

Norton Rose Fulbright - Data Protection Report blog

On 24 September 2019 the Court of Justice of the European Union (CJEU) gave two judgments (Cases C-507/17 and C-136/17) ruling that: (i) de-referencing by Google should be limited to EU Member States’ versions of its search engine with some important qualifications; and (ii) when Google receives a request for de-referencing relating to a link to a web page on which sensitive data are published, a balance must be sought between the fundamental rights of the person requesting such de-referencing and those of internet users potentially interested in that information.

Google has already faced the issue related to the right to be forgotten before the CJEU under Directive 95/46/CE (Directive) in the landmark “Google Spain” case[1] where the judges ruled that a search engine operator can be obliged to remove links to information about an individual from its list of results. This decision led to a large number of requests from individuals to remove such links and notably four complaints to the CNIL from individuals following the rejections by Google of their requests for de-referencing (see point 2 below).

The CJEU has ruled in Google’s favour.

1.      The right to be forgotten ends at the borders of the EU

In its decision of 10 March 2016 the CNIL had imposed a fine of €100,000 on Google Inc. because of the latter’s refusal, when granting a de-referencing request, to apply it to all its search engine’s worldwide domain name extensions.

Consequently, in its first judgment[2], the CJEU was asked to clarify the territorial scope of the right to be forgotten to determine whether a search engine operator is required to carry out that de-referencing on all its search engine’s worldwide domain name extensions or whether, on the contrary, it is required to do so only at a European or national level.

The CJEU’s decision starts by pointing out that, in the current globalised world, the access by internet users – including those located outside the EU – to the referencing of a link referring to information about an individual whose centre of interests is situated in the EU is likely to have “immediate and substantial effects on that person within the Union itself”, suggesting a worldwide de-referencing duty.

However, the Court qualifies this statement by stating that:

  • many non-EU countries may take a different approach to the right to de-referencing or may not even grant such a right; and
  • the right to the protection of personal data, not being an absolute right, must be balanced against other fundamental rights in line with the principle of proportionality.

In light of the foregoing, the Court ruled that an “operator is not required to carry out that de-referencing on all versions of its search engine, but on the versions of that search engine corresponding to all the Member States …”. The Court underlined that such a de-referencing must, if necessary, be accompanied by  “measures which … effectively prevent or, at the very least, seriously discourage an internet user conducting a search from one of the Member States on the basis of a data subject’s name from gaining access, via the list of results displayed following that search, [through a version of that search engine outside the EU], to the links which are the subject of that request”.

The Court nevertheless pointed out that, while EU law does not currently require the de-referencing, by an operator such as Google, to apply to all the versions of its search engine, such practice is not prohibited. A balance between a data subject’s right to privacy and the protection of personal data concerning him/her, on the one hand, and the right to freedom of information, on the other, should be made by the authorities of Member States and could in certain cases still require the operator to carry out a worldwide de-referencing.

It should also be noted that Google used geo-blocking to stop a user apparently in an EU Member State from accessing the content de-referenced in the Google search page in his/her Member State through a search page of a non-EU Member State where the content was not de-referenced. When the CJEU refers to “effective prevention” and “serious discouragement” it is clear that this type of additional technological measure will be required; what is not clear is how much further a Member State authority can require a search engine to go.

2.      Prohibition on processing certain categories of sensitive data: fundamental rights vs. freedom of information

In its second judgment,[3] four individuals had requested that Google de-reference various links, appearing in the lists of results displayed by the search engine following searches of their names, resolving to web pages published by third parties. The web pages included a satirical photo-montage of a politician, articles mentioning an individual as a public relations officer of the Church of Scientology, the judicial investigation of a politician and the sentencing of another person for sexual assaults on minors respectively.

Following Google’s refusal to de-reference, the four individuals brought complaints before the CNIL, seeking an order for Google to de-reference links. The CNIL did not take their complaints up. The parties then brought their case before the French Council of State (“Conseil d’Etat”) which referred a number of questions to the CJEU, including whether the prohibition imposed on other controllers on processing special category personal data – such as political opinions, religious or philosophical beliefs and sex life –  without falling within one of a restrictive set of grounds also applies to the operator of a search engine.

The Court found that it did. In this context, the judges emphasised that operators of search engines are responsible “not because personal data referred to in those provisions appear on a web page published by a third party but because of the referencing of that page and in particular the display of the link to that web page in the list of results presented to internet users following a search”.

This allowed the Court to go on to find that a search engine operator only needed to comply with the very restrictive grounds for processing special category personal data once a request for removal (and the search engine operator’s verification) had been made. This neatly allowed searches which return special category personal data in the results to remain viable.

The Court then recalled its Google Spain’s judgment which held that while a data subject’s rights may, as a general rule, override the freedom of information of internet users, the balance between these rights must be assessed on a case-by-case basis taking into account:

  • the nature of the information in question and its sensitivity for the data subject’s private life; and
  • the interest of the public having that information, an interest “which may vary, in particular, according to the role played by the data subject in public life”.

Consequently, the Court concluded that, where search engines such as Google are facing a request by a data subject to exercise his/her right to be forgotten relating to a link to a web page containing special category personal data, they must consider all the relevant factors of the specific case and take into account “the seriousness of the interference with the data subject’s fundamental rights to privacy and protection of personal data” so that only the links in the list of results, displayed following a search on the basis of the data subject’s name, that are “strictly necessary for protecting the freedom of information of internet users” are retained.

The Court added that, where the processing relates to information made public by the data subject, an operator of a search engine may refuse to accede to a request for de-referencing provided that:

  • the processing meets all the other conditions of lawfulness; and
  • unless the data subject has the right to object to that processing on compelling legitimate grounds relating to his/her particular situation.

The Court also took an expansive view of the definition of criminal convictions data by holding that reporting on an investigation or trial was caught regardless of whether the data subject was convicted or not subsequently. It held that:

  • where the search returned criminal convictions data “which no longer reveal the current situation” then the operator, in light of all the circumstances of the case, must balance the data subject’s fundamental rights and the public’s right to freedom of information, considering “the nature and seriousness of the offence in question, the progress and the outcome of the proceedings, the time elapsed, the part played by the data subject in public life and his/her past conduct, the public’s interest at the time of the request, the content and form of the publication and the consequences of publication for the data subject”; and
  • if the results (e.g. because of the order in which links appear) suggest a criminal conviction is still current when it is not (and there is still a public interest in the older information remaining accessible), then on request from a data subject, the search engine operator must adjust the results so that the up to date accurate information is most prominent.

Our take

Although the first CJEU judgment would appear to put an end to the French data protection authority’s absolutist vision of the territorial scope of the right to be forgotten, the detail is somewhat different. The door remains open in particularly serious cases for a data protection authority to find that de-listing in all EU member states, coupled with geo-blocking access to search pages in non-EU countries, would be insufficient to protect the affected data subject’s privacy. Indeed on the very day the judgments were published, the CNIL underlined that, since a global de-referencing is not prohibited, it still has the authority to force a search engine operator to delist results on all the versions of the search engine where justified in a particular case to guarantee the rights of the individuals concerned (see the CNIL press release here).

As for the second judgment, the Court confirmed that a similar balancing test between the data subject’s fundamental rights and the public right to freedom of information should apply in the context of the processing of criminal convictions data.

In our view, although these judgments go some way to curbing the extraterritorial impact of the GDPR, they still leave the door open for a worldwide application in the most egregious of cases.

They also do little to relieve search engine operators’ burdens in adjudicating right to be forgotten requests – the balancing test remains as precarious as ever – and we therefore expect to see continued complaints and CJEU references in this area.

For more information, the full text of the judgments can be found here and here.

[1] CJEU, 13 May 2014, C-131-12, Google Spain SL, Google Inc. / Agencia Española de Protección de Datos, Mario Costeja González
[2] CJEU, 24 September 2019, C-507-17, Google LLC, successor in law to Google Inc. v Commission nationale de l’informatique et des libertés (CNIL)
[3] CJEU, 24 September 2019, C-136-17, GC and Others v Commission nationale de l’informatique et des libertés (CNIL)

New York’s Breach Law Amendments and New Security Requirements

Although California has recently captured the lion’s share of attention with respect to privacy and security, on October 23, 2019, New York’s amended security breach law goes into effect, and on March 1, 2020, new security safeguards go live (N.Y. S.B. 5575). Anyone with personal information about a New York resident is potentially affected by these far-reaching amendments.

Breach Law Changes

Readers may recall that New York’s security breach notification law (N.Y. Gen. Bus. Law § 899-aa) differs from most states’ law in several ways including (1) using separate definitions of “personal information” and “private information;” and (2) providing factors to consider whether personal information had been acquired. New York was among the majority of states whose breach law focused on acquisition of personal data (including Social Security Number, driver’s license number, or credit card number and security code).

As of October 23, 2019, much of that will change:

  • New York will no longer be a purely “acquisition” state but will be “access or acquisition” of personal information in order to constitute a breach requiring notice.
  • New York retains its very broad definition of “personal information” (“any information concerning a natural person, which, because of name . . . or other identifier, can be used to identify such natural person”), but the definition of “private information” (data elements) will expand to add two new categories (emphasis added):
    • Account number, credit card number, debit card number, along with “personal information”—but no longer requiring security code if the number could be used to access the individual’s financial account without additional identifying information. This change is consistent with the New York Attorney General’s position since 2017, which found that many popular websites permitted purchases to be made with credit cards without requiring security codes.
    • Biometric information that is used to authenticate or ascertain the individual identity.
  • “Private information” is also separately defined to mean user name or e-mail address in combination with the password or security question and answer—without any need for “personal information.”
  • New York will exclude from “private information” any encrypted data elements or “combination of personal information plus data elements”—as long as the encryption key has not been acquired by the unauthorized person.
  • Although New York has left unchanged its examples to determine if information has been acquired, it has added one for “access” that we will reformat a bit to make sure you see the full impact:

In determining whether information has been accessed, or is reasonably believed to have been accessed, by an unauthorized person or a person without valid authorization, such business may consider, among other factors, indications that the information was: (1) viewed; (2) communicated with; (3) used; or (4) altered

by a person without valid authorization or by an unauthorized person.

  • New York no longer requires that the person or business conduct business in New York state, but rather requires only that the person or business simply own or license computerized data that includes private information of a New York resident.
  • New York will permit persons or businesses to use a “risk of harm” analysis and determine not to provide notice, with some unique twists that are slightly reformatted to emphasize their potential full impact (emphasis added):

If the person or business reasonably determines such exposure will not likely result in: (1) misuse of such information; (2) financial harm to the affected persons; or (3) emotional harm in the case of unknown disclose of online credentials [user name or e-mail address in combination with the password or security question and answer].

Once that determination is made, the person or business must document it in writing and maintain it for five years. If the incident affects over 500 New York residents, then the person or business must provide the written determination to the state attorney general within 10 days after the determination.

  • New York will expressly recognize that notices under HIPAA, Gramm-Leach-Bliley, and the New York Department of Financial Services’ cybersecurity regulations as well as notices provided under “other data security rules and regulations of, and statutes administered by” any federal or New York agency, will suffice for this statute, and will not require a second notice. With respect to HIPAA only, the law now also provides that, if a covered entity must provide notice to the U.S. Department of Health and Human Services (HHS) but not under this New York law, the covered entity must provide a copy of the notice to HHS to the New York Attorney General within five business days of providing the notice to HH
  • The consumer notice will now be required to include phone numbers and websites of state and federal agencies that “provide information regarding security breach response and identity theft prevention and protection information.
  • New York amended its requirements relating to substitute notice. Although New York retains unchanged the requirements for “conspicuous posting” on the company’s website, and notification to major statewide media, a business will need to provide notice via e-mail if the business has an e-mail address for the affected individual unless the breached information includes the e-mail address plus the password/security question and answer. In that case, the business must instead offer “clear and conspicuous notice delivered to the consumer online when the consumer is connected to the online account from an internet protocol address or from an online location which the person or business knows the consumer customarily uses to access the online account.”
  • Although New York still does not have a private right of action under this section, the amendment at least doubled the fines the Attorney General may seek for violations, from $10 to $20 for each instance of failed notification, up to a total of $250,000 (from $100,000). The time to bring an action also increased from two years to three, commencing with the earlier of the date the Attorney General learned of the breach or notice was provided.

Data Security Protections

As of March 1, 2020, New York will start requiring reasonable security requirements for any person or business that owns or licenses computerized data that includes “private information” of a New York resident. (N.Y. Gen. Bus. Law § 899-bb).

For most companies, the new law requires that the person or business “develop, implement, and maintain reasonable safeguards to protect the security, confidentiality and integrity of private information including, but not limited to, disposal of data.” New York does not place specific requirements on these persons or companies, but instead provides examples of the elements of a data security program. For example, for administrative safeguards, the law lists safeguards “such as”:

(1)        Designates one or more employees to coordinate the security program;

(2)        Identifies reasonably foreseeable internal and external risks;

(3)        Assesses the sufficiency of safeguards in place to control the identified risks;

(4)        Trains and manages employees in the security program practices and procedures;

(5)        Selects service providers capable of maintaining appropriate safeguards, and requires those safeguards by contract; and

(6)        Adjusts the security program in light of business changes or new circumstances.

If these appear familiar, it is because they are a slightly revised version of the FTC’s Safeguards Rule requirements (16 CFR § 314.4). New York’s new law contains similar examples of technical and physical safeguards.

As with the amended breach law, this new law also states that compliance with the data security requirements of HIPAA, Gramm-Leach-Bliley or New York Department of Financial Services cybersecurity regulations, or other similar agency requirements, will meet this statute. In addition, the law states that, for “small businesses,” compliance means “reasonable administrative, technical and physical safeguards” that are “appropriate for the size and complexity of the small business, the nature and scope of the small business’s activities, and the sensitivity of the personal information the small business collects from or about consumers.“ The law defines a “small business” as a person or business that meets one of three criteria:

(i)         Fewer than 50 employees

(iii)       Less than $3 million gross annual revenues in each of the last three fiscal years; or

(iii)       Less than $5 million in year-end total assets.

The new requirements can be enforced by the Attorney General and the law specifically states that there is no private right of action. This result differs from the California Consumer Privacy Act, which provides its only private right of action for a data breach caused by a business’s failure to implement reasonable data security to protect the information breached. Cal. Civ. § 1798.150..

 

LexBlog