First multi-million GDPR fine in Germany: €14.5 million for not having a proper data retention schedule in place

Data Protection Report - Norton Rose Fulbright

On October 30, 2019 the Berlin Commissioner for Data Protection and Freedom of Information (Berliner Beauftragte für Datenschutz und InformationsfreiheitBerlin DPA) issued a €14.5 million fine on a German real estate company, die Deutsche Wohnen SE (Deutsche Wohnen),  the highest German GDPR fine to date. The infraction related to the over retention of personal data. For the first time, the Berlin DPA applied the new calculation method for GDPR fines issued by the German Datenschutzkonferenz recently (see our recent post).

Continue reading

Covert monitoring in the workplace – impact on an employee’s privacy

Data Protection Report - Norton Rose Fulbright

The Grand Chamber of the European Court of Human Rights (ECHR) has held that Spanish shop workers’ right to privacy under Article 8(1) of the European Convention on Human Rights was not violated when their employer obtained evidence of theft from covert CCTV footage of the employees.

The case involved five employees who worked as cashiers at a supermarket chain. The employer noticed stock discrepancies and as part of the investigation installed CCTV cameras, both visibly within the store and hidden cameras at the checkouts. Although customers and staff were aware that CCTV cameras operated, the employees were not aware of the concealed cameras. The cameras provided evidence of the staff involved in stealing items and the staff were dismissed. They claimed unfair dismissal on the basis that the surveillance had been unlawful.

The employees claimed that the use of covert surveillance had breached their right to privacy under Article 8 of the ECHR. The High Court in Spain accepted that the surveillance had been justified by the employer’s reasonable suspicion of theft and that it had been appropriate for the legitimate aim of detecting theft. However, the Chamber of the ECHR originally overturned the decision of the Spanish Court, holding that the court had failed to strike a balance between the rights of the employer to protect its property and the workers’ right to respect for their private life. The case was referred to the  Grand Chamber of the ECHR for a final ruling.

The Grand Chamber overturned the previous decision by a majority of 14 to three and determined that there had been no breach of the workers right to privacy. The Court held that states should ensure that any monitoring by employers is proportionate and is accompanied by adequate and sufficient safeguards against abuse. It referred to the criteria set out in the Barbulescu case:

  • Has the employee been notified of the possibility of surveillance measures? If so was the notification given before the implementation?
  • The extent of the monitoring by the employer and the degree of intrusion into the employee’s privacy.
  • Has the employer provided legitimate reasons to justify monitoring?
  • Could monitoring have been established in a less intrusive manner?
  • What are the consequences of the monitoring for the employee?
  • Has the employee been provided with appropriate safeguards?

In this case the court held that the surveillance had been necessary and proportionate and took into account the following factors:

  • The employer had a legitimate reason for the surveillance, the suspicion of theft due to the significant losses that the employer had suffered.
  • The extent of the monitoring had been limited as regards the area that it covered and the staff being monitored. The monitoring took place in an area that was open to the public. The employee’s expectation of privacy was therefore lower than in places that were private in nature.
  • The duration of the surveillance had not been excessive. The employer had not set a maximum duration of the surveillance, but it had in fact only lasted ten days.
  • The surveillance had only been viewed by certain individuals before the applicants had been informed: the Manager, the employer’s legal representative and a trade union representative.
  • Although the consequences of the monitoring for the applicants had been significant in that they had been dismissed, the surveillance had not been used for any purposes other than to investigate the thefts and to take the disciplinary measures against those responsible.
  • There had been no other means by which to fulfill the legitimate aim. If the employees had been notified of the surveillance then this may well have defeated its purpose.  This was significant in that in most cases of surveillance prior knowledge is a requirement.

Our take

Employers will be pleased with this result as it gives better cover when covert monitoring is the only option. However, it should be noted that the dissenting judgement did consider that employers should not be entitled to operate covert surveillance.  Indeed the UK Information Commissioner’s Office guidance sets out that it is rare for covert monitoring to be justified  and that it should only operate in exceptional circumstances. The decision to implement cover monitoring should still remain a last resort but employers can improve their position by including wording in monitoring policies that clarifies to employees when  such surveillance could take place.

California Governor signs all 5 CCPA amendments

On Friday, October 11, 2019, the California Governor signed all five of the California Consumer Privacy Act amendments that were awaiting his signature (AB 25, 874, 1146, 1355, and 1564) as well as an amendment to California’s data breach law (AB 1130).  We had previously written about the impact on CCPA if all five amendments went into effect here. Continue reading

Mic Drop: California AG releases long-awaited CCPA Rulemaking

On October 10, 2019, with just weeks to go until the law goes into effect, the California Attorney General released the long-awaited draft regulations for the California Consumer Privacy Act (CCPA).  The proposed rules shed light on how the California AG is interpreting and will be enforcing key sections of the CCPA.  In the press release announcing the proposed regulations, Attorney General Becerra described CCPA as “[providing] consumers with  groundbreaking new rights on the use of their personal information” and added, “It’s time we had control over the use of our personal data.”   The proposed regulations are intended to operationalize the CCPA and provide practical guidance to consumers and businesses subject to the law.  In addition to the draft rules, the AG’s office also published a “CCPA Fact Sheet” and the “Initial Statement of Reasons,” which also provide insights as to the regulatory focus and enforcement priorities.  According to the AG’s office, the draft rules summarized below, are needed to “[mitigate] the asymmetry of knowledge and power between individuals and businesses.”  Businesses “must comply to the greatest extent it can” to give consumers greater control over their personal information: to vest consumers with the right to know details about how their personal information is collected, used, and shared by businesses; the right to take control of their information by having businesses delete it and stop selling it; and the right to exercise these privacy rights without suffering discrimination in price or service.

The rules are not final.  The Attorney General will hold public hearings in four California cities during the first week of December to hear comments.  Written comments will be accepted by the Attorney General until 5 PM (Pacific time) on December 6, 2019.

Below is a summary of the proposed regulations that may have the most impact for organizations who are seeking to operationalize the CCPA requirements in time for the January 1 deadline.

Continue reading

No surprises in the recent Planet49 European Court of Justice judgment

Data Protection Report - Norton Rose Fulbright

On 1 October 2019, the European Court of Justice (ECJ) delivered its judgement on Case C – 673/17 (the “Planet49” case), which relates to the consent and transparency requirements for the use of cookies and similar technologies. The ECJ largely followed the March 2019 Opinion of Advocate General Szpunar and the judgment is generally consistent with the recent regulatory guidance issued by the UK and French data protection authorities in this area.

The decision:

Planet49 GmbH, a German company offering an online lottery service, used two checkboxes on its website at the login page for the online lottery. The first unticked checkbox forced users to consent to being contacted for marketing purposes by third-party companies before participating in the lottery. The second checkbox, which was pre-checked, sought consent for installing cookies on users’ browsers. The German federal consumer rights group (Bundesverband der Verbraucherzentralen) brought an action asserting that the requested declarations of consent did not satisfy the relevant requirements of the German data protection laws. In November 2017, the Federal Court of Justice referred the following questions to the ECJ, which found as follows:

  • Is a pre-checked checkbox, which the user must deselect to refuse his or her consent, valid consent for the purposes of Article 5(3) of the e-Privacy Directive (namely the “cookie consent” requirement) and have the consent requirements under the GDPR been met?

The ECJ held that a pre-checked box is not sufficient to obtain valid consent for placing cookies on users’ devices under Article 5(3) of the e-Privacy Directive, as it does not constitute an unambiguous indication of the data subject’s wishes.

The ECJ acknowledges that Article 5(3) of the e-Privacy Directive does not prescribe a specific way of obtaining consent to the storage of and access to cookies on users’ devices. However, it observes that the wording “given his or her consent” means that some action is required on the part of the user. It also notes that “consent” under the e-Privacy Directive previously had to be read as having the same meaning as consent under the Data Protection Directive (Directive 95/46/EC) and that the requirement for consent to be an “indication of the data subject’s wishes” under Directive 95/46/EC also points to “active, rather than passive, behaviour”. This, it concludes, is not the case where pre-checked boxes are used.

The court also points out that “only active behaviour on the part of the data subject with a view to giving his or her consent” fulfils the requirement under Directive 95/46/EC for consent to be “unambiguously” given. It would appear impossible to ascertain objectively whether a website user had actually granted his or her consent by merely continuing with his or her activity on the website visited (continuing browsing or scrolling) and, in doing so, failing to deselect a pre-checked box.

The ECJ considers that the GDPR has now closed off any debate on this issue, stating that the consent requirements under the GDPR are stricter, expressly requiring active consent and precluding “silence” and “pre-ticked boxes of inactivity” from constituting valid consent.

Separately on the question of consent needing to be “specific”, the ECJ notes that “consent must relate specifically to the processing of the data and cannot be inferred from an indication of the data subject’s wishes for other purposes”. Therefore, the fact that a user selects the participate button for the promotional lottery is not sufficient to conclude that the user validly gave his or her consent to the storage of cookies, or to the sharing of his or her data with commercial partners.

In a departure from the Opinion of Advocate General Szpunar, the ECJ did not rule on whether making participation in the lottery conditional upon the user giving his or her consent to advertising complied with the requirement for consent to be “freely given”, as this question was not referred to it. However, based on the AG’s opinion and recent guidance on the meaning of consent under the GDPR, we consider that it would often be difficult to meet the “freely given” requirement in this context.

  • Does it make a difference whether the information stored or accessed by the cookie or tracking technology constitutes personal data or not?

The ECJ points out that Article 5 (3) of the e-Privacy Directive merely refers to “information without characterising that information or specifying that it must be personal data”. Therefore, it is irrelevant if the data accessed by cookies constitutes personal data or not and the consent requirement in Article 5 (3) of the e-Privacy Directive applies regardless.

  • What information must the service provider give to users about the use of cookies and other tracking technology under Article 5(3) of the ePrivacy Directive and: (a) does this include information about the duration of the cookies; and (b) information about third parties being given access to the cookies?

The ECJ clarified that the information provided must enable the user to determine the consequences of any consent he or she gives and, in this case, be sufficiently detailed so as to enable the user to understand the functioning of the cookies employed. It considers that this requires information about the duration of the cookies and whether or not third parties may have access to those cookies to be provided, a position it considers is supported by the transparency requirements of the GDPR.

Position in Germany

In Germany, the position on cookies in national law remains unique. Section 15 (3) of the German Telemedia Act (Telemediengesetz) still allows cookies to be used in certain circumstances on an opt-out basis. The question remains how this will be interpreted in the light of the GDPR and the ECJ’s decision above, as the ECJ only answered the specific questions raised and did not give further guidance on the relationship between the GDPR and the German Telemedia Act. However, the ECJ decision is certainly consistent with recent guidance issued by the German data protection authorities, which provides that in most use cases online tracking requires the user’s prior consent in the form of an opt-in solution and that reliance on other grounds for processing, such as “legitimate interests” is not acceptable. This is especially so in cases where cookies allow a third party to track users across different sites. Therefore, the ECJ’s decision may be another step towards a more harmonised approach being taken across the whole of the European Union.

Our take:

The ECJ decision should come as no surprise to people following this area of data protection law, as it confirms the position on consent already set out in the GDPR and reflects recent regulators’ guidance across Europe. It also re-enforces the need for companies to revisit their cookies notices and consent mechanisms to ensure that they are compliant with the position taken.

It is unfortunate that the judgment does not provide any further clarity on the level of information required about the third parties who may access the cookie data or how to collect consent on behalf of these third parties. However, this is perhaps unsurprising given that the regulators have recognised this as a key challenge that they are still considering, and it should not stop companies from improving their cookie notices and consent mechanisms in the meantime.

The right to be forgotten: the CJEU sides with Google in two landmark cases

Norton Rose Fulbright - Data Protection Report blog

On 24 September 2019 the Court of Justice of the European Union (CJEU) gave two judgments (Cases C-507/17 and C-136/17) ruling that: (i) de-referencing by Google should be limited to EU Member States’ versions of its search engine with some important qualifications; and (ii) when Google receives a request for de-referencing relating to a link to a web page on which sensitive data are published, a balance must be sought between the fundamental rights of the person requesting such de-referencing and those of internet users potentially interested in that information.

Google has already faced the issue related to the right to be forgotten before the CJEU under Directive 95/46/CE (Directive) in the landmark “Google Spain” case[1] where the judges ruled that a search engine operator can be obliged to remove links to information about an individual from its list of results. This decision led to a large number of requests from individuals to remove such links and notably four complaints to the CNIL from individuals following the rejections by Google of their requests for de-referencing (see point 2 below).

The CJEU has ruled in Google’s favour.

1.      The right to be forgotten ends at the borders of the EU

In its decision of 10 March 2016 the CNIL had imposed a fine of €100,000 on Google Inc. because of the latter’s refusal, when granting a de-referencing request, to apply it to all its search engine’s worldwide domain name extensions.

Consequently, in its first judgment[2], the CJEU was asked to clarify the territorial scope of the right to be forgotten to determine whether a search engine operator is required to carry out that de-referencing on all its search engine’s worldwide domain name extensions or whether, on the contrary, it is required to do so only at a European or national level.

The CJEU’s decision starts by pointing out that, in the current globalised world, the access by internet users – including those located outside the EU – to the referencing of a link referring to information about an individual whose centre of interests is situated in the EU is likely to have “immediate and substantial effects on that person within the Union itself”, suggesting a worldwide de-referencing duty.

However, the Court qualifies this statement by stating that:

  • many non-EU countries may take a different approach to the right to de-referencing or may not even grant such a right; and
  • the right to the protection of personal data, not being an absolute right, must be balanced against other fundamental rights in line with the principle of proportionality.

In light of the foregoing, the Court ruled that an “operator is not required to carry out that de-referencing on all versions of its search engine, but on the versions of that search engine corresponding to all the Member States …”. The Court underlined that such a de-referencing must, if necessary, be accompanied by  “measures which … effectively prevent or, at the very least, seriously discourage an internet user conducting a search from one of the Member States on the basis of a data subject’s name from gaining access, via the list of results displayed following that search, [through a version of that search engine outside the EU], to the links which are the subject of that request”.

The Court nevertheless pointed out that, while EU law does not currently require the de-referencing, by an operator such as Google, to apply to all the versions of its search engine, such practice is not prohibited. A balance between a data subject’s right to privacy and the protection of personal data concerning him/her, on the one hand, and the right to freedom of information, on the other, should be made by the authorities of Member States and could in certain cases still require the operator to carry out a worldwide de-referencing.

It should also be noted that Google used geo-blocking to stop a user apparently in an EU Member State from accessing the content de-referenced in the Google search page in his/her Member State through a search page of a non-EU Member State where the content was not de-referenced. When the CJEU refers to “effective prevention” and “serious discouragement” it is clear that this type of additional technological measure will be required; what is not clear is how much further a Member State authority can require a search engine to go.

2.      Prohibition on processing certain categories of sensitive data: fundamental rights vs. freedom of information

In its second judgment,[3] four individuals had requested that Google de-reference various links, appearing in the lists of results displayed by the search engine following searches of their names, resolving to web pages published by third parties. The web pages included a satirical photo-montage of a politician, articles mentioning an individual as a public relations officer of the Church of Scientology, the judicial investigation of a politician and the sentencing of another person for sexual assaults on minors respectively.

Following Google’s refusal to de-reference, the four individuals brought complaints before the CNIL, seeking an order for Google to de-reference links. The CNIL did not take their complaints up. The parties then brought their case before the French Council of State (“Conseil d’Etat”) which referred a number of questions to the CJEU, including whether the prohibition imposed on other controllers on processing special category personal data – such as political opinions, religious or philosophical beliefs and sex life –  without falling within one of a restrictive set of grounds also applies to the operator of a search engine.

The Court found that it did. In this context, the judges emphasised that operators of search engines are responsible “not because personal data referred to in those provisions appear on a web page published by a third party but because of the referencing of that page and in particular the display of the link to that web page in the list of results presented to internet users following a search”.

This allowed the Court to go on to find that a search engine operator only needed to comply with the very restrictive grounds for processing special category personal data once a request for removal (and the search engine operator’s verification) had been made. This neatly allowed searches which return special category personal data in the results to remain viable.

The Court then recalled its Google Spain’s judgment which held that while a data subject’s rights may, as a general rule, override the freedom of information of internet users, the balance between these rights must be assessed on a case-by-case basis taking into account:

  • the nature of the information in question and its sensitivity for the data subject’s private life; and
  • the interest of the public having that information, an interest “which may vary, in particular, according to the role played by the data subject in public life”.

Consequently, the Court concluded that, where search engines such as Google are facing a request by a data subject to exercise his/her right to be forgotten relating to a link to a web page containing special category personal data, they must consider all the relevant factors of the specific case and take into account “the seriousness of the interference with the data subject’s fundamental rights to privacy and protection of personal data” so that only the links in the list of results, displayed following a search on the basis of the data subject’s name, that are “strictly necessary for protecting the freedom of information of internet users” are retained.

The Court added that, where the processing relates to information made public by the data subject, an operator of a search engine may refuse to accede to a request for de-referencing provided that:

  • the processing meets all the other conditions of lawfulness; and
  • unless the data subject has the right to object to that processing on compelling legitimate grounds relating to his/her particular situation.

The Court also took an expansive view of the definition of criminal convictions data by holding that reporting on an investigation or trial was caught regardless of whether the data subject was convicted or not subsequently. It held that:

  • where the search returned criminal convictions data “which no longer reveal the current situation” then the operator, in light of all the circumstances of the case, must balance the data subject’s fundamental rights and the public’s right to freedom of information, considering “the nature and seriousness of the offence in question, the progress and the outcome of the proceedings, the time elapsed, the part played by the data subject in public life and his/her past conduct, the public’s interest at the time of the request, the content and form of the publication and the consequences of publication for the data subject”; and
  • if the results (e.g. because of the order in which links appear) suggest a criminal conviction is still current when it is not (and there is still a public interest in the older information remaining accessible), then on request from a data subject, the search engine operator must adjust the results so that the up to date accurate information is most prominent.

Our take

Although the first CJEU judgment would appear to put an end to the French data protection authority’s absolutist vision of the territorial scope of the right to be forgotten, the detail is somewhat different. The door remains open in particularly serious cases for a data protection authority to find that de-listing in all EU member states, coupled with geo-blocking access to search pages in non-EU countries, would be insufficient to protect the affected data subject’s privacy. Indeed on the very day the judgments were published, the CNIL underlined that, since a global de-referencing is not prohibited, it still has the authority to force a search engine operator to delist results on all the versions of the search engine where justified in a particular case to guarantee the rights of the individuals concerned (see the CNIL press release here).

As for the second judgment, the Court confirmed that a similar balancing test between the data subject’s fundamental rights and the public right to freedom of information should apply in the context of the processing of criminal convictions data.

In our view, although these judgments go some way to curbing the extraterritorial impact of the GDPR, they still leave the door open for a worldwide application in the most egregious of cases.

They also do little to relieve search engine operators’ burdens in adjudicating right to be forgotten requests – the balancing test remains as precarious as ever – and we therefore expect to see continued complaints and CJEU references in this area.

For more information, the full text of the judgments can be found here and here.

[1] CJEU, 13 May 2014, C-131-12, Google Spain SL, Google Inc. / Agencia Española de Protección de Datos, Mario Costeja González
[2] CJEU, 24 September 2019, C-507-17, Google LLC, successor in law to Google Inc. v Commission nationale de l’informatique et des libertés (CNIL)
[3] CJEU, 24 September 2019, C-136-17, GC and Others v Commission nationale de l’informatique et des libertés (CNIL)

New York’s Breach Law Amendments and New Security Requirements

Although California has recently captured the lion’s share of attention with respect to privacy and security, on October 23, 2019, New York’s amended security breach law goes into effect, and on March 1, 2020, new security safeguards go live (N.Y. S.B. 5575). Anyone with personal information about a New York resident is potentially affected by these far-reaching amendments.

Breach Law Changes

Readers may recall that New York’s security breach notification law (N.Y. Gen. Bus. Law § 899-aa) differs from most states’ law in several ways including (1) using separate definitions of “personal information” and “private information;” and (2) providing factors to consider whether personal information had been acquired. New York was among the majority of states whose breach law focused on acquisition of personal data (including Social Security Number, driver’s license number, or credit card number and security code).

As of October 23, 2019, much of that will change:

  • New York will no longer be a purely “acquisition” state but will be “access or acquisition” of personal information in order to constitute a breach requiring notice.
  • New York retains its very broad definition of “personal information” (“any information concerning a natural person, which, because of name . . . or other identifier, can be used to identify such natural person”), but the definition of “private information” (data elements) will expand to add two new categories (emphasis added):
    • Account number, credit card number, debit card number, along with “personal information”—but no longer requiring security code if the number could be used to access the individual’s financial account without additional identifying information. This change is consistent with the New York Attorney General’s position since 2017, which found that many popular websites permitted purchases to be made with credit cards without requiring security codes.
    • Biometric information that is used to authenticate or ascertain the individual identity.
  • “Private information” is also separately defined to mean user name or e-mail address in combination with the password or security question and answer—without any need for “personal information.”
  • New York will exclude from “private information” any encrypted data elements or “combination of personal information plus data elements”—as long as the encryption key has not been acquired by the unauthorized person.
  • Although New York has left unchanged its examples to determine if information has been acquired, it has added one for “access” that we will reformat a bit to make sure you see the full impact:

In determining whether information has been accessed, or is reasonably believed to have been accessed, by an unauthorized person or a person without valid authorization, such business may consider, among other factors, indications that the information was: (1) viewed; (2) communicated with; (3) used; or (4) altered

by a person without valid authorization or by an unauthorized person.

  • New York no longer requires that the person or business conduct business in New York state, but rather requires only that the person or business simply own or license computerized data that includes private information of a New York resident.
  • New York will permit persons or businesses to use a “risk of harm” analysis and determine not to provide notice, with some unique twists that are slightly reformatted to emphasize their potential full impact (emphasis added):

If the person or business reasonably determines such exposure will not likely result in: (1) misuse of such information; (2) financial harm to the affected persons; or (3) emotional harm in the case of unknown disclose of online credentials [user name or e-mail address in combination with the password or security question and answer].

Once that determination is made, the person or business must document it in writing and maintain it for five years. If the incident affects over 500 New York residents, then the person or business must provide the written determination to the state attorney general within 10 days after the determination.

  • New York will expressly recognize that notices under HIPAA, Gramm-Leach-Bliley, and the New York Department of Financial Services’ cybersecurity regulations as well as notices provided under “other data security rules and regulations of, and statutes administered by” any federal or New York agency, will suffice for this statute, and will not require a second notice. With respect to HIPAA only, the law now also provides that, if a covered entity must provide notice to the U.S. Department of Health and Human Services (HHS) but not under this New York law, the covered entity must provide a copy of the notice to HHS to the New York Attorney General within five business days of providing the notice to HH
  • The consumer notice will now be required to include phone numbers and websites of state and federal agencies that “provide information regarding security breach response and identity theft prevention and protection information.
  • New York amended its requirements relating to substitute notice. Although New York retains unchanged the requirements for “conspicuous posting” on the company’s website, and notification to major statewide media, a business will need to provide notice via e-mail if the business has an e-mail address for the affected individual unless the breached information includes the e-mail address plus the password/security question and answer. In that case, the business must instead offer “clear and conspicuous notice delivered to the consumer online when the consumer is connected to the online account from an internet protocol address or from an online location which the person or business knows the consumer customarily uses to access the online account.”
  • Although New York still does not have a private right of action under this section, the amendment at least doubled the fines the Attorney General may seek for violations, from $10 to $20 for each instance of failed notification, up to a total of $250,000 (from $100,000). The time to bring an action also increased from two years to three, commencing with the earlier of the date the Attorney General learned of the breach or notice was provided.

Data Security Protections

As of March 1, 2020, New York will start requiring reasonable security requirements for any person or business that owns or licenses computerized data that includes “private information” of a New York resident. (N.Y. Gen. Bus. Law § 899-bb).

For most companies, the new law requires that the person or business “develop, implement, and maintain reasonable safeguards to protect the security, confidentiality and integrity of private information including, but not limited to, disposal of data.” New York does not place specific requirements on these persons or companies, but instead provides examples of the elements of a data security program. For example, for administrative safeguards, the law lists safeguards “such as”:

(1)        Designates one or more employees to coordinate the security program;

(2)        Identifies reasonably foreseeable internal and external risks;

(3)        Assesses the sufficiency of safeguards in place to control the identified risks;

(4)        Trains and manages employees in the security program practices and procedures;

(5)        Selects service providers capable of maintaining appropriate safeguards, and requires those safeguards by contract; and

(6)        Adjusts the security program in light of business changes or new circumstances.

If these appear familiar, it is because they are a slightly revised version of the FTC’s Safeguards Rule requirements (16 CFR § 314.4). New York’s new law contains similar examples of technical and physical safeguards.

As with the amended breach law, this new law also states that compliance with the data security requirements of HIPAA, Gramm-Leach-Bliley or New York Department of Financial Services cybersecurity regulations, or other similar agency requirements, will meet this statute. In addition, the law states that, for “small businesses,” compliance means “reasonable administrative, technical and physical safeguards” that are “appropriate for the size and complexity of the small business, the nature and scope of the small business’s activities, and the sensitivity of the personal information the small business collects from or about consumers.“ The law defines a “small business” as a person or business that meets one of three criteria:

(i)         Fewer than 50 employees

(iii)       Less than $3 million gross annual revenues in each of the last three fiscal years; or

(iii)       Less than $5 million in year-end total assets.

The new requirements can be enforced by the Attorney General and the law specifically states that there is no private right of action. This result differs from the California Consumer Privacy Act, which provides its only private right of action for a data breach caused by a business’s failure to implement reasonable data security to protect the information breached. Cal. Civ. § 1798.150..

 

LexBlog