California Governor signs all 5 CCPA amendments

On Friday, October 11, 2019, the California Governor signed all five of the California Consumer Privacy Act amendments that were awaiting his signature (AB 25, 874, 1146, 1355, and 1564) as well as an amendment to California’s data breach law (AB 1130).  We had previously written about the impact on CCPA if all five amendments went into effect here. Continue reading

Mic Drop: California AG releases long-awaited CCPA Rulemaking

On October 10, 2019, with just weeks to go until the law goes into effect, the California Attorney General released the long-awaited draft regulations for the California Consumer Privacy Act (CCPA).  The proposed rules shed light on how the California AG is interpreting and will be enforcing key sections of the CCPA.  In the press release announcing the proposed regulations, Attorney General Becerra described CCPA as “[providing] consumers with  groundbreaking new rights on the use of their personal information” and added, “It’s time we had control over the use of our personal data.”   The proposed regulations are intended to operationalize the CCPA and provide practical guidance to consumers and businesses subject to the law.  In addition to the draft rules, the AG’s office also published a “CCPA Fact Sheet” and the “Initial Statement of Reasons,” which also provide insights as to the regulatory focus and enforcement priorities.  According to the AG’s office, the draft rules summarized below, are needed to “[mitigate] the asymmetry of knowledge and power between individuals and businesses.”  Businesses “must comply to the greatest extent it can” to give consumers greater control over their personal information: to vest consumers with the right to know details about how their personal information is collected, used, and shared by businesses; the right to take control of their information by having businesses delete it and stop selling it; and the right to exercise these privacy rights without suffering discrimination in price or service.

The rules are not final.  The Attorney General will hold public hearings in four California cities during the first week of December to hear comments.  Written comments will be accepted by the Attorney General until 5 PM (Pacific time) on December 6, 2019.

Below is a summary of the proposed regulations that may have the most impact for organizations who are seeking to operationalize the CCPA requirements in time for the January 1 deadline.

Continue reading

No surprises in the recent Planet49 European Court of Justice judgment

Data Protection Report - Norton Rose Fulbright

On 1 October 2019, the European Court of Justice (ECJ) delivered its judgement on Case C – 673/17 (the “Planet49” case), which relates to the consent and transparency requirements for the use of cookies and similar technologies. The ECJ largely followed the March 2019 Opinion of Advocate General Szpunar and the judgment is generally consistent with the recent regulatory guidance issued by the UK and French data protection authorities in this area.

The decision:

Planet49 GmbH, a German company offering an online lottery service, used two checkboxes on its website at the login page for the online lottery. The first unticked checkbox forced users to consent to being contacted for marketing purposes by third-party companies before participating in the lottery. The second checkbox, which was pre-checked, sought consent for installing cookies on users’ browsers. The German federal consumer rights group (Bundesverband der Verbraucherzentralen) brought an action asserting that the requested declarations of consent did not satisfy the relevant requirements of the German data protection laws. In November 2017, the Federal Court of Justice referred the following questions to the ECJ, which found as follows:

  • Is a pre-checked checkbox, which the user must deselect to refuse his or her consent, valid consent for the purposes of Article 5(3) of the e-Privacy Directive (namely the “cookie consent” requirement) and have the consent requirements under the GDPR been met?

The ECJ held that a pre-checked box is not sufficient to obtain valid consent for placing cookies on users’ devices under Article 5(3) of the e-Privacy Directive, as it does not constitute an unambiguous indication of the data subject’s wishes.

The ECJ acknowledges that Article 5(3) of the e-Privacy Directive does not prescribe a specific way of obtaining consent to the storage of and access to cookies on users’ devices. However, it observes that the wording “given his or her consent” means that some action is required on the part of the user. It also notes that “consent” under the e-Privacy Directive previously had to be read as having the same meaning as consent under the Data Protection Directive (Directive 95/46/EC) and that the requirement for consent to be an “indication of the data subject’s wishes” under Directive 95/46/EC also points to “active, rather than passive, behaviour”. This, it concludes, is not the case where pre-checked boxes are used.

The court also points out that “only active behaviour on the part of the data subject with a view to giving his or her consent” fulfils the requirement under Directive 95/46/EC for consent to be “unambiguously” given. It would appear impossible to ascertain objectively whether a website user had actually granted his or her consent by merely continuing with his or her activity on the website visited (continuing browsing or scrolling) and, in doing so, failing to deselect a pre-checked box.

The ECJ considers that the GDPR has now closed off any debate on this issue, stating that the consent requirements under the GDPR are stricter, expressly requiring active consent and precluding “silence” and “pre-ticked boxes of inactivity” from constituting valid consent.

Separately on the question of consent needing to be “specific”, the ECJ notes that “consent must relate specifically to the processing of the data and cannot be inferred from an indication of the data subject’s wishes for other purposes”. Therefore, the fact that a user selects the participate button for the promotional lottery is not sufficient to conclude that the user validly gave his or her consent to the storage of cookies, or to the sharing of his or her data with commercial partners.

In a departure from the Opinion of Advocate General Szpunar, the ECJ did not rule on whether making participation in the lottery conditional upon the user giving his or her consent to advertising complied with the requirement for consent to be “freely given”, as this question was not referred to it. However, based on the AG’s opinion and recent guidance on the meaning of consent under the GDPR, we consider that it would often be difficult to meet the “freely given” requirement in this context.

  • Does it make a difference whether the information stored or accessed by the cookie or tracking technology constitutes personal data or not?

The ECJ points out that Article 5 (3) of the e-Privacy Directive merely refers to “information without characterising that information or specifying that it must be personal data”. Therefore, it is irrelevant if the data accessed by cookies constitutes personal data or not and the consent requirement in Article 5 (3) of the e-Privacy Directive applies regardless.

  • What information must the service provider give to users about the use of cookies and other tracking technology under Article 5(3) of the ePrivacy Directive and: (a) does this include information about the duration of the cookies; and (b) information about third parties being given access to the cookies?

The ECJ clarified that the information provided must enable the user to determine the consequences of any consent he or she gives and, in this case, be sufficiently detailed so as to enable the user to understand the functioning of the cookies employed. It considers that this requires information about the duration of the cookies and whether or not third parties may have access to those cookies to be provided, a position it considers is supported by the transparency requirements of the GDPR.

Position in Germany

In Germany, the position on cookies in national law remains unique. Section 15 (3) of the German Telemedia Act (Telemediengesetz) still allows cookies to be used in certain circumstances on an opt-out basis. The question remains how this will be interpreted in the light of the GDPR and the ECJ’s decision above, as the ECJ only answered the specific questions raised and did not give further guidance on the relationship between the GDPR and the German Telemedia Act. However, the ECJ decision is certainly consistent with recent guidance issued by the German data protection authorities, which provides that in most use cases online tracking requires the user’s prior consent in the form of an opt-in solution and that reliance on other grounds for processing, such as “legitimate interests” is not acceptable. This is especially so in cases where cookies allow a third party to track users across different sites. Therefore, the ECJ’s decision may be another step towards a more harmonised approach being taken across the whole of the European Union.

Our take:

The ECJ decision should come as no surprise to people following this area of data protection law, as it confirms the position on consent already set out in the GDPR and reflects recent regulators’ guidance across Europe. It also re-enforces the need for companies to revisit their cookies notices and consent mechanisms to ensure that they are compliant with the position taken.

It is unfortunate that the judgment does not provide any further clarity on the level of information required about the third parties who may access the cookie data or how to collect consent on behalf of these third parties. However, this is perhaps unsurprising given that the regulators have recognised this as a key challenge that they are still considering, and it should not stop companies from improving their cookie notices and consent mechanisms in the meantime.

The right to be forgotten: the CJEU sides with Google in two landmark cases

Norton Rose Fulbright - Data Protection Report blog

On 24 September 2019 the Court of Justice of the European Union (CJEU) gave two judgments (Cases C-507/17 and C-136/17) ruling that: (i) de-referencing by Google should be limited to EU Member States’ versions of its search engine with some important qualifications; and (ii) when Google receives a request for de-referencing relating to a link to a web page on which sensitive data are published, a balance must be sought between the fundamental rights of the person requesting such de-referencing and those of internet users potentially interested in that information.

Google has already faced the issue related to the right to be forgotten before the CJEU under Directive 95/46/CE (Directive) in the landmark “Google Spain” case[1] where the judges ruled that a search engine operator can be obliged to remove links to information about an individual from its list of results. This decision led to a large number of requests from individuals to remove such links and notably four complaints to the CNIL from individuals following the rejections by Google of their requests for de-referencing (see point 2 below).

The CJEU has ruled in Google’s favour.

1.      The right to be forgotten ends at the borders of the EU

In its decision of 10 March 2016 the CNIL had imposed a fine of €100,000 on Google Inc. because of the latter’s refusal, when granting a de-referencing request, to apply it to all its search engine’s worldwide domain name extensions.

Consequently, in its first judgment[2], the CJEU was asked to clarify the territorial scope of the right to be forgotten to determine whether a search engine operator is required to carry out that de-referencing on all its search engine’s worldwide domain name extensions or whether, on the contrary, it is required to do so only at a European or national level.

The CJEU’s decision starts by pointing out that, in the current globalised world, the access by internet users – including those located outside the EU – to the referencing of a link referring to information about an individual whose centre of interests is situated in the EU is likely to have “immediate and substantial effects on that person within the Union itself”, suggesting a worldwide de-referencing duty.

However, the Court qualifies this statement by stating that:

  • many non-EU countries may take a different approach to the right to de-referencing or may not even grant such a right; and
  • the right to the protection of personal data, not being an absolute right, must be balanced against other fundamental rights in line with the principle of proportionality.

In light of the foregoing, the Court ruled that an “operator is not required to carry out that de-referencing on all versions of its search engine, but on the versions of that search engine corresponding to all the Member States …”. The Court underlined that such a de-referencing must, if necessary, be accompanied by  “measures which … effectively prevent or, at the very least, seriously discourage an internet user conducting a search from one of the Member States on the basis of a data subject’s name from gaining access, via the list of results displayed following that search, [through a version of that search engine outside the EU], to the links which are the subject of that request”.

The Court nevertheless pointed out that, while EU law does not currently require the de-referencing, by an operator such as Google, to apply to all the versions of its search engine, such practice is not prohibited. A balance between a data subject’s right to privacy and the protection of personal data concerning him/her, on the one hand, and the right to freedom of information, on the other, should be made by the authorities of Member States and could in certain cases still require the operator to carry out a worldwide de-referencing.

It should also be noted that Google used geo-blocking to stop a user apparently in an EU Member State from accessing the content de-referenced in the Google search page in his/her Member State through a search page of a non-EU Member State where the content was not de-referenced. When the CJEU refers to “effective prevention” and “serious discouragement” it is clear that this type of additional technological measure will be required; what is not clear is how much further a Member State authority can require a search engine to go.

2.      Prohibition on processing certain categories of sensitive data: fundamental rights vs. freedom of information

In its second judgment,[3] four individuals had requested that Google de-reference various links, appearing in the lists of results displayed by the search engine following searches of their names, resolving to web pages published by third parties. The web pages included a satirical photo-montage of a politician, articles mentioning an individual as a public relations officer of the Church of Scientology, the judicial investigation of a politician and the sentencing of another person for sexual assaults on minors respectively.

Following Google’s refusal to de-reference, the four individuals brought complaints before the CNIL, seeking an order for Google to de-reference links. The CNIL did not take their complaints up. The parties then brought their case before the French Council of State (“Conseil d’Etat”) which referred a number of questions to the CJEU, including whether the prohibition imposed on other controllers on processing special category personal data – such as political opinions, religious or philosophical beliefs and sex life –  without falling within one of a restrictive set of grounds also applies to the operator of a search engine.

The Court found that it did. In this context, the judges emphasised that operators of search engines are responsible “not because personal data referred to in those provisions appear on a web page published by a third party but because of the referencing of that page and in particular the display of the link to that web page in the list of results presented to internet users following a search”.

This allowed the Court to go on to find that a search engine operator only needed to comply with the very restrictive grounds for processing special category personal data once a request for removal (and the search engine operator’s verification) had been made. This neatly allowed searches which return special category personal data in the results to remain viable.

The Court then recalled its Google Spain’s judgment which held that while a data subject’s rights may, as a general rule, override the freedom of information of internet users, the balance between these rights must be assessed on a case-by-case basis taking into account:

  • the nature of the information in question and its sensitivity for the data subject’s private life; and
  • the interest of the public having that information, an interest “which may vary, in particular, according to the role played by the data subject in public life”.

Consequently, the Court concluded that, where search engines such as Google are facing a request by a data subject to exercise his/her right to be forgotten relating to a link to a web page containing special category personal data, they must consider all the relevant factors of the specific case and take into account “the seriousness of the interference with the data subject’s fundamental rights to privacy and protection of personal data” so that only the links in the list of results, displayed following a search on the basis of the data subject’s name, that are “strictly necessary for protecting the freedom of information of internet users” are retained.

The Court added that, where the processing relates to information made public by the data subject, an operator of a search engine may refuse to accede to a request for de-referencing provided that:

  • the processing meets all the other conditions of lawfulness; and
  • unless the data subject has the right to object to that processing on compelling legitimate grounds relating to his/her particular situation.

The Court also took an expansive view of the definition of criminal convictions data by holding that reporting on an investigation or trial was caught regardless of whether the data subject was convicted or not subsequently. It held that:

  • where the search returned criminal convictions data “which no longer reveal the current situation” then the operator, in light of all the circumstances of the case, must balance the data subject’s fundamental rights and the public’s right to freedom of information, considering “the nature and seriousness of the offence in question, the progress and the outcome of the proceedings, the time elapsed, the part played by the data subject in public life and his/her past conduct, the public’s interest at the time of the request, the content and form of the publication and the consequences of publication for the data subject”; and
  • if the results (e.g. because of the order in which links appear) suggest a criminal conviction is still current when it is not (and there is still a public interest in the older information remaining accessible), then on request from a data subject, the search engine operator must adjust the results so that the up to date accurate information is most prominent.

Our take

Although the first CJEU judgment would appear to put an end to the French data protection authority’s absolutist vision of the territorial scope of the right to be forgotten, the detail is somewhat different. The door remains open in particularly serious cases for a data protection authority to find that de-listing in all EU member states, coupled with geo-blocking access to search pages in non-EU countries, would be insufficient to protect the affected data subject’s privacy. Indeed on the very day the judgments were published, the CNIL underlined that, since a global de-referencing is not prohibited, it still has the authority to force a search engine operator to delist results on all the versions of the search engine where justified in a particular case to guarantee the rights of the individuals concerned (see the CNIL press release here).

As for the second judgment, the Court confirmed that a similar balancing test between the data subject’s fundamental rights and the public right to freedom of information should apply in the context of the processing of criminal convictions data.

In our view, although these judgments go some way to curbing the extraterritorial impact of the GDPR, they still leave the door open for a worldwide application in the most egregious of cases.

They also do little to relieve search engine operators’ burdens in adjudicating right to be forgotten requests – the balancing test remains as precarious as ever – and we therefore expect to see continued complaints and CJEU references in this area.

For more information, the full text of the judgments can be found here and here.

[1] CJEU, 13 May 2014, C-131-12, Google Spain SL, Google Inc. / Agencia Española de Protección de Datos, Mario Costeja González
[2] CJEU, 24 September 2019, C-507-17, Google LLC, successor in law to Google Inc. v Commission nationale de l’informatique et des libertés (CNIL)
[3] CJEU, 24 September 2019, C-136-17, GC and Others v Commission nationale de l’informatique et des libertés (CNIL)

New York’s Breach Law Amendments and New Security Requirements

Although California has recently captured the lion’s share of attention with respect to privacy and security, on October 23, 2019, New York’s amended security breach law goes into effect, and on March 1, 2020, new security safeguards go live (N.Y. S.B. 5575). Anyone with personal information about a New York resident is potentially affected by these far-reaching amendments.

Breach Law Changes

Readers may recall that New York’s security breach notification law (N.Y. Gen. Bus. Law § 899-aa) differs from most states’ law in several ways including (1) using separate definitions of “personal information” and “private information;” and (2) providing factors to consider whether personal information had been acquired. New York was among the majority of states whose breach law focused on acquisition of personal data (including Social Security Number, driver’s license number, or credit card number and security code).

As of October 23, 2019, much of that will change:

  • New York will no longer be a purely “acquisition” state but will be “access or acquisition” of personal information in order to constitute a breach requiring notice.
  • New York retains its very broad definition of “personal information” (“any information concerning a natural person, which, because of name . . . or other identifier, can be used to identify such natural person”), but the definition of “private information” (data elements) will expand to add two new categories (emphasis added):
    • Account number, credit card number, debit card number, along with “personal information”—but no longer requiring security code if the number could be used to access the individual’s financial account without additional identifying information. This change is consistent with the New York Attorney General’s position since 2017, which found that many popular websites permitted purchases to be made with credit cards without requiring security codes.
    • Biometric information that is used to authenticate or ascertain the individual identity.
  • “Private information” is also separately defined to mean user name or e-mail address in combination with the password or security question and answer—without any need for “personal information.”
  • New York will exclude from “private information” any encrypted data elements or “combination of personal information plus data elements”—as long as the encryption key has not been acquired by the unauthorized person.
  • Although New York has left unchanged its examples to determine if information has been acquired, it has added one for “access” that we will reformat a bit to make sure you see the full impact:

In determining whether information has been accessed, or is reasonably believed to have been accessed, by an unauthorized person or a person without valid authorization, such business may consider, among other factors, indications that the information was: (1) viewed; (2) communicated with; (3) used; or (4) altered

by a person without valid authorization or by an unauthorized person.

  • New York no longer requires that the person or business conduct business in New York state, but rather requires only that the person or business simply own or license computerized data that includes private information of a New York resident.
  • New York will permit persons or businesses to use a “risk of harm” analysis and determine not to provide notice, with some unique twists that are slightly reformatted to emphasize their potential full impact (emphasis added):

If the person or business reasonably determines such exposure will not likely result in: (1) misuse of such information; (2) financial harm to the affected persons; or (3) emotional harm in the case of unknown disclose of online credentials [user name or e-mail address in combination with the password or security question and answer].

Once that determination is made, the person or business must document it in writing and maintain it for five years. If the incident affects over 500 New York residents, then the person or business must provide the written determination to the state attorney general within 10 days after the determination.

  • New York will expressly recognize that notices under HIPAA, Gramm-Leach-Bliley, and the New York Department of Financial Services’ cybersecurity regulations as well as notices provided under “other data security rules and regulations of, and statutes administered by” any federal or New York agency, will suffice for this statute, and will not require a second notice. With respect to HIPAA only, the law now also provides that, if a covered entity must provide notice to the U.S. Department of Health and Human Services (HHS) but not under this New York law, the covered entity must provide a copy of the notice to HHS to the New York Attorney General within five business days of providing the notice to HH
  • The consumer notice will now be required to include phone numbers and websites of state and federal agencies that “provide information regarding security breach response and identity theft prevention and protection information.
  • New York amended its requirements relating to substitute notice. Although New York retains unchanged the requirements for “conspicuous posting” on the company’s website, and notification to major statewide media, a business will need to provide notice via e-mail if the business has an e-mail address for the affected individual unless the breached information includes the e-mail address plus the password/security question and answer. In that case, the business must instead offer “clear and conspicuous notice delivered to the consumer online when the consumer is connected to the online account from an internet protocol address or from an online location which the person or business knows the consumer customarily uses to access the online account.”
  • Although New York still does not have a private right of action under this section, the amendment at least doubled the fines the Attorney General may seek for violations, from $10 to $20 for each instance of failed notification, up to a total of $250,000 (from $100,000). The time to bring an action also increased from two years to three, commencing with the earlier of the date the Attorney General learned of the breach or notice was provided.

Data Security Protections

As of March 1, 2020, New York will start requiring reasonable security requirements for any person or business that owns or licenses computerized data that includes “private information” of a New York resident. (N.Y. Gen. Bus. Law § 899-bb).

For most companies, the new law requires that the person or business “develop, implement, and maintain reasonable safeguards to protect the security, confidentiality and integrity of private information including, but not limited to, disposal of data.” New York does not place specific requirements on these persons or companies, but instead provides examples of the elements of a data security program. For example, for administrative safeguards, the law lists safeguards “such as”:

(1)        Designates one or more employees to coordinate the security program;

(2)        Identifies reasonably foreseeable internal and external risks;

(3)        Assesses the sufficiency of safeguards in place to control the identified risks;

(4)        Trains and manages employees in the security program practices and procedures;

(5)        Selects service providers capable of maintaining appropriate safeguards, and requires those safeguards by contract; and

(6)        Adjusts the security program in light of business changes or new circumstances.

If these appear familiar, it is because they are a slightly revised version of the FTC’s Safeguards Rule requirements (16 CFR § 314.4). New York’s new law contains similar examples of technical and physical safeguards.

As with the amended breach law, this new law also states that compliance with the data security requirements of HIPAA, Gramm-Leach-Bliley or New York Department of Financial Services cybersecurity regulations, or other similar agency requirements, will meet this statute. In addition, the law states that, for “small businesses,” compliance means “reasonable administrative, technical and physical safeguards” that are “appropriate for the size and complexity of the small business, the nature and scope of the small business’s activities, and the sensitivity of the personal information the small business collects from or about consumers.“ The law defines a “small business” as a person or business that meets one of three criteria:

(i)         Fewer than 50 employees

(iii)       Less than $3 million gross annual revenues in each of the last three fiscal years; or

(iii)       Less than $5 million in year-end total assets.

The new requirements can be enforced by the Attorney General and the law specifically states that there is no private right of action. This result differs from the California Consumer Privacy Act, which provides its only private right of action for a data breach caused by a business’s failure to implement reasonable data security to protect the information breached. Cal. Civ. § 1798.150..

 

Office of Privacy Commissioner Says It’s Status Quo on Consent Requirements for Data Processing Transfers

On September 23, the Office of the Privacy Commissioner of Canada (OPC) announced, following consultation with stakeholders, that it will maintain the position set out in its 2009 guidelines that an organization’s transfer of personal information to a third party for processing, including a transfer across the Canadian border, is a “use” of that personal information, and not a disclosure that requires separate consent.

This announcement brings at least temporary clarity to an issue that resulted in a tumultuous summer for organizations and the OPC alike as everyone grappled with the potential consequences of the OPC’s June 2019 announcement of a proposed shift in policy to treat transfers for processing as “disclosures” rather than “uses” of personal information under the Personal Information Protection and Electronic Documents Act (PIPEDA).

What’s Old is New Again

In January 2009, the OPC issued Guidelines for processing personal data across borders setting out its interpretation that a “transfer” of personal information by an organization for processing is a “use” and not a “disclosure” of that personal information. The limit on the transfer was that the personal information could only be used for the purposes for which the information was originally collected. Therefore, when an organization transferred personal information to a third party for processing, additional consent for the transfer itself was not required. Processing was broadly interpreted to include any use of the information by the third party for a purpose for which the transferring organization can use it.

The OPC did expressly state in its guidelines that organizations would need to make it plain to individuals, ideally at the time of collection, that their information may be processed in a foreign country, and may be accessible to law enforcement and national security authorities of that jurisdiction. Notably, the guidelines stated that once informed individuals have chosen to do business with a particular company, they do not have an additional right to refuse to have their information transferred for processing purposes.

Organizations duly structured their consent practices and procedures to account for this interpretation of PIPEDA. As a result, the vast majority of organizations have not been obtaining separate consent to transfers for processing.

However, in April 2019, the OPC announced it was revisiting this position. Specifically, the OPC announced its view that transfers of personal information for processing, including cross-border transfers, are disclosures that require separate consent. This change in position followed the OPC’s April 2019 investigation findings on Equifax Inc. and Equifax Canada’s Co.’s compliance with PIPEDA in light of the 2017 breach of personal information. The OPC based its findings on the principle that individuals would expect to know whether and where their personal information may be transferred or disclosed to an organization outside of Canada.

Under the OPC’s revised interpretation, organizations would be required to inform individuals of any options available to them if they did not wish to have their personal information disclosed across borders. This would allow individuals to make an informed decision about whether to consent to the disclosure and therefore do business with the organization.

The OPC initially set out to consult with stakeholders on this revised position, but then took a step back in May 2019 when the Department of Innovation, Science and Economic Development (ISED) published its Digital Charter, which contemplates the amendment of PIPEDA. That step back was short-lived, however, as the OPC reissued its request for consultation in June.

Following receipt and consideration of submissions from 87 stakeholders, most of which were critical of the proposed shift, the OPC has now reverted to its original position – a “transfer” of personal information by an organization for processing was again a “use” and not a “disclosure” of that personal information. The OPC, recognizing that more than one interpretation of the requirement for consent was possible, determined it was pragmatic to maintain its previous position until PIPEDA itself is amended.

The OPC will now focus instead on its submissions to ISED for modernizing PIPEDA, including on how to most effectively protect individuals’ privacy rights in the context of transfers for processing. This suggests that while the debate is not over, its eventual resolution will be determined by Parliament.

Challenges associated with the OPC’s changes of position

The OPC’s guidelines, while important and useful tools to interpret PIPEDA, are not legal precedent and therefore may be more freely subject to change.

At the same time, organizations do establish their organizational processes based on the guidelines issued by the OPC, which allows organizations and consumers to have confidence that their processes are compliant with privacy law obligations. PIPEDA is, after all, intended to “support and promote electronic commerce by protecting personal information that is collected, used or disclosed in certain circumstances.”

Therefore, by maintaining the status quo in an effort to keep organizational confidence in their own processes while at the same time making it clear it is the OPC’s view that these processes are deficient, the OPC has in effect created temporary clarity that is tempered by a persistent sense of  uncertainty surrounding its de facto expectations and future intentions regarding transfers of personal information for processing.

What is clear from the OPC’s most recent announcement is that organizations should at the very least be transparent with individuals that their information may be processed in a foreign country and may be accessible to law enforcement and national security authorities of that jurisdiction. Best practices would be to advise individuals of details of the transfer at the time of getting consent.

Finally, with the loss or misuse of personal information by organizations being highlighted in news cycles, consumers are more aware of the handling of their personal information. Where consumers do not believe organizations met their expectations for transparency or security of their information, this could lead to reputational and legal risk to an organization. Organizations should be cognizant of their consumers’ expectations and the risks associated with the transfer of their personal information to other jurisdictions when designing consent and transfer processes. This is particularly so where significant privacy risks arise from the transfer of personal information across borders, such as the transfer of information of the exercise of legal activities by Canadian individuals where such activities are not legal in the other jurisdictions (cannabis use, for instance).

Data protection and cyber risk issues in arbitration – dealing with regulation, cyber attacks and hacked evidence

The GDPR has significantly altered the landscape of data protection. Its broad scope and potentially severe penalties have forced those who hold and process data to take note of its provisions. In certain instances, that will include many in the international arbitration community, such as arbitral institutions. In parallel, cyber attacks and instances of hacking in the arbitration context have brought cyber security issues to the fore.

As a result, data protection and cyber security are now hot topics in international arbitration. A majority of respondents in the 2018 Queen Mary International Arbitration Survey listed “security of electronic communications and information” as an issue which should be addressed in arbitration rules. This clearly demonstrates that the users of arbitration are concerned about data security. While there are signs that the market is listening, users seem to think that arbitral institutions, counsel and tribunals could do more to address cybersecurity.

In our article published in the latest International Arbitration Report, we examine three areas of data protection and cyber security in arbitration:

  • The EU’s GDPR and how it bears on international arbitration;
  • Data breaches in arbitral proceedings and cyberattacks on institutions, and how institutions are responding; and
  • How hacked evidence might appear in arbitration, and how tribunals have dealt with this issue.

The full article is available here.

LexBlog