At the end of 2019, following a public consultation, the CNIL adopted its much-anticipated “standard” on whistleblowing systems. The “standard” is essentially a reference document which serves as guidance for those implementing whistleblowing systems.
On 21 January 2020 at a meeting at the World Economic Forum, the Personal Data Protection Commission (PDPC) and Infocommunications Media Development Authority (IMDA) released the second edition of its Model Artificial Intelligence (“AI”) Governance Framework (“Model Framework”).
Background – the Model Framework
The Model Framework, which was first released in January 2019, is a voluntary set of compliance, ethical principles and governance considerations and recommendations that can be adopted by organisations when deploying AI technologies at scale. It is not legally binding.
At the heart of the Model Framework are two high-level guiding principles:
(1) organisations using AI in decision-making should ensure that the decision-making process is explainable, transparent and fair; and
(2) AI solutions should be human-centric.
The Model Framework provides guidance on the following four key governance areas:
- Internal governance structure and measures: adapting existing or setting up internal governance structures and measures to incorporate values, risks, and responsibilities relating to algorithmic decision-making, which includes delineating clear roles and responsibilities for AI governance within an organisation, processes and procedures to manage risks, and staff training.
- Determining the level of human involvement in AI-augmented decision-making: based on the assessment of risks and identifying an appropriate level of human involvement in the process in order to minimise the risk of harm to individuals.
- Operations management: issues to be considered when developing, selecting and maintaining AI models, including data management.
- Stakeholder interaction and communication: strategies for communicating with an organisation’s stakeholders concerning the use of AI.
The Model Framework is intended to be broadly applicable – it is algorithm-agnostic, technology-agnostic, sector-agnostic and scale-and-business-model-agnostic. Consequently, it can be adopted across all industries and businesses, regardless of the specific AI solution or technology involved.
The Second Edition – Key Updates
Key updates introduced include:
(1) Inclusion of industry examples
The Model Framework now includes real-life industry examples in each of the four key governance areas, demonstrating effective implementation by organisations. Such examples are drawn from a variety of industries – ranging from banking and finance, healthcare to technology and transportation, and are based on different use cases, thereby reinforcing the neutral and flexible nature of the framework.
(2) Additional tools to enhance the usability of the Model Framework
In addition to the Model Framework, IMDA and PDPC have also concurrently released two additional documents to guide organisations in adopting the Model Framework:
- The Implementation and Self-Assessment Guide for Organisations (ISAGO); and
- The Compendium of Use Cases.
These documents can be accessed here.
The ISAGO was jointly developed by the IMDA, the PDPC and the World Economic Forum Centre for the Fourth Industrial Revolution, with input from the industry. It is designed to be a DIY guide for organisations seeking to implement the Model Framework and identify potential gaps in their existing AI governance framework.
The Compendium of Use Cases sets out various case studies of organisations which have operationalised the principles from the Model Framework. The case studies showcase the flexibility of the Model Framework, which can be adapted according to the needs and priorities of the different organisations.
(3) The inclusion of new measures
The following new measures have been introduced into the Model Framework:
- Robustness refers to “the ability of a computer system to cope with errors during execution and erroneous input, and is assessed by the degree to which a system or component can function correctly in the presence of invalid input or stressful environmental conditions”.
- Reproducibility refers to “the ability of an independent verification team to produce the same results using the same AI method based on the documentation made by the organisation”. This is different from repeatability, which refers to “the internal repetition of results within one’s organisation”.
- Auditability refers to “the readiness of an AI system to undergo an assessment of its algorithms, data and design processes”.
These three measures are aimed at helping organisations enhance the transparency of the algorithms used in AI models. Industry examples and further elaboration on how to implement these three measures can be found under the “Operations Management” section of the Model Framework.
(4) Other changes
Other useful changes introduced in the Second Edition of the Model Framework include:
- Clarifying the concept of “human-over-the-loop” by explaining the supervisory role to be performed by a human in AI-augmented decision-making.
- Clarifying that organisations can consider factors such as the nature of harm (i.e. whether the harm is physical or intangible in nature), reversibility of harm (i.e. whether recourse is easily available to the affected party) and operational feasibility in determining the level of human involvement in such AI-augmented decision-making processes.
- Providing suggestions on the level of information to be provided when interacting with various stakeholders to build trust such as including information on how AI is used in decision-making, how the organisation has mitigated risks and an appropriate channel to contest decisions made by AI.
- Providing further guidance to organisation on how to adopt a risk-based approach to implementing AI governance measures.
The PDPC encourages feedback from organisations that adopt and apply the principles in the Model Framework. It is a “living document” which would evolve alongside with the fast-paced changes in technology and the digital economy, as well as feedback from organisations that adopt the Model Framework.
This Second Edition of the Model Framework is the result of feedback from organisations that have adopted the Model Framework (which is an impressive list including large technology companies and global financial institutions) and feedback from Singapore’s participation in leading international platforms such as the European Commission’s High-Level Expert Group and the OECD Expert Group on AI. With the inclusion of such valuable feedback and the inclusion of real-life industry examples illustrating companies’ application of the key governance areas, this Second Edition of the Model Framework will provide even greater practical guidance to organisations seeking to implement the Model Framework when deploying AI-solutions.
Whilst the Model Framework does not impose any binding legal or regulatory obligations, organisations that intend to deploy AI-solutions at scale should consider and apply the Model Framework. This is because adherence to the Model Framework will assist an organisation in demonstrating that it has implemented accountability-based principles in data management and protection (e.g. personal data protection obligations under the Singapore Personal Data Protection Act 2012) as it is an accountability-based governance framework.
The authors would like to thank Ji En Lee, associate at Ascendant Legal LCC, for his contribution to this article.
The CNIL has published draft recommendations on how to obtain consent when placing cookies. This is following the publication of its revised “Guidelines on the implementation of cookies or similar tracking technologies” which was published in July 2019 (see our article here).
The objective of the recommendations is to provide stakeholders with practical guidance and illustrative examples. These recommendations are neither exhaustive nor binding and data controllers are free to consider other practical measures as long as they comply with the revised rules as provided by the CNIL in July 2019. The CNIL also provides a number of “good practices” that will enable businesses to go even further in their compliance process.
Scope of consent
Consent is required for all cookies other than those necessary for the use of the website/app, whether they are used in “logged” or “un-logged” environments, and whether they are implemented by the website/app operator or a third party.
Before collecting consent, data controllers must ensure that proper information has been provided to users.
On a first layer of information, websites/app operators are recommended to provide information about:
(i) the purposes of the cookies (a title and a short description would suffice);
(ii) the number of data controllers who have access to the cookies (and associated data);
(iii) whether a user’s consent is also valid for tracking his/her navigation throughout other websites or apps (and which ones); and
(iv) the right to withdraw consent at any time and how (the CNIL recommends using descriptive and intuitive titles, such as “Cookie Management Form” or “Manage My Cookies”).
This information must be provided in a clear, easily accessible and exhaustive manner before seeking the user’s consent.
From this first layer, the user should be able to easily access further, more detailed information such as:
(i) a detailed description of each purpose of the cookies (e.g. via a scroll button or a hypertext link, marked “find out more” or “for more information”), and
(ii) an up-to-date list of the data controllers, their roles, and a link to their privacy policies:note that only substantial modifications to this list will require a new consent. Moreover, as good practice, the CNIL discourages the use ofmasking techniques hiding the identity of the entity using the cookies, such as sub-domain delegation.
The CNIL recommends that this information is accessible on all pages of the website and placed in fields of the screen that catch the attention of users or in areas where they expect to find it. Standardised icons should be use to this end.
Obtaining valid consent
Unsurprisingly, the CNIL applies the GDPR criteria for consent: it must be freely given, specific, informed and unambiguous. The CNIL’s main recommendations are that:
- Consent should be sought purpose by purpose. Nevertheless, this does not prevent the user from giving global consent to all the purposes of the processing provided that the user has: (i) been presented with all the purposes beforehand; (ii) also been given the opportunity to give consent purpose by purpose; and (iii) can easily refuse all the purposes at the same time on the same conditions. The CNIL also recommends the use of different cookies for each distinct purpose and the use of explicit and standardised names for cookies (e.g. functionality cookies, advertising cookies, etc).
- Data subjects must be able to consent or withhold their consent with the same degree of simplicity. This implies that the acceptance and refusal mechanisms should be at the same level on the web page and be presented in the same technical manner and that no negative consequence should arise from the user’s refusal to consent to the implementation of cookies.
- Finally, the CNIL stresses the importance of a neutral design and prohibits deceptive practices that are likely to mislead the user by, for example, suggesting that user acceptance is mandatory. In this respect, it is recommended that a simple consent procedure such as tick boxes be used.
- The CNIL recommends that consent be renewed at appropriate intervals (e.g. every 6 months).
The “double proof” of consent
The controller must at all times be able to prove that:
1. the individual gave their consent, e.g. via a timestamp of the consent, the context in which the consent was collected, the type of consent collection mechanism used, and the purposes to which the user has consented ; and
2. the consent mechanism meets all the requirements set out above, e.g. via deposit of the source code of the website/app from which the consent is collected with a third-party escrow agent to create a dated proof that the consent mechanism exists, screenshots of the consent interface or regular audits.
This draft is subject to public consultation until February 25, 2020 and a final version should be released soon after.
Regulators across the EU are taking an increasingly strict approach to the rules on cookies. The adtech industry is already reacting to the consequences of this reality. Google, for example, has recently announced its intention to block third-party cookies in Chrome web browsers within two years, following the example of its competitors Safari and Firefox.
In France, the CNIL has announced a transition period of 6 months from the publication of the final recommendation (following the public consultation). Website operators still have until the end of the summer of 2020 before risking any sanction. However, international organisations should be aware that, so far, other EU data protection authorities have issued substantially similar recommendations and have not granted a period of grace for the implementation of these recommendations. Therefore, it is advisable to start reviewing internal cookie practices and policies in order to comply with the CNIL’s recommendations and “good practice”.
 The CNIL follows the CJEU approach in the Fashion ID case (see our article on the subject here)
Happy Data Privacy Day! Data Privacy Day represents a timely opportunity to highlight anticipated significant developments in Canadian privacy law in 2020 that we are monitoring following two major developments from the Government of Canada.
In the private sector in Canada, federally regulated businesses, works and undertakings are subject to the Personal Information Protection and Electronic Documents Act (“PIPEDA”), including with respect to their collection, use and disclosure of employee personal information. In addition, provincially and territorially regulated organizations that collect, use or disclose personal information in the course of their commercial activities are subject to the Act. The Act applies unless provincial or territorial legislation deemed “substantially similar” has been enacted. Currently, only Alberta, British Columbia, and Quebec have enacted substantially similar legislation with the result that PIPEDA’s application is broad in scope, to organizations across Canada.
In response to consultations across Canada and in recognition of the importance of Canada’s growing digital economy, the Government of Canada announced its Digital Charter, and launched its National Digital and Data consultations by publishing an accompanying paper entitled Strengthening Privacy for the Digital Age, which included numerous recommendations for amending PIPEDA.
In its Digital Charter, the Government of Canada articulates a principled approach to digital and data transformation, setting out ten principles to guide amendments to PIPEDA. Proposed amendments include:
- Enhancing the control and transparency that individuals have over their personal information by requiring specific standardized plain language information on its use;
- Providing data mobility opportunities to support greater individual control over data and promotion of consumer choice; and
- Strengthening enforcement mechanisms, including enhanced penalties for non-compliance.
Mandate Letter from the Prime Minister’s Office
It now appears that amendments to PIPEDA may be on the horizon. On January 17, 2020, the Prime Minister’s Office delivered a mandate letter to the Minister of Innovation, Science and Industry, outlining a number of data protection initiatives for the Ministry. Notably, some of these initiatives include:
- advancing Canada’s Digital Charter;
- enhancing the power of the Office of Privacy Commissioner of Canada, such as adding the ability to award administrative monetary penalties, creating new offences, or providing additional oversight by the Federal Court of Canada to incentivize compliance;
- establishing a new set of rights for individuals online, including:
- data portability/privacy; and
- the right to be forgotten.
- enhancing knowledge of how personal data is being used; and
- creating new regulations for large digital companies to protect personal data and to encourage greater competition in the digital space.
Each of these amendments, if implemented, have the potential to effect a fundamental change in the way private-sector organizations in Canada collect, use, and disclose personal information. Such amendments would also more closely align Canada with the robust, rights-based data protection regime in the European Union under the General Data Protection Regulation (GDPR); an important consideration in light of the imminent sun-setting of PIPEDA’s adequacy decision by the EU, which allows for free data exchanges between the EU and Canada, with the exception of employee data and under certain conditions.
The authors would like to thank Miranda Sharpe, articling student, for her contribution to this article.
An interim proprietary injunction has been granted by the English High Court over a bitcoin ransom payment paid to a third-party wallet.
The case was brought by an English insurer (requesting anonymity) against four defendants, consisting of unknown cyber-extortionists (as well as three other parties who respectively hold and/or trade Bitcoins). The claim related to a customer of the Insurer whose data and systems had been encrypted and bitcoin ransom payment demanded.
After some negotiation, the Insurer agreed to pay the ransom (equal to $950,000) in return for the decryption tool. Following the payment of the ransom and the provision of the decryption tool, further investigations were undertaken on behalf of the insurer as to the destination of the ransom – with the ultimate aim of recovering the Bitcoins by way of a restitutionary or equitable remedy.
Whilst some of the Bitcoins were transferred into fiat currency, a substantial proportion of the Bitcoins (96) were transferred to a specific address; this address is linked to the exchange known as Bitfinex, operated by the third and fourth defendants. The insurer sought a proprietary injunction over those Bitcoins, as the initial step in looking to recover them via the courts.
The decision and its application
The judge was satisfied that the test for a proprietary injunction over the Bitcoins against each of the four defendants was satisfied. A fundamental element of the decision was the conclusion that crypto assets, such as Bitcoin, are “property” for the purposes of English law and therefore can be the subject-matter of a proprietary injunction.
While historically it has been very difficult to recover ransom payments, the case highlights the potential for corporations to recover these payments via the courts (or, at the very least, to obtain interim relief in respect of them). The decision should certainly be borne in mind by any corporations who become the subject of a targeted and substantial ransom demand and in circumstances where the ransom is paid and is subsequently traceable.
Given the typical speed in which ransom crypto assets are transferred / dissipated, in order to increase any potential for the recovery of such assets, it would be advisable for those considering making such an application, to act with expediency.
The discussion paper on the proposed changes to Hong Kong’s Personal Data (Privacy) Ordinance (Cap.486) (the PDPO) was debated by the Legislative Council’s Panel on Constitutional Affairs’ (the Panel) on 20 January. The proposals set out in LC Paper. No. CB(2) 512/19-20(03) (the Paper) are summarised in our earlier post.
Seven Panel members attended the meeting to discuss the Paper and provide their views in respect of the proposed reforms to the PDPO. Also in attendance was the Privacy Commissioner for Personal Data (the Commissioner) and the CMAB Secretariat. Out of the six reforms proposed in the Paper, those around mandatory breach notification and increased powers to curb doxxing (the disclosure of personal data online without the consent of the target individual) were the focus of the discussion. This is not surprising given the number of high profile data breaches in Hong Kong and the prevalence of doxxing incidents in the past year. The proposals in respect of data retention periods, revenue based fines and the regulation of data processors were not discussed. The Paper received some criticism for only including six proposed reforms, as did the process given that there will be no public consultations in respect of the proposed reforms.
The key takeaways from the Panel discussion are as follows:
- There will be no public consultation in respect of the proposed reforms. Some members were critical of the absence of public consultation, however, the CMAB Secretariat and the Commissioner responded that public consultation is a time consuming process, stakeholders have already provided input and due to the major recent incidents, change is needed promptly.
- There is general support for changes to the Commissioner’s sanctioning powers. The general consensus was that the Commissioner’s powers are inadequate, referred to as a “toothless tiger”, and there is a need for a strengthening of the powers. The Commissioner used the current issues relating to doxxing as an example, saying that having the ability to impose administrative fines would give a more direct route to enforcement and deter both platform users and platform operators from doxxing. There was no specific discussion regarding the proposal to increase relevant criminal level fines or link fines to an organisation’s revenue and type.
- There is general support for the introduction of a mandatory breach notification mechanism. Members were generally supportive of the proposed mandatory breach notification mechanism, but suggested that that the proposed notification threshold is ambiguous and more clarity is required as to what constitutes a “real risk of significant harm”. In addition, members commented on the reporting mechanism and suggested that notifications by way of instant messenger should be considered given the prevalence of usage in Hong Kong. The proposed timeframe of “not more than five business days” for submitting a notification to the Commissioner was not raised by members, but the Commissioner noted that this timeframe is in line with international practice.
- A definition is needed for “sensitive personal data” in line with international standards. Members criticized the fact that there is no mention of sensitive personal data including biometrics, facial recognition and DNA. The Commissioner was agreeable to considering such definition and proposing safeguards in line with international standards.
- Guidance in respect of cross-border transfers is expected to be released in next six months. Members raised concerns relating to the absence of proposals in respect of the regulation of cross-border data transfers, including enactment of section 33 of the PDPO (which regulates cross-border transfer of personal data but has not been enacted for over 20 years). The Commissioner stated that the consultation process in respect of s.33 is still ongoing and there is no timetable for completion of the consultation or enactment of the section. However, templates and best practice guidelines relating to (i) cross-border transfers between organizations, and (ii) cross-border transfers between cloud processors is expected to be released in the next six months.
In terms of next steps, the Panel meeting made clear there would be no public consultation on the proposals. Therefore, we expect the next step in this process to be the preparation of a draft bill amending the PDPO and its publication in the Government Gazette in order for the bill to be introduced into the Legislative Council. No indication was given as to the timing of this draft amendment bill, but we will be closely monitoring its progress.
2019 saw continued growth and change in data protection and cyber-security across the Asia-Pacific. Following the implementation of the GDPR in May, 2018, many jurisdictions moved to review and strengthen existing data privacy and cyber-security laws. In addition, 2019 saw regulators publishing findings in respect of some of the largest data incidents of 2018. We have set out below the key highlights of the year and what to look out for in 2020.
Written by Partner Anna Gamvros and Associate Libby Ryan, both based in the Hong Kong office.
Earlier this week, the Constitutional and Mainland Affairs Bureau (the CMAB) released its discussion paper (LC Paper. No. CB(2) 512/19-20(03) (the Paper) seeking the Legislative Council’s Panel on Constitutional Affairs’ (the Panel) views on proposed changes to the Personal Data (Privacy) Ordinance (Cap.486) (the PDPO). The Paper was released on Monday 13th January, as part of an agenda for the Panel meeting which was held on Monday, 20th January, and follows proposals by the Privacy Commissioner for Personal Data (the Commissioner) to the government to amend the PDPO. The Paper sets out six proposed amendments to the PDPO:
- Introduction of a mandatory breach notification mechanism. It is proposed that the mechanism should include:
- a definition of “personal data breach” along the lines of the GDPR definition, being “a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed”;
- a notification threshold so the mechanism will only apply to data breaches that have a “real risk of significant harm” taking into account factors such as the type and amount of data leaked and the security level of the data (encrypted or not);
- a time frame for notifying the breach to the Commissioner and individuals. An example of, “as soon as practicable and, under all circumstances, in not more than five business days” is included in the Paper; and
- details on the method of notification, as well as the content.
- Certainty around data retention periods. It is proposed that data users will be required to have clear retention policies. The Paper recognises that it is not practicable to set a uniform retention period applicable to all types of personal data held by various organisations for different purposes. As such, the Paper proposes requiring data users to have in place a clear retention policy that specifies:
- a maximum retention period for different categories of personal data collected;
- legal requirements that may affect the retention periods (for example, tax, employment and medical regulations); and
- how the retention period will be counted. For example, from the date of collection of personal data, or from expiry of a data subject’s membership with the organisation.
- Changes to the Commissioner’s sanctioning powers. In order to enhance the deterrent effect of the PDPO and strengthen the Commissioner’s powers, the following changes are proposed:
- increasing the relevant criminal level fines and potentially linking the fines to a percentage of annual turnover and a scale which would have different levels of fines depending on the turnover of the data user;
- conferring powers on the Commissioner allowing him to directly impose administrative fines for breaches of the PDPO. Such fines should take into consideration a number of factors including the types of data compromised, severity of the data breach, whether the data user intended the breach to happen and its attitude towards the handling of the breach, remedial actions taken, track record etc. Data users should have the right to appeal the fines, and be given appropriate time to do so; and
- a mechanism for the imposition of the administrative fine.
- Regulation of data processors. The purpose of this amendment is to share responsibilities for data protection between data users and processors, and prevent data processors from neglecting the importance of preventing personal data leakage. Data processors would be held directly accountable for data retention and security, equal obligations would be imposed on data processors and they would be required to notify the Commissioner and the data user upon becoming aware of a data breach.
- Amendment to the definition of personal data. Changes to the definition would expand the current definition to include information that relates to an “identifiable natural person”, rather than an “identified person”. This change reflects the wide use of tracking and data analytic technology being used today and is in line with definitions adopted in other jurisdictions.
- Regulation of disclosure of personal data of other data subjects. This change is proposed primarily to curb the effect of doxxing of which we have seen an increase recently in Hong Kong. Since 14 June, 2019, the Commissioner has received over 4700 doxxing related complaints and enquiry cases since 14 June, 2019. Proposed measures include conferring statutory powers on the Commissioner allowing a request to remove doxxing content from social media platforms or websites, as well as criminal investigation powers and prosecution.
These changes are the first changes to the PDPO to be proposed in over 10 years. They are in response to recent data protection related events in Hong Kong and reflective of changes and new laws we have seen in other jurisdictions.
We will closely monitor the discussions around these proposals and will provide an update following the Panel meeting on 20 January, 2020.
On New Year’s Day, you may have received emails from numerous companies saying their privacy policies have changed, or noticed a link at the bottom of many companies’ homepages stating “Do Not Sell My Info.” These are two of the more visible requirements of the California Consumer Protection Act (CCPA) and companies are still in the process of rolling out other requirements. For those of you that are in the EU or doing business with companies that offer products or services to EU residents, this might have felt like the movie “Groundhog Day.”
To understand the various approaches to CCPA compliance, we reviewed the websites of 50 companies in the Fortune 500® and noticed a few trends:
1. Brace yourself (for export turbulence)
2020 could well be a year of data export turmoil – so brace yourself.
The Court of Justice of the European Union (CJEU) will determine the validity of the EU Standard Contractual Clauses (SCCs) (Data Protection Commissioner v Facebook Ireland Limited, Maximillan Schrems) whilst the General Court of the EU will consider the future of Privacy Shield (La Quadrature du Net v Commission).
The Advocate General (AG) delivered his non-binding opinion on the SCCs just before Christmas (see our blog post). Although the AG’s view was that the SCCs are valid, he suggested that those using them would need to examine the national security laws of the data importer’s jurisdiction to determine whether they can in fact comply with the terms of the SCCs. He also raised serious doubts over the validity of the Privacy Shield. If the CJEU shares these doubts, it could influence the outcome of La Quadrature du Net.
Data localisation issues are also set to resurface during 2020. China’s requirements are tricky, the Russian Data Localisation law now has monetary penalties and the draft Indian data protection bill also imposes localisation requirements in certain circumstances.