Data Protection Report - digital privacy, CCPA and cybersecurity

On 15 April 2019, the ICO opened a public consultation on a draft code of practice titled Age Appropriate Design (the “Code”).  The Code will remain open for public consultation until 31 May 2019.

The consultation document is described as a “code of practice for online services likely to be accessed by children.”  However, its potential impact is in fact wider, and is perhaps better described as applying to all online services that are not demonstrably unlikely to be accessed by children, which it controversially defines as individuals under 18.  For this reason, the Code in its current form will have implications for almost all providers and users of online services.

This blog post does not seek to serve as an overview of all of the requirements of the Code (a useful summary is provided at the beginning of the document itself, which can be found here  Rather, we have focused on the key requirements that we consider have implications from a business perspective.

Scope of application

  • The Code must be considered in relation to all Information Society Services (“ISS”), that is, “any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services.” This is interpreted widely, with “remuneration” including ad revenue as well as payment received directly from the end user and not-for-profit services where these are “typically provided on a commercial basis.” It includes small businesses, provided that products are sold, or a service is mainly transacted, online.
  • Irrespective of whether an ISS is aimed or targeted at children, the standards of the Code must be met (to the extent applicable) unless:
    • robust age verification mechanisms are applied (we will re-visit these later on) restricting access to adult users only; or
    • there is specific, documented evidence to demonstrate that children are, in practice, not likely to access the service. The ICO recommends undertaking market research or compiling other evidence relating to user behaviour to ascertain (and help demonstrate) this.

The “Best Interests” Test

  • Many standards of the Code are subject to an exception, whereby ISS providers are not required to comply with the standard if they are able to demonstrate a “compelling reason” to deviate from it. In keeping with the ICO’s earlier guidance on processing personal data of children, the primary factor for consideration here is whether the proposed activity is in the “best interests” of the child.
  • The guidance uses two very polarised examples to illustrate what amounts to a “compelling reason” in this context: “One clear example of a compelling reason is data sharing for safeguarding purposes, or for the purposes of preventing or detecting crimes against children such as online grooming. An example that is unlikely to amount to a compelling reason for data sharing is selling on children’s personal data for commercial re-use.”
  • Whilst it is recognised that the best interests of a child have to be balanced against other interests, it appears that the bar is set very high in this regard, with the only examples of a prevailing interest relating to the conflicting interests of other children. The ICO also specifically states that it is unlikely… that the commercial interests of an organisation will outweigh a child’s right to privacy.”

Users impacted

  • As stated above, the Code applies to all users under the age of 18, which is perhaps surprising given that the UK opted for the youngest possible “ISS age” of 13 under Article 8.1 of the GDPR. However, this blanket application is mitigated by the different standards to be applied to five defined “Age Categories.”
    • 0 – 5: pre-literate and early literacy
    • 6 – 9: core primary school years
    • 10-12: transition years
    • 13-15: early teens
    • 16-17: approaching adulthood

These age categories are supported by information relating to the capacities, needs, skills and behaviours of minors at each stage, which is annexed to the code. The standards relating to transparency, parental controls, nudge techniques, and use of online tools are all category-specific, and they will also be relevant in respect of data protection impact assessments.  Older children do not necessarily require less protection, but the measures required to meet the standards of the Code may be different (particularly given their increased level of understanding).  This will create a particular challenge for ISS providers who have previously been “age agnostic” (ie. their service is not specifically aimed at or designed for children, but could potentially be of interest to them) or those that are likely to appeal to a variety of age categories.

  • The code also has implications for adult users – as they may be required to verify their age prior to reducing the default, high privacy settings or to access content that isn’t suitable for children.
  • The code applies retrospectively, so will affect existing user accounts as well as accounts created after it comes into force. There will be a grace period (it is yet to be determined how many months long this will be) after which “high privacy” settings, which the code requires as default, will need to be rolled out across all existing accounts. The requirement for individuals to verify their age prior to changing these privacy settings will be disruptive to the user experience.

Key requirements

  • Default settings: the Code requires that settings must be “high privacy” by default. Within this default environment, unless it is appropriate to allow the child to change the setting (see above in relation to the “best interests” test), children’s personal data:
    • must be limited to use that is essential to the provision of the service;
    • must not be visible to other users of the service; and
    • cannot be transferred to third parties.

In the event that an ISS provider considers that the best interest test is met and wishes to allow children to change the default settings, it should undertake a DPIA which should include (among other things) (1) how privacy information should be provided to each of the impacted age categories; (2) whether further measures are required, such as parental consent; and (3) whether the change should be session-specific or permanent. Where possible, privacy settings should be device specific.  Nudge-techniques are encouraged to remind individuals where they have deviated from the default settings.

  • Transparency: the ICO elaborates on its previous advice relating to age appropriate children’s privacy notices, requiring “bite sized” notices to be provided at the point at which personal data is used. This raises practical challenges, as privacy notices must be tailored to the five Age Categories and the accompanying guidance is quite prescriptive as to how information is communicated. ISS providers will be required to develop an understanding of the age groups that are likely to access their service, and tailor the notifications accordingly. The notices should be “scalable” such that the user can self-select the level of detail provided to them, and should sit alongside notices aimed at parents, which must comply with Article 13 or 14 of the GDPR. Audio, video or graphical prompts are required to prevent children from changing default privacy settings, and nudges can be used for this purpose. In short, compliance with the Code will require significant expense and a greater amount of web design expertise than is required to provide compliant privacy information to adults.
  • Geolocation: geolocation must be turned off by default and requirements under the Privacy and Electronic Communications Regulations (PECR) must be interpreted in light of the Code. Options which make a child’s location visible to others must revert back to the default (“off” setting) at the end of each session, and location tracking must be made obvious to the child whilst it is active. This is complicated by the requirement to ensure that the needs of each of the impacted age categories are met.
  • Profiling: similarly to geolocation, profiling must be turned off by default – again, unless there is a compelling reason otherwise. The ICO acknowledges that some profiling is fairly benign, but notes the risk of content feeds that gradually take the child away from his/her original area of interest to other less suitable content. Perhaps for this reason, profiling is only permissible where appropriate measures are in place to protect the child from harmful effects, which would include being fed content that is detrimental to their health or wellbeing.
  • Data sharing: children’s data cannot be disclosed to third parties, unless there is a compelling reason to do so.
  • Parental controls: child users must be given age appropriate notification of any control or monitoring of their use by parents.

Age verification

The ICO makes specific references to “robust age verification mechanisms” but does not outline its expectations as to what these should look like or how they should be implemented. The ISS provider must be able to demonstrate that children cannot easily circumvent them.  Reference is made to trusted third party age verification services, with a reminder that it is necessary to undertake due diligence.  This is acknowledged as a developing area and the ICO will support development of standards and certification schemes in this regard.  There is a trade-off between deploying age verification and the requirement to process additional personal data for this purpose.  It seems likely that this will have a significant impact on user experience, unless ISS providers are driven towards centralised systems where a single set of verified credentials may be utilised for multiple services.  This in turn raises questions as to costs and interoperability.

What next?

The Code will remain open for public consultation until 31 May 2019. Once it has been finalised, it will be made a statutory code under s. 123 of the Data Protection Act 2018, meaning its provisions must be strictly complied with.  Given the ambitious scope of the code and its wide-reaching implications, it remains to be seen whether the consultation process may erode some of the more strict provisions that are currently proposed.  Further guidance on what amounts to a “compelling reason” seems likely to be welcome.  Finally, given the requirements to build technical solutions to (1) ensure compliance with the standards and/or (2) obtain age verification to deviate from them, a decision to widen the grace period (currently proposed in relation to retrospective application only) to the Code’s wider application, may also be popular.

Our take

ISS providers will need to assess the likelihood of their content being of interest to children and either “age gate” it, with cumbersome age verification for users who are over 18 years old, or build in considerably more privacy-friendly defaults into the 13-18 age bracket than are generally currently applied. This is going to make the commerce friendly nudges and configurations for adult users all the more obvious and difficult to justify.  How the guidance is implemented and enforced could well carry over into the grown up world too, given that it will potentially impact users’ expectations as to the treatment of their personal data and it will generally be difficult to justify lower standards.  We recommend that all digital businesses watch this area carefully.