NT Analyzer Blog Series: Why So Many Cookie Policies Are Broken, Part I – HTML5 LocalStorage

NT Analyzer blog series, cookie

Cookies Are One Piece of a Larger Puzzle

There has been an odd preoccupation with cookies for some time now—to the exclusion of other forms of browser tracking, some of which are much more flexible and more robust in their data collection capabilities than cookies.  Despite this fact, these other, non-cookie tracking technologies are often not referenced in privacy policies and cookie policies, even though they are used to “store information” and / or “gain access to information stored in the terminal equipment” for purposes of the ePrivacy Directive and will presumably qualify as personal information under the CCPA as well. Continue reading

ICO’s update report into adtech and real time bidding – a sobering read for participants in the adtech industry

Norton Rose Fulbright - Data Protection Report blog

On 20 June, the UK’s Information Commissioner (the ICO) published a report setting out its views on adtech, specifically the use of personal data in “real time bidding”, and the key privacy compliance challenges arising from it.

In its report, which is a status update rather than formal guidance, the ICO acknowledges that this is a very complex area and that online advertising is key to the economic models of many online content providers. It therefore notes that that it will take a “measured and iterative approach” before undertaking a further industry review in six months’ time.

The ICO is clear that, in the meantime, it expects organisations in the adtech space to take on board the issues identified in the report and to re-evaluate their practices and improve their compliance. However, whilst the report sets out many of the well-documented deficiencies in adtech and specifically real time bidding, the ICO has not provided any practical solutions for how to deal with these issues, which may leave many advertising vendors and the publishers that rely on them questioning precisely what action they should be taking.

Brief introduction to adtech and real time bidding (RTB)

Adtech is the term used to describe the tools that analyse and manage information for online advertising and automate the processing of advertising transactions. RTB is an important example of adtech, which enables the buying and selling of advertising inventory (e.g. on a website or app) in real time.

This is generally done using an auction pricing mechanism whereby an organisation (a “publisher”) operating an online service will use first or third party cookies and similar technologies to collect information about a user when they visit a site. This information will then be incorporated into a “bid request” that is transmitted into the RTB ecosystem so that advertisers can bid for the opportunity to insert their advert into the relevant ad space. These bid requests typically contain personal data, with those bid requests containing the most information about a person and their device(s) being the most attractive, in part because they enable more accurate targeting of adverts to the person concerned. To assist in building up this accurate picture of an individual, processes referred to as “data matching” or “enrichment” commonly take place to augment the data collected from the individual.

Importantly, a single RTB bid request can result in personal data being processed by hundreds of organisations. As explained below, this factor is arguably the most challenging aspect of undertaking RTB in a compliant manner. It remains unclear how to get round this challenge and this may ultimately necessitate a change to, not only the practices of individual organisations, but the industry as a whole.

The key issues identified by the ICO

The ICO identified the following as their key priority issues (although they noted that they do not represent the full nature of their concerns with RTB and adtech more generally):

  1. Lack of understanding about the role of the PECR

The ICO notes that there is generally a lack of understanding amongst the RTB participants about how the requirements of the Privacy and Electronic Communications Regulations (PECR) sit alongside the General Data Protection Regulation (GDPR) requirements.

Specifically, the ICO found that many participants did not appreciate how the rule in PECR requiring consent for the use of cookies impacted the requirement to have a lawful basis for processing personal data. Many participants were of the belief that, contrary to the requirements of PECR, the “legitimate interest” lawful basis could be used for both the processing of personal data collected in connection with RTB and the use of cookies (although this seems surprising given the prevalence of cookie pop-ups and consent banners on most websites). It also criticised the industry initiatives for being too focussed on GDPR compliance rather than PECR.

  1. Lawful basis for processing personal data

Data controllers processing personal data in connection with RTB need a “lawful basis” to process that personal data under the GDPR.

With the exception of the actual collection of personal data via cookies or similar technology, where consent is required under PECR, the ICO acknowledges that the subsequent processing of non-special category personal data does not require explicit consent under Article 6 of the GDPR and that this is why many organisations rely on the “legitimate interest” legal basis to process the data as part of RTB.

However, the ICO considers this reliance on legitimate interest to be incorrect and based on the incorrect perception that the legitimate interest legal basis is the “easy option”, especially when compared to consent. The report notes that, in the ICO’s view, “the only lawful basis for “business as usual” RTB processing of personal data is consent” because the extensive and unexpected nature of the data processing in RTB makes it impossible to meet the legitimate interests lawful basis requirements, a view that it considers to be supported by guidance previously issued by the European Data Protection Board and the Article 29 Working Party about data processing in the context of online advertising.

This is a very restrictive approach, especially given the challenges associated with collecting valid consent under the GDPR. If the ICO maintains this strict position following its consultation with the adtech industry, then it is impossible to see how RTB (in particular the extensive data sharing involved in it) can continue in its current form and significant changes will be needed to both the industry and the industry initiatives seeking to deal with many of the recognised challenges.

Despite the apparently robust approach taken in relation to legitimate interest, the ICO does nevertheless seem to leave open the possibility for a legitimate interest argument to be relied on, but only if a participant can show that “their use of personal data is proportionate, has minimal privacy impact and individuals would not be surprised or likely to object”. Satisfying these requirements would likely also require changes to current practices.

  1. Lawful basis for processing special category data

The ICO notes in its report that special category data (i.e. data from which sensitive information can be identified or inferred about a person) is being used in the context of RTB for targeted adverting purposes, notwithstanding that organisations sometimes argue that this is not the case.

The GDPR is clear that the processing of this data (regardless of what purpose it is for) is prohibited, unless a condition within Article 9 of the GDPR applies. The ICO notes that the only applicable condition is explicit consent.

The report states that this means that consent mechanisms currently used, including the IAB Transparency and Consent Framework and Google’s Authorized Buyers Framework, are non-compliant (presumably because they do not specifically call out the processing of special category data and name the vendors doing this). The ICO notes that market participants must therefore modify existing consent mechanisms to collect explicit consent, or they should not process this data at all.

4. Lack of transparency

The ICO considers that privacy notices and information given to individuals about the use of RTB is not detailed enough to give an accurate overview of what happens to their data.

In particular, the report notes that the widespread disclosure of personal data to different participants through the use of RTB presents a particular challenge in satisfying the transparency requirement, especially as the publisher will not always know with whom the data will be shared. Where the processing of personal data by participants is undertaken in reliance on consent obtained by the publisher (which the ICO now suggests must be the case), those participants must be named as recipients of the data, and yet the nature of RTB means that the publisher has no means of determining which participants the data will be shared to. This leads to long (and no doubt incomplete) lists of organisations with whom data “might” be shared, but this will not satisfy the relevant transparency and informed consent requirements.

In addition, the ICO criticises the creation of very detailed profiles, which are repeatedly augmented with information about actions that individuals take on the web, as being “disproportionate, intrusive and unfair” in the context of the processing of personal data for the purposes of delivering targeted advertising, especially when in many cases individuals are unaware that the process takes place and the privacy information provided does not clearly inform them what is happening.

  1. The data supply chain and data leakage

The ICO also notes that the extensive sharing of personal data through the RTB process leads to the risk of data leakage, especially where data is either unintentionally shared or used in unintended ways. This is a very real risk given that bid requests are often not sent to single entities or defined group of entities and because redirects may trigger the further sharing of data contained within the bid requests to a wide range of participants.

The ICO notes that the contractual controls put in place to try and guarantee a level of data protection compliance processing through the supply chain are not sufficient in themselves and that organisations must have technical and organisational controls backing up the contractual controls (e.g. through audits and inspections). Once again, it remains unclear how in practice a single participant can satisfy these requirements where it cannot necessarily know who data is being shared with.

6. Data protection impact assessments (DPIAs)

The ICO considers that DPIAs are mandatory in the context of RTB because, among other things, it involves the use of new technologies, profiling of individuals on a large scale and can involve the tracking of individuals’ geolocation or behaviour. The ICO notes that currently very few participants have undertaken DPIAs in relation to their practices.

  1. Industry initiatives to address issues

The ICO notes that it has considered many of the industry initiatives that have been put forward to change the way in which the adtech industry operates. However, currently it considers that these initiatives are not fully mature, do not sufficiently address their concerns or are not measures that the industry would be willing to adopt.

It leaves open the possibility that these initiatives may address some or all of their concerns in due course, but for the time being this is a clear statement that initiatives, such as the IAB Consent and Transparency Framework, remain deficient. As one of the ICO’s stated next steps is to continue engagement with IAB Europe and Google, the industry will hope that progress can be made in remedying some of these deficiencies.

Next steps

The ICO intends to build on this report and its understanding on this sector through: (i) targeted information gathering exercises about the data supply chain, profiling, existing controls and DPIAs undertaken; (ii) engagement with key stakeholders, including an event similar to the “fact finding session” that it ran earlier in the year; (iii) cooperation with other data protection authorities, which could present quite interesting results in light to of the complaints that have been received around Europe in relation to the adtech sector and the differing approaches taken by different Member State data protection regulators; and (iv) an industry review in six months’ time, the scope of which will depend on its findings before then.

In the meantime, it expects data controllers in the adtech industry to “re-evaluate their approach to privacy notices, use of personal data, and the lawful bases they apply within the RTB ecosystem”, meaning that this report cannot be ignored by participants in the markets and the ICO will expect some proactive steps towards better compliance to be taken.

Our take

This report is rather unhelpful for the thousands of publishers and adtech vendors involved in RTB and the online advertising industry. It sets out a number of deficiencies, but leaves open how many of these can be addressed, especially in the absence of a widespread change in the way the industry works.

However, in light of this report, we would recommend that our clients consider their current online advertising practices and engage with their marketing and digital team, so that they have a clearer picture of what activities are being undertaken and what use of personal data collection this involves. They should also revisit their privacy and cookie notices and, where necessary, enhance the sections on marketing and digital advertising to give a much clearer picture of what is happening to individuals’ data and the different types of processing (including profiling, matching and augmentation) that might be undertaken. Participants should consider undertaking DPIAs to assess the privacy risks associated with their particular use of RTB.

On top of this, participants should monitor enforcement and guidance issued in other Member States and by the European Data Protection Board, which will likely emerge in the lead up to the ICO’s second review in 6 months’ time. Finally, this regulatory activity will form the backdrop for the finalisation of the E-Privacy Regulation which might bring legislative solutions in parallel timeframes.

We will provide updates on key developments in further blog posts.

Nevada, New York and other states follow California’s CCPA

The US privacy law landscape continues to shift and evolve as state and federal privacy legislative proposals continue to be debated and become enacted. While CCPA-like bills in Washington and Texas failed to pass, Nevada passed its online privacy amendment and proposals in New York and Washington, DC appear to be gaining momentum. Continue reading

CCPA: “Attorney General Amendment” Likely Dead

Norton Rose Fulbright - Data Protection Report blog

This is the Data Protection Report’s ninth blog post in a series of CCPA blog posts that will break down the major elements of the CCPA. Stay tuned for additional posts on the CCPA.

On May 16, 2019, the California Senate Appropriations Committee held a hearing that included S.B. 561, the “Attorney General amendment” to the California Consumer Privacy Act (“CCPA”). The bill is being held in committee and under submission, which means the bill has been blocked and is likely dead. Continue reading

ICO’s draft Age Appropriate Design Code could seriously impact processing of under 18’s personal data

US Supreme Court expands digital privacy rights in Carpenter v. United States

On 15 April 2019, the ICO opened a public consultation on a draft code of practice titled Age Appropriate Design (the “Code”).  The Code will remain open for public consultation until 31 May 2019.

The consultation document is described as a “code of practice for online services likely to be accessed by children.”  However, its potential impact is in fact wider, and is perhaps better described as applying to all online services that are not demonstrably unlikely to be accessed by children, which it controversially defines as individuals under 18.  For this reason, the Code in its current form will have implications for almost all providers and users of online services. Continue reading

OPC reconsiders its approach to cross-border data transfers with the Equifax decision

Data Protection Report - Norton Rose Fulbright

In a significant recent decision, the Office of the Privacy Commissioner of Canada (OPC) altered the regulatory landscape when moving personal information between affiliated companies and across Canada’s border for data processing or storage purposes.

Any organization governed by the federal Personal Information Protection and Electronic Documents Act (PIPEDA) will have to re-evaluate and likely adjust its approach to such cross-border data transfers, possibly affecting its outsourcing and cloud computing relationships with vendors and related companies. The OPC has also initiated a two-month consultation period with stakeholders concerning this important policy change. Continue reading

Google and other big data companies face increased scrutiny

Data Protection Report - Norton Rose Fulbright

Norton Rose Fulbright’s US Head of Data Protection, Privacy and Cybersecurity Jeewon Serrato and Partner Vic Domen write about the increased scrutiny that big data companies like Google and Facebook are now facing.

A number of state attorneys general are preparing to have discussions with the US Federal Trade Commission to discuss their concerns about the use of massive amounts of personal data in the digital ad marketplace.

There is a trend among federal and state enforcers to bring these online platforms and technology markets under higher scrutiny.

Get all the details at the full legal update, “Big data companies face increased state and federal scrutiny.”

ICO blog post on AI and solely automated decision-making

Data Protection Report - Norton Rose Fulbright

The ICO has published a blog post on the role of “meaningful” human reviews in AI systems to prevent them from being categorised as “solely automated decision-making” under Article 22 of the GDPR. That Article imposes strict conditions on making decisions with legal or similarly significant effects based on personal data where there is no human input, or where there is limited human input (e.g. a decision is merely “rubber-stamped”). Continue reading

Parenting support club Bounty fined in ‘unprecedented’ data breach

Norton Rose Fulbright - Data Protection Report blog

On 12 April, the Information Commissioners Office (ICO) fined Bounty, a pregnancy and parent support club, £400,000 for illegally sharing personal data belonging to more than 14 million people. As the contravention took place just before the General Data Protection Regulation (GDPR) came into force, the fine was issued under the Data Protection Act 1998 (DPA). Continue reading

LexBlog