On 11 August 2020, the Court of Appeal (CA) handed down its judgement in the case of R (on the application of Edward BRIDGES) v The Chief Constable of South Wales Police.  The court found that the use of automated facial recognition technology (AFT) by South Wales Police (SWP) was unlawful and did not comply with Article 8 of the European Convention on Human Rights (the right to respect for private and family life) (the Convention).

Whilst this judgement concerned the use of AFT in the public sector, the case provides interesting  commentary on the legalities of this technology in general.  This blog post therefore sets out what the private sector should take away from this decision.

Brief background

This case was brought by Edward Bridges who is a civil liberties campaigner living in Cardiff.  The claim concerned the lawfulness of AFT that had been deployed by SWP on around 50 occasions between May 2017 and May 2019.  The AFT screened against “watchlists” of wanted persons in police databases.  Where the AFT identified a potential match, the images were reviewed by an AFT operator to check whether a match had in fact been made.  Where there was not match, the image was automatically deleted.

Mr Bridges had originally brought a claim for judicial review on the basis that AFR was not compatible with Article 8 of the Convention, data protection law and the Public Sector Equality Duty (PSED) under section 149 of the Equality Act.  However, in September 2019, the High Court dismissed Mr Bridge’s challenge, claiming that the use of AFT was necessary, proportionate and that there was no suggestion that the software might operate in a way that was indirectly discriminatory.

Mr Bridges appealed the decision on the following five grounds:

  1. The High Court had erred in concluding that SWP’s use of AFR and interference with Mr. Bridges’ rights under Article 8(1) of the Convention was “in accordance with the law” for the purposes of Article 8(2) of the Convention. By way of reminder Article 8(2) of the Convention provides that: There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic wellbeing of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others”.
  2. The High Court had made an error of law in concluding that the use of AFR and the interference with Mr. Bridges’ rights was proportionate interference with rights under Article 8 of the Convention.
  3. The High Court was wrong to hold that SWP’s Data Protection Impact Assessment (DPIA) complied with the requirements of section 64 of the Data Protection Act 2018.
  4. The High Court should have reached a conclusion about whether SWP had an “appropriate policy document” in place, as is required under section 35 of the UK Data Protection Act 2018.
  5. The High Court was wrong to hold that SWP had complied with the PSED on the basis that the Equality Impact Assessment carried out was “obviously inadequate” and failed to recognise the risk of indirect discrimination.

The appeal succeeded on grounds one, three and five, but grounds two and four were rejected.

Take-aways for the private sector

Create internal rules

The Court of Appeal (CA) said that the interference resulting from the use of the AFT was not “in accordance with law” as required by Article 8(2) of the Convention.

In other words, the court’s view was that existing legislation and local policies of SWP did not provide clear guidance on the parameters that should be applied when using AFT.  The court also said that: (i) too much discretion was afforded to individual police officers to decide who can be placed on a watchlist; and (ii) it was not clear that there was any criteria for determining where AFT can be deployed. As a result, the relevant policies did not have the “necessary quality of law”.

Those using AFT in the private sector should therefore consider creating internal rules governing the use of AFT.  The rules should demonstrate that there are checks and balances in place to ensure the appropriate use of AFT.  For example, regular reviews, oversight mechanisms and clear (and narrow) criteria setting out when and where AFT may be used.

In its submissions, the Information Commissioner’s Office (which intervened in the case) emphasised that the processing by SWP was not “strictly necessary” for law enforcement purposes.  It is likely that the ICO would apply the same strict test in relation to private sector use of this type of technology.  Organisations therefore also need to be prepared to justify the use of AFT for any given purpose and be able to explain why other, less privacy intrusive, means could not be used as an alternative. 

Scrutinise your DPIA – it could end up in court!

When organisations conduct DPIAs, very few envisage that the contents could be publically scrutinised by a court.  But this is exactly what happened in these proceedings.  SWP’s DPIA was analysed and probed by the court in reaching its decision.  For example, the court considered the description in the DPIA of which individuals were impacted by the AFT, namely: “Those individuals could be personal wanted on suspicion for an offence, wanted on warrant, vulnerable persons and other persons where intelligence is required”.  The court considered this in detail this, commenting that the last category is not objective and “leaves too broad a discretion vested in the individual police officer to decide who should go onto the watchlist”.

Those using AFT in the private sector should therefore:

  1. view their DPIA as a public facing document and consider how a DPIA may be viewed by a court, regulator or complainant; and
  2. be precise in the responses in the DPIA.   Some DPIAs include open-ended and ambiguous statements possibly to ensure its  broad applicability.  This case highlights the importance of avoiding this pitfall in your DPIAs and highlights the importance of being precise in the responses in DPIAs.

Be objective when assessing proportionality

In this case, the CA agreed with the lower court that the SWP’s use of AFR was in fact a proportionate interference with Article 8 rights under Article 8(2).  The court held that a weighing exercise had been correctly conducted and that the potential benefits outweighed the impact on Mr Bridges which was deemed to be minor.  However, this case does provide a reminder that the proportionality test is not an easy one to pass. The judgement recalls the case of Bank Mellat v Her Majesty’s Treasury (No 2) [2014] AC 700 which sets out four questions aimed at justifying whether it is proportionate to limit a Convention right:

  1. Is the objective of the measure pursued is sufficiently important to justify the limitation of a fundamental right?
  2. Is it rationally connected to the objective?
  3. Could a less intrusive measure could have been used without unacceptably compromising the objective?
  4. Having regard to these matters and to the severity of the consequences, has a fair balance has been struck between the rights of the individual and the interests of the community?

Answering these questions in respect of AFT for commercial purposes is not always going to be easy.  For example, it may be difficult to argue that using AFT in retail outlets to analyse shopper behaviour and identify new and repeat customers is “sufficiently important” or that a “less intrusive measure” could not have been used.

Whilst proportionality in this case was discussed in the context of Article 8(2) of the Convention, it is still a crucial consideration for those using AFT in the private sector.  Most of the lawful bases for processing under the EU General Data Protection Regulation (GDPR) require processing to be “necessary”.  This means that the use of AFT must be targeted and proportionate.

Those using AFT in the private sector should therefore remember that the proportionality test must be conducted objectively and not solely from the business’ perspective. To assist with this, organisations should document in the DPIA their assessment of alternative measures that have been considered and explain why they were not appropriate.

Check and assess for bias in training data

The final ground of the appeal related to SWP’s compliance with the PSED.  The PSED requires public authorities to give regard to whether a policy could have a discriminatory impact.  In this case, the court held that SWP had not taken reasonable steps to make enquiries about whether the AFT Locate software risked bias on racial or sex grounds.

This aspect of the judgement is very interesting as there is discussion on bias in “training datasets”. According to the judgement, SWP was not aware of the dataset used to train the AFT system.  It was therefore difficult for SWP to confirm whether the technology was inherently biased as a result of its training data.  One witness, Mr Paul Roberts, an employee of a major corporation specialising in facial recognition technology, explained to the court that the precise makeup, scale and sources of the training data used are commercially sensitive and cannot be released.  However, whilst the court acknowledged this was “understandable”, it said that “The fact remains, however, that, SWP have never sought to satisfy themselves, either directly or by way of independent verification, that the software programme in this case does not have an unacceptable bias on grounds of race or sex [given that] there is evidence,…that programs for AFR can sometimes have such a bias.”

This sentiment is echoed in recent guidance produced by the ICO on AI and data protection which, amongst many things, discusses the risks of bias and discrimination caused by training data.  Specifically, it points out that training data can: (i) be imbalanced, i.e. the proportion of different genders or ethnicities may not be balanced; and (ii) reflect past discrimination, e.g. if in the past loan applications from women were rejected more frequently than men then a model trained on this data will likely reproduce the same pattern of discrimination.

Those using AFT in the private sector should therefore ensure that they make appropriate enquiries with developers of this technology about potential bias in the software.  Similarly, if organisations are training software using their own training data they should also take steps to reduce the risk of bias. 

Our take

Whilst recently, the focus on AFT has been in the context of use by law enforcement and public authorities, there are indications that attention is shifting to the private sector.  The ICO is currently investigating the use of the technology by a property developer in Kings Cross station.  Further, the Centre for Data Ethics and Innovation (CDEI) said that AFT has: “seen increasing use in the private sector, where it is being applied to identify known shoplifters or people engaged in antisocial behaviour in stores, as well as to anonymously track the movements of customers for marketing purposes”.  The CDEI said that over the coming months it will be exploring further how this technology is being used in the private sector, and “whether the UK’s current arrangement of laws and oversight bodies is equipped to minimise the harms posed”.  Organisations in the private sector should therefore expect more media and regulatory attention on their use of AFT and be prepared to justify how that use of the technology complies with the law.