16/02/2021

The Council of Europe recently published guidelines on the use of FRT to comply with the Convention on the Protection of Individuals with regard to the Processing of Personal Data.

The guidelines sum up the risks that FRT presents to the ‘right to respect for private life:

“Integrating facial recognition technologies to existing surveillance systems poses a serious risk to the rights to privacy and protection of personal data as well as to other fundamental rights since the uses of these technologies do not always require the awareness or cooperation of the individuals whose biometric data is processed, considering for instance the possibility of accessing digital images of individuals on the Internet”.

The "Do Nots Ever"

Use of FRT for the sole purpose of determining a person’s skin colour, religious belief, sex, racial or ethnic origin, age, health condition or social standing should be prohibited – or at least allowed only if there are strict, effective measures against discrimination.

More ‘advanced’ FRT attempt to assess a person’s “affect” – their personality traits, inner feelings, mental health or workplace engagement. The guidelines state that “linking recognition of affect, for instance, to hiring of staff, access to insurance, education may pose risks of great concern, both at the individual and societal levels and should be prohibited.”

The guidelines also prohibit ‘scrapping’ faces from the internet:

“using digital images that were uploaded on the Internet, including social media or online photo management websites or were captured passing through the lens of video surveillance cameras cannot be considered lawful on the sole basis that the personal data were made manifestly available by data subjects.”

Consent should not, as a rule, be the sole legal ground for the use of FRT by public authorities or private business. Much less, can passing through environments where FRT is used, even with prominent signage, be considered as explicit consent.

Public vs business use of FRT

The guidelines make a distinction between the use of FRT in “uncontrolled environments” – which are places freely accessible to individuals, where they also can pass through, including public or quasi-public places, such as hospitals, schools or shopping malls – and “controlled environments” in which FRT can only be used with a person’s participation, such as security access system to open a door.

The guidelines state that FRT should never be used by private business in an uncontrolled environment (e.g. by a shop owner for marketing purposes such as monitoring reactions in the aisles to products or promotions).

The use of FRT in an uncontrolled environment by public authorities also should be heavily restricted. Secure school or other public building is not a sufficient justification for FRT if other less intrusive measures would be effective. As we saw with the US Congress storming, extremists enjoy taking selfies while they riot as much as the rest of us do when at tourist locations. But ‘watch list’ of right wing extremists scrapped from their Facebook or Instagram pages to protect  the US Congress or our own Parliamentary buildings is probably not permitted.

Use for law enforcement purposes can be justified but require strict measures on setting up the watchlist and deploying the live facial recognition. Given that many more ‘innocent’ people will be caught up in the surveillance, use for FRT in an area for law enforcement needs to be transparent – with signage at entry points that sets out the purpose of the processing and the authority using the FRT and the duration of the live deployment. Covert use of live facial recognition is permissible only if strictly necessary to prevent an imminent and substantial risk to public security.

Use of FRT by business in a controlled environment requires explicit, informed consent from individuals, given the power imbalances between, for example, employers and employees. To ensure consent is freely given, individuals have to be offered an alternative security access technology, such as keying in a password. Those alternatives also must not be so cumbersome (e.g. length of the password) that they have the practical effect that the choice is not genuine.

Makers of FRT

The guidelines also set out principles which developers and manufacturers of FRT should satisfy. They are under a strict burden to test their FRT systems to identify and eliminate disparities in accuracy. This needs to be done using synthetic data sets of men and women of different skin colours, all ages, different morphologies, from different camera angles and in today’s world, to manage face coverings. The FRT system also needs periodic renewal of the data (i.e. the faces to be recognised) to continuously improve the algorithm but also to recognise aging.

Developers of FRT have a positive duty to work through the privacy issues with their clients. Their FRT should have enough flexibility to adjust for the principles of purpose limitation, data minimisation and data storage, which are likely to be specific to the setting in which the client proposes to use the FRT.

It’s about more than the law

Finally, a user proposing to use FRT needs to ask “I know I can, but should I?”. The guidelines recommend:

“In addition to the respect of legal obligations, giving an ethical framework to the use of this technology is also crucial, in particular with regard to higher risks inherent to the uses of facial recognition technologies in certain sectors. This could take the form of independent ethics advisory boards that could be consulted before and during lengthier deployments, carry out audits and publish the results of their research to complement or endorse an entity’s accountability. Expressly ethical considerations may help strike an appropriate balance between competing interests in a demonstrably fair way.”

 

Read more: Consultative Committee of the Convention for the Protection of Individuals with regard to automatic processing of personal data

 

""

 

""