ross jennifer

Biometric technology, meaning any technological solution used to identify a person by his or her unique physical or behavioural characteristics, is likely to have a major impact on asset finance over the next decade.

From checking the background of investors, to identifying the participants and securing major investments, the tech’s increasing usage among banks and wider public on smartphones has made its further spread into asset finance inevitable.

However, the promised benefits of increased security and convenience that are created by solutions such as fingerprint and face recognition must be balanced with cultural concerns and legal risks related to privacy and data protection that may make asset finance managers think twice before deploying biometric systems.

For instance, one such fear is ‘function-creep’, i.e. data recorded for one purpose being used for another purpose, such as law enforcement or targeted advertising.

It is not difficult to imagine a future in which decision-making in many spheres of life is automated, with outcomes being determined on the basis of the data available in relation to that individual.

These concerns could be heightened by incoming data protection regulations.

Currently, a firm seeking to implement a biometric authentication system must also grapple with complex data protection legislation.

Processing of personal data in the UK is currently regulated by the Data Protection Act 1998 (‘DPA’) but this is set to change on 25 May 2018 when the EU’s General Data Protection Regulation (‘GDPR’) comes into force, notwithstanding the UK’s decision to leave the European Union.

Most people are naturally wary of sharing their biometric data, the most personal of all forms of ‘personal data’.

For an individual, being asked to provide fingerprint or facial data to an asset finance organisation feels more intrusive than a request for bank details or a national insurance number.

Under the GDPR, personal data must be processed in accordance with specific data protection principles, most of which are similar to  those under the DPA.

These include lawfulness and fairness, transparency, purpose limitation, data minimisation, accuracy, storage limitation and accountability. At least one condition for the processing must also be satisfied. 

The GDPR also introduces a new requirement to perform a privacy impact assessment in relation to processing which is likely to be high-risk to the rights of the individual. It specifically makes privacy impact assessments mandatory in relation to large-scale processing of special category personal data.

In certain circumstances it may also be necessary to consult the Information Commissioner (the UK data protection regulator) prior to starting any high-risk processing.

Practical considerations for asset finance managers

Along with commercial considerations such as costs, companies should consider the following issues before implementing a biometric system:

  • Purpose. For what legitimate purpose will the biometric data be collected and processed? What specific need is the new system intended to address?
  • Privacy. A privacy impact assessment must be carried out in relation to large-scale processing of biometric data in order to identify and limit potential risks to privacy arising out of the data processing.
  • Proportionality. Is the biometric system necessary to meet the identified need or merely the most convenient or cost-effective solution? Is the data being processed limited to the minimum required to meet that need? Is there a less intrusive alternative available?
  • Security. Will the biometric data be adequately protected from theft, misuse and unauthorised access? Consider how and where the information will be stored and the necessary safeguards in terms of technical and organisational measures.
  • Accuracy. How frequently are identification errors expected to occur when using the new system? 

It remains to be seen how the GDPR will impact attitudes and practices in relation to biometric data in particular.

However, in the US, where biometric systems are more widely used, for example in workplace settings, their implementation by various high-profile employers is already being challenged in the courts.

For example, employee plaintiffs in Illinois have filed a class action lawsuit against the Intercontinental Hotel Group, alleging that their fingerprints and other biometric data were collected and used in violation of the state's Biometric Information Privacy Act.

The employees claim that their employer implemented a new system which logged attendance via fingerprint scanning without properly obtaining their consent or providing information as to how this data would be stored or used, whether it would be shared with third parties and how or when it would be permanently deleted, all of which is required under that Illinois State law.

As for the UK, will the use of biometric technology be the norm a few years from now? If so, will the GDPR and the new UK data protection legislation ensure that privacy is adequately protected?

The undoubted benefit to be derived from such technologies is accompanied in each case by inevitable losses in privacy and benefits to the firms involved. As the boundaries between work and private life become increasingly blurred, it will only become more difficult to strike a balance between these competing interests.

This is a challenge which will continue to face employers and organisations in the years to come.

* Zeinab Harb and Clare Murray specialise in employment and partnership law at CM Murray LLP. This article was co-authored with Jennifer Ross (pictured above) of leading civil litigation firm, Peters & Peters LLP.