business privacy laws

How to Stay Compliant with Biometric Data Regulations

AI-powered facial recognition technology is an increasingly popular tool for organisations looking to streamline access controls and enhance security. But as Serco Leisure found out recently, data protection regulators are singling out such rollouts for increased scrutiny. Privacy risk assessments and other best practices are fast becoming indispensable to ensure deployments stay on the right side of the law.

Face/Off

Regulators at the Information Commissioner’s Office (ICO) penalised Serco Leisure for failing to offer an alternative to having employee faces and fingerprints scanned to clock in and out of work. Less intrusive means such as ID cards or fobs could just have easily been used instead, the ICO ruled.

More than 2,000 employees at 38 leisure facilities were obliged to submit biometric data for attendance checks. Serco Leisure failed to demonstrate that its use of biometric technologies was “necessary and proportionate” in recording workers’ attendances.

“Biometric data is wholly unique to a person so the risks of harm in the event of inaccuracies or a security breach are much greater – you can’t reset someone’s face or fingerprint like you can reset a password,” John Edwards, UK Information Commissioner, said in a statement. “Serco Leisure did not fully consider the risks before introducing biometric technology to monitor staff attendance, prioritising business interests over its employees’ privacy.”

The ICO further faulted Serco for failing to provide workers uncomfortable with biometric checks with any mechanism to opt out of the system. Serco Leisure, Serco Jersey and seven associated community trusts were ordered to stop processing of biometric data for monitoring employees’ attendance at work. The businesses were further ordered to destroy all biometric data that they are not legally obliged to retain.

The sanctions coincided with the publication by the ICO of new guidance on how organisations can process biometric data lawfully.

Access Control

Facial recognition technology is gaining traction in the enterprise to control access to secure locations and authenticate users through ID verification services, among other things. But biometric data, unlike passwords, is intrinsically tied to an individual, so the impact of its exposure in the event of a data breach can be more severe and long-lasting.

Recognising the heightened privacy protection it deserves, the EU’s General Data Protection Regulation (GDPR) sets out tighter rules governing the processing of biometric data including purpose limitation as a safeguard against mission creep, transparency and consent requirements.

Deployment Hurdles

Rolling out biometric technologies in the workplace can be compatible with regulatory requirements, but only provided they are introduced after a comprehensive data protection impact assessment (DPIA), according to experts.

Jon Bartley, partner and head of the Data Advisory Group at international law firm, RPC, tells ISMS.online that it’s “critical to have a clear process for onboarding technologies” that involve the use of biometric data, such as fingerprint or iris scans.

“From the perspective of data protection law, companies need to be aware of the hurdles to deployment,” he explains. “For example, depending on the use case for the tech, it may be difficult to establish a lawful basis for processing the biometric data unless affected individuals are given a real choice about accepting the tech or selecting an alternative.”

Since it can be used to uniquely identify individuals, biometric data falls into a special category, and is therefore subject to stricter data processing rules.

“This data is higher risk and a further condition must be satisfied to legitimise the processing, which may be difficult due to the limited number and scope of such conditions,” Bartley says.

AI Increases Risk

Sarah Pearce, partner at solicitors Hunton Andrews Kurth and an expert in data privacy, claims the ICO’s Serco Leisure ruling shows why businesses should consult legal professionals on regulatory compliance. Aside from data protection law, facial recognition that is underpinned by AI technology triggers requirements to comply with a growing body of legislation governing this emerging technology space, she tells ISMS.online.

The recently ratified EU AI Act establishes a risk-based legal framework for AI governance that characterises the use of AI in conjunction with facial recognition technologies as “high risk”.

This would mean that, in order to be fully compliant with that law, businesses would need to “assign human oversight of the AI system, conduct a fundamental rights impact assessment (not dissimilar to a DPIA under the GDPR), and inform the employees and works council where appropriate,” she explains.

The high-risk designation applies to both customer-facing and employee-focused deployments of biometric technologies in conjunction with AI.

RPC’s Bartley adds: “Use of facial recognition tech to track employees may be classified as a high risk use of AI under the AI Act, which triggers various obligations such as human oversight, monitoring, record-keeping and employee consultation.”

Operational Benefits

Ashley Avery, partner at UK law firm Foot Anstey, tells ISMS.online that she has recorded a “significant increase” in the volume of queries from clients looking to deploy biometric technology. These organisations see the business benefits of it for security purposes, or as part of a customer offering, but are “wary” of the data privacy risks, Avery explains.

“The guidance issued by the ICO is useful as it details the points that need to be considered before adopting such technology and how businesses can demonstrate compliance with data privacy legislation,” she adds.

Frameworks and standards – such as ISO 27001 – can help businesses position themselves to achieve compliance when rolling out facial recognition technologies. ISO 27001 offers a systematic approach to managing and protecting sensitive information, including data handled by facial recognition technologies.

The ICO’s guidance places a strong focus on the requirement to carry out DPIAs.

“In our experience, DPIAs are a valuable tool for identifying privacy-related risks, meaning businesses can put in place measures to minimise such risks from the outset – this is crucial when processing such sensitive data,” Avery concludes.

Businesses looking to deploy biometric technologies should consider the following:

⦁ Consider less intrusive authentication or access control options
⦁ Follow the ICO’s new biometric guidance
⦁ Carry out a rigorous DPIA
⦁ Consult with stakeholders (clients, partners and employees) prior to deployment
⦁ Provide clear and transparent information on how biometric data will be used
⦁ Implement appropriate technical and organisational measures to ensure the security and integrity of the biometric data – using standards like ISO 27001 where applicable

Streamline your workflow with our new Jira integration! Learn more here.