← Back to portfolio

Are biometrics a boon or bust for refugees?

Published on

This article was originally published for Mobile World Live, an online communications hub and newsroom under industry body GSMA.

*

A leading United Nations (UN) aid organisation predicted biometrics would soon become essential for interactions with displaced people, a move likely to become increasingly important at a time when growing numbers are being forced to leave their homes.

The World Economic Forum reported 89 million people were forced to leave their homes globally in 2021, while asylum seekers spiked to more than 100 million this year.

As part of its digital push, the United Nations High Commissioner for Refugees (UNHCR) has deployed a mobile app to verify refugees’ identities, in turn enabling aid providers to check individual’s biometric data.

The UNHCR’s figures show such biometric processes are already widely used, with around half of the 21.7 million displaced people it registered on its systems in 2021 employing biometric data.

The technology has expedited registrations and provided ample benefits for the community in question, with fields including financial access (mobile banking) and cash-based interventions claiming to have successfully deployed biometrics.

But, considering the layers of structural and social barriers and unintended risks like cybercrime, could this biometric data do more harm to refugees than good?

One major global operator acting on this very issue is Vodafone Group, which founded the Instant Network Project under Vodafone Foundation in 2015, an initiative which looks at the role of mobile in humanitarian disasters.

Oisin Walton, programme manager of the project, told Mobile World Live technology should be paired with a refugee-specific guideline involving responsible policymaking.

This should be developed alongside digital training “to raise awareness on some of the risks when being online, particularly for first time internet users”, Walton said, naming privacy and scams as the most prevalent threats.

Breaches and protections


In mid-2021, the UN accidentally breached the biometric data of the Rohingya ethnic group who fled violence in Myanmar.

Following the breach, more than 100 displaced Rohingyas hid in fear of possible repatriation.

Biometric data on Afghanistan’s population became vulnerable to the Taliban-led government following the withdrawal of UK and US troops in 2021, and The International Red Cross suffered a sophisticated cyberattack earlier this year involving information on more than 500,000 people.

The UNHCR has a Biometric Identity Management System and operational guidelines. Risk assessments and years of research take place under mandatory due diligence.

But, every policy has its limits, so what can regulators and the industry involved do better?

Barnaby Willitts-King, director of research and policy at GSMA Mobile for Humanitarian Innovation, addressed the gap between institutional and ethical functions.

“There are a lot of claims made about the benefits of biometric data collection for efficiency and avoiding fraud, but it’s not always clear cut that this is actually going to save money or be more efficient or avoid duplication.”

The researcher believes ethical considerations should always sit at the forefront of technology-assisted interventions. “I also think there is a big issue around consent, and it’s very difficult given the deep power imbalance.”

Conscientious approach


As with its physical counterparts, biometrics are not only used for national safety or border controls: the technology is also used to confirm refugees’ rights to basic needs such as healthcare, food and a place in asylum.

Prior to the Rohingya data breach, UNHCR staff asked refugees if they consented to their details being shared with Myanmar when giving out Smart Cards, but it was not made clear people could still receive aid without this.

US-based NGO Human Rights Watch stated the registration receipt was limited to English (a language not spoken by most beneficiaries) and a form had been ticked “yes” despite absence of consent.

Charity Oxfam also cited concern among refugees about biometrics, in particular the potential for the technology to discount their existence by reducing them to iris scans and fingerprints, along with worries over reading the retinas of older asylum seekers or the fingerprints of those with darker skin.

This, coupled with unequal knowledge distribution on part of the beneficiaries, often complicates the issues around meaningful consent.

Willitts-King argued transparent, conscientious data collection starts with a systemic cross-examination, where responsible agencies reframe which information is considered valuable and explore the options to opt out of sensitive databases.

“I think it’s an important policy that a growing number of organisations are starting to realise. If we cannot identify the point of collecting a certain data, then we shouldn’t collect it because we don’t know what the unintended consequence could be, especially when working with vulnerable people”, he said.

Rethinking accountability


The use of data to deliver aid raises issues related to consequence and accountability.

Since biometrics has only become widely-used in recent years, the risks are just becoming apparent, and humanitarian agencies and stakeholders are working to resolve these.

“This is not some narrow ICT challenge, so the process of accountability spread through a whole range of different parts of an ecosystem,” Willitts-King stated.

The deployment of technologies in the humanitarian space has resulted in successful outcomes in several cases: biometrics have expedited border registrations in Jordan, asylum administrations in Greece and financial inclusion among globally displaced communities. This is no doubt a success of the technology.

However, to bring refugees out of the statistical shadows, the industries involved need to re-evaluate key threats until we find a more salient, inclusive approach that feels mutually beneficial.

The editorial views expressed in this article are solely those of the author and will not necessarily reflect the views of the GSMA, its Members or Associate Members.