UK Home Office Deployed Face Recognition Platform with Known Racial Biases

Biometrics NewsFacial Recognition

Biometrics News - UK Home Office Deployed Face Recognition Platform with Known Racial Biases

The UK Home Office has come under fire for a passport checking system with significant racial biases. The platform has trouble identifying people with very dark and very light skin, and has repeatedly mistaken lips for open mouths when trying to map the faces of darker skinned individuals.

The system was first deployed in June of 2016 to streamline the passport application process.  

While racial bias is a well-documented issue with many facial recognition algorithms, the Home Office is facing particularly heavy scrutiny because the department knew about the issue before launching the system, and then failed to notify the public about the problem. The information came to light thanks to a Freedom of Information (FOI) request submitted by MedConfidential.

“User research was carried out with a wide range of ethnic groups and did identify that people with very light or very dark skin found it difficult to provide an acceptable passport photograph,” reads a document released through the FOI request. “However; the overall performance was judged sufficient to deploy.”

The Home Office allows citizens to opt out of the automated check to continue the passport application process. However, critics noted that many people may be hesitant to do so because the website suggests that a bad photo could hinder the application. They have also pointed out that it creates a two-tiered system in which vital public services are not available to many British citizens.

“A person’s race should not be a barrier to using technology for essential public services,” said a spokesperson from the UK’s Equality and Human Rights Commission.

“It clearly shows it wasn’t a priority for them that [the system] would work for people with black skin,” added the Race Equality Foundation’s Samir Jeraj.

The Home Office said that it will work to improve the system, but did not offer a timeline for doing so. Major tech companies like IBM and Microsoft have been working to reduce racial biases in facial recognition tech, but the full extent of the problem is still unclear in many cases.

Sources: New Scientist, BBC

October 11, 2019 – by Eric Weiss