• Skip to main content
  • Skip to secondary menu
  • Skip to primary sidebar
  • Skip to footer
  • Log In
  • Member Registeration
  • Account
  • Our Services
  • Contact Us
  • Newsletter
  • Top Nav Social Icons

FindBiometrics

FindBiometrics

Global Identity Management

  • Biometrics
    • What are Biometrics?
    • FAQ
    • Biometric Associations
    • Companies
    • Premier Partners
  • News
    • Featured Articles
    • Interviews
    • Thought Leadership
    • Podcasts
    • Webinars
    • Year in Review
  • Applications
    • Biometric Security
    • Border Control and Airport Biometrics
    • Consumer and Residential Biometrics
    • Financial Biometrics
    • Fingerprint & Biometric Locks
    • Healthcare Biometrics
    • Justice and Law Enforcement Biometrics
    • Logical Access Control Biometrics
    • Mobile Biometrics
    • Other Biometric Applications
    • Physical Access Control Biometrics
    • Biometric Time and Attendance
  • Solutions
    • Behavioral Biometrics
    • Biometric Sensors and Detectors
    • Facial Recognition
    • Biometric Fingerprint Readers
    • Hand Readers & Finger Scanners
    • Iris Recognition
    • Biometric Middleware and Software
    • Multimodal Biometrics
    • Physiological Biometrics
    • Smart Cards
    • Vein Recognition
    • Voice and Speech Recognition
  • Stocks
  • Events
  • Companies
  • Podcasts

Does LFR Have a Racial Bias? The UK’s National Physical Lab Weighs In

April 12, 2023

“Now, the NPL study on NEC’s technology offers further evidence that the racial bias problem can be ameliorated, and even eliminated.”

Does LFR Have a Racial Bias? The UK's National Physical Lab Weighs In

After years of intense debate over facial recognition’s shortcomings, new research from the United Kingdom’s National Physical Laboratory offers an updated perspective on equity issues in the police use of the technology.

Assessing NEC’s NeoFace solution together with the HD5 Face Detector, the study delivered multiple key findings. One of the most important directly concerns the longstanding issue of demographic bias, in which certain facial recognition systems show disparities in accuracy across different ethnicities and genders. The NPL study found that, when used at the settings maintained by the Met, there was no statistically significant difference in the facial recognition technology’s accuracy across such demographic groups; in other words, bias was eliminated from the system.

While the study did not test a broad sampling of facial recognition solutions, focusing instead on NeoFace, its findings nevertheless offer some indication of the state of the art in 2023, years after reports first came to light indicating the presence of bias in facial recognition.

A 2019 study from America’s National Institute of Standards and Technology (NIST) found particularly concerning disparities. Testing 189 facial recognition algorithms from 99 vendors, NIST researchers found that in one-to-many face searchers, African American women were more likely than other groups to receive a false positive match. More generally, Asian and African American people could be up to 100 times more likely than white men to be misidentified by certain facial recognition algorithms.

Such findings caused alarm over their real-world implications. Even a small disparity, such as a one percent difference in accuracy between different demographic groups, can add up to hundreds or thousands more misidentifications for one particular demographic group when used in large-scale scenarios such as airport screening.

Learn more about NEC’s biometric technology, and how it is transforming the aviation experience. Register now!

But at least some biometrics vendors have taken pains to reduce or eliminate such biases in recent years. Paravision, for example, demonstrated a 100 percent True Identification Rate across Asian, Black, and White racial categories, and across Male and Female groups, in testing by the Department of Homeland Security last spring. Now, the NPL study on NEC’s technology offers further evidence that the racial bias problem can be ameliorated, and even eliminated.

Of particular importance is the study’s assessment of the facial recognition system when set to a matching threshold of 0.6. This is the default setting for NeoFace, and the one applied by the Met in Live Facial Recognition, as mentioned above. The matching threshold determines which faces could be a potential match; when it is set lower, more faces will be considered, and vice versa.

While the 0.6 threshold setting did produce a lower True Positive Identification Rate for Black female subjects, the variation in outcomes was the same as the variation produced by environmental effects, and not considered statistically significant.

At the higher threshold of 0.62, there was only one false positive, and at a threshold of 0.64, there weren’t any. This means that the possibility of demographic bias is completely eliminated at the latter setting. Conversely, when the NPL researchers lowered the matching threshold to 0.58 and under, they found that a statistically significant imbalance returned, with Black subjects more likely to receive false positives than Asian and White subjects.

This all suggests that NeoFace, at least, continues to have a demographic bias, but that it disappears at more selective matching configurations.

Also important is the NPL’s finding that the chance of a false match when using facial recognition at the 0.6 threshold amounted to one in 6,000. This, together with the finding concerning bias (or lack thereof), was used by another UK police agency, the South Wales Police, to help justify restarting its use of facial recognition.

“The study confirms that the way South Wales Police uses the technology does not discriminate on the grounds of gender, age or race and this reinforces my long-standing belief that the use of facial recognition technology is a force for good and will help us keep the public safe and assist us in identifying serious offenders in order to protect our communities from individuals who pose significant risks,” commented South Wales Police Chief Constable Jeremy Vaughan.

Sources: The Guardian, Swansea Bay News, Forbes, Metropolitan Police, NPL

–

April 12, 2023 – by Alex Perala

Related News

  • Clearview Doesn’t Know How Many Individuals Are In Its DatabaseClearview Doesn’t Know How Many Individuals Are In Its Database
  • Another Commission Pushes for Facial Recognition Ban in Long BeachAnother Commission Pushes for Facial Recognition Ban in Long Beach
  • Police and Industry Lobbyists Push Back Against Facial Recognition BansPolice and Industry Lobbyists Push Back Against Facial Recognition Bans
  • Long Beach Commission Pushes for Facial Recognition MoratoriumLong Beach Commission Pushes for Facial Recognition Moratorium
  • Amnesty Report Shows Biased Distribution of NYC Surveillance CamerasAmnesty Report Shows Biased Distribution of NYC Surveillance Cameras
  • West Lafayette Mayor Vows to Veto Facial Recognition BanWest Lafayette Mayor Vows to Veto Facial Recognition Ban

Filed Under: Featured Articles, Features Tagged With: Biometric, biometrics, demographic bias, equity issues, face biometrics, facial recognition, law enforcement biometrics, LFR, National Physical Laboratory, police biometrics, racial bias, UK

Primary Sidebar

Identity is Shaping Air Travel – Time to Invest

Sponsored Links

facetec logo

FaceTec’s patented, industry-leading 3D Face Authentication software anchors digital identity, creating a chain of trust from user onboarding to ongoing authentication on all modern smart devices and webcams. FaceTec’s 3D FaceMaps™ make trusted, remote identity verification finally possible. As the only technology backed by a persistent spoof bounty program and NIST/iBeta Certified Liveness Detection, FaceTec is the global standard for Liveness and 3D Face Matching with millions of users on six continents in financial services, border security, transportation, blockchain, e-voting, social networks, online dating and more. www.facetec.com

TECH5 logo

TECH5 is an international technology company founded by experts from the biometrics industry, which focuses on developing disruptive biometric and digital ID solutions through the application of AI and Machine Learning technologies.

TECH5 target markets include both Government and Private sectors with products powering Civil ID, Digital ID, as well as authentication solutions that deliver identity assurance for various use cases. 

Learn more: www.tech5.ai

Mobile ID World Logo

Mobile ID World is here to bring you the latest in mobile authentication solutions and application providers. Our company is dedicated to providing users with the best content and cutting edge information on technology, news, and mobile solutions for your mobile identity management needs.

HID logo

HID powers the trusted identities of the world’s people, places and things. Our trusted identity solutions give people convenient and secure access to physical and digital places and connect things that can be identified, verified and tracked digitally. Millions of people use HID products to navigate their everyday lives, and billions of things are connected through HID technology. https://www.hidglobal.com/

Recent Posts

  • Mobile ID Checks Are Coming to a Business Near You – Identity News Digest
  • TECH5 Leverages Biometric Codes to Fight ID Forgery
  • DeSantis Redefines ‘Personal Information,’ Signs Florida’s Digital Bill of Rights Into Law – Identity News Digest
  • AU10TIX Re-tools Identity Solutions Portfolio to Reach New Clients
  • Apple’s Getting Into Iris Recognition – Identity News Digest

Biometric Associations

IBIA and fido

Tweets

Footer

  • About Us
  • Company Directory
  • Advertise With Us
  • Contact Us
  • Privacy Policy
  • Terms of Use
  • Archives
  • CCPA: Do not sell my personal info.

Follow Us

Copyright © 2023 FindBiometrics