• Skip to main content
  • Skip to secondary menu
  • Skip to primary sidebar
  • Skip to footer
  • Our Services
  • Contact Us
  • Newsletter
  • Top Nav Social Icons

FindBiometrics

FindBiometrics

Global Identity Management

  • Biometrics
    • What are Biometrics?
    • FAQ
    • Biometric Associations
    • Companies
    • Premier Partners
  • News
    • Featured Articles
    • Interviews
    • Thought Leadership
    • Podcasts
    • Webinars
    • Year in Review
  • Applications
    • Biometric Security
    • Border Control and Airport Biometrics
    • Consumer and Residential Biometrics
    • Financial Biometrics
    • Fingerprint & Biometric Locks
    • Healthcare Biometrics
    • Justice and Law Enforcement Biometrics
    • Logical Access Control Biometrics
    • Mobile Biometrics
    • Other Biometric Applications
    • Physical Access Control Biometrics
    • Biometric Time and Attendance
  • Solutions
    • Behavioral Biometrics
    • Biometric Sensors and Detectors
    • Facial Recognition
    • Biometric Fingerprint Readers
    • Hand Readers & Finger Scanners
    • Iris Recognition
    • Biometric Middleware and Software
    • Multimodal Biometrics
    • Physiological Biometrics
    • Smart Cards
    • Vein Recognition
    • Voice and Speech Recognition
  • Stocks
  • Events
  • Companies
  • Podcasts

NIST Study Confirms Racial Bias of Many Facial Recognition Algorithms

December 24, 2019

Biometrics News - NIST Study Confirms Racial Bias of Many Facial Recognition Algorithms

A study by the National Institute of Standards and Technology (NIST) released last Thursday shows that many of the facial recognition algorithms used today misidentify people of color far more often than middle-aged white men.

The study tested roughly 18 million photos of more than 8 million people taken from databases run by the State Department, the Department of Homeland Security, and the FBI. It was conducted on 189 algorithms from the facial recognition industry’s leading systems, voluntarily submitted by 99 companies, academic institutions and other developers. A number of major tech companies supplied the algorithms, including Intel, Microsoft, Panasonic and SenseTime.

Amazon — which develops Rekognition, its own software used by law enforcement to track criminal suspects — did not submit its algorithm for the study, saying that its cloud-based service could not be easily examined by the NIST test.

The results of the study showed that, depending on the algorithm being tested and the type of search being conducted, Asian and African American people were up to 100 times more likely to be misidentified than white men.

For the kinds of searches most often used by police investigators — ‘many-to-one’ searches where a single image is compared to thousands or millions of others to find a match — the faces of African American women were the most likely to receive a false positive match.

The highest false positive rate amongst ethnicities belongs to Native Americans, with researchers finding that algorithms varied widely in their accuracy.

The study also showed that with regards to algorithms developed in the U.S., “one-to-one” searches of Asians, African Americans, Native Americans, and Pacific Islanders showed high error rates in identification. These searches form the backbone of rapidly expanding services like cellphone sign-ins and airport boarding systems.

Lawmakers reacted to the results, saying they were alarmed and called on the Trump administration to revisit its plans to expand the country’s use of facial recognition technology.

“[F]acial recognition systems are even more unreliable and racially biased than we feared,” said Rep. Bennie G. Thompson (D-Miss.), chairman of the Committee on Homeland Security.

In a statement released after the findings were made public, Sen. Ron Wyden (D-Ore.) said that “algorithms often carry all the biases and failures of human employees, but with even less judgment,” and added that “[a]ny company or government that deploys new technology has a responsibility to scrutinize their product for bias and discrimination at least as thoroughly as they’d look for bugs in the software.”

NIST’s study comes after multiple communities in the U.S. have placed bans or restrictions on the use of facial recognition technology by law enforcement, including the state of California, which banned its use on body cameras worn by police officers.

Facial recognition is also facing vocal criticism on the international stage. In China’s northwestern Xinjiang province, its use — along with other forms of biometric surveillance — against the Uighur Muslims is the subject of increasing political and public scrutiny.

Source: The Washington Post

—

December 23, 2019 – by Tony Bitzionis

Related News

  • Vision-Box Biometric Algorithm Boasts Top-15 NIST PerformanceVision-Box Biometric Algorithm Boasts Top-15 NIST Performance
  • ‘T5-OmniMatch’ Platform Consolidates TECH5’s Biometric Solutions‘T5-OmniMatch’ Platform Consolidates TECH5’s Biometric Solutions
  • Liberian Election Officials’ Suspicious Biometrics Dealings Draw Washington’s AttentionLiberian Election Officials’ Suspicious Biometrics Dealings Draw Washington’s Attention
  • US Mulls More Severe Sanctions Against Chinese Biometric Surveillance CompanyUS Mulls More Severe Sanctions Against Chinese Biometric Surveillance Company
  • Innovatrics Completes NIST Testing With Third Biometric ModalityInnovatrics Completes NIST Testing With Third Biometric Modality
  • After Topping FRVT Rankings, Paravision Officially Launches 5th-Gen Facial Recognition PlatformAfter Topping FRVT Rankings, Paravision Officially Launches 5th-Gen Facial Recognition Platform

Filed Under: News Tagged With: algorithm, algorithms, Amazon, Amazon Rekognition, China, facial recognition, facial recognition algorithms, government biometrics, NIST, NIST testing, Xinjiang

Primary Sidebar

Identity is Shaping Air Travel – Time to Invest

Sponsored Links

facetec logo

FaceTec’s patented, industry-leading 3D Face Authentication software anchors digital identity, creating a chain of trust from user onboarding to ongoing authentication on all modern smart devices and webcams. FaceTec’s 3D FaceMaps™ make trusted, remote identity verification finally possible. As the only technology backed by a persistent spoof bounty program and NIST/iBeta Certified Liveness Detection, FaceTec is the global standard for Liveness and 3D Face Matching with millions of users on six continents in financial services, border security, transportation, blockchain, e-voting, social networks, online dating and more. www.facetec.com

TECH5 logo

TECH5 is an international technology company founded by experts from the biometrics industry, which focuses on developing disruptive biometric and digital ID solutions through the application of AI and Machine Learning technologies.

TECH5 target markets include both Government and Private sectors with products powering Civil ID, Digital ID, as well as authentication solutions that deliver identity assurance for various use cases. 

Learn more: www.tech5.ai

Mobile ID World Logo

Mobile ID World is here to bring you the latest in mobile authentication solutions and application providers. Our company is dedicated to providing users with the best content and cutting edge information on technology, news, and mobile solutions for your mobile identity management needs.

Recent Posts

  • An ‘Exciting Time’ for IDEMIA: Identity News Digest
  • Facephi’s Rocketship Award Delivers New Opportunities
  • Chips, Guacamole, and Device-Agnostic Identity – Hummingbirds AI CEO Nima Schei at ISC West 2023
  • NECAM Gets a New CEO: Identity News Digest
  • Onfido Delivers 15-second Identity Verification for UK’s Co-operative Bank

Biometric Associations

IBIA and fido

Tweets

Footer

  • About Us
  • Company Directory
  • Advertise With Us
  • Contact Us
  • Privacy Policy
  • Terms of Use
  • Archives
  • CCPA: Do not sell my personal info.

Follow Us

Copyright © 2023 FindBiometrics