The racial bias that tends to be displayed by numerous facial recognition algorithms seems to be intensified when subjects are wearing face masks, according to officials from the Department of Homeland Security.
The officials’ comments were made at the 2021 Federal Identity Forum and Expo hosted this week by AFCEA, a nonprofit aimed at facilitating knowledge-sharing and collaboration between military, government, and industry professionals. (This week’s AFCEA event also brought remarks from a Defense Department official indicating that the DoD is preparing to issue a Request for Proposals concerning an upgrade to its Automated Biometric Identification System.)
The officials explained that officials sought to explore race-based differences in the performance of facial recognition system’s at the third annual DHS Science and Technology Directorate Biometric Technology Rally last year, with the help of volunteers.
Over 60 facial recognition systems were tested against a minimum performance standard of a 95 percent success rate in biometrically acquiring and matching subject’s faces. More than a third met that rate when scanning volunteers who identified as Black, while over half met the standard when scanning white volunteers.
While it is not clear how large the sample sizes were for these tests, that is a large disparity in the algorithms’ performance with respect to race – and one that builds on previous studies, such as a landmark investigation by the National Institute of Standards and Technology in 2019, that demonstrated the presence of racial bias in facial recognition technology.
That isn’t to say all of the facial recognition systems that were tested displayed a bias. As AFCEA reports, the DHS found that the best facial recognition algorithms attained a 100 percent success rate regardless of the races of subjects.
Unfortunately, even the best facial recognition systems began to show a bias when volunteer subjects donned face masks. AFCEA’s report does not provide hard data concerning this trend, but indicates that even as the top-performing algorithms continued to meet the 95 percent success rate threshold for white subjects, they failed to do so for Black volunteers.
The results are dispiriting, suggesting that racial bias could continue to be a serious issue as facial recognition technology is increasingly deployed in the real world, especially if the use of face masks around the world remains prevalent for some time. But the fact that DHS officials are paying attention to the issue offers cause for hope; events like the Biometric Technology Rallies – together with ongoing testing by NIST and other groups – offer compelling incentives for solutions providers to continue to refine their technologies, including by seeking to eliminate racial bias.
Until they show marked success in those efforts, the use of facial recognition is likely to continue to provoke scrutiny from civil rights experts and advocacy groups – and for good reason.
August 27, 2021 – by Alex Perala