The Electronic Frontier Foundation (EFF) has tried to articulate the difference between the public and private use of facial recognition. The privacy watchdog has consistently pushed to ban the government use of the technology, but believes that private use should still be permissible as long as strict regulations are put in place.
In that regard, the EFF noted that there are some socially beneficial uses of the technology. For example, many people use facial recognition to unlock their smartphones. Other factors may be more secure, but facial recognition remains a convenient option that raises the world’s overall security posture, and should be available to people who might not use other features.
By the same token, facial recognition could be used as part of a public demonstration, even if only to highlight the potential dangers of the technology. A full private ban would prevent such uses and limit protest activities and other forms of public expression.
With that in mind, the EFF believes that private facial recognition should be legal, with some considerable limitations. Most notably, private entities should not be allowed to collect or share any facial information without explicit written consent, and should be barred from selling faceprints in any capacity. Faceprints that are collected should be gathered for a specific purpose, and should be discarded once that purpose has been accomplished, or after a certain amount of time. Finally, face data should be stored securely, to prevent malicious actors from gaining access to people’s personal information.
The EFF also advocated for minimization, a principle that argues that facial recognition should only be used to provide people with something that has been explicitly requested. Private entities should similarly not be allowed to retaliate against anyone that chooses not to use facial recognition, as would be the case if those entities force people to pay higher fees or suffer worse service if they choose to preserve their privacy.
Of course, those laws would need some teeth to make sure that private companies cannot choose to ignore them. To that end, the EFF stressed the need for a “private right of action” that allows individual citizens to take legal action against private entities that violate the law. The organization cited Illinois’ Biometric Information Privacy Act (BIPA) as an example of such legislation, and suggested that it should be implemented on a national scale.
While the EFF is in favor of private use, it came down against retail outlets and other establishments that use facial recognition to identify and market to their customers. Venues that deploy the technology are doing so without obtaining written consent, and are often leveraging user information pulled from other sources (such as online activity). The EFF also spoke out against private companies like Clearview AI that market facial recognition to the government, noting that any laws would need to close those public sector loopholes.
The EFF has backed public facial recognition bans all over the country, though it did not support the private ban that was implemented in Portland, Oregon. The organization has spoken out against the Department of Homeland Security’s recent efforts to expand its collection powers
January 21, 2021 – by Eric Weiss