“By targeting three of the world’s biggest tech companies, the letters could help to focus attention on the issue.”
A coalition of 85 advocacy and activist groups have signed open letters urging Amazon, Google, and Microsoft to pledge not to sell facial recognition technology to government authorities. Signatories include the American Civil Liberties Union, the Electronic Frontier Foundation, Human Rights Watch, and the Refugee and Immigrant Center for Education and Legal Services, among others.
The letters were tailored to each tech giant, with those directed at Google and Microsoft crediting the companies’ leadership for previously signalling at least some level of commitment to ethical standards concerning AI and facial recognition, while also outlining why a firmer stance is needed. The letter to Amazon was somewhat more harsh, given the latter’s unapologetic approach to selling such technologies to police and other government actors, which it has defended on the basis of the technology’s utility in fighting human trafficking and sexual exploitation.
Commenting on the issue to media, ACLU California’s Nicole Ozer said that “the choices made by these companies now will determine whether the next generation will have to fear being tracked by the government for attending a protest, going to their place of worship, or simply living their lives.”
By targeting three of the world’s biggest tech companies, the letters could help to focus attention on the issue. But they also ignore major facial recognition specialists that have a lower profile in public sphere, such as NEC, which are likely to step into any void left by the big tech firms if they withdraw from the government market. Calling for government action on the issue, meanwhile, could help to set the groundwork for UK-style oversight bodies, but faces the currently daunting obstacle of legislative dysfunction in the US.
January 16, 2019 – by Alex Perala[UPDATE 01/21/2019: An earlier version of this article stated 90 advocacy groups signed the open letters mentioned above. This article has been corrected to reflect the correct number: 85. Additional changes were made to reflect the complexity of the situation.]