Indian privacy advocates are pushing back after police in the Uttar Pradesh capital city of Lucknow announced that they will be using facial recognition to watch for women in distress. To that end, the police will be installing five-camera setups at roughly 200 locations that have been linked to a high number of complaints from women.
While the scheme is ostensibly designed to cut down on street harassment, privacy advocates have warned that the invasive technology could in fact lead to more harassment of the women it is supposed to protect. The police did not provide many details about how the system would actually work, stating only that it would send a notification to the nearest police station whenever it detects a change in a woman’s facial expression.
Unfortunately, the police were unclear about what constitutes distress, which increases the likelihood of a false alarm. That would lead to over-policing, and expose people to much greater surveillance without reducing the risk of harm.
“The idea that cameras are going to monitor women’s expressions to see if they are in distress is absurd,” said Internet Freedom Foundation Associate (IFF) Counsel Anushka Jain. “What is the expression of someone in distress? I could be talking to my mother on the phone and get angry and make a face – will that trigger an alert and will they send a policeman?”
As it stands, Uttar Pradesh is the most dangerous state in India, reporting the highest number of crimes against women in 2019. The country as a whole averages one rape every 15 minutes, a statistic that has prompted legislators to introduce legal reforms that make it legal to report and prosecute sexual assault.
However, local women’s rights groups do not see the police as a solution. They noted that the police often turn away women who bring complaints, or else ignore them and take no followup action. With that in mind, it is noteworthy that the proposed facial recognition system will monitor the victims, and not the alleged perpetrators of the crimes in question.
“The police are using the technology to solve a problem without considering that this will simply become a new form of surveillance, a new form of exercising power over women,” said Article 19 researcher Vidushi Marda. “AI is not a silver bullet, and no amount of ‘fancy’ tech can fix societal problems.”
The Lucknow police agency did not disclose how the biometric data gathered with its system would be stored, nor did it reveal who would have access to information, which raises security concerns given the country’s prior issues with data breaches. The Indian government has been aggressively expanding its use of facial recognition in the past few years, despite a 2017 supreme court ruling that declared that privacy is a fundamental human right. For its part, the IFF has called for a three-year ban on facial recognition technology.
Source: Al Jazeera
January 22, 2021 – by Eric Weiss