An apparent pay-for-play service has raised serious concerns about the integrity of Moscow’s facial recognition system. The network was launched back in January, and is theoretically restricted to law enforcement officers.
However, that claim is now in doubt after a Moscow resident stumbled across an ad for a service that seems to give regular civilians access to information in the system. Anna Kuznetsova saw the ad on the Telegram messaging app, and responded to it on behalf of the digital rights group Roskomsvoboda, for which Kuznetsova is a volunteer.
The service in question offered to provide a detailed report on the movements of anyone in Moscow. Kuznetsova was asked to submit a photo and pay a modest fee of 16,000 rubles (or approximately $200). She paid the fee and submitted a photo of herself, and received a report two days later that included 79 positively identified photos of Kuznetsova.
According to Kuznetsova, the service provider never asked about her identity, or inquired about how the photos would be used. Each photo was stamped with a time and a location, painting an accurate picture of her daily habits for the past month. The report also disclosed her address and her place of work, information that could jeopardize her safety in the wrong hands.
“Any crazy guy can stalk you using this; criminals can check when and where you go and steal from your apartment or hurt you,” Kuznetsova told Reuters. “Anything can happen.”
The network is run by Moscow’s Department of Technology (DIT), and consists of more than 105,000 cameras from NtechLab. When it launched, the DIT claimed that all footage would be deleted after five days, unless law enforcement needed that footage for an investigation. Kuznetsova’s report detailed her movements for a full month, which suggests that footage of her was still in the database well past the point that it should have been deleted.
At the moment, it is not yet clear how the service provider gained access to the network, and specifically whether they hacked (and compromised) the database or bribed an official with legitimate access. Two police officers are now being investigated in relation to the incident, but Kuznetsova has nevertheless filed a lawsuit that asks that the system be suspended until clearer rules can be put in place, with penalties for those who violate them.
The news reflects international concerns about the unregulated use of facial recognition. Clearview AI similarly came under fire when it was revealed that the company had given investors personal access to the company’s controversial facial recognition system.
November 13, 2020 – by Eric Weiss