A team of scientists at the University of Chicago’s Sand Lab have created a tool that is designed to thwart facial recognition algorithms. Named in honor of Guy Fawkes (whose mask appears in V for Vendetta), the Fawkes tool will alter a photo to trick a facial recognition algorithm into thinking that the face belongs to a different person. However, the changes are virtually imperceptible, so the cloaked version still looks like the original to the naked eye.
“What we are doing is using the cloaked photo like a Trojan Horse to corrupt unauthorized models to learn the wrong thing about what makes you look like you and not someone else,” said University of Chicago computer science professor and Fawkes co-creator Ben Zhao. “Once the corruption happens, you are continuously protected no matter where you go or are seen.”
According to its creators, Fawkes is 100 percent effective against major facial recognition algorithms like Amazon’s Rekognition and Megvii’s Face++. The Sand Lab also expressed particular concern about Clearview AI, which has built a database of more than 3 billion images gathered from all corners of the web. If those algorithms are trained with cloaked images, they will not be able to identify the individual when they appear on a surveillance camera in real life.
The challenge, of course, is getting regular people to adopt the platform. A free version of Fawkes is available for Mac and Windows, and while it has been downloaded 100,000 times, that is not nearly enough to cover the entire social media landscape.
“There are billions of unmodified photos on the internet, all on different domain names,” said Clearview CEO Hoan Ton-That. “In practice, it’s almost certainly too late to perfect a technology like Fawkes and deploy it at scale.”
The Sand Lab is ultimately hoping that tech giants like Facebook will pick up the technology and make it the default for people using their platform. In that scenario, Facebook would still be able to gather data from images that have been uploaded to its own site, but would block third parties like Clearview that want to scrape the platform to build their own databases. More importantly, it would ensure that the number of cloaked images would eventually exceed the number of unaltered photos, limiting the utility of large scale facial recognition systems.
In that regard, the Sand Lab is hoping that Fawkes will put companies like Clearview out of business. In the meantime, D-ID has also released a Smart Anonymization tool that removes identifying information from facial images.
August 6, 2020 – by Eric Weiss