The ubiquitous face recognition is a serious threat to privacy. The idea that the photos we share are collected by companies to train commercially sold algorithms is worrying. Anyone can buy these tools, take a picture of a stranger, and find out who they are in seconds. However, researchers have found a clever way to tackle this problem.
The solution is a tool called Fawkes and was developed by scientists from the Sand Lab at the University of Chicago. Named after the Guy Fawkes masks put on by revolutionaries in the V for Vendetta comic and film, Fawkes uses artificial intelligence to subtly and almost imperceptibly manipulate your photos to trick facial recognition systems.
Running Fawkes on your photos is like adding an invisible mask to your selfies
The way the software works is somewhat complex. If you let Fawkes run your photos, you're not exactly invisible to face recognition. Instead, the software makes minor changes to your photos, so any algorithm that scans those photos in the future will see you as a completely different person. Running Fawkes on your photos is essentially like adding an invisible mask to your selfies.
Scientists call this process "camouflage" and it is said to damage the resources that facial recognition systems need to function: databases of faces that come from social media. For example, the facial recognition company Clearview AI claims to have collected around three billion images of faces from websites such as Facebook, YouTube and Venmo, with which it identifies strangers. But if the photos you share online went through Fawkes, the researchers say, the face the algorithms know about isn't your own.
According to the University of Chicago team, Fawkes is 100 percent successful against state-of-the-art facial recognition services from Microsoft (Azure Face), Amazon (Rekognition) and Face ++ by Chinese technology giant Megvii.
"What we're doing is essentially using the disguised photo like a Trojan horse to corrupt unauthorized models and learn the wrong thing about what makes you look like you and not someone else," said Ben Zhao, professor for computer science at the University of Chicago, who helped create the Fawkes software, said The Verge. "As soon as corruption occurs, you're protected no matter where you go or where you're seen."
You would hardly recognize her. Photos of Queen Elizabeth II. Before (left) and after (right) going through the Fawkes camouflage software.
Image: the edge
The group behind the work – Shawn Shan, Emily Wenger, Jiayun Zhang, Huiying Li, Haitao Zheng and Ben Y. Zhao – published an article on the algorithm earlier this year. At the end of last month, however, they also released Fawkes as free software for Windows and Macs that anyone can download and use. So far it has been downloaded more than 100,000 times.
In our own tests, we found that Fawkes is sparse in design but simple enough to use. It takes a few minutes to process each image, and most of the changes you make are imperceptible. Earlier this week the New York Times published a story about Fawkes, noting that the camouflage effect was fairly obvious and often made gender-specific changes to images such as giving female mustaches. However, the Fawkes team says the updated algorithm is much more subtle, and The Verge's own tests agree.
How much difference can a tool like Fawkes make?
But is Fawkes a silver bullet for privacy? It is doubtful. First of all, there is the problem of adoption. If you read this article and use Fawkes to camouflage photos that you will upload to social media in the future, you are definitely in the minority. Face recognition is worrying because it is a society-wide trend and the solution must therefore be company-wide. If only the tech-savvy shields their selfies, it only creates inequality and discrimination.
Second, many companies that sell facial recognition algorithms created their facial databases a long time ago, and you cannot take this information back retroactively. Clearview's CEO, Hoan Ton-That, told The Times. "There are billions of unchanged photos on the Internet, all with different domain names," said Ton-That. "In practice, it's almost too late to perfect a technology like Fawkes and use it on a large scale."
Compare uncloaked and disguised faces to Fawkes.
Image: SAND Lab, University of Chicago
Of course, the team behind Fawkes disagrees with this assessment. They find that companies like Clearview claim to have billions of photos, but that doesn't mean much when you consider that they should identify hundreds of millions of users. "For many people, Clearview probably has a very small number of publicly available photos," says Zhao. And if people publish more camouflaged photos in the future, sooner or later the number of camouflaged pictures will exceed the number of camouflaged photos.
However, regarding adoption, the Fawkes team acknowledges that their software needs to be released on a larger scale to make a real difference. They have no plans to build a web or mobile app for security reasons, but hope that companies like Facebook can integrate similar technologies into their own platforms in the future.
The integration of this technology would be in the interest of these companies, says Zhao. After all, companies like Facebook do not want people to stop sharing photos, and these companies could continue to collect the data they need from images (for functions like tagging photos) before disguising them on the public web. The integration of this technology may have little impact on current users, but it can help convince future privacy-conscious generations to sign up for these platforms.
"Acquisition through larger platforms, e.g. Facebook or others could eventually paralyze Clearview by effectively rendering (their technology) so ineffective that it is no longer useful or financially viable as a service, ”says Zhao. "Leaving Clearview.ai out of business because it is no longer relevant or accurate is something we would be satisfied with as a result of our work."