Face masks are one of the best defense mechanisms against the spread of COVID-19, but their increasing acceptance has a second, unintended effect: breaking facial recognition algorithms.
Wearing face masks that adequately cover the mouth and nose causes the error rate of some of the most commonly used face recognition algorithms to increase to 5 to 50 percent, according to a study by the U.S. National Institute of Standards and Technology (NIST). Black masks were more likely to cause errors than blue masks. The more the nose was covered by the mask, the more difficult it was for the algorithms to identify the face.
With some algorithms, the error rates rose up to 50 percent
"With the onset of the pandemic, we need to understand how facial recognition technology deals with masked faces," said Mei Ngan, author of the report and NIST computer scientist. “We initially focused on how an algorithm evolved before people with face masks could affect the pandemic. Later this summer, we plan to test the accuracy of algorithms that are intentionally designed with masked faces in mind. "
Example images used by NIST to evaluate the accuracy of various face detection algorithms.
Picture: B. Hayes / NIST
Face detection algorithms, such as those tested by NIST, measure the distances between features on a target's face. Masks reduce the accuracy of these algorithms by removing most of these functions, although some are still there. This is slightly different from face detection on iPhones, which use depth sensors for added security, for example, to ensure that the algorithms are not fooled by displaying an image to the camera (a hazard that is not present in the NIST scenarios deals with ).
Although there are numerous anecdotes about face masks that hinder facial recognition, the NIST study is particularly clear. NIST is the government agency whose job it is to assess the accuracy of these algorithms (along with many other systems) for the federal government. The ranking of the different providers is extremely influential.
In particular, only one type of face recognition, called one-to-one matching, was tested in the NIST report. This is the procedure used in border crossing and passport control scenarios, in which the algorithm checks whether the face of the target matches its ID. This differs from the type of face detection system used for mass surveillance, which scans a lot to find faces in a database. This is called a one-to-many system.
The Department of Homeland Security is concerned about face masks
Although the NIST report does not cover one-to-many systems, they are generally considered to be more faulty than one-to-one algorithms. Choosing faces in a crowd is more difficult because you cannot control the angle or lighting of the face and the resolution is generally reduced. This suggests that face masks that damage one-to-one systems are likely to damage one-to-many algorithms at least the same, but probably more, frequency.
This corresponds to reports that we have heard from the government. An internal bulletin from the U.S. Department of Homeland Security, published by The Intercept earlier this year, said the agency was concerned about the "potential impact the widespread use of face masks could have on security measures with facial recognition systems."
Some companies state that they have already developed new face recognition algorithms that work with masks, as in the NEC system above.
Image: Tomohiro Ohsumi / Getty Images
This is welcome news for data protection advocates. Many have warned of the onslaught of governments around the world to introduce facial recognition systems, despite the deterrent effects of this technology on civil liberties and the generally accepted racist and gender prejudices of these systems, which tend to underperform anyone who is white Man.
In the meantime, companies that develop facial recognition technologies have quickly adapted to this new world and developed algorithms that only identify faces based on the area around the eyes. Some vendors, such as the leading Russian company NtechLab, say that their new algorithms can identify people even if they are wearing a balaclava. However, such claims are not entirely trustworthy. They usually come from internal data that can be selected to achieve flattering results. For this reason, third-party agencies such as NIST offer standardized tests.
NIST plans to test specially tailored facial recognition algorithms for mask wearers later this year and to investigate the effectiveness of one-to-many systems. Despite the problems caused by masks, the agency expects the technology to remain. "In terms of face mask accuracy, we expect the technology to improve further," said Ngan.