Facial masks are breaking facial recognition algorithms, according to new government study

Facial masks are one of the best defenses against the spread of COVID-19, but their increasing adoption is having a second unwanted effect: breaking facial recognition algorithms.

Wearing facial masks that adequately cover the mouth and nose causes the error rate of some of the most widely used facial recognition algorithms to rise to between 5 and 50 percent, according to a study by the National Institute of Standards and Technology (NIST ) from the USA. Black masks were more likely to cause errors than blue masks, and the more you covered the nose with the mask, the more difficult algorithms were to identify the face.

“With the arrival of the pandemic, we have to understand how facial recognition technology deals with masked faces,” said Mei Ngan, author of the report and a computer scientist at NIST. “We started by focusing on how an algorithm was developed before the pandemic could be affected by subjects with face masks. Later this summer, we plan to test the accuracy of algorithms that were intentionally developed with masked faces in mind. “

Sample images used by NIST to assess the accuracy of various facial recognition algorithms.
Image: B. Hayes / NIST

Facial recognition algorithms like those tested by NIST work by measuring the distances between features on a target’s face. Skins reduce the precision of these algorithms by removing most of these features, although there are still some. This is slightly different from how facial recognition works on iPhones, for example, which use depth sensors for added security, ensuring that algorithms cannot be fooled by displaying an image to the camera (a danger not present in NIST scenarios. is concerned about).

Although there has been a lot of anecdotal evidence about facial masks that prevent facial recognition, the NIST study is particularly definitive. NIST is the government agency charged with evaluating the accuracy of these algorithms (along with many other systems) for the federal government, and their ranking of different providers is extremely influential.

In particular, the NIST report only tested one type of facial recognition known as a one-to-one match. This is the procedure used in border crossings and passport control scenarios, where the algorithm checks if the target’s face matches its ID. This is different from the type of facial recognition system used for mass surveillance, where a crowd is scanned to find face matches in a database. This is called a one-to-many system.

Although the NIST report does not cover one-to-many systems, these are generally considered more errors than one-to-one algorithms. Choosing faces in a crowd is more difficult because you cannot control the angle or lighting of the face and the resolution is generally reduced. That suggests that if face masks are breaking systems one by one, they are likely breaking algorithms one by many with at least the same frequency, but probably higher.

This coincides with reports we have heard from the internal government. An internal bulletin from the US Department of Homeland Security earlier this year, reported by The interception, He said the agency was concerned about “potential impacts that widespread use of protective masks could have on security operations incorporating facial recognition systems.”

NEC transforms Tokyo headquarters into a smart building

Some companies say they have already developed new facial recognition algorithms that work with masks, like in the NEC system, above.
Image: Tomohiro Ohsumi / Getty Images

For privacy advocates, this will be good news. Many have warned of the rush of governments around the world to adopt facial recognition systems, despite the chilling effects that such technology has on civil liberties, and the widely recognized racial and gender biases of these systems, which They tend to perform worse on whatever. Not a white man.

Meanwhile, companies creating facial recognition technology have been rapidly adapting to this new world, designing algorithms that identify faces simply by using the area around the eyes. Some providers, such as Russia’s leading firm NtechLab, say their new algorithms can identify people even if they wear a balaclava. However, such claims are not entirely reliable. Usually they come from internal data, which can be selected to produce flattering results. That is why third party agencies like NIST provide standardized testing.

NIST says it plans to test specially tailored facial recognition algorithms for mask users later this year, along with testing the effectiveness of the systems one-to-many. Despite the problems caused by the masks, the agency hopes that the technology will persevere. “With regard to precision with face masks, we hope that the technology will continue to improve,” said Ngan.