Unfairly accused by an algorithm


Mr. Williams’ case combines faulty technology with bad police work, illustrating how facial recognition can go wrong.

The theft in Shinola occurred in October 2018. Katherine Johnston, an investigator at Mackinac Partners, a loss prevention company, reviewed the store’s surveillance video and sent a copy to Detroit police, according to her report.

Five months later, in March 2019, Jennifer Coulson, a Michigan State Police digital imaging examiner, uploaded a “probe image,” a video image showing the man in the Cardinals cap, to base of facial status recognition data. The system would have mapped the man’s face and searched for similar in a collection of 49 million photos.

The state’s technology is supplied for $ 5.5 million by a company called DataWorks Plus. Founded in South Carolina in 2000, the company first offered photo management software, said Todd Pastorini, general manager. In 2005, the company began expanding the product, adding facial recognition tools developed by external providers.

When one of these subcontractors develops an algorithm to recognize faces, DataWorks tries to judge its effectiveness by running searches using low-quality images of individuals it knows are present in a system. “We have tried a lot of garbage,” said Pastorini. These checks, he added, are not “scientific”: DataWorks does not formally measure the precision or bias of systems.

“We have become a pseudo technology expert,” said Pastorini.

In Michigan, the DataWorks software used by the state police incorporates components developed by the Japanese technology giant NEC and by Colorado-based Rank One Computing, according to Pastorini and a state police spokeswoman. In 2019, algorithms from both companies were included in a federal study of more than 100 facial recognition systems that found they were biased, falsely identifying African-American and Asian faces 10 to 100 times more than Caucasian faces.

Rank One chief executive Brendan Klare said the company had developed a new algorithm for NIST to review that “adjusts for differences in precision between different demographic cohorts.”