ACLU files a complaint alleging the first known case of mistaken arrest based on facial recognition identification


According to the complaint, Robert Williams was entering the driveway of his home in Farmington Hills, Michigan, one January afternoon when a police vehicle suddenly stopped and blocked him. Officers jumped up and put Williams under arrest as his wife and children. He watched from his door and took him to a detention center half an hour away in Detroit. There, he spent the next 30 hours in what his lawyers describe as a “crowded and dirty cell.”

Detroit police believed Williams was responsible for stealing multiple watches from a local Shinola store. But it turns out that the police were wrong – misled by a computer algorithm, according to the ACLU. The Detroit Police Department did not respond to multiple requests for comment.

As the country boasts decades of police practices that have disproportionately affected black and brown communities, the incident hints at the personal cost that average Americans caused by the rapid spread of facial recognition by the police, and the enormous potential of such technology to misidentify people of color.

Facial recognition systems generally use software to link an image of a face with those stored in a database. The technology has been used everywhere, from concerts to airports, but privacy and civil liberties advocates, technologists, and lawmakers increasingly scrutinize it for concerns about algorithmic discrimination.

Williams, who is black, has now become a face of that movement. According to the ACLU complaint, Detroit police provided video surveillance of a black man who was stealing watches from a Shinola store to Michigan state police, who ran the footage through a facial recognition system and suggested a photo of Williams as a possible coincidence. Shanon Banner, a spokeswoman for the Michigan State Police, forwarded CNN questions about Williams’ case to Detroit police.

According to Banner, the policy of the Michigan State Police is not to use facial recognition as a form of positive identification.

“It is considered to be only an investigative lead, requiring the investigator to continue the criminal investigation before making any final determination, including arrest,” he said. “All lead investigation reports include the following statement at the top of the report: ‘This document is not a positive identification. It is only a lead investigation and is not a probable cause of arrest. More investigation is needed to develop a cause. probable arrest. ‘”

Who owns the right to your face?

Banner did not explain how the software, allegedly sold to the state police by a company known as DataWorks Plus, suggested that the man in the video was Williams. However, police showed a list of head shots, including Williams’s, to a Shinola security guard who had not witnessed the robbery but had seen the video. The guard, the complaint says, identified Williams as the suspect.

Williams was later arrested, although he was released after questioning during which, as the complaint states, “it was clear that his arrest was based on a misidentification of facial recognition.”

In a video produced by the ACLU about his experience, Williams said that once officers questioning him saw that the images did not match his face, “they left them on the table and looked at each other, like,” oops! ”
“I never thought I would have to explain to my daughters why they arrested Dad,” Williams wrote in a Washington Post op-ed on Wednesday. “How do you explain to two girls that a computer was wrong, but the police heard it anyway?”
The ACLU’s complaint about Williams’ experience follows announcements from several large tech companies that have said they will not sell facial recognition to police. Amazon has announced a one-year moratorium on sales of its Rekognition software, while Microsoft has said it will not sell its facial recognition technology to police departments until federal regulations emerge. Meanwhile, IBM has announced a ban on “general-purpose” facial recognition, including research and technology development.
AI software defines people as male or female.  That is a problem
It is unknown how many local police departments use facial recognition systems across the country, and there are few rules that govern how and where it can be implemented. There is no federal legislation regarding its use, although several states have enacted laws related to it. Illinois, for example, which requires companies to obtain customer consent before collecting biometric information. And some cities, including Boston and San Francisco, have banned the technology for use by their governments.

With his complaint, the ACLU said, Williams hopes the Detroit Police Department, among other things, will publicly apologize to Williams and his family. The complaint also searches all department records related to Williams’ arrest, and asks Detroit police to stop using facial recognition as an investigative tool.

In surveillance footage used by police, the true suspect was wearing a St. Louis Cardinals hat, according to the complaint.

“Mr. Williams, a lifelong resident of the Detroit area, does not own that hat and is not a fan of the Cardinals,” the complaint said. “He is not even a baseball fan. However, he is black.”

.