December 14, 2024

Facial Recognition Led to Wrongful Arrests. So Detroit Is Making Changes.

In January 2020, Robert Williams spent 30 hours in a Detroit jail because facial recognition technology suggested he was a criminal. The match was wrong, and Mr. Williams sued.

On Friday, as part of a legal settlement over his wrongful arrest, Mr. Williams got a commitment from the Detroit Police Department to do better. The city adopted new rules for police use of facial recognition technology that the American Civil Liberties Union, which represented Mr. Williams, says should be the new national standard.

“We hope that it moves the needle in the right direction,” Mr. Williams said.

Mr. Williams was the first person known to be wrongfully arrested based on faulty facial recognition. But he wasn’t the last. The Detroit police arrested at least two other people as a result of facial recognition searches gone awry, including a woman who was charged with carjacking when she was eight months pregnant.

Law enforcement agencies across the country use facial recognition technology to try to identify criminals whose misdeeds are caught on camera. In Michigan, the software compares an unknown face to those in a database of mug shots or drivers’ license photos. In other jurisdictions, the police use tools, like Clearview AI, that search through photos scraped from social media sites and the public internet.

One of the most important new rules adopted in Detroit is that the images of people identified via facial recognition technology can no longer be shown to an eyewitness in a photo lineup unless there is other evidence that links them to the crime.

“The pipeline of ‘get a picture, slap it in a lineup’ will end,” said Phil Mayor, a lawyer for the A.C.L.U. of Michigan. “This settlement moves the Detroit Police Department from being the best-documented misuser of facial recognition technology into a national leader in having guardrails in its use.”

The police say facial recognition technology is a powerful tool for helping to solve crimes, but some cities and states, including San Francisco; Austin, Texas; and Portland, Ore., have temporarily banned its use because of concerns about privacy and racial bias. Stephen Lamoreaux, head of informatics with Detroit’s crime intelligence unit, said the Police Department was “very keen to use technology in a meaningful way for public safety.” Detroit, he asserted, has “the strongest policy in the nation now.”

Mr. Williams was arrested after a crime that happened in 2018. A man stole five watches from a boutique in downtown Detroit, while being recorded by a surveillance camera. A loss prevention firm provided the footage to the Detroit Police Department.

A search of the man’s face against driver’s license pictures and mug shots produced 243 photos, ranked in order of the system’s confidence it was the same person on the surveillance video, according to documents disclosed as part of Mr. Williams’s lawsuit. An old driver’s license photo for Mr. Williams was ninth on the list. The person running the search deemed him the best match, and sent a report to a Detroit police detective.

The detective included Mr. Williams’s picture in a “six-pack photo lineup” — photos of six people in a grid — that he showed to the security contractor who had provided the store’s surveillance video. She agreed that Mr. Williams was the closest match to the man in the boutique, and this led to the warrant for his arrest. Mr. Williams, who had been at his desk at an automotive supply company when the watches were stolen, spent the night in jail and had his fingerprints and DNA collected. He was charged with retail fraud and had to hire a lawyer to defend himself. Prosecutors eventually dropped the case.

He sued Detroit in 2021 hoping to force a ban on the technology so that others would not suffer his fate. He said he was upset last year when he learned that the Detroit police had charged Porcha Woodruff with carjacking and robbery after a bad facial recognition match. The police arrested Ms. Woodruff as she was getting her children ready for school. She has also sued the city; the suit is ongoing.

“It’s so dangerous,” Mr. Williams said, referring to facial recognition technology. “I don’t see the positive benefit in it.”

The Detroit police are responsible for three of the seven known instances when facial recognition has led to a wrongful arrest. (The others were in Louisiana, New Jersey, Maryland and Texas.) But Detroit officials said that the new controls would prevent more abuses. And they remain optimistic about the technology’s crime-solving potential, which they now use only in cases of serious crimes, including assault, murder and home invasions.

James White, Detroit’s police chief, has blamed “human error” for the wrongful arrests. His officers, he said, relied too heavily on the leads the technology produced. It was their judgment that was flawed, not the machine’s.

The new policy, which is effective as of this month, is supposed to help with that. Under the new rules, the police can no longer show a person’s face to an eyewitness based solely on a facial recognition match.

“There has to be some kind of secondary corroborating evidence that’s unrelated before there’s enough justification to go to the lineup,” said Mr. Lamoreaux of Detroit’s crime intelligence unit. Police would need location information from a person’s phone, say, or DNA evidence — something more than a physical resemblance.

The department is also changing how it conducts photo lineups. It is adopting what is called a double-blind sequential, which is considered a fairer way to identify someone. Rather than presenting a “six-pack” to a witness, an officer — one who doesn’t know who the primary suspect is — presents the photos one at a time. And the lineup includes a different photo of the person from the one the facial recognition system surfaced.

The police will also need to disclose that a face search happened, as well as the quality of the image of the face being searched — How grainy was the surveillance camera? How visible is the suspect’s face? — because a poor quality image is less likely to produce reliable results. They will also have to reveal the age of the photo surfaced by the automated system, and whether there were other photos of the person in the database that did not show up as a match.

Franklin Hayes, Detroit’s deputy chief of police, said he was confident that the new practices would prevent future misidentifications.

“There’s still a few things that might slip up, for example, identical twins,” Mr. Hayes said. “We can never say never, but we feel that this is our best policy yet.”

Arun Ross, a computer science professor at Michigan State University who is an expert on facial recognition technology, said that Detroit’s policy was a great starting point and that other agencies should adopt it.

“We don’t want to trample on the rights and privacy of individuals, but we also don’t want crime to be rampant,” Mr. Ross said.

Eyewitness identification is a fraught endeavor, and the police have embraced cameras and facial recognition as more reliable tools than imperfect human memory.

Chief White told local lawmakers last year that facial recognition technology had helped “in getting 16 murderers off the street.” When asked for more information, Police Department officials did not provide details about those cases.

Instead, to demonstrate the department’s successes with the technology, police officials played a surveillance video of a man who splashed fuel inside a gas station and set it on fire. They said he had been identified with facial recognition technology and arrested that night. He later pleaded guilty.

Detroit’s Police Department is one of the few that keep tabs on its facial recognition searches, submitting weekly reports about its use to an oversight board. In past years, it has averaged more than 100 searches a year, with around half of those searches surfacing potential matches.

The department keeps track only of how often it gets a lead, not whether the lead pans out. But as part of its settlement with Mr. Williams — who also received $300,000, according to a police spokesperson — it has to conduct an audit of its facial recognition searches dating back to when it first started using the technology in 2017. If it identifies other cases in which people were arrested with little or no other supporting evidence beyond a face match, the department is supposed to alert the relevant prosecutor.

Molly Kleinman, the director of a technology research center at the University of Michigan, said the new protections sounded promising, but she remained skeptical.

“Detroit is an extraordinarily surveilled city. There are cameras everywhere,” she said. “If all of this surveillance technology really did what it claims to, Detroit would be one of the safest cities in the country.”

Willie Burton, a member of the Board of Police Commissioners, an oversight group that approved the new policies, described them as “a step in the right direction,” though he was still opposed to the use of facial recognition technology by the police.

“The technology is just not ready yet,” Mr. Burton said. “One false arrest is one too many, and to have three in Detroit should sound an alarm to discontinue it.”