This browser does not support the Video element.
Facial recognition software leads to Detroit man being wrongfully arrested
FOX 5 NY looks at the story of Robert Williams, his wrongful arrest and how facial recognition software can be biased against minorities and women.
MICHIGAN - Robert Williams, 42, of Farmington Hills, Detroit says on January 9, 2020 police showed up to his home after they believed he stole almost $4,000 in watches. Their evidence, a match they got from using facial recognition software.
The complaint by Robert Williams is a rare challenge from someone who not only experienced an erroneous face recognition hit, but was able to discover that it was responsible for his subsequent legal troubles.
The Wednesday complaint filed on Williams' behalf alleges that his Michigan driver license photo — kept in a statewide image repository — was incorrectly flagged as a likely match to a shoplifting suspect. Investigators had scanned grainy surveillance camera footage of an alleged 2018 theft inside a Shinola watch store in midtown Detroit, police records show.
That led to what Williams describes as a humiliating January arrest in front of his wife and young daughters on their front lawn in the Detroit suburb of Farmington Hills.
"I was completely shocked and stunned to be arrested in broad daylight in front of my daughters, my wife, my neighbors,” says Williams.
The father of two was wrongfully arrested based on a false face recognition hit. Williams says he spent almost 30 hours at the Detroit detention center.
“ A detective turns over a picture of a guy and says it’s not you? I looked I said no that’s not me. He turns another paper over and he says I guess this is not you either? I picked that paper up and held it up against my face, and I said this is not me, and I said I hope you don’t think all black people look alike. Then he said the computer says it’s you,” claims Williams.
Adam Scott Wandt, an assistant professor of public policy and vice-chair for technology at the department of public management at John Jay College, says facial recognition software is biased, because when the algorithms and the software were first created the artificial intelligence machines were fed lots of pictures of white men, so the system is significantly better at identifying white males over any other gender or race.
Meanwhile, the ACLU has filed a complaint against Detroit Police. We reached out to the police department for a comment but have not heard back.
With the Associated Press.