Skip to main content

Wrongfully Accused by an Algorithm

The New York Times published an article in June about the wrongful arrest of Robert Julian-Borchak Williams due to a mistake in facial recognition software. Find the article here.

In January, Mr. Williams was arrested by the Detroit Police Department and taken to a detention center on account of “stealing” from a Shinola watch store in 2018. Surveillance footage shown to Williams in the interrogation room depicts a man that was clearly not him, but somehow flagged as him. The case marks “the first known account of an American being wrongfully arrested based on a flawed match from a facial recognition algorithm,” writes Kashmir Hill for The New York Times.

Facial recognition software has generally been found to be less accurate for people of color as a result of a lack of diversity in the images that comprise the databases used to train the facial recognition algorithms. False positive matches from these faulty algorithms can have a severe impact when the algorithms are applied to law enforcement, as seen by Mr. Williams’ case. While many companies provide facial recognition technology to police departments, the quality of the algorithms can range, says Clare Garvie, a lawyer at Georgetown University’s Center on Privacy and Technology. The company that supplied the technology to the Detroit Police Department for the case of Mr. Williams is DataWorks, based in South Carolina, whose software featured some components from Japanese tech company NEC and Colorado-based Rank One Computing. Algorithms from NEC and Rank One Computing were tested in a federal study in 2019 and found to be biased towards African-American and Asian faces as compared to Caucasian faces. 

Another important factor other than accuracy of the algorithms is the importance placed on them. For Mr. Williams, after his driver’s license photo was matched with an image from the security video, his photo was simply shown and selected among 5 others to a representative from Shinola; no further investigation was conducted. Additionally, a second witness was present at the store during the shoplifting incident, but they were not asked to look at the photo lineup. Mr. Williams even had an alibi in the form of an Instagram post at the time of a shoplifting, but it was never asked for by the police department. While tech developers and law enforcement defend facial recognition software as providing clues instead of answers, in Mr. Williams’ case, it seemed to be taken as closer to an answer than a clue.

The Detroit Police Department eventually decided to dismiss the case. Some other change has been made at the department as well: they announced that in the future, when an individual identifies someone among a lineup, it will be up to the office as to whether or not to issue charges. Additionally, they announced that their facial recognition policy was updated in July 2019 to only be used to investigate violent crimes. However, there is still a long way to go in eliminating the bias towards people of color that is perpetuated by the use of facial recognition software.


Comments are closed.