Skip to main content

Another case of facial recognition gone awry in Detroit

Detroit Free Press wrote about the story of Michael Oliver, who was mistakenly arrested due to the police’s use of facial recognition technology. Find the article here.

In 2019, the Detroit Police Department accused Oliver, 25, of “reaching into a teacher’s vehicle, grabbing a cellphone, and throwing it, cracking the screen and breaking the case,” says the article. Similar to the case of Robert Williams, facial recognition used by the police marked Oliver as a lead. The teacher whose phone was taken was taking a video of the incident, and the software matched the person in the video to Oliver. While there were obvious physical differences between the actual individual who committed the crime and Oliver, it didn’t stop the teacher from identifying Oliver in a photo lineup of possible suspects. Additionally, no evidence supporting Oliver was considered before the police charged him with a felony. 

The arrest of Michael Oliver marks the second known case of a false arrest due to facial recognition software. In both cases, the individuals falsely charged were Black men. Detroit police, specifically, began using this software in investigations in 2017, use of the software has faced pushback from protesters in the city. Their claim, which has been supported by studies and exemplified in cases like that of Williams, is that the rate of false positives when the software is applied to people of color is especially high. This is likely due to a dearth of sufficient representation of people of color in the sets of images used to train algorithms.

Oliver’s case was soon dismissed, and Detroit police have made some changes since then. There are now stricter rules for the use of facial recognition in investigations, namely that it can now only be used for violent crimes. In the future, greater discretion will also be taken to determine whether or not the person identified by the software should be charged.

However, the cases of both Oliver and Williams are indicative of a larger issue of bias that comes with the use of facial recognition technology — one that continues to be debated across the country.


Comments are closed.