Slate published an article detailing Facebook’s AI’s misidentification of Black men in a video as “primates.” Find the article here.
Facebook’s topic recommendation that it displays after videos watched on the site is driven by AI. The feature recently asked one user if they would like to “[k]eep seeing videos about Primates” after they watched a year-old video of Black men having altercations with police and white civilians. The egregious mistake is likely a result of poor facial recognition on Black faces, an issue that has often been traced to a lack of representation in training data. Similar mislabellings have occurred when Google Photos falsely recognized Black people as “gorillas.” Facebook has since shut down the topic recommendation feature and apologized for the mistake, stating that they have “more progress to make” with their AI, coming after last year’s statement that they were planning to analyze if their AI algorithms were racially biased. Issues such as these can have ramifications beyond degrading Facebook recommendations, as similar software can be used in fields like policing.