Facebook's AI Technology Mislabeled Video of Black Men as 'Primates'

Facebook apologized for the incident and disabled the recommendation feature: "While we have made improvements to our AI, we know it’s not perfect."

Facebook
Getty

Image via Getty/Kirill Kudryavtsev/AFP

Facebook

Facebook is apologizing after its artifical intelligence technology mistook a video of Black men as “primate” content.

According to the New York Times, the gaffe was brought to light earlier this week after a former Facebook designer was sent a screenshot of the mislabeled video. The post in question, which was published by the Daily Mail in June 2020, showed footage of a group of Black men interacting with police officers. Those who watched the video were presented with an automated prompt that asked, “keep seeing videos about Primates?” along with the “yes” and “dismiss” options.

“As we have said, while we have made improvements to our AI, we know it’s not perfect, and we have more progress to make,” Facebook said in a statement to the Times. “We apologize to anyone who may have seen these offensive recommendations.”

A representative for Facebook went on to say that the team has since disabled the recommendation feature that caused the “unacceptable error,” and its team is now trying to find ways they can prevent similar incidents from happening again.

Facial recognition technology has created a slew of problems for big tech over the past several years. In 2015, Google came under fire after its Photos application mistakenly tagged images of Black people as “gorillas.” According to Wired, Google subsequently censored searches for “gorilla,” “chimp,” “chimpanzee,” and “monkey” within the application. 

Latest in Life