Facebook apologises after AI feature mistakenly labels Black men as ‘primates’
Facial recognition software - used by Facebook and others - has been blasted by civil rights advocates who point out problems with accuracy, particularly it comes to people who are not white.
An Artificial Intelligence (AI) tool has put social media giant Facebook in trouble after it identified Black men being shown in a video as "primates". Facebook has apologised and disabled the tool, and has also launched an investigation.
The issue surfaced when some users were watching the video from a British tabloid featuring Black men. These users soon received an automated prompt from Facebook asking if they would like to "keep seeing videos about Primates".
The video date June 27, 2020, was from Daily Mail. It showed an altercation between the Black men and white police officers. While humans are among the many species in the primate family, the video had nothing to do with monkeys, chimpanzees or gorillas.
A Facebook spokesperson called it a "clearly unacceptable error" and said the recommendation software involve was taken offline.
"We apologize to anyone who may have seen these offensive recommendations," Facebook said in response to a query sent by news agency AFP.
"We disabled the entire topic recommendation feature as soon as we realized this was happening so we could investigate the cause and prevent this from happening again," the spokesperson further said.
The issue was brought to light by former Facebook content design manager Darci Groves who posted the screenshots of the recommendation on Twitter. "Um. This “keep seeing” prompt is unacceptable, @Facebook. And despite the video being more than a year old, a friend got this prompt yesterday. Friends at FB, please escalate. This is egregious," said said in the tweet.
Facial recognition software has been blasted by civil rights advocates who point out problems with accuracy, particularly it comes to people who are not white.