
The New York Times said a video was posted by a tabloid in June and showed Black men in altercations with white civilians and police officers. After the video ended, an automatic message popped up that said “keep seeing videos about Primates.”
Facebook says it has turned off the artificial intelligence feature that showed the message, the Times referred to, and apologized for what it called “an unacceptable error” and told the newspaper that it would investigate further so that it doesn’t happen again, as now reported in the Associated Press.
Facebook on Saturday did not respond to a request for comment from The Associated Press. But a company spokeswoman told the Times that even though Facebook has made improvements to its artificial intelligence, “it’s not perfect” and has “more progress to make.”
Artificial intelligence has mislabeled people of color before. In 2015, Google apologized after it labeled a photo of two Black people as gorillas.