×
  • Tech - News - Tech Companies
  • Updated: September 05, 2021

Facebook Apologises For Its AI Mislabel Of Black Men As ‘Primates’

Facebook Apologises For Its AI Mislabel Of Black Men As ‘P

Facebook in a statement has apologised for the "unacceptable error" of its AI as its mislabeled black men as "primate" content. It also disabled the recommendation feature that was responsible for the message as it looks into the cause to prevent serious errors like this from happening again.

Facebook's AI plastered a horrible label on a video of Black men by asking users who watched a video posted by Daily Mail featuring Black men a prompt question if they’d like to "keep seeing videos about Primates."

Company spokeswoman Dani Lever said in a statement: "As we have said, while we have made improvements to our AI, we know it's not perfect, and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations."

Gender and racial bias in artificial intelligence is hardly a problem that's unique to the social network — facial recognition technologies are still far from perfect and tend to misidentify POCs and women in general. Last year, false facial recognition matches led to the wrongful arrests of two Black men in Detroit. In 2015, Google Photos tagged the photos of Black people as "gorillas," and Wired found a few years later that the tech giant's solution was to censor the word "gorilla" from searches and image tags.

A few months ago, Facebook shared a dataset created with the AI community in an effort to combat the issue. It contained over 40,000 videos featuring 3,000 paid actors who shared their age and gender with the company.

Facebook even hired professionals to light their shoot and to label their skin tones, so AI systems can learn what people of different ethnicities look like under various lighting conditions. From the recent happening, it clearly stated that Facebook  has not completely solved AI bias.

Related Topics

Join our Telegram platform to get news update Join Now

0 Comment(s)

See this post in...

Notice

We have selected third parties to use cookies for technical purposes as specified in the Cookie Policy. Use the “Accept All” button to consent or “Customize” button to set your cookie tracking settings