Whistle-blower fallout: Facebook spotlights tools that detect hate speech

Facebook says it prohibits hate speech and content that incites violence and has invested significantly in technology that proactively detects hate speech.
Former Facebook employee Frances Haugen charged that the social media giant ignored internal research findings that Facebook and Instagram were harmful for children and public safety (REUTERS/File Photo)
Former Facebook employee Frances Haugen charged that the social media giant ignored internal research findings that Facebook and Instagram were harmful for children and public safety (REUTERS/File Photo)
Updated on Oct 06, 2021 10:20 PM IST
Copy Link
By Vishal Mathur

New Delhi: As the fallout of the US Senate Commerce Subcommittee on Consumer Protection hearing following a whistle-blower’s complaints continues to be felt, social media giant Facebook has issued a statement in an attempt to clarify certain issues which former employee Frances Haugen raised in the complaint filed with the US Securities and Exchange Commission (SEC).

Facebook’s statement seeks to set the record straight on certain points raised in the whistleblower’s complaint, including the handling of hate speech, language classifiers or the lack of them and how hate speech against certain communities or religions continues to be the focus of enforcement as well as content policy updates.

“We prohibit hate speech and content that incites violence. Over the years, we‘ve invested significantly in technology that proactively detects hate speech, even before people report it to us,” said a Facebook spokesperson in a statement shared with HT.

According to internal documents submitted with Haugen’s complaint, it is believed that only 0.2% of the reported hate speech is taken down by automated checks which are in place to monitor content on the social media platform.

Also Read: Facebook knows it is in trouble as Mark Zuckerberg picks his battles

The latest clarification follows a lengthy statement by Mark Zuckerberg, Chief Executive Officer of Facebook, earlier in the day. In this, Zuckerberg asserted that the claim that the company put profits over safety is “just not true.”

The whistle-blower’s complaints also raised the issue of language classifiers, which should be able to check for translations as well.

Classifiers are automated systems and algorithms that are designed to detect hate speech in content in social media posts on Facebook.

The complaint said Facebook’s internal records show how this problem of lack of Hindi and Bengali classifiers meant much of the reported content, particularly the anti-Muslim narrative, was never dealt with or flagged by the systems either. Yet, Facebook insists that Bengali and Hindi are among the 40 languages globally which are compatible as classifiers with systems and algorithms that detect content that violates content guidelines.

“We now use this technology to proactively detect violating content in Hindi and Bengali, alongside over 40 languages globally. As a result, we’ve reduced the prevalence of hate speech globally – meaning the amount of the content people actually see - on Facebook by almost 50% in the last three quarters and it’s now down to 0.05% of all content viewed,” said the Facebook spokesperson.

The company also said they have a team of content reviewers covering 20 Indian languages, who manually monitor content on Facebook.

In the complaint filed with the US SEC, the whistleblower also said: “RSS (Indian nationalist organization Rashtriya Swayamswvak Sangh) Users, Groups, and Pages promote fear-mongering, anti-Muslim narratives targeted pro-Hindu populations with V&I (violence and inciting) intent….”.

Facebook denied any laxity in enforcing guidelines to protect any religion or marginalised groups against hate speech.

“As hate speech against marginalized groups, including Muslims, continues to be on the rise globally, we continue to make progress on enforcement and are committed to updating our policies as hate speech evolves online,” said Facebook.

HT has reached out to the BJP and RSS for their comments. The copy will be updated once they respond.

The social media network, however, has not clarified its stance on the whistle-blower complaints about Facebook’s handling of the issue of duplicate accounts. This is internally known as Suma, or Single User Multiple Accounts.

Facebook has also not touched upon the revelations that highlight the categorization India gets alongside the US presidential elections as well as the elections in Brazil. In 2020, the “Top 3 Policy Priorities” for Facebook are referenced as, “Tier O includes Brazil, India, United States” while Tier 1 includes Germany, Indonesia, Iran, Israel and Italy. It is not clear what the prioritization to Tier 0 means in terms of Facebook’s investment in India.

Close Story
Story Saved
Saved Articles
My Reads
Sign out
New Delhi 0C
Tuesday, October 19, 2021