Facebook removes 1.9 million pieces of ISIS-related content | Tech News

Facebook removes 1.9 million pieces of ISIS-related content

Facebook said that it removed more content in the first three months this year which is about twice as it did the previous quarter. 99% of the content is said to have been flagged by the company’s internal systems and not users.

By: SARAH FRIER
| Updated on: Aug 19 2022, 22:32 IST
Facebook said that it removed ISIS-related content by actively looking for it.
Facebook said that it removed ISIS-related content by actively looking for it. (REUTERS)
Facebook said that it removed ISIS-related content by actively looking for it.
Facebook said that it removed ISIS-related content by actively looking for it. (REUTERS)

Facebook Inc. said it was able to remove a larger amount of content from the Islamic State and al-Qaeda in the first quarter of 2018 by actively looking for it.

The company has trained its review systems -- both humans and computer algorithms -- to seek out posts from terrorist groups. The social network took action on 1.9 million pieces of content from those groups in the first three months of the year, about twice as many as in the previous quarter. And, 99% of that content wasn't reported first by users, but was flagged by the company's internal systems, Facebook said Monday.

Facebook, like Twitter Inc. and Google's YouTube, has historically put the onus on its users to flag content that its moderators need to look at. After pressure from governments to recognize its immense power over the spread of terrorist propaganda, Facebook started about a year ago to take more direct responsibility. Chief Executive Officer Mark Zuckerberg earlier this month told Congress that Facebook now believes it has a responsibility over the content on its site.

The company defines terrorists as non-governmental organizations that engage in premeditated acts of violence against people or property to intimidate and achieve a political, religious or ideological aim. That definition includes religious extremists, white supremacists and militant environmental groups. "It's about whether they use violence to pursue those goals."

The policy doesn't apply to governments, Facebook said, because "nation-states may legitimately use violence under certain circumstances."

Facebook didn't give any numbers for its takedown of content from white supremacists or other groups it considers to be linked to terrorism, in part because the systems have focused training so far on the Islamic State and al-Qaeda.

Facebook has come under fire for being too passive about extremist content, especially in countries like Myanmar and Sri Lanka where the company's algorithm, by boosting posts about what's popular, has helped give rise to conspiracy theories that spark ethnic violence. People in those countries told the New York Times that even after they report content, Facebook may not take it down.

Follow HT Tech for the latest tech news and reviews , also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.

First Published Date: 24 Apr, 17:39 IST
NEXT ARTICLE BEGINS