Ensure steps to prevent spread of rumours, government warns WhatsApp
The warning to WhatsApp comes in the wake of a spate of incidents involving lynching of innocent people because of certain “fake and motivated” messages being circulated on the widely used messaging app
The government has expressed its “deep disapproval” to instant messaging service company WhatsApp, over “irresponsible and explosive messages”, warning it to prevent the spread of rumours that have incited several instance of violence in the country in the last two months.
“While the law and order machinery is taking steps to apprehend the culprits, the abuse of platform like WhatsApp for repeated circulation of such provocative content are equally a matter of deep concern,” the ministry of electronics and information technology said in a statement.
“It has also been pointed out that such a platform cannot evade accountability and responsibility, especially when good technological inventions are abused by some miscreants who resort to provocative messages which lead to spread of violence.”
An official familiar with the matter said on condition of anonymity the implicit message is that if WhatsApp, which is owned by Facebook, does not take action, the government would be forced to act.
A spokesperson for WhatsApp said: “WhatsApp cares deeply about people’s safety and their ability to freely communicate. We don’t want our services used to spread harmful misinformation and believe this is a challenge that companies and societies should address. For example, we recently made a number of updates to our group chats and will be stepping up efforts to help people spot false news and hoaxes.”
Fake videos and rumours of alleged child-lifters have resulted in mobs targeting innocent bystanders or visitors. At least eight states, including Karnataka, Assam, Maharashtra and Gujarat, have reported incidents of mob violence motivated by WhatsApp messages on childlifters. The most recent incident occurred in Maharashtra, where five people were lynched on Sunday, although the role of a messaging platform in this case hasn’t been established. Fear of child-lifters has led to at least 29 people being killed since May last year, of which at least 19 has been in the last two months alone.
Fact checking should be built into the interface of WhatsApp to avoid rumour-mongering, said Sunil Abraham, founder of the think tank Centre for Internet and Society. “So, for instance, if there is a database of discredited memes then each message sent or received should be checked against that database,” he said.
In December, a top Facebook executive confirmed plans to bring third-party fact checking to India.
Abraham added that a hard regulatory approach won’t work in this case. “If Facebook were to ban end-to-end encryption to be able to monitor what’s happening on WhatsApp, chances are people will move to free software alternatives (that do the same).”