Worrying trends, unclear data: India's CSAM challenge
This article is authored by Ranjana Kumari, founder, Centre for Social Research, New Delhi.
The recent CyberTipline data released by the National Centre for Missing and Exploited Children (NCMEC) makes for grim reading. In 2024, India accounted for the largest number of reports related to Child Sexual Abuse Material (CSAM) globally with a staggering 2.3 million reports. This data is not just alarming—it is an emergency.

While these numbers are alarming, they reveal only half the story. What they obscure might be even more important.
We still don’t know how many of these reports pertain to unique instances of abuse or how many relate to the same content being circulated again and again. We don’t know how many were generated by perpetrators and how many came from individuals who, in horror or ignorance, reshared the material. Crucially, we have no clear picture of how many reports are translated into timely interventions, legal action, or support for survivors.
This gap in understanding should concern all of us. Because in the absence of clarity, effective policy and accountability are impossible. And as long as we continue to work in the dark, children will continue to suffer.
The lack of transparency and disaggregated data is not just a technical issue—it is a moral failure. Every number in that 2.3 million reports from India is a potential instance of unspeakable harm. But without knowing the context, India’s efforts to respond to this crisis are reactive at best.
Encouragingly, India has already taken important legal steps to address online child sexual exploitation. Under Section 67B of the Information Technology Act, the creation, transmission, and even viewing of CSAM is a criminal offence, punishable by imprisonment and fines. The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 further require social media platforms and other intermediaries to remove CSAM swiftly upon knowledge or notification and to report such content to Indian law enforcement. Most recently, in 2024, the Supreme Court of India clarified that even the possession or viewing of CSAM is punishable under Indian law and recommended replacing the term child pornography with Child Sexual Exploitative and Abuse Material (CSEAM) to reflect the grave nature of the offence more accurately.
However, India urgently needs a national strategy—one that is research-driven, coordinated, and survivor-centred. At the heart of this strategy must be a few core principles:
· First, we need platform transparency. Major tech companies must be compelled to share disaggregated data with regulators, researchers, and civil society. That means not just how many reports were filed, but the nature of the content, its origin, distribution patterns, and response timelines.
· Second, we need robust legal frameworks. Our laws must evolve to reflect the complexity of the digital age. There should be clear legal distinctions between those creating or intentionally distributing CSAM, and those who may unwittingly share such material out of shock or confusion. The law must be firm, but also fair.
· Third, we need a massive public education campaign. Many people, especially younger users, do not know what to do when encountering harmful content online. Some try to flag it by posting screenshots. Others share it in outrage. We must teach people that the safest, most responsible action is to report the content immediately to platform moderators or relevant authorities, and never to redistribute it.
· Fourth, and most critically, we need a national commitment to survivor support. Children who have experienced abuse, especially when that abuse is digitised and distributed, require specialised care. From trauma-informed counselling and medical support to legal aid and safe housing, survivors need pathways to recovery that are compassionate and sustained.
· And finally, we need research. There is an urgent need for academic institutions and civil society to be empowered to study the scale, nature, and consequences of online child sexual exploitation in India. We must stop depending solely on foreign data sets. India must invest in its national data infrastructure while maintaining international collaboration.
The truth is, the numbers we are seeing may be the tip of the iceberg. And for each data point we miss, we risk failing a child.
This is not just a criminal justice issue. It is a societal one. It is about the kind of digital environment we are willing to accept—and the kind of country we want our children to grow up in.
If India is serious about building a safe digital future, then child protection must be placed at the core of our internet governance strategy. That includes robust law enforcement, yes—but also education, prevention, corporate accountability, and above all, compassion.
Even one child harmed is one too many. And 2.3 million reports are not just numbers—it is a wake-up call. The question is: Will we listen?
This article is authored by Ranjana Kumari, founder, Centre for Social Research, New Delhi.
One Subscription.
Get 360° coverage—from daily headlines
to 100 year archives.



HT App & Website
