Draft govt policy seeks 3-tier checks for OTTs
Social media companies may need to appoint officers who will be responsible for complying with content moderation orders, and both they and streaming service providers will be brought under a three-tier regulatory framework, according to proposed new rules that will cover media companies such as Facebook and so-called over the top (OTT) platforms such as Netflix. In addition, shows on these platforms will have to carry ratings such as U (universal) or A (adult) — something they are not required to do currently.
In a document running into 30 pages, reviewed by Hindustan Times, the central government has laid down the rules and the framework for regulating both sets of companies, which remain largely unregulated, although some provisions of the Information Technology Act apply to them. The rules, in the process of being finalised, will also apply to digital news media.
Countries, including India, have felt the need to regulate social media companies, which, under current rules, are not responsible for content, unlike traditional media firms; there have also been demands to regulate content on OTT platforms, with some shows on these running into trouble for offending religious sentiments.
In the document, titled Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021, the government cites powers provided to it under section 87 of the Information Technology Act, 2000. This section allows the government to make rules to carry out the provisions of the law by notification in the Official Gazette and in the Electronic Gazette.
The guidelines define social media companies, suggest a three-tier mechanism for regulation of all online media, define the process for tracing the first originator, and confer blocking powers to an inter-ministerial committee that forms the third tier and which will be headed by a joint secretary level officer from the ministry of information and broadcasting.
In effect, in addition to the IT ministry, this committee can also recommend blocks or take downs.
When asked about the impact of having two authorities to block content, Rahul Matthan, partner at Trilegal said: “Having two (separate) authorities to regulate and having powers to block are unlikely to bring them in conflict with each other. Section 69A has been drafted in specific terms. Essentially you can give 10 people the power to block content as long the reason is clear.”
The first tier of the regulatory mechanism is grievance redressal by the company itself; and the second level involves a Press Council of India-like regulatory body that will be headed by a retired judge of a high court or the Supreme Court.
The ministry of I&B and electronics and information technology have been working to come up with a comprehensive framework to regulate content and intermediaries. OTT platforms such as Amazon Prime have come under fire for airing content that some claimed “hurt religious sentiments”. Social media companies such as WhatsApp and Twitter, meanwhile, have sparred with the state over an update in usage policy in case of the former, and non-compliance with take down orders in the case of the latter.
According to the rules, a significant social media intermediary may be defined on the basis of the number of users for which the government is yet to set a threshold. They also involve intermediaries appointing a chief compliance officer who will be responsible for ensuring compliance with the law and be held liable if the intermediary fails to observe due diligence while discharging its duties.
Another provision in the rules is asking social media intermediaries, which operate primarily in the area of messaging, to enable the identification of the first originator — an important requirement in the effort to tackle fake news.
The issue is a point of contention for social media companies, but according to Matthan they should be able to identify the first originator using meta-data related to the message. “That will technically not violate end-to-end encryption. Theoretically, it is possible. Telcos at present are obliged to do this for text messages. This essentially concerns traceability.”
The rules also say that competent authorities, though an order, may demand pertinent information for the purposes of prevention, detection, investigation, prosecution or punishment of crimes. It, however, excludes the intermediary from having to disclose the content of the personal messages.The rules mandate the creation of a grievance redressal portal as the central repository for receiving and processing all grievances. They ask intermediaries to act on certain kinds of violations within 24 hours, and on all concerns of a complainant within 15 days.
The inter-ministerial committee that is at the apex of the regulatory framework will have representatives from the ministries of IT, information and broadcasting, home, law, external affairs, defence, and women and child development.
Matthan cautioned that the velocity and volume of complaints that the government may receive are likely to be much higher than expected.
“Scale is going to be a challenge,” he said. “There are going to be thousands of complaints. Moreover, take for instance copyright violations; one also has to understand the nuances of the law. An oversight board is not a bad move. For example, whether to take down [former US President Donald] Trump’s tweet should not have been a decision left to Twitter.”