Featured

Unduly Worried Over New Information Technology Rules

Photo by Canva Studio on Pexels.com

In a communication dated June 11, three UN Special Rapporteurs raised serious concerns over provisions of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. They claim that these provisions do not meet the standards of rights to privacy and to freedom of expression as per the Articles 17 and 19 of the International Covenant on Civil and Political Rights (ICCPR) and that some of the due diligence obligations of intermediaries may infringe upon a ‘’wide range of human rights”.

They claim that the terms such as “ethnically or racially objectionable”, “harmful to child”, “impersonates another person”, etc. are broad and lack clear definitions and may lead to arbitrary application. Nothing could be further from truth. These terms have been very well defined and understood in both Indian and international law and jurisprudence. The Rule 3(1)(b) of the IT Rules specifies these terms clearly as part of a user agreement that the intermediaries must publish. They are aimed at bringing more transparency in how intermediaries deal with the user content and are not violative of the UN’s Joint Declaration on Freedom of Expression and “Fake News”, Disinformation and Propaganda.

It must also be mentioned that the Rule 3(1)(d) allows for removal of an unlawful content relating to sovereignty and integrity of India, security of the state, friendly relations with foreign states, public order, etc. only upon an order by a competent court or by the Appropriate Government. This is as per the due process specified by the Supreme Court in the Shreya Singhal Vs Union of India case in 2015. Given the potential of immense harm that can be caused by such unlawful content being freely available online, the time limit of 36 hours for their removal after due process is reasonable. Similarly, the time limit of 72 hours for providing information for investigation in response to lawful requests in writing from government agencies is entirely reasonable. The Rule 3(2) also provides for establishing a grievance redressal mechanism by the intermediaries and resolution of user complaints within 15 days. However, content in the nature of ‘revenge porn’ must be removed within 24 hours. Again, given the potential of immense personal damage that such acts can cause to the dignity of women and children, this time limit is reasonable.  

The liability of the Chief Compliance Officer under Rule 4(1) of a significant social media intermediary is not arbitrary. He or she can be held liable in any proceeding only after a due process of law. This has been clearly specified in the rule itself.

The apprehensions about the Rules harming privacy are also misplaced. The Rule 4(2) requires the significant social media intermediaries to provide only the metadata about the first originator of a viral message that may be required for investigation of a serious crime relating to sovereignty and integrity of India, public order, rape, child sexual abuse, etc. that are punishable with a minimum term of five years. This again is after a lawful order is passed by a court or a competent authority and where there is no other less intrusive means of obtaining such information. There is no provision to ask the intermediary to break any encryption to obtain the contents of the message. In fact, the content is provided by the law enforcement agencies to the intermediary. Lawful investigation of crimes cannot be termed as harmful to privacy. Several countries, such as the US, UK and Australia have enacted laws that allow for far more intrusive interception of encrypted messages, including their decryption.

The concerns with regard to media freedom are also misplaced. The section 5 of the UN’s Joint Declaration on Freedom of Expression and “Fake News”, specifically enjoins upon the media outlets to provide for self-regulation at the individual media outlet level and/or at the media sector level. The IT Rules provide for a three-tier system of regulation, in which the government oversight mechanism comes in at the third level only after the first two tiers of self-regulation have failed to produce a resolution. The rules clearly specify the due process for the government oversight mechanism.

India is a vibrant democracy with a long tradition of rule of law and respect for freedom of expression and privacy. The IT Rules aim at empowering the users to enable them to exercise their right to freedom of expression responsibly and prevent the misuse of these platforms for unlawful purposes. The selective interpretation of the provisions of the IT Rules by the UN Rapporteurs is, at best, disingenuous.  

(The above article appeared in The Economic Times on July 11, 2021 and is available at https://economictimes.indiatimes.com/opinion/et-commentary/unduly-worried-over-new-rules/articleshow/84323812.cms?from=mdr. The views expressed by the author are personal.)

Featured

New Code for Digital Media Seeks to Strike a Balance Between Freedom and Responsibility

Photo by Tracy Le Blanc on Pexels.com

Countries around the world have grappled with the issue of regulating content hosted by the internet intermediaries. As the internet allows freedom to anyone to host content without any moderation, intermediaries were allowed protection from liability for third-party content through laws such as section 230 of the Communications Decency Act in the US and the safe harbour provisions in the EU with certain exceptions for illegal content.  

Section 79 of the IT Act in India also allowed exemption to the intermediaries for third-party content provided they observed certain due diligence. The content could be removed only based on orders from a court or from an authorised government agency with certain conditions as laid down by the Supreme Court in the 2015 Shreya Singhal vs Union of India case.

This classical interpretation of the role of intermediaries worked satisfactorily for several years as the services they provided were predominantly passive in nature. However, the enormous growth of social media during the last decade with their hundreds of millions of users has made the limitations of this framework starkly evident as they have been unable to check the proliferation of fake news, and other illegal and harmful content on their platforms. The proliferation of fake accounts and bots has only aggravated the problem. Several countries, e.g., Germany, France, Australia and Singapore have enacted legislation to deal with unlawful and harmful content on these platforms.

The new Intermediary Guidelines and Digital Media Ethics Code must be seen in the context of the need to make these platforms more responsible and accountable. These rules specify certain due diligence and institute a mechanism for redressal of grievances. The due diligence includes informing the users about their privacy policy and an agreement not to host any unlawful or harmful content. The rules envisage removal of content only in three situations: voluntary removal due to violation of the privacy policy or user agreement, pursuant to an order by a court or an authorised government agency or based on the grievances received.

The rules also specify some additional due diligence to be observed by ‘significant social media intermediaries’, defined based on the number of registered users (currently specified as 50 lakhs) in India. These include appointment of a Chief Compliance Officer, a nodal contact person, and a Resident Grievance Officer, who should all be residents in India. The intermediary should also have a physical contact address in India. The rules also include providing information about the first originator in India of any unlawful message for the purposes of investigation of specified offences that are punishable with imprisonment of not less than five years. It must be noted that the intermediary is not required to disclose the contents of the message itself.

The Digital Media Ethics Code under these rules create a largely self-regulatory framework for publishers of online news and current affairs and online curated content on Over-the-Top (OTT) platforms. The oversight mechanism of the government comes into play only after the redressal mechanism at the first two levels has failed to address the grievance satisfactorily.

It is relevant to note that the exemptions to the intermediaries under section 79 are still available, provided they observe the due diligence as specified.

Freedom of expression must come with adequate responsibility and accountability. John Stuart Mill, one of the most influential thinkers in classical liberalism, explicitly recognized the ‘harm principle’ while arguing for placing some limitations on free expression. The new rules seek to strike a fine balance between freedom and responsibility in the online world.

(The above article appeared in The Economic Times on March 19, 2021 and is available at: https://economictimes.indiatimes.com/industry/media/entertainment/media/view-new-code-for-digital-media-seeks-to-strike-a-balance-between-freedom-responsibility/articleshow/81593609.cms. The views of the author are personal.)