PDF Submission

The Indian Ministry of Electronics and Information (MeitY) recently issued a call for comment on proposed amendments to intermediary guidelines under the Information Technology Act.

While we appreciate MeitY openly consulting with affected stakeholders, members agree that the proposed amendments could place significant pressures on a range of information and technology companies to monitor users activities, remove content, and hand over data in ways that could unnecessarily and inappropriately impact users’ freedom of expression and privacy.

At a minimum, amendments should:
(i) ensure key provisions, such as the definitions of illegal content and appropriate authorities are refined and clarified;
(ii) allow for appropriate company review of and, where appropriate, legal challenges to content removal or user-data request orders;
(iii) eliminate, or significantly limit, situations where companies will be ordered, expected or encouraged to implement “proactive measures”; and
(iv) revise and clarify provisions under which companies will be expected to designate legal entities for 24/7 coordination with local enforcement agencies.

Since 2012, GNI has worked on intermediary liability issues in India, including commissioning a report that demonstrated how greater legal certainty for online intermediaries could help unleash the economic potential of the ICT sector.

Full Submission

The Global Network Initiative (GNI) welcomes the opportunity to provide input to the Indian Ministry of Electronics and Information Technology (MeitY) on the draft amendments to Information Technology (Intermediaries Guidelines) Act. We appreciate that MeitY is consulting openly with affected companies, civil society, and other experts. GNI is concerned the amendments, as drafted, would place significant pressure on a wide range of information and communications technology (ICT) companies to monitor users’ activities, remove content, and hand-over data in ways that could unnecessarily and inappropriately impact users’ freedom of expression and privacy. Given the potential significance of the concerns articulated below, which are shared across GNI’s wide membership of leading experts from civil society organizations, academia, ICT companies, and the investor community, we encourage MeitY to reconsider these amendments.

About GNI

GNI is the world’s preeminent multistakeholder collaboration in support of freedom of expression and privacy online. GNI’s members include leading academics, civil society organizations, ICT companies, and investors from across the world. All GNI members subscribe to and support the GNI Principles on Freedom of Expression and Privacy (“the Principles”), which are drawn from widely adopted international human rights instruments. The Principles, together with our corresponding Implementation Guidelines, create a set of expectations and recommendations for how companies should respond to government requests that could affect the freedom of expression and privacy rights of their users. The efforts of our member companies to implement these standards are assessed by our multistakeholder Board every other year.

GNI encourages governments to be specific, transparent and consistent in the demands, laws and regulations that impact freedom of expression or the right to privacy, including restrictions of access to content, restrictions of communications, and demands that are issued regarding privacy in communications.

GNI’s Work on Intermediary Liability in India

GNI members have been investing in, researching, engaging in, and contributing to the ICT sector in India since 2012. In March 2012, GNI co-organized a multistakeholder roundtable with the Centre for Internet & Society called “India Explores the Balance Points between Freedom of Expression, Privacy, National Security and Law Enforcement.” This event brought together representatives from government, industry, civil society, and academia and provided important insights that were captured in the subsequent report, “Digital Freedoms in International Law: Practical Steps to Protect Human Rights Online.” GNI has also previously submitted comments to the Law Commission of India’s Consultation on Media Law in August 2014.

At the behest of our membership, GNI commissioned a report, published in 2014, “Closing the Gap: Indian Online Intermediaries and a Liability System Not Yet Fit for Purpose,” which found that online platforms that support user-generated content can become an important part of India’s Internet economy and contribute approximately INR 2.49 lakh crore (USD 41 Billion) by 2015—in addition to the contribution of other elements of the Internet economy. Additionally, the positive productivity effects of online intermediaries were found to be significant, creating an even greater impact in India in areas like e-sales and e-procurement compared to their impact in Europe or the United States. The report highlighted the cases of local companies who had suffered due to uncertainty related to legal liability in India.

A year after that report was published, it was cited in briefings in the Shreya Singhal v Union of India (2015 SCC 248) litigation, which resulted in a landmark decision by the Supreme Court of India clarifying intermediary liability under Section 79 of the IT Act. GNI appreciates that the proposed amendments to the Intermediary Guidelines may be intended, in part, to codify and clarify the implications of that ruling. However, we are concerned that the proposed amendments are so vague and potentially broad in several places that they actually have the opposite effect.

Arbitrary Time-Periods

The proposed amendments to Rule 3(5) of the Guidelines introduce a new 72-hour time period for providing “information or assistance” in response to requests from “any government agency,” and the newly proposed Rule 3(8) allows the “appropriate Government or its agency” to issue removal orders to companies requiring they remove content, deemed illegal under the proposed regulation, within 24 hours from receipt of the order. According to the GNI Principles, members are expected to “interpret government restrictions and demands, as well as governmental authority’s jurisdiction, so as to minimize the negative effects on freedom of expression.” These arbitrary and rapid timelines will create significant challenges for appropriate review of removal orders. In addition, the potentially significant legal penalties for noncompliance will put increased pressure on companies to comply with these orders.

While we appreciate the Indian government’s interest in ensuring prompt action in response to legal orders, we would note that most large platforms already act expeditiously in response to clear orders appropriately issued from duly empowered government authorities. There are nevertheless instances when such orders may be incomplete, issued inappropriately, or are overly broad. It is important that companies are allowed to review orders and seek clarity, where appropriate, in order to avoid unnecessarily impacting user rights. This is especially important considering that, if content is removed or user data improperly shared, it may take a substantial amount of time and effort for appropriate redress to take place, if it can take place at all.

Automated Proactive Content Filtering

Rule 3(9) of the Draft Rules, by requiring intermediaries to actively monitor and filter content, transforms them from neutral providers of access to services into censoring bodies. Intermediaries are likely to err on the side of over-censoring the content shared on their platforms in order to comply with this rule. This over-censoring in fear of repercussions under the IT Act will lead to a chilling effect on the freedom of speech and expression of the users in India, who will face a contraction in their ability to share views and content online.

In particular, we are concerned about the language in Rule 3(9) that requires intermediaries to deploy “technology-based automated tools or appropriate mechanisms, with appropriate controls, for proactively identifying and removing or disabling public access to unlawful information or content.” Broad applications of automation should be carefully weighed against the risks such tools pose to freedom of expression. As GNI civil society member Center for Democracy and Technology (CDT) pointed out in a recent publication, companies and policy makers should recognize the limitations of such technological tools in deciphering nuance and context of text-based human communication.

GNI does not believe that governments should mandate the use of filters or other automated content evaluation tools in laws regulating speech. If companies decide to use automation to facilitate content moderation, they should do so in a transparent, accountable manner, while maintaining an appropriate degree of human review. The process of deciding what content is addressed using automated tools, which tools are used and how, and the extent and scope of human review, should be carefully thought through in an open, transparent, participatory manner involving relevant stakeholders, so as to minimize potential human rights impacts.

Definitional Challenges

In its amended form, the Guidelines provide very limited definitional clarity as to which government agencies are appropriately empowered to exercise the various authorities related to user data requests and content removal. In addition, there is little clarity as to the content which might qualify for removal according to clauses (a) through (k) under Rule 3(2). In addition, we are concerned that some items on the list of prohibited content may fall outside of Section 19(2) of the Constitution, raising questions about the extent to which the amended Guidelines conform to the requirements in the Supreme Court’s Shreya Singhal decision.

In addition, Rule 3(8) requires intermediaries to remove or disable access to unlawful acts as required by court order or by the appropriate Government or its agency. However, this provision formulates no checks and balances to ensure that this power is used sparingly and in a just manner. The provision also mandates storage of such information and associated records for a longer period of 180 days and even authorizes this period to be lengthened. Yet the provision does not formulate sufficient safeguards to ensure that the power to extend retention of data is used by government agencies in a fair, transparent and sparing manner. For all of these reasons, Rule 3(8) may fail the constitutional requirement of due process, and should be deleted from the Draft Rules.

These definitional issues are likely to lead to legal uncertainty, as well as potentially overly-aggressive interpretations by companies that could result in the removal of content which would infringe on the users freedom of expression. In addition, the proposed amendments to Rule 3(5) requiring intermediaries to “enable tracing out of such originator of information on its platform as may be required by government agencies” creates a vague and potentially broad new obligation that could have significant impacts on user privacy. The tracing of originators without sufficient limitations and safeguards would constitute a violation of users’ right to privacy, and will affect the way that people use the Internet in India. In addition, it is important for MeitY to evaluate the technical limitations in terms of implementing and enforcing such an obligation on intermediaries.

Incorporation Requirement

There are stringent requirements for companies with more than 50 lakh users to incorporate locally and have a permanent registered office per clauses (i) and (ii) of Rule 3(7). Additionally, companies are required to appoint legal points of contact and alternates “for 24×7 coordination with law enforcement agencies and officers to ensure compliance to their orders/requisitions made in accordance with provisions of law or rules.” This constitutes a highly onerous obligation on international companies who provide services globally but do not find it feasible to incorporate in every country of operation. It would also affect the Internet users’ online experience by limiting the online services available in India. The lack of clarity as to how MeitY will determine the number of Indian users of any given company, as well as the possibility that the Government of India can also arbitrarily add companies to this list, poses particular challenges for small and medium-sized enterprises in particular who may not have resources to establish a permanent office in India, or may lack the infrastructure to deal with the 24/7 requests and properly assess related human rights impacts. The impact of these aspects of the amendments may be to discourage such companies from potential business opportunities at the cost of compliance with the Guidelines. These requirements are likely to lead to further balkanization of the Internet and have an adverse impact the economic potential of, as well as the digital integration in, India.


As noted above, the proposed amendments raise significant issues that must be addressed before they are enacted into law. At a minimum, amendments should: (i) ensure key provisions, such as the definitions of illegal content and appropriate authorities are refined and clarified; (ii) allow for appropriate company review of and, where appropriate, legal challenges to content removal or user-data request orders; (iii) eliminate, or significantly limit, situations where companies will be ordered, expected or encouraged to implement “proactive measures”; and (iv) revise and clarify provisions under which companies will be expected to designate legal entities for 24/7 coordination with local enforcement agencies.

GNI recognizes the importance of taking measures to prevent the dissemination of illegal content online and stands ready to continue engaging with relevant actors, including MeitY, to ensure that our collective efforts to address this challenge remain effective, efficient, and consistent with applicable human rights principles.