“Intermediary liability” describes the allocation of legal responsibility to content providers of all kinds for regulated content categories. Protections against liability for user-generated content on information and communication technology services help to guard against overbroad restrictions on expression and foster innovation in the design and delivery of mechanisms for digital communication.
GNI uses the term “Content Regulation” to refer to government measures that claim to address various forms of online harm related to user-generated content. These are often articulated as efforts to enhance “safety” or address “harms” and can focus on a range of issues, including disinformation, extremist content, and hate speech.
GNI seeks to preserve and expand freedom of expression protections throughout the tech value chain. Since 2009, GNI has issued numerous statements highlighting the importance of intermediary protections, engaged in open government consultations, and called attention to and expressed concern about problematic laws, policies, and actions in a number of countries and contexts.
Building on this work, in 2020, GNI released “Content Regulation and Human Rights: Analysis and Recommendations,” a policy brief that analyses more than 20 governmental initiatives that claim to address various forms of digital harm. The brief uses the enduring framework of international human rights law to offer practical guidance for governments and other stakeholders on formulating and implementing content regulations that are effective, fit-for-purpose, and enhance and protect the rights to freedom of expression and privacy.
Read our latest work on intermediary liability and content regulation below.