This is the first post of the GNI blog series, “Content Regulation and Human Rights.” Follow GNI on Medium to read more from members and close collaborators on practical guidance to those seeking to address online harms while upholding human rights.  

I. Introduction

Principles of good governance and human rights indicate that governments should seek to understand and address public and private harms within their jurisdiction. Since policymakers and regulators around the world are increasingly concerned about various forms of online content and conduct, it is no surprise that many are considering how different forms of state action may help or hinder efforts to address those concerns.

The multistakeholder Global Network Initiative (GNI) recently reviewed over a dozen new governmental initiatives that claim to address various forms of online harm related to user-generated content — a practice we refer to broadly as “content regulation.” We focused on proposals that could shift existing responsibilities and incentives related to user-generated content. GNI produced a draft policy brief on content regulation summarizing this work and has been engaging with a broad range of stakeholders in diverse jurisdictions on its findings.

This is the inaugural post in a forthcoming series of blogs that will build off of GNI’s initial work and consultations on content regulation. Through this series, GNI members and close collaborators will examine emerging approaches to content regulation in order to provide practical guidance to those seeking to regulate content while upholding human rights. We hope these contributions will help inform advocacy and debate around content regulation in a range of contexts across the world. They will also feed back into the finalized policy brief, which we will publish later this year.

II. The Rights Foundation

GNI’s initial analysis of global content regulation efforts illustrates how good governance and human right principles provide time-tested guidance for how laws, regulations, and policy actions can be most appropriately and effectively designed and carried out. These historically validated human rights principles can help lawmakers find creative and appropriate ways to engage stakeholders, design fit-for-purpose efforts, and mitigate unintended consequences.

Governments that actively place human rights at the forefront of their deliberations and policy designs are not only less likely to infringe on their own hallowed commitments, but can also achieve more informed and effective content regulation outcomes, balancing public and private responsibilities, designing appropriate incentives, enhancing trust, and fostering innovation.

Because content regulation is primarily focused on and likely to impact digital communication and conduct, we use international human rights principles related to freedom of expression and privacy as our primary lens. These principles — legality, legitimacy, and necessity — help reveal flaws in the foundation of many governmental efforts. For instance, prioritizing speed over proper deliberation and consultation often contributes to imprecise definitions, regulatory uncertainty, and distrust. Those cracks can then splinter outward into the design and execution of resulting laws, spawning unintended consequences that can hinder or even undermine legitimate policy objectives. In many instances, actions short of lawmaking or regulation may suffice, so it is important to resist acting for the sake of mere expediency or political gain.

In considering new and updated regulatory approaches, policymakers should recognize that:

  • Consensus is achievable. Many actors agree on the need to address legitimate public policy concerns around harmful content and conduct online while respecting human rights. Processes for legislative deliberation should be open and non-adversarial, drawing on broad expertise to ensure results are evidence based. In addition, any non-elected regulatory or oversight bodies should prioritize transparency and consultation with diverse constituencies.
  • The technology sector is evolving. Services that facilitate the sharing of user-generated content differ in important ways, and the ICT sector more broadly features an ecosystem of interrelated components upon which multiple industries, initiatives, and possibilities depend. This complexity counsels careful consideration of what state actions are most appropriate and narrowly tailored to address which specific challenges. Lawmakers must be clear about the priorities that inform their efforts and open to diverse approaches to achieving them.
  • Flexibility and humility is needed for all actors. While governments can and should learn from each other, they should also recognize that there are no off-the-shelf solutions. Governments need to consider appropriate, proportionate state actions for their jurisdiction. Furthermore, policymakers must consider the global nature of technology and the rapid pace of technological change.
III. Roles and Responsibilities

It is clear that ICT companies have a key role to play in addressing online harms. However, lawmakers should resist the temptation to shift legal liability from those generating illegal content to the ICT intermediaries. Not only can this misalign companies’ incentives, including toward invasive monitoring and over-removal of content, it also does little to address the underlying drivers of harmful conduct. Similarly, while private companies can and should respect principles of transparency, due process, and non-discrimination, delegating traditional state functions, such as the adjudication of illegal content, to companies under arbitrarily tight deadlines and the threat of significant penalties is unlikely to foster such worthy goals.

Where such delegations do occur, it is essential that policymakers provide clear, sufficient guidance on what content is prohibited, set appropriate expectations for responsible company action, and ensure appropriate notice, due process, and remedy are available to mitigate any infringements on user rights.

When the decision is made to regulate — governments should build strong transparency, remedy, and accountability measures into their efforts. Such measures allow policymakers and other relevant stakeholders to proactively seek to understand and measure if content regulations are working as intended, including assessing the activities of any non-elected oversight or enforcement body. Where experience demonstrates they are not working as intended, governments must recognize and expeditiously rectify them.

IV. Conclusion

For over a decade, the Global Network Initiative has brought together leading companies, civil society organizations, academics, and investors from around the world in a shared endeavor to enhance freedom of expression and privacy across the ICT sector. Together, our members have built trust and shared expertise with each other in order to help protect rights in the digital age. We have also worked constructively with a wide range of governments to ensure that well-intentioned efforts are fit-for-purpose and proportionate.

The analysis and recommendations in this blog series and our forthcoming policy brief draw from this extensive experience and expertise. We look forward to continuing to engage with lawmakers and other stakeholders to keep human rights at the forefront of our collective efforts to reduce harms in the ICT sector.