Addressing Digital Harms AND Protecting Human Rights — GNI Shares Recommendations for Policymakers

Home > News
megaphone-FoE

October 13, 2020  |  Policy

Washington, D.C./Amsterdam — Today the Global Network Initiative (GNI) issued a policy brief that uses human rights to analyze more than 20 recent, governmental initiatives that claim to address various forms of digital harm. Informed by this analysis, the brief offers practical guidance for governments and other stakeholders on how to formulate and implement content regulations that are effective, fit-for-purpose, and enhance and protect the rights to freedom of expression and privacy.

“Content Regulation and Human Rights: Analysis and Recommendations” is the result of months of multistakeholder analysis by GNI’s diverse, expert membership — which includes leading civil society organizations, tech and telecommunications companies, investors, and academics — as well as six virtual consultations with government actors and other key stakeholders in Africa, the EU, India, Pakistan, and the UK.

“There is growing consensus on the need to address legitimate concerns about digital content and conduct. It is critical, however, that governments are deliberative and flexible in their approaches, otherwise, we risk enabling more restrictive models of content regulation,” said David Kaye, former UN Special Rapporteur on Freedom of Expression and GNI’s new Independent Board Chair. “This carefully thought out and constructive paper is a must read for government officials and other actors working to address these issues in a rights-respecting manner.”

The brief recognizes that content regulation can help improve the digital ecosystem by providing clarity, predictability, transparency, and accountability around government and company efforts to address digital harms. In order to achieve these legitimate objectives while avoiding unintended consequences, the brief recommends that content regulation processes are open, inclusive, deliberative, and evidence-based. It also underscores that resulting definitions and obligations should be clear and targeted at services that face the most risk and are best positioned to address underlying concerns. Finally, the brief identifies strong transparency, remedy, and oversight measures, together with the preservation of intermediary safe harbors for user-generated content, as important characteristics of effective content regulation.

“As this brief notes, there are no off-the-shelf solutions to complex regulatory issues, and content regulation is no exception. We are eager to continue working with policymakers to promote risk-based and tailored responses to user-generated content that reflect these distinctions,” said Nicole Karlebach, Global Head of Business and Human Rights at Verizon/Verizon Media.

The brief also identifies some of the ways that well-intentioned regulation can actually hinder progress toward these admirable goals. These include vague or broad definitions for the content and/or companies covered by the scope of regulations; deputizing private companies as judge, jury, and executioner of the legality of user content and conduct; overreliance on automated moderation tools; and potential privacy infringements, including by prohibiting or undermining encryption.

Individually or in combination, these characteristics can incentivize over-removal on the part of companies and self-censorship from users. In other words, where content regulations are not appropriately tailored, they risk not only failing to root out legitimately harmful content, but also imperiling the healthy and innovative aspects of our digital ecosystem.

“Whether it is citizen journalists, political opposition, or marginalized populations, there are countless examples of the most vulnerable facing the brunt of excessive government pressures to remove content, and the corresponding shift to automation” notes Emma Llansó, Director of the Free Expression Project at the Center for Democracy and Technology. “ It is critical that this wave of content regulation, responding to real challenges like disinformation and online extremism, does not become another means for repressive ends.”

GNI is grateful for the opportunities to work constructively with a wide range of governments and other stakeholders to ensure that content regulations are effective in both addressing legitimate harms and protecting and enhancing users’ rights. We hope this brief, informed by a globally diverse set of perspectives, offers a tool for ongoing dialogue to keep human rights at the forefront of our collective efforts to reduce harms in the ICT sector.

IN THIS REPORT
• Executive Summary
• Recommendations
• Index of Laws and Legislative Proposals

MORE FROM GNI ON CONTENT REGULATION
• Blog Series (Medium)
• GNI Virtual Consultation Event Reports

GNI CRP Brief, PDF VERSION:
中国人
English
Español
Français
Kiswahili
Melayu
ਪੰਜਾਬੀ
Português
ਪੇਂਗਵਾਲਤੂਰਨ —

Copyright Global Network Initiative
Website by Eyes Down Digital