On July 14th, as part of our learning series on “Emerging Issues”, the GNI held its fourth call on the topic of account deactivation and content removal by platforms that host user-generated content. Both GNI members and external participants from industry, civil society, the investor community, and other sectors participated in the discussion.
During the call, participants offered feedback on a draft report on principles and practices related to account deactivation and content removal. The report was developed by the Center for Democracy & Technology (CDT) and the Berkman Center in collaboration with other GNI participants.
Overview of the Issue
Whether intentionally or not, private entities assume a primary role in policing the ‘networked public sphere,’ and the online channels through which citizens increasingly exercise their rights to free expression and association. Platform operators are often faced with challenging decisions regarding content removal or account deactivation, which, even when valid or reasonable, may have costly implications for the human rights of users, especially activists.
Notes from our last call in this series describe the issue in more detail and can be found here.
GNI Executive Director Susan Morgan began the call with a description of our work to date on the issue. Jillian York then briefly highlighted three key issues that she has encountered in her research and that are addressed in the draft report: the need for clear Terms of Service, the need for mechanisms to prevent the abuse of the community reporting tools that are used to inform decisions to take down content, and the need for transparent appeals processes.
Caroline Nolan of the Berkman Center and Erica Newland of CDT then introduced the draft report on principles and practices related to account deactivation and content removal, which offers recommendations for both companies and users who are engaged with the creation and policing of user-generated content. The report is an outgrowth of the GNI’s Emerging Issues learning series, builds on the work and input of participants in the series, and is complementary to a number of international efforts focused on guiding principles and rights-sensitive practices for companies, including the Protect, Respect, and Remedy framework created by John Ruggie, the United Nations Special Representative of the Secretary General on transnational corporations and other business enterprises and the OECD’s “Guidelines for Multinational Enterprises -Recommendations for responsible business conduct in a global context.”
During the open discussion, participants offered a number of suggestions for consideration, future research, and exploration. These recommendations touched on a wide range of issues, including: scalability (both up and down); concerns that translation efforts might create new jurisdictional hooks for platforms; the implications of “real name” policies for activists; how to expand and improve user education programs; the benefits and risks of transparency around the number of takedown requests; and the legal constraints in different countries that may limit a platform’s ability to engage in escalation processes. Recommendations will be incorporated into the final draft of the document.