In dozens of countries around the world, lawmakers are putting forward new proposals to regulate online content, from incitement to violence to disinformation. On 26 January 2021, GNI hosted a learning forum moderated by GNI’s Independent Board Chair David Kaye, with Tom Malinowski (U.S. Congressman, Democrat, New Jersey), Shazia Marri (Member of the National Assembly, Pakistan Peoples Party Parliamentarians, Pakistan), Alexandra Geese (Member of the European Parliament, Group of the Greens/European Free Alliance, Germany), and Agustina Del Campo (Director of the Center for Studies on Freedom of Expression and Access to Information at Palermo University, Argentina). This event built on GNI’s recent policy brief on Content Regulation and Human Rights, which offered recommendations for how to put human rights at the center of content regulation. 

Throughout the discussion, panelists:

  • shared insights about how efforts to regulate content in different jurisdictions affect freedom of expression. Topics included the proposed Protecting Americans from Dangerous Algorithms Act, Pakistan’s Prevention of Electronic Crimes Act, the EU’s draft Digital Services Act (DSA), and trends in content regulation across Latin America;
  • discussed the need for greater transparency around and increased understanding of how algorithms are used to suggest and moderate content, in order to effectively evaluate how laws can dampen the spread of harmful or illegal content; and 
  • reiterated the importance of understanding local context and the actions of national companies, in order to create fit-for-purpose legislative solutions.
Click here to watch the GNI Learning Forum on Human Rights and Content Regulation.

Tom Malinowski used the January 6th Capitol riots to illustrate how digital echo chambers, which result in part from the use of algorithms, can result in physical violence. He also emphasized his belief that it is not the proper role of government to encourage censorship by online platforms, in the U.S. or elsewhere. To address this issue, his proposed Protecting Americans from Dangerous Algorithms Act, focuses specifically on instances where a a company’s algorithm is found to have actively promoted content that leads to severe instances of real-world offline harm, and would remove companies’ liability protections under Section 230 of the Communications Decency Act in such scenarios. 

Shazia Marri shared an overview of content regulation debates in Pakistan, including discussions about the 2015 Prevention of Electronic Crimes Act and recently introduced rules to address unlawful online content. Throughout, she noted that there have been unclear legislative and regulatory processes and insufficient attention to fundamental rights, resulting in significant website blocking and content removal by the Pakistan Telecommunications Authority, without corresponding transparency or access to remedy. She emphasized how essential it is to have meaningful consultations with stakeholders while developing legislation and regulation, to ensure that they do not violate fundamental rights. Relatedly, she noted the importance of addressing content moderation in local languages.

Alexandra Geese expressed her view that the business models of companies are part of the underlying problem related to dissemination of harmful online content. She described the existing safeguards for freedom of expression in place in Europe, including liability protections for companies that are “mere conduits” for transmission of information, a ban on general monitoring requirements, and a notice and action system that provides safe harbor to companies that act to remove illegal content when they become aware of it. She also discussed new measures in the proposed DSA, noting the importance of comprehensive company transparency reports, both to inform the public and regulators, but also so that academics and researchers can access raw data and algorithms to ensure compliance with regulatory measures. 

Agustina Del Campo acknowledged how difficult it is to speak about Latin America as a whole, given the heterogeneity in the region and lack of a unified legislative body to look to for guidance. One trend in the region that she identified is a lack of clear intermediary liability frameworks. Some frameworks from the EU and the U.S. have been imported through trade agreements, with a focus on copyright in particular, but governments are now starting to address some of the same regulatory questions discussed in other jurisdictions. The principles of the inter-American system are providing important context and permeating national conversations, but policymakers too are looking to Europe and the U.S. to fill the gaps. She offered both glimmers of hope with the DSA conversation’s open process and emphasis on human rights, as well as areas of challenge like a fake news law in Brazil.

As we see legislation being proposed in all regions of the world and companies facing increased government pressures with potential impact on users’ freedom of expression, GNI also sees an opportunity for a multistakeholder approach grounded in international human rights law. It was our pleasure to be able to convene such an excellent panel to discuss these issues and we look forward to continuing to engage with governmental and non-governmental stakeholders around the world on how to ensure content regulation protects and enhances human rights.


These presentations only reflect the personal views of the panelists and do not represent GNI’s positions.