On 28 May, the Global Network Initiative brought together David Kaye, the UN Special Rapporteur on the promotion and protection of freedom of opinion and expression, Council of Europe Commissioner for Human Rights Dunja Mijatović, and a set of experts from GNI’s multistakeholder membership to explore a human rights-based approach to content regulation in the context of EU Digital Services Act (DSA). The event was attended by nearly 150 virtual participants, with representatives from EU institutions and member states, civil society, companies, academics, and experts from intergovernmental organizations, and was followed by a closed-door roundtable on the DSA with EU policymakers the following week.
The DSA, part of the European Commission’s vision for “Europe’s digital future,” will update the existing European legal framework for intermediaries’ responsibilities for user content and conduct. Parliamentary committees have issued initial positions and the Commission is beginning formal consultations, with a proposal likely to be finalized by early 2021. The nature of these shifts could have significant impacts on global freedom of expression and privacy.
Kaye opened the discussion by framing the DSA deliberations within the context of recent content regulations in Europe. Robust debates among policymakers around harmful online content are driven by legitimate concerns about misuses of technology and lack of transparency from major platforms, but have insufficiently prioritized human rights, democratic principles, and rule of law. As an example, Kaye cited the legislative response to a spike in vitriol and discriminatory online speech toward the growing refugee population in Germany, which led to passage of the Network Enforcement Act in June 2017, despite the fact that the European Commission had established a Code of Conduct on Countering Illegal Hate Speech with major technology companies only a year earlier.
Mijatović walked through some of the disparate impacts of digital transformation that inform the DSA debate. Technology has been valuable for healthcare, education, and employment, but abuses can also contribute to extensive personal data collection or threaten diverse and pluralistic political information environments. Furthermore, the pandemic response has accelerated digital rights infringements in the name of public health. Mijatović shared practical recommendations for mitigating the harms from deploying artificial intelligence in particular, and noted that multistakeholder approaches like GNI are important for the way forward.
To open the “fireside chat,” Kaye and Mijatović discussed how the principles of necessity and proportionality under international human rights law apply to content regulation. Kaye explained the three-part test for restrictions on expression; they should be enshrined in law, limiting excessive discretion from executive authorities and offering clarity for users and companies (legality); and should be for legitimate purposes such as public health or the rights of others (legitimacy); and they should be fit-for-purpose and proportionate (necessity). Mijatović stressed that regulations must focus on the least intrusive means for addressing the illegal content at issue, as verified by independent oversight. Furthermore, any restrictions that are implemented must be lifted when no longer necessary, noting particular concerns around measures implemented during emergencies, when citizens may beare less aware and more accepting.
Kaye and Mijatović then explored the pros and cons of a single, overarching content regulation framework versus regulation that addresses particular harmful content. According to Kaye, some content-specific approaches have placed too great a burden on companies to adjudicate and interpret legality, which is particularly challenging for smaller companies and can incentivize excessive content removal. The lack of meaningful transparency and disclosure requirements in these initiatives represents a missed opportunity, he noted.
Mijatović commended government efforts to prevent and combat hate speech, noting the European Commission against Racism and Intolerance recommendation in this regard, but shared concerns about the risks of approaches outlined to varying degrees in the Avia Law, NetzDG, and proposed UK Online Harms White Paper, which use broad, hard-to-interpret definitions of illegal content combined with a shift from conditional liability for intermediaries to direct responsibility for dissemination. Like the GDPR before it, the DSA is likely to have global implications, Kaye noted, so stakeholders should share examples where similar restrictions have been used to target critics, dissenters, and journalists.
Agustina Del Campo, Director of the Center for Studies on Freedom of Expression and Access to Information (CELE) at the University of Palermo, opened a discussion between GNI members by noting that bills modeled after European content regulations have flourished in Latin America. As the EU promotes the DSA framework globally, Del Campo hopes to see clarity between content-specific approaches and the existing safe harbor protections and prohibition on general monitoring in the eCommerce Directive; clear distinctions between legal and illegal content; and strict necessity and proportionality analysis. Del Campo also cautioned policymakers on the risks of COVID-19 responses usurping the DSA process.
Francois-Xavier Dussart, Senior Director of EU Public Policy at Verizon Media, shared the perspective of a company that offers a broad array of services: producing and sharing content, a search engine, and communications tools. As a founding GNI member, Verizon Media seeks to examine human rights implications of legislation, and finds laws applying social-media-focused regulations universally across these products can be disproportionate — the Avia Law in particular. For search engines, when companies are required to remove an illegal phrase, it is nearly impossible to do so without de-indexing legal content as well. Dussart hopes for balanced approaches that can address the legitimate concerns but also reflect the nuances of different services.
Jens-Henrik Jeppesen, Director of European Affairs at the Center for Democracy and Technology, opened with an overview of more-recent initiatives that also frame the DSA discussion, the Audio Visual Media Services Directive; the draft regulation on preventing the dissemination of terrorist content, and the Platform to Business Regulation. Still, Jeppepsen notes, policymakers tend to feel a more holistic and comprehensive reform involving the eCommerce directive is needed, with a strong focus on addressing market dynamics as well. He shared CDT’s nine recommendations for EU policymakers, reiterating earlier points about the importance of meeting the legality, necessity, and proportionality tests, and the need to retain the eCommerce Directive provisions on liability and general monitoring. He also discouraged the delegation of authority for state interventions on online speech to a regulator for certain forms of content, and encouraged policymakers to consider the potential cross-border impacts of global content restrictions, a point GNI previously commented on.
The session closed with a question and answer session with audience members, where panelists explored topics around the suitability of social media councils for governing harmful content in Europe; what model transparency looks like; and avoiding the risk of abuse of notice and actions and procedures.
Across these topics, panelists stressed that it is not always necessary to reinvent the wheel. Social media councils — based on existing press council models — could be one part of a larger regulatory effort and help address difficult edge cases, according to Kaye. On transparency, in particular related to artificial intelligence, experts at international organizations have offered a lot of recommendations already — including Mijatović and Kaye. Panelists shared concern that abuse of notice and action frameworks needs more attention, and shared alarm over elements of the EU Copyright Directive encouraging rapid removals of content and even upload filters that can be ripe for abuse. Ultimately, the balance should never lead to an assumption that notification equals illegality, Jeppepsen emphasized.
GNI looks forward to continuing to engage on the DSA and share more about our work on content regulation and human rights more broadly throughout the year. If you want to learn more, please reach out to [email protected]
- “Unboxing Artificial Intelligence: 10 steps to protect Human Rights” from the Council of Europe Commissioner for Human Rights
- “Opinion: Coronavirus concerns are not carte blanche to snoop” by Dunja Mijatović
- Nine principles for future policymaking on intermediary liability from the Center for Democracy and Technology
- David Kaye report on artificial intelligence, human rights, and freedom of expression to the Social, Humanitarian & Cultural Committee at the UN General Assembly
- GNI statement on domestic cases asserting global jurisdiction
Opening remarks by:
- David Kaye, UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression
- Dunja Mijatovic, Council of Europe Commissioner for Human Rights
Followed by a “fireside chat” with GNI Executive Director Judith Lichtenberg, and a panel discussion with:
- Agustina Del Campo, Director of the Center for Studies on Freedom of Expression and Access to Information at Palermo University
- Francois-Xavier Dussart, Senior Director of EU Public Policy at Verizon Media
- Jens-Henrik Jeppesen, Director of European Affairs at the Center for Democracy and Technology