GNI Submission to European Commission Consultation on the Digital Services Act

Home > News

April 1, 2021  |  News, Policy

GNI shared the submission below with the European Commission on 31 March, responding to the open feedback period on the Commission’s Draft for the Digital Services Act. 

The Global Network Initiative (GNI) welcomes the European Union’s efforts on the Digital Services Act (DSA). We were pleased to see several recommendations from GNI’s prior submission addressed in the Commission’s draft. This submission draws from from our recent analysis of two dozen content regulation initiatives from around the world and the resulting recommendations on how to address digital harms while upholding human rights principles. We stand ready to continue engaging constructively.

We welcome the distinctions among categories of intermediaries and the exemption of micro and small enterprises. However, the DSA should be refined to ensure that obligations align appropriately with different intermediaries’ ability to adequately address underlying risks. For instance, the notice-and-action regime applies equally to search engines, which index and display entire web pages, and cloud services, which do not have visibility of or control over content. This may create unnecessary burdens and impacts on freedom of information.

GNI also welcomes: the stated ambition to remove only illegal content; the inclusion of transparency, notice, counter-notice, and grievance requirements for all removals; the retention of core principles of the e-Commerce Directive (2000); and the creation of a “safe harbor” for content moderation. However, Article 14(3) imputes “actual knowledge” of illegal content for purposes of liability when any properly formed notification is made, creating perverse incentives likely to lead to disproportionate over-removal of content. As we have stated, “actual knowledge” that content is illegal should be imputed only when an order indicating such illegality is received from a duly authorized, independent authority, preferably a judicial order.

While we are pleased to see the clear, rights-preserving criteria applied to government demands in Articles 8 and 9, we are concerned that Article 21 would require online platforms to turn over user information upon mere “suspicion” of serious criminal offense. Deputizing private entities with such a quasi-law enforcement function creates significant concerns and should be avoided. Furthermore, while the language in Article 8(2)b requiring orders to act against illegal content not exceed the territorial scope “strictly necessary to achieve its objective” is helpful, further clarification and guidance is necessary to ensure such orders avoid creating conflicts of law.

We also express concern at the lack of checks and balances in the proposed governance structure. Certain tasks, such as the designation and regulation of trusted flaggers and out-of-court settlement bodies, may be better managed by the EDSB. Currently, the draft does not provide the EDBS with sufficient legal personality, staff, or resources to ensure the level of independence necessary to carry out oversight of such sensitive issues as electoral oversight and questions on speech.

The DSA’s extension of personal liability to the “legal representatives” required to be designated under Article 11 is unnecessary and risks further incentivizing overly-aggressive surveillance and policing of users. Furthermore, it sets a troubling precedent as non-democratic governments insert “hostage provisions” in their content regulations in order to increase their leverage over intermediaries.

Finally, Articles 26 and 27 on the identification and mitigation of risks by VLOPs are overly focused on identification and removal of illegal content. This may incentivize overly aggressive and technology enabled monitoring and policing of users, in tension with the prohibition against general monitoring obligations. These should be revised to clarify that the preservation of fundamental rights is the primary objective of such assessments, and provide guidance on how VLOPs should ensure that their services, including their efforts to detect and address illegal content or conduct, do not impinge upon these rights.

Copyright Global Network Initiative
Website by Eyes Down Digital