This piece was originally published on the GNI Blog. Find GNI’s full submission to the DSA consultation here (PDF).
Later this year, the European Commission is expected to finalize a draft of the Digital Services Act (DSA), a regulation which would update and expand the e-Commerce Directive (2000). The DSA is expected to transform the regulatory environment for tech companies and have wide-reaching impacts beyond the EU. On September 7, GNI submitted its comments to the EU’s open consultation on the DSA. The questionnaire was divided into six sections, and GNI addressed questions in three of the sections most relevant to its policy priorities: 1) Safety and Responsibility; 2) Liability Regime; and 3) Governance and Enforcement.
GNI welcomes the possibility that the DSA could make necessary updates to the Directive, including the effort to establish a more clear and predictable regulatory environment for intermediaries. In order for the DSA to succeed, the EU must ensure these new rules not only do not violate fundamental rights, but actually incentivize their protection and respect by governments and intermediaries alike. Holding intermediaries accountable for illegal speech cannot come at the cost of users’ rights to freedom of expression and privacy.
GNI’s full submission to the consultation outlines our recommendations to achieve these objectives. This post summarizes some of those key points and we encourage readers interested in understanding our position to review the full submission . GNI’s forthcoming Content Regulation Policy Brief, which analyzes dozens of government efforts to regulate content and underscores the importance of the principles of necessity, proportionality, legitimacy, and legality, also guided our response. We look forward to continuing to work with EU entities, civil society, and other key stakeholders to ensure that this legislation addresses systemic challenges on the Internet, while upholding and advancing EU member states’ international human rights obligations.
Safety and Responsibility
The consultation sought evidence, data, and experiences with regards to illegal activity online, as well as activities that are not illegal, but may be harmful. In its submission, GNI reiterated that the principles of legality and proportionality require a clear understanding of what is legal and illegal. It is essential that the DSA distinguish between ‘illegal’ and ‘legal but harmful’ content/activity, and not obligate online platforms to remove legal content or otherwise violate free expression rights.
If designed appropriately, a harmonized EU notice-and-action framework can assist platforms in removing both harmful and illegal activity online, while facilitating greater user control, visibility, and access to remedy. In order to do so, it should facilitate third-party notification of content believed to be illegal or in violation of platforms’ terms of service/community standards, provide clear guidance and liability protection to platforms, and facilitate meaningful transparency, oversight, and accountability to help ensure such decisions are made consistently, appropriately, and fairly.
On the topic of appropriate transparency and accountability measures, the DSA should make clear what information must be made public by which types of online platforms, recognizing that different information may be more or less relevant vis-a-vis different services. In line with these precautions, covered online platforms could be required to: (i) make clear what processes and tools they rely on to identify illegal content/activity; (ii) make their terms of service and procedures for identifying possibly infringing content/activity, as well as the reasons for changes thereto, clear and publicly available; and (iii) periodically report on the number of notices of illegal and otherwise improper content/activity received, as well as who those come from, what law or term they were alleged to violate, and what action, if any, was taken.
Transparency and clarity around the use of algorithms for detecting and assessing possibly infringing content, including algorithmic impact assessments, can help ensure that the use of algorithms is rights-respecting and non-discriminatory. As more platforms turn to automated tools for content moderation, the risk of erroneous removal increases, especially for those categories of content that are particularly context-dependent.
As GNI has learned from over a decade of work facilitating non-company collaboration with and oversight of company policies and practices, when designed appropriately, multistakeholder oversight can help avoid and alleviate potential violations of users’ rights. Such arrangements can help build trust, foster collaboration, and facilitate transparency in circumstances where legitimate data protection, privacy, competition or other concerns may limit the degree to which information can be made public.
The DSA should not be seen as the appropriate instrument for facilitating or requiring access by public authorities to online platform data of all sorts. In particular, where public authorities are given access to online platform data for regulatory purposes, clear limits and oversight mechanisms must be put in place to ensure that such data is not misused for law enforcement or other purposes, which could result in undue pressure on online platforms and damage public trust more broadly. In this respect, strong transparency requirements for both digital platforms and governments should be included in the DSA.
Finally, sanctions for systematic non-compliance with these requirements should be clear, specific, and predictable. In line with international human rights principles, the DSA should avoid requiring intermediaries to adjudicate the legality of content/behavior, imposing arbitrary time periods for responding to notifications, setting rigid compliance targets, or requiring reporting of allegedly illegal content to authorities. Where online platforms discover content that they believe may be illegal, they should nevertheless be encouraged to report it.
The e-Commerce Directive provides a “safe harbor” or liability exemption for three types of service providers, covering ‘mere conduits,’ ‘caching services,’ and ‘hosting services.’ GNI supports the preservation of and expansion of safe harbor protections, while also encouraging the EU to use the DSA to update these categories to account for new services and business models. Specifically, the range of “hosting services” has expanded significantly, some playing a more neutral role than others. The DSA presents an opportunity to update this framework, and ensure that providers who host third-party content are operating under a clear and predictable liability regime.
Liability for intermediaries is also conditioned by the Directive on the obligation to act “expeditiously” upon “actual knowledge” of illegal content. What constitutes “actual knowledge,” however, is contested. In order to protect freedom of expression, it is critical that the DSA clarify that “actual knowledge” standards apply to both criminal and civil liability, and that such knowledge can only be established pursuant to an order from a duly authorized, independent arbiter — preferably a court.
The current Directive also prohibits Member States from imposing general monitoring obligations on intermediaries, which the DSA should preserve. Intermediaries do not have the capacity or resources to proactively monitor diverse amounts of user content without jeopardizing fundamental rights. Fearing hefty fines or other sanctions, intermediaries are likely to lean towards proactively over-removing or otherwise limiting any content that could be considered unlawful, even if it may in fact be lawful or legally protected. In addition, such obligations may force large service providers who handle a high volume of content to rely more on automated tools, further obscuring the process and results of content moderation from users, policymakers, and other key stakeholders.
In general, the DSA should seek to minimize uncoordinated duplication and the potential for conflicts between relevant authorities tasked with oversight and enforcement, while maximizing their expertise and effectiveness. Certain activities can only be carried out by public authorities. For instance, an EU-level authority might be best equipped to offer authoritative guidance for interpretation of the DSA, and to facilitate the enforcement of cross-border flows of digital services and products.
Other responsibilities, such as investigating alleged breaches of the DSA, may require and/or benefit from a more federated approach. In considering how best to provide meaningful and effective oversight, the DSA can and should involve and leverage the broad expertise, legitimacy, and interest of civil society, academics, and technical experts. For instance, as noted above, the DSA could consider facilitating and incentivizing multistakeholder frameworks for information sharing and assessment, such as the one pioneered by GNI.
Just as transparency among service providers is critical, public authorities must also publicize relevant information about their activities. Specifically, EU and member state authorities should be required to regularly publish information about the legal orders and referrals they send to intermediaries, including the number of such notices broken out by the legal basis, underlying content/behaviour of concern, and the requesting government agency. Such transparency should extend to civil court orders mandating that intermediaries take certain action with respect to content. The DSA should also ensure that companies do not face restrictions in publishing information about such notices, including their content where appropriate.
Finally, given that EU law and jurisprudence differ in certain ways from speech-related laws
in other jurisdictions, efforts to address content that is deemed illegal in the EU should avoid fostering impacts on the availability of the same content outside the Union, unnecessary conflicts of law, and unnecessary strains on international comity.