The Global Human Rights Implications of Canada’s Online Harms Act

Home > News

April 18, 2024  |  Confluence Blog

By Samaya Anjum

Canada is set to become the latest country to join a global trend in online safety regulation. As its approach proceeds through public and legislative debate, it is important to understand both how it is shaped by efforts in other countries and how it could influence and impact future efforts around the world.

On 26 February 2024, the Canadian government tabled the landmark Bill C-63, which seeks to enact the Online Harms Act. If passed, it will set in place a legislative scheme to hold a wide range of services accountable for reducing the risk of harmful content online. It also proposes a comprehensive regulatory framework, including the establishment of a Digital Safety Commission to oversee enforcement, an Ombudsperson to support victims of online harms, and a Digital Safety Office to support their operations. Beyond that, the Bill seeks amendments to the Criminal Code and Human Rights Act to introduce new “hate crime” offenses. In this post, I’ve focused on the provisions in Part 1 of the Bill, the Online Harms Act.

The legislative approach of the Act, currently pending its second reading in the House of Commons, borrows from a collective toolbox for online safety that Canada’s counterparts in the European Union, the United Kingdom, and Australia have already largely stocked up. The Global Network Initiative (GNI) has been engaging in several of these law-making processes in the EU, the UK, and Australia, as well as tracking how they have been reproduced in countries such as Vietnam and Sri Lanka. Given this broader trend and the extent to which related regulatory efforts are building on one-another, it is clear that the evolving contours of the Canadian approach is likely to have serious implications for human rights not just in Canada, but around the world.

The Problematic Potential of Strong Authorities with Ample Prosecutorial Discretion

The proposed Online Harms Act applies to social media services accessible in Canada, including services allowing for live streaming or the sharing of user-generated pornographic content, but notably excludes private messaging services from its scope. Compared to the online safety acts in the UK and Australia, the definition of regulated services under Bill C-63 is rather vague, with much of its interpretation – such as the minimum number of users that would bring a service within scope – left to regulation implementation. Additionally, under Article 2.3, the Governor in Council can “make regulations designating a particular social media service” if they are satisfied there is a significant risk of it hosting harmful content. This reliance on secondary legislation to expand on and implement the Online Harms Act, primarily through a Digital Safety Commission, is also noticeable in subsequent parts of the draft law.

The establishment of the Digital Safety Commission with mandated powers to investigate, hear complaints, issue guidelines, and ensure the enforcement of the Online Harms Act is one of the major features of the draft law. Some of its functions are modeled after the Australian eSafety Commissioner, including its discretionary powers and the power to make regulations. GNI was involved in the initial drafting stages of Australia’s Online Safety Bill, which formalized the eSafety Commission, and expressed the need for oversight and reporting requirements to enhance accountability for the role of the Commissioner. The same arguments apply with respect to the Canadian approach, which attempts to further strengthen the Digital Safety Commission with the power to hold hearings (Article 89) and decide on procedural and evidentiary questions (Article 86.d) without clear safeguards for its functions.

This regulatory model of the Commission – combining broad authorities with a lack of oversight – is a recipe for potential abuse and troubling in the global context, especially for privacy and freedom of expression in authoritarian countries. Sri Lanka is a recent example of a country that has adopted an Online Safety Bill which deals a severe blow to human rights. GNI has shared its concerns about the Online Safety Commission established by the Bill that is assigned with broad and unchecked powers to regulate a wide range of categories of content under vague legal offenses. Canada’s approach will inevitably contribute to a global precedent that can either be protective of or violate human rights, and it is therefore important that the government frames the authorities of the Commission as carefully as possible.

Test of the Necessity and Proportionality Principles

Another major aspect is the consideration of whether the Online Harms Act meets the principles of necessity and proportionality, which are two important human rights principles established under international law. Canada has been a steadfast advocate for human rights within the international system, including through its participation in the Freedom Online Coalition and Christchurch Call. Given this role, it is important that the Canadian government clarify how the Act aligns with its interpretations and commitments to privacy and freedom of expression. The following provisions raise particular concerns:

  • Remote Government Access to Data: To verify compliance or non-compliance with the Act, the Commission can designate inspectors who are authorized under Articles 91 and 92 to enter a dwelling-house or access any place “remotely by means of telecommunication” based on “reasonable grounds”. While this is not unique compared to the EU’s Digital Services Act (DSA) and the UK’s Online Safety Act (OSA) which also provide regulators some authority to enter the premises of platforms, the Canadian approach mandates access that is technologically mediated. The law further allows the inspector to use or reproduce any information that they find during their search. Remote government access to data almost always constitutes a privacy infringement and generates surveillance-related concerns. The approach taken by Canada takes some steps in the right direction by mandating the need for search warrants to enter a dwelling-house. However, provisions limiting the ability of service providers to issue user or public transparency notices concerning government access to data undercut this protection. By taking service providers out of the data access loop and limiting their ability to provide notice or challenge these orders, it removes an important potential safeguard for user rights. Recently, GNI also raised similar concerns about a remote access provision introduced to the proposed Artificial Intelligence and Data Act in Canada.
  • 24-hour Takedown Windows: The requirement for services to remove possible harmful content within 24 hours of identifying or being notified of it by the Digital Safety Commission is arbitrary and unnecessarily short. The Canadian inclusion of a time limit differs from other democratic country approaches, such as the EU’s DSA and the UK’s OSA. This is also a recurring issue that GNI has raised in several countries, including Australia, Vietnam, Pakistan, India, and Indonesia, as well as with respect to an earlier draft Canadian online safety law. While many pieces of content will likely be easy to evaluate and remove quickly, flexibility needs to be afforded for difficult or edge cases. Otherwise, it is likely that providers will default to taking down more identified or notified content than is necessary or appropriate under the law in order to avoid penalties.

Canada’s Online Harms Act is an effort to create safer and more inclusive online spaces for people, but its long-term effects require careful thought. Technology changes quickly, and new laws might have unforeseen consequences for human rights. Therefore, to ensure the draft law’s effectiveness and protection of human rights, the Canadian government should prioritize expert and multistakeholder consultation prior to its adoption. In 2020, GNI published a policy brief on Content Regulation and Human Rights analyzing more than 20 governmental initiatives that claim to address digital harms. The brief can offer Canadian stakeholders practical guidance on how to formulate and implement content regulations that are effective and enhance and protect the rights to freedom of expression and privacy. The government and Parliament should seek further opportunities to work with a wide range of experts in the field, including those from academia, civil society, and industry as it works to ensure that the Online Harms Act provides a best-in-class global example of a rights-respecting, rights-protecting, and future-proof approach to content regulation.

Copyright Global Network Initiative
Website by Eyes Down Digital