[PDF] GNI Letter and Analysis of the Online Safety Bill
The Hon Paul Fletcher MP May 11, 2021
Minister for Communications
Department of Infrastructure, Transport, Regional Development and Communications
GPO Box 594
Canberra ACT 2601, Australia
Dear Minister,
The Global Network Initiative (GNI) welcomes the Australian Government’s efforts to update the country’s legal framework for online content and conduct and shares the Government’s strong commitment to protecting its citizens online. We recognize the importance of addressing abuses of digital communications tools and welcome recent amendments to the proposed Online Safety Bill 2021 (“the Bill”) improving transparency and Australian citizens’ rights to effective remedy through an internal review scheme. However, a number of serious concerns remain. These include: the overly broad and undifferentiated application of the Bill to companies across the spectrum of services; extensive discretionary powers afforded to the eSafety Commissioner; limited exemptions for content in the public interest; an inflexible emphasis on a 24-hr takedown window; and lack of definitional clarity around thresholds for certain categories of content. We are concerned that individually and cumulatively, these provisions may be disproportionate in their impact on freedom of expression and privacy in Australia. In the hopes of supporting further improvements to the Bill, we have elaborated on these points further in the attached analysis.
GNI’s interest in the progress of this Bill derives from our position as the world’s preeminent multistakeholder collaboration in support of freedom of expression and privacy in the information and communications technology (ICT) sector. Our contribution draws from the lessons learned from our recent analysis of two dozen content regulation initiatives from around the world and the resulting recommendations on how to address digital harms while upholding human rights principles. It is with this context in mind that we write to provide our high-level analysis of the Bill and urge the government to use the international human rights principles of legality, legitimacy, and necessity as the basis for ensuring that the Bill is clear, appropriate, and proportionate, and includes robust transparency, oversight, and accountability mechanisms.
As a long-standing and established democracy, Australia has the opportunity to demonstrate a model for strong and participatory governance that can successfully address legitimate concerns around online harms in a manner that is consistent with internationally recognized rule of law and human rights principles. We encourage the Government to incorporate relevant proposed amendments and address the further concerns outlined below. We stand ready to continue engaging constructively.
Sincerely,
Judith Lichtenberg
Executive Director
Global Network Initiative
GNI Analysis of Australia’s Online Safety Bill
I. General Transparency and Accountability
The OSB confers broad discretionary powers to the eSafety Commissioner, an unelected official, to do “all things necessary or convenient to be done for or in connection with the performance of the Commissioner’s functions,” without providing sufficient oversight and accountability measures to ensure the Commissioner remains accountable and does not abuse this discretion.
Although the Commissioner is subject to parliamentary scrutiny and the Bill provides for an independent review within three years of the commencement of the Act, greater transparency and reporting requirements would enhance the accountability provisions of the Bill. As we outline in our Policy Brief, “[t]o the extent that substantial rulemaking authority and discretion is delegated to independent bodies . . . . States must create robust oversight and accountability mechanisms to ensure that those bodies act pursuant to the public interest.” We applaud the Commissioner’s testimony that an annual report is already published, and the Commissioner’s Office is working to build out an “insights capability” to publish further data. This presents a step towards greater transparency but these voluntary steps are not sufficient in ensuring effective oversight of the Commissioner’s Office. Instead, the Bill should be amended to require detailed and regular public reporting about the use of Commissioner’s powers and the reasoning behind specific cases. This would allow policymakers and other relevant stakeholders to understand if content regulations are working as intended and determine if there is need for adjustments.
We welcome the provision of an Internal Review Scheme as a means to improve individuals’ right to remedy and, in turn, hold the Commissioner’s Office accountable for its decisions. It is important that the schemes currently in development have effective remedy powers. We note that the proposed amendment states that the Commissioner could “affirm, vary or revoke the decision concerned” and encourage the practical application of such remedy powers to include the option of reinstating content.
II. Overly broad application to service providers
In GNI’s Content Regulation and Human Rights Policy Brief (“Policy Brief”), which analyzed over 20 proposed or recently-enacted content regulations from around the world, we emphasize the need for laws to be “tailored, effective, and fit for purpose.” The Online Safety Bill (OSB) outlines various content schemes that appear to apply the same provisions uniformly to different types of services. This includes services that provide end-to-end encrypted messaging and may not be in a position to mitigate the risks or offer an option for removal of specific content. Extending potentially burdensome and technically difficult compliance obligations on such a wide range of platforms is likely to lead to overbroad or untargeted enforcement by service providers, which would in turn create chilling effects for individuals’ rights to free expression. Amendments that create flexibility and mechanisms for clarification between service providers and relevant authorities could mitigate these risks.
III. Public interest test
As we highlight in our Policy Brief, efforts to “carve out or provide affirmative defenses for particularly vulnerable or important groups, such as human rights documentation groups and journalists” can help ensure that online content regulation bills are narrowly tailored to their objectives. The OSB does create exemptions for certain public interest circumstances, including a carve-out for news reporting “in the public interest” by a “person working in a professional capacity as a journalist.” Nevertheless, this exemption appears to only apply to the narrow category of e “Abhorrent Violent Material” and does not include those producing journalistic material, in a nonprofessional capacity. Given the potential to limit speech in the public interest, we encourage additional public interest exemptions to ensure that no Australian is chilled from documenting, reporting on, or commenting upon protests, police activity, natural disasters, political activities, or other events that the public may have a legitimate interest in. These should include exemptions for counter-speech, content that condemns hate speech, satire, etc., to ensure that those working to counter harmful content are not unintentionally limited by well-intentioned schemes. They should also apply broadly, including to the adult cyberbullying scheme.
IV. Flexibility in 24 hr takedown
The reduced timeframe established by the OSB for the removal of content by service providers is concerning in terms of its broad applicability and potential to encourage over-removal. The quick turnaround presents practical challenges even for some large, well-established services, and may be near impossible for smaller companies and newer services to comply with. As we recommend in our Policy Brief, the OSB should “refrain from overly stringent enforcement and penalties, so as to accommodate a diverse range of business models and capacities among covered businesses.” Greater flexibility from the Commissioner’s Office under certain circumstances could ease the impacts of strict enforcement across the spectrum of companies.
Additionally, while the Commissioner justified the reduced timeframe as a means to limit the trauma and stress suffered by victims, the rigid 24-hr takedown provision creates the possibility that content that is newsworthy, time-sensitive or of a subjective nature may be censored without sufficient clarity or opportunities for appeal. Our Policy Brief specifically warns against the institution of “aggressive and arbitrary timelines” for content removal. While certain content in specific circumstances may merit such quick and decisive action, a blanket 24-hr takedown rule can “hinder the ability of ICT companies to prioritize resources and make nuanced, content and circumstancespecific determinations” and ultimately cause them to lean towards over-removal.
V. Lack of definitional clarity
The OSB frequently uses terms, such as “reasonable grounds,” “ordinary reasonable person,” and “offensive content,” to describe categories of content and establish thresholds of harm that may ultimately be applied subjectively and inconsistently, and could therefore create a low threshold for removal. Consistent with the principle of legality, our Policy Brief recommends definitional clarity to allow individuals “to regulate [their] conduct accordingly” without risking self-censorship. Such a lack of clarity can make it difficult for companies to determine the boundaries of content categories and could lead to proactive monitoring and unintentional, excessive removal of permissible content. These concerns could be mitigated by providing more detailed definitions, clarifying the difference between adult cyberbullying and defamation, articulating specific factors for consideration when making such assessments, and including more robust and explicit protections for freedom of expression and political communication.