By Hilary Ross
The Global Network Initiative (GNI) and the Action Coalition on Meaningful Transparency (ACT) hosted a virtual public event on Monday, 8 April 2024, exploring what’s needed to make tech transparency meaningful for the Global Majority.
Transparency plays a critical role in producing accountability related to the tech sector through mechanisms like researcher and publicly accessible data, human rights impact reporting, and transparency reporting. Yet, transparency efforts are under-resourced in Global Majority countries. This is especially concerning as the most significant harms on digital services often emerge first in Global Majority contexts, before their digital and analog ramifications echo across platforms and other jurisdictions.
The event featured lightning talks by experts based in India, Brazil, and South Africa and a dynamic discussion moderated by Anupam Chander (Georgetown University & the Berkman Klein Center). The speakers highlighted the critical importance of considering transparency regulations and implementation in local contexts, and explored similarities across their national and regional jurisdictions. Below is a recording, followed by summaries of each talk and the panel; event slides are available here.
(1) Shashank Mohan (Centre for Communications Governance (CCG) at the National Law University Delhi): Global South countries considering transparency regulations should focus on systemic and process-based frameworks that embed protections for human rights
Shashank Mohan discussed his team’s research mapping the Digital Service Act’s (DSA) key transparency mechanisms for Very Large Online Platforms and Search Engines (VLOPs and VLOSES), and evaluating the opportunities and challenges these requirements pose for the Global South. He emphasised that local contexts for regulatory implementation matter greatly, in particular, related to a country’s digital literacy and access to digital services, the level of people’s dependencies on major platforms for key services, regulatory capacity, limitations on independent research, and lack of territorial jurisdiction over platforms.
His team’s key findings addressed how India and Global South countries might consider transparency regulation including: focusing on systemic and process-based modes of transparency; engaging civil society to ensure regulations respect human rights; investing to build a diverse ecosystem of auditors, researchers, journalists, and human rights practitioners; empowering independent regulatory bodies for enforcement; ensuring state-action is guided by necessity and proportionality; and developing necessary complementary legal frameworks like digital competition and data protection.
Clarice Tavares shared her team’s work on researcher access to platform data in Latin America. They are exploring how researchers in Brazil, Chile, Argentina, and Bolivia perceive platform transparency and access to data, and these researchers’ challenges related to academic freedom. Their work shows that Latin American researchers have too few opportunities to partner or engage with platforms directly; disparities which are exacerbated by structural inequalities between Global North and South and also across the Latin American region. For example, some researchers have less access to cloud data than others, and this impacts their ability to access platform data. In general, most Latin American researchers rely on platform APIs to do quantitative research, rather than partnerships with platforms or other types of access. Yet, they have encountered difficulties with APIs, like unexpected changes in access policies and issues with data quality and quantity. Finally, there are significant risks to researchers doing platform research in Latin America, including uncertain legal accountability because of local legal frameworks, fears of personal safety because of political dynamics and the types of content researchers are exposed to, and unclear ethical responsibility because of the newness of the types of research and questions being addressed.
Liz Orembo shared her team’s work exploring researcher access to data in the African context, which was prompted by the DSA’s provision for researcher access to data. She opened with key contextualization: there’s not as much data being produced by government or other sectors in Africa, due to the digital divide, and there are challenges around the capacity to use digital data. This means platform data could be particularly useful, especially for development uses. There’s recent momentum for conceptualising data as a public good, while balancing protection for human rights, but there has been almost no proactive sharing of data between platforms and researchers/civil society in the African context.
Their work found significant demand and supply side challenges in sharing platform data, including a lack of clarity on what data might be available and useful to researchers; lack of access to platforms; unclear legal barriers; and possible risks of retaliation to researchers. Their team consulted researchers during the African Internet Governance Forum, and one risk that was particularly concerning was the possibility of governments demanding access to their research data, raising concerns for the digital rights of users. Given the public interest need for data access alongside these concerns they heard, Research ICT Africa is pursuing a Pan-African alliance for researchers so researchers and civil society can work collectively to develop appropriate rights-protecting data governance.
Ricky Gaines discussed the Wikimedia Foundation’s approach to protecting and upholding human rights on their projects, engaging with civil society, and reporting publicly on their progress. In 2018, the Foundation joined the Global Network Initiative as an observer, and in 2020, completed a human rights impact assessment across all of their operations and activities. The assessment recommended adopting a human rights policy. So, in 2021, building on international human rights principles, they committed to protect, respect, and uphold the human rights of everyone who interacts with Wikimedia projects through four key areas: continuous human rights due diligence; tracking and publicly recording efforts; using influence with partners, private sector, and governments; and providing access to effective remedies. Since adopting its policy, the Foundation has completed three additional human rights impact assessments, including a child rights assessment, and is currently completing an assessment on AI and human rights. Ricky emphasised that engaging with volunteers, researchers, and civil society is key to the Foundation’s ability to protect human rights, as these groups often have the necessary contextual information to identify risks, suggest mitigations, and engage around accountability.
Nicolo Zingales shared his team’s research exploring whether social media platforms demonstrate a satisfactory level of commitment to transparency through their documents and interfaces in the Brazilian context. In addition, his team examined how platforms are responding to their DSA transparency obligations in the EU, to see whether transparency disclosures that were required in the EU spilled over into the Brazilian context.
To start, the team scanned Brazilian legal frameworks to map the sources and characteristics of platform transparency obligations in Brazil. Their findings show the complicated relationship between platform Terms of Service and the implementation of related transparency obligations. On the whole, they found that a large majority of transparency obligations (72%) in Brazil is at least referenced in platform documents, which they thought was refreshing. However, more challenges arise on the comprehensibility of the relevant disclosures, particularly in terms of language and completeness. Furthermore, it was observed that there does not seem to be meaningful transparency transferring from the EU to the Brazilian context in reflection of the codes of conduct/practice signed by some of the platforms in Europe. Overall, the research illustrated that, while a federal agency may be a powerful catalyst to guide the interpretation of open-ended transparency concepts, the effectiveness of this legislation may depend on the distribution of responsibilities across multiple actors, including the extent to which resources and attention from private sector actors can be leveraged to promote greater accountability.
In opening the panel, Anupam Chander noted that all the speakers mentioned the risks of working on digital rights in many Global Majority contexts and emphasised the importance of digital rights research to protecting people’s freedom of expression and privacy. He focused the conversation on where transparency regulations seem most promising, most worrying, and how their local contexts interact with the DSA.
In terms of what’s promising in the regulatory environment globally: the panellists agreed with Shashank Mohan that regulators’ now seem to have a clear understanding of the importance of transparency mechanisms and are realising they need to build these mechanisms into regulatory frameworks that seek to address online harms. Nicolo Zingales found parts of the DSA’s approach promising, noting that the regulatory model enables a lot of input from the private sector to show their risks and how they plan to provide accountability, which he believes is a good approach for Brazil to consider as well. He stressed that it’s not in the public interest for government bodies to be overly prescriptive in designing regulations, because that creates too much of a burden on platforms and, there’s also a risk of homogenization, which could have an adverse impact on rights. Instead, he thinks the right approach is to standardise basic regulatory transparency requests, particularly where there’s a lack of shared context, and then create incentives that allow companies to innovate solutions that are responsible, insightful ways of conveying transparency information. In Brazil, one way this is being considered is a framework of “regulated self-regulation,” where there are codes of conduct that are adopted collectively by industry, with a body that oversees the implementation of those codes.
Liz Orembo and Clarice Tavares also noted that the regulatory development of the DSA has sped up the conversation and momentum around researcher access to data in Africa and Brazil, especially in connection with supporting democratic processes. In the Brazilian context, Clarice Tavares raised an ongoing debate related to legislation under consideration, which draws on DSA’s Article 40 for inspiration to grant researchers access to data. The main concern is who qualifies as a researcher and how this designation is determined, as Brazilian journalists and civil society groups do critical local digital rights research outside of academia. So, defining which researchers will have the right to access – in a way that also protects researcher freedom, user privacy, etc. – is essential.
Ricky Gaines called out that when governments are looking to regulate the internet, they tend to have a handful of platforms and search engines in mind, and they forget about platforms like Wikipedia. And then, oftentimes inadvertently, they can do damage to projects like Wikipedia. For example, the Wikimedia Foundation thinks the UK’s Online Safety Act is overly prescriptive, particularly in how it requires online platforms to do age verification, where compliance perversely would require Wikipedia to collect more user data. They think that the DSA does a better job of taking into account the differences between platforms, where the regulatory model lets platforms identify the risks that are unique to their platforms and then holds them accountable for mitigating those risks via transparency, periodic auditing, etc.
One of the biggest concerns across the panel was around the lack of regulator independence and capacity in many Global Majority contexts. Shashank Mohan noted that the DSA gives a lot of authority to the European Commission and the Digital Service Coordinators, relying on their ability to operate independently. To make this model work, countries need independent, well-funded regulatory bodies that can make regulation enforceable, rights-respecting, and meaningful for citizens, and both he and Liz Orembo worried that these institutional requirements are not present in India and across Africa, and likely many Global Majority contexts. Even the DSA uses broad and vague language for, say, the definitions of parameters for platforms to assess recommender systems. This vagueness could lead to transparency-washing, where compliance just becomes a complex tick-box exercise, but does not actually enable users to have more power in their interactions with platforms. Even more troubling, enforcement of the DSA has already been politicised, particularly in France, where the government threatened to shut down social media platforms after recent protest movements. These kinds of threats can perversely inspire other governments and get “copy-pasted” in less democratic contexts.
The “million dollar question”, as Nicolo Zingales put it, is how to ensure regulator independence and capacity to ensure effective implementation of transparency mandates. He stressed that it’s critical that there are many actors collectively, and independently, involved across the regulatory ecosystem. For example, if transparency, risk assessment and audit reports are being published, then civil society can act as a check and keep an eye on both how platforms and regulators are engaging.
On the whole, Shashank Mohan emphasised that the DSA offers an opportunity for experts in the Global South to study and consider the regulatory model in their local contexts, and then push platforms to offer similar levels of transparency within the Global South in appropriately contextualised ways. Clarice Tavares noted how important it is to share across contexts, highlighting that in this event, we discovered similarities around needs and concerns related to transparency broadly, and in particular researcher access to data in India, Africa, and Latin America.
Thank you to each of the speakers for their thoughtful contributions, and to Professor Chander for the excellent moderation.