While there has been plenty of discussion around the impact of social media on American democracy, across much of the Global South its influence has been even greater. In many emerging democracies, access to traditional media is limited, and independent journalism is a relatively young phenomenon. These factors can often mean that platforms’ content moderation decisions play a dominant role in shaping the public discourse, with literal life and death consequences underlying these policies.
Although some of the biggest platforms have made efforts to globalize their operations, there is still an inevitable tension surrounding the transnational role that they play. Platforms are often left calibrating a difficult balance between the conflicting demands of many different international stakeholders. Journalists, human rights activists, and other vulnerable communities, particularly in the Global South, face enormous challenges getting through to the U.S.-based staff who decide what they can and cannot say. There are also tensions between the platforms’ desire to provide a uniform product, and a consistent standard with regard to the boundaries of permitted speech, and the need to cleave to conflicting legal, cultural and moral contexts across the many countries where they operate.
This event, which brings together global stakeholders from civil society, academia, and the private sector, will discuss the challenges and tensions that platforms face in developing and applying their content moderation policies around the world, and seek to facilitate a constructive discussion on the impact of content moderation, and how its standards and processes might be improved.
Please note that the session will be recorded, for upload to the Yale ISP YouTube channel.
For more information, contact Michael Karanicolas, Wikimedia Fellow, Yale Law School: https://law.yale.edu/michael-karanicolas
Day 1: Understanding the Global Impacts of Content Moderation Decisions
Session 1: Scoping the Issue
10 am – 1130 am ET
Nearly all of the biggest platforms are based in the United States, which means not only that their decision makers are more accessible and receptive to their American user base than they are to frustrated netizens in Myanmar or Uganda, but also that their global policies are heavily influenced by American cultural norms. This first session attempts to scope the tension between the international nature of the platforms’ business, and the profound local impact of their moderation decisions, presenting perspectives on how content moderation decisions manifest in different global contexts, and offering an introduction to the transparency and accountability challenges that exist.
Moderator: Michael Karanicolas, Wikimedia Fellow, Yale Law School
- Tomiwa Ilori, Centre for Human Rights, University of Pretoria
- Aye Min Thant, Tech for Peace Manager, Phandeeyar: Myanmar Innovation Lab
- Farieha Aziz, MSFS Practitioner in Residence, Georgetown University
- Zahra Takhshid, Lewis Fellow for Law Teaching and Lecturer on Law, Harvard Law School
- Bruna Martins dos Santos, Advocacy Coordinator, Data Privacy Brasil Research
Session 2: Transparency and Public Reporting
1 pm – 2:15 pm ET
A key gatekeeping challenge to meaningful engagement on platforms’ content moderation structures is the lack of information underlying how these policies are set and implemented. Although transparency reporting continues to expand, even professionals who are focused on this space find it difficult to monitor the specific details of how content moderation standards are applied. This problem is exacerbated for users in the Global South, whose attempts to track this issue is made even more complicated by regional discrepancies in enforcement.
Moderator: Daphne Keller, Stanford Cyber Policy Center
- Agustina Del Campo, Director at the Center for Studies on Freedom of Expression and Access to Information (CELE) at Universidad de Palermo
- Leighanna Mixter, Senior Legal Counsel, Wikimedia
- Emma Llansó. Director of the Free Expression Project, Center for Democracy & Technology
Session 3: Searching for Accountability
2:30 pm – 3:45 pm ET
The importance of content moderation in defining the global political discourse, and shortcomings in traditional accountability measures, have led to a number of independent civil society initiatives aimed at developing accountability structures for platforms. This session discusses the propagation of new, civil society-led accountability mechanisms, and the challenging of developing accountability in this space.
Moderator: Pierre François Docquir, Head of Media Freedom, Article 19
- Jessica Dheere, Director, Ranking Digital Rights
- Jillian C. York, Director for International Freedom of Expression, Electronic Frontier Foundation
- Spandana Singh, Policy Analyst, Open Technology Institute
Day 2: Considering Responses
Session 4: Global Approaches to Intermediary Liability
10 am – 11:15 am ET
Questions around content moderation structure are intimately tied to the debate around intermediary liability, as platforms’ role in deciding the limits of acceptable speech naturally interact with the legal contours of their own responsibility. In the United States, the future of section 230 of the Communications Decency Act, which provides platforms with legal cover for their moderation decisions, is the subject of intense debate. This session considers how this debate has unfolded elsewhere, and what lessons might be drawn from the different ways that legal liability has been applied.
Moderator: Corynne McSherry, Legal Director, Electronic Frontier Foundation
- Andres Calderon, Professor of Law, Universidad del Pacífico
- Amelie Heldt, Doctoral Candidate, Leibniz-Institute for Media Research
- Mishi Choudhary, Legal Director, Software Freedom Law Center
Session 5: Emerging Accountability Structures
1 pm – 2:15 pm ET
The growing power and influence of platforms over the online discourse has led to the development of a number of accountability structures, which operate with varying levels of independence and transparency. This session will discuss the formation and operation of these structures, as emerging bodies of global content governance.
Moderator: evelyn douek, S.J.D. Candidate, Harvard Law School
- Nicholas Rasmussen, Executive Director, GIFCT
- Julie Owono, Facebook Oversight Board
- Dia Kayyali, Program Manager – Tech + Advocacy, Witness
- Jason Pielemeier, Policy Director, Global Network Initiative
Session 6: Devolving Moderation Structures
2:30 pm – 3:45 pm ET
Different platforms have adopted different strategies to respond to the challenges posed by content moderation, and to bridge the tension between conflicting stakeholder and jurisdictional demands. This session will discuss some of the industry responses, and potential avenues ahead to improve global content governance, particularly through the devolution of content moderation responsibilities to users.
Moderator: Sarah Roberts, Associate Professor of Information Studies, UCLA
- Sherwin Siy, Lead Public Policy Manager, Wikimedia Foundation
- Abby Vollmer, Director of Platform Policy and Counsel, Github
- Jessica Ashooh, Director of Policy, Reddit
Michael Karanicolas, Wikimedia Fellow, Yale Law School
About the Organizers
Visited 15 times, 1 Visit today