Time
    Friday, 13th November, 2020 (11:20 UTC) - Friday, 13th November, 2020 (12:50 UTC)
    Room
    Room 3
    About this Session
    The aim of the discussion is to draw attention to how inconsistent and opaque enforcement of content moderation standards can reinforce and magnify existing power disparities. The workshop will posit steps to foster greater transparency and accountability towards even enforcement of community standards on platforms.
    Subtheme

    Organizer 1: Jyoti Panday, Internet Governance Project
     

    Speaker 1: Pratik Sinha , Civil Society, Asia-Pacific Group
    Speaker 2: Juan Carlos, Civil Society, Latin American and Caribbean Group (GRULAC)
    Speaker 3: Monika Bickert, Private Sector, Western European and Others Group (WEOG)
    Speaker 4: Tarleton Gillespie , Technical Community, Western European and Others Group (WEOG)

    Additional Speakers

    Varun Reddy, Facebook (Asia Pacific)

    Marianne Díaz, Derechos Digitales  (Latin America and Caribbean)

    Amélie Pia Heldt, Hans-Bredow-Institut (Western European & Others)

    Moderator

    Jyoti Panday, Internet Governance Project

    Online Moderator

    Jyoti Panday, Internet Governance Project

    Rapporteur

    Jyoti Panday, Internet Governance Project

    Format

    Round Table - U-shape - 90 Min

    Policy Question(s)
    1. How do the variety of political, and regulatory contexts shape the different ways in which content moderation decisions and enforcement of community standards take place on platforms? 
    2. What kind of formal and informal arrangements have developed between digital platforms and governments to limit the proliferation of state-backed misinformation/ disinformation, hate speech, and violent or terrorist content?
    3. What are the opportunities/ limitations associated with proposals for fostering greater transparency and accountability in enforcement of platform content moderation standards?

    This workshop seeks to deep-dive into content moderation practices of platforms in order to highlight policy gaps that leave platforms vulnerable to manipulation through state or geopolitical pressures. A key focus of discussions will be to highlight scenarios where coordination between digital platforms and governments has evolved into a vector through which political power can be exerted, consolidated, and restricted in non-transparent ways. The steps that platforms can take in order to limit being co-opted as a tool in geopolitical conflicts will be discussed.

    SDGs

    GOAL 9: Industry, Innovation and Infrastructure
    GOAL 16: Peace, Justice and Strong Institutions
    GOAL 17: Partnerships for the Goals

    Description:
    Whether we consider platforms taking action against accounts allegedly backed by the Chinese government to spread disinformation on Hong Kong protests, or Facebook's informal agreement with the Israeli government to work together to address incitement on its platform, it is increasingly apparent that platforms content moderation standards, business practices, and its relationships with nation states effectively arbitrate which narratives can reach the global public. While content moderation are essential functions of a platform's business, policies and practices of global platforms carry with them the capacity to reshape the dynamics of public discourse and are also changing the way political power can be organized and exercised across borders. In the absence of transparency and accountability, the rules and procedures for content moderation established and enforced by private platforms pose a threat to democratic culture as they can severely limit participation and impact the individual interests of the platform’s users—particularly minority groups and marginalized communities at risk. This workshop will examine how platforms content moderation standards are reconfiguring traditional allocations of responsibility, accountability, and power in societies. 

    Expected Outcomes

    This workshop is a continuation of IGP's work around building cooperative solutions to tackle the challenges associated with content moderation. The immediate goal of the workshop is to bring together regulators, academic researchers, representatives from technology companies and civil society to discuss new ideas for introducing transparency and accountability in content moderation practices. Another aim of the workshop is to further knowledge sharing on regulatory and technical developments in order to tease out the similarities and unique challenges associated around content moderation in different jurisdictions.

    Prior to the online IGF workshop, the organizers will connect speakers and identify the substantive issues that will be addressed during the discussion. The discussion will build on recent decisions on content removal and account suspensions in order to highlight policy gaps in existing content moderation practices and standards. Representatives of platforms will be given an opportunity to respond to these views and highlight the steps being taken to address these issues. The agenda includes time slot for questions to enable the audience and remote participants, not only from Europe but from stakeholders from other countries to join the conversation and present their experiences, opinions, suggestions, etc., on how to move the debate forward and identify action areas. 

    Relevance to Internet Governance: As more and more of our social interaction moves online, and given the role that private entities have in deciding what narratives are available in the public discourse, it is crucial that we examine the content moderation practices of platforms. Platforms cannot claim to be neutral arbiters while simultaneously opting to cooperate with one side of a disputed narrative without considering the broader, significant implications of their actions. A focus on content moderation slices through platforms' claims of neutrality and allows us to understand how platforms can uphold consistent policies in the face of competing societal expectations, different experiences, cultures, or value systems. By understanding moderation as a fundamental aspect of a platform's service, we can ask new questions about their power in society.

    Relevance to Theme: Uneven enforcement of content moderation standards can lead platforms to become a proxy battle in which disputing narratives and activities emerge and collide. Vague criteria for content removal and account suspensions, lack of procedural transparency, algorithmic bias, and informal relations between platforms and governments contribute to eroding the trust of users on both sides of these opposing narratives. This workshop will focus on steps platforms can take towards improving trust in their content moderation decisions including audits of algorithms, publishing data on its internal practices, providing robust data on its content removal and account suspension practices, particularly in conflicted territories.

    Online Participation

    Usage of IGF Official Tool.

     

    1. Key Policy Questions and related issues
    How do the variety of political, and regulatory contexts shape the different ways in which content moderation decisions and enforcement of community standards take place on platforms?
    What kind of formal and informal arrangements have developed between digital platforms and governments to limit the proliferation of state-backed misinformation/ disinformation, hate speech, and violent or terrorist content?
    What are the opportunities/ limitations associated with proposals for fostering greater transparency and accountability in enforcement of platform content moderation standards?
    6. Final Speakers

    Pratik Sinha, Alt News

    Marianne Diaz, Derechos Digitales

    Amelie Pia Heldt, Hans Bredow Institute

    Varun Reddy, Facebook-India 

    Tarleton Gillespie, Microsoft Research, Cornell University

    Urvan Parfentyev, Russian Association of Electronic Communications