Check-in and access this session from the IGF Schedule.

IGF 2021 WS #17 Content Moderation BEYOND Social Media

    Time
    Wednesday, 8th December, 2021 (10:15 UTC) - Wednesday, 8th December, 2021 (11:45 UTC)
    Room
    Conference Room 4

    Organizer 1: Jan Rydzak, Ranking Digital Rights
    Organizer 2: Jim Prendergast, The Galway Strategy Group
    Organizer 3: Elizabeth Milovidov, e-Enfance

    Speaker 1: Owono Julie, Civil Society, African Group
    Speaker 2: Jordan Carter, Technical Community, Western European and Others Group (WEOG)
    Speaker 3: Jan Rydzak, Civil Society, Eastern European Group
    Speaker 4: Darme Zoe, Private Sector, Western European and Others Group (WEOG)

    Additional Speakers

    Courtney Radsch is substituting for Jan as Moderator

    Samantha Dickinson is substituting at rapporteur

    Moderator

    Jan Rydzak, Civil Society, Eastern European Group

    Online Moderator

    Jim Prendergast, Private Sector, Western European and Others Group (WEOG)

    Rapporteur

    Elizabeth Milovidov, Civil Society, Western European and Others Group (WEOG)

    Format

    Round Table - U-shape - 90 Min

    Policy Question(s)

    Content moderation and human rights compliance: How to ensure that government regulation, self-regulation and co-regulation approaches to content moderation are compliant with human rights frameworks, are transparent and accountable, and enable a safe, united and inclusive Internet?
    Additional Policy Questions Information: An additional question we will consider. Are the current efforts with regards to content moderation sufficient or do we need to develop more nuanced approaches that account for the diversity of the tech ecosystem and the global community of actors involved?

    By providing the architecture for human interaction on a global scale, technology helps facilitate the dynamism of information and ideas. Because it allows us to make discoveries, share knowledge, express our beliefs, and collaborate across borders, technology is now a tool wielded by billions of people, many of whom are using it to launch movements, embark on new ventures, and harness the wisdom of the crowd. At the same time, technology has provided bad actors with the opportunity to share illegal content with others, normalize fringe ideologies, and spread toxicity at incredible speeds and on an unprecedented scale.

    With internet penetration at 60% of the world’s population, more than four billion people are interacting with content online. Policymakers are grappling with how to govern the sharing, storing, and hosting of that content, and new regulations will set a high bar for online platforms. In some countries, policymakers are developing content regulations in response to specific, high-profile incidents of content moderation failure. These have often involved real-world harm stemming from content related to suicide and self-harm, terrorism and violent extremism, and child sexual exploitation and abuse. The case for regulation is clear, but the pathway towards effective, future-proofed obligations for platform responsibility is less so. The recent decision by the Facebook Oversight Board has once again thrust content moderation and content governance into mainstream conversations about the internet. This is NOT a workshop about that decision. Instead - this workshop will explore content moderation BEYOND social media. What does that mean? Much of the attention on content moderation has been focused on the dominant platforms like Facebook, Twitter and YouTube. This is where billions of users are generating oceans of content. But what happens when calls for content moderation move beyond this top layer of the internet ecosystem to other parts such as hosting providers, data centers, cloud service providers, domain name registrars, or even ISPs? It quickly becomes complicated and exposes what until now has been a one-size-fits-all approach to content moderation. With social media platforms, you have several content moderation options beyond the question of “leave it up” or “take it down.” You can downrank, label, redirect to authoritative sources and more. When dealing with hosting, storage, domain names and other services, the moderation tools are far more restrictive, often limited to deplatforming whole websites or platforms. And that approach causes significant problems. Our workshop will explore these and other challenges such as calls for exempting “small platforms”, creation of one set of rules for everyone and the differences between content moderation on struggle between consumer facing services and enterprise services. The moderator will walk our panelists through the following questions: · How can we ensure that government regulation, self-regulation and co-regulation approaches to content moderation are compliant with human rights frameworks, are transparent and accountable, and enable a safe, united and inclusive Internet? · Are the current efforts with regards to content moderation sufficient or do we need to develop more nuanced approaches that account for the diversity of the tech ecosystem and the global community of actors involved? Audience members, both in person and online will also be asked for their feedback on these questions and we will use online polling to increase audience interaction. Ample time will be dedicated to developing a tangible outcome, a Multistakeholder Framework for Understanding Content Moderation Beyond Social Media.

    SDGs

    16. Peace, Justice and Strong Institutions

    Targets: SDG 16 States “Promote peaceful and inclusive societies for sustainable development, provide access to justice for all and build effective, accountable and inclusive institutions at all levels.” Our discussion will touch on several facets of content moderation that have direct linkages portions of SD-16.

    16.6 Develop effective, accountable and transparent institutions at all levels 16.7 Ensure responsive, inclusive, participatory and representative decision-making at all levels 16.8 Broaden and strengthen the participation of developing countries in the institutions of global governance

    There are elements of ensuring effective, accountable and transparent practices for content moderation (16.6). We are looking to ensure that not just the dominant social media platforms set the rules but the process is inclusive, and representative of all stakeholders (16.7). Finally, all stakeholders, regardless of where they live will be impacted by content moderation standards. Our session will have a speaker from Africa offering her perspective on the need for developing countries having input into the development of a Multistakeholder Framework for Understanding Content Moderation Beyond Social Media. (16.8)

    Description:

    Content Moderation is not a new concept but high-profile cases such as the suspension of former US President Donald Trump’s social media channels and disinformation campaigns during the COVID-19 pandemic have increased the importance of this tool in ensuring the internet remains a valuable resource for society. To the casual observer, the decision to delete content or suspend an account is straight forward. But the tools and tactics used for content moderation once you go beyond the large social media platforms becomes increasingly complicated, with significant consequences both for users and the free and open internet. This session will explore the challenges to content moderation beyond the dominant social media players and look at potential impacts in other parts of the tech ecosystem. In addition to a better understanding about the problems with one size fits all content moderation schemes, we will lay the groundwork for developing a Multistakeholder Framework for Understanding Content Moderation Beyond Social Media. Content Moderation is not a new concept but high-profile cases such as the suspension of former US President Donald Trump’s social media channels and disinformation campaigns during the COVID-19 pandemic have increased the importance of this tool in ensuring the internet remains a valuable resource for society. To the casual observer, the decision to delete content or suspend an account is straight forward. But the tools and tactics used for content moderation once you go beyond the large social media platforms becomes increasingly complicated, with significant consequences both for users and the free and open internet. This session will explore the challenges to content moderation beyond the dominant social media players and look at potential impacts in other parts of the tech ecosystem. In addition to a better understanding about the problems with one size fits all content moderation schemes, we will lay the groundwork for developing a Multistakeholder Framework for Understanding Content Moderation Beyond Social Media.

    Expected Outcomes

    At the end of the session, we would hope to have broad agreement on the following: 1) An improved understanding that a One Sized Fits All Approach to content Moderation does not work. 2) Agreement that there needs to be various approaches to content moderation, depending on where a platform sits in the tech ecosystem and the remedy proposed. ex consumer content generating platforms, social media, enterprise, hosting, data centers, cloud platform, domain name registrars, or even ISPs. 3) A path forward to developing a Multistakeholder Framework for Understanding Content Moderation Beyond Social Media that could be developed post workshop and used for intersessional work.

    Content moderation is one of the most prominent and controversial topics facing the internet. With governments, civil society, academia and private sector all debating what it should and should not encompass and where boundaries should be drawn, we expect a significant level of interest in this session. Even though it is constructed as a panel, we intend to leave considerable time for in person and remote audience members to participate. It seems that no one is without an opinion on this issue and our in person and online moderators will work together to ensure that each segment of our session allows for ample audience input. We specifically chose roundtable format, as opposed to a Panel, to foster audience engagement and interaction.

    Our moderators have significant experience in managing sessions with both online and in person audiences and will ensure everyone is able to participate on equal footing. They will also ensure that no one viewpoint dominates the session and will encourage dissenting voices to help work through the issues.

    Online Participation

    Usage of IGF Official Tool. Additional Tools proposed: We anticipate using Slido (https://www.sli.do/) for online polling. This will help make the session more interactive with both in person and online audiences.

    Key Takeaways (* deadline at the end of the session day)

    1. Organizers and audience participants agree that there is no one size fits all approach to this very complicated topic

    2. Potential Principles a) Transparency (explore Santa Clara Principles) b. Global Taxonomy of service providers c) Emphasis on rights application d) Proportionality e) Acknowledge the complexity of platforms, content & behaviors, jurisdictions f) harmonization may never happen but important to discuss the consequences of this

    Session Report (* deadline 9 January) - click on the ? symbol for instructions

    Opening perspectives:

    Liz Thomas from Microsoft:

    • Important to avoid a one-size-fits-all approach to content moderation as it doesn’t take into account the complexity of the environment, including the different layers of the Internet stack
    • Important to remember the vast size and complexity of the ecosystem that content moderation could be applied to, that the ecosystem is rapidly growing & changing
    • There's a range of often contradictory regulatory measures being enacted in different jurisdictions. Realistically, may never achieve harmonization, but important to discuss & highlight consequences of this.
    • It isn’t just about an individual’s content on a single platform, but a far more complex issue of wider patterns of behavior across platforms & that intersect with the real world.
    • It’s an interesting thought that instead of moderating content *after* it’s posted, there be efforts to proactively set the tone for online communities (preventative action).
    • #humanrights and principles of necessity, proportionality, legality should be fundamental principles guiding content moderation.

    Jordan Carter from .NZ

    • Issue of content moderation will keep being raised because we’re spending more and more of our lives online. It’s not going to go away.
    • How do you even begin to find a framework of principles for content moderation that can apply to platforms as diverse as Facebook, Telegram, eBay and Tinder?
    • For infrastructure providers like @InternetNZ, there is no granularity available to deal with content moderation. It’s more an on/off switch. It’s not moderation, but cancellation (of domain name).
    • When govts start making laws about content, it has an effect on platforms & infrastructure well beyond that country’s borders. Gives example of EU’s GDPR effects around the world.
    • There’s a need for basic transparency from platforms about how they are responding to requests by those with regulatory authority to take down content.
    • Content moderation isn’t just about resources. It’s also about mandate and where in the chain an actor is.
    • Santa Clara Principles https://santaclaraprinciples.org
    • Let a thousand flowers bloom does have risk creating some unintended side effects.
    • Mishmash across jurisdictions can cause difficulties for platforms

    Courtney Radsch

    • Asks for show of hands if domain name services should be doing content moderation.
    • Notes that news orgs are at the mercy of algorithms in terms of what visibility their news articles have on different platforms.
    • Raises issue of resources needed by platforms to moderate content. (Not easy for smaller platforms) and how this might be in tension with regulatory desire to prevent monopolies.
    • Large scale content moderation relies on algorithms. And good algorithms rely on good data. But you only get good data if you’re looking for it & recognizing differences in languages & cultural contexts.
    • Should we cooperate on universal taxonomy or a classification across jurisdictions concerning this diversity of all those intermediaries?
    • Discussing value of developing a global taxonomy to help understand the diversity of intermediaries across jurisdictions, with the potential to help harmonise approaches where possible.

    A participant notes that some governments have a history of suppressing freedom of speech. There is a risk that a harmonized approach to content moderation could exacerbate such suppression in these countries.

    Complexity of platforms, content & behaviours, jurisdictions

    Taxonomy based approach versus principles based approach

    Taxonomy of different types of service providers not different types of content

    What goals/objectives are we trying to achieve?

    If moderation is effective, people move elsewhere. Good enough (setting expectations)

     Audience Questions

    I agree that a one size fit all is probably not a good idea. At the same time, do you see a problem in enacting more and more regulation that might be fragmented or contradictory?

    Do you think we should cooperate on a universal taxonomy or classification across jurisdictions concerning the diversity of all those intermediaries? Do you think this would improve more accurate regional intermediary responsibility regulations?

    What are the stakes/risks of creating an inadequate regulation, that is a regime that does not take into consideration the complexity of types of Intermediaries?
     

    Potential Principles for a Framework

    Transparency

    One Size doesn't fit all

    Global Taxonomy of different service providers

    Human Rights and rights in offline world apply online

     

    Online Participants

    Ahmed Farag

    Andreas Hautz

    Anifowose Chioma

    Elzbieta Balcerowska

    Farzaneh Badie

    Gerard Bawarcyki

    Irina Sineva

    Iveta Skujina

    Jacqueline Rowe

    Jose Michaus

    Luiz Eduardo Martelli da Silva

    Luiza Malheiro

    Manda Libor

    Maria Luisa Stasi

    Noppachat Buakant

    Olejniczak BArtosz

    Patrick Kane

    Peter Koch

    Phil Corwin

    Pingkan Audrine

    Porunkoz Mikhail

    Rowena Schoo

    Salma Abbas

    Kristophina Shilongo 

    Susan Payne

    Tim Smith

    Vahan Hovsepyan

    Violeta Gjorgjievska

    Yuanyuan Fan