Session
Organizer 1: Joanna Szymanska, ARTICLE 19
Organizer 2: Barbara Dockalova, ARTICLE19
Speaker 1: Pavel Marozau, Civil Society, Eastern European Group
Speaker 2: Marcelo Daher, Intergovernmental Organization, Intergovernmental Organization
Speaker 3: Yaman Akdeniz, Civil Society, Western European and Others Group (WEOG)
Speaker 4: Nadim Nashif, Civil Society, Asia-Pacific Group
Speaker 5: Gabrielle GUILLEMIN, Civil Society, Western European and Others Group (WEOG)
Joanna Szymanska, Civil Society, Eastern European Group
Barbara Dockalova, Civil Society, Western European and Others Group (WEOG)
Joanna Szymanska, Civil Society, Eastern European Group
Break-out Group Discussions - Flexible Seating - 90 Min
Content moderation and human rights compliance: How to ensure that government regulation, self-regulation and co-regulation approaches to content moderation are compliant with human rights frameworks, are transparent and accountable, and enable a safe, united and inclusive Internet?
Our session would focus on how we challenge companies' removal of content that is legitimate under international human rights law. We’re focusing on Facebook, YouTube and Twitter, given the size of their user base, and we have three simple demands for them. These demands will mean companies are more transparent and accountable to users for what they do, and that better safeguards exist to protect free speech online.
Whenever companies take down user content or suspend an account, we want them to notify the user and clearly explain what content has been removed and why.
When notifying users of a take down or account suspension, we want companies to give users the opportunity to appeal the decision, using clear and simple language to tell them how to do this, and giving them the opportunity to discuss the matter with a person.
Finally, we want these companies to proactively publish much more detailed data on the numbers of complaints, content takedowns and appeals which have been made together with detail on the type of information that was removed and reinstated. Our session will bring together speakers from Eastern Europe and the Middle East where we have seen growing challenges with how social media platforms regulate content. The aim of the session is to have a strategic discussion on what can be further done to ensure social media platforms’ compliance with international human rights law.
16. Peace, Justice and Strong Institutions
Targets: The session focuses on access to information and fundamental freedoms online, especially freedom of expression. We believe that by strengthening transparency and due process of social media platforms, the society as a whole will enjoy more human rights online.
Description:
While social media platforms offer valuable spaces to connect, they also hold immense power over the information we see online. By using algorithms and human moderators, both of whom are prone to mistakes and bias, they are removing large amounts of content in error, silencing millions of people. This impacts vulnerable groups in particular, who are already often denied a voice in society. Censorship by social media platforms reduces dialogue, shrinks public knowledge for everyone and prevents us all from holding those in power to account. At the same time, there have been growing risks to online speech by repressive national laws that the companies increasingly comply with.
For years, civil society has been advocating for more transparency on content removal and a right to appeal on social media platforms. Unfortunately, despite some progress, there is still much work to be done. Our session will bring together civil society speakers from Eastern Europe, Turkey and the Middle East where we have seen growing challenges with how social media platforms regulate content. The aim of the session is to have a strategic discussion between civil society, companies and international organisations on how to ensure social media platforms can meet the highest transparency and due process standards and respect international human rights law.
This session aims to share effective advocacy strategies and recommendations for companies to improve their content moderation practices. Further coordination through events or joined processes will be explored as a possible outcome of the discussion.
ARTICLE 19 is a global free speech organisation. We have been involved in various regional content moderation groups that seek to challenge companies’ content moderation practices and their lack of consistency across regions. We therefore hope to bring various stakeholders from these groups and moderate an interactive discussion by inviting the various participants to share their experiences engaging with the platforms and then discuss a set of possible joint recommendations or best practices for companies on how to moderate content in their respective regions and in line with international human rights.
This would be a hybrid session where the ARTICLE 19 facilitator would be on-site (Poland) but would moderate the discussion with remote participants with a laptop with tech and other facilitation support from ARTICLE 19 staff on-site and/or remotely. For instance, the participants’ knowledge could be tested with an online/offline quiz about their knowledge about platforms and their community guidelines.
Usage of IGF Official Tool.