Session
Organizer 1: Liz Woolery, Center for Democracy & Technology
Speaker 1: Emma Llanso, Civil Society, Western European and Others Group (WEOG)
Speaker 2: Liz Woolery, Civil Society, Western European and Others Group (WEOG)
Speaker 3: Gregory Nojeim, Civil Society, Western European and Others Group (WEOG)
Emma Llanso
Liz Woolery
Greg Nojeim
Break-out Group Discussions - 60 Min
Interventions in this session will be primarily audience-driven. The purpose of the session is to give participants an opportunity to work through examples of difficult content moderation questions, develop their own questions and criteria for evaluating content, and hear from other participants about what kinds of considerations matter most to them. The interactive polling feature will help to demonstrate the consensus, or lack thereof, in the room and has proven effective at drawing participants into discussion about why they arrive at different conclusions.
Because the bulk of the interventions in this interactive session will be audience-driven, the session will be well poised to reflect the diversity of IGF participants. The session is explicitly non-normative--there is no "right" answer about how to respond to the various examples--and the purpose of the session is to identify the broad range of considerations that individuals from different backgrounds consider relevant to content moderation determinations. The examples of flagged posts that participants will evaluate will also reflect a variety of diversity of scenarios, including multilingual content and the crucial role of understanding local context in determining whether posts violate a platform's TOS.
We will ensure that all content of the session, including the examples and the polling, is accessible to persons with disabilities.
This interactive breakout session will engage all participants in active discussions about difficult questions in content moderation. Building on a session that CDT ran at the "Content Moderation at Scale" conference in Washington, DC in May 2018, this session will feature a variety (8-10) examples of challenging/problematic content that reflect key contemporary issues, including hate speech, harassment, doxxing/personal information, terrorist propaganda, nudity and sexuality, impersonation, and disinformation/fake news. Participants will be presented with a "flagged" post or account and a relevant content policy/Terms of Service provision for a fictional social network, and will have 90 seconds to discuss, in small breakout groups, how they would resolve the issue. Participants will then be asked to register their decision in an online poll. After working through half of the examples, the organizers will facilitate a plenary discussion of the poll results and invite participants to explain how and why they arrived at their decision.
From experience running this sort of session, we anticipate that different participants will come to different conclusions based on a variety of rationales, and the examples will be framed to present genuinely difficult decisions--there will not be a list of "right" answers to the poll. The small group discussion and polling feature will give all participants a better understanding of how content moderation decisions actually play out in practice (including the challenge of making difficult decisions in a short time frame). And the large group discussion will generate a list of important considerations for platforms, moderators, policy makers, and users to take into account when thinking about content moderation, platform responsibility, and user empowerment.
The timeline for the session will be:
Intro to the exercise - 5 min
First set of examples, small group discussion and polling - 10 min
Plenary discussion - 15 min
Second set of examples, small group discussion and polling - 10 min
Plenary discussion - 15 min
Conclusions/Wrap-up - 5 min
The majority of this interactive session is dedicated to participant engagement. We will use the breakout format to enable participants to discuss each example of flagged content in small groups; the quick pace of examples encourages lively discussion and the cue to vote in the online poll directs conversation toward concrete outcomes. During the plenary discussions, the moderator will quickly review each example and the poll results will be displayed; discussion will then be opened to participants to describe why they voted they way they did. (The moderator will also be prepared to raise key questions and issues that each example is designed to raise, if these topics do not come up in the plenary discussion. But it has been our experience that participants in this type of session are highly engaged and eager to explain their perspectives.)
Content moderation on social media platforms is an issue that underpins a huge range of Internet public policy discussions today. This session is designed to give participants the opportunity to discuss how content moderation on UGC platforms actually works, to identify key issues and considerations that should guide moderators' decisions, and to discuss the merits and drawbacks of different approaches to moderation.
This interactive session is designed to enable robust online participation. The session content will be available in the form of a shared slide deck and the online polling feature will be accessible to remote participants. We will encourage online participants to engage in the breakout discussions by participating in the chat function and the online moderator will help facilitate that discussion. We will also encourage online participants to weigh in during the plenary discussions as we review the results of the poll.