The following are the outputs of the real-time captioning taken during the Tenth Annual Meeting of the Internet Governance Forum (IGF) in João Pessoa, Brazil, from 10 to 13 November 2015. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
***
>> MARIANA VALENTE: Good morning everyone or good afternoon. Thank you all for being here. My name is Mariana. I am one of the directors of InternetLab, that's a research center about law and technology, and we aim to foster discussion on Internet policy. And I'm currently at InternetLab developing research on gender and Internet with focus on revenge porn and (?). So this is a debate about tech‑related gender violence versus freedom of expression. When we thought about this panel we were discussing it with APC and we thought it was a big innovation inside IGF because nobody was talking about it and actually in the previous panels about gender and freedom of expression we realized this was a hot discussion because everyone was in the panels of freedom of expression bringing the subject of gender, and the panels of gender bringing up the subject of freedom of expression. And I hope we can profit from having the specific time to just discuss the relation between these two subjects here.
This morning we had at IGF a lineup of gender‑based violence again women. As you may know because of this process the UN and the Take Back the Tech campaign have been under attack precisely because this would favor censorship. So this makes this debate even more urgent now.
I think there are several different paths that this discussion can take, and I hope it can address some of them here, legal discussions such as intermediary liability or whether legal solutions are desirable at all; or if we can think of a single discussion (?) or even that we can go more concrete so we know exactly what we are talking about. So let us go straight to our panelists. I'm going to very briefly introduce them to you. And then I'm going to ask them all to speak for five minutes each so that we can open to the floor for your questions and for remote questions, as well.
So Anna de Freitas, she's a journalist from Brazil who recently went through harassment herself because of being vocal about sexism. Bia Barbosa, she's a member of Intervozes Brazil and she will speak about some examples in which this contradiction shows itself. Dafne Plou, she's an APC member (?). Erika Smith is part of the Take Back the Tech from APC. Gabrielle Guillemin works for Article 19. ‑‑ Sorry if I misspoke your name ‑‑ and will speak about what violence against women means from a free speech perspective. And Hibah Hussain is a public policy analyst at Google. And Paz Pena from the Derechos Digitales will speak of the adequateness or not of legal measures. So we will start with Dafne, please.
>> DAFNE PLOU: Thank you. And I want to refer directly to research we did with the projects that have to do with violence against women online, tech‑related violence. And we have been working with this project since 2012 until now. And we decided to have a map online, a software map, where we are mapping cases from different parts of the world to really see what is going on and where it's going on.
Suddenly we saw that things were going on everywhere. So we found this very challenging; and the need then to analyze what was coming out of all these cases that were being mapped there.
In this project we are working with seven different countries in different regions, and that's why we were also going to collect all this information from different parts of the world. And what we saw in this map on the case is ‑‑ and we are analyzing here about 400 cases that are mapped there ‑‑ you can go and look at the map afterwards if you want to ‑‑ we found that there were three main groups that were being targets and were being victims of online violence against women.
And when we categorized them, we saw that the main group had to do with someone involved in an intimate relationship. A second group was a professional involved in public expression and this involves activists, journalists, writers, researchers, musicians, actors, anyone with a public profile or interesting public exchange. And then another group that we found has to do with survivors and victims of physical assault.
So coming to this situation we see what is at stake. With the first group someone involved in an intimate relationship we see that what is there has to do with intimacy and trust. And what happens, it involves the use of ICTs for private expression; the content of which is being exploited publicly by someone who was intimately involved with the person, with a woman. And the consequences, they can have extreme consequences. We've seen research because of this and probably you've read about it on the press. A widespread sense of shame may require severe actions such as changing name, changing address, sometimes changing jobs, leaving school or university and so on. And then following the incident we see a lack of trust in technology, withdrawal from using technology for intimate exchange and expression and shame because original use of ICTs and so they retreat from online spaces. So we really see a problem there that has to really do with freedom of expression online and offline, too.
With the second group, as I said before, it's professionals often involved in public expression and the whole list of them. We see that what is at stake is a freedom of expression, self and political. And what happens is harassment, threats, silencing through verbal abuse; and this is going on a lot. People really don't want to write anymore, don't want to speak anymore. They just want to withdraw from any media.
And the consequences, typically appears to (?) extreme consequences for the victim given the public stages a greater sense of empowerment to remedy the situation may happen but still many of them lose their jobs, they have to change from the place where they live and their reputation is also harmed. Perhaps they can sort of overcome it after a while, but this situation can be really difficult.
And what happens towards technology? These women start using more carefully technology. They see tools for remedy of rights violated and they might engage with technology as a tool for public expression and sharing also what happened to them. So we see that many of them don't leave technology aside as in the first group.
And then what happens to survivors of our victims of physical assault? Their physical safety is at stake. It involved crimes such as filming a gang rape, and this has happened in many countries, and can result in extreme consequences such as suicide of the person violated.
And so they're psychologically and socially affected. And this evidence is still going on on the Internet as we know. It's very difficult to accompany them so that they can take down the information there and we know all that comes after it. And of course they don't want to see anything that has to do with technology for a long time.
So there's a lot to do to working with them so that they can tell the stories, they can share this potential harm with others and they can also overcome their situation. So we see through what we mapped and all these cases that freedom of expression is really contained for these women and that the effects can be very negative.
And we think that as women's rights program we have worked a lot so that these women can come again online, you know, and feel safer and also be able to share and look for remedies together with us. This is not always possible because of so many cases around the world but with several partners we have in different countries are working hard on this so we get results not only on the legal side but also on the personal side that has to do with these survivors. So this was it.
(Applause)
>> MARIANA VALENTE: Thank you, very much, Dafne, for presenting data and also research; it's very enriching. So I'll give the word to Gabrielle.
>> GABRIELLE GUILLEMIN: Thank you, very much. First I'll like to address to anyone who is there on (speaking foreign language). What I'd like to do today is mainly three things. One, I'd like to talk about the framework that we use as an international free speech organization under the International Covenant of Civil and Physical Rights. Second I'd like to raise issues are around definitions when we talk about online violence and different forms of speeches. And thirdly, hopefully if there's enough time, talk a little bit about the responsibility of Internet intermediaries.
So first looking at the International Covenant on Civil and Political Rights. Freedom of expression is rights but we also know it's not absolute. It can be restricted and it can be restricted first under three conditions. First, the restriction must have a basis in law; second, it must pursue a legitimate aim, usually among those aims there's the prevention of crime that could be irrelevant in this context, sometimes known as the protection of morals. And thirdly it must be necessary and proportionate in a democratic society.
Now, it doesn't stop here. Article 22 of the international covenant also mandates states to prohibit incitement to hostility, violence and discrimination on a number of discriminatory grounds.
Now, if I go back to what I was discussing earlier about restrictions which are permitted, they have to have a basis that means that it just be sufficiently precise. So when we were talking about online violence against women, there's a question that arises about what do we mean by violence online. Particularly when looking at incitement to violence, for instance, at least from a free speech perspective we've understood violence as the likelihood of physical violence occurring rather than psychological.
Now, if we turn to different forms of speeches, I think generally speaking you can say that international free speech organization would recognize that if we are talking about harassment and if harassment was criminalized, for instance, there would probably be a legitimate restriction on freedom of expression.
Now, online other issues arise and understanding that harassment can be challenged because for instance on Twitter sometimes an individual may receive one Tweet from a thousand individuals and these Tweets are very offensive and the person may consider they feel harassed. However, that form of harassment may not forward then existing definitions. But from our perspective under freedom of expression if you consider that it's just one person making one remark, even that that remark is merely grossly offensive, it shouldn't be criminalized and the state shouldn't be trying to go after the those thousand different people just for that one people. Unfortunately, the person may still feel very aggravated.
At that leads me to an important point for a free expression is that free expression also protects speech which may be deeply distasteful and offensive to others. And I think that's where the difficulties arise because people have a different understanding of what may be permissible.
Now, I just want to briefly touch on the responsibility of intermediaries. So I think for free speech perspective criminal liability would be a disproportionate restriction. In practice we don't see such extreme measures so much; however, intermediaries can be held liable if they fail to take sudden actions under the laws of various countries.
And now I think that although we could generally accept that intermediaries may be compelled by a court order, for instance, to remove content which is unlawful; I think where we have some difficulties is when there's a push for intermediaries to do more, to have a positive obligation, for instance, to monitor content, that we see as a danger for free expression particularly when it's not clear what sort of language we are talking about or when for instance violence online is not well‑defined.
So I'll just finish with one more remark and it's about anonymity. It's also a big topic in this area, and the need for the protection of anonymity not only to enable people for trolling (phonetic) because unfortunately that's also a sad reality of Internet but at the same time recognizing its importance for expression. And we hear a lot of groups who tell us how it's very important to them that their anonymity is protected, it's a means for them to express viewpoints that people in their own communities may not agree with. So I would like to raise this also very important issue and I look forward to discussing it for the rest of the panel and the audience. Thank you.
>> MARIANA VALENTE: Thank you, Gabrielle. I'll pass the word now to Erika from Take Back the Tech.
>> ERIKA SMITH: Good afternoon, everyone. (No sound). I have been tasked with just telling you very briefly about what happened with the Take Back the Tech case but I'm not sure what you're seeing. No, that's not what I want you to see, although it's very interesting, I'm sure.
Anyway. What you're seeing is perhaps not a ‑‑ yeah, I want to go back one, if we can. Where do I focus this at? Somebody has got a control. Can you go back? Perfect. Thank you, very much.
Very quickly, so Take Back the Tech accompanies 16 days of action against gender‑based violence. Started in 2006. It was before social networks were hot all over the world. It's basically a campaign that's been developed by campaigners by all over the world and very specifically Asia and Africa. And it has led to things like documenting on the map that Dafne talked about for example and many other initiatives mostly to a great deal offline and women experimenting, exploring, developing tech and other ways of looking at digital security but really questioning issues of violence against women connected and related to technology.
So the campaign of companies, a very famous, long‑held campaign throughout the world against gender‑based violence but it wants to do it in a fun and creative way and it definitely doesn't want technology to be demonized in that process.
So we played a very important role as part of APC in talking about the best practice form of encountering abuse because we feel just a little bit passionate about the Internet and our right to play and speak and be in this space and multiple spaces and how it's understood.
And we have been told for a long, long time that it's not our space and the way that that gets told to us varies dramatically. That's why we got involved with the best practice forum.
So we have been doing a lot of work around it but we wanted to also invite other inputs, people who are not necessarily aware about the best practice forum and Internet Governance, hmm, what's this strange animal. So we got together around a Twitter chat where we invited people from all over the world to participate and talk. You can see them talk and the video transcripts, it's all about transparency. We like that.
So they decided, those people decided, and we, to use the Take Back the Tech hashtag. It's been used for a long time. It's not that we own it; it's a hashtag.
So some people say who they are part of the club and based ‑‑ we are supposedly mostly in the United States ‑‑ decided that they were very upset about this UN organization called APC who obviously has enough funds to control the world who was really calling for global censorship of men's voices. And they decided to have an effect what we wanted was to shut down anonymities. In fact, all feminists can't listen to any sort of criticism and basically attack the conversation.
The initial dialogue and messages were amusing in their ignorance and ridiculousness about the idea about transparencies. Like, oh my God, this website has everything published on it with no controls, say the guys in their chats. But what they were able to do in just under 24 hours, very small groups of people, but over 25,000 tweets. That makes it difficult to have a conversation with a hashtag, right?
We did. We went ahead and had that. Once you set something out there, escalation can happen. It did escalate from those clouding messages and the same old same old, to much more intense misogyny, definitely generalized threats around the hashtag, and then direct threats around Muslims and a phobia, definitely rape threats, sexualized images, all (?).
There was also an exposé video with information from the best practices forum process. They even infiltrated the public meeting saying they were reporting from the Guardian in South Africa, some little white man's voice from the United States saying I'm a reporter from the Guardian, South Africa, with a U.S. accent.
It got to the point, however, they took the time to put ‑‑ together with other people but there were a total of 98 comments on an IGF paper on the best practice forum paper. And I think anyone here at the IGF can say, wow, that was impressive feedback.
And, of course, all of those have to be taken seriously and looked at and analyzed; and we did. And that's why it's such a transparent and informative process. There was also an attempt for an attack on the IGF site, and other fallout on other spaces that we occupy. I think probably this particular attack didn't get much attraction because some people were like, hey, wait a minute, you're saying you're not in favor of freedom of expression but they're part of the Internet rights charter.
So it was confusing about why. So when they let us know that they were about to ruin our lives, literally, they wrote us a letter, for us it's different, right? Take Back the Tech campaign has existed. It's made up of a whole bunch of different women's rights activists, people who are concerned about digital security who have barely enough of a digital security practice, but they're informed. APC has its sites behind deflect. There's different levels of security in our network.
So what this type of an attack means for us is really different than if it's on one single women's rights organization, an abortion rights group, if it's on a particular women, a journalist, it's a very different reality than to have to deal with 25,000 tweets.
I'm not saying this was a minimalist attack and I'm not saying that there were not threats and there were not aspects that were very violent to this but it was different again because of the context and who it was against and because of the incredible solidarity that we received from Take Back the Tech campaigners, from different parts of Internet rights and Internet communities, but mostly from women's rights organizations.
So it's a whole different reality than I think what a lot of women who are individually facing this sort of violence to have to deal with. I think also that the fact that we as campaigners responded from everywhere in the world in many different languages because that's what Take Back the Tech was all about, but it was very confusing for these people. Oh, they're speaking. And this was something they were not also used to which is why we assume they're based in the U.S.
But I think our biggest concern was people were supposed to be in a chat where they were supposed to be talking about experiences they knew about and how safe would they be? How safe would their privacy configurations be up to speed? And that was why we were really concerned that people would get outed and attacked.
In fact, many people who wrote to us privately about solidarity said we can't participate because we are concerned about that, too. But interestingly a lot of people said we can't participate because frankly we don't have the time to deal with this and we don't really think it's worth investing the effort to deal with jerks like this, and other aspects. Of course we had back channels, encrypted secure channels, other strategies involved. But I think it's important to say when we saw this happen ‑‑ and it's very linked to everything that Gabrielle said that we really echo with ‑‑ is we were reporting these people all, oh, this awful person who sent me this huge erect penis in anime and said how about a surprise butt fuck. You know, it's not a direct threat. We didn't necessarily report but many of us were getting direct threats and harassments and they were also reporting.
And so the mechanisms were working in some cases. A lot of that stuff was removed by the Internet intermediary, that effects documentation. And I think this is one of the key things that's hard to do and understand about those take downs. Can our Internet intermediary let us know? Well, how much did you have to take down because of this particular attack? These are the sorts of things that we would like to hear more about.
And I just want to say ‑‑ and this is not just a freedom of expression issue, it's about a lot of rights ‑‑ and what happens is because it happens online, it gets framed as if it's only about the freedom of expression issue. That is the topic of this session, but it's important that it's not a session about violence about women online versus expressionism. And when you use that word "versus," you're going to get a hell of a lot of pissed‑off feminists.
(Laughter)
(Applause)
>> MARIANA VALENTE: Thank you so much, Erika. That is so helpful, especially because I think the Take Back the Tech gave us kind of a showcase because the whole idea of discussing women online was being labeled anti‑freedom of expression, I think it adds a lot to the discussion of not being one thing versus the other. So I will now give the microphone to Ana.
>> ANA DE FREITAS: Can you hear me? Yeah? Okay. So I would like to start telling you my story and the reason I'm here. I've been working as a journalist, as a reporter, for a little over ten years now, for the past three years as a free‑lance journalist pitching stories about mostly digital culture technology. Earlier this year, specifically in February, I was working with one of my main editors meaning I would be publishing two to three stories a week at this publisher. And I pitched to them a story, an article, about how and why online boards related to pop culture, that being comics, movies, games, music, and Internet culture in general are very whole style to women presence and minorities in general.
I wrote from my own experience as a teenager when I used to try to take part on the online spaces, but I also interviewed men and women who were also users of these kinds of spaces and were also witnesses of this kind of violence against women and minorities.
I wrote the article. My editor, she praised the quality of the article, she said she liked it. Two days later she got back to me and she said they weren't going to publish it. They didn't want that kind of attention. They even tried to talk me out of publishing it somewhere else.
Then I took some time thinking about it but then I decided to go along with it, with the original idea of publishing it. So I talked to another friend of mine, an editor at another publisher, the (?) Post, which is the local Huffington Post, and they published it. The article had a huge repercussion, much bigger than I actually expected.
Most people were surprised that that kind of violence that I was reporting happened to women on the spaces that way. And a lot of them were supporting the fact that I decided to talk about it. But then I immediately started to get threatened. So I got hundreds of rape threats, murder threats on my social network profiles, on different online spaces. So there were threads on boards that were only about me, hundreds of posts with pictures of me.
They would track the events I would confirm on Facebook. They would tell me they knew where I was going, they knew where I would be on a set day and time and date, then they would discuss which one of them would go over there to murder me, to attack me.
Quickly, a few days later, the threats extrapolated the online environment. They started posting around my home address, my personal data, my relative's personal data. They were all shared online. My relatives were threatened, too.
And then I started getting packages at my home that contained ‑‑ they were supposed to make me feel psychologically shaken or affected. So they contained things as diverse as worms, sexual parlance, T‑shirts with my picture and no flattering sentences on it, leashes with my name on it.
My editor, the one that decided not to publish the article in the first place, they decided to not work with me anymore. And I had to leave home for a few weeks and search help of a professional organization to get my life back, which is Article 19, by the way. Thank you.
It's been a few months since that happened. I'm getting better. I still to this day still get attempts to hack my Facebook account, my Twitter account. I guess this whole process got me closer to covering social issues and minority issues, feminism. And that's a good outcome I got from it.
But I also think it made me feel a bit scared whenever I tried to pitch in a story, if that story was going to make me go through everything again. So it's ‑‑ I'm not ‑‑ it's really hard to analyze the question from a perspective where I am a communicator so I need to advocate for freedom of speech because that concerns my profession. It also concerns everything I believe in. But also other people's freedom of speech is limiting my freedom of speech.
And I don't think any communication professional should be scared of talking, of writing about any subject. So that's what I've been trying to pursue, I guess. So that's it. Yeah.
(Applause)
>> MARIANA VALENTE: Thank you, so much, Ana.
>> ANA DE FREITAS: Thanks.
>> MARIANA VALENTE: We have been following the development of Ana's case since it happened. And apart from the importance of listening to someone who has actually been through that which I think is essential in a discussion like this, I think your report raises a subject that is oftentimes dismissed and this discussion especially when it's framed within the "versus" framework which is how online abuse can affect the freedom of speech of women themselves, who want to express themselves on the Internet. And I think Dafne was also addressing that. So thank you very much. I'll hand over now to Bia Barbosa.
>> BIA BARBOSA: Hi, everybody. Thank you so much.
>> MARIANA VALENTE: Can you please put Bia's slide ‑‑
>> BIA BARBOSA: Yeah, just wait a little bit. I'm going to say a few things before it. Thank you for inviting Intervose to be with you this afternoon. Intervose is a civil society organization here in Brazil for the last 12 years which has been fighting for freedom of expression in Brazil. And with the subject we are dealing with, it's one of the major challenges for those who consider Human Rights defenders including women's rights and freedom of expression and also for those who are for free open and (?) Internet.
First I would like to reinforce what Gabrielle just said that freedom of expression is not an absolute right and no Human Rights are absolute, so there must be balance with other fundamental rights.
Of course in different environments there are different treatments of this. If you're talking about broadcasting media, it's one kind of treatment. If you're talking about Internet, it's another treatment that we should address. But balancing must be ensured. And that means that we should not, as Erika said ‑‑ it's difficult because everybody said a lot of things that I was supposed to say. Not that I was supposed to say but that I wanted to say ‑‑ that we should not address this issues in a perspective of contradiction but in a perspective of balance.
Another point I would like to highlight is organizations like ours or like the organizations that are here following the sessions, we have an important task in approaching two different words that are not in permanent dialogue in my point of view.
We need to show, for example, in countries like Brazil that traditional Human Rights organizations do not discuss Internet in a broad perspective. That anonymity is all right. That any content removed must come after, for example, court decisions that must serve as a threat to any democracy and that there are ways to sanction those that propagate hate speech and advocate violence against women without violating the privacy of users in general.
But at the same time we need to dialogue with sectors that are working on Internet, do not admit any kind of withdrawal of content that consider exceptions as the one we have here in Marco da Civil, the Brazilian Internet bill of rights, a violation of online Civil rights.
We need to solve this problem following an educational technological and cultural approach so we can have a safe and open Internet for all. But while these two universes keep working separated, not acting together to defend a much democratic practice that guarantees this balance between freedom of expression in woman's rights, what will happen is that (?) that I would like to show at least here in Brazil have become more and more recurring.
So this is a campaign from the Human Rights ministry here in Brazil calling for a process of Let's Humanize the Web. And the idea was ‑‑ how can I ‑‑ can you ‑‑ okay. The idea was I support the respects on the Web or positive message and to receive ‑‑ and to be open space to receive Human Rights violation on the Web.
What happens is that ‑‑ next one, please. A response like this that dehumanize the Web called by a comedian here in Brazil that has 11 million followers on Facebook to get down the campaign from the Human Rights Ministries in Brazil, which I don't have to describe the image.
So things like this, "if sex is already good when the woman wants, imagine when she does not."
This is from a website as well. "This is not rape. If she did not want, she would say something."
And this is a Ford advertisement. "Leave your problems behind." It was not only published in Brazil but in other countries. And things like this that we have all the time, as well. Homosexuals should be used as human ‑‑ how do you say that in English? Experiment. Yeah.
>> Guinea pigs.
>> BIA BARBOSA: Guinea pigs. To say that in English I thought it was a literal not working translation.
Okay. And we have, of course, many, many corrective rape pages and blogs spreading a very quick way where the victims were lesbians, bisexual women and transgender, which called for enforcing them to change their sexual orientation. A hospital in Sao Paulo in Brazil specialized in violence against women of rape (?) in Sao Paolo. So to finish because of my time, I would like to say something that happened yesterday here in IGF when members of our organization (?) retained by the UN security preventing them to come back to the event after demonstration with some vendors in the opening ceremony, and this morning after negotiations we have the UN security where we said that it was a disproportionate reaction and we called for the importance of preserving democratic and participatory nature of the IGF.
So we would like to thank the Brazilian Steering Committee that helped us to solve this and many, many Civil Society organizations that were with us to try to solve this problem. But we think the best participatory and governance of the Internet where we believe that a legitimate freedom of expression is necessary otherwise we don't provide a democratic environment. Thank you for everyone that supports us and we thank you for this space.
(Applause)
>> MARIANA VALENTE: Thank you, Bia. Thank you for bringing concrete examples. I think that helps a lot. We are running out of time but I confirmed that we can have extra ten minutes because we also started ten minutes late. So I hope that's okay. If not, just tell us. I will also ask Paz and then Hibah to really stay within their time so we can have a little bit of discussion. So please, Paz.
>> PAZ PENA: I think we need to distinguish two things. First we need to acknowledge that online harassment is a problem for women on Internet especially in patriarchal countries like Latin America.
But the very different thing is what the governments are doing with that problem and how they could use the excuse of online harassment as a tool for censorship. That doesn't mean that the women suffering harassment or working on these issues are promoting censorship. I think this has to be stated because very often I'm hearing those kinds of confusions.
In that context the solutions for online harassment to women are not necessarily in a position of freedom of expression in the same way that privacy, for example, is not the position of security.
Having said that, can be the solutions for violence against women online problematic for freedom of expression? Yes, of course, as all the bills and laws can be.
How? Well, there are panelists in here that have explained better than me some of these as possible clashes between definitions of hate speech and online harassment or persecution of anonymity which is one of the key pillars of freedom of expression or make intermediary liable of third‑party content.
But there is also another key issue that can affect freedom of expression on Internet which is produced when public policies don't have a proper gender approach and therefore reinforce gender stereotypes that don't challenge economic and cultural power structures, not just with legal reforms but cultural and economical terms.
Finally, another thing that is needed is to build bridges between digital rights organizations and women and gender organizations in order to understand the possible tensions and work better public policies. Thank you.
(Applause)
>> MARIANA VALENTE: Thank you, so much. And thank you so much for bringing public policy into the debate, not only in the legal perspective. So Hibah, please.
>> HIBAH HUSSAIN: Thank you, so much. And thank you all for being here during your lunch hour to talk about this issue. It's obviously something that is very top of mind right now.
Basically one of the reasons that I'm here and one of the reasons that Google cares about this issue is that we want the Internet to be a place where everybody can feel safe and comfortable and every kind of community can thrive.
If the Internet is just a place for a certain subset of people, that's not the Internet we want. That's not the Internet that is good for the company either. So that's kind of why we're here today.
And what I want to do today is a high level of policy approaches, go into the specific of what is allowed, what is not allow, how are things taken down, things like that. Talk about what my panelists said about free expression, free speech, how to make sure that we have balanced portions but don't just link it to all kinds of content.
And then identify some possible solutions and talk a little bit about something that you raise about how to address some of the offline roots of these issues because often these issues are kind of rooted in offline power imbalances. And so you can get rid of all the online content but those kind of inequalities and the repressive offline structures are still there. So we can view that as part and parcel of the solution to this issue.
So, you know, I think we are all here today because we believe that the Internet and social media can be a positive and empowering place, especially for marginalized communities. It can be a place where people can really find this sort of empowerment and find their voices especially when they don't that have that privilege in the offline world.
So our goal is to really figure out how to promote that positive community and speech, while minimizing the negative stuff and freedom from safety. Because as many of my panelists identified, if somebody feels that they can't speak safely online, that's also a free expression issue.
So basically a high level, we don't think it's appropriate for unintermediary company to determine what is or isn't offensive. That's not the role that we want to play because what is offensive to one place in one community is very different, it's very subjective, and it depends a lot of context. Honestly it depends not just on context and culture but on the individuals itself.
So what we do is we rely on users to flag content. That varies across the platforms. On YouTube you can flag a video. On Blogger you can request that things get removed. Things like that. There's a legal removals page.
So our kind of goal is to make it as easy and intuitive as possible to flag objectionable content. I will be the first to admit this is a huge challenge for us. I think some of the biggest challenges are building scalable systems because there are so many users around the world flagging so much content that the sheer scale is often incredibly tough to deal with.
Understanding context, as I mentioned, it's really difficult because we don't want to crack down on, for example, consensual sexual imagery. It's okay. And there are a lot of people on this panel who I'm sure would agree that we don't want to crack down on sexual expression, that would be harmful for a lot of communities online.
So understanding context. And then recognizing ways to work with offline partners, especially offline partners around the world. So local women's organizations, human rights organizations, who understand the local context have been working in this space for decades and really make sure we have those partnerships going.
So one kind of example of a recent policy change we have had based actually on a lot of Civil Society feedback and partner feedback is we no longer allow revenge porn search results on any of our platforms. And so basically what kind of prompted this decision is that we realized that this content doesn't really serve any purpose aside from the degradation of victims. And so unlike a lot of the right to be forgotten initiative, unlike a lot of that stuff, this policy is narrow and it's designed to have minimal impact on free expression. It's basically an extension of our policies that surrounding personally identifiable information. So we don't allow people to post people's bank account numbers and things like that so it falls into that category of very sensitive information that shouldn't be easy to find online.
So I think a lot of my co‑panelists have done an amazing job at highlighting some of the free expression issues at stake here and I do think that framing this as not a binary but sort of a balance but also a balance between different types of speech and different people's right to expression. So we are keeping that in mind as we think about solutions.
In addition to a lot of the ‑‑ in addition to making tools easier to use and more intuitive, having local partnerships with traditional Human Rights women's organizations, we are also focusing on digital safety and security and also digital literacy. Making sure that when people post online they fully understand the implications of their post that they're able to control their data, that they're able to have the knowledge of how this information could be shared online. And especially for young girls who may or may not be thinking long‑term about their digital footprint making sure that people know that you know things could be shared outside of your control.
We have also really worked on a lot of counter speech efforts. When I was talking about this on Monday somebody mentioned ‑‑ or it might have been earlier today it might have been ‑‑ yeah, somebody mentioned that counter speech is all fine and good but you have to realize that not everybody is equally privileged offline and online. So kind of figuring out how to empower voices while keeping existing inequalities in mind and trying to really address those. It's hard. And there isn't a one size fits all solution, but we have found it really, really effective not just in this area but also when it comes to countering violence extremism and hate speech and some of those other areas.
So we think ‑‑ and I'm running out of time. And I would be happy to talk to any of you about more product‑specific stuff. But at the end of the day we don't allow content that promotes or condones violence or inciting hatred on the basis of sexual orientation, gender identity, gender race, ethnic origin, age, nationality, et cetera.
So that's kind of the approach. Obviously the implementation varies a lot and that's where things get tricky, and I'm happy to talk more about that.
Finally, a lot of the solutions that my panelists identified get to the crux of the point which is you can't address this from one point of view. You can't have a purely technical solution or a purely legal solution or a purely policy decision solution. It has to be a multistakeholder and also multi‑sectoral effort to deal with this. Thank you.
(Applause)
>> MARIANA VALENTE: Thank you, so much. I'm really sorry but we are so short of time. I think we deserved like one extra hour to discuss this. But so I'll ask if you have any questions. We will give space to maximum of three people to make very short questions or comments. I'll be very strict on that. And then we can come back and have also brief answers and, sorry, maybe continue the conversation in the corridors.
So is there a microphone that we can hand to? Okay. So you must come.
>> AUDIENCE: Well, very quick. It is a shame that it is so few time for this table. What I want to say is that rights shall not be think (?) independent. They are not separated from practice, neither when they are created, nor in the way they are exercised.
And what I think is that in patriarchal countries with these minority situations in which the context shows that they are harassed and they are constantly assaulted as women or gay or lesbian or trans people, the way in which we think rights and the way in which we assess rights should be different; because even for example in the case of violence, then when it is thought to be high standard in countries such as Mexico or Argentina or others of Latin America, it is not so different to have offenses against woman and incitation of violence. In those cases context shall be much more important than it is right now.
>> MARIANA VALENTE: Who else wanted to speak? I'm sorry, what is your name?
>> AUDIENCE: My name is (?), I'm from Nigeria. One is what is the level of awareness, and can there be differences between choices rights?
>> MARIANA VALENTE: Could you maybe clarify your question?
>> AUDIENCE: Yes. My question is can we differentiate between choices and rights? And then the second ‑‑ the first one is what is the level of our awareness especially among young guests to avoid violence or infringement? Because in Africa we have a lot of challenge. The ladies are almost not informed. They don't know what they deserve so they don't know how to go about it. So how do we? Thank you.
>> MARIANA VALENTE: Anyone else who wants to make a short answer? And then I'll give everyone 30 seconds to answer and make final remarks.
>> AUDIENCE: Thank you, very much, for this wonderful panel. I have two specific questions that I would love to hear comments on, especially with the revenge porn mechanism that we now have in APC women, and Erika specifically has made a lot of emphasis on this. They prefer to call consensual dissemination of intimate images, but that opens the door a lot. So I was wondering what are your criteria regarding this aspect?
And the second is just that general remark about the IGF and to all the people that are here listening. When we touch these topics, I think there's a barrier of conversation that regards (?). So I just want to put that on the table and call it by its name. It's very difficult even in multistakeholders forums like this to overcome this and speak as equals but we are getting there. So my one specific question and the other remark.
>> MARIANA VALENTE: Thank you, very much. So can we start by Dafne by answering and final remarks?
>> DAFNE PLOU: Well, I fully agree with the idea that freedom of expression shouldn't collide with women's rights and, you know, this needs to go on advancing in our rights and they need to go on sharing all this information and getting people aware of what is going on doesn't mean that feminists want censorship like Erika said.
So I think it's very good to be here all of us discussing with this, very good for IGF to have opened these possibilities. So I think we should go on discussing and thinking of as we said public policies, thinking about intermediary liability and all the issues involved in that and thinking also about freedom to express themselves.
>> MARIANA VALENTE: Thank you. Paz? 30 seconds.
>> PAZ PENA: Thank you yes. As pointed out the online harassment is different in different countries. In that sense it will be great for example if Google can have a public report about porn revenge measures they are taking because it will be great to have evidence to understand the difference of this program in different countries in different regions around the world. Thank you.
>> ANA DE FREITAS: First of all, I'd like to thank you guys for being here discussing the issue. I guess it's the most important part of being here is actually the fact that there's been a lot of ‑‑ you mentioned before that the IGF would have been discussing gender violence online a lot more than we expected. And that's the fundamental thing I guess because it's ‑‑ the awareness is being properly raised. No one can ignore it anymore. So ‑‑ and then we can go straightforward to actions like proper actions. So that's ‑‑ I see that as very positive.
I hope no one in our future somewhere not very far away I hope things like what happened to me are less and less common in Brazil and elsewhere. So thank you, very much.
>> MARIANA VALENTE: Thank you, very much, Ana, for coming. So Hibah?
>> HIBAH HUSSAIN: Thank you, so much. I just wanted to address what I can of the two questions. One is I think the first one really addressed the need for more digital literacy training. I think a lot of people often come online without. And I think it's ‑‑ we think of it as part of our responsibility, too, to make sure that people know how to use tools and technologies adequately instead of just having it dropped. So that's really something we are working on a lot.
The second part I think we are still working on what exactly constitutes revenge porn, but the kind of core kernel is the non‑consensual aspect. And I would love to chat with you more about definite it more clearly indicators we should be looking at and things like that.
>> MARIANA VALENTE: Anybody else?
>> GABRIELLE GUILLEMIN: The first point about the notion of harassment has been considered and the context, I would like to point out in relation to incitement to violence (?) the content is very important that they have a plan of action that specifically says that context is very important to take into account who the speaker is and a range of other factors.
On revenge porn, I think generally speaking I would agree it's a breach of privacy. I think it also raises all sorts of different issues depending on how the particular ‑‑ like the approach to revenge porn; in some countries it's criminalization, so what does that mean? Has the offense phrased also factors that come into play.
And also like in one of them I think in some places is if it's drafted too broadly, what does that mean for example for the reporting by ‑‑ it's not high‑flying journalism necessarily but what does that mean for women who may engage in a relationship with a powerful individual. So I think sometimes although in very narrow cases these are aspects that should be taken into account. Thank you.
>> ERIKA SMITH: Well, and as we know when women are using their agency and their bodies to express themselves as subjects, they're the first to be caught up in decency laws. And, in fact, many of the revenge porn laws have been used against women who are freely and happily expressing their sexuality. So an excellent point. And you don't have to be involved with a powerful figure to have that problem.
I think there's a lot of elements. I think the question of mobbing is a really difficult thing. Each one in its own is maybe not an expression ‑‑ a violation of anyone's freedom of expression but when you get to the swarm, when you get to the mob, there's a problem. And that's not something that needs to be criminalized, but there needs to be a way to work with Internet intermediaries to address it.
I think that's a big issue to uncrack more and talk about more because even if each and every message is not a threat, the fact of your having to individually deal with that many from that many different sources is a problem. And they're mobbing against all sorts of people; a lot of it is gender‑based and that doesn't just mean against women.
I think another really important thing regarding what Paz was saying and I really want to push this because there is so much polarization around this particular issue and some different feminists have a lot of different viewpoints about what they feel is okay to be online and what they feel is okay in terms of threatening expressions. And many have taken stands of absolutely no tolerance for normalization of violence.
And there's a structural reason as to why they got to that stance. So it is not helpful if our beloved Internet rights activists and others, as Paz is saying, don't also get pissed off along with the rest of us. And we really need to attack that polarization together from many different spheres.
And I guess ‑‑ I'll stop there.
(Laughter)
>> BIA BARBOSA: We need to avoid the polarization and we need to be a bridge, so I think this is a way we should deal with that. I'm from here. And unfortunately for example the traditional women's rights organizations that are here in Joao Pessoa are not here in IGF so when we go to Internet organization go to the traditional human rights spaces; we are only one, two, three people. So I think it is one really goal that we should follow for the next period because otherwise we won't solve anything polarizing or leaving each one working in his universe separated. So thank you so much.
>> ERIKA SMITH: I remember what I was going to add and it was what Bishaka Datta said in another session. I think it's really important to put out here. We need to also be talking a lot more about sexual freedom of expression, and consent.
>> MARIANA VALENTE: So thank you so much, everyone.
(Applause)
>> MARIANA VALENTE: Let's give an applause.
(Applause)
>> MARIAN VALENTE: It's a really interesting mixture of people from different backgrounds and really important presentations here so I thank you very much. Thank you for making the debate possible. One last thing tomorrow at lunch time there's the GIS Watch Launch which this year is about gender and the Internet.
>> Sexual rights.
>> MARIANA VALENTE: Sexual rights, exactly. And we also left some material about our research over there in case you're interested, we would love to have your feedback. Thank you, so much.
(Session concluded at 13:20 p.m.)