2015 11 11 BPF Online Abuse and Gender-Based Violence Against Women Workshop Room 6 FINISHED

The following are the outputs of the real-time captioning taken during the Tenth Annual Meeting of the Internet Governance Forum (IGF) in João Pessoa, Brazil, from 10 to 13 November 2015. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

 

***

 

>> JAC KEE:  Hello.  Good morning.  How are you feeling?  Good?  Awesome?  Fantastic?  Needing coffee?  I do.  Thank you very much for making it to such an early panel this morning on day two.  What we will be doing is really looking at the best practice forum on countering online abuse and gender‑based violence.  We have only an hour and a half and we also have as you can see a fantastic ‑‑ woo ‑‑ panel of speakers.  Actually, more like ‑‑ we are trying to do this like a talk show.  I'm your fabulous talk show host.  Thank you.  Thank you.  I even dressed up.  So what we will try to do is go through some of the key highlights and recommendations from the best practice forum throughout but open it up at different junctures for inputs and responses and, of course, in a talk show, the audience is very important.  The audience participation.  So please also feel free to kind of jump in and have questions or provide your insights and inputs and so on.

Because we only have an hour and a half so I will sort of start relatively on time, quite quickly even though I know people will be trickling in as the day goes along.  So let us start by maybe talking about the methodology, I will introduce Anri van der Spuy, who has been critical and without her this BPF really would not have gone through, not what we have today.  It's 162 pages document, which you are more than welcome to look through in the IGF website.

>> ANRI VAN DER SPUY: Is this working?  Okay.  Great, hi, everyone.  (Feedback).

Okay.  Let's try again.  I have the lovely task of talking about methodology which normally bores people but I think this is important, especially for a topic with this one and to start off with, I think we should put this in context of what best practice forums are in the IGF body and quickly explain how this fits into what IGF does.  This is intersessional activities, it's what we do throughout the year and not just at the IGF, like at this meeting.  So it's a lovely opportunity to work as a community throughout the year to produce something hopefully tangible that people can use and people can take back to their communities and hopefully use and improve whatever topic we are looking at.

So this year, there's six best practice forums and most of them or four out of the six are what we normally call narrow issues in Internet governance, so more technical and two of them are broader.  The other is multistakeholder mechanisms and then this one.

What makes these BPFs, they have methodological freedom.  These topics are so different.  You can adopt whatever methodology works in this context.  We have a mandate from the MAG which is the multistakeholder advisory committee to look at the online abuse of women.  One of the first things is to do a mapping exercise how we want to address this.  He used Etherpad and we use these items.  We basically mapped what we wanted to cover in this BPF.  It's a wide topic and we thought we should prioritize certain aspects of this work.

So we redefined the mandate or the scope of the MAG and we are fully allowed to do that.  And Jac will talk about the recommendation and one of the recommendations of the best practices forum is we need to look at definitions, et cetera, you know, more carefully in how we address this very important topic.

The primary goal, if I can move on to the methodology, was to gather stakeholder input and to get as many people from diverse parts of our community involved in addressing this from different aspects.

So we followed the normal IGF, you know, methodology or approach of being bottom up, open, transparent and inclusive so all of our meetings, et cetera, were open.  Anyone could join and we had a lot of people coming in throughout the nine months and joining late and participating when they could.

So our general approach, was to have meetings every two weeks, and those were scheduled using Doodle polls to kind of make it accessible for people all over the world.  After each meeting, we would distribute meeting summaries and meeting audio to help people who couldn't attend.  And then the other things we used was the BPF has a dedicated mailing list.  So that was used quite often and we asked people to participate and quite a lot of people subscribed to that mailing list.  We also had an online platform, which you can still see on the website and we used to publish any documents we were asking feedback on or the meeting summaries, et cetera, et cetera.

As I mentioned editing platforms, we used Google Docs and Etherpads so people could see where we were and could comment directly and last but not least, social media.  So whenever we had ‑‑ we were doing something, we would call for input on social media and we had a specific campaign related to social media, which I will refer to a little bit later.

The first was the synthesis.  We had topics or headings and we started populating those in our meeting.  So for over two months, we had this on Google Docs and people could go in and edit it correctly and put their comments or put content and we would extract sections from that document and paste it on input and ask people for direct inputs.  We realized however, that the group was quite limited and we needed to get input in different ways.  What we started developing was a more targeted approach to getting wider stakeholder input.  And also to get back to the meetings, the problem is these meetings tend to have a core group of participants who are, you know, very enthusiastic, and that's great and we need to get more input.

We had basically four other approaches.  So a real mixed methodology, if you may, including a survey, case studies, direct input on our drafts, and a social media campaign.  I will quickly go into those.  The survey was drafted during one of our meetings.  We asked the community for input on the questions and basically distributed on Facebook, et cetera.  And we received 56 responses which if you know IGF structures is a lot.  And we were particularly happy with the fact that many of those responses came from developing countries and a nice mixtures of countries.  25 different countries.

In the Africa region, we had Uganda, Tunisia, Gambia, we had, you know, a real nice mixture of countries and input from different also stakeholder groups.  So that was nice.

And while it's not representative study or population, it does give us a snapshot of what people were thinking about this issue.  And that can be found in appendix 2 of the draft document which is on our page.

In addition to that, we asked people for case studies and those are both formal and informal.  We don't have any guidelines for what a case study should look like.  You can see that in appendix 3 of the document.  Basically, we had from quite a few different individuals and countries, you can see the list of countries on that slide, we have Afghanistan, the Philippines, et cetera, et cetera.  Those were the same as the survey were input directly in draft two, which I will talk about in a bit.

And then the last, you know, direct measure of getting more input was social media campaign which we planned at the same time that the second draft was on the review of the website.  It was around impact, because we wanted to know what people feel specifically about the consequences of online violence, and the question was what impact does online violence have on women girls, and the tag was #takebackthetech.  We also had a lot of support.

Jac will talk more about the social media campaign.  As some of you know, it attracted a lot of attention, not necessarily positive attention but over one weekend, we had 25,000 tweets.  And a lot of other media, videos and a lot of emails and if ‑‑ at the least, it provided us with a good case study of why this case should be studied.

That will be in appendix 5, unfortunately it's not current draft, but it will be in the next draft.  So draft two was basically populated from these sources that I just mentioned.  So case studies, surveys, social media campaign was not incorporated in that.  It was published on the IGF review platform with all the other BPFs.  It attracted 96 comments which is also significant which you compare to other BPFs.  36 unique commentators.  I must mention, though, that ‑‑ (Off microphone comment).

George Orwell 1984 type of commentators.  We used those and did a proper analysis of them, and identified nine common codes.  It was interesting to see what the main concerns of the participants were, concerns and inputs, really valuable sometimes.  Some not so valuable, but they were all analyzed individually and each one of them, we gave an action.  So we said with this comment, we did this and it falls under this action and it was incorporated into our current draft, which is draft JP and it's on the IGF's website at the moment.

Draft JP we are receiving input on, especially and that's why the dot, dot, dot is there.  We will update that and hopefully have a final version in December.  Thank you.

>> JAC KEE: Thank you, very much, Anri.  As you can see, it's been a very comprehensive methodology.  We really did try to take this and use it as a ‑‑ as a ‑‑ you know, provided really significant effort in trying to open up the process and get as much input as possible from as many stakeholders as possible and this included attracting the attention of potentially bad actors.

So when we were running the social media campaign, there was actually ‑‑ there was an announced targeted attempt to hijack the hashtag in which we were trying to run the campaign.  We tried to do some level of analysis, but to be honest, it took up a lot of energy and resources to try to respond and in the process of trying to really facilitate an open conversation, it really raised questions around what happens when an open and participatory platform, sort of becomes targeted by bad actors who are not interested in having a constructive conversation but really interested in sort of shutting you down or create sort of arguments and not interested in dialogue but more interested in stopping dialogue.  That also gave us a lot of ‑‑ a lot of interesting food for thought.

Like, nothing like being steeped in hot tea to know the flavors of it.  And it pointed out why this work is quite critical and the complexities and a quite a lot of misunderstanding of what the topic is around.

So before I begin going into the fining, maybe I can introduce you, our guests for today, let's start from the left, we have Rebecca McKinnon, Rebecca McKinnon is ‑‑ look, I have this wonderful list.  She works on the conditional rights and global voices.  And then we have Gary Fowlie, the head of ITU liaison office in UN, and then we have Agustina Callegari with Buenos Aries and Narelle Clark.  And then Anri and I think we need to give Anri a round of applause.

(Applause).

Louder, louder.

(Applause)

And then we have Nighat Dad, who is the founder of the Digital Rights Foundation, and then Hibah Hussein, and then Mariana Valente, who is the director of InternetLab in Brazil, and then we have Franz Marivick and then Professor David Kaye who is somehow sitting in the shadows and I'm not sure why.  Today he's feeling a little bit shy.  Oh, yes, Patrick.  Yes, come.  Yes.  And we have a late participant, but very valuable, Patrick Penninckx from the Council of Europe, head of the Information Society department.

We have so many wonderful brains that we actually crowded into the corner but hopefully we get to have some time to really get some quite important insights from all of this.

So let's start with maybe looking at some of the key highlights and recommendations so that I can hopefully share with you the product of this nine months of work.

Firstly, I think one of the things that we really found is that definitions is complex.  What is this topic about?  It covers a whole range of things, a whole range of acts.  People refer to it differently.

Some people called it cyber vol and some called it violence against women, some called it online violence against women and we so on and so forth.  So we ‑‑ one of the things that we found that was really important was we really needed a more comprehensive understanding of the issue.

And so we needed a comprehensive, yet flexible definition with the wider and global recognition as well.  So because we understood that the area is ‑‑ it is getting increased recognition, which is a great thing but there's so many things that also are changing and unraveling as we speak.  We are getting to understand the issue more, how it's manifesting itself, what are the different ways in which it affects different people.  So the definition to some extent needed to be quite flexible to be able to incorporate this, but it feed to be quite comprehensive as well.

So what we thought is it's important to just start with human rights.  Human rights offline applies online equally.  It's not too complicated.  It is a human rights issue and not to get detracted or weird, it's a human rights issue and it's important to locate this as it's part of gender‑based violence.  It gives you the first solid ground to work from.  Because gender‑based violence is very much linked to the issue of discrimination and violation of rights and the part of perpetuating that cycle but what is particular about this is then how does ICT and the Internet impact on gender‑based violence, how is it used and how can we unpack this?

And finally, what was really important is that there is a need to understand and address the underlying causes that promote the problem so not focusing on the surface or the technology but really open the root causes.  What is it?  And at the end of the day, it's really an issue of discrimination, it's an issue of power and inequality.

So, I was going to ask Agustina because you inputted into the best practice forum document, sort of different dimension of ‑‑ of looking at violence against women, which could help us move some of our deaf nations forward.  Would you like to share with us a little bit?

>> AUGUSTINA CALLEGARI: No, it's okay.  Hi, everyone.  My name is Agustina, I'm from Argentina, so be patient about my English, please.  I work at the data protection authority at the ombudsman office in Buenos Aries.  It's a human rights organization.  So I totally agree with that, that human rights has to be applied to the Internet.

We ‑‑ that we are trying to do is to address the issue ‑‑ not only for an empirical focus but only for a general ‑‑ from the general sociological area.  That's why I try to highlight this importance of the different dimensions of the violence against women in general, not only online.  So I base my comment on a report made by Roberto Castro from Brazil.  They identified three areas of violence against women.  First of all, conceptual, where it's necessary to differentiate the emotional violence from the physical and economical area.  And then the temporal area, in which every episode of violence takes place and a chronic violence, where it's a long‑term violence and we have to distinguish both of them.

I think with the Internet case that we are focusing on the chronic violence, because when you put any mention online or a comment online against women, it will be online for the long term.  It's not only for the moment as a data protective center and a privacy center, we are trying to focus on this issue, because we don't believe it's are a problem of the moment, because of the way of the information flows on the Internet.

So luckily, we have an evaluative, which makes a difference between violence measure between objective standards and violence that is perceived by women and men.  I think we saw that in the comments or the trolls, that it's not the same what people perceive in each case.

So I think we should take into account this sociological theory when we spoke about ‑‑ when we speak about violence against women, and we are trying to do that at the data protection center, and I also want to highlight that there are laws that address the violence of relation rights, like privacy in this case.  There are some that advocate on gender specifically.  So we are trying to address the issues, not only focuses on data protection center, but only take into account the national law about violence against women.

>> JAC KEE: Fantastic.  That's one of the findings what are existing remedies and existing laws and how do we apply a new understanding to them.  And I think the concept was actually quite interesting first to unpack.  For example, in the example of harassment, one tweet on its open may not look like much but when you get 25,000 tweets, it's a lot.  How do we conceptually address this issue in terms of the temporality and that's really very useful.

In terms of addressing the underlying causes.  We have thought it was important to address the specific concept and intersections.  So in the BPF document, we actually identified six.  And we looked at girls and young women.  We looked at women in rural contexts, religion, cultural and morality.  We at women of diverse sexualities and gender identities and women with disables which was a gap which was unfortunate, because we thought this was a critical area that needed more interrogation, because it really ‑‑ it has an impact in terms of access to technology as well.

And then public women and women in technology fields.  You can check out the report to sort of look into these sections a little bit more.  I won't go into that in detail, but what I would like to hear is from Mariana and Nighat and Franz about the role of specific context.  Let's start with Mariana, the context in Brazil.

>> MARIANA VALENTE: Good morning, everyone.  So to present myself again, I'm Mariana, I'm the director of InternetLab and InternetLab, we are developing research about gender and the Internet focusing especially on revenge porn.  Jac, I'm really glad you asked me that, the input I gave to the BPF report was related to a specific case that is the top ten case, and it's a case that we are developing.  We actually are ‑‑ we are publishing part of it tomorrow at the report.  And we are still developing it.  Was a very special case for research because it was a case in which we could see how these markers were operating.

In our case, especially, age, class and religion.  We were trying to look at revenge porn in Brazil and we started looking at the case in the media, and then when we first heard of what was happening in the Sao Paulo.  The breeches of intimacy of girls was operating very differently.  It would become very important if we identified how this type of violence happened in Brazil, that we took these variables into consideration.

The top ten case is a practice that's been developing amongst teenagers, they are 12 to 15 years old, especially in big cities in Brazil.  We are studying two neighborhoods in Sao Paulo and religion is a strong issue.  We are speaking about girls, usually of religious families in poor areas of Sao Paulo and what happened in the top ten is lists are being developed by boys.  It's a sort of shaming of girls.  They are ranked according to their sexual behavior, and these boys develop ‑‑ they create videos in which they put the images of the girls with phrases about their sexual behavior.  And if they are spreading these videos on what's up, they sometimes use nudity and if they upload them to YouTube, it's very easy to take down nudity from YouTube.  So they are just using girls Facebook profile pictures with phrases about their sexualities and girls keep going each week, up and down in these rankings and these rankings usually correspond to areas such as schools or small parts of neighborhoods.  So we are following that case.  It's very interesting, especially because the case goes inside and the outside the Internet all the time.  It's led to all types of harassment in schools and graffiti on neighborhood walls.

What seems important when considering class and religion and ages is confronting the problem in this case is very different from the other cases that we were following in the media and sometimes the solutions that the community and the activists we were speaking to were very different from the solutions that we had been seeing in the media or even in feminist groups.

For example, we have a legal approach.  So we were wishing to see if the state could confront revenge point in Brazil.  And the first time we went to speak to these activists, we were speaking, oh, do you think anything could be changed in law, or what do you do?  Do you think there is ‑‑ there are resources in law to deal with that?  And the question was actually received with fear, because we are speaking of neighborhoods in which state presence is very low in Brazil.  It's held generally through the police, which is an organization is seen to maybe harass more than protect people in these neighborhoods.  So even speaking of criminalization or something was different in these areas.  So I think that approaching this case from this perspective was very important.  Jack thank you very much.  We find that the response is compliment the legislative or the policy approach may not be necessarily the best one.  Nighat.

>> NIGHAT DAD: My name is Nighat, I'm from Pakistan.  I work for Digital Rights Foundation.  Unfortunately in Pakistan, there's no sex desegregated Internet users data, but what we are witnessing, that the way Internet users ‑‑ Internet users are increasing, in the same way the violence against woman is also increasing.  But there is no mechanism about how to report these cases of violence or no legal mechanism, no ‑‑ there is a government authority, but like really useless at this point in time.

I just want to talk about the social context about ‑‑ in terms of online harassment.  And a recent case which is actually related to the Facebook, Pishor, women are using the Internet, there but just to ‑‑ just to understand the social context, that most of the families do not know that the girls are using social media platforms or they are sharing their data or whatever.

And when they face online harassment, they absolutely have no idea where to go and how to report the cases.  They cannot go back to their families and ask for help or support.

So recently two hackers had different women's profile, and disclosed their pictures and those are not intimate pictures.  Those are just profile pictures where people can easily download from Facebook and just their name, their college name, their addresses and their phone numbers and releasing their data while making these pages on Facebook.  These women have been reporting these pages to the Facebook to take down the ‑‑ to take down the pages because those pages, all the information are putting their lives on risk.

And some of the families got to know about it, because Pishor is not a really big city.  People know each other mostly and the result was that some families stopped their girls going to the colleges or the universities, or some were actually done by their fathers., why they shared their pictures on Facebook?

And whenever they reported the pages to the Facebook, Facebook always comes back saying that these pages do not violate our community guidelines.  Then I got in touch with Facebook policy team and I tried to make them understand that they need to understand the social context.  Maybe some forms of violence online is not really dangerous or risking society but in some society those are really putting women's lives at risk.  So the role of platform, I think it's very important to understand the social context, to understand the local languages and to build their capacities.  They are making money out of our data, right?

So, I mean, we are their product.  I think they need to understand that they are there for the users and respect their privacy and understand the social context.

>> JAC KEE: Thank you, Nighat.  It's reminding me of a case in other areas.  So this girl, she's in school.  Somebody created a Facebook page of her, with her and her boyfriend, just like that.  And she's in a rural part of Malaysia and a poor income family.  The family found out about it because the teacher told them.  So the teacher got involve and told the family and the family pulled her out of school.  We are talking about right to education also being impacted from something as simple as that.  This unpacks a lot, not just around underlying causes and the impact and what can you do about it?

Franz, what about your work around protection of Internet violence.

>> FRANZ:  My name is Franz.  And I'm the director of the office of the OSCE Representative on Freedom of the Media, which is an intergovernmental organization covering 57 countries of the northern hemisphere so US, Canada, Europe, and all the states of the former Soviet Union and we have been looking very much at this issue of the ‑‑ from the context of safety of female journalists throughout this year, we have been looking into how female journalists have been affected through online abuse because it's quite clear, it affects their safety.  Safety is one of the prerequisites of freedom of the media and freedom of expression.

And so the representative has issued a number of recommendations to the governments because governments are essentially our main counterparts in this area.  We work directly with them.

They should really work and recognize that the threats of online abuse directly attacked freedom of the media and freedom of expression, because essentially women who have been abused, as we hear in other context, is they have been taken out of education but the journalists that are being abused sometimes take themselves off social media.  They may choose not to report on certain issues on certain topics because of the abuse that they have suffered.  So that's leading to censorship.  So it's also important that the law enforcement agencies understand this issue, and they treat it with seriousness because when we heard in the past is when women have come to report such abuse to the law enforcement, they are not taken particularly seriously because it's not understood that a threat ‑‑ an online threat is just as real as a threat in the real world.

Even some contexts, what we have heard it's even more painful and more real, it's a threat that you quite often receive inside your house, it's a place you should feel safe and you are being threatened through your online activity.  It's something that the governments and the authorities need to do more work on law enforcement but also prosecution and judiciary to understand the context and to deal with it.

What we do also see is that there is no real need to introduce new criminal legislation in this area.  There is existing criminal laws that can deal with this, because any new legislation can quite easily stifle freedom of expression and freedom of the media and that could be quite problematic.  So there is legislation that deals with threats, that deals with threats to violence and quite often these threats are extremely of a graphic nature and explicit.  So there's a fairly easy link to make, as to the threats being a credible threat.

I'm glad to hear that there's some research and data being collected about this because we feel it's not a new issue, but it's an issue that people are coming around to and I'm glad to hear it here at the IGF.  It's now been mentioned in a number of forums.  So it's something that everybody is becoming more aware of, but we need more data and understanding of the effects of the abuse, but also the sources of the abuse.

>> JAC KEE: Thanks, Franz.  Too close to my mouth.  Thank you very much, that's very, very useful and thanks for pointing out that the impacts of threats online.  Often we hear that it's not as real as offline threats.  It's just online.  If you don't like it, just shut it off.  I think it's kind of important to also sort of break down this distinction.  It's actually not true.  The online is located within the offline.  There's no real distinction but what we need to unpack is the relationship, rather than distinction.  Before I open it to responses maybe I would like to bring it to audience participation.  Does anyone have any sort of comments or questions or even cases that you would like to share?  Too early?

Okay.

>> AUDIENCE MEMBER: Good morning, I'm Beton Lashere.  One comment on the reaction of Facebook to take down or make an action that reveals the private data of people.  Is it really not something that would be forbidden according to the terms of service?  I would think that I'm maybe not familiar enough, but I would think that disclosing private information on a page about a third person should be something that's taken into being and if it's not, maybe it's an avenue to discuss with them.  Of.

>> JAC KEE: Are we here?  Do we have Facebook.  I noted two responses but let me take the comments.  No?  Okay.

>> AUDIENCE MEMBER: (Awaiting English interpretation).

>> I have been reading reports.  I have been aware of the issue and we are always talking about the problems but not solutions.

We're a family, my mother is a lawyer and I'm a psychologist and my brother is an information specialist and that allowed us to have a holistic approach in my country, which is Bolivia to look at the solutions and we have been working with the government.  They have supported us.  We have been able to do national campaigns and we have done prevention exercises with kids, over 20,000 in the country.

We have worked with kids from age 8 to 15, also, 17 to 18 years old and it allowed me to think this process through and think about what we do with kids in the prevention program.

And so I think that from a civil society perspective, we need to work in education and prevention but also as part of social responsibility of service providers this should be an aspect, as well as the state and those are two actors which are quite distant sometimes from that.

>> JAC KEE: Thank you very much.  We are running a little bit short on time.  I will ask you to hold your comments.  We have run into two responses.  We are very, very keen to move forward.  We have a few bits more before we want to go forward and then after that, we will come back.  Is that okay?

Okay.  So we also looked at ‑‑ oh, okay.  So we are here.  So one of the things that often came up and I think it's also come up in some of the inputs is about the need to balance competing rights and interests and the need to kind of consider all kinds ‑‑ all rights and all interests.  And the two issues that always came up in particular, is the apparent tension between the freedom of expression and the need to address online violence against women.  And thankfully, we have two fantastic people who can help us think through some of this and the other tension that also came up around anonymity and privacy.  Anonymity is important for safety and expression.  But at the same time anonymity is something that can be used for abuse.  So how do we kind of balance between the two?

So let's start with maybe freedom of expression with Rebecca.  How do you think we should approach this tension?

>> REBECCA McKINNON: Thank you so much.  So I have been spending a lot of time in the past couple of years looking at the role of companies and platforms, in particular.  Not only through the ranking digital rights project, but also through a UNESCO sponsored company, and I think Shen Holm is here went there will be discussion of it tomorrow.  And through supporting the work of Manila Principles projects and it's compatible with freedom of expression.  I guess one point I want to make is that in trying to resolve problems that occur on platforms one of the reactions of policymakers is to hold the platforms legally responsible for the bad, evil, unacceptable behavior of many of the users of the platform.

And what we have seen from studies of intermediary liability, you know, legal regimes around the world that basically strong liability, you know, when the law is placing strong legal responsibility on platforms to police content, it always leads to over censorship.  I'm not aware of any case where the censorship or the restrictions are done in a very sensitive way that only deals with real harassment and don't end up leading to the censorship of activists and take down accounts of people who ‑‑ who have a right to be speaking and who are engaging in legitimate speech and women who are trying to get their message out.

So putting the responsibility on platforms often leads to platforms just whenever in doubt, just taking things down because they don't have enough staff to kind of look at every single case with enough nuance.  So that's a problem with kind of heavy handed approaches to the law, which also ends up not helping women.  We also found some cases of platforms policing speech and from the perspective of violence, the women, the wrong people end up getting censored because the platform doesn't understand enough of what is going on.  That's always a danger when the platform feels it has to do something but it's not ‑‑ but they don't have the staff to really understand well enough.

So one of the things with the ranking digital rights project and we have other sessions where we talk about the project itself, but we look at ‑‑ we're basically measuring companies according to a set of standards for respect for freedom of expression among other things by companies.

And we do not specifically have a question about, you know, does ‑‑ what does the company do to control violence against women?  However, the way we are structuring our approach, we assume that companies are going to want to have rules and should be ‑‑ you know, that it's ‑‑ that it is not ‑‑ not that it should be a free‑for‑all but companies do have a responsibility to set rules, enforce rules, but what's important is that there be transparency.

So what we are looking at is, okay, is the company clear about what the rules are?  Does it communicate that clearly to the users?  Does the company engage with stakeholders about how to formulate its terms of service in a way that actually is serving and respecting the user's rights.  Is it formulating the terms of service also through the engagement of users conducting human rights impact assessments which we expect companies no do.  Is part of that assessment including an assessment of their terms of service enforcement, and whether it's ‑‑ whether the terms of service and its enforcement is actually serving its users well within a human rights context.  So we are looking at that kind of proactive thing.

We are also looking at ‑‑ to what extent companies are being transparent about what they are taking down, why they are taking it down, on whose request, and whether there are grievance mechanisms.

So whether there are adequate mechanisms for users to file a grievance if they feel their rights have been infringed in connection with this company's business and if there's some mechanism for redress.  And so the grievance mechanism is being really important.

And one final thing that we found through our research is that, you know, of the 16 companies we looked at in the ranking Digital Rights only half report half of their process of government requests or third‑party requests for user information in general.

And then when it comes to data about what they remove, only 6 of the 16 companies we looked at release any information about the amount of content they are removing due to government requests.  Only four of the companies released any data in terms of content removed in terms of requests.  So if it's NGOs or private individuals asking them to take down content.  Only 4 of the companies.  None of the companies released any information about ‑‑ about the volume and the nature of content they are taking down restricting in enforcement of their terms of service.

So part of the problem is that the process is really a black box and there's a lack of accountability and not enough stakeholder engagement and assessment of what's going on, and I guess I would argue that perhaps that's a way forward that can help find the right balance as opposed to sort of heavy handed, you know, we will throw everybody in jail kind of approach.

>> JAC KEE: Yes, I don't think that's kind of a response that's been sort of quite ‑‑ has been quite directed to dealing with this issue.

And transparency is also one of the things that we find to be quite important and an important way forward.  I think a lot of the freedom of expression work and I'm not shower if the ranking digital rights.  There's something about human rights that's in existing context and existing structural inequalities and how do we take that into account when we are doing sort of the process of balancing rights and interests.

So David, do you have a way forward for us?

>> DAVID KAYE: Coming from the shadows.  I think the number of people on this panel is a testament to Jac says come and you just come to the panel.

So actually, what everybody has said, I feel inadequate to the task here, because what everybody on the panel has said already, I think covers quite a range of both the problem and some potential solutions.  And I guess I would just ‑‑ I wanted to identify three issues, three or four issues that are more along the lines of questions, although I would start by saying that it's just critically important that people share ideas of practice, because there is good practice out there, but it's often buried and so sessions like these are really important and projects like this are really important so that people have ideas about what ‑‑ about what's possible.

So the first question, and we are talking about human rights and freedom of expression, maybe sort of toggling off of Rebecca's comments is first thing is definition.  I mean, I think it's really important for us to have definitions of the problem that don't over regulate, because very often the tools that we would want to use in order to counter harassment will be the same tools that are used to censor as Rebecca was suggesting.

So that's one thing, I think is focusing on definition.

A kind of sub issue around definitions is whether any of the definitions vary according to the target of the harassment, in particular, whether we are talking about children or we are talking about adults.  I think there's some room there for distinctions about the kinds of restrictions that one might find acceptable depending on how that's operating.

The second is I think the big question, to a certain extent is who decides and when we think about who decides, we are talking about a range of different kinds of actors, right?  We are talking about, you know, social change, the kind of campaigning that APC and others do which is critical in order to shed a light on harassment and to counter it, in an effective kind of campaign‑oriented way.

There's the question of corporate approach.  There's legal and state sanctions, which are available and I agree with Franz's point about there being quite ‑‑ quite a great amount of existing law that is simply often not applied in this space because of the idea that this is only expression when, in fact, harassment is also ‑‑ is an act, not just an expression.

And then ‑‑ and then the last thing is just a question about user tools.  So I agree, I mean, Jac, I think you made this point that we don't want to put the burden on the target of harassment to deal with it.  But at the same time, there are tools out there that we can use to shut it off in a ‑‑ to a certain extend, not to the full extent that is required.  But I think there needs to be some discussion about the tools we have in order to block, the tools that technologists can offer in order to allow us some control in the face of harassment, which is really a crisis in many places, as has been shown by the panelists here.

I will have to run, shortly, Jac, as I told you, but it's not personal.

>> JAC KEE: I will try not to take it too personally.  Before you run, run tricky question.  It's around anonymity, and you have come up with a report on that.  I think we value, as feminist activists, anonymity is absolutely key and as was said a couple of days, it's always the political act of resistance that's in the history of feminist activism, however, is there a point in which your right to anonymity and privacy is forfeited because you have abused it essentially?  What is that point?

>> DAVID KAYE: What is that point?  That is maybe the ‑‑ I don't know, used to say million dollars question.  That doesn't get you as far.  Honestly, I don't know what the dividing line is, but it's clear that ‑‑ I mean, with all tools even pen and paper.  I mean, all tools are subject to abuse, and that's clearly the case with respect to anonymity.  And I think that ‑‑ I mean, my main concern and one of the reasons why, you know, I did the report last spring on anonymity is because anonymity is under threat as a general matter by law enforcement and intelligence agencies.

And I think the absence of any tools of anonymity would be a very, very serious threat to activists and just to ordinary people who are searching for ideas and ‑‑ about their own sexuality, about their heritage, whatever it might be.  So for me, I really wanted to flip the default, so that the default is anonymity ‑‑ I mean, I didn't say anonymity is a right, but anonymity is oftentimes a critical tool for people to enjoy their freedom of expression.

If at least you start there, then you can identify what are the problems that arise and what are the abuses of anonymity?  It's extremely hard to challenge anonymity and then we end up in places where like in the Delphi case where you have intermediary liability that can actually, you know, detract from freedom of expression too.  I don't know what the line is, and maybe that should be the next report.

The first step is to make the default that anonymity is allowed and then we look at what are the problems and then the solutions to the problems.

>> JAC KEE: It makes complete sense to me.  Thank you very much.  I would like to open it very briefly for one or two inputs and then we will go back.  One there and one there.

>> AUDIENCE MEMBER: Thank you, I'm a policy specialist on ICT.  I think this is a magnificent panel.  I think it's excellent that the people on the floor are putting pressure on state actors and the SIDA.  And I think that what we have been doing at SIDA, is to use a clear human rights based approach and para‑analysis when dealing with freedom of expression issues.  Gender‑based human rights violations in an online context is something that needs to be dealt with from the freedom of expression perspective.  This is something that should be discussed even more within our fields, of course, but together with expert, as the one on the floor.  I have papers showing how SIDA works in these matters.  I will happily share them and cards to discuss this.  Thank you so much for this magnificent forum.

>> JAC KEE: Thanks.

>> AUDIENCE MEMBER: Thank you.  I'm Judy Ward and I'm a member of the European parliament.  I'm on the culture and education committee and also the women's rights committee.  And I would be interested to ‑‑ for people to say a bit more about women in public space, because I ‑‑ as soon as I was elected, and I made my first speech in the culture committee, actually defending the Erasmus program which should not be particularly contentious, I was subjected to Twitter hate by an extreme right party, simply because I had spoken about having aggressive Europe that wants young people to have mobility.  Because I was a woman and I dared to speak up, the abuse that I got was actually sexual abuse.

>> JAC KEE: That's actually one of the things that we are also finding.  Yes, while people in public face abuse but then there's something specific about women in public positions face which is often targeted towards your sexuality and your gender, and it's ‑‑ it's of a different flavor and of a different volume, I think.

Okay.  So moving on, and ‑‑ and so before we start talking about some of the considerations and I know if you have to go, it's okay.

So we have already started thinking through some of the considerations for responses and this was also a big area in which the BPF was really trying to get as much responses as possible and we divided this according to sector, actually.  So the first ‑‑ the first group that we looked at is really around public sector initiatives hasn't what we found is that as ‑‑ as Rebecca was also saying there's a need to prioritize.  Rebecca and also Franz, there's a need to prioritize redress and criminalization.  Oh, we need a new law, no we need release and redress now, can we focus attention into that area instead?

And the second thing, there's a need to recognize the forms of harm beyond physical violence because there's often an attention ‑‑ I think it's link to crime as well, that the intention is put just on physical violence but there's a range ‑‑ but there's a range of other kinds of impact from psychological harm, to impact on mobility, to economic, to education and so on and so forth.

And then there was also a need to prioritize access to justice.  I have think this is often very key.  Even if you had the laws and the initiatives.  The actual access to justice itself is not so simple.  And so that this included the need for flexible and informal measures.  The specialized agencies and the more improved access.  And then we looked at the special sector.  As we can already see from the conversations, this really needed to be unpacked more.  How can ‑‑ you know, should Internet intermediary have more of a responsibility and how can this be and how can it be implemented in a way that makes sense.

So one of the things that BPF said is to look and perhaps the rookie framework is a useful way to do this, the UN rookie framework on, gosh, the three pillars which completely escaped my brains now.  But you can search it for now.  And the other thing we looked at is there's a need to evaluate intermediary responsibilities and looking at initiatives and what are the different ways we can explore this book, like ranking digital rights project is very important for this but how do we apply this in relation to understanding what is the intermediary's responsibility to address this issue that increasingly critical?

And then finally, same with the state, actually, even if there's measures and complaints mechanism, there's a need for greater ease for reporting not to make it that you have to jump through 3 million hoops before you can report something or not really understanding the process and also transparency around it, not just in terms of how the complaints mechanism works but how many reports do you get.  How many reports of harassment have you actually got to know?  How many have responded.  What are the different ways you can respond to this?

And training for staff, it's inadequate training.  Like, what do you do you?  Can't expect kind of staff to suddenly automatically oh, I can recognize this specific issue ‑‑ like Nighat was saying oh, this specific issue has this specific impact.

Then we looked at community‑led initiatives.  We understood many different actors to be doing different things.  There were a whole host of them but we generally collected them under community‑led initiatives and we looked at digital safety and training and also around campaigns of awareness raising.  I think this is in particular dealing with things like a massive amount of ‑‑ when the attacks are targeted ‑‑ are swarming.  So this is where the ‑‑ you know, knowing that you are not alone in responding to this is quite critical.

And then there's development of apps and technical solutions and trying to figure out what is the best way to do this, for example, harassment in Egypt.

And help lines.  A lot of NGOs set up help lines to get out information and advice on what to do.

We looked at multistakeholder intergovernmental rules in facilitating multistakeholder responses either way it has to be developed, not just with current and future users but with different stakeholder groups, including the groups that have been working on gender‑based violence for a long time.  That level of consultation is key in trying to understand the issue and bringing it forward.

So now, let's ask the difficult question.  So we have been talking so much about Internet intermediary.  We have Hibah from Google here.  What is the rights?

>> HIBAH HUSSEIN: Absolutely.  Thank you again for having me.  I think one of the reasons why I'm really excited to be here today is because personally and as Google, we kind of recognize that the Internet is only going to be this wonderful, robust place, if everybody can participate online without fear of harassment, without fear of threatening.

So, you know, trying to figure out how can we make these spaces for speech really dynamic and open, and make the Internet a places where communities who are marginalized and can really find kind of a good space for speech.  So that said, we have kind of heard loud and clear from civil society groups and other actors that you don't really want an intermediary to make blanket decisions about what qualifies as harassment.  As a lot of my panelists, a lot of this stuff is incredibly subtle.  You can't have a one size fits all solution for all of that.  Would be incredibly damage to go free speech.

What we do is we rely on users to really flag content for us, and as David mentioned while he was here, making the tools for reporting as easy to use,s intuitive as possible for users around the world.  That said, I think some of the things that we are dealing with is, you know, building scalable systems, understanding context, and subtlety, and also trying to recognize good policies that aren't over broad.  So just to quickly give a kind of an overview on how we are handling these issues we don't allow content that promotes or condones violence that has the primary purpose of inciting hatred, based on sexual orientation, gender identity, gender, race, ethnic, origin, age, and veterans status.

We are really focusing on the fact that a lot of these issues as my panelists have mentioned aren't just online issues.  They are issues with offline roots.  You can't just play whack‑a‑mole with the content online and the problem will go away.  We have been building partnerships with some of the groups in the BPF draft, some of the groups that have been in the space for a really long time.  We have partnerships with help lines.  We have partnerships the national network to end domestic violence to make sure that technology can facilitate safe spaces and can empower marginalized communities.

A couple of the other areas that we have been focusing on is digital literacy and safety, that's come up in the BPF draft and several of my panelists have raised that as West Nile Virus, making sure that people understand how to use technology in a way that's responsible and have full control over their online presence.

We have been focusing a lot on counter‑speech and helping people really take advantage of online platforms to get their message out and really push back a little bit and that's been really, really, really helpful.

So those are kind of our high level overviews.  I'm happy to dig deeper into product‑specific policies and such, but that's how we are approaching these.

>> JAC KEE: Do we have any other kind of private sector representation in the room?  Maybe from Twitter or Facebook who might want to also speak or any other kind of ‑‑ I mean, these are the familiar people, the three.  But there could be other innovative private sector responses.  Hold that thought if you do.  Next, we can go to Gary.  ITU has looked at issue and prioritized it.  Why now?  Why do you see it as critical?  What do you think is this link between this issue and access?

>> GARY FOWLIE: Thank you.  I'm honored to be a part of this.  First of all, in terms of the private sector, maybe that's a place to start and first of all with Google, I believe that you recently announced that you are going to eliminate revenge porn from your search results which I think is a bowl step and I hope it inspires others in the ICT industry to really fully engage in the problem and that's part of it.

In terms of the private sector and public sector, we have the UN broadband commission, for sustainable development and as part of that, we have a Working Group on gender which recently released a report that I'm supposed to tell you a bit about.  The report is ‑‑ talks about combatting online violence against women and girls and a worldwide wake‑up call.

You know, I think the results of which we have heard a lot of here.  It reaffirms what has been obvious and what wasn't obvious to some of us was the revengeful response that this discussion generates, but the highlights are pretty straightforward.  26% of law enforcement agencies in 86 countries surveyed ‑‑ only 26% of law enforcement agencies, in 86 countries are taking appropriate action, that women in the range of 18 to 24 are uniquely likely to experience sexual harassment.  One in five internet users live in countries where that online harassment is unlikely to be punished.  None of this is probably surprising to anyone here.

And we have ‑‑ we also discovered, that you know, obviously women are reluctant to report this, their victimization as a result of fear of social repercussions and I don't think that peace particularly new.

I think what we do have, that is a bit new is we need to look at some basic recommendations, things like we are calling it the three Ss, sensitization, preventing cyberviolence, and maintaining responsible Internet infrastructure through technical solutions.  And more informed customer care practices and I will get to the role of education, I think, in a moment.

And sanctions and probably this is the most controversial part, developing and upholding laws, regulations and government mechanisms which are there and are in place, and ‑‑ but maybe are not being upheld to the manner that they would be.

So I think that would ‑‑ that brings me back to really what I want to talk about is that we have heard this is a human rights issue.  Yes, it is clearly, it's not just a women's issue.  It's an issue of privacy, it's an issue of gender equality and it's an issue of Article 19 of the rights ‑‑ the UN rights declaration that we have the right to express opinions on all media:  It's a very far reaching guarantee of free speech that was developed in 1946 but was recently really being tested thanks to the Internet.  So it's a Nexus of those three human rights issues and it's a very difficult Nexus to balance.

But I'm going to, I think get personal here.  Anybody in the audience today who like me shares an X and a Y chromosome would agree that the vast majority of our brothers are shocked and appalled by the online behaviors of online stalkers.  I'm an old guy.  Fact that I'm talking with gender issues me and my father was one of six boys.  I have no brothers.  I have three sons but I do know that, you know, frankly, the violence against women and girls, whether it's online or offline is only going to stop when men and boys decide to stop it.

I think it's as simple as that, frankly, and yes, there are men and boys who are being harassed online but the research indicates that 90% of the intended victims of some types of online violence, especially revenge porn are women.  That's simply a fact.  And ensuring the safety and the security of these women and girls online is something that we all have to be concerned about, because it's not just a human rights issue to me.  It's a matter of human dignity.

You know, why are we treating women, our sisters, our mothers, our daughters, our friends like this?  I just ‑‑ I just don't get it, frankly.  It's a matter of dignity and I think it's up to men and boys and the education of boys and men that this is not acceptable.  This is not a matter of restricting your rights to freedom of expression.  This is a matter of learning to treat people with respect, your fellow human beings with respect.  That's a very, very hard thing to do.  On that line, I will quit.  Thank you for this time.  But I really would like to see the men and the boys in this audience and those that are online really rising to this issue.  This is not just a women's issue.  This is an issue for all of us.  But we're ‑‑ it's only going to stop when the X and the Y chromosomes decide to stop it.

>> JAC KEE: I get everyone needs to take action.  So Patrick, the Istanbul convention, it's really looked at a strong approach to addressing the issue of violence against women also.

>> PATRICK PENNINCKX: It certainly does.  But it's been said so much already that I really humbled by the interventions of the public and the panel here.  I think the basic starting point is it is indeed a human rights issue and if it's a human rights issue it needs to be balanced with respect to other human rights.

And what David Kaye said at the beginning we need to see who is acting and what we can be doing.

And that is what basically the Istanbul Convention, which is a convention combatting violence against women trying to set out.  Its not just about criminalization acts, it's also about what can we do and what can do what?  I think we are past a stage of lynching and revenge, as the ultimate legal tools that we possess to deal with issues.  I think it's very important, that we rely on clear legislation, on clear regulations in order to see where we are going.  Whether these regulations and legislations are worked out by governments or parliaments or whether they are self‑structured through Internet intermediaries through service providers, I think that does not really matter all that much.

What we have to be careful about is that we do not legislate in many different directions.  One of the rights and freedom of expression is also the right to offend and the right to offend, who is going to say what is the offense and what is not?  And who is then going to start deciding?  I think it's very ‑‑ we are ‑‑ it's ‑‑ of course, we are all like‑minded in this room.  I don't really think ‑‑ but we have to also be extremely careful that we do not ensure that we have a multitude of judges and a multitude of courts that are going to each in their quarter decide on what is good and bad for us.

Because then we are going the wrong way.  Then we are opening ourselves to legislation that goes in different direction based on morals, based on history and religion and other aspects.  It's important that we see the hierarchy that first of all we define very clearly what we want to be doing and secondly who has the responsibility to do what, whether that is courts, judges, whether that is Internet intermediaries and then third, there's plenty of examples of that in this room, that is to see what action, positive action to undertake and legislation and that's also the case with the Istanbul convention.  Legislation can also be part of the legislation, whether that's stimulating service providers to think about their own role, stimulate ethical thinking, stimulate internal controls, stimulate training, skills training awareness raising.  Those are the items we need to be looking into.  And it's a complexity of things and that's also what the Istanbul convention tries to provide.

>> JAC KEE: Thank you very much, Patrick.  It's really good to think about it in terms of clear ways in terms of how we need to address different things at different levels.  And last but not least, Narelle, one of the gaps from the BPF that we really noted as well, apart from looking at the context of women with disabilities what can the technical community.  We looked at state and community and private sector but what about the technical community?

>> NARELLE CLARK: Well, normally when I speak in Australia, actually, the first thing I do is to acknowledge the traditional owners of the land and their elders past and present so I will try to do that first here as well in Brazil, although I don't know any yet.

But some of the things that the ‑‑ the projects I have been involved with in Australia, we had a project running for a while on technology facilitated stalking and violence and abuse where we have been working with the domestic violence resource center Victoria and the legal center and developed a guide to all of the different laws that are in place in Australia, right across the different sectors that are involved with online stalking and abuse.

And as part of that analysis they found over 70 pieces of legislation as they got seven main legal jurisdictions being a country that has states inside of it.  And very few of them seem to have any harmony between them, so that there were a whole raft of different pieces of legislation that could be used to get these guys, but in the most part, what the domestic violence workers have been finding is when they get down to the police station, with the whole swag of evidence, the please don't want to act.  Look, love, just get off Facebook.  Just don't go there.  That's not an appropriate response, because that's where a lot of that woman's support is!

She shouldn't have to get off social media.  He should have to stop or whoever else happens to be doing it.  So if you will excuse that little rant on my part.

So one of the things we wanted to do was not only to put together a guide called smartsafe.org.au is on all the applicable pieces of legislation in Australia and put that in the hands of the domestic violence workers so they knew what type of evidence they could collect and what technical steps they could take to assist the women from sort of shackling themselves from the ‑‑ the various pieces of stalkerware and other mechanisms that were in place.  Simple things, like changing the privacy settings on one's phone.  People were happily using the phone that their ex‑partner had bought for them and their ex‑partner was paying for and under the data retention because, the partner has access to all the data for that mar woman.  There are some things we are trying to unpick and unravel across all of that.

If I move back to what I should be talking about, that what the internet society is trying to do as a whole.  Under the realm of the engineering task force, which is the happy land where all the technology standards are developed for how the Internet works.  Now, we will respectfully ‑‑ now here's ‑‑ we have our layers.  I was inviting a RFC, request for comments in the IUTF, in not bullying and that RFC is gradually ‑‑ the draft is gradually making its way to RFC status.  We within the technical community have to tell each other how to be nice, and that's ‑‑ you know, people get carried away with ideas and they will champion them passionately which is great and we should, but we should also not degenerate to attacks.  We should be taking apart the ideas and more with that.

And so we now, like a lot of other professional organizations started to adopt standards for behavior within our own technical development communities.  And that is kind of run in parallel with the other safe spaces that a lot of women have developed throughout the whole community of developers.  So you have forms like sisters, sisters at IAUTF and sisters more broadly where women can come together and build support for each other and be stronger in their development and their sense of their own technical careers.

So security and security online is fundamental to the Internet Society's mission.  Recently the ITF, the Internet architectural board released a paper.  These are the sorts of things that we have fundamentally done throughout all the creation of Internet standards is to address security at all fundamental levels and now more recently to put in privacy conditions all the way up the technology stick.

But that doesn't get to stopping violence against women.  You know, these are just some of the tools that we have to help women, and our male colleagues to use constructively, rather than no abuse.

>> JAC KEE: And actually, that's a critical layer.  We look at all different layers but how do we think of prioritizing, it's also another critical layer to address in looking at this issue that maybe doesn't quite get as much attention as it should.

Okay.  Before going back here, let's go back here for about ‑‑ for five minutes and I would also like to ask you a specific question.  Which is where do we go from here?  The best practice forum process has been really, really ‑‑ very, very difficult but very good in terms of trying to gather all of the different stakeholder thoughts and work and responses in this area.  So where do we take this work this clearly we are sort of moving to different kinds of solutions and responses.  What do you think ‑‑ where should we go from here?  And I see three hands up.  One here.  One here.  And one ‑‑

>> AUDIENCE MEMBER: My, my name is Jan and I'm from the APC, I'm not going to respond to Jac's question.  Firstly, while it's really great to hear that Google is reaching out to women's organizations, they are mostly in the United States and our work is really showing us that the responsiveness of private sector actors and companies is really not same in terms of women's experiences of the global south and the point that Nighat was making is really important.

The second point is around the recommendations we are looking at counter speech.  That conversation needs to happen in a way that doesn't make an assumption that it's an equal playing field because that already has to recognize women's existing inequalities and existing exclusions.  I think counter‑speech response really feeds to be unpacked a lot more and lastly, I think there's a lot of emphasis open freedom of pression in a way that's locating it's also outside of the broader realities of existing inequalities and also it's not only about speech.  When we are talking about violence against women.  It's also about online harassment.  It's also not only about the unknown.  So it's also about the fact that our research shows that one out of three women, the experiences are related to someone that they know.

So I think the conversation also needs to be much broader.  Yes, speech matters but I think it's one of the areas that we actually have moved quite further along into and we need to look at other issues as well.

>> JAC KEE: Thank you very much.  Hopefully the next person might answer my question.  But because we are running quite short on time, so I will ask you to please keep the interventions quite short.

>> AUDIENCE MEMBER: Yes, I'm one of the Internet Society ambassador and I'm an activist and a blogger.  I have a blog about the women issue, about violence against women and I believe as dynamic the topic, is the solution should be dynamic.  We can not refind it because something ‑‑ it's not original.  It happens differently.  So a multistakeholder group in a discussion like this should be ruling on and the evolution process should go open.  So that better solution, better answers, because right now, we will might have a definition here, but four years forward, we don't have the same definition because things are changing.  Technology is changing.  It's going to impact us.

And like he said, you know, we have to step up.  The men ‑‑ the group, we have to step up.  Unless we recognize it, until then we say it's wrong.  It's not going to change, because in the policy, the case happens like, this just the case I will highlight.

>> JAC KEE: I will have to cut you.  Thank you.  Thank you very much for that.

>> AUDIENCE MEMBER: Yes, we have to step up.  The men group, we have to step up.

>> JAC KEE: I like the men in this group.  It's great!

>> AUDIENCE MEMBER: I'm Courtney Rogg for the Committee to Protect Journalists and you asked what we should do next.  Definitions, that's one thing but it seems like a lot of the issues raised in online violence against women debate and figuring out solutions are similar to those being raised in the conversation around hate speech online and in the conversation around counter‑violent extremism online.  I think a good next step would be to get into conversation with what's happening on those the OSCE just held an expert workshop consultation in Romania and I wish we heard such a strong support for not privatizing censorship with intermediaries and, you know, what the experts said at that workshop is that removing content online is ‑‑ is pretty ineffective, and yes yet we didn't see that reflected in the recommendations that came out from the OSCE.

The conversation about solutions and how different actors should play should be part of the debates.  I would like to know from Google in terms of the algorithmic choices that you make and whether ‑‑ we haven't heard anything about the technical or algorithmic aspect of this.

>> JAC KEE: Very quick.  One sentence and I will probably cut you off.  I'm very sorry but I can be quite rude.

>> AUDIENCE MEMBER: I will answer your question.  This is about IGF.  We have heard about CERTs and how there's a need for collaboration to do those things adequately, maybe we need collaborations in the hotlines, in the groups and all of these different organizations are dealing with this issue ever harassment of women and girls online.  Maybe we need to collaborate, we need to get to go and we need to have a forum.

>> AUDIENCE MEMBER: Rebecca mentioned research, studies have shown, this is a room full of tech‑oriented people.  We are ‑‑ if we are looking for solutions, they should be based on data, on evidence, on research.  So I would like to answer Jac's question by suggesting that we look for opportunities to do specific targeted research on what will work and if possible, do it collaboratively across contexts and across countries because as many people have said, the contexts are different.  So there will be different solutions in different cases.  There are also some circumstances in which certain responses or certain solutions will work across contexts and across platforms.  That's it.

>> JAC KEE: Thank you.  One at the back.  And is this anymore.

If not, that will be the final comment and then last words like literally words.  I'm sorry.

>> AUDIENCE MEMBER: I would say there is no one solution, but data.  We need data and we need community.  And we also need to connect the dots, because we talk about the women's party.  This means a lot about sexuality, and this is blasphemy, and sexuality is the reason why women's body is attacked, and we need to connect those things and to ‑‑ you will never forget there are different nonnormative bodies and the attacks are directed to any of them.

>> JAC KEE: Thank you very much.  There's a need for greater focus, greater looking at what could ‑‑ but actually more around the facilitating conversations within different actors and different initiatives and in order to be able to do more targeted research as well as trying to unpass some possible solutions towards this issues.

This is some of the findings that we have looked in terms of the BPF, in terms of what is needed and there's need for greater research and understanding the prevalence of this issue, greater awareness and important components in terms of looking into this.

Okay.  So final sort of like a ‑‑ a little quickie.

>> I thick I can second more or less everything that you said and it should really be a focus on education, on the Internet literacy as well that people understand what they are doing online, has a real effect on offline world of other people.  It's not just the comments that they though into the world, into the ether.

>> I would like to stress that we need data, that we need context, especially that we need intersectionality, when we are discussing solutions for gender issues.

>> HIBAH HUSSEIN: Thank you so much.  I want to reiterate some of the calls for data, some of the calls for more research.  Courtney, I would love to connect with you after this to talk about the technical aspects and thank you for all of your input.  We have been furiously taking notes and boosting our transparency and human rights which we are committed via the global network initiative and other networks but that's critical for us.  We will keep improving.  Thank you.

>> NIGHAT DAD: I think we don't need new legislations.  We need to, you know, use existing legislation and more training of law enforcement agencies and judiciary.

>> ANRI VAN DER SPUY: Just a quick comment about the draft, and I neglected to say thank you to a lot of faces in this room who contributed case studies and attended virtual meetings.  Without you, this would not have happened.

>> NARELLE CLARK: I think we need a balance between the technology solutions and the education of both the women and the domestic violence workers and law enforcement workers who work with women to try to help them build their privacy and build their strength online, to build their abilities online.

>> AUGUSTINA CALLEGARI: Yes, I would like to highlight the importance of the growth of the international issues because we need to strengthen the multistakeholder cooperation, at least in Argentina because this is our problem now.

>> REBECCA McKINNON: Yes, just plus one on data and research and fact‑based, you know, solutions based on understanding of actual facts rather than sort of broad assumptions.

But also, just sort of a plea for ‑‑ I think we need to find more ways to bring the right communities into conversations with the right people and the companies and companies often are not quite sure who to reach out to.  And they are starting to be some mechanisms and intermediary organizations that help connect people in companies with the right sort of NGOs and communities open the ground, in particular places where the context can be provided and where the real conversation for problem solving can be had and I think your organization and your initiative provides one potential conduit for that conversation, and there are a number of others as well.

In building those bridges so that there can be an effective sort of problem solving approach, when companies are sort of faced with so many different issues at once and they don't quite know who to ask to figure out how to solve problems.

>> DAVID KAYE: Our Council of Europe convention is called the Istanbul Convention combatting violence against women.  It's naive to think that it will go away without effort.  It's naive to think that we are in a linear process towards eliminating violence against women.  I think it's a daily struggle.  I have think when I was a student 30 years ago, we were speaking more or less about the same things.  So it's a daily effort and a daily struggle from all of us.  Thank you.

>> JAC KEE: Thank you.  And thank you very much to the distributors, the best practice forum, without your contribution and input, it would not be possible and I encourage you to look at ‑‑ I know it's scary and it's 162 pages but it's not all.  It's really a good resource.

And thank you very, very much for your participation and to the panelists.  Thanks.

(Applause)