The following are the outputs of the real-time captioning taken during the Tenth Annual Meeting of the Internet Governance Forum (IGF) in João Pessoa, Brazil, from 10 to 13 November 2015. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
***
>> ANJA KOVACS: Hello. We will start in a couple of minutes. This is intended as a roundtable and we would like to have your voices as part of the conversation. If you could join us at the table, that will be fantastic. This is for the remote caption. I'm one of the moderators and speakers. My name is Anja Kovacs.
>> BISHAKHA DATTA: This is also for the remote captioner. My name is Bishakha Datta. I'm one of the moderators. Thank you.
>> AMELIA ANDERSDOTTER: My name is Amelia Andersdotter. I'm a Presenter on the second part of the workshop.
>> JOCHAI BEN-AVIE: My name is Jochai Ben-Avie.
>> JOANA VARON: My name is Joana Varon.
>> DANILO DONEDA: My name is Danilo Doneda.
>> BISHAKHA DATTA: We are ready to start the session. Thank you for being here for this session, which Anja and I are co-moderating. My name is Bishakha Datta and I work with a non-profit in India called, point of view. I believe, before we start we would like to ask as many people as possible to come up to the table in front since this is intended as a roundtable with speakers doing trigger presentations and we would love to have your voice participating in the conversation. So, please feel free to join us at the table.
And now to just introduce the session. This feels like it is possibly the first time a full session has ever been dedicated to the concept of consent at the Internet Governance Forum. And, the reason Anja and I were keen to do a session around this concept is because the concept of consent gets buried but is related to other Internet rights concepts.
So, for example, privacy and consent often go hand-in-hand. Consent is related to free speech and expression, digital trust, to security, to many, many core principles of the Internet. But, it rarely finds a space where it is discussed by itself.
So that is something we hope to basically excavate this buried concept and show how it relates to several other concepts we talk about more typically in Internet rights and governance and see whether it might be a useful concept for us going forward.
>> ANJA KOVACS: I'm from the Internet Democracy Project in India. Just to build on that, we do a lot of work around Internet Governance issues and as I was having these discussions about consent with Bishakha. Each the more technical issues or issues on different discussions issues that had nothing to do with sexual expression or even privacy, this concept kept coming back and we thought we should start to explore both of the connections and what perhaps the potential is to move the contentious debates we are in, forward in a way that is more respectful of Human Rights.
The way we wanted to organize today is that we will divide the hour-and-a-half we have in three subsections. We will have two trigger speakers in each subsections who will make their interventions first. But, we asked them to be fairly brief and that is really to give you the chance to intervene as well.
So, after those initial talks respond and comment. We'll take about 25 minutes for each section and then move on to the next one.
The very first one we wanted to address, and obviously going to be overlaps, is the issue of consent and Human Rights and our trigger speakers for that section are Joana Varon from Coding Rights in Brazil and Danilo -- I always fall over your name. I'm so sorry. Danilo Doneda also from Brazil. Joana, why don't you go first.
>> JOANA VARON: Very quickly, I want to talk about two topics. One is related to consent and Human Rights. In Brazil we had a public consultation for draft view and protection and one of the most debated topics in the public consultation was exactly what is the meaning of consent regarding to privacy.
What does it mean to click on the ISS? There is also the Terms of Services with pages and pages of micro letters. What does it mean? There is even an App that you can start in your browser and it changes the bottom to accept to whatever, because actually that is what we are doing every time we accept those Terms of Services, no? So what does it mean? Meaningful and Informed Consent in the context of protection of privacy rights. It was one of the core debates because the view is exactly the structure and Danilo can explain better than me, but the structure in the concept of consent. The moment you consent you're giving permission on a lot of things, but you are still protected by several principles of data protection like proportion and necessity and so on.
So this was one of the main debates in Brazil but I also think it's the main debate extending beyond the country because we didn't find a way to inform people practically to what they are consenting to. So, why are we -- I think we are going to explore this further in the session just to put some questions. This is regarding too, when you are using your service and regarding to a company's service. But I want to give one example of how you can even protect from your consent being exploited in relation to another one. Because we also share a lot of information and sometimes this information can leak, become -- and that is the case that has been happening a lot with women in cases of when what we can call the threatening model of information leakage is related to the person you are communicating and not to a third party like a company.
So I give one example that is, we were playing about with this guide about safer nudes, is a way to protect from your consent that was focused in one conversation with another person through technological means that would allow the information you're sharing would be destroyed after you share it. So there are also technical means to bring in this example because there are also technical means to limit your consent according to the kind of technologies you use, but just some teasing points and then we can talk further.
>> BISHAKHA DATTA: I will now introduce Danilo properly. Danilo Doneda from the University of Rio de Janeiro in Brazil.
>> DANILO DONEDA: Thank you. We have some issue in the public consultation but firstly in the Human Rights point of view. I would like to propose some questions regarding concept in the sense that we generally approach the concept issue asking how to obtain concept? What concept is for? In cases do we need to obtain concept? But as to that, how do we think or re-think about what concept is for; because concept, even if it is a very traditional instrument data protection laws is not a name in itself.
The protection was not created to be built around concept. That protection exists to put people in the control of their data. To make people make the right choices, the choices they want about their data. And concept as we are seeing, observing in the last year's, is a instrument which can be valued or not depending on the circumstances.
Concept is particularly a means for connecting people, people's free will, to the reality of personal data treatment. In the sense there is a kind of -- we face now a moment of -- a kind of dual monitoring in the sense that very traditional of that protection laws is that personal data can be treated when the law obliges or when the person gives personal concept. And, this has been criticized by several players, foremost by the industry by the private sector. Private sector doesn't want concept to be so strong because this way they cannot use all the data they want.
But, it is interesting that Governments concept thinks is important. But, I think we sometimes face ourselves that people also, seems to don't care. In many cases they don't need the concept or even concept can be an instrument against people's free will in the sense that going a bit to the legal.
When you obtain concept from an individual, it can be obtained in certain circumstances. That may change in the future. And, the concept obtained in that certain situations cannot be the formulation or the idea the individual has in the following years and months. This way the concept can be an instrument that can be used to prove something against the individual's free will. In the sense I believe when thinking about new revelation or new approaches to concept to check much carefully, the idea of the free will which may be in scripted into the regulations and the concept is one of the instruments, but information, the determining of what is doing with data. The constant information about what is done to their data may be sometimes more important than a form concept which was obtained in the past.
>> ANJA KOVACS: Thank you, Danilo. Are there immediate responses from the floor?
>> INGO FRIESE: My name is Ingo Friese. I'm with Deutsche Telecommunication provider in Germany and then with an organization called Qatar Initiative and just to -- they developed or they are working on a concept called, user consent receipts. The idea is to, that it is even for service provider from a legal side it is more secure to have data with consent and assume you give consent, you get a kind, as a user. You get a kind of receipt back and you can manage them in terms of you can revoke your consent or you can store it somewhere so that you have an overview of where have you given what kind of consent.
This is a Working Group within Qatar. So, just this to mention.
And, a second remark, is regarding the complexity of the different terms and I think they discussed something like, there is an organization that groups these kinds of Terms and Conditions for different jurisdictions and then you have something like a label, that this is kind of green policy. Or, a green Terms and Conditions and this is also where a service provider can say, okay, if I want to be on a safe side when I get this data; because in some legislations like in Germany, it's massive. If you want to deal with personal identifying data, so they want to be on the safe side.
If there is an organization that says, these terms of conditions, we check them and we give them a kind of green label or whatever you want to call it, then it might be for the user easy to accept it or to be on the safe side. Then there are of course, in details, there is work to do about what happens when there is violation of this concept or someone violates against this certification. Just as an idea to bring in here.
>> ANJA KOVACS: It is very interesting to have such practical proposals. Anyone else?
>> CATHERINE EASTON: Thank you. My name is Catherine here from the Human Rights Big Data and Technology Project which is just been establish established at the University of Essex in England.
I'd like to add to Danilo's points about consent and circumstances around consent that may change in the future. This is one area our project is extremely interested in and concerned about, is the repurposing of data once you have given your consent to cause A, and then that data is then used for cause B.
So, you can perhaps see examples of this in health care when you give details about your health conditions and then it is used against you for example in health insurance and increases in prices.
So, just basically like to add our voice to this Panel and just to say that is an area we are looking into and very concerned about. Thank you.
>> ANJA KOVACS: Thank you it is interesting to see how many of the comments are specifically about data protection vis-a-vis private companies. But, of course Joana already flagged a question about the stuff we have shared on say a social media platform what happens with that and how it is used by other individuals as well. So, any comments on that?
>> AUDIENCE MEMBER: Valentina from -- (Indiscernible). What is interesting for me is the definition of what is a public space; because for the people, the general user, as soon as something is out, especially if we talk about social media and platforms, it is public. And, if it is public, then you somehow have no anymore consent or the ability to manage or decide and to do anything. So, there is a completely empty, the concept is completely empty for many, many people. It is really not something that they can even use or think about.
>> ANA JARDIMBR: Maybe just to add to that, my name is Ana, also from British University, also working on a related project on surveillance. And, some of the things that we did were focus groups, with people in the U.K. and a questionnaire, surveys and so on.
And, we found that lots of people are very worried about what happens with their data and exactly this point of, I don't know with whom my data is shared, and then with what other institutions? and for what other users? and originally intended? There is lots of worries. And, a lack of transparency in a way. Not much knowledge about how to -- or to what give my consent and how I can also retract my consent if data is used for other purpose that I would not like it to be used for.
>> BISHAKHA DATTA: Just adding to the -- I think interestingly, conceptually speaking, when -- this is a question I don't have an answer. But, when we download applications, et cetera, there is actually no way to download the application without automatically ticking and accepting the conditions.
So, I'm just curious whether conceptually we would see this as meaningful consent? And, one of the information bits they wanted to share is that last year, the Government of Canada, the Office of the Privacy Commissioner, ruled that the Terms of Service of most of the sites and applications that we use, would not be considered meaningful consent by the Office of the Privacy Commissioner; because they were not accessible in terms of reading.
For example, graphic artist just yesterday did a little economic book of itunes Terms of Service. And it runs 200 pages, right? Yes. But, it is sort of a funny example but it also shows that it is not really feasible for regular users to sit down and go through like hundreds of pages on each and every site and so the Privacy Commissioner ruled that unless that information around the Terms of Service can be presented in a way that is sort of accessible to an ordinary user. Including the language which should not be so legal as to require special legal education or something of the sort, that in their eyes it would not constitute, meaningful consent.
>> ANJA KOVACS: I was wondering when the -- Valentino, when you made the comment about public space and many people feel once you put your data out there, there is nothing you can do. When you share pictures on Facebook, and somebody misuses that picture, those pictures, there is a lot of people who will tell you, but you put them on there.
So, maybe then you shouldn't have done that. Or you should have put your privacy settings in a different way. And, I think what we see especially in gender circles is more and more push back against that and kind of saying like, but even in these spaces, or in those instances, we need to have a discussion about consent much more.
I support that obviously as such but what I often have been wondering is, at the same time. If we look for example at journalists, we would argue that they need to be able to take pictures of people quite freely and the boundaries of where that should be is discussed in many countries. There isn't always agreement. And there is often a fear that those boundaries protect people in power.
I'm starting to wonder whether the great push for consent in terms of privacy, which comes from the large number of privacy violations on the Internet, at some point might actually start to intervene in freedom of expression; because we might end up pushing for norms and regulation that at the same time then will actually give say powerful people good reasons to say, but you can't use my picture in that way. Or you can't take a picture in that space.
So, I'm wondering in terms of Human Rights, we are going to end up with a contradiction which might not be easy to resolve. So if anybody has a comment or any comments that have been made.
>> AUDIENCE MEMBER: Hello. This is Gildes from Turkey and I'm a journalist and I think I may add some comments about that issue.
As a new trend in journalism, we started to see that people's Facebook comments, Twitter accounts and their photos and the Instagrams are being used as news like especially for the famous or popular people. And, this may cause several problems. Working for an LGBTI's based Internet newspaper. And, while we are trying to make some news from the data like so-called popular LGBTI people, we always have to check twice.
As a journalist, I feel it is my responsibility to reach the people who shared it and ask for a simple question, like would you like me to write about it? Can I write about it? But usually, the journalist we don't want to lose time and become the first to publish it and this can cause real problems. We had like issue like one of the gay activists just wrote something about that he wanted to just kill himself in the Facebook and he wasn't out but one of the newspapers just take it and heard the news about it and his family learned that he is gay and they didn't focus on the issue that he wants to kill himself. They focused on the issue that he is gay.
Then, it caused really problems and it was like he had some life threats and now he is hiding from his family. He is in a shelter. And ,this was just because just a journalist who was friend of him just made that news about that he is gay and he wants to kill himself.
So, I think the issue is about rather than talking about the freedom of press or freedom of speech, but the responsibility of the press. I guess that if you're a press member or if you're publishing and saying something to public, yes, you are free but you also have the responsibility and the responsibility is something to think about to Human Rights in a sense.
>> ANJA KOVACS: A hand in the back.
>> BABURA MARYAL: Good morning. I'm Babura, work for EACSO, European Alliance for Child Safety Online. Just a comment from our perspective. So, at the beginning of the Panel, someone asked what does it mean consent? So, what does it mean when a young girl, for example, posts their picture naked online? It's a right of expression? Or what does it mean when we look at advertising online and their impact on a young boy or young girl? And how can we assure that young people can understand the implication? How can we assure that they can understand the influence of the Internet, the business model until their life? Thank you.
>> ANJA KOVACS: Thank you. If there are no other comments, then perhaps we should move on to the next session. I'm sure like these conversations will build on each other. Or at least that is what we hope. Bishakha will be moderating the next session.
>> BISHAKHA DATTA: Okay. So, we move on now for the next half hour from consent and Human Rights to the section that we call consent and Internet Governance. We have three speakers in this session.
One is Amelia Andersdotter who is a Swedish politician and former member of the European Parliament until 2014. She was elected on the pirate party platform in the 2009 election.
Our next speaker is Jochai Ben-Avie, from the Mozilla Corporation and you are the Senior Global Policy Manager. I'm sorry about that.
And third speaker is Anja Kovacs, a co-Moderator of the session, researcher and the Director of the Internet Democracy Project in India.
>> AMELIA ANDERSDOTTER: So, I'm sorry I have to break off from the previous conversation which I thought were interesting. So, I want to start out with a concrete example of something I have been doing in Sweden over the last year that relates to technical concerns online, how developers approach consent when they help particularly the public sector develop IT solutions. Me and a friend thought that we would check how are people tracked online and how are Swedish municipalities participating in ensuring that people are tracked?
It turns out the vast majority like 288 out of 290 municipalities perform some form of tracking of user behavior when people visit their websites without getting the consent of the user to perform such tracking. That is surprising because municipalities don't need to serve ads.
Normally we would associate online tracking with some kind of targeted advertising but the municipalities already 100% tax funded. So, they don't need to profile visitors in order to serve them better ads. What would they advertise? You're in the municipality. You need their day care services. You're already restricted to the space where you're in when you live in a particular municipality.
What we found when we started addressing this with municipalities and also with web developers that help municipalities is the developers of websites first of all, the municipalities don't feel strongly about tracking. It's not a conscious choice. They are just taking recommendations from service providers they procure IP services from. Web developers will use tracking tools for self improvement. They feel if we can gather lots of statistics from users, then we can also make ourselves better and provide better services to the municipal which is good for us. The problem that I hope that we can have a discussion around here is that this means that web developers have put on visitors to municipal websites, and imblessit obligation to help web developers with self improvement. This implicit obligation to, assist technical developers with improving the performance of technical developers is something that I question. I don't think that merely by visiting your municipality website in order to find out where is the hospital? How do I get health care? How do I help my aging mother? You implicitly acquired a kind of task to help web developers with their jobs.
And I think here in the, especially with technical development in the domain of implicit obligations that we put on others without thinking, I think there is a lot of work that can be done and in particular a lot of thoughts that can be addressed inside of the technical community, how do we stop ourselves from implicitly requesting that people contribute to our activities and can we an environment where instead we think of visitors or users of IT services as being actively engaged and having the right to actively consent or somehow be able to choose whether they contribute to a web developer improving himself or not.
>> JOCHAI BEN-AVIE: I'm the senior global policy mother-in-law at the Mozilla Corporation.
I'm going to try to tie the two sections together a little bit. I think that already we have talked about consent through the lense of check boxes. I accept the Terms of Service. I'm okay with cookies being loaded on my website. And, those aren't -- it's not really meaningful consent. It's a take it or leave it approach to consent. And I think we at Mozilla think that users send a lot of signals actually.
If you look at how people interact with websites and with different services. Users do try and signal in a lot of different way what is they want and what is important to them. And so, from our perspective, we try and do everything we can to be honoring that user choice. And, that is because in order for the web to work and to thrive, those relationships have to be built on trust and communication, like any good relationship. And so, one of the ways that we have done this most recently and most visibly I think, is through private browsing.
So, when you enter private browsing mode, within FireFox, we think you're signaling I really care about my privacy in this moment. I want privacy. And when you, up until about last week, if you opened a private browsing mode window in FireFox or incognito window in Chrome, that web page would still load some third party trackers. It won't save your browsing history or your cookies, but it would have -- you're still sharing unwittingly information with third parties.
Starting last week with FireFox 42, we rolled out tracking protection default on in private browsing mode. And, this does not load any third party ads, third party cookies, any tracking analytics and social share functions. And so, we think you really want privacy in that moment. So, we are really proud for that. It will break some parts of the Internet.
If you load a page, there might be some elements missing. And here too, we have enable one click sort of choice. You can go into your control centre that is now right there at the top of the window and just say I want that content. I want that to be loaded. But, we are trying to empower the use they're way and to honour that choice.
And for us, I think that gets at the idea that instead of data being the currency of the web, trust needs to be the currency of the web. That in the effort to improve monetization, lots of companies have been going after increasingly personalized experiences but the trade off there of, I'm trading my data in order for that more personalized or improved experience, those trade office aren't clear it's not transparent what is happening and the choices aren't being made.
What we found in our research with our users through e-mails and commented and surveys, through going out into the communities, is that people want to know what is happening with their data. They want to have control over what data are shared and they want to understand how their data are being used, for what purposes.
People are perfectly willing to engage in a value exchange and to soared of give up some of their data for that improved experience if they can trust and know what is going on. So, our users are trying to tell us, or we think our users are trying to tell us that they want trust to be that currency.
That's why if you look at our privacy principles, come are all about a sentence long and there is only six of them, we say things like, no surprises. We say things like sensible settings. That to the extent that you can't -- it's not operationally functional to sort of consent to every single setting when you're setting up and loading a service for the first time. We want to make sure the default is there to strike the right balance between what you think you want and providing a quality user experience.
We are guided by a principle of limited data. We collect as little data as possible. We secure and anonymize the extent possible the data we hold and get rid of it after it no longer serves the purpose for which we were providing the benefit to the user. And then we also to try to think through trusted third pates to the extent we are interacting with third party services. Having privacy and protection is a key indicator for us for who we are engaging with. And now I'm going to tie to Internet Governance.
This is how we think about trust is the currency in terms of the relationship between users and companies. And, that is a very special relation where you're trusting a tremendous amount of private information about yourself. But, we need to take this lense of trust and of consent higher up into the chain. We need to think about trust and the integrity of systems. When we see a vulnerability like a heartbeat affecting two-thirds of the Internet, that's a blow to integrity to trust of system.
When we see a data retention law, for example, where a government is mandating that a telecommunication provider. For example, retain records of your private communications, they are retaining that for longer than you have given consent to the company to do so. That is superseding the user's consent and choice in that home and that further breaks down trust.
When we see particularly the revelations of overbroad surveillance programs over the last few years, this too is a violation of that trust and sort of is that the heart of negative reactions that users are having and changing user behavior in a way that is not always good for the Internet and for the Internet community and the Information Society.
And, so I think that it is useful to think about meaningful consent but also the other side of meaningful consent is trust and sort of designing for trust as the currency at all layers. Trust in companies that you choose to store your data with. Trust in the government and the Governments of the world and trust in the broader integrity of the Internet Ecosystem. Thank you.
>> BISHAKHA DATTA: And Anja is our last speaker in this section.
>> ANJA KOVACS: That was really interesting. I think in a way I will build on this because I think what we have been discussing mostly is the brick down or the tense relationship between users and companies. But, the larger point that I wanted to make is that especially with overbroad surveillance programs, there is actually also a real tension in the social contract that citizens traditionally have with the state. And, I think that we should pay much more attention to that.
So, if you look at political theory around modern democracy, in a very simplistic way, the way the balance of power is maintained is by the state gets the Monopoly on the use of force and then on the other side you have citizens who have Human Rights, which need to be protected to make sure that these two powers or forces in that configuration will be in balance. What you see with the digital age is that states have now access to surveillance that is unprecedented and are able to control all the time what people do to such an extent that actually they have something changed on one side of the equation.
And, actually the Internet Democracy Project started from this question that if that happens, then what we need to do is have a correction on the other side as well and this is the reason why we worked from the very beginning very strongly around freedom of expression issues. Increasingly I start to think, that's not enough. A strong right to free speech is not enough; because what you see is that we have a really fundamentally in a qualitative manner, a changed social contract.
While all of our countries at some point have some kind of revolution or a founding moment where there was a discussions about the social contract, which in many cases is enshrined in the Constitution, we are seeing a real recreation of the social contract but not asked for our consent in that fundamental way again.
And, I think that is a really, really problematic thing. What I found really interesting and quite triggering in your presentation, was your emphasis on the concept of trust. I kept thinking with companies perhaps, you can use the concrete measures that our colleague from Germany was also describing to kind of try and move things in a particular way. But, how do you signal to your government that you can't step out. You can't say I will be citizenless from now onward. And governments do use the fact that they have been democratically elected to kind of say, but that is the consent you have given. You voted us into power.
So, it's bringing that issue of consent on what the Internet has done to like that most highest abstract level. But, I think that the questions in a way are similar. I'm not sure about the solutions. I'll leave it here.
>> BISHAKHA DATTA: Great. So, we have about 10 minutes for responses from the floor as well as questions and I think the session has opened up a lot of other concepts that are related to consent and how to look at consent in different layers as well as related concepts. So, any comments or questions?
>> AUDIENCE MEMBER: Hello. My name is Lucas with Youthful IGF from the ICGR Program. What Mr. Ben-Avie said about trust, I think it connects with what we were discussing last session about when you post, for instance on Facebook, and what does that mean? If it's a public space or not?
What I understand about trust is that we need to look at intention. So, we would have to analyze what was the intention of the person while the person posted? I don't think when you post online it is the same thing as if you post your picture on a wall outside of your house. There is a problem.
Some people think that a public space online is essentially the same thing as a public space offline. And that is conceptually wrong since when we post online, we are bound by Terms of Service and by a private company. And when we post pictures off line, we are just showing them. So, I guess we should start comparing these things a little bit more differently. Thank you.
>> BISHAKHA DATTA: Two hands up here.
>> AUDIENCE MEMBER: From the first part maybe we should also remember about public interest. Again, together with what it was being said, it is not only responsibility but it is also public interest. When journalists, media should use images of people for public z there are many refugees now pictures all over the world and probably this is a risk for their safety but there is a public interest.
I have a problem with market model anyway; because I do not understand why that the default has to be, you track me instead of the default, private mode am. And then if I want to be tracked, I can be; because if we need to shift paradigm, if we enter the box with the rules that have been set up without our consent, meaningful consent, then you know, it is a poor area. I don't see what we talk about. And, I have to say the issue of government, I think, that when government abuse that power or politicians, it is objection. Is there a protest. The only way to retreat the consent, unfortunately. It takes longer and we do not always win the battles.
>> AUDIENCE MEMBER: I think when we discuss about public places, we miss something. I don't believe that the world divided into the privacy and public space. I think there are like something levels of publicity too.
For example, like Facebook is a public place, yes. I oppose some public places about but that public place just only consists of my friends in it. This doesn't mean it is a private place. Still a public place but there are some levels of publicity and I think we have the right to just decide on the levels of publicity.
In that sense, I don't agree with the idea that online and offline is so different. Like in off line thinking the streets, I may want to like have my photo in my street which is a public place but this doesn't mean that I want my photo to like be in the whole streets of the city. So, the problem begins with the dichotomy we have. The world is not that simple. The publicity and being private has some levels and we need to discuss much more about that level and we need to consent in that sense while we are shifting the publicity levels.
>> AUDIENCE MEMBER: So, my question is relating to the idea of bringing in the element of trust. My example is SnapChat. So, in SnapChat you can send an understanding that you put your picture out there which will only be there for a couple of seconds and will not be retained. And, correct me if I'm wrong with the respect to this policy. But, I read somewhere that what actually happens is that your data is not erased. It's probably your information identifying you with the certain amount of data with the certain date you put up on SnapChat, which is probably changed.
So, the trust issue that comes here is, what are we really consenting to? And if we are consenting to say A, our understanding being A, and what the company is doing about the policy is B, how do you go about this?
And if the government does come up with a framework where the they hold companies accountable for a responsible in terms of coming up with a framework where they take into account these things, how do you really implement it? Because for example, how do you enforce this kind of a framework on WhatsApp say in India?
Two questions. How trust is -- how people really have trust in these kind of companies and how in terms of how the company complies with the framework the government has come up with. That's the first point. If there is an answer to that, then we can move to the second point.
>> BISHAKHA DATTA: So, maybe what we can do is go back to the Panel right now and ask for responses to the comments questions that came up. Jochai would you like to start?
>> JOCHAI BEN-AVIE: Thank you. In regards to SnapChat, what understand in the United States, it was shown that SnapChat was logging your chats and holding on to those data when they made a promise and users had consented sharing information on the basis of this idea that it would disappear, right?
And, it turned out that is not what was happening and the Federal Trade Commission, government sort of enforcement agency took action against SnapChat for violating their agreements with users and for violating trust that way. And, I think that's appropriate. And, I think that trust is partly also a design principle. It's a way to try and think through how do you develop products and policies and practices? It's a way of sort of -- a little bit intangible on some days but constantly reminding yourself as a company to be reaffirming that relationship with the user to make sure that every business decision you have the sort of user benefit and some cases you might have a business benefit.
But, if your business benefit and user benefit don't align, that is problematic. So, keeping that idea of what is the user benefit in this moment at all times, is another way of thinking about trust about thinking about consent. So, to pick up on the government trusting government point. I think in some ways it's much hard to leave your government. I think Anja you were alluding to that. Government has an interest in the freedom of expression in many different forms. Government has an interest in people not self censoring them themselves. Government has an interest in promoting commerce.
And, it also in terms of facilitating relationships based on trust and consent, between companies and users. These are all sort of interests in some forms obligations of government.
And so, again this is a design principle. So it helps the government to think through how do we ensure these things be happening? When people lose that trust, they self sensor. They might move their business to another company or country. And, they might take to the streets and protest and they might vote people out.
We have seen a lot of popular uprisings around Internet pol frees surveillance to copy trite patents and more, where people have gone up in the streets to sort sort of demand that that sort of value exchange be reset.
>> AMELIA ANDERSDOTTER: I want to come back to the point about Governments; because we have also looked at how the government is in data processing, the way the government saves their IT systems in their governance. One remarkable thing at least about Sweden but I suspect this is reflected in other European countries, is that the Ministry of Justice very frequently points out when the approach rules governing data processing and other types of privacy violations, in terms of how much they do not understand technological solutions that they are applying to facilitate governance of the people.
So, very frequently in the 1970s and the 1980s, you will find these statements in government publications about how IT systems are far too complicated for individuals to understand and often far too complicated for the public administration civil centers to understand implying they are trying to regulate something that they don't know what it is about in which case of course you have a rewriting of the social contract because normally we understand that we expect that our administrators understand the administrative tools. Like administrative law or whatever they are trying to use when they govern people.
This continues to exist today in Swedish publications that it is assumed that individuals cannot understand what a computer does. Therefore, it is used to give an individual rights inside of these administrative systems because how would you anyway express any form of meaningful consent or have any meaningful trust in an object that by definition, it just goes beyond your capacity of understanding? And, I think this was also sort of reflected in the Panel here that we have these talks about are individuals capable of understanding the difference between online and offline?
Is it possible for an individual or a government to make a distinction between what is public and what is not public? And, there is always a baseline assumption somehow that we have constructed systems, global systems, that are too complicated for us to understand. And, I feel in order to get meaningful or meaningful address this consent issue, maybe we first need to start moving away from the idea that it is impossible for us, in either case to, understand what is going on and see how would why create consumer rights?
In the 1970s, we dealt with the standard contract thing. A lot of the end user license agreements were talking about the same thing. There is no difference between signing a contract with SnapChat and signing a contract with your railway operator. You get the slightly different service but if the one contract should be simple, why shouldn't also the other contract be simple? We have solved this before and even if it it's not a perfect analogy between railways and SnapChat, at least some basis for analogy where we can start discussions exist.
So, that would be my reflection here.
>> BISHAKHA DATTA: Just very briefly, I have been wondering whether the --
>> ANJA KOVACS: Consent would be useful to the rewriting of the sociality contract to the fore in a much more systematic manner. I think we have seen there are problems against particular laws, particular instances which are often a reaction to something that is being proposed. But, that means that you don't have -- we are not able to react against everything. There is just not enough bandwidth. Enough things are shifting all the time. I think if you look at people who are using the Internet 10 years ago, if you look at what the Internet looks like today, it is already a very different space. So much has changed.
This is far more corporatization. There is far more centralization on the Internet despite the nature of the network but what you need as use ser much more centralized. And, people have been talking about this throughout all this time. But, we haven't really been able to do anything to stop that fundamental level.
I think earlier in this am radio, there was a session on zero-rating and in a way, that is the same kind of debate, right? That you can look at that in a very narrow way, how are we going to deal with this particular issue? Or, you can wonder about so what direction is that going to send the Internet in? Or, even my society seeing that we are so dependent on the Internet now?
In that sense, I think the proposal was to bring consent more to the fore as something that is important in all these different things.
That might be the glue to connect all of it and make it clear enough there is something fundamental going on. I do take your point about also going back to actually the technology and how it two, and that people should take more ownership over it because the combination of the two is probably what would send out a really strong message to our governments.
>> BISHAKHA DATTA: I'm going to ask Joana and Danilo if they would like to comment on the questions that have come up.
>> JOANA VARON: I want to re-stress the point of Amelia. Anja already did. But, this importance of understanding the two. So, if you're using SnapChat, we should be educated on how the Internet works. So, if you're using this kind of App to share folders, you can check.
If it's end-to-end, encryption, if it blocks print screens, if the messages affirm they do not start a message not only in your device but also in the services F it is open source and you can check or not you check, but someone has checked if what they are saying the Terms of Services is true.
So, this kind of knowledge needs to come together with the legal debate as well. The technical and legal debate so we can discuss what is really meaningful consent.
>> DANILO DONEDA: Just observations on the comments made after our speech. Regarding the companies that use personal data but maintain records of concept. That is most secure way to do. That is very interesting. It shows us that concept is very tight should transpire because if a company even the government is transparent about what exactly is being recorded, captured and so on. And more than this, if someone kind of -- given to the individual to define how to take the term from the comment, if the individual can control in a sophisticated way, maybe we are trying that concepts that are a bit a part right now should put maybe in control.
Regarding the concept and aggression of Big Data which was mentioned. I know that the next generation -- I believe that the next generation will focus less on the traditional view of concept than on ways to regard and use that data. And which proposes does is it?
For instance, the U.N. is, working right now on guidelines for radical approach for Big Data for secondary reasons, humanitarian reasons, for disasters and so on. That say very, very interesting project. But, raises also some questions about who can use or access this data? What is the risks for an individual of this data?
And also, raises the question of what is the risk of not using the data if you have it? It is very important, maybe cultural in this sense. It can be a subject which is maybe more important than the traditional notion of concept.
And finally, it was mentioned also about data which is -- by an individual which maybe can be used for whatever purpose it fits. I don't know. We believe that even if data is made public, there is a reason for it that it is made public. And, it cannot be used for any purpose possible, for monetization, for creating harm or causing harm for an individual. The purpose exists even for public data in a sense that personal data is made public. That is a concept that which is just in our legislation and is beginning to be implemented in Brazil.
And also, finally, the terms of use are good tool to obtain meaningful concept. Depending on legislation, of course. In Brazil, we can say that terms of use are not meaningful concept according to consumer law because they don't have that protection law. But, if you had that protection law in the way it is being drafted, they would not meaningful concept because it is harm against people's will.
>> ANJA KOVACS: I think we can move on to the next.
>> BISHAKHA DATTA: So, the last part of the session is about strengthening our understanding and concept of consent --
>> ANJA KOVACS: And, the comments were made I think are really good segue into that conversation and bring us back to the earlier part of the conversation as well where perhaps we need to start to do that. We have invited Mr. Patrick Burton to speak on this as well but I don't think he is in the room. And, so then I just hand over the floor to Bishakha Datta, Director of Point of view in India.
>> BISHAKHA DATTA: So, I'm going to talk a little about consent as a yin-yang concept when it comes privacy. And, what I put up is a slide which takes us back to 1890 when Justuce Samuel Warren and Louis Brandeis and the United States wrote a epoch making manuscript called, "The Right to Privacy" which defined it as the right to be left alone or let alone.
The reason I wanted to put this up is because it is often in privacy text that we find the most explicit mentions of consent. So, this is a direct quote from the Right to Privacy in 1890 where again it is important to note this is a moment of technological change. This is a moment when the photographic camera has just been invented in the 1800's. The commercial use of the photographic camera is also sort of in the mid 1800's and during that time, a performer alleges that while she was playing in the Broadway theater, in a role which required her appearance in tights, she was by means of a flashlight, photographed without her consent from one of the boxes.
So, this is just to give us a historical. That is actually always yin and yang. Like we tend to talk about privacy more but in some concepts, consent is quite tightly tied to privacy. It's not so far apart a concept.
So, Anja, can I have the pointer? I have just one more slide and then I'll talk a little bit. So, if we go to the next slide, this is actually a picture of our Minister in the Indian Government. This was in all the Indian newspapers this year because she was photographed in a clothes shop when she went to try on some clothes in the changing room. She felt of course that it violated both her consent, her right to consent as well as her right to privacy. And, because she was a minister, there was a lot of fuss made about this. It got into the media. And, this is one of the few -- I wanted to look a little at, does the concept of consent even make it into legislation?
So, in India we find there is one provision under the information technology act, which is called section 66E, which explicitly mention system privacy and consent and this is where the minister's complaint was recorded. So, it basically says that anybody who intentionally or knowingly captures, publishers or transmits the image of a private area of any person, without his or her consent, under circumstances violating the privacy of that person shall be punished.
So, just a few additional comments to really flush this out. I think it is interesting that in this particular section it focuses only on image, not on information because I'm not sure that we actually have a specific clause related to private information. So, this is -- Anja maybe you can clarify that later. But, this is only about image.
The second thing is, when it talks about private area, I think this takes us to that very complicated domain of what is public area on the body and what is private area on the body?
Just to give you a provocative example, we know that often people's faces are morphed onto other images without their consent. The facings traditionally seen as an area that is not private. The face is seen as an area that is traditionally public.
So for example, if my face were to be taken and put onto the body of a woman in a porn site where I may not necessarily want to feature without my consent, how would I complain because it would not traditionally be considered a private area. Or it would be subject to some sort of interpretation, right?
And then similarly, I want to say that what is very interesting is we have a number of cases in India where from Agenda and sexuality perspective, we do find that people are complaining about their right to privacy being violated. But, these are being filed under a totally different section.
So, I'll give you very quick examples. One is, we have situations where gay men have been prosecuted themselves, for putting up images of themselves, intimate images of themselves consensually which then slipped outside of their control. Now ideally we would imagine that they are the victims of these images slipping outside of their control and that they should be seen not as the perpetrators of an offense. They should be protected and not punished.
So, we would think that it would go under this law, right? But, instead of it being put thunder law, it is put under another section which is related to obscene images, which turns them into perpetrators. So, this is very common.
And similarly we also see there are many cases in India where actual physical rapes are then filmed and put online, which I always talk about as three violations of three counts of consent being violated. The rape itself, the filming and the distribution and we need to look at all of these three separately.
There again, unfortunately, instead of the cases being filed thunder section which would at least strengthen the notion of both consent and privacy, it is often filed under the obscenity section, which takes us -- in a totally different -- but it is a circulation of obscene image rather than a violation of the person's prove parts, et cetera.
So, just to say that it is a little complicated at this point, and I think the final question in the online context is culpability. If you look at the law it says, whoever intentionally or knowingly. But, given the virality of the Internet, who do we actually hold culpable? Do we hold the first person who puts it online as prosecutable under a statute like this? What about the second, third, 200th, two millionth person who actually also circulates it?
Which is why I feel like for some of these things, law may not be necessarily the solution. I think we really need to start talking about an ethics of consent, about not just prosecuting after the act but how can we sort of build environment or culture where online consent is respected in advance where before we sort of hit that very easy forward button? We sort of stop and think a little about the implications of consent.
>> ANJA KOVACS: Thank you very much, Bishakha. Any responses to those questions or any additional thoughts? Was there a hand?
>> MARINA LIMEIRA: Hi, my name is Marina and I come from Mexico as part of the IGF Programme. Precisely, in the regard of ethics of consent, it is not an either-or and it is easy to frame these things as either you focus on user and corporate relationships or you focus on user and parental relationships.
But, I feel that in these debates sometimes we forget that for youth, consent doesn't necessarily matter in terms of cooperation and user. Yes, it affects them. But sometimes consent is dealing about the relationship with their authorities which are their parents. And, so I feel that a lot in these discussions we tend to favor things that would go against the desires did the consent of youth. Why; because we require a parented for consent for certain images or certain uses of services. And, I think it is important to remind that sometimes consent for us as adults doesn't reflect consent and actually contradicts consent for youth.
>> ANJA KOVACS: Very interesting point.
>> AUDIENCE MEMBER: Thank you very much for raising that. I think one of the things that is related to consent that -- and consent is tricky. It's kind of a very, very tricky concept especially in the terms of looking at it from a legislative point of you video. It always make me think about trying to establish consent for example, in situations of rape. And, in rape laws. And to understand that, is to actually think about consent is also quite intimately embedded within an existing power relations and that's why I appreciate the comment from the authority matter and authority matters and power matters. There are some moments where you're unable to provide consent simply because you are trying to provide consent in an existing relationship where consent is impossible. So, what do you do then? And how do you impact that?
>> AUDIENCE MEMBER: Hi, Serene from Empower and I have one -- more like a question. Especially for bloggers in these days. There are so many bloggers and YouTube video bloggers. So, these people they put out their imaging's, videos, to public and Instagram is there a function where you choose public or private.
So, they have given consent to the public to use or rather to see their photos. So, how do you address consent in these areas? Especially when their photos are being used or shared in manners which is not appealing? Thank you.
>> ANJA KOVACS: Thank you, Serene. Any responses to that, perhaps? Any other comments? Bishakha?
>> BISHAKHA DATTA: So, again just taking off on what you were saying about parental consent and the whole power thing. I think that is a fantastic lense to think of it from. And, I think it is interesting when we talk about privacy. There was a study done by an anthropologist, Dana Boyd, which looked at the social lives of networked dispense she has written a book called, "It's Complicated" and talks about when privacy activists talk about privacy, we always talk about vis-a-vis the state.
And, in the lives of teenagers, she says on Social Media, the real challenge for teenagers is how to establish your right to privacy vis-a-vis your parents; because those are the authority figures in your life. So, it reminded me of that.
And actually, the other thing that I really wanted to say, which is sort of important in terms of consent is that, there is a really complicated relationship with morality.
So, for instance, when we look at nude images or intimate images or whatever we want to call it, from our perspective, we feel that as long as these images are consensually exchanged, there should really be no sort of problem with it as such, right? But, when the state looks at it or when the police look at it, it's often assumed that even if it's not assumed, it's put into the frame where it is seen as a model to consensually exchange these kinds of images and so it sort of put into a different frame from perhaps what the user intended. And, this leads us to sort of very complicated legislation.
So, what we find is, in the domain of pornography, for instance, we find that a lot of the policy initiatives tend to favor censorship and banning, et cetera, which is a little bit complicated when you think about freedom of sexual expression. So, I think when it comes to the domain of Internet images, it is useful to think about what is consensual and what is nonconsensual? Or when we think about pornography or things like this, again, consent is reasonably useful along with other concepts to think about this.
>> ANJA KOVACS: Thank you Bishakha.
>> AMELIA ANDERSDOTTER: I think I have a fusion of comments from others earlier. What is consent mean in situations where you have a power distribution which is unequal between the parties? In particular, Danilo brought up the point of Big Data. What is the riffing of not using data when you collected lots of health data about the population?
You can make useful statistical research. You're talking about a trade off between a collective benefit, which is the benefit of the government and being able to make analysis of the entire population and an individual benefit which might be to keep your medical records secret. And so, at least confidential or just in relation to your doctor.
And a lot of time when we talk about consent or IT or data protection, we don't make it clear when we are talking about collective benefits of using consent as an instrument. And, when we are using consent or any kind of regulatory mechanism, to protect individuals, like what are the trade office between collective benefits and individual benefits that we are doing? And here again, I think it is something that we need to read out whether it is about -- the conflict comes about in rape law kind of as well because it is about when do we allow a particular person to invoke the violence Monopoly of the government against another individual because they have slighted them or done them wrong?
And so, I think for all of these discussions, we should always start by trying to weed out are we talking about the collective benefit, an individual benefit? Is this an individual right or an aspiring towards? Is it a collective benefit in which case my government would have the right to consent on my behalf? Since the government can then put me into perspective with all of the other citizens of my country or municipality or whatever. And, as long as we are not clear from the beginning, what kind of benefit we are aspiring towards talking about the means of aspiring towards that benefit, which would then be consent, may not be so useful. So, I think that is often forgotten and it should be addressed much more.
>> ANJA KOVACS: Before I give the floor to Valentina, I wanted to check if there is a comment from the remote participants. No? And we are also slowly coming towards the end of the session so if you have any points or questions, contradictions to make, please raise your hands and otherwise I would like to go back and have one more round with our Panelists to ask for their final feedback. Valentina.
>> VALENTINA PAVEL: I like the decision between collective and personal and individual. And I would go back to the individuals and specifically to the issue of youth. I think we live in a society that our public images need to be polished and we cannot make mistake. So I think that a youth can, if it is there in making a decision, can survive the mistake of an image that can cause certain things.
And, we should be less afraid of making mistakes because we have all done mistakes. On the other end it is very, very modelized, the mistake because Kardashian and all the superstar of the 27-hour images, they are allowed to express their nudity or whatever they want. Big brother is all over the place. So, I think that it is really important to think of the individual and also to stop to be afraid of making mistake. These protection is killing us.
>> ANJA KOVACS: Thank you. Any other comments from the floor? Yes, please?
>> CORINA CASTRO: Hello. Corina Castro. Regarding the clauses in Terms and Conditions, there could be a Troys of forum arbitration and mainly with the restriction of material actions to be discouraged from bringing a claim and that would be a difficult or almost impossible for a single person to bring the claim.
I mean, consumers, users, children, teenagers. At this point, I may not distinguish between equal relations or other relations as you just said. But, to really be able to exercise your rights. Would the panel consider there are different types of consent? This panel might only refer to privacy. But, also if a consent may affect the way you exercise your rights, that might be also an interest to define a consent. Thank you.
>> ANJA KOVACS: Thank you. Any other final comments from the floor? No? Then I would like to invite all the speakers to give their final comments and let's start the original order so Joana, you go first.
>> JOANA VARON: This was amazing debate. I never thought about so many layers of consent even I have thought a lot about consent.
So, we talked about attics, involves discussion about morality -- ethics -- about legal protection, about technological knowledge, and it is so rich I think we need to compile all that was debated here and do something. I'm going to have a proposal to move forward. That's it.
>> ANJA KOVACS: Thank you.
>> DANILO DONEDA: I assumed the discussion, the complexity showed that I'm happy for this, that concept cannot be viewed as a formal concept that must be obeyed, complied with and so on. But, it raises questions regarding several parts, several subjects and layers of protection and ethics and technology. They are taking care in the sense of the need to include the real will of the individual and the context of personal data processing.
So concept, yes, it is an issue that can be used in several cases but must not be the king of the information of the user well personal data because if we think about this, when end up with only a formal instrument that can be used even against individuals.
>> ANJA KOVACS: Thank you. Even these closing remarks are quite thought provoking. Amelia.
>> AMELIA ANDERSDOTTER: So, I would like to come back to some of the points I have addressed earlier which is the ethics of consent would be quite tightly connected to the simplistic obligation that is in a lot of circumstances users have been subjected to on online environments.
I would also like to reemphasize that governments and individuals are nut a situation where they implicitly do not understand the technologies with which they are interacting and I don't think this is necessarily correct description of what happens. Technology at the end of the day is not so mysterious and clearly we could find ways of informing both public institutions and private citizens about the tools we are using implied, for instance in a consent relation.
I also want to reemphasize it is very important to make decision between when you're talking about individual benefits and collective benefits. That processing has traditionally been used by Governments mostly in the 70s and even before that, to get collective advantages sort of at the expense of individual freedom of action. This is a conflict that is still present in our consent discussions today. The power exercised by SnapChat where SnapChat user collective benefit in relation to the individual user benefit, individual SnapChat user benefits that could be derived from the service. Or, in the case of Big Data with health.
This is particularly emphasized conflict where you need to be clear from the beginning what is the benefit you are striving for? Who should be the entity deciding over that benefit? And, how do we ensure that there are mechanisms for making those decisions in the structured and universally acceptable way?
>> ANJA KOVACS: Thank you.
>> JOCHAI BEN-AVIE: Thank you for organizing this Panel. It has been really interesting and thought provoking to think about how we conceptualize consent across all layers of the stack and all sort of parts of the value chain. And so, I think that we know that consent can be hard online in terms of trying to concept. What does this look like in practice? So, I think that thinking through how do we make this real, and stow I think that for me, a huge part of this is about shifting focus in a lot of frames to put in the user first.
That comes from a question of product development but also development of law. And, I think that sort of shifting things that our first prime directive is, are we providing benefit to the user and are we providing the services which in the environment which the user has consented to and expressed a desire to and made that choice and providing transparency and education around that so we are enabling real choices.
And, I think we need to give users or individuals actionable and informed choices by informing them and educating them at the point of collection. At the point in which things happen.
That applies just as much as how FireFox operates as to the municipalities that amelia was talking about in that there is an obligation to do that and education to enable that sort of informed, actionable and meaningful choice. And, I'm really excited to think more about this. Thank you.
>> ANJA KOVACS: Thank you. So, since I was one of the speakers as well, I wanted to note -- I must admit I don't change my mind about something that in a session like this. But, I take Danilo's point. It makes me wonder, is consent really the concept, as I was arguing earlier to focus on the rewriting of the contract with the state? What has become very clear for me here is the complexities of it and that there are limits to how it can be used and where it can be used and also that it is important not to use it in isolation or think that you can apply it in the same way in every context.
So, that complexity for me is the most or biggest takeaway from this session in a way. And what Joana said earlier, now we really need to go and think it through more in all of these different aspects, including the relation with our understanding of how technology actually works. And then, finally Bishakha.
>> BISHAKHA DATTA: I pretty much will echo what everybody said and say also I think sometimes when you take a buried consent and say okay, let's do a session on it it's a little scary; because you actually have no idea a part from your little box that you have been working in, whether this makes any sense at all. You sort of feel instinctively it links with many other concepts but you don't really know how, right?
So, I really want to say that for me, the Panel has really shown again that it is something that is deeply linked to many other concepts and I guess the next step for us is in some places it can be actualized. Whether it is through the user experience et cetera. In other places, it may be more useful staying in a somewhat buried state. So, I think that is sort of really what we wanted to think about going forward.
Because it has been so rich and useful, I would like to thank also Nadine from APC who has been diligently taking notes and I too have been writing everything because so much came up that I needed to think about. So, before closing, just want to really thank everybody for sort of putting your trust and faith in something that really this is the first session ever at the IGF of a whole session on consent. Thank you very much for being here. And let's go to lunch.
>> ANJA KOVACS: Thank you very much from my side as well. Just making a note of the fact that several people mentioned that they are interested in kind of exploring this further. And, I guess the workshop report will be the first step for us where we can do that. But, maybe if it is something you really want to continue to work on in a more collaborative way, perhaps you should just come and say hi to us for a moment and then we can see, if we have any concrete ideas right now. But, I can see that there is lots of interest. Thank you for my side as well for making this engaging and rich conversation.