Sixth Annual Meeting of the Internet Governance Forum
27 -30 September 2011
United Nations Office in Nairobi, Nairobi, Kenyap
September 27, 2011 - 11:00am
***
The following is the output of the real-time captioning taken during the Sixth Meeting of the IGF, in Nairobi, Kenya. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.
***
>> MS. MAUD de BOER‑BUQUICCHIO: Thank you very much. Good morning, ladies and gentlemen. It is my pleasure to open the workshop focusing on strengthening cross‑personal data. This workshop with a number of key experts aims at feeding the current modernization process with regard to individuals reflecting personal data, a very long title. And we actually used to refer to this convention as Convention No. 108 since it was the 108th treaty which was drafted in the framework of the Council of Europe.
And this year we will be marking the 13th anniversary of the convention, its purpose which is to secure to every individual the right to privacy, remains as valid today as it was 30 years ago. The increased use of internet, the borderless nature of data flow and the development of new technology leads to a new data protection challenges.
This includes potential new risks for the protection of human rights, fundamental freedoms which we are bound to address.
This is a growing global concern calling for a global consideration of privacy issues. And I'm, therefore, pleased to highlight the Convention 108 provides indeed a suitable framework of this.
The global outreach of the convention is already in its genes. 30 years ago, a number of non‑European states, Australia, Canada, Japan, and the United States, participated in the drafting process. But these genes have expressed themselves, influencing legislations and policies around the world and Uruguay has just been invited. But sometimes genes undergo transformation to allow better adaptation to a changing environment.
One of the main provisions of the data protection convention is its technological neutral language which makes its provisions still valid today.
However, the information society revolution is triggering evolution. And we are currently reviewing the data convention so that it become even more relevant and efficient.
We are, therefore, in a defining moment worldwide. Similar review exercises are currently being carried out both at European and international level, namely by the European union and the organisation for economic cooperation and development.
An important feature of these discussions is their multi‑stakeholder nature. Privacy and data protection are everyone's business. Regulators have discussed those questions with all parties concerned.
Ladies and gentlemen, Chinese philosopher Lau Hsu said that the reason why the universe is eternal is that it doesn't live for itself. It gives life to others as it transforms.
This is our vision of the evolution of our data protection convention which I trust you share. But this is your platform and time devoted to this kind of exchange is limited. So I, therefore, prefer to give this precious floor back to you now.
Let me finish by encouraging you to make use of this opportunity to put questions, raise issues and propose answers, allowing us to build a strong instrument capable of addressing all present and future challenges of data protection. I thank you for your attention, and I wish you a fruitful exchange.
(No audio)
>> Thank you for letting me join the panel. We had a very active preparation session already of one and a half hours, so we feel like we're well prepared to go into the discussion and let also others participate here.
Let me move a little bit to put this in context, what I'm going to say. We are looking obviously not only at the revision of the 108th convention, we are also looking at the OECD guidelines that are currently revised. We are looking at APEC revision of data protection with in particular cross‑border privacy rules and from my perspective, being a European, we are also looking obviously at the revision of the EU legal framework for privacy. And in that context, there are two common denominators.
Our global flow of data has fundamentally changed. Not only is data flowing, actually it is people around the data which are having specific expectations. It is the people who are mobile and the people who want to be able to have access to their data whenever and wherever.
Also the service expectation is increased and services have adapted to that. So their services are available 24 hours a day, seven days a week. And that is only possible if you have the processing of that data according to the help desk, wherever they are.
So all these things are fundamental changes in how we treat or process data. On top of that, cloud computing is obviously just increasing really this amount of data that is processed and how data has really globalized.
So the three questions from our perspective are inherently combined. And new principles are necessary but the necessity of the principles, which I'm going into in detail in a second, are going along with a need for simplification of the laws that are encumbering that. So there needs to be, that's a technical discussions around other areas which are not covered here. The principle of accountability is a good principle if that allows companies to be more efficient in how they process the data.
So adding such a principle seems to be very important. And it's in particular a principle that is currently very much discussed in the APEC discussion around cross‑border privacy rules. It's actually the fund amount of these new rules. So welcome from our perspective. And there's a number of discussions going on to really come to terms and what it really means and the gateway project, now called Paris Project.
The privacy by design principle is a principle from Microsoft's perspective. Microsoft has encumbered the privacy by design principle already since 2002 when we started our trustworthy computing programme. This programme has developed a so‑called security development life cycle which enforces in the development of our software specific steps for our developers, that is first of all the trainer of the developers to make them aware of privacy principles but also having to apply certain mechanisms while they are developing the software in order to make the privacy enhancing, privacy friendly.
So there are steps done, this is a programme which I have designed which is applicable to all our software and we have made this secure to develop a life cycle programme, also for other companies to apply it.
>> Thank you. I might answer to what Cornelia said, we very much agree with your perspective, which is good news, I guess, for working out new principles, whatever business and society meets, it means we can move forward.
What we see as an NGO, as the main challenge here that needs to be addressed is certainly what has been said, cross‑border data flows which are ubiquitous and indivisible from both a client and business perspective. We have business‑to‑business cloud data, sending date to India and wherever but also from the client perspective. As Cornelia mentioned, clients want their data to be available wherever they go which means again the necessity of using the cloud.
Another problem that we have identified is the issue of consent. How do we actually get to informed consent in that setting? Can an average individual going online comprehend the complex system which is being used to process data. Can we really achieve informed consent online? And even if we have informed consent, if there is no viable alternative to a service that a customer uses, if she or he cannot go anywhere else, is the concern, well, do we really want to justify data processing.
So in that context, I think principles were mentioned, proportionality are absolutely fundamental. This is the only way of achieving proportionality in data processing. If we have a customer agreeing to data processing in the context I mentioned where it is formed or it is not voluntary, well, we get the situation where it is ‑‑ processed. The only way to move forward, in my opinion, is to impose higher principles on business companies, not only business, but in order to prevent that excessive processing from happening.
And we should be clear about the fact that the consent cannot validate excessive data processing. Another problem is governments that exchange date on a security basis. They use existing examples so effectively in our opinion, the data being exchanged for security reason not protected, it's done excessively. There is no proportionality in the system. We need to consider how to convince government and law enforcement agencies to apply essentially the same principles that business can or wishes to apply, which is, I would say, the biggest challenge not only for 108 convention but also for the big reform we expect in the EU, from the insiders talk about that the business can really work on about that but nobody knows how to get the government to accept the same principles for so‑called issues.
So I would say this is the main challenge for all of us to discuss today and definitely the minimization and proportionally should be applied by secret service agencies the same way it should be applied by international corporations. Thank you.
>> I would like to react just a bit first on what has been said about the problem with consent. I think there's an even bigger problem with the kind much data problem which does not live so much in a single data set like, for example, a single e‑mail envelope data set which essentially says that, "I am sending an e‑mail to somebody at Microsoft."
What does this tell about me? Not so much. But when you take all my data together, then it says a lot about me, very much so. And I have never given my consent to processing this big set of data. But it could be argued that I have, every time I send an e‑mail, given my consent that this e‑mail envelope data set has been processed because without processing it, you cannot get the e‑mail to the destination.
So I think we need to add a third category of data protection issue, in addition to the processing of data within a country subject to the data protection laws of that country and conditioned to the cross‑border transmission of data, there is this kind of data set which does not live anywhere specifically where the privacy problem comes into existence through the fact that data exists about me in many places and no one can really prevent anyone from putting that data together. Who is responsible, who is in charge of protecting that data?
And then when we get to privacy by design, which is really the only feasible, practical approach to prevent that kind of data from eventually leaking out and creating huge problems at least for members of some minority groups, it is impossible to predict right now which will be the minority group that will be attacked next.
So how can that be done? I would like to react a bit to what my colleague from Microsoft has said. And now personally, I'm on the record of being a strong critic of Microsoft, but in this particular point, I'm not going to criticize Microsoft because they are actually pushing ahead in that area. And that's good. But I think we need to move this to a higher level out of the Microsoft world into the IGF and also the world of really creating standards that will be used by everyone.
So I would suggest to the Council of Europe, to national governments, send technical people to these standardization groups. If one person who understands what is going on, he can make a huge difference. These groups, they act by consensus. So one person there who says, "We need to protect this personal data of who communicates with whom. There must be encryption when the data is transmitted."
For example, that will make a huge difference. And really, if it's not the governments protecting the citizens, who will do it?
>> ANTON BATTESTI: Yes, good morning. First I want to thank the Council of Europe for allowing me to speak on the panel. The first thing I want to say just to really frame the subject is, data protection is, from a point of view, is above all a question of human rights because information is power. Information you have on people, whether you are a company or a government, gives you some kind of power. It is considered to have this kind of regulation be maybe a luxury or it is not the case. The thing is, the first goal is to protect the rights of the citizens. That's why all this legislation has been created now 30 years ago.
30 years ago, even before my birth, some personal data. I am 29. The thing is, if you look at our French data protection act, the article one is saying computing must respect human dignity and human rights.
When you're talking about human rights, you're talking about empowerment, that you give something to the people to protect them against an abuse of power from the government or someone else, maybe a company, for example.
So when you're talking about principles, I've said and we need to ‑‑ but the thing is, we are really good principles now since 30 years. And the beauty of that principles is they are neutral and still applicable now, as it is the same for our human rights acts, whatever they are. In 1789 in France never imagined the internet or imagined radio communication, television. But those people just said, you have the freedom of speech, at this time was newspaper, books, and then after it's still applicable on the internet 200 years after. So we need to keep those principles and find a way to enforce them better.
But it's something like, so we can maybe improve them or add something. What we can add, for example, is a right to ‑‑ I don't know if this is the right word in English, but the possibility, you have your own data somewhere. And you want to change your ‑‑ I'm talking about, for example, social networking. You want to change because you think Facebook is old‑fashioned now. There is a new thing. That will happen. And I just want to say, I don't want to be blocked on something. It's the same thing that when you change your mobile operator, you want to keep your number because everybody knows your number. It's very complicated.
This is a way to give maybe a little more power to people on their personal data.
The following principle is, it's really ‑‑ I don't want to launch into something here. It is the right to oblivion, the right to be forgotten. It's a way to frame a question or a need from our citizens that today, they realise that their personal data are everywhere on the internet. And it has some consequences on their professional life, for example. And they're calling us sometimes desperately to ask us to do something because they don't want this information to have serious consequences to them.
The problem with the internet, as you all know, is global material and with copy and paste, this information would duplicate itself forever.
So I think it's not a solution, magical, say, right to oblivion. But this concept is at least a way to frame what looks to be a real need for a citizen.
And my last point will be on accountability. We think that we need to find common grounds with private companies. And accountability is the way to build a bridge between data protection authorities and companies because if we agree on processes, and it will be maybe safer and better for companies because they will know that what they have done is okay.
So it is very important for business, so they will continue to develop their software; profitability. So be sure that being a DPA is not to annoy any company, it's just to be sure that our people just know what they're doing and we're talking about consent or anything like that. But that's the point. Accountability is maybe a mutual we can use not to sanction but to prevent. And again, it's better for citizens, consumers, and for business. Thank you.
>> AUDIENCE MEMBER: I'm from the German government from the ministry of interior. I have a question because we are talking on an international conference. Your statements came from a European perspective, I think. We have a very important rule, Article 25, which is about transmitting data to other countries, so‑called third countries. And it is only allowed to transmit data to these countries if there are some special requirements. And there are very strong restrictions on that.
And now it comes to the internet, we have a problem because the internet is a worldwide web. And posting someone on the internet, blogging something or telling stories on your home page, it's a transmission of data to the rest of the world. So the European council suggested that this rule is not applicable for the internet.
I think that's a challenge for the European law now because what are we doing now if internet is becoming more and more important with this rule of transmitting data to the rest of the world.
So my question is, is this rule, if it comes to the internet, still state of the art and, if so, how do we deal with this challenge? Shall we build firewalls and do we have to protect our legal framework in Europe so that it is not allowed anymore to transmit these data? That's the first question.
The second one comes to the real principles on data protection. These principles are built at a time long before we had the internet. And you were talking about envelopes with data packages and these things. So the whole law is looking on each data and each package. And it will follow it. It will say, well, what is the purpose for processing these data and where does it come from, am I allowed to send it to another country and so on.
So is this philosophy of looking ‑‑ for protecting the privacy and the privacy of a person, is it still the right approach to look at each single data and ask, who is the owner of the data or for what purpose is this data being processed, or are we now in a different world with the internet that all these data go more or less together, they build new packages.
Is this the second challenge, and do you think the Council of Europe, about these things, about these questions and also the European Union. That's my question.
>> My name is Remy Caron. I'm from the Netherlands. I've been thinking about this personal data thing. Principles are applied to personal data but what is exactly the set that you define as personal data. A second question that's attached to it is, who actually owns that data set? I tend to think that I own my personal data and I should be allowed to do with it whatever I feel like. So the question is, how do you deal with that?
>> Thank you. Sebastian from the Singapore Internet Research Centre. My first question is, protection of data by who, by the owner, by the data controller, by who? The question seems to be unclear because in my opinion, if I approve my data on the internet, it ceases to be exclusively under my control. Now I share it with the service provider.
It brings me to the next question of, who then, how then can we balance the interests especially when it comes to responsibilities. Both of the state, the data controller, and myself, how do we balance the interest of this, supposing I'm a transnational and my data should be protected. Data control shouldn't release my data, how do we balance those conflicting interests. Thank you.
>> Hello, my name is Dr. Tran Mawo. I'm from Kenya. Some of the discussions that you are actually focusing on, I teach my students. I think an element that has been left out is awareness and education. As far as privacy, let people know what is private, what is protected, there is no way for them to come up with. It's not going to work.
One of the things I found out in my research, research on online security, people are not aware, even people who use credit cards, people who use mobile phones, they are not aware of some of the risks, some of the information that they are transmitting through all this media and also they are not aware that data is backdrop and stored, and if you do something foolish, that data can be retrieved and it will be ‑‑ so I think we need to include education, awareness, and making people be responsible. That means educating them that this is private data, identifying what is private data and also to look at the other countries, especially the developing countries, where they display even national ID's in the newspapers. How do you protect that? It's lack of accurate understanding and it's lack of awareness. Thank you.
>> Thank you. My name is Avry Doria, just an independent person. In this discussion on the data and the privacy of it, I think two things are getting confounded that confuse me sometimes. And that's, certainly when I put my own date on the network and I'm doing some sort of broadcast of it, there's one set of conditions that might need to apply to my data. There is still perhaps a privacy interest and a privacy protection concern, but there is also an act I took in broadcasting that data.
On the other hand, there is data that has been given in confidence, either to some state agency to some private agency and such. And that data having been given with some degree of confidentiality should probably have a different set of constraints defining it. Both of them seem to be important issues to me but confounding them makes it difficult.
And I'm sure even within that particular dimension, it's two separate piles, there would be subcategorization. The other thing I wonder is, when I have given data privately and in confidence to one agency, is there a right I should have of being notified where my data goes? And not only is that data supposed to be private but if I give it to you, you give it to someone else or a country actually transfers it across the border. Does that become something, and I start to worry about that a lot when I look at some of these cloud services that people buy into with a notion that my data is private, I'm putting it in my own cloud containers.
It's not being put out there for any purpose other than my private use but then that data gets transferred from one jurisdiction to another that has a completely different take on what they may or may not do with my data.
So those are the issues that I look at that really start to confuse me when I look at how we should be treating data and privacy. Thanks.
>> My name is Dana Tomish from Germany. We have a pretty large debate actually about the end of anonymity on the internet on the German side. I have a lot of concerns and would love to hear from you, what is your position in this case and how this affects data protection and privacy among different countries.
>> So this was a lot of questions. Shall we stay the whole day and discuss that further? As I mentioned at the beginning already, I think that the three questions are really inherently linked to each other. And so having these reinforced principles and a couple more were mentioned during the discussion, transporter data flow, current frameworks need to be revised. And obviously again, I apologize for focusing on the legal framework I know best.
There are certain rules currently existing in the European Union which have been proven to be very cumbersome. And in terms of how to get approval. The safe harbor principle that basically allows for sort of equivalency test of other country's data protection.
There are very few countries that are actually certified as being under that rule. And the other two clauses are equally cumbersome. The fact that there is a revision currently, I have no doubt that the commission is currently thinking very hard in making these efforts more adjusted to our paradigm shift in how data is processed across the border. Who owns the data?
I think the question is a very good one that relates really to the principle that I think the representative has put forward as accountability. So basically, used the cloud service as an example and you want to shift to another one or want to retrieve the data which you have put on there that you will be able to do so. And Microsoft has invested in data portability. We do have data portability principles.
And there might be the necessity to make distinction between the data you've put on that service and data which was produced during that service. But that is something which we welcome to be part of that, together with that, I think more transparency is extremely necessary for data subjects to understand what they can do. So data transparents are transparent in terms of views are extremely welcome. There is obviously the problem between more information is not necessarily good information.
So our approach to inform our customers is in ‑‑ adjust the information in a certain layered approach so they can get the higher level principles in how we treat data but then can go into details and click into a layered information that will give them more and more information according to their needs.
Awareness education goes without saying. That is a very important part. I think that is something we're actually all stakeholders can contribute, civil society, governments and businesses in order to raise awareness. In the business case, certainly a huge issue amongst small or medium sized enterprises that do not able to employ as many data protection experts or even not having them at all in their own businesses. And that is something where special attention needs to be done.
I think that's really big important step. That links back to the privacy by design principles where you need to have the awareness of privacy issues right at the beginning when the services and the software is developed so that can be inherent in that development process. And I mentioned our security development life cycle for that purpose.
The question around the principles of my being responsible for what I put myself on the web, around the right to be forgotten just before we came to this room. And how I can eventually enforce that right to be forgotten it's very much linked to how laws actually apply and if you put yourself data on a block, you're in a way exempted from data protection rules because you are considered to be a household at least under the current status of the law.
So there is different ways of eventually, when your picture gets retrieved to another Web site and you don't want to have it on that Web site, I think there's a really different set of rules that applies to that specific situation with its own complexities and challenges, in particular when it comes to how to enforce these rights. But they're not necessarily being dealt with in the same context. And I think this is all I have noted. I am certain I missed a couple of things. Sorry.
>> There was a question ‑‑
>> It sounds great.
>> I can ask you an answer. Unfortunately, we don't have much time left. Thank you very much.
>> I will quickly try to add some comments to this. Thank you, Cornelia, for putting the whole picture. Certainly the definitions mentioned by many of the speakers in the audience is a big challenge how we define personal data, what I believe is that we cannot go into a very precise definitions, namely enumerating what is personal data, what is not because that will lock us in the next ten years we will have to revise it, revise the whole framework so unfortunately we have to stay with the general principles which are not very comfortable for business, I can understand that, as well as for customers who might not know where the borderline between personal and not personal has been set by the law.
But there is no other way. Maybe we can rely more on jurisprudence and international courts to have more say on that. So I don't see a clear and easy answer. I also think that they're very basic principles should be going that we own our data and everything that follows, all the principles are built on that one. I can understand that looking at the current landscape, it is easy to think that we no longer own our data, it is owned by the cloud, by the government. That is something that we should work on and change.
Who should protect data, data protected by whom? Certainly, by the authorities, by the State and enforced by the law. But the protection applies to all entities that process data, be it individuals in a certain context or be it business. So this is a very general concept. And there is no easy answer on that one.
What is also interesting, and I would like to get back to the broadcasting versus giving data in confidentiality, I do see there is a problem here. I totally agree with Cornelia that this is not so much data protection problem because when you broadcast yourself, you move more into protection of your works of art and other things.
But I can imagine profiling is becoming an issue here. If I put my data on the social network and I broadcast it, I make it available to the whole network, yes, I cannot call on data protection to say, please don't quote me, but if I discover I'm being profiled with many other individuals on the base of what I wrote by certain companies or by the government, it might be a data protection issue.
So I would say yes and no. I mean, there are different rules for sure, but we should see some threats here. Thank you.
>> Yes, just to make it quick because we have two other very passionate questions. On anonymity, it is a question because we face the balance between freedom and security. You have freedom to express yourself on the internet. But on the other hand, sometimes law enforcement agencies need to find some people. They need to find the identity of that person.
So this is a challenge for our Democratic regulation, how we balance ‑‑ I don't want to answer that question. We as DPA apply laws and we don't decide them. My point is, if I can make a comparison, when you drive your car, there is, when you say, an ID, identification. I can see the number of this as industries. But I don't know exactly who is in the car. I don't know the name. I don't know the address.
And I cannot access the fight that would give me this information because I'm not a policeman. I'm not a judge. But there are people in our society because the laws state, can't access that information for very specific purpose. Maybe it is the path we can find. It's not anonymity, in which regulation we can find for some critical information when we need to ensure security of our citizens, those information can be used. For all of the other questions, who wants the data? I think, you own your own data. It's your property. Your property, you can rent it, you can use it, or you can destroy it. You can do things with your property.
But the most important is that, when you're free to do something, you have all the information on what you're doing that you can trust the people on the other side. So the thing is not to forbid you to do something. It's just to say, to create a framework that you can use your freedom, to avoid that what you're doing will be a backfire.
You had to learn drive a car. It is something different to drive a small car and a large car. So it is to allow citizens that will become your citizens to really have the control all of those formidable tools that the internet brings to them because if they don't know how to use it properly, it can backfire them. The beautiful thing on the internet is to continue to exist on and on.
And the last thing is, you can change your mind. You can rent it, but you can have it back after. You don't want to rent because it's annoying you or you just want to use it for you. It is making some images, just try to see apply to this special data. The same thing, when you are yourself on the internet, you choose to do it and the conditions how you do it.
It is better to put all stakeholders, we need a framework for all personal data. But after the questions, how do we apply them in this case, apply them in collecting for law enforcement, but we need some very common principles for all of data collectivity. Thank you.
>> It is more of a comment ‑‑ it is a remark on this idea of owning data. I strongly disagree with what you said because I think it's not real. Owning information, if we ever look at the real world or the other world or the former world before the internet, if I'm writing a letter to a person, I'm not the owner of the letter anymore with the whole content. If I'm telling someone a secret, I'm not the owner of the secret anymore. I trust the person I'm telling the secret that he's not telling it to another one and creating gossip or something else, but I'm not an owner of it. It's not my property. And if I'm going out and someone is taking a picture of me, I'm not the owner of the picture, the photo.
This idea of owning data is not ‑‑ I think it is a very ‑‑ it is dangerous thinking and telling people you are owner of everything about you. If I say you are the owner, your the only one you can say, what to decide to do with the information.
The decision is yours at the beginning. That's the most important.
>> My apologies for before. I guess my comment is somewhat belated in that personal data, in many cases, is interpersonal data and it involves multiple people. So if we have a meeting that is discussed or that is recorded or there is a photo of both of us, who owns that data and who has essentially the right for that data to be published? And then are we infringing on data of speech? Is that an issue?
I think it gets complicated very quickly.
>> I just think that, when we talk about data owning, I don't think the term used in most of the European treaties is data controlling, not necessarily owning, something about the rights that you have for that data. If you didn't have these sort of rights, there are a lot of things in the realm of confidentiality which are not covered by intellectual property.
There is no legislation necessarily or trade secrets necessarily unless a particular state has enacted it and when processes over countries like Pakistan, India and other countries actually process it and do it on the basis of the confidentiality agreement. If what you're saying is true that data is not owned, then we would be out of business in developing countries because you would not be able to guarantee the data that we are receiving. That means that data could be given to anybody. And that's a major concern.
We're not necessarily talking about ownership, we are talks about rights connected to that data. You know, when there was another point made that this never existed in earlier times. I don't know about that because at least in the common law countries, the biggest misnomer, owner of data, was the government.
And if you disclose any information that the government had that you got as cabinet minister or an official working in a government, you could be prosecuted and put into jail. Now, whether or not those sort of rights extended to business or not is a different question but it's not true to say that this sort of right of control of data didn't exist earlier.
>> Thank you, from the Austrian government. I just want to come back briefly with what the German colleague asked and your reaction to it. I think you are right of saying it's not so much about the ownership, it's a question of what your constitutional court in Germany had developed, the informational self‑determination, what happens. And coming back to this information, what happens my data, I just want to ask, how does the penalty of situation in children, because the Council of Europe in 2008, 47 member states declared and say the that other, in the context of law enforcement, there should be no lasting or permanently accessible record of the content created by children on the internet, that the member states were invited with relevant stakeholders to explore the feasibility of removing or deleting such content.
So I was very much involved because I was chairing the steering committee at this time. And I'm frustrated about this because when I ask my colleagues about the evaluation and what really happened, nobody really knows an answer. So I just ask that, what advice would you like to give us in the CDMC, the steering committee that has developed this, that government and stakeholders really take an action on this declaration. Thank you.
>> Just a quick reaction on the ownership issue, I would like to defend the concept that, stressing that, of course, it's not the same notion as owning a table or a chair because information is something different. And you are very right by showing the processing of information is a different thing and it's unique in itself.
But I do think we have no other analogy in the law. And it's a very good starting point. And then we have to consider exemptions, we have a government, with information, we simply have more and different exemptions like giving you rights to express or a right to document important events. Very often it will be exempted from data protection on the grounds of something else, on the ground of security, on the grounds of right to information.
So I think by remembering, we do have different exemptions and different balancing here, the very concept is a good one. As soon as we get lost thinking, well, I have no other right ‑‑ I mean, no other basis for my rights to claim what has been called my, autonomy or the rights to determine the purposes for which my data will be processed.
>> Yes, I totally agree with that. You have to see that in that given context. But I understand the concerns around why you want to see a distinction. That is a difficulty in all of those notions on these new principles and therefore, I think it is important to remind us the context in which we would like to see these be applied.
That brings me to the child safety question which is a very interesting one and it also relates back to the difficulties around consent and children giving consent. I think if you look around the EU Member States, EU Member states, the participation which in a large extent is found back in civil code principles, it already shows the difficulty in defining these things. In a practical matter, there is a lot of self‑regulation going on currently within the EU to enhance privacy for children and other things, by the way, really safety for children online. And I'm certainly willing to talk to you about this more specifically when this panel is over.
It is very difficult. We have, for example, parent controls that give educators some way of helping their children to control their behavior online. There is an educational part which is very important where in particular, schools come in. And I think there has been a tremendous effort by a number of NGO's in cooperation with businesses to really go into the schools and educate children around the risks when they are online, which goes to that.
>> A question, but how does industry see this recommendation or this declaration that data of children should be deleted after a certain time automatically?
>> I think in the first place, in certain circumstances, there is agreement, for example, when data is collected for advertising purposes that there is, for example, agreement among the advertising industry that there is no segmentation of children. So there's no data collection in the first place.
This is, for example, a very concrete example of where the protection of children is safeguarded. Parental controls is another thing. And then many of the services have age restrictions or age ratings which help again the parents.
It's certainly a complex one and it goes back to the identity because very often you don't necessarily know whom you're dealing with as a service provider. So it is not an easy one. And the current discussion, I would actually invite more Member States to participate in this EU discussion which as far as I can see is only happening from the UK currently.
>> NORBERT BOLLOW: Thank you. I also was planning to criticize the concept of ownership of data, even though I really like it in a way. But I think it should be more accurately phrased as, each person having dignity rights that unlike property, cannot be sold. That's the first distinction that I would draw. Otherwise, there will soon be companies claiming ownership of my personal data. And that is not something I want even if they claim they gave me money for it. No. It's a dignity right. It cannot be sold, cannot be traded. That is something that I should be allowed to have as a human being.
So this concept of ownership has its limits, even if it can inspire good ideas, but then I think we should look closely at these ideas and really make sure they make sense even if we drop the word of ownership.
Just as a very quick aside, as a free software person, I'm very, very familiar with how thinking about ownership like an ownership of software can become a mind-block, because there is this huge world that many people cannot see if they just think in the intellectual property way of thinking, which limits what they can see in some ways. So I think we need to move to a different framework of protecting this human dignity.
And I would suggest that it is more appropriate to think about a framework of responsibilities, responsibilities which, of course, must be designed with the goal in mind of protecting human dignity and human rights and which must be designed --and this is very important from my perspective-- for compatibility with citizen's freedom in every respect.
It does not help me so much if somebody designs data protection into the software that I use, but then I lose freedom to change my software. As a free software person, this is unacceptable to me, especially because I love free software so much because precisely because I can use it to protect my privacy.
>> Thank you very much.
(Away from microphone)
>> Thank you very much. We've seen the issues are integrated, one into another. We started with new principles. We already had those questions about data flows popping up. There is a strong link concerning this regional focus that has been mentioned several times. I would like to underline that. I don't think it's that relevant.
The Council of Europe has organised this event. You have European panelists there at the table. Modernizing a convention but as it was said, this convention was drafted at the time with non‑European member states. It is open to non‑European member states, Uruguay invited now to the convention. It will become the first non‑European party and most of all, modernizing it now with this global mind and no better forum, the forum today to exchange together because those are the issues which we've heard Kenyan representatives saying they have the same problem. We need to raise awareness. So we need to discuss this altogether. So precisely about the transport of data flows linking it back to the convention because that's our working basis. The convention 30 years ago did not foresee anything in terms of transportable data flow.
It focused on the free flow of information. We didn't want our convention to be an obstacle to the free flow of information. That was a starting point. In 2000, we realized that some things needed to be done about that and there was a provision that basically specified that you can have those transported data flows when the legislation of the receiving party of the other member states is adequate, is in conformity with the level of protection we're setting. So that's the basis.
Is this still relevant? Can we still speak about transported data flows? We've heard terms like ubiquitous, instantaneous. How can we effectively protect our data knowing that it's in the cloud that's been mentioned and that exchanges is how they were foreseen at the beginning are not necessarily relevant anymore.
>> Maybe a first reaction from you.
>> I wanted to say as a non‑European, in my country, the Council of Europe treaties have actually played a very important part. We may not have been part of the negotiating process because we are not a European country but we do take from it an important source of legislation, etcetera.
Now that you're opening this up, I'm sure there will be countries asking for exceptions. I want to support that from a non‑European developing country perspective.
>> Thank you very much. Our panelists, before we come to the panel, would you like to say something? Okay.
>> My name is Patrick Ryan. I'm with Google. This is great discussion. I have a question. It was highlighted at the outset about the APEC principles.
Because we're seeing a world where there's essentially two different types of regimes. There's multiple regimes but the two largest are the European and APEC model which has been recently finalized. It's entering an implementation phase. I'm wondering whether the principles which are much more self‑regulatory than the European minds would be comfortable with, whether these are complimentary, are they going to be part of a complementary regime or are we going to see in the next couple of years some tension and a war of standards for privacy in this respect.
>> So just we can start, the thing is, at the edge of governance, we cannot think we have only one role. So at this stage, it is very interesting but also in Europe, we have also things called comparable rules which allows international companies to exchange data across frontiers when people set in many countries. But it may be different because when you are building a rule saying that I am an international company, I would like to transfer data everywhere in the world, you have to be allowed by data protection authority. So there is this, still a provision of the protection authority.
But it is also kind of to walk together between the companies because it is much easier for a company to declare the treatments in each country. So the APEC model at this stage, it could be complimentary. So I don't want to even insert this because we still expect ‑‑ studying it. And also we ‑‑ but it cannot replace, I think, the need for global standards, something like standards Convention 108 of Europe, something that sets principles for the world. Then after we can see how we implement the principles or maybe we can use a mechanism of self‑regulation or mixed regulation. But my point is, we cannot replace something global. And I think we'll talk after this. Thank you.
>> I obviously know the APEC reform to a lesser extent. My understanding, however, is that the outcomes and goals are very comparable to the principles that are entering in laws. Therefore, having overarching global standards seems to be useful in order to achieve eventually some sort of mutual recognition of those different schemes.
One tool which hasn't yet been mentioned but which we were discussing in our prep is also a standardization process in the more technical world which is, for example, Iceland, there's a lot of activity going on currently in order to adapt current standards like the ISO 27 series to the challenges that we are just discussing in the legal framework to get this into the standardization world which would certainly also help in that mutual recognition process of those different ways of approaching the same goal really.
>> We'll come back to this issue in the third part, which I think is precisely the place to discuss the role of binding corporate rules and self‑regulation in the general framework. I think it's a great question, but maybe we move forward.
>> The notion of transported data flow, is there anything that you would like to add to that in particular? I look at my watch, it's nearly half past 12:00, and we already started going into the global discussion standard. Should we move to that now?
>> I'm happy to move to that now.
>> I will say something, so from our point of view, information starts from Point A and we go from Point B, and that will cross the border. So internet, it's the extension in a sense of our societies. There is no transborder because if we say yes, we will lose any confidence. There is a need for any rule. In any space you live in, you need common regulation. Once you have said that, the challenge is among us. Thank you.
>> I'd like to briefly remark that formerly, the problem was companies having database of some kind transmitting an entire database from one country to another, that challenge, I believe, is mostly unchanged and existing law can deal with it. But we now have a new situation with a new kind of data that doesn't fit those old assumptions, it doesn't fit the old model and we really need a way that citizens in country A can.
It is not just to have a good principle in the law of that country. You have to have the practical ability to actually assure yourself that the data is protected, that there is security and that the company which violates its responsibilities actually will get punished.
>> I would have expected some reactions to that. No one?
>> I'll give an initial reaction. A lot of this discussion is good. What I heard, maybe I'm incorrect, what I heard is that there is room for different kind of standards as long as the European standard is accepted by everybody.
[ LAUGHTER ]
Right? And you know, maybe that isn't the conclusion here. But I think it's worth discussing a little bit. The APEC countries represent half of the world's GDP. There is very little overlap between the ACOE and the APEC countries.
>> Any other ‑‑ yes.
>> Evan Muirfield, I'm an internet entrepreneur from America. Along that same line, this is a most practical and immediate standpoint, a business standpoint trying to create companies. The market out there today is highly globalized. I can create an application that is available to almost anyone around the world instantly. But it's a tremendous hindrance on the ability to actually innovate and drive that that you have such fragmented standards for data privacy around the world.
An approach that's at least able to take the lowest common denominator, the most basic standards and implement some kind of harmonization across different regulatory regimes would be useful for making cloud computing more practical in driving that forward.
I'm new to the Internet Governance Forum, but, you know, a lot of the very abstract principles around whether it's ownership or human dignity, that is wonderful but there's immediate issues that are sort of preventing people from kind of distributing to the market the way that they could.
>> What I would like to add to that, and something Which again has related to our question is really the legal certainty that businesses need in order to know which law actually applies in a given circumstance. And that is something Which has caused a lot of uncertainty in processing the normal business. You conform with the rules and there is some sort of infringement at the other end and there is this catch‑22 situation where you kind of actually do this right.
This really goes in the same direction. It is for business actually to be competitive and work in that space, you need to have that as ‑‑ this is one of the key issues, is get the applicable law really clear and easy to understand in all circumstances. And I think specifically, in current revisions of frameworks, a lot of attention needs to be paid to that because there is clearly, in the current situation in Europe, there's clearly laws that are competing in that space where you can't comply with both. And that shouldn't be the situation for any business. And it doesn't help consumers either because then at the end, a business can't really be transparent about what happens with the data if the law is not clear in that respect.
>> Let me just quickly jump in from the civil society perspective of being able to enforce rights. Obviously, clarity of law is fundamentally important to that, also. Civil society organisations don't have huge budgets. So what can we do if we cannot even figure out which law might apply? Please fix this problem. I completely agree with my business colleagues on this one.
>> I don't know if we are back to the third question now and I can move on with the minimum, how to ensure the minimum international standards, I guess to, the how to in our debate today should be the key question. We all agree that certain new principles are needed. But the hardest question remains, all the issues we've been discussing have global relevance, but there is no global authority.
And it has been mentioned that we do not have any ‑‑ there is no legislation behind EU to impose their concept of privacy on the rest of the world. From what I know, there are big discrepancies between how Europeans perceive privacy as some kind of social right and a very important public policy issue as opposed to the U.S. where it is entirely an individual right that can be waived by the individual with not much social relevance.
I had the pleasure to talk to Daniel Feitzner last week about the U.S. approach, and he confirmed that there is actually no chance that the U.S. will move towards having a coherent data protection principles for many reasons being that the lack of time to work on the huge framework they have, a very complex framework.
So imagining that we ‑‑ I would like to have, of course, cross‑border principles between India, EU, and U.S. for many reasons but it's a fallacy. We will never get there or at least we won't get there in the next 10 years. So it's a very serious question, how we approach that and what other standards we might consider, here coming then the question of binding corporate rules and self‑regulation.
I'm aware that in the U.S., this is the main tool of regulating, namely the business imposes rules on themselves and then it gets enforced by the state. Well, it might be a way forward as long as consumers and society have this grassroots movement towards pushing the business in order to get the standards higher.
I'm skeptical about it. I see interesting movement in the U.S., but I do not see the same movement, for example, in eastern Europe where I'm from, where the companies feel very much more powerful from the governments and they are more powerful. So we really have a serious issue. I'm actually more interested in your opinions on that because I don't have a good rule on that.
>> So we indeed switched to the last topic we wanted to address, which is need for global standards. Anton or Cornelia, your views.
>> CORNELIA KUTTERER: I think I've already made my point on the global standards. I just want to note that the European framework also actually allows for self‑regulation. It eventually, business needs to be critical to itself saying, we could have more used that tool. In the past, I think that it was noted by the Article 29 working party.
There is tremendous, and that brings us really into what you said, tremendous cultural differences in how we approach legislation between the U.S. And I wouldn't, from a civil society perspective, I wouldn't be eventually critical to what comes out of the U.S.. I think there's some really good improvements going on in that space. And I pay a lot of credit to, in particular, the Federal Trade Commission who is putting enormous efforts in saying, privacy really forces businesses to play along.
And I think the current discussion around tracking protection both in the EU and the U.S. is really showing and pushing companies like Mozilla or Microsoft and others towards technical solutions like tracking protection lists, for example, or do not track.
So there's a lot of things going on. And sometimes my feeling is that if we focus on some very specifics and then kind a solution that eventually really will improve the privacy more than only talking about high principles. So it's sometimes just going down to the root and solve the problem.
The example I just mentioned with tracking protection is really one of that. We have, in the Internet Explorer 8, we started with in private processing. And we have it developed and that is really because of the public awareness, developed a further tool, a tracking protection list which really helps and it is actually not helped on self‑regulation from the business. It's really self‑regulation from the society. They can decide what they want to have in there. So it's really powerful tool. And that is what technology can actually provide in that context, so we should not, not only focus on the one but really sometimes focus on practical solutions as well.
>> Norbert, I don't know if you want to add something, Norbert.
>> NORBERT BOLLOW: Thank you. Of course, I want to add something when you talk about international standards. I know that in this context, people think first about some kind of international agreement like a treaty. But let's not forget that there's something called an international standardization organisation, ISO.
And this does not only focus on technical standards in the narrow sense but it's quite feasible there to develop standards defining very precisely, for example, responsibilities and how to exercise them in the context of data protection, various aspects of it. And if you do it that way, first you have an international standard. It's called in the international standard. It is precise. It can be certified against. And then each country has the freedom to make that standard mandatory for compliance with the data protection regulations of that country, as a condition for doing business in that concrete country.
You could have an international ISO standard defining responsibilities for companies that have advertising‑related data for good data protection. That in itself has no legal force, but it is developed in a transparent international manner.
For example, a country in Europe or a country in Africa, wherever, could decide to impose this rule if you want to sell advertising to companies in my country, then you should comply to this ISO standard. And that is, world trade organisation compatible because it's an ISO standard. That's one way to implement international standards in a legally workable way even without first getting everybody to harmonize their laws, which just isn't going to happen.
>> Thank you. Just one quick question. When is the International Conference of Protection Authorities?
>> Is it next month?
>> Just to make a point that in 2009, as you all know, DPA from around the world, so much broader than Europe, they reached a consensus on standards. So maybe it's too unknown. But it's a very important state on global governance and privacy because as I say, it's an international consensus, is not the European imposing something to other people.
But it is only a first step. The second step in this multi‑stakeholder world, the only thing the government knows to do is treaties because it is the only tool that they will have to follow. Okay. It's just a memorandum of understanding. They don't know. The thing is, if you want to be sure, they will follow it. You have to create a treaty. But a treaty is, as the conference said in 2010, is the objective. So we need to reach it in a few years to create a global principal for everyone. So for business and for protection of users. So everywhere in the world, the same law. But it's not the end of the story because everyone will have to do his job. So you cannot put anything in the treaty. Okay. You don't have anything in your constitution. That's impossible. Then you will have to enforce it or apply it with many ways, so standardization is a way, corporate regulations is a way to do it. So it's a puzzle. My point is, if some consider that you can achieve the puzzle without the very important piece, the treaty, you cannot achieve the global standards that everybody needs now.
So that's why we encourage a dialogue because it will lead us to this goal. And we are not here as a DPA to say, the treaty is the only solution. Okay. Just create a beautiful treaty and all the problems will be solved. No. We need a treaty. And we have one in a sense because the convention today in the Council of Europe is a really good start.
And from a public policy point of view, we have statements made by DPA's. And we need to create that link with governments because the DPA's, do not create a treaty, to create convergence from the public side. And we can also encourage the private sector to create the kind of coverage that we need.
So that's my point. Everybody has to do his job. We will have a very good thing in the end.
>> Thank you. I've been hesitant about speaking up because I'm not 100 percent sure yet what my message is. But the one thing I wanted to say is that one shouldn't put too much hope into standards because unlike technical standards, it's very difficult to enforce standards on data protection. It's easy to cheat for everybody. So I don't think we can expect too much from standards.
I've been having a conversation over breakfast this morning with somebody. And we both thought that for the private sector, this is an area of great uncertainty because nobody knows what users of next generation will think. Will they think that data protection is cool or will they think it's completely uncool? And what will actually determine this development? Nobody knows.
At the same time, the business models of more and more companies depends so much on this crucial question. One potential development is an sent, like Fukashima in Germany, it's a complete turnaround of government policies.
The same could happen in that area. I don't have a clear idea of what kind of accident that could be, but it's a possibility that users opinions and attitudes radically change because of some black swan kind of thing.
So I think companies need to prepare for various developments in this area, to be competitive. They need to take into account that new business models might be build on strong data protection so that the market plays a major role in creating new auctions for users and users change their preferences and choices.
At the same time, this might not happen. So I would say, one way forward is to give users more choice in this, to give them more transparency about what happens with their data and allow them to interact with companies about what they want.
I'm not sure this creates the degree of certainty all stakeholders would like to see in this field, but I'm pretty sure that standards or treaties won't do the trick.
>> I want to quickly react to this, as much as I agree that standards is not the way forward, I think the picture with ‑‑ okay. The thinking from the individual perspective is a kind of trap that I think we mentioned already. What I want to say is, it all depends how you perceive privacy and data protection.
Do you perceive this as a purely individual approach or do you see it as something more like a human rights issue where you might have individuals consenting to processing the data like youngsters, teenagers having no problem with this whatsoever but from the perspective, you might see this as a social problem and you might have this challenge of educating people.
So when you say transparency, I think all of us agree that this is a necessity but I would say it's not doing the job. We really need, that's my perspective, maybe not popular perspective, we need this accountability, day time immunization, and principles that protect individuals even if the individual in question does not feel like getting protection.
So we have the big challenge of whether privacy could be could be an individual thing that you can wait or we might see how the State is imposing the concept of privacy is important. I don't see ‑‑ I think it goes more towards human rights perspective, U.S. rights goes more toward individual rights which may not be relevant any more in the next generation. But I would fear that this, there is a danger in people choosing singes without full awareness of what they are choosing and us being able to educate online.
>> Can I clarify one thing, it's not the ‑‑ I'm just not sure whether you can enforce the laws. I remember back in Germany, for example, for a long time we thought, big companies, they all complied. Our problems with the small companies
And then we had scandals like those with German Telecom. Nobody could believe that such a big company would violate its own data protection rules to such an enormous amount. So I mean, how would you enforce these, particularly if they are cross border rules. That is what we need to take into account, I think.
>> Just a quick comment, we are only 150 people. So the thing is, our job is to enforce our French data law and the European laws. So the thing is, that's why we talked about accountability because, we can't stay on that stage and say let's continue as it is because things like that will happen again. We've accountability we can work before that in a fully collaborative way between DPA's and companies to be sure that they will implement the process that will more protect our citizens in the end.
There is something that we cannot see everything everywhere. But accountability can be a solution.
>> NORBERT BOLLOW: Let me just quickly jump in from the technical perspective, when a standard is too abstract, too high level, you cannot really enforce it because it's not precisely possible to determine what you can do with it and what you should not be allowed to do. But when a standard becomes very specific, then you can actually certify against it.
Many ISO standards are designed for certification. And you can get a certification and there are penalties when you lie in that kind of thing. You can in principle do it, it just takes the will to do it. And if the governments say certain companies have to present the certification and the government sends people there into the commitees developing the standards to make sure that they actually ‑‑ that the certification actually means something, then you have achieved something. Short of that, we have only created an illusion.
>> A lot of this discussion on standards and standardization and ISO is really healthy tis good to hear it from other people. It there a lot of work that all of us can do but this is a very healthy discussion. Thank you.
>> Thank you very much. Thank you for your patience. We've been very happy to share this with you today. I would like to invite you tomorrow to the open forum of the Council of Europe which is not specifically dedicated to privacy issues, it's the global internet governance strategy, tomorrow morning at 10:30, you're very welcome.
Thank you to our panelists. Thank you very much. I think we had a very good balance and mix of views. And thank you to the audience.
(End of session.)