The following are the outputs of the real-time captioning taken during the Thirteenth Annual Meeting of the Internet Governance Forum (IGF) in Paris, France, from 12 to 14 November 2018. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
***
>> MODERATOR: Good morning. It is just a little test. Not much is happening. Okay. Okay. Good morning. Thanks for coming at the early session here at the IGF.
We have a few technical problems, but we will just go ahead. Transcript is not working right now. And can you hear me well? Thanks. The digital age and I work lot along privacy issues and it's quite natural we want to address today's privacy issues. With me, we have a wonderful Group of experts and I'm very grateful they are great to come.
Just to my left, Emilie Serruga is the head of the public affairs department here at the commission, French data authority. Next to her is advisor of the democracy of the digital rights movement, center for human rights. Next to her Graham Webster, fellow and coordinating editor of digichina in America. Wafa Ben has seen. And finally last but not least and I'm really immensely grateful that she is here today because we had a little hiccup and I asked her this morning, um, to replace someone, the sort director of governance at university law university daily. And an expert on privacy.
I wanted to start just with a few words on the work of the office on privacy issues. Then focus on one particular field of issues.
In our work on human rights online, obviously privacy is a core topic. The first wave in the last few years of the work started with the known revelations in 2013 which led to a report on the right to privacy in the digital age in 2014 which set quite influential standards in particular when it comes to government surveillance. Things have moved on ever since.
2015, the human rights counsel created as many of you know probably the special rap tour on the right to privacy and this year has really seen a shift of focus on new topics both in the real world and in our work. We had until the GDPR and general data protection in the European union coming into effect. Other influential in September this year the high commissioner of human rights presented to the human rights counsel a new comprehensive report on the right to privacy in the digital age.
In parallel, you may have heard about the secretary general having developed a comprehensive strategy on issues around digital merge and technologies where privacy placed a core role. And also within numerous ‑‑ whether you're in the system between many offices agencies, et cetera. We have developed a set of quite fundamental principles that we intend to develop into policies in the future.
Just a few words about the report and then we will move on to our experts.
The report, that's to ‑‑ as I mentioned before, adds a new dimension away from ‑‑ from a government surveillance focused approach and without abandoning it and looks more closely at new developments of technology and data‑driven technologies i big data, Artificial Intelligence, et cetera. And then things related issues and, um, it raises concerns that are connected to this ever‑growing digital foot print of everyone. And ‑‑ and based on that, the report develops fundamental standards regarding the responsibility of business enterprises, the duties of states to protect against interference by third‑party pactors and of course to respect and promote the right to privacy as a state director. And based on that, we, um ‑‑ we came up with a set of minimum standards, we would say, for legal framework we think is necessary both for regulating states and business enterprises. And one particular trend the report identifies is a growing reliance by metric information of all kinds for the identification of individuals by states, by businesses and, um, we see this as a trend that really needs to require close attention because of the sensitivity of metric data.
So I want to transition now to our discussion around biometric data and ask Emilie next to me to set the frame to describe what kind of data we're talking about, what kind of standards the French data protection authority sees and yet. Thanks so much.
>> EMILIE SERRUGA-CAU: Thank you very much. So I pleasure to take part in this event. A brief summary for human right report. Just a few words to begin. The subject selected for its foreign table demonstrates through technology called development, biometric data have become a key element of identification and identification persists this and (sound cutting out) for particular attention from the data protection and privacy point of view. Biometrics can be changed in the legal model and it endures the daily life. For example, biometric data can be used for (?) and many other purposes. Have your mobile access to hop in your car, et cetera.
In the European union, the GDPR addresses specifically biometric data and provides forward this definition. Biometric data is personal data resulting from specific technical processing relating to the physical, pathological behavioral directive which follow identification such as special images or that copied data. Most specifically, the European union regulation considers biometric data as a special category of data. And it's really important because it's a special category of data together, for example, data with the link (?) political opinions, et cetera. This special type of category of data lays down as a general principle of the prohibition of (sound cutting out). The GDPR specified what they are. We could mention among others the consent of data subjects, the legal obligations in the field of employment, social security of social protection, the protection of a little bit of data subject.
Even when processing of special categories of personal data, such has biometric been allowed as an exemption. All other obligations derived from the GDPR we apply and in particular, privacy by design and by default. The security of processing and for data protection impact assessment. These are essential obligations to insure the safety of biometric data processing.
Finally, member states are entitled to maintain or enforce further conditions including imitations in respect of biometric data. The French data protection authority is applying by law since structural of 14 years ago related to the processing of biometric data firstly relying on a prior authorization and has been in 2016 obtained this preferential to take account of the latest development with two single authorization processes of this type of category of data.
Generally speaking, the processing of biometric data can introduce benefits for people, as for example, make identification and identification procedures easy, fast and convenient. Improvement of user experience insure safety, et cetera. But also some risks. As, for example, lead to a gradual loss of privacy if no adequate samples are implemented. Risk linked to the misuse of a biometric data, et cetera. In this context, the main principles of the French DP are the individuals control over his biometric data and data minimization.
There are 25 necessary requirements for domestic biometric data such as mobile phone verification of application access. In this case, we use of this tag of data is closed for private and individual use only. The use of such must be optional and of the soul decision of a data subject. The biometric template must be stored in the device and dedicated from space not accessible remotely. So not remote to close on external. The template must be protected by set of art encryption.
In context, the opportunity to implement such processing is simply defined by the load. We need to have a legal date to process the stack of data. The processes are asked on the state acting in the exercise of (sound cutting out) and data must be strictly necessary to this identification or to check the identity of individuals. This requirement the further respond to the upcoming application of a principle of privacy by design and by default have paired the openune regulation and could be used as model to consider the development of comments on the processing of biometric data at international level.
In the processors country, the protection of data secretary and fundamental rights requires a global response. Technology called it and ranging from authorization from data to artificial intelligence and unprecedented capacity for surveillance of individuals.
This current on the processing of biometric data has small load and would respond to facts when it becomes biometric in the certification regardless of the conditional order biometric order are units identified and contar to past world which can be changed if compromised that is not the case for special image. In the context, data protection on the front line of development of developing several standards and taking technological change and announcing to protect privacy. Thank you very much.
>> MODERATOR: Thank you so much. May I ask Lola to jump in? Thank you.
>> Lola: So I will just be presenting a case studdie from South Africa on how the use of biometrics, which was intended to actually give it to human rights was essentially misusing and led to the violation of rights.
So over 10 years ago, the south African government started (inaudible) (sound cutting out) in the agency to help this mission to over 17 million that have in South Africa now. So because of corruption in the system, then that led to a lot of savings and mothers and fathers ran with seeds in process. In 2012, they entered into the contract with this company to help in distributing. So this country ‑‑ this company is called CPS and it was net one. So now net one was a huge company that provided a lot of financial services in South Africa from insurance to ‑‑ even had a bank on its own and also had things like providing cell phone data and stuff like that to citizens.
So then the question found out the process of getting the contract. My CPS was invalid and illegal. So it was going to sort of suspend the contract. That was suspended and especially when the whole (sound cutting out) what happened is the contract assisted. The process and the official beneficiaries was compromised. Let me explain what happened. So the deal was (sound cutting out) (?) it would then withdraw your money out of ATM.
So (sound cutting out) so there's a single mother who has (sound cutting out) they had information. I thought about that to her. You can't just loan or unless you took out an insurance policy. What happened at the end of the day is at the end of themont, there were introductions to the point where Christmas there were ‑‑ how could they have money to support their family. CPS used data protect they have and I'm sure the information was a subsidiary they would use that. (?)
And so because of what is happening, a lot of status put out this issue and they were against the (cutting out/static) so one of the things they're doing now is beneficiaries are opening an account. And that is data control how the funds come through and they built them. So I still have now the neat process working on the surface in South Africa and I have the way you cannot pick up anything from the beneficiary.
So it is leaving information it has access to. They have announced that CPS had agreed to get the information and this process of the mission happened, but the way together with ‑‑ somebody you trust has done that or it will do that? Every time you verify, they have access to the biomet like data. So at the end of the day, the process that was in the organization that needs that data, I was really (sound cutting in and out/static) on an average, there are 14 a month. And whether to give it by law, it is about proper sound cutting in and out) so a lot to say because like I said, it takes a lot of money. (sound cutting out) the kind of (inaudible) with biometrics and effects. Thank you.
>> MODERATOR: Thank you. We have some insights that you dive into. Thank you.
So I come to this table as an organizer of the project. We were to understand the tech call in China from a number of angles with the specific notion that there's a lot of good information out there in Chinese, but not much in English. So I come with a number of things in my mind that are thanks to the works of our community, volunteer translators and analysts and I want to appreciate that first. The project that everybody's been building has made it possible to do a great amount of analysis that we wouldn't have been able to do. Thank you to our community out there first.
But today, I want to start a little bit broadly with the emerging personal information protection regime in China. Go into a few specific areas of biometric use in China and then I have a couple of recommendations that I'll bring up and we'll see what we have time for. But there's basically an international conversation about privacy in minea that's become a trophy in recent years. Companies can explain user data in profids in some areaatives providing competitive advantage. The reality is unsurprisingly more complicated and we see a fairly significant strong looking, but mostly not tested regime of standards and other documents pertaining to privacy and personal data protection. The core future of this machine like in some order countries ‑‑ other countries is that companies and other handlers of personal information or for significant applications in state of emergency (sound cutting out) protected in the way it is processed. The cybersecurity law of China which came out and went into effect last year, had significant content on this. There are ‑‑ there are rules and proceduresna provide feed lines both the Private Sector actors and in termsf how systems that are new should handle personal company. There's a personal information protection law that has long been in the works and may be picking up in the next few years. And for this context, there's a nonbinding, but still very forceful personal information security spessization that has been issued by cyberspace authorities in China. That will look very familiar for people who started that and major into national conversations about privacy and personal protection.
So I note that to say that it is non‑binding, but it is still used for enforcement. They have two categories. One is personal information and one is sensitive information information and both contain the same list of biometrics that are to be focused on. Genetic data, finger prints, voice prints, palm sprints unique shape of your (inaudible) and Io sis and facial recognition features and is helpful in many. They want to you keep in mind that there are other biometrics out there. Specifickifying the biometrics has a large degree of consent that is collected and what can be done. There are enormous, just huge exceptions and things liked social security authority might have significant data protection requirements. If get into the state security and police area will be significantly less restricted and very opaque. Just a couple of examples of technologies in biometrics and how they're deployed in China. Public safety authorities have been making news of the existing national ID card database which includes a photograph of everyone that has an idea. Most people. There have been a lot of media reports uses of this data. So at the annual beer festival as well as other events, facial recognition was used to identify criminal suspects. I'm not sure this is with the national ID database. Facial recognition is also used on some reported cases at the entry way to the buildings essentially for the security of the building. Fable recognition is used to identify J‑walkers in some cities whose identities may be broadcast on a billboard in a certain amount or shame effort to decrease the instance of J walking for public safety and trafficking sense. And then general inmost cities, at least, surveillance cameras are on the‑although it is generally unclear how many of them are still working, what type of systems they're expected to, this type of information I'm not available.
So when it comes to the government use of this data, the regulations around retention and information from a large event like the beer fest ring is thing or using that facial recognition of an apartment door building, the rules and how they would be protected are not fettlely equipped ‑‑ totally equipped. Talk about one more area. DNA collection. So there are a number of different national efforts in China to collect enettic information. It is coming in two major good luckets that is toity drafting. One is widely reported by both journalist and human rights organizations. Police organizations are developing a national database of DNA. And as the Wall Street ternal has reported that b NA forgetting to carry bugs or bugs of the state. Last year we reported that biometrics including DNA finger prints. There's a real apparent human right in this subverging ‑‑ submerging. There is less law enforcement or security and the national effort to develop for scientific reasons. Chinese companies are some of the mobile leaders in wrapid. Do you know spencing there's aca ‑‑ did you know distensing this from large databases.
At this point, I think I will hold off on recommendations. I think we'll talk about some of that later, but I want something that you see in the Chinese detection, this case is a recognition that ‑‑ there is also creating security concerns. Now one thing assigned there sense of what a good purpose is in different ways, but the notion of a risk in holding this data is very important and I think to observe data metrics whether facial recognition we wear that on our face and is it seems unlikely that some of these would not be breached. We will be living in a world where large numbers of people have unique overrers out there. I think that is something we aught to keep in mind as we both develop.
>> MODERATOR: Thank you. That was fascinating. (sound cutting out) do you want to jump in now?
>> So I'm going to give you a brief overview of the system that we have and there are documents and they were developing things made. So biometric VRS and the identification VRS in there is called (sound cutting out) and it was a very simple idea when it started. Individuals didn't have an interesting ID.
>> The everybody has a quarter to go back to her. Use this identification to vary fee whether or not we are meant to receive government subsidies. It's an interesting piece. A lot of social welfare seems in India as opposed to many countrys provide actual good services to individuals who are beneficiaries of the schemes. Some people will get food distribution and lentils and things like that rather than get ameddic. It is the coolant amount with money into the bank accounts and providing web services meant there was a lot of corruption and the food and the other goods that these people were meant to obtain. And having like a digital unique biometric piece was supposed to type ahead of this and make sure that only those people deserve under the law, the given subsidiaries (cutting out) this unique ‑‑ it is for the number and giving your name and age and address and also your biometric details which is finger prints which include the idol scan, but the law is fectd for each opportunity. And the interesting ning in this case was when the programs are (inaudible), we only had an actual law come in to governor this picture in 2016 and that's been crittized‑for‑not really taking Ted or data protection and trying to see if she was among them. Airline of this, we have ‑‑ it also hatched and the investor does not have a comprehensive data protection. So there was a black (inaudible). So in the mean time betweenf one is there was a case challenging your project and the entire project as well as many aspects. The other big issue is centralized databases and the mix. In terms of the data of individuals but also national security concern associated with this, but at the same time, the use of this biometric ID number was expanded greatly. So it was no longer restrictd to welfare and subsidies. It was used for everythingna you can think of, every dollar service that was provided, but also many, many private citizens. You needed another number to get a bank account and help desk services and you need it for your subsidies. And they started asking for the numbers (?) family school, secondary school and colleges. Why under the law the schema is volunteering and the data was supposed to be provided with itself. It was ‑‑ it is without having a (?) and this happened while the supreme court continuously issued orders saying you cannot make a mandatory for government services. There's a lot of discussion in the dat about what happens next or what is going on. After that, we have a lot of savings and it's a bit saying that we shouldn't restrict the technology to include government services. And earlier this year, we found the supreme court after five years issued a judgment on this system. So they did upload to consider that necessarily so that it is valid. But they have the scripts essentially. So now the government can only mandate the user of other services that qualify as subsidies. And private companies can no longer use that. So your banks and Telecos can no longer ask you for another number to pull what are what services they need to. But it is interesting that it continues to be mandatory because the supreme court upheld a lot of trip wires. You deleted your number and your income tax returns and everyone in the country is irrespective or not. And the thing is people who are probably below that range of income who will be filing will possibly (?) it's hard to see who would be inside this point. The thing is that the blueses that was followed ‑‑ pluses that was followed because we have a much more single project and then they lock them into place without that much debate. So notice that in the situation if private actors cannot use our (?). We now have to sit down and figure out who has the associated related with the number. Who has what range of that and what will we do with it. If they're not using anything else, how would they show up if they had this number. What happened next is the quick question. I gave ‑‑ it would have been much simpler if there were processes following DBD.
>> MODERATOR: Thank you. And Wafa I think has another intriguing example.
>> WAFA BEN-HASSINE: Yes. I think our session has 10 minutes. We do not have much time to discuss the recommendations. So I will also kind of dive into that, but I wanted to provide one more example of the biometric data. There are other programs such as actions. The console representatives and the rock had an election last year. So in may there are recommendations in the office that decided to use another floating VRS and integrate the use ‑‑ system and integrate the usage of that data to identification for the commission. It launched a campaign called your car today is your voice tomorrow to push. The electronic voting system they pushed with local universities to encourage the younger generation to subride isd according now for using myometric data to work was that doing so prevents fraud and improves it. There are cards cleaning the chip and the voters will be invested on that chip.
Ironically only days after the election, there had been years attempts, which was the very thing. They demandd up is the and overa the process. It was a board of my members. Until this day, we don't know what happened to it, what was madef it and we all the ‑‑ all of the voters protected private and secured.
With that being said, I want to perhaps go through a few recommendations on the go ofance of the data. But not necessarily (?). Some others must use the refined and scripted scope that employs biometric data‑for‑the law. We find the purpose of the government or the authorities that are integrating it must explain the scope of the application and view to the public; second, government should insure that they recognize as voluntary. And also optum and it is not a default security measure. Individuals must not be ‑‑ they must opt in ‑‑ there is no transition for services or to vote and you being the citizen. Third, one consensualizing or designing these types of programs, the government must have well. They should have a ‑‑ to that end, public authorities should keep detailed logs whenever an officer gets in with that and public authorities must have so what was accessed and what was it accessed for? First legal procedures and standards should also be developed with care that protects human rights and your process. So when biometrics are used, the first evidence should be attained and used as a primary source of identification. Law enforcements use myometric data. Biometric information collected by private parties must be subjected. But also especially under our relevant domestic call. Authorities should undertake transplant s and con sultations whenever they initiate any proposal to integrace biometric data. What is mentioned in the past, this is information one cannot believe changed. And if you put a wrong piece of information, it can be fatal. If you say your blood type is O and it is actually A, you might die in a hospital. This is very, very, very dif information. It needs to integrate all different stakeholders and don't let anything charged. This means consulting and allowing comments for all interested parties and providing feedback and also just having meaningful with industry and all of hid consultants must be made public and much I will hand it over to our moderator.
>> MODERATOR: I think that was helpful and thank you for providing so much food for thought that we can now start full‑day sun. I think you brought up an important ‑‑ very important point. One in particular of anchoring it and getting international standards such as the proportion at. We need to ‑‑ we used a collection of storage and data to put proportionate in the circlum assistances and you see that where you can't derive many other aspects, et cetera. You have (?) in what cases using bi's metric data is really beneficial and it's not the solutions and what kills you would you say should they be?
>> Just an example of what ‑‑ (?) the system for vocal transformation to access ‑‑ because some measures have been installed, may it have access to questions you shall ‑‑ there is benefit ‑‑ there is (inaudible)so to ‑‑
>> Did is very difficult to list out where it should and shouldn't be. I think you have to put each one on the kick back, but they always say you need to (sound cutting in and out) like I said, assessment one is an example. It did do a lot of boot, but something bad happened. This is not really a block to know what needs to be done to ‑‑ obviously the data is looking.
>> I think this is basically the questions we will have to answer in the case of our system and I agree that it has to be done in this understand. I think that was the problem that somehow had ‑‑ it was such a large one. It is difficult to accept it and one ‑‑ and I will look into what it if it is the competition of this. The other problem with biometrics for example is in the Philippines where people engage manually and the Philippines don't really function. It doesn't always. There are a lot of issues like this. I think the question that you messaged earlier is they have alternatives and that's a big question whether or not to use biometric data for this. Any other identification is sufficient. The only way to do it is without that place.
>> MODERATOR: Thank you. I discovered ‑‑ I'm sorry. But ‑‑ let's (sound cutting in and out) (inaudible) is quite outspoken of the impacts and regulate in area and that is linked to the discussion we have. Would you be willing to elaborate more on how Microsoft does this? Thank you.
>> Sure, Tim. My pleasure. Because of the short amount of time, I will make three tech sectors and tech companies. When it comes to sensitive use of technologies such as facial recognition and the democratic societies governed by the rule of law is really it is how that should be used. If the police want to use facial recognition to conduct surveillance on law enforcement, it really is the government's job to make sure the reg regulations. Referring to private companies to regulate the police, it is a substitute for that which is really the government's role. Secondly, we are trying to figure out the principles and providing this technology to all sogments whether the government or private enterprises to use them. It is still new and we're trying to figure that out and we're trying to get to a point where we can share our thoughts? A more public way. Because it is all new and evolving, we know we should move into it. It could do damage and harm.
Lastly, in order to do that, we have to engage tech sector. Weary courage government ‑‑ we are encouraging government. If they will regulate, it would be informd and thoughtful and effective regulation. Then two, one company has limited of wa we can do, but if we collaborate with order companies, we can learn from each other. I will learn from other sting host. That would know my thoughts on how to solve this.
>> MODERATOR: Thank you. Thank you very much. And I think it was quite great having this multi‑stakeholder. I know we are basically out of time. It will give everyone 30 seconds to put in their closing statement. That was interesting.
[Laughter]
>> On the French national load is really important and it makes a lot of sims in this. Inform ticks must be a service of every citizen. That is development that must not interfere with human identity, human rights, privacy or individual of public liberties. I think it is really important to keep in mind this on the matter, on the contextf developing. There is processing of data biometrics.
>> Okay. Just to quickly say that in South Africa there is a dat protection law, there's a huge challenge in the terms of framework of data protection especially because they don't have right to privacy in its provisions. Makes it a surprise on the content. That needs to be the starting point. Very few countries have laws on the content. There is laws and they're not treating human rightsna are President. I think a priority should be defining and incorporating the principal that might be law to get privacy data protensions on the legal content. Thank you.
>> I have to say coming from the starting point that governments are going to install surveillance and other entities will be buying surveillance technology, I think there's a very strong need for international standards work that is audited or certifiable such that those systems can be recognized for meeting. At least some of the standards embedded in the human rights principals, you know, necessarily, et cetera. There should be a standard and there should be a potential for certification. So that customers can choose to buy the more privacy protected option rather than the less protected option.
>> I completely agree with you. Two very brief nings. We should move deliberately and we cannot move fast and not break things. It is important to build it right. We are going to be seeing learn learning. Everywhere we have to do it correctly.
And two, multi‑steaker holder is a long word is great, but we're still waiting in silos. And the industry or a lot of privacy. Operate in a completely different world and we need to start integrating that type of ‑‑ that view point as well in all the these conversations. Thank you, all.
>> I wanted to reach out for the transparency and the account ant seems to be built in. A company just makes sure that when we're going slow, I think it is easiery to do and just to make sure we don't have to be let up when we're doing things. Yeah. That's it.
>> MODERATOR: Thank you so much, all of you. It was hugely interesting for me and I hope for you too. Thank you for coming and staying with us. I hope that insights from this end, food for thought and I wish you all a wonderful day. I assume most of you did, safe travel. Bye.