FINISHED TRANSCRIPT
EIGHTH INTERNET GOVERNANCE FORUM
BALI
BUILDING BRIDGES ENHANCING MULTI‑STAKEHOLDER COOPERATION FOR GROWTH AND SUSTAINABLE DEVELOPMENT
THURSDAY, OCTOBER 24, 2013
11:00 AM
WORKSHOP 90
NO CYBER SECURITY WITHOUT
GOVERNMENT IMPOSED REGULATION
********
This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.
********
>> MODERATOR: Good morning, everybody. I'm very happy to see so many faces in the room and have this excellent panel here.
We're going to do a session on self‑regulation, yes or no, but it's a bit provocative so that actually we said there's no cyber security without government regulation. And last year the NL IGF did a panel on critical infrastructure incidence and the reporting of cooperation, and the panel we had last year actually did eight or nine recommendations, of which we thought were two were very interesting to pick up for 2013.
We're going to do another session this afternoon on breaking down silos and exchanging information, and the other recommendation was if there's ‑‑ we talk about critical infrastructure incidents, then we have to talk about self‑regulation, and then we ask ourselves what is self‑regulation? Is it actually happening? Who is self‑regulating themselves and in which way? Do other parties know enough about the self‑regulation and are these self‑regulations implemented by the people who should be implementing them and the companies that should be implementing them? And if not, is there some way that other affiliations could assist with these implementations?
So from there we went to organizing a panel. We tried to make that as good as possible. And unfortunately we had one cancellation but that can happen. In the panel we have Aparna Sridhar from Google. So welcome. We have Nina Janssen from the Dutch ministry of security and justice. We have Astrid Oosenbrug from the parliament of the Netherlands, social Democratic party. We have Liesyl Franz from the state department in the U.S. Andrew Sullivan from the Internet engineering task force that he represents here today. Silvia Viceconte from the DG connect in the international department of the European commission. And we have Virgilio Almeida who is the national secretary for information technology policies of Brazil.
So very welcome. And we're going to start with one question. When we talk about cyber security, do we need regulation or do we need self‑regulation? Yes or no?
>> APARNA SRIDHAR: Strictly yes or no answer?
>> MODERTAOR: Strictly yes or no answer.
>> APARNA SRIDHAR: Well, I feel you haven't framed the question properly because it's either self‑regulation or ‑‑ so I would say yes to self‑regulation, and regulation depending on what it looks like could either be helpful or harmful.
>> MODERATOR: Thank you. One zero.
>> NINA JANSSEN: Okay. So Nina Janssen from the ministry of security and justice in the Netherlands. We aren't so much talking about regulation or not. There are several forms of regulation if you look at it from different perspectives. You have self‑regulation but you also have a lot of ‑‑ you have several notification procedures which already apply off‑line and do apply on line as well. There is government individual legislation. We got criminal legislation and we have international law, human rights which is a form of regulation as well. So I would definitely say yes to regulation is necessary for cyber security.
>> MODERATOR: That was a long yes. It's one to one.
>> ASTRID OOSENBRUG: Well, I think it should be both. Self‑regulation and. It's not a question about or. It's and.
>> MODERATOR: One and a half, one and a half.
>> LIESYL FRANZ: I think I have to go with the both with an emphasis on self‑regulation, except when absolutely, absolutely needed, and only when well deliberated to make that determination.
>> MODERATOR: I'm going to say self‑regulation to make it easy. Two and a half and one and a half.
>> ANDREW SULLIVAN: So if I have to pick I'm going to vote the same way. I think self‑regulation is the answer.
>> MODERATOR: Three and a half and one and a half.
>> SILVIA VICECONTE: I'm going to try to balance the score. Yes to regulation. I think cyber security is like any other area of public life. It's impossible to deal completely without regulation.
>> VIRGILIO ALMEIDA: Well, my answer is two yes. I think that there is need for government regulation, and there is also a space and room for self‑regulation. Actually we can frame the ‑‑ this question in another ‑‑ in a different way. It's a government and mode stakeholder rose in cyber security. I think both have very important roles in the cyber security, and later on I can give some examples of that.
>> MODERATOR: Thank you. I think self‑regulation wins with something like a nail.
But in the room, what do you think? Self‑regulation is prevalent on cyber security or regulations today. Just show hands for self‑regulation.
>> AUDIENCE MEMBER: (Off microphone.)
>> MODERATOR: That's the third option, but purely for self‑regulation. Purely self‑regulation. That's not a lot. Purely regulation.
(Laughter.)
It seems to be even. Who is for both? That seems to win. That is just about the same in the panel as we noticed. So that's at least a start for this discussion, because I think that it shows that maybe we should be discussing both and it's not a black and white issue.
Let's start first with the Brazilian government. You've been making a lot of proposals since the WCIT, we understand, and how do these proposals reflect the discussion on self‑regulation or regulation?
>> VIRGILIO ALMEIDA: Let me first make some comments to give a context for my position. So we are talking about cyber security, which is clearly an asymmetric enterprise. So we are talking about an activity where defenders are reactive and attackers are proactive. We are talking about an environment where defense is expensive and attacks are cheap. We are talking about an environment where defense is based on past attacks, but every day you have a new type of attack. So we need something else.
So I think that government and most stakeholders have a very important role in working towards a more safe cyber space.
The role of government is essential, because are talking about something that needs a lot of science and technology, and that demands investments. So governments have to invest in certain development to come up with new tools, new technologies that will prevent future attacks. So that's the first point. We need investments for that.
Second point, we need economic and regulatory incentives, so that companies can help the society, government, to create a safer cyber space. But the rule of most stakeholder model is really important for this.
Let us compare this to the off‑line world. When you think about a neighborhood, if you want to increase the safety of a certain neighborhood, what you do, you count on the police, but you count on the population there. You depend on the people that live in that area. So it's a clear combination of community and government.
So the same thing happens in the cyber space. We need civil society, we need private sector, and we need government.
Let me give an example for that. Government has an important role of proposing and implementing policies for cyber security. But this policy making process is also ‑‑ can also benefit from different views, from the civil society in order to construct a better legislation for that. So that's another importance for the most stakeholder model.
The implementation of the legislation and policy also depends on the support of the public sector companies, and we can see that.
Let me give an example. In Brazil we have the Internet steering committee that has a very nice example of how things work in this mode stakeholder model. The Internet steering committee did a study to understand spam in Brazil. And Brazil was one of the top five countries in producing spam two years ago. After this study done by the steering ‑‑ the Brazilian steering committee, which is a mode stakeholder committee, the solution proposed was to do some special management of gate ‑‑ of the port 25 of the systems, it's a technical question, in order to reduce spam emission, and Brazil now is in the 20th position of the rank of countries that produce spam. So that was achieved by the participation of the civil society members in the steering committee, the companies that adopted the suggestions that it's not a regular story, but it's a best practice, and that worked well. So these are examples of the importance of both bodies, government and most stakeholder participants.
>> MODERATOR: Thank you. I think that's a very clear example of how people work together that can actually make a difference and made a difference in this example of pushing spam down.
For the other government representatives, what are your concerns towards cyber security in general? So what is actually keeping you awake at night as a government representative and not as a private person, I hope? Start with the European commission.
>> SILVIA VICECONTE: Sure. First of all, I think what keeps us awake at night is instead very much what keeps citizens up at night, so it's whether they are ‑‑ their online operations are safe, whether their financial data is safe, whether their ‑‑ you know, the systems operating energy transmission are working, whether safety measures that make trains avoid collisions are working. All this runs through the Internet these days. So I think ‑‑ I don't see a big difference with the Internet with any other means or tool or good in general that permeates public life, so I think we need to start ‑‑ first of all, stop looking at Internet as something exceptional. It's not anymore. It really goes into every aspect of public life, and as such, there's a moment where the governments or different regulatory authorities intervene, have to intervene, in the remit of their responsibility and only in the remit of their responsibility, and that's why I think we end up with something that's a mix of regulation, self‑regulation and different tools.
So indeed what keeps us up at night? As I said, things that the banking sector is running, that the stock exchange doesn't collapse, that electronic medical devices are working in hospitals. So anything that relies on the Internet really.
As some of you may or may not know, Europe is ‑‑ the way it is, it is conceived, it is made up, the European commission covers what is the community remit, so part of the public policy is initiated at the European commission. Much of that has to do with cyber security is actually in the remit of our member states, so in that, by definition, we work in a multi‑stakeholder way.
Just one concrete example from my side. We have the ‑‑ European commission has put out a proposal for legislation in the cyber security area to introduce a minimum requirement that when there are cyber security incidents in critical areas of the infrastructure, these be reported. At the moment the rules in Europe just say only if telecom companies have some kind of breach of security. With this legislation, we are going to extend it to other critical areas, indeed banking sector, energy, transport, health, and some areas of the public administration. So as you see, just minimum standards, and then we can work on self‑regulation as well.
>> MODERATOR: And one question. The remit of government, for the EU commission, what do you think is your remit?
>> SILVIA VICECONTE: Well, things that affect the internal market, for example, that would be within the commission's remit, but there are things that relate to, for example, national security. National security remains with member states at the moment. So that this is not something we deal with directly.
>> MODERATOR: Let me pass the question to Nina or Liesyl. What is the remit of the national government when cyber security is concerned?
>> LIESYL FRANZ: Well, certainly there is a care of the citizenry of the country that is a role for government, but I think that that can take many, many forms. I don't think strictly a purely regulation is the only way to do it, so I think there are alternative ways to realize that role. Such as a communing function, providing incentives for action, being sure to partake in consultations, like Virgilio said, in a multi‑stakeholder way, getting input from people that are either going to be impacted by or help impact the outcome. I think that looking at how to make sure your legal framework has the appropriate laws for criminalizing activity that allows for redress that way when something has gone wrong, and then I think one of the key roles for government is as a partner in public‑private partnership efforts and all of those functions that a government undertakes move industry and move participants in this Internet life we have to take action. I even left out helping to build awareness, not only of citizens but also of enterprises and organizations of any kind. So I would put it as an active participant and a partner in moving a large, you know, a large constituency and effecting change.
One thing I'd like to say about ‑‑ elaborate a little bit on in my not so quick answer to the first question, I think we also need to look at regulation, or as I prefer in this case to talk about legislation, as not just mandates for prescriptive or ‑‑ prescriptive requirements, but ways in which you can enable public‑private partnership or you can enable companies to take action. One way in which that has manifested in the U.S. is there are several impediments to the ability for government and industry to share information with each other, so is there a way to enact legislation that allows for more companies and government to share information more easily so we all have better situational awareness about what's happening.
I think what keeps government up at night, as Silvia said, is pretty much anything that can go wrong. Recently, you know, my White House leadership characterized the current environment as one in which cyber security threats are increasingly broader, sophisticated, and dangerous. They include persistent intrusions, theft of business information, degradation or denial of service to legitimate industries trying to do their business or get their message out. So that's kind of a scary environment to be looking at. But I think one key element of any of those is something that we haven't been able to detect.
>> MODERATOR: Thank you Liesyl.
Nina, your ministry is even wider. It's security and justice, so that's a lot wider, so there must be a lot more that keeps you awake at night.
>> NINA JANSSEN: Well, as a policy maker, you're always weighing interests and looking ‑‑ gathering input, so I would agree with Liesyl that you're looking for the partners, the methods and the means that are available to, yeah, address the different interests at the table so to say. So it's not something that keeps me awake at night, 'cause then I wouldn't sleep, 'cause it's my work. But I guess that the dilemmas that we are faced with from the governmental viewpoint is how do you weigh the interests that should be taken into account and treated equally. What extent of security, because I don't think that there is absolute security. It's neither possible nor desirable, I think. So what extent of that security should be ‑‑ should we want is desirable if you set it against usability or the openness of the Internet and so what consequences would a certain measure have. What extent of freedom online is desirable if you set it against threats or crimes, that actually comes with certain free opportunities, and how do you again react against such threats or the crime itself. Should we only be able to react defensively or should we be able to act in return or not?
So the interests that you're regarding as a government, they are ‑‑ they're quite broad. You look at industry. You look at the small players in a field that maybe not have such a strong voice, individual consumers, human rights defenders, individuals' privacy rights, commercial interests. They should all be taken into account when you make policy and if you decide to regulate, they should be taken into account in that legislative procedure.
What was experienced in the Netherlands in 2011 was a cyber security incident or actually it evolved into a crisis where a relatively small certificate provider called digi noter, they were hacked and they had a crisis which actually was quite a wakeup call for politicians, for government, for industry players as well. And parliament realized that certain services are ‑‑ should be considered critical infrastructure or a vital sector, so they demanded security breach notification.
We've recently tabled this security breach notification proposal to parliament and I think this is one of the regulatory measures you can take if they are set in the correct fashion and what I mean with the correct fashion is for example we also would like to get the information that we need at the table so we don't want to frighten off anyone. This notification concerns therefore only the vital infrastructure and information that actually influences our national security. And we also chose not, at least at this point in time yet, we chose not to sanction, because these partners share a lot of information with us already on a voluntary basis and we want them to keep sharing information because it helps us assess trends and see emerging threats or anything that may actually also influence national security.
So it's ‑‑ currently this proposal is in the consultation process. There are every stakeholder from an individual to NGO or industry players can, yeah, provide their input at this moment. It's not finished yet, but it's one of the examples I think that we ‑‑ where we have balanced those interests and are looking for the way forward.
>> MODERATOR: Okay. Thank you. I would have loved to have passed this question on to ETNO as the representative of the European telecommunications organization, but unfortunately they can canceled. I think Google could be seen as something like a vitally critical infrastructure, because if Google goes down, we won't know anything more in this world.
(Laughter.)
Let's pretend that, hypothetical, a major hack takes place at Google. We can't use Google anymore to find our information, and you're obliged to bring that to an institution, in this case in the Netherlands, that you have to report to. How would you look at that sort of obligation and that's something which you could actually work with?
>> APARNA SRIDHAR: Well I'm not familiar with the specifics of the obligations, so I probably shouldn't comment directly on that. But what I can say is, you know, we take our ‑‑ you sort of asked the initial question, what keeps you up at night, and I think in our organization, the security of our users' data and the security of our systems is something that keeps us up at night, right? We recognize that it's an arms race and that people who want to get into our systems are continually improving, and they're continually getting smarter and faster and we need to improve at the same rate or ideally a faster rate.
So I think, you know, in the context of sharing information, we can do that, but we are very wary of sharing information that will give anyone the ability to have enough knowledge to perpetrate an attack on our systems. So it sort of depends kind of what exactly the parameters of any sharing would look like.
I will say we have been fairly transparent about the fact that attacks do occur. So ‑‑ and I'll give you two examples. One is we were one of the first companies to step out and actively acknowledge that our systems had been targeted by attacks from China, not necessarily from the Chinese government, but from somewhere in that region in the territory.
And another example is, and a lot of companies do this now, we have a program where we hold contests where we actively encourage users to hack Chrome, which is our browser, and if you achieve a certain level of hacking through the contest, then there are certain cash prizes, and then that helps us understand better the vulnerabilities of our system and how to improve it.
>> MODERATOR: That's a good example of self‑regulation, I suppose, trying to get people into your own system, inviting them to come into your system.
Is there anybody from a vital infrastructure from Europe in the room? And can respond to the proposals at this moment? No one? So who will provide the critical infrastructure?
>> AUDIENCE MEMBER: (Off microphone.) Yeah, I'd just like to say Internet doesn't exist without us, so the panel is perhaps not representing civil society. I think there's maybe a bit of a bias toward what vital structures are in the Internet. And I'm also quite concerned when governments are talking with corporations about data sharing and leaving us out of the equation, leaving oversight. I believe in the northern hemisphere there are already quite sinister data sharing programs between governments and corporations from what I've read recently in the papers. Thank you.
>> MODERATOR: And your name?
>> AUDIENCE MEMBER: My name is Alex Comninos. I'm with the Internet.
(Laughter.)
>> MODERATOR: That sounds good.
So there's no critical infrastructure representative in the room here? You, Bastiaan?
So Bastiaan is with M6 in Amsterdam. So what about if you guys go down and you have to report? What happens?
>> AUDIENCE MEMBER: Thanks for introducing me. Indeed, I'm from the Amsterdam Internet exchange, one of the big Internet hubs in the world actually. Officially we're not critical infrastructure, vital infrastructure as we say so in Holland, but obviously at least from our perspective, we take the ‑‑ our operations very, very seriously when it comes to resilience. We see the continuity of services and along with that, security as well. But now more than 600 networks are connected to our platform and they are very keen on the way to see how we operate and that we report on that. So if we would be designated in Holland in this case as being critical infrastructure, which we do not think is necessary, but if that would be the case, of course we would abide by the law and notify security breaches.
I have to say that maybe from a more public‑private partnership perspective although it's not like formalized, we do have informal agreements with the ministry of economic affairs in this case, the ministry mainly responsible in Holland for the telecom sector and also the Internet, that if anything happens, we know where to find them. They can all RCEO 24/7. So I think we have that part covered and I don't think it would make a real difference, also, you know, looking at what is in place for vital infrastructure in Holland, if we would be a critical infrastructure and we would notify. I don't think it would improve our operations or really add something. But at the same time, we are always open to discussions also with policy makers and we tend to see ourselves as completely transparent with regard to how we operate also technically. The only thing that we will not share is like customer info, customer data. We can always point people, look, this is a list on the website. These are all the customers. If you have a particular question for an ISP or a content provider, I can invite you to approach them personally, but that's about the only thing. For the rest we are completely transparent. Everything else is there. That's about it.
>> MODERATOR: Thank you.
So let's go to the political side of the equation.
Sorry? Somebody else? Okay. I see you. Please introduce yourself.
>> AUDIENCE MEMBER: I'm Nasser Kettani. I'm the CEO for Microsoft in the Middle East and Africa. And to some extent I would consider my company runs critical infrastructure as well.
I'd like to give two examples of public partnerships and how this can work, and two examples of how our own experience. One is around how we actually work with CERTs, you know, computer emergency response teams around the world and with information sharing, because there is an opportunity for sharing information with CERTs, around threat intelligence and how that can help cooperation in terms of alerts and responses and, you know, and obviously keying computers, et cetera. So that's an area of information sharing. I completely agree with what you said because we have the same concerns as Google.
The other thing that I was ‑‑ an interesting experience for us is working with law enforcement agencies and so forth in areas of botnets. You know, I don't want to go into the details of how botnets work, but basically we have been able to attack botnets and the technology in order to address and take the servers down, but that's required collaboration with law enforcement agencies. So we can take orders immediately from courts and go and, you know, take down those botnets and address some of the major malware issues that we're running in the industry.
So there are areas of cooperation and cooperation sharing where we, you know, we take action to make the Internet safer.
>> MODERATOR. Thank you. And we have Microsoft in our second panel this afternoon where we will go into details and examples like this. Thank you.
Let's go to the political side of things and, Astrid, what keeps you away at night when you think about cyber security from a politician point of view?
>> ASTRID OOSENBRUG: Well, keep me awake? No. I think the most important thing is people should be aware, aware what happens on Internet, what happens when you don't protect yourself, and the funny thing is in real life, we all know how to act when we buy a car. We know we have to get a driver's license. We know everything. But when we go on the digital highway, we just go there and have fun and are not aware that there is also a dark side on the Internet. So the first thing I have to agree, awareness. So that keeps me really, really awake, because in the Netherlands, 98 percent of the whole Netherlands can be on Internet. So you also have a responsibility for your people, but I think people also have a responsibility for themselves. So that sometimes keeps me awake.
Somebody has a question?
>> AUDIENCE MEMBER: Good morning. My name is Yuliya Morenets. I'm with the not‑for‑profit TAC, Together against Cybercrime International, and we do work on cybercrime and cyber security and specifically on awareness raising campaign. My question would be but how? How can we have an effective awareness raising campaign. Actually I think the point is even if citizens or users are aware, they don't know how to be assisted. How can we assist victims that don't report? So my question would be can this be part of the cyber security strategy? Do we need a cyber security strategy at the national level and how to do this? Thank you.
>> ASTRID OOSENBRUG: Yes. Thank you for the question.
My feel would be start at the bottom. Start at school. The first graders, start teaching about Internet. Teach children well. That's my ‑‑ the message I always have, is teach your children well, 'cause there's a lot of bad stuff going on on the Internet. We tell our kids not to go out and talk to people they don't know. On the Internet, they just chat with everybody. So that's good. And if you don't know, you just go there and talk to everybody.
But there are other things. There's ‑‑ when Internet started, way, way, way back, it was about sharing information. These days, Internet is about selling stuff. So all this ‑‑ there's always ‑‑ there's also a trap in that, because when there's money you can earn, the people will go there, also bad people. So you have to be aware. But also companies has to be aware, the government has to be aware, and people has to be aware. So that's a good thing.
What was the question again? Sorry.
'Cause it's about awareness.
>> AUDIENCE MEMBER: Yeah, mainly the second part of the question was do we need to establish or to develop cyber security strategy at national level so the awareness raising and what you just mentioned, the education element, can be part of it, because if we don't have a strategy, at least for our point of view, how can we prevent this? How can we organize this? That's the question.
>> ASTRID OOSENBRUG: Well Internet doesn't keep itself to borders. It's an international global thing, so we have to work together, and you see with the CERTs, the computer emergency rescue teams, in Europe they're now talking together. You see it works. 'Cause then you have short lines, and you can start with the cyber security. 'Cause you talk to each other and then there's a big advantage. And I think it has to be global, not Europe, not America or something. Global. We have to work together. So it's not a small issue. It's a big issue. And the funny thing is we don't look at it that way. I feel the whole NSA thing, people were all shocked and then they went on like nothing ver happened, so we have to make sure everybody understands what happens on the Internet.
We also talked about attacks, so people who hack, we also have people who ethical hacking. I'm not sure if everybody understands.
In Holland the government made responsible disclosure to protect people who have ethical ways to hack. So you help each other. And maybe that's the way we should look at each other, not always at dangerous things but also at good things, and we should make each other stronger instead of weaken each other. So there's a lesson to learn for everybody, I guess. We're together and ‑‑
>> MODERATOR: It's an important lesson, I think. We have three national ‑‑ two national and one European strategy in the making.
I think Silvia was first.
>> SILVIA VICECONTE: Yeah, just to answer directly your question. In Europe the answer has been yes we thought we needed to have a cyber security strategy and so we devised a cyber security strategy. It's a strategy that has different parts to it because we don't believe in the purely regulatory approach to it, but when I was mentioning earlier the obligation for the key infrastructures to report on their cyber incidents, that would be a regulatory part to it.
We have put up a public‑private partnership with the other actors, and of course there's let's say softer things, like awareness raising and of course there's self‑regulation from the industry who will have its own incentives in making sure that the data is protected. But I think for Europe the answer is very clear. We thought we needed it. We got one.
>> MODERATOR: I have one question in the audience. Please introduce yourself.
>> AUDIENCE MEMBER: Thank you. Zahid Jamil from the developing country center on cybercrime.
I like the hacking for cash business, by the way. It's kind of interesting.
A clarification on the north part. By the way, you would be surprised at actually the data between government and citizens that's actually for sale in some countries in the south, so I want to flatten that debate a bit. My question really was about the things that you said about life, et cetera that goes on. It seems a little interesting for me that the problem ‑‑ whenever we were discussing where the problem is and what your concerns were, they were banking, stock exchange, health and these areas, which is sectoral. But the solution and response was general life overall regulation, and I think that maybe we need to balance out what we're trying to do. It makes sense to have sectoral regulation related to cyber security. So say for instance the banking sector has to respond or inform when there's breach notifications with the stock exchanges, et cetera, so that makes perfect sense. But it really makes me nervous when I hear that this is the life now because it's on the Internet. All of it needs to be regulated and I hope that's not the way we're going to go. If that was the case, if there was one global or large meta cyber security strategy, policy, regulation, whatever it was, guess what? That's exactly what Aparna was saying. They don't want people out there knowing exactly how they do things, because the moment everybody knows what your cyber security strategy is or how you function, you become more vulnerable. One of the things I would sort of suggest is being more sectoral, rather than being broadly general cyber security, a specific might be a better way to go. Thank you.
>> MODERATOR: Liesyl?
>> LIESYL FRANZ: Sure. First I'd like to answer the question about national strategy and in the context of awareness. The U.S. has had essentially two national strategies that attempted to do categorization, a compilation of the issues that are encompassed by cyber security, which are fairly broad, and diverse, and then put ‑‑ and then in our second time around, put together a ‑‑ put a framework around that that helped guide an approach to cyber security, and an awareness campaign was certainly a part of that.
When a nation puts forward a strategy of any kind, it heightens the level of awareness nationally about that. When we put out the strategy, it was done with a lot of consultation with multi‑stakeholder groups, civil society and industry and technical community, and among the government entities as well, and so it reflects that input.
The other thing I would say is that it's not exclusive. You know, just because you say in a national strategy that an awareness campaign is important, which does help, because it focuses the mind as to what's important to that country, and what efforts might be able to be undertaken to help implement that strategy, but that's not the only thing that has to happen. It can be very grassroots and be very societal in a way, so they're not mutually exclusive but both very important.
And I absolutely agree with the comment that it's global. That also isn't mutually exclusive. Just because you have a national strategy that gives a framework to your national approach, in our case an international component is included in the international strategy, that's very, very important, and coordination and collaboration and partnerships and reaching out and trying to make ‑‑ put it in a global context is very important. Thank you.
>> MODERATOR: We're going to take one last question on this topic and then move on to industries, because they get the word also.
Please introduce yourself.
>> AUDIENCE MEMBER: Thank you very much. My name is Alun Cairns. I'm a member of the U.K. parliament.
I have no doubt that governments of all nations will understand the issues of cyber security and the risks around it but if I look to my colleagues and I even include myself in this, that the understanding of parliamentarians of the risks is pretty limited and therefore the scrutiny over government cyber security policy is pretty ‑‑ is therefore limited because they're not questioning, they're not probing, and not prioritizing it as an issue. So when in austere periods there are budget cuts, unless the parliamentarian is driving it as a priority, it doesn't tend to get the slice of money that it may well deserve, and I think the original quote was it's cheap to attack but it's expensive to defend against. And therefore what can we do to raise the level of debate and understanding amongst legislators who don't understand the technical risks?
>> MODERATOR: That's a challenge for industry, I think. So who would like to respond?
>> ANDREW SULLIVAN: I think there's a really interesting question, because you're quite right that a lot of the policy is being made in a technical vacuum, and in fact some of this discussion has struck me as sort of funny, because we keep talking about cyber security and I'm trying to figure out what are you being secure against. So we've got bad guys, we've got other random people on the Internet. On your local network we've got foreign governments. We've got just random other users. We've got people who pick up your little key that you happen to drop on the sidewalk. What about your own government? What about your vendor? Your vendor can attack you, because they've got access to all your data. You're own employees, you need to be secure against them too because they can walk out the door. So there's an enormous breadth of the problems here and actually there are different kinds of problems. So when you lump them together as cyber security it's really easy to focus on one of those and not pay attention to the distinctions.
One of the things we did as sort of protocol designers, and I shouldn't include myself, this was before my time, but one of the things that happened with the Internet design is that it's essentially a lab experiment that escaped and was successful, right? So it wasn't really designed to be secure in the first place and what we've been doing ever since then is kind of laying in extra security in various places.
So there is a thing that governments can do that is not regulation but that would be super helpful and I strongly encourage you to think about doing this and that is you can use your purchasing power to insist that vendors actually make the system secure from the ground up. So instead of having these systems that are designed and then you like kind of add some security later, which is this bolt on component, of course if you bolted it on, then somebody can come along and unbolt it, instead you can insist that the systems be secure from the ground up, and that means it's secure from you too. So what happens a lot of the time is what people want is they want the system to be secure from other people, but they want their own access and they want to make sure that they've got their own back door. If there's a back door there, then somebody can walk through it and that somebody may not be you and that's actually a big part of the problem we have now. People are designing systems in order to respond to so‑called security initiatives that are not security. They're in fact access initiatives. And I think that we need to pay a great deal of attention to the technical details, to finally come back to this question, to the technical details that people are asking for, you know, what is it they're really asking for when they say I want this system designed this way or that way.
So I think the number one thing that I would say anyway as a protocol person is first of all I'm sorry, we screwed up, and we didn't design it so that there was perfect security all the way along. We're doing our part. For instance, at the IETF meeting in Vancouver next week, the week after, whatever it is, we're devoting a significant chunk of time to that with specific proposals to improve security in various ways. But this requires users and one of the people who are big users and big consumers of this stuff are governments. You can do your part just by buying it.
>> MODERATOR: That's an advice and a challenge, I think.
I've got somebody who wants to respond to that?
>> AUDIENCE MEMBER: Thank you. John Laprise, Northwestern University.
I'd just like to comment on the last speaker's contribution that I'm a historian of technology and the U.S. government has been doing this for a long time, embedding contractual terms in its contracts to compel its vendors to embed certain technologies within its services, and given the market power at least within the domestic market in the U.S., that has major impacts on policy, not just within the government sector but more broadly across the domestic sector, so this can have a real impact across a nation and more widely. Thank you.
>> MODERATOR: As a question to the audience, people from governments, who has actually know that they've negotiated security when they bought ICT, or do you just buy it off the shelf? Anybody who knows that there's been any negotiation in security? Hands please?
>> AUDIENCE MEMBER: (Off microphone.)
>> MODERATOR: By governments first, because that was the challenge.
From industry, anybody negotiates security by design when you buy ICT? So for industry ‑‑
>> AUDIENCE MEMBER: My name is Roelof Meijer. I'm from the Netherlands, and I know, but maybe my compatriots from the government are not aware of that, or did not mention it, but I know we have, I think it's like the gentleman over there suggested, that we have a list in the Netherlands that we call (speaking non‑English language) in which government puts certain conditions on which if they put out the bid for tender, the supplier will have to follow those criteria, and we managed to get on that list, so I think it's a good example of a security specification that the government uses in its tenders.
>> MODERATOR: Thank you for the example.
We already mentioned the IETF just now, Andrew. Perhaps not everybody knows exactly what the IETF does, what it stands for, but it's something like self‑regulation, right?
>> ANDREW SULLIVAN: So the IETF stands for the Internet engineering task force, and I need to be very careful here. I don't speak for the IETF and if you ever went to an IETF meeting, you would know why, because herding cats barely describes what we do. It's much more chaotic than that.
But one of the things we do is we are a group of people who come together and anybody can join. You can show up on the mailing list and that's really all it takes, and contribute to these standards. All of the stuff that you use on the Internet, all of the protocols that make things go, all of the things that allow us to send these bits around, they're all IETF with a couple of exceptions. They're all IETF standards, and these standards are developed in public, in the open, in a rough consensus model so that anybody can read it, anybody can read the documents, anybody can comment, and what we do is we work towards the best technical solution as far as we understand it. That isn't to say we always get it right. For instance we didn't think that pervasive monitoring was going to be a problem, so we didn't include that in our threat model. That was a mistake. But, you know, we didn't do that. We didn't do opportunistic encryption because that was too dangerous. It turns out maybe it was a good idea. So there are cases where we get it wrong.
Nevertheless, the overall structure is a way in which different people come together and everybody is working towards this common goal of making the systems work in the technically best way that we know how to do. It is a completely open process, and so it's a little chaotic and if you want to come and join the meeting, of course I would encourage you to do so, but be prepared because it's not formal like this. So it can be a little alarming, but we're actually friendly. It's just that we're really vicious about it.
(Laughter.)
>> MODERATOR: As a response to that, we had the same session, you should know in the Netherlands three weeks ago, and there was somebody from the Dutch government in the audience and I knew he had been to his first IGF in Berlin, and I walked up to him and asked him what did you think, did you feel welcome, did you understand what was going on, and he said, well, I was sort of left on my own, and nobody approached me at first, so I went out talking to people. But it is an example that somebody from government goes to the IGF and then doesn't know where to start. Is that something that could be improved over time?
>> ANDREW SULLIVAN: It is something that could be improved. One thing that I would say is there are sessions outside of the official sessions so we're not very good at this kind of stuff because we're geeks. You know the story, right, about the extrovert geek is the one that looks at your shoes. So there's a certain amount of this that is just a cultural problem. But we'll say that there are sessions, for instance, on Sunday that are ‑‑ they endeavor to welcome people in. It's like any other kind of group that works a lot together, right? It's difficult to get in. It's hard for me for instance to get into this community a little bit because I spend most of my time in the geek land. So it's the same sort of problem, but some of us are welcoming, and if you have any other things that you want to ask about or if you're coming to Vancouver or another future meeting, you know, obviously I will be there and you should feel free to tap me on the shoulder and say, hey, I heard you before and you don't seem that scary. Maybe I do, but I intend not to be.
>> MODERATOR: Have you ever met the people at the table, not necessarily personally, but their colleagues?
>> ANDREW SULLIVAN: Not before we walked in the room.
>> MODERATOR: So no Brazilian government, European commission, or U.S. government?
>> ANDREW SULLIVAN: I know people from those bodies in various places but typically there are people from the back office, right? It is a very technical environment. We're not talking about governance. What we're trying to do is build the pieces that allow this stuff to happen. We don't do policy. The goal is to do the protocols but of course there are choices. There are policy things that are either enabled or not enabled by the protocols that you build and sometimes we don't think about these angles and sometimes we do and we think it doesn't really matter and some of the time, we're enabling these various behaviors, and one could argue, well, that's not a completely neutral thing to be doing, but I think the IETF tries to take the stance that we're not in the business of telling you how to run your network. That does mean, of course that the network has capabilities on it that we're personally many of us are not comfortable with, but that is part of our job. Our professional responsibility is to make the system technically strong and allow the range of policy choices that we think people might want to have. That doesn't mean that I think all of those policy choices are a good one, but that's not my job when I'm doing my protocol work.
>> MODERATOR: I'm putting you on the spot here. Had you ever heard of the IETF, yes?
>> VIRGILIO ALMEIDA: Yes. I'm a professor, so I'm a computer science professor. I have heard a lot about IETF.
I would like to make a final comment about what we have been discussing here. I have ‑‑ regarding your question about cyber security strategy. One comment that I have is that most of the strategies, they address past threats. They do not look into future problems.
Let me give a clear example. When we talk about Internet of things, nobody mentioned that here. They will mean a big increase in the risk for the population. Because we are not talking about hacking information, but we are talking about hacking real actions on your vehicle, on your home. So I think a message that I would like to leave here as a government is we should look more into the future for those threats.
>> MODERATOR: So if I can sum it up, it may be that some topics which are generating concerns should reach people from the IETF and maybe not just people from industry but also from government. So should there be something like a GAK to the IGF. I'm saying something very sensitive here, I know, but something that you need and discuss these sort of topics.
>> ANDREW SULLIVAN: It isn't a GAK that you need, right? You need to show up and participate. A good example is the U.S. NIST, the National Institutes For Science and Technology. Did I explain that right? So I have colleagues there, for instance, that participate in these things and they participate precisely because they want the standards to be good. So that's the way to do it. And one thing of course that governments can do under those circumstances is say, hey, this is enough of a priority that we will have people who will work on those things, and will actually contribute to those things and will review the documents. I got to tell you it is scut work building this stuff. You got to read endless versions of the draft and then negotiate about, you know, well, no, I think it has this attack or I think it has this problem or so on. It's a lot of work and that requires people who are going to do that work and it's a hard job to find those reviewers.
>> MODERATOR: I think that's an open invitation, right? So I think that's a major outcome of this session.
>> APARNA SRIDHAR: Yeah, I just wanted to add, I know the Internet Society, and maybe Karen Mulberry can talk a little bit more about this, has a program that essentially helps countries with regulatory officials attend the IETF and sort of provides an introduction to what can seem like a complicated process. I don't know if Karen wants to say more about that, but I think it's an important resource and one that could be more utilized.
>> MODERATOR: Would you like to respond, Karen?
>> KAREN MULBERRY: Hi. I'm Karen Mulberry and I'm with the Internet Society and indeed we sponsor a fellowship program for policy makers and we've got a lot of support from a variety of organizations to do this so that governments and industry, you know, policy makers can attend and participate in the IETF process. This is truly the multi‑stakeholder dialogue, where individuals show up and contribute and try to move things forward in terms of development of initiatives on the Internet. So we bring policy makers in. We try to shepherd them through the process. We don't totally hold their hands a hundred percent of the time because they need to experience this individually as well. But we try to provide some context and framework for how the IETF operates, what it focuses on at various sessions so they can be exposed to this because it's important for them to understand how the collaborative effect occurs in this open dialogue.
I do note too that the IETF has recently formed a policy discussion list, because they're very interested in trying ‑‑ you know, for the geeks and the techie people, trying to understand the policy perspectives on things, because, you know, as engineers, you can ‑‑ you know, you're driven to collaborate and develop solutions to things. While the context of policy and other things may not be part of the dialogue and when you look at the perspective of a collaborative solution. So they're trying to bring a little bit more understanding from the policy side, things that they ought to be aware of as they work towards providing those collaborative solutions.
And by the way, if there's any governments in here that are interested in the fellowship program, please see me and I'll see what I can do to arrange to have you participate in an IETF meeting.
>> MODERATOR: Another invitation. Very good.
I think we have some questions from the room. We have a remote one? We'll do the remote one first, okay?
>> REMOTE MODERATOR: Thank you. We have a question from Venezuela, from Jorge Gonzalez at the public university who asks what can be done to prevent governments from committing excesses with the excuses of cyber security?
>> MODERATOR: Who wants to take that one? Yes? Virgilio, please.
>> VIRGILIO ALMEIDA: I think that if we take into consideration the multi‑stakeholder models, they can help to give a balance to government proposals.
>> MODERATOR: Nina.
>> NINA JANSSEN: If I may just add to that, I think it's about checks and balances the framework that's in place, so whenever you're considering any security proposal, take it in as a multi‑stakeholder fashion, do have discussions with several interested parties at the table and do it in a transparent way. I think that should be at least a guiding principle when you're discussing security measures and use of data.
>> MODERATOR: Thank you.
Please introduce yourself.
>> AUDIENCE MEMBER: Hello. My name is Athina Fragkouli from RIPE NCC. So we understand that cyber security is nothing new and it's a problem that the technical community has identified for some years now, so it has been discussed in open fora, and through self‑regulation initiatives and we heard about the IETF discussions. There have been some proposed solutions such as intersect and so on. And we also hear from the commission that they have come up with a strategy, which is very much appreciated. They also introduced this platform for a discussion between public and private sector. And I was wondering whether the commission has also thought of the possibility of cooperating with this already existing fora for standards, whether they will take the standards into account, whether they will refer to the standards into this body, this self‑regulating bodies and the strategy. Thank you.
>> SILVIA VICECONTE: Thanks. Thanks for your question. On standards, I think the answer is yes. I mean I don't deal directly with ‑‑ it's my colleagues who deal with the platform. One thing I can say is that it's not part of our strategy to mandate any specific standards. So the commission would not, you know, kind of get up there and start mandating standards. We absolutely recognize our limits as policy makers and not technicians. So I think the dialogue is open, and on standards also, we would not be imposing any type of standards.
I don't think that fully answers your question but I'm happy to put you in touch with the right people.
>> ANDREW SULLIVAN: One problem that the IETF standards have is that there are a number of governments who refuse to refer to them as standards, who accept only standards that are promulgated by particular agencies that they recognize and these are typically treaty organizations. That's a problem because those are typically not the places where the standards are actually developed.
One of the really cool things about the Internet and what has made it so successful is the permissionless innovation that is part of it. The innovation happens at the edges and that's why we have all of the good things that we do on the Internet, because it's easy for somebody to pick it up. You don't have to upgrade the entire network for something to happen, but that kind of environment really only lends itself to the kinds of standards that we're able to develop in the IETF, because it's the people who are participating who are interested in it, right? The kind of heavy weight centralized standards development that is effective for other kinds of telecommunication devices frequently is not that ‑‑ it's not interested in the kinds of standards that we're developing in the IETF. We collaborate with those bodies and we're happy to, but they're different kinds of spheres and interests, so it would be very helpful for governments and for regulators to say, oh, yeah, we're going to embrace the RFC series as well as one of our significant sources of standards and to promulgate those and include them in purchasing decisions and so on.
>> MODERATOR: That's one of the questions we have on the list. So let's go there first, and I'll come back to you in a moment.
A lot of the standards apparently are not adopted or not implemented by companies that maybe should be doing so or at least it's voluntary, but when we're talking about cyber security, it may be in the interest of a national government to have them implemented somehow. So is there a way that when they know that those standards are there and that they're important to create some sort of a level playing field to actually assist with having them adopted within a certain period of time? Or would that not be the right way forward? From a government point of view and from industry point of view, please. Aparna.
>> APARNA SRIDHAR: So I think we want to be really careful about mandating standards in a legislative or regulatory process and I'll just give you two examples.
In 1996, the U.S. Congress passed the most recent iteration of a revision of the telecoms act. The prior sort of full writing of the act had been in 1934, and between ‑‑ and most accounts will say that it took roughly ten years to get the '96 act off the ground, so people started talking about it roughly in 1984, '85, along the time that the cable act was adopted. So that's only about ten years for the revision of an act, not a creation of a new act out of whole cloth.
Similarly the rule making process in the states, which is somewhat faster than the regulatory processes, is not what you would call truly fast. Rule making lasts a number of years. They get tied up in litigation. It could be four or five or six years before a rule is finally settled. What we see in standards setting context is a world that evolves much more quickly and more importantly what we see in the context of cyber security attacks is attackers that are much, much, much more nimble. So if we get into a situation where we're waiting ten years to adopt or implement or require standards, the standards we will require will be hopelessly outdated and nearly hamper companies in being able to effectively respond to current attacks.
>> LIESYL FRANZ: Since we're talking about this in this context, I'd like to take the opportunity to refer to President Obama's executive order that he issued in February of this year. It does a couple of things, but one thing in this context it does is to direct our National Institutes of Standards and Technology that Andrew referred to earlier to lead the development, and that goes back to the convening role I mentioned earlier for government, to reduce cyber risks to critical infrastructure. So working with industry, this will work to identifying existing voluntary standards such as those we're talking about and possibly others, and industry best practices into the framework and then utilize all manner of communication and incentives and convening functions to help encourage adoption of those standards and those practices by all in the community.
The idea is not for the framework to dictate a one size fits all technological solution precisely to the points that Aparna raised but instead to promote in a collaborative approach to encouraging innovation, recognizing there are different needs by all of the actors. One person's critical infrastructure entity is not another person's critical infrastructure entity. So we really need to recognize that there are those differences and differences in needs and challenges.
So I commend you with the information about this cyber security framework. We have ‑‑ it has been a very robust multi‑stakeholder approach of soliciting input and participation by industry. But also by industry and other players from around the world too. So if there's an interest in providing input in that, we can certainly make that available too. Thank you.
>> MODERATOR: And when all else fails, you've tried everything and everyone agrees it's important, and it's just not happening, for whatever reason, what happens then?
>> ANDREW SULLIVAN: I really want to challenge this idea that everybody agrees that it's important and nobody's using it. If people aren't using it, they don't think it's that important. But if you think that something is really important and you think that it's not getting implemented and you're a government, and you're buying a trillion dollars worth of IT this year, you've got a tremendous opportunity to make that thing ‑‑ that thing implemented. I'll tell you what, if somebody comes to me with this year's purchase order and say I want to spend $3 million on this technology, will you make it happen, I've suddenly got a much more powerful, you know, argument to make to my bosses in product development, right? I've got this sale here right now. Let's build it. So that's a really, really important incentive and it's one of the biggest levers that governments have because every government is an enormous consumer of these things and that's really how the Internet itself got built, right? There was an interest on the part of the Department of Defense and the advanced research projects agency and so on to have this thing and to build the experiments and so on, so that's how it got funded, and that's the reason we've got this good thing here today. So it's always been a partnership with governments always being involved, and it is crazy to talk about this being either government or no government. But it's important to recognize that the way the government acts in here is primarily by being an actor rather than by being a regulator. I think that it's much more effective in that case.
>> MODERATOR: So by example also that I've raised. Silvia?
>> SILVIA VICECONTE: Yeah, I would just like to agree with you on that and I think there's two things I want to say. First, indeed, it's up ‑‑ government can set an example when it procures its own IT systems. And actually that two years ago, the European commission put out there some legislation that allows in public procurement for referencing of standards that are outside of the official standardization body. So actually we've been recognizing this for some time, and we've put it into practice. And we do come to the IETF meetings.
(Laughter.)
>> ANDREW SULLIVAN: Thank you.
>> VIRGILIO ALMEIDA: In our case in Brazil, government procurement, it's important to demand certain types of standards. It has been used for that.
>> MODERATOR: Thank you.
>> AUDIENCE MEMBER: Thank you. My name is Norani Puna. I work for a technical Internet infrastructure organization based in Sweden. We run exchange points and we also manage one of the 13 DNS route servers, and that means we get pulled into both the work of the IETF and the work here. It means that I go to IETF meetings, I go to ICANN meetings, I go to regional Internet registry meetings, and I go to these and I never feel comfortable at any of them. I don't know what that says about my relationship to shoes.
(Laughter.)
What I want to say though, is that well first of all, so there are people from the IETF here, Andrew is not the only one, and like he says, there's no king of the IETF who comes here to speak to you. And there haven't been from the IETF, the idea previously. And I really do think that it's good to have an increased exchange of information between the two. It doesn't mean that governments should go and sit in their Internet standards working groups and discuss whether this flag should be on or off but there needs to be an increased communication. But I think the IETF is doing things to try to welcome policy makers into its sphere, I also think the IGF should do more to welcome the geeks into this community, because we do need those voices.
And while it's entertaining to talk about cyber security, good or bad, for or against, self‑regulation or no regulation, only regulation, I think it's actually sometimes it's not very useful to talk about security as if it's just one thing. You're just lumping a whole lot of different issues into one bucket. And it also kind of disregards this kind of ‑‑ the ecosystem we're in, with all the different multi‑stakeholders. There are all the different stakeholders. So I think it would be more useful to ‑‑ it would be interesting to hear specific examples of where different multi‑stakeholders have cooperated to solve specific issues within sort of this cyber security bucket. Thank you.
>> MODERATOR: Nina, would you like to address that?
>> NINA JANSSEN: Yes. Thanks. What we did after our first national cyber security service strategy in the Netherlands about two years ago, we built a lot of structures and institutions to at least get a sense or make it open for organizations, multi‑stakeholders to participate. So if you for example need a specific example, we have at the national cyber security center now, which has developed from a government CERT to a national CERT. We have this flexible pool of liaisons from the private sector, from academia, from other national organizations, public organizations, private, and they are connected to the information sharing and analysis centers of different critical sectors, the ISECS. The national cyber security center holds a secretariat so they are providing this forum, this platform, for a sector to consult together on security issues, to share the best practices. Some sectors have developed their ISEC much more already than others. We're still forming about two new ones on transport and I believe there's one on the ports. I'm not sure. So this is one of the concrete examples that our national cyber security center uses.
Then there's also privately driven initiatives. There is the abuse hub or abuse information exchange. It's an initiative that was subsidized in the beginning by the ministry of economic affairs, but while there are some people in the room that may actually have much more information on that, but it's something where we as a government are trying to create our relationship with that private sector on specific botnet information, initiatives like that. Yeah.
>> MODERATOR: We have another remote participant. After that we're going to go to the final closing because we're running out of time. If you don't mind, we're going to go a few minutes over time, but after that everybody can go to lunch. So one more remote participant.
>> REMOTE MODERATOR: So we just got in a comment from Holland, and it says I disagree with the Dutch panelist that following the NSA revelations, that life went on as usual. I think this is possibly naive and dismissive. There is a healthy paranoia that the revelations have helped to develop. I believe this will add to greater self‑governance as well as be a component in organization security planning consciously, subconsciously or otherwise.
And one more comment from Belgium that just came in that says I want to know why there is no discussion of vulnerability regulation. Surely software security is the biggest problem. Regulation of vulnerability disclosure is far more important than regulation of breaches. What about zero day regulation and forever day regulation. Zero day and forever days are the biggest threats to all security. If the vulnerability did not exist, the attacks could not happen. Panelists seem to be side stepping the issue.
>> MODERATOR: Question for ‑‑ I think it was for ‑‑ I'm losing my mind here. For you, Astrid, and it's a question for you, Silvia, I think. So you first. On the NSA vulnerability.
>> ASTRID OOSENBRUG: What was the question?
>> MODERATOR: The question was it's a bit naive to say it just went away, the NSA, and that there's actions going on.
>> ASTRID OOSENBRUG: Well, that's not what I said. I said there was this whole NSA issue and then people just went on with their lives. That's what I said. And what you need is people have to make sure that what happens with the NSA, to be aware of what's going on the Internet. So everything you say on the Internet can be read by other people. If your PC is not secured, then there is ‑‑ it's not only the NSA. It's the whole world who can read with you on your PC. It's your ‑‑ your PC can be used in a botnet. If you don't understand how your PC is working ‑‑ that was my message more. If you don't understand how things work, but you're also part of it, so you should be sure you understand how it works. So that was more my message.
And also I wanted to react on the man from England, because it's true. In government and the politicians, they have to make sure they make agreements, but they don't understand how it works. So there's a lesson to learn. And I think in Holland, we have a lot of organizations who talk to politicians. They try to explain. We have roundtable sessions. So that's a good thing too. We have to learn from each other. So that may be in other governments, they should try to talk to people like bits of freedom who can explain things to you. So maybe that's also a good message.
>> MODERATOR: So also if you have a home security system, by locking the door, if you drive a car and you know how to do that and take some responsibility for your environment, but when we go on the Internet, we perhaps do not all do that yet.
>> ASTRID OOSENBRUG: Yeah, that was the message I was trying to send. Yes. That's what we should be aware. That's the main message from me.
>> MODERATOR: That's also an outcome that it's also not only the governments or the industry to do something. In the end it's the end user also.
Silvia, on the comments on why the software vendor is not included in any regulation that is envisioned.
>> SILVIA VICECONTE: I just wanted to clarify that as far as the commission proposals, as far as this directive is concerned, this piece of legislation, we are actually focusing on reporting. We are not focusing so much into seeing what is wrong in the system, but in reporting.
Now one of the areas that is included in our directive are the key Internet companies, including large cloud providers, social networks, e‑commerce, platforms and search engines. So I think that might actually ‑‑ I don't know if that was the question that was being posed by Belgium, but we are covering those areas. But again, this is about reporting. I mean no government is in a position to go and see what's wrong in the system. That's really not our job.
>> MODERATOR: Okay. Thank you.
Okay. One last one. Sorry. We're running out of time. I know we have a lot of questions and comments but I'm going to take the last question myself and then we go into the recommendations. Otherwise you miss lunch. Unless you say we go on for half an hour and skip a half an hour lunch, but the choice is yours or you have other meetings.
So we have to stop. So I have one more question and then we go to recommendations.
From the WCIT, I don't know what it stands for, but most people know. But what I understand is that there was some sort of, if you paraphrase the help question there from all these things on the Internet usually go on in the west and all these western companies that have money enough go there, and people also feel left behind. And I have a question. So how can we make sure that the more ‑‑ the working group or the IETF or other self‑regulatory bodies also include people that maybe have questions that are not heard properly at the moment? So are there suggestions to how to reach out better perhaps into the developing world?
>> LIESYL FRANZ: I wasn't sure where you were going with that question, but I'm happy to end it up, because what I'd really like to say is that I think there are a lot of efforts that are ongoing and that can be enhanced in the area of capacity building for all of these areas, and not that I wouldn't necessarily say that developing countries might not need to go with the IETF because I think they should but there's also other ways to help build capacity in these areas, help information exchange, training opportunities. So there are lots of opportunities, lots of options for that kind of programming. The U.S. is certainly taking a bigger effort in this area. We've had several cyber security and cyber com training workshops in the developing world and we're working with many others to find ways to help do that. We're working in the ‑‑ our region to help bring capacity building efforts into regional efforts, such as the OAS or CTEL. So I encouraged those that feel that they might need to participate in some of these other structures or need to learn more to reach out and ask for it. Thank you.
>> SILVIA VICECONTE: Yeah. I just wanted to, you know, maybe tell the audience some of the ‑‑ you may be aware already, the commission together with other partners and most notably Brazil is working on launching a global Internet policy observatory. This is a tool that we are at the moment just starting the feasibility of, but it is a tool that would on line provide all kinds of information and a platform for exchange on a variety of ongoing processes in the Internet governance level, and it's exactly meant to reach out to developing countries or to countries that in any case wouldn't have maybe all the resources to be able to participate in the process fully. And it is something ‑‑ we call it the JIPO for now. It's something that is not at all western. I think we have some African partners in there. We have civil society in there. So that's just a tool, but something that could help.
>> MODERATOR: Andrew?
>> ANDREW SULLIVAN: On the standards development side, first of all I want to concede something. The IETF is overwhelmingly populated by fat middle age men with middle class engineering jobs from North America, Europe, and mostly that. There's a significant population from Japan. There's a very significant population from China. It's also true that we perform our work primarily in English, which is a significant disadvantage for people for whom English is a second language.
The English problem, I'm going to separate these two, because the common language problem, it turns out you just need one, right? You can't do it in everything, because you're building a standard, and so you need to have exactly one language because you need to have exactly one thing that it means, because these things have to interoperate. We picked English I think just by historical accident. That doesn't mean that it will be English forever, but it is English right now.
However, despite the fat middle aged men problem, it's also true that we're working rather hard to do something about that. So we do encourage, you know, people who are not from industry. We try pretty hard. It's difficult because, you know, we have these meetings and you go in one place and you got to travel to the meeting but you don't have to travel to the meeting. One of the key things is that it is hard but it's not impossible to get work done in the IETF exclusively by the mailing list. I have a cochair of the working group that is winding down now who is from Mauritius, and he has not come to all the meetings. Sometimes we do it remotely. He has participated remotely, and I've been in the room or something. So we do work at that. It means if you are participating in that mode, you've got to work harder, but, you know, there's just a fact that it's harder, I think, also for remote participants here, if they're in North America right now, right? It's the opposite side of the world and so they're up in the middle of the night and I think that's just always going to be true in a global development organization. So there's going to be some costs on all sides, but I do believe at least in the case of the IETF that as a community, we're taking this quite seriously, and that diversity problem is not only, you know, on different parts of the world but also different genders, different ages, different economic circumstances. There's a number of dimensions.
>> MODERATOR: Thank you very much. We're going to move to the last part of our panel. I've asked everybody to give personal or from their affiliation, a recommendation for topics that could be discussed in other fora next year or two years or perhaps at the next IGF. A topic we should delve deeper into. So we'll start with you, Virgilio.
>> VIRGILIO ALMEIDA: Well I'm going to conclude, emphasizing the importance of the multi‑stakeholder governance models to increase cyber security. In developing countries, in Brazil, we have observed the importance of these models. For instance in cyber security awareness programs, the presence of non‑governmental organizations, NGOs, and private companies is very important to disseminate the word and the knowledge about cyber security. So again to emphasize the importance of that.
And the examples that we had in Brazil with almost 20 years of multi‑stakeholder model, the results concerning security have been very positive. So I would like to ‑‑ so I will conclude with that. Thank you.
>> MODERATOR: Thank you.
>> SILVIA VICECONTE: Maybe for the work ahead, I think we should evaluate the discussion ‑‑ let's say the public‑private platforms and see if they're ‑‑ you know if they're doing their job properly and that's something maybe we could look at in the future.
>> ANDREW SULLIVAN: I'm loathe to make predictions a year out because the Internet moves around so fast that it's difficult, but one thing that I would say is I believe the collaboration among the various stakeholders, and technical people are only one of those, is something that I think ‑‑ I think ought to be encouraged. I've noticed over the past year some additional bridge building between sort of policy makers and technical community people, and I hope that that will continue because I think that it's one of the most valuable trends that we can encourage.
>> MODERATOR: Thank you, Andrew.
Liesyl?
>> LIESYL FRANZ: I'm going to say that I think I would like to propose that we don't wait until the next IGF say to have the next discussion. You know, the preparation for the IGF is kind of a yearlong process, and it starts in February. And so I would encourage anybody that may want to provide input into what might be discussed next year to start then. Start thinking now about what questions we didn't answer, what questions aren't answered for you in the course of the week, who you might want to try to meet with, if you haven't been able to do that now, because that ‑‑ the work ‑‑ the way the workshops are built up, the way the topics are built up, that starts really early. So I guess I encourage everyone to think about that now and provide input into the website, into working with others, whatever way that we can do that, because that's part of the robustness of this forum. Thank you.
>> ASTRID OOSENBRUG: Yeah, I keep repeating myself but I would say raise awareness, look at it global, not only local or in your own small way, and take responsibility as a user, as a company, as a government. So that would be my message.
>> NINA JANSSEN: I think we just need to continue keeping to build trust by sharing information, sharing your ‑‑ knowing each other, being transparent on your interests, on objectives, sharing your experiences, sharing your mistakes, and just start talking the same language. We do that in several ways, the multi‑stakeholder models, the public‑private partnerships. We're starting to move more into public‑private participation, acting rather than just talking. But my main message would just be keep sharing that. Talk the same language. This summer our national cyber security center and our policy departments actually moved in together within the ministry, and I can tell you that never before I actually enjoyed liked to get a sticker in my goody bag before. So I guess that's starting to connect at policy levels.
>> APARNA SRIDHAR: I think one key thing to focus on in the coming year and something we might reflect upon a year from now is how do we do a better job of building capacity around cyber security issues in emerging markets. This is a rather northern heavy panel, and I think we have some work to do and some outreach to do to sort of share best practices, answer questions, and develop a set of strategies that make sense for those markets and those communities.
>> MODERATOR: Thank you very much. I won't start apologizing for the northern hemisphere, but that's the way it wound up.
I want to thank a big hand to our panelists, our remote people, and the scribes of this session, so thank you very much.
(Applause.)
And to wind up I also want to thank you for you actively participating and I apologize that not everybody could have all questions answered but I hope there's enough to discuss afterwards when we go to lunch. So thank you.
(Applause.)
********
This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.
********