The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> ANDREA CALDERARO: I can start. Anyway, this is going to be a very informal session. It's a networking session. How many of you have ever attended a networking session before? No? Good, that's what we are going to start this. It's a very informal moment to discuss, share interests around the -- yeah, tech and peace, whatever that means, something that I hope we will come out with some common ground on this specific topic.
So I'm Andrea Calderaro. I'm Associate Professor International Relations at Cardiff University, also part of the PeaceTech Hub.
Today we have a series of colleagues that participate in this session from remote. And Michele Giovanardi in particular, the coordinator of the PeaceTech Hub and leader of this initiative. And then we do have online I saw Mark Nelson is the director of the Central Peace Innovation Lab. Evelyne Tauchnitz, Senior Researcher at the Institute of Social Ethics. And Peter, of course, who is in the lead of -- what is the title exactly? UN leader at Access Now. And then other leaders in the room. And I will give you the opportunity to introduce yourself. Michele, do you have anything else to add?
>> MICHELE GIOVANARDI: We have a few more online participants, but I'm not sure they can activate the camera. So they don't have the permissions but they are listening in. And if they want to interact there is a chat function and, of course, the sound I guess we can intervene and talk. As you anticipated, the idea is to really having a moment for us who have been running this networking session for the last four years at IGF. And it's always a space to share some of the ideas and to learn about each other projects and work around technology and peace.
So maybe we cannot see the online audience from -- sorry, the in-presence audience from online. So it would be nice maybe to start with a round of introduction or I can introduce briefly what is the Global PeachTech Hub. I have and maybe we can have a big reaction. If you want to do a first round of introductions and then maybe go more into the topic.
>> ANDREA CALDERARO: Go ahead. The audience anyway, we are a few more people in addition to what you see on the main stage. So it's going to be a very short round of introductions. So go first on the Global PeachTech Hub and then give as much information as possible.
>> MICHELE GIOVANARDI: Great. Okay. So just to like a few words of what is Global PeaceTech hub. And what is Global PeaceTech. We have Marielza joining online so give the cohost rights to activate the video and participate. That's for the organizers.
So I just have three slides so nothing much. Just to understand what is the Global PeaceTech. We define Global PeaceTech as a field of analysis applied to all processes connecting local and global practices aimed at achieving social and political peace through the responsible use of frontier technologies. This is very kind of -- it sounds very official, but it is something that we can just start with as a framework.
So basically what is the basic idea of the Global PeaceTech Hub? We know that emerging technologies bear great opportunities for positive change. But at the same time, they have threats and risks that we need to mitigate. So the idea is that in the 21st century we are battling with these two opposite forces. And this is a common trend that we can see with different technologies and with different functions.
We know how the hope of the internet to be this force for democracy turned into a force for online misinformation, polarisation, violence. We know about the opportunities of digital identities and in terms of giving access to finance and services to people, marginalized people and also about all the privacy and cybersecurity and data privacy issues that come with that.
We know about the great potential of telepresence to build empathy and trust. But also about the risk of hate speech and deep fakes. We know about the data and use of data and potential of early warning response systems. And issues related to how the systems are managed and secured. This can go on and on, and the list is very long.
The question is this one: How can we at the same time mitigate the risks related to this emerging technologies? But at the same time investing in the initiatives that are using technology for peace. And there are many projects that we mapped, more than 170 projects using technology in different ways to enhance different functions of peace at the global level.
One side is about mitigating the risks with mapping and sorry, with the regulation, capacity building, education and good governance. The other side is about mapping the peace tech projects and assessing their impacts on peace processes. And also try to shift public and private investment in these peace tech initiatives that work in partnership we can build between different stakeholders. Hence, the topic of the networking session: A Multistakeholder Approach for PeaceTech, to governing tech for peace.
And first we would like to know who is in the call. What you are doing, what you think technology can contribute to peace. And second, after we get to know you and so we can know each other, we would like to know what you think of the topic of the day.
So how can different actors come together. Actors from academia, government, public sector, think thanks, NGOs come together in this puzzle and create synergies to achieve these peace goals through the responsible use of technology.
So I will give the floor back to you, and maybe we start with the first round and then we can see where the conversation goes. And I really invite online participants somehow chat, audio, video, to interact as well and give us their take about this and introduce themselves as well.
>> ANDREA CALDERARO: Thank you for providing a good overview about the perspective that we are privileged in look at relationship between tech and peace.
I would like Evelyne, maybe you can introduce and provide also your insight on the topic of what is your perspective on tech and peace.
>> EVELYNE TAUCHNITZ: Thank you very much, Andrea. My name is Evelyne Tauchnitz. I'm a senior researcher at the Institute of Social Ethics at the University of Lucerne in Switzerland. And focusing on human rights and peace, of course, what we are talking about today.
So like if we really are talking about what is peace tech, I think that is a bit the guiding question here today. We first have to consider also what do we understand by peace? And that is a really challenging question by itself and not the first time we are discussing that. We had a conference last year also where we dedicated some thoughts on that, and I don't think we reached a conclusion. It is a really difficult question.
I can tell you how I'm using this concept of peace in my own research. So, first of all, I think peace really could show us in which direction we would like to develop as -- together as humanity but also like in what kind of societies we would want to be living. And although there will never be like absolute peace in all corners of the world or even in any society, we still need to know what it would look like. And I find it really helpful first to look at this positive definition of peace meaning that it should not only mean the absence of direct violence but also of structural violence and cultural violence. So there should be no injustices. There should be equality. Poverty. Access to healthcare. Access to education.
So trying to eliminate the causes that often give rise to direct violences. So means really that even if you have no direct violence there is still usually in any society would still have some forms of injustices and some forms of discrimination.
Which leads me also to the third form of cultural violence which legitimizes the other forms of violence. For example, if women are not allowed to go to school. This is structural, when they cannot access the school physically. But it's also a form of culture violence. Why is it not boys that cannot access the school? Why is it girls? Probably isn't going to be any society free of this violence.
So when you talk about peace tech, I think it is important to keep in mind that it is more than security only. Because security is a very narrow definition of peace. Like, for example, if we think about surveillance technologies. They might increase security, but then again they raise different problems like of ambivalency. Means our personal freedom might be reduced. Because data is collected about us and we know that. Even if it's in a conflict setting, we might not meet certain people any more. Like you would not have big assembly of people because they would be afraid that one person being there also, one contact might be suspicious. And then I'm meeting that person and therefore I should not meet too many people. And if it is a big crowd, more likely it would be that somebody is there who doesn't have a good record, so to say.
So it really influences us on democracy, it influences on like freedom of expression. How we move. Even how we move. Like how do we behave in public squares. And I think it is really important to just look at this from an ethical perspective. Like when we look at ethics, you might have good purposes, for instance, but still it could have like negative consequences.
As in the case of surveillance. You want to increase security, but at the same time it might impact on personal freedoms or civil and political rights. So I think what I'm doing in my own research, we need some kind of point of reference when we talk about peace like what kind of peace do we want. What kind of peace technologies do we want. And I'm using there human rights as a baseline. If you have technologies that violate human rights, I find that problematic to say they are because they are increasing security.
But peace in my understanding also is really based on these values of human dignity and freedom as well. And human dignity and freedom are two basic values that connect both peace and human rights. And that is not by coincidence, of course, but it's historically linked because the first Declaration of Human Rights was adopted after the second world war after this devastating experience. While I don't know really if I should go on, but that is really a topic of discussion maybe that I would like to raise rather than me speaking.
But what do we understand by peace or peace tech concretely? Or like what kind of technology should we gather under this term? Which technology really merits to be called peace tech? Sometimes I see a lot of advertising or marketing efforts to like use this brand of peace tech, but if you look at it more closely it is maybe not so much peace, peace, peace, as it is supposed to be.
So yes, that's so far my import or something I would like to discuss with you. Thank you very much.
>> ANDREA CALDERARO: Thank you. When it comes of course to human rights, Access Now has been at the forefront on a variety of human rights battle and also in the series of processes in the UN context. So you can respond on your perspective.
>> PETER MICEK: Absolutely. Thank you. Well, I feel like I want to network with both of you after this. So I think this is getting at the point of the session. And yeah, I appreciate your remarks for two reasons.
I think there's a Martin Luther king quote: "Peace isn't the absence of conflict or the absence of war but the present of justice" which I think leads to accountability. And then thank you also for recognising the need for this tech to be respecting of human rights. That's certainly where we come from at Access Now. We are a human rights organisation. We got our start during the green movement in Iran and have since expanded globally working at the intersection of human rights and new technologies.
I think around 2018, we recognised that increasingly we were working in context characterized by conflict, fragile and conflict-deflected states. And then the communities who were in the midst of those, fleeing those or trying to work on the diaspora to end those conflicts. And I recognise this and have built it into our work plans most directly on our international organisations team adding a senior humanitarian officer who brings experience from that sector. Really just the acknowledgement that the digital rights that we worked to protect are most at risk in conflict-affected persons and communities.
And that I think from, you know, monitoring and prediction, to mitigation, to resolution, to accountability, a lot of the digital tools are infused and becoming essential to each of those steps, I think. So I can mention a few programmes that we have to make it more practical.
We were -- we have tracked Internet shutdowns, intentional disruptions of access to the Internet since the Egypt shutdown in 2011. And have brought this narrative, this terminology, and this charge to end shutdowns through our hashtag keep it on campaign. Brought that to the UN and seen a lot of positive resonance in the number of UN bodies and among States that recognising Internet shutdowns go beyond the pale; that increasingly these are linked to conflict events and times of unrest.
But that that they really levy a number of harmful effects on human rights. And increasingly, we would say, to the resolution of those conflicts themselves. And so that keep it on campaign continues alongside our work on content governance.
So in some ways the flip side. But I think this came out during the conflict in Ethiopia was how overrun a lot of social media platforms in particular with incitement content and the unresponsive or, you know, the perceived indifference or ignorance of those tech companies to deal with what was happening on their platforms. Of course, in Myanmar as well. Led us to work with partners to create a declaration on content governance in times of crisis. We are following that up with a full report.
So, you know, when it comes to shutting down the Internet or, you know, allowing the proliferation of harmful content, showing there is a number of ways that freedom of expression is directly at the heart of I think a lot of these responses and interventions.
And then more recently, we've tracked, especially through our digital security help line, which is recognised as a tool that DPPA and others use to refer people to, it provides free of charge technical guidance and advice to Civil Society which we define pretty broadly, 24 hours a day in 12 languages. And increasingly we are getting reports to that help line of really invasive targeted surveillance and spyware. And I mentioned this particularly because a recent report we wrote described how the use of spyware in the midst of the Armenia-Azerbaijan conflict which has flared up again recently. We were actually able to detect infections on the devices of the actual negotiators of the peace on the Armenian side. The ombudsperson who was directly involved in talks, their phone was completely infected throughout. And so spyware is now a tool of war. And of, you know, efforts to monitor peace negotiations.
And finally, yeah, we are mapping now all of the tech that we can find being used in the humanitarian situations. And will definitely look at the mapping project that was presented just now on peace tech. I think with like human rights, we are taking a human rights lens. Where is the human rights due diligence, what are the procurement processes, how do we determine whether this tech -- has it been analysed openly? Is it human-rights respecting as you said, I think.
So that is the kind of work that we do more directly in pursuit of our mission to help communities and people at risk. And that we try to bring lessons from that frontline work to the United Nations through the OWG cybersecurity process. I will stop talking soon. But happy to talk more on that later. And I'm joined by my colleague Carolyn Takat, who runs our campaign and rapid response work.
>> ANDREA CALDERARO: Super. Thank you. I do have a question, and I know that we should keep the roundtable going on.
Since you are exactly following all of the UN process and since UN is I guess you are there because we know that UN is supposed to be this organisation ensuring peace. So yeah, what do you think the UN is moving and whether the UN is moving in the right direction? Of course, the UN open-ended working group is a process I'm following as well. And I'm not sure whether we can feel satisfied about that.
>> Peter MICEK: Well, yeah, I mean it is undeniable that from the top down the Secretary-General has put digital transformation and tech and cyber at the top of his agenda. You will see that by creating the Office of the Technology Envoy. And we, for our part, have been pushing the UN Security Council to, you know, integrate monitoring of the role of digital and cyber on its mandate and on the situations that come before it.
And, you know, we haven't seen much evidence -- well, we don't know a whole lot about what happens at the Security Council because it is so opaque by design, especially to Civil Society. But we have not seen really dedicated talks there in ways that could be helpful. Of course, they will disagree like over Article 51 and the right of security and offensive and defensive measures in cyberspace. But how about just understanding, you know, their ongoing situations at the Council on -- that they look at on Sudan and Yemen, Colombia. Like how is, you know, tech being integrated and influencing those conflicts and what role could it play in bringing about some resolution.
We invite others who monitor that body as well as the First Committee's work. And then there are the cybersecurity and the cyber crime processes continue right now.
The Cyber Crime Treaty looks like it will come imminently. And it has been really interesting to see how they scope that. Making, you know, how wide of a lens they take on what constitutes a cyber crime certainly is going to be relevant for the people involved in peace tech. And yeah. I mean I think this summer the future -- the future finally does include the new agenda for peace which, you know, is supposed to be where I think they will speak to cybersecurity alongside complementing the Global Digital Compact. So those are both meant to be signed and delivered in September of 2024. So there's a couple of things to look at.
>> ANDREA CALDERARO: Super. So going back online. Michele, you might have a better overview about who is online.
>> MICHELE GIOVANARDI: I just wanted to say that maybe before since we are just one hour and we are all through and we have some people. Just to check, you know, who is in and who is on the call. Who is in the meeting and both there and live. We would like a really easy round of who we have in the room.
And then we go through more to the answering questions and go more in deeper into the discussion. Because I'm just afraid that if we go one by one then we will not get to the end. And I would also like to get a sense of the online audience who is online.
So if you could, maybe just like a couple of minutes each, just a round to see who is around. And then we go back to the more in-depth discussions. What do you think?
>> ANDREA CALDERARO: So yes. Mark, please. Let's interact also with the online audience. Mark Nelson, Director of the Stanford Peace Innovation Lab might have some insight on this.
>> MARK NELSON: Did you want me to speak while people are doing the little intros in text or should I wait for that?
>> MICHELE GIOVANARDI: Let's see if we can do who you are and what you do like in one or to minutes. And do this round first. And if when we are done, we go back to you and Marielza Oliveira and we get some more in-depth perspectives.
>> MARK NELSON: Okay. So very quickly, my name is Mark Nelson with my colleague and partner Marguerite Kiwas, I co-direct the Peace Innovation Lab at Stanford and also our independent institute in the Hague, the Peace Institute where we are working to commercialize our research in peace tech.
And I focus on the measurability that is possible now because of technological advances. Peace metrics that are possible in very high resolution in real time and affects the emerging space of peace technology to capital markets and the potential to create peace finance as an active investment sector. So that is the quick overview.
>> MICHELE GIOVANARDI: Thank you. I see Marielza Oliveira online.
>> MARIELZA OLIVEIRA: Hello. Hi, everyone. Marielza Oliveira from UNESCO and the Director for the Digital Inclusion Policies and Transformation, which is a division within the communications and information sector of UNESCO. The CI sector for short.
Our task is to defend two basic human rights. Freedom of expression and access to information. And, you know, leverage those for the mission of UNESCO which has a mandate literally of building peace in the minds of men and women. And a mandate to enable the free flow of ideas by world and image, you know, through every type of media and so on. For us the digital ecosystem is incredibly important and we have been doing different types of interventions on it for quite a long time.
From one side on the freedom of expression, for example, recently we had the Internet for Trust Conference which looks at the regulation of platform in ways that respect human rights. And on the access to information side we are working on building capacities of different types of stakeholders so that they can intervene properly, you know, and participate properly in this process. Because there is quite is lot of digital exclusion and inequality happening giving voice to specific side only and we want to see a more inclusive Internet in that sense. I will stop here. Thanks.
>> MICHELE GIOVANARDI: Thank you so much. I see Moses Owiny online. If you want to introduce yourself.
>> MOSES OWINY: Yes, thank you very much. I'm deep from an area where the Internet is really poor so I may not be able to turn my video on. My name is Moses Owiny. I'm the Chief Executive of the Center in Uganda. A platform that seeks to advance the perspectives of African countries in multistakeholder conversations like this one.
I'm really glad that I'm in this discussion because the overall concept of, you know, global peace tech in my view cannot be fully complete without the perspectives of many stakeholders that some of the speakers have already alluded to. For example, if you look at the realities that are in parts of Sub-Saharan Africa where I come from. But also Pacific, Caribbean, you know, the future of what the tech, the peace around technology and cyberspace is shaped also by the realities.
So today we are talking about, you know, cyber, you know, technology for peaceful means. What does that mean for a country like Uganda and Kenya and Nigeria and might apply differently for countries more developed or more advanced economies.
For me, I'm glad that this conversation is here, but imploring the stakeholders and organizers also conversations around this to personally draw from all of the multistakeholder perspectives and actors that you have clearly talked about. So that's one point.
My second contribution. You are all university professors and you know this. When you use the term global peace because we are in a world of international relations where States relating each other by lateral moves, laterally. Is there -- and I -- I seek to be corrected, I may not be knowing.
So are we right to say is it more accurate to say we are in a global world or in a world -- or in an international world? So for me the term global peace is a little bit with all of the -- a little bit ambiguous. But if you have something like international, you know, peace, in the tech, it is -- it equates more meaning, it is more relatable because it implies many different states are collaborating and working together with the multistakeholder groups to achieve the kind of peace, to achieve the kind of prosperity in the cyberspace, in the tech industry that everyone desire. So that is my contribution. And thank you so much for this platform and the opportunity.
>> MICHELE GIOVANARDI: Thank you, so much, Moses, for these very insightful comments. And also connected to what Marielza was mentioning. We will move forward with the round of introductions.
If there are other participants that want to contribute? I see Youssef. I see -- I don't know if you want to introduce yourself. Okay. I see Yuxiao Li. And Omar Farukh. Okay. You will have chances to introduce yourself in the cat as well. And I see also Teona Nesovic online. And she is the Rapporteur so she will also kind of put together some of the reflections we are sharing now after the session.
So I guess let's go back to the room. Okay, does somebody wants to introduce themselves? Yuxiao Li, if you want to introduce yourself, please feel free to do so.
>> YUXIAO LI: No problem. I have no questions. Thank you.
>> MICHELE GIOVANARDI: Okay, thank you. Let's move forward. So I would like to go back to either who is left in the room. Or, Mark, if you want to expand a little bit on the topic of the session.
So and also, of course, Marielza, do you see any opportunities in a multistakeholder approach in this space? There are many initiatives trying to leverage technology for peace in different aspects and dimensions.
There are actually nonprofit sector growing in this space with many organisations and networks. What is missing maybe is to connect these dots. So to connect this nonprofit sector with the tech companies that also have the capacity and data to and capability to develop the technology itself and the government.
Do you see some kind of opportunities in developing this multistakeholders approaches and networks? Maybe we will start with Marielza Oliveira and then go to Mark.
>> MARIELZA OLIVEIRA: Great that you asked for this question because this is one of the approach that we are taking at UNESCO is to bring together different groups. And particularly bringing together the tech community, the tech companies and the, you know, the multistakeholder group. And we have one good initiative on that. It is the AI for the planet initiative in which we are leveraging this technology for combating climate change. You know, so this is one of the things that we see promise on.
Although, quite a few of this -- how do you say this -- hopeful approaches, they end up in failure. So this is one of the things we have to be very frank and sincere about. There has been quite a lot of attempts to do exactly that.
So for the companies, the question is what is in it for me? And, you know, at the end of the day, they will be looking at what the gain it is. So bringing them to the table and having them to commit to specific initiatives and to support that and dedicate time, money, resources is quite difficult.
But you have to be selective with the types of initiatives that you pick and when the priorities are high enough. And when also can the companies can make a difference without necessarily a long-term large commitment. One of the examples that we have that is very successful is the United Nations Technology Innovation Lab.
I don't know whether you have heard about it. But it is using technology in innovative ways to enhance peace-building processes such as, for example, 3D augmented reality to show the delegates in New York what the conditions that displaced people were facing in a particular war zone so they could immerse themselves into that and then have an informed discussion about it.
It is a completely different type of things to actually see the environment than to just hear the statistics about it. The increase in empathy in terms of understanding, you know, what needs to be done is tremendous. So there are quite a few good potential things.
Another way that UNESCO is working on that, we have a partnership with European Commission on a project called Social Media for Peace in which we track and map the different types of conflicts that are happening online in particular pilot countries. And, you know, bring together multistakeholder group that then deploys -- you know, devises and deploys countermeasure tactics essentially, you know, to diffuse the polarization, to bring back a civil, you know, dialogue online, you know, and reduce polarization. It has been quite interesting.
We have a lots of interesting results on that. And hopefully we can draw some mechanisms for scaling it up out of the pilots. But, of course, the big thing that we are looking at is the Internet for Trust Conference. We are about to release the guidelines that we have been developing for an entire year together with quite a large group of stakeholders that provided inputs from all sectors of society in how we actually regulate Internet platforms for a, you know, a civil discourse as well.
So that without affecting freedom of expression, access to information. Because quite a lot of the measures that are there right now do enhance one right -- to enhance one right they sacrifice another. That is one of the things we are not willing to do is sacrifice freedom of expression and access to information.
There are quite a few countries deploying measures of blocking or, you know, certain apps or features and et cetera in a more kind of -- a more -- not necessarily well thought out process, you know. Some examples of those.
And, of course, you know, the consequences you have, you know, for example limiting the voices of, for example, journalists that are recording and keeping us informed about these issues is tremendous. You know, so the activists lose their voice, too. So we can't be discriminating that. So what the Internet for Trust Guidelines are proposing is that instead of regulating content, for example, that we regulate process. That we enable and build together, you know, a real governance mechanism that goes about, that fosters transparency and accountability. You know, because those are the things that are missing on the Internet, you know. Accountability is something that is not there.
You know, so this is one of the key things that we need to target. And accountability is what, you know, truly makes, you know, people behave in a responsible way. If you don't have it, there is no responsibility, there is no decent, and people act in different ways that are counterproductive to societal development. So thanks.
>> MICHELE GIOVANARDI: Thank you very much. Andrea, if you agree, I will also move to Mark online. And then if you have feedbacks from the room, feel free to interrupt us.
Mark, if you want to give us your comment on the topic of the session also connected to what Marielsza was saying about building trust and building measures, you know, of impacts and measures on data. And I know your effort is also in this direction of the measurement side.
How do we measure these impacts? How do we find parameters to describe this space as well?
>> MARK NELSON: Thank you, Marielza. Thank you also to Moses for that excellent question. I think these things all tie together.
There is a lot of general interest in peace tech without perhaps a detailed understanding of what it is and what are the components that make it possible and make us to be standing at this incredible opportunity as a human species right now.
What changed? What's different? Because it's not the wheel, it's not the lever. I mean you could make a sort of general case that our ability as a species to construct tools of any kind is interesting and useful if we use it right and so on. And that's kind of true. But the thing that is really unique in the last 20 years is the vast proliferation of sensors. And sensors that can detect all sorts of changes in the environment in real time. And the thing that really makes this powerful is that when you start looking at the subset of sensors that can detect human behavior, and then again at the subset that can detect human social behavior; how we actually interact with each other. At that moment we start realizing for the first time ever we can really measure not only what kind of interactions we have between individual people in real time, we can also measure for the first time ever over time what are the impacts of those interactions.
And so we can very quickly test theories about well, I thought if I did this, it would be good for that person. But was it really good for that person or not. Suddenly we can answer those questions and we can answer those questions in real time and with huge sample sizes.
What this means is that as we structure the kind of data of human interactions to see where are we doing really well and what are the Best Practice kind of behaviors that humans can do for each other and how do we think about how we could design even better behavior sequences for each other and how could we customize those and tailor them for each of our individual situations, that's where the huge opportunity of peace tech is really possible.
And it allows us to go back to something that was one of the original dreams of Norbert Wiener back in the 1950's setting up the foundations of information technology and of cybernetics. Which was this closed loop of technology between a sensor that could detect something happening in the environment that we cared about. Communications technology that could move that data to processors that could do the best sense making of that data. And then the processor connecting to an actuator that can then respond to the environment. So that the sensor can detect did the response cause the effect -- cause the change I wanted to change. Did it move the needle at all? And if it did move the needle, did it move it in the right direction?
That's where the opportunity now of peace tech is really incredibly powerful because when you close that loop you go from having a linear kind of technology to a closed loop technology. And that closed loop changes things to be fundamentally persuasive technologies.
And this allows us -- I want to just underscore why this is so historic. This allows us to move from, quote, peace technology like the Colt peacemaker from the 1800s if you all know your western movies really well. You know, it's a gun and they called it a peacemaker. And it was built on the theory of Leviathan, right, that whoever has the power can enforce peace and so on.
Peace up until this change of technology has been coercive peace that has based on a Hobson leviathan theory. And what has changed is that for the first time ever we can build technology that can create persuasive peace instead of coercive peace. And that is a huge shift for our species. I will pause there and let everybody respond to that.
>> MICHELE GIOVANARDI: Thank you, Mark. Over to you, Andrea. We have eight minutes left. I just want to remind everybody maybe we should collect some takeaways before the end of the session.
>> ANDREA CALDERARO: Yeah, there are people queuing.
>> AUDIENCE: Thank you so much. Hello, everyone. My name is Manja, I'm from China. So I am a youth representative of a peace building project held by UN-DPPA.
So my question to all panelists is that how do you see the young people's role in tech for peace and how to engage young people into this initiative or this wider agenda? And do you have any Best Practice to share for current or future plan for this? Thank you so much.
>> ANDREA CALDERARO: Thank you. Anybody else like to add anything to the discussion from the floor? No?
>> AUDIENCE: Sure. We are networking. We can chat. Hi, everyone, my name is Carolyn. I'm the director of Campaigns and Rapid Response with Access Now.
I think to your question about the definition of peace in the first place is a really important one. And I think from our frame, we really think about digital rights rooted in human rights and that applies in all contexts. And that has a very clear kind of legal structure to understand what the rules of the game are.
I think I would be curious to understand from the peace tech folks what the added value is of this lens specifically of peace tech is. It seems like it covers a lot of ground from human rights to social justice and environmental justice. Perhaps a bit of umbrella term, but I think understanding what we are trying to get to by using that lens would be helpful.
The other thing that is on my mind from this conversation is just how this peace tech movement is thinking about kind of the compulsion towards moving in the direction of techno solutionism. And I'm not saying that is what is being presented here, but I think there are many conversations in this space here at IGF and that happen across the UN system and in other spaces where the instinct is very strong to reach for a particular digital tool to solve a very complex human problem.
And so, I think just, you know, if you could all share a little bit more how you are thinking about integrating the technical conversations back to some of the core work of peace building and all of these many different fronts and how those things are feeding back into each other.
>> MICHELE GIOVANARDI: Maybe we can reply for the global and someone can reply from the youth perspective.
>> ANDREA CALDERARO: Yeah, very briefly, just 30 seconds so we can wrap up the session before the social reception that we are having here.
>> MICHELE GIOVANARDI: Maybe I will start with the last question, and then anybody who wants to jump in for the youth perspective on this.
So for the global peace tech hub. First of all, it is a growing space made of different actors and we are not representing them, of course.
But the Global PeaceTech Lab is an initiative that is based at the European Institute in Florence and at the School of Transnational Governance, which is a school of governance for current leaders on issues that go beyond the state. That's the context. So our role to play in this is a facilitator of different actors and of course some independent analysis of what is going on in the space. So we can't talk for the others.
But about the -- where we are positioning ourselves and where we see the added value is in fact in the connection of the different spaces that don't necessarily talk to each other. So the peace tech idea was born more in the peace-building sector. And especially NGOs would apply digital tools to some peace-building initiatives they were doing already. Peace tech starts more from a tech policy perspective. And as it was explained at the beginning, it is equally on how you assess the impact and invest in the impactful projects, peace tech projects. But at the same time how you govern and mitigate the risks with the good governance, with regulation and the capacity building.
So in this challenge it is very broad, as you say, of course. And kind of definition. But at the same time we see the value of the interconnectedness of the different challenges.
So we are talking about trust, for instance, before how is that like divided from any discussion on digital identities. How is this like not linked to some applications of block chain. How does not link with some discussion about data and who owns the data. How is that not connected with Internet infrastructure? And how do we give accessibility, secure accessibility to Internet? It is not connected to cybersecurity. So you see all this interconnectedness of different challenges. And I think that is the kind of added value we are trying to bring.
And on the concrete side, we also want to foster the dialogue between the different sectors that they discuss kind of in bubbles and don't necessarily talk to each other. So I'm talking about the tech sector per se that they don't have on the top of their agenda, let's say, discussion about the social tech or peace tech.
At the same time, NGOs are doing other things on the ground so know what is going there. This doesn't elevate to the governance level. So governance should be involved. We are trying to connect the connections and regulations with the peace definition and adopt a positive peace paradigm. A bit similar to what underpins the global peace index. So this positive peace pillars which is not just the absence of --
>> ANDREA CALDERARO: Sorry, they are about to make us clear the room. They are going to kick us out of the room. I think Evelyne had a brief reaction to the point made.
>> EVELYNE TAUCHNITZ: Thank you very much for this question. I just want to take on what is the added value of peace as compared to human rights. I think it's a really important question. As I see that human rights are a necessary condition for peace, but they are not a sufficient condition. Means peace requires the respect for human rights, but it is a bit something morals. Like as I talk I see it more as a vision.
And in a sense you also have with human rights a bit the problem that it is about the process. Like in the sense of you have this human right and that one and that one. But you can imagine that it is any -- the government can interpret human rights as they wish to in a way.
And say okay, you get that right but we don't have so many resources that we cannot fulfill this kind of right and so on. Whereas peace is more kind of a broader concept.
And I also see that you -- it is in a way it is almost easier to start a public debate on it or it is like everybody has a perception of peace. Whereas, human rights are like really kind of focused on rights and they are legally enforceable, which is an advantage. But it is much more narrow. And it is not about the process. For example, how do you reach certain political decisions or the process is not really incorporated in the human rights perspective. So this I think is important. Like it's necessary, but it's not sufficient.
If I just can make a brief comment to the youth question. I think it's a really important one as well.
And I think because if -- I think it is also important to ask like how is digital change changing our perception of peace. Like what we have been thinking of peace might be changing in the future precisely due to the digital transformation. So that conception of peace, change is not only that we can use digital technologies for constructing peace but it's also that through the process we might have a different understanding of peace. And there youth is really, really important I think.
And also connected to that like I think you said -- I think you said you are from China, and I had a conversation with a colleague of mine recently. And he did research in China and he told me, for example, that a lot of the social scoring system has to do with a deficit of trust and nowadays China as compared to the China of earlier generations.
And in order to reconstruct the trust, social scoring systems are considered useful. Of course, if we talk about peace, it is this broad condition. Like do we want trust as digital tools, for example? Or do we want to -- how do we want to build this trust in society? Like this would be a really vital question, I think.
>> PETER MICEK: Really quickly. So thanks. I think on a few of the questions. I mean for youth, first of all, youth are leading movements for human rights and for peace around the world. And I think, you know, they are putting themselves on the line and more and more so in a digitized era where everything they do, their pictures, holding protest signs out and marches are going to be indelible and probably have already been scraped for the facial recognition and so they're getting attacked by spyware. We see that on the phones of youths. So from Thailand to Sudan to the USA, youth are doing it, are doing the work and it is on us to help ensure this doesn't, you know, lead to reprisal or, you know, permanent problems for them.
I think on just to respond to Mark. I think we were talking about euphemisms here. You know, peace and, you know, tech for good, digital public goods. You know, and so as advocates we chafe at this because there is a bit of trust deficit. We don't see either the tech sectors or, frankly, large humanitarian agencies as deserving of the trust of our personal data, personal information about us. Especially those most at risk.
So when I hear talk of like the ubiquity of sensors, sensors implies some kind of like passive monitoring. That is so contrary to the vortexes creating these traps for us to deliver our personal sensitive information into, you know, the black boxes that we have no control over. That characterizes most modern tech. But I think sensors is a dangerous, you know, sort of word to use in that context.
Unless, you know, you are talking about some tools that are completely divorced from the modern digital economy. Thanks.
>> ANDREA CALDERARO: Lastly, so this was, of course, a networking session. This means that I mean we are pretty well networked, at least the people on this stage. But for those of you that would like to connect to this conversations, we, of course, will have a follow-up. And please pass your contacts on to me and so we are going to keep you in the loop.
I mean again, this was a networking session. The impression is that we are still even within the tech peace community, there is no common ground, there is no agreement of what exactly tech and peace actually means. And impression is that for some tech is approached as a tool to achieve peace. For others it is tech is a -- is a new battleground as such. And then all of the discussion about cybersecurity and so on. For others, it could be tech as a threat to peace. And, for example, the discussion on the autonomous weapon system and how AI is increasingly embedded in arms.
So yeah, I think there is still lots of conversation that needs to be done also within the tech community in order to bridge these community that work in silos. And hopefully we will have additional sessions trying to break these silos. That's all from here. And then, Michele, if you want to have a last 10 seconds word. Just a word.
>> MICHELE GIOVANARDI: Perfect wrapup. Enjoy the sushi dinner. It was a great networking session. Let's keep context and the conversation going. Thank you.
>> ANDREA CALDERARO: Fantastic. Thank you also to the online contributors to the discussion. Bye, all.