Check-in and access this session from the IGF Schedule.

IGF 2021 Town Hall #17 Emerging Technologies in conflict and maintaining peace

    Time
    Wednesday, 8th December, 2021 (09:45 UTC) - Wednesday, 8th December, 2021 (10:45 UTC)
    Room
    Ballroom C
    Issue(s)

    Digital policy and human rights frameworks: What is the relationship between digital policy and development and the established international frameworks for civil and political rights as set out in the Universal Declaration on Human Rights and the International Covenant on Civil and Political Rights and further interpretation of these in the online context provided by various resolutions of the Human Rights Council? How do policy makers and other stakeholders effectively connect these global instruments and interpretations to national contexts? What is the role of different local, national, regional and international stakeholders in achieving digital inclusion that meets the requirements of users in all communities?
    Promoting equitable development and preventing harm: How can we make use of digital technologies to promote more equitable and peaceful societies that are inclusive, resilient and sustainable? How can we make sure that digital technologies are not developed and used for harmful purposes? What values and norms should guide the development and use of technologies to enable this?

    Other - 60 Min
    Format description: Preferably fully online Round table (U-shape) if there is to be physical attendance

    Description

    While the main focus of emerging technologies is on industry innovation, there are societal implications as well: this is giving rise to the need for additional digital policies and alignment to human rights frameworks. Many emerging technologies can unintentionally cause harm, and some have already been adapted for malicious purposes. For example, technologies such as artificial intelligence, IoT, and cybersecurity have applications in safety and security technologies, autonomous weapons systems, and cybersecurity; however, ethical concerns related to misuse or limitations of the technologies having negative impacts on privacy or incorrect conviction or targeting is raising the need for national and international regulation and governance. The session will be a round table discussion where participants can propose solutions, raise awareness of initiatives, and register concerns related to emerging technologies in scenarios of conflict and maintaining peace, with particular relevance to the regulating these technologies and the application of human rights to prevent harm. The session will be moderated by members of the International Federation of Information Processing Working Group 9.10 on ICT Use in Peace and War.

    Session organisers will present an overview of the topics, and open the floor for discussion to all participants (the intention is to have a purely online session). Each topic will be introduced and discussed, before moving onto the next topic. Participants will be able to indicate on taking the floor, and will be given an opportunity to speak on a first-come-first-served basis. The plan is to focus on the IGF’s Official Online Participation Platform, however the use of complementary technology (e.g. Twitter) can be re-assessed at a later stage.

    Organizers

    International Federation of Information Processing WG 9.10 on ICT Uses in Peace and War

    Brett van Niekerk, University of KwaZulu-Natal & International Federation of Information Processing WG 9.10, Academia & civil society, Africa

    Joey Jansen van Vuuren, Tshwane University of Technology & International Federation of Information Processing WG 9.10, Academia & civil society, Africa

    Louise Leenen, University of the Western Cape & International Federation of Information Processing WG 9.10, Academia & civil society, Africa

    Trishana Ramluckan, Educor Holdings, UKZN International Federation of Information Processing WG 9.10, Academia & civil society, Africa

    Speakers

    The organisers will present a brief introduction: Brett van Niekerk, University of KwaZulu-Natal & International Federation of Information Processing WG 9.10, Academia & civil society, Africa Joey Jansen van Vuuren, Tshwane University of Technology & International Federation of Information Processing WG 9.10, Academia & civil society, Africa

    Thereafter, the floor will be open to participants to engage.

    Onsite Moderator

    N/A

    Online Moderator

    Brett van Niekerk

    Rapporteur

    Joey Jansen van Vuuren

    SDGs

    16.3
    16.8
    16.a

    Targets: There is a move to govern emerging technologies due to the potential societal impacts, particularly related to conflict and peacekeeping scenarios. This requires the promotion of relevant legislation at national levels, as well as international levels through cooperation. In addition, developing nations may require support in their efforts to implement the regulations.

    Key Takeaways (* deadline at the end of the session day)
    There is a need for international regulation for AI, cyber, autonomous weapons systems, and related technologies that can affect international security particularly related to the impact on human rights; however it is recognised there are challenges in gaining national compliance

    There is a need for a unified international forum to discuss the impacts of AI, cyber, disinformation and autonomous weapons systems on international security, specifically to consider the overlaps of these technologies but taking into account thematic nuances.

    Call to Action (* deadline at the end of the session day)

    A unified forum should be established (possibly under the UN) to consider the impact of AI, cyber, autonomous weapons systems, disinformation, and related technologies in international security in order to consider the influence of the technologies amongst themselves.

    Session Report (* deadline 9 January) - click on the ? symbol for instructions

    Introduction

    With the imminence of emerging technologies, a dire need has arisen to govern these emerging technologies due to the potential societal impacts, particularly related to conflict and peacekeeping scenarios. This requires the promotion of relevant legislation at national levels, as well as international levels through cooperation. In addition, developing nations may require support in their efforts to implement the regulations. From the session at the IGF, the need for policies on social media use is apparent, as these policies appear as not to align to national laws within a state.  This report presents the policy questions, findings and outcomes of the IGF session on “Emerging technologies in conflict and maintaining peace”.

    policy questions

    Three main questions were discussed as part of the forum. These were:

    • Is there a need to regulate cyber & AI use in conflict at an international level to ensure human rights are protected?
    • Should cyber, disinformation, AI and autonomous weapons systems (AWS) be considered under one ‘forum’, or under different forums as is currently done?
    • What is the role of developing nations, and the challenges they may face?

    Relation to SDGs Targets

    • S16.3 Promote the rule of law at the national and international levels and ensure equal access to justice for all
    • S16.8 Broaden and strengthen the participation of developing countries in the institutions of global governance
    • S16.a Strengthen relevant national institutions, including through international cooperation, for building capacity at all levels, in particular in developing countries, to prevent violence and combat terrorism and crime

    discussion

    From the discussion of the policy questions, it was evident that in general, governments appear to be shutting down social media sites to prevent citizens from engaging on political issues, thus inhibiting the right to provide critical commentary and thereby deeming it illegal. Along with the implementation of international treaties and cyber, AI and AWS governance, the issue of sovereignty was discussed, as some of the developing nations appear to separate themselves in terms of vetting and agreeing to be signatories to the establishments of international agreements on the governance of cyber as a whole. With further reference to the regulation of AI, AWS and cyber, there is currently an over reliance on the Geneva Convention (Article 36, Protocol I), and the Marten’s Clause. And while these stipulations allow for the open interpretation and development of new international legislations, state leaders often avoid being signatories to the concept of global cyber governance, as they would need to be wary of the penalties or implications of using AI, AWS and cyber in conflict situations.

    With reference to policy question two, the majority of forums that are currently in place, are forcing governments to fulfil their agreements. States within the European Union have or are in the process of adopting AI and to some extent principles for Cyber weapons. However, these initiatives, in most cases are controlled and monitored by non-governmental organisations. While these NGOs normally have technology policies that discuss these two areas, the problem is still the nuances and implementation of laws as cyberweapons are considered as part of normal weapon system laws. And these subtle differences that can create conflict  between opposing perspectives.

    With reference to policy question three, there are international institutions e.g., IFIP research base communities, who as in this case must pay attention to the inclusion of developing  countries.  With developing countries, however, additional support in terms of funding for travel and attendance to these forums may be required. On the other hand, it is possible that developing countries value their sovereignty too much and feel that by agreeing to international protocols they may lose a part of their sovereign powers. Countries wants to control what is happening in their countries. Although they want to use the international security arguments.

    outcomes

    There were two key outcomes which were:

    • Policy question one : Participants of working groups should force governments to act in accordance with agreements and must join working groups. Consideration must be permitted for the role of cyber in conflicts.  International law must apply the law of humanity, and this must not be reactive when cyber related conflicts are involved, as the Geneva Convention does imply the openness to interpretation and collaboration in new areas of development. This would also include aspects of social media usage governance.
    • Policy question two: Citizens need to motivate the need for an international forum on the use of cyberweapons and international security as well as the use of the same weapons for cybercrime and cyber terrorism.  Disinformation is currently the primary means of income for criminals, and global awareness around this is important. Forums can act as the preventative but engagement medium with regard to mitigating the effects of disinformation while engaging in awareness. It was stated that a unified forum for AI, AWS and cyber would be beneficial , especially because of this *very* low barrier to entry for the use of AI, with tons of toolkits available on all of the major vendors' platforms.

    The third outcome was a general discussion on the challenges faced by developing nations, and the discussion followed the route that most developing nations have poor infrastructure and a general lack of awareness, which may entail additional support in terms of funding. The issue of sovereignty also plays a key challenge in the willingness of developing countries to participate, engage and become signatories to international governance frameworks. This is a major challenge which can only be addressed through mutual cooperation.

    links to additional sources