IGF 2024-Day 2 - Village Stage - NS 169 A Rights-Respecting Approach to Emerging Tech Development-- RAW

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> MODERATOR:  Good morning, everybody.  Thank you very much for joining us for this afternoon's session.  My name is Radka Sibille, and I cover digital affairs for the European Union delegation in Geneva, and I'm really happy to be moderating this fantastic panel with our great panelists today.

     So the session is focused on the rights respecting approaching to emerging tech development.  It's a session that's cosponsored on the freedom online coalition, the kingdoms of the Netherlands who's chairing the freedom coalition this year and the United Nations human rights office in Geneva. 

     As was already mentioned the topic of our discussion how to tackle development of new and emerging technologies in a way that is safe, that protects individual human rights and empowers people. 

     This also extends to the technical standards setting, which is being done in several regional and international organizations, and those very concrete impacts on those people's everyday lives. 

     In the EU, we try to embed the human rights approach in all our EU relations on technology such as the AI act, EU data act, et cetera, and you may recall that at the international level, the international community reinforced the community of a human rights based approach to tech in the recently adopted Global Digital Compact which really says that all the height international law, standards need to be firmly embedded in everything that we have do with technologies    everything that we do with technologies. 

     So now when we're entering the implementation phase of the GDC how we can collectively implement this human rights based approach.  It's very timely.  I'm very happy we have here a very diverse group of speakers that represent various stakeholders from various sectors that all have to work together to make this happen. 

     So just to present    we have here Isabel Ebert who's a human rights officer from the United Nations human rights office in Geneva.  Thank you very much for joining is us. 

     I also have the pleasure to present his excellency ambassador Brendan Dowling.  Thank you very much, sir, for joining us. 

     I also have here the Chair of the internet engineering task force or the IETF one of the international standards setting organizations Mr. Roman Danyiw.  Thank you for joining us I have a representative of civil society Mr. Ihueze Nwobilor who's senior programmes officer for Paradigm Initiiative.  Thank you very much. 

     And last but not least I have the representative of a private sector Alex Walden.  Thank you very much for joining. 

     And before we dive deep into the topic, I would like to first give the floor for a keynote speech from the Director General of the Department of International organizations and human rights in the ministry of foreign affairs of Estonia Mr. Rasmus Lumi.  Thank you. 

>> RASMUS LUMI:  Thank you very much.  It's a great honor to be here.  To be able to say a few words on the issue of standardization.  To introduce or lead in the discussion of the panel, and I could not agree more that human rights based approach is of essential importance when we talk about our approach to emerging tech development. 

     Most likely each one of has used ChatGPT or another chat assistant these once or maybe you have ordered something online or maybe rebooked your plane tickets.  You gave your information online.  You gave things    that it asked for.  It may have included your name, your order number, maybe or birth date and identification number, and so on. 

     But before you did this, however, did you consider how this data will be processed and where the company you were dealing with actually had stringent privacy standards. 

     We all know the pace of technological advancements is speeding up by the year if not the month, and that is what we want. 

     We want innovation, and we want technology that helps us reach sustainable development goals easy. 

     The question is, can standards keep up?  And more importantly, can they adapt? 

     And what can we do to help.  I'll leave that for you to answer, but I'll do my part by giving maybe some food for thought.  Firstly, it is all of our duty to uphold human rights and to make sure that the foundation we have built together is taken into account in ongoing and future processes.  Estonia is the next chair of the Freedom Online Coalition.  We'll continue to advocate    we'll continue to advocate and work for it. 

     As part of our priorities, we'll be focusing on tech governance, digital inclusion, and cooperational engagement.  I'm glad to see so many important stakeholders in this panel who will be key partners in our joint mission. 

     The report of the OHCHR from last year on human rights and technical standards setting processes gives a great overview of the relationship between technical standards setting processes and human rights.  Of course, a lot of work still needs to be done to address the challenges that we face.  And global issues can only be solved through collaboration. 

     Secondly, I would like to emphasize that standards setting processes must be a multistakeholder and holder.  While states have rules on human rights law the private sector as well should accept the human rights process from development to adoption to implementation. 

     UN entities is well should make sure to follow UN's guidance on human rights, due diligence for digital technology use. 

     Moreover, all these entities should take measures to enable and empower the participation of stakeholders that are often sidelined.  Yet, should have a seat at the table.  We must adopt a proactive stance to stakeholder participation especially the governments have a big role in this. 

     In this relation, we will come to my third and final point, which is extra care must be taken when it comes to marginalized populations.  With all technologies the human rights risks are the highest when it comes to children, persons with disabilities, indigenous peoples, and so on.  We could also talk about, obviously, people in some conflict zones and other vulnerable situations where they are in more difficult situations to protect themselves and their interests. 

     Therefore, standards must be assessed on the basis of their affects on marginalized communities.  All of us must make sure to protect the rights and standardization processes.  Standardization is an integral part in making sure we get the technology we want, so let's make sure we get standardization right.  Thank you very much. 

>> Thank you so much for this comprehensive overview which allows us to dive deeper into the topic.  Thanks to our panelists. 

     First let me turn to Isabel the high commissioner for human rights if you could, please that can bring to people and how we can mitigate them.  Thank you. 

>> ISABEL EBERT:  Thank you very much, Radka and the hosts for the session and also the advocacy we're very looking forward to working with the incoming chair Estonia and thanking the Dutch for their amazing leadership throughout this year. 

     So if you if you heard a few speakers talked about the necessity of integrating human rights implementation, and I would like to reemphasized the human rights coalition in the    in private sector respecting human rights when seeking to make their contributions to the implementation of the GDC.  If you looked to find that language it's mostly under objective 3, the paragraph 22 25.  Which calling those to respect human rights and the full cycles which includes development, and that is very much in line with the standard that we are promoting for responsive business conduct, the UN guiding principles on business and human rights. 

     You will have noted the GDC have called the high commissioner on height to provide further guidance to states and other relative stakeholders through the OHR advisory service so    and that also will include advice on the UN guiding principles. 

     For those of you who don't know the UN guiding principles on businesses and human rights yet, they are the leading global standard on responsible business conduct and the seven it does from our sister organization the OACD that is in the conceptual approach very much in line with the business practices and human rights. 

     The core new element they brought to the table when they were adopted in 2011 they set out the cooperate responsibility to respect human rights so other than, of course, the state does to promote human rights they bring this active role of businesses into the conversation of not negatively impacting highlights while some companies today want to actually go beyond just respecting human rights.  They also want to promote human rights. 

     In that sense, once again, also, of course, our office tries to provide further clarity what that actually means, how do you break human rights down to the conduct of tech companies?  We founded 5 years ago the B tech project.  Where we are working also jointly with tech companies directly and understanding how they can mitigate    first of all, identify, and then mitigate and prevent human rights risks from occurring.  All with the idea that we need tech to advance humanity, but we need to discuss how we do so and to make sure that we have the right safeguards in place, so that the risks are not materializing, and we can really reap the benefits of tech innovation. 

     So in that sense we have a couple of experiences to share what type of risks can occur in the context of digital technologies.  I would just name 3    just to illustrate a little bit, but there's also lots of work coming out of the special procedures, for example, and also some of our reporting to the council. 

     So freedom of expression issues can occur when content governance fails to provide for contextual nuances.  This is an issue in particular in Arabic languages where they fail to find the appropriate resources in order to manage content in Arabic, so it would be great if that if that could be improved in terms of allocating appropriate resources also for majority countries while, of course, the global north is also very much looking forward to seeing some of the regulations that have been adopted giving content governance practices from a rights respecting another push. 

     Another risk is gender discrimination in the outputs of large language models.  We've heard also here from the host government the strong call for gender inequality in it internet governance.  Also, large language models create very stereotypical degrading representation of women, so this is something that can be addressed by better risk management practices of platforms and last but not least, as we are moving also towards interacting more and more with large language models in the form of chat bots, the youth are using chat bots with regards to mental health questions and have been confronting false information that can lead to a detriment of mental health. 

     Since I'm an eternal optimist, though.  I think there's a lot companies can do to mitigate how states can help them.  We published a taxonomy of human rights risk and how they map on generative AI, which I point in a very digestible format, but the main message from our companies is coming out the need of contextual practices as well as human rights, human rights due diligence to better understand contextual risks, and then find solutions to help mitigate such risks and reg teaming is very important but also technical standards setting is an area where we see companies engage and often can be a helpful voice in order to shift technical standards setting discussions towards a more rights respecting approach.  In that sense technical standards assisting is really the glue that binds the policy discussions to the actual practical implementation and also operationalization.  It allows us to translate the human rights into technical requirements.  And just to close, this is also where    there's a strong role for the freedom online coalition members to apply a smart mix you of regulatory measures but also policy measures in order to require tech companies to embody the UN principles guiding spirit and prevent and mitigate the risks with regards to their bodies and services in the digital tech place. 

>> RASMUS LUMI:  Thank you for what they are doing and the GDC that enshrines digital human rights.  I expect your office will be doing even more work so good luck with that. 

     Now, let me turn to the governments perspective and ask a question to ambassador Dowling, what do you think the role of the governments should be ensuring emerging new technologies such as generative AI that were just mention to respect human rights not just respect but promote as Isabel was saying and also what role do you think the advocacy can play with its wide membership on this.  Thank you. 

>> BRENDAN DOWLING:  Yeah, thank you.  Look, I think we made a mistake in technology 20, 25 years ago where we essentially bought into the idea that the technology industry was moving so fast that it was difficult for governments to make regulations and policies with respect to technology.  I think there was an attitude from some parts of the technology center that government were not benign actors when it Cass to interventions in the technology market.

     And so for a couple of decades, we essentially did not step in the role that democratic governments bring as a guardians for human rights    guardians for their citizens in raising how technology is developed.  I think we saw that mistake play out in a range in a really damaging ways.  We saw technology products developed that did not consider safety by design, that did not consider the human rights impact, and it was only after those impacts were realized that we all took action collectively.  And to be fair, the technology companies    when those issues manifested they would respond and take action. 

     But for example, when we saw Scott become a technology dominant technology player, and we saw Scott of large streaming of child abuse that's an issue that Scott as a company did not engage on until early on until you it was demonstrably causing harm    we saw TikTok have no trust and safety team when they initially developed and became ubiquitous.  We were concerned with OpenAI.  With the growth of that company with the growth of the large language models that they had underinvested in trust and safety.

     So we are in this cycle of developing new technology, the technology is immediately used by bad actors to abuse human rights in the    some of the areas that we've already talked about, and then we say:  Okay, how do we address that?  How do we fix that?  How do we introduce measures to protect and preserve human rights. 

     There's a better way to do that, and that is addressing it in the design phase.  It is about consulting with the human rights community who are always willing and active to engage with technology companies.  It's about governments setting clear expectations that technology is not neutral.  That the design choices made when technology is developed have a real impact for how they are then used, exploited, abused as deployed. 

     It's about a constructive conversation that brings in a range of actors.  It's about saying that that a technology that that does not prioritize, recognize human rights impact is not ready to go to market, so it's about not saying we'll rush to market to take advantage of commercial interests.  It's about saying a core expectation is that we will consider the human rights impacts of our new technology before we deploy. 

     I think    yes, technology moves fast but actually when you come at it at a principles level, it is the same principles that we can apply to various classes of technology.  I think the standards piece is crucial, and we went were very pleased to work with the freedom online coalition last year on our technical standards for human rights in digital technology.  I think it lays out here are the expectations, here are the ways the ways that human rights can be considered in technology standards. 

     So for me, it's foundational to all elements of new technology.  There are the materials.  There are other resources.  There is the willingness of the human rights communities and government to engage with industry to ensure they're embedded. 

>> RASMUS LUMI:  Thank you, thank you so much and also for recalling the slow starts of the governments of the early stage of development and technologies before they started to really pay attention more to the risks. 

     Now, technical standards setting, which is being done in large organizations we are very glad to have to have here chair Roman, how do you think this process of standards setting of technology standards setting how to ensure that we make this process multistakeholder inclusive with all the voices at the table as we just heard    even the design of the technologies is an important phase and the human rights need to be embedded at the very early stage.  And also, again, how you can    your organization can work maybe with the Coalition of partners such as the FOC to spread the word.  He have

>> ROMAN DANYLIW:  Thank you for asking the questions the FOC set emerging standards but really standardizing all sorts of technology.  Before I directly kind of answer your question I want to animate a little bit about just how unique civil society contribution can be to the standards process and how the IATF how it's really benefited through this processes.  Civil society participates in the IATF participants.  And in addition to helping the standards process along in three different ways:  So first when new work is started, very often one has to really conceive what are the threat models?  What are the use cases of this kind of technology and what are their underlying implications for privacy, security and human rights?  And civil society is uniquely positioned to help with that and a very concrete example, last year the IETF helped standardizing unwanted trafficking.  When you think of those Bluetooth fobs to find lost items the legitimate purpose you put them in things like luggage, so you can kind of find them but, unfortunately, they're also being used for things like stalking especially in the context of intimate partner kind of violation and where civil society was crucial in really educating the technical community and, frankly, governments also participating in the ITF what IETF and what the shape has to look like as table stakes before the protocol table begins. 

     Another great place where civil society very much contributes    contributes to the process is protocol work has now begun.  Inevitably this is an engineering exercise there are design tradeoffs to be made.  And what civil society very much helps with is helping the community of the multistakeholder community participate in that standards process understanding the consequences of each one of those tradeoffs and that better informs ultimately which of those tradeoffs needs to be made, and we walk into those tradeoffs as we're engineering them with full eyes open about what the consequences might be now at design time so potentially mitigations can be put in place and buydown some of that risk, and we're not discovering it in infielded product. 

     And the last place where civil society really helps is whaled call the last checkout, so the specification is kind of done.  It's probably been within a year or two.  Is it fit to purpose?  So this is the last check to put independent eyes on:  Is this fit to purpose to being ultimately kind of fielded?  And we've seen engagement in the IETF, and we kind of welcome civil society kind of coming and the humanesque experts kind of helping us reason that.

     And I would like to directly kind of answer your question about making sure that it multistakeholder participation especially from civil society and human rights experts can work in the standards process. 

     I would say that there are two classes of barriers to identify.  One is around governance and for lack of a better term I would call them "operational barriers" on the governance side the very quick kind of summary standardization is not a comity process.  The process you choose, the standards organizations you go to will directly impact, of course, who will participate and for certain standards organizations, that means    it means easy participation for civil society. 

     For other standards processes and other standards organizations it's exceedingly    exceedingly difficult to do that.  We would observe that best integrating civil society and height comes when we follow it an open process which we think is the right approach for technologies that we want to see broadly kind of adopted and processes that have set aside consultations that are doing the standardization process are cut out to do consultations is not the right model. 

     We believe that participation from the technical community, private sector, governments civil society and academia should be participating equally throughout the standards kind of process all on equal footing. 

     From a governance design from the IETF this is pretty much why we have the maximum    everyone participates as an individual so everyone can contribute to all the different elements of the standardized process.  There isn't, again, some special kind of consultation, and one of the areas we have in the broad kind of community that is the IETF we have a research arm kind of as well we have one dedicated on human rights providing human rights considerations for protocols.  And, again, it doesn't operate any differently than any of the other groups or working groups it operates there the working groups around cryptography and other kind of web protocols.  That a has processes through the IETF.  And to walk through a some of the processes that we'll be talking a little bit about we observed a couple things that makes it hard for civil society to participate.  First on site meanings is the primary modality is challenging.  Showing up is expensive.  Meeting costs, meeting fees    can you get a advisory in time?  And, of course, the sustainability concerns.

     And then sufficient transparency in the process.  Open access if you want feedback in review of the underlying thing that's being standardized can you actually get the artifact?  Do you have a vehicle to provide feedback on that artifact?  Can you see how others have commented on it?

     And then lastly its fluency in the standards process.  I've heard this joke made before if you know one standards organization, you know exactly one standards organization.  They really are kind of radically different and being able to meaningful contribute requires some internalization of the different kind of cultures. 

     So from the IETF perspective tackle some of those we've thought really hard about what does it take to participate, and we've recalibrated our processes really since COVID to ensure that    that we have mostly remote meetings and all other meetings are hybrid.  There's no such thing as onsite only and the technology we use to facilitate the hybrid processes is a dedicated kind of app.  You have to use it if you're onsite.  You have to use it as remote because we want to make sure that being onsite physically in some place is not a requirement to contribute in any way in the standards kind of process. 

     We also use asynchronous tools to make sure time zones are not a barrier like email to consent access and things like issue tracking software, and we're collecting pretty strong evidence it is possible to fully contribute in the standards process mostly remote or entirely remote to include via leadership positions which we also think is critical, so we can get expertise from around the world. 

     On the access to artifacts.  We have continued to commit to for decade, which is full open access all working products of standards, all final kind of standards, all discussions meeting agendas, meeting minutes post videos    everything is available on the website no log in is available you can get at it from anywhere in the world. 

     And the place I think the IETF, and I think another standards organizations need to continue working hard on is the onboarding.  How do you welcome more people in the organization?  How do you better kind of learn kind of from them, so you can learn their modalities, so you can take their input but also help integrate them in the process as other issues.  We have a number of initiatives to work on that, and so briefly to kind of pivot to the other question is:  What's the kind of link to supporting the FOC?  I would be kind of really    really kind of brief to the directing that member states have the ability to shepherd work.  To guide work where it might be standardized.  Guides to standards that gives human rights and civil society a seat at the table from the very beginning to the beginning of the process, and I'm not smart enough what all the future emerging technologies that we need to put out there.  I can envision standard organizations might need to be spun up for new technologies so get it right from the beginning.  Help make sure they bake in the governance that make sure that civil society that height can benefit. 

>> MODERATOR:  Thank you for your organization's experience to increase voices of diversity in your work that's important, and you mentioned there's a lot of impediments and the cost of civil society are still very high so let me now turn to Ihueze. 

>> IHUEZE NWOBILOR:  You're bringing the voices of the users to the people on the ground.  How do you see the role and also the challenges in being at the table as equal participant?  And how you can maybe work with the FOC coalition to increase.  Thank you. 

>> IHUEZE NWOBILOR:  Thank you so much.  I'll start by    when he was talking, I was I was nodding my head, and I will also put in impediment because my organization got interested in engaging in the standard certain organizations like 2 years' ago. 

     One of the things we noticed is that we don't have a lot of civil societies who have interest in that area, and this is difficult because it is usually highly technical, and we find out a lot of civil societies don't have that technical background so    with dissuade them to even attempt to engage. 

     And Evan the next is from our own personal experience.  It's when you now want to engage there's a whole lot of gatekeeping. 

     So, for example, we're part of a project that was supposed to engage at ITU so our first attempt    we found out as a civil society we cannot engage directly, so we need to be part of the delegate of my country.

     And it took me 2 years, two full years, to be given access to the TIES account in ITU, and it took a whole lot of process that needs to get to the point that the minister of communication needs to approve that.

     So you found out that while we were interested, while we were following it, so we had access to limited documents where we wanted to make and include working groups, so most times it took our tenacity for us to stay on that 2 years to get to the point of being    getting that access.  Some civil society who can stay that long we give up along the way, and you find out that's why you see a few of them when you look at the rates, you say when you see that an organization like IATU you have only 10% participation.  That's too small.  It's too small. 

     So what we    those are part of the challenges that we have.  And information    the cost of participation of this organization is very    disorganization is very, very high and most organizations do not have resources and most are grant based and when we look at what's happening globally today, the granting that takes place is shrinking, so civil society are now trying to channel the resources, and they don't prioritize, and they look the other way around. 

     Then one of the other things that we see, which is why we are saying it is important to do this is we saw the standard setting organizations in imploring digital technologies. 

     So most times we as civil society we somehow are reactionary.  We have the food at the table, and you do not complain it's salty, and so much pepper but now if you are    like he mentioned, if we participate at the standards setting at the point of designing, at the point of development, you find out that you'll be able to identify those areas that changes need to be done, which will make up the product will be made at the end.  We catch up the ideas that has to do with impedance of human rights in the development of those technologies. 

     So most of these challenges' lack of resources that will enable us and the gatekeeping, and then the lack of capacity to be able to engage technically because most times like for me, the very first time I joined an IT event every time it sounds strange to me, so it took me on how to organize workshops and master classes that we took over a year and a half to be able to come to terms with most times talking about internet draft, talking about how to go through the processes, the impact of work on parties, and those things to make and impute.

     And those processes are very rigorous, so one of the things we are looking forward to is making participation to be easy.  Helping civil society to develop capacity because we know that when we do this it will work for everyone, but the change    the danger in it or one of the challenges we see is that most private organizations    they move at the speed of the new imagined technologies because their focus is to make profits. 

     Like he said nobody wants to waste time on resources making consultation, calling people to make and impute when you need to meet up with your competitors to bring out the new technology and make money.

     And then on the side of government what they do are thinking about is security.  How to    most times all the things we talk about all the processes that we suggest that we help embed human rights sounds so strange today.  Most of the time it's time wasting. 

     So for us those are areas not minding how difficult they look as the GDCs comes in and as most of the conversations goes on, those are most of the things that we want brought to the table, to look at it to make sure these vital stakeholders is given that some space at the table and not just space, but space that will empower because when you go to a room that only 10% you find that your voice might even be drowned.

     So those are    those are some of the challenges that we are looking    we have seen over time and that which we are seeing with the new competitors coming in, this conversation should    that we look at it because if we don't address them, what we are going to be doing and what it's going to look like will look like a talk show where we come in and make noise and at the end of the day we make no impact.

     So this is the time to identify these things as we are talking about how are we going to support civil of civil to have access to resources.  I appreciate ITF with what they've done so far with making their meeting to be hybrid, so those are some of the calls we are making to order standards setting organizations.  And implementing an environment and also to help these people like you said embedding process.  Create an avenue.  Create workshops that we help people who are interested to engage to have the capacity that will help them come to the table and make impact.  Thank you. 

>> MODERATOR:  Thank you very much. 

(Applause.)

>> MODERATOR:  Yeah thanks also for your patience in waiting for 2 years for this account in the IETU.  It's important to be at the table and not as difficult as you said and definitely civil society should be able to participate in its own right not just as part of government organizations delegations do; right?  It

     Which brings me to the last panelists, Alex from Google.  How do you see it from the private sector perspective what can the private sector do to ensure that the new imagined technologies are safe for people to protect and empower individuals and how can you work with the FOC?  Thank you. 

>> ALEX WALDEN:  Thanks for including us on the panel.  This is my favorite topic, so I'm very happy to talk about this. 

     I think    well, first I just wanted to start by saying realizing the benefits of technology and for everyone to realize those benefits, we do believe that companies alongside governments and other stakeholders need to center human rights considerations, and that's sort of really the only way this will happen, and so, obviously, kind of bringing technology everyone for everyone is part of our mission is it a company, and I know lots of other in private sector are committed to human rights have similar missions and ultimately we kind of all have to be in that together, so I just wanted to highlight the piece about all of us being here, part of that being about realizing the benefits of technology, and I think focusing on the risks is also important but thinking about the benefits, I think, is an important driver in particular for companies, and I think what brings a lot of the engineers and technologists to companies is really thinking that engaging in the technology is something that's going to benefit people everywhere so just as a starting point you wanted to flag that. 

     But really before it feeds to start for companies is them having a commitment to human rights, and I think    I say it sounds simple but, in fact, many companies don't have a career policy that they are committed to human rights in particular the UN vetting principles on businesses and human rights.  Making sure all of the Eck equities and the rights that are enumerated in implementing treaties are ones companies are thinking how their products might have an impact on.

     So again, first it's the policy and having the commitment and that should manifest in two ways for companies:  First it has to be internal facing, so your commitment to human rights as a company needs to ensure that everyone who works there has some sort of baseline understanding, not needing to be an expert in human rights or fundamental rights or kind of the intricacies of international human rights law but really just understanding the top lines of what with a human rights are and that actually being at a business still means you have an obligation to respect rights.  That's sort of enough but everyone at companies needs to understand that, and so companies need to do things like have corporate training sort of    it's really in the weeds things like that that companies need to do that.

     And then its embedding rights, rights considerations into processes, so that the way we think about rights, scales across a company, you know, there are, I think, now we're maybe at 180,000 people who work at Google.  I know Microsoft is large, Amazon is large, Meta is large.  Any ways in which we're thinking about rights we need to be implementing that inside it of our company in a way that makes sense for how we are developing the technology, reviewing impacts and deploying    deploying the technology. 

     So in terms of process, and then across the varieties of function, so that means our engineers need to be thinking about what their role is, trust and safety is, of course, another part of an organization at a company that needs to be focused on how rights are embedded across the potential harms that we might be seeing.  And, obviously, other folks like public policy and making sure that the company is showing up to the conversations in ways that are bringing and raising right related issues. 

     And that sort of transitions me to what other companies should do in our external engagements we're focusing on centering rights and manifesting our human rights commitments in that work and part of that is how we show up in a public policy setting.  How we're having conversations with governments about the challenges we're having and sort of the ways they're thinking about rights based issues and making sure we're in dialog informing stakeholders about what our technology is and the ways in which we should be thinking about mitigating potential harm.

     And then related to that a little bit in the external piece is about companies    and this is part of being committed to the UNDPs, which is engagement with stakeholders and partnerships across industry.

     So I think doing consultation with stakeholders and engaging stakeholders and experts throughout the product lifecycle is something that that is necessary for us to be    to be happening across all of the technology we're developing certainly in the AI space, but it's something that should happen across anything in the future as well. 

     Examples of where good work is being done and where Google is showing up, and we're happy to see others in industry showing up as well the work that the B tech group is doing.  The work that happens in GNI.  The Global Networking Initiative with civil society there.  The work of the freedom online coalition the advisory network is a place where Google shows up and has conversations, and we're happy that, you know, the FOC is a place where we are making statements and thinking about the FOC as governments and tig of human rights in particular.  Engaging with the NGOs in this space, and then, of course, engaging with the standards bodies too. 

     So just to kind of highlight some of the places where I think there are really important conversations being driven about the focus of rights where companies are showing up to the table. 

     I have so many things like I said, I think it's my favorite.  Maybe just to hit a little bit on what FOC    FOC governments can do. 

     I think FOCs    FOCs is made up of governments are and want to be seen as the leaders in respecting rights in the technology sphere and what they can do is embody that, be that, make sure international laws are centering human rights when you're seeking to regulate AI or other technologies or other aspects of technology.  Make sure that when you're showing up to multilateral aura that these are issues that you are raising.

     And then also, I think, to be a champion of    be a champion of human rights.  Be a champion of the multistakeholder model create space for companies and civil society to show up at the table and support civil society in particular and enabling them to do that.  Kudos to raising that already. 

     And    well, I think that's my last one.  I'll stop there for now, but I look forward to the rest of the conversation. 

>> MODERATOR:  Thank you, thank you so much, and this is just actually a start of a conversation because now we can open the floor for our audience, and I'm sure the presentations are already generated a lot of interests and perhaps a lot of questions that we can have from the people here in the IGF. 

You Alex, mentioned the nez I of constant consultation with all the stakeholders throughout the process, and I think this discussion is a nice little miniature of such multistakeholder consultation, and I hope we are all taking notes of the needs of each and every one of those sectors, so that we can then implement it. 

     So, yes, please, I see the gentleman, and then the lady over there, and we're bringing the mics if you could please use the mics.  Thank you. 

>> AUDIENCE:  Yes, thank you very much.  I'm Alexander from a country, which is not member of freedom online coalition, and I also have a question related to what I heard. 

     I'm a member of civil society and my government doesn't support this part of civil society, so I definitely have no chances to go to ITU, for example, to participate, and so on because by some huge corporations completely is one thing.  And the approach is very good many countries and governments are raising this questions  

(Garbled.)

>> AUDIENCE:  And everyone in in your country as far as    Russian citizens is  

(Inaudible.)

>> AUDIENCE:  Human rights governments of    not looking on    it is legal.  The GDPR doesn't such AI unless entering the European Union and either way GDPR doesn't touch any government activities. 

     So talking about    it's interesting, and it's really important.  But there are also loopholes in existence in legacy  

(Garbled.)

>> AUDIENCE:  Governments themselves and also the quartet and tech development    yes, I know    like    approach something like that but    anyway, thank you very much. 

>> MODERATOR:  Thank you.  Who would like to take this very easy question I would say maybe, you know, the United Nations because there is something that binds us all right, you don't have the EU frameworks and every country is its own, but I think what binds us together is the international human rights law, so we're super happy to have an expert on this panel if you would like to take this one. 

>> I already answered the question.  What was interesting just to come back to the UN guiding principles they say the corporate responsibility to respect human rights exists independent of the    whether or not it's a leading binding framework in the country where the businesses operate the UN guiding principles come out of this    of the spectrum that exists when globalization really kicked off and the fact that many companies globally are more powerful than some states might be or some states might be unwilling or unable to promote human rights so in that sense, this session is very much about tech governance and rights respecting conducting in the developing technologies.  Developing these technologies and the answer in terms of human rights framework is there, and it's there in terms of also the corporate responsibility to respect human rights independent    if    legally binding standards exists in the country or not. 

>> MODERATOR:  Thank you, thank you so much, and I think, yes, we had the lady. 

>> Yes my name is Lena, and I'm with the Council on Tech and social cohesion and search for common ground I very much appreciate Google's engagement on a number of things multistakeholder involvement in design and respecting human rights so one of the most interesting new technologies is generative AI and chat bots, so I'd love to ask the question:  What went wrong with the character.AI that somehow we're not able to put these frameworks and a commitment to multistakeholder, you know, sort of prevention of harms in place before releasing these chat bots on young people that's led to suicide, and now, unfortunately, court cases against Google and others.  I'd just love to take a current news headlines and say what's gone wrong and what could we learn from that? 

>> MODERATOR:  Thank you, I think that was directly for Alex that question.  Alex that's the beauty of being at the ITF when you think of this conversation. 

>> I love it. 

>> ALEX WALDEN:  I think ultimately this technology is new lots of things being piloted and what is going to be required of us and companies, and I think a challenge of what's needed for all of the stakeholders with us is to move at a faster pace.  That's just something that is happening but also it means that it despite all of the work that's being done to protect against any harms all of the evaluations that are being done, all of the guardrails that are created sometimes things don't work out in the way that we    that they manifested when we tested them, and so, you know, we had    this has happened    a few things, you know, where we released it and have seen challenges and had to potentially pull back for their improvement before we rereleased them so, you know, it is    it's sort of a process and there are lots of ways in which we're seeking to identify potential harms, mitigate against them, but there will be errors, and those are things that we're going to have to be sure we're quick in responding to as well. 

>> MODERATOR:  Thank you, thank you so much, and I was already reminded that we have time for one last question, and then we will go into the wrap up, so please. 

>> MODERATOR:  Just very quickly thank you very much for the session.  I just wanted to add that innovation for others as we're talking about rights    we're talking obviously about AI as well    one example  

(Garbled.)

>> MODERATOR:  Acted on this

(   

(Poor audio quality)

>> MODERATOR:  That has established the baseline finding a   

(Inaudible.)

>> AUDIENCE:  On democracy and all other aspects on AI, and now it's going into the implementation and that connects with the standardization work 'cause they are looking into ways of embedding the digitalization work of the guidance they are developing in the council which has basically all the shareholder  

(Inaudible.)

>> AUDIENCE:  Just on a positive note this can be done. 

>> MODERATOR:  Thank you very much.  I just like to    I'm taking before I give the chance to the panelists for the last word. 

     The Global Digital Compact has a particularly strong language on human rights.  It was mentioned, and this should be seen as an opportunity.  An opportunity now to do even more than before human rights based approaches technologies before the whole lifecycle of them even including the standards setting we'll continue to rely on the good guidance for the United Nations office as we are in doubt as governance as the private sector as organizations dealing with tech we can always call them.  They were tasked to provide for advisory services so there will be a huge help in that regard. 

     We're also reminded the need for having a multidisciplinary all inclusive approach when dealing with these technologies from the get go.  From the very design space because afterwards, it is already too late to try to mend something that was broken in the first place and for this we need to have the civil society at the table to get it in the technical community the private sector and governments as well and the OHCHR advisors and other human rights activists. 

     We were also reminded civil society faces a lot of impediments.  The process for them to participate there are sometimes the capacity of these processes the gatekeepers, you know, sometimes they tend to do decisions outside the deciding room so once the civil society actually gets in, everything is already been decided, and we need to work on that.

     And there was also the note of optimism that we need to think of the benefits of these technologies which can really help and empower civil society as such to bring an additional voice online. 

     Now, if I can turn back to our panelists.  If you would like to say half a minute each the important message you would like our audience to take back home from this panel, what would it be. 

>> Yeah, things next year is a very crucial year for internet governance to champion the multistakeholder model to take a strong measure from Riyadh to Oslo.  The international human rights framework is fit for purpose, and we should continue to find creative solutions in I think making Rusk of people, and we're very happy to explore those together with all stakeholders. 

>> Human rights at the design phase bull you a products, services and policies are developed through a human rights test and engage the experts to assess what the impacts would be and make sure that is considered as much to industry as you consider things consider the human rights.  It's not ready until you've done so. 

>> Standards is not a commodity process if we want to make sure we get human rights in our standards, we need to include human rights experts like everyone else in the multistakeholder community at the very beginning and all the way through the end the process is equal participants. 

>> Thank you so much.  For the civil society again, and    it's to a way to interpret  

(Inaudible.)

>> MODERATOR:  Like everything I was thinking was already taken maybe just to re enforce it's an important year ahead and the GDC it already does re enforce the importance of human rights and for us to focus of the importance of rights and all the ways we're thinking of implementation going forward and in particular for companies to show up and reintegrate how we engage. 

>> MODERATOR:  Thank you so much, and I think you all deserve a round of applause.  Thank you.  And thank you so much to the audience as well. 

(Applause.)