IGF 2024-Day 2 -Workshop Room 10 -OF 22 Citizen Data to Advance Human Rights and Inclusion in the Di-- RAW

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> PAPA SECK: Good evening, I'm Papa Seck from UN Women, and wearing another hat of the steering committee of collaborative on citizen data. The collaborative was established last year to foster partnerships to promote capacity and promote citizen data for inclusive and sustainable development. UN Women was the chair, together with the UN (?) Division. We are really happen to organise this session today.

     Colleagues, in the digital era the participation of citizens in data driven processes is essential for fostering inclusive, equitable digital virals that can serve the diverse needs of all communities and community members.

     Since Sunday, Day 0 of this forum, we have heard repeatedly why data that feeds AI algorithms, for example, needs to be representative but also needs to be scrutinized and this absolutely necessitates citizen's inclusion and engagement. The Newly adopted Global Digital Compact has inclusivity as cornerstone for fair and equitable future. Meaning fully engaging all citizens in the data production in use is important to ensure digital systems and policies reflect their unique experiences and priorities, and this paves the way, of course, for more inclusive digital transformation.

     If we put that another way there will be no transformation without inclusion and diversity. This panel today really reflects that. So we have distinguished speakers from Civil Society, Human Rights Institute, National Statistical Office and private sector who will share their experiences on the way citizen data movement can foster inclusion and human rites in public services and policies in the digital age.

     The session will also explore how the recently launched UN collaborative on citizen data and the Copenhagen data can support communities in this endeavor.

     We have five distinguished speakers today. We have Rassmuss V n at the Danish institute and Bonita, the research director of policy and online again we have Elizabeth Lockwood, the representative at the UN at stakeholder group for people with disabilities for sustainable development and Dr. Hemrage in national statistical office and last but not least Mr. Joseph Hassine, at Google.org and AI for social goods at Google.org.

     Let me start with    we will start with one question for each of the panelists. What I ask is can you briefly describe your experience on citizen data, Human Rights and digital transformation. How does this come across? It will be the same question and kindly ask you to stick to the time allocated of two minutes. So Line, start with you. Can you hear me? While we sort that out, Bonita to you.

>> BONITA NYAMWIRE: Thank you. At policy where I work, we are in Uganda. Our team is spread across the African continent, so our work is deeply root ed in empowering citizens through data and leverage citizen generated data to advocate for digital services, inclusivity and accountability. And by integrating data, we explore Human Rights and digital transformation and ensure women and girls are included in the processes. So our work mostly focuses on improving digital technology services for women and girls across the African continent.

     So it is significant focus of ours has been ensuring digital transformation does not exacerbate inequalities, especially gender inequalities, but rather create opportunities for more inclusive and equitable societies. This therefore includes extensive research we have done as policy in various African countries on issues like technology facilitated gender based violence and need for digital systems and other research on citizen engagement in data governance processes, as well as application of gender data in data governance, thank you, Papa. Back to you.

>> PAPA SECK: Great. Thank you, Bonita. Line, can we start again? Were you able to unmute? We still cannot hear you, but let me see if we could try to fix it with technician in the room.

     So in the meantime again, let me go to you, Elizabeth. Elizabeth, are you speaking?

>> ELIZABETH LOCKWOOD: I can unmute now. Apologies. Thank you so much, Papa. Along with Papa, I'm one of the steering committee members with the Collaborative on Citizen Data, so I will speak to that and my other role. One of the key outputs is our Copenhagen framework which is citizen participation across entire chain. It outlines principles for citizen data and necessary enabling environment.

     A central objective of the framework is integrate citizen perspective into broader discussions with the national data ecosystem, including topics such as digital transformation, artificial intelligence and governance. This approach is essential, ensuring open, transparent, inclusive, participatory, ethical and other approaches.

     As the stakeholder group of persons with disabilities we really focus on data led by organisations of persons with disabilities, citizen data in particular, to fill critical data gaps to provide evidence to influence policies and ensure data reflect reality. This includes data to address the increasing digital divide that disproportionately affects persons with disabilities who live in    80% live in the global south. Of that group, 90% do not have sufficient access to assistive technologies that they require. I'll talk about that a bit more. Thank you, Papa.

>> PAPA SECK: Great. Thank you,

>> ELIZABETH LOCKWOOD: Line, try again?

>> LINE GAMRUTH RASMUSSEN: Hope you can hear me. Thank you for waiting. So working in this space of technology and Human Rights there is always this dual relationship where tech creates wonderful opportunities for promoting and protecting Human Rights but can cause serious harm to human rights. At the Danish institute we take a human rights approach so we work on responsibilities of states and businesses to use and deploy technology in a way that is human rights compliant. We forget ourselves as human rights professionals we have to think about how technology is happening in a way that is not causing harm to human rights. So that is why we are working with this human rights impact assessment methodology, where you actually try and assess the risks and the impacts that the technology will have on your    on the users involved in your projects. That is why we made a few tool kits on the assessment in a digital space and also tool called digital rights check, which I will talk a little more about later. Thank you.

>> PAPA SECK: Great. Over to you, Dr. Hem.

>> HEM RAJ REGMI: Thank you, Chair, distinguished delegates, ladies and gentlemen, good evening to everyone. Yes, we all know that digital data are required for almost everyone, from individuals to the governments, policymakers, to the decisionmakers, to the (?) To the business people. We know these data don't come automatically. We need to invest in data tech.

     Some data can be produced with a little bit less reports, like managing administrative data or MIS, management Of Information System, but the data system are quite costly, particularly the census or surveys or these studies. Even the governments like Nepal    I'm not able to find sufficiently to produce sufficient amount of the data. To monitor plans with the international (?) Agreed by the governments. We are looking for the alternative resources with maybe a little bit cost effective, which maybe more reliable and presentation like. These are different types. For example, Big Data may be supported by the AI and all computers and abilities. But the capacity is limited to manage this Big Data. That is why the best alternative that we assume may be citizen data.

     Within these constraints the citizen data may be the best alternative for the data production. Not only for the government but for good of the society. That is why I have collaborated with the UN citizens datas and data group to focus, to produce some data. Not all, maybe some data. Particularly on the social side. For example, the violence, the household level violence and gender based violence, which are quite common. And to report that type of information.

     Maybe the disaster related. Nepal is prone to type disasters. If we can report this data, the use you are discussing, like Human Rights violation, related within Human Rights. These datas may be invaluable for governments and provide (?) Those leaving behind as part of the team of digital. That is why we are collaborating with the data form, trying to work it with Copenhagen framework and statistics of Nepal    yes, I can hold it, Papa.

>> PAPA SECK: Doctor, thank you. We are a little limited in time, but I will come back to you with a second question. Last but not least, over to you, Jessa.

>> JOSEPH HASSINE: Thank you, I look at this as a philanthropy where I fund organisations and Civil Society leverage AI and technology towards digital transformation and toward more positive societal outcomes. Specifically, my team funds nonprofits to build to support diagnostics, education for teachers or food security to predict famine to predict response, as a few examples. All this requires data. Data that is accurate, accessible and inclusive and encourage a more open data ecosystem as whole, such as through the UN statistics to build UN data, using Google's data comments to build an open source platform for understanding vast data with natural language functionality.

     These are towards that goal of a more open data ecosystem that provides a more clear and accurate understanding O F the world and challenges we are facing. So I'm looking forward to talking more about this work and how it connects to work some others have mentioned by the collaborative on citizen data as well

>> PAPA SECK: Thank you. All of you have highlighted one area as examples of why the work on citizen data is so rich.   So I have specific questions for each of you. In five minutes, Line, you tell us a little more about it    deeper about what you have shared with us.

     You have touched upon some of the tools that the Danish Human Rights Institution offers to protect and mitigate Human Rights risk. In your context can tools to be used to assess whether data collection uses more by apps or developmental platforms to showcase data in risk    in risk of violating Human Rights. Over to you.

>> Yes, indeed. We develop these tools. I shared a link to what we call the digital rights check. It is something we develop together with the gi set to really have a tool for digital for development projects where they could    you know, where the staff involving these could have a risk management guide for what will be the Human Rights risk or impacts of this particular project. (Line speaking) so the tool gives entry points. You can enter as technical development corporation as investor or others and allows you to    it is an online questionnaire that helps you identify these potential issues and some corresponding actions that can help you rectify or mitigate the risks that you identify.

     So it will help users consider technology specific risks, application specific risks, so what kind of technology are you using, is it AI, is it cloud services, et cetera. Then also the context specific risk as relates to, for example, data protection, regulation in the context or country where you are deploying this technology.

     It also as a human-rights-based tool pays attention to vulnerable, marginalized groups and makes you consider accessibility issues, so it really gets you around the whole process of identifying the different risks, and also makes you consider stakeholder engagement and how you think about transparency and accountability.

     It also gives you these case studies and further readings that you can link to. Then in the end, you get this final results page with the risk identified in sort of Human Rights action plan that you can use to follow up in and really react on the risks you have identified.

     This is an Open Source tool that is open to everybody, so if anybody would like to adapt it to their own context, own projects, that is perfectly feasible. In the true spirit of privacy by design, the data is also not stored so it will be deleted as soon as you finish the questionnaire.

     So I would welcome you to explore and give us feedback if there is anything you would like to see in the tool and hope it can be useful for many of your projects.

>> PAPA SECK: Thank you very much and for sharing the tool. I think the tool is really an excellent product so I encourage all of you to do so. So now over to you, Bonita. You have extensively researched the intersection of women, girls and technology. Could you share insights on involving women and girls in data practices, in what way can they meaning fully involved.

>> BONITA NYAMWIRE: Thank you, Papa. Involving women and girls is essential because of their perspectives, needs, experiences are oftentimes overlooked in decision making, yet they are very important. As policy research we have conducted with women and girls shows, for instance, they are disproportionately affected by issues like online gender based violence or technology facilitated GBV, data privacy breaches and other kinds of online injustices and discrimination. So we have done our research and looked at women in politics, women in the media, women in Human Rights defenders across several countries on the African continent.

     So why they need to be involved in these processes is, one, involving them helps identify and address systemic biases, assuring fair policies, including their perspectives also ensures their unique challenges, such as I have mentioned, THGBV, algorithm discrimination is not overlooked with safeguards involved for them. Involving women and girls also ensures fair representation in the digital age. Because as data becomes central to governance and involvement, excluding women and girls perpetuates this.

This is vital for achieving gender parity in leadership, decision making roles and goals to achieve them so involvement    so how do we involve women and girls in a meaningful way? So meaningful but especially for women and girls can take various forms, ranging from involving them in participatory workshops with design as data governance, stakeholders on better governance policies, ensuring that solutions are truly reflective of their realities. We have done this as policy with women politicians, with women in the media on our different programmes.

We have a programme for women politicians called Vote Women that we have implemented in Uganda, Tanzania and Senegal and have seen the involvement help to    you know, protect them online but also improve their well being in digital spaces. We also have had another programme, which is still running, which is Work For Women in the media, where we are also creating their resilience in online platforms but also to see their voices are amplified.

     Then the other one on meaningful involvement of women is need to invest in digital digital literacy programmes to improve skills. Research has shown that women lack digital skills, so involving them through capacity building on digital literacy on the programmes. The other is make sure representation in data policy boards and leadership and decision making bodies is also improved, which will further ensure that they are key stakeholders in decision making processes, whether at community level, national or global level. So it is important that women and girls are also involved in the decision making processes at different levels.

     Then the other is create safe spaces, where their voices can influence both national and global data governance policies. These are some of the spaces for their -- here where we are at IGF and several others. We have seen involvement of women in the spaces improve their station and amplifying their voices in data governance.

     Meaningful involvement requires removing barriers in station by removing gender and intersectionality lenses in every stage of policy development and digital transformation.

     Very important is this point on embedding intersectional lens at every stage of data governance process so that no one is left behind. Then lastly, it is also important to foster collaboration and accountability in the tech ecosystem, to prioritise the needs and rights of women and girls.

     So this will involve collaboration with the women and girl's rights networks and organisations as well as government departments that work on gender issues. Thank you very much.

     Great. Thank you very much, Bonita. After UN Women this was tasked by the statistical commission to several agencies recently. We are developing a new framework for    (?). I'm increasingly convinced citizen data has to be central to this, because it takes so many different forms. I think at this IGF we have heard several examples of that, where measurement becomes really tricky if you don't have inclusion. So thank you very much. I will definitely be looking at some of the tools that    some of the work you have done in this area, because I think this, again, has to be part of our global efforts to develop meaningful measurement of this phenomena.

     So Elizabeth, we turn to you. With your work on disability statistics, could you give us an example on how citizen data help to ensure that the digital tools are inclusive.

>> ELIZABETH LOCKWOOD: Yes, thank you Papa. I have three brief examples from partners. First during the Covid pandemic we had little data on people and their experiences globally. As a result, NGOs, organisations of persons with disabilities gathered data using citizen data to understand the barriers and solutions for persons with disabilities.

     And the finding the stakeholder group of persons with disabilities collected indicated that persons with disabilities face barriers in accessing digital technology in many vital areas. This was critical for their survival, in many cases actually. This included lack of access to fast Internet connection, lack of financial means to purchase data packages for devices and lack of captions and sign language interpretation for those daily news briefings that we had.

     What happened, since governments really weren't supporting this, organisations of persons with disabilities came in and they supported their members, they shared the information and they advocated to their governments.

     In the case of deaf organisations, captions were added, National Sign Language was added and in many cases and still continue today in emergency settings.

     This isn't universal and there are still many hurdles to jump for this, but it is important to recognise this. And then another thing that is interesting is digital platforms such as zoom increased for persons with disabilities in terms of accessibility, but when the pandemic lessened, this actually got worse. It was a priority for a general population, so it is an interesting point to add.

     My second example is a large scale example. It is called the digital accessibility rights evaluation index. It is a benchmarking tool developed by the global initiative for inclusive technologies for advocates, governments, Civil Society and O thissers to trace ICT that is accessible in different countries around the world.

     The data collection is based on a set of question nears, done in cooperation with disabled peoples international and other organisations of persons with disabilities. It has been documented in 137 countries, in eight regions in the world, representing 90% of the world. Their finding in 2019 and, most recently, 2020. If you look at the index score, a very nice platform online, you can see global and regional ranking, peer development, group ranking and implementation ranking.

     For many countries, you can compare 2018 to 2020. Most countries have improved, but not all. So that is something I really recommend you look at. Then my final example is from the European blind union that has done significant advocacy around accessible voting. Feedback was collected from blind, partially sighted individuals on their barriers using digital voting systems and election materials. So this turned into advocacy companies. As a result this advocacy has included improved compatibility with screen readers and other enhancements in the European region. In closing it is only by ensuring associations for persons for disabilities are leading and co  leading these initiatives, that digital tools will be inclusive, reflecting reality and needs of communities themselves, thank you.

>> PAPA SECK: Great. Thank you very much, Elizabeth. Great work. I have followed this. I think you and I have also had conversations on some of the work we are doing. Because I think in the study    in the space of statistics, there's still more that can be done. Particularly on disability. We have had several of these conversation. I think the collaborative is definitely well placed to add    enrich that work.

     So Dr. Hem, over to you. You are considering, as you mentioned, the use of digital tools to collect data on gender based violence, as an example. Could you please let us know how you plan to engage with the communities to ensure the tools are inclusive. But also, you know, I would add in this space obviously there are lots of ethical concerns when comes to collection of data on gender based violence. How do you aim to address those concerns?

>> HEM RAJ REGMI: Thank you, Papa and group. The statistics in most countries have been governed by principle of statistics in many years, the statistics, rules and regulations are particularly based on those principles, which were promoted by the UN in last 20, 30 years ago. Now the situation is getting changed and we need statistical system to include other dimensions, particularly the citizens data, for example. Luckily, Nepal we promulgated a new statistics in 2022, almost two years ago. Then there are few provisions where we can use    we can use such type and alternative data source, not streamlined but can be alternative. What we have right now by different censuses and studies.

     The provision of the data like (?) System, which is almost similar to the framework, where a different modality suggested to produce the (?) The CSOs, Civil Society in collaboration with the CSO, led by the NSO, primarily these frameworks are there. Similar    not exactly similar but up to similar in Nepal was promulgated in 2022, we plan to use    we plan to use those modalities in the future. For example, the data of marginalized people, the data for the center where there is a data gap exists. If there is a data gap and national level of these, maybe they are not able to produce data at that level for those marginalized people that we are discussing.

     For those areas that are quite remote, that particular segment of the society which are marginalized, we plan to use that and work to produce the different sectors. Then the same thing we did with level workshop to implement citizen data and we have decided it is to (?) I mentioned also. One is on the violence. Particularly, the gender based violence. The other one is the diversity. Particularly, the impact of the diversity. Not on the assigned risks, when disaster occurs, the impact on the livelihood and on the life of the people are even disturbed to excess. This tool is like piloting on the framework in Nepal. If confidence it be reality of the data can be bathed with these type of activities, we hope in the future we can take these citizens and data as alternative first with maybe the major part of the opposite system, like the centers, MIS has been part of the major system right now. The citizens related are maybe the honour source of the data systems, which can help us particularly for marginalized people, for the women, children, displaced people. People affected by the diversities. Yes, that is the plan.

>> PAPA SECK: Great. Thank you very much. I really look forward to seeing this work. I think one    maybe one piece of advice I can also offer is that there is a lot of work on citizen data and violence that's been done in Ghana, in particular. I would definitely advise you to also, you know, talk to them and also learn about their experiences. I would be happy to make the connection.

     Joseph, again, you know, all our gratitude for the strong support to the work of the collaborative on citizen data. With that, you know    and my question to you is what motivated you to invest in this space? What challenges in context of innovation do you hope to address through your investments in citizen data?

>> Thank you. We are grateful to be part of this work in a small way. (Joseph) Google .org's work has focused on two things, data that informs more informed and policy or decision making or has more accurate and inclusive AI tooling, sometimes both. The funds may be used to build a new data analysis platform, create advocacy tools to communicate with policymakers on critical issues or collect data in different languages and context in order to improve the accessibility of AI models.

     Citizen generated data, in particular, is foundational to all of these efforts. Because, at the end of the day, gives a more exclusive picture of the world around us, critical for any tooling built on that data. At the same time the data is only helpful if it performs action through new understanding of a community or problem.

     In order to make data useful, from my perspective, there's some amount of validation and standardization that is critical in order to ensure that these individual, small or largescale citizen data efforts, are seen as trustworthy, usable and ultimately able to spur positive change.

     I think it is not dissimilar to the digital rights check, for example, that a colleague mentioned, of meeting kind of an expert in the space to create standards others can follow when building so we can look at data produced by these entities and know it meets some amount of benchmarks.

     So that type of work is part of what makes us excited to support the collaborative on citizen data, because I think the collaborative is filling that critical role through the Copenhagen framework, through pilot efforts, to ensure there is reliability in the citizen data ecosystem that continues to grow.

     So we view that central leadership as critical to build capacity and create standards in the space. Thrilled to support the collaborative, to continue to build up on that.

>> PAPA SECK: Great. Thank you very much. Again, thanks for your strong support. So we still have some time. I would like to now open up for questions both from those in the room but those online. Yes, please. Please introduce yourself.

>> Hello, everyone. I'm Dena. I'm from Brazil. First of all, thank you for these amazing meeting and other information shared. My question is, how can the Human Rights initiatives also include children and other adults? Because it was mentioned about women and people with disabilities, but I would like to also know about these initiatives and it would be amaze ing if you can also mention inspired examples regarding these communities, thank you.

>> PAPA SECK: Thank you. Yes, please.

>> I also had a question. Thank you so much. Thank you so much for that discussion. That was actually really helpful. So I had a slightly different question. When you build, in a sense you are talking about having high quality good data sets for better policy making and citizen's good. There is a lot of harm I think you can do. When you have these data sets and essentially sharing them with the aim of doing public good, if you are sharing them widely or actually if it is anywhere that is accessible, there is nothing to stop maybe a company or any other actor from using this data to exploit certain kind of society problems or any sort of divide to make things worse. You already have political consultancies when comes to elections. So sharing public data, yes, it could lead to public good but I think there is potential for public harm, especially with so many AI systems deployed. How would you prevent and ensure this data collection is only used for good?

>> Thank you. Do we have any questions online?

>> Not at this moment.

>> PAPA SECK: Great, thanks. Does any of you want to take the    do any of you want to take the questions that were asked?

>> BONITA NYAMWIRE: I can go.

>> PAPA SECK: Bonita, go ahead.

>> BONITA NYAMWIRE: I can go on the children one. So I know that one, that is why I talked about using the intersectional lens in governance data because when you use the intersectional lens, you are able to see who has been left out, who has been included. Everyone the different categories, because if you focus on children they have different categories. This applies to women and girls and also adults you talked about.

     I know there are some organisations that doing work on children being online. I know that Plan International is doing work on that. They have research on gender    cyber bullying online for young girls. I know UNICEF is also doing a lot of work on that one. I know that also child fund, the different offices of child fund globally, they are also doing work around Children's Rights online. Yes, thank you.

>> PAPA SECK: Thank you. Anyone else for the second question on the forum?

>> ELIZABETH LOCKWOOD: I can go ahead. This is Elizabeth.

>> PAPA SECK: Go ahead.

>> ELIZABETH LOCKWOOD: We have humans that guard human right so it is a core theme of the framework so something to apply to address the very good question that was asked in the audience. I also think that it is important that this    that we need to be    data need to be confidential and protected but also in other cases be open and accessible, so I think we also need that balance. That is really important to look at that balance and monitor and retain that balance, thank you.

>> PAPA SECK: Thank you. Dena, to add to your question. As part of the collaborative, we have various organisations working on different issues. So obviously gender, women and girls, is an important dimension but not the only one. Other organisations are working on different dimensions and what makes the collaborative really rich.

>> JOSEPH HASSINE: Papa, if I may.

>> PAPA SECK: Go ahead, Joseph.

>> JOSEPH HASSINE: To the second part of the question, there are organisations doing interesting work ensuring the data collection efforts are equitable and fair. I think in particular areas that I have seen, such as for example indigenous language, these are areas where indigenous communities can benefit from leveraging AI tools, but they are often not available in native languages. At the same time, AI could be a tool for preserving native language. Many of which are unfortunately becoming extinct in the U.S.  And else where. Of course that has meaningful risks associated with ownership of that data. Whether these communities want their data ingested by systems.

     So I think these questions take place at a large scale, but also at a more issue specific scale, where it is how do we make the data appropriate and safe in this particular instance. And I have spoken to organisations leading efforts on if you are collecting indigenous data here is a charter and constitution how that can be used, how communities should be compensated and included in the process and what ownership they should have of data, moving forward. I think work like that is really critical to that second question of how do we avoid some of the risks here.

>> PAPA SECK: Great, thank you very much, Joseph. We have time for just one final lightning round for all the panelists. Just like in no more than a minute, what would you advise collaborative to do, its work in digital space. A piece of advice we can take into consideration. So let's start with Line.

>> LINE GAMRUTH RASMUSSEN: Yes, one piece of advice. That is hard. One thing we have to realise, is not just consider the end result. So we might have great citizen data that is inclusive and that really reflects the society we are in. But we also have to get it in a way    the process has to be human rights based as well. That means we have to think about the Human Rights of nondiscriminatory nation, humanity and legality. This is ongoing, not something you do once and then sort it and then the rest you do for the next five years is fine. Keep doing and keeps a accessing who might be harmed by the products or services that you are using and how you are using. Then one final thing is maybe also this thing about considering off line alternatives as maybe the only option for some people to participate and be empowered by this. That we cannot, you know, just    you know, even if we have like universal access, even if it is affordable, there will be people that we cannot reach online.

     So we have to think about and be serious about off line alternatives, thank you.

>> PAPA SECK: Yes, the point is quite central I think to all the discussions in order to make sure we don't leave anyone behind. Bonita, can I go to you?

>> BONITA NYAMWIRE: Thank you, Papa. As for me, my final I think one piece of advice would be that all key stakeholders doing work in the digital ecosystem, looking at all these issues should not work in silos. They should work together because you find government is doing this. The same thing Civil Societies are doing, the same thing private sectors are doing. But instead would all work together, leverage on each other's efforts, leverage on the structures and the systems that each of those stakeholders have to be able to address some of these issues.

     For instance, the safety issues that one of the participants has asked about. You know, it would be, you know, looking at what does government have, the tech companies have and Civil Society have and working together to sort most of these issues, then also to ensure that we do not leave anyone behind. Because working in silos, we may forget some people. But working together will not leave anyone behind. Plus all the categories that have been talked about. The women, the girls, the children, the adults, the persons with disabilities, thank you.

>> PAPA SECK: Great, thank you very much. Elizabeth, over to you.

>> ELIZABETH LOCKWOOD: Thank you, Papa. I think the collaborative needs to be in the conversation. We need to be part of this work. Meaning fully part of this work, one way is to engage in monitoring and implementation of the digital global compact. I think that is a very good way we can really be part of this as a collaborative. Then strengthen the capacity and invest in inclusive citizen data and participation in the digital space, especially for marginalized groups, which we have been talking about.

     I also echo we should work cohesively, collaboratively instead of silos, thank you.

>> PAPA SECK: Great, thank you very much. Dr. Hem.

>> HEM RAJ REGMI: Thank you, Papa. Yes, according to my understanding    not one to replace the (?) Immediately but in future we can think about it. So the focus would be on the    on those areas where there is a data gap.

     Since Covid the data collection system, part has already been changed. We have already transferred from the paper based to the capital based approach and (?) Has become quite common. We can lean with the digital world. My idea is that start from a small area, primarily municipality with a small team. Maybe district or maximum province. Let's not focus at the national level that citizen generated can level the gap. Let's start municipality from let's start from marginalized areas, from marginalized communities, from some societal segments, particular (?) Then we can integrate into the standard system and then there will be digitally and different forums. That is my opinion.

>> PAPA SECK: Great, thank you very much, Dr. Hem. Joseph, over to you.

>> JOSEPH HASSINE: Thank you. I think I view this through a lens of how do we get more resources to the collaborative. To that end I think the work done already to pilot some of these efforts is critical. A critical next step we often miss in the data spaces. Like how do we capture the impact that is having and tell that story. Because I think that will ultimately need those through lines of what data and organisation created, what decision that ultimately led to and ultimately had on a community of people. The better we can get at capturing and telling those stories, the more we will be able to find support for this work and continue to scale it.

>> PAPA SECK: Great. Thank you very much. This wasn't part of the plan, but I will make it as a prerogative to    as a prerogative. I would like to put on the spot Ms. Chen at the UN statistics division at the forefront, including driving this session. I don't think we can close without giving you the floor. Please to you.

>> Thank you for putting me on the spot. It is really great question. Thank you, first of all, Papa, for being such a great moderator, and to all the speakers. Great to see you over there, Bonita, and all the speakers. Without all the help from you, it would not be possible. Thanks to Francesca and her team to make it happen, interpretations. Of course the collaborative is a collaborative. We all work together, so really grateful for all the great work we have done together. Thank you, Papa, for co  leading the collaborative with us. We look forward to continuing the conversation.

>> PAPA SECK: Thank you. Thanks to all of you for a rich conversation. There are many takeaways. Obviously I won't summarize but I think, for me, just some of the key points that came up really around meaningful citizen participation in the data values chain. Ensuring digital systems and policies address the diverse needs of marginalized and underrepresented communities.

     We also need to foster inclusivity and equity in the digital area. Citizen data initiatives, as we have heard, such as the Copenhagen Framework are vital for integrating citizen's perspective into digital transformation. We also need them for promoting transparency and self guiding human rights and data governance processes.

     We have heard and seen how partnerships between Civil Society, national statistical offices, Human Rights institutes and institutions, academia and the private sector can really help to identify    amplify the effectiveness of citizen data in creating innovative policy and solutions.

     And here, again, given the richness of this work in this area, I think the sky is really the limit. In terms of action points, there are three of them. I noted. One is promoting and encouraging the adoption and implementation of Copenhagen framework on citizen data. Two, ensure meaningful participation of marginalized communities in data governance.

     We need to develop and promote tools such as those highlighted here today. And include really relevant stakeholders to ensure both compliance through ethical and Human Rights standards, but also data standards as well.

     We need to strengthen capacity and increase also investments in inclusive citizen participation in digital spaces. You know, again, this needs to be    we need to ensure marginalized communities and population groups are also included. So with that we'll close here. Thank you very much to all of you for a great session.