The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> AHMAD BHINDER: Okay. I'll start.
>> Test test.
>> AHMAD BHINDER: Can you hear me well? Just want to cheque the system.
Can you hear me through your headphones well?
Perfect.
We'll start in a minute. We're just sorting out some technicalities.
It is channel 3. It is channel 3 to hear the audio for the speakers. So the tech is all sorted hopefully. My name is Ahmad Bhinder. Policy innovation director at Digital Cooperation Organization.
And we are gathered here to discuss the very critical topic of protecting children online. Or in the retail spaces.
I see some problems with people not being able to hear me. My audio is on channel 3. May I confirm if people can hear me? Okay. Excellent.
We are here to discuss very critical topic that is very relevant to the digital economy and evolving digital landscape. And for us, we have today we have a very senior diverse and expert panel of speakers. Unfortunately our moderator is on his way. He is stuck somewhere, but I will try to fill in for him. His name is Phillip Nahas and partner at technology policy firm.
We have with us today, a partner of cybersecurity at PWC. May I ask him to introduce himself and then we'll move on with the other panelists.
>> Good afternoon. I'm a partner, PWC Middle East. Based out of Dubai. I do cybersecurity for a living. I've been with the firm for 20 years. I help government entities. My focus on government entities, critical national infrastructure, cyber agenda. And glad to be here. Thank you.
>> AHMAD BHINDER: Thank you Mr. (?).
Next we have Mr. Mohamad Asab. And founder and CEO of IN4 group head quartered in the (?) in (?). Mohammed, if you could please introduce yourself.
>> Good afternoon. I'm Mo Esop. CEO for (?) group. One of the leading advance tech training providers in the UK, working extensively across data, cyber, cloud and software. As part of our business group, we also deliver the government's cyber first programme. Developed by (?) into schools and colleges. And I'm also a found over star academies, where we have 35 schools and 25,000 young people in education across the United Kingdom as well.
>> AHMAD BHINDER: Thank you Mohamad. Next we have with us, director and head of public policy of Meta in India. Mr. Shivanat, please introduce yourself.
>> My name I lead India.... hear me?
My name is Shivanat. I public policy in
>> AHMAD BHINDER: Need to change this, not working
>> before that and my responsibility and make sure that continues to take the right measures in country like India tends to be one of the largest user base for Meta across the world. Across Facebook Instagram and WhatsApp.
E.
>> AHMAD BHINDER: Thank you Mr. Shivanat. And online speaker us from the U.S. Afrooz Johnson. And working in the UNICEF. Afrooz, please introduce yourself.
>> Loud and clear. Good morning from a cold New York City. I am Afrooz Kaviani Johnson I work at UNICEF. And I'm global lead in our work to prevent and respond to online child abuse and exploitation. UNICEF works in over 190 countries and territories. And I support our teams around the world on tackling this issue. So this includes research, work on legislative and policy reform. Training for frontline law enforcement and social services, educative efforts with children and their families and various collaborations and engagement with industry.
Thank you.
>> AHMAD BHINDER: Thank you Afrooz so much.
And. Let me start with few words and then we'll listen to the expert panelists. My name is Ahmad Bhinder. Digital Cooperation Organization. You can see on your screens we are in an intergovernmental organisation that covers the digital economy. And we instil cooperation amongst our 16 member states, from the Middle East, Africa, subcontinent, to Europe. And we have we do a lot of initiatives around, for example, the sustainable growth of the digital economy. Governance in the digital space. AI, digital rights et cetera.
We have a growing list of observers. And observers are from the academia, from civil society, from private companies as well. As you can see on the screen we are 40 plus. And we are rapidly growing.
We play four roles. As I said. We are intergovernmental organisation. So we are represented by the ministers of digital economy and ICT of our member states. And we play four roles for our member states. We advise on the best practise policies. We facilitate the cooperation on digital economy. We advocate for the best practise policies. And we provide information.
So we have published a few indices to measure the digital economy. The details could be found on the website.
So today I am here to talk about the safe space for children. Children are digital natives. And I was having a chat with my kids, who are 13 and 10 last night. And I was telling I'm going to speak at a session for online safety for children. So could you come up could you tell me what are the issues you face online?
And they said what do you mean? What do you mean online? Or what do you mean issues that we face?
And e it took me a while to, you know, to bring it to them or discuss with them, that, you know, there is world around the digital world for them.
So for them, it is all about digital. And that is why, you know, when I when we think about them being digital natives. They live in the digital world.
So the first question when we are addressing the issues around the children online protection, are then: How do we define a child? Who qualifies to be a child?
Generally, a child becomes an adult when the child turns 18. But there is no magic switch that flips and, you know, you qualify from being a child to an adult. So when we are thinking of online safety and we are thinking of creating the safe digital experiences, we need to remember that the child needs the change as they grow. And as they grow does not really only include as they grow from, for example, from toddler to a pre teen, et cetera. But as the digital experiences evolve as well.
So and therefore the discussion is ongoing. And the discussion has to evolve. And the measures to protect children's online safety, they have to evolve keeping this in consideration.
Second thing then is how do we define a safe digital space?
So the term refers to an online environment where individuals, especially children and vulnerable groups, can interact, communicate and engage in various digital activities without having the risk without experiencing the risk of harm, exploitation or abuse.
Five years ago, approximately, there was a study I was just looking up for. And almost a third of digital interactions or third of internet users were children. And this was five years ago. I'm not sure. I did not come across. Maybe the panelists have a latest number. But I'm sure it is half of the population or half of the internet users are .
Keeping this in consideration, last year we did some work on digital rights and one of the streams of our work focussed on online safe space, especially for children. We developed through consultation and I will come back to it how we did it. But we developed a paper available on the DCO website. I welcome you all to have a look at that paper.
In that paper we explored different dimensions of what are the tracks and challenges that you could see on your screen that children are subjected to. And what are the different categories? Who are the stakeholders? And who can play a part in protecting the safe space for children?
And then we furnished some policy registrations based on that. So risks in three categories. Technology, customer experiences and the risks o privacy and security.
We all know and I think we will get, so we have social media representation here. So we'll explore what is being done to counter effects of addictive behaviours of social media. Especially with the endless scrolling feature.
Then we have diversive technologies, for example. Meta verse and lot of virtual technologies that are coming. So what could be the impact, for example if the amount of data that is being collected or that could be collected on a child or anybody who is using those technologies. And how it could be used and how it could be protected. We will explore how that risk can be effectively addressed.
And then we, you know, cyber bullying and social media, et cetera, bullying, et cetera, they are all the risks that the children face in today's world. And hopefully with this, with our discussion, we will address some of those and we'll see how to what is being done to address them.
So then normally, and you could see on the slide. It could be educational institution or it could be any organisations that are dealing with the children data. Or with who are dealing with the children. So the data leaks, for example, from the education system. Is a growing concern. Which can then, if it lands in the wrong hands could be used for harmful manipulation of kids, for example.
Or the ineffective crisis management by the institutions or by the organisation can lead to reputational damages. Lack of efficient incident management. Could cause disruption from the education system for the kids that is highly dependant on the online interfaces now.
So the paper then comes to present some policy recommendations. For four kind of stakeholders. For example, schools and educators should support the programmes. And help line for children to create awareness campaigns. On cyber bullying especially.
The role of parents is, you know, is moving beyond just implementing the parental control. And it is about parent's involvement. I'll give you a small example towards end of my conversation here on that. But it is not just that, you know, you yell at them and say your screen time is over and then disconnect yourself. That is not working anymore.
Government, for example so this is a concern about the targeted ads. About the endless scrolling. About the transparency of what the kids say. So governments have a lot to do there. And private sector, the most important thing is and this is for all is to involve children or involve the young people in to whatever is being designed at them.
Okay.
Now the boring stuff aside. So last night I went home. As I said, I just got my 10 year old and said, look, I'm preparing something for tomorrow. So could you give me a few point's that I can bring up?
So she went back and she wrote this piece of paper for me. And I think this is and it was not meant to be shown. She just wrote this for me. And this was for me to copy the points here. And I just looked. I just pulled it out from a notebook and I I think to conclude my intervention here, I would just read what she said. Just read in whatever grammatic mistakes or whatever you find it.
Number one, games. She categorized her online interaction after we had the conversation. That, you know, how do you look at the online space. She says problem is cyber bullying. At 10 years old is aware of cyber bullying before looking it up. And I had no idea about it.
She says the solution is to report the person. Then talk to an adult who you feel comfortable with. So this is her recommendation. Right? Then she writes apps. Calling apps as a subcategory that she produced. And the problem she says is random people call or texting you.
She does not have a cell phone. So I was quite surprised to see this, that the ways of interaction that is subject to of course she has a tab, interacting (?). So shz a problem that she identified. And the solution is ignore and block.
So we need the kids, they are smart. But we really need to reinforce this.
Then she says watching apps. This is the content that is available online. And the problem is inappropriate content. So we try to protect them from any content but they are very well aware. And the solution she says is report the okay, report the channel. And dislike the video.
And then she say, if you are a minor, turn "kids mode" on.
By the way I'm okay. So then comes the final thing is social media. And the problem she said is addiction. And the solution is, put a screen time limit. Which I normally do. And go outside regularly.
So the importance of the physical activities is really important. The final one is again on the social media, she says hacking. Make the system stronger so the hackers cannot break into people's accounts.
So I would really finish my speech with this. Because last night at around 12:30 when she gave me this paper. She has school holiday so she is allowed to keep awake. So I said I'm done. I'll just read this in front of the group today.
Now with this I pass on to our able moderator, Phillip. We went through the moderation sessions and the floor is yours now.
Thank you.
>> Thank you very much Ahmad.
(No audio).
Testing. Okay. It's and special thanks to.
(audio fading in and out)
My apologies for joining the session late today. And I'll start by saying, that I got from my child's nursery. Which is that we overprotect children in the world and we underprotect them in the virtual world. And today we're hear to discuss to you said how can we provide the right protection for children across the virtual world.
And for that, I have a very distinguished panel joining me today. I'm going to take back my seat. I feel I'm disconnecting here.
Excuse me while I do this sitting down.
(audio fading in and out)
......
so maybe we can take some time to go over a few questions.
The last Bhinder gave was secure .
And I think this is your area of predilection. My question to you, dos and don'ts for institutions when it comes to cybersecurity environments.
>> Thank you. Let me start by saying.
Mine is not working?
So let me start by saying it is very personal to everyone who has, like, children. Because it impacts every society on the planet. And when it becomes personal we have to put the right measure, it becomes even more obligation for all of us to stand and protect the most vulnerable part of our society, which is children.
Now, to do that there is an obligation on multiple stakeholders that way we look at this and (?). So there is a role for international communities. And we're seeing, for example, Saudi taking a big step last year in arranging what we call "child protection conference," or summit, which took place, where they looked into international collaboration to look into this international challenge.
If we go down in the role of government, I'm not going to talk about that. And also schools. Parents. Et cetera.
Now if we zoom into the private sector, where I belong. So we categorize the private sector into three categories. The way we look into that framework. So the first category is social media, gaming platform. And those platforms are like mobile operator of technology manufacturers. Interact directly with children. This is the first let's say layer of technology or private sector that interact with children. And there is a big role that these players should play in terms of many factors.
Let's say different risks, cyber bullying. How do you minimize that? By having the right functions from social media platforms. We'll talk about that from Meta. We'll hear about that from Meta.
So how do you control comments? How do you control the ability to block certain users? Not all platforms they have this. So this is a journey that all platforms should adopt that kind of behaviour. It is like we (?) leading players like Meta or Instagram stuff like that. But others should also follow.
There is inappropriate content, for example. Inappropriate content. Here comes the role of AI. Not just AI but even human moderation is needed to control that and to filter content for children.
Similarly, when you talk about privacy. Privacy is very important that we have the right privacy configuration for that these platforms adopt in order to make sure they have this by default. It is no longer a feature. It should come by default in these platforms.
Two factor authentication for users. For children even. It should become easier and easier. It should become, again, a default setup.
That kind of obligation on those players is very important. Activating features like parental controls. Screen time like what your daughter suggested.
And also we recommend that parents and children, they use the same platforms. Because that makes it easier for parents to control what their children watch or limit screen time or ability to download apps and stuff like that.
And second category of private sector would be companies. Service providers. Technology. Vendors. These guys, they work in the back, let's say, to serve the entities that they deal with children directly.
These companies, they need to invest more in R&D. And work with law enforcement and regulators to help with policies. Vendors, they have big obligation to invest more in ethical AI and stuff like that.
And finally, we look into corporate organisations. The private sector. They also have a role to play. Adopting compliance and privacy laws, protecting, young generations and let's not one of the challenges we talk about, it is an interesting one is, a new generation is very like, let's say they have this mentality of hacking and not adhering to like corporate kind of policies. That is new generations like this, how do you deal with that kind of behaviour?
How do you make sure you give them the best experience but at the same time is safe and does not compromise the corporate policies as well?
>> Thank you for that. You spoke about corporate policies. And I think that is a very important point to touch on. And maybe I'll turn it to Mo to give us his view about whether these corporate policies should be more about adults setting the path to safety for children? Or are these also about empowering children themselves to self govern their safety? What is your view on that, Mo?
>> You hear me okay?
I think the technologies is struggling at the moment.
, I can give you a perspective just from
>> Let's swap.
>> Sorry. Swap the mic. So I'll give awe perspective from what we do in the UK. So we have a cyber e sports academy, 7 1/2 thousand young people transition through those academies across the country. And one thing I'll say a starting point is that we kind of just use a single definition of children and think that be all and end all in terms of policy thinking.
You have a spectrum of children. From vis advantaged backgrounds to more advantaged backgrounds. There is a very different exposure to risk in the socioeconomic contexts. There is also the way young people are from neurodiversity to learning and learning styles. And how they interact into that world.
And unless we understand the nuance, no policy can ever work. And it has got to be at that sort of level of hyperlocal to hyperchild in the context of how that child is engaging with the wider world and the digital world is just part of the oxygen they breathe.
So what we've found in our cyber and esports academies, is you have young people who are feeling empowered and belong. And one thing that heartens me that all the surveys I've seen and experienced from those young people that go through my academies, is that they do know right from wrong.
In fact, to be honest with you, I think they are probably better than our generation was in terms of understanding right from wrong. It is just the world is more gray than it was black and white in our day. And let's be honest. In our ability to tran you know, and you have heard already that we have limitations about our knowledge of this world and any one of us of a certain generation that had sex education at school. I don't think anybody really learned about sex at school. They just experienced it for themselves and learned for themselves.
And I think it is the same issue. Young people are not listening in the classroom. They are just experiencing it for themselves. If you give them a place where they can be empowered in a safe place. And what we do with our cyber and esports academies is we give them the belonging of the positive elements of this world and how they can be empowered both in terms of understanding their future. But also understanding that there is a physicality with that digital world. And making friends and connecting with peers can, you know, is augmented with the digital world. Not just exclusively in the digital world. And those two things are critically important.
So I think, you know, from a understanding of young people point of view. And what we do to empower them to, you know, we have cyber teams in each school. And I believe that, you know, we need to get to a point where young people will govern this space themselves. If you give them the tools and the ability to do so.
And like we have, you know, in every school in every country, young people will be selected as head boy and head girl and prefects and monitors. I think there is a spaises now for schools to really identify digital advocates to be able to govern across their peer group and give support to the educators. Because the educators are really struggling to keep up with the nuance and the complexities..
Even primary schools. I've got 16 in my trust. And if I speak to the educators in the primary schools, they are at a loss. Because the sophistication of a year, five or year six child is way beyond the capacity of a key stage 2 teacher. Who really does not understand truly the complexity.
So they might delivery a curriculum. But that is nowhere near where it needs to be. And that confidence needs to emanate from both educators and empowerment. For young people.
So from my point of view what we'vened found successful across our cyber and E sports academies. Is when we create a safe space for them to be empowered, I think you get some really brilliant outcomes. But I just say the more deprived, the more disadvantaged T more vulnerable the person is. So that is the point which we have to do more. Because a parent can have all of the sort of tools. Ifer educated. If they are from a decent socioeconomic background.
But I think one thing I would say as a parent and my children have grown up now. But I would just say that parents need to empathize more. I don't think that there is this preaching mentality that is going to ever work. I think there is a confidence and a openness to understanding. And also for parents to want to learn rather than try and educate all the time as well.
>> Thank you very much, Mo. And I think you have touched on a good point.
think you have touched on a good point.
>> across the board, I feel that the person online space that children tend to experiment with is social media today. And maybe that is the perfect segue for question to Shivna here. And in terms of that space. In terms of the. Environment. What is Meta's approach to ensure safety and well being of children online, especially with respect to the products that you provide?
I'm going share my mic with you.
>> Thank you so much. And firstly, I want to thank all my panelists and you for framing this issue in a very forward looking manner. I can assure this room that usually when I'm in panel, it starts with a very attack mode.
What I'm glad to hear is that at least you all are thinking of solutions and how to make it better rather than looking at blaming anyone. And I'll everybody talked about the parental experience. I have a 10 year old son and a 7 year old daughter.
And trust me, I think despite me working for Meta, they know much more than I do when it comes to the world of technology. Sometimes I have to look at them and get some (?). And yet what they don't realise is there is something called the cloud. So when they are sitting back in India and clicking on the iPad. And I can see. And the iCloud account is appear on my phone.
Brings me to the point. What as companies are we doing or what is our approach?
Ky confidently state that as a companial which is one of the leading companies in the world of social communication, the approach we have taken is very pragmatic, forward looking. And builds solutions. Some of which we've heard already. And some of which are already existing. Through the product.
You cannot bring in a sheet of paper and tell people to say these are the features that can happen. These features are already there. Whether it is about non stop addiction. parental controls. Brings me to more fundamental issue. You can do parental controls. Question we have to ask is do our children want us to be allowed to implement those parental controls? Will they like it if you tell them give me the phone, let me put it on and now (?). They don't even want to share their account. That is the reality. You can have parental controls but kids don't like to share. So then what?
The more fundamental approaches, think it through the design level. Someone rightly said, we've launched something called teen accounts. Where the default feature of a teenager will be on a parental control. It is not like they have to give it. So default feature below 16 will be parental control. I think as a tech company, if I don't use a tech to install the solutions, we have a problem. And where I think we have taken the right measures. But I want to share one thing with this room.
This will never be 100% (?). It cannot be. Reason? Just like in the real world, bad actors are everywhere. And the virtual world, bad actors are everywhere. But we need to think through what you read from the piece of paper, unwanted content.
On Instagram, for example. We do not allow anybody we don't know to contact or message you whatever. And yet we see instance of sign cyber bullying or unwanted actors.
Question as parents, how vigilant are we. Like Mo said. Tools or thinking of yesterday cannot confront issues of today. So are we upping our game as parents? I find it as a challenge. Despite being in tech company.
Sometimes we are not able to gauge what the bad actors are going to do. Hacking impersonation not just a child or youth issue. How many in this room have been hacked? We are all adults. Please tell me the truth that you have never been hacked? Do you have your two factor verification on Instagram or WhatsApp? I'm many of you don't. Why?
Despite being adults we don't go and try to do. So I you imagine your child should be doing that? Awareness and knowledge level how we're doing. One fundamental approach from the safety of the youth which we feel we very strongly believe in is framework of preventing, controlling and responding.
We take several measures including deploying lot of AI tools, looking at key words, to make sure anything bad before it happens we can prevent it. Once a bad incident is already happened, it's too late. We have lost that game then. So prevention is the most critical thing of taking down accounts. Taking down accounts which are usually on the (?) et cetera.
Then giving user controls. Related to features we have.
Third critical, which is responding. How fast are we able to give you ability to report to us. And how fast are we responding to it? Are we working with enough civil society organisations? We have a programme where we share data to make sure law enforcement agencies across the world can work with each other.
Prevent, (?) and control doctor and take very seriously. Invested heavily on that I think we work across multiple agencies across the world inside our company we have many former law enforcement officials who work with us. Officers who work with us. And we deploy a range of technology to make sure that we are able to prevent bad actors from being on the platform. It is not in our interest. I cannot ask you to be user of my platform if I can't keep our children safe.
As a parent, it is my responsibility. And in India, trust me, I show up in front of every regulator, to be asked very very tough questions. But I can also say very proudly, that we play on the front foot by stating what we do, running public affair campaigns for awareness, et cetera. But at the end of the day my product has to do the talking and that is what we're focussed on.
>> MODERATOR: Thank you very much. Thank you.
I think you have made a very good point on, you know, the engagement with the regulators and your constantly on that. Would you care to elaborate a bit more about.
>> We have a Afrooz online from UNICEF. Should we invite her.
>> MODERATOR: Absolutely. Next. At this point I think.
>> Okay.
>> MODERATOR: if we're talking about, you know, policy stakeholderses, I think it is a good idea to talk to Afrooz. And I think to hear it from you as well Ahmad. From the, you know, the NGOs perspective, from the policymakers perspective. And to begin with Afrooz is with us and she can hear us. Maybe Afrooz, the question to you here is, what are the most critical interventions that governments in various countries can make to protect the children online. This is your experience from UNICEF. What can you tell us.
>> AFROOZ JOHNSON: Thank you so much. So you know lot of them have been touched on so forgive me for repetition. But the first I think when we look at this issue is, you know, the design and the operation of digital services and platforms. We've heard, you know, how there are bad actors on these platforms. But also there are design features that make more risks for children is and not just children as we've heard.
So we see that there are, you know, user engagement, prioritised over child safeguarding, for example. The ways in which platforms are designed, you know, facilitate kind of this rapid and wide ranging spread of hateful and abusive content as another example. So the ask there for government and for regulators is, you know, requiring the tech sector to undertake assessments.
And I think this came out in the DC area report as well. And what you UNICEF advocates for is for child rights due diligence and particularly child rights impact assessments. So that companies can, rather than, you know, being reactive and trying to retrofit after the fact, can be more proactive and embed, you know, this concept of child rights by design which is inclusive of safety by design and privacy by design.
So all of these are, you know, prioritised in the development of digital products and services. So this is the first ask for governments and policymakers is really prioritising children's rights in, you know, the design and governance of technology products. And, you know, this could be through regulation or other means.
The second main challenge is of course that we know that laws have not kept pace with the rapid development of digital technologies. And then when we're talk about the criminal activity there are also challenges in investigating and prosecuting cross border cases. So the ask then again for government and what we've done around the world is really support a government to update laws and policies. So that online violence is adequately criminalized and that they are also future proofed against rapidly evolving technologies.
The third challenge that we see around the world is that social services, and law enforcement often lock the resources and the expertise to address these new challenges through digital technologies.
So this makes it hard to support, you know, at risk children. And it also makes it hard to identify perpetrators. And I really appreciated the intervention earlier. Just, you know, talking about, you know, particular groups of children that are at risk. And the solutions for supporting them, you know, need to happen kind November real life, if I can put it like that.
So we need strong social services in order to identify and support children.
In many countries mental health services are often insufficient. So this can leave children without support. Which again, you know, makes them vulnerable. But also means they don't get adequate support after. If and when something, you know, happens online.
So I think the ask there for government is really equipping law enforcement, educators, social services. And others. To really identify, respond to and prevent forms of online harm.
And then finally, I would say there are challenges with respect to some harmful kind of social norms. And limited public discussion that we have in many communities around the world. There are taboos around certain topics like sexual abuse. And these things can make it difficult for victims to speak out. And we know that, you know, there are forms of sexual abuse that are facilitated by technologies.
And when we have these, you know, limbed public discussion and we have these harmful norms, it can really constrain kind of the efforts to prevent and respond. As well as, you know, the efforts that we've spoken about for governments. Supporting children's digital literacy and online safety. We also need those broader educative initiates designed to foster healthy relationships and early adolescence. Those designed to challenge kind of harmful gender norms. To motivate help seeking. And really support children, you know, if their peers disclose to them, how can they react to that. And of course all these other educational initiatives with parents and caregivers and educators.
I'll stop there for now. Thank you.
>> MODERATOR: Thank you very much Afrooz. I think it is very interesting that you spoke about your collaboration with the various governments. And maybe we can hear it from another perspective. The DCO is all about collaboration between various governments, specifically the digital arms of these government.
In maybe you can tell us in your experience Ahmad, how does that collaboration contribute to the safety of children online.
>> AHMAD BHINDER: Thank you, Phillip. I think I will start with this. So one topic across the policy domains, public policy practitioners. Where there is a consensus that things need to be done. So children need to be protected. So we have come to a level of consensus where, you know, lot of other policy debates are towards, you know, taking an approach or another.
There is a global consensus that there needs to be So something needs to be done to protect the children especially while online. Now the question is, how do you how do we activate the different stakeholders? So the role of the intergovernment organisation like UNICEF is broadly policy guidance or proposing initiatives or proposing different measures.
The role of the technology companies is to use technology to enhance (audio... ).
...
last year and this is where this session and this policy paper came from.
We have a programme called digital space accelerators programme. Where we pick up the pressing issues in the digital economy. And then we have global round tables and discussions across all the regions. And we bring in experts to really talk about and discuss the issues and how to collaboratively solve those issues.
So this is one role of the intergovernmental organisations. But I think there is a need for a more concerted effort on the national levels as well. So right now of course the policy landscape and legal landscape and regulatory landscape are different across different countries. And the level of maturity is quite different across different countries.
So one of the recommendations that we or one of the things that we picked up from last year's collaborative discussion was that on a national level, considering or identifying the different stakeholders that could include educators of course there are common stakeholders like government policymakers and technology companies that are beyond borders. But to get them together. Come one children online protection strategy.
So that would include what are the rules that need to be made. What are the rules that need to be tweaked? What are the initiatives?
So it has to be a concerted effort on national levels. While learning from the best practises, of course. We have a whole bunch of them. And then having a national championship in across those nations. And then that could then be expanded on a regional or international level.
>> MODERATOR: Thank you very much. I think we're all out of questions for today. I'll open it to the audience if there are few questions that you have for any of our panelists today. Please raise your hand. And we'll give you the microphone.
You have got time, literally for a couple of questions. So I think I'll start with the lady and pass it onto you.
>> He so much. My name is Utek from Germany digital opportunities foundation. Digital rights advocate.
Let me collaborate that today you have a panel with only male speakers. Several years ago we would have been talking about safety for children only among women. So it is kind of an achievement that no woman is on your stage.
I would like to refer you to the unconvention on the rights of the child. Because you have somehow picked upon that issue. I saw the e sports, Mo speaking about e sports and that is related to children's rights to leisure time, to play. In article 31.
But also I saw that some of the measures you have been talking about would touch upon children's privacy. And that is article 16 of the unconvention.
So when we put up parental controls, when you have a look at the photos your children upload to the cloud without them knowing, that would touch upon their privacy. So we need to take care that we balance privacy of children as well as their right to be protected.
Thank you for listening.
>> MODERATOR: Thank you very much for sharing.
>> Thank you so much for your presentation. I'm actually currently a high school student in Massachusetts. So as a literal child perspective. I want to elaborate a little bit on what you said about how children indeed know the difference between right and wrong.
I completely agree with that. After being exposed to all kinds of education about digital safety and stuff. But one thing I observe in any school is that despite the fact that we all have been educated about digital safety, we still become vulnerable to it. And some people even go onto become the attacker.
So I think the reason for that is that children are evolving. And technologies are evolving. But the education on digital safety has to keep pace with those evolution.
So I think it is also time for us to, like, revolutionize our education. For our children.
>> MODERATOR: Thank you very much for sharing.
Maybe I'll conclude in a minute just to say that we've seen in short that the digital world is actually an image of the real world. Whereby we determine there are bad actors in both.
And and we need protection in both.
Certainly, the policies out there were not up to speed. We have been playing catchup for the past few years. And we are still catching up with a very fast paced evolution of the digital realm. One thing that I think becomes apparent frommal our parents here is importance of looking at it holistically to international organisations, to governments, to parents, to law enforcement. As well as policymakers. All the way to empowering children themselves. And I think as you well said, the difference between right and wrong is something that should be infused at a very early age. And it should be holistically part of what we do as we conceive new policies to go ahead.
Thank you very much everyone for listening.
I appreciate your time.
And hope to see you again soon.