The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> JUMANA HAJ‑AHMAD: Good afternoon, everybody. Can you hear me? Yes? Yes.
Channel 2, if you cannot hear me, I'm on Channel 2.
Okay. Good afternoon, everybody, and thank you for joining us. Just one second. We're checking that ‑‑
It's okay. You can hear me?
Can you hear me online?
>> We can hear you.
>> JUMANA HAJ‑AHMAD: Good afternoon, everybody, and thank you so much for joining us today. We're delighted to have you all with us, both in person, in the room, and online.
A very warm welcome to my co‑moderator, Afrooz Kaviani Johnson, who is joining me. Thanks for joining us.
My name is Jumana Haj‑Ahmad, Deputy Representative, UNICEF Gulf Area Office.
I am honored to help guide this discussion on a very important topic that resonates deeply to all of us, children's rights and safety in the digital space.
As we all know, the Internet and digital technologies offer a lot of opportunities for children and adolescence, to learn, to develop, to grow, but, at the same time, these technologies and tools come with a lot of risks that require our attention and action.
This session focuses on embedding children's rights and safety within the digitalisation agenda, ensuring that the digital world becomes a safe space, a place where children are protected but, at the same time, they're learning, they're playing, and they're exploring.
Today, we'll be hearing from different countries. We'll be hearing from different stakeholders and different sectors to make the digital space a safe place for children and adolescents.
Those joining this discussion shows that a collective action is needed to ensure that children and adolescents are safe in the digital space.
I will start by introducing our amazing and esteemed panellists, Dr. Maimoonah Al Khalil, Secretary General, Family Affairs Council, Kingdom of Saudi Arabia; Ms. (?) Representative, Global Cybersecurity Authority; Floreta Faber, Deputy Director General, National Cyber Security Authority, Albania; Mr. Paul Clark, Executive Manager of Education, Prevention and Inclusion, eSafety Commissioner, Australia; and back to Ms. Helen Mason, Director of Operations, Child Helpline International, Netherlands.
Sorry. I forgot Mr. Richard. Apologies for that. Mr. Richard Wingfield, Director, Technology Sectors, BSR, London office.
Thank you, all, for joining us in this exciting discussion.
As we start the discussion, I would like to really flag two key points. The first one, as many of you know, it is estimated around one in three Internet users worldwide is a child. And this number underscores the importance of speaking about children's rights and having children rights at the centre of any discussion that relates to digital technology, even those not designed for children.
We need to think of those children as real life. Those children could be your own son or daughter, could be your nephew or niece, could be the son or daughter of a neighbour.
So, as we talk today about the importance of safety in the digital space of children, we need to put the lives of these children in our minds and hearts and think about what these children are going through. Is it really a safe space that we have providing for them? Are they allowed to learn and make the best out of this opportunity that is being offered?
I would like to start with a question to Dr. Maimoonah Al Khalil. The IGF is being held in the Kingdom of Saudi Arabia this year, which aligns with the country's strong digitalisation agenda and the focus on child online protection. Can you share with us how the Family Affairs Council is integrating into national policies and coordinating with the different stakeholders and different departments?
>> MAIMOONAH AL KHALIL: Thank you very much. I hope I'm audible to everyone. I would like to start by thanking you for asking me to participate and to share what we're doing in the kingdom.
There are initiatives taking place from government agencies and GOs and private sector on child online protection to coordinate these efforts and to enable greater impact for these initiatives and to apply a holistic approach that covers all aspects of child online safety, the Family Affairs Council in 2023 launched the national Framework For Child Safety Online.
This has one being awareness, two being enablement, and three ‑‑ I will only give a few examples, because I know we only have four minutes.
Under awareness, we have a track on school curricula where we'll be working with the Ministry of Education to look at development of tailored material that looks at specific groups of children age‑related so the material that is not there is created and the material that exists is recreated to embed some information and also to develop separate and independent material to be accessible online at any time.
Under enablement, we have capacity building efforts to prepare teachers and parents, mainly, with the set of skills they require to detect any dangers children might face online, and another track for integrated governance in order to have an integrated approach to what children are exposed to online.
Under prevention, we have a measure that is based on research and data, in terms of looking at how can we do to prevent, from the beginning, any kind of issues that children may face online, and, basically, this framework is aimed as an effort to bring this to issue and ensure that the NGOs are working toward one vision.
Now, we cover a number of domains, including cybersecurity, data protection, privacy, everybody the psychological aspect of what children go through is very important and prevent any cybercrimes.
There's also the private sector and active NGOs in this regard.
The role of the Family Affairs Council, in this particular guideline and framework is to monitor these efforts, to report on the progress of these efforts to the centre of government and to fill any gaps.
Our efforts focus on stakeholder coordination mainly, partnering with educators and service providers and establish a government model that brings everybody together for the benefit of the child.
And our focus areas on enhancing digital literacy, protecting children's data, and building a culture of safe online behaviours through awareness campaigns and family support.
Our outcome‑oriented approach is in addressing and finding the gaps. Where are the gaps that we can start working on? And leveraging and advancing technology. Technology, in this sense, is a danger, but it can also be an ally and helper. And aligning efforts with global best practices.
I would like to end with a recent initiative that has brought together national efforts. It was a Family Affairs Council nationally led campaign. In actuality, what made it succeed is every entity that came and joined this campaign that focused on promoting child online digital safety, promoting healthy digital practices and behaviors, and advocating for parental controls, artificial intelligence, reporting tools, support channels that are existing here in the kingdom and shedding light and bringing them to the floor in every way we could, whether online or through the channels of other participating entities.
The targeted segment in this particular campaign were teachers, parents, and caregivers.
So this is just a snapshot of the framework. I hope you enjoy.
>> JUMANA HAJ‑AHMAD: Thank you so much, Dr. Maimoonah, for sharing this overview.
Being here in Saudi Arabia has allowed me to also see the consolidated approach that you applied in leading the process of the development of the strategy, which, really, I think helped a lot in having the different stakeholders onboard, including children.
I remember the interactions with the children that the Family Affairs Council led, which was an amazing process.
Now, I would like to go to (?) From the Global Cybersecurity Forum. Since the launch, it's provided this as an initiative. Can you share more about the Forum's mission and your strategy for strengthening commitment and action to ensure children's safety within the digitalisation agenda?
>> Thank you, Jumana, for the question. Allow me to extend my appreciation for inviting us to this event. I would like to start by addressing the question that the majority of the audience might have, which is: Why Global Cybersecurity Forum is considering this a key priority and initiative within its mandate?
As you know, the unique positioning of the GCF is that considering the cybersecurity beyond the technical aspects. Our work spans all the areas from geopolitics, technical, behavioural, and social, under which the child protection is a priority.
As you know, the GCF was interested to deliver the child protection in cyberspace initiative initiated by His Royal Highness. Our key targets include, first, developing child protection framework in more than 50 countries, upscaling 60 million people, and protecting 150 million children globally.
So to achieve these targets, we have developed an extensive research to develop a robust strategy. This includes, first, a deep assessment of the global landscape globally with a validation of key insights from global experts as well as including reviewing reports, initiatives, organisations, and programmes.
Second, a global survey to cover 41,000 people from parent and children from 24 countries across six regions.
And I invite you all to read the report about why children are unsafe in cyberspace. It's available on our website.
Now, when it comes to the delivery model of our initiative, mainly this initiative, it's structured under one fundamental principle, which is strengthening and complimenting experts ‑‑ we're delivering on our initiatives project.
Finally, to maximise the impact of this initiative, we conducted and organised the Global Cybersecurity Forum, the CBC Global Summit and with our partners. It was here in Riyadh last October.
There's no better way to conclude than with the statement to the GCF annual meeting and the CBC Summit, it reads as follows. Cyberspace is a (?) The security of individuals and the stability of nations.
Thank you.
>> JUMANA HAJ‑AHMAD: Thank you so much and for all the hard work. I know how you personally also lead the global Summit on child protection. It was an amazing summit with a lot of amazing commitments made to make the digital space a safer space for children.
Now, I would like to pass it over Afrooz.
>> AFROOZ KAVIANI JOHNSON: Thank you, Jumana. Can you hear me in the room.
>> JUMANA HAJ‑AHMAD: We can hear you, yes.
>> AFROOZ KAVIANI JOHNSON: Wonderful. I'm glad we can use technology to connect. Sorry we're not there in person, but I'm really happy to now go from the incredible developments within Saudi Arabia and the global picture from the Global Cybersecurity Forum.
So I have a question from Mr. Paul Clark, who, as Jumana mentioned, Paul Clark, Executive Manager of Education, Prevention and Inclusion, eSafety Commissioner, Australia.
We know that the eSafety Commissioner is initiating measures to protect children.
Really keen to hear from you, Paul, about your educative work, knowing that there are many fascinating works of the Commissioner, and how you're ensuring these resonate with children and families across Australia to really make a difference in children's lives.
And because the theme of our session today is also about innovation, what are some of those key innovations that you're introducing in this area?
Thanks, Paul.
>> PAUL CLARK: Thank you, Afrooz.
Thank you very much, everyone. It's a pleasure to be here today.
For some of you who might not be familiar with our work, I thought it may be good to give a background about eSafety. When we were established 15 years ago, we were actually the Children's eSafety Commission.
In addition to our focus on awareness raising, we had the world's first support that allowed us to have harmful cyberbullying content removed from an individual.
So now while our responsibility extends to all Australians, and that's reflected in our current name, children will always remain a priority for us.
Any safety in our education programmes ‑‑ and it was a challenge to fit this into four minutes because there's quite an extensive range of programmes there.
We want to focus on the needs of children and young people in different circumstances, always keeping in mind their fundamental rights to protection, participation, and access to information.
So this includes now programmes, resources, and training to support young people, their carers, and educators, from the zero to age five right up to adults in their early 20s.
We ensure to prioritise within these people's age groups, we know that cohorts are most likely to get the most benefit from engaging online.
So in the Australian context, young people with a disability are more likely than the national average to engage with harmful treatment online and even physical threats.
They're also more likely to come across harmful graphic content.
In Australia, we know that LGBTQIA+ teens are more likely than the national average to experience hurtful online interactions and more likely to engage in risky online behaviors themselves, like sending messages that are sometimes inappropriate and risky.
They're much more engaged in the general population in using the Internet for cultural connections and staying informed about the world around them, whether that be politics or news.
Despite these cultural and news worthy interactions, sadly, First Nations young people are three times more likely to have harsh things said to them because of their race and gender.
When we talk about innovation, I wanted to call out things that are really close to our hearts.
Central to our commitment to children's rights is our eSafety youth council that's a diverse group of young people from the ages of 24 from every state and territory in the country to ensure young people's voices are not just heard but acted upon when developing solutions, support, and helping us frame policy that impacts young people.
Children have the right to express their views in all matters affecting them.
Our council is key to keep the principle at front and centre of the work we do. It's not about doing work for young people. It's about doing work with them.
Your youth council has been involved in such a broad range of work in eSafety, everything from helping review content for our youth pages and to support teaching in our education models but also in shaping more broader please decisions that will have a direct impact on young people.
For an example, they recently made a formal submission to the Australian government and took part in two events in Wales and south Australia.
That leads me talking about the summit to talk about children's access to social media online and balancing rights and protections. I'm sure many of you would have heard now of the recent legislation that has just passed in Australia that's going to prevent young people from under the age of 16 from accessing certain social media sites.
These provisions are going to come into effect over the next 12 months, and over that 12‑month period is when we're going to be able to specify which services and what regulatory mechanisms are going to be required to have this legislation enacted.
There's also awful lot of work to do in this space over a short period of time, and we're building on our considerable work on these issues.
We know that the relationship for young people between mental health, social media, and online engagement is complex, but we're continuing to contribute our research and insight to assist the government and ensure that any legislation minimises the unintended consequences.
Regardless of the access of young people ‑‑ sorry ‑‑ regardless of the age to access these certain sites, for us, prevention and education will always be a foundation of eSafety's work. It's more important than ever for young people that we continue to work with them to build their digital literacy, and critical reasoning skills so they're prepared for the environment, regardless of the age at which they choose to engage with it.
We remain committed to listening to young voices, collaborating with regulators, and evolving the approach to the challenges in this digital world.
>> AFROOZ KAVIANI JOHNSON: Thank you, Paul. It's heartening to hear how Article 12 is taken into account and embedding that with all of your work.
We're going to have time for questions at the end. So, audience, online and in the room, start thinking about questions you may like to put to our panellists.
Thank you.
Thank you, Paul.
From Australia, I'm really delighted now to go to Albania.
I have a question for Ms. Floreta Faber, Deputy Director General, National Cyber Security Authority, Albania.
So we're really interested to hear how Albania is integrating child safety into its cybersecurity agenda. I hear that there is a new cybersecurity law in Albania, and this includes a dedicated chapter on children.
So could you tell us more about this and how you're ensuring that ministries and departments are equipped with the latest knowledge and skills to effectively empower and support children online?
>> FLORETA FABER: Thank you very much for giving us the opportunity to be a part of this really fascinating discussion. Albania is a small country, and we are very happy to be next to Australia now.
We have been involved ‑‑ really, the Albania government is committed to the digital future for young people and the young generation. Of course, we're focused on strongly protecting our important and critical infrastructures and also protecting children online is one of the main pillars of our work.
The commitment is reflected in inclusion of child online in the cybersecurity strategy for 2025, and we had it on the previous but now more developed in the new law on security, which you mentioned that, came into force this May.
The new law on cybersecurity has tasked the National cybersecurity Authority, coordinating the work with all the institutions that are responsible for the protection and safety of children and young people online in order to create a safer online environment in Albania.
This means preparing the new generation capable of benefitting from all the advantage of the new technologies and information technology but also know the challenges of the new development and showing them that in today's work, the digital personality is of high importance.
In Albania, we have seen this in different ways in order to address the issue.
First, we believe it's important to create the legal framework and the necessary mechanisms that make all our government institutions but also the civil society work together. When I say public institution, the National Cybersecurity Authority is coordinating the work, but state agency for child rights and protection, all the original media authority, and the (?) Communication authority, peoples advocate and play their own role in this field.
We are trying to coordinate all the work that all the institutions do in order to make sure that we cover the whole country with the issues on protecting children online.
The National Cybersecurity Authority has a reporting platform where citizens can report direct cases of illegal online content and also issues online. Last year, we had a low number, but this year, we have had over 250 cases which people have reported to us. So we all send those cases to the police.
In cases we have done a questionnaire with children, which we have met through the awareness and in‑person events. We found out that TikTok and Snapchat is the most‑used platform for young people, especially when we had issues with TikTok lately, we send all our materials also to all the regional authority that is directly talking with TikTok in order to prevent cases of use of illegal content.
The NSCA has implemented a number of initiatives during the years to protect children online. We are developing our strategy, but in the last four or five years especially, we have been working with IT units with the U.S. Government support, in trying to develop a number of initiatives in the country where we have been able to have various educational materials, videos, manuals, brochures to raise awareness.
As I mentioned, in 2024, our authority together with the Children Protection Authority, we have organised ‑‑ I can bring you some examples. We have been in 28 schools around the country, 13 cities, 500 (?) We have been training about 400 teachers in those schools, about 200 safety officers in schools in order to have everyone aware and educated and there's a phone call to report in those cases.
We have organised a number in of workshops and roundtables with teachers representative from local government, high‑level government, and for everyone to play a role, what are the changes that need to be done, and what awareness is needed.
We have rare cases where children had issues out of using the social media, and I can say that lately ‑‑ when I say "lately," I mean in the last month or two, we had the entire Albania government addressing the awareness for protection of students online.
We have talked with teachers, parents, and students about how to address the issue.
We have our Prime Minister who has been talking in a number of those consultations, addressing the issue, and we are trying to find what is the best way to address the issue.
Now we are also talking, are we closing TikTok and Snapchat in Albania? Whatever is the best way to have our kids online, for our (?) And nieces and nephews. The Prime Minister said there's no solution if there's no direct involvement in family, schools. Both parents and teachers and, of course, all the government institutions, in order to increase, expand the safety parameter for our children.
One new item I would like to add, we believe that international collaboration for protecting children online is absolutely important. On top of the events that will continue next year, in February, on the day of Internet on children safety online, we (?) In order to speak and not only share in the experience but try to share the resources we have for protecting children online.
And, in the process, I'm happy to see that, in the same way with Australia, we've seen that online protection of children is not enough. In the new 2030 strategy, we're putting the protection of citizens online with the special emphasis and a special chapter because we understand that, in particular, children, teachers, safety officers should be equipped with necessary knowledge but also underrepresented groups, it's very important to be involved in all of our work.
We have started doing already events, but the new strategy is going to have a more organised framework on how to speak with people with disabilities from our community, LGBT communities, people over the age of 65, and all those groups.
So thank you very much for giving us the opportunity to share some of our experiences.
>> AFROOZ KAVIANI JOHNSON: Thank you so much. That was amazing. I know it was a hard task we gave everyone to try to synthesise the developments in four minutes, but it's really fantastic to hear the accelerated progress, particularly in recent years and just the whole of society effort that is being led in Albania.
Thank you.
So from Albania, I am really happy to introduce again Mr. Richard Wingfield to give us a different perspective from his organisation, which is BSR, Business For Social Responsibility. Really interested to hear about your work for companies. There's been mention of tech companies so far. What are the gaps and trends you're seeing in relation to current industry practices when it comes to children's rights and technology?
Over to you, Richard.
>> RICHARD WINGFIELD: Great. Well, thank you so much for inviting us. And I'm really glad to be able to talk about our experience.
So, for those of you who are not familiar, BSR is a global nonprofit, and we work with companies to turn human rights principles and laws and standards into practice using the framework of the UN Guiding Principles. We're being asked by companies to integrate respect for children's rights into their products and services but also their broader business and corporate responsibility strategies, processes, and plans.
And we've worked with UNICEF and a number of actors over the years. We have over 300 of the world's largest companies who are members and others. And the last years have definitely seen an increasing interest in ensuring children's rights are protected in the digital environment,
And that may come from regulatory requirements such as the Digital Services Act and others in the UK that talk about the risk of children and protection of fundamental rights. It could mean taking children rights impact assessments. It could mean looking at how companies are using their reporting and disclosing obligation. In all these different ways, companies thinking about the importance of protecting children and the rights of children as part of their broader human rights responsibilities when it comes to technology and the digital environment.
So I tried to put together a few of the promising trends and gaps to try to answer your question as best as possible.
The first trend is the approach we're seeing companies take to protect children online are definitely evolving and becoming more sophisticated.
Historically, we saw largely the efforts were focused on just parental controls, so really just giving parents the power to control their children's online experience and sort of putting all the effort and emphasis on the parents.
This is still helpful, definitely, but we're seeing more sophisticated approaches taken.
So these can mean things like different kinds of content classifications for different age group of children, promoting digital safety education and how to use technology and products safely. Undertaking risk assessments about products and services and how they affect children. Looking at different controls and access requirements for different age groups based on the development stage of children. So becoming a lot more sophisticated in the techniques used to protect children online while also protecting their rights.
In the EU, the digital safety Act allows search engines to include the risk to children and to undertake risk assessments regarding changes to their product and services as well.
In the UK, the (?) Act also forces company to look at (?) Harmful children online.
Some of the regulatory in Australia has been mentioned.
Finally, we're also seeing more companies doing children rights impact assessment. So these are specific targeted assessments as new products and features are being developed to really think about how they may impact against children's rights and take steps to mitigate risk to children as they're finalised.
There are still some gaps, one of which is the moment the impact of children online have very much been driven by safety considerations.
While this is incredibly important, it's also important to remember the opportunities to promote and advance children's rights that come through technology and thinking about how you integrate considerations around protecting children's right to freedom, expression, privacy, and other fundamental rights as part of this process is an area where we think more work could be done.
We're also seeing more assessments are generally focused on well‑known, high‑profile issues, for example, things like children's sex abuse material, bullying, and so forth. What we would like to see is a more holistic approach to the four ways that children can be affected online.
Pay attention to the lesser‑known issue, for example, the way children might be exploited in supply chains or forced labour issues. The protection of children's privacy online as well.
We still don't have enough transparency. So a lot of companies are undertaking this work, but it's not necessarily made public. We know that a number of companies have concerns around disclosing information because of legal liability risks or reputational risks, but we would like companies to talk more about the efforts they're making, not just the changes and features they have credited by the actual process by which they engaged in undertaking those assessments, including how they engage with children and children rights organisations so that stakeholder engagement aspect is a much more apparent and visible.
We would like to see a bit more data being produced by companies. So we know there's an increasing emphasis by regulators around certain methods like harmful content. Again, it would be great to see companies produce this more proactively and be more nuanced so we could see how different age groups of children are being affected and how different groups who might be impacted specifically, for example, ethnic minorities, young girls or children who identify as LGBTQ+.
So there's definitely room for improvement, but we're seeing promising trends as well.
>> AFROOZ KAVIANI JOHNSON: Thank you, Richard. I think BSR has a quite unique view on these. I think that was extremely insightful in a very limited time but just giving us some cause for optimism but also very clear areas where more work is needed.
I'm going to hand back to Jumana and the panellists in the room.
>> JUMANA HAJ‑AHMAD: Thank you, Afrooz.
And thank you for the insightful interventions. Amazing work done in different sectors.
Now I would like to come back to Helen Mason, Director of Operations, Child Helpline International, Netherlands.
Thank you so much for being with us.
The child helplines that are operating in more than 130 companies play a crucial role in supporting children across various aspects of their lives, including their online experiences.
So can you share with us some insights from your data on children's online experience and lives but also how you've been using technology to enable a safer environment for children digitally?
>> HELEN MASON: Thank you. Good afternoon. That's better.
Well, it's been a pleasure to be on this esteemed panel with my panellists today.
Thank you very much for the question. I would also like to start by talking about the characteristics of child helplines. Child helplines are a response mechanism for children and young people in any aspects of their lives and also a preventive mechanism as well.
I also want to highlight that while I think the legislation is changing, by any of the accounts we've heard today, we're still designing retroactively in the online space regarding children rights. That's very much evident in our own work.
To mention here that child helplines are the key to the response of online harm is part of the national model for response.
Over the last years, we've been developing different methods and capacity buildings around this type of issue that might come through to a child helpline. That means capacity building for our members. It also means around data frameworks, collecting data from our members on these types of issues.
So regarding the question and regarding the data that's collected by our child helpline members, important to say, perhaps, that it's a unique resource, the data that we collect directly from the voices of children, in the sense that it's also a by‑product between the conversations that go on between adults and children.
Being able to collect the data in a timely fashion is crucial to amplifying children's voices.
Suffice to say that technology has had a huge impact on that particular aspect of our work, in terms of data capture, data analysis, and also now looking at, for example, AI tools, to look at the chat that comes in from children's helplines.
In terms of the actual data we collect, of course, it can provide unique insight into the lives of children. We believe it should be part of a national programme and globally.
I would like to state that the most common reason for children and young people to contact a child helpline is mental health, first and foremost, and violence secondly. Those are 32% and 24% of counseling contacts, respectively.
So we can also state that based on the data we collect, that girls are more likely to contact child help lines than boys, with 52% of contacts coming from girls.
I think what I would also like to share is our latest report on online child exploitation and abuse, we see cases increasing, as of 2018, when we first started to collect data on this. And we would also report substantial issues around disclosure, substantial issues around taboos, substantial issues around underreporting. And, of course, it's very important for us to develop methods and strategies to deal with this together with our methods and many partners, including UNICEF. It can make IWF, for example.
I also wanted to highlight what we can read in the data. In terms of our nonbinary contacts, we can see that there's higher incidents of suicidal ideation, and those are (?) Online methods.
Child helplines ultimately operating 24/7, free of charge, over the years of launch, multiple channels of access.
You will be quite familiar with child online access that's coming through chat and all different types of online means, all the places that children themselves are present.
So it's incredibly important that as child helplines are responsive to children, they need to be in places where children are.
So working with online platforms to develop ways to intersect with those platforms to provide seamless referral to a child helpline service.
So I think bearing in mind that child helplines have always existed at the intersection of technology and child rights, partnerships with the industry is absolutely vital, and over the last years, we've been spending time developing partners carefully with industry partners like Snapchat, Meta, Roblox, et cetera. It's so that children can find help when it's time critical.
One of the challenges we have is around raising awareness. A child may not know that a crime is committed. They may not be able to talk or disclose this information. So raising awareness in a preventive sense absolutely vital. For us that, means raising awareness that you can talk to a child helpline about these issues.
I want to just close now. I want to say, as well, what really matters to me and Child Helpline International is the countries around the world without this service. So given the role we identify for child helplines and responding to online harms, it's urgent, and it's our aim to fill that gap by 2030, to have a child helpline in every country in the world so when a child needs to speak to someone, wherever they are, they can contact the child helpline.
Thank you.
>> JUMANA HAJ‑AHMAD: Thank you so much, Helen. We also have a child helpline here in Saudi Arabia. I had the chance to discuss with the director some of the priority issues that saw children face here. It's also very similar to what you were saying. Mental health is a key issue, and being subjected to violence is also another.
Okay. So I think we have some time. We have seven minutes. We are able to take a few questions from the floor here and a few from colleagues joining us online.
So we have two questions here. I think you need to speak in the mic.
Try to speak from where you are. Let's see. I doubt it. I think you will need the mic.
Vicki: Hello. I'm from Harvard University. I work in artificial intelligence and child rights.
First of all, thank you for all the amazing work that all of you do. It's super impressive.
I think my question is mostly for Mr. Clark, from Australia. I'm very curious to hear if you have tried to collect any data about adolescents' reaction on your government's decision to ban some social media platforms from them. I'm just curious to hear if you have any sense about this. Thank you.
>> PAUL CLARK: Thank you. As I mentioned, youth council did provide formal submissions. As of now, the mechanisms coming into play, and we're looking at which platforms will be excluded and the leverage that we're going to have to use to look at how that's implemented. I think we'll be undertaking more study.
Up to this, most of our research, which is always published on our website, has looked at particular cohorts of young people to understand their experiences online, to understand the benefits they receive, how they're engaging using technology but also the specific risks that they're facing.
Part of the implementation, this legislation, for us, is we'll be doing a full evaluation. So we're about to now start a baseline research piece to really get that clear understanding and to follow that over the next two years to evaluate the legislation.
So he's not something we have available right at this moment.
>> JUMANA HAJ‑AHMAD: Let's take questions from participants online.
Back to you, Afrooz.
And we have two more questions in the room. But I see there's a hand up online.
>> AFROOZ KAVIANI JOHNSON: Yes. There's a hand up online. I'm not sure if I can give ‑‑
>> KUBI: Can you hear me? Can I share a video. While that is being worked on, I will go ahead.
Thank you so much.
I came to listen to Mr. Paul Clark, first. I got more interested. We're currently working on projects. Actually, my intention is to get more collaboration from this organisation. So we are working on developing a comprehensive online safety benchmark that will be used globally. This will serve as a valuable tool for organisations. Our focus is on child online safety and also the use of social media and websites, how individuals will basically protect themselves using these particular sites. The focus is on children.
Our targets are sub‑Saharan African, Southeast Asia, Latin America. We're hoping to engage key organisations in those regions but to be able to cover a wider range and also to have a comprehensive safety benchmark being done.
Mr. Clark and other panellists have worked on (?) I would like to get the emails and collaborate on any of your projects you're currently working on and also get involved in our project as well to make our project a success. We're an Internet society. We're trying to make the importance on a global scale. So getting an organisation involved and collaborate on this project is one of the key things we value.
So that's what I can say from my end.
But I think the chat has been disabled. So I'm unable to share my email. So I don't know but ‑‑
>> AFROOZ KAVIANI JOHNSON: Thanks. That's great. That's one of the things about IGF, making the connections.
We can talk with the technical folks and see how we can exchange contacts.
Thank you so much.
>> KUBI: Thank you.
>> JUMANA HAJ‑AHMAD: We have five minutes left. If we can take, quickly, one question from the room here.
Over to you.
>> ANDREW CAMPLING: I'm with an (?) Watch foundation. Really interesting presentations. You may be aware that there are changes in Internet standards that will bypass filtering and parental controls, exposing children to inappropriate ‑‑ age‑inappropriate content much more easily and also allow tech companies to give plausible deniability, a horrible phrase, so that they don't see some of that illegal or unsuitable content.
Is that something that you're aware of? How do we get more child protection groups involved in the development of Internet standards to prevent these things from happening in the future?
I would really appreciate your thoughts. Thank you.
>> PAUL CLARK: I'm happy to jump in.
>> JUMANA HAJ‑AHMAD: Please go ahead, Paul.
>> PAUL CLARK: I was going to jump in with that. One of the key things we push in Australia is Safety By Design. All of these add‑on parental controls and measures that are completed at the tail end demonstrate the value in understanding and setting up a platform or an app with safety as the primary principle to begin with.
One of the following pieces of legislature which the Australian government is about to bring, in post the age restriction piece, is an Internet duty of care. So putting a legal obligation back on the platforms that they must keep the safety and security of their users paramount. So they will be held responsible for that.
>> JUMANA HAJ‑AHMAD: Thank you, Paul.
I know we have two questions in the room, but, unfortunately, we need to close this session.
I would like to thank our esteemed panellists for your participation today and for the audience who joined us.
Thank you for the rich discussion.
>> AFROOZ KAVIANI JOHNSON: Thank you.
>> PAUL CLARK: Thank you, everyone.
>> RICHARD WINGFIELD: Thank you.
>> FLORETA FABER: Thank you.