The following are the outputs of the real-time captioning taken during the Tenth Annual Meeting of the Internet Governance Forum (IGF) in João Pessoa, Brazil, from 10 to 13 November 2015. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
***
>> SUSIE HARGREAVES: Good afternoon, everyone.
We're about to start the session, if you'd like to all take your seats.
We're going to try to do this in a different way this session, because you have had a number of panel presentations. You only have one presentation this session, and that's me.
Afterwards we're going to open up and have a discussion. We really want to be sure this is highly interactive. Christoph, my colleague, you're going to run around with a microphone or something.
Yeah, so I would like to welcome you all to the session. My name is Susie Hargreaves, I'm the chief executive of the Internet Watch Foundation in the U.K., which is the U.K. hotline for reporting child sexual abuse content.
I'm going to tell you about a project we did, now going into the third year of a piece of research which is funded by Microsoft.
And I want to kind of tell you about some of the findings so that we can share those with you and some of the learnings.
Actually what I really want to do is also share with you some of the challenges of the work because some of the findings were quite controversial, actually. Some of the ways we designed what we found people didn't like. Some of the issues we uncovered were uncomfortable. I think it would be useful for us to have a discussion about this together.
I'm also really going to focus on how we're a multistakeholder organization.
I know that I just have come out of an excellent session on multistakeholder working in the field.
When it comes to child sexual abuse and removing from the internet, you can't do anything without working with all the partners.
I want to stress that and talk about how we work as a multistakeholder organization. And also want to stress we're a third of the safer internet center. We're a partnership, a hot line element, and also an awareness raising node, child net international in the U.K. and professional help line, southwest grid for learning. In fact they are partners of a bigger network in safe, and there's a stall in the foyer, so please find out about that.
In particular, safer internet day, which happens the beginning of February each year across the whole of the world. It's not just about protecting young people from accessing criminal content, my area of work I'm going to talk about, but also about finding was to help young people understand how to keep themselves safe online.
I'm now going to talk about the specific research and we'll stop, only about ten minutes, quite a speedy presenter, and then we'll can really open it up and have a discussion if that's all right. Feel free to really challenge this or to give your opinions and to differ from mine to disagree.
Okay, so although I work for a technology organization, invariably I can never use this, so I'm going to apologize in advance.
Why is this not working?
Okay, right.
Okay the IWF, the Internet Watch Foundation, which I will call the IWF, because that's what we call ourselves, is the largest European hotline. The second biggest in the world after the National Center For Exploited Or Missing children in the U.S.
It's our child to go after child sexual abuse material.
We're a registered charity in the U.K., entirely independent organization. That means an NGO, not for profit, and work incredibly closely with law enforcement and online industry. We're funded by online industry. If you see the poster over here is our members, about 120, the biggest internet companies in the world. And they fund us, about 80 percent of our money coming from then and 20 percent from the EU as part of the U.K. Safer Internet Center.
So what do we actually do? In 2014, and we have given you all a copy of our annual report, we assessed about 75,000 reports. Those are words that came from the public but also we have powers in the U.K. to proactively seek content so we can go out and look for content.
We assessed about 75,000 reports, and of those, 31,266 were child sexual abuse content.
Now, URL could have one or a thousand images, so that doesn't equate to numbers of images. That was the most URLs we have ever brought down, and in fact this year we have already exceeded 50,000 so it's going to be a lot more this year.
Also as well as our core service, which is to work with the internet industry to remove content from the internet, we also conduct research because we're in a very privileged position in that we have access to these images and these trends and patterns on line, so we can actually use that access to inform our operations and share our findings.
In 2012 we were asked by Microsoft to undertake a really tiny little piece of research, a snapshot into looking at self-generated content on the internet.
In 2015 we recently repeated this exercise, changing the methodology slightly. And we're about to look at the in next stage of this research.
I'm going to tell you quickly about the 2012 study.
We had our content analyst, actually 12 analysts whose job it is to look at child sexual abuse, assess it and take appropriate action. They dedicated some time. Each time they found a URL that had child sexual abuse content, to assess whether it was self-generated and note that.
They had three criteria to look at. Was the content that was assessed self-generated and featuring a young person.
I think it's important to say we take action on young people aged 18 or under. That is the definition of a child, aged 18 or under.
But we have rarely taken actual action on children unless we can absolutely guarantee they are under 18.
So this means that the majority of the action we take on children is very young age groups, because once children go through puberty it's very hard to assess whether a girl is 13 or 18 or 21.
The majority of the content we take action on like last year, 80 percent of the content was of children we assessed to be ten years or under, and about four percent was of children we assessed to be under two.
So the first criteria was is the content self‑generated or young person, so under 18.
Has the content been assessed as still image or video.
Does the content being assessed appear on a parasite website. I'd like to define that for you.
Basically this is content that has been taken from its original origin and uploaded on to another site. So in most cases this would mean it would be on a child sexual abuse website.
So it may have been selfie taken by someone at home, shared at school and somehow made its way on to child sexual abuse website.
The combined total time of that study was 47 hours, 47 hours of an analyst's time. And we defined it in 2012 as self‑generated content as nude or seminude images or videos created by a young person knowingly engaging in erotic and sexual activity.
That's interesting because the definition changed by 2015 and is changing all the time, and that became one of the most controversial aspects was around the definitions that we used.
The actual full report is available on our website if anyone wants to see it.
So in 47 hours we found 12,224 images that met the criteria. And they broke down to just over 7,000 images and 5,000 videos and on 68 different websites.
And the statistic picked up most was in the media and with partners, was 88 percent of the content had been redistributed from its original location. So basically once it's out there, it’s out there, it's been shared and shared and shared.
The conclusion we made at the time, very limited snapshot. We took a quick look, a snapshot of what we could find. We wanted to get the message across, if your image is out there, it's hard to control the redistribution of that work, that image, and we want to make sure people are aware it's going to be shared and it will appear on different websites.
But there were a number of limitations because obviously it was only a snapshot. Then also we were very clear that we weren't doing anything to look at the motivation behind the taking of those images. So clearly there was child sexual abuse images but we weren't looking at whether the victims had been coerced, had they been groomed. We weren't saying this is an abuse image. That was outside the scope of the study.
Microsoft asked us to do the study again. We did it in slightly different way. We knew more about the issue and looked at issues around definitions and how we might actually do it.
We wanted to see if the technology had change, if the patterns were different.
We looked at a three‑month period this time. So we actually looked at three months of data based on public reports. So this is the reports that we received from the public.
This was actually prior to us having powers to proactively seek content.
It was a public records, historic data, and we did some proactive, we did do a little bit of proactively sourced content because we had powers to go from publicly lead into a proactive situation.
You can see we captured the content as follows. We wanted to look at the image, video category, the site type, whether commercial content. There was a discussion about whether content was commercial or not in the previous session. We find around 25 percent of content that we encounter is commercial. That means it's behind a payment barrier.
About 75 percent is freely available on the internet.
We looked at hosting location. We assessed the age and gender. And we looked at the original suspected Providence, where the device came from. We wanted to have a sense of had patterns changed around technology.
In this time we defined user produced sexual content as nude or seminude images or videos produced by a young person of themselves engaged in erotic activity and intentionally shared by electronic means.
During this time we encountered 3,803 images and videos which met the criteria, and 17 percent of those were of children that we assessed were 15 or younger. Majority of them were the older age group.
But looking at the younger group, and this is actually the controversial finding in terms of what we found, was that 85 percent of them had been used creating a webcam, and 46 of that was category A or B.
Now, in the U.K. child sexual abuse images are category I'd A‑C, A the worst, C the least worst. They are all pretty awful. A and B are the worst types of child sexual abuse images.
The amount of the content on parasite websites, those wore child sexual abuse websites, about 89 percent. Again, the full results of this can be found on our website.
Now, the issue of why is the issue of webcams important here is that actually what that meant was a child going into the bedroom on their own, without an adult visible anywhere whatsoever, and actually doing things on camera which were being recorded.
Okay, so we did find that the amount of content on parasite websites remained static, and that it was simply 17 and a half percent at the younger age group. So still the older age group taking images of themselves and sharing them or being coerced or groomed or whatever.
But the issue as we say was of the young children using webcam. It raised whole issues about parental supervision, about what children were doing in their bedrooms on their own.
The issue we tried to cover here was once again we were looking at what we found as opposed to the motivations behind it. So the truth is that a five‑year‑old or six‑year‑old or ten‑year‑old is not going to their bedroom and knowingly engaging in sexual activity on a webcam without a level of coercion and exploitation. And that's the reality of it.
But we weren't in a position, because we were looking at those images and videos on parasite websites, to understand the original source.
Okay, so what can be done about this? Because it did raise loads of questions, you know. Is it responsible for us to say children very young going into bedrooms on their own and doing things? And actually we feel it's important because people need to understand the risk for their children. We think it's really important that people need to understand that children in their bedrooms, needs to be age appropriate access to the Internet in their bedrooms and they need to understand the risks and be protected.
We don't suggest it's one person or one organization's role to resolve this. It requires a multistakeholder approach.
So we need to do a lot to prevent these risks becoming harm. So what we need to do is work really closely with parents and schools, help lines, awareness raising organizations. And we need though start it much younger.
I don't know about other countries, but in the U.K. it's quite difficult to start a conversation with very young children in a school setting about how to keep themselves safe on line and the notion of sexualized behavior, that's a really tricky thing. But actually there's some really clear safety messages which need to be shared.
We need to actually work very closely with the companies that provide the services. One of the issues for us is that as it becomes easier and easier and easier to upload videos and new apps come in every single day, it's really hard to make sure that all the internet companies are totally on top of it. So you'll get the bigger companies who will totally engage in the process and look at ways in which they make their services safer, but actually it's really difficult to ensure that everybody is totally up to speed.
The researchers in academia need to share everything that they find about this, and also governments need to take this seriously, the issue of online safety needs to be taken really seriously.
So that was our research.
I'm happy to answer specific questions about it, but I wanted to kind of raise a discussion and ask people's opinion about how do we keep people safe on line, when is it appropriate to start the conversations, what do we do about self‑generated content, about people, you know, how to we master and deal with all that side of things.
One of the good things I can say at the end of this, and I have an information sheet about this, a number of times during the last couple of days, the issue of technology and hash lists has come up. We're developing a hash list.
Basically I'll just repeat what I said before. Hash list is a digital fingerprint. Each photograph, each image can be given a digital fingerprint. So if you have a unique image, you give it a digital fingerprint. You can then search for all the copies of that photograph.
So this is going to be a game changer for us because it means the majority, it's very, very rare our analysts see new photos or images or videos. The majority of what they see are duplicates.
If we can search for a photograph which we know there are maybe hundreds of thousands of copies out on the internet, and bring those down, then it really is going to be a game changer.
If people want to understand how hash lists work, I have an information sheet on that. And that will really help some of this content that we have been seeing, especially the younger ones. If we can identify it, we can hash it, we working on the videos element of hashing now, and then we can actually go out and search for those images and help protect those young people.
Okay, so that's the end of the research, but I'd like to open it up to questions now, please.
Thank you.
(Applause).
>> AUDIENCE: Thank you for this research. I mean, I think this is what we need more research to have deeper understanding and have evidence to base our programs on rather than just intuitions.
I will just disclaim this. I come from the youth at IGF program. I work with youth in Mexico and have worked with youth in different stages. My personal bias is I come from a context where self‑produced sexual content among youth is used to continue misogyny adult central views. So you end up in a situation where if you sext, if you send images, you will be judged for doing so. It is not seen as sex positive perspective, it is seen at best a protectionist perspective and at worst based on a poor view of youth.
My concern is sometimes in child abuse protection we can give a lot of food for these types of abuse, a different type of abuse that relates especially to late childhood. Right? Or teenage.
So my question on this is, I mean I think what you are saying, for example, for people under ten that go to their room, that's pretty clear there's a form of abuse going on there. I don't know if it was part of your research or if you can answer based on your other work with child abuse, do you think that these abuse or to what extent does this abuse happen because of online based extortion? And to what extent it happens because of the context they are physically living in. Like maybe are they living in an abusive family or have they been abducted. In the cases that it's clearly not just, where you cannot argue that it's a sex positive stance from children.
>> MODERATOR: Okay, because it's a discussion I'm going to let some other people answer. We have some experts in the room.
So Mary Law.
>> AUDIENCE: You had raised many interesting issues, so I'll angst the first parts of the question regarding teenagers being harassed in some countries.
One way to look at it, some agencies and organizations do think the best way to handle, speaking of teenagers, not younger age categories, the best way to handle this new thing going on with self‑produced sexual terms is not to criminalize this behalf your because there's an acknowledgment that adolescents are entitled to have a healthy sexual life.
Having said that, some countries from a legal perspective have accepted that. As a matter of fact, Austria just recently adopted a law, and it will come into force January 1, legalizing consensual sexting for teenagers over 14 years old.
So that's an example I would say in the legal field of a best practice of a country that is accepting the idea of adolescents having con essential blah blah blah relationship.
From a cultural perspective we have to acknowledge most of the countries in this the world there's still a resistance in accepting, having adolescents engaging in sexual relationship, from a cultural religious perspective still a challenge in most of the countries in the world, in Africa, Asia. So from a cultural or religious perspective, it's very hard to accept it, then the legal field it's going to be even harder because the legal field follows.
That's a couple of comments I would like to make related to your point.
>> AUDIENCE: Yeah, my name is Devin Kay from Electronic Frontiers Australia.
A couple of years ago we had a state law, and I attribute to that discussion around it and the inquiry that created it, where in one of the states of Australia, Victoria, hopefully it will be a model and spread, they created, looked at the issue of sexting, youth created sexual material, and they distinguish between it's a different issue to child sexual abuse material, and they created a separate law.
Essentially the exchange would be between people who legally could be considered to sexually consent, then that's by itself okay. And that the, but the, even though the, you know, the people who would sexually consent would not be able to be published material. So to 16‑year‑olds, for example, they might be legally be able to have sex and by then by the law they can legally have sexual explicit material of each other, but can't publish that widely.
They removed some images from child pornography, sexual abuse material law, but they then brought in a separate offense for nonconsensual sharing of those images.
And so you have to sort of show that's consent for that to be shared. I think actually if you have nonconsensually shared it you can still be prosecuted under child sexual abuse law if you can't prove you have consensual.
It's also a separate thing. Part of the problem is when you are in this, I mean I know one of the groups here is actually saying no gray areas, but it is a gray area because it requires much more complex understanding, requires much more context for teenagers than it does for children younger than that.
Within that teenage context, child pornographer law is a blunt instrument where sometimes you need something more targeted that looks at who and what and history in more detail.
That's our experience in Australia. We created a separate law and are hoping that will spread to be a model that is adapted more widely where you treat the two issues separately.
>> MODERATOR: It would be great to hear from maybe Africa or other countries, great, and also from young people as well.
If anybody wants to kind of talk about it, you know, I know in terms of the older age groups, in the U.K. we can only take action if we know they are under 18, we set up a relationship with the NSPCC, the biggest child charity, where if they can confirm the child is under 18, some it's 17, we then can take action to it removed.
We have a revenge porn help line as well. For people whose consensual videos have been used for revenge, we have legislation now and we have a help line as well.
I want to take a question there, then maybe some other perspectives.
Yes, sir, over there.
>> AUDIENCE: I wanted to share from Africa, my name is Alex Gakuru, currently director of IP Trust, a Kenya nonprofit, currently implementing youth online safety, actually funded by Google. Not speaking on behalf of Google but with experiences I have had running the program, a pilot, involving a few schools.
What has been my experience has been, A, is this a very very new area. Nobody has been tackling it. It's been a vacuum. Forget about gray, it's just been a vacuum.
Parents have been afraid to discuss the issues. It's been taboo in most of society to discuss even sex. So even educating about sex in itself in the first place is another complicated issue.
Also we are in the sexual movements where we are sensitive about freedom of expression and we are aware even developed nations used the wonderful theme of child online protection to sneak in sensor ship and going to pull in that one.
What I want to share is also a paper that you will find interesting that questions the role of the legislation, government in terms of protecting children.
It's by Fordham Urban law Journal, volume 28 in the year 2000. The title, Is COPPA a cop out? The child online privacy protection act as proof that parents not government should be protecting children's interest on the internet.
What this paper goes ahead and argues is the problem and the challenges the U.S. and others in approaching legislation in protecting kids and how these laws are overturned until they end up with a law that protection child privacy online.
It goes ahead and starts arguing that I think as parents failing on parenting, then trying to turn to technology and laws and everything, but probably proof of parents failing in parenting.
And if governments are going to raise kids and who the kids meet online, why isn't there a law as to who the kids are going to meet in school, on the estate, who they are going to talk to. Equally there should be another law about who the kids will talk to.
The fundamental question here, think we need to look and start focusing on the people we ignore the most, the parent, and start telling them we know we're all busy, but you must do your parenting job and start speaking to your kids. That is a principle.
As we look at the other measures, technology, legal, whatever they might be, we need to address that fundamental question.
Secondly, we need not demonize the internet because other people are using it as a means of trying to show how terrible the internet is. It's not the internet's problem. So the third thing is we must create what is called a peer. The peer is a most effective way of behavioral change. So if the kids are given some nice framework where they become ambassadors of good internet use, even in the U.K., you are find the kids finding the problem of the images when they get older and internet never forgets, something sent when they are young, experiences of people who lost jobs because of somebody Googling social media and finding something when they were kids, they start learning about the digital footprint being clear, and many of them are changing their behavior.
If you start that club, they will talk to peers and you start have a bigger impact. And parents also because multistakeholder holders having a role as guiding one. And the entire campaign must be led ‑‑
>> MODERATOR: Great advice, very similar For Safer Internet Day, lots of us working on. Fantastic points. I think people probably want to bring up about the parent.
We have one over here, then you sir.
>> AUDIENCE: That's a great pint. Michael Kaiser from national security cyber lines in the U states.
I think your research is great. I want to say thank you and I wish we had corollary in the United States to look and get a better sense of what our young people are doing in this regard.
I think this is the kind of thing we have to try and understand whether this is becoming a normal behavior for kids or is it somehow kids more at risk. We don't have all the answers but need to look at even more data.
Congratulations for sort of breaking the ice and bringing this issue. You all deserve a lot of credit.
I appreciate that.
We just end the cyber security awareness month. Happens in a lot of places. Started in the United States with a program we helped found. We did research with parents about rules in the households about cyber security issues, which I think are directly relevant.
I think it's interesting because I think parents are a part of the legislative body when we talk about kids.
We talk about legislation, but parents legislate. Right?
>> MODERATOR: Yes.
>> AUDIENCE: In their own home. Some might be dictators and some more cooperative.
We asked about different rules, first of all having the talk, something we're all interested. And a lot were having the talk, like 75 percent.
We asked things like do you have rules around allowing devices to be used in the bedroom after a certain time. Only 25 percent of parents had rules about that.
Right? This is directly related to this kind of issue. And you can triangulate from that kind of rule to this kind of result. And I think we haven't done enough work, think it's a college for all of us in the online safety space. I think we have good ideas for what we think parents should be doing.
There was another one, 60 percent didn't even limit screen time per day. You know, didn't limit during the hours during the day which the device could be used. Didn't limit password sharing. All these rules we think are fundamental to online safety that parents should be teaching, and they are not the adopting them.
We have a disconnect between messaging and parents adopting those things.
I think that's a serious gap which could address not all about you certainly a substantial part of this issue.
>> MODERATOR: Do you have ideas about how to bridge the gap?
>> I think we need to do more work. We have been prom gating these messages and somehow they are not getting through.
We can't just have a good idea about what rules should be unless we talk more with parents about how they are struggle to go put those rules in place. That's where I think there's had a big gap.
>> MODERATOR: Thank you.
>> AUDIENCE: Hi, Edmond Chung.
Following up on what David was saying. First of all, great study. I congratulate you and this is very timely.
I remember a few years ago, probably '09 this was one of the topics brought up at one of the IGF settings, youth produced sexually explicit content. Kind of a big knot. And problematic. I think Marion mentioned earlier saying cultural sensitivities make it very difficult to not only decriminalize the issue but even talking about the issue.
I wanted to ask David, because I think that's great idea. In fact this is a topic that I have been running a training for internet governance, and this is a topic that I usually bring up as a very tough topic to tackle.
I think the experience from Australia is very interesting. Not only are you decriminalizing but also in some ways protecting the right of certain young people, by the time they can do it, they can also see themselves do it in some ways.
The more interesting thing, though, I'm curious if there's any, six the passing of law, has there been any cases? Once we get there, there's a lot of gray areas.
You provided consent. You might have provided consent the first time you share it. How do you take back consent. For example, in a relationship, I may be giving you the consent, but after the relationship I probably want to take back the consent.
How does that work? Are there cases? Because we quickly run into very difficult cases in terms of defining whether consent was established or not for sharing.
I think it's a great direction, but I'm curious what the experience is with the law in actual action.
>> Okay, I don't know if there have been any cases. That's actually for a state I don't live in so I haven't been following that closely, the results.
Also I want to say Australia law remains a total mess because we have state and federal law, totally different there is. Federal law incredibly overbroad.
My understanding of the law they passed, there's no way to revoke possession of an image, but no matter if you legitimately obtained it, that gives you no right to share. In fact sharing it is a criminal act. I don't think there's any provision to revoke consent for possession of an image once you already have it. I think that would have to, yeah, pass a whole different and very difficult to enforce law I think to be able to have that happen.
>> AUDIENCE: To build on what David just said, the revenge bill act that Susie mentioned is a similar thing. It's very recent in the U.K.; so basically what is criminalized is the act of sharing without consent. You're not criminalizing the producing with consent in a private relationship, but when it goes out without the consent, and it's the dissemination that is being criminalized.
In light of the findings, I think that's the way to go because then you lose control of the material and it's copied and reproduced and ends up in collections of sexual predators.
>> DAVID: One other thing, I'm perspective sure that there's some provision to deal with it in a less aggressive criminal ways when the people have committed the act are they said under age.
So basically they are worried about images being passed around to high schools sort of communities. And the best way is to not necessarily threaten 15‑year‑olds with jail. There's some flexibility in how it's applied.
>> MODERATOR: We have talked a lot about the difference between self‑generated content which people have consented to in some kind of way, then quite a difference between a five or six‑year‑old being groomed or exploited, coerced and going into their room and having images taken of them.
I'm wondering if anybody had points about parents, the issues we have talked about, parents' responsibility, about the legislation, about how to we get the messaging across.
You know, so anybody has anything they want to share or want to challenge, please feel free to go for it.
I'd love to hear from some young people as well, so yes.
>> AUDIENCE: Sorry, I raised my hand before you said the word parents.
>> MODERATOR: It's okay.
>> AUDIENCE: I wanted to go to the point of commercialization of the content. Because when you talked about the methodology, you said 20 percent of the content was assumed to be commercial, and you said that means it's behind payment barrier.
My question is, wouldn't you think that there is more commercialization because we know that there is a phenomenon where on the one hand images are swapped between people, not like sharing and consent, but handing over an image and getting something in exchange. And in many many cases the images are used as a fee to get access to more images.
So you need to produce new image to show a new image, to get into a photo, then you have access to lots and lots of other images.
I would say this is also kind of a commercialization, and therefore I was questioning a little bit the 20 percent that you mentioned.
Could you explain that.
>> MODERATOR: Yeah, and you make a really good point.
I guess what we, the reason we make a separation between commercial content and content freely available as part of the European financial coalition which we're very active in. Actually what you have is if it's behind a payment barrier, it's normally an organized crime situation. So people who are behind a payment barrier might be selling drugs, child sexual abuse, hit men, whatever. Whereas actually when it comes to sharing of content, that's about behavior, collectors. It's a different kind of motivation, and that's why we separate them.
You're certainly absolutely correct that there is a bartering process not only in terms of I have a better collection than you peer to tier but certainly some of the very worse sites you can't get on unless you submit unique content.
You're right, it's complex. We separate in terms of payment barrier because we deal what's on the open internet and we don't have authority to go beyond payment barrier because we would then be paying for criminal content.
To raise that, you're absolutely right.
Yeah.
>> AUDIENCE: Sorry for stepping away from five‑year‑old. I promise the last comment but to respond to Michael, I think is your name. I'm not a parent, also not sure if I'm young. I'm well past 18.
I also don't lead a national initiative so I don't know if this is going to be useful at all. But I do think we face the same problem, which is how to have impact in our communications and advocacy beyond setting good rules. Like how we get people to brush their teeth beyond three times a day beyond telling them.
Something that worked on microlevel beyond context, with parents not talking about their responsibilities but opening the space for their fears. Again this is not like a two‑hour workshops, which are basically good for nothing in my opinion. But in the process of four‑hour sessions, four times with a group of parents. It was not framed of what they should be doing with kids on line but addressing their fears in not understanding what kids were doing with technology.
These gatherings began because of a girl who was almost a victim of child trafficking because of Facebook activity. And this brought the parents together and it was about first them having a chance to speak about what they feared. What they feared would happen if they weren't good parents or what their kids would get they said into, and then understanding like the basics.
After this process we can say that basically 80 percent of the participants three months afterwards had reported that they had introduced new rules and new conversations in their home.
Now this is a micro level.
And working with youth, I have less experience with this. But rather than framing in the media literacy curricula we started working with sexual and reproductive rights groups rather than like the people who talk about ICT because their interests will go to other stuff. They will go to you tube and stuff. But when we speak with people who have been working on how to use condoms and how to get people to ask about the pill and things like that, that was a more natural link to the discussion on sexting than the technology side. Again that's in my context in Mexico.
>> MODERATOR: Thank you, yes.
Thank you.
>> AUDIENCE: Hi. I'm David from dokas foundation.
I want to share a very interesting piece of information with partner in China. We had research done by China partner called China National Youth palace association on media literacy and research center. We had research done last year for around 20,000 children age 3‑14 years old and with their parents.
The research findings on how the parents are doing making some kind of guidance and the context of children themselves.
Research found out around 61.7 percent parents have no screening for the use of apps on their children's mobile and tablets, and 67 percent parents have no promises with their children on what kind of time and those kinds of materials can be bruised online.
Then we like to show another fact that children and parents have this kind of consent and have it done, and have guidance on parent perspective on doing things like this.
In another sense besides this kind of information to be shared, as I'm also a youth, but I'm not presenting at the children age to 18. But I'm also the chair of the children under 18, they are using apps like snap check, like the Skype or others with the records we check. They can share the information in snap check that can't be traced, that can be erased in a very short period of time. If we are using, our youngest generation is using these kinds of apps in a sense of communication, it's hard to check out whether they are using this platform to share the information which is not appropriate.
So how to tackle the problems. I would appreciate the lady to talk about media education to be done in the curriculum. Media education is one of the things for kids to how they can use internet appropriately. I think it's the better way to be run in a sense instead of doing those kinds of like law enforcement in a sense, to have children and youth self-discipline in the main problem in a sense.
>> MODERATOR: Thank you.
One, two, three.
>> AUDIENCE: Okay. I have a question and a comment, suggestion.
Let me start with the suggestion.
We are trying to, okay.
.
>> MODERATOR: We're just feeding.
>> AUDIENCE: We're trying to develop a curriculum for children KG‑1 onward that will help the child be safe online from the childhood itself.
The curriculum will have three dimensions. Child and the parent and the teacher.
The parent also have to be educated along with the child, how do you parent a cyber child.
So this is one thing that you are working on. It's an ambitious project and taking care of it.
I have a question basically. Normal sextortion is having, okay, I have your e‑mail if you don't do what I say, we will publish on internet. This is normal happening.
We have couple of issues happening in India. I call it sort of blackmail, I call it white mail or gray mail.
The person will come back and say I already published your images online, and here is the link and user name and password. If you don't do what I said, I don't do anything. The user name and password will be removed and the URL will be publicly available.
In this one she cannot go back to the police because he don't do anything, even you caught him red handed, still if you don't do anything, the images will be published automatically.
So have you come across these kind of issues and have a way to handle this kind of thing?
>> MODERATOR: Audrey, do you have a response to that one? Do you want to respond and we'll take a point from here and there as well.
Perfect, thanks.
>> AUDIENCE: Thank you, Susie.
Actually I'm Arta from the Dutch hotline. We see where a lot of reports are happening this way. Various ways of extortion.
Actually it is legal. Even if it's behind a password or whatever, you cannot extort somebody. You can go to the police. The fact is 90 percent of the time you will never ever hear from it again. Ten percent do publish it online so then you're in big trouble.
We always say you shouldn't pay because they will always ask you for more money. Extortion is that, that's just there in the world.
Another thing what I wanted to add how to educate the children, I think we have a big problem here. I think we are the only ones in the room that actually realize what can be done on the internet. We probably I hope so are the ones who make our equipment safe with strong passwords because we know what will be happening online if we don't.
Most of the parents, most of the normal people don't use strong passwords, don't have virus scanners on their equipment. They are not safe at all.
Now, this is grown‑up people who know A has a result to B.
Children don't have any clue of what might happen, and they are even more thinking like it will never ever happen to me, happen to educated kids who think it will never happen to me.
I think the challenge is there.
>> MODERATOR: A really great point actually.
>> AUDIENCE: Yeah, during this year’s internet day we conducted a very successful experiment and decided, you know, we want to pretend we don't know the answers to this problem.
So we got ten schools, teenagers from ten schools. We got them in a round table. We had parents, teachers, even government policy makers, minister of education and had all of them there and just asked a simple question. What is it about the internet that you love, that you want to be on? And the kids were amazing. The sort of things they said, they wowed everybody, things we thought we knew, we found kids knew better.
Then we asked them to think, what are the bad things on this internet that you would want to be protected.
They said a lot of amazing things.
Lastly we said what do you think should be done to keep you safe, so the good things you want on line can continue and bad things prevented.
They had myriad of answers.
Some said parents can do nothing.
Some said if parents think they can follow us on Facebook, they join, we quit and go to another app.
If they join this app we get out because we don't want them where we are, which is what happens in real life.
Now, the moral of the story or to cut it short is that I think this dialogue is best when you get all these people around there. It maintains the currency so you discuss current issues and you get amazing solutions to complex problems. You learn along the way and you find even parents appreciate what's happening. It makes an environment where it's not the top‑down, father knows all, don't become a net nanny trying to dictate. And the kids feel they are heard and it becomes a dialogue that can continue and becomes a very light conversation and interesting.
We found that to be extremely interesting and I thought maybe I should share that experience.
>> MODERATOR: Fantastic example, thank you so much.
>> AUDIENCE: Hello, everybody, I'm Florian, I'm 19, so a little older.
I want to make a point about going to the parent and why we maybe do not reach out to them and why it's hard to communicate.
I would say my parents know what can happen if you don't use a condom when having sex. They don't know what will happen if I share my private pictures on Facebook.
I think that's a key issue. I will make a risky prediction. I will say when my generation is starting having kids and some people already start having kids, this issue will fix itself by certain amount because then there is an understanding from the parents. But it has to be understanding by experience. It's hard to teach parents what can happen. They have to experience it.
So in a few years' time for the young parents, it's not an issue.
Yeah.
>> MODERATOR: You think it's just a temporary issue really.
>> AUDIENCE: Not really because the technology is also changing. So when I'm having kids in I don't know how many years, there are new apps and technologies that I don't understand.
But there is a slight chance that the technology doesn't change that much, the logic behind is the same and I understand what can happen to my kids.
>> MODERATOR: I have big kids, 17 and 23. They just accept different modes of behavior as well. So it's not just about the older age group. As you say, it's the consensual thing. They actually think certain things are not a big deal that sort of send me crazy.
When my daughter was meeting one of her close friend's boyfriend, I said have you met him before? She said no, but I have seen his penis, without even, you know, okay, right, fine. And I'm!
You know, anyway, here we go.
Yeah.
>> AUDIENCE: A little bit skeptic of relying on you didn't know better or you couldn't foresee your future. I think even as adults make the mistake often. For example, when you say something then you go oh, I shouldn't have said that. I mean is it truly that nobody took the time to train you and make you aware of what was in your future?
We all make decisions based on what the present is. When I was 15 years old and I was posting things of myself online, I was thinking of the current authorities, which were my parents. But now I'm ten years later, I still make decisions that maybe when I'm 35 I will be like, I shouldn't have done that.
And I don't think I would call myself ignorant. I think I reflect a lot about these things, and I understand the framing is let's try to get people to think about their future if it's possible. But even if you try to foresee your feature, there will be things when you're assessing in real life at the moment, it's not going to be like that. Then you're going to say well, they are stupid, ignorant. Is that what we really think of people who uploaded a picture of themselves naked, they are ignorant and don't think about the future?
>> Hi Susie, Jim Prendergass, and I'll preface by saying father of ten‑year‑old girl and 12‑year‑old boy.
Thinking about what you said, Susie has given the youth plenty of notice. This question is for you guys and you guys only.
The parent telling the child what to do or not to do, and as you alluded to as well, you're not going to listen to us. We're your parents. You're not supposed to listen to us. That's part of the cycle of growing up. We all went through it.
The question I have for you, when it comes to issues like this, who do you listen to? Is it solely making the mistakes on your own? Or do you actually listen to your friends, your teachers? Are there other folks that when they talk to you about these issues, that it sort of resonates and you say okay, maybe I'll give that a shot, but it's definitely not my parents speaking to me.
I'll toss that out and see if anybody wants to take it up.
>> MODERATOR: Florian? Do you want to respond? Okay.
Anybody else want to respond to that? Great.
Younger people right now.
One second.
Okay.
>> AUDIENCE: Certainly very good question. I remember when I was in secondary school, there was this love track we learned about condom is and sex and all that stuff. Random youth basically trainers I didn't know that came from the big city, I'm from the countryside, my teachers had no relationship with them before, and afterwards I could talk to them about my private fears without having fear they would tell everybody I know because they were not in my social connection base, basically.
So I think yeah, trainers that just go around the countries and inform people is probably a part of the solution. Not the whole, but certainly a part.
>> MODERATOR: Yeah, have you got something?
Take the young person first.
Yes, after.
>> AUDIENCE: Yes. Answering to your question, I think what our parents tell us, it's of course very important. But I think like what we see in our schools and what happened to other people in our environment, we see what happened to the other people. And if I essentially sent a nude to anyone, I know that I would take that risk. But that's how we know how it's danger, the danger of posting a pic in sexual content, yeah.
>> AUDIENCE: I'm Haley, and I'm youth as well. To respond to your questions, I would say definitely peers play an important role in influencing my decision. But at the same time actually parents play a more important role to me. In terms of two main conditions.
The first one is how knowledgeable my parents are. Because my father is engineer, and he knows a lot. If he show me he really know a lot about internet and shows some, tell me some evidence in doing something, like if the consequences of some wrong doings or whatever, he's knowledgeable and I will believe him.
The second condition is the relationship between parents and children. I think for most of those naughty kids, where they are not listening to the parents is base their relationship is not that close, and that's the main problem for some of the problems, they are like trying to play around with the other kids or sometimes their friends are not that like good, and that's why to cause this kind of problem.
So I think if we, the role of parents are really important. I think we have to address the two conditions I address, yeah.
Thanks.
>> AUDIENCE: Just want to add to this question.
If you're interested in the behavior of teenagers online and how they deal with privacy and social media, I highly recommend the research of Dana Boyd. She did a book called It's Complicated, which is about, she interviewed hundreds of teenagers about their use of social media and so on and privacy, an it's really interesting. But if you are looking for a single sort of simple answer to how teenagers deal with privacy and so on, the book is called It's Complicated, that's a bit of a give‑away.
>> AUDIENCE: To your question, Jim, even though we know as parents that some of what we're going to say won't be listened to, it's our obligation as a parent to say it and to sort of discuss it with our kids.
I mean, this is parenting. To your point. Parenting in the 21st century is much harder maybe than parenting in the 20th century. I don't know that, but to your point about your father knowing, most parents know very little about technology, and the kids are wiser in terms of technological knowledge.
But it doesn't mean that parents can't guide you, because they have more experience, and they know better because you're older. You can teach them how to use technology, but they can teach you how to behave because as you said, it's not about technology, it's about human behavior.
Off line, online, it's human behavior. It's the decision you take. Using the technology, but after all, it's just a decision you take to click or not click. It's in your brain, your heart. It's not about the technology. Technology evolves, crime is going to evolve.
Again, parents have to be around. Peers have to be around. Family has to be around. We all have to discuss those issues.
>> MODERATOR: Thank you, we have one, two, three.
But before we hear from people who have spoken before, anybody who hasn't spoken yet who really wants to speak.
Okay, some new voices then we'll pick up.
Thank you.
>> AUDIENCE: Hello, I'm Felix from net mission.Asia. To bring in perspective from Asia, we are usually very subtle in talking about sex education. The sex education that I received in high school, we couldn't hear about anything about sexual organs at all. I never received any sex education from parents at all.
I think from an Asian perspective, a lot of these youth produced sexual materials tend to be produced in a really private setting, and we won't even discuss with our friends that reproduce such content. I think it's really an issue for us to identify such images. Because for average parents, if they know that their kids are conducting these youth‑produced images, they would certainly do something about it at least.
But we're talking about situations that none of us know that the kids are producing such materials. I think it is a concern that we have to address as well.
>> MODERATOR: Right. That was a really interesting point.
Yeah, so we're going to go one, two, then three.
Okay, sir, thank you.
>> AUDIENCE: Adam Joy.
I thought the interchange right now is really good, and it really highlights what we hear and talk about multistakeholder approach. It's not so how much about adults teaching kids or saying you need to do this or not do that, nor is it about kids knowing everything.
I think one of the conclusions that we really needle to draw from kind of this conversation is all the people, including myself, who are creating training materials for kids, we need remember that from the very beginning, you need to, we need to engage kids in the production of those materials. How this type of thinking can be conveyed to them.
So that's really think one of the very important things.
The gentleman over there, one of the things I always tell when I do some training is that you guys actually know better than some of the experts because you lived through, in grade school, lived through the Facebook, right?
You're the first generation that does that. If you think that by the time you're parents you will know everything, that's probably also wrong because I'm hoping that at least not everyone is a victim, right?
So you have to learn from others as well. That's one thing, one of the take‑aways really, not just in how we conduct training or education sessions, but from the beginning of designing materials we need to engage those quote/unquote experts who have lived through the process as well.
That's one of the take‑aways I have pretty clear.
>> AUDIENCE: Yeah, it's about behavioral change. I think we have always done better when they speak by themselves.
There's a website I have been following since 2005 and still find it one of the most powerful things in terms of behavioral change. And this is SADD.org.
Www.SADD.org, this is students against destruction decisions.
They pick issues on various aspects and this creates a club where the dynamics of the issues, maybe it could be drugs, it could be tomorrow an issue on technology, it could be on anything on decisions. Unless make we create an environment where the kids can discuss the issues they want, because he may say what he's saying back there right now because he's not yet got a kid, but by the day he has his first kid, people get transformed.
I have somebody tell me, the moment I have a kid I have to worry about protecting the kid, but before that they wanted the freedom.
The idea is to push them into something of that sort so they can start making decisions. Even out of the picture, because they will not listen to you as a parent or at least be rebellious, it's part of growing up. You have a chance of saving them before they go to find their own direction.
>> MODERATOR: Fantastic.
>> AUDIENCE: A comment earlier about parents not doing the right thing in security. I think it's true. Most people actually know what they are supposed to be doing but don't do it. Right?
I don't think they don't know. I think they don't implement it.
I think to your point, I have some confidence in generational issue. Generations do change things. Some of us know in the 1960s if you were caught smoking pot people told you it was the end of the world, and by 2015 it's a different world when it comes to marijuana at least in the United States.
There are shifts.
And a couple more points. Two things. We have to cut parents some slack. When we did our survey, like I said, a lot of parents felt, and this is the challenge to young people, 60 percent of the parents we asked said they knew everything their kids were doing on line.
Now, we know.
>> (Chuckles).
>> AUDIENCE: You hear me. There's not a conversation happening the other way. So it's hard to parent when we think I got it under control. But no one is helping you understand what you don't know.
So I think that's an important challenge to young people because you are exploring the technology. And having the conversation two ways, it's not just about parents, it's about kids saying here is what we're doing, this is my online life, these are the apps we're using.
It's really important for them to know, because clearly at least in the United States they don't.
The other thing I would say, we're all in this together.
The example I would give is this. As an apparent, right, when I finally teach my child to drive, I will probably have been driving for 15 or 20 or maybe even more years. Right? I have a lot of experience driving. So when I get behind, you know, try to teach my child to drive, I have a ton of experience.
We are both, parents and children, the first generation using this technology. We're using it together for the first time. None of us know the rules really when it comes down to it. It's not like I have 20 years of experience to teach my child using the technology.
We're all in this together I think is part of the issue here. We have to figure out how to do that a little bit better, though parents I feel always feel like they have the responsibility to know it.
So we have that challenge too. I'll leave it at that.
>> MODERATOR: Thank you, Florian, then a comment here.
>> AUDIENCE: First I want to comment on what the young man from Asia said, just talking about the parents will do something about it if they know their kids would send a nude to a social media and social networks.
Very good point because there's actually a problem where we should start tackling the society in general, because what's wrong with sending nudes? It's sexy, exciting, exploring, it's fun. So why shouldn't I do it.
The problem is as we just hear, I would never tell me parents especially not when I'm younger because they will try to stop me from doing it. Because it's always about all the concern. This topic it's never about the positive side. It's about exploring myself.
Really a deeper issue we should maybe also start to tackle with in the workshops and trainings people do with parents, it's also a good thing and not only a bad thing.
Maybe come back to your example of the car driving, as a parent you also have 15, 20 even longer experience in sexual interaction. So you also should know that it can be exciting if you do uncommon stuff and stuff that other people don't know about.
There is some special thing in sending nudes without parents knowing. So that's a problem we have to tackle in the society and not with education about how to prevent sending nudes. It's deeper.
>> AUDIENCE: I'd like to speak.
Good afternoon, I'm JD from Hong Kong, and I want to talk about instead, I don't think everything that the children do on line is really known by their parents, because as I know many of my friends, they have a lot of IG accounts, and some of the IG accounts do not allow their parents to follow because they just, I think it's a kind of they have feeling that they want to leave some spaces. They don't want all the things that their parents can know about it. Because I have asked my friends why they don't let their families or their parents to forward their IG accounts, they say it's nothing but I just don't want it. Because the photo I post is just about my school life, it's not very related to my family life, so I want to separate them into two different maybe accounts to handle those things.
About maybe, is it really the teens, and you say the teens will know what they are doing and they will know what is the result of posting some video? It is a very typical example in east Asia, some apps just like social network, and you can post video on it. And there's a lot of videos which is containing some sexual meaning posted by some young people, just because they want to accumulate a certain number of those in their apps.
And due to the poor management of the apps, there's a lot of video rose and did not be deleted.
Think it is a very typical example which explains that not every teenager really understands or really knows what is the result of it. And it also raises another problem. In Hong Kong there's no more lessons which is talking about those issues. They just, maybe we have just some irregular talk which is about what you can or can't do in the online in our schools.
So I just wanted to ask what program or they can carry out regularly to make the education about those become more efficient.
>> MODERATOR: Thank you, that was really interesting.
Thanks a lot.
We're going to take another point here, then we're going to start wrapping it up.
You have not spoken, so yes, you're next to speak.
Before we wrap it up, if anybody has a really burning and important thing they have to say, you will be given an opportunity, otherwise two more speakers.
Okay, thank you.
>> AUDIENCE: Just as you were saying, to cut some slack for parents, sorry, I don't know the phrases in English, I want to say the same with youth. I'm old enough to cringe when I read comments on asca frame or secret, but I'm young enough to remember what it's like to be able to go in an online chat at age 10 before you know how to keep a secret.
So I guess I'm in a position where I'm not old or young either, but at the same time I see older people really get frustrated and have like create problems out of youth behaviors that I'm not sure are problematic.
I think we're all on the same side. We all want young people to flourish and live the lives they want to live. And we don't want to see them harmed. That's why we are here. We want them to have access to technologies that will change their lives for good as well, and people at risk today will not be even more at risk because of technology.
But at the same time when we are trying to protect children on line, what harm are we creating for them? I have sat across cyber police members who openly admit they key log their kids and take aware their privacy.
Privacy is needed when growing up to develop social skills and assessment.
I know this guy has good intentions and I doubt he wants to see his kids unable to do stuff when they are older, but do we really think collectively that key logging and reading everything and being surveilled, promoting surveillance of kids is the way to go about it.
Then again, we never really spoke about the five‑year‑olds right now being abused and kidnapped and suffering a lot and not getting the start of life they deserve because we're focusing all our effort on teenagers sexting because that's closest to us. We are not doing our best effort on doing right by kids not doing okay today.
Just as we should cut parents some slack, we should cut slack to youth. We remember okay. We should focus on those who are not okay because technology is not necessarily making things better for them.
>> AUDIENCE: Yeah, wanted to follow up with from the thing sending nudes is also, because like that's like healthy way of exploring your sexuality. Like we should be building kind of environment where there's an attitude that sexuality is okay and sending nudes is okay.
Like we can still see even with like adult celebrities, the nudes elite, people are shaming the victim, not the one who like hacked their computer or like is like sharing the pictures, but the victim for healthy sexual expression.
I think that's a big problem in our society.
>> MODERATOR: Thank you very much.
Anybody who hasn't spoken yet who wants to say something? Anybody who has a burning last thing?
Yes, I know.
Anyone else? Hang on, new voice here.
I'm going to have to give priority to a new voice.
>> AUDIENCE: Hi, I'm Juliana.
You're talking about the views, unfortunately, we leave the future of views, what I can do to be more famous, what I can do to be more sexy.
People need to love me, so I can post a photo, then I can be regretted tomorrow with 35, 25.
We need to think about the parents too. It's in house you can learn this stuff. Doesn't matter what the people can say off your nudes. It's safe to send. It's natural. It's okay. But you need to think. It's an offer you just send some photos, you have the risk to get in the internet online.
In Brazil we have very famous kid, Brazilian, McMelody, and her dad posts very sexy photos and she is only eight years old. She is very famous in Brazil singing some songs with sexual content. The parents don't do anything. She's very, very famous, she and her sister. She has 18 years old and her sister has ten. And she dresses like a woman but she's a kid. She has followers to treat her like a woman. She dresses like little, little, I can't talk everything. It's a complicated.
So we need, really need to think about it, what we can do with this kind of parents. It's okay. If I will parent one day I will teach my kid, blah blah. But this father, he posted a sex photo of his kid. And everybody says oh, she's cute, oh, she's sexy.
Eight years old of girl? No.
>> MODERATOR: Right. I'm going to actually stop, okay, because it's a matter of personal pride to me that I have never gone overtime on a session and not about to start now.
I think this has been a great discussion.
I just can't thank you all enough for really engaging in the subject.
I know we focused on the older age group, but that was because that's what people here wanted to talk about.
Obviously we really are aware there's a difference between the very young, very vulnerable children who really do need to be protected and we need to do what we can.
Really interesting discussion. Absolutely love the range of views.
Can't thank you enough. In particular all the young people who spoke. I know for some it's hard and tough. It really takes guts to stand up and speak.
I remember when I first stood up, I actually went off and was physically sick afterwards. It's a really tough thing to do. So thank you so much. Thank you all for your participation, and have a really fabulous evening.
Thank you.
(Applause).
(End of session).