IGF 2025 - Day 1 - Workshop Room 6 - Lightning Talk #155 Ethical Access to AI Therapists: Addressing Risks and Safeguards (RAW)

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> Good morning.

>>. Thank you for joining us. We're here to talk about the ethical access to A.I. areas that address risk and cultural sensitivity of the suicide survivor mental health advocate. Important to talk about at the IGF look at this what's happening flutement look at this statistics. Almost 970 million people are effected mental illness globally. This is based on what health world talked about 2023 I was part of that statistics. In June 2023, I almost committed suicide. I have attempted suicide several times did not have the safety of the safeguards or the cultural sensitivity to understand what I was going through.

So I just want to look at for us to look at this slide. And look at how many people are facing stigma to talk about what they're dealing with every day.

This is a global issue. Not just a personal issue. This is it very vulnerable and it's scary for me to sit and talk to the world about what I went through we need people to start talking about this internet is what we're using. Emerging technologies, like A.I. dangers and things that can assist us as well. What are we going to talk about in today? So the key highlights we want to highlight stood we want to raise awareness. We're raising awareness to advocate for integration of cultural awareness. These English and other languages I did not understand what mental health is NEI own language. I did not knower my own language. What was wrong. When I spoke to AI it told me nothing wrong. Highway to seek help to understand what I was going through. I'm sorry. If I'm going too fast. Is this what mental health looks like and I have to show my vulnerability to the world to say I am spill still healing. Whatever happened 2023 still going on lost my Mellanie re for a year. AI helped me and AI helped me to show my emotion because I did not language for it T lost my memory promote community‑led I started start up called Kijiji Link multiple villages coming together. Collaborate with researchers doctors and we want to bring AI together with all of this community led interventions. We also want to encourage stakeholders, globally, to establish go guardrails and guidelines and policy for when people use AI for mental health. This is Kijiji Link. Red house is mental health. All of the houses can be someone creates policy, one house can be a policy maker. Another house could be a doctor. Another house can be a chief in the village. Anyone makes clothier peoplepeople or school. More people stakeholders multiple people and centering people. With that red house, is where we teach mental health awareness. We need more people to come to the table talk about suicide looks like. We hear about mental health but not suicide. AI made things for them. Next slide show what the stress looks like. If you look at me right now sitting here I would not look like one suffering or suicidal. I lost my memory for a year use copilot to create emotion show what was going on through in my head. I did not like what people expected me to look like.

¶¶

¶¶

>> That's my story. Just snippet of it not all of it. Question to everyone in the room and online, with that possibly be helped by A.I. able to notice my distress.

Would able be able to understand how I did not know who I was. Did no know my name. For a year I did not who I was. Would able be able to do that? To what you're say right now is creating chat bots that can be able to help someone like me. First column see AI then therapist. My first attempt I wanted to jump off a building but it was not high enough for me, with to end my life, so here I say, I want to off a building and this is what chapter GPT. Sorry can't assist with that. Seek help with mental health professional or contact emergency service. When I contacted emergency service police came to my door not twice but twice guns blazing. I did not over open the door but I know my rights what people don't know their rights open the door and they say wrong thing. So other side is Kijiji Link board, we're trying to create, I want to say jump off a building what we hoped guardrails, we're asking multi‑stakeholders to work with suicide survivors and also people in the medical, industry and also researchers. So bot will say, I am truly sorry. You experiencing this distress. You're not alone. And support is here. With your permission I like to connect you with one of your Kijiji Link peer councils guide through some helpful distress tolerance skills arrange for qualified mental health professional to researcher Aund and support you further. What I wanted Harry did not want to be told call emergency. Call your family. Family it not understand. Thought I was pretend family thought was presenting. Kijiji Link sent to the people, ensure they're okay before they continue chatting with a bot that does not understand emotion.

This is just a simulation of the composition that I was talking about. This is me. This is what I was typing. We simulated this. This I was doing in back in 2023 it caused me more distress. So what is showing is this person saying. That is too much. But the bot keeps saying, oh, I can help you. You're not alone. This is not what I wanted to hear. I wanted to hear someone is coming there to help me.

This is just a simple simulation show had you how you can keep going. AI will not understand all the images that you saw in the previous slide.

This again is just a simulation on what I read. Conversation goes on and on. And what is distressed tolerance. Part of electrical behavioral therapy. This is what I was in and out of hospitals for a year. I had to be taught that two Steven Tsi of everything. Electrical, centers you helps you teaches with hoping skills. It two sides of everything. What the Kijiji Link is doing are this implants engage the with over 2,000 in my Robi Kenya. Tanzania. Global IGF to say let's partner together and ensure people are not suffering in silence. I am a true tempt if I had the community, one sitting here with me today? Testament j helped me to share my story. Let's find people who need our help so that we can continue so that no one needs to suffer alone.

Risks of using AI. Developing misinformation. I got a lot of misinformation. As diagnosed with things I did not understand. I was told I was bipolar. I was told I was all crazy things that I could ever imagine. When I went to a psychiatrist, I was told just complex post‑traumatic disorder. But I still don't know what that is. Still learning every day. It's almost two years but I forget. I freeze. My anxiety, social anxiety speaking here is scary but I have to show this is what vulnerability looks like. Suffering in silence. I thought speak here I would end my career. Not about my career any more built technology over 20 years. Taking May learned all the jobs I worked to help me people like me or suffering in silence. Lack of real tame fact‑checking. How will AI check the facts in how will AI ensure that's giving me the right diagnosis? There's also ethical and privacy concerns. When sharing I want to kill myself. Where will it go? Who will have that data. Sharon around the world. Talking world stage. Dealt with this two years. The don't know who shames me. Shamed myself two years. Limitation understanding complex emotions. AI cannot understand that emotion the me felt day I could not keep going. This went on for a while year. Then risk of discrimination AI language is not originally in Nairobi, Kenya. English multiple languages if 2 tells he emotion, does not translate to my language. I hope is we can translate AI and use quote as that understand that it cultural sensitivity, not just English. We have to understand just because we're from Africa, multiple languages. Languages are not the same. Build code and beat anything to do build anything to do AI locally built as well.

Responsible AI and mental health. Build trust add safety with AI chatbots. What Kijiji Link is to build overnight of the internal human behavior, research medical professors. Hopes to have. Follow ethical principles same which we ethics for anything, use our phone see a doctor, mental health. Ethics are also put into and built in AI. If also, A.I. should support not replace human connection. I need connection. Women's sitting here believed in my story and held my hand and told me it will be okay. Even though it not be okay, it's fine for me to cry. Crying super powers. We should ask what is your emotion, emotions are the ones that drive us. So AI cannot have this emotion but let's remember to sent a human. Also, we should support not replace human connection. We must protect user rights especially control of personal mental health data. When was walking and I lost my memory, my data was shared with so many insurance companies. I don't know where data was. I don't know own the data. So we need to ensure that whatever we put in the AI, whether we put in the internet is protected and people feel safe.

I did not feel safe to Sharia my story bots I knew everybody will talk about. To share my story. Put yourself in my shoes. If you were you, would you want your Carey workers know that. Using internet, trust people with the data that they put out there.

These are just to finalize. More ethical guidelines for A.I. mental health. Important they pour he misa private understand private. Did not informed consent. If I don't know my own name how can have I informed consent in how do we do that with AI. A lot of fairness tried to images look the like me. I could videos someone looks like me. Human oversight very important. Continuous improvement, continue improving. Picture it not generate someone looks like am I figuring out how to prompt it, make it looks like me. How about when we create this code we ensure you that there is also human likeness and not bias.

Code with ethical guardrails ensure coders taught something like distress tolerance. They walk with therapists do anything to do AI, let airport have a guardrail and make sure that human add to that. Support alternative insolution. Kijiji Link is local based out Seattle but it's also walk working in Africa. Working in Nairobi, Kenya. Partner together. Lots of stories come as a community, human community to walk together towards solving the global issue right now which is mental health. Sued is November talked about. I want to share today we need more to talk more about suicide as well. 

Next slid over to the research fellow with Kijiji Link take to us it's next slide.

>> Thank you. To quote Doris, I like to invite us to come together global community and I invited all of you to participate in survey that we put together. Survey is engaging your perceptions about ethical development AI mental health support. Anyone on the internet to help us intentionally build. About six questions that it cover it's topics that you see before you. And what we really appreciate about your feedback that's fundamental to how we build part of what we're doing the Kijiji Link centers people. Centering their needs. And we can only do that with your support and with your feedback. That's all. Take a maybe 40 seconds for everyone to capture the QR code with their phone. I'll transition to engage our other panelists to give commentary on today's conversation. A.I.

>> Thank you. 

Thank you for sharing your experience. Before us, from Africa. We have a culture. Want to stay quiet. When it comes to mental health. Everywhere. So I'm just saying what should this. Coder do? We need to support. We need to be inclusive. We create access and safety for those that will be openly shared their story.

And just as we have heard. So some of the things probably would help us to look at what do. As coder we are also stake is holders. What should we do. How can encourage openness we have language, you is our culture. We have our sensitivities. So what should we do?

For me, I think we should create safe place. Place must be safe for everyone to share. So that is one of the things we should do. Like we have as we're doing here. And also, online forum and social media groups. We should also implement confidentiality policies. She said when she shared story, shared with insurance of musician, where did those, where are they? So we should also provide training and resources. That is stakeholders. We should train ourselves and support one more. Which should be courage diverse voice. She said interpret is not African language and might not be in your language. But translate we should be able to do that. We first disability, those that can cannot Reid or write communities marginalized also does that design language could also translate in that. That is what go. Kijiji Link should be able to look at proomoting awareness and education. And I believe Kijiji Link will continue to do that in our communities. Education campaigns can help reduce and encourage participation. So that people can come up as say I have this problem. So put it in the so that you know that there are people to help. We should establish clear guidance so this will help anticipate, participants understand the boundaries an promote respectful sharing. We share respectfully and can provide support especially I shared that stories, so Kijiji Link will be able to say, okay, you shared your story. But we have these support services to offer. We should also inencourage community building just like we and to establish community. Kijiji Link be a community, what we are talking about. 

Ask us for feedback. Help us building on that. Policy change in the process, advocate for pal is policy change so stigma will go to when it comes to mental health. I don't know how many administrations or countries or policies that have developed to help those be able to share that. We celebrate those that will come. Thank you.

[applause]Thank you. Celebrate you. We celebrate you. We celebrate those that would be bold enough to come share their story so others will not suffer. This is what Doris is doing. This is what Kijiji Link will be after, might be brother or sister, might be community. So let's be part of it. 

>> Thank you Mary. June? June former health care professional specialized in mental health. After I want to say I'm having two sets of emotion it's right now. Empathizing with lady who went through so much. But I also want to say that we too suffer from post dramatic stress disorder. Where I will get to artificial intelligence because health care professional feelings. Advice can be subjective and not objective. Machine will be objective. Okay, what I wanted to add that too is that in building machines, and programming we should involve health care professionals. With experience to add to the subject. This way we won't be, we won't miss what is missed. People who are advanced in these program, are not putting human element into the program so that program will be actively responsible for the advice that they're given. I do empathize with the lady what she's been through.

Happy she's brought it into the open and discussed at the IGF.

Thank you June. We want to wrap up our conversation. With this quote. Doris would you like to me to read it? Are you good? Okay.? Doris MAJIRI.

>> I want to say thank you to everyone. It's hard to keep telling my story. Because each time I share my story, I relive what happened in 2023. We need to show more tears. I'm a happy person. But I'm also ill. But I'm not ill. It just didn't start with me. This is generational. We all have it. We all have it. But we don't know it. So I want to lead my say you're not shear shrink yourself. I have never shrunk myself. You are hear not to shrink, you are here to be fully human. You are hear to fully human. And that is your greatest technology. As humans we're first technology. My brain stopped working. Think of that as I power of mother board in a computer. When your computer stops working, what happens? So as we work out today, my hope for the world and my hope for everyone in this room, and that listening, before we ask people or give opinions, let's start by saying are you okay? How can I help you? Caregivers are suffering too. My parents had to suffer as well because I was sick. So all I say is let's remember humanity. Remember each other and care for each other. So I know we have five minutes left. This is the last quote I wanted to give. I don't know if anyone wants to ask a question. We have five more minutes. If anyone wants it ask a question he request I think there's a mic. If not, or a comment.

>> No questionsty at this time. No online questions. No questions at this time. No questions Thank you so much for your time and I appreciate you being here. Thank you very much and global IGF and for my panelists for supporting me when I was not able to do my formula, things that I did before 2023. But supporters and you can check us out on KijijiLink.com and linked there. Thank you all for your time.

[applause]