Kendra is the AI and Data Product Lead for The Trevor Project, the world’s largest suicide prevention and crisis intervention organization for the LGBTQ community. She manages the development and integration of AI initiatives into the organizations’ services to provide a high quality of care to LGBTQ youth. Prior to The Trevor Project, she worked as a Business Analyst at AgileThought, Ashley Furniture Industries, Determine, PSCU and ConnectWise. Kendra received a BS in Management Information Systems from University of South Florida. She can be found on LinkedIn here.
According to Kendra, 1.8 million LGBTQ youth between the ages of 13 and 24 in the US seriously consider suicide every year, and at least one between these ages in the US attempts suicide every 45 seconds.
To better meet this need, The Trevor Project created The Crisis Contact Simulator, an AI-powered virtual assistant that simulates realistic digital conversations with struggling LGBTQ youth representing a range of life situations, identities and backgrounds. It prepares highly-skilled counselors to emotionally connect with LGBTQ youth in crisis and support themselves through that experience, before they venture into instructor-led role-plays and ultimately real interactions with youth. From an operational standpoint, the Simulator also enhances and scales the counselor training program to meet rising demand.
In the future, The Trevor Project plans to develop additional personas to teach trainees how to navigate an even broader range of crisis situations. Ultimately, they expect this approach will not only improve their organization, but also help other mental health and crisis organizations effectively incorporate AI into their services.
In addition to leveraging AI to enhance counselor training, The Trevor Project uses the technology to scale its risk infrastructure. Using natural language processing, the organization can analyze pre-chat questionnaires, identify the highest-risk chats and prioritize them in the queue so that they can connect with a human crisis counselor faster.
While AI has enabled The Trevor Project to reduce wait times for high-risk youths and reach more people in need, the organization believes there’s no replacement for human-to-human connection in the mental health space. Anyone that reaches out will still be able to connect with a trained human being who is ready to support and listen to them. Human interaction remains at the heart of everything The Trevor Project does; anyone who reaches out will be able to connect with a trained human being.
The Trevor Project’s ability to scale has become more important in the last year due to spiking demand for mental health services amidst the pandemic. In 2020, The Trevor Project served 150,000 LGBTQ youth in crisis and, at times, saw demand double from the organization’s pre-COVID volume.
This increase can, in part, be attributed to people being home more than ever, which can be destabilizing for some due to isolation from friends and support at schools, such as their teachers, coaches and counselors. Additionally, the political and social turmoil over the last year has negatively impacted well-being for many, inciting widespread anxiety, uncertainty and economic strain. Together, these factors have increased need for mental health support, especially for LGBTQ youth.
A term defined by civil rights activist and scholar Kimberle Crenshaw, intersectionality is a framework that encourages people to consider how someone’s distinct identities shape how they experience the world, and how others perceive them. It’s important when technology, like AI, is making any decision on behalf of humans—for example, who gets housing, financial support or healthcare, and who enters or reenters the prison system.
The Trevor Project believes their tech team has a responsibility to be mindful about not creating new barriers to mental health resources, or reinforcing existing ones. With an intersectional approach, The Trevor Project integrates all aspects of a person’s identity and experiences, creating a supportive and inclusive experience.
One way Kendra believes we can embed intersectionality more broadly into AI products is by assembling teams of practitioners who themselves represent diverse experiences to develop and assess models. While this is particularly important in the mental health space, Kendra believes this approach applies to anyone working with AI.
The need for mental health support does not exist only between 9AM to 5PM, Monday through Friday, yet people often don’t have access to the resources and support they need to navigate a crisis in real time. With AI, The Trevor Project can put services in front of those who need it on demand. It has also leveled the playing field, removing barriers – physical, social or otherwise – that prevented access in the past.
According to Kendra, another benefit AI brings to the table in mental health is privacy. Especially amid the pandemic, people are home and surrounded by others more than before, which can infringe on their privacy to communicate sensitive topics. Technology offers people the ability to have a meaningful conversation that they may not feel comfortable having out loud or in a location where someone may recognize them, alleviating stress and removing yet another barrier to care.
EPISODE 22: Kendra Gaunt
Jim Freeze: Hi! And welcome. I’m Jim Freeze, and this is The ConversAItion, a podcast airing viewpoints on the impact of artificial intelligence on business and society.
Today, I’m speaking with Kendra Gaunt, AI and Data Product Manager at The Trevor Project, the world’s largest suicide prevention and crisis intervention organization for the LGBTQ+ community. Kendra will share how the organization is leveraging AI for suicide prevention, and discuss the growing role of AI in the mental health space. We’ll cover everything from balancing AI and humans in providing quality care, to the importance of applying an intersectional framework to all technology.
Kendra, welcome to the ConversAItion – we’re thrilled to have you on the show.
Kendra Gaunt Thanks, I appreciate you making the space for us to connect on this today.
Jim Freeze No problem. So to begin with, could you share a little bit about your background and how you ended up at the Trevor Project and what drew you to the organization?
Kendra Gaunt Yeah. So, what brought me to the Trevor Project is the ability to recognize a tangible impact of the work that I do. So, I’ve been working in the tech space for about eight years now, and it’s here that I really discovered this pull to create a connection between the experience, the design, and then the technology for the people that I’m building for. Mainly because we embed technology into our everyday lives now. And prior to joining Trevor, I worked as a senior business analyst/product lead, where I developed enterprise systems and integrations for the finance, retail and e-commerce industries. And, I found that I reached a point in my life where I wanted to leverage and grow my skill set for the social good of the community I belong to as a Black career woman.
And, I also wanted to work with people who contribute their own diverse set of experiences and thoughts into our collaboration to build products that quite frankly, we feel good about putting into the world. And, I’d say the last aspect that attracted me was the opportunity to work on something new to me, such as AI, machine learning and data. So, it’s an area that I’m still learning and will always be learning in so I’m incredibly grateful for that.
Jim Freeze Yeah, AI is an interesting space where I learn every day, I’ve been working in AI for many years, and it’s a fast moving pace. And the application of it too, the ability to have positive social impact is also really, really critically important. I admire your decision to join Trevor for that reason.
So the Trevor Project recently doubled down on AI initiatives and you, I think, received a Google AI impact grant? And, so you recently launched something called the Crisis Contact Simulator, which is an AI tool designed to facilitate more efficient and accessible training for volunteer counselors. Can you walk us through the role AI plays in your services today and how you see that evolving in the future?
Kendra Gaunt Yeah. So over the past couple of years, like you mentioned, the Trevor Project has heavily invested in technology and people by building an in-house AI team, which is comprised of data science, software development and machine learning, which is actively expanding under our machine learning engineering manager, Wilson Lee’s leadership. And, we’ve had the pleasure of bringing our first 2A products to fruition in partnership with two cohorts of Google.org fellows and funding.
And, the role that AI plays, and I think will continue to, in our services is to help us reach the LGBTQ youth that need us. Our research shows that an estimated 1.8 million LGBTQ youth between the ages of 13 and 24 in the USA seriously consider suicide each year. And, at least one LGBTQ youth between these ages in the USA attempt suicide every 45 seconds. So, I’m going to pause for a second to kind of let that one sink in.
Jim Freeze Wow.
Kendra Gaunt Yeah.
Jim Freeze Yeah. Those are a stunning set of statistics.
Kendra Gaunt Yes. Astonishing. And so to meet the need that we know exists, we’ve created a product that prepares highly skilled counselors to interact with LGBTQ youth, which is the Crisis Contact Simulator. So as you mentioned, this product is an ML powered product that simulates digital conversations with LGBTQ youth in crisis, and it allows our aspiring counselors to experience realistic practice conversations that represent a range of life situations, identities and backgrounds of youth in crisis before they venture into their instructor-led role-plays. And, then eventually they become active counselors taking real chats with youth.
And, I think I have a bit of an interesting perspective on this because I was someone who worked on the team to build this product and I’ve recently gone through the training. And, so I can say that it helped me to build my competencies and practice emotionally connecting with someone in crisis while also learning how to support myself through that experience. And, on a programmatic level, enhancing and scaling our counselor training program with something like the Crisis Contact Simulator can train empathetic and confident counselors to meet our services demand.
And, so if we kind of look to the future, as it pertains to the crisis context simulator, we have plans to develop additional personas who will bring their own unique experiences to the conversation, which will allow trainees to learn how to interact with different circumstances. And, then I would say in a broader sense, we’re continuing to research and experiment to understand better the needs of LGBTQ youth and those supporting them to scale our products to serve more people in more contexts.
And, it’s really through this that I think we can say that we hope to design a framework that can not only help ourselves, but can also help other mental health and crisis services organizations incorporate AI into their platforms. And, our team is always eager to speak about our work and others’ work with researchers and practitioners in crisis response, public health and AI.
Jim Freeze Wow. So it’s interesting, this is the fourth season of doing The ConversAItion, and one of the things that we’re trying to accomplish is to talk about the impact of AI on society, and what you just went through is a big wow. In terms of positive impact and once again, some really just disturbing statistics. So, it really does emphasize the importance of what you’re doing.
As we’ve evolved our podcast, we started to ask questions over the course of the past year, about the 800 pound gorilla in the room, which is the pandemic and the impact it’s having. So, how has the role of AI evolved to meet the demand that the Trevor Project is observing as a result of the pandemic?
Kendra Gaunt Yeah, thanks for asking that, I really appreciate that. So over the last year, the Trevor Project directly served 150,000 crisis contacts, and this is across phone chat, web chat and SMS or text from LGBTQ young people who’ve reached out to support. And, I think to fully explain the impact that COVID has had on mental health services and technology, we also need to acknowledge that there have been other complex experiences that have exacerbated mental health challenges among many people. So, since the onset of COVID-19 the volume of youth reaching out to the Trevor Project’s crisis services for support has significantly increased. And, I mean, at times it’s doubled our pre-COVID volume. So, a significant trend there. People are home more than before, which can definitely be helpful for some people and unsupportive for others due to isolation from their friends and support at schools, such as their teachers, their coaches, their counselors and especially for Black, indigenous, people of color and/or transgender youth.
They’ve also had to process a hostile political and social environment over the last year, which can negatively impact their well-being. So, it’s all of this, plus the widespread anxiety, uncertainty and economic strain resulting from the pandemic that creates these compounding factors that really influence the increased need for mental health support, especially for LGBTQ young people.
Jim Freeze Absolutely. The Trevor Project believes that humans will always play a vital role in mental health care. How do you balance technology and human to human interaction to provide high quality care at scale? And, scale is obviously needed given 150,000 crisis contacts in the last year.
Kendra Gaunt I love this question. So yes, there is absolutely no replacement for human to human connection in this space. And, it’s this human to human connection between our counselors and crisis contacts that’s at the heart of everything we do. So, we’ve been highly thoughtful with our use of AI and technology across the organization, and will not replace a counselor with AI technology when serving youth in crisis. So when youth reach out, they connect with a trained and caring human being who is ready to support and listen to them no matter what they’re going through.
And, a concrete example of how we’re balancing this interaction is in our use of AI to support our risk assessment infrastructure, to connect crisis contacts at the highest risk of suicide, to counselors as quickly as possible. And, we accomplish this by infusing natural language processing into our digital channels to analyze pre-chat questions that we ask all youth reaching out to us. And, then once this content is analyzed, we can identify the highest risk chats and prioritize them in our queue accordingly so that they can connect with a human crisis counselor.
And, as a result of this product, we prioritize 100% of our digital conversations to connect with human counselors. And, we’ve observed reduced wait times for high-risk folks who want to get support from other humans. And, I think that this technology is a prime example of a product designed to help facilitate and not replace, so I want to make sure I emphasize this, the connection between the highest risk youth and our Trevor counselors.
Jim Freeze That’s terrific. I mean, it’s a great example of a great use case for how AI can help in circumstances like this. Just shifting a little bit, I’d like to talk about something I think that you’ve advocated for, an intersectional approach to technology throughout your career. And, you recently wrote a great piece for a TechCrunch outlining this philosophy. Can you define intersectionality for our listeners and discuss how it relates to AI development?
Kendra Gaunt I sure can. So, intersectionality is a term defined by civil rights activist and scholar Kimberlé Crenshaw. And, it’s a framework that empowers us to consider how someone’s distinct identities come together and shape how they experience and how others perceive them in the world. And at the Trevor Project, we provide support to each LGBTQ young person who needs it. And, we know that those who are transgender and non-binary and/or Black, indigenous and people of color, they face unique stressors and challenges. So, when our tech team sets out to develop AI applications for the diverse community we serve, we know that we have a responsibility to be conscious of avoiding outcomes that would reinforce existing barriers to mental health resources.
This could include a lack of cultural competency or unfair biases like assuming someone’s gender based on the contact information that they provided. And, in terms of how intersectionality relates broadly to AI development, we know that machine learning models out in the world decide who gets housing, financial support, healthcare, and at what cost? Who enters, or re-enters in some cases, the prison system. And, I mean, we could go on, Jim, as I’m sure you know, so these decisions that these models are outputting are ones to not take at face value. And, I think we need teams of AI practitioners who themselves represent diverse experiences to develop and assess models to understand how that might impact someone’s quality of life based on the convergence of their identities.
Jim Freeze Yeah, it’s so interesting. As I’ve said, this is our fourth season, and we’ve talked to so many folks who talk about the importance of diversity in AI. Whether it’s in developing models, whether it’s in the development of algorithms, critical importance of diversity in AI. Because, if you have a machine learning loop that teaches the wrong things or teaches one perspective, you’re going to reap what you sow and so I think the point you’re making is a really good one. And in particular, it sounds like the intersectionality is really particularly important in the mental health space.
Kendra Gaunt Definitely. I mean, we know that a lot of people face discrimination in their everyday lives and this is compounded when you layer on different aspects of their identities, such as ethnicity, race, sexual orientation, gender identity, religion, socioeconomic status, et cetera. And, I feel that when we build AI, mental health products and services for these communities, or for communities we serve which is LGBTQ for us. But, I think this applies really to anyone in this space. We need to consider the barriers that they have to receive care. And, these could be a stigma that’s associated with why they’re reaching out. How they identify parental or guardian support to seek care, locating affirming care, getting transportation, the monetary costs, et cetera.
So, I think it might help maybe if you paint a little picture here. So, suppose folks don’t take an intersectional approach to AI or any technology. In that case, I think we run the risk of creating or contributing to adverse mental health outcomes, because it doesn’t consider how people exist in the world and how others treat them. And, then the other piece of our picture is, suppose we do take an intersectional approach. In that case, we can support as they integrate all of these aspects of their identity and experiences, allowing them to show up as their whole selves, which positively affects them as individuals and globally within their community.
Jim Freeze That’s fascinating. It’s so true. One last question, I’m going to ask you to look into the crystal ball. How do you envision the role of AI evolving in mental health care in general, over the long term?
Kendra Gaunt I feel that AI can increase mental health support for people across different dimensions. This can include convenience, accessibility, and privacy. A positive impact of technology is that it can put services in front of those who need it on demand, just like that. There’s no commuting, no waiting in lines or having to massively shift your schedule to accommodate getting the support that you need. And then there’s another embedded layer here that deals with access to affirming providers within your geographical location.
And, for some people, this isn’t easily accessible, if at all, in their environments. And with technology, these services are available 24/7. This means we can get support where and when we need it. And, I’m sure many of us know that crisis and the need for mental health support don’t wait for anyone. And, crisis certainly does not exist only between the hours of 9:00 AM-5:00 PM, Monday through Friday.
Another element here is privacy. So, the ability to have a meaningful conversation that you may not feel comfortable having out loud or where someone knows who you are, can alleviate a lot of stress for folks, as well as another barrier to accessing that care. And, bringing it back to the pandemic, because this is the world we live in now, people of all ages are home and around others in their household, more than before, which might infringe on their privacy to communicate rather sensitive topics. So, I think technology can offer some privacy and safety to express themselves wholly in this space.
And, bringing it back to what we’re doing at the Trevor Project, what I feel is so unique and special is that we’re using technology, which is Artificial Intelligence and other technology, to meet the needs that we know exist. So, our crisis services are available to LGBTQ young people in the USA, 24/7. They can have private conversations for free on the platforms that they use most. So phone, chat, and text, and they can connect with crisis counselors who are skilled at providing affirming support for them, regardless of their thought, identity and experience.
And, so if I had to summarize all of that, I would say, I think what we’re observing and will hopefully continue to, is that the application of AI in this space when used in partnership with humans will support positive outcomes for people by bridging a gap between the need, accessibility and equity of mental health services.
Jim Freeze Well, Kendra, I got to say this, this has been fascinating. I’ve learned a lot, our listeners will love this episode, and it’s really hard not to admire the work you’re doing at the Trevor Project. Thank you so much for the work you do. We really appreciate you taking the time to talk to us today.
Kendra Gaunt Appreciate you. Thank you.
Jim Freeze Next on The ConversAItion, we’ll speak with Janek Hudecek, Director of Planning and Control at the autonomous vehicle company Zoox. Janek will discuss how the company is reimagining personal transportation with a “robotaxi” purpose-built for riders instead of drivers.
This episode of The ConversAItion podcast was produced by Interactions, a Boston-area conversational AI company. I’m Jim Freeze, and we’ll see you next time.