|
By: Zoe Chambers-Daniel, Taproot Therapy Clinical Trainee
The availability of AI tools that are able to engage in a back-and-forth conversation and mimic human responses, is steadily growing. Tools such as ChatGPT, Replika, and Character AI can act as a personal assistant and a virtual companion all in one. Chat bot features are being used more as a support to cope with emotional distress, allowing us to disclose our mental health diagnoses and experiences of suicidal ideation to these tools. What does this mean for the decision making process in seeking out mental health care? Let’s start with the benefits. Using AI for emotional support gives us unrestricted access to something that listens and responds to what we type, and can create the feeling of being cared for (D’Alfonso, 2020). A key difference between human-human interaction and human-AI interaction is accessibility. AI does not get tired, and can continue a conversation until we decide to stop or pause. Many of us may feel safe talking to an AI tool because there isn’t a fear of judgment that exists in the same way when we are talking with another person face to face. We control the conversation and the environment. Imagine not having to leave your home to complete tasks or feel socially fulfilled! Although these benefits have impacted the daily lives of many, there are risks to using AI as a replacement for mental and emotional support services. The negative consequences of using AI for emotional support include the possibility of being too reliant on the tool, isolating ourselves from human companionship, and psychological distress from insufficient capabilities (De Freitas et al., 2023; Kalam et al., 2024). Being too reliant on AI to the point of isolation is connected to the benefit of having unrestricted access to it. This connection is important to think about through a culturally-informed lens. Marginalized communities who already feel isolated from the majority may use AI for support. Reliance is subjective with this growing development. De Freitas et al. (2023) identifies mental health risks that can form from using chatbots during a crisis and when we are seeking counsel in a vulnerable state. Companion AI from Cleverbot and Simsimi were analyzed, and the researchers found that the generative AI was usually unable to recognize signs of distress or when the user was hinting at intentions to self-harm. They also found that responses to distress were generally unhelpful. The AI either ignores the user’s distress, or provides encouraging commentary in response to suicidality (the user wishing they were dead, expressing intentions to harm themselves). Counselors and mental health professionals are thinking critically about AI use in mental health, weighing the benefits and drawbacks. Given the various effects identified, there is an aim to prioritize moderation when choosing to use AI tools and trying to seek out opportunities for human connection when possible (Alanezi, 2024). This does not in any way villainize AI or people who use it for companionship, but the distinction between human-human relationships and human-AI relationships needs to be acknowledged. If you or someone you know is seeking mental health support, please consider these points and resources: 1. Mental Health Professionals Are Here To Help Mental Health Professionals are able to consider your cultural context, assess your symptoms, provide empathy in a non-judgemental space, and create a treatment plan with you to support your journey towards emotional wellness. It can be scary to seek out help, and finding a professional may not be accessible for some. Keeping this in mind, there are professionals who are available to speak on the phone with you when you are in crisis. If you would like assistance finding support from a therapist who is a good fit for you, visit our Contact page, or reach out to [email protected]. 2. The Risk and Benefits of Using AI Depend on Your Unique Life Context The risks and benefits listed in this blogpost are not exhaustive. They are meant to accompany your examination of the utility of AI for mental health purposes. Please consider your own background and context, and what works best for your specific situation. Using chatbots may be your only accessible way to receive some support, and that’s okay. 3. Mental Health Wellness is a Journey We are not at a point where AI can be considered an appropriate replacement for mental health professionals. It can be a tool, however, in supporting that journey. Because wellness is a journey, remember to be kind to yourself when you are making decisions for your care. 988 Suicide & Crisis Lifeline The 988 Suicide & Crisis Lifeline connects you to trained crisis counselors 24/7. They can help anyone thinking about suicide, struggling with substance use, experiencing a mental health crisis, or any other kind of emotional distress. You can also call, text or chat 988 if you are worried about someone you care about who may need crisis support. OASAS HOPEline New York State’s 24/7 problem gambling and chemical dependency hotline. For Help and Hope call 1-877-8-HOPENY or text HOPENY Domestic Violence If you or someone else is in a relationship is being controlled by another individual through verbal, physical, or sexual abuse, or other tactics, please call: 1-800-942-6906 The Trevor Project 24/7 crisis services for LGBTQ+ people: 1-866-488-7386 References Alanezi, F. (2024). Assessing the effectiveness of CHATGPT in delivering mental health support: A qualitative study. Journal of Multidisciplinary Healthcare, Volume 17, 461–471. https://doi.org/10.2147/jmdh.s447368 Crisis prevention. (n.d.). https://omh.ny.gov/omhweb/bootstrap/crisis.html D’Alfonso, S. (2020). AI in mental health. Current Opinion in Psychology, 36, 112–117. https://doi.org/10.1016/j.copsyc.2020.04.005 De Freitas, J., Uğuralp, A. K., Oğuz‐Uğuralp, Z., & Puntoni, S. (2023). Chatbots and mental health: Insights into the safety of generative AI. Journal of Consumer Psychology. (John Wiley & Sons, Inc.), 1. https://doi.org/10.1002/jcpy.1393 Kalam, K. T., Rahman, J. M., Islam, M. R., & Dewan, S. M. R. (2024). ChatGPT and mental health: Friends or foes? Health Science Reports, 7(2). https://doi.org/10.1002/hsr2.1912 Comments are closed.
|
Categories
All
Archives
November 2025
|
|
|