|
By: Allison Torsiglieri, Taproot Therapy Clinical Trainee
“I think it's just knowledge that everyone should have. That you have this amplifier… potentially linked to your pain, and your perceptions, or the fears, or the dangers around what might be going on in your body can contribute to that pain, or headaches, or anxieties, or probably all kinds of other things” (Tankha et al., 2023, p. 1588). Pain Reprocessing Therapy (PRT) is a promising new approach to treating certain types of chronic pain (Pain Reprocessing Therapy Center, n.d.-c). Even after a painful injury heals, the brain can get stuck in a pattern of sensing bodily harm or danger when there is none, and in response, trigger pain unnecessarily. This remembered pain, which doesn’t have a meaningful physiological cause, is called neuroplastic pain (Pain Reprocessing Therapy Center, n.d.-b). PRT helps us to better differentiate between dangerous and safe signals from the body, thereby reducing neuroplastic pain. My personal experience with PRT: I first heard about PRT on a podcast, and thought I’d give it a try as part of my own journey to tackle chronic back pain. While I haven’t mastered any of the techniques (described below), after reading The Way Out (a book on PRT, by its developer) I’ve noticed I feel less fear surrounding my back pain and more in touch with what is really going on in my body when I do feel this pain (Gordon & Ziv, 2021). This blog post is my way of sharing what I know about PRT, in case anyone reading might benefit from this model of therapy! How Does PRT Work? There are two main processes PRT uses to help reduce pain (Tankha et al., 2023):
What’s Involved in PRT? PRT uses psychological techniques to retrain the brain to interrupt neuroplastic pain. The main technique PRT uses is called somatic tracking. Somatic tracking is a practice in which we are experiencing our pain while simultaneously experiencing a sense of safety (Pain Reprocessing Therapy Center, 2021). Somatic tracking has three main elements (Pain Reprocessing Therapy Center, 2021):
Therapists trained in PRT also work with clients to process other sources of fear and stress in their lives, which can be contributing to a generalized sense of danger, and exacerbating their experiences of pain by way of the pain–fear cycle. “...I never would have guessed that childhood issues could be affecting the way I feel in my physical body today” (Tankha et al., 2023, p. 1588). How Can I Learn More About PRT? Here are some ways to learn more about PRT:
References Ashar, Y. K., Gordon, A., Schubiner, H., Uipi, C., Knight, K., Anderson, Z., Carlisle, J., Polisky, L., Geuter, S., Flood, T. F., Kragel, P. A., Dimidjian, S., Lumley, M. A., & Wager, T. D. (2021). Effect of Pain Reprocessing Therapy vs placebo and usual care for patients with chronic back pain: A randomized clinical trial. JAMA Psychiatry, 79(1), 13–23. https://doi.org/10.1001/jamapsychiatry.2021.2669 Fishbein, J. N., Schuster, N. M., Anders, A., Portera, A. M., & Herbert, M. S. (2025). Pain Reprocessing Therapy for migraine: A case series. Headache: The Journal of Head and Face Pain, 65(9), 1660-1665. https://doi.org/10.1111/head.15043 Gordon, A., & Ziv, A. (2021). The way out: A revolutionary, scientifically proven approach to healing chronic pain. Vermilion. Pain Reprocessing Therapy Center. (n.d.-a). Free recovery resources. https://www.painreprocessingtherapy.com/free-resources/ Pain Reprocessing Therapy Center. (n.d.-b). Neuroplastic pain. Retrieved October 9, 2025, from https://www.painreprocessingtherapy.com/neuroplastic-pain/ Pain Reprocessing Therapy Center. (n.d.-c). Pain Reprocessing Therapy. Retrieved October 9, 2025, from https://www.painreprocessingtherapy.com/ Pain Reprocessing Therapy Center. (2021). Treatment outline for Pain Reprocessing Therapy. https://www.painreprocessingtherapy.com/wp-content/uploads/2021/03/PRT-Supplementary-Materials-for-Site.pdf Sturgeon, J., Trost, Z., Ashar, Y. K., Lumley, M. A., Schubiner, H., Clauw, D., & Hassett, A. L. (2025). Brief pain reprocessing therapy for fibromyalgia: A feasibility, acceptability, and preliminary efficacy pilot. Regional Anesthesia & Pain Medicine. Advance online publication. https://doi.org/10.1136/rapm-2025-107076 Tankha, H., Lumley, M. A., Gordon, A., Schubiner, H., Uipi, C., Harris, J., Wager, T. D., Ashar, Y. K. (2023). “I don't have chronic back pain anymore”: Patient experiences in Pain Reprocessing Therapy for chronic back pain. The Journal of Pain, 24(9), 1582-1593. https://doi.org/10.1016/j.jpain.2023.04.006 By: Zoe Chambers-Daniel, Taproot Therapy Clinical Trainee
The availability of AI tools that are able to engage in a back-and-forth conversation and mimic human responses, is steadily growing. Tools such as ChatGPT, Replika, and Character AI can act as a personal assistant and a virtual companion all in one. Chat bot features are being used more as a support to cope with emotional distress, allowing us to disclose our mental health diagnoses and experiences of suicidal ideation to these tools. What does this mean for the decision making process in seeking out mental health care? Let’s start with the benefits. Using AI for emotional support gives us unrestricted access to something that listens and responds to what we type, and can create the feeling of being cared for (D’Alfonso, 2020). A key difference between human-human interaction and human-AI interaction is accessibility. AI does not get tired, and can continue a conversation until we decide to stop or pause. Many of us may feel safe talking to an AI tool because there isn’t a fear of judgment that exists in the same way when we are talking with another person face to face. We control the conversation and the environment. Imagine not having to leave your home to complete tasks or feel socially fulfilled! Although these benefits have impacted the daily lives of many, there are risks to using AI as a replacement for mental and emotional support services. The negative consequences of using AI for emotional support include the possibility of being too reliant on the tool, isolating ourselves from human companionship, and psychological distress from insufficient capabilities (De Freitas et al., 2023; Kalam et al., 2024). Being too reliant on AI to the point of isolation is connected to the benefit of having unrestricted access to it. This connection is important to think about through a culturally-informed lens. Marginalized communities who already feel isolated from the majority may use AI for support. Reliance is subjective with this growing development. De Freitas et al. (2023) identifies mental health risks that can form from using chatbots during a crisis and when we are seeking counsel in a vulnerable state. Companion AI from Cleverbot and Simsimi were analyzed, and the researchers found that the generative AI was usually unable to recognize signs of distress or when the user was hinting at intentions to self-harm. They also found that responses to distress were generally unhelpful. The AI either ignores the user’s distress, or provides encouraging commentary in response to suicidality (the user wishing they were dead, expressing intentions to harm themselves). Counselors and mental health professionals are thinking critically about AI use in mental health, weighing the benefits and drawbacks. Given the various effects identified, there is an aim to prioritize moderation when choosing to use AI tools and trying to seek out opportunities for human connection when possible (Alanezi, 2024). This does not in any way villainize AI or people who use it for companionship, but the distinction between human-human relationships and human-AI relationships needs to be acknowledged. If you or someone you know is seeking mental health support, please consider these points and resources: 1. Mental Health Professionals Are Here To Help Mental Health Professionals are able to consider your cultural context, assess your symptoms, provide empathy in a non-judgemental space, and create a treatment plan with you to support your journey towards emotional wellness. It can be scary to seek out help, and finding a professional may not be accessible for some. Keeping this in mind, there are professionals who are available to speak on the phone with you when you are in crisis. If you would like assistance finding support from a therapist who is a good fit for you, visit our Contact page, or reach out to [email protected]. 2. The Risk and Benefits of Using AI Depend on Your Unique Life Context The risks and benefits listed in this blogpost are not exhaustive. They are meant to accompany your examination of the utility of AI for mental health purposes. Please consider your own background and context, and what works best for your specific situation. Using chatbots may be your only accessible way to receive some support, and that’s okay. 3. Mental Health Wellness is a Journey We are not at a point where AI can be considered an appropriate replacement for mental health professionals. It can be a tool, however, in supporting that journey. Because wellness is a journey, remember to be kind to yourself when you are making decisions for your care. 988 Suicide & Crisis Lifeline The 988 Suicide & Crisis Lifeline connects you to trained crisis counselors 24/7. They can help anyone thinking about suicide, struggling with substance use, experiencing a mental health crisis, or any other kind of emotional distress. You can also call, text or chat 988 if you are worried about someone you care about who may need crisis support. OASAS HOPEline New York State’s 24/7 problem gambling and chemical dependency hotline. For Help and Hope call 1-877-8-HOPENY or text HOPENY Domestic Violence If you or someone else is in a relationship is being controlled by another individual through verbal, physical, or sexual abuse, or other tactics, please call: 1-800-942-6906 The Trevor Project 24/7 crisis services for LGBTQ+ people: 1-866-488-7386 References Alanezi, F. (2024). Assessing the effectiveness of CHATGPT in delivering mental health support: A qualitative study. Journal of Multidisciplinary Healthcare, Volume 17, 461–471. https://doi.org/10.2147/jmdh.s447368 Crisis prevention. (n.d.). https://omh.ny.gov/omhweb/bootstrap/crisis.html D’Alfonso, S. (2020). AI in mental health. Current Opinion in Psychology, 36, 112–117. https://doi.org/10.1016/j.copsyc.2020.04.005 De Freitas, J., Uğuralp, A. K., Oğuz‐Uğuralp, Z., & Puntoni, S. (2023). Chatbots and mental health: Insights into the safety of generative AI. Journal of Consumer Psychology. (John Wiley & Sons, Inc.), 1. https://doi.org/10.1002/jcpy.1393 Kalam, K. T., Rahman, J. M., Islam, M. R., & Dewan, S. M. R. (2024). ChatGPT and mental health: Friends or foes? Health Science Reports, 7(2). https://doi.org/10.1002/hsr2.1912 |
Categories
All
Archives
November 2025
|
|
|