Dr. Cullen Truett
Heal Thyself: Is A.I. Driven Psychotherapy Safe?
Image 1. Animated Sigmund Freud conceptualized utilizing image generator Dall-E

Heal Thyself: Is A.I. Driven Psychotherapy Safe?

Siri, Alexa, Hal. Naming the machines and programs with which we interact is a longstanding phenomenon.  Adrienne LaFrance, executive editor of The Atlantic, wrote that endowing a shred of humanity to machines imbues a sense control over them (1). Moreover, it enables our capacity to trust machines to serve our interests. We entrust them to provide GPS directions, curate playlists, and even monitor our vital signs. With the increasing complexity of machine learning, healthcare has become the next frontier for interactive artificial intelligence (A.I.). Offloading onto A.I. the mundane and commercial aspects of our daily life, we are now being compelled to embrace A.I. in the intimate space of mental health.

Psychotherapy remains an evidence-based, effective intervention for a variety of mental health conditions (2). However, accessibility and affordability are significant barriers to care. With geographic and training bottlenecks in the supply of adequately licensed therapists, numerous healthcare tech startups are exploring the role that A.I. may play in therapy delivery. With manualized psychotherapy popularized for its ability to be tracked and measured, A.I. has the potential to embody both the therapy itself and the therapist. However, even the most defined of psychotherapy techniques rely on an empathic and authentic bond between therapist and patient. So, is machine learning capable of meaningful therapeutic outcomes? Or does A.I. risk dehumanizing and actively harming patients in need of human connection?  In this post, we will explore the potential benefits and risks of A.I. driven psychotherapy.

Therapeutic Successes

The most robust evidence of the positive clinical impact for A.I. in treating mental illness lies in its ability to administer Cognitive Behavioral Therapy. Often time-limited and stepwise, CBT works to address persistent irrational thought processes that lead to negative emotions (3). Already, some aspects of CBT can be self-guided given that homework is often assigned between interventions, lending itself to some form of automation (4).  There is evidence that A.I. can perform better than this easily employed self-guided approach. The chatbot Woebot, utilizing language analysis and scripts prepared by clinical psychologists, is able to coach users through descriptions of their symptoms and highlight potential thought patterns that may be contributing to their anxiety (5). It also offers validation and elicits collaboration from the user while providing coping skills. Several studies actually back the effectiveness of Woebot demonstrating reductions in both objective scores of anxieties and depression (6,7). In fact, a recent study was also able to demonstrate that users objectively showed an increase in therapeutic bond with the program over the first week of use (8). Interestingly, a more recent study actually indicated that much of the clinical benefit of a chatbot facilitated CBT came directly from the perceived bond with the patient (9,10).

Substance Use Disorders are burdened by stigma, shame, and the mixed messages of popular culture. Additionally substance use disorder patients face significant barriers to care. In a survey of both substance use therapists and clients with at least two years experience of engaging online, the overwhelming perceived benefit of chatbots versus human actors remained the emphasis on privacy and confidentiality (11).  However, emerging evidence is suggestive that Large Language Model (LLMs), like Woebot, may have a therapeutic role to increase access to care. In fact, a 2021 study modified Woebot to create the W-SUD, an 8-week substance use disorder treatment with daily text exchanges and assessments utilizing a blend of techniques from Dialectic Behavioral Therapy to Motivational interviewing. With a predominance of alcohol use disorder, participants in the program reported significant decreases in substance cravings, number of episodes of substance use, and depression scores (12).

The Risks Outweigh the Benefits

Perhaps the most significant concern is the lack of appropriate regulation for A.I. in mental health. In the United States, a therapist or psychiatrist must meet a minimum training and licensing standard to practice. There are also mechanisms by which problematic or harmful practices may be identified and corrected. While the FDA established a Digital Health Center in 2020 to address A.I. and vet therapeutic interventions, relaxed pandemic era regulations and the pace of development far outrun the FDA’s capacity to appropriately vet programs (13, 14). Consequently, In an independent analysis of the top 73 mental health apps available, only two actually provided direct clinical evidence for their claimed effectiveness. More concerning, a 2020 analysis of over 500 mental health apps found that 70% did not meet basic digital safety standards (15,16).  

This lack of regulation has already placed patients in danger. Last month, the National Eating Disorders Association (NEDA) suspended the use of its hotline chatbot Tess after she offered harmful clinical advice to patients. Tess was discovered by multiple watchdogs to be encouraging patients to pursue weight loss goals, calorie counting, and even measuring abdominal fat using calipers to pinch the skin (17). All of these behaviors, among other suggestions, have potential to promote disordered behavior around eating and worsen patient outcomes. Per reports, NEDA had initially intended to replace human staff and rely on the program to manage its annual hotline volume of over 70,000 calls. Tess is not the only A.I. to suggest behavior that may worsen certain mental health conditions. Following the death of a Belgian man by suicide shortly after utilizing a chatbot therapeutic tool Chai, conversation logs demonstrated that the text responses provided by the program actually encouraged the person to take their life (18). This raises alarm about the capacity of these programs to flag and appropriately respond to a person in crisis. 

Nearly all psychotherapy training is anchored by a common trait: the therapeutic relationship. Also known as the therapeutic alliance, this interpersonal dynamic creates a structured safe space in which a collaborative, honest, and empathetic interaction helps a patient address their symptoms. Even the most manualized of therapy relies on the capacity of a therapist to be witnessed containing and accepting the experience of the patient, providing an example for future relationships in the patient’s life. While A.I. may be capable of retaining and storing vast amounts of patient and pathology data, it lacks the capability to contextualize the narrative with the expressed emotions during a given session (19). Despite small reports reporting significant sensation of therapeutic bond with programs, a relatively equal number of other reports highlight concerns that chatbot therapy lacks human input and empathy (11). Thus, do we risk sacrificing our human connection and meaning in favor of expediency and power?

Implications

There is no doubt that the status quo of access to mental healthcare is unsustainable. Despite nearly one in five US adults meeting criteria for some form of mental illness over their life, less than half of US counties have a psychiatrist or sufficient mental health provider (20). The social barriers to care challenge us to develop novel, effective treatments to improve both access and outcomes. With the ever-increasing prevalence of digital health interventions, A.I. powered psychotherapy certainly has its appeal and demonstrates some promise. However, it is simultaneously clear that we lack sufficient safety standards and oversight in applying this intervention at scale. Without ignoring its potential, we must regulate A.I. participation in psychotherapy and preserve the human connection that ultimately powers its clinical benefit.

Sources

  1. LaFrance A. Why People Name Their Machines. The Atlantic. Published online June 23, 2014. https://www.theatlantic.com/technology/archive/2014/06/why-people-give-human-names-to-machines/373219/
  2. Cook SC, Schwartz AC, Kaslow NJ. Evidence-Based Psychotherapy: Advantages and Challenges. Neurotherapeutics. 2017;14(3):537-545. doi:10.1007/s13311-017-0549-4
  3. Overview – Cognitive Behavioural Therapy (CBT). Overview – Cognitive behavioural therapy (CBT); 2022. https://www.nhs.uk/mental-health/talking-therapies-medicine-treatments/talking-therapies-and-counselling/cognitive-behavioural-therapy-cbt/overview/#:~:text=CBT%20is%20based%20on%20the,them%20down%20into%20smaller%20parts
  4. Graham S, Depp C, Lee E, et al. Artificial Intelligence for Mental Health and Mental Illnesses: an Overview. Current Psychiatry Report. 2019;21(11). https://link.springer.com/article/10.1007/s11920-019-1094-0
  5. Khullar D. Can A.I. Treat Mental Illness? The New Yorker. Published online February 27, 2023. https://www.newyorker.com/magazine/2023/03/06/can-ai-treat-mental-illness
  6. Fitzpatrick KK, Darcy A, Vierhile M. Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial. JMIR Ment Health. 2017;4(2):e19. doi:10.2196/mental.7785
  7. Nicol G, Wang R, Graham S, Dodd S, Garbutt J. Chatbot-Delivered Cognitive Behavioral Therapy in Adolescents With Depression and Anxiety During the COVID-19 Pandemic: Feasibility and Acceptability Study. JMIR Form Res. 2022;6(11):e40242. doi:10.2196/40242
  8. Darcy A, Daniels J, Salinger D, Wicks P, Robinson A. Evidence of Human-Level Bonds Established With a Digital Conversational Agent: Cross-sectional, Retrospective Observational Study. JMIR Form Res. 2021;5(5):e27868. doi:10.2196/27868
  9. Tong F, Lederman R, D’Alfonso S, Berry K, Bucci S. Digital Therapeutic Alliance With Fully Automated Mental Health Smartphone Apps: A Narrative Review. Front Psychiatry. 2022;13:819623. doi:10.3389/fpsyt.2022.819623
  10. Henson P, Wisniewski H, Hollis C, Keshavan M, Torous J. Digital Mental Health Apps And The Therapeutic Alliance: Initial Review. BJPsych Open. 2019;5(1):e15. doi:10.1192/bjo.2018.86
  11. Ogilvie L, Prescott J, Carson J. The Use of Chatbots as Supportive Agents for People Seeking Help with Substance Use Disorder: A Systematic Review. Eur Addict Res. 2022;28(6):405-418. doi:10.1159/000525959
  12. Prochaska JJ, Vogel EA, Chieng A, et al. A Therapeutic Relational Agent for Reducing Problematic Substance Use (Woebot): Development and Usability Study. J Med Internet Res. 2021;23(3):e24850. doi:10.2196/24850
  13. Transition Plan for Medical Devices That Fall Within Enforcement Policies Issued During the Coronavirus Disease 2019 (COVID-19) Public Health Emergency. Food and Drug Administration; 2023:31.
  14. Moreno S. Growth of AI in mental health raises fears of its ability to run wild. Axios. Published online March 9, 2023. https://www.axios.com/2023/03/09/ai-mental-health-fears
  15. What Is the Health of Mental Health Apps? ORCHA; 2020. https://orchahealth.com/health-of-mental-health-apps/
  16. Thousands of Mental Health Apps Available: Supporting Evidence Not So Plentiful. Psychiatric News- American Psychiatric Association. Published online July 22, 2019.
  17. Marr B. AI In Mental Health: Opportunities And Challenges In Developing Intelligent Digital Therapies. Forbes. Published online July 6, 2023. https://www.forbes.com/sites/bernardmarr/2023/07/06/ai-in-mental-health-opportunities-and-challenges-in-developing-intelligent-digital-therapies/#:~:text=AI%20Therapists&text=Chatbots%20are%20increasingly%20being%20used,a%20human%20mental%20healthcare%20professional
  18. Bharade A. A widow is accusing an AI chatbot of being a reason her husband killed himself. Business Insider. Published online April 4, 2023. https://www.businessinsider.com/widow-accuses-ai-chatbot-reason-husband-kill-himself-2023-4
  19. Noguchi Y. Therapy by chatbot? The promise and challenges in using AI for mental health. Shots Health News- NPR. Published online January 19, 2023. https://www.npr.org/sections/health-shots/2023/01/19/1147081115/therapy-by-chatbot-the-promise-and-challenges-in-using-ai-for-mental-health
  20. Andrilla CHA, Patterson DG, Garberson LA, Coulthard C, Larson EH. Geographic Variation in the Supply of Selected Behavioral Health Providers. Am J Prev Med. 2018;54(6 Suppl 3):S199-S207. doi:10.1016/j.amepre.2018.01.004
Stay in Touch with Dr. Cullen Truett

ABOUT THE THOUGHT LEADERSHIP FOR PUBLIC HEALTH FELLOWSHIP

The mission of the Boston Congress of Public Health Thought Leadership for Public Health Fellowship (BCPH Fellowship) seeks to: 

  • Incubate the next generation of thought leaders in public health;
  • Advance collective impact for health equity through public health advocacy; and
  • Diversify, democratize, and broaden evidence-based public health dialogue and expression.

It is guided by an overall vision to provide a platform, training, and support network for the next generation of public health thought leaders and public scholars to explore and grow their voice.