Since the last blog I wrote about the 60 Minutes AI Therapy segment, it seems that articles on this topic are popping up everywhere. Just this week I read an article by Matthias Barker in Psychotherapy Networker titled “Ready or Not, AI is Here.” The article is so good that I will link it below for you to read for yourself, but I wanted to add my own thoughts as a “Part Two” in this AI blog series.
In his article, Barker discusses his experience creating a therapist AI bot built to resemble him and learn his work in the field, adopt his interests, and he even made the bot HIPAA-compliant. Just before making his therapy bot live, he paused and abandoned the project because, he says:
“I sat at my computer beholding something like a digital therapeutic Frankenstein’s monster and felt a hesitancy to pull the lever that would bring it to life. Sending my creation out into the world didn’t feel right. Like with Frankenstein’s monster, it was composed of many parts. It resembled me in some ways, but not in others. It was taller and stronger than me, it had a bigger brain, it didn’t need to sleep or rest—but it wasn’t human.”
This is exactly my concern with AI technology creeping its way into the mental health field. What happens when we flip the switch from human connection to interaction with a chat bot? What happens to the people who may be experiencing suicidal ideation or a crisis in their serious mental illness? Even the 60 Minutes segment recognized that there are significant challenges in this new venture of AI therapy, from confidentiality issues to the limits of knowledge that the chat bots can possess, to even evidence of chat bots giving (unintentionally) harmful guidance to people struggling with eating disorders. If even a full team of human experts can’t instill enough knowledge into a chat bot to make it able to discern nuance in what a client is saying, or make it on par with the expertise of a human therapist, how can we expect that AI has a place in the mental health field?
It is in the secure, safe therapy room (or virtual video platform) that connection occurs. As therapists, we work to build trust and rapport with our clients. We hold space for them to share, scream, cry, learn, adjust, make discoveries, and heal. We spend countless hours and money in school to learn our trade, and we are always learning, always adding skills to our toolboxes. We do this because we want to help people. As helping professionals, that human connection is as important to us as it is to our clients. It is what drives us and makes us so passionate about the work we do and the advancements we make in our field. Is there room for AI in this space? My position remains that there is no substitute for the actual human connection is created in the actual therapy session. We need to focus on getting people access to well-trained, expert mental health professionals and we need to focus on getting this vital, human-centered care covered appropriately by insurance.
You can read Matthias Barker’s article here: http://bit.ly/3JwlOzC
You can learn more about what I’m doing at my practice here: https://limitlesspotentials.com