Did you ever think that we would think that Artificial Intelligence (AI) could fill the demand for the “shortage of therapists?”
Did you know there was a shortage?
On Sunday, April 7th, 2024, CBS ran a segment on 60 Minutes reporting this and implied we can replace live mental health therapy with Artificial Intelligence. This episode of 60 Minutes profiled an app called Woebot, a chatbot which has been created (by a human team of medical doctors, staff psychologists, and computer technologists) to detect words and nuances of how things are said in text entries by app users.
They stated the algorithms programed in Woebot can then provide or mimic responses commonly used in Cognitive Behavioral Therapy (CBT) to help the user recognize and change negative and destructive patterns and behaviors. While the intention of apps like Woebot attempts to “fill the gaps” in access to mental health care- I would argue that there is no substitute for the connection, empathy, and support offered in a one-on-one, live, secure therapy session with a highly trained mental health professional.
My first concern is around the assertion that there is a shortage of mental health therapists. There is not a shortage of mental health therapists—the problem is that insurance does not want to pay for mental health counselling, either at all or at a decent rate to compensate mental health professionals who are well-trained and experienced. It costs a lot of money for post-graduate training and other required continuing education to maintain our licenses. We need to be able to make a living to keep a practice open and provide services at this high level.
My next concern is with the apparent gaps in privacy and safety related to these AI chatbots. What happens to all that recorded information that clients are sharing with chatbots about their private lives? Could this be a potential HIPAA violation? If someone is suicidal or has complex trauma, what emergency procedures are in place to not only identify a person at risk, but get them to the right help? Even the 60 Minutes segment admits that most mental health apps are unregulated, whereas therapists are licensed in the states in which they practice. Where is the safety net for people who are in crisis and turning to these apps, only to experience unregulated, unlicensed help?
AI in psychology cannot possibly replace the human experience. Having eye contact with a caring, attuned therapist in a live, private, secure office (or virtual platform), and to be able to share with the client observations about their posture, tears in the eyes, or noticing that the client is obviously activated and upset—how can AI see or acknowledge this? Insurance providers/employers/carriers are making huge profits and commissions for not providing coverage and probably love and endorse the idea of Artificial Intelligence. Does replacing well-trained professionals with AI chatbots mean even fewer claims paid for mental health services?
As a veteran mental health provider, I am sure there are other providers out there that have the same concerns with the use of AI in mental healthcare. It is great that everyone can travel for miles and spend lots of money to watch an eclipse and be together in community–for an hour once every 20 years. What about coming together for other important issues in our country—such as access to health and mental health care, or just being a kind neighbor? I don’t believe that AI is the solution to the mental health crisis in this country. I believe greater access to and support of professional, live therapy sessions and a return to our human community is the answer.