Image: nadia_bormotova - Getty Images
Chatbot with stethescope
Rudimentary AI-powered tools have assumed the role of psychotherapist for some time now. Automated apps using standard algorithms spitting out canned responses have provided various iterations of cognitive-behavioral therapy (CBT), the so-called gold standard of talk therapies. Many of these apps employ technology similar to what also drives electronic assistants like Alexa and Siri. They can be helpful to some folks but are limited in their ability to respond to individual circumstances or crises, and in providing more human-like, empathic responses.
But so-called “large language model” AI programs, or chatbots (think ChatGPT or Google Bard), are much more sophisticated, and they are moving rapidly into the psychotherapy realm. Not long after ChatGPT emerged in the public domain, folks began using it informally as a counseling resource. Online forums, like Reddit, are full of personal accounts in this regard, many of them positive. Some users assert these programs are superior to their human counselors. What’s more, increasingly, large mental health providers and advocacy groups utilize chatbots to augment or even replace human therapists.
In some instances, these robot therapists are simply used to screen incoming clients, provide standardized assessments or help determine the best human resource for their concerns. In other situations, they become “therapist assistants” that reduce the administrative burdens on human shrinks by automating record keeping, administering diagnostic tests and the like. However, increasingly, they are standing in for flesh-and-blood mental health professionals. And, make no mistake, the demand is there. Some of these chatbot therapy programs have millions of downloads.
Stay on top of the news of the day
Subscribe to our free, daily e-newsletter to get Milwaukee's latest local news, restaurants, music, arts and entertainment and events delivered right to your inbox every weekday, plus a bonus Week in Review email on Saturdays.
Convenient if Artificial
On the upside, experts believe these chatbots can address the supply-demand imbalance plaguing the mental health field. There simply aren’t enough shrinks to go around, and many people can’t afford the high cost of human care. As one colleague put it, “A robotic therapist is better than none at all.” What’s more, these AI programs are convenient, allowing access day or night and eliminating the discomfort of traveling to an appointment, as well as the public exposure of sitting in a waiting room. Many people who are already very comfortable with interactive technology will find them attractive.
On the downside, the artificial intelligence powering these programs, while impressive and improving rapidly, remains artificial. Their smarts come not from sensory-based experiences, the province of humans and other animals, but through information, and lots of it. Some mental health chatbots absorb voluminous studies describing evidence-based approaches to psychotherapy (which is good), taking in far more data in less time than any human could possibly assimilate. This information informs their decision-making algorithms which operate on facts rather than intuitions or gut feelings, which human therapists often lean on a great deal.
These newer AI programs are potential game changers. Mental Health America has created a robotic therapy program based on the ChatGPT algorithm, one that has been used by over 50,000 people. How does it work? The user types in negative thoughts and the program replies with what therapists call “reframing” responses that offer positive alternatives. This is essentially the same process used by cognitive-behavioral therapists. The difference with these more advanced programs is that the chatbot’s responses are not canned generalizations. Rather, they are specifically tailored to the user’s unique inputs and drawn from the program’s vast library of behavioral science information.
Granted, it is still an algorithm, meaning it does not reflect what techies call “general artificial intelligence,” the kind most closely approximating human cognition. It’s only a matter of time before this next iteration emerges. What’s more, installing these advanced AI capabilities in humanoid robots, already widely used in several Asian nations, will further blur the lines between human and robotic therapists.
The question remains: how will humans relate to and interact with robots, no matter how intelligent? Research offers some clues. For example, most people can distinguish between a response from a chatbot and a human therapist, yet, over half the folks in this study rated the chatbot’s responses more empathic. Another study showed that most people develop a trusting relationship with a therapist chatbot within four days of exposure. Trusting relationship is another term for rapport, previously thought to be entirely the province of species-to-species interaction. Overall, while interacting with a chatbot shrink, most of us will humanize them in our minds.
As is true in many fields, mental health treatment will be revolutionized by AI. And while human-to-human therapy will remain valued and used, a new and potentially powerful ally has entered the struggle against mental distress and suffering. Properly used and absent the toxicity of greed, I see mostly good.
For more, visit philipchard.com.