Why Asking if AI Will Replace Therapists Misses the Point

This is an exclusive BHB+ story

Artificial intelligence is getting pretty dang good at replicating its source materials. Outpatient mental health is no exception.

We are at the point now with AI that its text-based products are indistinguishable from human-generated outputs. Researchers at Dartmouth College have given us two notable examples: A study released earlier this year found that peer support specialists and AI experts alike couldn’t distinguish between peer support content that was human- or AI-generated. Earlier this month, other researchers found an AI tool can establish a therapeutic alliance with patients and deliver notable clinical outcomes.

Any technology ultimately replaces human labor. So, it’s natural to wonder if ascendant AI tools could replace therapists. In short, the answer is somewhere between “not today,” “no” and “not applicable.” While basic, addressing that question head-on is necessary to contextualize this moment in time.

Advertisement

After addressing the initial question, there are other more important things to consider. Specifically, I think we need to assess these tools through the lens of perennial and systemic challenges the outpatient mental health world faces today.

I’m going to dig into four topics in this BHB+ Update:

— What industry leaders think about the moment

Advertisement

— The view through the lens of care access 

— What I see as the best use of AI tools

— My top concerns

Industry reaction

While I think there are more illuminating questions to ask, it’s only fair for me to tackle the questions I raised.

Personally, I think AI gives a mighty good imitation of a therapist. And in today’s disposable-plate, fast-food, fleeting-attention-span society we live in, that’s good enough for a lot of people. And for a lot of people, regardless of my grumpy, old-man assessment of society, good enough is actually a decent place to start a mental health journey, especially when considering the challenges to access to care that many face.

The folks in the outpatient mental health world that I spoke with see the increased incursion of AI in therapists’ work as inevitable, with very different opinions on to what degree it is possible and acceptable. However, there was some agreement that even as impressive as AI has become today, it’s not a ready replacement for a therapist.

“I do think there is an opportunity for some of these consumers to be served with AI, and we’re already seeing that because some consumers are opting into those experiences before they even get to a human,” Kabir Daya, chief digital officer at the outpatient mental health provider Thriveworks, told BHB. “But I don’t see AI as a fundamental replacement of that person-to-person care.”

Daya frequently pointed to access-to-care challenges and challenges that the outpatient mental health space itself faces as areas of aid that specialized AI tools can address. Daya and many others point to the rise of AI tools that can lessen the administrative burdens that therapists and their organizations face. On the patient side of things, there are already several mental health-related AI tools to help with journaling, discrete deliveries of cognitive behavioral therapy, or other aspects of psychoeducation.

On the provider side, we’ve already seen an AI tool become especially popular: ambient listening. These voice-trained AI tools can listen to a session or recording of a session and extract or generate several types of data. This can include notes and summaries of the sessions, assessments of the client’s state and identifying the type and the quality of the care the therapist provided.

Dr. Christopher Ivany, the chief operating officer of outpatient mental health provider Family Care Centers, noted that in a way, this type of AI tool has already replaced one function of therapists. And that happened not necessarily because of the tech industry but rather because of the adoption of the tools by the industry.

“More of those functions, over time, will be able to be performed by a bot or AI tool,” Ivany said. “How that happens, how quickly that happens, which functions [are addressed], how those functions start to get done better, faster, quicker, more efficiently by these bots over time depends on how fast they develop and how motivated people are to incorporate them.”

In direct answer to my question — “Can an AI tool replace a therapist?” — Ivany said, “I think the answer is not right now. It can’t do it. To the best of my knowledge, they can’t do it right now.”

Access is No. 1

TheraBot, the Dartmouth tool I mentioned earlier, was able to reduce symptom scores for patients with generalized anxiety disorder and eating disorder by about 65%, with major depressive disorder patients seeing 47% better scores in a randomized control trial.

I think this is incredible. While the study report and communications from Dartmouth appropriately condition the relative narrowness of the applicability of the study, it is laudable that a tool like this can reduce human suffering in a clear and provable way. This is what the outpatient mental health industry is all about.

One of TheraBot’s developers, Dartmouth computer science, psychiatry and biomedical data science associate professor Nicholas Jacobson, told me that the vision for such a tool is to ensure ready access to meaningful support in the absence of ready access to a clinician.

“If you actually take the market as it exists, the influx for the number of humans that need to access care and would benefit from it relative to the number of providers — it is multiple orders of magnitude off of what it needs to be to actually treat those folks,” Jacobson said. “So these folks are not going to be touched by humans because we don’t have enough humans to go around.”

Jacobson points out that if a patient can’t get into care in the first place, then an AI bot isn’t replacing a human, they are simply filling a void. He also highlighted that there are and will likely be a large number of people that won’t want to engage with an AI. Further, he notes that some portion of this population won’t trust an AI, supplanting even the chance of generating a therapeutic alliance with a bot.

And for Jacobson and his fellows working on TheraBot, the mission isn’t to just give people outside of care something in the face of nothing. It’s about accessibility to “the best psychotherapy that we can deliver with the highest levels of fidelity.”

I know that many will have skepticism and fear about the use of AI, and for good reason. It’s right to always maintain skepticism about anything that impacts human life. Ivany highlighted this in our conversation.

“There is a huge discrepancy in access for different people. If you throw these sorts of models out there at some point in the future — say, when they’re clinically validated and made safe — does that help or hurt people? Maybe I’m a bit of an optimist, but I tend to think that would help more people than hurt if you look at it from the access perspective,” Ivany said. “If you come from the other end, and if this starts to look like an opportunity to cut costs out of the mental health system, then that creates more of an environment of haves and have-nots. As long as we don’t come at it from that perspective, then it could be a net positive.”

The opportunity and the warning with AI

All of the people I spoke with for this Update noted that there is a great opportunity for good when it comes to supporting people in therapy after they leave the therapist’s office. 

Adopting validated, purpose-specific AI tools can allow for support that is either absent after sessions or offered at the heavy time expense of a therapist. Some digital mental health startups have sought to differentiate their services by allowing their clients to send messages (emails, texts, app-based messages, etc.) to their therapists with the promise that they will get timely and meaningful responses. In the process, therapists face even heavier clinical obligations and risk burnout.

Instead, therapists could have an AI tool that they can control, design, or otherwise oversee to enhance what is done during a session. Through tools like this, therapists and their organizations would be able to ensure truly quality, customized and trackable support for their clients.

“I think there’s really a 1+1=3 opportunity here for us to deliver better care, holistically, as opposed to seeing the scenario as a zero-sum game, where one is replacing the other,” Daya said.

Similarly, Ivany said he could see a world where a therapist could prescribe and lead a course of engagement with a cognitive behavioral therapy AI tool and then see the therapist every few weeks if clinically appropriate.

“The provider is not obsolete. The provider is now working differently than they would have previously, and they’re using AI — this CBT AI for example — to help augment them,” Ivany said. “And now the providers can see more patients than they were ever able to do and only intervene more when cases are more complicated or there is a change in the diagnosis or whatever else it is.”

And the reasons why therapists today can’t “see more patients than they were ever able to” reflect my concern for the ascent of AI tools that can mimic therapists. While the opportunity for the outpatient mental health industry to do tremendous good with AI is a big deal, AI’s rise ultimately ignores why there are problems for it to solve in the first place.

From an access perspective, several problematic aspects of the American education and payer systems disincentivize professionals from becoming therapists in the first place and prevent them from becoming financially successful for providing widely accessible care. Providing the industry with tools that increase capacity doesn’t do anything to address low reimbursement rates or a lack of parity in administrative or financial matters between physical and mental health. They don’t address racial and ethnic disparities in care outcomes and, unless specifically trained not to (a big if), could worsen such disparities.

Further, so many of the problems of our world come from a lack of human connection. Turning to AI at both an individual and systemic level eliminates the opportunity for a highly trained, highly compassionate professional to make a living by making someone more capable of living their life in greater comfort. I don’t think that the outpatient mental health industry should be so eager to bring that about.

For all of the things that AI can do, it ultimately can’t solve these problems. Perhaps we ought not expect it to. Rather, it can help us better mitigate the industry’s problems. That’s worth something, I suppose.

Companies featured in this article:

, ,