This is an exclusive BHB+ story
Therapists of 2025 have turned to an unlikely new coach: artificial intelligence.
More behavioral health providers are using AI to help train and provide real-time feedback to therapists — not to replace clinicians.
Within the behavioral health space, AI is being leveraged for avatar-training scenarios and pointing out therapist blind spots. It’s also being used for improved patient communication as well as for summarizing session insights.
However, some initial hesitancy around the adoption of these tools and questions around the ethical considerations of their broad implementation still stand.
“We think of this as another tool in our toolbox, and we still do other types of training as well,” Laurie Lebo, the chief operating officer at Centerstone Health, told Behavioral Health Business. “This just fills in a lot of gaps that other trainings can’t.”
Nashville, Tennessee-based Centerstone Health has partnered with Lyssn.io to train its clinicians since late 2023. Even though the system already had a simulation center in which it provided training for therapists, this solution had similar components and allowed Centerstone to easily upscale it to provide training to thousands of therapists more rapidly, Lebo said.
More scalable and accessible training options through technologies like these are worth the investment, Lebo and other providers told BHB, particularly as the industry is in the middle of a significant workforce shortage that is only expected to worsen by 2037.
New tools in an established field
After partnering with Lyssn, Centerstone initially had difficulty getting clinicians to utilize AI-powered training. For them, it wasn’t as much about hesitancy around the new tool as it was skepticism around the investment of their time, according to Lebo.
Now, the training allows therapists to earn continuing education credits, which has vastly improved clinician uptake.
Since then, Centerstone has seen a significant increase in the number of its clinicians who use the Lyssn platform for training, which provides immediate, personalized feedback across therapeutic modalities, including CBT, DBT, etc., letting the therapist know what skills and approaches they could improve on in AI-simulated sessions. Users can access the training anywhere, anytime — completing it in short bursts or taking more time to go through it.
That flexibility model has also been key, Lebo said.
New York-based Fort Health and virtual therapy provider LunaJoy have implemented AI tools for their clinicians, featuring AI avatars that engage in therapeutic sessions and provide scenarios for them to respond to and strengthen patient communication.
The different platforms both provide similarly structured training that allows therapists to upskill on their own time to avoid awkward role-playing in person and receive feedback summaries on how to refine soft skills like empathy and listening.
Even though Fort Health and LunaJoy now have buy-in from their therapists on working with the tool, implementation initially was met with a bit of skepticism from some.
“I think fear of the unknown about what AI means for our field, what it’s doing, and what it’s learning from you — that’s been a big hurdle, as well as helping people understand the security, the privacy and what is being done with that data,” Lindsay Henderson, founding clinical director of Fort Health, said. “You have to invest the time one-on-on in walking them through and orienting them to the tool itself. One of the biggest barriers is just the overwhelm of working with a tool that is new to them.”
LunaJoy’s AI tool, which it debuted a couple of months back, is specifically tailored to methodically train therapists on techniques that closely fit the needs of its patient population: women suffering from mental health conditions like postpartum. Some of its initial barriers to roll out and adoption were centered around ensuring the tool had the right knowledge base to properly represent the needs of its niche patient population.
“I think with anything in the mental health field or the psychiatry field, there is a reasonable amount of skepticism when something new comes to the market or when there’s a new modality of doing things,” Sipra Laddha, CEO of LunaJoy, told BHB. “We want to make sure that we’re really protective first and foremost of our patients and that the training is really good and high quality, so that we’re having clinicians working with patients in an evidence-based fashion.”
Since then, a majority of the feedback LunaJoy clinicians have shared about the AI-avatar training simulations has been that it is “very interesting and highly useful.”
“It’s very hard to be trained or have domain expertise in everything,” Laddha said. “I think the opportunity to be able to practice on AI patients that represent something very close to how our actual patient encounters go is really helpful.”
While across health care in general, the adoption of AI has been rapid for areas like radiology imaging and EHR documentation, using it to add value to the behavioral health space has been more tedious simply due to the primary modality this type of care happens in. Behavioral health care revolves so much around talk and conversation, Dennis Morrison, chief clinical officer of Eleos Health, explained.
“Using AI to analyze speech, speech patterns and understanding nuances of language is much more difficult than in the way it’s been used in general health care,” Morrison said. “Understanding the differences between two images: one which is cancer, one which is not cancer, for example, is an easier technological lift than understanding the nuances of how people talk in a psychotherapy conversation and providing value-added insights to that. It’s just a more difficult challenge, but it’s there now.”
The ethics of AI-enhanced care
Boston-based Eleos Health, a leader in AI-assisted behavioral health care, which recently nabbed $60 million in a series C funding round, asserts that infusing this type of technology with mental health care should prioritize transparency, first and foremost.
“I think from a transparency perspective, organizations should include that they’re going to use tools like this in their general consent to treatment documents,” Morrison said. “I think also including transparency about the use of AI: What is AI? What is not AI? How is it being used in the organization? This is really new technology, but organizations should already be putting in place an AI policy document defining the role of artificial intelligence in their organization and what is acceptable and what is not.”
As AI algorithms are known to contain inherent biases, it is also critical to report, refine and communicate what is being done to account for that.
Centerstone’s AI partner, Lyssn, does annual bias reports that analyze potential biases within the existing AI models so they can make adjustments regularly.
Fort Health and LunaJoy both see the data on the back end of the training models and use it to adjust how they work with the platforms from there.
“It’s always important to remember that a lot of this AI is trained on actual clinical information, clinical data, clinical conversations and the bias that we see is a reflection of the biases that already exist within mental health and certainly within medical care,” Laddha said. “We certainly see less, but bias does come up. I think it is important for clinicians to be able to recognize bias, both internally as well as externally, and be able to call that out.”
All four companies also recommend the inclusion of human supervision and oversight at some level, as well, to make sure the training and AI feedback is not kept in a silo in a very hands-on human-centric field without a human-in-the-loop. Making sure clinicians understand that is also key.
“We’re using technology not to replace them but to enhance and upskill them,” Henderson said. “While I do feel confident that this is like a private, safe space where people are really just there for their own development, I don’t want the team members to feel like this is something that is intimidating or that they potentially will be judged for, or like that they are being put through something that is a test. It’s not like being shared or used for performance or scoring or in any way.”
In the next year, Henderson anticipates seeing more clinicians using the tool. LunaJoy and Eleos Health also anticipate growth and expansion of their tools.
For its part, Centerstone is looking at how to expand the application of AI training to peer support specialists and possibly even volunteers, according to Lebo.
“We’re really opening our minds to other options for how we can use this in other areas we may not have had the resources to train in the past,” she said. “Because this is so accessible and portable, we’re now able to expand who we can deliver training to.”