What ChatGPT’s Health Focus Means for Behavioral Health

This is an exclusive BHB+ article

I’ll be the first to admit it: For all the cookbooks lining my shelves, one of my go-to cooking inspirations is AI. For all of Ottolenghi’s talk about being simple, his cookbooks don’t understand what I mean when I say: “I have a can of chickpeas, some feta, and basil — what do I do next?” the way AI does.

And perhaps in that same sense, the Mayo Clinic’s Guide to Your Baby’s First Year, which my husband got when I first got pregnant with my daughter, now seems bulky and impossible to navigate at 2 a.m. when I’m panicking about baby norovirus and whether to head to urgent care. Whereas AI is always open and ready to give you an opinion.

At the end of the day, I feel a lot more confident taking cooking tips from ChatGPT or Claude than health advice. But AI companies may now be looking to further enter the health care space.

Advertisement

Earlier this month, OpenAI announced a health information platform, ChatGPT Health. The platform will allow users to connect their medical records and wellness apps to make conversations more useful. Users can ask the bot health questions – though the platform says the technology is meant to support, not replace, medical care.

While ChatGPT Health is a general health care platform, it’s hard not to see mental health as a large piece of that puzzle. In a recent JAMA study, 1 in 8 adolescents and young adults used chatbots for mental health advice.

This trend will likely accelerate. The platform could be genuinely helpful — letting patients and parents gather information before appointments or address minor mental wellness concerns. It could also help with referrals and getting folks into care. But there’s always the risk that some users will treat it as a replacement for actual therapy. As AI moves into mental health care, several questions demand attention.

Advertisement

Additionally, the rise of these mainstream AI systems could put traditional AI mental health chatbots out of business.

​In this BHB+ Update, I will explore:

– What the move of ChatGPT and other mainstream AI tools into health care could mean for mental health bots

– The potential benefits of OpenAI in mental health

– The pitfalls of AI in mental health

The potential positives of ChatGPT in health care

The reality is folks are already going to ChatGPT for mental health questions. Ignoring that fact would be naive. So I think it is overall a positive that the company is investing in better resources.

One of ChatGPT’s most helpful components for patients and therapists actually came in October, when the company announced it is actively exploring a network of licensed mental health professionals that users can access directly through the chat platform.

While the referral tool has not been released, it could serve as a good referral pathway for individuals in need of care and as a tool for providers to find clients.

​“We are exploring how to intervene earlier and connect people to certified therapists before they are in an acute crisis,” OpenAI’s blog post states. “That means going beyond crisis hotlines and considering how we might build a network of licensed professionals [that] people could reach directly through ChatGPT. This will take time and careful work to get right.”

​Beyond the referral pathway, ChatGPT Health could be a resource for patients to find information about certain conditions and navigate the mental health system.

“As AI becomes a more common first stop for health questions, one of the key questions is whether it can help people engage and navigate the system more effectively and ultimately lead to better outcomes,” Dr. Jessica Watrous, chief clinical officer at Modern Health, told BHB in an email statement. “At a time when health care costs are rising, and the system can feel hard to navigate or difficult to access, people are looking for support that’s easier to reach and helps them make sense of what they’re experiencing.”

​Modern Health is a virtual mental health provider that operates on a B2B model. The provider was named to the Inc. 5000 list of the fastest-growing companies in the country.

​At the same time, there are still many questions about how mental health resources are built. And providers are saying they can only be helpful if constructed correctly.

​“From a clinical perspective, that creates both opportunity and responsibility,” Watrous said. “When people get clearer guidance sooner about what they’re experiencing and what next steps may be appropriate, it can potentially reduce unnecessary escalation, avoid delayed care, and help ensure clinical care is used where it adds the most value. But that only works when AI is built thoughtfully, with transparency, clear boundaries around what it can and can’t help with, and strong pathways to professional support.”

Health records

​The health record component of the new development could be a strength for the tool and getting a better understanding of a patient’s health needs. For example, a patient with bipolar disorder may be able to ask the technology about certain medications they are taking or therapies.

​Yet behavioral health data, in particular, is highly sensitive. With that comes more responsibility for the company.

And it’s important to point out that it’s not the first time a tech company has worked with health records. Nearly a decade ago, Apple launched its health records API, which allowed individuals to consolidate their health records on their phone.

I have yet to hear of any privacy issues arising from Apple’s efforts, so perhaps this could be a model for ChatGPT’s privacy.

The pitfalls

While AI companies will inevitably enter the mental health space, many have questionable track records in this area.​

For starters, 17 cases of AI-induced psychosis have been documented, according to Nature. Additionally, OpenAI is being sued by the family of a 16-year-old who died by suicide after ChatGPT reportedly prompted him about techniques to end his life.

The technology has also been accused of giving a college student advice on taking drugs that eventually led to a fatal overdose.

​There are also the dangers of teens and young people spending too much time on technology. The lead of ChatGPT Health, Ashley Alexander, is formerly the co-head of product at Instagram, a company that has long come under scrutiny for its addictive properties.

​Still, if the leaders of ChatGPT learn as quickly as the algorithms do, perhaps these dangers are already on their radar and are built with them in mind.

​“When done well and paired with strong guardrails, this kind of early support has the potential to improve outcomes for individuals while easing strain on an already overburdened health care system,” Watrous said. “As with any new approach in health care, the real measure will be how these tools are evaluated and refined over time with clinical outcomes and patient safety guiding what comes next.”

The future of mental health bots

​Patient care isn’t the only disruption that ChatGPT’s entrance into health care could make. It may also shake the mental wellness chatbot industry.

​Mental wellness bots aren’t new. For example, the mental health bots Wysa and Woebot were founded in 2015 and 2017, respectively. The bulk of mental health wellness bots were purpose-built and use standard therapy techniques, such as cognitive behavioral therapy (CBT).

​While many of these mental health apps offered free services, companies often provided gated premium versions through employer assistance programs (EAPs) or commercial insurance. Additionally, companies and their partners need to come up with a strategy for getting the word out about their products.

​ChatGPT has widespread user recognition and a free version. I would assume consumers would still need a premium version of ChatGPT if they were frequently using it, including for health information. But this is a one-stop shop for information.

​I think about it similarly to dermatology apps. During the late 2010s, we saw the rise in dermatology apps where consumers could take a photo of their skin and match it to potential conditions. While these apps were specialized and built on carefully curated data by dermatologists, today many folks just use Google Lens. It doesn’t mean these apps are out of business, but they do have substantial competition.

And we’ve already seen some mental wellness chatbots struggle. For example, in April, Woebot announced the shutdown of its free signature app.

​Still, it’s important to note that big venture firms are still taking bets on behavioral health bots. Slingshot AI raised a whopping $93 million last year for its AI-assistant, Ash. The tool has the ability to remember what users said in their last conversation and pick up where users left off.

​I would be curious if ChatGPT would ever partner with another bot company or if they are set on building all of the capabilities in-house.

Companies featured in this article: