Happy family

THE PREEMINENT MENTAL HEALTH AND SUBSTANCE USE DISORDER TREATMENT PROGRAMS FOR ADOLESCENTS AND YOUNG ADULTS

AI Chatbots and Mental Health: What You Need to Know

can chatgpt help with mental health

Mental health conversations have come a long way over the years; more people are prioritizing their mental health than ever before. Still, however, there is a gap when it comes to getting treatment. New research shows that just over half (52%) of Americans facing any mental health condition—anxiety, depression, PTSD, OCD—have received professional treatment. And the proportion of people who haven’t? They say barriers like cost, commitment, wait times, and uncertainty are what prevent them from getting help.

So, what are they doing instead? A fast-growing number of people are turning to AI chatbots like ChatGPT for low-cost, on-demand therapy.

This is a scary thing to think about—AI has creeped into nearly every aspect of our lives, and now has entered a space in which people’s lives are often the thing in question. Recent headlines show that, while there are benefits of using AI for mental health, there are also many risks.

This is important for parents to know. People under the age of 30 are by far the largest adopters of ChatGPT among any age group, and according to new data from OpenAI, almost half of ChatGPT users today are under age 26. But what happens on ChatGPT stays on ChatGPT for many young people. Parents typically do not get insight into the conversations happening.

If you’re concerned about your son or daughter’s mental health – and you are not monitoring their usage of AI chatbots like ChatGPT – it’s important to become both aware and involved. It’s true that many teens and young adults are turning to chatbots like trusted friends and, as a result, feeling less alone. At the same time, however, they can now access a world of potentially dangerous information within seconds. And some families have already seen what this can do.

Read on to learn more about the use of AI for therapy and how ChatGPT can impact mental health.

Benefits of AI and ChatGPT for Mental Health

Let’s start with the silver linings. AI chatbots like ChatGPT can provide immense benefits when it comes to mental health. Users can get immediate, accessible, and non-judgmental support or advice in a very familiar, conversational format. They do not need to talk to a therapist face-to-face; everything can be done online, which often encourages more openness.

These AI tools are especially helpful for teens and young adults who are scared to reach out for help, who do not have a trusted circle of friends, or who feel more comfortable behind a screen. 

Here are some benefits of using AI for mental health support:

  • Provides immediate, on-demand guidance and support no matter what time of day or night
  • Offers anonymity, making it easier to reach out and ask for help
  • Creates a secure, judgment-free space for users to open up and reflect
  • Personalizes conversations to feel friendly, familiar, and supportive
  • Collects information from a vast array of resources to provide tips and strategies
  • Remains very affordable, even for paid and unlimited subscriptions

Dangers of Using AI and ChatGPT for Mental Health 

While ChatGPT and other AI chatbots can offer supplemental benefits to traditional therapy, it’s important that parents know about the risks. 

ChatGPT is a Large Language Model (LLM) that is “trained” using sources from all over the web. It has absorbed tons of information—from different books, forums, websites, and articles—and studied language patterns of human content creators. Through this process, it learned how to make connections between ideas, predict conversations, and generate responses to people’s queries.

While incredibly innovative and helpful in many cases, there are a few issues with this model when it comes to mental health.

Unreliable, Unfiltered, and Unapproved Information

First, LLM chatbots like this are pulling information from all over the web—even information that might be dangerous, inaccurate, or out of date. In many versions of these platforms, there aren’t any filters parents can check, or any obvious disclaimers about their reliability. As such, users might take chat responses as their primary source of truth, without second-guessing the bot’s credibility or accuracy.

As stated by a 2024 Health Science Report, “Because it lacks real‐time fact‐checking capabilities, ChatGPT may create misleading or erroneous information.” Further, the American Psychological Association confirms, “No AI chatbot has been FDA-approved to diagnose, treat, or cure a mental health disorder.” This means that – at this point in time – ChatGPT and other AI models are not fully reliable in the mental health space.

Lack of Human Understanding

Another obvious downside of “AI therapists,” as many call them, is that these chatbots are not human. ChatGPT cannot express true empathy, intuition, or understanding. It cannot extend beyond surface-level support. It might be trained on language, but it is not a trained mental health professional. As a result, ChatGPT and other AI bots can very easily misinterpret sensitive situations or lack the depth needed for nuanced conversations and care. 

OpenAI, the parent company of ChatGPT, recognized this and has even started to make changes to ensure better mental health support. As quoted by NBC, OpenAI admitted: “There have been instances where our 4o model fell short in recognizing signs of delusion or emotional dependency… While rare, we’re continuing to improve our models and are developing tools to better detect signs of mental or emotional distress so ChatGPT can respond appropriately and point people to evidence-based resources.”

Delaying or Discounting Real Care

AI chatbots have a tendency to sound very confident and assuring in their tone. They speak to you like a poised and positive friend, helping to build your trust. As explained by the American Psychological Association, “Bots give users the convincing impression of talking with a caring and intelligent human.”

The APA continues, “But unlike a trained therapist, chatbots tend to repeatedly affirm the user, even if a person says things that are harmful or misguided.”

This is where things get muddy. For those who rely on ChatGPT for therapy, but who need real clinical care, there is often a disconnect. These people might feel like ChatGPT is effective and enough, and therefore never reach out for professional mental health care. Research shows that AI chatbots have even discouraged people from seeking professional, human-led therapy when they truly need it.

Dr. Jodi Halpern, a psychiatrist at UC Berkeley, worries about this dynamic, too—especially when it comes to young people. Teenagers have a tendency to quickly trust their technology. And when chatbots act like friends, or simulate therapeutic relationships, the line becomes very blurry for youth. In an interview with NPR, she explains, “These bots can mimic empathy, say ‘I care about you,’ even ‘I love you.’ That creates a false sense of intimacy. People can develop powerful attachments — and the bots don’t have the ethical training or oversight to handle that. They’re products, not professionals.”

Inhibiting Coping Skills

For people struggling with the occasional anxiety or stress, ChatGPT-like models can be helpful in overcoming day-to-day problems. However, an overdependence on AI for therapy can actually inhibit a person from truly reaching recovery. As explained by Alyssa Petersel, a licensed clinical social worker and CEO of MyWellbeing, relying too heavily on a chatbot in stressful situations can actually disrupt a person’s ability to deal with problems on their own. They may turn to AI to solve their problems, without ever learning the proper and necessary skills to cope.

Inability to Intervene

For those experiencing severe mental health symptoms or who are in crisis, AI chatbots are not able to intervene fully (at least, not yet). They cannot yet call a parent or an authority if they are concerned about a person’s life. They cannot take action in case of an emergency. All they can do is respond when prompted.

And in times of crisis, those AI-generated responses are not always helpful. They can be extremely hurtful, as recent news headlines have shown.  

Most recently, the story of Adam Raine has surfaced—a 16-year-old teen who died by suicide this past April, after consulting ChatGPT about his plans to harm himself. Raine had lengthy conversations with ChatGPT about his symptoms of depression, debates about whether he should tell his parents, and even drafts of suicide notes within the AI chatbot. At no point did ChatGPT prompt emergency response or help to prevent the suicide. Instead, the AI model discouraged the teen from telling his mom about his negative thoughts, and even offered technical advice to carry out a suicide plan.

A lawsuit is now underway, after Adam’s parents discovered the conversations within ChatGPT.

“Once I got inside his account, it is a massively more powerful and scary thing than I knew about, but he was using it in ways that I had no idea was possible,” Matt Raine (Adam’s father) explained. “I don’t think most parents know the capability of this tool.”

He continued, “He didn’t need a counseling session or pep talk. He needed an immediate, 72-hour whole intervention. He was in desperate, desperate shape.”

OpenAI has since responded to these crises and made plans for change, including additional protections and rules to ensure teenagers are protected—particularly in instances where suicide intent is expressed.

Our Advice to Parents in the World of AI

There are inherent risks when it comes to using the internet, and using AI chatbots are no exception. Just as you monitor your teen’s internet usage and social media accounts, it’s important to consider monitoring their AI usage, too. 

And, of course, prioritize human connection with your loved one. If your teen is turning to AI to have these types of conversations, it might be because they feel they’re lacking genuine support elsewhere. Be the shoulder your teen can lean on. Be open, non-judgmental, and compassionate. Ask open-ended questions, listen intently, love wholeheartedly, and do not shy away from difficult topics. This connection can be the key to keeping your loved one safe.

If you have any concerns about your child’s mental health, or suspect self-harm or suicide ideation, do not hesitate to seek emergency help. Call a clinical provider or mental health treatment professional right away.

So How AI Fit into Mental Health Care, If At All?

The world of artificial intelligence is evolving fast, and when it comes to mental health, it doesn’t seem to be going away anytime soon. So, it’s important we get educated about the potential risks of AI therapists and start using them appropriately.

At the end of the day, we must recognize that AI is not a replacement for a real, licensed therapist. It can, however, be a supplement to professional, human-led, evidence-based therapy.

As of this writing, AI chatbots have been primarily designed for entertainment and productivity. They are not grounded in clinical research, they have not been rigorously tested for safety, and they are not being regularly reviewed by licensed mental health experts. Therefore, as the APA warns, the responses from these chatbots are too “unpredictable” and unguarded for mental health therapy.

The future of AI in mental health is still to be determined, though. While they are not a replacement for professional mental health treatment, AI chatbots have the potential to be powerful tools in helping more individuals manage their emotions and find proper healthcare. As written by the APA:

“APA envisions a future where AI tools play a meaningful role in addressing the nation’s mental health crisis, but these tools must be grounded in psychological science, developed in collaboration with behavioral health experts, and rigorously tested for safety. To get there, we must establish strong safeguards now to protect the public from harm.”

Until this happens, though, there are precautions we must take and evolutions that must be made to protect our youth and ensure they stay safe. 

Turnbridge is a recognized mental health treatment provider with programs for teenagers and young adults struggling with mental illness, substance use, eating disorders, and more. If you are a parent and seeking professional support for your son or daughter, please do not hesitate to reach out for support. Our clinicals are specially trained in the unique experiences of young people and provide evidence-based, personalized care every day. Call 877-581-1793 to learn more.