(504) 452-5937
(504) 452-5937

Should You Trust AI with Your Mental Health? What You Need to Know

What happens when emotional support comes from a screen instead of a person?

Susan Harrington, Licensed Marriage and Family Therapist and founder of Maison Vie, joins KLFY-TV CBS to talk about the growing trend of turning to AI chatbots for comfort — and why it can be risky.

She explains how chatbots may feel supportive in the moment but can also increase anxiety, reinforce harmful thoughts, and lack the confidentiality and care that real therapy provides. Susan also shares why human connection remains the safest and most effective path to healing.

Artificial intelligence seems to be everywhere. It writes emails, curates your playlists, and somehow knows when you’re out of coffee. Ask it for a dinner recipe, and you’ll get so many choices it could trigger the anxiety of choosing. Ask it why you feel lonely, and suddenly it’s acting like your therapist. The trouble is, AI might sound supportive, but it isn’t trained to help you heal. So should we really trust it with our mental health?

How does an AI chatbot work for emotional support?

When someone turns to an AI chatbot instead of a therapist, it can feel helpful at first because the program reflects back what you share. For example, if you type, “I feel depressed,” it might reply, “I’m sorry you’re feeling depressed. That sounds really difficult. Do you feel this way often?”

That can feel validating, even soothing. But it’s only a reflection. Therapy takes you beyond that momentary comfort. A licensed therapist helps you process feelings, identify patterns, and move toward real healing. AI can’t provide that deeper work, and that’s where the risk lies if it’s used as a substitute for therapy.

Mental health professionals have raised similar concerns, warning that relying on chatbots for therapy can actually delay real help.

Why are people drawn to chatbots?

Billed as “mental health companions,” AI chatbots are attracting people who can’t afford therapy, those who’ve had discouraging experiences with it, or anyone curious about whether technology can help. Much of the draw comes from their accessibility — they don’t judge, they don’t rush, and they don’t operate on a clock. For some, the comfort is in knowing the chatbot is always available, even in the middle of the night after a bad dream or during moments when human support isn’t immediately reachable.

“For some, the draw of chatbots is simple: constant availability and comfort without time limits or pressure.”

What are the risks of using AI for therapy?

A 2025 Mount Sinai study found that people in distress who relied on chatbots often ended up more anxious, more isolated, and even more vulnerable to hopelessness. In some cases, the risk of suicidal thoughts actually increased. Some tragedies have even pushed lawmakers to propose new safety rules for chatbot systems.

Other risks include:

  • Privacy concerns: Unlike therapy, where confidentiality is legally protected, anything you share with AI may be stored, misused, or vulnerable to data breaches.
  • Emotional dependence: AI is available 24/7, which can create unhealthy reliance instead of encouraging you to practice coping skills between sessions.
  • Short-term relief: The “listening” feels real, but the comfort fades quickly because no new tools are learned.
Watercolor illustration of a solitary person sitting in a dark room, lit only by the blue glow of a smartphone, symbolizing loneliness and reliance on AI for support.
Relying on AI for comfort can leave you feeling more isolated… nothing replaces real human connection in mental health support.

Can AI ever play a positive role in mental health?

Yes—with professional oversight. When trained and guided by licensed therapists, chatbots can help people practice coping strategies or reinforce skills between sessions. AI has also been explored as a possible support tool in healthcare, though researchers caution it must always be paired with professional oversight.

But without that training, AI can also get things wrong. Researchers call it AI hallucination when a program fills gaps with made-up but confident-sounding responses. To make matters worse, AI often brushes off its own mistakes by saying “good catch”—as if a blatant error were no big deal. And if you’ve noticed, most platforms now carry a disclaimer at the bottom reminding you: “ChatGPT can make mistakes. Check important info.

Think of it like a fitness tracker: it can remind you to move and celebrate your steps, but it doesn’t replace a coach who understands your body. The same goes for therapy.

What can you do instead?

If you’ve been using AI for emotional support and feel uneasy, the next best step is connecting with a licensed therapist. Here are a few tips to start your search:

  • Check your insurance network and read reviews.
  • Narrow by logistics such as location, online options, and specialties.
  • Prepare three simple questions:
    • Are you currently taking new clients?
    • How would you address my specific concerns?
    • What does success usually look like for clients like me?
  • Take a “test drive.” Call and ask your questions to see if you feel comfortable.

The bottom line: chatbots may offer a quick response at midnight, but genuine human connection remains the safest and most effective path to healing.

If you or someone you love ever feels unsafe with their thoughts, please call 988, the Suicide & Crisis Lifeline, right away, or visit 988lifeline.org.

Your Next Step

Just as you’d see a doctor for physical health concerns, the safest way to care for your emotional well-being is with a licensed therapist. While AI chatbots can sound supportive, they aren’t trained to guide you toward real healing — and what you share with them may not be private. Therapy offers something no algorithm can: a safe space, confidentiality, and the expertise to help you move forward.

If you’ve been turning to AI for comfort and find yourself feeling more anxious, disconnected, or overwhelmed, you don’t have to go through it alone. Contact us at Maison Vie to take the first step toward genuine support, connection, and lasting change.

CONTACT US