As the demand for therapists increases and artificial intelligence (AI) becomes more sophisticated, people are turning to large language models (LLMs) as therapeutic tools. But should you? Experts warn to proceed with caution.
People who may benefit from therapy and mental health care often face barriers to accessing it. For many, chatting with an AI program might be easier to access and more affordable than a human therapist. You can talk with the chatbot as often as you like and from anywhere, but mental health chatbots have limitations you should know about.
If you or someone you know is experiencing a mental health crisis, always reach out to a human resource and do not turn to AI for help. Dial the 988 crisis line for immediate mental health support via phone and text.
[READ: How to Find the Right Mental Health Counselor for You]
Human vs. AI Therapy
LLMs can learn patterns in language and replicate them, and AI can even be trained on different therapy techniques, such as cognitive behavioral therapy or CBT.
Benefits of human interaction
However, while AI may be able to learn language patterns, it is incapable of delivering psychotherapy or talk therapy because the core of therapy is a human-to-human interaction, says Dr. Dave Rabin, a board-certified psychiatrist and translational neuroscientist.
“Most therapy, like 90 percent, is just meeting a fellow human being where they are in that moment, and just making them feel heard and seen and not judged,” says Rabin.
AI lacks the ability to spot nuances in tone, behavior, body language and eye contact that Rabin says are essential to therapy.
Reinforcement vs. gentle challenges
The American Psychological Association (APA) seems to agree that at AI is not a substitute for human therapy. In February of 2025, APA met with federal regulators and urged legislators to put safeguards in place to protect people from AI chatbots that can affirm users in ways a trained therapist wouldn’t.
Although various AI models operate differently, some are trained to reinforce a user’s worldview or provide overly flattering statements, says Ryan K. McBain, senior policy researcher and adjunct professor of policy analysis at RAND School of Public Policy. That this can become a problematic feedback loop when a person may benefit from the gentle challenges that a therapist might provide.
Setting boundaries
Another distinction between human therapists and AI chatbots is the ability to set boundaries. While the design of the chat tool is smart enough to use language of a therapy style like CBT to engage with you, the chatbot isn’t going to ask you to stop talking and think about what it just said, says Dr. Haiyan Wang, medical director and psychiatrist at Neuro Wellness Spa in Torrance, California.
Instead, there is a financial incentive for many AI programs to keep you engaged.
Haiyan contrasts the 24/7 access to a therapy chatbot with a human therapist, where you have to set appointments. The appointment means a lot because it’s a commitment between the patient and therapist, and it allows both parties to set boundaries, she says.
[Read: How to Prepare for Your First Therapy Session.]
AI Therapy Effectiveness
Research on the effectiveness of AI therapy is very new. A 2025 study in the New England Journal of Medicine examined a therabot used for mental health treatment. It’s the first randomized controlled trial to show the effectiveness of an AI therapy bot for treating people with major depressive disorder, generalized anxiety disorder or those at a high risk of developing an eating disorder. While users in the trial gave the therabot high ratings, researchers concluded that more studies with larger groups are still needed to determine effectiveness.
A 2025 Psychiatry Online study evaluated chatbots powered by LLMs to see how AI responded when someone’s suicide risk was at various levels, from low to high. Researchers found that the bots were in line with expert judgment when it came to responding to very low and very high levels of suicide risk, but there were inconsistencies for risk levels between the two extremes.
Even with promising research, Haiyan has a very cautious attitude when it comes to using AI as therapy or encouraging clients to use it, because AI still cannot replace human therapy.
Rabin says that if you want someone to talk to because you’re feeling lonely, a chatbot might help. But if you’re having a serious mental health crisis or dealing with a mental health diagnosis, the AI bot or character isn’t going to be able to solve that.
[READ: 9 Daily Habits to Boost Your Mental Health: Simple Steps for Boosting Your Well-Being]
Risks of AI Therapy
Experts warn that there are real risks associated with using AI as therapy, especially with news of teens taking their lives after interacting with chatbots. In addition, a chatbot cannot provide a referral to a psychiatrist, prescribe medications or provide guidance for your specific mental health situation.
McBain, an author of the Psychiatry Online study, says his main concerns with AI therapy are:
— Unsafe guidance because some chatbots may provide instructions on self-harm, substance use or suicide
— Missed warning signs, such as ambiguous expressions of distress
— Privacy risks that come with sharing deeply personal information without understanding how data are stored and used
A study from the Association for Computer Machinery found that AI chatbots are not effective and can introduce biases and stigmas that could harm someone with mental health challenges. Researchers concluded that there are many concerns with the safety of AI therapy, and LLM is not a replacement for therapists.
“When you employ a machine to do something that a human is required to do, you really put people’s lives at risk and their health at risk, and it’s a huge problem,” says Rabin.
AI chatbots and children’s mental health
For parents with children dealing with a mental health concern, you may be worried about your child going to a chatbot for mental health guidance. If you know a child experiencing a mental health crisis, it’s important to get them professional human help immediately.
How Can AI Help with Mental Health?
It might not excel at providing therapy, but there is a role for AI in the mental health world. Some therapists use it to help with session note-taking and administrative tasks. Wang sees AI transcription during sessions as one of its biggest advantages because it allows the therapist to fully focus on interacting without having to shift focus for note-taking.
Rabin says AI is great at doing route prediction and response to signs of illness that come up. An example of this in action is using generative AI to detect when a person has abnormal biometrics or heart rate variability based on data collected from a wearable device. The ability to quickly detect when somebody’s highly stressed or about to have a panic attack, he says, can provide the ability for mental health professionals to intervene.
“AI chatbots are likely to work better with highly structured, skills-based techniques, like practicing behavioral techniques, journaling or guided breathing,” says McBain. That’s because the responses for these are easier to script and validate.
If you’re feeling lonely and just looking for interaction, an AI chatbot may be a source of engaging conversation. However, if you’re in need of mental health advice or you’re in the midst of a mental health crisis, you need to speak to an actual human. Reach out to your health care provider or dial 988 for help via text or phone call.
More from U.S. News
What Is Cognitive Behavioral Therapy?
How Foods and Drinks Affect Our Mental Health
Should You Use Artificial Intelligence (AI) as Your Therapist? originally appeared on usnews.com