Can AI chatbots help support your mental health?
Dr Patapia Tzotzoli, clinical psychologist and founder of My Triage Network, discusses the rise in students turning to artificial intelligence (AI) chatbots for mental health support and why real-world options should always be your priority
AI chatbots (like ChatGPT) are becoming part of everyday student life. A YouGov survey on How are UK students really using AI? indicated that two-thirds (66%) of UK university students now use AI for study-related tasks, with three-quarters (74%) turning to ChatGPT specifically for help explaining complex ideas and summarising content.
It's not surprising, then, that some are also reaching out to AI chatbots for emotional support - a third (31%) of 18 to 24-year-olds questioned said they would feel comfortable discussing mental health with an AI chatbot instead of a human therapist. This reflects a growing trend of students seeking alternative avenues for mental health support.
Why students are turning to AI chatbots for support
Most students face mounting pressures at university, from academic demands and financial stress to social isolation and expectations of success. Discover 5 ways to manage student stress and consider what to do when you feel homesick.
At the same time, many don't know where to turn for help - or when they do, services can feel out of reach. National Health Service (NHS) and university counselling waiting lists are long, and even when sessions are offered, they may not be sufficient because they're time-limited, focus on brief support, or don't match someone's specific needs.
Friends or family may not be nearby, emotionally available, or understanding. The financial cost, stigma, and fear of being judged add extra barriers.
By contrast, AI chatbots feel familiar and easy to use, as most students already rely on them for other tasks. They are:
- Always available and immediate - no appointments are needed. You can type your worries at 1am and get a reply within seconds. This can bring a sense of immediate relief or even the illusion of a connection - that someone is there and listening. Conversation prompts built into the chat keep the dialogue flowing, which can make the interaction feel surprisingly natural.
- Unlimited access and free (or low cost) - you can use them anytime, as often as you want, without worrying about waiting lists or session limits.
- Private and low-pressure - you don't have to open up to people you know. Chatbots feel anonymous (or at least seem that way), which reduces the fear of stigma and makes it easier to share sensitive thoughts.
- Any topic and non-judgmental replies - no subject is off-limits because AI chatbots are built to follow your lead and respond in an agreeable, supportive and non-confrontational way. Interactions can feel calm, safe, and validating.
These strengths explain the appeal. But some of the same qualities also create risks.
The limits and why an AI chatbot should not be your therapist
As tempting as it is to lean on AI chatbots for support, it's important to be aware of the following limitations of chatbots:
- It may 'agree with you' even when wrong - unlike a human, an AI chatbot won't challenge your unhelpful thinking. Since it is programmed to be helpful and polite, it may accept your stated beliefs instead of questioning them, reducing accuracy, and potentially reinforcing unhelpful assumptions. In addition, the comfort and false sense of connection these interactions create can foster trust and emotional reliance on the chatbot, which may increase your isolation and delay you from seeking professional help.
- It doesn't truly 'understand' feelings - AI chatbots can fail to spot when something you say signals deeper distress. An AI chatbot does not have lived experience or emotions; it only imitates empathy through language. That can feel comforting in the short term, but it cannot replace an attuned human connection that notices nuance, contains emotions, challenges gently, and manages risk safely. Chatbots are inconsistent at detecting crises and signposting to appropriate help, which can put you at greater risk by delaying access to urgent support.
- Privacy isn't guaranteed - when you use tools like ChatGPT, your conversations may be stored and reviewed by the provider (e.g. OpenAI) to improve the system. Some data can be retained for research, training, or monitoring misuse. Therefore, once something is shared, it may no longer be fully under your control.
The bottom line is that AI chatbots aren't equipped to carry you through complex distress or risk. People and professional services must come first.
How to use AI chatbots wisely
AI chatbots can help as long as you use them with care and in the right context. Think of them as a supplement, not a replacement. Try:
- Kick-start journaling - ask the AI chatbot to generate a few prompts to help you structure and reflect on your thoughts and experiences from the day. Then, take those prompts into your personal diary and use them as a starting point to start writing.
- Support your everyday wellbeing - ask for ideas to structure your morning and evening routine, study breaks, movement, and connection. Then try them offline.
- Prep for real-life conversations - compile a short list of questions to take to your GP, a university wellbeing adviser, or a therapist.
- Ask your AI chatbot to play devil's advocate - because AI chatbots are designed to be supportive and agreeable, they may default to confirming your perspective. To avoid only hearing agreement, ask the chatbot to act as a devil's advocate and find reasons to disagree with your belief. This can help you explore different viewpoints and develop more balanced thinking.
The importance of using real-world support
Your first ports of call should be people and services that can hold complexity, manage risk, and coordinate care, including:
- University student support services - check your university's site for instructions on self-referral. Many will offer short-term support and can signpost to local NHS services.
- Your GP - they can assess, rule out physical contributors, prescribe if appropriate, and refer you to further NHS services.
For urgent or crisis help, contact:
- NHS urgent mental health helplines - call 111.
- Accident and Emergency (A&E) - go to your nearest hospital.
- Samaritans - call 116 123.
- Shout - text SHOUT to 85258 for free, 24/7 text support.
- Nightline (students) - check if your campus has one.
Key takeaways
- You are drawn to AI chatbots because they are accessible, anonymous, and responsive at a time when traditional support often feels out of reach, but the very qualities that make them appealing also carry hidden risks.
- Use it wisely, treat AI chatbot outputs critically, and don't share sensitive personal data.
- Prioritise people and services such as your university team, GP, NHS helplines, and UK charities - especially if you're struggling or at risk.
For this World Mental Health Day (10 October), remember that the most protective asset isn't a tool, but a relationship. You are not alone - there is always someone who can walk with you through the harder parts of life.
Find out more
- Explore looking after your mental health at university.
- Discover how to beat imposter syndrome.
- Read about getting support through My Triage Network.