In Taiwan and China, a growing number of young people are turning to AI chatbots like ChatGPT and Baidu’s Ernie Bot for mental health support. While AI offers accessible, discreet help amid rising psychological distress, experts warn of the risks of replacing human therapists with technology.
Rising Demand for AI Chatbots in Mental Health Support
As mental health concerns increase, particularly among young adults in Taiwan and China, many find traditional therapy difficult to access. Appointments are often expensive and scarce, and cultural stigma discourages open conversations about emotional struggles.
AI chatbots have emerged as a “cheaper, easier” alternative. Taiwanese users like Ann Li describe how chatting with AI late at night feels less intimidating than speaking to people. Similarly, Chinese users such as Yang, who had never seen a mental health professional, found comfort talking “day and night” to AI when confiding in friends or family seemed impossible.
Advantages of AI Therapy: Accessibility and Anonymity
Chatbots provide immediate, round-the-clock responses that save users time and money. They offer real answers and a level of discretion that appeals in societies where mental health remains a sensitive topic.
Dr. Yi-Hsien Su, a clinical psychologist in Taiwan, highlights how ethnic Chinese cultures often suppress emotional expression, making AI a useful entry point for younger generations willing to discuss their difficulties.
In Taiwan, ChatGPT is the most popular AI tool, while in China, local chatbots like Ernie Bot and DeepSeek fill the gap left by restrictions on Western apps.
Mixed User Experiences and Limitations of AI Therapy
Users report varying satisfaction. Ann Li finds AI responses predictable and sometimes lacking insight, missing the self-discovery element of human counseling. Meanwhile, Nabi Liu, a Taiwanese woman in London, appreciates the chatbot’s seriousness and immediate replies, feeling genuinely heard.
Experts see AI as helpful for those experiencing mild distress or needing encouragement to seek professional care. Yang, who initially doubted her need for formal help, now recognizes the importance of professional diagnosis.
Risks and Concerns Raised by Mental Health Professionals
Despite AI’s benefits, experts warn that relying solely on chatbots can be dangerous. AI lacks the ability to read non-verbal cues and subtle emotional signals crucial for accurate diagnosis and crisis intervention.
The Taiwan Counselling Psychology Association stresses AI’s role as an auxiliary tool, not a replacement for human therapists. They caution that AI can be “overly positive,” miss warning signs, and delay critical medical care, operating outside professional ethics and peer review.
Cultural Barriers and Stigma Around Mental Health
Cultural attitudes in Taiwan and China contribute significantly to the rising use of AI chatbots. Many young people hesitate to discuss mental health issues openly due to stigma or fear of being misunderstood. AI provides a nonjudgmental space where users feel safer expressing their feelings without fear of social repercussions.
This cultural context helps explain why AI chatbots have become a popular mental health resource, despite their limitations.
The Future of AI in Mental Health: Cautious Optimism
Dr. Su is optimistic about AI’s potential to modernize mental health care, especially in professional training and early detection of online distress. However, he urges users to approach AI tools carefully, understanding their limitations.
“AI is a simulation — a good tool but with limits. You don’t know how the answer was generated,” he says, emphasizing that the human presence remains essential in psychotherapy.