On August 6, 2025, USA TODAY published an article by Amaris Encinas titled “It’s not you, it’s me. ChatGPT doesn’t want to be your therapist or friend.” The piece highlighted OpenAI’s intentional stance: ChatGPT is not designed to serve as a mental health provider or to substitute for genuine human connection. Instead, it is a tool—one that can help with information, guidance, or even light emotional support, but never as a replacement for therapy or friendship.
This raises an important question: What does this mean for people who rely on chatbots for emotional connection, especially in cultures where mental health stigma remains strong, such as in the Philippines?
The Benefits of This Boundary
Setting clear limits around what ChatGPT can and cannot do carries some positive outcomes:
- Prevents misuse. By explicitly stating “I am not your therapist,” ChatGPT reduces the risk of individuals mistaking AI support for professional help (Encinas, 2025).
- Encourages real connections. Human relationships—whether with family, friends, or faith communities—remain central to mental well-being. In collectivist cultures like the Philippines, where family and pakikipagkapwa (shared identity with others) are highly valued, emphasizing human support is especially important (Church et al., 2017).
- Promotes ethical AI design. By clarifying its role, ChatGPT aligns with responsible AI use. Scholars emphasize that transparency around limitations is critical in avoiding over-dependence (Bender & Friedman, 2018).
The Challenges and Risks
At the same time, this stance also raises some concerns:
- Emotional invalidation. For individuals who already feel isolated, hearing that ChatGPT “is not your friend” may deepen feelings of rejection. This can be significant in societies where mental health resources are scarce or stigmatized (World Health Organization, 2022).
- Loss of a safe outlet. Some people—particularly young people or those facing stigma—may find it easier to “talk” with ChatGPT than to disclose emotions to others. Discouraging this outlet too strongly may leave them with no immediate alternative.
- Overcorrection. While it is important not to replace therapy, AI interactions can still play a useful role as a “first step” toward reflection or help-seeking. Eliminating this possibility could miss opportunities for early support (González-González et al., 2021).
What Research Suggests
Recent studies offer mixed findings about emotional reliance on chatbots:
- Risks: People who form intense emotional attachments to AI report lower well-being, particularly if they have limited human social support (Ho et al., 2024).
- Benefits: However, moderate use—such as journaling, practicing conversations, or seeking coping strategies—can contribute positively to self-reflection and short-term stress relief (Nass & Yen, 2010; Ghosh et al., 2023).
In the Philippines, where professional mental health access is often limited, AI could provide a bridge tool: a safe and stigma-free entry point to reflection, but ideally leading toward real interpersonal support.
A Balanced Perspective (A Therapist’s Stance)
As a mental health professional, here is my view:
- ChatGPT can be a helpful tool—but with limits. It can provide psychoeducation, stress management tips, or even a space to “practice” how to share feelings.
- It is not therapy. Real healing requires connection—whether with family, friends, faith communities, or licensed professionals.
- Boundaries matter. Clear messaging like Encinas (2025) reports helps protect users from false expectations. But discouraging AI entirely risks losing a valuable adjunct resource.
For Filipinos, who often rely on close family ties and community support, the healthiest approach may be to see ChatGPT as a “practice notebook”—something you use to rehearse thoughts, but not the final place for healing.
Conclusion
ChatGPT is not your therapist, and it is not your friend. But it can still play a meaningful role—as a companion tool for self-reflection and emotional expression. When used responsibly, it may even serve as a gentle bridge toward reaching out for genuine human support.
Ultimately, the article reminds us of something deeply human: technology may guide us, but healing comes from real relationships—our families, our communities, and our Creator.
References
- Bender, E. M., & Friedman, B. (2018). Data statements for natural language processing: Toward mitigating system bias and enabling better science. Transactions of the Association for Computational Linguistics, 6, 587–604.
- Church, A. T., Katigbak, M. S., Mazuera Arias, R., Rincon, B. C., Vargas-Flores, J. J., Ibáñez-Reyes, J., … & Ortiz, F. A. (2017). Culture and the Big Five personality traits: Comparing Indigenous and imported constructs across cultures. Journal of Personality and Social Psychology, 113(4), 645–665.
- Encinas, A. (2025, August 6). It’s not you, it’s me. ChatGPT doesn’t want to be your therapist or friend. USA TODAY.
- Ghosh, S., Mahmud, N., & Chatterjee, A. (2023). Human–AI interaction in therapy chatbots: A review. Frontiers in Psychology, 14, 1104321.
- González-González, C. S., Toledo-Delgado, P. A., Collazos-Ordoñez, C. A., & Medina-Merodio, J. A. (2021). Emotions and motivation in learning with AI chatbots. Education and Information Technologies, 26(5), 5245–5267.
- Ho, A., Hancock, J. T., & Naaman, M. (2024). AI companions and social health: How reliance on AI for emotional support impacts well-being. Journal of Computer-Mediated Communication, 29(1), 1–22.
- Nass, C., & Yen, C. (2010). The man who lied to his laptop: What machines teach us about human relationships. Current.
- World Health Organization. (2022). Mental health atlas 2020. WHO.