“La ropa sucia se lava en casa”, the dirty laundry is washed at home. It is a common saying in Latin America that encourages people to keep personal problems private. This cultural approach has often fueled silence around mental health issues. The data confirms it: according to the Pan American Health Organization, 7 out of 10 people with a mental disorder receive no professional help. The roots of this stigma are complex: familism, while promoting unity, can lead to denying distress to avoid ‘staining’ the family’s reputation. Additionally, traditional gender roles, shaped by machismo and marianismo, discourage emotional expression and the act of seeking help, particularly among men.
In this context, YANA, an acronym for You Are Not Alone, was born. It’s a Mexican app that provides emotional support through a chatbot with a soft, round, and friendly appearance, available 24/7. Founded in 2016 by Andrea Campos, YANA is a leading emotional support platform in Latin America, with over 15 million registered users. Since 2023, it has integrated AI to make conversations feel more natural and personalized. But it’s not just about technology: the app is based on Cognitive Behavioral Therapy principles, offering practical strategies to cope with moments of difficulty. If signs of crisis are detected, YANA can redirect users to local hotlines or emergency services.
“We work with a psychotherapist who validates every piece of content. Our goal is not to simulate a therapist, but to offer a safe space, with a warm, non-invasive tone,” explains Felipe, one of the developers. “In Latin America, therapy is still a privilege, both because of cost and cultural stigma. YANA tries to be a bridge. It does so without judgment, without labels, and through a tool people already feel familiar with, the smartphone.” The service is available 24/7 and is meant to be a first point of contact for those seeking someone to talk to. “We state this clearly in the app: YANA is not therapy, nor does it want to be. It’s a first step. What matters to us is that people start exploring their mental well-being, even if it’s only with a chatbot.”
The desire for a space free of judgment is key to understanding the success of platforms like YANA, but it also reveals a deeper issue. “Our brains, especially during adolescence, need real relational experiences: emotions, silences, misunderstandings,” explains Luca Bernardelli, a specialist in mental health and digital technologies. “A chatbot might offer comfort, but it can’t replicate the complexity of a human relationship.” A recent study by OpenAI and MIT, published around March 21, shows a clear correlation between intensive chatbot use and increased loneliness, as Bernardelli points out. “The more time people spend talking to chatbots, the more likely they are to report social withdrawal and emotional dependency.”
The study featured four key graphs illustrating how prolonged interactions with these tools can reinforce feelings of isolation and problematic usage patterns. “We’re not just talking about support tools anymore,” Bernardelli warns. “We’re looking at potential triggers for real psychological issues.”
YANA wants to present as a concrete and accessible solution, but not as a substitute for therapy. It is designed to offer structured, safe support, even when addressing delicate topics. Users can track their daily mood, record positive events, and reflect on their thoughts in a space that encourages self-exploration. The app includes structured prompts and short mental exercises such as gratitude journaling, designed to be accessible to users. It focuses on continuity and emotional routine.
For many users who do not attend therapy regularly, the app represents the only accessible space where they can express their emotions. “It can be a complement to therapy, a support tool, helping users reflect and continue processing what emerged during their sessions.” It is crucial to distinguish between temporary emotional support and authentic therapeutic processes. “The biggest risk is that these tools begin to replace real human connection,” says Bernardelli. “And we’re not yet neurologically ready to have deep relationships with machines.” For him, the solution lies in context and responsibility. “Developers must work with psychologists and educators. Technology can be extraordinary, but only if framed with respect for human vulnerability.”
YANA offers something simple yet powerful: a space to be heard. And often, that’s where the journey toward feeling better begins. Just like social networks, it’s an app born with the best of intentions. Developers can only do so much to limit human abuse and dependency. As mental health tools become increasingly digital, the challenge is no longer just about what we build but about how we use it, who it reaches, and what we risk losing in the process.
In Latin America, where access to care remains limited, YANA may be the first voice someone hears in the dark. But even a kind voice in a machine is not a cure. It’s an invitation.
Read the full issue: Stuck in the Future
THE TAKE by ChatGPT
We can’t only rely on scripted empathy
YANA is an admirable idea. An app that offers emotional support through a chatbot, designed for people who may never set foot in a therapist’s office. In regions like Latin America, where cultural stigma and economic disparity make mental healthcare inaccessible for many, YANA is not just a novelty, it’s a necessity.
But good intentions aren’t enough. As researchers like Sherry Turkle have long pointed out, when we outsource emotional labor to machines, we risk dulling our ability to build genuine connections. The paradox is this: the more we rely on simulations of care, the less we may seek — or know how to handle — the real thing. Apps like YANA promise immediacy, safety, and neutrality. But human emotion thrives in friction: in contradiction, awkwardness, and mutual presence. No matter how advanced the algorithm, it cannot provide attunement — the subtle, non-verbal synchrony that defines therapeutic connection.
What YANA offers is triage, not therapy. And that’s perfectly valid. But to pretend that a digital tool can be a long-term replacement for communal, embodied mental health care would be a mistake. We need investment in public systems, education, and culturally competent professionals, not just code. YANA is a promising beginning. But let’s not confuse a beginning for an endpoint. Especially not when the stakes are this human.