Toward a Responsible Vision of AI in Mental Health Tools: Interview with Daniela Andrade, Head of Growth at Resolution
by Melissa Carleton
Content warnings: AI in mental health, suicide, abusive relationships
With the pace of technological change, prevalence of job loss, and worsening socioeconomic inequality, individuals face more mental health challenges than ever. In conjunction with the AI boom, the market for AI in mental health is large and growing. On a global scale, it was estimated at $1.13 billion in 2023 and is projected to reach $5.08 billion by 2030.
Access to high-quality life advice or therapy matters now more than ever. Many entrepreneurs have spotted an opportunity: creating AI for mental health. Daniela Andrade is one such individual. Daniela graduated from Harvard in 2025 and is Head of Growth at Resolution, a startup that serves primarily young women by providing them with an “AI guardian angel” called Fabio to help them navigate toxic relationships.
As a disclaimer, the views expressed in this article do not necessarily reflect The Honest Economist’s views on the ethics of AI in mental health care. Tragic events such as AI-assisted suicides lead many to rightfully question the use of AI in mental health care. This interview acknowledges such ethical concerns.
Daniela powerfully articulates the vision of an AI tool developed hand-in-hand with its users and with responsible guardrails, including working directly with therapists who receive alerts when concerning user behavior occurs. This type of tool may provide an alternative to technology that is so often created without direct user feedback.
In this interview, we touch on themes such as how the power of AI and big data can help women catch concerning signals of abusive relationships before they reach a crisis state. We discuss what it’s like when AI enhances human connection rather than replaces it.
We then delve into the economics of the gender wage gap among young women, a group often left out of the conversation. We explore how an AI tool developed by young women could help close this gap in a world where AI is not designed with them in mind. Even if readers of this article walk away with lingering skepticism regarding the use of AI for emotional support, I hope they become inspired to reflect on how existing AI tools could become more socially responsible.
The Power of Data: AI for Modern Romance
Melissa: Hi Daniela, thank you so much for speaking with me today. My first question for you is, how would you characterize the market that you serve? Would you say it’s mental health technology, AI and mental health, or AI and self-help? How do you prefer to define it?
Daniela: We call it an AI emotional support system for modern romance. So, emotional support.
Melissa: I like that. AI for Modern Romance. I’d like to ask, what are your users’ pain points, and why introduce AI in particular to help mitigate these?
Daniela: Our main users are women who are in very toxic, physically, mentally, emotionally abusive relationships. They don’t usually have people who can help them, and their friends have been trying to convince them to leave the boyfriend with little success. So these are the girls who are struggling with, you know, self-confidence, self-worth, and finding ways to get out of really bad relationships.
To answer the second part of your question, why AI, what we find is that a lot of these girls that I was just referring to, is that a lot of times they lose friends over the boyfriend, right? They often cut people off because their friends told them to leave the boyfriend, and they don’t want to hear it, or the boyfriend won’t allow them to see their friends, or they’re just with the boyfriend all the time, and they kind of lose touch with their friends.
We found that when you lose your friends, it only reinforces the toxic situation. You further lose touch with reality, making it more difficult to understand that this is not normal, this is not healthy, this is not safe.
And so we wanted to create a tool that can respond to you instantly because most therapists aren’t available at 11PM or midnight if a fight breaks out between you and your boyfriend. A personalized AI can frame its response in a way that’s most persuasive so that you’ll actually listen, or that you’ll be open to hearing things that will help you be safe, and to help you leave a bad relationship.
And I think that personalized element would not have been possible without LLM technology. On the backend, AI has instant access to the context and history of your relationship, and we wanted to bring these elements directly to the consumer.
Melissa: And so, for people who find themselves in this toxic and abusive situation, AI provides a safe confidant, you could say, in contrast to a friend who might try to talk you out of it, tell you that you’re overreacting, tell you you’re underreacting.
Do you think that having access to this AI actually increases the probability that a woman in this situation would get help?
Daniela: That’s the goal, and we’re still experimenting and testing with our initial users. But that’s ultimately the goal.
We have a strong belief that if our tool works, it will help women, you know, find someone that they can trust and leave their toxic relationship. And really interestingly, we spoke to this girl who was in a similar abusive relationship like I was in. And when she left that relationship, it was because her friends hung out with her every day immediately after the breakup.
They made, you know, they kind of did a friendship calendar of we’re gonna do these activities (an events calendar), and after class we’re gonna take you here, we’re gonna go bowling, we’re gonna do fun stuff. When we heard that success story (not involving AI), we were like, there are so many ways to integrate that dynamic because a lot of times your friends don’t know how hard you’re hurting.
One feature that we’re working on with Fabio is a group chat feature where he can make a group chat with your close friends that you trust and that you love and can tell them, with your consent, that you’re struggling with this relationship and would love to see them, you know?
And your friends would realize they didn’t understand the gravity of the situation because it takes a village, right? And we do not believe that Fabio can do it alone either. We definitely want to leverage your existing human relationships to help you.
AI as a Complement, Not a Substitute for Human Relationships
Melissa: Yes, that human relationship part is key. So that launches into my next question: how are you challenging the misconception that AI tools in mental health replace human connection?
Daniela: Yeah, well, like I was sharing before, we’re not in the business of replacing friends. We’re not an AI friend. We are an AI support system. We do not want to replace your friends, family, or therapists. We are working with clinical partners.
If there were some emergency language around suicide, we would definitely work with therapists immediately and connect users to therapists in our network.
I think there’s this broader conversation of humans versus AI, but in reality, it’s humans who use AI versus humans who don’t use AI. AI, like anything in life, is a tool. It can sound like a human being, but it doesn’t replace the human nature of relationships, at least at this current point.
We are in the business of bettering your relationships with the people around you. We’re starting with the romantic relationships, but then we plan to also expand to friendships because we know from a lot of the conversations we’ve had with girls on our waitlist, there’s a lot of drama with college roommates. We ultimately want to make your relationships better with people because technology has done things to deteriorate those relationships, but we believe that with the right technology, it can actually improve them.
AI in Mental Health: Mitigating Economic Inequality
Melissa: Yeah, that’s incredible. My next question relates to the economics of your product. What you are describing is an understudied area in economics, but let’s say that a woman is in a toxic and abusive relationship. You touched a lot on the psychological impact and what that can do to a young woman. What about the economic impact? What about, say, reducing the woman’s earnings?
Individuals could get distracted in college by having to deal with these issues, versus spending time on classes, applying for jobs, or seeking opportunities. Could this tool help increase women’s economic power, especially as they experience these difficult situations and recover from them?
Daniela: Great question, because I took a class at Harvard Law School on gender violence and we had a reading about the economic impacts of domestic violence on women…. not only women’s economic earning potential but also society’s economic potential.
If you’re in a really poor romantic relationship, that significantly impacts your mental health. Which also affects your earnings and just your ability to interact with others and do positive things for society, and for yourself, ultimately. And so, I definitely think women in abusive relationships are often actively thinking about that relationship or worrying about safety. These effects on your health affect you economically, right? Women seek out mental health services more than men. But those mental health services also tend to cost money and aren’t always covered by insurance.
Women who are in violent relationships might need to go to the doctor because they’ve suffered physical assaults on their bodies. There are definitely effects on their health, and our goal with Fabio is to use a consumer model where individuals can pay less for active support versus paying hundreds of thousands of dollars to go to therapy.
Melissa: Right, as you’re touching on, it can be very expensive to recover from incidents like these or seek help. And it’s unfair. A woman might think, okay, well, if I seek help, then I might get slapped with this huge medical bill. But you’re saying that you’re introducing a tool to actually prevent these incidents from occurring or mitigate their effects. That could be incredibly helpful. In a way, it’s no different from some AI app that reminds people to take their medication, for instance.
Daniela: Not to mention, women have all these additional health fees. And we make less money than men per hour for the same work. So there is also that element of, like, we make less, but we also have to spend more.
Melissa: Yes, that is so true. I would be interested in looking at the main drivers of the gender wage gap by age group. Traditionally, unequal caregiving responsibilities are cited as one of the main drivers. But we never really talk about it for women at the college age or right after graduating college.
Combating Assumptions with Positive Representation
Melissa: Another question I had that popped up as well is that a lot of the negative press coverage around AI and mental health focuses on tragic incidents where AI assisted somebody to commit suicide, or something like that. A lot of these AI tools have actually been developed by groups of people that are not really that diverse. And so, inevitably AI is going to be a thing people are going to use, and the developers are not always taking responsibility when these incidents occur.
Given this context, why do you think it’s important to have, say, a group of young women developing these tools for the people that they’re serving and who have been through it themselves? How does that counter some of the negative perceptions about AI and mental health?
Daniela: That’s a great point. We often forget that all the technology that we’re currently experiencing the negative impacts of were created by white men, and that we’re not considering women and our concerns, or at least not broadly considering them. We’re a strong believer of understanding the pain points of our users personally and listening to the users.
You can be a man and build a product for women in a way that effectively speaks to their concerns if you really work hand-in-hand with your users. But we’re not really seeing that, right? A lot of times, we’re seeing quite the opposite. The gaslighting of female users, or the victim blaming.
What distinguishes us is that we’re really working with our users. It’s not to say that just because I’m a girl, I know the experience of every single girl in the world. But we work very directly with our users and try to build relationships with them because we ultimately want to gain their trust and build for them. We care about challenging our own assumptions by building for the people who invested their time and their trust into us to support them back.
And so, we don’t, like I said, we don’t have all the answers, though we do have a different philosophy behind what’s driving us and culture around what’s helping us build, which I don’t think has been effectively done before by incumbent companies for this demographic of users that we’re targeting.
The Path Forward
Melissa: Yes, if you look at it, it’s predominantly the same demographic building these tools. This brings me to my last question. Is there anything else you’d like to add, whether about Fabio, Resolution, or these broader economic issues that we are discussing?
Daniela: The last thing I’d like to add is that I just want people to be more open-minded. I think we’ve been so traumatized by the effects of technology. A lot of people, especially the people who’ve had children or who are a lot older, who have experienced these different waves of technology, more aggressively oppose it.
But I do think that innovation has led us to this point and I think that innovation to help others succeed is needed, because we are the quality of our relationships, right? When we think about who we are, it’s like, who’s gonna be at our funeral, right? Like who are these people we’ve cared about? None of the money you made matters. It all comes down to, who are the people in your life that care about you and care about your legacy?
And we ultimately believe that technology doesn’t have to replace your relationships. It can actually support them and make them better.
There will definitely be growing pains (with our startup) and we’re not going to get this all right. We can make mistakes but I also think that our overall intention and goal is to improve human relationships.
Melissa: That’s extraordinary.
Daniela: Well, thanks so much for bringing me on today. This was a good conversation that helped me understand what other people have been observing, as well.
Sources:
Barron, Jesse. “A Teen in Love With a Chatbot Killed Himself. Can the Chatbot Be Held Responsible?” The New York Times Magazine, October 24, 2025. https://www.nytimes.com/2025/10/24/magazine/character-ai-chatbot-lawsuit-teen-suicide-free-speech.html
Goldin, Claudia. “A Grand Gender Convergence: Its Last Chapter.” American Economic Review, vol. 104, no. 4 (April 2014): 1091–1119. https://www.aeaweb.org/articles?id=10.1257%2Faer.104.4.1091
Hill, Kashmir. “A Teen Was Suicidal. ChatGPT Was the Friend He Confided In.” The New York Times, August 26, 2025. https://www.nytimes.com/2025/08/26/technology/chatgpt-openai-suicide.html
InsightAce Analytic. “AI in Mental Health Market Size, Share & Trends Analysis Report by Application (Conversational Interfaces, Patient Behavioral Pattern Recognition), by Technology (Machine Learning, Deep Learning, Natural Language Processing (NLP), and Others), by Component, by Region, and by Segment Forecasts, 2025–2034.” InsightAce Analytic, March 6, 2025. https://www.insightaceanalytic.com/report/global-ai-in-mental-health-market-/1272
Resolution. “Meet Fabio – Your Guardian Angel.” Fabio (webpage). Accessed December 3, 2025. https://meetfabio.com/about
Wells, Sarah. “Exploring the Dangers of AI in Mental Health Care.” Stanford HAI, June 11, 2025.https://hai.stanford.edu/news/exploring-the-dangers-of-ai-in-mental-health-careStanford HAI

