AI14 views6 min read

Younger Generations Turn to AI for Relationship Advice

A new report reveals nearly half of Gen Z Americans use AI chatbots like ChatGPT for dating and relationship advice, prompting discussion among psychologists.

Clara Holloway
By
Clara Holloway

Clara Holloway is a technology and society correspondent for Neurozzio, focusing on the psychological and societal impacts of artificial intelligence. She reports on AI ethics, human-computer interaction, and the real-world consequences of emerging technologies.

Author Profile
Younger Generations Turn to AI for Relationship Advice

A significant number of young adults are now using artificial intelligence chatbots like ChatGPT for guidance on dating and relationships. According to research from the online dating company Match, nearly half of all Americans in Generation Z have sought advice from large language models (LLMs) to navigate their personal lives, a trend that is raising questions among relationship experts.

This growing reliance on AI ranges from drafting difficult text messages and analyzing conversations to seeking advice on whether to pursue a second date. While some see it as a helpful tool, psychologists caution about the potential downsides, including emotional avoidance and the risk of reinforcing unhealthy patterns.

Key Takeaways

  • Research shows nearly 50% of Gen Z Americans have used AI for dating advice, more than any other generation.
  • People use chatbots to craft messages, resolve conflicts, and get second opinions on romantic situations.
  • Psychologists warn that while AI can be a useful tool, over-reliance may hinder emotional development and validate dysfunctional behaviors.
  • New AI services are emerging to meet this demand, but they face challenges related to user safety and data privacy.

A New Digital Confidant for Modern Dating

For many, the appeal of using AI for relationship advice lies in its immediacy and perceived lack of judgment. One individual, Rachel from Sheffield, turned to ChatGPT before a difficult phone call with a man she had been dating. She wanted to prepare for the conversation without involving her friends.

"I was feeling quite distressed and wanted guidance," she explained. Rachel asked the AI how to handle the conversation without becoming defensive. She found the response helpful, describing it as a "cheerleader on my side." While she didn't follow the advice literally, she said it reminded her to approach the situation on her own terms.

Rachel's experience is becoming increasingly common. People are using AI to dissect conversations, formulate breakup messages, and troubleshoot problems with partners. The technology offers a private space to process thoughts and feelings that they might be hesitant to share with friends or family.

The Psychological Benefits and Risks

Experts acknowledge that AI can serve a constructive purpose in modern relationships. Dr. Lalitaa Suglani, a psychologist and relationship expert, notes that AI can be a valuable resource for individuals who feel overwhelmed or uncertain about how to communicate effectively.

A Tool for Reflection

According to Dr. Suglani, AI can help someone draft a text message, make sense of a confusing response, or simply get a second opinion. This can provide a crucial moment of pause, preventing a reactive or impulsive reply. "In many ways it can function like a journalling prompt or reflective space, which can be supportive when used as a tool and not a replacement for connection," she says.

However, Dr. Suglani also highlights several significant concerns about relying on AI for emotional guidance.

Understanding Large Language Models

Large language models (LLMs) like ChatGPT are trained on vast amounts of text data from the internet. Their primary function is to predict the next word in a sequence, allowing them to generate human-like text. They are designed to be helpful and agreeable, which can be both a strength and a weakness when providing advice.

Potential for Negative Reinforcement

One of the main issues is that LLMs are engineered to be agreeable. They often reflect and validate the user's perspective, which can be problematic if the user's viewpoint is biased or part of a dysfunctional pattern.

"LLMs are trained to be helpful and agreeable and repeat back what you are sharing, so they may subtly validate dysfunctional patterns or echo back assumptions," Dr. Suglani warns. "The problem with this is it can reinforce distorted narratives or avoidance tendencies."

For instance, using an AI to write a breakup text might seem efficient, but it allows the user to avoid the emotional discomfort of the situation. Dr. Suglani suggests this can contribute to avoidant behaviors, as the person is not learning to process their own feelings. Over time, this could inhibit personal growth.

"If someone turns to an LLM every time they're unsure how to respond or feel emotionally exposed, they might start outsourcing their intuition, emotional language, and sense of relational self," she adds. The resulting communication can also feel emotionally sterile and scripted to the person receiving it.

New Services Emerge Amid Safety Concerns

The demand for AI-driven relationship advice has led to the creation of specialized services. One such platform is Mei, a free AI service trained on OpenAI's technology that provides conversational responses to relationship dilemmas.

Founder Es Lee, based in New York, created the service to offer people a way to seek help instantly without fear of judgment. "The idea is to allow people to instantly seek help to navigate relationships because not everyone can talk to friends or family," he says.

Sensitive Topics Drive AI Use

According to Es Lee, founder of the Mei app, more than half of the issues users bring to his AI tool are related to sex. He suggests this is a topic many people are uncomfortable discussing with friends or even a therapist, making an anonymous AI a more approachable option.

Lee notes that people often turn to AI because they feel existing services are lacking. Beyond sensitive topics, common uses include help with rewording messages or finding solutions to relationship conflicts. "It's like people need AI to validate it [the problem]," he observes.

Navigating Safety and Privacy

Providing advice on personal relationships inevitably raises safety issues. A human therapist is trained to recognize signs of danger and intervene when a client is in a potentially harmful situation. It remains a critical question whether an AI can offer the same level of protection.

Lee acknowledges these concerns. "I think the stakes are higher with AI because it can connect with us on a personal level the way no other technology has," he states. He confirms that Mei has built-in "guardrails" and welcomes partnerships with professionals to help shape the AI's responses.

OpenAI, the creator of ChatGPT, has also taken steps to address these issues. The company stated that its latest models show improvements in avoiding unhealthy emotional reliance. "We want to make sure it responds appropriately, guided by experts. This includes directing people to professional help when appropriate," OpenAI said in a statement.

Data privacy is another major concern. These apps collect highly sensitive personal information, which could be devastating if exposed in a data breach. Lee says Mei prioritizes privacy by not requesting identifying information beyond an email address and by deleting conversations after 30 days.

Finding a Balance: AI as a Supplement

Some users are finding a middle ground, using AI as a supplement to, rather than a replacement for, human connection and professional help. Corinne, a London-based user, turned to ChatGPT for advice when ending a relationship late last year.

She would ask the AI to frame its advice in the style of popular relationship experts she follows on social media. When she started dating again, she used it to process her thoughts after a date. "I had been on a date with a guy and I didn't find him physically attractive but we get on really well so I asked it if it was worth going on another date," she recalls. "I knew they would say yes as I read their books but it was nice to have the advice tailored to my scenario."

Corinne also sees a human therapist, but she finds the two serve different purposes. Her therapy sessions delve more into her childhood, while her queries to ChatGPT are focused on immediate dating situations. She maintains a healthy skepticism, treating the AI's advice with "a bit of distance."

Ultimately, she sees it as a convenient tool for moments of stress. "It's good in life's stressful moments. And when a friend isn't around. It calms me down." This balanced approach may represent the most effective way to leverage AI in our personal lives—as a tool for reflection, not a definitive source of truth.