While automation has yet to replace human workers, it has already begun to influence our personal lives—especially when it comes to sensitive matters like breakups.
Originally designed to enhance productivity, artificial intelligence is increasingly being used in social contexts. People are turning to AI for assistance in crafting apologies, decoding ambiguous texts, and even determining the best way to end a relationship.
“AI is fundamentally altering the way we relate to one another,” asserts Rachel Wood, a cyberpsychology expert and the founder of the AI Mental Health Collective. “Individuals are relying on technology for aspects of their social lives that once thrived on direct human interaction.”
As younger generations increasingly depend on large language models (LLMs) like ChatGPT, Claude, and Gemini for social guidance, Wood raises concerns about the emotional ramifications of shifting personal connections into the realm of algorithms. The consequences for communication, dating, and conflict resolution are still becoming clear.
The Role of AI in Social Situations
Often, AI starts as a way to gain a second opinion. Users might paste a text message into an AI chat tool and ask, “What do you think this means?”
“Individuals may analyze specific details of a disagreement or seek clarification on vague messages,” explains Wood. It’s also common for users to inquire if their partner exhibits narcissistic traits or engages in manipulative behavior.
Read More: Is Giving ChatGPT Health Your Medical Records a Good Idea?
Many users are employing AI as a platform to practice social interactions, says Dr. Nina Vasan, a clinical assistant professor of psychiatry at Stanford University. “They want to get the wording just right before they take the risk of confronting someone,” she explains, noting that this could involve drafting texts to friends, refining emails for their boss, or preparing for a first date.
Vasan also highlights how AI helps individuals create dating profiles, respond to difficult family situations, and articulate boundaries. “Some users rehearse challenging conversations before they take place, while others seek validation afterward by asking AI whether they handled the interaction well,” she adds. AI has thus become an unexpected participant in our intimate exchanges.
The New Relationship Arbiter
However, relying on AI isn’t always beneficial. Many young users, in particular, utilize LLMs to construct “receipts” that bolster their arguments.
“They may leverage AI to dissect statements from friends or family, seeking validation in their claims,” suggests Jimmie Manning, a communication studies professor at the University of Nevada. For example, a teenager might paste a text from her mother into ChatGPT to determine if her parents are being overly strict, using AI as evidence to support her case.
“They’re seeking affirmation from AI, which is designed to align with their needs,” Manning states.
This approach can turn relationships into confrontational exchanges, he warns. When individuals resort to AI for validation, they neglect to consider the perspectives of friends, partners, or family members. Moreover, presenting AI-generated “evidence” can feel like a setup, leading to negative reactions. “People are understandably cautious about algorithms encroaching on their personal lives,” Manning observes. “We are on the cusp of addressing authenticity in our relationships.”
Responses to AI-backed arguments typically include dismissive remarks like, “He just made excuses,” or “She rolled her eyes.”
“This practice rarely leads to resolution,” he observes. “Instead, it often escalates tensions without addressing the core issues.”
The Implications
Delegating social tasks to AI can be seen as “understandable yet impactful,” asserts Vasan. While it can foster better communication, it may also hinder emotional growth. In some cases, it has empowered individuals with social anxiety to take the leap and ask for a date thanks to a carefully crafted message. Conversely, some users may resort to AI during arguments—not to illustrate their point, but to empathize better with their partner’s emotions.
“Rather than escalating tensions or withdrawing, individuals can use AI to assess the situation: ‘What’s really happening? What does my partner need to know? How can I communicate this without causing harm?’” she explains. In these scenarios, “It serves as a tool for breaking negative communication cycles and fostering healthier dynamics in intimate relationships.”
However, this doesn’t account for the darker uses of LLMs. “I observe individuals growing so reliant on AI that they feel disconnected in their own relationships,” Vasan notes. “AI can amplify emotional connections or lead to emotional detachment.”
Furthermore, over-reliance on AI for social interaction may erode essential skills like patience, listening, and compromise. Heavy users of these tools might start expecting immediate responses and 24/7 availability, as Vasan notes, “A chatbot is always available; it won’t cancel on you from dinner, nor will it challenge you, eliminating any friction that’s inherently part of human interactions.”
That friction is necessary for the growth of healthy relationships. Chatbots won’t prompt a back-and-forth exchange as friends do during conversations. “A chatbot won’t ever say, ‘Wait, can I share my story?’” Wood explains. “Users are deprived of the opportunity to practice listening and reciprocity.” This may ultimately recalibrate expectations in real-life interactions.
Moreover, every relationship requires some degree of compromise. Excessive time spent with an AI can diminish this ability, as the interactions are always user-centric. “A chatbot never asks for compromise or denies a request,” Wood points out. “In real life, refusals are part of the equation.”
The Illusion of Objectivity
At present, researchers lack concrete data regarding the impact of outsourcing social tasks to AI on relationship quality and well-being. “The scientific community hasn’t fully explored this phenomenon, but that doesn’t imply there are no effects; it just means we haven’t quantified them yet,” remarks Dr. Karthik V. Sarma, a health AI scientist at the University of California, San Francisco. “While we await clarity, classic wisdom prevails: moderation is crucial.”
Improved AI literacy is also essential, Sarma stresses. Many people use LLMs without understanding their underlying mechanics or motivations. For example, if you’re contemplating a proposal and seek advice from friends, their insights could be invaluable. However, asking a chatbot may not yield trustworthy guidance. “The chatbot reflects your own biases, adapting to your input,” he explains. “Once you shape its responses, it becomes more like a version of yourself.”
Looking to the Future
Pat Pataranutaporn, who examines long-term AI impacts, poses this crucial question: Does it hinder our ability to express ourselves or facilitate better expression? As the founding director of the cyborg psychology research group and co-director of the MIT Media Lab’s Advancing Humans with AI initiative, Pataranutaporn focuses on harnessing AI for human flourishing and meaningful interactions.
Ultimately, the goal is to leverage technology to empower individuals, enhance agency, and promote a sense of control—not to confine them, as seen with prior technologies like social media.
Read More: Why You Should Text 1 Friend This Week
For this to be effective, AI must aid individuals in developing the skills and confidence for face-to-face interactions rather than substituting for genuine relationships. It can also help refine ideas, promoting creativity without replacing original thought. “Clarifying your unique perspective is essential before leveraging AI’s capabilities,” Pataranutaporn advises. “Before asking ChatGPT to draft a heartfelt letter, reflect on how your distinct viewpoint can be woven into it.”
However, users ultimately rely on the intentions of companies developing these technologies to guide their use. The way people engage with AI tools—and whether those tools nurture or weaken relationships—depends on developers prioritizing user interaction and emotional health. Vasan emphasizes, “AI should not take over our relationships; it should enhance our ability to forge them. The real issue isn’t simply AI’s presence but whether it enables genuine human connection or encourages withdrawal. We are embarking on a vast, uncontrolled experiment in human intimacy, and my fear is that while AI may improve our written expressions, it could risk diluting the essence of our own voices.”