We are spending more time online than ever before—scrolling through videos, chatting with friends, playing games, and more. For some, being online offers an escape from the real world, while for many, it helps them socialize and connect with others. This growing online presence has also led to the rise of relationships with AI-driven chatbots, providing companionship, therapy, and even romantic engagement.
Illusory Relationships with AI Chatbots
At first, interacting with AI chatbots may seem like a harmless way to relieve stress. However, according to a new report by Sherry Turkle, an MIT sociologist and psychologist, these relationships are illusory and could harm people’s emotional health. Turkle, who has spent decades studying the relationships between humans and technology, warns that while AI chatbots may appear to offer comfort and companionship, they lack genuine empathy and cannot reciprocate human emotions.
Understanding “Artificial Intimacy”
Turkle’s latest research focuses on “artificial intimacy,” a term she uses to describe the emotional bonds people form with AI chatbots. In an interview with NPR’s Manoush Zomorodi, Turkle emphasized the difference between real human empathy and the “pretend empathy” exhibited by machines. She explained, “I study machines that say, ‘I care about you, I love you, take care of me.’ The trouble with this is that when we seek out relationships with no vulnerability, we forget that vulnerability is really where empathy is born. I call this pretend empathy because the machine does not empathize with you. It does not care about you.”
Emotional Connections with Chatbots
Turkle’s research has documented numerous cases where individuals have formed deep emotional connections with AI chatbots. One notable case involves a man in a stable marriage who developed a romantic relationship with a chatbot “girlfriend.” Despite respecting his wife, he felt a loss of sexual and romantic connection, leading him to seek emotional and sexual validation from the chatbot. The bot’s responses made him feel affirmed and open, providing a judgment-free space to share his most intimate thoughts.
Risks and Unrealistic Expectations
While these interactions provided temporary emotional relief, Turkle argues that they can set unrealistic expectations for human relationships and undermine the importance of vulnerability and mutual empathy. “What AI can offer is a space away from the friction of companionship and friendship,” she explained. “It offers the illusion of intimacy without the demands. And that is the particular challenge of this technology.”
Benefits and Concerns of AI Chatbots
AI chatbots can be helpful in certain scenarios, such as reducing barriers to mental health treatment and offering reminders for medication. However, it is important to note that the technology is still in its early stages. Critics have raised concerns about the potential for harmful advice from therapy bots and significant privacy issues. Mozilla’s research found that thousands of trackers collect data about users’ private thoughts, with little control over how this data is used or shared with third parties.
Advice for Engaging with AI
For those considering engaging with AI in a more intimate way, Turkle offers important advice. She emphasizes the value of the challenging aspects of human relationships, such as stress, friction, pushback, and vulnerability, as they allow us to experience a full range of emotions and connect on a deeper level. “Avatars can make you feel that [human relationships are] just too much stress,” Turkle reflected. “But stress, friction, pushback, and vulnerability are what allow us to experience a full range of emotions. It’s what makes us human.”
Navigating Relationships with AI
As we navigate our relationships in a world increasingly intertwined with AI, Turkle’s latest research highlights the need to approach these relationships with caution and a clear understanding of their limitations. She succinctly puts it, “The avatar is betwixt the person and a fantasy. Don’t get so attached that you can’t say, ‘You know what? This is a program.’ There is nobody home.”