Researcher studies married men with AI girlfriends


A sociologist at the Massachusetts Institute of Technology is studying the artificial intimacy offered to humans by AI chatbots — including for people who are married in real life.

In a interview with NPRMIT researcher Sherry Turkle said she is interested in “machines that say, ‘I care about you, I love you, take care of me.'”

Some humans have long since developed intimate bonds with inanimate objectswith Turkle examining similar phenomena since the 1990s with interactive toys like Tamagotchis and Furbies. But recent advances have put intimate relationships with AI in overdrive.

For Turkle, the feelings people have for their AI companions present a curious psychosocial conundrum.

“What AI can offer is a space away from the frictions of camaraderie and friendship,” researcher and author said NPR“It offers the illusion of privacy without the demands. And that’s the particular challenge of this technology.”

Take, for example, a married man at the center of one of Turkle’s case studies.

Although the unnamed man said he respected his wife, she turned away from him to focus on their children, which made him feel like their relationship had lost its romantic and sexual spark. After he began chatting with an AI about his thoughts and anxieties, the man said he felt validated, particularly in the way she seemed to express sexual interest in him. He felt affirmed and not judged in these interactions with the chatbot, suggesting he didn’t feel the same way about his wife.

It’s unclear whether or to what extent the man’s wife or children are aware of his AI “girlfriend,” but it’s clear from the information shared that he has expressed some level of vulnerability with the chatbot — a vulnerability that occurs, Turkle suggests, under false pretenses.

“The problem is that when we seek relationships without vulnerability, we forget that it is in vulnerability that empathy is born,” she said. “I call it faking empathy, because the machine doesn’t empathize with you. It doesn’t care about you.”

Rather than judge people who turn to technology for their human needs, Turkle instead offers a few words of caution to those who take the AI-assisted path: to remember that chatbots are not people, and while they may produce less stress than human relationships, they also can’t truly fulfill the roles that humans can play in our lives.

“The avatar is somewhere between a person and a fantasy,” the researcher reflected. “Don’t get so attached that you can’t say, ‘You know what? It’s a program.’ There’s no one home.”

Learn more about AI relationships: Tech exec predicts billion-dollar AI girlfriend industry

Leave a Comment