September 27, 2023 - 1:30pm

This week, OpenAI announced that ChatGPT can now “see and speak,” broadening its interactive range. “Settl[ing] a dinner table debate” and “request[ing] a bedtime story for your family” are among the list of potential use cases suggested. OpenAI seems to be positioning ChatGPT to be more than just a tool — perhaps an AI companion.

The impending rise of robot and AI companions has been circulating the tech news world for decades — even before the mid-60s, with Joseph Weizenbaum’s ELIZA — arguably the first and most popular example of a human-computer “chatbot”-style interaction. However, it’s taken on renewed significance with the rise of AI tools like ChatGPT, MidJourney, Replika, and character.ai

These developments could well mean that we are heading towards an ever more atomised future in which people feel more comfortable with robots (or chatbots, as it were) than with other people. Indeed, we may end up in a situation where human-human relationships become a luxury for those with “reality privilege”.

People are surprisingly skilled at nurturing one-sided affection, and there have been numerous studies that show that feelings need not be reciprocal (including with computers) for love to develop. The most extreme examples of this tendency can manifest in sometimes frightening behaviours, such as the erotomania of somebody like Ruth Steinhagen or John Hinckley Jr.. But there are, of course, also parasocial relationships between celebrities or social media influencers and their fans; the myriad stories of people becoming attached to and anthropomorphising household objects and robotic pets; and the strange affection some people feel towards historical, religious, or even fictional figures and characters. 

The question is how these dynamics would change if given the opportunity to interact. What if there was a tool that allowed people to have conversations with that pet or object they’ve been weirdly attached to for years? What about a celebrity? 

In studies of a phenomenon called “fictosexuality,” a rare sexual orientation where individuals primarily feel attraction to fictional characters, researchers discovered an interesting paradox. People who experienced fictosexuality weren’t experiencing erotomania or any other type of delusion: intellectually, they were aware they were projecting onto someone (or something) who couldn’t reciprocate their feelings. Yet the affection was nonetheless as real-feeling as human-to-human affection. This has been well-documented in Japan, in books like The Moe Manifesto, where people regularly become attached to pop idols and anime characters. Often, it is a symptom of extreme loneliness, and these people never return to “normal” lives. 

Two other interesting quirks might strengthen the case for the rise of AI companions. The first is that according to a post-Covid study of Zoom interactions, in-game and computer-mediated “eye contact” has the same psychological impact on people as physical world eye contact. That is, it creates a feeling of connection. If we already have an incredible capacity for one-sided relationships — including with chatbotswhat happens when we add eye contact to the mix? 

The second is that it’s well-documented that even if something is obviously “fake”— in this case, non-human — that won’t prevent people from getting attached, like in the infamous story of Miranda Grosvenor. There have been several famous stories of people being deceived on telephone chat lines or people maintaining online relationships with the deceiver — even after discovering the truth. But more mundanely, this is something we see daily on programmes like Dr. Phil and MTV’s Catfish. The human will to hang onto connection is strong. 

AI companionship might not be such a far-off prediction after all. But it could well end up being dystopian, as people give up on human-to-human relationships and opt for more accessible chatbots.


Katherine Dee is a writer. To read more of her work, visit defaultfriend.substack.com.

default_friend