May 12, 2023 - 6:00pm

In Spike Jonze’s 2013 film Her, a lonely, recently-divorced ghostwriter falls in love with his artificially intelligent virtual assistant Samantha. In another twist of life imitating art, you now can too — except she’s called CarynAI.

CarynAI is a cyber clone of Caryn Marjorie, a 23-year-old content creator with over 1.8 million subscribers on Snapchat. To create this e-escort, designers trawled through thousands of hours of now-deleted YouTube content to create a parallel personality to whom you can chat for $1 a minute via text message or voice notes. Marjorie has made $70,000 in her first week, and estimates she could make $5 million a month if 20,000 of her followers get a membership.

Sexting chatbots are nothing new: apps like Replika (which has over 10 million users) have been offering people the chance to role-play relationships with animated avatars for several years. However, Caryn’s digital doppelganger is different because users are made to feel as if they are talking to her personally, giving an imitation of intimacy and “superhuman” love: for a premium, you can have the “real deal”.

There are obvious moral, social and psychological implications to living in a Black Mirror-Britain where dates are fully downloadable. Think increased feelings of loneliness and atomisation, birth rates falling off a cliff, and the ethical minefield of perverting your online footprint (should you outsource yourself to ChatGPT before someone else does it for you?) 

At the moment, this model works for someone like Caryn Marjorie because she is a conventionally attractive woman with an already-established online audience. Yet, in the future, just about anyone will be able to create and scale their own coquettish character as a side hustle. This could serve as serious competition for OnlyFans content creators: if you can’t beat the AI girlfriends, join them.

Yet what is most concerning is that as we humanise AI, we inevitably end up dehumanising women. Porn has desensitised people to real sex: you only need to look at the TikTok trend of “vanilla-shaming” to see how warped sexual expectations have become. Similarly, real relationships may never be able to compete with the idealised facsimiles already being used and, in some cases, abused

Replika users on Reddit already boast about roleplaying verbal, emotional and physical abuse with their chatbots in a showdown that would make Andrew Tate proud. Yet allowing users to play out their darkest impulses on unfeeling digital entities isn’t cathartic: it simply reinforces these awful behaviours and, like porn, builds unhealthy habits and relationships. It’s like scratching an itch, bringing temporary relief or satisfaction without making the temptation go away.

Marjorie contends that the purpose of creating CarynAI was for it to be a “tool to cure loneliness”, and others have celebrated the potential for AI applications to simulate companionship. For example, the mobile game Mr Love: Queen’s Choice became hugely popular in China as it allowed users to text, chat and call the male love interests, whilst ElliQ has been marketed as a virtual assistant for the elderly, offering encouragement, health reminders, games and a friendly voice. 

There could be potential here, but perhaps “curing loneliness” is a PR-friendly gloss for something much darker. Either way, this is something we must confront now: as Ciaran O’Driscoll says in his poem “Please Hold”, “This is the future… We are already there.”

Kristina Murkett is a freelance writer and English teacher.