Credit: Thomas Lohnes/Getty Images


April 2, 2019   4 mins

Sex robots are always good for a bit of clickbait (though UnHerd has featured some thoughtful considerations of the subject by Kate Devlin and Rowan Pelling).

But how close are we to producing a convincing android as a opposed to a glorified sex toy? Films like Ex Machina and TV shows like Channel 4’s Humans depict robots that look like people, in a near-future setting. In reality though, the necessary technology is a long way from being developed. If you’re worried about the tech-driven disruption of human sexuality, then the ubiquity and extremity of internet pornography is a more immediate concern.

Beyond the porn issue, the near-term danger isn’t one of robots displacing real people in our beds, but in our hearts.

Consider the case of HitchBOT, a relatively simple robot created by a team at Ryerson University, Toronto – and designed for no other purpose than hitching rides with passing motorists. Writing about this technological and social experiment for the BBC, Jane Wakefield describes the little robot as “cartoon-like” and “non-threatening”:

“In order to qualify as a robot, it had to have some basic electronics – including a Global Positioning System (GPS) receiver to track its journey, movements in its arms, and software to allow it to communicate when asked questions. It could also smile and wink.

“And, of course, it could move its thumb into a hitch position.”

After the experiment went public, HitchBOT became a social media sensation. There were no shortage of fans willing to help out when it was sent on a mission to hitchhike across Canada. Setting off from Halifax, Nova Scotia on the 27 July 2014, HitchBOT made it to Victoria, British Columbia by 21 August. Not bad going.

The following year, there was a new mission – to hitchhike across the United States from Boston to San Francisco. However, HitchBOT only got as far as Philadelphia, where it was discovered on the side of the road – dismembered and decapitated by some unknown assailant.

Jane Wakefield’s article is headlined “can you murder a robot?” – presumably by an unknown editor. The answer to the question is “obviously not”, but there’s no doubt that the senseless destruction of HitchBOT upset a lot of people.

Frauke Zeller, who led the team behind the project, was careful not to inflame the situation. Adam Gabbatt of the Guardian reported her as saying “I really want to emphasise I don’t think it has anything to do with the States nor with Philadelphia.” She also said “robots can trust humans but there’s always some people anywhere that might have issues for any reason.”

While “some people” certainly have “issues”, robots cannot “trust” (or distrust) humans. They don’t have any feelings at all, yet we so easily slip into speaking and writing about them as if they do. Furthermore, we reciprocate these imagined feelings with real ones of our own.

Jane Wakefield cites a number of other experiments demonstrating just how readily we treat robots as if they were people:

“Prof Rosalind Picaurd, who heads up the Affective Computing Lab… based at the Massachusetts Institute of Technology, thinks it comes down to human nature.

“‘We are made for relationships, even us engineers, and that is such a powerful thing that we fit machines into that,’ she said.

“But while it is important that robots understand human emotions because it will be their job to serve us, it might not be a good idea to anthropomorphise the machines.

“‘We are at a pivotal point where we can choose as a society that we are not going to mislead people into thinking these machines are more human than they are,’…”

But, of course, when these things become mass consumer products they will be anthropomorphised. Silicon Valley is full of people who know exactly how to manipulate our minds. Given how successfully they’ve addicted us to our mobile phones, do you think they’d pass up on the chance to hijack our capacity for love and compassion?

We need to prepare ourselves, because unlike the overhyped sex robots, there’ll be no need to achieve a convincing simulacrum of a living, breathing human being. If people can become devoted to something as crudely mechanical as HitchBOT, then mass consumer technology you can literally fall in love with can’t be very far away.

A good thing too, some might say. ‘Emotional support robots’ could provide comfort and companionship for anxious and lonely people.

In the New York Times, Nellie Bowles writes about a company called Care.Coach, which provides its customers with an animated onscreen cat (or some other winsome avatar) that can provide conversation and reassurance on demand:

“The technology… is quite simple: a Samsung Galaxy Tab E tablet with an ultrawide-angle fisheye lens attached to the front. None of the people operating the avatars are in the United States; they mostly work in the Philippines and Latin America…”

“Early results have been positive… patients with avatars needed fewer nursing visits, went to the emergency room less often and felt less lonely.”

It’s significant that even though there are real people on the other end of the line, the interaction with the customers is filtered through a clearly non-human animated character. How long before the kind of AI that powers digital assistants like Alexa and Siri is used to fully automate emotional support services too?

Of course, people already use radio, TV, and the internet for company. A more fully interactive electronic companion, whether onscreen or downloaded into a robot would seem to represent an advance – especially if it can be programmed to recognise and respond to appropriate to shifts in mood or even to emergency situations.

Still, I wonder about the wider effect on human society. Will an AI that’s good enough to replace the connections that ought to exist between people but don’t also displace those that currently do?

Our new friends might not look or behave like real people, but I’ve got a horrible feeling we might prefer it that way.


Peter Franklin is Associate Editor of UnHerd. He was previously a policy advisor and speechwriter on environmental and social issues.

peterfranklin_