December 16, 2021   6 mins

In the winter after the financial crash, two Swedish conceptual artists sent me to Hong Kong to interview the head of an offshore finance management company. Their project, Headless, played with the idea that finance has become an anonymous, self-propagating system of shell companies and proxies. To underline the sense of a faceless, agency-less force at work in the world, the artists used proxies — including me — for every part of the work.

What would happen, though, if instead of creating the impression of headlessness, we actually took artists out of the creative process? Well, now we can find out. Researchers are experimenting with applying machine learning to visual imagery, and the result is the unnerving phenomenon of ‘GAN art’.

A ‘GAN’ is a ‘generative adversarial network’, which is to say two machine learning networks programmed to run in parallel. The first is fed a set of input data, such as a type of image, and told to generate new, believable versions of that data. The second is tasked with assessing whether the new, AI-generated data are real or fake.

The aim is for the first network to succeed in fooling the second network more than half the time. As the two models adjust in relation to one another, they become increasingly skilled at predicting and manipulating patterns in the input imagery.

One playful result of this is web or smartphone apps that invite the user to ‘create’ — that is, prompt the AI to create — images based on text. Wombo is a popular one: you can choose from several different ‘styles’, enter your text prompt and watch the machine create an image.

When you do so, two things become apparent. First, the more abstract the prompt, the more difficult it is to distinguish the output from human-created art. But secondly, there’s something indefinably off about them. That comes more clearly into focus when you give the machine a more specific prompt — as I discovered when I asked Wombo to draw me a cat.

Cat images are so popular and widespread online that one 2013 cat food ad claimed that they make up 15% of all internet traffic. With such an immense dataset, you’d think an AI would find it easy to generate a cat image.

But when I prompted Wombo with ‘Kitten’ I got what looks like the disfigured victim of some deranged scientist’s experimental flesh-sculpting, as viewed through a haze of LSD. Here it is; judge for yourself.

A kitten, apparently. (Wombo)

Looking at this abomination, the off-ness becomes easier to grasp.

Everyone — at least, everyone human who’s ever spent time online — knows what a kitten looks like. A skilled artist can convey a kitten’s shape and movement with a few lines of a pencil. But that knowledge is acquired not just by mechanically digesting and attempting to synthesise the aggregate of two-dimensional kitten images, but also understanding what kittens are, and what they mean in our cultural context.

Without that capacity to filter and actively re-interpret in terms of the wider context, replacing meaning with pattern recognition creates something that’s both eerily almost-intelligible but also deeply disturbing. It is, as Macbeth says, “a tale told by an idiot”.

So what, you might say. GAN ‘art’ isn’t art. It’s a game, to amuse us for a few moments online when we should be doing something else. But absurd as it is, the idea of AI ‘art’ only takes a pervasive contemporary dream to a slightly more absurd conclusion: that of eliminating individuals from human culture.

The positive version of this dream imagines that we could promote equality by minimising the role of specific, individual contributions to human society. You see versions of this in the widespread academic effort, for some decades now, to downplay the study of ‘great’ individuals in favour of mass movements or widespread sociocultural phenomena.

Well-established in disciplines such as history or literature, this has percolated even as far as the quintessential study of great individuals, the ‘Grand Strategy’ course at Yale. Here, the course director Professor Beverley Gage prompted debate last year when she shifted the course curriculum away from individual political actors toward grassroots activism and civil rights.

We see same the effort to de-emphasise leadership in favour of the collective in the rise of technocratic forms of government. This is often well-meant: this study shows populations support technocracy in proportion to how incompetent their elected government is perceived to be. In other words, the more disillusioned we grow with the ability of our elected leaders to govern, the more longingly we look to ‘neutral’, depoliticised ways of running our affairs.

And confidence in human leadership is waning across the board. Angry populists condemn the ruling class as out of touch; critics decry the current government as incompetent; others dismiss the electorate as ignorant or racist. The problem, across the board, is: us.

Faced with our manifestly disappointing powers of human self-governance, the dream of eliminating leadership can’t but seem appealing. And it’s in this light that we should view Tuesday’s vote by our MPs, to entrench vaccine passes in British domestic life.

This has been presented as a measure to control Covid — or, as George Osborne put it, a matter of “citizenship”. But it’s worth bearing in mind that emergency powers have a way of sticking around. Anti-terror measures enacted in America in the wake of 9/11 massively expanded US surveillance and are still in force. Once the infrastructure and social norms are in place for digital ID, they are unlikely to go away again — especially given the Government has already signalled a desire to introduce such a scheme.

From some perspectives, digital ID is an obvious and sensible move. In a 2020 video, the aerospace and weapons manufacturer Thales offers an enticing picture of how much easier many ordinary interactions could become with a ‘Digital ID Wallet’. Controlled from a smartphone by each individual user, Thales imagines this centralising a user’s personal data, eliminating the messy business of individual assessment and human error from ID checks and record-keeping across all areas of life.

The power of such tools is evident, as is their appeal from a governance perspective. But once they’re in place, the scope is infinite to extend them beyond simply logging driving licence records.

Once identities are held digitally, our behaviour can be tracked — and ‘nudged’ — digitally as well. And our elites, habituated by university courses that highlight the importance of large-scale social and cultural patterns over the specifics of individual experience, already look eagerly toward the potential offered by AI pattern recognition. But machine learning systems are only as good as their inputs. And pattern recognition isn’t the same thing as interpretation.

In an AI ‘art’ generator, this has little impact beyond creating creepy kitten-images. But in contexts that affect people’s lives more directly, machine learning has more serious implications. Imagine, for example, that we embraced a ‘prevention’ oriented public health system, using citizen data collected around a Digital ID Wallet. This isn’t so far-fetched: a data-driven pilot scheme aimed at tackling obesity via measures including tracking scheme members’ shopping history was launched earlier this year.

And while the existing pilot scheme focuses on rewards, a unified ID would make it easy to implement sanctions too. The Health Pass now required for entry into bars, clubs or venues to combat the spread of Covid-19 could without great difficulty be extended to other behaviours or linked with purchase history. What this change means is the arrival, likely for good, of a mode of digital-era governance where for the first-time participation in public life can be made conditional on ‘healthy’ behaviours.

The mutant kitten is a vivid illustration of what happens when you task a computer with doing something human, without the capacity to understand what humans actually care about. Applied to real lives, the blind brilliant-idiot quality of AI pattern recognition is already causing miscarriages of justice.

Now imagine extending that deep into your life: having your Health Pass suspended because you’ve bought too many units of alcohol this week, perhaps, meaning you can’t access essential services. Never mind that you were shopping for an event at your local social club. The computer sees a pattern. And unlike for a human, for machine intelligence there’s no difference between pattern recognition and meaning.

The slow collapse of social meaning into computer-legible pattern is reshaping us into a post-political, post-individual human order — one resisted most vigorously, to date, by conservatives. It was chiefly conservative politicians and voters who framed Brexit as pushback against technocracy. At Yale, conservative donors including Henry Kissinger forced Professor Gage to resign earlier this year. And on Tuesday 98 Tories — nearly a third of elected Conservative MPs — rebelled on Covid passes. Boris owes his victory to Labour Party support, and is probably a dead man walking within his own party now.

But Boris’s loss of face in his own party is one fraction of a far more deadly bleeding of authority: our downward spiral of disintegrating trust in human government full stop. As long as this continues, the temptation will persist to solve the crisis of authority with ever deeper rule by AI: the only compromise option left when we can’t agree on which human or humans should be in charge.

Fittingly, while writing this, I stumbled on another, current digital art project titled Headless. This one doesn’t depict the impersonal nature of finance — it’s trying to harness decentralised digital technology to human ends, by embracing it as an engine of creativity.

I hope its creators are right to think this can be done. For we’ve just taken another step closer to a world ruled by machines; machines that are brilliant at detecting patterns, but idiots when it comes to understanding what those patterns mean. And unless we’re able to humanise those technologies, this dawning age of digitised governance will digest the patterns of our social order and regurgitate a world of idiotic, inhuman cruelty.

Mary Harrington is a contributing editor at UnHerd.