We in the West are fascinated by how technology will give us the edge in war. Possibly because it always has. From massed infantry phalanxes, to railways, to the atomic bomb and autonomous drones – the West and its antecedents have almost always brought greater levels of technology to the fight.
These days, barely a week goes by without some politician pronouncing on cyber warfare, or new space systems, or next generation warplanes – not to mention voicing concerns about Chinese laser systems, or drone swarms, that could make those investments nugatory, for a much lower cost.
[su_unherd_quote]How AI robots interact when they fight each other will foreshadow how the essence of warfare could change in the future[/su_unherd_quote]
This obsession with 'kit' almost eclipses the fact that war is a human endeavour. One which involves beating hearts, and blood, and guts. And most important of all, brains. Wars are won when opposition make the decision that they don’t want to fight anymore — they are won when you have a better strategy than your opponent. And strategy, at its core, is and always has been about deception, or chutzpah, advance, or retreat; it is about surprise. But that may be about to change.
How so? At its simplest, strategy is two human brains trying to outcompete each other. How do I (or we, if you are the leader of a group) get him/her/them to do what I want? In warfare, the means of executing this strategy are often violent and lethal. Though not always. Russian information activities around recent plebiscites in the West demonstrate that you can achieve the desired effects non-violently. The key thing, though, is that strategy is about influencing the way an opponent thinking.
[su_unherd_related fttitle="More on Automation" author="Peter Franklin"]https://staging.unherd.com/2018/07/trust-robot-build-house/[/su_unherd_related]
The key thing about strategy is that it scales: the dynamics of the competition are the same, whether it is a bar brawl, two platoons fighting hunter-seeker style, two divisions in clash, or two multi-national coalitions viewing each other across a global ideological divide: all of these levels of conflict have the same dynamics of retreat, advance, bluff, feint, counter, lure, deceive, charge, and of course the goal — surrender.
Strategy is also timeless. The dances on ancient battlefields are studied for strategic insights; military officers go on battlefield tours; and business executives study Sun Tzu. Ancient strategists have something to teach us about the present, and about the future.
Strategy scales across time and space because the competing brains have barely changed in the period of recorded history. The Sapiens brain evolved on the African Savannah between about 250,000 years’ ago. It had a specific evolutionary environment: the brains of other humans in their social group of around thirty hunter-gatherers.
[su_unherd_quote]Neural networks, a specific aspect of AI, are the closest, virtual equivalent of the human brain[/su_unherd_quote]
Because our brains have evolved in social competition with other humans, we all have the same emotional responses (more or less). We are all — at times — jealous, angry, proud, and sad. These emotions, and these ways of psychologically relating to others, are the basis for strategy. They form the timeless essence of warfare. We know how our opponent thinks and can play him at his own game. What is a ruse, if it is not playing on the opposing commander’s pride?
The mode in which we fight is less timeless, however. A mode of warfare is the method or tools which we use to fight it. Thus, we have gone from spear chucking, to phalanxes, the legions, to infantry companies, to submarines, to drone swarms. The actual physical fighting has changed beyond wildest dreams, while the dynamics of the fight have remained the same. But, artificial intelligence could be the first technology in the Anthropocene to change that.
Neural networks, a specific aspect of AI, are the closest, virtual equivalent of the human brain. They learn in analogous ways. After being exposed to stimuli, certain pathways degrade, and others reinforce.
[su_unherd_related fttitle="More on Automation" author="Emily Thornberry"]https://staging.unherd.com/2017/12/reported-now-proven-ability-ai-technology-outwit-us/[/su_unherd_related]
With neural networks this happens in two different ways. Either they are exposed to huge data sets, like NHS patient records, and can spot patterns of, say, cancer occurrence, and help prioritise treatments. Or they are trained on each other, in a very similar way to how the human brain evolved in social competition with other human brains.
One of the leading lights of AI is a company called DeepMind. A British start-up, it was acquired by Google in 2014 for £400m. Among their achievements, they designed the first AI system, called AlphaGo, that beat the world champion at Go — a game of abstract strategy invented in China more than 2,500 years ago. It is fiendishly complex, much more so than chess.
[su_unherd_quote]Advanced militaries are prototyping how AI can power autonomous ‘individual’ weapons (that is, killer robots)[/su_unherd_quote]
Essentially, DeepMind taught AlphaGo the rules of Go. Then they set it off to learn from itself. So AlphaGo played itself tens of thousands of times over a period of three days. And then it beat Lee Sedol, the Go World Champion.
The match was then studied by other Go experts and champions. They stated that, retrospectively, they could see how AlphaGo had won, but they would never have worked out its strategy as humans. One, Ke Jie said:
“After humanity spent thousands of years improving our tactics, computers tell us that humans are completely wrong ... I would go as far as to say not a single human has touched the edge of the truth of Go.”The AI system simply thought in a different way. Human brains that evolved through competition with other human brains create the psychology that underpins strategy, which creates the essence of warfare. But these human brains evolved in order to maximise survival and reproduction: to find food and water, and sexual partners, to form coalitions, and to avoid being killed. AlphaGo has none of these ultimate goals: and so its strategy has an entirely different essence. That's why, if applied to the battlefield, AI could change the essence of warfare in a way that is beyond human speculation. [su_unherd_related fttitle="Further reading" author="Peter Franklin"]https://staging.unherd.com/2018/05/influential-religion-youve-never-heard/[/su_unherd_related] Of course, my argument relies on AI being in the decision-making loop, and we are unlikely to see this at the strategic level for some time. But at lower levels — like individual and squad — we are already there: advanced militaries are prototyping how AI can power autonomous ‘individual’ weapons (that is, killer robots). How these robots interact when they fight each other is the preview, if you like, of how the essence of warfare could change in the future. We can't predict how it will be different, but we can be certain it will change the rules in a way that will leave us us, like the Go grandmasters, scratching our heads.
Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
Subscribe