Elon Musk, whose Tesla electric cars, Space-X re-usable rockets, and “hyper-loop” super-fast transport system will likely qualify him as the Thomas Edison of our day, recently weighed in (again) on the dark side of Artificial Intelligence.
Together with a hundred other signatories from the high-tech world, he has called for a ban on “autonomous” weapons – weapons that can operate without humans to make the decisions.
According to The Guardian, Musk and the other signatories say these “morally wrong” weapons should be “added to the list of weapons banned under the UN’s convention on certain conventional weapons (CCW), brought into force in 1983, which includes chemical and intentionally blinding laser weapons.”
Earlier, Musk had pointed out that AI presents a unique problem:
“Normally the way regulations are set up is when a bunch of bad things happen, there’s a public outcry, and after many years a regulatory agency is set up to regulate that industry. It takes forever. That, in the past, has been bad but not something which represented a fundamental risk to the existence of civilisation.”[su_unherd_quote]“Whoever becomes the leader in this sphere will become the ruler of the world,” warned Vladimir Putin[/su_unherd_quote] The campaign to stop killer robots Meanwhile, a new global effort based in the UK has been launched to bring together robotics/AI experts and mainstream campaigners. They too are looking to the United Nations for action. Here’s part of their manifesto.
“Giving machines the power to decide who lives and dies on the battlefield is an unacceptable application of technology. Human control of any combat robot is essential to ensuring both humanitarian protection and effective legal control. A comprehensive, pre-emptive prohibition on fully autonomous weapons is urgently needed….through an international treaty, as well as through national laws and other measures.”[caption id="attachment_7752" align="alignnone" width="1024"] Elon Musk speaks on the final day of the 68th International Astronautical Congress. Credit: Xu Haijing/Xinhua News Agency/PA Images[/caption] A committee room in Washington, nearly a decade ago On 19 June, 2008, I was sitting in a committee room in the United States House of Representatives. It was a hearing of the Foreign Affairs Committee – specifically, its Sub-Committee on Terrorism, Nonproliferation, and Trade. The topic was a little out of the ordinary: "Does the technological enhancement of humans represent a new arms race?" As the hearing opened, Congressman Brad Sherman, a Democrat representing California, laid out a clear case that the “weapons of mass destruction” of the 20th century – nuclear, chemical, and biological weapons – are being joined in the 21st by a whole new generation of weapons focused on enhancing human performance that are leading us into a "new kind of arms race". He quoted futurist Christine Peterson:
“[in 25 years] we are going to live in a sci-fi movie, we just don’t know which one.”The prime focus of the hearing would be genetics, though a variety of emerging technologies are granting us increasing super-powers. Countries around the world would use these powers for military purposes. What’s more, said Chairman Sherman: “There is one issue that I think is more explosive than even the spread of nuclear weapons: engineered intelligence.” The Ranking Member (the term for the senior committee member of the minority party, in this case the Republicans), Edward R. Royce, endorsed Sherman’s approach:
“We are dealing with the early stages of a very grave proliferation issue.”In my own evidence to the Committee I focused on the inter-related character of 21st century “emerging technologies.” A key reason genetics is moving ahead by leaps and bounds lies in the use of information technology to power research. And the “engineered intelligence” referred to by Chairman Sherman (artificial intelligence, as we more usually call it) is increasingly driving everything. That was true enough in 2008. Nine years later, as Alexa and Siri and Cortana have become a key part of most of our lives, it’s no exaggeration to say that artificial intelligence is all around us. And whatever is true of us as individual citizens and consumers will be yet more true of those working in the defence and security industries. Sadly, Sherman’s hearing did not lead to action at the United Nations, on the part of the United States government, but it was a prescient effort to track the momentous implications of the steady advance of still-new technologies for global security – while the entire focus of the international community remained on the need to contain the “new” weapons of the 20th century - nuclear, chemical, and biological. The testimony and full transcript of the hearing is available here. [caption id="attachment_7754" align="alignnone" width="1024"] Campaign to Stop Killer Robots. Credit: DSC_0126[/caption] A UN view In response to recent efforts, a senior United Nations official, Angela Kane, High Representative for Disarmament Affairs, has had this to say:
“A new type of arms race is underway and its outcome will shape the future of our planet. This race is not one between two countries. It is between the “tortoise” of our slowly changing legal and institutional norms and the “hare” of rapid technological change in the arms industry.”The case of lethal autonomous weaponry—what many now call killer robots—offers a classic example of this larger challenge. The stakes could hardly be higher, which may be why large countries are not jumping on the bandwagon of some kind of global ban. Because as President Putin has stated:
“Whoever becomes the leader in this sphere will become the ruler of the world.”
***
FURTHER READING/RESEARCH Transcript of the US House of Representatives hearing on human enhancement technologies. Professor Sharkey of the Campaign to Stop Killer Robots make his case at TEDx Sheffield: [su_youtube url="https://www.youtube.com/embed/kjRV9FzdQNk"] DISCLOSURE Noel Sharkey and Christine Peterson are professional friends of the author, although neither was interviewed for the purpose of this article. As noted, the author was invited to testify to the US House hearing in 2008.
Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
Subscribe