Robot wars: lethal machines coming of age
The era of drone wars is already upon us. The era of robot wars could be fast
Already there are unmanned aircraft demonstrators like the arrow-head shaped X-47B that can pretty-well
fly a mission by itself with no involvement of a ground-based "pilot".
There are missile systems like the Patriot that can identify and engage targets automatically.
And from here it is not such a jump to a fully-fledged armed robot warrior, a development with huge
implications for the way we conduct and even conceive of war-fighting.
On a carpet in a laboratory at the Georgia Institute of Technology in Atlanta, Professor Henrik
Christensen's robots are hunting for insurgents. They look like cake-stands on wheels as they scuttle
Christensen and his team at Georgia Tech are working on a project funded by the defence company BAE
Their aim is to create unmanned vehicles programmed to map an enemy hideout, allowing human soldiers
to get vital information about a building from a safe distance.
"These robots will basically spread out," says Christensen, "they'll go through the environment and map
out what it looks like, so that by the time you have humans entering the building you have a lot of
intelligence about what's happening there."
The emphasis in this project is reconnaissance and intelligence gathering. But the scientific literature has
raised the possibility of armed robots, programmed to behave like locusts or other insects that will swarm
together in clouds as enemy targets appear on the battlefield. Each member of the robotic swarm could
carry a small warhead or use its kinetic energy to attack a target.
Peter W Singer, an expert in the future of warfare at the Brookings Institution in Washington DC, says
that the arrival on the battlefield of the robot warrior raises profound questions.
"Every so often in history, you get a technology that comes along that's a game changer," he says.
"They're things like gunpowder, they're things like the machine gun, the atomic bomb, the computer…
and robotics is one of those."
"When we say it can be a game changer", he says, "it means that it affects everything from the tactics that
people use on the ground, to the doctrine, how we organise our forces, to bigger questions of politics, law,
ethics, when and where we go to war."
Jody Williams, the American who won the Nobel Peace Prize in 1997 for her work leading the campaign
to ban anti-personnel landmines, insists that the autonomous systems currently under development will,
in due course, be able to unleash lethal force.
Williams stresses that value-free terms such as "autonomous weapons systems" should be abandoned.
"We prefer to call them killer robots," she says, defining them as "weapons that are lethal, weapons that
on their own can kill, and there would be no human being involved in the decision-making process. When
I first learnt about this," she says, "I was honestly horrified — the mere thought that human beings would
set about creating machines that they can set loose to kill other human beings, I find repulsive."
It is an emotive topic.
But Professor Ronald Arkin from the Georgia Institute of Technology takes a different view.
He has put forward the concept of a weapons system controlled by a so-called "ethical governor".
It would have no human being physically pulling the trigger but would be programmed to comply with
the international laws of war and rules of engagement.
"Everyone raises their arms and says, 'Oh, evil robots, oh, killer robots'," but he notes, "we have killer
soldiers out there. Atrocities continue and they have continued since the beginning of warfare."
His answer is simple: "We need to put technology to use to address the issues of reducing non-
combatant casualties in the battle-space".
He believes that "the judicious application of ethical robotic systems can indeed accomplish that, if we
are foolish enough as a nation, as a world, to persist in warfare."
Arkin is no arms lobbyist and he has clearly thought about the issues.
There is also another aspect to this debate that perhaps would be a powerful encouragement to caution. A
present, the US is one of the technological leaders in this field, but as Singer says this situation will not
"The reality is that besides the United States there are 76 countries with military robotics programmes
right now," he says.
"This is a rapidly proliferating technology with relatively low barriers to entry.
"You can, for a couple of hundred dollars, purchase a small drone that a couple of years ago was limited
to militaries. This can't be a situation that you interpret through an American lens. It's of global concern."
Just as drone technology is spreading fast, making the debates about targeted killings of much wider
relevance — so too robotics technology will spread, raising questions about how these weapons may be
used or should be controlled.
The prospect of totally autonomous weapons technology - so called "human-out-of-the-loop" systems - is
still some way off. But Nobel Prize winner Jody Williams is not waiting for them to arrive.
She plans to launch an international campaign to outlaw further research on robotic weapons, aiming for
"a complete prohibition of robots that have the ability to kill".
"If they are allowed to continue to research, develop and ultimately use them, the entire face of warfare
will be changed forever in an absolutely terrifying fashion."
Arkin takes a different view of the ethical arguments.
He says that to ban such robots outright, without doing the research to understand whether they can
lower non-combatant casualties, is to do "a disservice to those who are, unfortunately, slaughtered in
warfare by human soldiers".
1. What is the important feature shared by the X-47B and the Patriot missile system?
2. What are the robots under development at the Georgia institute of technology meant to do?
3. What arguments can be used for and against robots programmed to find and kill enemy soldiers?
Give meanings for the following words as they are used in the passage: