The military-industrial complex is never idle. War is big business and the US is engaged in developing a new and more lethal generation of weapons. The most widely known of these weapons is the drone or ‘unmanned aerial vehicle’ (UAV). These drones enable targeted killings across national borders and the New America Foundation estimates that a third of casualties are civilians. The US deploys armed drones in six countries: Afghanistan, Pakistan, Yemen, Iraq, Libya and Somalia and these theatres of war afford opportunities to practice the targeting and execution of individuals. The ‘pilots’ of the drones are sitting safely some 7,000 miles away from their assassinated victims. Philip Alston, United Nations Special Rapporteur describes how “young military personnel raised on a diet of video games now kill people remotely using joysticks ----- far removed from the human consequences of their actions ” and asks “how will this generation of fighters value the right to life?” This is a valid question because in pressing a joystick to kill someone thousands of miles away risks an irreversible de-sensitisation of the operator to the violent death of the victim. If there is a danger of being killed oneself – or of one’s children being killed – one’s actions would be more considered and less casual. That is in a sense a more honourable way of fighting – where the risks are equal on either side. There is something obscene about picking off targets from the safety of another continent. Nevertheless, that is the direction in which modern warfare is going.
Drone design and production is a global activity with manufacturers all over the world. The United States and Israel were pioneers of the technology and US manufacturers have a market share of over 60% as at 2006, due to increase by 5-10% up to 2016. Northrop Grumman and General Atomics are the dominant manufacturers in the industry. Israeli and European manufacturers form a second tier with ambitions to acquire US systems because of higher levels of capability. A new and more sophisticated series of drones is in the pipeline. Northrop Grumman together with Lockheed Martin and Boeing are developing an Unmanned Aerial Combat Vehicle (UACV) that is capable of making its own decisions about manoeuvring and targeting in battle.
In the UK, the MoD has been using armed US ‘Reaper’ drones in Afghanistan since 2007 and they have recently announced plans to double the number of these drones at a cost of £135 million. Currently they are flown by RAF pilots from the United States, but a new Reaper Squadron is being formed to fly them from RAF Waddington in Lincolnshire* .
The MoD has acknowledged the serious legal and moral issues arising from the use of armed drones. On 30 March this year, the MoD issued the Joint Doctrine Note 2/11 : The UK Approach to Unmanned Aircraft, in which the technological, legal, moral and ethical aspects of drone use are examined. One of the points raised was that if the risk of loss is removed from the decision-makers will they resort to war far sooner than they might have done previously? “One of the contributory factors in controlling and limiting aggressive policy is the risk to one’s own forces. It is essential that, before unmanned systems become ubiquitous (if it is not already too late) we consider the issue and ensure that by removing some of the horror – that we do not risk losing our controlling humanity and make war more likely”.
The Joint Doctrine Note quotes General Robert E. Lee after the Battle of Fredericksberg in 1862, who said: “It is well that war is so terrible – otherwise we might grow too fond of it”. This point goes straight to the heart of the debate. Revelations about UK and US forces illegally torturing and mistreating prisoners in Abu Ghraib, Afghanistan and Guantanamo Bay show how quickly their “controlling humanity” can become eroded under war conditions. As Professor Christopher Coker from the London School of Economics says: “We enter a new century knowing all too well that our ethical imagination is still failing to catch up with the fast expanding realm of our ethical responsibilities. Robots are taking us even further away from the responsibilities we owe our fellow human beings.”
The MoD’s Joint Doctrine Note echoes this worry in some places. Putting itself into the mind of a robotic drone, it observes that “to a robotic system, a school bus and a tank are the same – merely algorithms in a programme”. “It doesn’t have to know why it is engaging a target. There is no recourse to human judgement in an engagement, no sense of higher purpose on which to make decisions, and no ability to imagine (and therefore take responsibility for) repercussions of action taken.” The MoD goes on to say that “The use of unmanned aircraft prevents the potential loss of aircrew lives and is thus in itself morally justified.” Maybe, but what about the question of honour? This ignores the slaughter of innocent civilians by a drone when badly targeted or out of control. At present, the MoD does not publish figures of civilians killed in this way in Pakistan, Afghanistan and elsewhere. This unaccountability is illegal under international law.
The Joint Doctrine Note (JDN) is an unusually frank, official exposé of the ethical and moral dilemmas associated with the use of drones – and it is high time they were discussed in a public forum. We need to know if we are approaching the time when an autonomous flying machine armed with a nuclear device will fly over us making its own decision as to what is a legitimate target. This nightmare scenario is not science fiction. Robots cannot be emotive, cannot hate, cannot care. The robot does not care that the target is human or inanimate, terrorist or freedom fighter, savage or barbarian. Can it be held accountable for war crimes? The pace of technological development is accelerating and the UK must establish a clear policy on what will constitute “acceptable machine behaviour” in future – and time is running out. It is far from certain that there will be time for a debate or for a policy to be developed because the technological genie may be already out of the ethical bottle.
Will the Geneva Convention, the Laws of Armed Conflict (LOAC) and the Rules of Engagement (ROE) have to be re-written – and if so, by whom?
*It is time that we – the public – made our views known about the use of drones and UACVs. We can do this by visiting and supporting the new peace camp that has been set up at RAF Waddington – call Helen John for details on 07971- 675776
Acknowledgements to Dave Webb (CND Chair) and Campaign Magazine and to The Joint Doctrine Note (2/11) (DEP2011-1514)
Puts me in mind of the 1984 movie 'Terminator', an artificial intelligence network called Skynet becomes self-aware and initiates a nuclear holocaust of mankind.
ReplyDeleteTruth stranger than fiction? The movie is set in 2029, however it appears the reality is drawing in much sooner, and we (the people) are standing by and allowing it to happen.
WHEN will we unite in our protest against these actions, taken in our name by faceless officials claiming to represent us?