RISE OF THE ROBOT SOLDIERS
Rise of the robot soldiers
Yvonne Gill charts the deployment of Chinese killer robots on the Indo-Chinese border, and considers the ethics of such lethal autonomous weapons systems
In an alarming first, a ground-based robotic weapon system has been deployed by China’s People’s Liberation Army (PLA) in battlefield conditions. Among a range of Unmanned Ground Vehicles (UGVs) pressed into service in the Tibet Autonomous Region and the Indo-Chinese Line of Actual Control (LAC)are those capable of being armed with machine guns and mortars. These UGVs can also carry troops, arms, ammunition and other cargo, autonomously negotiating the treacherous terrain of Aksai Chin, an area occupied by China since 1962. This is a part of Ladakh, where both the armies are eye-to-eye, following the Galwan Valley clashes that resulted in casualties on both sides last year.
Military drones, or Unmanned Aerial Vehicles (UAVs), have been extensively used by the US in Afghanistan and Iraq to lethal effect, often wreaking havoc on unintended targets. Robotic systems have also been used for mine-clearing and other remotely-controlled operations. But China’s deployment of UGVs armed with deadly weapons is a matter of serious concern. It has once again trigged the debate on whether the use of killer robots – Lethal Autonomous Weapons Systems (LAWS)– should be banned, or at least severely restricted. Otherwise we could witness hapless humans being mowed down by mechanised killers that can never die but need to be smashed to stop them wreaking destruction.
Of the 88 Sharp Claw-2 all-terrain UGVs deployed by the PLA, 38 have been moved to the LAC, where the situation remains tense, with massive troop build-up on both sides. The autonomous vehicle can be armed with a remote controlled gun carriage that can accommodate an assault rifle or machine gun, according to the Chinese arms manufacturer NORINCO. It can be used for reconnaissance, patrolling and transportation of weapons and other logistical work on difficult terrain. The smaller Sharp Claw-1 is a compact tracked version, armed with a light machine gun and sensors for reconnaissance and patrolling missions.
Mule-200 is another dual-use robotic vehicle that can carry about 200kg of supplies. When fitted with a weapon, it can perform combat tasks too. Besides more than 100 Mules, the PLA has been using drones for reconnaissance. These remote-controlled drones can also drop explosive payloads on enemy targets.
While the Russians have made considerable progress in robotic technology, the Chinese have been heavily investing in robotised weapons
China is not alone. Israel, France and Turkey have been developing autonomous weapons too. But the US leads in the field with its advanced UAVs and combat support UGVs, and it has been criticised for inflicting repeated ‘collateral damage’ – a euphemism for killing innocent people and hitting civilian targets during drone attacks in Afghanistan, Pakistan and Iraq. While the Russians have made considerable progress in robotic technology, the Chinese have been heavily investing in robotised weapons. Chinese military doctrine involves the intelligentisation and informatisation of integrated combat capabilities by developing LAWS, including swarm drone technology and autonomous tanks like its Path breaker, equipped for reconnaissance, assault, patrol, and search-and-destroy operations. Chinese, Russian and American scientists are also developing humanoid robot-warriors and mind-controlled weapons.
The future of warfare is frightening indeed. LAWS take us to a new level of destructive technology, where wars in times to come could be controlled by artificial intelligence, sensors and machine processes, which would substitute human decision-making about life and death.
Decisions made by machine intelligence about whom to kill and what to destroy would be the ultimate form of digital dehumanisation, because machines cannot identify people as ‘people’. For a robot, it makes no difference whether its victim is a child, woman or man, friend or foe. Algorithms fed to the machine are likely to have its developer’s own biases, especially towards marginalised communities. To a robot, every bearded individual may appear to be a potential terrorist because Muslims have beards – even though many non-Muslims also have beards.
Even if a LAW is remotely operated by a human being, the operator loses meaningful control as he is dependent on software-based machine interpretation of his target, which has already led to so many civilian casualties in US drone attacks. Robots cannot make complex ethical choices or understand the value of human life; nor can they be held accountable for their actions. An errant soldier or officer can be punished for his recklessness, but how can one discipline a machine?
LAWS take us to a new level of destructive technology
Forms of artificial intelligence and machine learning can make such weapons unpredictable and dangerous to handle. Even simple autonomous systems present challenges. Under the law, military commanders must be able to judge the necessity and proportionality of an attack, and distinguish between civilians and legitimate military targets. How could we expect a robot to abide by rules and take measured decisions, once it has been let loose? Killer drones already in use have caused civilian casualties, yet these war crimes have gone unpunished because these were ‘mistakes’ made by flying robots.
Worst still, cheap low-tech versions of LAWS are already being used by terrorists and non-state actors. Not only is the world on the threshold of a LAW arms race, these ‘intelligent’ weapons would heighten tensions and lead to dozens of conflicts around the world, killing more people and devastating assets on a greater scale than ever witnessed. With these robots’ controllers sitting in safety far away, their actions would be akin to playing a video game, although the people they ruthlessly kill would be real.
‘Technology can and should be developed to promote peace, justice, human rights and equality,” says Stop Killer Robots, an international campaign. ‘Rejecting digital dehumanisation and ensuring meaningful human control over the use of force are key steps to building a more empowering relationship with technology for all people, now and in the future.’
In December last year, most of the 125 nations which are signatories to the Convention on Certain Conventional Weapons (CCW) met in Geneva to discuss internationally-mandated curbs on killer robots. The CCW bans or restricts weapons considered to cause unnecessary, unjustifiable and indiscriminate suffering, such as incendiary explosives, blinding lasers and booby traps that don’t distinguish between fighters and civilians. However, the convention has no provisions for killer robots.
The conference, in the end, came to a naught as powerful members such as the United States and Russia opposed a complete ban on such weapons. The conference concluded with the issuing of a vague statement about considering ‘possible measures acceptable to all’. The agenda has been postponed for the next meeting.
‘Fundamentally, autonomous weapons systems raise ethical concerns for society about substituting human decisions about life and death with sensor, software and machine processes,’ said Peter Maurer, president of the International Committee of the Red Cross and an outspoken opponent of killer robots. Maurer has been among those demanding a legally binding agreement that requires human control of weapons at all times.
‘Robots lack the compassion, empathy, mercy, and judgment necessary to treat humans humanely, and they cannot understand the inherent worth of human life,’ says a briefing paper released by Human Rights Watch and Harvard Law School’s International Human Rights Clinic (IHRC).
It is telling that the Chinese deployment of machine-gun wielding robotic vehicles in Ladakh happened even as the CCW convention on curbing killer robots was taking place in Geneva. The campaign against these devices should therefore unambiguously condemn their deployment on India’s northern border – the first case of deployment of LAWS in a conflict zone.
Yvonne Gill is a freelance journalist based in London