Who could object to a project that seeks to stop killer robots? The UK government apparently:
The Campaign to Stop Killer Robots, an alliance of human rights groups and concerned scientists, is calling for an international prohibition on fully autonomous weapons.
Last week Human Rights Watch released a report urging the creation of a new protocol specifically aimed at outlawing Laws. Blinding laser weapons were pre-emptively outlawed in 1995 and combatant nations since 2008 have been required to remove unexploded cluster bombs.
Some states already deploy defence systems – such as Israel’s Iron Dome and the US Phalanx and C-Ram – that are programmed to respond automatically to threats from incoming munitions. Work is also progressing on what is known as “automatic target recognition”.
The Foreign Office told the Guardian: “At present, we do not see the need for a prohibition on the use of Laws, as international humanitarian law already provides sufficient regulation for this area.
“The United Kingdom is not developing lethal autonomous weapons systems, and the operation of weapons systems by the UK armed forces will always be under human oversight and control. As an indication of our commitment to this, we are focusing development efforts on remotely piloted systems rather than highly automated systems.”
While the idea of autonomous weapons systems immediately summons up the prospect of something akin to a flash crash that does much more than destroy fictitious capital, it seems far from obvious to me that the prohibition of as yet unrealised technologies is necessarily the best way to ameliorate a putative future problem.