By Owen Bowcott | The Guardian
Fully autonomous weapons, already denounced as “killer robots”, should be banned by international treaty before they can be developed, a new report urges the United Nations .
Under existing laws, computer programmers, manufacturers and military commanders would all escape liability for deaths caused by such machines, according to the study published on Thursday by Human Rights Watch and Harvard Law School.
Nor is there likely to be any clear legal framework in future that would establish the responsibility of those involved in producing or operating advanced weapons systems, say the authors of Mind the Gap: The Lack of Accountability for Killer Robots.
The report is released ahead of an international meeting on lethal autonomous weapons systems at the UN in Geneva starting on 13 April. The session will discuss additions to the convention on certain conventional weapons.
Also known as the inhumane weapons convention, the treaty has been regularly reinforced by new protocols on emerging military technology. Blinding laser weapons were pre-emptively outlawed in 1995 and combatant nations since 2006 have been required to remove unexploded cluster bombs.
Military deployment of the current generation of drones is defended by the Ministry of Defence and other governments on the grounds that there is always a man or woman “in the loop”, ultimately deciding whether or not to trigger a missile.
Rapid technical progress towards the next stage of automation, in which weapons may select their own targets, has alarmed scientists and human rights campaigners.
Read the rest of the article at The Guardian…