Minister refuses to rule out fully autonomous lethal weapons

0
337
Oracle enhances customer experience platform with a B2B refresh

Source is ComputerWeekly.com

A government minister has refused to rule out the future use of lethal autonomous weapons amid calls for regulation for new technologies in the area.

During a debate on autonomous weapons systems at the House of Lords yesterday (1 November), Lord Lancaster of Kimbolton asked whether the UK government has agreed not to develop autonomous weapons. He also asked whether there would be human intervention in the chain from targeting to operating such weapons.

Ministry of Defence (MoD) minister Baroness Goldie said the UK Armed Forces “do not use lethal force without context-appropriate human involvement”, and that the UK will continue to engage internationally with experts on the matter, but did not provide details on the government’s current vision for autonomous weapons.

Pressing further on the issue, shadow defence spokesperson Lord Coaker noted that the increasing use of artificial intelligence (AI) in modern warfare brings “enormous moral challenges”. He asked whether the the minister could say “unequivocally, and as a matter of principle”, that humans would always be involved, whether in terms or oversight of AI in defence or whenever there was a decision to be made about lethal use of force.

Without answering the question directly, the minister said that “human responsibility for the use of a system to achieve an effect cannot be removed, irrespective of the level of autonomy in that system or the use of enabling technologies such as AI”.

A third question along the same lines was asked by Liberal Democrats defence spokesperson Baroness Smith of Newham, on the government’s stance on whether lethal autonomous drones or AI systems could kill without human involvement. The minister repeated answers given previously, that the UK does not use lethal weapons without human oversight.

Arguing that “nations are sleepwalking into disaster”, Lord West of Spithead said hand-sized lethal autonomous drones with facial recognition are already being produced, and thousands could be employed in conflict. “I find this quite horrifying,” he said, asking whether the government agrees that a human should make the ultimate decisions in a “kill-chain”, rather than a robot. “Also, these things are AI: they learn; therefore, they will learn how to kill even more than they have been programmed to. This is extremely dangerous.”

Responding to the arguments, the MoD minister noted that all weapon systems, whether or not they are autonomous, should comply with humanitarian laws. “A robust application of that framework, I would suggest, is the best way of ensuring the lawful and ethical use of force in all circumstances,” she said, adding that the same applies to all states that might be developing autonomous weapons.

Pointing out NATO’s principles of lawfulness, responsibility and accountability outlined in its AI strategy released in October 2021, Lord Browne of Ladyton said “it is time for the UK to show global leadership on lethal autonomous weapons” and reaffirm its commitment to ethical AI.

The minister agreed to the points made, and noted the MoD had committed to publishing a defence AI strategy by Autumn 2021. Even though the process is delayed, Goldie said “significant work has been done on the strategy and we can expect publication in early course”.

“[The defence AI strategy] will set out our vision to be the most effective, efficient, trusted and influential defence organisation of our size, and have principled components to it,” she said.

Liberal Democrats digital spokesperson Lord Clement-Jones asked for the government’s stance on a recent discussion of the Group of Government Experts on Lethal Autonomous Weapons Systems at the United Nations (UN) Convention on Certain Conventional Weapons.

Calls for regulation

At the UN debate, calls were made for a legally binding instrument, including both prohibitions and positive obligations, to regulate autonomous weapons systems, Clement-Jones noted.

Responding to the question, Goldie said the UK is actively participating in the UN’s discussions on autonomous systems and on how to build norms to ensure safe and responsible use of autonomy. However, she said there isn’t a consensus around regulation.

“The UK and our partners are unconvinced by the calls for a further binding instrument,” she said. “International humanitarian law provides a robust principle-based framework for the regulation of weapons deployment and use.”

Expressing disappointment with the minister’s answer, Clement-Jones noted this stance puts the UK “at odds with nearly 70 countries and thousands of scientists in its unwillingness to rule out lethal autonomous weapons”.

He then asked the minister whether the government would rethink its policy to give UK representatives a mandate to negotiate a legally binding instrument in the next UN meeting on autonomous weapons in December.

The minister replied that international agreement on regulation on that front “has so far proved impossible” and that the UK is more interested in understanding the characteristics of autonomous systems in conflict and focusing on effects.

Source is ComputerWeekly.com

Vorig artikelHSBC standardises on DevOps platform
Volgend artikeliXsystems entry-level NAS goes hyper-converged with TrueNAS Scale