Site icon CathNews New Zealand

Killer robots will make war more inhumane

Developing and using killer robots – fully automated lethal weapons systems – would make warfare even more inhumane, Archbishop Ivan Jurkovic says.

Jurkovic is the Vatican observer to the United Nations (UN) agencies in Geneva.

He told a UN session for the “Group of Governmental Experts” on Lethal Autonomous Weapons Systems (LAWS) that autonomous systems would undermine efforts to achieve peace through dialogue.

These systems “would lead us imperceptibly to dehumanisation and to a weakening of the bonds of a true and lasting fraternity of the human family,” he said.

LAWS technologies are also referred to as robotic weapons or “killer robots.”

Jurkovic said developing LAWS “will provide the capacity of altering irreversibly the nature of warfare, becoming even more inhumane, putting in question the humanity of our societies.

“Any armed intervention must be carefully weighed and must at all times verify its legitimacy, legality and conformity with its purposes, which must also be both ethically and legally legitimate.”

Jurkovic says these tasks are becoming too complex and nuanced to be entrusted to a machine.

He pointed out machines “would be ineffective when facing moral dilemmas or questions raised by the application of the so-called principle of ‘double effect’”.

He explained the double effect: “The Catholic principle teaches it is morally acceptable to pursue a good goal that could have an unintended evil effect if and when there is a proportionate or adequate reason for allowing the evil.”

Jurkovic told the UN that robotisation and dehumanisation of warfare present several serious ethical and legal problems.

As an example, he said increased automation will blur or erase accountability and the “traceability of the use of force with an accurate identification of those responsible.

“Such loss or dilution of responsibility induces a total lack of accountability for violations of both international humanitarian law and international human rights law and could progressively incite to war.”

Furthermore, he said autonomous weapons systems don’t have “unique human capacity for moral judgment and ethical decision-making,” which involves input much more complex than a “collection of algorithms.”

They cannot understand a situation or context and apply the appropriate rule or principles, since such discernment or judgment “entails going well beyond the potentialities of algorithms.

“The idea of a war waged by non-conscious and non-responsible autonomous weapons systems appears to hide a lure for dominance that conceals desperation and a dangerous lack of confidence in the human person.

“International security and peace are best achieved through the promotion of a culture of dialogue and cooperation, not through an arms race.”

Source

 

 

 

 

 

 

 

 

 

 

 

 

Exit mobile version