Site icon CathNews New Zealand

Vatican calls for ban on killer robots

autonomous weapons

The Vatican has renewed its call for international restrictions on autonomous weapons systems, often referred to as “killer robots”, as their use in conflicts such as those in Ukraine and Gaza increases.

Archbishop Ettore Balestrero, the Vatican’s Permanent Observer to the United Nations in Geneva, pushed for the move, highlighting the moral implications of allowing machines to make life-and-death decisions without human oversight.

In an address on 26th August, Balestrero emphasised the Vatican’s concern over the ethical dimensions of using AI-driven “lethal autonomous weapons” (LAWs) in warfare.

“It is profoundly distressing” Balestrero emphasised, “that, adding to the suffering caused by armed conflicts, the battlefields are also becoming testing grounds for more and more sophisticated weapons.”

In particular, he insisted that autonomous weapons systems can never be considered “morally responsible entities”.

“The human person, endowed with reason, possesses a unique capacity for moral judgement and ethical decision-making that cannot be replicated by any set of algorithms, no matter how complex” Balestrero said.

Weapons restrictions

The Vatican’s push comes as Josep Borrell, the EU’s top foreign policy official, advocated for fewer restrictions on Ukraine’s use of weapons supplied by EU nations.

“The weaponry that we are providing to Ukraine has to have full use, and the restrictions have to be lifted in order for the Ukrainians to be able to target the places where Russia is bombing them. Otherwise, the weaponry is useless” Borrell told reporters.

While traditional weaponry remains predominant in the Ukraine conflict, there is a growing emphasis on AI-driven systems.

Reports suggest Ukraine has become a testing ground for new technologies including autonomous drones. These systems require human intervention to lock onto targets, but experts warn that future advancements could diminish this role, raising significant ethical and safety concerns.

Similarly, Israel has employed autonomous systems in its ongoing conflict in Gaza. However, they use technologies capable of identifying and suggesting targets without human input.

Israel reportedly uses an AI system called “Habsora” to identify bombing targets inside Gaza. It is said to be capable of doing so at a rate much higher than manual detection.

In that context, Balestrero distinguished between a “choice” and a “decision”, arguing that the latter is a human act that involves weighing ethical considerations such as human dignity.

“No machine should ever make the decision to take a human life” Balestrero declared.

Sources

Crux Now

CathNews New Zealand

 

Exit mobile version