Recently, it was learned that on March 27, 2020, a Turkish drone had taken, autonomously, the decision to attack enemy troops in the Libyan war. The drone decided to attack based on its built-in algorithm without expecting any human intervention. This news highlighted the possibility of new scenarios of war and fundamental changes in the nature of conflicts. 
By programming a military drone, one can decide what kind of control the human agent will exert over its piloting and learning, even going so far as to make it highly autonomous. Based on the type of control, a distinction can be made between military attack robots that can act in autonomous mode (LAWS: Lethal Autonomous Weapon Systems), and other robotic military systems (RMS: Robotized military systems). 
It is important to distinguish between an automated system and an autonomous system. Both perform actions without the direct and timely intervention of a human agent, but the actions of the former are predictable and programmed, while those of the autonomous system can be unpredictable and could eventually exceed the goals and actions initially set by the human programmer.
There are also computerized robotic systems that operate on computer networks with virus-like techniques and could cause cyberwars with serious and unpredictable consequences. Allegations of Russian interference in the 2016 US presidential election are just one example.
RMS are remotely controlled. The human operator is the only one who can make the final decision to shoot. A second category of RMS, closer to LAWS, are semi-autonomous weapons systems, capable of identifying targets on their own, but the human agent constantly monitors them and can interrupt their actions at any time. It is now accepted that an ethically valid use of these semi-autonomous weapons must be limited to defensive tasks such as missile defence (Otto101).
The third category would be LAWS, which are autonomous systems in their operation and learning. That is why they are also called “innovative LAWS”. This raises the question of who will be directly responsible for their actions and “collateral effects”. The autonomy with which they act, once activated, makes them able only to attack hardware targets, such as systems that block communications. However, the line between human control and the complete absence of that control is becoming increasingly blurred.
1.1. The difficult task of programming these drones
It is very difficult to program these war robots. Equipping them with effective ethical rules currently seems impossible, because real circumstances are always complex and ethics cannot be reduced to logic or automated application of standards. In addition, the resulting algorithms presuppose a prior choice about the underlying principles and the type of ethics. Some authors suggest that utilitarian ethics would be the easiest to implement in an algorithm, as it would allow the software to choose between options whose effects would be quantifiable.
War situations are becoming more and more complex. Parties to armed conflict seek to make it difficult to distinguish between combatants and civilians. It is also not easy to determine, especially for a robot, whether enemy soldiers are already out of action or if they have surrendered, as well as to assess the proportionality of the response.
Being robotic does not make war more respectful of human dignity, because armed robots act coldly, like binary code, without mercy or forgiveness. Some authors argue that the use of robots would significantly reduce the human cost of war, thus avoiding the protest movements that condition government action.
Western countries are already steering the development of their weapons in this direction. This could lead to an arms race that is even more difficult to control than the previous Cold War. Pope Francis has made it clear that simply possessing atomic weapons is already immoral. This statement can also be applied to LAWS.
Armed conflicts could also intensify, as the benefits that aggressor countries seek would offset the reduced cost they would have to pay to use armed force. With less spending and less social protest, the decision to go to war would be easier to make and responsibilities more difficult to determine.
It’s easy to eliminate thousands of people when they’ve already been reduced to figurines on a computer screen. By avoiding any form of face-to-face relationship, the other is not perceived as an “alter ego” but as an anonymous being, alien to myself and for which I do not feel responsible.
LAWS are justified by the fact that they would behave in a more rational and balanced way than humans. Some authors claim that these combat robots would reduce the number of innocent lives and commit fewer war crimes, not being subject to human passions such as fear, revenge or hatred.
The arguments used to justify the use of LAWS are based on a negative anthropological conception (homo homini lupus). We need to adopt a more positive anthropological view. Indeed, the above arguments for preferring LAWS to humans can also be used in the opposite direction.
From the Christian perspective, the human being is inherently social, capable of forgiving and showing compassion, for he is endowed with a deep moral sense and a natural inclination not to kill. Instead of delegating matters of life and death to machines, we must assume them responsibly, always promoting openness to forgiveness and self-giving. Lasting peace is built on reconciliation, dialogue, forgiveness, not on violence and forgetfulness.
Only man can build a community of values, fraternal and authentically human. It cannot delegate this task to machines. The cold calculation of laws, based on the principle of maximum utility, could lead them to massacres. Moreover, the human being is able to go beyond the logic of equivalence that governs justice to embrace gratuitousness and forgiveness, thus opening a future of hope and fraternity.
The negative anthropological conception has led to considering war as inevitable and thus to justifying the arms race. This doctrine is also used to justify and regularize LAWS.
The encyclical Pacem in Terris marked the end point of the Church’s official references to the doctrine of just war, since the destructive power of modern weapons prevents war from being a proportionate response to redress injustice. This is even more evident in robotic warfare, as LAWS are unable to properly apply the principle of proportionality, which requires consideration of many elements. Moreover, war today is not a lesser evil, but the greatest evil (FT 258).
The Church teaches that war “is always a failure of humanity” (FT 261), “the negation of all rights and a dramatic aggression against the environment” (FT 257). Robotic warfare accentuates this dehumanization and therefore cannot be promoted as a solution. To break the spiral of violence, we must overcome the attitudes that have engendered it and the injustices that fuel it.
There are already alternative instruments to restore peace and justice without resorting to war. It can therefore be said that at present the conditions that could justify the application of this doctrine no longer exist. A similar argument was used to reject the death penalty. Pope Francis teaches that both (war and the death penalty) “are false answers, which do not solve the problems posed, and that in the end they only add new factors of destruction” (FT 255).
When autonomous machines manage important aspects of social life, there is a risk of dehumanizing relationships and weakening universal brotherhood. Any war is inhumane, but it will be even more so if those who carry it out are autonomous machines and no one takes direct responsibility for their actions.
Any weapon system with lethal capacity must remain under the absolute supervision of man. Therefore, the development of LAWS that is completely autonomous in their actions and learning must be prohibited.
Violence provokes more violence. Only gratuitousness and forgiveness can give birth to an authentic universal fraternity.
Fr. Martín Carbajo Nuñez, OFM
(the original is in French)
 Full and detailed text with abundant bibliographical references: Carbajo Núñez M., “The War of Autonomous Drones”, in Truth and Life 280 (2022). In Italian: Laurentianum 63/1 (2022).
 Apps P., «New era of robot war may be underway unnoticed», Reuters (10.06.2021), in Internet : https://www.reuters.com/article/apps-drones-idUSL5N2NS2E8.
 Cf. Otto P. – Count And. 3th1cs. A reinvention of ethics in the digital age?, [Otto], Irights Media, Berlin 2017.
 Francis “Meeting for Peace, Hiroshima” (24.11.2019), in OR 269 (2019) 8.
 Cf. Lorenzetti L., «Pacem in terris, turning point for moral theology”, in RTM 179/3 (2013) 347-355. «Alienum est a ratione» (PT 67).