Law Professor Fr. Afonso Seixas-Nunes, SJ, Contemplates AI, War and Human Dignity

April 8, 2026

By Therese Fink Meyerhoff

Artificial intelligence is an issue for our time. The technology is already prevalent in daily life, impacting your online searches and shopping, streaming services, perhaps even your medical care. In international relations, it is used in the planning and execution of military action – as the world has witnessed in the current conflicts in the Middle East and Ukraine. Father Afonso Seixas-Nunes, SJ, a professor at Saint Louis University School of Law (SLU Law), studies the legal and ethical considerations that come with the use of AI, specifically in the use of autonomous weapon systems (AWS).

Father Seixas-Nunes teaches Public International Law, Humanitarian Law (or the laws of war) and Criminal Law – a particularly “broad enterprise” because it required this Portuguese Jesuit to learn American criminal law. But, he says, teaching about AI is not limited to his classroom. He regards education as part of the Society of Jesus’ wider call to help people make sense of the world.

“The general public is not being well informed about what AI is,” Fr. Seixas-Nunes said. “People get scared, because they don’t understand. I think one of my missions as a Jesuit is to have these kinds of conversations, to explain what AI is – the dangers and advantages, how it is applied in different fields. I think it is our mission as educators to provide information to others.”

Developing Expertise

Father Seixas-Nunes’ interest in AI and AWS began at the London School of Economics, where he wrote his master’s dissertation on the use of drones in warfare. When he began his doctorate, he shifted his focus to the use of autonomous weapons systems, which use algorithms to “optimize outcomes,” making it seem like the weapon – a machine – is making decisions regarding targets and strikes.

“The more I read, the more I began to understand the complexity of the issue and how serious the problem is,” he said. His subsequent research resulted in a book: The Legality and Accountability of Autonomous Weapon Systems: A Humanitarian Law Perspective (Cambridge University Press, 2022).

Fr. Afonso Seixas-Nunes, SJ, leads a classroom discussion at the Saint Louis University School of Law.

AI and Weaponry

AI systems depend on huge amounts of information, especially when used in surveillance and warfare. “Every single system that we use for surveillance, for war, depends on massive volumes of data,” Fr. Seixas-Nunes said. And, he added, “that data comes from us.”

He suggests many people underestimate how much information they share in daily life. “People may not be aware that we are digital citizens,” he said. “Everything we do is tracked.” He used supermarket loyalty cards as an example. They can offer benefits, but they also reveal your address, phone number, spending habits, location and more. AI can use that information to make predictions about your behavior.

In war – or terrorism – prediction can become targeting. When AI is linked to satellites and other surveillance tools, it can identify where an individual is likely to be, what he looks like, even what he’s wearing. That can enable “very precise strikes. Today, with AI, you can target a person when he’s walking down the street,” Fr. Seixas-Nunes said.

Precision targeting eliminates the need for broader attacks, such as the bombing of buildings and cities, and can potentially reduce the harm to innocents – a good thing, certainly, but issues remain.

The Risks: Trust and Accountability

Father Seixas-Nunes outlined several risks of an overdependence on AI systems – especially when weapons are involved:

1) We trust computers more than we realize. “The first problem is the trust that we are developing in computers,” he said. That trust often starts with tools that work well, like GPS. Over time, it becomes natural to accept the machine’s output.

2) Weapons systems can be unpredictable. Autonomous weapons systems use algorithms based on neural networks, which attempt to replicate how the human brain operates. That makes them inherently unreliable, because neural networks do not simply follow a command, the way a simple algorithm does. Instead, they add data collected from the environment to the original instruction and combine the data to arrive at an outcome. But this outcome is not controlled by a person; it has no qualitative input.

3) Algorithms don’t “decide”; they optimize. AI functions in probabilities. A computer determines which possible outcome has the highest probability and chooses that option. But that doesn’t mean that it’s the best option. An algorithm can’t be “ethical,” because ethics require human understanding. The danger comes when we allow ourselves to trust in the outcomes of algorithms devoid of human ethics.

4) Accountability is blurry. Father Seixas-Nunes notes that AI obscures “the chains of command.” If something goes wrong – a school full of children is targeted, for instance – who is responsible? Do we blame military commanders, AWS operators, designers or programmers? If accountability cannot be traced, it becomes harder to enforce.

Today’s Conflicts and Corporate Involvement

Father Seixas-Nunes said AI is already being used in wars to identify targets and recommend weapons and timing, as seen in the precise targeting of Iranian leaders by Israeli and American missiles.

In addition, warfare is being redefined in the conflict between Russia and Ukraine, where privately owned satellites are being used to provide military intelligence. Thus, these satellites have become legitimate military objectives according to international law. “For the first time, we have opened the door to a possible conflict in outer space,” Fr. Seixas-Nunes said.

A related concern is the growing influence of private companies in areas once controlled by governments. This “shift in sovereignty” involves more than weapons; it is also about data, infrastructure and access to advanced AI systems.

“The social responsibility of a corporation is not the same as that of a nation,” he said. “Corporations act based on profit.”

At the same time, some companies are setting limits on the use of their products. “It is very interesting to see corporations basically telling the states that they should be more ethical than the states are,” he said.

Is Regulation Possible?

Is there a way to regulate the use of autonomous weapons? “Since 2014, countries have been gathering under the Convention on Certain Conventional Weapons to decide the future of AI applied to weaponry,” Fr. Seixas-Nunes said. Some states, including the Holy See, want a ban on autonomous systems. A smaller group of the most powerful countries, including the United States, is in favor of autonomous weapon systems. Still others want to “wait and see.” But time has run out for that approach, as today’s conflicts demonstrate.

Father Seixas-Nunes believes the use of AWS is inevitable. He notes that a ban is unrealistic, because countries that want the technology will simply disregard the law.

The Catholic Perspective

The Church has a long history of speaking out about warfare and weaponry. Today, it is part of the global conversation about the ethical use of AI. Both Pope Leo XIV and Pope Francis have commented on AI and autonomous weapons. In a World Day of Peace message in 2024, Pope Francis voiced concerns about AWS: “Autonomous weapon systems can never be morally responsible subjects,” he said. “The unique human capacity for moral judgment and ethical decision-making is more than a complex collection of algorithms, and that capacity cannot be reduced to programming a machine, which as ‘intelligent’ as it may be, remains a machine. For this reason, it is imperative to ensure adequate, meaningful and consistent human oversight of weapon systems.”

Pope Leo has gone a step further and called for total disarmament. He has warned against letting algorithms make choices in “fundamental areas of human dignity.”

As a professor of law, Fr. Seixas‑Nunes wants to prepare students to understand emerging technologies from a perspective based in faith and humanity. In a world increasingly shaped by algorithms, decisions about life, death and power must remain accountable to human reason and ethical reflection.

Father Afonso Seixas-Nunes, SJ, was recently interviewed on the Saint Louis University School of Law podcast on “Presidential Powers, Executive Authority, and the Conflict with Iran.”  Listen here

Related Items of Interest

Carlos Martínez-Vela, SJ, a Jesuit in first studies at Fordham University, offers a reflection on Jesus’ last words.
The Church invites us to draw spiritual benefits from Lent, modeled after the forty days Jesus spent in the desert fasting and praying.
Beau Guedry, SJ, is a Jesuit regent assigned to the Jesuit Social Research Institute at Loyola University New Orleans, where he is involved with JSRI’s