Skip to main content

Since the advent of Artificial Intelligence (AI), a pressing question has arisen: Is Clausewitz still relevant?1 The game-changing potential of AI and the idea of human-machine teaming2 (Centaur Systems) have led many to doubt the seemingly unchanged nature of war. Apparently, this had led to the belief that AI-powered systems will replace humans (generals) in the command loop. However, this view is detached from the complex nature of warfare, which fundamentally remains a human endeavour guided by violence, chance and friction.3

Just like other social institutions, war is generally an interpretivist paradigm rooted in complex human nature. It is a non-linear phenomenon4 whose conduct and outcomes cannot be determined by analytical predictions or algorithmic patterns. In other words, war does not proceed on pre-determined rules of engagement, prescriptive manuals, established patterns, or predictive modelling. Instead, it is fought on judgment, adaptation to changing realities, commander’s intuition, and paying attention to the unfolding of the unknown.

At the pinnacle of algorithmic dominance, it is widely assumed that AI is seemingly poised to drive humans out of the loop and alter its Clausewitzian nature. However, the optimism is otherwise. Contemporary AI relies on inductive reasoning based on drawing insights from existing data.5 It employs the methodology of ‘deep learning,’ which establishes relationships between different variables by learning from statistical and probabilistic inferences. This enhances AI’s capability for target selection and information processing at a speed beyond human ability. Based on these factors, the AI optimists argue that it can facilitate swift information processing, target selection and bridging the Observe-Orient-Decide-Act (OODA) loop faster than humans.

However, war is not a bounded enterprise that can be resolved with immense computing power and datasets alone. It is generally a Zweikampf, which implies a duel between two opponents having competing political wills.6 Being inherently abstract and non-linear, it cannot be dealt with simply by the oversimplified machine logic. Rather, it relies on abductive logic, which deals with the unknown circumstances, unintended chaos and the adaptation to sudden changes in the battlefield landscape.

Hereby, the role of commander is significant in interpreting and visualising the changing dynamics. Amidst the fog of war, a military leader’s competence is central to making sense of the events when the first shot is fired (which AI cannot do). Especially, those operating in GPS-denied environments or submerged in depths rely mainly on their intuitive judgement, professional competence, and interpretation of the events to guide formations or sometimes entire bureaucracies, which goes beyond the Crisis Action Planning (CAP) and Deliberate Planning (DELPLAN) frameworks.

Moreover, the AI algorithms ideally consider reality as stable. This assumption is well-suited to guide linear systems such as automated weapons, radars and missile guidance systems. Nevertheless, war is not a hermetically sealed phenomenon, narrowly confined to operating weapon systems, firing bullets and executing minor tactical engagements. It is the employment of violence that occurs in a political context, with unpredictable outcomes.

According to James Clerk Maxwell, a renowned statistician of the nineteenth century, the real-world structures are inherently unstable,7 which renders pattern-driven algorithmic logic unfit to predict the course of future events. Conversely, political objectives rely on the subjectivity of humans. The same objective can elicit different responses from different people and even from the same people at different times.

Between two states, small perturbations could result in unanticipated outcomes, making routes to victory infinite and fundamentally unclear. This is evident from the recent India-Pakistan military crisis,8 where both forces deviated from their established rules of engagement. Pakistan’s downing of numerous top-of-the-line Rafale jets, along with Sukhoi and Mirages, triggered a dramatic climb on the escalation ladder. It was the first time in history that both states fired surface-to-surface missiles (SSMs) and launched kamikaze drones at each other’s military-grade infrastructure.

Sudden pre-emptive strikes by India demonstrated a visible departure from the punitive retaliation to counterforce posturing, marking a significant disruption in the existing linear models of the escalation ladder. In response, Pakistan’s multi-domain response went beyond the traditional retaliation mechanics towards instant deep strike engagements.9 This made the strategic calculus and risk tolerance of both actors ambiguous and unpredictable.

The uncertainty, coupled with organised violence, makes war a true chameleon that exhibits a different nature and outlook in every instance. Out of bounds from the restrictions of linear fashion, the nature of war unmasks the pitfalls associated with AI and other technologies. When digital screens flicker and communications are dead, the commander’s intuition and troops’ morale determine the battlefield outcomes. The non-linear reality of war makes it a different enterprise for the intelligent systems that present analytically simple solutions to inherently complex problems. Therefore, relying solely on algorithmic logic contrasts with the fundamental contours of warfare.

Footnotes

1Wing Commander Alison Morton, “Artificial Intelligence Changing the Nature of War-Strategic Security Implications for the United Kingdom?” Air and Space Power Review 25, no.1, (2023): 164-189.

2Dorottya Zsiboracs, “Human-Machine Teaming in Modern Warfare: Evolving Collaboration at Edge on the Battlefield,’ Karve, March 25, 2025, https://www.karveinternational.com/insights/human-machine-teaming-in-modern-warfare

3Michael Howard and Peter Paret, On War (New Jersey: Princeton University Press, 1989), 75-99.

4Alan Beyerchen, “Clausewitz, Non-linearity and Unpredictability of War,” International Security 17, no. 3, (1992): 59-90, https://doi.org/10.2307/2539130

5Cameron Hunter and Bleddyn E. Bowen, “We’ll never have a model of an AI major-general: Artificial Intelligence, command decisions, and kitsch visions of war,” Journal of Strategic Studies 47, no. 1, (2024): 116-146, https://doi.org/10.1080/01402390.2023.2241648

6Alan Beyerchen, “Clausewitz, Nonlinearity, and Unpredictability of War,” International Security 17, no. 3, https://doi.org/10.2307/2539130

7Lewis Campbell and William Garnett, The Life of James Clerk Maxwell, (Cambridge: Cambridge University Press, 2011), https://doi.org/10.1017/CBO9780511709050

8Shaheer Ahmad, “India-Pakistan Military Crisis: A Testing Ground for Chinese Military Hardware,” The Diplomat, May 13, 2025, https://thediplomat.com/2025/05/india-pakistan-military-crisis-a-testing-ground-for-chinese-military-hardware/

9Raja Zarkullah Khan, “From Ladder to Elevator: India, Pakistan and the End of Predictable Escalation,” Strategic Vision Institute¸ October 9, 2025, https://thesvi.org/from-ladder-to-elevator-india-pakistan-and-the-end-of-predictable-escalation/

00
Cite Article
Harvard
APA
Footnote
RIS
(Ahmad, 2026)
Ahmad, S.. 2026. 'How the Nature of Warfare Affects the AI Optimism'. Available at: https://theforge.defence.gov.au/article/how-nature-warfare-affects-ai-optimism (Accessed: 22 April 2026).
(Ahmad, 2026)
Ahmad, S. 2026. 'How the Nature of Warfare Affects the AI Optimism'. Available at: https://theforge.defence.gov.au/article/how-nature-warfare-affects-ai-optimism (Accessed: 22 April 2026).
Shaheer Ahmad, "How the Nature of Warfare Affects the AI Optimism", The Forge, Published: April 21, 2026, https://theforge.defence.gov.au/article/how-nature-warfare-affects-ai-optimism. (accessed April 22, 2026).
Download a RIS file to use in your citation management tools.
Defence Technical Social

Defence Mastery

Own Domain Awareness defence-poa-level1
Military Power Joint Mastery defence-poa-level4
Complicated Problems defence-cognitive-level2
Complex Problems defence-cognitive-level3
Wicked Systems defence-cognitive-level4
Multi-agency Wicked Systems defence-cognitive-level5

Social Mastery

Lead Operating Systems social-influence-level3
Lead Capability social-influence-level4
Lead Integrated Systems social-influence-level5
Moral Leadership social-ethics-level3
Stewarding the Profession social-ethics-level5
Character Role Model social-character-level3
Generate Climates of Trust social-character-level4
Character Exemplar social-character-level5
Cultural Stewardship social-culture-level3
Cross Cultural Leadership social-culture-level4

Comments

Disclaimer

The views expressed in this article are those of the author and do not necessarily reflect the position of the Department of Defence or the Australian Government.

This web site is presented by the Department of Defence for the purpose of disseminating information for the benefit of the public.

The Department of Defence reviews the information available on this web site and updates the information as required.

However, the Department of Defence does not guarantee, and accepts no legal liability whatsoever arising from or connected to, the accuracy, reliability, currency or completeness of any material contained on this web site or on any linked site.

The Department of Defence recommends that users exercise their own skill and care with respect to their use of this web site and that users carefully evaluate the accuracy, currency, completeness and relevance of the material on the web site for their purposes.

 

Related Articles

1 /4