Thoughts, reflections and experiences

icy banner

Tag: Clausewitz

Cyber operators working at screens

From Theory to the Trenches: Introducing “Cyber Centres of Gravity”

The nature of warfare is in constant flux. Clausewitz’s timeless insight that war is a “contest of wills” remains central, yet the means by which this contest is waged are transforming. Traditionally, Centres of Gravity (CoGs) were often seen as physical entities: armies, capitals, industrial capacity. The thinking was that neutralising these would cripple an adversary’s warfighting ability. However, it’s crucial to recognise, as scholars like Echevarria highlight, that Clausewitz himself acknowledged non-material CoGs, such as national will. The concept isn’t entirely new, but modern interpretations significantly expand upon it, especially in the context of cyberspace.

Today, the pervasive nature of information networks prompts us to consider what this means for strategic targeting. What happens when the critical vulnerabilities lie not just in the physical domain, but in an enemy’s belief systems, the legitimacy of their leadership, or their very grasp of shared reality? This is where exploring an emerging concept – what this article terms “Cyber Centres of Gravity” (Cyber CoGs) – becomes vital for contemporary military strategists. While “Cyber CoG” as a distinct term is still evolving and not yet firmly established in formal doctrine (which tends to use adjacent terms like cognitive targets or information influence objectives, as noted by analysts like Pawlak), its exploration helps us grapple with these new strategic challenges. Ignoring these intangible, yet increasingly critical, aspects in our information-saturated world could represent a significant strategic blind spot.

Understanding “Cyber CoGs”

So, what might a “Cyber CoG” entail? It can be conceptualised as a critical source of an adversary’s moral or political cohesion, their collective resolve, or a foundational element of their operative reality-construct that underpins their ability or will to resist your strategic objectives. The key idea is that significant degradation of such a “Cyber CoG,” predominantly through cyber-enabled means, could fundamentally unravel an enemy’s capacity or desire to continue a conflict, perhaps by altering their perception of the strategic landscape.

This isn’t merely about disrupting networks or servers, though such actions might play a role. A true “Cyber CoG,” in this conceptualisation, is intrinsically linked to these deeper wellsprings of an enemy’s will, cohesion, or their understanding of reality. If an operation doesn’t aim to decisively alter the strategic balance by impacting these moral, political, or epistemic foundations, it’s more likely an operational objective rather than an attack on a strategic “Cyber CoG”.

Clausewitz identified the CoG as “the hub of all power and movement, on which everything depends”. In an age increasingly defined by information, this hub can often be found in the cognitive and informational realms. When societal “passion” can be manipulated through digital narratives, when a military’s operating environment is shaped by perception as much as by physical friction, and when governmental “reason” is threatened by the decay of a shared factual basis, cyberspace becomes an increasingly central domain in shaping strategic outcomes. While kinetic, economic, and geopolitical power still hold immense, often primary, sway in high-stakes confrontations (a point Gartzke’s work on the “Myth of Cyberwar” reminds us to consider), the cyber domain offers potent avenues to contest the very “reality” upon which an adversary’s will is constructed. Here, strategic success may rely less on physical destruction and more on the ability to influence or disrupt an adversary’s cognitive and narrative environments.

Identifying Potential “Cyber CoGs”: A Framework for Analysis

Pinpointing these potential “Cyber CoGs” requires a nuanced analytical approach, considering factors such as:

  1. Strategic Relevance: Does the potential target truly sustain the enemy’s will to fight or their core strategic calculus? This involves looking at national cohesion, public legitimacy, dominant narratives, key alliances, or shared assumptions underpinning their strategy. Its degradation should aim to undermine their strategic purpose or resolve.
  2. Cyber Primacy in Effect: Can cyber-enabled operations offer a uniquely effective, or significantly complementary, method for impacting this CoG, especially when compared or combined with kinetic, economic, or diplomatic levers? Some intangible CoGs may be less susceptible to physical attack but highly vulnerable to informational strategies.
  3. Potential for Decisive Influence: Is the intended effect of targeting the “Cyber CoG” likely to be decisive, whether through an irreversible loss of trust (e.g., in institutions or information), a critical breakdown in a foundational narrative, or a fundamental, lasting shift in the adversary’s perception of their strategic environment? It could also be a cumulative effect, eroding coherence and resolve over time.
  4. Linkage to Moral and Political Dimensions (Clausewitzian Character): Is the “Cyber CoG” intrinsically tied to the enemy’s unity, cohesion, will to resist, or the shared narratives defining their interests and threats? It’s not just a system or infrastructure but is linked to the collective spirit or governing principles.
  5. Strategic Viability and Responsibility: Can the proposed operation be conducted with a rigorous assessment of attribution risks, potential for unintended escalation, and broader second-order societal effects? This includes careful consideration of evolving international norms and legal frameworks.

Implications for Military Planners

Strategically engaging potential “Cyber CoGs” would necessitate evolving current approaches:

  • Integrated Intelligence: Identifying and understanding these “Cyber CoGs” demands a deep, multidisciplinary intelligence effort, fusing technical insights with profound cultural, political, cognitive, and narrative analysis. This requires collaboration between experts in fields like anthropology, sociology, political science, and data science to map the ‘human terrain’ and ‘narrative architecture’.
  • Dynamic and Adaptive Campaigning: Operations targeting “Cyber CoGs” are unlikely to be single events. Influencing moral cohesion or perceived reality is a complex, interactive process involving continuous adaptation to feedback loops, narrative shifts, and adversary countermeasures. The aim is often cognitive degradation or displacement, subtly altering the adversary’s decision-making calculus over time.
  • Strategic, Not Just Tactical, Focus: While drawing on tools from traditional information warfare or psychological operations, the concept of “Cyber CoGs” pushes for a more strategically ambitious focus on these Clausewitzian centers of power, wherever they may reside. When a CoG itself is located in the moral, political, or epistemic domains, cyber-enabled operations can become a key component of strategic engagement.

Navigating the Ethical and Legal Landscape

The capacity to strategically influence an adversary’s societal beliefs and perceived reality carries a profound ethical burden and operates within a complex legal landscape. Responsible statecraft demands a deliberate moral calculus, especially in the ambiguous “grey zone”. The Tallinn Manual 2.0, for instance, provides detailed interpretations of how international law applies to cyber operations, including complex issues around sovereignty, non-intervention, and due diligence. Operations that aim to alter perception or manipulate societal beliefs can brush up against these established and evolving legal interpretations. Pursuing strategic goals through such means requires careful navigation to avoid widespread societal disruption or unintended consequences that could undermine international order. There is also the risk of “blow-back,” where the methods used externally could erode internal democratic norms if not carefully managed.

Integrating New Concepts into Strategic Thinking

The future of conflict is undeniably intertwined with the contested terrains of perception, belief, and societal cohesion. Exploring concepts like “Cyber Centres of Gravity” can help us theorise and analyse these critical nodes of will, unity, and perceived reality. This endeavor is less about new technologies and more about refining our understanding of strategy itself: to influence an adversary’s will or alter their perceived reality to achieve strategic aims, through means that are proportionate, precise, and adapted to the evolving character of modern conflict.

Failing to adapt our thinking, to build the necessary multidisciplinary approaches, and to foster the institutional agility to operate in this transformed strategic landscape is a risk to our future strategic effectiveness.

Selected Bibliography

  • Brittain-Hale, Angela. “Clausewitzian Theory of War in the Age of Cognitive Warfare.” The Defense Horizon Journal (2023): 1–19.
  • Clausewitz, Carl von. On War. Edited and translated by Michael Howard and Peter Paret. Princeton, NJ: Princeton University Press, 1976.
  • Echevarria, A. J. (2002). “Clausewitz’s Center of Gravity: Changing Our Warfighting Doctrine—Again!” Strategic Studies Institute.
  • Gartzke, E. (2013). “The Myth of Cyberwar: Bringing War in Cyberspace Back Down to Earth.” International Security, 38(2), 41–73.
  • Krieg, Andreas. Subversion: The Strategic Weaponization of Narratives. London: Routledge, 2023.
  • Lin, Herbert, and Jackie Kerr. “On Cyber-Enabled Information/Influence Warfare and Manipulation.” Center for International Security and Cooperation, Stanford University, 2017.
  • Pawlak, P. (2022). “Cognitive Warfare: Between Psychological Operations and Narrative Control.” EUISS Brief.
  • Schmitt, M. N. (Ed.). (2017). Tallinn Manual 2.0 on the International Law Applicable to Cyber Operations. Cambridge University Press.

Strategic Chessboard

The Unfolding Strategic Environment: Reconciling Enduring Principles with Revolutionary Change

The contemporary strategic environment presents a paradox. On one hand, the fundamental nature of war as a political instrument, driven by human factors and subject to friction and uncertainty, appears timeless. Carl von Clausewitz’s assertion that war serves political objectives remains a crucial anchor, forcing strategists to connect means with ends, even amidst technological fascination. Similarly, Sun Tzu’s principles regarding deception, intelligence, and achieving advantage with minimal direct confrontation resonate strongly in an era increasingly defined by non-traditional operations and persistent competition below the threshold of open warfare.

Yet, the character of conflict is undergoing a profound transformation. Technological disruption, particularly in the digital domain, is eroding traditional military advantages, intensifying “grey zone” activities, empowering non-state actors, and blurring the very definitions of war and peace. This necessitates a critical re-examination of established strategic paradigms and a forward-looking approach to national security. The challenge for policymakers and strategists lies in reconciling the enduring nature of war with its rapidly evolving character.

From Deterrence by Punishment to Deterrence by Resilience?

The Cold War’s strategic stability, largely built upon the concept of Mutually Assured Destruction (MAD), faces fundamental challenges in the digital age. While nuclear deterrence created a precarious balance, its logic struggles to adapt to threats operating outside its established framework. Cyberspace and information warfare lack the clear attribution mechanisms and proportional response options that underpin traditional deterrence by punishment. As Thomas Rid notes, establishing credibility and effective retaliation in these domains is problematic. Jeffrey Knopf’s work on “Fourth Wave” deterrence highlights how emerging threats disrupt existing models.

Furthermore, the strategic landscape is no longer solely dominated by states. Powerful technology firms, transnational terrorist organisations, and ideologically driven groups operate with increasing autonomy and influence, complicating deterrence calculations built on state-centric assumptions. The conflict in Ukraine provides stark examples, where companies like SpaceX have deployed capabilities, such as Starlink, that significantly impact battlefield communications and information warfare dynamics, challenging the state’s traditional monopoly on such strategic assets. This diffusion of power necessitates a broader conception of deterrence, moving beyond punishment towards denial, resilience, deception, and proactive information operations. Security may increasingly depend on the ability to withstand, adapt, and operate effectively within a contested information environment, rather than solely on the threat of overwhelming retaliation.

The Digital Revolution and the Transformation of Conflict Logic

The digital revolution represents more than just the introduction of new tools; it signifies a potential “change of consciousness” in warfare, as Christopher Coker suggests. Conflict becomes less geographically bounded and more psychological, abstract, and continuous, eroding distinctions between wartime and peacetime. Cyber operations, AI-enabled decision-making, and sophisticated disinformation campaigns are not merely adjuncts to traditional military power; they are becoming central components of strategic competition. China’s “Three Warfares” doctrine integrating psychological operations, public opinion manipulation, and legal maneuvering, which exemplifies how state actors are weaponising the information domain to achieve strategic aims.

This shift challenges classical strategic concepts. How is escalation controlled when cyberattacks lack clear attribution? How is victory defined when conflict plays out continuously in the non-physical domain? The Ukraine conflict serves as a real-world laboratory, demonstrating the strategic significance of cyber defenses, AI-driven targeting, and narrative warfare alongside conventional operations. It highlights how eroding conventional advantages forces a rethink of the very currency of power. Non-state actors, like ISIS, have also adeptly exploited the digital realm for recruitment, propaganda, and operational coordination, demonstrating the asymmetric advantages offered by this environment.

Systemic Fragility, Strategic Agility, and Redefined Victory

The deep integration of technology across society creates unprecedented efficiencies but also introduces systemic fragility. Interconnectedness means that disruptions, whether from cyberattacks, pandemics, financial crises, or supply chain breakdowns, can cascade rapidly with significant security implications. Consequently, building national resilience – encompassing robust cybersecurity, hardened infrastructure, diversified supply chains, and societal preparedness – becomes a core strategic imperative.

Alongside resilience, strategic agility is paramount. The accelerating pace of technological and geopolitical change means that strategies and institutions must be capable of rapid adaptation. The failure of European powers to adapt their doctrines to the realities of industrialised warfare before World War I, as chronicled by Barbara Tuchman, serves as a potent warning against strategic rigidity. Fostering agility requires institutional cultures that embrace learning and experimentation, empower decentralised action, and anticipate change.

This evolving landscape also forces a re-evaluation of “victory”. As warfare expands beyond purely military considerations to encompass cyber, economic, and informational domains, success becomes more ambiguous. Robert Mandel’s distinction between “war-winning” (tactical success) and “peace-winning” (achieving sustainable political outcomes) is increasingly pertinent. Future conflicts, likely to be protracted and involve multiple actors with divergent goals, may necessitate strategies focused on achieving iterative, adaptable political objectives rather than decisive military triumphs.

Adapting Strategy for an Unfolding Future

While some argue that classical, state-centric models of war are obsolete, discarding the foundational insights of strategists like Clausewitz and Sun Tzu would be premature. As Lawrence Freedman emphasises, war remains shaped by human agency and political motives, regardless of technology. The core task is not replacement but adaptation: applying enduring principles to navigate the complexities of the contemporary environment.

Successfully navigating the future strategic environment requires a conceptual shift. Technological foresight, AI-driven analysis, and robust cyber capabilities are necessary but insufficient. The decisive factor may be institutional and cultural: the capacity for continuous learning, adaptation, and innovation. Strategy must become truly multidimensional, integrating all instruments of national power – diplomatic, informational, military, and economic – within a coherent framework that acknowledges both the timeless nature and the transforming character of conflict. The future belongs to those who can master this complex, dynamic interplay.


Bibliography

  • Awan, Imran. “Cyber-Extremism: Isis and the Power of Social Media.” Society 54, no. 2 (April 2017): 138–49. https://www.proquest.com/scholarly-journals/cyber-extremism-isis-power-social-media/docview/1881683052/se-2.
  • Coker, Christopher. Future War. Polity Press, 2015.
  • Freedman, Lawrence. The Evolution of Nuclear Strategy. New York: Palgrave Macmillan, 2003.
  • Freedman, Lawrence. The Future of War: A History. New York: PublicAffairs, 2017.
  • Gray, Colin S. The Strategy Bridge: Theory for Practice. Oxford: Oxford University Press, 2010. Online edn, Oxford Academic, September 1, 2010.
  • Greggs, David. “Violent Limitation: Cyber Effects Reveal Gaps in Clausewitzian Theory.” The Cyber Defense Review 9, no. 1 (2024): 73–86. [invalid URL removed].
  • Jervis, Robert. The Meaning of the Nuclear Revolution: Statecraft and the Prospect of Armageddon. Ithaca: Cornell University Press, 1989.
  • Kaldor, Mary. New and Old Wars: Organized Violence in a Global Era. 3rd ed. Cambridge: Polity Press, 2012.
  • Kania, Elsa B. “The PLA’s Latest Strategic Thinking on the Three Warfares.” The Jamestown Foundation, August 22, 2016. https://jamestown.org/program/the-plas-latest-strategic-thinking-on-the-three-warfares/.
  • Knopf, Jeffrey W. The Second Nuclear Age: Strategy, Danger, and the New Power Politics. Washington, D.C.: Council on Foreign Relations Press, 2002.
  • Layton, Peter. “Fighting artificial intelligence battles: Operational concepts for future ai-enabled wars.” Network 4, no. 20 (2021): 1-100.
  • Mandel, Robert. The Meaning of Military Victory. Boulder, CO: Lynne Rienner, 2006.
  • Morozov, Evgeny. To Save Everything, Click Here: The Folly of Technological Solutionism. New York: PublicAffairs, 2013.
  • Rid, Thomas. Cyber War Will Not Take Place. London: Hurst, 2013.
  • Skove, Sam. “How Elon Musk’s Starlink Is Still Helping Ukraine’s Defenders.” Defense One, March 1, 2023. https://www.defenseone.com.
  • Sun Tzu. The Art of War. Newburyport: Dover Publications, Incorporated, 2002.
  • Tiwari, Sachin. “Cyber Operations in the Grey Zone.” The Digital Humanities Journal, November 14, 2023. https://tdhj.org/blog/post/cyber-operations-grey-zone/.
  • Tuchman, Barbara W. The Guns of August. New York: Macmillan, 1962.
  • Van Creveld, Martin. Transformation of War. New York: Free Press, 1991.
  • Von Clausewitz, Carl. On War. Edited and translated by Michael Howard and Peter Paret. Princeton: Princeton University Press,1 1984.

The Grey Mirage: Navigating Strategic Uncertainty and the Elusive Victory in Grey Zone Conflicts

Imagine a world where war is waged not with bombs and bullets, but with lines of code and viral misinformation. This is the reality of grey zone conflicts, a persistent feature of modern geopolitics characterised by cyber operations, economic coercion, and disinformation. While many initially hailed these tactics as a revolutionary new form of strategic competition, a critical examination reveals that they not only fundamentally fail to achieve strategic victory in a traditional Clausewitzian sense but also introduce profound strategic uncertainty and volatility into the international system. Extending Thomas Rid’s compelling argument that “cyber war will not take place” due to the inherent lack of decisive physical destruction, this critique applies even more broadly to the entire spectrum of grey zone conflicts.

To understand the inherent limitations of these operations, we must return to the foundational strategic thought of Carl von Clausewitz. His framework remains a lodestar: tactical successes must always serve political objectives, and the very essence of war is to impose one’s will upon the enemy. As Michael Handel succinctly summarises, Clausewitzian war aims at the destruction of enemy forces, control of vital resources, and the sway of public opinion. Grey zone tactics, however, are structurally incapable of achieving these aims in the decisive manner Clausewitz envisioned. They may sow disruption and discord, but they rarely deliver battlefield outcomes, nor can they compel political compliance in the way traditional military campaigns do. Consider, for instance, the persistent cyberattacks between nations; while disruptive and costly, they have yet to force a nation to fundamentally alter its core strategic direction.

The very nature of grey zone strategies – their calculated avoidance of outright force and immediately recognisable acts of aggression – means they cannot truly compel an adversary to accept a fundamentally new strategic order. Cyber operations, as Rid convincingly argues, rarely inflict the kind of lasting, tangible damage comparable to conventional military strikes. Disinformation campaigns, while capable of eroding trust in institutions and even mobilising populations, as seen in the Arab Spring uprisings, cannot on their own force political capitulation. Economic sanctions, though often painful and strategically useful in shaping behaviour, are notoriously slow and far from guaranteed to change a determined state’s core strategic calculations.

This inherent strategic limitation is further underscored by Colin Gray’s assertion that strategy is fundamentally about the application of force to achieve political objectives. For Gray, war is fundamentally about contesting and achieving control, and without the capacity to impose a decisive order, grey zone tactics fall drastically short of true strategic efficacy. He cautions that the absence of decisive engagement in contemporary conflicts leads not to resolution, but to a debilitating strategic paralysis. This resonates deeply with Clausewitz’s core tenet that successful war must culminate in the decisive defeat of the enemy. Grey zone conflicts, by their very nature, do not and cannot fulfil this criterion. At best, they generate protracted stalemates; at worst, they risk unintended escalation into open, conventional warfare.

Countering the Cumulative Argument and Embracing Ambiguity: Incrementalism vs. Decisiveness

It is important to acknowledge a key counterargument: that grey zone tactics, while rarely decisive alone, gain strategic effect cumulatively over time. Proponents argue that persistent cyber intrusions, disinformation, and economic pressure can erode an adversary’s strength and will. This view sees grey zone warfare as long-term shaping, not a knockout blow, exemplified by China’s “Three Warfares” doctrine.

Furthermore, the ambiguity of grey zone conflicts can be strategically useful, like nuclear deterrence. Bernard Brodie argued nuclear war’s cost shifted strategy to prevention, redefining “victory” as avoiding war. Similarly, grey zone tactics might deter and manage competition below open conflict. Incremental disruption, like cyberattacks on Iran’s nuclear program, can also shift power balances.

Hurting Without Winning and the Zero-Sum Nature of Grey Zone Competition

Thomas Schelling noted, “Victory is no longer a prerequisite for hurting the enemy.” This is key to grey zone tactics, which can aim to inflict pain and signal resolve without overt war. Even non-military gains – diplomatic wins, sanctions, legal advantages achieved through disinformation and cyber influence – become strategic victories in this zero-sum competition. This is particularly relevant as tech-savvy strategists recognise the advantages of ambiguity in these operations.

However, pursuing overwhelming military victory can backfire, escalating conflict. Grey zone tactics offer a way to avoid this, operating below the threshold of conventional war. Yet, this ambiguity breeds volatility, with miscalculation and escalation always looming.

Strategic Victory as Peace-Winning and the Challenge of Subjectivity

Rethinking “strategic victory” beyond military terms is crucial. Robert Mandel distinguishes “war-winning” from “peace-winning,” arguing true strategic victory is “peace-winning” – a multi-dimensional achievement across information, politics, economics, and diplomacy. Grey zone tactics align with this broader view, especially as public mobilisation and decentralised networks shape geopolitics.

Yet, “victory” in the grey zone remains subjective and hard to measure. Ethan Kapstein highlights the difficulty of defining metrics, gaining consensus, and obtaining reliable data in grey zone operations. Progress in one area may undermine another, increasing strategic uncertainty. Whether grey zone tactics are a “strategic win” depends on perspective and chosen metrics.

Taiwan: Strategic Uncertainty in Action

Taiwan exemplifies the inherent volatility of grey zone warfare: while hybrid strategies can pressure an opponent, they provide no clear pathway to a controlled, predictable outcome. The lack of definitive thresholds makes grey zone tactics as much a risk as an opportunity for the aggressor. Imagine China using grey zone tactics against Taiwan: cyberattacks, disinformation, and economic pressure. While this might weaken Taiwan, it’s unlikely to force capitulation without risking wider conflict. Taiwan’s reaction, U.S. responses, and the ever-present risk of miscalculation create a strategic dilemma.

While Russia has shown resilience to external grey zone pressures by controlling information, societal resilience only mitigates, not eliminates, strategic uncertainty. Even the most robust resilience strategies cannot eliminate the risk of miscalculation or escalation, underscoring the inherent volatility of grey zone conflicts. Because grey zone conflicts operate ambiguously, even careful campaigns can unexpectedly escalate, making control and predictability elusive.

Policy Implications: Actively Shaping the Grey Zone for Advantage

The inherent strategic uncertainty of grey zone conflicts demands proactive policies:

  1. Sharpen Intelligence and Active Disruption: Enhance intelligence to understand adversary intentions and develop capabilities to actively disrupt their grey zone operations.
  2. Develop Flexible and Escalatory Response Options: Create a wider range of responses, including calibrated counter-grey zone tactics and clear signalling for de-escalation and conflict management. As artificial intelligence and automation continue to reshape information warfare, states must anticipate how AI-driven disinformation, deepfake technology, and autonomous cyber operations will further complicate grey zone conflicts. Developing countermeasures that integrate AI-based detection and rapid-response systems will be critical for maintaining strategic advantage.
  3. Promote Transparency to Force Predictability: Actively expose adversary actions to force them into a more predictable strategic posture, enhancing transparency and accountability in the grey zone.
  4. Focus on Proactive Crisis Management: Develop proactive crisis management to prevent crises, including clear communication, de-escalation protocols, and persistent low-intensity engagement for stability.
  5. Re-evaluate “Victory” and Embrace Persistent Engagement: Shift from traditional victory metrics to measures of resilience, deterrence, and long-term shaping, embracing persistent engagement as the norm in grey zone competition.

Conclusion: Embracing Uncertainty, Seeking Control Through Persistent Engagement

Russia’s pre-2022 hybrid warfare campaign in Ukraine – combining cyber operations, disinformation, and economic pressure – demonstrated the limitations of grey zone tactics. Rather than coercing Ukraine into submission, these operations reinforced Ukrainian national resistance and galvanised Western military support, ultimately leading to Russia’s full-scale invasion. This case underscores the strategic volatility of grey zone competition: while these tactics can create disruption, they provide no guarantee of controlled, predictable outcomes.

This highlights how grey zone tactics, while seemingly flexible, are unlikely to deliver traditional strategic victory and introduce significant strategic uncertainty. While ambiguity and “peace-winning” are modern adaptations, they don’t guarantee predictable outcomes or escalation control. The grey zone is a volatile battlespace defined by miscalculation and instability. Navigating the grey zone requires embracing uncertainty, prioritising crisis management, and actively shaping the battlespace. In this new era of perpetual contestation, mastering the grey zone is not about winning – it is about ensuring that one’s adversaries never can.


References

  1. Brodie, Bernard. “The Absolute Weapon: Atomic Power and World Order.” The Yale Review 35, no. 3 (Spring 1946): 456-472.
  2. Gray, Colin S. The Strategy Bridge: Theory for Practice. Oxford: Oxford University Press, 2010.
  3. Handel, Michael I. Masters of War: Classical Strategic Thought. London: Frank Cass, 2001.
  4. Kania, Elsa B. “The PLA’s Latest Strategic Thinking on the Three Warfares.” The Jamestown Foundation, August 22, 2016. https://jamestown.org/program/the-plas-latest-strategic-thinking-on-the-three-warfares/.
  5. Kapstein, Ethan B. “Measuring Success in Complex Operations.” The Journal of Strategic Studies 34, no. 2 (April 2011): 267-285.
  6. Mandel, Robert. “Thinking about Victory in Strategy.” The Journal of Strategic Studies 34, no. 2 (April 2011): 199-200.
  7. Monaghan, Sean. “Twitter Revolutions? Social Media and the Arab Spring.” Whitehall Papers 69, no. 1 (2011): 21-22.
  8. Rid, Thomas. Cyber War Will Not Take Place. London: Hurst, 2013.
  9. Sanger, David E., and William J. Broad. “Obama Order Sped Up Wave of Cyberattacks Against Iran.” The New York Times, June 1, 2012. https://www.nytimes.com/2012/06/01/world/middleeast/obama-ordered-wave-of-cyberattacks-against-iran.html.
  10. Schelling, Thomas C. Arms and Influence. New Haven: Yale University Press, 1966.
  11. Simons, Greg. “Russia and information confrontation: perceptions, strategies and responses.” Journal of strategic studies 42, no. 1 (2019): 139-140.
Past meets Future

Rethinking Warfare: Clausewitz in the Age of Cyber and Hybrid Conflict

Carl von Clausewitz’s claim that war is “a continuation of politics by other means” has survived railways, radio and nuclear weapons.  Today the “other means” range from data-wiping malware that bricks ventilators to viral deep-fakes that never fired a shot.  The central puzzle is whether these novelties merely change the character of war (the tools, tempo and terrain) or whether they erode its immutable nature of violence, chance and political purpose (Echevarria 2002; Strachan 2013). 

International lawyers behind the Tallinn Manual 2.0 accept that a non-international armed conflict may now consist solely of cyber operations if the effects rival kinetic force (Schmitt 2017).  Thomas Rid counters that almost all cyber activity is better classed as espionage, sabotage or subversion. Such attacks may be potent, but not war in the Clausewitzian sense (Rid 2017).  The Russian “AcidRain” attack of February 2022 sits precisely on that fault-line: a single wiper disabled thousands of Ukrainian satellite modems and 5,800 German wind-turbines, yet no bombs fell (SentinelLabs 2022; Greenberg 2023).  If violence is judged by effect on human life rather than by the immediate mechanics of injury, Clausewitz still works; if it is judged by physical harm alone, he wobbles. 

The 2022 US National Defense Strategy elevates “integrated deterrence”, urging day-to-day campaigning below the armed-attack threshold (US DoD 2022).  US Cyber Command’s doctrine of persistent engagement pushes the same logic into practice, contesting adversaries continually rather than waiting for crises (USCYBERCOM 2022).  Fischerkeller and Harknett argue that such calibrated friction stabilises the domain; Lynch casts it as a new “power-sinew contest” in which outright war is the exception, not the rule (Fischerkeller & Harknett 2019; Lynch 2024).  The danger is conceptual inflation: call every malicious packet “war” and escalation thresholds blur, yet forcing every new tactic into Clausewitz’s vocabulary risks missing genuine novelty. 

Frank Hoffman’s once-handy term “hybrid warfare” now covers almost any sub-threshold activity.  NATO’s recent work on cognitive warfare goes further, treating perception itself as decisive terrain and calling for a fresh taxonomy of “acts of cognitive war” (NATO Innovation Hub 2023).  Clausewitz, writing in an age of limited literacy, rarely considered the deliberate collapse of an adversary’s shared reality as a line of operation.  The gap is undeniable – but it need not be fatal if his categories can stretch. 

Clausewitzian elementDigital-age inflectionIllustrative case
ViolencePhysical harm or systemic disruption that produces downstream human sufferingAcidRain modem wipe, 2022
ChanceAmplified by tightly coupled networks where small code changes trigger cascading failuresLog4j exploit cascade, 2021
Political purposeTerritorial control plus cognitive or behavioural manipulation2016 US election interference

The table shows how old categories bend.  Violence migrates into infrastructure; chance spikes in opaque systems; political purpose colonises the infosphere.  None of these shifts removes politics from the centre – precisely why the trinity still maps the ground.

There are 3 key areas where Clausewitz’s wisdom holds strongly:

  1. Politics first.  Colin Gray insists that strategy is the orchestration of means to political ends; replacing artillery with algorithms does not move that lodestar (Gray 1999).
  2. Escalation logic.  Even in cyberspace, deterrence depends on adversaries reading tacit red lines.  Clausewitz’s emphasis on uncertainty and friction remains apt.
  3. Human cost.  Cyber operations hurt indirectly – frozen hospital wards, confused electorates – but the harm is felt by bodies in time and space, not by circuits.

There are however, a number of places where the strain shows, namely where:

  • Systemic cyber harm approaches “force” while sidestepping bodily violence.
  • Persistent, below-threshold campaigning blurs the war–peace boundary Clausewitz assumed.
  • The trinity was never meant to classify acts aimed at belief rather than battalions.

For now, Rid’s scepticism still holds true – most cyber operations do not meet Clausewitz’s threshold of war.  Yet as societies entangle their critical functions ever more tightly with code, the line between systemic disruption and physical violence narrows.  Clausewitz’s trinity of violence, chance, political purpose – still offers the clearest compass, because politics, not technology, remains the centre of gravity of strategy.  The compass, however, is being asked to steer across novel terrain.  Should a future campaign achieve political aims through cyber-enabled systemic coercion alone, the Prussian might finally need more than a tune-up.  Until then, his core logic endures, and while needing adaptation, it has not been eclipsed.

Bibliography

Clausewitz, C. v. (1832) On War.  Berlin: Ferdinand Dümmler.

Echevarria, A. J. (2002) ‘Clausewitz’s Center of Gravity: Changing Our Warfighting Doctrine – Again!’.  Carlisle, PA: US Army Strategic Studies Institute. 

Fischerkeller, M. P. and Harknett, R. J. (2019) ‘Persistent Engagement, Agreed Competition, and Cyberspace Interaction Dynamics’. The Cyber Defense Review

Gray, C. S. (1999) Modern Strategy.  Oxford: Oxford University Press. 

Greenberg, A. (2023) ‘Ukraine Suffered More Wiper Malware in 2022 Than Anywhere, Ever’. WIRED, 22 February. 

Lynch, T. F. III (2024) ‘Forward Persistence in Great Power Cyber Competition’.  Washington, DC: National Defense University. 

NATO Innovation Hub (2023) The Cognitive Warfare Concept.  Norfolk, VA: NATO ACT. 

Rid, T. (2017) Cyber War Will Not Take Place.  Oxford: Oxford University Press. 

Schmitt, M. N. (ed.) (2017) Tallinn Manual 2.0 on the International Law Applicable to Cyber Operations.  Cambridge: Cambridge University Press. 

SentinelLabs (2022) ‘AcidRain: A Modem Wiper Rains Down on Europe’.  SentinelOne Labs Blog, 31 March. 

US Cyber Command (2022) ‘CYBER 101 – Defend Forward and Persistent Engagement’.  Press release, 25 October. 

US Department of Defense (2022) National Defense Strategy of the United States of America.  Washington, DC. 

The Future of War: AI and Strategy

When Carl von Clausewitz wrote that war is “a continuation of politics by other means,” he centred conflict on purpose rather than technology. Colin Gray later warned that strategic constants outlive every gadget. Artificial intelligence now accelerates observation-decision loops from minutes to milliseconds, but whether that shift dethrones human strategy is still contested.

Speed Meets Friction

Ukrainian drone teams run machine-vision updates at the front line every fortnight, turning quadcopters into near-autonomous kamikaze platforms (Bondar 2025). Yet the same coders struggle with false positives – such as bears flagged as enemy sentries – and with mesh-network bottlenecks once EW jammers blanket the spectrum. AI compresses time, but it also multiplies friction, the very element Clausewitz thought ineradicable. We have to be conscious that false positives do not just waste munitions; when the same image-detection stack mis-tags an ambulance as a supply truck, the result is shrapnel in a paediatric ward, not an algorithmic hiccup. In 2025, the World Health Organization stated that hospitals reported 205 deaths from strike-related service loss in Ukraine.

Open-source models still give insurgents propaganda reach, but the sharper edge of algorithmic warfare sits with states.  Israel’s Lavender system, revealed in 2024, generated a list of roughly 37,000 potential Hamas targets and was used even when commanders expected up to twenty civilian deaths per strike—a machine-driven tempo that unsettled some of the intelligence officers involved (McKernan & Davies 2024).  Cutting-edge autonomy, however, still demands high-end GPUs, abundant power and proprietary data.  That keeps strategic dominance gated by infrastructure, mirroring geopolitical power.  Yet, as Kuner (2024) notes, Brussels carved a national-security escape hatch into the AI Act precisely to preserve state leverage over the biggest models.

Washington’s Replicator initiative aims to field “thousands of attriable autonomous systems” within two years (DoD 2024). Beijing answers through civil-military fusion; Moscow improvises with AI-augmented loitering munitions. These programmes underpin an operating concept of continuous, sub-threshold contest, paralleling U.S. Cyber Command’s “persistent engagement”. Strategic deterrence thus rests on the hope that algorithmic agents still read tacit red lines the way humans do. In Stanford’s 2024 crisis simulations, LLM agents recommended first-strike escalation in seventy-nine per cent of runs, providing evidence that algorithmic ‘rationality’ may be anything but.

If LLM advisers escalate crises in simulation nine times out of ten, the locus of judgement drifts from commander to code.  The next question is whether that drift merely speeds execution or begins to automate strategy itself.

Promoters once claimed AI would dissolve uncertainty; real battlefields say different. Sensor glut, spoofed tracks and synthetic “ghost columns” now drown analysts in contradictory feeds (Collazzo 2025). AI redistributes fog rather than lifting it – accelerating some judgements while blinding others through overload or deception (Askew and Salinas 2025). 

The Pentagon updated Directive 3000.09 on autonomous weapons in late 2024, tightening human-in-the-loop requirements. At the multilateral level, UN talks in May 2025 once again failed to agree binding rules, though Secretary-General Guterres set a 2026 deadline (Le Poidevin 2025). Norms lag well behind code, keeping accountability – and escalation liability – firmly in human hands. 

Strategic implications

The transformative impact of AI on strategic paradigms can be distilled into a few key considerations:

  • Advantage remains political. AI is a means; objectives still emanate from human intent. Strategy therefore keeps its Clausewitzian anchor in politics.
  • Automation magnifies misperception. Faster loops leave less time for reflection; and black-box models hide their own failure modes.  Bias and data poisoning risk strategic self-harm.
  • Deterrence becomes brittle. Autonomous systems may over-react to spoofed inputs; adversaries may test thresholds in micro-seconds rather than hours, shortening the ladder of de-escalation.

Conclusions

AI does not automate strategy; it amplifies both its promise and its pathologies. Machines accelerate tactics, generate options and even draft operational plans, but they do not choose political ends – and they continue to manifest friction, chance and uncertainty. Thomas Rid remains broadly right that most cyber and AI activity falls short of war (Rid, 2017), yet as energy grids, logistics chains and battlefield kill cycles digitise, the gap between systemic disruption and physical violence narrows. For the moment, Clausewitz’s compass still points true – but the ground beneath it is starting to slide.

Select bibliography

Askew, M. and Salinas, A. (2025) ‘AI Will Make the Mind Games of War More Risky’, Business Insider, 18 Apr.

Bondar, K. (2025) Ukraine’s Future Vision and Current Capabilities for Waging AI-Enabled Autonomous Warfare.  CSIS.

Collazzo, A. (2025) ‘Warfare at the Speed of Thought’, Modern War Institute, 21 Feb.

Department of Defense (2024) ‘Replicator Initiative Progress Update’.

Kuner, C. (2024) ‘The AI Act National Security Exception’, Verfassungsblog, 15 Dec.

Le Poidevin, O. (2025) ‘Nations Meet at UN for “Killer Robot” Talks’, Reuters, 12 May.

McKernan, B., & Davies, H. (2024). The Machine Did It Coldly. The Guardian, 3 April. https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai-database-hamas-airstrikes  

Rid, T. (2017) Cyber War Will Not Take Place.  Oxford: Oxford University Press.

Stanford HAI (2024) Escalation Risks from LLMs in Military and Diplomatic Contexts.

World Health Organization (2025) WHO’s Health Emergency Appeal 2025. Geneva: World Health Organization.

Powered by WordPress & Theme by Anders Norén