Thoughts, reflections and experiences

icy banner

Tag: cyberwar

Cyber operators working at screens

From Theory to the Trenches: Introducing “Cyber Centres of Gravity”

The nature of warfare is in constant flux. Clausewitz’s timeless insight that war is a “contest of wills” remains central, yet the means by which this contest is waged are transforming. Traditionally, Centres of Gravity (CoGs) were often seen as physical entities: armies, capitals, industrial capacity. The thinking was that neutralising these would cripple an adversary’s warfighting ability. However, it’s crucial to recognise, as scholars like Echevarria highlight, that Clausewitz himself acknowledged non-material CoGs, such as national will. The concept isn’t entirely new, but modern interpretations significantly expand upon it, especially in the context of cyberspace.

Today, the pervasive nature of information networks prompts us to consider what this means for strategic targeting. What happens when the critical vulnerabilities lie not just in the physical domain, but in an enemy’s belief systems, the legitimacy of their leadership, or their very grasp of shared reality? This is where exploring an emerging concept – what this article terms “Cyber Centres of Gravity” (Cyber CoGs) – becomes vital for contemporary military strategists. While “Cyber CoG” as a distinct term is still evolving and not yet firmly established in formal doctrine (which tends to use adjacent terms like cognitive targets or information influence objectives, as noted by analysts like Pawlak), its exploration helps us grapple with these new strategic challenges. Ignoring these intangible, yet increasingly critical, aspects in our information-saturated world could represent a significant strategic blind spot.

Understanding “Cyber CoGs”

So, what might a “Cyber CoG” entail? It can be conceptualised as a critical source of an adversary’s moral or political cohesion, their collective resolve, or a foundational element of their operative reality-construct that underpins their ability or will to resist your strategic objectives. The key idea is that significant degradation of such a “Cyber CoG,” predominantly through cyber-enabled means, could fundamentally unravel an enemy’s capacity or desire to continue a conflict, perhaps by altering their perception of the strategic landscape.

This isn’t merely about disrupting networks or servers, though such actions might play a role. A true “Cyber CoG,” in this conceptualisation, is intrinsically linked to these deeper wellsprings of an enemy’s will, cohesion, or their understanding of reality. If an operation doesn’t aim to decisively alter the strategic balance by impacting these moral, political, or epistemic foundations, it’s more likely an operational objective rather than an attack on a strategic “Cyber CoG”.

Clausewitz identified the CoG as “the hub of all power and movement, on which everything depends”. In an age increasingly defined by information, this hub can often be found in the cognitive and informational realms. When societal “passion” can be manipulated through digital narratives, when a military’s operating environment is shaped by perception as much as by physical friction, and when governmental “reason” is threatened by the decay of a shared factual basis, cyberspace becomes an increasingly central domain in shaping strategic outcomes. While kinetic, economic, and geopolitical power still hold immense, often primary, sway in high-stakes confrontations (a point Gartzke’s work on the “Myth of Cyberwar” reminds us to consider), the cyber domain offers potent avenues to contest the very “reality” upon which an adversary’s will is constructed. Here, strategic success may rely less on physical destruction and more on the ability to influence or disrupt an adversary’s cognitive and narrative environments.

Identifying Potential “Cyber CoGs”: A Framework for Analysis

Pinpointing these potential “Cyber CoGs” requires a nuanced analytical approach, considering factors such as:

  1. Strategic Relevance: Does the potential target truly sustain the enemy’s will to fight or their core strategic calculus? This involves looking at national cohesion, public legitimacy, dominant narratives, key alliances, or shared assumptions underpinning their strategy. Its degradation should aim to undermine their strategic purpose or resolve.
  2. Cyber Primacy in Effect: Can cyber-enabled operations offer a uniquely effective, or significantly complementary, method for impacting this CoG, especially when compared or combined with kinetic, economic, or diplomatic levers? Some intangible CoGs may be less susceptible to physical attack but highly vulnerable to informational strategies.
  3. Potential for Decisive Influence: Is the intended effect of targeting the “Cyber CoG” likely to be decisive, whether through an irreversible loss of trust (e.g., in institutions or information), a critical breakdown in a foundational narrative, or a fundamental, lasting shift in the adversary’s perception of their strategic environment? It could also be a cumulative effect, eroding coherence and resolve over time.
  4. Linkage to Moral and Political Dimensions (Clausewitzian Character): Is the “Cyber CoG” intrinsically tied to the enemy’s unity, cohesion, will to resist, or the shared narratives defining their interests and threats? It’s not just a system or infrastructure but is linked to the collective spirit or governing principles.
  5. Strategic Viability and Responsibility: Can the proposed operation be conducted with a rigorous assessment of attribution risks, potential for unintended escalation, and broader second-order societal effects? This includes careful consideration of evolving international norms and legal frameworks.

Implications for Military Planners

Strategically engaging potential “Cyber CoGs” would necessitate evolving current approaches:

  • Integrated Intelligence: Identifying and understanding these “Cyber CoGs” demands a deep, multidisciplinary intelligence effort, fusing technical insights with profound cultural, political, cognitive, and narrative analysis. This requires collaboration between experts in fields like anthropology, sociology, political science, and data science to map the ‘human terrain’ and ‘narrative architecture’.
  • Dynamic and Adaptive Campaigning: Operations targeting “Cyber CoGs” are unlikely to be single events. Influencing moral cohesion or perceived reality is a complex, interactive process involving continuous adaptation to feedback loops, narrative shifts, and adversary countermeasures. The aim is often cognitive degradation or displacement, subtly altering the adversary’s decision-making calculus over time.
  • Strategic, Not Just Tactical, Focus: While drawing on tools from traditional information warfare or psychological operations, the concept of “Cyber CoGs” pushes for a more strategically ambitious focus on these Clausewitzian centers of power, wherever they may reside. When a CoG itself is located in the moral, political, or epistemic domains, cyber-enabled operations can become a key component of strategic engagement.

Navigating the Ethical and Legal Landscape

The capacity to strategically influence an adversary’s societal beliefs and perceived reality carries a profound ethical burden and operates within a complex legal landscape. Responsible statecraft demands a deliberate moral calculus, especially in the ambiguous “grey zone”. The Tallinn Manual 2.0, for instance, provides detailed interpretations of how international law applies to cyber operations, including complex issues around sovereignty, non-intervention, and due diligence. Operations that aim to alter perception or manipulate societal beliefs can brush up against these established and evolving legal interpretations. Pursuing strategic goals through such means requires careful navigation to avoid widespread societal disruption or unintended consequences that could undermine international order. There is also the risk of “blow-back,” where the methods used externally could erode internal democratic norms if not carefully managed.

Integrating New Concepts into Strategic Thinking

The future of conflict is undeniably intertwined with the contested terrains of perception, belief, and societal cohesion. Exploring concepts like “Cyber Centres of Gravity” can help us theorise and analyse these critical nodes of will, unity, and perceived reality. This endeavor is less about new technologies and more about refining our understanding of strategy itself: to influence an adversary’s will or alter their perceived reality to achieve strategic aims, through means that are proportionate, precise, and adapted to the evolving character of modern conflict.

Failing to adapt our thinking, to build the necessary multidisciplinary approaches, and to foster the institutional agility to operate in this transformed strategic landscape is a risk to our future strategic effectiveness.

Selected Bibliography

  • Brittain-Hale, Angela. “Clausewitzian Theory of War in the Age of Cognitive Warfare.” The Defense Horizon Journal (2023): 1–19.
  • Clausewitz, Carl von. On War. Edited and translated by Michael Howard and Peter Paret. Princeton, NJ: Princeton University Press, 1976.
  • Echevarria, A. J. (2002). “Clausewitz’s Center of Gravity: Changing Our Warfighting Doctrine—Again!” Strategic Studies Institute.
  • Gartzke, E. (2013). “The Myth of Cyberwar: Bringing War in Cyberspace Back Down to Earth.” International Security, 38(2), 41–73.
  • Krieg, Andreas. Subversion: The Strategic Weaponization of Narratives. London: Routledge, 2023.
  • Lin, Herbert, and Jackie Kerr. “On Cyber-Enabled Information/Influence Warfare and Manipulation.” Center for International Security and Cooperation, Stanford University, 2017.
  • Pawlak, P. (2022). “Cognitive Warfare: Between Psychological Operations and Narrative Control.” EUISS Brief.
  • Schmitt, M. N. (Ed.). (2017). Tallinn Manual 2.0 on the International Law Applicable to Cyber Operations. Cambridge University Press.

[Video] UK and EU AI Influence

Artificial intelligence isn’t just reshaping industries—it’s reshaping reality. While the UK and EU focus on regulating AI and combating misinformation, adversarial states like Russia and China are weaponizing it for influence warfare. The AI-driven disinformation battle isn’t coming; it’s already here.

In my latest article, “Why the UK and EU Are Losing the AI Influence War”, I explore how Europe’s slow response, defensive posture, and reliance on outdated regulatory approaches are leaving it vulnerable to AI-enhanced propaganda campaigns.

To bring these ideas to life, I’ve created a video that visualises the scale of the challenge and why urgent action is needed. Watch it below:

The AI influence war is no longer a hypothetical—it’s unfolding in real-time. Europe’s current strategies are reactive and insufficient, while adversaries leverage AI to manipulate narratives at unprecedented speed. Without a cognitive security unit, AI-powered countermeasures, and a national security-driven approach, the UK and EU risk losing control of their own information space.

The question isn’t whether AI will reshape public perception, it’s who will be in control of that perception. Will Europe rise to the challenge, or will it remain a passive battleground for AI-driven narratives?

What do you think? Should the UK and EU take a more aggressive stance in countering AI-enhanced disinformation? Feel free to discuss in the comments.

Past meets Future

Rethinking Warfare: Clausewitz in the Age of Cyber and Hybrid Conflict

Carl von Clausewitz’s claim that war is “a continuation of politics by other means” has survived railways, radio and nuclear weapons.  Today the “other means” range from data-wiping malware that bricks ventilators to viral deep-fakes that never fired a shot.  The central puzzle is whether these novelties merely change the character of war (the tools, tempo and terrain) or whether they erode its immutable nature of violence, chance and political purpose (Echevarria 2002; Strachan 2013). 

International lawyers behind the Tallinn Manual 2.0 accept that a non-international armed conflict may now consist solely of cyber operations if the effects rival kinetic force (Schmitt 2017).  Thomas Rid counters that almost all cyber activity is better classed as espionage, sabotage or subversion. Such attacks may be potent, but not war in the Clausewitzian sense (Rid 2017).  The Russian “AcidRain” attack of February 2022 sits precisely on that fault-line: a single wiper disabled thousands of Ukrainian satellite modems and 5,800 German wind-turbines, yet no bombs fell (SentinelLabs 2022; Greenberg 2023).  If violence is judged by effect on human life rather than by the immediate mechanics of injury, Clausewitz still works; if it is judged by physical harm alone, he wobbles. 

The 2022 US National Defense Strategy elevates “integrated deterrence”, urging day-to-day campaigning below the armed-attack threshold (US DoD 2022).  US Cyber Command’s doctrine of persistent engagement pushes the same logic into practice, contesting adversaries continually rather than waiting for crises (USCYBERCOM 2022).  Fischerkeller and Harknett argue that such calibrated friction stabilises the domain; Lynch casts it as a new “power-sinew contest” in which outright war is the exception, not the rule (Fischerkeller & Harknett 2019; Lynch 2024).  The danger is conceptual inflation: call every malicious packet “war” and escalation thresholds blur, yet forcing every new tactic into Clausewitz’s vocabulary risks missing genuine novelty. 

Frank Hoffman’s once-handy term “hybrid warfare” now covers almost any sub-threshold activity.  NATO’s recent work on cognitive warfare goes further, treating perception itself as decisive terrain and calling for a fresh taxonomy of “acts of cognitive war” (NATO Innovation Hub 2023).  Clausewitz, writing in an age of limited literacy, rarely considered the deliberate collapse of an adversary’s shared reality as a line of operation.  The gap is undeniable – but it need not be fatal if his categories can stretch. 

Clausewitzian elementDigital-age inflectionIllustrative case
ViolencePhysical harm or systemic disruption that produces downstream human sufferingAcidRain modem wipe, 2022
ChanceAmplified by tightly coupled networks where small code changes trigger cascading failuresLog4j exploit cascade, 2021
Political purposeTerritorial control plus cognitive or behavioural manipulation2016 US election interference

The table shows how old categories bend.  Violence migrates into infrastructure; chance spikes in opaque systems; political purpose colonises the infosphere.  None of these shifts removes politics from the centre – precisely why the trinity still maps the ground.

There are 3 key areas where Clausewitz’s wisdom holds strongly:

  1. Politics first.  Colin Gray insists that strategy is the orchestration of means to political ends; replacing artillery with algorithms does not move that lodestar (Gray 1999).
  2. Escalation logic.  Even in cyberspace, deterrence depends on adversaries reading tacit red lines.  Clausewitz’s emphasis on uncertainty and friction remains apt.
  3. Human cost.  Cyber operations hurt indirectly – frozen hospital wards, confused electorates – but the harm is felt by bodies in time and space, not by circuits.

There are however, a number of places where the strain shows, namely where:

  • Systemic cyber harm approaches “force” while sidestepping bodily violence.
  • Persistent, below-threshold campaigning blurs the war–peace boundary Clausewitz assumed.
  • The trinity was never meant to classify acts aimed at belief rather than battalions.

For now, Rid’s scepticism still holds true – most cyber operations do not meet Clausewitz’s threshold of war.  Yet as societies entangle their critical functions ever more tightly with code, the line between systemic disruption and physical violence narrows.  Clausewitz’s trinity of violence, chance, political purpose – still offers the clearest compass, because politics, not technology, remains the centre of gravity of strategy.  The compass, however, is being asked to steer across novel terrain.  Should a future campaign achieve political aims through cyber-enabled systemic coercion alone, the Prussian might finally need more than a tune-up.  Until then, his core logic endures, and while needing adaptation, it has not been eclipsed.

Bibliography

Clausewitz, C. v. (1832) On War.  Berlin: Ferdinand Dümmler.

Echevarria, A. J. (2002) ‘Clausewitz’s Center of Gravity: Changing Our Warfighting Doctrine – Again!’.  Carlisle, PA: US Army Strategic Studies Institute. 

Fischerkeller, M. P. and Harknett, R. J. (2019) ‘Persistent Engagement, Agreed Competition, and Cyberspace Interaction Dynamics’. The Cyber Defense Review

Gray, C. S. (1999) Modern Strategy.  Oxford: Oxford University Press. 

Greenberg, A. (2023) ‘Ukraine Suffered More Wiper Malware in 2022 Than Anywhere, Ever’. WIRED, 22 February. 

Lynch, T. F. III (2024) ‘Forward Persistence in Great Power Cyber Competition’.  Washington, DC: National Defense University. 

NATO Innovation Hub (2023) The Cognitive Warfare Concept.  Norfolk, VA: NATO ACT. 

Rid, T. (2017) Cyber War Will Not Take Place.  Oxford: Oxford University Press. 

Schmitt, M. N. (ed.) (2017) Tallinn Manual 2.0 on the International Law Applicable to Cyber Operations.  Cambridge: Cambridge University Press. 

SentinelLabs (2022) ‘AcidRain: A Modem Wiper Rains Down on Europe’.  SentinelOne Labs Blog, 31 March. 

US Cyber Command (2022) ‘CYBER 101 – Defend Forward and Persistent Engagement’.  Press release, 25 October. 

US Department of Defense (2022) National Defense Strategy of the United States of America.  Washington, DC. 

Powered by WordPress & Theme by Anders Norén