Thoughts, reflections and experiences

icy banner

Tag: cyber

The Missing Adversary: What the UK’s Cyber Security Strategy Leaves Out

The UK Government Cyber Security Strategy 2022–2030 is 84 pages long. It mentions offensive cyber capability once, in a subordinate clause, in a focus box, on page 49. The remaining 83 pages describe risk management, asset discovery, vulnerability reporting, supply chain assurance, incident response, workforce development, and the adoption of the Cyber Assessment Framework across the public sector. It is thorough, competent, and well-structured. It is also not a strategy.

Rather than an issue of drafting, the document cannot contain the thing that would make it genuinely strategic, because that thing — the reciprocal logic of state cyber competition, including the UK’s own offensive posture and the adversary intent it provokes — is precisely what politics dictate that a public document must leave out. As such, if you are a practitioner building your cyber defences from this document, that structural gap has consequences you need to understand.

Strategy requires an adversary

A strategy, in any serious usage, is a theory of how applying your means produces a desired effect on an adversary pursuing their own ends. It demands that you understand not just what you are defending, but who is attacking, what they want, and why your organisation matters to them. The UK’s cyber security strategy does none of this. Its five objectives (manage risk, protect, detect, respond, develop skills) are the components of a security operations programme. While they do describe how to administer resilience, they do not link it to the strategic intentions of the actors generating the threat.

The document’s aim is that all government organisations should be resilient to known vulnerabilities and attack methods by 2030. That is a maturity target, and a reasonable one. But a maturity target is to strategy what vehicle maintenance is to a campaign plan. It ensures your equipment works. It does not tell you where to concentrate your forces or why.

In practice, if you are a CISO implementing the Cyber Assessment Framework, you are sizing your defences against a threat profile. But the threat profile describes adversary capabilities and methods, not adversary objectives. It tells you what is being done to you. It cannot tell you why, because answering that question honestly would require the UK government to describe a two-sided strategic contest in which it is an active participant, not a passive recipient. And that is not politically possible.

The adversaries have strategies. You don’t.

The actors referenced obliquely throughout the document — never named, often gestured at — do operate with genuine strategic intent, even if their cyber operations vary enormously in sophistication and purpose. North Korea’s cyber theft programme serves regime survival by generating hard currency that sanctions deny through conventional channels. China’s operations span industrial espionage, domestic surveillance, and the suppression of external dissent, each serving distinct but related political objectives. Russia employs cyber operations as one component of a broader approach to degrading institutional coherence in adversary states. These are necessarily simplified descriptions of complex state behaviours, but they share a common feature: in each case, cyber operations are means directed toward stable, rational and identifiable political ends.

The UK’s published strategy, by contrast, presents the threat environment as something to be endured and managed rather than understood as a competitive interaction. The result is a document that describes the defensive half of a strategic relationship as though it were the whole picture.

Your threat model is incomplete, and the framework cannot fix it

The Cyber Assessment Framework provides tiered profiles matched to threat levels, and organisations assess against the profile that corresponds to their function’s criticality. Sound in principle, but in practice it produces a uniform defensive posture calibrated to adversary capability rather than adversary intent. But the difference between being scanned by an automated botnet and being targeted by a state actor pursuing a specific intelligence requirement is not a difference of degree. It is a difference of kind. The botnet will move on when it encounters adequate defences. The state actor will adapt, persist, and find another way in, because the objective driving the operation has not changed. Achieving your Cyber Assessment Framework (CAF) profile outcomes addresses the first situation. It does not reliably address the second, because the second is shaped by strategic logic your threat profile does not and cannot capture.

The question you need to answer, which the strategy will not answer for you, is: what does my adversary want from me specifically? Not what tools they use. Not what vulnerabilities they exploit. What political or economic objective my organisation’s compromise would serve.

Identify your centre of gravity

This is ultimately a question about your own centre of gravity — the asset, function, or data set whose compromise would cause disproportionate harm and which a strategically motivated adversary would therefore prioritise.

If you are a government department managing classified defence procurement, your centre of gravity is not your email server. It is the data that reveals UK capability trajectories to a competitor. If you manage health infrastructure, your centre of gravity shifts depending on context: during a crisis, it is service continuity; in normal operations, it is patient data at population scale. If you regulate energy, it is the dependency mapping that would allow an adversary to identify cascading failure points across the national grid.

Once you identify what a strategically motivated adversary would actually want from you, your defensive posture should concentrate around that. Not distribute itself uniformly to meet a standardised profile. Compliance with the CAF is necessary. But the gap between compliance and genuine strategic defence is precisely the space the document’s unwritten section would have occupied — the section that explains who is coming for you, what they want, and what that means for where you put your resources.

Reading the silence

The Government Cyber Security Strategy is not a bad document produced by people who don’t understand strategy. It is a public document produced under conditions that make honest strategic disclosure impossible. You cannot publish a candid account of reciprocal cyber competition without exposing capabilities, revealing intelligence sources, and acknowledging that the UK’s own operations shape the threat environment its departments face. The genre of ‘published national cyber strategy’ is therefore structurally evasive, given it must present administrable resilience in place of the adversarial logic that would make it genuinely strategic, because that logic is classified, diplomatically sensitive, or both.

For practitioners, the implication is not that you should disregard the strategy. Implement the CAF. Build shared capabilities. Invest in your workforce and your detection capacity. But do not mistake the document for a complete account of the strategic environment you operate in. It is the publicly sayable portion of a larger contest whose most important dynamics, the ones that determine why you are being targeted and what your adversary considers worth taking, cannot appear in a document with an ISBN number.

The unwritten section is the one that matters most. Plan as though it exists.

Cyber operators working at screens

From Theory to the Trenches: Introducing “Cyber Centres of Gravity”

The nature of warfare is in constant flux. Clausewitz’s timeless insight that war is a “contest of wills” remains central, yet the means by which this contest is waged are transforming. Traditionally, Centres of Gravity (CoGs) were often seen as physical entities: armies, capitals, industrial capacity. The thinking was that neutralising these would cripple an adversary’s warfighting ability. However, it’s crucial to recognise, as scholars like Echevarria highlight, that Clausewitz himself acknowledged non-material CoGs, such as national will. The concept isn’t entirely new, but modern interpretations significantly expand upon it, especially in the context of cyberspace.

Today, the pervasive nature of information networks prompts us to consider what this means for strategic targeting. What happens when the critical vulnerabilities lie not just in the physical domain, but in an enemy’s belief systems, the legitimacy of their leadership, or their very grasp of shared reality? This is where exploring an emerging concept – what this article terms “Cyber Centres of Gravity” (Cyber CoGs) – becomes vital for contemporary military strategists. While “Cyber CoG” as a distinct term is still evolving and not yet firmly established in formal doctrine (which tends to use adjacent terms like cognitive targets or information influence objectives, as noted by analysts like Pawlak), its exploration helps us grapple with these new strategic challenges. Ignoring these intangible, yet increasingly critical, aspects in our information-saturated world could represent a significant strategic blind spot.

Understanding “Cyber CoGs”

So, what might a “Cyber CoG” entail? It can be conceptualised as a critical source of an adversary’s moral or political cohesion, their collective resolve, or a foundational element of their operative reality-construct that underpins their ability or will to resist your strategic objectives. The key idea is that significant degradation of such a “Cyber CoG,” predominantly through cyber-enabled means, could fundamentally unravel an enemy’s capacity or desire to continue a conflict, perhaps by altering their perception of the strategic landscape.

This isn’t merely about disrupting networks or servers, though such actions might play a role. A true “Cyber CoG,” in this conceptualisation, is intrinsically linked to these deeper wellsprings of an enemy’s will, cohesion, or their understanding of reality. If an operation doesn’t aim to decisively alter the strategic balance by impacting these moral, political, or epistemic foundations, it’s more likely an operational objective rather than an attack on a strategic “Cyber CoG”.

Clausewitz identified the CoG as “the hub of all power and movement, on which everything depends”. In an age increasingly defined by information, this hub can often be found in the cognitive and informational realms. When societal “passion” can be manipulated through digital narratives, when a military’s operating environment is shaped by perception as much as by physical friction, and when governmental “reason” is threatened by the decay of a shared factual basis, cyberspace becomes an increasingly central domain in shaping strategic outcomes. While kinetic, economic, and geopolitical power still hold immense, often primary, sway in high-stakes confrontations (a point Gartzke’s work on the “Myth of Cyberwar” reminds us to consider), the cyber domain offers potent avenues to contest the very “reality” upon which an adversary’s will is constructed. Here, strategic success may rely less on physical destruction and more on the ability to influence or disrupt an adversary’s cognitive and narrative environments.

Identifying Potential “Cyber CoGs”: A Framework for Analysis

Pinpointing these potential “Cyber CoGs” requires a nuanced analytical approach, considering factors such as:

  1. Strategic Relevance: Does the potential target truly sustain the enemy’s will to fight or their core strategic calculus? This involves looking at national cohesion, public legitimacy, dominant narratives, key alliances, or shared assumptions underpinning their strategy. Its degradation should aim to undermine their strategic purpose or resolve.
  2. Cyber Primacy in Effect: Can cyber-enabled operations offer a uniquely effective, or significantly complementary, method for impacting this CoG, especially when compared or combined with kinetic, economic, or diplomatic levers? Some intangible CoGs may be less susceptible to physical attack but highly vulnerable to informational strategies.
  3. Potential for Decisive Influence: Is the intended effect of targeting the “Cyber CoG” likely to be decisive, whether through an irreversible loss of trust (e.g., in institutions or information), a critical breakdown in a foundational narrative, or a fundamental, lasting shift in the adversary’s perception of their strategic environment? It could also be a cumulative effect, eroding coherence and resolve over time.
  4. Linkage to Moral and Political Dimensions (Clausewitzian Character): Is the “Cyber CoG” intrinsically tied to the enemy’s unity, cohesion, will to resist, or the shared narratives defining their interests and threats? It’s not just a system or infrastructure but is linked to the collective spirit or governing principles.
  5. Strategic Viability and Responsibility: Can the proposed operation be conducted with a rigorous assessment of attribution risks, potential for unintended escalation, and broader second-order societal effects? This includes careful consideration of evolving international norms and legal frameworks.

Implications for Military Planners

Strategically engaging potential “Cyber CoGs” would necessitate evolving current approaches:

  • Integrated Intelligence: Identifying and understanding these “Cyber CoGs” demands a deep, multidisciplinary intelligence effort, fusing technical insights with profound cultural, political, cognitive, and narrative analysis. This requires collaboration between experts in fields like anthropology, sociology, political science, and data science to map the ‘human terrain’ and ‘narrative architecture’.
  • Dynamic and Adaptive Campaigning: Operations targeting “Cyber CoGs” are unlikely to be single events. Influencing moral cohesion or perceived reality is a complex, interactive process involving continuous adaptation to feedback loops, narrative shifts, and adversary countermeasures. The aim is often cognitive degradation or displacement, subtly altering the adversary’s decision-making calculus over time.
  • Strategic, Not Just Tactical, Focus: While drawing on tools from traditional information warfare or psychological operations, the concept of “Cyber CoGs” pushes for a more strategically ambitious focus on these Clausewitzian centers of power, wherever they may reside. When a CoG itself is located in the moral, political, or epistemic domains, cyber-enabled operations can become a key component of strategic engagement.

Navigating the Ethical and Legal Landscape

The capacity to strategically influence an adversary’s societal beliefs and perceived reality carries a profound ethical burden and operates within a complex legal landscape. Responsible statecraft demands a deliberate moral calculus, especially in the ambiguous “grey zone”. The Tallinn Manual 2.0, for instance, provides detailed interpretations of how international law applies to cyber operations, including complex issues around sovereignty, non-intervention, and due diligence. Operations that aim to alter perception or manipulate societal beliefs can brush up against these established and evolving legal interpretations. Pursuing strategic goals through such means requires careful navigation to avoid widespread societal disruption or unintended consequences that could undermine international order. There is also the risk of “blow-back,” where the methods used externally could erode internal democratic norms if not carefully managed.

Integrating New Concepts into Strategic Thinking

The future of conflict is undeniably intertwined with the contested terrains of perception, belief, and societal cohesion. Exploring concepts like “Cyber Centres of Gravity” can help us theorise and analyse these critical nodes of will, unity, and perceived reality. This endeavor is less about new technologies and more about refining our understanding of strategy itself: to influence an adversary’s will or alter their perceived reality to achieve strategic aims, through means that are proportionate, precise, and adapted to the evolving character of modern conflict.

Failing to adapt our thinking, to build the necessary multidisciplinary approaches, and to foster the institutional agility to operate in this transformed strategic landscape is a risk to our future strategic effectiveness.

Selected Bibliography

  • Brittain-Hale, Angela. “Clausewitzian Theory of War in the Age of Cognitive Warfare.” The Defense Horizon Journal (2023): 1–19.
  • Clausewitz, Carl von. On War. Edited and translated by Michael Howard and Peter Paret. Princeton, NJ: Princeton University Press, 1976.
  • Echevarria, A. J. (2002). “Clausewitz’s Center of Gravity: Changing Our Warfighting Doctrine—Again!” Strategic Studies Institute.
  • Gartzke, E. (2013). “The Myth of Cyberwar: Bringing War in Cyberspace Back Down to Earth.” International Security, 38(2), 41–73.
  • Krieg, Andreas. Subversion: The Strategic Weaponization of Narratives. London: Routledge, 2023.
  • Lin, Herbert, and Jackie Kerr. “On Cyber-Enabled Information/Influence Warfare and Manipulation.” Center for International Security and Cooperation, Stanford University, 2017.
  • Pawlak, P. (2022). “Cognitive Warfare: Between Psychological Operations and Narrative Control.” EUISS Brief.
  • Schmitt, M. N. (Ed.). (2017). Tallinn Manual 2.0 on the International Law Applicable to Cyber Operations. Cambridge University Press.

Powered by WordPress & Theme by Anders Norén