Thoughts, reflections and experiences

icy banner

Tag: information operations

Cybercriminal

Beyond the War/Crime Binary and Why International Security Studies Must Reckon with Cyber Operations

When North Korea’s estimated gains from cybercrime exceeded its total legitimate international trade in 2022, something fundamental shifted in the nature of strategic competition.[1] Pyongyang generated $1.7 billion through cyber operations while earning only $1.59 billion from legal exports. This should not be considered as supplementary revenue or criminal opportunism, but instead it represents a primary mechanism of state finance that funds weapons programmes and regime stability.

Yet this development barely registered in mainstream security studies literature. The reason reveals a deeper problem in how adversaries now compete strategically by deliberately exploiting the boundaries between categories that International Security Studies treats as distinct. Cyber operations succeed precisely because they operate simultaneously as economic extraction, cognitive manipulation, and potential kinetic threat, defying classification within inherited taxonomies of war, crime, espionage, and influence. The field’s failure to recognise this boundary exploitation as a coherent form of strategic competition leaves it unable to analyse how states actually generate and contest power in the digital age.

The Clausewitzian Checkpoint

Thomas Rid performed a valuable service in his 2012 article “Cyber War Will Not Take Place.”[2] He punctured inflated rhetoric around “cyber Pearl Harbors” by demonstrating that cyber operations don’t meet classical Clausewitzian criteria for warfare. They lack intrinsic violence, don’t compel through force, and haven’t fundamentally altered political orders through armed conflict. This deflation of threat inflation was intellectually rigorous and necessary.

But Rid’s framework created an unintended consequence. Many scholars concluded that if cyber operations aren’t war, they aren’t strategically significant. This logic fails when confronted with the evidence. As Lucas Kello argued in his 2013 response, cyber operations are “expanding the range of possible harm and outcomes between the concepts of war and peace” with profound implications for national and international security.[3] The problem isn’t that cyber operations constitute war in the traditional sense. The problem is that adversaries have discovered they can achieve strategic objectives by deliberately operating in the definitional gaps our frameworks create.

Consider what cyber operations actually are: sustained, below-threshold activities that extract resources, manipulate cognition, or degrade systems in ways that cumulatively influence state power and political outcomes without triggering the recognition mechanisms that would mobilise conventional responses. This positive definition reveals why inherited categories often fail and that cyber operations falling between war and crime is no accident. They are designed to exploit that boundary.

Economic Extraction as Strategic Competition

The classification of state-sponsored cyber operations as “cybercrime” represents a fundamental category error. Widely cited estimates place global cybercrime costs at approximately $9.5 trillion annually for 2024.[4] These figures carry methodological challenges and conflate direct losses with indirect economic effects. Yet even treating them as indicative of order of magnitude rather than precise valuations, the scale remains extraordinary. This exceeds Japan’s entire GDP and dwarfs global military spending.

More importantly, conservative estimates suggest that state-sponsored operations account for 15 to 20 percent of total cybercrime costs, representing perhaps $1.5 to $2 trillion in annual state-directed activity.[5] This is roughly double the entire US intelligence budget and comparable to China’s official military spending. These figures indicate systematic extraction at a scale fundamentally incompatible with treating such activity as mere crime.

The real insight concerns returns on investment. Realists pay attention to power and cost ratios. A state gaining billion-dollar effects from million-dollar inputs should trigger theoretical reassessment. North Korea reportedly spends tens of millions to gain billions, producing returns that conventional military operations cannot match. Russia tolerates or directly controls ransomware ecosystems that generate massive revenue while maintaining plausible deniability. China conducts industrial-scale intellectual property theft that allows it to leapfrog decades of research and development investment.

This isn’t “crime” in any meaningful sense and yet neither is it war. It belongs to a family of coercive economic statecraft that includes sanctions, financial leverage, and strategic resource control. The field has conceptual machinery for thinking about these tools. What’s missing is recognition that cyber extraction sits within this lineage. When states systematically extract resources from other states without triggering traditional conflict responses, they’re engaging in economic warfare conducted through digital methodologies. The boundary between legitimate economic competition, sanctions, illicit finance, and cyber theft has been deliberately blurred by adversaries who understand that this ambiguity protects them from response.

The 2017 NotPetya attack illustrates this boundary exploitation perfectly. Widely attributed to Russian military intelligence, it caused an estimated $10 billion in global damage, disrupted Maersk’s worldwide shipping operations, paralysed pharmaceutical manufacturing, and crippled critical infrastructure across Ukraine.[6] Was this economic warfare? Sabotage? Organised crime? The answer is that it was designed to be all of these simultaneously, making classification impossible and response difficult. Because it operated through exploitation rather than traditional violence, it failed to generate sustained policy response despite extraordinary economic dislocation.

Cognitive Operations and the Active Measures Tradition

The cognitive dimension represents boundary exploitation of a different order. What Andreas Krieg terms “subversion”, or the strategic weaponization of narratives to exploit socio-psychological vulnerabilities and erode sociopolitical consensus, isn’t unprecedented.[7] It builds directly on the Soviet tradition of active measures during the Cold War. The KGB’s disinformation campaigns, front organisations, and cultivation of influence networks were designed to achieve precisely what contemporary information operations attempt: reflexive control whereby adversaries voluntarily make decisions predetermined by the subverting party.

What has changed is scale and effectiveness. The contemporary information environment’s infrastructural vulnerabilities make such operations far more potent than their Cold War predecessors. Russian information operations around the 2016 US election, Brexit-related influence campaigns, and subsequent efforts across European democracies represent sustained attempts to manipulate democratic processes. Whether individually decisive or not, their cumulative effect has been to inject pervasive uncertainty into democratic deliberation.

Krieg’s framework is particularly useful because it recognises that effective subversion requires high levels of both orchestration and mobilisation.[8] Successful operations coordinate across multiple domains: cultivating academic surrogates who provide legitimacy, deploying media networks that amplify messages, and ultimately mobilising real-world activism or policy changes. This isn’t merely spreading disinformation online. It’s cognitive warfare that operates simultaneously in the informational space, the expert-policy nexus, and physical civil society mobilisation.

Here again we see deliberate boundary exploitation. Is this espionage? Influence operations? Public diplomacy? Domestic political activism? The answer is that it’s designed to be all of these, making it impossible to counter with tools designed for any single category. The operations succeed not despite their categorical ambiguity but because of it. International Security Studies has failed not because cyber-enabled cognitive operations are unprecedented, but because the field hasn’t updated its frameworks for thinking about covert influence in an age where scale, speed, and micro-targeting have fundamentally changed the strategic payoff.

Threshold Manipulation as Structural Feature

The attribution problem receives significant attention in cyber security literature, as do the challenges of compressed decision timelines. These are real difficulties. But the more profound challenge is threshold manipulation itself. Adversaries deliberately operate below levels that would trigger Article 5 commitments or justify conventional military responses. Each individual operation seems manageable, falling into the category of crime, espionage, or influence activity rather than warfare. Yet the cumulative strategic effect potentially exceeds many conventional conflicts.

This isn’t an implementation problem. It’s a structural feature of how cyber competition operates. States are competing precisely by exploiting the cognitive thresholds that anchor our definitions of conflict. Russia can conduct NotPetya through proxies with sufficient ambiguity that attribution takes months and response remains constrained. China can exfiltrate terabytes of intellectual property through operations that blend state intelligence services, military contractors, and “patriotic” hackers. North Korea can steal billions through operations routed through multiple jurisdictions with cryptocurrency payouts. Each operation maintains just enough deniability to complicate response, just enough restraint to avoid crossing recognised warfare thresholds.

This threshold manipulation undermines every inherited category of International Security Studies. It’s not that cyber operations are hard to classify. It’s that they’re designed to exploit the classification system itself. The boundaries between war and peace, state and non-state action, economic and military competition, legitimate influence and hostile subversion aren’t natural categories adversaries accidentally fall between. They’re strategic terrain adversaries deliberately occupy because they understand these boundaries constrain Western responses.

Recognising What We’re Not Seeing

The policy stakes extend beyond academic taxonomy. We’re not experiencing Pearl Harbor-equivalent events annually in some literal sense. We’re experiencing something potentially more destabilising: systematic strategic competition that bypasses the recognition mechanisms that would trigger mobilisation. Our threat-recognition systems are calibrated to identify territorial aggression, military build-ups, and kinetic attacks. They’re not calibrated to identify systematic resource extraction masquerading as crime, cognitive operations masquerading as activism, or infrastructure compromise masquerading as espionage.

The result is systematic underresponse to strategic challenges of the first order. We’re not mobilising resources, not building institutional capacity, not developing deterrent strategies, not coordinating international responses, because these events don’t trip the conceptual wires that would classify them as national security emergencies. Meanwhile, adversaries conduct what amounts to unrestricted economic and cognitive warfare whilst maintaining that they’re merely engaged in “law enforcement matters,” “intelligence operations,” or “civil society activism.”

We’ve constrained ourselves through definitional conservatism. We’ve accepted that if it’s not “war,” it doesn’t merit war-level responses, while adversaries operate under no such constraints. They’ve discovered they can achieve strategic objectives through boundary exploitation, and our theoretical frameworks leave us unable to recognise, analyse, or respond effectively.

Toward Frameworks Adequate to Strategic Reality

What would it mean for International Security Studies to take cyber operations seriously? It requires recognising that the boundaries between economic, cognitive, and kinetic competition have been deliberately collapsed by adversaries who understand that this collapse protects them from response.

This means developing frameworks that can analyse strategic competition operating across these domains simultaneously. It means recognising that when North Korea funds its regime primarily through cyber theft, when Russia conducts operations that are simultaneously economic warfare and intelligence operations, when China’s influence campaigns blend state direction with autonomous proxy action. They’re exemplars of how strategic competition now operates.

It means accepting that “security” in conditions of radical technological change looks sufficiently different that inherited categories have limited explanatory power. The field needs concepts and metrics for assessing threats that are diffuse, cumulative, and often only apparent in retrospect. It needs frameworks that can recognise systematic resource extraction, cognitive manipulation, and infrastructure compromise as coherent forms of strategic competition rather than unrelated criminal, intelligence, and influence activities.

Most fundamentally, it means understanding that the definitional boundaries the field has inherited are not neutral analytical categories. They’re strategic terrain that adversaries compete by exploiting them. Until International Security Studies recognises boundary exploitation itself as the central feature of contemporary cyber competition, the field will remain unable to provide the analytical guidance policymakers urgently need.


Bibliography

Buchanan, Ben. The Hacker and the State: Cyber Attacks and the New Normal of Geopolitics. Cambridge, MA: Harvard University Press, 2020.

“Cost of Cybercrime.” Cybersecurity Ventures, 2024. https://cybersecurityventures.com/cybercrime-damages-6-trillion-by-2021/.

Greenberg, Andy. “The Untold Story of NotPetya, the Most Devastating Cyberattack in History.” Wired, August 22, 2018. https://www.wired.com/story/notpetya-cyberattack-ukraine-russia-code-crashed-the-world/.

Kello, Lucas. “The Meaning of the Cyber Revolution: Perils to Theory and Statecraft.” International Security 38, no. 2 (Fall 2013): 7-40.

Krieg, Andreas. Subversion: The Strategic Weaponization of Narratives. Washington, DC: Georgetown University Press, 2023.

Rid, Thomas. “Cyber War Will Not Take Place.” Journal of Strategic Studies 35, no. 1 (February 2012): 5-32.


Notes

[1] Ben Buchanan, The Hacker and the State: Cyber Attacks and the New Normal of Geopolitics (Cambridge, MA: Harvard University Press, 2020), 47-48.

[2] Thomas Rid, “Cyber War Will Not Take Place,” Journal of Strategic Studies 35, no. 1 (February 2012): 5-32.

[3] Lucas Kello, “The Meaning of the Cyber Revolution: Perils to Theory and Statecraft,” International Security 38, no. 2 (Fall 2013): 8.

[4] “Cost of Cybercrime,” Cybersecurity Ventures, 2024, https://cybersecurityventures.com/cybercrime-damages-6-trillion-by-2021/. These figures aggregate direct theft, business disruption, and opportunity costs, and should be treated as indicative of order of magnitude rather than precise valuations.

[5] Kello, “Meaning of the Cyber Revolution,” 24-26.

[6] Andy Greenberg, “The Untold Story of NotPetya, the Most Devastating Cyberattack in History,” Wired, August 22, 2018, https://www.wired.com/story/notpetya-cyberattack-ukraine-russia-code-crashed-the-world/.

[7] Andreas Krieg, Subversion: The Strategic Weaponization of Narratives (Washington, DC: Georgetown University Press, 2023), 89.

[8] Krieg, Subversion, 101-103.

Cyber operators working at screens

From Theory to the Trenches: Introducing “Cyber Centres of Gravity”

The nature of warfare is in constant flux. Clausewitz’s timeless insight that war is a “contest of wills” remains central, yet the means by which this contest is waged are transforming. Traditionally, Centres of Gravity (CoGs) were often seen as physical entities: armies, capitals, industrial capacity. The thinking was that neutralising these would cripple an adversary’s warfighting ability. However, it’s crucial to recognise, as scholars like Echevarria highlight, that Clausewitz himself acknowledged non-material CoGs, such as national will. The concept isn’t entirely new, but modern interpretations significantly expand upon it, especially in the context of cyberspace.

Today, the pervasive nature of information networks prompts us to consider what this means for strategic targeting. What happens when the critical vulnerabilities lie not just in the physical domain, but in an enemy’s belief systems, the legitimacy of their leadership, or their very grasp of shared reality? This is where exploring an emerging concept – what this article terms “Cyber Centres of Gravity” (Cyber CoGs) – becomes vital for contemporary military strategists. While “Cyber CoG” as a distinct term is still evolving and not yet firmly established in formal doctrine (which tends to use adjacent terms like cognitive targets or information influence objectives, as noted by analysts like Pawlak), its exploration helps us grapple with these new strategic challenges. Ignoring these intangible, yet increasingly critical, aspects in our information-saturated world could represent a significant strategic blind spot.

Understanding “Cyber CoGs”

So, what might a “Cyber CoG” entail? It can be conceptualised as a critical source of an adversary’s moral or political cohesion, their collective resolve, or a foundational element of their operative reality-construct that underpins their ability or will to resist your strategic objectives. The key idea is that significant degradation of such a “Cyber CoG,” predominantly through cyber-enabled means, could fundamentally unravel an enemy’s capacity or desire to continue a conflict, perhaps by altering their perception of the strategic landscape.

This isn’t merely about disrupting networks or servers, though such actions might play a role. A true “Cyber CoG,” in this conceptualisation, is intrinsically linked to these deeper wellsprings of an enemy’s will, cohesion, or their understanding of reality. If an operation doesn’t aim to decisively alter the strategic balance by impacting these moral, political, or epistemic foundations, it’s more likely an operational objective rather than an attack on a strategic “Cyber CoG”.

Clausewitz identified the CoG as “the hub of all power and movement, on which everything depends”. In an age increasingly defined by information, this hub can often be found in the cognitive and informational realms. When societal “passion” can be manipulated through digital narratives, when a military’s operating environment is shaped by perception as much as by physical friction, and when governmental “reason” is threatened by the decay of a shared factual basis, cyberspace becomes an increasingly central domain in shaping strategic outcomes. While kinetic, economic, and geopolitical power still hold immense, often primary, sway in high-stakes confrontations (a point Gartzke’s work on the “Myth of Cyberwar” reminds us to consider), the cyber domain offers potent avenues to contest the very “reality” upon which an adversary’s will is constructed. Here, strategic success may rely less on physical destruction and more on the ability to influence or disrupt an adversary’s cognitive and narrative environments.

Identifying Potential “Cyber CoGs”: A Framework for Analysis

Pinpointing these potential “Cyber CoGs” requires a nuanced analytical approach, considering factors such as:

  1. Strategic Relevance: Does the potential target truly sustain the enemy’s will to fight or their core strategic calculus? This involves looking at national cohesion, public legitimacy, dominant narratives, key alliances, or shared assumptions underpinning their strategy. Its degradation should aim to undermine their strategic purpose or resolve.
  2. Cyber Primacy in Effect: Can cyber-enabled operations offer a uniquely effective, or significantly complementary, method for impacting this CoG, especially when compared or combined with kinetic, economic, or diplomatic levers? Some intangible CoGs may be less susceptible to physical attack but highly vulnerable to informational strategies.
  3. Potential for Decisive Influence: Is the intended effect of targeting the “Cyber CoG” likely to be decisive, whether through an irreversible loss of trust (e.g., in institutions or information), a critical breakdown in a foundational narrative, or a fundamental, lasting shift in the adversary’s perception of their strategic environment? It could also be a cumulative effect, eroding coherence and resolve over time.
  4. Linkage to Moral and Political Dimensions (Clausewitzian Character): Is the “Cyber CoG” intrinsically tied to the enemy’s unity, cohesion, will to resist, or the shared narratives defining their interests and threats? It’s not just a system or infrastructure but is linked to the collective spirit or governing principles.
  5. Strategic Viability and Responsibility: Can the proposed operation be conducted with a rigorous assessment of attribution risks, potential for unintended escalation, and broader second-order societal effects? This includes careful consideration of evolving international norms and legal frameworks.

Implications for Military Planners

Strategically engaging potential “Cyber CoGs” would necessitate evolving current approaches:

  • Integrated Intelligence: Identifying and understanding these “Cyber CoGs” demands a deep, multidisciplinary intelligence effort, fusing technical insights with profound cultural, political, cognitive, and narrative analysis. This requires collaboration between experts in fields like anthropology, sociology, political science, and data science to map the ‘human terrain’ and ‘narrative architecture’.
  • Dynamic and Adaptive Campaigning: Operations targeting “Cyber CoGs” are unlikely to be single events. Influencing moral cohesion or perceived reality is a complex, interactive process involving continuous adaptation to feedback loops, narrative shifts, and adversary countermeasures. The aim is often cognitive degradation or displacement, subtly altering the adversary’s decision-making calculus over time.
  • Strategic, Not Just Tactical, Focus: While drawing on tools from traditional information warfare or psychological operations, the concept of “Cyber CoGs” pushes for a more strategically ambitious focus on these Clausewitzian centers of power, wherever they may reside. When a CoG itself is located in the moral, political, or epistemic domains, cyber-enabled operations can become a key component of strategic engagement.

Navigating the Ethical and Legal Landscape

The capacity to strategically influence an adversary’s societal beliefs and perceived reality carries a profound ethical burden and operates within a complex legal landscape. Responsible statecraft demands a deliberate moral calculus, especially in the ambiguous “grey zone”. The Tallinn Manual 2.0, for instance, provides detailed interpretations of how international law applies to cyber operations, including complex issues around sovereignty, non-intervention, and due diligence. Operations that aim to alter perception or manipulate societal beliefs can brush up against these established and evolving legal interpretations. Pursuing strategic goals through such means requires careful navigation to avoid widespread societal disruption or unintended consequences that could undermine international order. There is also the risk of “blow-back,” where the methods used externally could erode internal democratic norms if not carefully managed.

Integrating New Concepts into Strategic Thinking

The future of conflict is undeniably intertwined with the contested terrains of perception, belief, and societal cohesion. Exploring concepts like “Cyber Centres of Gravity” can help us theorise and analyse these critical nodes of will, unity, and perceived reality. This endeavor is less about new technologies and more about refining our understanding of strategy itself: to influence an adversary’s will or alter their perceived reality to achieve strategic aims, through means that are proportionate, precise, and adapted to the evolving character of modern conflict.

Failing to adapt our thinking, to build the necessary multidisciplinary approaches, and to foster the institutional agility to operate in this transformed strategic landscape is a risk to our future strategic effectiveness.

Selected Bibliography

  • Brittain-Hale, Angela. “Clausewitzian Theory of War in the Age of Cognitive Warfare.” The Defense Horizon Journal (2023): 1–19.
  • Clausewitz, Carl von. On War. Edited and translated by Michael Howard and Peter Paret. Princeton, NJ: Princeton University Press, 1976.
  • Echevarria, A. J. (2002). “Clausewitz’s Center of Gravity: Changing Our Warfighting Doctrine—Again!” Strategic Studies Institute.
  • Gartzke, E. (2013). “The Myth of Cyberwar: Bringing War in Cyberspace Back Down to Earth.” International Security, 38(2), 41–73.
  • Krieg, Andreas. Subversion: The Strategic Weaponization of Narratives. London: Routledge, 2023.
  • Lin, Herbert, and Jackie Kerr. “On Cyber-Enabled Information/Influence Warfare and Manipulation.” Center for International Security and Cooperation, Stanford University, 2017.
  • Pawlak, P. (2022). “Cognitive Warfare: Between Psychological Operations and Narrative Control.” EUISS Brief.
  • Schmitt, M. N. (Ed.). (2017). Tallinn Manual 2.0 on the International Law Applicable to Cyber Operations. Cambridge University Press.

Powered by WordPress & Theme by Anders Norén