Since its founding in 2005 Ontonix has focused its QCM (Quantitative Complexity Management) technology on protection, problem prevention, ‘robustification’, early warnings and identification of crisis precursors in diverse fields and industries (manufacturing, engineering design, economics, finance, traffic control, medicine, etc.).
The present blog introduces a new exotic application of the QCM technology – the use of complexity as a systemic offensive tool. The goal, therefore, is not to prevent crises or systemic collapses – which is the mission of Ontonix – the objective is to cause them.
Complexity is a systemic characteristic of networks and processes. Since 2005 complexity can actually be measured. It is measured in cbits (complexity bits) and quantifies the amount of structured information ‘contained’ within a system. Every system (network, process) possesses at a given time a certain amount of complexity as well as the so-called critical complexity. In proximity of its critical complexity the dynamics of every system tends to be dominated by uncertainty, becoming chaotic and uncontrollable. This reduces its structural stability, rendering it less resilient hence vulnerable. Systemic collapses happen in the presence of high fragility and high density of interconnections, i.e. in the proximity of critical complexity. Well-managed systems and processes function at a certain distance from their respective critical complexities. However, this mandates that one be able to measure both complexity as well as critical complexity. This is the business of Ontonix. Managing super huge systems and networks without the explicit knowledge and monitoring of complexity and critical complexity is risky, to say the least.
The objective of ‘complexity as a weapon’ is to reduce/neutralize the overall resilience of the enemy by deliberately introducing harmful targeted and structured information (complexity) into adversarial networks so as to induce fragilities as well as structural instabilities leading, potentially, to systemic/catastrophic collapse.
The goal is to ‘inject’ complexity into adversary’s computers and networks in a surgical manner, damaging or debilitating systems such as PLC, DCS, SCADA, in particular the hubs of those systems, which can quickly propagate on a large-scale the effects of an attack. The goal is to increase network/process complexity to levels in the vicinity of critical complexity, so as to induce fragility, vulnerability and cascading failures. In essence, we’re looking at a targeted alteration of specific sensitive network functions.
Inducing critical complexity levels in strategic networks can offer an effective preemptive measure which can soften the enemy’s critical infrastructures/networks prior to a more conventional attack (cyber or not).
Complexity-based aggression, when implemented on a large-scale level (i.e. when targeted at large networks or interconnected systems of networks) can offer a ‘subtle’ low-intensity and low-visibility intervention in virtue of its highly distributed nature. In other words, instead of a highly concentrated attack, a more diluted action may result potentially difficult to trace and counter and, at the same time, lead to devastating systemic consequences.
The technical details of ‘complexity as a weapon’ will not be explained in this blog for obvious reasons. However, the rationale is based, in part, on certain observations one can make when studying very large-scale highly complex systems, such as the following:
- The Functional Indeterminacy Theorem (F.I.T.): In complex systems, malfunction and even total non-function may not be detectable for long periods, if ever.
- The Fundamental Failure-Mode Theorem (F.F.T.): Complex systems usually operate in failure mode.
- A complex system can fail in a very large number of ways. Higher complexity means a system possesses more failure modes.
- The larger the system, the greater the probability of unexpected failure.
Our Quantitative Complexity Theory has verified the above statements on an empirical and numerical basis (science, not opinions, has always been our motto).
When it comes to complex systems – by the way, before you say something really is complex you should actually measure its complexity – failure isn’t always something obvious and may even be difficult to design. In fact, there are many ways, modes, in which such systems can fail (or be made to fail). In reality, failure is a combination (superposition) of various failure modes. Some of these modes can be quite likely, some can require high energy expenditure in order to trigger them, some can be triggered with little effort but may require an unlikely set of circumstances.
This means that it may be possible to provoke the collapse of large networks/systems by identifying first what their failure modes are and, in each mode to pinpoint the key variables (nodes) that can cause a cascading failure. Once these nodes have been identified, that’s where the attack should be concentrated. The way this is accomplished is not intuitive. It is not sufficient to ‘invert’ the conventional QCM-based process of system ‘robustification’ in order to arrive at the complexity-as-a-weapon logic which induces systemic fragility. What is certainly needed, though, is plenty of supercomputer fire power.
Who would be the target of a large-scale systemic ‘complexity attack’? Rogue states that are threatening global peace and support terrorism.
0 comments on “Complexity: A Next Generation Global Cyber Weapon”