A fundamental property of the complexity metric developed by Ontonix is that it has bounds – a lower and an upper bound. This means that it cannot assume infinite values (both negative and positive). If it could it wouldn’t be a good metric.
All good metrics satisfy the laws of physics
For example, a temperature cannot go to infinity, or the density of a material cannot be negative. The same applies to complexity. For every given system it can vary between a lower and upper bound. Both values are positive. These bounds have a very nice physical meaning. Close to the lower bound, structure dominates – systems are more predictable and easier to fathom. In proximity of the upper bound – called also critical complexity – entropy dominates and structure is weak. It is never a good idea to function close to one’s critical complexity.
A system cannot be more complex than its critical complexity
Unless one adds more structure. For example, you can only put that many employees in an office building, unless you add new floors. If you don’t add new floors but you add employees, the company will suffocate itself to a stop. The same can be said of our Earth. They don’t manufacture new land but the population is growing. And so our society is becoming more complex but it is also getting more fragile. This is the key reason one should stay away from critical complexity:
In proximity of critical complexity systems become very fragile
This means they can break suddenly, in addition to being inefficient and chaotic. But the really bad news is that at high levels of complexity system can often behave in a non-intuitive manner, or suddenly jump from one mode of behaviour to another. For all these and many other reasons, it is better, all things being equal, to be less complex. This is why some managers prefer a lean business. In highly complex organizations it is easier to hide incompetency or fraud. Experienced engineers prefer simple solutions, not just because they may be elegant and aesthetically pleasing – they are generally less problematic. Excessive complexity may be seen as a form of bad cholesterol or obesity. A sort of ballast. Complexity-heavy systems are in general:
- Difficult to understand
- Difficult to manage and govern – some just run on ‘autopilot’
- Fragile – can collapse suddenly
- Able to collapse in many different ways
- Able to generate surprises
- Difficult to fix
- Difficult to predict in terms of behaviour or performance
- Difficult to reform
- Easier to attack and destroy
This is why simpler solutions are preferable. This is why a well-governed system is run at a safe distance from its own critical complexity. The good news is that today it is possible to measure complexity and critical complexity for almost any natural and man-made system.
Because of the existence of critical complexity – a sort of physiological limit on complexity that each system possesses – it is possible to define a relative measure of complexity, which ranges from 0% to 100%. When a system has relative complexity close to 0% it means that it functions close to the lower complexity bound and its behaviour is dominated by structure (like a watch movement). When, on the other hand, relative complexity approaches 100% our system is almost critically complex, its structure is weak and its behaviour is dominated by chaos. All things in life are relative and complexity is no exception. A large company can be relatively less complex than a small one. Size doesn’t matter. What makes a system complex is its proximity to the critical complexity threshold.
Relative complexity is what determines how resilient a system is. Relative complexity close to 100% implies a very low resilience – high fragility – while values close to 0% point to systems that are well equipped to resist shocks (both internal and external) or other destabilizing events.
Featured image courtesy of Flight Radar.