Complexity Economics Society

On Theories, Complexity and Decision Making

In highly complex contexts, or when one runs out of arguments in a debate, it is customary to resort to elaborate conspiratory theories. Theories are constructed based on a set of data points. When many data points are available, one can construct many theories. The problem is that all these theories will agree in the very set of data points used to construct them. The figure below explains the concept.

Ockham’s razor helps scientists choose from various theories suggesting that the simplest one that explains a given experiment is probably the theory to go for. In the above figure the circle and the parabola pass though exactly the same data. Both theories yield very similar results in the vicinity of the said data points. But theories are used to make extrapolations and predictions, not to verify existing data. So, in the case on the left, is it going to be a circle or a parabola? Once one moves away from the data which was used to construct a theory, theories diverge. The more sophisticated they are the more they diverge. In the case on the right, there are even more possibilities to join the points and many more ways to agree and disagree on a given subject.

The point, however, is not to debate theories. The more complex a theory is the more difficult it will be to prove it is valid. This is because theories can only be verified by means of repeatable experiments, performed by independent teams. The point is how to make decisions, or pick strategies in highly complex situations. Things become involved if many people debate many highly complex theories built of very many data points (or charts). Many individuals prefer long debates over having to live with decision risk.

In the past, decisions were made based on assessments of probability. In the future, decisions will be made based on complexity – highly complex scenarios are more risky because high complexity induces fragility – whereby the strategy is to favor low-complexity alternatives. Low-complexity solutions tend to be more resilient. Ask any engineer.

However, the most dangerous practice is to resort to complex, unverifiable theories (conjectures, really) to support and justify not making any decision at all.

Established originally in 2005 in the USA, Ontonix is a technology company headquartered in Como, Italy. The unusual technology and solutions developed by Ontonix focus on countering what most threatens safety, advanced products, critical infrastructures, or IT network security - the rapid growth of complexity. In 2007 the company received recognition by being selected as Gartner's Cool Vendor. What makes Ontonix different from all those companies and research centers who claim to manage complexity is that we have a complexity metric. This means that we MEASURE complexity. We detect anomalies in complex defense systems without using Machine Learning for one very good reason: our clients don’t have the luxury of multiple examples of failures necessary to teach software to recognize them. We identify anomalies without having seen them before. Sometimes, you must get it right the first and only time!

0 comments on “On Theories, Complexity and Decision Making

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: