Complexity Uncategorized

On Optimization, Determinism, Fragmentation and Models.

The clash between Eastern and Western cultures is also evident in the languages. Oriental languages like Japanese and Chinese are very expressive, and lend themselves to ambiguity in expression. The Western alphabetical-type languages on the other hand, are more schematic and clear. The reason lies of course in the way the alphabet is architectured. Once you lay down a set of rules, and you respect them all along, you eventually run into trouble. Our Western languages are well suited for our black and white logic, for determinism. The books of Democritus (who invented atoms) were destroyed because they were offensive. Against Aristotle and Plato, the inquisitors themselves. Like Plato, who didn’t care much about studying nature, and who preferred to discuss politics, so today, those who engage in optimization of complex numerical entities (models), tend to forget reality and concentrate their efforts on ”ethics” and other forms of generally accepted forms of ”morality”. There is much more room, more freedom in the imaginary world, more independence from the severe and unforgiving laws of physics. Aristotle eventually invented the fragmentation of science into separate (and hermetic for centuries) fields and compartments. Most of his philosophy was based on circularity of reasoning. Once you got caught in the circle, you’re part of the club!

The beginning of wisdom, as the Chinese say, is calling things by their right names.” Contemporary engineering language is corroded with ambiguities and many form of lexical abuses. I always get the same comment everywhere I lecture: ”in reality we all do the same, we all optimize…”. No, no and no! Namen est numen, to name is to know. We should name things correctly. Optimization is optimization, i.e. seeking the minimum, the best. According to E.O. Wilson ”In all cultures, taxonomic classification means survival. Optimization, i.e. seeking the optimum, is, by definition, a process whereby something is minimized or maximized. Therefore, in the practice of optimization, the declared goal is to be best. But optimization needs a model of some sort and model economy truncates diversification. The more simplistic the model is, the higher the danger of missing something. Along these lines it is incredible to see how humanity fiddles with the ecosystem based on abstract models, often built on incomplete data. And then we rush to optimize the utilization of natural resources, but without knowledge of the consequences. Wilson claims that ”We do not know to the nearest order of magnitude how many species exist on the earth in the first place”. I think that this fact reflects a bit all of humanity. At present, 1.4 million organisms have been discovered (and scientific names attached to them), but the total number may be between 10 and 100 million. Of the 1.4 million species that have a name, less than 10% have been studied at levels beyond gross anatomy! There is still so much to learn, before attempting manipulation! And yet, we get obsessed with details, while we miss the whole picture. Local, short-term tactics (highest possible profit, lowest possible risk) everything is optimized, squeezed, but the global vision is lost. Our optimized economy (a bit fragile though, isn’t it?)
and our profit-driven existence leads to a loss of 27000 species per year, which means 74 per day, or 3 every hour! The most endangered species are those trapped by specialization and in shrinking habitats. Determinism is like the underutilization of biodiversity (few crops, few animals, just fast profit). Homogeneity means vulnerability. Uncertainty creates redundancies that are useful as emergency exits.

Faithful to out atomistic and reductionist credo, we understand evolution more from the genetic point of view, while ignoring its global, ecological (systemic) aspects. But, according to Wilson, ”The best of science doesn’t consist of mathematical models and experiments, as textbooks make it seem. Those come later. It springs fresh from a more primitive mode of thought, wherein the hunter’s mind waves ideas from old facts and fresh metaphors and the scrambled crazy images of things recently seen. To move forward is to concoct new patterns of thought, which in turn dictate the design of new models and experiments. Easy to say, difficult to achieve”. He further states that: ”First rule of the history of science: when a big, new, persuasive idea is proposed, an army of critics soon gathers and tries to tear it down. Such a reaction is unavoidable because, aggressive yet abiding by the rules of civil discourse, this is simply how scientists work. It is further true that, forced with adversity, proponents will harden their resolve and struggle to make the case more convincing. Being human, most scientists conform to the psychological Principle of Certainty, which says that when there is evidence both for and against a belief, the result is not lessening but a heightening of conviction on both sides…. Rule number two: the new idea will, like mother earth, take some serious hits. If good it will survive, probably in modified form. If bad it will die, usually at the time of death or retirement of the last original proponent.” Wilson cites Paul Samuelson, who said of the science of economics: ”funeral by funeral, theory advances”. ”What we understand best about evolution is mostly genetic, and what we understand least is mostly ecological. In engineering we also need to look more at systems, not at components. Interaction must be understood. To model, or not to model. That is the question.

Established originally in 2005 in the USA, Ontonix is a technology company headquartered in Como, Italy. The unusual technology and solutions developed by Ontonix focus on countering what most threatens safety, advanced products, critical infrastructures, or IT network security - the rapid growth of complexity. In 2007 the company received recognition by being selected as Gartner's Cool Vendor. What makes Ontonix different from all those companies and research centers who claim to manage complexity is that we have a complexity metric. This means that we MEASURE complexity. We detect anomalies in complex defense systems without using Machine Learning for one very good reason: our clients don’t have the luxury of multiple examples of failures necessary to teach software to recognize them. We identify anomalies without having seen them before. Sometimes, you must get it right the first and only time!

1 comment on “On Optimization, Determinism, Fragmentation and Models.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: