Engineering

Sensitivity Analysis and How Nature Fights Back

Sensitivity analysis is still very popular, especially in engineering design. The idea is simple. Suppose you have a function of three variables, f(x, y, z). This is how it works:

1) you freeze all your design variables except one;

2) you perturb the variable which is loose;

3) you measure how the performance of the system varies as the loose design variable is perturbed. What could be wrong here? Well a bunch of things.

First of all, this entire process reflects an old and surpassed way of thinking – the desire to superimpose effects in an innocent linear-looking fashion. The conviction (I honestly don’t know where people get this from) that small perturbations cause only small effects. Secondly – and this is most surprising – is to freeze certain variables while you let others ripple away. In Nature this doesn’t happen. In reality, because of the stochastic nature of things, everything “fluctuates”, around the so-called nominal values. Young’s modulus of steel is not 21000.00 – it is close to this number but there is no exact value. Therefore, “blocking” a variable is, de facto, an attempt to violate physics. It is like trying to say that the universe is deterministic, while it is not. Of course, physics cannot be violated and one way Nature fights back is to yield over-designed systems. In fact, sensitivity analysis, being an unnatural process, leads to overkill. If you can live with excessive weight, or with excessive safety factors, then you’re OK.

The partial derivative – which reflects the strength of the relationship between f and x, while y and z are fixed, does not exist in nature. It is a useful numerical artifice but it is not anything that grows on trees. It is precisely when one writes x=a and y=b that one is saying: the universe is deterministic. This of course is not true. 

In Monte Carlo Simulation you let all the variables in a problem ripple away at the same time, just as it happens in reality, and then, only then, you see what happens to your system. 

It is difficult to resist, at this point, citing Frank Lloyd Wright: I’m all in favour of keeping dangerous weapons out of the hands of fools. Let’s start with typewriters. I’d also add computers. But, at the same time, just to be fair, I also should also cite Friedrich Nietzsche: We have art to save ourselves from the truth. I just can’t help thinking that sometimes Computer-Aided Engineering is becoming a form of modern art.

www.ontonix.com

Established originally in 2005 in the USA, Ontonix is a technology company headquartered in Como, Italy. The unusual technology and solutions developed by Ontonix focus on countering what most threatens safety, advanced products, critical infrastructures, or IT network security - the rapid growth of complexity. In 2007 the company received recognition by being selected as Gartner's Cool Vendor. What makes Ontonix different from all those companies and research centers who claim to manage complexity is that we have a complexity metric. This means that we MEASURE complexity. We detect anomalies in complex defense systems without using Machine Learning for one very good reason: our clients don’t have the luxury of multiple examples of failures necessary to teach software to recognize them. We identify anomalies without having seen them before. Sometimes, you must get it right the first and only time!

0 comments on “Sensitivity Analysis and How Nature Fights Back

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: