Complexity Engineering

Image Complexity as Proxy of Traffic Density

In a previous blog we have shown how complexity may be used in conjunction with streaming video. In this short note we illustrate an example of application of this technology to infer traffic density. The case is that of traffic on a motorway and the video in question is shown at the top of this blog. The video has a duration of 22 seconds and is split into 110 frames. A camera is located above the motorway.

The evolution of complexity – traffic intensity – over time is illustrated below:

A few characteristics points are highlighted below.

The simplicity of this approach to “measuring” traffic intensity stems from the fact that it is not necessary to install sensors and count the actual vehicles. With QCM, we just “look” at the situation and we “feel” the state of the traffic without any effort. This is why QCM is Artificial Intuition.

A final comment: traffic in big cities is highly complex and therefore, in virtue of the Principle of Incompatibility, lacks precision. What the principle asserts is that the more something is complex, the less precise it will be. In other words, one cannot make precise statements about highly complex systems simply because Nature won’t allow it. This, in turn, dictates a more “fuzzy” approach as complexity rises. In the case of traffic is sufficient to say, for example, that is it “normal” or “critical” instead of counting cars and trucks and coming up with precise (but irrelevant) numbers.

Established originally in 2005 in the USA, Ontonix is a technology company headquartered in Como, Italy. The unusual technology and solutions developed by Ontonix focus on countering what most threatens safety, advanced products, critical infrastructures, or IT network security - the rapid growth of complexity. In 2007 the company received recognition by being selected as Gartner's Cool Vendor. What makes Ontonix different from all those companies and research centers who claim to manage complexity is that we have a complexity metric. This means that we MEASURE complexity. We detect anomalies in complex defense systems without using Machine Learning for one very good reason: our clients don’t have the luxury of multiple examples of failures necessary to teach software to recognize them. We identify anomalies without having seen them before. Sometimes, you must get it right the first and only time!

0 comments on “Image Complexity as Proxy of Traffic Density

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: