The classical complexity metric, which constitutes the core of the QCM, is scale-free. This means that the value of complexity is independent of data amplitude. While this presents numerous advantages, there exist contexts in which it is useful to link the value of complexity to actual data amplitudes. This constitutes the key characteristic of the QCM2, a recently introduced next-generation Quantitative Complexity Management technology.
While the conventional QCM blends Structure and Entropy via the following equation:
C=f(S; E)
the
QCM2 extablishes a link between physics and Information Theory
An example of how the QCM2 better captures anomalies is illustrated below. Conventional QCM indicates a drop in complexity, hinting a potential anomaly. QCM2, on the other hand, shows a much more pronounced signal.
QCM
QCM2
In both cases, the anomaly occurs after step 800.
While the QCM is based on one complexity metric, the QCM2 provides various alternatives.
OntoNet™, the company’s flagship QCM software solution, has been upgraded to include new QCM2-based complexity metrics. Development of the QCM2 has required over two years and has been inspired by work in the financial industry.
All future CAHMS installations will incorporate the QCM2 technology.
Ontonix invests an unusually high portion of its revenue in R&D. This not only protects the company’s Intellectual Property and unique technology from emulation, it pushes further the frontiers of complexity science.
0 comments on “What Lies Beyond the QCM? The QCM2.”