
We report verbatim a 1998 article A Survey of “Complexity Measures” by David P. Feldman and Jim Crutchfield.
A more adequate title for this evasive article would be “A Survey of How Science Avoids to Measure Complexity”. Once you’ve read it, ask yourself this: can I use any of these measures to actually measure the complexity of, say,
- a piece of equipment, e.g. an electromechanical system
- an investment portfolio
- a corporation, or a system or corporations
- a piece of software (but beyond a trivial count of the number of lines or instructions)
- a country, or the whole world
- a patient in Intensive Care
- an electrocardiogram
- a computer chip
- a stock market index
- a piece of terrain
- a trading floor
- an energy distribution network
- a battle strategy
- a road or railway network
The answer is probably no. This was 1998. Twenty two years later, the so called “complexity science” still clings to the “definitions” provided below, which introduce nothing but fragmentation (it seems that T. S. Kuhn is not a popular author in certain circles!). A definition becomes a “definition” if it doesn’t provide, or even just hint a metric. Science becomes “science” if it doesn’t provide a means of measuring things. Astrology versus astrophysics, alchemy versus chemistry. Shamanism versus medicine.
The authors go even further by claiming that:
There is no “universal” complexity-entropy relationship!
Well, there is: C=f(S; E), where S is Structure and E stands for Entropy. It is not surprising, that with the mess and confusion with metrics and concepts, it is impossible to identify a “universal” (BTW, what is that?) complexity-entropy relationship. How could one?! In essence, one reads this article and finds no measure of complexity. It is as if the Inquisition were avoiding certain vexacious and irritating topics, not in line with the doctrine (or should I say dogma). I even heard, on more than one occasion, people annunciating proudly: complexity cannot be measured!
In 2005, Ontonix introduced a complexity measure which “includes” almost all of the metrics listed in the article and makes it possible to measure the complexity of the things enumerated above. Our complexity quantification algorithm runs on a chip and allows us to measure in real time not just complexity, but also critical complexity, the upper bound of complexity which every system possesses. Any metric, which is unbounded is not rooted in physics.
The authors are of course not culpable, they only performed a survey.
A Survey of “Complexity Measures”
David P. Feldman
University of California, Davis dave@santafe.edu http://leopard.ucdavis.edu/dave/
and
Jim Crutchfield
The Santa Fe Institute
chaos@santafe.edu http://santafe.edu/˜jpc/0-0
Complexity?!?
The term complexity has many different meanings. At least one adjective is needed to help distinguish between different uses of the word:
- Kolmogorov-Chaitin Complexity
- Computational Complexity
- Stochastic Complexity
- Statistical Complexity
- Structural Complexity
Deterministic Complexity
The Kolmogorov-Chaitin complexity of an object is the length, in bits, of the smallest program (in bits) that when run on a Universal Turing Machine outputs and then halts.
References:
Kolmogorov, Problems of Information Transmission, 1:4-7. (1965)
Kolmogorov, IEEE Trans. Inform. Theory, IT-14:662-664. (1968).
Solomonoff. Inform. Contr., 7:1-22, 224-254. (1964). Chaitin, J. Assoc. Comp. Mach., 13:547-569. (1966). Martin-Lo¨ f, Inform. Contr., 9:602-619. (1966).
Books:
- Chpt. 7 of: Cover and Thomas, “Elements of Information Theory,” Wiley, 1991.
- Chaitin, “Information, Randomness and Incompleteness,” World Scientific, 1987.
Measures of Randomness
The entropy rate of a symbolic sequence measures the unpredictability (in bits per symbol) of the sequence.
The entropy rate is also known as the entropy density or the metric entropy.
References:
Boltzmann (1866).
Shannon, Bell Sys. Tech. J. 27:379-423. (1948). Kolmogorov, Dolk. Akad. Nauk. SSSR, 119:861-864. (1958). Books:
- Shannon and Weaver, “The Mathematical Theory of
Communication,” Univ. of Illinois Press, (1963).
- Renyi, “Probability Theory,” North Holland, 1970.
- Chpt. 7 of: Cover and Thomas, “Elements of Information Theory,” Wiley, 1991.
- Beck and Schlogl, “Thermodynamics of Chaotic Systems,” Cambridge, (1993).
Kolmogorov Complexity Randomness!!
The Kolmogorov complexity is maximized for random strings.
The average growth rate of … is equal to the entropy rate .
If = trajectory of a chaotic dynamical system :
(Brudno, Trans. Moscow Math. Soc., 44:127. (1983). )
If a string is random, then it possesses no regularities. Thus,
That is, the shortest program to get a UTM to produce is to just hand the computer a copy of and say “print this.”
Measures of “Complexity” that Capture a Property Distinct from Randomness
The entropy rate and the Kolmogorov Complexity do not measure pattern or structure or correlation or organization.
- Deterministic Complexity
- Structural Complexity
- Randomness h m
- Randomness h m
Structure or pattern is maximized for neither high nor low randomness.
Note: The structural complexity vs. randomness relation above is just one of many possible behaviors. Different systems have different structural complexity vs. randomness plots. There is no “universal” complexity-entropy relationship! (E.g., Feldman and Crutchfield, Phys. Lett. A, 238:244-252, (1998), and references therein.)
Information Theoretic Approaches to Structural Complexity
Entropy Density Convergence and/or Mutual Information:
Crutchfield and Packard, Intl. J. Theo. Phys, 21:433-466. (1982); Physica D, 7:201-223, 1983.
Shaw, “The Dripping Faucet …, ” Aerial Press, 1984. Grassberger, Intl. J. Theo. Phys, 25:907-938, 1986.
Szepfalusy and Gyorgyi, Phys. Rev. A, 33:2852-2855, 1986. Lindgren and Nordahl, Complex Systems, 2:409-440. (1988). Csordas and Szepfalusy, Phys. Rev. A, 39:4767-4777. 1989. Li, Complex Systems, 5:381-399, 1991.
Freund, Ebeling, and Rateitschak, Phys. Rev. E, 54:5561-5566, 1996.
Feldman and Crutchfield, J. Stat. Phys (submitted) SFI:98-04-026, 1998.
Entropy Density Convergence and/or Mutual Information
Notes on Terminology
All of the following terms refer to (essentially) the same quantity.
Excess Entropy: Crutchfield, Packard, Feldman
Stored Information: Shaw
Effective Measure Complexity: Grassberger, Lindgren, Nordahl
Reduced (Renyi) Information: Szepfalusy, Gyorgyi, Csordas
Complexity: Li, Arnold
Early Uses of Mutual Information
Rothstein, in The Maximum Entropy Formalism, MIT Press, 1979.
Chaitin, in Information, Randomness, and Incompleteness, World Scientific, 1987.
Watanabe, Knowing and Guessing: A Quantitative Study of Inference and Information, Wiley, 1969.
Computational Mechanics
Discover and Quantify Structure by Using a Combination of Computation Theory, Statistical Inference, and Information Theory
Computational Mechanics seeks to Detect the Intrinsic Computation being Performed by the System
Crutchfield and Young, Phys. Rev. Lett, 63:105-108, 1989
Crutchfield and Young, in Complexity, Entropy and the Physics of Information, Addison-Wesley, 1990.
Crutchfield, Physica D, 75:11-54, 1994.
Hanson, PhD Thesis, University of California, Berkeley, 1993. Hanson and Crutchfield, Physica D, 103:169-189, 1997.
Upper, PhD Thesis, University of California, Berkeley, 1997. Delgado and Sole´ , Phys. Rev. E, 55:2338-2344, 1997.
Witt, Neiman and Kurths, Phys. Rev. E, 55:5050-5059, 1997. Goncavales, et. al., Physica A, (in press), 1998.
Feldman and Crutchfield, J. Stat. Phys (submitted) SFI:98-04-026, 1998.
Other Approaches to Complexity Distinct from Randomness
Logical Depth:
The Logical Depth is the run time of the shortest program that will cause a UTM to produce and then halt.
Logical depth is not a measure of randomness; it is small for both trivially ordered and random strings.
References:
Bennett, Found. Phys., 16:585-592, 1986.
Bennett, in Complexity, Entropy and the Physics of Information, Addison-Wesley, 1990.
Thermodynamic Depth:
Proposed as a measure of structural complexity (Lloyd and Pagels, Annals of Physics, 188:186-213). However, thermo. depth depends crucially on the choice of state. Lloyd and Pagels give no general prescription for how states should be chosen. Once states are chosen, thermo. depth is equivalent to the reverse time entropy rate. (Shalizi and Crutchfield, (in preparation), 1998.)
Other Approaches to Complexity Distinct from Randomness
Sophistication:
Reference:
Koppel, Complex Systems, 1:1087-91, 1987.
Effective Complexity:
Reference:
Gell-Mann and Lloyd, Complexity, 2:44-52, 1996.
Other Approaches to Complexity Distinct from Randomness
Non-Linear Modeling
References:
Wallace and Boulton, 1968.
Crutchfield and McNamara, Complex Systems 1: 417-452, 1987.
Rissanen, Stochastic Complexity in Statistical Inquiry, World Scientific, 1989.
Crutchfield and Young, in Complexity, Entropy and the Physics of Information, Addison-Wesley, 1990.
Model Convergence and Hierarchical Grammatical Complexities:
References:
Badii and Politi, Complexity: Hierarchical Structures and Scaling in Physics, Cambridge, 1997.
Badii and Politi, Phys. Rev. Lett., 78:444-447, 1997.
Note: Badii and Politi’s book contains a solid discussion of many different structural complexity measures.
Other Approaches to Complexity Distinct from Randomness
Miscellaneous References:
Kolmogorov, Russ. Math. Surveys, 38:29, 1983.
Wolfram, Comm. Math. Phys., 96:15-57, 1984.
Wolfram, Physica D, 10:1-35, 1984.
Hubermann and Hogg, Physica D, 22:376-384, 1986. Bachas and Hubermann, Phys. Rev. Lett., 57:1965, 1986
Peliti and Vulpiani, eds., Measures of Complexity, Springer-Verlag, 1988.
Wackerbauer, et. al., Chaos, Solitons & Fractals, 4:133-173, 1994.
Bar-Yam, Dynamics of Complex Systems, Addison-Wesley, 1997.
Non-constructive Complexity Measures: the Road Untakable
All Universal Turing Based complexity measures suffer from several drawbacks:
- They are uncomputable.
- By adopting a UTM, the most powerful discrete computation model, one loses the ability to distinguish between systems that can be described by computational models less powerful than a UTM.
UTM-based “complexity” measures include:
Logical Depth: Bennett, Found. Phys., 16:585-592, 1986.
Sophistication: Koppel, Complex Systems, 1:1087-91, 1987.
Effective Complexity: Gell-Mann and Lloyd,
Complexity, 2:44-52, 1996.
Falling off the “Edge of Chaos”
Packard, ”Adaptation to the Edge of Chaos” in Dynamic Patterns in Complex Systems, Kelso et.al, eds., World Scientific, 1988
Mitchell, Hraber, and Crutchfield ”Revisiting the ‘Edge of Chaos’ Complex Systems, 7:89-130, 1993. (Rebuttal to Packard, 1988).
Crutchfield and Young, ”Inferring Statistical Complexity”, Phys. Rev. Lett., 63:105-108, 1989. (First plot of “Complexity” vs. Entropy.)
Langton ”Computation at the Edge of Chaos”, Physica D (1990).
Li, Packard and Langton, ”Transition Phenomena in Cellular Automata Rule Space” Physica D 45 (1990) 77.
Wooters and Langton, ”Is there a Sharp Phase Transition for Deterministic Cellular Automata?”, Physica D 45 (1990) 95.
Crutchfield, “Unreconstructible at Any Radius”, Phys. Lett. A 171: 52-60, 1992.
0 comments on “A Survey of “Complexity Measures”,,,,, Without Measures”