## Saturday, 12 October 2013

### The not-that-useful Definitions of Complexity

"Every few months seems to produce another paper proposing yet another measure of complexity, generally a quantity which can't be computed for anything you'd actually care to know about, if at all. These quantities are almost never related to any other variable, so they form no part of any theory telling us when or how things get complex, and are usually just quantification for quantification's own sweet sake". Read more in: http://cscs.umich.edu/~crshalizi/notebooks/complexity-measures.html. The above mentioned abundance of candidate complexity measures - a clear reflection of the rampant fragmentation in the field - is summarized in: http://en.wikipedia.org/wiki/Complexity as follows: In several scientific fields, "complexity" has a specific meaning:

In computational complexity theory, the time complexity of a problem is the number of steps that it takes to solve an instance of the problem as a function of the size of the input (usually measured in bits), using the most efficient algorithm. This allows to classify problems by complexity class (such as P, NP) such analysis also exists for space, that is, the memory used by the algorithm.

In algorithmic information theory, the Kolmogorov complexity (also called descriptive complexity or algorithmic entropy) of a string is the length of the shortest binary program which outputs that string.

In information processing, complexity is a measure of the total number of properties transmitted by an object and detected by an observer. Such a collection of properties is often referred to as a state.

In physical systems, complexity is a measure of the probability of the state vector of the system. This is often confused with entropy, but is a distinct Mathematical analysis of the probability of the state of the system, where two distinct states are never conflated and considered equal as in statistical mechanics.

In mathematics, Krohn-Rhodes complexity is an important topic in the study of finite semigroups and automata.

In the sense of how complicated a problem is from the perspective of the person trying to solve it, limits of complexity are measured using a term from cognitive psychology, namely the hrair limit.
Specified complexity is a term used in intelligent design theory, first coined by William Dembski.

Irreducible complexity is a term used in arguments against the generally accepted theory of biological evolution, being a concept popularized by the biochemist Michael Behe.

Unruly complexity denotes situations that do not have clearly defined boundaries, coherent internal dynamics, or simply mediated relations with their external context, as coined by Peter Taylor.

And now, ask yourself this: can I use any of these measures to study the evolution of a corporation, of air-traffic, of a market? Can any of these 'measures' help identify a complex system and distinguish it from a "simple system"?

www.ontonix.com

### Isn't Everything a 'Complex' System?

What distinguishes a theory from a conjecture? For example a characteristic constant (G, c, h, K, etc.) or a fundamental equation. The so-called 'complexity theory' has none. Most importantly, it lacks a measure of its most fundamental quantity - complexity. But worse than that. It lacks a definition of complexity too! Increasing complexity is, by far, the most evident characteristic of most aspects of our lives. It is, therefore, quite correct to talk about complexity. It would be great to be able to manage it before it becomes a problem. But, if you can't measure it, you can't manage it. Right?

If we accept the current 'definition' of a complex system we can claim that all systems are complex,  This 'definition' states that a system is complex if it is an aggregate of autonomous agents, which, spontaneously interact and self-organize leading to more elaborate systems, etc., etc. You know, the usual 'the whole is greater than the sum of the parts' stuff.  It is also stated, quite correctly, that it is impossible to infer the behaviour of the system from the properties of the agents that compose it. True. Analyzing in depth a single human will hint little on the dynamics of a society. Nothing new under the sun.

According to the above logic, all systems that surround us are 'complex':
• Atoms spontaneously form molecules
• Molecules spontaneously form crystals, proteins, etc.
• Proteins combine to form cells, which, in turn, form organs
• Humans form societies
• Grains of sand form dunes and landslides
• Flakes of snow combine to form avalanches
• Animals and plants form ecosystems
• Matter in the universe forms stars, which organize into galaxies
• Corporations form markets
• Molecules of water form drops, which, in turn, form waves in the ocean
• Electrical impulses in networks of neurons form thoughts, sensations, emotions, conscience, etc.
None of the above require outside orchestration of a Master Choreographer.

A closer look at life reveals that everything we see and experience is a 'complex system'. At this point, then, one may ask the following question: what  benefit (for science and philosophy) stems from establishing a new name for a set of objects which already contains all objects?

## Tuesday, 8 October 2013

### EU Commission: Italy Has Highest Long-Term Sustainability in the EU

We've been saying it for a long time: Italy's economy is one of the most resilient ones in the EU. It may not have the best performance but it has high robustness. Performance is one thing, robustness and sustainability are another.

Today, it is the EU Commission to confirm that in the long run, Italy has the best Sustainability Index (see above figure ) - see the EU Commission's Fiscal Sustainability Report 2012 from which the above graph is taken.

This seems paradoxical, to say the least. Italy, a G8 economy, with a manufacturing industry that is second only to that of Germany, has been bombarded by rating agencies, attacked by speculators and often indicted as the weakest link of the Eurozone.Why?

www.ontonix.com

### Nasdaq CFO Says Complexity is the Biggest Challenge to Market Success

In a recent article, the CFO of NASDAQ states that "Complexity is the Biggest Challenge to Market Success". He also speaks of the complexity of financial products and of a complexity reduction initiative. All this can be put in place if and only if you measure complexity. Talking about it will not reduce it. Hope is not a strategy.

Today, the technology to measure complexity exists:

Assetdyne - www.assetdyne.com - to measure the complexity of stocks and financial products

You can only manage it if you can measure it. Resistance is futile.

## Monday, 7 October 2013

### Probability of Default Versus the Principle of Incompatibility

According to the Millennium Project, the biggest global challenges facing humanity are those illustrated in the image above. The image conveys a holistic message which some of us already appreciate: everything is connected with everything else. The economy isn't indicated explicitly in the above image but, evidently, it's there, just as the industry, commerce, finance, religions, etc. Indeed a very complex scenario. The point is not to list everything but to merely point out that we live in a highly interconnected and dynamic world. We of course agree with the above picture.

As we have repeatedly pointed out in our previous articles, under similar circumstances:
• it is impossible to make predictions - in fact, even the current economic crisis (of planetary proportions) has not been forecast
• only very rough estimates can be attempted
• there is no such thing as precision
• it is impossible to isolate "cause-effect" statements as everything is linked
• optimization is unjustified - one should seek acceptable solutions, not pursue perfection
The well known Principle of Incompatibility states in fact that "high precision is incompatible with high complexity". However, this fundamental principle, which applies to all facets of human existence, as well as in Nature, goes unnoticed. Neglecting the Principle of Incompatibility constitutes a tacit and embarrassing admission of ignorance. One such example is that of ratings. While the concept of rating lies at the very heart of our economy, and, from a point of view of principle, it is a necessary concept and tool, something is terribly wrong. A rating, as we know, measures the Probability of Default (PoD). Ratings are stratified according to classes. One example of such classes is shown below:

Class     PoD
1              =<0.05%
2              0.05% - 0.1%
3              0.1% - 0.2%
4              0.2% - 0.4%
5              0.4% - 0.7%
6              0.7% - 1.0%
etc.

A rating affects the way stocks of a given company are traded - this is precisely its function. What is shocking in the above numbers, however, is the precision (resolution). A PoD of 0.11% puts a company in class 3, while a 0.099 in class 2. How can this be so? Isn't the world  supposed to be a highly complex system? Clearly, if even a crisis of planetary proportions cannot be forecast, it not only points to high complexity (see the Principle of Incompatibility) but it also says a lot about all the Business Intelligence technology that is used in economics, finance, or management and decision making. So, where does all this precision in ratings come from? From a parallel virtual universe of equations and numbers in which everything is possible but which, unfortunately, does not map well onto reality. But the understanding of the real universe cannot be based on a parallel virtual universe which is incorrect.

The above example of PoD stratification reflects very little understanding of Nature and of its mechanisms. In fact, economic crises of global proportions suddenly happen. As Aristotle wrote in his Nikomachean Ethics: an educated mind is distinguished by the fact that it is content with that degree of accuracy which the nature of things permits, and by the fact that it does not seek exactness where only approximation is possible.

www.ontonix.com

## Sunday, 6 October 2013

### Entropy, Structure and Critical Complexity

When a system grows and evolves, its complexity increases. Take a look at evolution in our biosphere to realize that this is true. If you want to accomplish more you must become more complex. This means two things: structure and entropy. Structure is what defines functionality, entropy is what allows a system to react in a creative and novel way to a changing and possibly harsh environment. In biology this is adaptation. When you get too much of either structure or entropy you’re in trouble - you reach to so-called critical complexity and your fragility increases. You become exposed and vulnerable. Your ability to absorb more uncertainty (and still function) diminishes, just as your capacity to face extreme events.

Images from Purestform.

www.ontonix.com

## Saturday, 5 October 2013

### Monitoring Bank Process Complexity. In Real-Time

Check out this video on how to monitor bank process complexity using the DDD DataPicker and OntoNet.

www.ontonix.com