Sunday, 20 October 2013

Rating the Resilience and Complexity of Stock Portfolios



Assetdyne offers a unique service which enables users to measure the complexity and resilience of stock portfolios in real-time. The Assetdyne portal is connected to stock markets and makes portfolio building very simple: it is sufficient to type the Ticker symbol of a security and "Add to Portfolio". Once a portfolio has been built and stored one may evaluate its complexity and resilience by simply clicking on the "Analyze" icon as illustrated in the image below.




By clicking on the "analyze" icon as indicated by the blue arrow, the system performs an analysis of the Dow Jones Industrial Average Index based on the "Close" values of the securities which compose the index. These are listed in the "Ticker Symbols" column above. Once the analysis has been completed, the system pops-up a window with an interactive Business Structure Map (or Complexity Map) of the portfolio. This is shown below.



The Assetdyne system may be used not only to analyze actual stock portfolios but also sectors of the industry. The above list shows sectors such as Oil & Gas, IT, Automotive, Banks, etc. Based on how the corresponding stocks evolve, the system provides a  reflection of an entire industry sector based on its complexity and resilience.

But why are complexity and resilience so important? This is why:

High complexity means difficulty in terms of understanding the dynamics of a system, to make forecasts, more fragility, higher likelihood of surprising behaviour, more turbulence.

Resilience, the opposite to fragility, is the most important feature a system (portfolio) needs in order to survive turbulence and high complexity of markets and of the economy in general. And today's markets and economy are extremely turbulent and dominated by uncertainty.

Some examples.

         




www.ontonix.com                www.assetdyne.com




Saturday, 12 October 2013

Complexity, Criticality and the Drake Equation



Frank Drake devised an equation to express the hypothetical number of observable civilizations in our galaxy N = Rs nh fl fi fc L, where N is the number of civilizations in our galaxy, expressed as the product of six factors: Rs is the rate of star formation, nh is the number of habitable worlds per star, fl is the fraction of habitable worlds on which life arises, fi is the fraction of inhabited worlds with intelligent life, fc is the fraction of intelligent life forms that produce civilizations, and L is the average lifetime of such civilizations. But there is an evident paradox. According to the Drake equation, our Universe should be populated by thousands of civilizations similar to our’s. The number of stars that appear to be orbited by Earth-like planets increases on an almost daily basis. But if that is the case, where is everybody? Why are there no signs of their existence? Why does SETI fail to produce evidence that would support the Drake equation?

In 1981, cosmologist Edward Harrison suggested a powerful self-regulating mechanism that would neatly resolve the paradox. Any civilization bent on the intensive colonization of other worlds would be driven by an expansive territorial impulse. But such an aggressive nature would be unstable in combination with the immense technological powers required for interstellar travel. Such a civilization would self-destruct long before it could reach for the stars. The unrestrained territorial drive that served biological evolution so well for millions of years becomes a severe liability for a species once it acquires powers more than sufficient for its self-destruction. The Milky Way may well contain civilizations more advanced than ours, but they must have passed through a filter of natural selection that eliminates, by war or other self-inflicted environmental catastrophes, those civilizations driven by aggressive expansion.

We propose an alternative explanation of the paradox. In the past, the Earth was populated by numerous and disjoint civilizations that thrived almost in isolation. The Sumers, the Mayas, the Incas, the Greeks, the Romans, etc., etc. If one or more of these civilizations happened to disappear, many more remained. The temporal and spatial correlation between civilizations was very limited. However, the Earth today is populated by one single globalized civilization. If this one fails, that’s it. As we know, the evolution and growth of a civilization manifests itself in an increase in complexity. The Egyptians, for example, deliberately chose not to evolve and for many centuries they haven’t advanced an inch. Such a static civilization is only possible in the presence of an extremely structured and rigid society. But any form of progress is accompanied by an increase in complexity (a mix of structure and entropy). Until critical complexity is reached. Close to criticality, a system becomes fragile and therefore vulnerable. In order to continue evolving beyond critical complexity, a civilization must find ways of overcoming the delicate phase of vulnerability in which self-inflicted destruction is the most probable form of demise. It appears - see our previous articles - that our globalized society is now arguably headed for collapse and shall reach criticality around 2040-2045. What does this mean? If we fail to move past criticality, there will be no second chance, no other civilization will take over, at least not for millenia. Clearly, the biological lifetime of our species is likely to be several million years, even if we do our worst, but as far as technological progress is concerned, that will essentially be it. Based on our complexity metric and on the Second Law of Thermodynamics we can conclude that any world populated by multiple and disjoint civilizations will always tends towards a single globalized society. It appears that globalization is inevitable and this, in turn, accelerates the increase of complexity until criticality is reached.

We argue that the self-regulating mechanism that Harrison suggests ultimately stems from critical complexity. Only a civilization which is capable of evolving beyond criticality and in the presence of overwhelmingly powerful technology, can ever hope to reach for the stars. In other words, critical complexity is the hurdle that prevents evolution beyond self-inflicted extinction. Since none of the ancient (and not so ancient) civilizations never evolved beyond critical complexity - in fact, they’re all gone - they were all pre-critical civilizations. There has never been on Earth a post-critical civilization. The only one left that has a chance of becoming a post-critical one is our’s. But what conditions must a civilization meet in order to transition beyond criticality? Essentially two. First, it must lay its hands on technology to actively manage complexity. Second, it must have enough time to employ it. The technology exists. Since 2005.



www.ontonix.com



 

The not-that-useful Definitions of Complexity



"Every few months seems to produce another paper proposing yet another measure of complexity, generally a quantity which can't be computed for anything you'd actually care to know about, if at all. These quantities are almost never related to any other variable, so they form no part of any theory telling us when or how things get complex, and are usually just quantification for quantification's own sweet sake". Read more in: http://cscs.umich.edu/~crshalizi/notebooks/complexity-measures.html. The above mentioned abundance of candidate complexity measures - a clear reflection of the rampant fragmentation in the field - is summarized in: http://en.wikipedia.org/wiki/Complexity as follows: In several scientific fields, "complexity" has a specific meaning:

In computational complexity theory, the time complexity of a problem is the number of steps that it takes to solve an instance of the problem as a function of the size of the input (usually measured in bits), using the most efficient algorithm. This allows to classify problems by complexity class (such as P, NP) such analysis also exists for space, that is, the memory used by the algorithm.

In algorithmic information theory, the Kolmogorov complexity (also called descriptive complexity or algorithmic entropy) of a string is the length of the shortest binary program which outputs that string.

In information processing, complexity is a measure of the total number of properties transmitted by an object and detected by an observer. Such a collection of properties is often referred to as a state.

In physical systems, complexity is a measure of the probability of the state vector of the system. This is often confused with entropy, but is a distinct Mathematical analysis of the probability of the state of the system, where two distinct states are never conflated and considered equal as in statistical mechanics.

In mathematics, Krohn-Rhodes complexity is an important topic in the study of finite semigroups and automata.

In the sense of how complicated a problem is from the perspective of the person trying to solve it, limits of complexity are measured using a term from cognitive psychology, namely the hrair limit.
Specified complexity is a term used in intelligent design theory, first coined by William Dembski.

Irreducible complexity is a term used in arguments against the generally accepted theory of biological evolution, being a concept popularized by the biochemist Michael Behe.

Unruly complexity denotes situations that do not have clearly defined boundaries, coherent internal dynamics, or simply mediated relations with their external context, as coined by Peter Taylor.

And now, ask yourself this: can I use any of these measures to study the evolution of a corporation, of air-traffic, of a market? Can any of these 'measures' help identify a complex system and distinguish it from a "simple system"?



www.ontonix.com


 

Isn't Everything a 'Complex' System?



What distinguishes a theory from a conjecture? For example a characteristic constant (G, c, h, K, etc.) or a fundamental equation. The so-called 'complexity theory' has none. Most importantly, it lacks a measure of its most fundamental quantity - complexity. But worse than that. It lacks a definition of complexity too! Increasing complexity is, by far, the most evident characteristic of most aspects of our lives. It is, therefore, quite correct to talk about complexity. It would be great to be able to manage it before it becomes a problem. But, if you can't measure it, you can't manage it. Right?

If we accept the current 'definition' of a complex system we can claim that all systems are complex,  This 'definition' states that a system is complex if it is an aggregate of autonomous agents, which, spontaneously interact and self-organize leading to more elaborate systems, etc., etc. You know, the usual 'the whole is greater than the sum of the parts' stuff.  It is also stated, quite correctly, that it is impossible to infer the behaviour of the system from the properties of the agents that compose it. True. Analyzing in depth a single human will hint little on the dynamics of a society. Nothing new under the sun.

According to the above logic, all systems that surround us are 'complex':
  • Atoms spontaneously form molecules
  • Molecules spontaneously form crystals, proteins, etc.
  • Proteins combine to form cells, which, in turn, form organs
  • Humans form societies
  • Grains of sand form dunes and landslides
  • Flakes of snow combine to form avalanches
  • Animals and plants form ecosystems
  • Matter in the universe forms stars, which organize into galaxies
  • Corporations form markets
  • Molecules of water form drops, which, in turn, form waves in the ocean
  • Electrical impulses in networks of neurons form thoughts, sensations, emotions, conscience, etc. 
None of the above require outside orchestration of a Master Choreographer.

A closer look at life reveals that everything we see and experience is a 'complex system'. At this point, then, one may ask the following question: what  benefit (for science and philosophy) stems from establishing a new name for a set of objects which already contains all objects?




 

Tuesday, 8 October 2013

EU Commission: Italy Has Highest Long-Term Sustainability in the EU


We've been saying it for a long time: Italy's economy is one of the most resilient ones in the EU. It may not have the best performance but it has high robustness. Performance is one thing, robustness and sustainability are another.

Today, it is the EU Commission to confirm that in the long run, Italy has the best Sustainability Index (see above figure ) - see the EU Commission's Fiscal Sustainability Report 2012 from which the above graph is taken.

This seems paradoxical, to say the least. Italy, a G8 economy, with a manufacturing industry that is second only to that of Germany, has been bombarded by rating agencies, attacked by speculators and often indicted as the weakest link of the Eurozone.Why?


www.ontonix.com



Nasdaq CFO Says Complexity is the Biggest Challenge to Market Success


In a recent article, the CFO of NASDAQ states that "Complexity is the Biggest Challenge to Market Success". He also speaks of the complexity of financial products and of a complexity reduction initiative. All this can be put in place if and only if you measure complexity. Talking about it will not reduce it. Hope is not a strategy.

Today, the technology to measure complexity exists:


Assetdyne - www.assetdyne.com - to measure the complexity of stocks and financial products

RateABusiness - www.rate-a-business.com - to measure the complexity of a business


You can only manage it if you can measure it. Resistance is futile.

Monday, 7 October 2013

Probability of Default Versus the Principle of Incompatibility







According to the Millennium Project, the biggest global challenges facing humanity are those illustrated in the image above. The image conveys a holistic message which some of us already appreciate: everything is connected with everything else. The economy isn't indicated explicitly in the above image but, evidently, it's there, just as the industry, commerce, finance, religions, etc. Indeed a very complex scenario. The point is not to list everything but to merely point out that we live in a highly interconnected and dynamic world. We of course agree with the above picture.

As we have repeatedly pointed out in our previous articles, under similar circumstances:
  • it is impossible to make predictions - in fact, even the current economic crisis (of planetary proportions) has not been forecast
  • only very rough estimates can be attempted
  • there is no such thing as precision
  • it is impossible to isolate "cause-effect" statements as everything is linked
  • optimization is unjustified - one should seek acceptable solutions, not pursue perfection
The well known Principle of Incompatibility states in fact that "high precision is incompatible with high complexity". However, this fundamental principle, which applies to all facets of human existence, as well as in Nature, goes unnoticed. Neglecting the Principle of Incompatibility constitutes a tacit and embarrassing admission of ignorance. One such example is that of ratings. While the concept of rating lies at the very heart of our economy, and, from a point of view of principle, it is a necessary concept and tool, something is terribly wrong. A rating, as we know, measures the Probability of Default (PoD). Ratings are stratified according to classes. One example of such classes is shown below:

Class     PoD
1              =<0.05%
2              0.05% - 0.1%
3              0.1% - 0.2%
4              0.2% - 0.4%
5              0.4% - 0.7%
6              0.7% - 1.0%
etc.

A rating affects the way stocks of a given company are traded - this is precisely its function. What is shocking in the above numbers, however, is the precision (resolution). A PoD of 0.11% puts a company in class 3, while a 0.099 in class 2. How can this be so? Isn't the world  supposed to be a highly complex system? Clearly, if even a crisis of planetary proportions cannot be forecast, it not only points to high complexity (see the Principle of Incompatibility) but it also says a lot about all the Business Intelligence technology that is used in economics, finance, or management and decision making. So, where does all this precision in ratings come from? From a parallel virtual universe of equations and numbers in which everything is possible but which, unfortunately, does not map well onto reality. But the understanding of the real universe cannot be based on a parallel virtual universe which is incorrect.

The above example of PoD stratification reflects very little understanding of Nature and of its mechanisms. In fact, economic crises of global proportions suddenly happen. As Aristotle wrote in his Nikomachean Ethics: an educated mind is distinguished by the fact that it is content with that degree of accuracy which the nature of things permits, and by the fact that it does not seek exactness where only approximation is possible. 


www.ontonix.com


Sunday, 6 October 2013

Entropy, Structure and Critical Complexity





When a system grows and evolves, its complexity increases. Take a look at evolution in our biosphere to realize that this is true. If you want to accomplish more you must become more complex. This means two things: structure and entropy. Structure is what defines functionality, entropy is what allows a system to react in a creative and novel way to a changing and possibly harsh environment. In biology this is adaptation. When you get too much of either structure or entropy you’re in trouble - you reach to so-called critical complexity and your fragility increases. You become exposed and vulnerable. Your ability to absorb more uncertainty (and still function) diminishes, just as your capacity to face extreme events.

Images from Purestform.


www.ontonix.com



Saturday, 5 October 2013

Monitoring Bank Process Complexity. In Real-Time



Check out this video on how to monitor bank process complexity using the DDD DataPicker and OntoNet.


www.ontonix.com



Sunday, 29 September 2013

Crisis Anticipation



Complexity technology establishes a radically innovative means of anticipating crises. Systems under severe stress or on a path to collapse undergo either rapid complexity fluctuations or exhibit a consistent growth of complexity. If complexity is not measured, these precious  crisis precursors will go unnoticed. Conventional methods are unable to identify such precursors.

How does complexity-based crisis anticipation work? You simply measure and track business complexity (yours or of your clients), and look out for any sudden changes or even slow but consistent drifts. This technique provides the basis for a rational and holistic crisis-anticipation system for decision-makers, investors, managers, and policy-makers. Essentially, the system buys you time, the most precious resource you have.

Our complexity-based crisis anticipation functions in real-time and may be applied to:
  • Corporations
  • Banks (in this case we indicate clients who may be defaulting)
  • Asset portfolios
  • Customer-retention
  • Process plants
  • Traffic systems
  • IT systems

Be warned of problems before it is too late.

Read article.


Contact us at info@ontonix.com for more information.


www.ontonix.com


 

Saturday, 28 September 2013

Measuring Processes in Banks Using the DDD DataPicker and OntoNet


Ontonix and PRB have integrated OntoNet™, the World's first real-time Quantitative Complexity Management engine into PRB's DDD DataPicker™ system. The DDD DataPicker™ system is an advanced and configurable platform for document, process and workflow management which is used mainly in banks to monitor a multitude of processes. Integration of OntoNet™ with the DDD system allows its users to measure in real-time the complexity of various processes and to quickly identify those that are excessively complex thereby reducing process efficiency. Moreover, the system allows users to identify which phases of a particular process are responsible for high complexity, indicating quickly where to intervene.

The following slide illustrates the dashboard showing the process of "Credit Management" and its various phases. Without going into the details, the various dials on the dashboard indicate process simplicity (the complement to complexity) from a process management standpoint (0%- low simplicity = hard to manage, 100% - high simplicity = easy to manage). The color of the dials, on the other hand, indicates process robustness (green = robust, red = fragile).



Clicking on any of the above dials opens a window which illustrates the highest (3) contributors to the complexity of a particular phase of a give process, and produces the so-called Complexity Profile (i.e. breakdown into components).



Finally, each curve may be navigated interactively, enabling users to identify quickly periods of high complexity and/or low process robustness and to identify the causes.


The objective, of course, is to make processes more robust 8stable and repeatable) as well as more efficient. The final goal is to cut costs without sacrificing efficiency and customer satisfaction. More soon.



www.ontonix.com.



Crowdrating Systems of Banks Using Stockmarkets

Crowdrating Systems of Banks Using Stockmarkets: Assetdyne , the London-based company which has introduced for the first time the concepts of complexity and resilience to stock and stoc...

Crowdrating Systems of Banks Using Stockmarkets


Assetdyne, the London-based company which has introduced for the first time the concepts of complexity and resilience to stock and stock portfolio analysis and design, has analyzed recently systems of banks, namely those of Brazil, Singapore, Israel, as well top European banks. The way this is done is to assemble portfolios of the said banks and to treat them as system (which, in reality, they are!). The results are provided with comments.

Brazil



Singapore



Australia



Israel


 European banks


Similar analysis may be run free of charge at Assetdyne's website. As the analyses are performed on daily Close value of the corresponding stocks, the above indicated values of complexity and resilience may also change on a daily basis.


www.assetdyne.com



Monday, 23 September 2013

Ontonix S.r.l.: Is Risk Management a Source of Risk

Ontonix S.r.l.: Is Risk Management a Source of Risk: The deployment of risk management within a business can be a source of false assurance. Over recent years, businesses have becom...

Is Risk Management a Source of Risk





The deployment of risk management within a business can be a source of false assurance.

Over recent years, businesses have become more and more reliant on increasingly complex modelling processes to predict outcomes, to the point that in many cases, businesses have lost sight of what risk management is all about - and at the same time, risk management lost sight of what the business was all about. Increasingly, I have seen risk management services being deployed in large institutions by the 'big four' consultancy firms, and to keep their huge costs down, they end up with the newly qualified consultants - mid twenties, bright young things, but I'm sorry, they often don't have the faintest idea what your business does. They have insufficient real world experience to permit effective dissemination of risk knowledge.

I worked with one lovely young lady recently in a banking environment. Very intelligent - but she did not have the first clue of what the business was about. She made assumptions, and those assumptions lead the business down some long, dark alleys.

If you have risk function, however, that fully understands the business model, the deployment of its operational strategy, the sector the business operates in and the macro-economic and socio-political environment in which it operates, then they will be able to provide risk information that is relevant to the business, and can be understood by the business.

My hope going into this recession was that businesses would learn from this period in time, and take a more realistic, holistic view of the world. Worryingly, what I see is "more of the same".

I see financial institutions that have - on the face of it - bolstered their risk functions, but in doing so have allowed them to become ever more 'siloed' and fractured in their approach. This can only lead to disaster, in my view. The left hand will not know what the right hand is doing - no one owns anything, no one is responsible, no one is accountable.

So, I think the deployment of risk management has been a source of risk, but I don't think the dramas are over yet. There is a second wave of failure yet to hit, unless businesses can swallow the pill and take the right approach.

Posted by Andrew Bird, Managing Director at Nile Blue and freelance business consultant.




Saturday, 21 September 2013

Ontonix S.r.l.: Measuring the magnitdue of a crisis

Ontonix S.r.l.: Measuring the magnitdue of a crisis: How can you measure the magnitude of an economic crisis? By the number of lost jobs, foreclosures, GDP drop, number of defaulti...

Measuring the magnitdue of a crisis




How can you measure the magnitude of an economic crisis? By the number of lost jobs, foreclosures, GDP drop, number of defaulting banks and corporations, deflation? Or by the drop in stock-market indices? All these parameters do indeed reflect the severity of a crisis. But how about a single holistic index which takes them all into account? This index is complexity and in particular its variation. Let us examine, for example, the US sub-prime crisis. The complexity of the US housing market in the period 2004-2009 is illustrated in the above plot. A total of fifty market-specific parameters have been used to perform the analysis in addition to fifteen macroeconomic indicators such as the ones mentioned above. The "bursting bubble" manifests itself via a complexity increase from a value of approximately 19 to around 32. With respect to the initial value this means an increase of 40%. The arrow in the above plot indicates this jump in complexity and this number represents a systemic measure of how profound the US housing market crisis is.

In summary, the magnitude of a crisis can be measured as follows:

M = | C_i - C_f | / C_i

where C_i is the value of complexity before the crisis and C_f the value during crisis. The intensity of a crisis can be measured as the rate of change of complexity

Serious science starts when you begin to measure.




Friday, 20 September 2013

Complexity: The Fifth Dimension





When complexity is defined as a function of structure, entropy and granularity, examining its dynamics reveals its fantastic depth and phenomenal properties. The process of complexity computation materializes in a particular mapping of a state vector onto a scalar. What is surprising is how a simple process can enshroud such an astonishingly rich spectrum of features and characteristics. Complexity does not possess the properties of an energy and yet it expresses the "life potential" of a system in terms of the modes of behaviour it can deploy. In a sense, complexity, the way we measure it, reflects the amount of fitness of an autonomous dynamical system that operates in a given Fitness Landscape. This statement by no means implies that higher complexity leads to higher fitness. In fact, our research shows the existence of an upper bound on the complexity a given system may attain. We call this limit critical complexity. We know that in proximity of this limit, the system in question becomes delicate and fragile and operation close to this limit is dangerous. There surely exists a "good value" of complexity - which corresponds to a fraction, ß, of the upper limit - that maximizes fitness:

Cmax fitness = ß Ccritical

We don't know what the value of ß is for a given system and we are not sure on how it may be computed. However, we think that the fittest systems are able to operate around a good value of ß. Fit systems can potentially deploy a sufficient  variety of modes of behaviour so as to respond better to a non-stationary environment (ecosystem). The dimension of the modal space of a system ultimately equates to its degree of adaptability. Close to critical complexity the number of modes, as we observe, increases rapidly but, at the same time, the probability of spontaneous (and unwanted) mode transitions also increases quickly. This means the system can suddenly undertake unexpected and potentially self-compromising actions (just like adolescent humans). 





Wednesday, 18 September 2013

Stocks, Crowdrating and the Democratization of Ratings


We have always claimed that the process of rating of a business and its state of health should be more transparent, objective and affordable, even by the smallest of companies. With this goal in mind Ontonix has launched the World's first do-it-yourself rating system - Rate-A-Business - which allows any individual to upload the financials of a company and to obtain, in a matter of seconds, a measure of its state of health. The system works, of course, for both publicly listed as well as private companies. Essentially, the tool shifts rating from a duopoly of two huge Credit Rating Agencies to the Internet, the World's central nervous system. Rating becomes, de facto, a commodity. In order to make our global economy healthier, and to reduce the impact of future crises, it is paramount to transform rating from a luxury, to a commodity. Today, it is possible to know one's levels of cholesterol, for example, for just a few dollars. The information is not just reserved for the rich. Similarly, the rating of a business - its state of health, or resilience - is something that every company should know, even the tiniest SME. This is the philosophy that has driven Ontonix to develop Rate-A-Business.

Assetdyne takes things forward even more. The company also provides a real-time rating system. Even though the system developed by Assetdyne focuses on publicly listed companies and portfolios of their stocks, it too introduces a fundamental new element into the process of rating - the so-called crowd-rating. The value of the stock of a company is the result of a complex interplay of millions of traders, analysts, investors, trading robots, etc. Ultimately, it is a reflection of the reputation and perceived value of a particular company and is the result of a democratic process. Clearly, the value of a stock is also driven by market trends, sector analyses, rumors, insider trading and other illicit practice and, evidently, by the Credit Rating Agencies themselves. However, undeniably, it is the millions of traders who ultimately drive the price and dynamics of stocks according to the basic principles of supply and demand. In practice, we're talking of a planet-wide democratic process of crowd-rating - it is the crowd of traders and investors that decides how much you pay for a particular stock.

What Assetdyne does is to use the information on the value and dynamics of the price of stocks to actually compute a rating. The rating that is computed does not reflect the Probability-of-Default (PoD) of a particular company - this the popular "AAA" kind of rating - it reflects the "resilience" of a given stock (hence the company behind it). Resilience is the capacity to resist shocks, a frequent phenomenon in our turbulent economy. Resilience, besides being a very useful measure of the state of health of any kind of system, not just of a corporation, possesses one very important characteristic - its computation is based on the measure of complexity. It so happens that complexity is the hallmark of our economy, of our times. The rating system developed by Assetdyne delivers, therefore, the following additional information:

Stock Complexity - this measures how "chaotic" the evolution of stock is. In other words, we're talking of an advanced measure of volatility. Complexity is measured in bits. The value of complexity of different stocks may clearly be compared.

Stock Resilience - this measures how well the stock price reacts to shocks and extreme events. Values range from 0% to 100%.

As the computation of the complexity and resilience of a stock are based on closing values at the end of each trading day, the corresponding values also change on a daily basis.

An example is illustrated below.




Assetdyne's rating system is applicable also to portfolios of stocks. The example below illustrates a small portfolio of oil&gas companies.




An important aspect of this particular rating technique is that it is not based on the financial reports (Balance Sheets, Income Statements, Cash Flow, etc.) which are of highly subjective nature. But companies construct their balance statements so as to provide a more optimistic picture and therefore conventional PoD-type ratings inevitably influenced. While financial statements and the resulting PoD ratings are subjective (recall the multitude of triple-A-rated companies that have defaulted all of a sudden, triggering the current crisis) to the point that governments have sued Credit Rating Agencies, stocks represent a considerably more objective reflection of the real state of affairs. Most importantly, the information is known to everyone. Of course, markets are not always right and the price may be wrong but the process of converging to a given price is as objective and democratic as things in this world can get. 

One could conclude that the World's stock markets constitute one huge social network which plays a global game called trading. As the game is played, one of its outcomes is the price of stocks. The price may be "wrong", it may be manipulated but it is what it is. It is the result of the mentioned crowd-rating and Assetdyne uses it to provide new important information on complexity and resilience rating of stocks and portfolios. Innovation in finance is possible.


www.assetdyne.com




Tuesday, 17 September 2013

Complexity Introduced to Stock and Portfolio Analysis and Design


Modern Portfolio Theory (MPT) has been introduced in 1952 by Markowitz. As described in Wikipedia, "MPT is a theory of finance that attempts to maximize portfolio expected return for a given amount of portfolio risk, or equivalently minimize risk for a given level of expected return, by carefully choosing the proportions of various assets. Although MPT is widely used in practice in the financial industry and several of its creators won a Nobel memorial prize for the theory, in recent years the basic assumptions of MPT have been widely challenged by fields such as behavioral economics.

MPT is a mathematical formulation of the concept of diversification in investing, with the aim of selecting a collection of investment assets that has collectively lower risk than any individual asset. This is possible, intuitively speaking, because different types of assets often change in value in opposite ways. For example, to the extent prices in the stock market move differently from prices in the bond market, a collection of both types of assets can in theory face lower overall risk than either individually. But diversification lowers risk even if assets' returns are not negatively correlated—indeed, even if they are positively correlated.

More technically, MPT models an asset's return as a normally distributed function (or more generally as an elliptically distributed random variable), defines risk as the standard deviation of return, and models a portfolio as a weighted combination of assets, so that the return of a portfolio is the weighted combination of the assets' returns. By combining different assets whose returns are not perfectly positively correlated, MPT seeks to reduce the total variance of the portfolio return. MPT also assumes that investors are rational and markets are efficient."

Since 1952 the world has changed. It has changed even more in the past decade, when complexity and turbulence have made their permanent entry on the scene. Turbulence and complexity are not only the hallmarks of our times, they can be measured, managed and used in the design of systems, in decision-making and, of course, in asset portfolio analysis and design.

Assetdyne is the first company to have incorporated complexity into portfolio analysis and design.  In fact, the company develops a system which computes the Resilience Rating of  stocks and stock portfolios based on complexity measures and not based on variance or other traditional approaches. While conventional portfolio design often follows the Modern Portfolio Theory (MPT), which identifies optimal portfolios via minimization of the total portfolio variance, the technique developed by Assetdyne designs portfolios based on the minimization of portfolio complexity. The approach is based on the fact that excessively complex systems are inherently fragile. Recently concluded research confirms that this is the case also for asset portfolios.


Two examples or Resilience Rating of a single Stock are illustrated below:




An example of a Resilience Rating of a portfolio of stocks is shown below (Top European banks are illustrated as an interacting system):


while the rating and complexity measures are the following:


The interactive map of the EU banks may be navigated on-line here.


For more information, visit Assetdyne's website.



Sunday, 15 September 2013

If You Really Need to Optimize


Optimal solutions are fragile and should be generally avoided. This unpopular statement enjoys substantial practical and philosophical argumentation and now, thanks to complexity, we can be even more persuasive. However, this short note is about making optimisation a bit more easy. If you really insist on pursuing optimality, there is an important point to keep in mind.

Let us examine the case illustrated below: designing a composite conical structure in which the goal is to keep mass under control, as well as the fluxes and axial and lateral frequencies. The System Map shown below reflects which (red) design variables (ply thicknesses) influence the performance of the structure in the nominal (initial) configuration, prior to optimisation. In addition, the map also illustrates how the various outputs (blue nodes) relate to each other. 




In fact, one may conclude that, for example, the following relationships exist:
  • t_20013 - Weight
  • Weight - Axial frequency
  • Min Flux - Max Flux
  • t_20013 controls the Lateral Frequency
  • t_20013 also controls the Axial Frequency
  • Lateral Frequency and Axial Frequency are related to each other
  • etc.
As one may conclude, the outputs are tightly coupled: if you change one you cannot avoid changing the others. Let's first see how optimisation is handled when one faces multiple - often conflicting - objectives:

minimise y = COST (y_1, y_2, ..., y_n) where y_k stands for the k-th performance descriptor (e.g. mass, stiffness, etc.). In many cases weights are introduced as follows:

minimise y = COST (w_1  *  y_1,  w_2  * y_2, ..., w_n  * y_n).

The fundamental problem with such a formulation (and all similar MDO-type formulations) is that the various performance descriptors are often dependent (just like the example above indicates) and the analysts doesn't know. The cost function indicated above is a mathematical statement of a conflict, whereby the y's compete for protagonism. This competition is driven by an optimisation algorithm which knows nothing of the structure of the corresponding System Map and of the existence of the relationships contained therein. Imagine, for example that you are trying to reduce a variable (e.g. mass) and increase, at the same time another (e.g. frequency). Suppose also that you don't know that these two variables are strongly related to each other: the relationship looks typically like this: f = SQRT(k/m). Here, f and m, outputs of the problem, are related - changing one modifies the other. This is inevitable. In a more intricate situation, in which hundreds of design variables are involved, along with tens or hundreds of performance descriptors, the problem really becomes numerically tough. The optimisation algorithm has a very hard time. What is the solution?

If you cannot avoid optimisation, the we suggest the following approach:
  • Define your baseline design.
  • Run a Monte Carlo Simulation, in which you randomly perturb the design variables (inputs).
  • Process the results using OntoSpace, obtaining the System Map.
  • Find the INDEPENDENT outputs (performance descriptors), or, in the case the aren't any, those outputs which have the lowest degree in the System Map. There are tools in OntoSpace that actually help to do this.
  • Build your cost function using only those variables, leaving the others out.
This approach "softens" the problem from a numerical point of view and reduces the mentioned conflicts between output variables. Attempting to formulate a multi-disciplinary problem without knowing a-priori how the various disciplines interact (i.e. without the System Map) is risky, to say the least.






www.design4resilience.com


Saturday, 14 September 2013

Optimality: The Recipe for Disaster



There still seems to be a rush towards optimal design.  But there is no better way to fragility and vulnerability than the pursuit of peak performance and perfection - optimality in other words. But let's take a look at the logic behind this risky and outdated practise:
  •  Based on a series of assumptions, a math model of a system/problem is built.
  •  Hence, we already have the first departure from reality: a model is just a model.
  •  If you're good, really good, the model will "miss" 10% of reality.
  •  You then squeeze peak performance out of this model according to some objective function.
  •  You then manufacture the real thing based on what the model says.
It is known - obviously not to all - that optimal designs may be well-behaved with respect to random variations in the design parameters but, at the same time, they are hyper-sensitive to small variations in the variables that have been left out in the process of building the model. This is precisely what happens - you design for or against something, but you miss out that something else. By wiping under the carpet seemingly innocent variables, you're deciding a-priori what the physics will be like. And this you cannot do. Now, if your model isn't forced to be optimal - to go to the limit - it might stand a better chance in that it will have room for manoeuvre. When you're optimal, you can only get worse! If you're standing on the peak of a mountain, the only way is down! Why, then, even attempt to design and build systems that are optimal and that can only get worse? Is this so difficult to see? Sounds like the Emperor's new clothes, doesn't it?



www.design4resilience.com


www.ontonix.com


Friday, 13 September 2013

Robustness and Rating of System of Europe's Top Banks


Based on Close values of the stocks of Europe's top banks, we have rated them as a system of interacting systems. The Business Structure Map is represented above, while the resilience rating and complexity measures are indicated below:



Finally, the Complexity Profile, which ranks each bank in terms of its complexity footprint (relevance) on the system as a whole.




www.ontonix.com



Thursday, 12 September 2013

Is a Global Post-Critical Society Possible?



(written in 2005)
Every dynamical system possesses a characteristic value of complexity which reflects how information is organized and how it flows within its structure. Like most things in life, complexity is limited. In fact, there is an upper bound on complexity that a given system may attain and sustain with a given structure. This ‘physiological’ limit is known as critical complexity. In the proximity of its corresponding critical complexity every system becomes fragile and therefore vulnerable. This fragility is consequence of a very simple fact: critically complex systems possess a multitude of modes of behaviour and can suddenly jump from one mode to another. Very often, minute amounts of energy are sufficient to accomplish such mode transitions. Consequently, highly complex systems may easily develop surprising behaviour and are inherently difficult to understand and govern. For this very reason, humans prefer to stay away from situations that are perceived to be highly complex. In the vicinity of critical complexity, life becomes more risky precisely because of the inherent element of surprise.

In the past few years modern complexity science has developed comprehensive metrics and means of measuring not only the complexity of generic systems but also the corresponding critical complexity. This has enabled to turn the above intuitive rules into rational general principles which govern the dynamics and interplay of everything that surrounds us. The interaction of entropy and structure is the fundamental mechanism behind co-evolution and behind the creation of organized complexity in Nature. Higher complexity implies greater functionality and therefore higher ‘fitness’. However, extreme specialization – fruit of ‘evolutionary opportunism’ – comes at a high cost. Robust yet fragile is the hallmark of highly complex systems. Think of how creative the human species is and yet how fragile human nature is. Under highly uncertain and stressful conditions this fragility emerges with strength. But since human beings are the basic building blocks of societies, economies and nations, it is not difficult to understand why the complexity of our globalized and turbulent world assumes almost cosmological proportions. Fragility and volatility are words which best reflect the state of health of not only the global economy but also of the society in all of its aspects.

Our global society is ultimately a huge and dynamic network, composed of nodes and links. The connections between the nodes (individuals, corporations, markets, nations) are rapidly increasing in number, just as is the number of nodes. A fundamental feature of this network is entropy, which is a measure of uncertainty. Because the nodes do not always act in a rational and predictable fashion, the connections are “noisy”. Because the amount of entropy can only increase – this is due to the Second Law of Thermodynamics - while new connections are being created every day, many others are destroyed. This process is also inevitable. The measure of complexity is a blend of the topology of the network and the amount of noise – entropy – contained within its structure. Consequently, there are two means of increasing complexity: adding more structure (connections, nodes or both), or, for a given network structure, increase the amount of noise.

In the past, the Earth was populated by numerous and disjoint civilizations that thrived almost in isolation. The Sumers, the Incas, or the Romans are just a few prominent examples. Because the temporal and spatial correlation between those civilizations was very limited, if one happened to disappear, many more remained. However, the Earth today is populated by one single globalized society. If this one fails, that’s it. But any form of progress is accompanied by an inevitable increase in complexity. This is true only until critical complexity is reached. In order to continue evolving beyond critical complexity, a civilization must find ways of overcoming the delicate phase of vulnerability in which self-inflicted destruction is the most probable form of demise.
When a society approaches critical complexity, it has the following alternatives in order to survive:
1.    Reduce its complexity. This is done by dumping entropy or by simplifying its structure. In practice this translates to:
  • Stricter laws.
  • Less laws.
  • Reduction of personal freedom.
2.    Learn to live in proximity of critical complexity. This is  risky because the system is:
  • Extremely turbulent (stochastic). Terrorism, crime and fraudulent behaviour thrive close to criticality.
  • Very difficult to govern – impossible to reach goals.
  • Unexpected behaviour may emerge.
  • On the verge of widespread violence.
3.    Increase its critical complexity. This may be accomplished in essentially two ways:
  • Creating more links (making a denser Process Map). However, this makes governing even more difficult.
  • Adding structure. Certainly the preferred option. One example? “Create” more nations – this not only increases structure, it may also help ease tensions. 
Option 2 is the most risky. Living in proximity of critical complexity cannot be accomplished in the framework of a conventional western-type democracy. The extreme turbulence which characterizes critically complex systems is most likely better dealt with in a technocratic and police-state setting, which limits severely personal freedom. Only a government which understands how to actively manage complexity on a vast scale may venture into similar territory. To our knowledge, solution 2 is today not viable. A better approach, therefore, is to adopt a mix of 1 and 3.

Terrorism constitutes surely one of the major concerns of modern democracies. The number of terrorist attacks has more than tripled in recent years. Contrary to popular belief, religion is not the main motivating factor. In terms of location most instances of politically-fuelled violence and terrorism may be found in Asia, not in the Middle East. In fact, our research shows that Asia enjoys a far greater complexity growth rate than the Middle East. Approximately one fourth of trans-national politically-motivated terrorist acts are inspired by religion. A similar amount is accounted for by leftist militant organizations. Nearly 40% of terror acts are perpetrated by nationalist and separatist groups. As expected, there is no single clear cause. A mix of factors, which ultimately lead to some form of social injustice, poverty, failing states or dysfunctional politics are what fuels terrorism. This suggests that the problem is indeed due to very high complexity. We are also painfully aware of the fact that modern democracies naturally lack efficient tools to effectively deal with highly complex socio-political-ethnical and religious problems, without neglecting the fundamental economical and ecological dimensions.

Where can terrorism develop with greater ease? Terrorists need to hide. For this reason they thrive in high-entropy environments, such as failing or rogue states, where there is little social structure. It is in highly complex societies (doesn’t mean developed) that terror groups find geo-political sanctuaries. High complexity, as mentioned, comes in many forms:
  • Little structure but high entropy (Third World countries)
  • Much structure, low entropy (Western democracies)
  • Much structure, high entropy  (the future global society)
Terror groups generally prefer high entropy-dominated complexity because of the Principle of Incompatibility: high complexity implies low precision. This means that hunting them down - essentially an intelligence-driven exercise - is difficult because of lack of precise information, laws on privacy, etc. Because of the fact that globally complexity is quickly increasing, it will be increasingly more difficult to identify terror groups especially in ambiguous countries, i.e. those which harbour terrorists but are willing to close an eye. The problem with Western countries is that they are becoming more permissive and tolerant, leading to an overall erosion of social structure in favour of entropy. In underdeveloped countries it is almost impossible to create new social structure hence it is entropy that causes the increase of complexity. In the West, the more intricate social structure is being eroded by loss of moral values and relativism. The result? in both cases an increase in complexity. Following the above logic, we can state that:
  • High complexity is necessary (but not sufficient) to lead to terrorism.
  • Terrorism in an almost “obvious” consequence of a highly complex world.
  • The Principle of Incompatibility and terrorism are intimately linked.
Can complexity be used to anticipate conflicts, crises and failing states? The answer is affirmative. It is evident that a society/country in the proximity of its critical complexity is far more open to enter a state of conflict, such as civil war or simply declare war on a neighbouring country. The conditions that a society must satisfy in order to switch to a conflict mode are multiple. As history teaches, there is no established pattern. Many factors concur. But it is clear that it is more difficult to take a well functioning and prosperous society to war than one which is fragile and dominated by entropy. In a society in which the entropy-saturated structure is eroded, the distance that separates a “peace mode” from a “conflict mode” is much smaller and switching is considerably easier. The idea, therefore, is to measure and track complexity region per region, country per country, and to keep an eye on those countries and regions where high complexity gradients are observed. Regions where complexity increases quickly are certainly candidates for social unrest or armed conflict. How can this be accomplished? What kind of data should be used? Good candidates are:

•    Birth-rate
•    Death-rate
•    Debt-external
•    Electricity-consumption
•    Electricity-production
•    Exports
•    GDP
•    GDP-per capita
•    GDP-real growth
•    Highways
•    Imports
•    Infant Mortality
•    Inflation rate
•    Internet users
•    Labour force
•    Life expectancy
•    Military expenses
•    Oil-consumption
•    Oil-production
•    Population
•    Telephones mobiles
•    Telephones-main lines
•    Total fertility rate
•    Unemployment rate 

The list is of course incomplete, as there are tens of other indicators which must be taken into account. Based on historical data such as that listed above, Ontonix has conducted comprehensive analyses of the World’s complexity and its rate of growth. It has emerged that if the current trend is maintained, our global society shall reach criticality around 2045-2050. What does this mean? The high amount of complexity will make it extremely difficult to govern societies or to make decisions of political nature. Under similar conditions, self-inflicted extinction will be highly likely. Although from a global perspective, the World is still almost half a century away from its critical state, there are numerous regions of the World in which societies are nearly critical and extremely difficult to grow and govern. Many parts of Africa, the Middle East or South East Asia are just a few examples. But also Western democracies are in danger. Highly sophisticated and peaceful societies too are increasingly fragile because of a rapid increase of rights, freedom, tolerance or relativism.
It is interesting to note how the global robustness of the world has dropped from 77% in 2003, to 68% in 2004. Similarly, in the same period complexity has increased from 6.3 to 8.1, while the corresponding critical complexity has risen from 8.1 to 9.6. Critical complexity increases because globally speaking the world’s economy is growing. This is of course positive. However, this growth is lower than the growth of complexity. The two values will cross around 2045-2050.

All ancient civilizations have collapsed. This is because due to a variety of reasons they reached their critical complexity and were unable to cope with the resulting fragility. Critical complexity becomes a severe liability for a species especially once it acquires powers more than sufficient for its self-destruction. Fragile civilizations are vulnerable and their most likely fate

If we fail to cope with and, ultimately, move safely past criticality, there will be no second chance, no other civilization will take over. Clearly, the biological lifetime of our species is likely to be several million years, even if we do our worst, but as far as technological and social progress is concerned, that will essentially be it. Globalization of course accelerates the increase of complexity until criticality is reached. Critical complexity, on the other hand, is the hurdle that prevents evolution beyond self-inflicted extinction. Since none of the ancient (and not so ancient) civilizations have ever evolved beyond critical complexity - in fact, they’re all gone - they were all pre-critical civilizations. There has never been a post-critical civilization on Earth. The only one left that has a chance of becoming a post-critical one is of course ours. But what conditions must a civilization meet in order to transition beyond criticality? Essentially two. First, it must lay its hands on technology to actively manage complexity. Second, it must have enough time to employ it on a vast and global scale. Complexity management technology has been introduced by Ontonix in 2005. This leaves us with about 40-45 years.

Sunday, 8 September 2013

How resilient are the big US IT companies?


We've analysed America's big IT players as system. The analysis has been performed using stock market data, and in particular the stock values. The result is a three-star resilience. Not bad but nothing to celebrate. Below is the Complexity Map.





The analysis has been performed using our Resilience Rating system. Try it for FREE here.



www.ontonix.com



Wednesday, 28 August 2013

A Different Look at Air Traffic





As we have seen in our previous blogs, holistic benchmarking can be applied to a wide class of problems, including images. Images originating from astronomical observations, medicine, weather radar, etc. In this short blog we illustrate the case of air traffic. In particular, we examine and compare a "critical" (very high volume) traffic situation to a "standard" one, the intent being to actually measure the difference between the two. For the purpose we use two traffic density maps. Both images are illustrated below together with the corresponding Complexity Maps as obtained using OntoBench, our holistic benchmarking system.




Based on the comparison of the topologies of the two Complexity Maps (reference image has complexity  C = 275.49, while the second has C = 302.15) which is more significant than a simple comparison of image complexities (or entropies) one obtains that the degree of image similarity is 59.56%. Consequently, the difference between the second and first image is, globally speaking, 100 - 59.56 = 40.44%.

Based on the analysis of image complexities one may state that the overall difference between the two scenarios is  100 – 59.56% = 40%. In other words, the “critical” situation is 40% more “severe” (or complex) than the baseline.



www.ontonix.com


Monday, 26 August 2013

Is This Really an Economic Crisis?



The economy is in a  state of crisis but is this a crisis of the economy? We think not. First and foremost this is a crisis of the society. A crisis of the people. A crisis of values and lifestyles - see our blog on the matter. The economy, while being a very important reflection of our society, is only one of its facets. One could, ultimately, risk saying that a "healthy society" leads to a healthy economy and risk a bit more by saying that the inverse is also true. In effect, it is difficult to imagine a decadent society producing a thriving economy. The point however is this:


  • The economy is a system which is subjected to a set of non-negotiable laws which are always there and which always function, regardless of the fact that we find them to be "correct", just, or not. Ultimately it all comes down to the laws of physics and, in particular, to laws of thermodynamics.

  • Just like in the case of any other system of laws - take the mentioned laws of physics - if you attempt to violate them, Nature will tax you in proportion to the magnitude of the intended violation.

Consider the most basic law of the economy: what you spend must be less than what you earn. Every reasonable person and family know of this law. Clearly, if ones lifestyle exceeds ones possibilities one cannot "blame the economy" if things suddenly go wrong. If you hurt yourself falling from a tree you don't say it's because of a force of gravity crisis! If you drink and then crash your car, you don't blame Newton's laws for the damage or alcohol for the injuries. You are the only one to blame. The same happens with the economy. If you attempt to violate one of its laws it will inevitably respond with a set of mechanisms that will kick in regardless of the consequences.

As the Chinese say, wisdom starts by calling things with the right names. The substance may not change much but recognizing a problem for what it really is may help find new ways of approaching it. Consequently, instead of saying "economy crisis" or "economy meltdown" we should more correctly state "society crisis" and "meltdown of values and life styles". Nature offers no free lunch and neither does the economy. To start fixing the economy we must start by fixing the society. And that means one thing: values. If you replace books with smart phones or hard work with speculation and financial engineering what can you expect?


www.ontonix.com


 

Sunday, 25 August 2013

Moody's warns US banks



On Friday 23-rd August, 2013 Moody's warns that it could downgrade the rating six of the biggest US banks. We too have issued a rating. However, this is a different, more modern rating - it is a Resilience Rating - because it measures the capacity of a business to resist shocks, extreme events and turbulence. Our globalized economy is, evidently, dominated by extreme events, shocks and exposed to contagion in virtue of its extreme interdependency and complexity. A Resilience Rating is, precisely, based on complexity. It doesn't speak of performance, it reflects the hidden fragility of a business and its structure.

The banks in question are: Bank of America, JP Morgan, Wells Fargo, Morgan Stanley, Goldman Sachs and Bank of NY Mellon. Moody's claims that Citi is also under review.

To view our Resilience Ratings of these banks click on the icons below.



Resilience = 58,7%, Resilience Rating: B


Resilience = 68.2%, Resilience Rating: BB+


Resilience = 76.0%, Resilience Rating: BBB+


Resilience = 61.7%, Resilience Rating: B+


Resilience = 80.5%, Resilience Rating: A


Resilience = 62.4%, Resilience Rating: BB-


Resilience = 64.8%, Resilience Rating: BB

Today, in a turbulent economy, the "Too Big To Fail" logic no longer holds - now it is "Too Complex To Survive".










www.ontonix.com                    www.rate-a-business.com