Tuesday 8 October 2013

Nasdaq CFO Says Complexity is the Biggest Challenge to Market Success


In a recent article, the CFO of NASDAQ states that "Complexity is the Biggest Challenge to Market Success". He also speaks of the complexity of financial products and of a complexity reduction initiative. All this can be put in place if and only if you measure complexity. Talking about it will not reduce it. Hope is not a strategy.

Today, the technology to measure complexity exists:


Assetdyne - www.assetdyne.com - to measure the complexity of stocks and financial products

RateABusiness - www.rate-a-business.com - to measure the complexity of a business


You can only manage it if you can measure it. Resistance is futile.

Monday 7 October 2013

Probability of Default Versus the Principle of Incompatibility







According to the Millennium Project, the biggest global challenges facing humanity are those illustrated in the image above. The image conveys a holistic message which some of us already appreciate: everything is connected with everything else. The economy isn't indicated explicitly in the above image but, evidently, it's there, just as the industry, commerce, finance, religions, etc. Indeed a very complex scenario. The point is not to list everything but to merely point out that we live in a highly interconnected and dynamic world. We of course agree with the above picture.

As we have repeatedly pointed out in our previous articles, under similar circumstances:
  • it is impossible to make predictions - in fact, even the current economic crisis (of planetary proportions) has not been forecast
  • only very rough estimates can be attempted
  • there is no such thing as precision
  • it is impossible to isolate "cause-effect" statements as everything is linked
  • optimization is unjustified - one should seek acceptable solutions, not pursue perfection
The well known Principle of Incompatibility states in fact that "high precision is incompatible with high complexity". However, this fundamental principle, which applies to all facets of human existence, as well as in Nature, goes unnoticed. Neglecting the Principle of Incompatibility constitutes a tacit and embarrassing admission of ignorance. One such example is that of ratings. While the concept of rating lies at the very heart of our economy, and, from a point of view of principle, it is a necessary concept and tool, something is terribly wrong. A rating, as we know, measures the Probability of Default (PoD). Ratings are stratified according to classes. One example of such classes is shown below:

Class     PoD
1              =<0.05%
2              0.05% - 0.1%
3              0.1% - 0.2%
4              0.2% - 0.4%
5              0.4% - 0.7%
6              0.7% - 1.0%
etc.

A rating affects the way stocks of a given company are traded - this is precisely its function. What is shocking in the above numbers, however, is the precision (resolution). A PoD of 0.11% puts a company in class 3, while a 0.099 in class 2. How can this be so? Isn't the world  supposed to be a highly complex system? Clearly, if even a crisis of planetary proportions cannot be forecast, it not only points to high complexity (see the Principle of Incompatibility) but it also says a lot about all the Business Intelligence technology that is used in economics, finance, or management and decision making. So, where does all this precision in ratings come from? From a parallel virtual universe of equations and numbers in which everything is possible but which, unfortunately, does not map well onto reality. But the understanding of the real universe cannot be based on a parallel virtual universe which is incorrect.

The above example of PoD stratification reflects very little understanding of Nature and of its mechanisms. In fact, economic crises of global proportions suddenly happen. As Aristotle wrote in his Nikomachean Ethics: an educated mind is distinguished by the fact that it is content with that degree of accuracy which the nature of things permits, and by the fact that it does not seek exactness where only approximation is possible. 


www.ontonix.com


Sunday 6 October 2013

Entropy, Structure and Critical Complexity





When a system grows and evolves, its complexity increases. Take a look at evolution in our biosphere to realize that this is true. If you want to accomplish more you must become more complex. This means two things: structure and entropy. Structure is what defines functionality, entropy is what allows a system to react in a creative and novel way to a changing and possibly harsh environment. In biology this is adaptation. When you get too much of either structure or entropy you’re in trouble - you reach to so-called critical complexity and your fragility increases. You become exposed and vulnerable. Your ability to absorb more uncertainty (and still function) diminishes, just as your capacity to face extreme events.

Images from Purestform.


www.ontonix.com



Saturday 5 October 2013

Sunday 29 September 2013

Crisis Anticipation



Complexity technology establishes a radically innovative means of anticipating crises. Systems under severe stress or on a path to collapse undergo either rapid complexity fluctuations or exhibit a consistent growth of complexity. If complexity is not measured, these precious  crisis precursors will go unnoticed. Conventional methods are unable to identify such precursors.

How does complexity-based crisis anticipation work? You simply measure and track business complexity (yours or of your clients), and look out for any sudden changes or even slow but consistent drifts. This technique provides the basis for a rational and holistic crisis-anticipation system for decision-makers, investors, managers, and policy-makers. Essentially, the system buys you time, the most precious resource you have.

Our complexity-based crisis anticipation functions in real-time and may be applied to:
  • Corporations
  • Banks (in this case we indicate clients who may be defaulting)
  • Asset portfolios
  • Customer-retention
  • Process plants
  • Traffic systems
  • IT systems

Be warned of problems before it is too late.

Read article.


Contact us at info@ontonix.com for more information.


www.ontonix.com


 

Saturday 28 September 2013

Measuring Processes in Banks Using the DDD DataPicker and OntoNet


Ontonix and PRB have integrated OntoNet™, the World's first real-time Quantitative Complexity Management engine into PRB's DDD DataPicker™ system. The DDD DataPicker™ system is an advanced and configurable platform for document, process and workflow management which is used mainly in banks to monitor a multitude of processes. Integration of OntoNet™ with the DDD system allows its users to measure in real-time the complexity of various processes and to quickly identify those that are excessively complex thereby reducing process efficiency. Moreover, the system allows users to identify which phases of a particular process are responsible for high complexity, indicating quickly where to intervene.

The following slide illustrates the dashboard showing the process of "Credit Management" and its various phases. Without going into the details, the various dials on the dashboard indicate process simplicity (the complement to complexity) from a process management standpoint (0%- low simplicity = hard to manage, 100% - high simplicity = easy to manage). The color of the dials, on the other hand, indicates process robustness (green = robust, red = fragile).



Clicking on any of the above dials opens a window which illustrates the highest (3) contributors to the complexity of a particular phase of a give process, and produces the so-called Complexity Profile (i.e. breakdown into components).



Finally, each curve may be navigated interactively, enabling users to identify quickly periods of high complexity and/or low process robustness and to identify the causes.


The objective, of course, is to make processes more robust 8stable and repeatable) as well as more efficient. The final goal is to cut costs without sacrificing efficiency and customer satisfaction. More soon.



www.ontonix.com.



Crowdrating Systems of Banks Using Stockmarkets

Crowdrating Systems of Banks Using Stockmarkets: Assetdyne , the London-based company which has introduced for the first time the concepts of complexity and resilience to stock and stoc...