Wednesday 28 August 2013

A Different Look at Air Traffic





As we have seen in our previous blogs, holistic benchmarking can be applied to a wide class of problems, including images. Images originating from astronomical observations, medicine, weather radar, etc. In this short blog we illustrate the case of air traffic. In particular, we examine and compare a "critical" (very high volume) traffic situation to a "standard" one, the intent being to actually measure the difference between the two. For the purpose we use two traffic density maps. Both images are illustrated below together with the corresponding Complexity Maps as obtained using OntoBench, our holistic benchmarking system.




Based on the comparison of the topologies of the two Complexity Maps (reference image has complexity  C = 275.49, while the second has C = 302.15) which is more significant than a simple comparison of image complexities (or entropies) one obtains that the degree of image similarity is 59.56%. Consequently, the difference between the second and first image is, globally speaking, 100 - 59.56 = 40.44%.

Based on the analysis of image complexities one may state that the overall difference between the two scenarios is  100 – 59.56% = 40%. In other words, the “critical” situation is 40% more “severe” (or complex) than the baseline.



www.ontonix.com


Monday 26 August 2013

Is This Really an Economic Crisis?



The economy is in a  state of crisis but is this a crisis of the economy? We think not. First and foremost this is a crisis of the society. A crisis of the people. A crisis of values and lifestyles - see our blog on the matter. The economy, while being a very important reflection of our society, is only one of its facets. One could, ultimately, risk saying that a "healthy society" leads to a healthy economy and risk a bit more by saying that the inverse is also true. In effect, it is difficult to imagine a decadent society producing a thriving economy. The point however is this:


  • The economy is a system which is subjected to a set of non-negotiable laws which are always there and which always function, regardless of the fact that we find them to be "correct", just, or not. Ultimately it all comes down to the laws of physics and, in particular, to laws of thermodynamics.

  • Just like in the case of any other system of laws - take the mentioned laws of physics - if you attempt to violate them, Nature will tax you in proportion to the magnitude of the intended violation.

Consider the most basic law of the economy: what you spend must be less than what you earn. Every reasonable person and family know of this law. Clearly, if ones lifestyle exceeds ones possibilities one cannot "blame the economy" if things suddenly go wrong. If you hurt yourself falling from a tree you don't say it's because of a force of gravity crisis! If you drink and then crash your car, you don't blame Newton's laws for the damage or alcohol for the injuries. You are the only one to blame. The same happens with the economy. If you attempt to violate one of its laws it will inevitably respond with a set of mechanisms that will kick in regardless of the consequences.

As the Chinese say, wisdom starts by calling things with the right names. The substance may not change much but recognizing a problem for what it really is may help find new ways of approaching it. Consequently, instead of saying "economy crisis" or "economy meltdown" we should more correctly state "society crisis" and "meltdown of values and life styles". Nature offers no free lunch and neither does the economy. To start fixing the economy we must start by fixing the society. And that means one thing: values. If you replace books with smart phones or hard work with speculation and financial engineering what can you expect?


www.ontonix.com


 

Sunday 25 August 2013

Moody's warns US banks



On Friday 23-rd August, 2013 Moody's warns that it could downgrade the rating six of the biggest US banks. We too have issued a rating. However, this is a different, more modern rating - it is a Resilience Rating - because it measures the capacity of a business to resist shocks, extreme events and turbulence. Our globalized economy is, evidently, dominated by extreme events, shocks and exposed to contagion in virtue of its extreme interdependency and complexity. A Resilience Rating is, precisely, based on complexity. It doesn't speak of performance, it reflects the hidden fragility of a business and its structure.

The banks in question are: Bank of America, JP Morgan, Wells Fargo, Morgan Stanley, Goldman Sachs and Bank of NY Mellon. Moody's claims that Citi is also under review.

To view our Resilience Ratings of these banks click on the icons below.



Resilience = 58,7%, Resilience Rating: B


Resilience = 68.2%, Resilience Rating: BB+


Resilience = 76.0%, Resilience Rating: BBB+


Resilience = 61.7%, Resilience Rating: B+


Resilience = 80.5%, Resilience Rating: A


Resilience = 62.4%, Resilience Rating: BB-


Resilience = 64.8%, Resilience Rating: BB

Today, in a turbulent economy, the "Too Big To Fail" logic no longer holds - now it is "Too Complex To Survive".










www.ontonix.com                    www.rate-a-business.com



Complexity Monitoring - A Formidable Early-Warning Tool


Complexity technology establishes a radically innovative means of anticipating crises. Systems under severe stress or on a path to collapse undergo either rapid complexity fluctuations or exhibit a consistent growth of complexity. If complexity is not measured, these precious  crisis precursors will go unnoticed. Conventional methods are unable to identify such precursors. The current planetary economy meltdown is eloquent proof.

A system enters in a state of pre-crisis as it approaches its critical complexity. Tracking the evolution of the distance of a system from its critical complexity yields a measure of the system’s vulnerability. As increasingly high thresholds of complexity are crossed, warning of increasing exposure may be issued. Systems that are kept at a safe distance from criticality are robust and therefore enjoy a lower risk exposure than near-critical systems. This may be said of corporations, markets or societies, or the World as a whole. The enormous value of this approach stems from a fundamental issue. Sufficiently complex systems often collapse due to endogenous, or internal, causes. Traumas induced from the outside are not necessary in order to destroy a very complex system. The sheer complexity of certain systems makes them vulnerable from within. History is full of examples. The US sub-prime bubble is one. Before the market collapsed, complexity has suddenly started to grown and has been rising steeply for over 18 months prior to the August 2007 implosion.

How does complexity-based crisis anticipation work? You simply measure and track complexity (yours or that of your clients), and look out for any sudden changes or even slow but consistent drifts (more or less like doctors would do when analyzing blood test results). In both cases, these point to the accumulation of entropy and/or the emergence of new structures within the system. Since entropy cannot grow indefinitely without being dumped by the system, one can be assured of an approaching crisis. The gradients of complexity give an idea of how intense the crisis will be and, most importantly, when it will hit. Coupled with past experience and the knowledge of previous crises, this technique provides the basis for a rational and holistic crisis-anticipation system for decision-makers, investors, managers, and policy-makers.

Complexity-based crisis anticipation functions in real-time and may be applied to:


  •     Corporations
  •     Banks (in this case we indicate clients who may be defaulting)
  •     Asset portfolios
  •     Customer-retention
  •     Process plants
  •     Traffic systems
  •     IT systems
Crisis anticipation in a turbulent economy is not just a strategic tool for decision-makers. It means survival.



www.ontonix.com                                            www.rate-a-business.com



 

Complexity Profiling and Causality


A Complexity Profile is probably the most important result of a complexity analysis and it may be helpful when it comes to shedding some light on the issue of causality. Its interpretation, therefore, is of paramount importance. Before this is done, it is important to consolidate a few basic concepts. There are two types of variables in a system:
  • Inputs
  • Outputs

These can be classified in two other categories:
  • Controllable
  • Uncontrollable

There are different situations that one can be confronted with:
  • Variables are only inputs (e.g. accelerator pedal angle)
  • Variables are only outputs (e.g. stock values, survey results)
  • Both inputs and outputs are present

But first of all, what is complexity? Complexity is a measure of how much information a system “contains” and how much this information is structured. One could simply sum up the Shannon entropies of each variable and conclude that this is the total amount of information in a system. However, because variables can be correlated, they give rise to structure. Structure means the system can “do more” and, potentially, perform new functions. Structure is present everywhere in Nature.  More structured information means more correlations within the system.  Critical complexity measures how much information can a system contain before it starts to lose this structure (i.e. before this information becomes meaningless).  Since information is measured in bits complexity is measured in bits.

The importance of structure is paramount. An analogy: the mass of an atom’s nucleus is less than the sum of the masses of its components. This is because the energy going into the various bindings has an equivalent in terms of mass (m=E/c^2). It is this amount that is “lost” when measuring the mass of the nucleus as a whole. The same is with complexity. It measures the information within a system not only based on the sum of the Shannon entropies of each variable, it also takes into account the “bindings” between the variables. This means that structure also carries information, not just each variable. This structure is reflected in the so-called Complexity Map.

Complexity is like energy. More energy one has, more can be turned into work in order to accomplish something. More complexity means more information and more information also means that more can be done.

What does the Complexity Map show? It shows which groups of variables vary together. It does NOT indicate if A is causing a variation in B or vice-versa, it simply shows how variables are grouped when they change. In other words, “when variable A varies, B also varies” – this is all that can be said, unless one knows specifically that a certain variable is independent and is controllable and its variations are intended.

A Complexity Profile (or Complexity Spectrum) shows how much information is “lost” from a system (a multi-dimensional data array) if a particular variable is removed.  The measurement is provided in percentage terms. The contributions to a Complexity Profile are ranked in descending order. When a variable is at the top of the CP it does not necessarily mean that it is the most important one or that it dominates/controls the system in question. This is ONLY true if the variable is an input.

When the first variable in a CP profile is removed, all one can say for sure is that the data set without that variable will experience the largest possible loss of information. The fact that a variable lies at the top of the CP does not automatically mean that it drives the business. Why is that the case? The first important step in a complexity analysis of any system is the synthesis of a meaningful data set. If you put in garbage, the results will be in proportion to the amount of garbage with respect to meaningful data. It is up to the user to collect meaningful data that embraces correctly a given problem and not indiscriminately. Therefore, if you are completely sure that your data is correct and meaningful (i.e. is of high quality), then indeed the CP provides a correct ranking of the variables in terms of how much information each variable contributes to the whole picture. But what does that physically mean? It means that the variable in question varies a lot AND it does so in unison (i.e. with structure) with numerous other variables.This means it is important, it is a driver.

The CP, therefore, is an objective way of ranking (weighing) variables as it ranks them based on how much information they carry. Therefore, if a variable lies in the upper part of the CP and it is a controllable input to your system then indeed it is an important business driver. And what about outputs? What if you have, say N stocks, and therefore N observable outputs from a system (stock exchange). How is the CP to be interpreted then? The above comment in red still holds. But can anything else be said in such a case? Probably yes.

A common question people formulate (even though we think this is not a good question to ask) is that of causality. If A and B vary together, is it A that causes the variation in B or vice-versa? This question is very difficult to answer (unless one has “insider” information). It is one of those questions that have no answer and that are useless to ask (is pizza better than spaghetti?). However, the Complexity Profile can help.

Let us see an example, the DJIA Index. The Complexity Map is illustrated below (click image to navigate map).


The corresponding CP is this:




This is a case in which it is impossible, for example, to say if it is the price of Home Depot stocks that drives the price of Citigroup stocks or vice-versa. What does it mean “to drive”?  The relationship in question is shown below:


What really drives both stocks is the market but that cannot be measured easily. So, what we can do is to assume that if two variables co-vary (vary together) the one with a higher CP contribution drives the other. In this case we could say that Citigroup “dominates” Home Depot. It is very difficult to disprove such a statement (unless one has privileged information or if the data has been manipulated).

In the case in question we could say that Citigroup dominates the DJIA Index even though market capitalization or stock value could hint something different. In summary, we could conclude that a Complexity Profile may help solve the eternal issue of causality (which seems to trouble humanity so much).


www.ontonix.com


Our Complexity Group in Linkedin - the largest one there is.



Join our Quantitative Complexity Management group in Linkedin - it is the largest complexity management group with nearly 1500 members. It is also the only group where you will see examples of complexity measurement and management for wide variety of systems, such as corporations, banks, financial products, countries, ecosystems, traffic systems, or the human body.

Complexity may be measured and managed. See how by joining our group. Click here to join.



www.ontonix.com



Saturday 24 August 2013

A Dynamic Look at the Eurozone.



Every quarter Ontonix processes EUROSTAT's macro-economic data and published Resilience Ratings for the entire EU. Below you may find those relative to Q2 2013.

Just click on any country and move the mouse.

   
     
   
       
     
 
     




























More next quarter.


www.ontonix.com



Beyond Pre-Crisis Analytics



Conventional data mining technology has the objective of establishing patterns and rules from large amounts of data by combining such techniques as statistics, artificial intelligence and data-base management. Data-mining and Analytics techniques are supposed to provide managers with an extra edge and to transform data into business intelligence. Have they succeeded? To find the answer take a look at the state of the global economy.

BEYOND PRE-CRISIS TECHNOLOGY
Conventional pre-crisis data mining and data analysis techniques display information by means of curves, 2D or 3D plots, pie charts, bar charts, or fancy surfaces. When the dimensionality of data is high methods become impractical in that one has to cope with hundreds if not thousands of such plots. It is necessary to resort to methods that really synthesize data not just transform one problem to another. Our complexity-based Analytic Engine OntoNet™ does something completely different.

SEEING THE BIG PICTURE
Every time you do something to data you destroy some of the information it contains. But data is expensive. We have developed innovative model-free technology that doesn’t destroy information. In fact, our approach emulates the way the brain works when you actually look at data. By transforming raw data into structure we also achieve an unprecedented degree of synthesis. And it all comes in one single Business Structure Map. This means you get to appreciate the nature and dimensionality of all your data to the fullest possible extent.

PUTTING YOUR DATA TO WORK
Extracting knowledge from data is not just about putting together pieces of information. This is precisely where traditional technology has failed. We drown in data but we are thirsty for knowledge, not information. Capturing knowledge, be it from field data or data that emerges from computer simulation, means transforming it into structure. Structure means relationships, degrees of freedom, constraints. Structure means gaining understanding and knowledge. Precisely what OntoNet™ is about.

UNSEEN INFORMATION HIDDEN IN YOUR DATA
The moment you map multi-dimensional data onto structure you get to appreciate a fundamental and new aspect of a business – its complexity.OntoNet ™ not only provides a unique and modern representation of a business, it also measures its complexity. Why is this so important? Because the rapid increase of business complexity, which is an inevitable consequence of turbulence and globalization, is one of the biggest enemies of growth, stability and resilience. With OntoNet™, conventional risk management transitions into its more advanced and natural form: complexity management.

SUPERIOR BUSINESS INTELLIGENCE = SURVIVAL
In a globalized and increasingly turbulent economy the survival of a business hinges on its ability to react quickly to unexpected, unique and extreme events. The economy is not linear, it is not stationary, it is not in a state of equilibrium and not everything follows a Gaussian distribution. However, many of the conventional BI and Analytics techniques are in violation of the basic laws of physics. Building a sustainable and resilient economy means also going beyond regressions, neural nets or statistics.

By the way, have you ever seen the DJIA like this?


See it in motion here.





 

Is France THE time-bomb for the Euro?


In an article published last year, the economist speaks of the country that could pose the largest threat to the Euro: France. A section of the article states:

"Even as other EU countries have curbed the reach of the state, it has grown in France to consume almost 57% of GDP, the highest share in the euro zone. Because of the failure to balance a single budget since 1981, public debt has risen from 22% of GDP then to over 90% now.

The business climate in France has also worsened. French firms are burdened by overly rigid labour- and product-market regulation, exceptionally high taxes and the euro zone’s heaviest social charges on payrolls. Not surprisingly, new companies are rare. France has fewer small and medium-sized enterprises, today’s engines of job growth, than Germany, Italy or Britain. The economy is stagnant, may tip into recession this quarter and will barely grow next year. Over 10% of the workforce, and over 25% of the young, are jobless."

The Resilience Rating - which reflects the "stability" of the situation, not performance of its economy - of France has only recently grown beyond 70%. Click below to see France's Business Structure Map and rating.

 




For some reason, according to the press it appears that only Southern European economies are in trouble. This article is telling us that we are all, essentially, on the same boat. Let's not forget, the crisis is global.


www.ontonix.com