Tuesday 17 September 2013

Complexity Introduced to Stock and Portfolio Analysis and Design


Modern Portfolio Theory (MPT) has been introduced in 1952 by Markowitz. As described in Wikipedia, "MPT is a theory of finance that attempts to maximize portfolio expected return for a given amount of portfolio risk, or equivalently minimize risk for a given level of expected return, by carefully choosing the proportions of various assets. Although MPT is widely used in practice in the financial industry and several of its creators won a Nobel memorial prize for the theory, in recent years the basic assumptions of MPT have been widely challenged by fields such as behavioral economics.

MPT is a mathematical formulation of the concept of diversification in investing, with the aim of selecting a collection of investment assets that has collectively lower risk than any individual asset. This is possible, intuitively speaking, because different types of assets often change in value in opposite ways. For example, to the extent prices in the stock market move differently from prices in the bond market, a collection of both types of assets can in theory face lower overall risk than either individually. But diversification lowers risk even if assets' returns are not negatively correlated—indeed, even if they are positively correlated.

More technically, MPT models an asset's return as a normally distributed function (or more generally as an elliptically distributed random variable), defines risk as the standard deviation of return, and models a portfolio as a weighted combination of assets, so that the return of a portfolio is the weighted combination of the assets' returns. By combining different assets whose returns are not perfectly positively correlated, MPT seeks to reduce the total variance of the portfolio return. MPT also assumes that investors are rational and markets are efficient."

Since 1952 the world has changed. It has changed even more in the past decade, when complexity and turbulence have made their permanent entry on the scene. Turbulence and complexity are not only the hallmarks of our times, they can be measured, managed and used in the design of systems, in decision-making and, of course, in asset portfolio analysis and design.

Assetdyne is the first company to have incorporated complexity into portfolio analysis and design.  In fact, the company develops a system which computes the Resilience Rating of  stocks and stock portfolios based on complexity measures and not based on variance or other traditional approaches. While conventional portfolio design often follows the Modern Portfolio Theory (MPT), which identifies optimal portfolios via minimization of the total portfolio variance, the technique developed by Assetdyne designs portfolios based on the minimization of portfolio complexity. The approach is based on the fact that excessively complex systems are inherently fragile. Recently concluded research confirms that this is the case also for asset portfolios.


Two examples or Resilience Rating of a single Stock are illustrated below:




An example of a Resilience Rating of a portfolio of stocks is shown below (Top European banks are illustrated as an interacting system):


while the rating and complexity measures are the following:


The interactive map of the EU banks may be navigated on-line here.


For more information, visit Assetdyne's website.



Sunday 15 September 2013

If You Really Need to Optimize


Optimal solutions are fragile and should be generally avoided. This unpopular statement enjoys substantial practical and philosophical argumentation and now, thanks to complexity, we can be even more persuasive. However, this short note is about making optimisation a bit more easy. If you really insist on pursuing optimality, there is an important point to keep in mind.

Let us examine the case illustrated below: designing a composite conical structure in which the goal is to keep mass under control, as well as the fluxes and axial and lateral frequencies. The System Map shown below reflects which (red) design variables (ply thicknesses) influence the performance of the structure in the nominal (initial) configuration, prior to optimisation. In addition, the map also illustrates how the various outputs (blue nodes) relate to each other. 




In fact, one may conclude that, for example, the following relationships exist:
  • t_20013 - Weight
  • Weight - Axial frequency
  • Min Flux - Max Flux
  • t_20013 controls the Lateral Frequency
  • t_20013 also controls the Axial Frequency
  • Lateral Frequency and Axial Frequency are related to each other
  • etc.
As one may conclude, the outputs are tightly coupled: if you change one you cannot avoid changing the others. Let's first see how optimisation is handled when one faces multiple - often conflicting - objectives:

minimise y = COST (y_1, y_2, ..., y_n) where y_k stands for the k-th performance descriptor (e.g. mass, stiffness, etc.). In many cases weights are introduced as follows:

minimise y = COST (w_1  *  y_1,  w_2  * y_2, ..., w_n  * y_n).

The fundamental problem with such a formulation (and all similar MDO-type formulations) is that the various performance descriptors are often dependent (just like the example above indicates) and the analysts doesn't know. The cost function indicated above is a mathematical statement of a conflict, whereby the y's compete for protagonism. This competition is driven by an optimisation algorithm which knows nothing of the structure of the corresponding System Map and of the existence of the relationships contained therein. Imagine, for example that you are trying to reduce a variable (e.g. mass) and increase, at the same time another (e.g. frequency). Suppose also that you don't know that these two variables are strongly related to each other: the relationship looks typically like this: f = SQRT(k/m). Here, f and m, outputs of the problem, are related - changing one modifies the other. This is inevitable. In a more intricate situation, in which hundreds of design variables are involved, along with tens or hundreds of performance descriptors, the problem really becomes numerically tough. The optimisation algorithm has a very hard time. What is the solution?

If you cannot avoid optimisation, the we suggest the following approach:
  • Define your baseline design.
  • Run a Monte Carlo Simulation, in which you randomly perturb the design variables (inputs).
  • Process the results using OntoSpace, obtaining the System Map.
  • Find the INDEPENDENT outputs (performance descriptors), or, in the case the aren't any, those outputs which have the lowest degree in the System Map. There are tools in OntoSpace that actually help to do this.
  • Build your cost function using only those variables, leaving the others out.
This approach "softens" the problem from a numerical point of view and reduces the mentioned conflicts between output variables. Attempting to formulate a multi-disciplinary problem without knowing a-priori how the various disciplines interact (i.e. without the System Map) is risky, to say the least.






www.design4resilience.com


Saturday 14 September 2013

Optimality: The Recipe for Disaster



There still seems to be a rush towards optimal design.  But there is no better way to fragility and vulnerability than the pursuit of peak performance and perfection - optimality in other words. But let's take a look at the logic behind this risky and outdated practise:
  •  Based on a series of assumptions, a math model of a system/problem is built.
  •  Hence, we already have the first departure from reality: a model is just a model.
  •  If you're good, really good, the model will "miss" 10% of reality.
  •  You then squeeze peak performance out of this model according to some objective function.
  •  You then manufacture the real thing based on what the model says.
It is known - obviously not to all - that optimal designs may be well-behaved with respect to random variations in the design parameters but, at the same time, they are hyper-sensitive to small variations in the variables that have been left out in the process of building the model. This is precisely what happens - you design for or against something, but you miss out that something else. By wiping under the carpet seemingly innocent variables, you're deciding a-priori what the physics will be like. And this you cannot do. Now, if your model isn't forced to be optimal - to go to the limit - it might stand a better chance in that it will have room for manoeuvre. When you're optimal, you can only get worse! If you're standing on the peak of a mountain, the only way is down! Why, then, even attempt to design and build systems that are optimal and that can only get worse? Is this so difficult to see? Sounds like the Emperor's new clothes, doesn't it?



www.design4resilience.com


www.ontonix.com


Friday 13 September 2013

Robustness and Rating of System of Europe's Top Banks


Based on Close values of the stocks of Europe's top banks, we have rated them as a system of interacting systems. The Business Structure Map is represented above, while the resilience rating and complexity measures are indicated below:



Finally, the Complexity Profile, which ranks each bank in terms of its complexity footprint (relevance) on the system as a whole.




www.ontonix.com



Thursday 12 September 2013

Is a Global Post-Critical Society Possible?



(written in 2005)
Every dynamical system possesses a characteristic value of complexity which reflects how information is organized and how it flows within its structure. Like most things in life, complexity is limited. In fact, there is an upper bound on complexity that a given system may attain and sustain with a given structure. This ‘physiological’ limit is known as critical complexity. In the proximity of its corresponding critical complexity every system becomes fragile and therefore vulnerable. This fragility is consequence of a very simple fact: critically complex systems possess a multitude of modes of behaviour and can suddenly jump from one mode to another. Very often, minute amounts of energy are sufficient to accomplish such mode transitions. Consequently, highly complex systems may easily develop surprising behaviour and are inherently difficult to understand and govern. For this very reason, humans prefer to stay away from situations that are perceived to be highly complex. In the vicinity of critical complexity, life becomes more risky precisely because of the inherent element of surprise.

In the past few years modern complexity science has developed comprehensive metrics and means of measuring not only the complexity of generic systems but also the corresponding critical complexity. This has enabled to turn the above intuitive rules into rational general principles which govern the dynamics and interplay of everything that surrounds us. The interaction of entropy and structure is the fundamental mechanism behind co-evolution and behind the creation of organized complexity in Nature. Higher complexity implies greater functionality and therefore higher ‘fitness’. However, extreme specialization – fruit of ‘evolutionary opportunism’ – comes at a high cost. Robust yet fragile is the hallmark of highly complex systems. Think of how creative the human species is and yet how fragile human nature is. Under highly uncertain and stressful conditions this fragility emerges with strength. But since human beings are the basic building blocks of societies, economies and nations, it is not difficult to understand why the complexity of our globalized and turbulent world assumes almost cosmological proportions. Fragility and volatility are words which best reflect the state of health of not only the global economy but also of the society in all of its aspects.

Our global society is ultimately a huge and dynamic network, composed of nodes and links. The connections between the nodes (individuals, corporations, markets, nations) are rapidly increasing in number, just as is the number of nodes. A fundamental feature of this network is entropy, which is a measure of uncertainty. Because the nodes do not always act in a rational and predictable fashion, the connections are “noisy”. Because the amount of entropy can only increase – this is due to the Second Law of Thermodynamics - while new connections are being created every day, many others are destroyed. This process is also inevitable. The measure of complexity is a blend of the topology of the network and the amount of noise – entropy – contained within its structure. Consequently, there are two means of increasing complexity: adding more structure (connections, nodes or both), or, for a given network structure, increase the amount of noise.

In the past, the Earth was populated by numerous and disjoint civilizations that thrived almost in isolation. The Sumers, the Incas, or the Romans are just a few prominent examples. Because the temporal and spatial correlation between those civilizations was very limited, if one happened to disappear, many more remained. However, the Earth today is populated by one single globalized society. If this one fails, that’s it. But any form of progress is accompanied by an inevitable increase in complexity. This is true only until critical complexity is reached. In order to continue evolving beyond critical complexity, a civilization must find ways of overcoming the delicate phase of vulnerability in which self-inflicted destruction is the most probable form of demise.
When a society approaches critical complexity, it has the following alternatives in order to survive:
1.    Reduce its complexity. This is done by dumping entropy or by simplifying its structure. In practice this translates to:
  • Stricter laws.
  • Less laws.
  • Reduction of personal freedom.
2.    Learn to live in proximity of critical complexity. This is  risky because the system is:
  • Extremely turbulent (stochastic). Terrorism, crime and fraudulent behaviour thrive close to criticality.
  • Very difficult to govern – impossible to reach goals.
  • Unexpected behaviour may emerge.
  • On the verge of widespread violence.
3.    Increase its critical complexity. This may be accomplished in essentially two ways:
  • Creating more links (making a denser Process Map). However, this makes governing even more difficult.
  • Adding structure. Certainly the preferred option. One example? “Create” more nations – this not only increases structure, it may also help ease tensions. 
Option 2 is the most risky. Living in proximity of critical complexity cannot be accomplished in the framework of a conventional western-type democracy. The extreme turbulence which characterizes critically complex systems is most likely better dealt with in a technocratic and police-state setting, which limits severely personal freedom. Only a government which understands how to actively manage complexity on a vast scale may venture into similar territory. To our knowledge, solution 2 is today not viable. A better approach, therefore, is to adopt a mix of 1 and 3.

Terrorism constitutes surely one of the major concerns of modern democracies. The number of terrorist attacks has more than tripled in recent years. Contrary to popular belief, religion is not the main motivating factor. In terms of location most instances of politically-fuelled violence and terrorism may be found in Asia, not in the Middle East. In fact, our research shows that Asia enjoys a far greater complexity growth rate than the Middle East. Approximately one fourth of trans-national politically-motivated terrorist acts are inspired by religion. A similar amount is accounted for by leftist militant organizations. Nearly 40% of terror acts are perpetrated by nationalist and separatist groups. As expected, there is no single clear cause. A mix of factors, which ultimately lead to some form of social injustice, poverty, failing states or dysfunctional politics are what fuels terrorism. This suggests that the problem is indeed due to very high complexity. We are also painfully aware of the fact that modern democracies naturally lack efficient tools to effectively deal with highly complex socio-political-ethnical and religious problems, without neglecting the fundamental economical and ecological dimensions.

Where can terrorism develop with greater ease? Terrorists need to hide. For this reason they thrive in high-entropy environments, such as failing or rogue states, where there is little social structure. It is in highly complex societies (doesn’t mean developed) that terror groups find geo-political sanctuaries. High complexity, as mentioned, comes in many forms:
  • Little structure but high entropy (Third World countries)
  • Much structure, low entropy (Western democracies)
  • Much structure, high entropy  (the future global society)
Terror groups generally prefer high entropy-dominated complexity because of the Principle of Incompatibility: high complexity implies low precision. This means that hunting them down - essentially an intelligence-driven exercise - is difficult because of lack of precise information, laws on privacy, etc. Because of the fact that globally complexity is quickly increasing, it will be increasingly more difficult to identify terror groups especially in ambiguous countries, i.e. those which harbour terrorists but are willing to close an eye. The problem with Western countries is that they are becoming more permissive and tolerant, leading to an overall erosion of social structure in favour of entropy. In underdeveloped countries it is almost impossible to create new social structure hence it is entropy that causes the increase of complexity. In the West, the more intricate social structure is being eroded by loss of moral values and relativism. The result? in both cases an increase in complexity. Following the above logic, we can state that:
  • High complexity is necessary (but not sufficient) to lead to terrorism.
  • Terrorism in an almost “obvious” consequence of a highly complex world.
  • The Principle of Incompatibility and terrorism are intimately linked.
Can complexity be used to anticipate conflicts, crises and failing states? The answer is affirmative. It is evident that a society/country in the proximity of its critical complexity is far more open to enter a state of conflict, such as civil war or simply declare war on a neighbouring country. The conditions that a society must satisfy in order to switch to a conflict mode are multiple. As history teaches, there is no established pattern. Many factors concur. But it is clear that it is more difficult to take a well functioning and prosperous society to war than one which is fragile and dominated by entropy. In a society in which the entropy-saturated structure is eroded, the distance that separates a “peace mode” from a “conflict mode” is much smaller and switching is considerably easier. The idea, therefore, is to measure and track complexity region per region, country per country, and to keep an eye on those countries and regions where high complexity gradients are observed. Regions where complexity increases quickly are certainly candidates for social unrest or armed conflict. How can this be accomplished? What kind of data should be used? Good candidates are:

•    Birth-rate
•    Death-rate
•    Debt-external
•    Electricity-consumption
•    Electricity-production
•    Exports
•    GDP
•    GDP-per capita
•    GDP-real growth
•    Highways
•    Imports
•    Infant Mortality
•    Inflation rate
•    Internet users
•    Labour force
•    Life expectancy
•    Military expenses
•    Oil-consumption
•    Oil-production
•    Population
•    Telephones mobiles
•    Telephones-main lines
•    Total fertility rate
•    Unemployment rate 

The list is of course incomplete, as there are tens of other indicators which must be taken into account. Based on historical data such as that listed above, Ontonix has conducted comprehensive analyses of the World’s complexity and its rate of growth. It has emerged that if the current trend is maintained, our global society shall reach criticality around 2045-2050. What does this mean? The high amount of complexity will make it extremely difficult to govern societies or to make decisions of political nature. Under similar conditions, self-inflicted extinction will be highly likely. Although from a global perspective, the World is still almost half a century away from its critical state, there are numerous regions of the World in which societies are nearly critical and extremely difficult to grow and govern. Many parts of Africa, the Middle East or South East Asia are just a few examples. But also Western democracies are in danger. Highly sophisticated and peaceful societies too are increasingly fragile because of a rapid increase of rights, freedom, tolerance or relativism.
It is interesting to note how the global robustness of the world has dropped from 77% in 2003, to 68% in 2004. Similarly, in the same period complexity has increased from 6.3 to 8.1, while the corresponding critical complexity has risen from 8.1 to 9.6. Critical complexity increases because globally speaking the world’s economy is growing. This is of course positive. However, this growth is lower than the growth of complexity. The two values will cross around 2045-2050.

All ancient civilizations have collapsed. This is because due to a variety of reasons they reached their critical complexity and were unable to cope with the resulting fragility. Critical complexity becomes a severe liability for a species especially once it acquires powers more than sufficient for its self-destruction. Fragile civilizations are vulnerable and their most likely fate

If we fail to cope with and, ultimately, move safely past criticality, there will be no second chance, no other civilization will take over. Clearly, the biological lifetime of our species is likely to be several million years, even if we do our worst, but as far as technological and social progress is concerned, that will essentially be it. Globalization of course accelerates the increase of complexity until criticality is reached. Critical complexity, on the other hand, is the hurdle that prevents evolution beyond self-inflicted extinction. Since none of the ancient (and not so ancient) civilizations have ever evolved beyond critical complexity - in fact, they’re all gone - they were all pre-critical civilizations. There has never been a post-critical civilization on Earth. The only one left that has a chance of becoming a post-critical one is of course ours. But what conditions must a civilization meet in order to transition beyond criticality? Essentially two. First, it must lay its hands on technology to actively manage complexity. Second, it must have enough time to employ it on a vast and global scale. Complexity management technology has been introduced by Ontonix in 2005. This leaves us with about 40-45 years.

Sunday 8 September 2013

How resilient are the big US IT companies?


We've analysed America's big IT players as system. The analysis has been performed using stock market data, and in particular the stock values. The result is a three-star resilience. Not bad but nothing to celebrate. Below is the Complexity Map.





The analysis has been performed using our Resilience Rating system. Try it for FREE here.



www.ontonix.com



Wednesday 28 August 2013

A Different Look at Air Traffic





As we have seen in our previous blogs, holistic benchmarking can be applied to a wide class of problems, including images. Images originating from astronomical observations, medicine, weather radar, etc. In this short blog we illustrate the case of air traffic. In particular, we examine and compare a "critical" (very high volume) traffic situation to a "standard" one, the intent being to actually measure the difference between the two. For the purpose we use two traffic density maps. Both images are illustrated below together with the corresponding Complexity Maps as obtained using OntoBench, our holistic benchmarking system.




Based on the comparison of the topologies of the two Complexity Maps (reference image has complexity  C = 275.49, while the second has C = 302.15) which is more significant than a simple comparison of image complexities (or entropies) one obtains that the degree of image similarity is 59.56%. Consequently, the difference between the second and first image is, globally speaking, 100 - 59.56 = 40.44%.

Based on the analysis of image complexities one may state that the overall difference between the two scenarios is  100 – 59.56% = 40%. In other words, the “critical” situation is 40% more “severe” (or complex) than the baseline.



www.ontonix.com