Sunday 29 September 2013

Crisis Anticipation



Complexity technology establishes a radically innovative means of anticipating crises. Systems under severe stress or on a path to collapse undergo either rapid complexity fluctuations or exhibit a consistent growth of complexity. If complexity is not measured, these precious  crisis precursors will go unnoticed. Conventional methods are unable to identify such precursors.

How does complexity-based crisis anticipation work? You simply measure and track business complexity (yours or of your clients), and look out for any sudden changes or even slow but consistent drifts. This technique provides the basis for a rational and holistic crisis-anticipation system for decision-makers, investors, managers, and policy-makers. Essentially, the system buys you time, the most precious resource you have.

Our complexity-based crisis anticipation functions in real-time and may be applied to:
  • Corporations
  • Banks (in this case we indicate clients who may be defaulting)
  • Asset portfolios
  • Customer-retention
  • Process plants
  • Traffic systems
  • IT systems

Be warned of problems before it is too late.

Read article.


Contact us at info@ontonix.com for more information.


www.ontonix.com


 

Saturday 28 September 2013

Measuring Processes in Banks Using the DDD DataPicker and OntoNet


Ontonix and PRB have integrated OntoNet™, the World's first real-time Quantitative Complexity Management engine into PRB's DDD DataPicker™ system. The DDD DataPicker™ system is an advanced and configurable platform for document, process and workflow management which is used mainly in banks to monitor a multitude of processes. Integration of OntoNet™ with the DDD system allows its users to measure in real-time the complexity of various processes and to quickly identify those that are excessively complex thereby reducing process efficiency. Moreover, the system allows users to identify which phases of a particular process are responsible for high complexity, indicating quickly where to intervene.

The following slide illustrates the dashboard showing the process of "Credit Management" and its various phases. Without going into the details, the various dials on the dashboard indicate process simplicity (the complement to complexity) from a process management standpoint (0%- low simplicity = hard to manage, 100% - high simplicity = easy to manage). The color of the dials, on the other hand, indicates process robustness (green = robust, red = fragile).



Clicking on any of the above dials opens a window which illustrates the highest (3) contributors to the complexity of a particular phase of a give process, and produces the so-called Complexity Profile (i.e. breakdown into components).



Finally, each curve may be navigated interactively, enabling users to identify quickly periods of high complexity and/or low process robustness and to identify the causes.


The objective, of course, is to make processes more robust 8stable and repeatable) as well as more efficient. The final goal is to cut costs without sacrificing efficiency and customer satisfaction. More soon.



www.ontonix.com.



Crowdrating Systems of Banks Using Stockmarkets

Crowdrating Systems of Banks Using Stockmarkets: Assetdyne , the London-based company which has introduced for the first time the concepts of complexity and resilience to stock and stoc...

Crowdrating Systems of Banks Using Stockmarkets


Assetdyne, the London-based company which has introduced for the first time the concepts of complexity and resilience to stock and stock portfolio analysis and design, has analyzed recently systems of banks, namely those of Brazil, Singapore, Israel, as well top European banks. The way this is done is to assemble portfolios of the said banks and to treat them as system (which, in reality, they are!). The results are provided with comments.

Brazil



Singapore



Australia



Israel


 European banks


Similar analysis may be run free of charge at Assetdyne's website. As the analyses are performed on daily Close value of the corresponding stocks, the above indicated values of complexity and resilience may also change on a daily basis.


www.assetdyne.com



Monday 23 September 2013

Ontonix S.r.l.: Is Risk Management a Source of Risk

Ontonix S.r.l.: Is Risk Management a Source of Risk: The deployment of risk management within a business can be a source of false assurance. Over recent years, businesses have becom...

Is Risk Management a Source of Risk





The deployment of risk management within a business can be a source of false assurance.

Over recent years, businesses have become more and more reliant on increasingly complex modelling processes to predict outcomes, to the point that in many cases, businesses have lost sight of what risk management is all about - and at the same time, risk management lost sight of what the business was all about. Increasingly, I have seen risk management services being deployed in large institutions by the 'big four' consultancy firms, and to keep their huge costs down, they end up with the newly qualified consultants - mid twenties, bright young things, but I'm sorry, they often don't have the faintest idea what your business does. They have insufficient real world experience to permit effective dissemination of risk knowledge.

I worked with one lovely young lady recently in a banking environment. Very intelligent - but she did not have the first clue of what the business was about. She made assumptions, and those assumptions lead the business down some long, dark alleys.

If you have risk function, however, that fully understands the business model, the deployment of its operational strategy, the sector the business operates in and the macro-economic and socio-political environment in which it operates, then they will be able to provide risk information that is relevant to the business, and can be understood by the business.

My hope going into this recession was that businesses would learn from this period in time, and take a more realistic, holistic view of the world. Worryingly, what I see is "more of the same".

I see financial institutions that have - on the face of it - bolstered their risk functions, but in doing so have allowed them to become ever more 'siloed' and fractured in their approach. This can only lead to disaster, in my view. The left hand will not know what the right hand is doing - no one owns anything, no one is responsible, no one is accountable.

So, I think the deployment of risk management has been a source of risk, but I don't think the dramas are over yet. There is a second wave of failure yet to hit, unless businesses can swallow the pill and take the right approach.

Posted by Andrew Bird, Managing Director at Nile Blue and freelance business consultant.




Saturday 21 September 2013

Ontonix S.r.l.: Measuring the magnitdue of a crisis

Ontonix S.r.l.: Measuring the magnitdue of a crisis: How can you measure the magnitude of an economic crisis? By the number of lost jobs, foreclosures, GDP drop, number of defaulti...

Measuring the magnitdue of a crisis




How can you measure the magnitude of an economic crisis? By the number of lost jobs, foreclosures, GDP drop, number of defaulting banks and corporations, deflation? Or by the drop in stock-market indices? All these parameters do indeed reflect the severity of a crisis. But how about a single holistic index which takes them all into account? This index is complexity and in particular its variation. Let us examine, for example, the US sub-prime crisis. The complexity of the US housing market in the period 2004-2009 is illustrated in the above plot. A total of fifty market-specific parameters have been used to perform the analysis in addition to fifteen macroeconomic indicators such as the ones mentioned above. The "bursting bubble" manifests itself via a complexity increase from a value of approximately 19 to around 32. With respect to the initial value this means an increase of 40%. The arrow in the above plot indicates this jump in complexity and this number represents a systemic measure of how profound the US housing market crisis is.

In summary, the magnitude of a crisis can be measured as follows:

M = | C_i - C_f | / C_i

where C_i is the value of complexity before the crisis and C_f the value during crisis. The intensity of a crisis can be measured as the rate of change of complexity

Serious science starts when you begin to measure.




Friday 20 September 2013

Complexity: The Fifth Dimension





When complexity is defined as a function of structure, entropy and granularity, examining its dynamics reveals its fantastic depth and phenomenal properties. The process of complexity computation materializes in a particular mapping of a state vector onto a scalar. What is surprising is how a simple process can enshroud such an astonishingly rich spectrum of features and characteristics. Complexity does not possess the properties of an energy and yet it expresses the "life potential" of a system in terms of the modes of behaviour it can deploy. In a sense, complexity, the way we measure it, reflects the amount of fitness of an autonomous dynamical system that operates in a given Fitness Landscape. This statement by no means implies that higher complexity leads to higher fitness. In fact, our research shows the existence of an upper bound on the complexity a given system may attain. We call this limit critical complexity. We know that in proximity of this limit, the system in question becomes delicate and fragile and operation close to this limit is dangerous. There surely exists a "good value" of complexity - which corresponds to a fraction, ß, of the upper limit - that maximizes fitness:

Cmax fitness = ß Ccritical

We don't know what the value of ß is for a given system and we are not sure on how it may be computed. However, we think that the fittest systems are able to operate around a good value of ß. Fit systems can potentially deploy a sufficient  variety of modes of behaviour so as to respond better to a non-stationary environment (ecosystem). The dimension of the modal space of a system ultimately equates to its degree of adaptability. Close to critical complexity the number of modes, as we observe, increases rapidly but, at the same time, the probability of spontaneous (and unwanted) mode transitions also increases quickly. This means the system can suddenly undertake unexpected and potentially self-compromising actions (just like adolescent humans). 





Wednesday 18 September 2013

Stocks, Crowdrating and the Democratization of Ratings


We have always claimed that the process of rating of a business and its state of health should be more transparent, objective and affordable, even by the smallest of companies. With this goal in mind Ontonix has launched the World's first do-it-yourself rating system - Rate-A-Business - which allows any individual to upload the financials of a company and to obtain, in a matter of seconds, a measure of its state of health. The system works, of course, for both publicly listed as well as private companies. Essentially, the tool shifts rating from a duopoly of two huge Credit Rating Agencies to the Internet, the World's central nervous system. Rating becomes, de facto, a commodity. In order to make our global economy healthier, and to reduce the impact of future crises, it is paramount to transform rating from a luxury, to a commodity. Today, it is possible to know one's levels of cholesterol, for example, for just a few dollars. The information is not just reserved for the rich. Similarly, the rating of a business - its state of health, or resilience - is something that every company should know, even the tiniest SME. This is the philosophy that has driven Ontonix to develop Rate-A-Business.

Assetdyne takes things forward even more. The company also provides a real-time rating system. Even though the system developed by Assetdyne focuses on publicly listed companies and portfolios of their stocks, it too introduces a fundamental new element into the process of rating - the so-called crowd-rating. The value of the stock of a company is the result of a complex interplay of millions of traders, analysts, investors, trading robots, etc. Ultimately, it is a reflection of the reputation and perceived value of a particular company and is the result of a democratic process. Clearly, the value of a stock is also driven by market trends, sector analyses, rumors, insider trading and other illicit practice and, evidently, by the Credit Rating Agencies themselves. However, undeniably, it is the millions of traders who ultimately drive the price and dynamics of stocks according to the basic principles of supply and demand. In practice, we're talking of a planet-wide democratic process of crowd-rating - it is the crowd of traders and investors that decides how much you pay for a particular stock.

What Assetdyne does is to use the information on the value and dynamics of the price of stocks to actually compute a rating. The rating that is computed does not reflect the Probability-of-Default (PoD) of a particular company - this the popular "AAA" kind of rating - it reflects the "resilience" of a given stock (hence the company behind it). Resilience is the capacity to resist shocks, a frequent phenomenon in our turbulent economy. Resilience, besides being a very useful measure of the state of health of any kind of system, not just of a corporation, possesses one very important characteristic - its computation is based on the measure of complexity. It so happens that complexity is the hallmark of our economy, of our times. The rating system developed by Assetdyne delivers, therefore, the following additional information:

Stock Complexity - this measures how "chaotic" the evolution of stock is. In other words, we're talking of an advanced measure of volatility. Complexity is measured in bits. The value of complexity of different stocks may clearly be compared.

Stock Resilience - this measures how well the stock price reacts to shocks and extreme events. Values range from 0% to 100%.

As the computation of the complexity and resilience of a stock are based on closing values at the end of each trading day, the corresponding values also change on a daily basis.

An example is illustrated below.




Assetdyne's rating system is applicable also to portfolios of stocks. The example below illustrates a small portfolio of oil&gas companies.




An important aspect of this particular rating technique is that it is not based on the financial reports (Balance Sheets, Income Statements, Cash Flow, etc.) which are of highly subjective nature. But companies construct their balance statements so as to provide a more optimistic picture and therefore conventional PoD-type ratings inevitably influenced. While financial statements and the resulting PoD ratings are subjective (recall the multitude of triple-A-rated companies that have defaulted all of a sudden, triggering the current crisis) to the point that governments have sued Credit Rating Agencies, stocks represent a considerably more objective reflection of the real state of affairs. Most importantly, the information is known to everyone. Of course, markets are not always right and the price may be wrong but the process of converging to a given price is as objective and democratic as things in this world can get. 

One could conclude that the World's stock markets constitute one huge social network which plays a global game called trading. As the game is played, one of its outcomes is the price of stocks. The price may be "wrong", it may be manipulated but it is what it is. It is the result of the mentioned crowd-rating and Assetdyne uses it to provide new important information on complexity and resilience rating of stocks and portfolios. Innovation in finance is possible.


www.assetdyne.com




Tuesday 17 September 2013

Complexity Introduced to Stock and Portfolio Analysis and Design


Modern Portfolio Theory (MPT) has been introduced in 1952 by Markowitz. As described in Wikipedia, "MPT is a theory of finance that attempts to maximize portfolio expected return for a given amount of portfolio risk, or equivalently minimize risk for a given level of expected return, by carefully choosing the proportions of various assets. Although MPT is widely used in practice in the financial industry and several of its creators won a Nobel memorial prize for the theory, in recent years the basic assumptions of MPT have been widely challenged by fields such as behavioral economics.

MPT is a mathematical formulation of the concept of diversification in investing, with the aim of selecting a collection of investment assets that has collectively lower risk than any individual asset. This is possible, intuitively speaking, because different types of assets often change in value in opposite ways. For example, to the extent prices in the stock market move differently from prices in the bond market, a collection of both types of assets can in theory face lower overall risk than either individually. But diversification lowers risk even if assets' returns are not negatively correlated—indeed, even if they are positively correlated.

More technically, MPT models an asset's return as a normally distributed function (or more generally as an elliptically distributed random variable), defines risk as the standard deviation of return, and models a portfolio as a weighted combination of assets, so that the return of a portfolio is the weighted combination of the assets' returns. By combining different assets whose returns are not perfectly positively correlated, MPT seeks to reduce the total variance of the portfolio return. MPT also assumes that investors are rational and markets are efficient."

Since 1952 the world has changed. It has changed even more in the past decade, when complexity and turbulence have made their permanent entry on the scene. Turbulence and complexity are not only the hallmarks of our times, they can be measured, managed and used in the design of systems, in decision-making and, of course, in asset portfolio analysis and design.

Assetdyne is the first company to have incorporated complexity into portfolio analysis and design.  In fact, the company develops a system which computes the Resilience Rating of  stocks and stock portfolios based on complexity measures and not based on variance or other traditional approaches. While conventional portfolio design often follows the Modern Portfolio Theory (MPT), which identifies optimal portfolios via minimization of the total portfolio variance, the technique developed by Assetdyne designs portfolios based on the minimization of portfolio complexity. The approach is based on the fact that excessively complex systems are inherently fragile. Recently concluded research confirms that this is the case also for asset portfolios.


Two examples or Resilience Rating of a single Stock are illustrated below:




An example of a Resilience Rating of a portfolio of stocks is shown below (Top European banks are illustrated as an interacting system):


while the rating and complexity measures are the following:


The interactive map of the EU banks may be navigated on-line here.


For more information, visit Assetdyne's website.



Sunday 15 September 2013

If You Really Need to Optimize


Optimal solutions are fragile and should be generally avoided. This unpopular statement enjoys substantial practical and philosophical argumentation and now, thanks to complexity, we can be even more persuasive. However, this short note is about making optimisation a bit more easy. If you really insist on pursuing optimality, there is an important point to keep in mind.

Let us examine the case illustrated below: designing a composite conical structure in which the goal is to keep mass under control, as well as the fluxes and axial and lateral frequencies. The System Map shown below reflects which (red) design variables (ply thicknesses) influence the performance of the structure in the nominal (initial) configuration, prior to optimisation. In addition, the map also illustrates how the various outputs (blue nodes) relate to each other. 




In fact, one may conclude that, for example, the following relationships exist:
  • t_20013 - Weight
  • Weight - Axial frequency
  • Min Flux - Max Flux
  • t_20013 controls the Lateral Frequency
  • t_20013 also controls the Axial Frequency
  • Lateral Frequency and Axial Frequency are related to each other
  • etc.
As one may conclude, the outputs are tightly coupled: if you change one you cannot avoid changing the others. Let's first see how optimisation is handled when one faces multiple - often conflicting - objectives:

minimise y = COST (y_1, y_2, ..., y_n) where y_k stands for the k-th performance descriptor (e.g. mass, stiffness, etc.). In many cases weights are introduced as follows:

minimise y = COST (w_1  *  y_1,  w_2  * y_2, ..., w_n  * y_n).

The fundamental problem with such a formulation (and all similar MDO-type formulations) is that the various performance descriptors are often dependent (just like the example above indicates) and the analysts doesn't know. The cost function indicated above is a mathematical statement of a conflict, whereby the y's compete for protagonism. This competition is driven by an optimisation algorithm which knows nothing of the structure of the corresponding System Map and of the existence of the relationships contained therein. Imagine, for example that you are trying to reduce a variable (e.g. mass) and increase, at the same time another (e.g. frequency). Suppose also that you don't know that these two variables are strongly related to each other: the relationship looks typically like this: f = SQRT(k/m). Here, f and m, outputs of the problem, are related - changing one modifies the other. This is inevitable. In a more intricate situation, in which hundreds of design variables are involved, along with tens or hundreds of performance descriptors, the problem really becomes numerically tough. The optimisation algorithm has a very hard time. What is the solution?

If you cannot avoid optimisation, the we suggest the following approach:
  • Define your baseline design.
  • Run a Monte Carlo Simulation, in which you randomly perturb the design variables (inputs).
  • Process the results using OntoSpace, obtaining the System Map.
  • Find the INDEPENDENT outputs (performance descriptors), or, in the case the aren't any, those outputs which have the lowest degree in the System Map. There are tools in OntoSpace that actually help to do this.
  • Build your cost function using only those variables, leaving the others out.
This approach "softens" the problem from a numerical point of view and reduces the mentioned conflicts between output variables. Attempting to formulate a multi-disciplinary problem without knowing a-priori how the various disciplines interact (i.e. without the System Map) is risky, to say the least.






www.design4resilience.com


Saturday 14 September 2013

Optimality: The Recipe for Disaster



There still seems to be a rush towards optimal design.  But there is no better way to fragility and vulnerability than the pursuit of peak performance and perfection - optimality in other words. But let's take a look at the logic behind this risky and outdated practise:
  •  Based on a series of assumptions, a math model of a system/problem is built.
  •  Hence, we already have the first departure from reality: a model is just a model.
  •  If you're good, really good, the model will "miss" 10% of reality.
  •  You then squeeze peak performance out of this model according to some objective function.
  •  You then manufacture the real thing based on what the model says.
It is known - obviously not to all - that optimal designs may be well-behaved with respect to random variations in the design parameters but, at the same time, they are hyper-sensitive to small variations in the variables that have been left out in the process of building the model. This is precisely what happens - you design for or against something, but you miss out that something else. By wiping under the carpet seemingly innocent variables, you're deciding a-priori what the physics will be like. And this you cannot do. Now, if your model isn't forced to be optimal - to go to the limit - it might stand a better chance in that it will have room for manoeuvre. When you're optimal, you can only get worse! If you're standing on the peak of a mountain, the only way is down! Why, then, even attempt to design and build systems that are optimal and that can only get worse? Is this so difficult to see? Sounds like the Emperor's new clothes, doesn't it?



www.design4resilience.com


www.ontonix.com


Friday 13 September 2013

Robustness and Rating of System of Europe's Top Banks


Based on Close values of the stocks of Europe's top banks, we have rated them as a system of interacting systems. The Business Structure Map is represented above, while the resilience rating and complexity measures are indicated below:



Finally, the Complexity Profile, which ranks each bank in terms of its complexity footprint (relevance) on the system as a whole.




www.ontonix.com



Thursday 12 September 2013

Is a Global Post-Critical Society Possible?



(written in 2005)
Every dynamical system possesses a characteristic value of complexity which reflects how information is organized and how it flows within its structure. Like most things in life, complexity is limited. In fact, there is an upper bound on complexity that a given system may attain and sustain with a given structure. This ‘physiological’ limit is known as critical complexity. In the proximity of its corresponding critical complexity every system becomes fragile and therefore vulnerable. This fragility is consequence of a very simple fact: critically complex systems possess a multitude of modes of behaviour and can suddenly jump from one mode to another. Very often, minute amounts of energy are sufficient to accomplish such mode transitions. Consequently, highly complex systems may easily develop surprising behaviour and are inherently difficult to understand and govern. For this very reason, humans prefer to stay away from situations that are perceived to be highly complex. In the vicinity of critical complexity, life becomes more risky precisely because of the inherent element of surprise.

In the past few years modern complexity science has developed comprehensive metrics and means of measuring not only the complexity of generic systems but also the corresponding critical complexity. This has enabled to turn the above intuitive rules into rational general principles which govern the dynamics and interplay of everything that surrounds us. The interaction of entropy and structure is the fundamental mechanism behind co-evolution and behind the creation of organized complexity in Nature. Higher complexity implies greater functionality and therefore higher ‘fitness’. However, extreme specialization – fruit of ‘evolutionary opportunism’ – comes at a high cost. Robust yet fragile is the hallmark of highly complex systems. Think of how creative the human species is and yet how fragile human nature is. Under highly uncertain and stressful conditions this fragility emerges with strength. But since human beings are the basic building blocks of societies, economies and nations, it is not difficult to understand why the complexity of our globalized and turbulent world assumes almost cosmological proportions. Fragility and volatility are words which best reflect the state of health of not only the global economy but also of the society in all of its aspects.

Our global society is ultimately a huge and dynamic network, composed of nodes and links. The connections between the nodes (individuals, corporations, markets, nations) are rapidly increasing in number, just as is the number of nodes. A fundamental feature of this network is entropy, which is a measure of uncertainty. Because the nodes do not always act in a rational and predictable fashion, the connections are “noisy”. Because the amount of entropy can only increase – this is due to the Second Law of Thermodynamics - while new connections are being created every day, many others are destroyed. This process is also inevitable. The measure of complexity is a blend of the topology of the network and the amount of noise – entropy – contained within its structure. Consequently, there are two means of increasing complexity: adding more structure (connections, nodes or both), or, for a given network structure, increase the amount of noise.

In the past, the Earth was populated by numerous and disjoint civilizations that thrived almost in isolation. The Sumers, the Incas, or the Romans are just a few prominent examples. Because the temporal and spatial correlation between those civilizations was very limited, if one happened to disappear, many more remained. However, the Earth today is populated by one single globalized society. If this one fails, that’s it. But any form of progress is accompanied by an inevitable increase in complexity. This is true only until critical complexity is reached. In order to continue evolving beyond critical complexity, a civilization must find ways of overcoming the delicate phase of vulnerability in which self-inflicted destruction is the most probable form of demise.
When a society approaches critical complexity, it has the following alternatives in order to survive:
1.    Reduce its complexity. This is done by dumping entropy or by simplifying its structure. In practice this translates to:
  • Stricter laws.
  • Less laws.
  • Reduction of personal freedom.
2.    Learn to live in proximity of critical complexity. This is  risky because the system is:
  • Extremely turbulent (stochastic). Terrorism, crime and fraudulent behaviour thrive close to criticality.
  • Very difficult to govern – impossible to reach goals.
  • Unexpected behaviour may emerge.
  • On the verge of widespread violence.
3.    Increase its critical complexity. This may be accomplished in essentially two ways:
  • Creating more links (making a denser Process Map). However, this makes governing even more difficult.
  • Adding structure. Certainly the preferred option. One example? “Create” more nations – this not only increases structure, it may also help ease tensions. 
Option 2 is the most risky. Living in proximity of critical complexity cannot be accomplished in the framework of a conventional western-type democracy. The extreme turbulence which characterizes critically complex systems is most likely better dealt with in a technocratic and police-state setting, which limits severely personal freedom. Only a government which understands how to actively manage complexity on a vast scale may venture into similar territory. To our knowledge, solution 2 is today not viable. A better approach, therefore, is to adopt a mix of 1 and 3.

Terrorism constitutes surely one of the major concerns of modern democracies. The number of terrorist attacks has more than tripled in recent years. Contrary to popular belief, religion is not the main motivating factor. In terms of location most instances of politically-fuelled violence and terrorism may be found in Asia, not in the Middle East. In fact, our research shows that Asia enjoys a far greater complexity growth rate than the Middle East. Approximately one fourth of trans-national politically-motivated terrorist acts are inspired by religion. A similar amount is accounted for by leftist militant organizations. Nearly 40% of terror acts are perpetrated by nationalist and separatist groups. As expected, there is no single clear cause. A mix of factors, which ultimately lead to some form of social injustice, poverty, failing states or dysfunctional politics are what fuels terrorism. This suggests that the problem is indeed due to very high complexity. We are also painfully aware of the fact that modern democracies naturally lack efficient tools to effectively deal with highly complex socio-political-ethnical and religious problems, without neglecting the fundamental economical and ecological dimensions.

Where can terrorism develop with greater ease? Terrorists need to hide. For this reason they thrive in high-entropy environments, such as failing or rogue states, where there is little social structure. It is in highly complex societies (doesn’t mean developed) that terror groups find geo-political sanctuaries. High complexity, as mentioned, comes in many forms:
  • Little structure but high entropy (Third World countries)
  • Much structure, low entropy (Western democracies)
  • Much structure, high entropy  (the future global society)
Terror groups generally prefer high entropy-dominated complexity because of the Principle of Incompatibility: high complexity implies low precision. This means that hunting them down - essentially an intelligence-driven exercise - is difficult because of lack of precise information, laws on privacy, etc. Because of the fact that globally complexity is quickly increasing, it will be increasingly more difficult to identify terror groups especially in ambiguous countries, i.e. those which harbour terrorists but are willing to close an eye. The problem with Western countries is that they are becoming more permissive and tolerant, leading to an overall erosion of social structure in favour of entropy. In underdeveloped countries it is almost impossible to create new social structure hence it is entropy that causes the increase of complexity. In the West, the more intricate social structure is being eroded by loss of moral values and relativism. The result? in both cases an increase in complexity. Following the above logic, we can state that:
  • High complexity is necessary (but not sufficient) to lead to terrorism.
  • Terrorism in an almost “obvious” consequence of a highly complex world.
  • The Principle of Incompatibility and terrorism are intimately linked.
Can complexity be used to anticipate conflicts, crises and failing states? The answer is affirmative. It is evident that a society/country in the proximity of its critical complexity is far more open to enter a state of conflict, such as civil war or simply declare war on a neighbouring country. The conditions that a society must satisfy in order to switch to a conflict mode are multiple. As history teaches, there is no established pattern. Many factors concur. But it is clear that it is more difficult to take a well functioning and prosperous society to war than one which is fragile and dominated by entropy. In a society in which the entropy-saturated structure is eroded, the distance that separates a “peace mode” from a “conflict mode” is much smaller and switching is considerably easier. The idea, therefore, is to measure and track complexity region per region, country per country, and to keep an eye on those countries and regions where high complexity gradients are observed. Regions where complexity increases quickly are certainly candidates for social unrest or armed conflict. How can this be accomplished? What kind of data should be used? Good candidates are:

•    Birth-rate
•    Death-rate
•    Debt-external
•    Electricity-consumption
•    Electricity-production
•    Exports
•    GDP
•    GDP-per capita
•    GDP-real growth
•    Highways
•    Imports
•    Infant Mortality
•    Inflation rate
•    Internet users
•    Labour force
•    Life expectancy
•    Military expenses
•    Oil-consumption
•    Oil-production
•    Population
•    Telephones mobiles
•    Telephones-main lines
•    Total fertility rate
•    Unemployment rate 

The list is of course incomplete, as there are tens of other indicators which must be taken into account. Based on historical data such as that listed above, Ontonix has conducted comprehensive analyses of the World’s complexity and its rate of growth. It has emerged that if the current trend is maintained, our global society shall reach criticality around 2045-2050. What does this mean? The high amount of complexity will make it extremely difficult to govern societies or to make decisions of political nature. Under similar conditions, self-inflicted extinction will be highly likely. Although from a global perspective, the World is still almost half a century away from its critical state, there are numerous regions of the World in which societies are nearly critical and extremely difficult to grow and govern. Many parts of Africa, the Middle East or South East Asia are just a few examples. But also Western democracies are in danger. Highly sophisticated and peaceful societies too are increasingly fragile because of a rapid increase of rights, freedom, tolerance or relativism.
It is interesting to note how the global robustness of the world has dropped from 77% in 2003, to 68% in 2004. Similarly, in the same period complexity has increased from 6.3 to 8.1, while the corresponding critical complexity has risen from 8.1 to 9.6. Critical complexity increases because globally speaking the world’s economy is growing. This is of course positive. However, this growth is lower than the growth of complexity. The two values will cross around 2045-2050.

All ancient civilizations have collapsed. This is because due to a variety of reasons they reached their critical complexity and were unable to cope with the resulting fragility. Critical complexity becomes a severe liability for a species especially once it acquires powers more than sufficient for its self-destruction. Fragile civilizations are vulnerable and their most likely fate

If we fail to cope with and, ultimately, move safely past criticality, there will be no second chance, no other civilization will take over. Clearly, the biological lifetime of our species is likely to be several million years, even if we do our worst, but as far as technological and social progress is concerned, that will essentially be it. Globalization of course accelerates the increase of complexity until criticality is reached. Critical complexity, on the other hand, is the hurdle that prevents evolution beyond self-inflicted extinction. Since none of the ancient (and not so ancient) civilizations have ever evolved beyond critical complexity - in fact, they’re all gone - they were all pre-critical civilizations. There has never been a post-critical civilization on Earth. The only one left that has a chance of becoming a post-critical one is of course ours. But what conditions must a civilization meet in order to transition beyond criticality? Essentially two. First, it must lay its hands on technology to actively manage complexity. Second, it must have enough time to employ it on a vast and global scale. Complexity management technology has been introduced by Ontonix in 2005. This leaves us with about 40-45 years.

Sunday 8 September 2013

How resilient are the big US IT companies?


We've analysed America's big IT players as system. The analysis has been performed using stock market data, and in particular the stock values. The result is a three-star resilience. Not bad but nothing to celebrate. Below is the Complexity Map.





The analysis has been performed using our Resilience Rating system. Try it for FREE here.



www.ontonix.com