Saturday 24 August 2013

The 18 Truths About Complexity




A paper copyrighted in 1998, called How Complex Systems Fail and written by an M.D., Dr. Richard Cook, describes 18 truths about the underlying reasons complicated systems break down. On the surface the list appears surprisingly simple, but deeper meaning is also present. Some of the points are obvious while others may surprise you.

We report the paper verbatim.


THE EIGHTEEN TRUTHS
 
"The first few items explain that catastrophic failure only occurs when multiple components break down simultaneously:
1. Complex systems are intrinsically hazardous systems. The frequency of hazard exposure can sometimes be changed but the processes involved in the system are themselves intrinsically and irreducibly hazardous. It is the presence of these hazards that drives the creation of defenses against hazard that characterize these systems.
2. Complex systems are heavily and successfully defended against failure. The high consequences of failure lead over time to the construction of multiple layers of defense against failure. The effect of these measures is to provide a series of shields that normally divert operations away from accidents.
3. Catastrophe requires multiple failures - single point failures are not enough. Overt catastrophic failure occurs when small, apparently innocuous failures join to create opportunity for a systemic accident. Each of these small failures is necessary to cause catastrophe but only the combination is sufficient to permit failure.
4. Complex systems contain changing mixtures of failures latent within them. The complexity of these systems makes it impossible for them to run without multiple flaws being present. Because these are individually insufficient to cause failure they are regarded as minor factors during operations.
5. Complex systems run in degraded mode. A corollary to the preceding point is that complex systems run as broken systems. The system continues to function because it contains so many redundancies and because people can make it function, despite the presence of many flaws.
Point six is important because it clearly states that the potential for failure is inherent in complex systems. For large-scale enterprise systems, the profound implications mean that system planners must accept the potential for failure and build in safeguards. Sounds obvious, but too often we ignore this reality:
6. Catastrophe is always just around the corner. The potential for catastrophic outcome is a hallmark of complex systems. It is impossible to eliminate the potential for such catastrophic failure; the potential for such failure is always present by the system's own nature.
Given the inherent potential for failure, the next point describes the difficulty in assigning simple blame when something goes wrong. For analytic convenience (or laziness), we may prefer to distill narrow causes for failure, but that can lead to incorrect conclusions:
7. Post-accident attribution accident to a ‘root cause' is fundamentally wrong. Because overt failure requires multiple faults, there is no isolated ‘cause' of an accident. There are multiple contributors to accidents. Each of these is necessary insufficient in itself to create an accident. Only jointly are these causes sufficient to create an accident.
The next group goes beyond the nature of complex systems and discusses the all-important human element in causing failure:
8. Hindsight biases post-accident assessments of human performance. Knowledge of the outcome makes it seem that events leading to the outcome should have appeared more salient to practitioners at the time than was actually the case. Hindsight bias remains the primary obstacle to accident investigation, especially when expert human performance is involved.
9. Human operators have dual roles: as producers & as defenders against failure. The system practitioners operate the system in order to produce its desired product and also work to forestall accidents. This dynamic quality of system operation, the balancing of demands for production against the possibility of incipient failure is unavoidable.
10. All practitioner actions are gambles. After accidents, the overt failure often appears to have been inevitable and the practitioner's actions as blunders or deliberate wilful disregard of certain impending failure. But all practitioner actions are actually gambles, that is, acts that take place in the face of uncertain outcomes. That practitioner actions are gambles appears clear after accidents; in general, post hoc analysis regards these gambles as poor ones. But the converse: that successful outcomes are also the result of gambles; is not widely appreciated.

11. Actions at the sharp end resolve all ambiguity. Organizations are ambiguous, often intentionally, about the relationship between production targets, efficient use of resources, economy and costs of operations, and acceptable risks of low and high consequence accidents. All ambiguity is resolved by actions of practitioners at the sharp end of the system. After an accident, practitioner actions may be regarded as ‘errors' or ‘violations' but these evaluations are heavily biased by hindsight and ignore the other driving forces, especially production pressure.
Starting with the nature of complex systems and then discussing the human element, the paper argues that sensitivity to preventing failure must be built in ongoing operations. In my experience, this is true and has substantial implications for the organizational culture of project teams:
12. Human practitioners are the adaptable element of complex systems. Practitioners and first line management actively adapt the system to maximize production and minimize accidents. These adaptations often occur on a moment by moment basis.
13. Human expertise in complex systems is constantly changing. Complex systems require substantial human expertise in their operation and management. Critical issues related to expertise arise from (1) the need to use scarce expertise as a resource for the most difficult or demanding production needs and (2) the need to develop expertise for future use.
14. Change introduces new forms of failure. The low rate of overt accidents in reliable systems may encourage changes, especially the use of new technology, to decrease the number of low consequence but high frequency failures. These changes maybe actually create opportunities for new, low frequency but high consequence failures. Because these new, high consequence accidents occur at a low rate, multiple system changes may occur before an accident, making it hard to see the contribution of technology to the failure.
15. Views of ‘cause' limit the effectiveness of defenses against future events. Post-accident remedies for "human error" are usually predicated on obstructing activities that can "cause" accidents. These end-of-the-chain measures do little to reduce the likelihood of further accidents.
16. Safety is a characteristic of systems and not of their components. Safety is an emergent property of systems; it does not reside in a person, device or department of an organization or system. Safety cannot be purchased or manufactured; it is not a feature that is separate from the other components of the system. The state of safety in any system is always dynamic; continuous systemic change insures that hazard and its management are constantly changing.
17. People continuously create safety. Failure free operations are the result of activities of people who work to keep the system within the boundaries of tolerable performance. These activities are, for the most part, part of normal operations and superficially straightforward. But because system operations are never trouble free, human practitioner adaptations to changing conditions actually create safety from moment to moment.
The paper concludes with a ray of hope to those have been through the wars:
18. Failure free operations require experience with failure. Recognizing hazard and successfully manipulating system operations to remain inside the tolerable performance boundaries requires intimate contact with failure. More robust system performance is likely to arise in systems where operators can discern the "edge of the envelope". It also depends on providing calibration about how their actions move system performance towards or away from the edge of the envelope."


www.ontonix.com


 

Friday 23 August 2013

The Principle of Fragility



The following equation, which we call the Principle of Fragility, has been coined by Ontonix in early 2005 and indicates why complexity management is a form of risk management:


Complexity X Uncertainty = Fragility        


In order to understand the Principle of Fragility let us borrow Fourier’s idea of variable separation and create a useful parallel. Let us assume, without loss of generality, that the term “Complexity” is specific to a certain system, e.g. a corporation, while  the term “Uncertainty” concentrates the degree of turbulence (entropy) in the environment in which the system operates, e.g. a market. The equation assumes the following form:

Csystem  X Uenvironment = Fragility         

or, in the case of a business,

Cbusiness model X Umarket = Fragility         

What the equation states is that in a market of given turbulence a more complex business model will be more fragile (exposed). In practical terms, the equation may be seen as a mathematical version of Ockham’s razor: with all things being equal a less complex compromise is preferable.




Thursday 22 August 2013

Complexity Maps get a facelift.



Business Structure Maps (known also as Complexity Maps) have now a new look and feel. The size of the nodes (variables) is now function of its importance, or footprint on the system as a whole. This makes reading maps much easier as it is immediately clear where the important things are and where to start solving problems. The larger nodes are where there is more leverage and this is where one needs to concentrate.

Numerous interactive examples of maps may be seen here.



Get your own maps here.


www.ontonix.com

Friday 16 August 2013

Complexity Impacts Negatively Portfolio Returns.




In a recent blog we have pointed out research conducted at the EPFL in Lausanne, Switzerland, which confirms that complexity impacts negatively portfolio returns. The research has now been concluded and the full report is available here.

The research has bee conducted using Ontonix's on-line system for measuring the complexity and resilience of businesses and portfolios.



www.ontonix.com




Thursday 15 August 2013

How Healthy Are the US Markets? A Look at a System of Systems.

The US stock markets indices have been enjoying upward trends for a few months now. When analysed one by one, the situation appears to be very positive. Based on the last 60 days of trading and on the values of "Open", "High", "Low", "Volume", "Close" and "Adjusted Close", we have analyzed the DJIA, S&P 500 and NASDAQ Composite markets separately and then as s single interacting system. Here the results (analysis performed on August 15-th, 2013).

The DJIA. Resilience:83%















The S&P 500. Resilience: 97%





  








The NASDAQ Composite. Resilience: 95%

















Because of the inter-dependencies that globalization has created, no system acts in isolation and no system should be analyzed in isolation. All systems interact, forming a huge system of systems. To show how this can impact the big picture we have analysed the three markets simultaneously. This is the picture:

DJIA + S&P + NASDAQ. Resilience: 72%




















In the above map the first six red nodes correspond to the Dow, the following 6 blue are the S&P, the remaining 6 nodes correspond to the NASDAQ.

The combined markets have a resilience o 72% even though the three markets boast values of 83%, 97% and 95%, with an average of 92%. Surprised? We put together three components, each of which has a resilience greater than 83% and the resulting system has a resilience of 72%! This is a great  example of  "the whole being actually less than the sum of the parts". So much for linear thinking. 




www.ontonix.com


www.rate-a-business.com




Wednesday 14 August 2013

Visual Analytics and Cognitive Computing - Ontonix Beats IBM




IBM has recently announced a new "way to program computers" - the so-called "cognitive computing" - see here - which uses "visual analytics" techniques to process (and display) data.

The method of "Visual Analytics" has been pioneered by Ontonix over a decade ago when we introduced and patented (in 2005) our model-free technique of data processing which actually mimics the human brain. The method doesn't use conventional mathematics or math models - it just "sees data" and extracts workable conclusions and knowledge from it. As data gets richer the system "learns" and accumulates the new information in the form of experience and rules.  See our recent blog on "Computing Knowledge" here, where we speak of extracting "Cognitive Maps" from raw data. More on the subject can be found in this other recent article. On model-free methods read here.

But there is more. We also measure the complexity of data as well as that of the resulting Cognitive Maps and, consequently, of the knowledge they contain. Knowledge and complexity are inextricably linked because complexity - which is measured in bits - actually quantifies the amount of structured information contained within a piece of data. But structured information is knowledge. More soon.





www.ontonix.com



Tuesday 13 August 2013

How Complex is the World? We've Measured It.



Using annual data from the World Bank, spanning the last 5 decades, we have computed the complexity of the World seen as a system of interacting systems (countries). There are currently 196 countries (the US recognises only 195), each of which is monitored via a series of over 1250 indicators, covering the economy, energy, transportation, education, health care, infrastructures, agriculture, environment, telecommunications, finance, crime, military expenses, etc., etc. Doing the arithmetic leads to approximately 250000 parameters which describe the entire system, i.e. the World (List of indicators).

The results are quite astonishing - see plot below illustrating the evolution of complexity (middle curve) as well as of it upper (critical) and minimum bounds.



As a system evolves and develops new functionality it becomes more complex. This is natural. The direction of evolution in our biosphere is a good example: from single-cell organisms to mammals. However, each system in nature possesses also the so-called critical complexity (green curve in above plot). This too increases as the system evolves. The same may be said of the lower complexity bound (blue curve). When a system functions close to the blue curve, its behaviour is deterministic and predictable. The problem is to to stay away from the critical complexity curve because there things get chaotic and uncontrollable. The amount of chaos (uncertainty) in the system is reflected by its entropy, see plot below.





 In the 60s through the 80s complexity as well as entropy have been growing steadily. In the 90s things have slowed down and in the last decade both complexity and entropy have plateaued. This means that our rate of development and growth is slowing down. Keep in mind we're not talking of the economy, we're looking at the entire system. The economy is only one way to "measure" a system but there are many other facets too.

While it is not the goal of this article to provide an in-depth analysis of the reasons and implications of the above, we would like to draw attention to a few points. First of all, the ratio between  Complexity and Critical complexity has almost doubled in the period 1960-2010 - see plot below.


This means that compared to the 1960s, it is almost twice as difficult to understand the world and to govern it. This means that it has become almost twice as difficult to make forecasts, to do business, to get things done. The world is becoming a more intricate place. However, the system seems to have settled in a sort of equilibrium. The fact that we're still far from critical complexity is good news, the bad news is that the system seems to be stuck. The mild evolution of all of the above curves tells us that the current economic crisis seems to be symptom and not the cause of this state of affairs.


The analysis has been run at the CINECA Supercomputer Center in Bologna. The collaboration of QBT is also acknowleged.






.

Friday 9 August 2013

Italy Beats Moody's.


Moody's is the largest of the three major rating agencies. It employs 7000 people worldwide and posted sales of 2.7 billion in 2012. Since rating agencies are under heavy fire from the beginning of the financial crisis - in January 2011, the Commission of Inquiry Financial Crisis U.S. Senate stated that "The three rating agencies have been instrumental in triggering the financial collapse" - we have decided to calculate their ratings. In particular, we chose Moody's because it is the largest credit rating agency and also because it is perhaps the one that has downgraded Italian debt more aggressively than others. Obviously, Moody's is publicly traded and therefore subject to the dynamics of the markets as all listed companies.

In Moody's rating we did not assess the financial performance of the company or its ability to meet its financial obligations and even its probability of default. In other words, we have not performed a calculation of the conventional rating, but instead have we focused on another key feature for those who live in turbulent times: resilience, i.e. the ability to company to resist and survive sudden and extreme events (natural disasters, failures of large companies or banks, financial contagion, etc..). Since the global economy is constantly exposed to extreme events and turbulence, which will become not only more intense but also more frequent, resilience becomes a feature of an economy or a business that could make the difference between survival or its collapse.

For the analysis we used the quarterly information that Moody's publishes on its website. In particular, we used the following items:


  1. Net income
  2. Depreciation and amortization
  3. Weighted average shares outstanding Basic
  4. Provision for income taxes
  5. Total expenses
  6. Operating
  7. Income before provision for income taxes
  8. Revenue
  9. Selling general and administrative
  10. Operating income
  11. Earnings per share Basic
  12. Diluted
  13. Diluted
  14. Non-operating income (expense) net
  15. Interest income (expense) net
  16. Restructuring
  17. Other non-operating income (expense) net
  18. Net income attributable to Moody's
  19. Net income attributable to non-controlling interests
  20. Gain on sale of building



The Resilience Rating is as follows:



(see interactive Moody's Business Structure Map here).


The Resilience Rating of 72% is, on the scale of conventional ratings, equivalent to BBB-, one step from class BB +, the first of speculative ratings.

Given that Italy has often been targeted by Moody's we wanted to compare the resilience of both. Using macroeconomic data published by Eurostat, we obtain the following Resilience Rating:






(see interactive Business Structure Map of Italy here).




A Resilience Rating of over 75% places Italy two steps above Moody's, i.e. at the level of BBB +.

The result immediately raises the question: shouldn't he who has the power to judge others be the first to set a good example? Would you trust a coughing cardiologist as he smokes while recording your electrocardiogram?

 

Thursday 8 August 2013

How is the Eurozone Doing? Still Extremely Fragile.

As EUROSTAT publishes new data, we update our quarterly analyses of the Complexity and Resilience of the Eurozone. The situation as of Q4 2012 is as follows:

Complexity. A growing economy becomes necessarily more complex (see black curve below). However, at the same time it is important to stay away from the so-called critical complexity (red curve). Before the crisis has crippled the global economy things are proceeding relatively low although the two curves were already quite close. Since complexity has peaked in early 2008 there has been a persistent reduction of complexity, equivalent to the loss and destruction of what has been created in the past. In mid-2011 the situation has stabilised but still dangerously close to critical complexity. In other words, the situation is that of extreme fragility. This means that the system is not in the condition to absorb shocks or contagion without major consequences. Moreover, there is no clear signal of recovery apart from the mild growth of complexity in the second half of 2012.



It is interesting is to see complexity for the core 15 EU member states and the 12 which have joined later (for Croatia there is insufficient data to incorporate it in the analysis). It appears that the group of 15 (red curve below) are indeed on a road to mild and sustained recovery. The remaining 12 nations are still on a downward path with indications of stabilisation.




However, what counts is the system as a whole. The resilience (robustness) of the EU27 system is indicated below. There is a mild upward trend but the value of resilience is below 50% which reflects extreme fragility. Certainly the system does not contain triple-A components, as the Rating agencies claim.


Based on the above plots one can infer how the crisis has so far destroyed approximately ten years of growth. And it's not over yet. The oscillatory character of the curves over the past 12-18 months suggests a state of prolonged stagnation. The next 3-4 quarters will show for sure.


www.ontonix.com


You can run the above analysis yourself here: www.rate-a-business.com




Wednesday 7 August 2013

In a Globally Crippled Economy, Can There Be AAA-rated Countries?



According to S&P, the following countries have been rated AAA (see complete list of country ratings here):

 United Kingdom



 Australia



 Canada



 Denmark



 Finland



 Germany



 Hong Kong



 Liechtenstein



 Luxembourg



 Netherlands



 Norway



 Singapore



 Sweden



  Switzerland        




Nine of the above countries are from Europe, the area of the globe that has been hit the hardest by recession and public debt issues.

Because of globalisation we are all on the same boat - every economy is connected to (almost) every other economy. This is what is meant by interdependency. The global economy forms a densely connected network through which information travels at the speed of the Internet. So, if the economy is a global mess - we're actually talking of a meltdown, which sounds pretty dramatic - can there exist triple-A rated economies? In theory yes. In theory anything is possible, nothing is impossible. But that of course depends on the theory.

How can a system that is severly crippled, impregnated with trillions and trillions of derivatives and toxic financial products, of which nobody knows the total amount in circulation (some say 10 times, some say 15 times the World's GDP - the uncertainty is high enough to reflect the severity of the problem.) contain so many large triple-A economies. Does that really make sense? In a state of metastacizing economic crisis how can this be explained?

The problem is quite simple really. Credit Rating Agencies are rating the wrong thing. They rate the Probability of Default (PoD) of a country (or a corporation). Instead, they should be rating other  more relevant characteristics of an economy, such as its resilience and complexity. Resilience (fragility) has nothing in common with performance. You can perform extremely well, and think you're like this:




but in reality you're like this:




Wouldn't you want to know? Isn't survival a nice reflection of success? More soon.


www.rate-a-business.com


www.ontonix.com



Monday 5 August 2013

Complexity Maps Get a Facelift.

Ontonix has announced today the release of version 6.0 of its flagship software system OntoSpace. The full Press Release is available here.

One of the salient new features is the new display of Business Structure Maps, illustrated below. One may notice that now each node of the map has different dimensions. These are computed based on the Complexity profile, i.e. size is function of the node's importance (footprint) on the entire system. This allows to focus immediately on the important issues.


This is what the above map looked like in the previous version released in 2010.




But there is more. In very large cases, things become difficult to grasp (this happen not only with OntoSpace but in life in general). Consider, for example, the Business Structure Map of the EU (each group of nodes, depicted either in blue or red - alternating colours are used to enables users to distinguish the various groups - corresponds to a country).


Not very clear is it? In fact, the map has 632 nodes which are interconnected by 41671 rules! How do you go about analysing that? Well, you can't. For this reason OntoSpace v6.0 supports the so-called Meta-maps, which are obtained from the above map by grouping all variables into "meta-nodes" and also by condensing all the interactions between meta-nodes into only one link. The result looks like this:


This is of course much more clear. A Meta-map is a nice way to represent a system of systems whereby each node is a system with various nodes (variables). More on OntoSpace v6.0 soon.


www.ontonix.com


www.rate-a-business.com



Sunday 4 August 2013

Rating the Rating Agencies. And those who control them.


Who rates the Rating Agencies? Who rates those that award triple-A ratings to companies that fail the day after or to junk bonds and toxic financial products that lead to global economy meltdown? The answer: nobody. Who controls them? Huge investment funds, such as BlackRock, for example. If you control a rating agency and if you control publicly listed companies the circle is closed. An excellent book on the subject is "The Lords of Ratings" by P. Gila and M. Miscali.

Ontonix provides quarterly ratings of the resilience of corporations, banks, national economies and systems thereof. We also rate rating agencies. One in particular: Moody's. Here is their latest resilience rating:


Which is equivalent to BBB-

And here is the rating of one of the investment funds that controls Moody's, BlackRock:




They get a resilience rating of 83%, which corresponds to an AA. Surprising? Not really.


www.ontonix.com


www.rate-a-business.com



Saturday 3 August 2013

A Structured Look at Cellular Automatons




From the Wikipedia: A cellular automaton is a discrete model studied in computability theory, mathematics, physics, complexity science, theoretical biology and microstructure modelling. It consists of a regular grid of cells, each in one of a finite number of states, such as "On" and "Off" (in contrast to a coupled map lattice). The grid can be in any finite number of dimensions. For each cell, a set of cells called its neighbourhood (usually including the cell itself) is defined relative to the specified cell. For example, the neighbourhood of a cell might be defined as the set of cells a distance of 2 or less from the cell. An initial state (time t=0) is selected by assigning a state for each cell. A new generation is created (advancing t by 1), according to some fixed rule (generally, a mathematical function) that determines the new state of each cell in terms of the current state of the cell and the states of the cells in its neighbourhood. For example, the rule might be that the cell is "On" in the next generation if exactly two of the cells in the neighbourhood are "On" in the current generation, otherwise the cell is "Off" in the next generation. Typically, the rule for updating the state of cells is the same for each cell and does not change over time, and is applied to the whole grid simultaneously, though exceptions are known.

We have measured the complexity and extracted the complexity map of a few cellular automatons which may be found here and are illustrated in the image below:




While  humans are good at recognising patterns and structure, rapid classification of patterns in terms of their complexity is not easy. For example, which is more complex in the above figure, Rule 250 or Rule 190? The answer is below.


Rule 30



Rule 54




Rule 62




Rule 90




Rule 190




Rule 250




It appears that Rule 250 Automaton is the most complex of all (C = 186.25) , while the one with the lowest complexity is Rule 90 (C = 64.31). Not very intuitive, is it?  Intuition is given only to him who has undergone long preparation to receive it (L. Pasteur).






www.ontonix.com