Saturday 21 September 2013

Ontonix S.r.l.: Measuring the magnitdue of a crisis

Ontonix S.r.l.: Measuring the magnitdue of a crisis: How can you measure the magnitude of an economic crisis? By the number of lost jobs, foreclosures, GDP drop, number of defaulti...

Measuring the magnitdue of a crisis




How can you measure the magnitude of an economic crisis? By the number of lost jobs, foreclosures, GDP drop, number of defaulting banks and corporations, deflation? Or by the drop in stock-market indices? All these parameters do indeed reflect the severity of a crisis. But how about a single holistic index which takes them all into account? This index is complexity and in particular its variation. Let us examine, for example, the US sub-prime crisis. The complexity of the US housing market in the period 2004-2009 is illustrated in the above plot. A total of fifty market-specific parameters have been used to perform the analysis in addition to fifteen macroeconomic indicators such as the ones mentioned above. The "bursting bubble" manifests itself via a complexity increase from a value of approximately 19 to around 32. With respect to the initial value this means an increase of 40%. The arrow in the above plot indicates this jump in complexity and this number represents a systemic measure of how profound the US housing market crisis is.

In summary, the magnitude of a crisis can be measured as follows:

M = | C_i - C_f | / C_i

where C_i is the value of complexity before the crisis and C_f the value during crisis. The intensity of a crisis can be measured as the rate of change of complexity

Serious science starts when you begin to measure.




Friday 20 September 2013

Complexity: The Fifth Dimension





When complexity is defined as a function of structure, entropy and granularity, examining its dynamics reveals its fantastic depth and phenomenal properties. The process of complexity computation materializes in a particular mapping of a state vector onto a scalar. What is surprising is how a simple process can enshroud such an astonishingly rich spectrum of features and characteristics. Complexity does not possess the properties of an energy and yet it expresses the "life potential" of a system in terms of the modes of behaviour it can deploy. In a sense, complexity, the way we measure it, reflects the amount of fitness of an autonomous dynamical system that operates in a given Fitness Landscape. This statement by no means implies that higher complexity leads to higher fitness. In fact, our research shows the existence of an upper bound on the complexity a given system may attain. We call this limit critical complexity. We know that in proximity of this limit, the system in question becomes delicate and fragile and operation close to this limit is dangerous. There surely exists a "good value" of complexity - which corresponds to a fraction, ß, of the upper limit - that maximizes fitness:

Cmax fitness = ß Ccritical

We don't know what the value of ß is for a given system and we are not sure on how it may be computed. However, we think that the fittest systems are able to operate around a good value of ß. Fit systems can potentially deploy a sufficient  variety of modes of behaviour so as to respond better to a non-stationary environment (ecosystem). The dimension of the modal space of a system ultimately equates to its degree of adaptability. Close to critical complexity the number of modes, as we observe, increases rapidly but, at the same time, the probability of spontaneous (and unwanted) mode transitions also increases quickly. This means the system can suddenly undertake unexpected and potentially self-compromising actions (just like adolescent humans). 





Wednesday 18 September 2013

Stocks, Crowdrating and the Democratization of Ratings


We have always claimed that the process of rating of a business and its state of health should be more transparent, objective and affordable, even by the smallest of companies. With this goal in mind Ontonix has launched the World's first do-it-yourself rating system - Rate-A-Business - which allows any individual to upload the financials of a company and to obtain, in a matter of seconds, a measure of its state of health. The system works, of course, for both publicly listed as well as private companies. Essentially, the tool shifts rating from a duopoly of two huge Credit Rating Agencies to the Internet, the World's central nervous system. Rating becomes, de facto, a commodity. In order to make our global economy healthier, and to reduce the impact of future crises, it is paramount to transform rating from a luxury, to a commodity. Today, it is possible to know one's levels of cholesterol, for example, for just a few dollars. The information is not just reserved for the rich. Similarly, the rating of a business - its state of health, or resilience - is something that every company should know, even the tiniest SME. This is the philosophy that has driven Ontonix to develop Rate-A-Business.

Assetdyne takes things forward even more. The company also provides a real-time rating system. Even though the system developed by Assetdyne focuses on publicly listed companies and portfolios of their stocks, it too introduces a fundamental new element into the process of rating - the so-called crowd-rating. The value of the stock of a company is the result of a complex interplay of millions of traders, analysts, investors, trading robots, etc. Ultimately, it is a reflection of the reputation and perceived value of a particular company and is the result of a democratic process. Clearly, the value of a stock is also driven by market trends, sector analyses, rumors, insider trading and other illicit practice and, evidently, by the Credit Rating Agencies themselves. However, undeniably, it is the millions of traders who ultimately drive the price and dynamics of stocks according to the basic principles of supply and demand. In practice, we're talking of a planet-wide democratic process of crowd-rating - it is the crowd of traders and investors that decides how much you pay for a particular stock.

What Assetdyne does is to use the information on the value and dynamics of the price of stocks to actually compute a rating. The rating that is computed does not reflect the Probability-of-Default (PoD) of a particular company - this the popular "AAA" kind of rating - it reflects the "resilience" of a given stock (hence the company behind it). Resilience is the capacity to resist shocks, a frequent phenomenon in our turbulent economy. Resilience, besides being a very useful measure of the state of health of any kind of system, not just of a corporation, possesses one very important characteristic - its computation is based on the measure of complexity. It so happens that complexity is the hallmark of our economy, of our times. The rating system developed by Assetdyne delivers, therefore, the following additional information:

Stock Complexity - this measures how "chaotic" the evolution of stock is. In other words, we're talking of an advanced measure of volatility. Complexity is measured in bits. The value of complexity of different stocks may clearly be compared.

Stock Resilience - this measures how well the stock price reacts to shocks and extreme events. Values range from 0% to 100%.

As the computation of the complexity and resilience of a stock are based on closing values at the end of each trading day, the corresponding values also change on a daily basis.

An example is illustrated below.




Assetdyne's rating system is applicable also to portfolios of stocks. The example below illustrates a small portfolio of oil&gas companies.




An important aspect of this particular rating technique is that it is not based on the financial reports (Balance Sheets, Income Statements, Cash Flow, etc.) which are of highly subjective nature. But companies construct their balance statements so as to provide a more optimistic picture and therefore conventional PoD-type ratings inevitably influenced. While financial statements and the resulting PoD ratings are subjective (recall the multitude of triple-A-rated companies that have defaulted all of a sudden, triggering the current crisis) to the point that governments have sued Credit Rating Agencies, stocks represent a considerably more objective reflection of the real state of affairs. Most importantly, the information is known to everyone. Of course, markets are not always right and the price may be wrong but the process of converging to a given price is as objective and democratic as things in this world can get. 

One could conclude that the World's stock markets constitute one huge social network which plays a global game called trading. As the game is played, one of its outcomes is the price of stocks. The price may be "wrong", it may be manipulated but it is what it is. It is the result of the mentioned crowd-rating and Assetdyne uses it to provide new important information on complexity and resilience rating of stocks and portfolios. Innovation in finance is possible.


www.assetdyne.com




Tuesday 17 September 2013

Complexity Introduced to Stock and Portfolio Analysis and Design


Modern Portfolio Theory (MPT) has been introduced in 1952 by Markowitz. As described in Wikipedia, "MPT is a theory of finance that attempts to maximize portfolio expected return for a given amount of portfolio risk, or equivalently minimize risk for a given level of expected return, by carefully choosing the proportions of various assets. Although MPT is widely used in practice in the financial industry and several of its creators won a Nobel memorial prize for the theory, in recent years the basic assumptions of MPT have been widely challenged by fields such as behavioral economics.

MPT is a mathematical formulation of the concept of diversification in investing, with the aim of selecting a collection of investment assets that has collectively lower risk than any individual asset. This is possible, intuitively speaking, because different types of assets often change in value in opposite ways. For example, to the extent prices in the stock market move differently from prices in the bond market, a collection of both types of assets can in theory face lower overall risk than either individually. But diversification lowers risk even if assets' returns are not negatively correlated—indeed, even if they are positively correlated.

More technically, MPT models an asset's return as a normally distributed function (or more generally as an elliptically distributed random variable), defines risk as the standard deviation of return, and models a portfolio as a weighted combination of assets, so that the return of a portfolio is the weighted combination of the assets' returns. By combining different assets whose returns are not perfectly positively correlated, MPT seeks to reduce the total variance of the portfolio return. MPT also assumes that investors are rational and markets are efficient."

Since 1952 the world has changed. It has changed even more in the past decade, when complexity and turbulence have made their permanent entry on the scene. Turbulence and complexity are not only the hallmarks of our times, they can be measured, managed and used in the design of systems, in decision-making and, of course, in asset portfolio analysis and design.

Assetdyne is the first company to have incorporated complexity into portfolio analysis and design.  In fact, the company develops a system which computes the Resilience Rating of  stocks and stock portfolios based on complexity measures and not based on variance or other traditional approaches. While conventional portfolio design often follows the Modern Portfolio Theory (MPT), which identifies optimal portfolios via minimization of the total portfolio variance, the technique developed by Assetdyne designs portfolios based on the minimization of portfolio complexity. The approach is based on the fact that excessively complex systems are inherently fragile. Recently concluded research confirms that this is the case also for asset portfolios.


Two examples or Resilience Rating of a single Stock are illustrated below:




An example of a Resilience Rating of a portfolio of stocks is shown below (Top European banks are illustrated as an interacting system):


while the rating and complexity measures are the following:


The interactive map of the EU banks may be navigated on-line here.


For more information, visit Assetdyne's website.



Sunday 15 September 2013

If You Really Need to Optimize


Optimal solutions are fragile and should be generally avoided. This unpopular statement enjoys substantial practical and philosophical argumentation and now, thanks to complexity, we can be even more persuasive. However, this short note is about making optimisation a bit more easy. If you really insist on pursuing optimality, there is an important point to keep in mind.

Let us examine the case illustrated below: designing a composite conical structure in which the goal is to keep mass under control, as well as the fluxes and axial and lateral frequencies. The System Map shown below reflects which (red) design variables (ply thicknesses) influence the performance of the structure in the nominal (initial) configuration, prior to optimisation. In addition, the map also illustrates how the various outputs (blue nodes) relate to each other. 




In fact, one may conclude that, for example, the following relationships exist:
  • t_20013 - Weight
  • Weight - Axial frequency
  • Min Flux - Max Flux
  • t_20013 controls the Lateral Frequency
  • t_20013 also controls the Axial Frequency
  • Lateral Frequency and Axial Frequency are related to each other
  • etc.
As one may conclude, the outputs are tightly coupled: if you change one you cannot avoid changing the others. Let's first see how optimisation is handled when one faces multiple - often conflicting - objectives:

minimise y = COST (y_1, y_2, ..., y_n) where y_k stands for the k-th performance descriptor (e.g. mass, stiffness, etc.). In many cases weights are introduced as follows:

minimise y = COST (w_1  *  y_1,  w_2  * y_2, ..., w_n  * y_n).

The fundamental problem with such a formulation (and all similar MDO-type formulations) is that the various performance descriptors are often dependent (just like the example above indicates) and the analysts doesn't know. The cost function indicated above is a mathematical statement of a conflict, whereby the y's compete for protagonism. This competition is driven by an optimisation algorithm which knows nothing of the structure of the corresponding System Map and of the existence of the relationships contained therein. Imagine, for example that you are trying to reduce a variable (e.g. mass) and increase, at the same time another (e.g. frequency). Suppose also that you don't know that these two variables are strongly related to each other: the relationship looks typically like this: f = SQRT(k/m). Here, f and m, outputs of the problem, are related - changing one modifies the other. This is inevitable. In a more intricate situation, in which hundreds of design variables are involved, along with tens or hundreds of performance descriptors, the problem really becomes numerically tough. The optimisation algorithm has a very hard time. What is the solution?

If you cannot avoid optimisation, the we suggest the following approach:
  • Define your baseline design.
  • Run a Monte Carlo Simulation, in which you randomly perturb the design variables (inputs).
  • Process the results using OntoSpace, obtaining the System Map.
  • Find the INDEPENDENT outputs (performance descriptors), or, in the case the aren't any, those outputs which have the lowest degree in the System Map. There are tools in OntoSpace that actually help to do this.
  • Build your cost function using only those variables, leaving the others out.
This approach "softens" the problem from a numerical point of view and reduces the mentioned conflicts between output variables. Attempting to formulate a multi-disciplinary problem without knowing a-priori how the various disciplines interact (i.e. without the System Map) is risky, to say the least.






www.design4resilience.com


Saturday 14 September 2013

Optimality: The Recipe for Disaster



There still seems to be a rush towards optimal design.  But there is no better way to fragility and vulnerability than the pursuit of peak performance and perfection - optimality in other words. But let's take a look at the logic behind this risky and outdated practise:
  •  Based on a series of assumptions, a math model of a system/problem is built.
  •  Hence, we already have the first departure from reality: a model is just a model.
  •  If you're good, really good, the model will "miss" 10% of reality.
  •  You then squeeze peak performance out of this model according to some objective function.
  •  You then manufacture the real thing based on what the model says.
It is known - obviously not to all - that optimal designs may be well-behaved with respect to random variations in the design parameters but, at the same time, they are hyper-sensitive to small variations in the variables that have been left out in the process of building the model. This is precisely what happens - you design for or against something, but you miss out that something else. By wiping under the carpet seemingly innocent variables, you're deciding a-priori what the physics will be like. And this you cannot do. Now, if your model isn't forced to be optimal - to go to the limit - it might stand a better chance in that it will have room for manoeuvre. When you're optimal, you can only get worse! If you're standing on the peak of a mountain, the only way is down! Why, then, even attempt to design and build systems that are optimal and that can only get worse? Is this so difficult to see? Sounds like the Emperor's new clothes, doesn't it?



www.design4resilience.com


www.ontonix.com