Tuesday, 9 July 2013

Traditional Ratings, Traditional Risk Management: Is it Over?

Risk management technology Ontonix
".. the effective management of risk is one of the core strengths that has made Lehman Brothers so successful" (from above image, October 28-th, 2008).


Ever since the financial meltdown, rating agencies have been under fire. And so has risk management. Setting aside considerations of political nature, it is evident that risk ratings and risk management must undergo a very substantial overhaul. Times have changed and so must the techniques and methods that help decision-makers do their job.

Let's look closer look at a central concept in Risk Management, Risk Rating and VaR analysis - that of probability of failure. Imagine a computer chip manufacturer who produces, say, 1 million chips a month. Suppose also that on average 5 chips per month have a defect and don't make it through quality control. Evidently, the probability of manufacturing a chip with a defect is 5/1000000. However, even in highly controlled environments such as those in silicon chip factories it is impossible to state a-priori which particular chip will have a defect. All you can do is spot the defects after quality control. Therefore, when it comes to a single chip, the mentioned probability is irrelevant. You can only verify a failure after the fact but you cannot predict it. If chip number X will be faulty, it will be faulty no matter what probability you can dream up. For all practical purposes, when it comes to the fate of a single chip, that probability is totally irrelevant.

Take now a slightly different case, that of a single public company. Investors want to know if buying its stocks is a bad or a good idea. They look at its rating and they decide. Ratings, as we know, measure the Probability of Default - the PoD. And here comes the problem:

If, based on a production of millions of (for example) chips, it is impossible to say which chip will be faulty, how can anyone issue a PoD-type statement for ONE corporation?

What's more, chips are produced under highly controlled conditions, while corporations compete in highly turbulent, non-stationary and chaotic markets, driven by subjective and irrational decisions and sentiments. What, then, is the relevance of a PoD? Days before Lehman Brothers defaulted it had a very high investor-grade rating. This corresponded to a very low PoD. However, had there been millions of Lehman Brothers banks (like chips) that PoD might have been pretty close to the true value of a Probability of Default (like the 5/1000000 in the above example). But there has been only ONE such bank!

A market is a dynamic non-stationary conglomerate of thousands of corporations which compete - which means that they are NOT independent - and in which every company is unique. While silicon chips are independent, corporations are not. Therefore, the problem is really very very nasty. Ultimately what this means is that a PoD makes little sense, even though some people pay for it while others invest their money based on its value. There must be an alternative.

When faced with an uncertain and turbulent future risk managers must change approach pretty radically. The facts as follows:
  • Complexity (of the business model) X Uncertainty (of the markets) = Fragility (of the outcome)
  • Because the uncertainty of the global economy is on the rise (see certain laws of physics to understand why this must be so) whenever possible a simpler business is better than a complex business
  • Simpler systems cope better with extreme and sudden events than highly complex ones - they can adapt faster and are less fragile
  • Our increasingly turbulent and uncertain economy will trigger more sudden and extreme events.

This means that:

  • Risk management becomes complexity management - you keep an eye on your complexity (on a monthly or quarterly basis) and you try to keep it low (just like cholesterol or sugar levels in your blood).
  • Complexity management becomes part of the corporate strategy. Unlike risk management - some companies today don't even have a risk management department - complexity management (when operated using our technology) is simple, natural and easily embraces an entire organization.

Quantitative complexity management is a modern and natural alternative to traditional risk management - an innovative tool for the challenges of the 21-st century. Basically:

1. instead of building stochastic models (which will always be surrogates of reality) and running them using, for example, Monte Carlo Simulation, to produce PoDs with 6 decimals

2. you constantly keep an eye on your complexity - it is easy to measure it using data which reflects the performance of your business - and watch out for any fluctuations. Such fluctuations are phenomenal pre-crisis alarms.

So, the new approach to uncertainty management is NOT measuring Probabilities of Default (or failure) but acting quickly on pe-crisis alarms and keeping the complexity of your business low (just like you would do with your cholesterol). In order to better face extreme and unexpected events you need resilience not A's or B's. Remember, we're no longer on a lake with a breeze but in the middle of a stormy ocean.



Read more about it here.

Get our ebook "A New Theory of Risk and Rating" on Amazon.



www.ontonix.com



No comments:

Post a Comment