Friday, 7 March 2014

Innovation in Finance is Possible




The rating, a parameter that reflects the state of health of a company, occupies a central position in the world economy . Users of ratings are private and institutional investors, brokers, traders, analysts, and, lately, even politicians. Ratings, which are used essentially to decide which companies to invest in, are free and are published in newspapers and on the web. The process leading to the definition of the rating of a company is long and complex and has long been the focus of much discussion and controversy since it is often paid by the same companies being valued. Despite the conflict of interest is obvious, the largest international agencies - Moody's, Standard & Poor 's and Fitch – continue to hold a rating monopoly. Who controls the rating agencies - largely companies that manage huge investment funds - has immense power. Everybody knows that. Everybody continues to use ratings.

We have always maintained that the rating process of a firm should be more transparent, objective and, above all, accessible even for the smallest of businesses. With this goal in mind we have launched the World’s first system ' self- rating' system - Rate-A-Business - which allows anyone to upload data from the financial statements of a company and to obtain in a few seconds a measure of its state of health. The system works for both listed companies as well as for those not present on stock exchanges. In essence, the tool moves the process of rating from the agencies to the Internet, making it more "democratic", fast and easily accessible by investors or even by smaller companies . In other words, the rating is transformed from a luxury to 'commodity'. More than that, it becomes a useful tool in managing a business. This is the philosophy that inspires the Rate- A- Business platform.

How does Rate-A-Business work?

To obtain the rating with Rate-A-Business, whether it is for a listed companies or not, one must have quarterly data from the company’s income statement, cash flow, balance sheet or ratios. It is advisable to use data for the last 12 quarters (three years). A small example is shown below (the data is fictitious).


 
Once data has been uploaded the system processes the numbers, establishes the inter-dependencies between the various entries and measures the overall amount of “chaos” (uncertainty) contained within the data. Data entries with highly random or chaotic evolution are a reflection of a business that is not predictable and therefore difficult to manage, as in the example below.
 

Clearly, the more entries behave chaotically the more the company in question is vulnerable. Additionally, if these items are related to each other, the company is heavily exposed given that a problem with either of them can spread rapidly throughout the system. The degree of chaos that a company is able to withstand is called "critical complexity ". Near this threshold the trends of the various data entries is so uncertain and unpredictable that the company is virtually uncontrollable. Clearly, if, for example, the balance sheet entries have evolved chaotically , it is easy to imagine similar performance in terms of sales, production and, eventually, across the entire enterprise. So, the more a company functions far from threshold of its "critical complexity" it is more healthy. This, in summary, is the spirit of the new rating system available on the Rate A Business platform: measure the distance that separates a company from a state of "total chaos". 

How does Assetdyne works? 
In the case of listed companies, the Assetdyne platform is connected in real time to different stock markets and collects the closing values of different securities. Using a technique similar to auto-correlation, the system assembles a table with those values going back a number of days. The system, at this point, measures the distance of these data from a state of "total chaos" (i.e. critical complexity ). The more the data in question are complex the more the evolution of the price of those securities results fragile and unpredictable. The operation of the system is very easy - one enters a ticker symbol and in a matter of seconds the system provides a measure of the complexity and resilience of its daily evolution.

As mentioned, high complexity implies chaos. As complexity approaches "critical complexity" the dynamics of the price per share becomes uncertain and therefore less resilient. To give an example, think of the value of cholesterol, which should be kept at a certain distance from a maximum value established by our physician. Near this threshold, our health is at risk. The situation with stocks and their dynamics is similar. If the trend is more complex (chaotic ) this increases the possibility of surprises and, therefore, the level of exposure. Complexity, therefore, provides a new measure of volatility (variability) which is not based on the conventional concepts of normality or linear correlation and which are questionable in a highly turbulent regime.


Resilience, on the other hand , which measures the ability to absorb shocks and extreme events, ranges from 0 % to 100%. The closer you get to 100 % the more predictable and stable the situation is. Low value of resilience point to situations which are chaotic, hence difficult to predict. Two examples of complexity and resilience rating systemic European banks are illustrated below:


Intesa Sanpaolo (ISPM.MI)
previous close
 1.83
stock complexity
 12.48
stock resilence
 85.12%


 
Credit Suisse Group (CS)
previous close
 31.05
stock complexity
 28.91
stock resilence
 73.78%


 

Portfolio ratings
Assetdyne’s rating platform is also applicable to portfolios of securities. Given that the computation of portfolio complexity is based on the closing prices at the end of the day, its value changes on a daily basis. It is, ultimately, a high-frequency portfolio rating system. It should be remembered that traditional ratings are generally issued once a year, when companies publish their consolidated financial statements. Given the speed of the economy, this may be inadequate.

One of the objectives of Assetdyne’s rating system is to indicate which are the securities or products which make a portfolio highly complex and which should be avoided by less experienced investors. What is surprising is that we all know that highly complex products are risky and yet no one has ever measured their actual complexity. Assetdyne does just that - for the first time we measure the complexity of stocks, portfolios of shares or other financial products.

As a final comment, one might conclude that stock markets constitute a huge social network in which millions of people participate in a global "game" called trading. One of the results of this game is the real-time share price of all the listed companies and which affects immediately the world’s economy. It is important, therefore, that every participant be aware of those financial products which hide high complexity, the most formidable source of fragility and risk. 

*Assetdyne, has developed a new rating system for listed companies seeking to make the ratings as objective as possible by addressing the problem of potentially unreliable financial statements. This system is based on the concept of 'crowd- rating ' and has its roots in the stock market.
The value of the shares of a company is the result of a complex interaction of millions of traders, analysts, investors, trading robots, etc. Ultimately, it is a reflection of the reputation and the perceived value of a company and is the result of a collective and 'democratic' process. Clearly, the value of a security is also driven by market trends, industry analysis, rumors, insider trading and other illegal practices and, of course, by the rating agencies. However, undeniably, it is the millions of investors who ultimately drive the price and the dynamics of the securities in accordance with the fundamental principles of supply and demand.

Assetdyne uses information about the daily value and the dynamics of the stock price of a company to actually calculate its rating. The rating that is calculated in this manner does not reflect the probability of default (i.e. the probability of bankruptcy) of a particular company - this is what a traditional rating produces, an AAA , BBB or CCC for instance - it reflects the complexity (degree of chaos) of the dynamics of its stock. This is very important for a number of reasons. Stocks with very complex dynamics are far more unpredictable than those with simpler dynamics. Highly complex and volatile dynamics are able to surprise investors, very often at the worst moment in time. It so happens that our economy and stock markets are not only very turbulent and chaotic , but also extremely complex. A rating based on the complexity of the stocks of listed companies reflects, therefore, the hallmark of our times - complexity.
 
 
 
 
 
 

Sunday, 23 February 2014

WEF 2013 Report on Global Risks: A Different View


The World Economic Forum 2013 report (available here) discusses a wide variety of global risks, their likelihood and potential impact. Risk, however, is a problematic concept. It is not related to any physical quantity and does not follow any laws of physics. It is a highly subjective entity based on another, even more slippery idea, of probability. The most popular definition of risk is this:

Risk = Probability of an event X Consequences

The problem with this definition is twofold:

  • Probability is evaluated either based on ensembles of past events or simulations.
  • The consequences of an event are extremely difficult to estimate.
But even if we have a "perfect" value of probability, its meaning is still difficult to grasp. Imagine two events, A and B. Imagine that the probability of occurring of A is 80%, that of B is 70%. What does that mean? What does it really mean? Does it mean that A will occur before B? Does it mean that the consequences of A will be more severe than that of B? Absolutely not. In actual fact, nobody knows what it means. A probability doesn't give any clue of when, why and with what consequences an event will happen. Bertrand Russel said in 1927:

"Probability is amongst the most important science, not least because no one understands it"

As to consequences of adverse events the situation is similar. Suppose there will be flooding in a certain country next Autumn due to heavy rain. Suppose we know it will happen, so the probability is 100%. What will the consequences be? How many homes will be lost? For how long will the territory be without electricity? How many families will need to be relocated? Ten thousand, fifty thousand, half a million? Depends. It depends on so many factors that any guess is as good as any other guess. So, what is the risk? A billion, two billion? How do you make contingency plans for risks which have unknown consequences and which occur with a probability that is, fundamentally, a concept that is  not understood? How well these contingency plans work is obvious. Every time we witness, for example, natural disasters or humanitarian crises, the keywords are impotence, inefficiency, slow response, angered populations, etc. So much for modern Risk Management.

The WEF produces interesting maps of potential risks, such as this one:


"Probability is amongst the most important science, not least because no one understands it".

Read more: http://www.physicsforums.com
 As the WEF report says, "The Global Risks Report 2013 analyses 50 global risks in terms of impact, likelihood and interconnections, based on a survey of over 1000 experts from industry, government and academia". In other words, the maps are based on subjective views of individuals who are experts in their respective fields, who use their own established models of analysis, simulation, etc.  Clearly, subjective opinions lead to subjective results and conclusions.

A different approach is to adopt an objective model-free Quantitative Complexity Analysis, using real, objective data from sources such as CIA World Fact Book or the World Bank. Processing such data provides something like this:


The above is a Complexity Map, and relates the various parameters (risks/events) to each other in an easy to grasp and analyze format. In fact, the maps is also interactive and may be navigated dynamically. Understanding the various relationships and dependencies between parameters is key towards understanding how the underlying system really works. This is what structure is all about. Knowledge. No structure, no knowledge, just sparse conclusions.

However, the most important result is the actual measure of resilience/robustness of the system (as well as its complexity). In the above case we're talking of just over 50%, a mere two stars. The curious thing is that this measure is very much in line with the resilience of the economy, which today is between 50 and 60% - in other words, very fragile.

An equally important result of a Quantitative Complexity Analysis is the ranking of each of the parameters/risks in terms of their footprint (i.e. objective weight) on the system as a whole. In the case in question it looks like this:


In other words, the ranking of parameters is not based on subjective opinions and surveys, it is not based on statistical or math models, it is based on real and raw data.

Our objective is not analyze in detail Global Risks. What we wish to point out is that when things get really complex, a thousand experts can deliver a thousand opinions, all of which may seem credible and fit the real picture.

The words "resilience", "complexity", "systemic risks", "systems thinking" are increasingly popular. There are numerous studies and publications on these subjects. This is good. See, for example, the WEF's page on national resilience. However, what these studies have in common is lack of a quantitative perspective. Complex? How complex? Resilient? How resilient? 10%, 30%. If we don't incorporate a quantitative dimension into these analyses, which are unquestionably valuable, they will inevitably remain in the sphere of subjectivity.

Let us recall the Principle of Fragility, coined by Ontonix in 2005:

Complexity X Uncertainty = Fragility

While we have often applied the principle to businesses and business processes, it can also be applied to the analysis of Global Risks. Clearly, we are facing a highly Complex situation. The problem at hand is very complex. We also agree on the fact that every expert has his own opinion. As we have said, a highly complex scenario may be interpreted in a plethora of ways. Depending on which expert we talk to, the answer will be different. So, the choice of experts is crucial. Combining, therefore, the complexity of the underlying problem, with the uncertainty originating from a multitude of different and subjective opinions, what we obtain ultimately is a fragile result. Handle with care.



www.ontonix.com




"Probability is amongst the most important science, not least because no one understands it"

Read more: http://www.physicsforums.com

Monday, 10 February 2014

Solving Extreme Problems.


Extreme problems are very large-scale multi-disciplinary problems, involving thousands of variables and which cannot be solved using conventional technology. In such situations, it is impossible to determine the cause of the problem not only because of its sheer size but, most importantly, because it is frequently perceived through conventional eyes and distorted by narrow and linear thinking. It is not a matter of compute power or sophisticated math modelling - some things just cannot be modelled.

Examples of extreme problems:

  • Unexpected collapses of critical systems or infrastructures (markets, transportation systems, IT networks, large corporations, etc.)
  • Prolonged states of crisis, inefficiency or frequent system failures (process plants, transportation systems, economies, telephone networks, etc.)
  • Sudden catastrophic collapse (spacecraft, aircraft, software systems, ecosystems, etc.)


Clearly, extreme problems cause extreme consequences and losses.

When it comes to man-made systems, bad design is often the cause. The inability of conventional science to embrace the systems perspective of things on the one hand, and neglecting their complexity on the other provide an efficient barrier to solving extreme problems.

Because in the majority of cases it is excessive and uncontrolled complexity that leads to severe consequences and extreme problems, Ontonix attacks them with its patented model-free Quantitative Complexity Management technology. In collaboration with supercomputer centers, Ontonix provides radically innovative means of formulating and solving extreme problems. We actually measure complexity, identify its sources, performing multi-level Complexity Profiling of systems and sub-systems until a solution is found. In huge and complex systems things often go wrong not because some parameters have the wrong value but because of the countless interactions that may develop. The higher the complexity, the more of such interactions may emerge. What this means is that thanks to high complexity the system in question possesses a marked capacity of producing surprises.

Extreme problems not only pose new challenges. They also stimulate innovative business models. In fact, when Ontonix takes on an extreme problem, the following scheme is followed:

  • The client and Ontonix analyze the problem together.
  • The consequences and losses incurred by the client are quantified.
  • As much data about the problem is gathered as possible.
  • Ontonix employs its best efforts to solve the problem.
  • In case of loss reduction/elimination, a percentage of the client´s gains are paid to Ontonix.




For more information e-mail us.


www.ontonix.com


 

Friday, 24 January 2014

Health of the EU Economy - Stagnation or Recovery?


Our quarterly analysis of Eurostat's macroeconomic data of the Eurozone for Q3 2013 has now been completed and published. The interactive Business Structure Maps of each member state may be navigated here.

Italy, together with the UK, Sweden and France have the highest resilience (approximately 80%), while Belgium, Ireland, and Spain score a low 60%.

It is interesting to note that when it comes to the entire region, there are clear indications of a slow but consistent recovery. The evolution of complexity (its increase) in the plot belows shows that clearly.




However, complexity remains dangerously close to critical complexity, denoting alarmingly high fragility. This means that the system is very much exposed and incapable of dealing with intense shocks, financial contagion or extreme events. Nevertheless, it is also evident how, based on the available data, we have hit the bottom around Q4 2011, i.e. approximately two years ago. As of today, the situation in terms of complexity is comparable to that of Q3 2010. In essence, overall situation of the Eurozone has not evolved over the last three years. This is in line with an evident lack of reforms and lack leadership at both EU and country level.

The evolution of resilience follows a similar trend and, although it is still alarmingly low.




Finally, it is interesting to notice how in terms of recovery the 15 core Eurozone states are outpacing the 13 new  member states.




While the crisis peaked in Western Europe in Q4 2007, it climaxed approximately one year later in Central and Eastern Europe. In terms of recovery things are different. While the EU15 group touched the bottom in Q1 2011, the EU13 did so in Q3 2012, i.e. 18 months later. What is also clear is that the complexity gradient (higher complexity means a more lively economy) in the case of the EU13 group is substantially lower than that of the EU15. This means that, based on the currently available data, recovery in Central and Eastern Europe will be significantly slower.



www.ontonix.com                                       www.rate-a-business.com






Thursday, 23 January 2014

Creating Fragile Monsters and When Failure Is An Option


Imagine the World as one big corporation, offering all sorts of products and services. If one observes the World, and recognises that most (if not all) things tend states of greater chaos and fragmentation, it may be difficult to reconcile both images. For example, the number of countries, is increasing. Look what happened to Yugoslavia and Czechoslovakia or the Soviet Union. Belgium and Spain will probably be next. Even the European Union itself is being questioned by growing numbers of its disillusioned citizens. The bottom line is that number of players is increasing, their demands and conflicting interests are going to be very difficult to deal with. There are centrifugal forces everywhere. Well, almost everywhere. In fact, as countries and societies tend to break up, there is an equally clear trend in the opposite direction in the corporate world. Consolidations are creating super-huge conglomerates of corporations and super-banks.

Let's look at consolidation in the US over the last two decades. First the media industry.






And the banking industry.





Such super-huge companies have been named as "Too Big To Fail" by Stewart McKinney when he  served on the Banking, Finance and Urban Affairs Committee in 1984. He was wrong. Super-huge companies and banks have failed and without early warning. Size, in this case, doesn't matter. The  problem, in fact, is not so much size as complexity. Excessive complexity to be precise. The new paradigm is "Too Complex To Survive". The enemy is excessive complexity. And why?


  • Highly complex systems are intrinsically hazardous systems.

  • Highly complex systems run in degraded mode.

  • Catastrophe is always just around the corner.


If we allow this mega-monster corporation to emerge, we need to be well aware of the three keywords appearing above: hazardous, degraded, catastrophe. Is this the world we want?


Today, complexity can be measured and managed. It is a fundamental and strategic Key Performance Indicator of any modern business. Ontonix is the first and only company to measure and manage complexity. Rationaly. Serious science starts when you begin to measure.



www.ontonix.com







Tuesday, 21 January 2014

Just How Good Are Minimum-Complexity Portfolios?


In order to showcase the performance of minimum-complexity portfolios versus highly complex ones, two such portfolios have been built with stocks from the Dow Jones Index. A nasty period, which included the Internet Bubble, has been chosen: 2000-2004. We confronted the performance of both portfolios with that of the index itself. This has been done in four distinct periods.Here are the results.




In terms of numbers we have the following results:




While the Dow has reported a total loss of 4.2% during the entire 4-year period, the high-complexity portfolio produced gains of 6.3% and the low-complexity one an impressive 24.1%. In turbulence, simpler is better.

Minimum-complexity portfolios may be obtained at www.assetdynex.com







Sunday, 12 January 2014

Just How Healthy is the US Economy?


We know that 2013 has been a great year for stock markets, US markets in particular. People are openly talking of "stock market recovery". We also know that the FED has been pumping paper into the system. But what has this really done to the economy, apart from increasing the values of market  indices?

Since Nature offers no free lunch (the economy probably doesn't either) printing money must have its consequence. If you make markets rally based on steroids you inevitably end up paying for it somewhere. We claim that such policies create fragility. Hidden fragility. Well, hidden to conventional pre-crisis analytics technology and to those that are concerned with numbers and numbers alone.

Assetdyne analyzes the major US markets every two weeks and publishes the results here. The focus of the analyses is resilience - the capacity to resist impacts, shocks, contagion, extreme events and, ultimately, sustained turbulence. The results are far from exciting, revealing mediocre levels of resilience. Here they are:


NASDAQ 100 - NDX

S&P 100 - OEX

Dow Jones Composite Average - DJA

Dow Jones Industrial Average - DJI

PHLX Semiconductor - SOX


A two to three-star resilience rating. Nothing to celebrate. The S&P 100, in particular, has an alarming two-star (64%) rating. We leave the comments to the readers.

Navigate Interactive Complexity Maps of the indices here. Just click on an index and move the mouse. More soon.



www.assetdyne.com






Monday, 6 January 2014

Complexity Science Helps in Early-detection of Fibrillations and Tachyarrhythmia


The main goal of ONTONET-CRT™ is to reduce detection times as well as unnecessary ICD shocks. ONTONET-CRT™ adopts new model-free technology which does not rely on traditional math models. Instead of conventional analysis ONTONET-CRT™ processes the EGM and computes its complexity. Its sudden fluctuations generally anticipate events such as fibrillations or tachycardias.

ONTONET-CRT™ processing of EGM data indicates that fluctuations of complexity generally precede tachycardias or fibrillations. This means that it is possible to gain precious time in properly detecting and classifying the event and even preventing it altogether.

Analyses of EGMs shows that in over 80% of the cases ONTONET-CRT™ is able to anticipate commencement of tachycardias and fibrillations by a significant number of heart beats. This opens news avenues in terms of dealing with these events even before they commence.

Below is an example of how a sudden increase in complexity precedes a Ventricular Tachycardia.


Read more here.









Sunday, 5 January 2014

Casino Capitalism: Legitimizing the Derivatives Soup.




In 2000, the Commodity Futures Modernization Act (CFMA) passed, legitimizing swap agreements and other hybrid instruments, a massive move towards deregulation and ending regulatory oversight of derivatives and leveraging that turned Wall Street more than ever into a casino. At the same time the first Internet-based commodities transaction system was created to let companies trade energy and other commodity futures unregulated, effectively licensing pillage and fraud. (Enron took full
advantage of this until it all imploded.)  Further, it launched a menu of options, binary options, forwards, swaps, warrants, leaps, baskets, swaptions, and unregulated credit derivatives like the now infamous credit default swaps, facilitating out-of-control speculation.

This deregulatory madness caused unprecedented fraud, insider trading, misrepresentation, Ponzi schemes, false accounting, obscenely high salaries and bonuses, bilking investors, customers and homeowners, as well as embezzling and other forms of theft, including loans designed to fail, clear conflicts of interest, lax enforcement of remaining regulatory measures, market manipulation and fraudulent financial products and massive public deception.

This slicing and dicing of risk-reducing derivative securities is still going on, creating a time bomb waiting to explode with catastrophic consequences.  According to the latest BIS statistics on OTC derivatives markets there was a whopping $693 trillion outstanding at the end of June 2013. That is more than 10 times the GDP of the entire world and equivalent to $100,000 for each of the 7 billion inhabitants of our planet.

The complexity and high potential risk associated with derivatives requires innovative risk assessment procedures and strong technical knowledge. There are tools to measure and monitor complexity of these financial products. One can be  found here.

With this innovative tool you can classify, rank and rate the complexity and resilience of derivatives, and establish maximum allowable levels of complexity and minimum allowable resilience. Products with low resilience contribute to making the system (economy) more fragile.  Once the most complex (dangerous) derivatives have been identified, they should be withdrawn progressively from circulation.


Submitted by Hans van Hoek



www.assetdyne.com
 

 
 
 

Saturday, 4 January 2014

NASDAQ 100 Resilience Rating Analysis - January 2014, (1/2)

The first of two fortnightly NASDAQ 100 Resilience Rating Analysis reports in January 2014 is now available for downloading here.

The second January 2014 report shall be available after January 15-th.

The NASDAQ 100 Resilience Rating Analysis provides a ranking of the 100 stocks composing the index based on stock complexity and resilience. The report is offered free of charge.

Reports can be generated on a daily basis or in real-time. For more information contact us.








Which is the Most Reliable and Trustworthy Rating Agency?


One can never really trust a third party 100%. A lot has been written about the unreliability, lack of transparency and conflicts of interest of the Big Three Credit Rating Agencies. And yet, the entire economy uses and depends on ratings. It's a bit like those who smoke knowing that smoking causes cancer.

Even though the rating agencies have been said to be the "key enablers of the financial meltdown" ratings are necessary. Sure, ratings are necessary but they must be reliable and trustworthy. Because nobody is really 100% transparent and 100% independent, the term "reliable rating agency" sounds like an oxymoron. A radically new approach is needed.

The only person you trust 100% is yourself. So, if you want a 100% reliable rating, you must do it yourself. This is why we have built the "Rate-A-Business" platform, so that you can rate any business yourself. This is how it works:

1. If you want to rate a publicly listed company, you download its financials from its website and you process them at www.rate-a-business.com

2. If you want to rate your own business, you already have the financials. You use data you trust. You trust the result.


In the first case we still have the problem of  trusting the financials that public companies post on their Investor Relations pages. But, at least, the mechanism for rating those numbers which is used by Rate-A-Business remains the same. For everyone. All the time.

Ratings must be democratised. This means they must be in the hands of those who use them. They must become a commodity. Not a means of deception.



www.ontonix.com




Thursday, 2 January 2014

Manipulation



Wall Street claims markets move randomly, reflecting the collective wisdom of investors. The truth is quite opposite. The government’s visible hand and insiders control them, manipulating them up or down for profit—all of them, including stocks, bonds, commodities and currencies. The public is none the wiser.

It’s brazen financial fraud like the pump and dump practice, defined as “artificially inflating the price of a stock or other security through promotion, in order to sell at the inflated price, then profit more on the downside by short-selling. This practice is illegal under securities law, yet it is particularly common, and in today’s volatile markets occurs daily to one degree or other. My career on Wall Street started out like this, in the proverbial "boiler room."

A company’s stock price and true worth can be highly divergent. In other words, healthy or sick firms may be way over- or undervalued depending on market and economic conditions and how manipulative traders wish to price them, short or longer term. During a trading frenzy, a stock price increases and so the capitalization of a company is suddenly more then just a few minutes or hours before? What non sense that is!

The idea that equity prices reflect true value or that markets move randomly (up or down) is nonsense. They never have and more than ever, don’t now. It is therefore crucial to circumvent the regular analysis hype, look at a company and find out the risk and complexity as a top analysis tool. There is no manipulation here, the data gives the company, stock or portfolio a face, and it is not a pokerface. The system developed by Assetdyne allows users to compute the Resilience Rating and Complexity of single stocks, stock portfolios, derivatives and other financial products.

Hans van Hoek
Partner at Assetdyne



www.assetdyne.com



Sunday, 29 December 2013

NASDAQ 100 Resilience Rating Analysis


During 2014 Assetdyne shall be performing a Resilience Rating analysis of all of the NASDAQ 100 stocks. The report shall be offered free of charge and will be produced twice a  month.

In addition, Portfolio Complexity Maps shall be produced and made available for interactive navigation.

The first report is available here.

For more information on Assetdyne visit website.



www.assetdyne.com



Saturday, 28 December 2013

Complexity and Battle Management


Modern battle scenarios involve a huge amount of data and information. According to Wikipedia:

"Network-centric warfare, also called network-centric operations or net-centric warfare, is a military doctrine or theory of war pioneered by the United States Department of Defense in the 1990s.
It seeks to translate an information advantage, enabled in part by information technology, into a competitive advantage through the robust networking of well informed geographically dispersed forces. This networking—combined with changes in technology, organization, processes, and people—may allow new forms of organizational behavior.
Specifically, the theory contains the following four tenets in its hypotheses:
  • A robustly networked force improves information sharing;
  • Information sharing enhances the quality of information and shared situational awareness;
  • Shared situational awareness enables collaboration and self-synchronization, and enhances sustainability and speed of command; and
  • These, in turn, dramatically increase mission effectiveness."
Now that complexity can be measured in real-time using the QCM engine OntoNet, we can take things to the next level: Complexity-Centric Warfare. The first step is to map the entire information flow obtained from a multitude of sensors onto a Complexity Map (before the enemy can trigger an EMP!). The map evidently changes in time as the battle evolves. The concept is illustrated below.



Clearly, sensors gather data about all the forces involved in a particular scenario. The combined map, showing two opposing forces, is illustrated below (clearly, an extremely simple example is shown). Experiments in Air Traffic Control conducted by Ontonix show that it is possible to track hundreds of airborne objects using radar and in real-time. A Massively Parallel Processing version of OntoNet (currently under development) will allow to process thousands of objects.


Once the maps are established, two issues of tactical character become evident:

  • Concentrate firepower on enemy hubs. 
  • Protect your own hubs.
Hubs are easily identified once a Complexity Map is available. A more sophisticated target ranking approach is based on battle Complexity Profiling, which allows to ranks the various actors based on their footprint on the entire scenario. Clearly, just as a Complexity Map changes with time so will the Complexity profile.

And now to strategic issues. How to manage a battle using complexity? Simple. Fast scenario simulation technology provides numerous options to chose from. And how do you chose between, say, two very similar options? You take the one with lower complexity. In other words, you  try to steer the conflict in a way that reduces its complexity. The concept is illustrated below.



A less complex battle scenario is easier to manage. It is easier to comprehend.It allows for faster decision-making. It is easier to be more efficient in a less complex situation than a highly complex one. Finally, highly complex situations have the nasty habit of suddenly delivering surprising behavior. And in the worst possible moment. Sounds like one of Murphy's laws.




www.ontonix.com




Tuesday, 24 December 2013

Amazing what Complexity Technology Can do for Medicine.




Even though the so-called “complexity science” has been around for a few decades, it has failed to produce workable definitions and metrics of complexity. In fact, complexity still today is being seen as a series of phenomena of un-orchestrated self-organization (e.g. swarms of starlings), and emergence in which their complexity is never measured. In early 2005 the first Quantitative Complexity Theory (QCT) has been established by J. Marczyk. According to this theory, complexity is no longer seen as a process but as a new physical property of systems. Complexity, therefore, just like for example energy, is an attribute of every system. In nearly a decade, the QCT has found numerous applications is diverse fields. One of them is medicine.

Because modern science lacks a holistic perspective, favouring super-specialization, a patient is rarely seen and treated as multi-organ dynamic system of systems. Due to this cultural limitation and because of the overwhelming complexity of the human body, only on rare occasions is medical science quantitative.....



Read full White Paper.


Click here for an Interactive Complexity Map of an EEG.


www.ontomeds.com




Thursday, 19 December 2013

Ontonix S.r.l.: How to Dismantle the Derivatives Time Bomb?

Ontonix S.r.l.: How to Dismantle the Derivatives Time Bomb?: From an article  on financial modeling: "Modeling derivatives is of particular importance due to the relative size of the de...

How to Dismantle the Derivatives Time Bomb?



From an article  on financial modeling:


"Modeling derivatives is of particular importance due to the relative size of the derivative market when compared to the real economy. If we examine the Bank of International Settlements (BIS) estimate of “Amounts outstanding of over-the-counter (OTC) derivatives” in December 2010, this amounted to US$601,046 billion. To the World Bank estimate of World Gross Domestic Product (GDP) for 2010, the volume of financial transactions in the global economy is 73.5 times higher than nominal world GDP.




In 1990, this ratio amounted to “only” 15.3. Transactions of stocks, bonds, and foreign exchange have expanded roughly in tandem with nominal world GDP. Hence, the overall increase in financial trading is exclusively due to the spectacular boom of the derivatives markets. In the final analysis, the mathematical modeling of this system of obligations is an imperative for the world economy’s well-being."

Basically, what this means, is that derivatives have engulfed our economy and our very existence relies on trading robots, stochastic integrals and Brownian motion. A very fragile situation which nobody today is able to grasp or control. The system is running on autopilot and nobody know how the autopilot works.

Now, we all know that one salient characteristic of derivatives is their high complexity. This is because they have deliberately been designed to be complex. There are numerous reasons why one would want to design a very complex financial product. One of them is to fool investors. However, there are very important implications deriving from the immission of highly complex financial products into the global economy:

1. Highly complex products have highly complex dynamics which are difficult to capture with conventional math methods. Monte Carlo, VaR, etc. are all techniques that not only belong to the past, they have contributed significantly to the crisis. This means they cannot be used to find a cure. If smoking causes cancer, smoking more will not make it go away.

2. A product may be said to be complex, at a given moment in time, but, precisely because of the complex dynamics of derivatives, this complexity is never constant. It changes.

3. If a product is said to be complex, it means that someone must have measured its complexity. Otherwise, how can such a claim be sustained? Serious science starts when you begin to measure.

4. The biggest problem with derivatives is that of their rating. Since their real dynamics is essentially unknown (or deliberately masked), attempting to rate them is futile. This is where the Credit Rating Agencies failed when they triggered the financial meltdown. On the one hand they assigned investment-grade ratings to products which where known to be toxic, on the other hand their outdated methods of rating were simply not applicable to super complex financial products.

This brings us to the main point of this article. Our economy looks more or less like this:



and we need to fix it before the system collapses. As you read this short article, every minute billions are being traded in hyperspace and the pile in the picture is growing. What can be done? There is no simple recipe. However, what must be done is this:

1. Start to measure and monitor the real complexity of the financial products that are out there. There exist tools today to do this. One is here.

2. Classify, rank and rate the complexity and resilience of derivatives. Establish maximum allowable levels of complexity and minimum allowable resilience of financial products. Products with low resilience contribute to making the system (economy) more fragile.

3. Once the most complex (dangerous) derivatives have been identified, they should be withdrawn progressively from circulation. 


More soon.



www.assetdyne.com














Tuesday, 17 December 2013

Superior-Performance Portfolio Design via Complexity Theory


Assetdyne LLC is a privately held company founded in 2013. Assetdyne has developed the Complexity Portfolio Theory (CPT) and offers an exclusive and sophisticated system which measures the complexity and resilience of stocks and stock portfolios and which introduces the concept of complexity to portfolio theory and design.

While conventional portfolio design often follows the Modern Portfolio Theory (MPT), which identifies optimal portfolios via minimization of the total portfolio variance, the technique developed by
Assetdyne designs portfolios based on the minimization of portfolio complexity. The approach is based on the fact that excessively complex systems are inherently fragile 


See how low-complexity portfolios perform better.

Downlod the full presentation here.



www.assetdyne.com



Friday, 13 December 2013

First Complexity-based Portfolio Design System



Assetdyne introduces Quantitative Complexity Science to portfolio analysis and design. Disruptive innovation in finance in its purest form. Check out the new website 


See interactive examples of stock portfolios.




Monday, 9 December 2013

Some Financial Products Are Said to be Complex. But How Complex is That?





Assetdyne offers an on-line tool which allows users to measure the complexity and resilience of a single security or a portfolio. The tool is connected in real-time to US markets and allows to monitor any security listed therein.

The tool allows investors to answer the following questions:

  • How complex is a portfolio? 
  • How complex is a financial product, such as a derivative?
  • What is the maximum complexity a portfolio can reach?
  • How resilient is it? How well can it resist the market turbulence?
  • What does the portfolio structure look like?
  • How interdependent are the stocks composing the portfolio?
  • Which stocks actually dominate portfolio dynamics?
  • How well can the dynamics of the portfolio be predicted?

See examples of portfolios (and their complexities) - click on a portfolio to open an interactive Portfolio Complexity Map:

Automotive

Steel

Gold Mining

Oil & gas

Pharmaceutical

IT Industry

EU Banks

US banks

Dow Jones



www.assetdynex.com