Sunday, 29 December 2013

NASDAQ 100 Resilience Rating Analysis

During 2014 Assetdyne shall be performing a Resilience Rating analysis of all of the NASDAQ 100 stocks. The report shall be offered free of charge and will be produced twice a  month.

In addition, Portfolio Complexity Maps shall be produced and made available for interactive navigation.

The first report is available here.

For more information on Assetdyne visit website.

Saturday, 28 December 2013

Complexity and Battle Management

Modern battle scenarios involve a huge amount of data and information. According to Wikipedia:

"Network-centric warfare, also called network-centric operations or net-centric warfare, is a military doctrine or theory of war pioneered by the United States Department of Defense in the 1990s.
It seeks to translate an information advantage, enabled in part by information technology, into a competitive advantage through the robust networking of well informed geographically dispersed forces. This networking—combined with changes in technology, organization, processes, and people—may allow new forms of organizational behavior.
Specifically, the theory contains the following four tenets in its hypotheses:
  • A robustly networked force improves information sharing;
  • Information sharing enhances the quality of information and shared situational awareness;
  • Shared situational awareness enables collaboration and self-synchronization, and enhances sustainability and speed of command; and
  • These, in turn, dramatically increase mission effectiveness."
Now that complexity can be measured in real-time using the QCM engine OntoNet, we can take things to the next level: Complexity-Centric Warfare. The first step is to map the entire information flow obtained from a multitude of sensors onto a Complexity Map (before the enemy can trigger an EMP!). The map evidently changes in time as the battle evolves. The concept is illustrated below.

Clearly, sensors gather data about all the forces involved in a particular scenario. The combined map, showing two opposing forces, is illustrated below (clearly, an extremely simple example is shown). Experiments in Air Traffic Control conducted by Ontonix show that it is possible to track hundreds of airborne objects using radar and in real-time. A Massively Parallel Processing version of OntoNet (currently under development) will allow to process thousands of objects.

Once the maps are established, two issues of tactical character become evident:

  • Concentrate firepower on enemy hubs. 
  • Protect your own hubs.
Hubs are easily identified once a Complexity Map is available. A more sophisticated target ranking approach is based on battle Complexity Profiling, which allows to ranks the various actors based on their footprint on the entire scenario. Clearly, just as a Complexity Map changes with time so will the Complexity profile.

And now to strategic issues. How to manage a battle using complexity? Simple. Fast scenario simulation technology provides numerous options to chose from. And how do you chose between, say, two very similar options? You take the one with lower complexity. In other words, you  try to steer the conflict in a way that reduces its complexity. The concept is illustrated below.

A less complex battle scenario is easier to manage. It is easier to comprehend.It allows for faster decision-making. It is easier to be more efficient in a less complex situation than a highly complex one. Finally, highly complex situations have the nasty habit of suddenly delivering surprising behavior. And in the worst possible moment. Sounds like one of Murphy's laws.

Tuesday, 24 December 2013

Amazing what Complexity Technology Can do for Medicine.

Even though the so-called “complexity science” has been around for a few decades, it has failed to produce workable definitions and metrics of complexity. In fact, complexity still today is being seen as a series of phenomena of un-orchestrated self-organization (e.g. swarms of starlings), and emergence in which their complexity is never measured. In early 2005 the first Quantitative Complexity Theory (QCT) has been established by J. Marczyk. According to this theory, complexity is no longer seen as a process but as a new physical property of systems. Complexity, therefore, just like for example energy, is an attribute of every system. In nearly a decade, the QCT has found numerous applications is diverse fields. One of them is medicine.

Because modern science lacks a holistic perspective, favouring super-specialization, a patient is rarely seen and treated as multi-organ dynamic system of systems. Due to this cultural limitation and because of the overwhelming complexity of the human body, only on rare occasions is medical science quantitative.....

Read full White Paper.

Click here for an Interactive Complexity Map of an EEG.

Thursday, 19 December 2013

Ontonix S.r.l.: How to Dismantle the Derivatives Time Bomb?

Ontonix S.r.l.: How to Dismantle the Derivatives Time Bomb?: From an article  on financial modeling: "Modeling derivatives is of particular importance due to the relative size of the de...

How to Dismantle the Derivatives Time Bomb?

From an article  on financial modeling:

"Modeling derivatives is of particular importance due to the relative size of the derivative market when compared to the real economy. If we examine the Bank of International Settlements (BIS) estimate of “Amounts outstanding of over-the-counter (OTC) derivatives” in December 2010, this amounted to US$601,046 billion. To the World Bank estimate of World Gross Domestic Product (GDP) for 2010, the volume of financial transactions in the global economy is 73.5 times higher than nominal world GDP.

In 1990, this ratio amounted to “only” 15.3. Transactions of stocks, bonds, and foreign exchange have expanded roughly in tandem with nominal world GDP. Hence, the overall increase in financial trading is exclusively due to the spectacular boom of the derivatives markets. In the final analysis, the mathematical modeling of this system of obligations is an imperative for the world economy’s well-being."

Basically, what this means, is that derivatives have engulfed our economy and our very existence relies on trading robots, stochastic integrals and Brownian motion. A very fragile situation which nobody today is able to grasp or control. The system is running on autopilot and nobody know how the autopilot works.

Now, we all know that one salient characteristic of derivatives is their high complexity. This is because they have deliberately been designed to be complex. There are numerous reasons why one would want to design a very complex financial product. One of them is to fool investors. However, there are very important implications deriving from the immission of highly complex financial products into the global economy:

1. Highly complex products have highly complex dynamics which are difficult to capture with conventional math methods. Monte Carlo, VaR, etc. are all techniques that not only belong to the past, they have contributed significantly to the crisis. This means they cannot be used to find a cure. If smoking causes cancer, smoking more will not make it go away.

2. A product may be said to be complex, at a given moment in time, but, precisely because of the complex dynamics of derivatives, this complexity is never constant. It changes.

3. If a product is said to be complex, it means that someone must have measured its complexity. Otherwise, how can such a claim be sustained? Serious science starts when you begin to measure.

4. The biggest problem with derivatives is that of their rating. Since their real dynamics is essentially unknown (or deliberately masked), attempting to rate them is futile. This is where the Credit Rating Agencies failed when they triggered the financial meltdown. On the one hand they assigned investment-grade ratings to products which where known to be toxic, on the other hand their outdated methods of rating were simply not applicable to super complex financial products.

This brings us to the main point of this article. Our economy looks more or less like this:

and we need to fix it before the system collapses. As you read this short article, every minute billions are being traded in hyperspace and the pile in the picture is growing. What can be done? There is no simple recipe. However, what must be done is this:

1. Start to measure and monitor the real complexity of the financial products that are out there. There exist tools today to do this. One is here.

2. Classify, rank and rate the complexity and resilience of derivatives. Establish maximum allowable levels of complexity and minimum allowable resilience of financial products. Products with low resilience contribute to making the system (economy) more fragile.

3. Once the most complex (dangerous) derivatives have been identified, they should be withdrawn progressively from circulation. 

More soon.

Tuesday, 17 December 2013

Superior-Performance Portfolio Design via Complexity Theory

Assetdyne LLC is a privately held company founded in 2013. Assetdyne has developed the Complexity Portfolio Theory (CPT) and offers an exclusive and sophisticated system which measures the complexity and resilience of stocks and stock portfolios and which introduces the concept of complexity to portfolio theory and design.

While conventional portfolio design often follows the Modern Portfolio Theory (MPT), which identifies optimal portfolios via minimization of the total portfolio variance, the technique developed by
Assetdyne designs portfolios based on the minimization of portfolio complexity. The approach is based on the fact that excessively complex systems are inherently fragile 

See how low-complexity portfolios perform better.

Downlod the full presentation here.

Friday, 13 December 2013

First Complexity-based Portfolio Design System

Assetdyne introduces Quantitative Complexity Science to portfolio analysis and design. Disruptive innovation in finance in its purest form. Check out the new website 

See interactive examples of stock portfolios.

Monday, 9 December 2013

Some Financial Products Are Said to be Complex. But How Complex is That?

Assetdyne offers an on-line tool which allows users to measure the complexity and resilience of a single security or a portfolio. The tool is connected in real-time to US markets and allows to monitor any security listed therein.

The tool allows investors to answer the following questions:

  • How complex is a portfolio? 
  • How complex is a financial product, such as a derivative?
  • What is the maximum complexity a portfolio can reach?
  • How resilient is it? How well can it resist the market turbulence?
  • What does the portfolio structure look like?
  • How interdependent are the stocks composing the portfolio?
  • Which stocks actually dominate portfolio dynamics?
  • How well can the dynamics of the portfolio be predicted?

See examples of portfolios (and their complexities) - click on a portfolio to open an interactive Portfolio Complexity Map:



Gold Mining

Oil & gas


IT Industry

EU Banks

US banks

Dow Jones

Tuesday, 3 December 2013

OntoMed Launches OntoNet-CARDIO for Real-Time Anticipation of Tachycardias and Fibrillations

OntoMed, a privately-owned company developing leading-edge complexity-based technology and solutions for applications in medicine, releases OntoNet-Cardio, an advanced algorithm which processes EGMs (Electrograms) providing early-warnings of imminent tachycardia and/or fibrillation. "The algorithm does not detect tachycardias or fibrillations, it anticipates them by identifying pre-event conditions as early as 15-20 seconds before they actually happen" said Dr. J. Marczyk, the CEO of OntoMed. "The principle on which OntoNet-Cardio functions has been verified in a multitude of fields and applications and is based on sudden variations of complexity which precede traumatic events" he added. "We are open to partnerships and collaborations with ICD/pacemaker manufacturers who are interested in incorporating our technology in their products" he concluded.

The image below illustrates how the system functions.

Read full Press Release here.