Sunday 5 January 2014

Casino Capitalism: Legitimizing the Derivatives Soup.




In 2000, the Commodity Futures Modernization Act (CFMA) passed, legitimizing swap agreements and other hybrid instruments, a massive move towards deregulation and ending regulatory oversight of derivatives and leveraging that turned Wall Street more than ever into a casino. At the same time the first Internet-based commodities transaction system was created to let companies trade energy and other commodity futures unregulated, effectively licensing pillage and fraud. (Enron took full
advantage of this until it all imploded.)  Further, it launched a menu of options, binary options, forwards, swaps, warrants, leaps, baskets, swaptions, and unregulated credit derivatives like the now infamous credit default swaps, facilitating out-of-control speculation.

This deregulatory madness caused unprecedented fraud, insider trading, misrepresentation, Ponzi schemes, false accounting, obscenely high salaries and bonuses, bilking investors, customers and homeowners, as well as embezzling and other forms of theft, including loans designed to fail, clear conflicts of interest, lax enforcement of remaining regulatory measures, market manipulation and fraudulent financial products and massive public deception.

This slicing and dicing of risk-reducing derivative securities is still going on, creating a time bomb waiting to explode with catastrophic consequences.  According to the latest BIS statistics on OTC derivatives markets there was a whopping $693 trillion outstanding at the end of June 2013. That is more than 10 times the GDP of the entire world and equivalent to $100,000 for each of the 7 billion inhabitants of our planet.

The complexity and high potential risk associated with derivatives requires innovative risk assessment procedures and strong technical knowledge. There are tools to measure and monitor complexity of these financial products. One can be  found here.

With this innovative tool you can classify, rank and rate the complexity and resilience of derivatives, and establish maximum allowable levels of complexity and minimum allowable resilience. Products with low resilience contribute to making the system (economy) more fragile.  Once the most complex (dangerous) derivatives have been identified, they should be withdrawn progressively from circulation.


Submitted by Hans van Hoek



www.assetdyne.com
 

 
 
 

Saturday 4 January 2014

NASDAQ 100 Resilience Rating Analysis - January 2014, (1/2)

The first of two fortnightly NASDAQ 100 Resilience Rating Analysis reports in January 2014 is now available for downloading here.

The second January 2014 report shall be available after January 15-th.

The NASDAQ 100 Resilience Rating Analysis provides a ranking of the 100 stocks composing the index based on stock complexity and resilience. The report is offered free of charge.

Reports can be generated on a daily basis or in real-time. For more information contact us.








Which is the Most Reliable and Trustworthy Rating Agency?


One can never really trust a third party 100%. A lot has been written about the unreliability, lack of transparency and conflicts of interest of the Big Three Credit Rating Agencies. And yet, the entire economy uses and depends on ratings. It's a bit like those who smoke knowing that smoking causes cancer.

Even though the rating agencies have been said to be the "key enablers of the financial meltdown" ratings are necessary. Sure, ratings are necessary but they must be reliable and trustworthy. Because nobody is really 100% transparent and 100% independent, the term "reliable rating agency" sounds like an oxymoron. A radically new approach is needed.

The only person you trust 100% is yourself. So, if you want a 100% reliable rating, you must do it yourself. This is why we have built the "Rate-A-Business" platform, so that you can rate any business yourself. This is how it works:

1. If you want to rate a publicly listed company, you download its financials from its website and you process them at www.rate-a-business.com

2. If you want to rate your own business, you already have the financials. You use data you trust. You trust the result.


In the first case we still have the problem of  trusting the financials that public companies post on their Investor Relations pages. But, at least, the mechanism for rating those numbers which is used by Rate-A-Business remains the same. For everyone. All the time.

Ratings must be democratised. This means they must be in the hands of those who use them. They must become a commodity. Not a means of deception.



www.ontonix.com




Thursday 2 January 2014

Manipulation



Wall Street claims markets move randomly, reflecting the collective wisdom of investors. The truth is quite opposite. The government’s visible hand and insiders control them, manipulating them up or down for profit—all of them, including stocks, bonds, commodities and currencies. The public is none the wiser.

It’s brazen financial fraud like the pump and dump practice, defined as “artificially inflating the price of a stock or other security through promotion, in order to sell at the inflated price, then profit more on the downside by short-selling. This practice is illegal under securities law, yet it is particularly common, and in today’s volatile markets occurs daily to one degree or other. My career on Wall Street started out like this, in the proverbial "boiler room."

A company’s stock price and true worth can be highly divergent. In other words, healthy or sick firms may be way over- or undervalued depending on market and economic conditions and how manipulative traders wish to price them, short or longer term. During a trading frenzy, a stock price increases and so the capitalization of a company is suddenly more then just a few minutes or hours before? What non sense that is!

The idea that equity prices reflect true value or that markets move randomly (up or down) is nonsense. They never have and more than ever, don’t now. It is therefore crucial to circumvent the regular analysis hype, look at a company and find out the risk and complexity as a top analysis tool. There is no manipulation here, the data gives the company, stock or portfolio a face, and it is not a pokerface. The system developed by Assetdyne allows users to compute the Resilience Rating and Complexity of single stocks, stock portfolios, derivatives and other financial products.

Hans van Hoek
Partner at Assetdyne



www.assetdyne.com



Sunday 29 December 2013

NASDAQ 100 Resilience Rating Analysis


During 2014 Assetdyne shall be performing a Resilience Rating analysis of all of the NASDAQ 100 stocks. The report shall be offered free of charge and will be produced twice a  month.

In addition, Portfolio Complexity Maps shall be produced and made available for interactive navigation.

The first report is available here.

For more information on Assetdyne visit website.



www.assetdyne.com



Saturday 28 December 2013

Complexity and Battle Management


Modern battle scenarios involve a huge amount of data and information. According to Wikipedia:

"Network-centric warfare, also called network-centric operations or net-centric warfare, is a military doctrine or theory of war pioneered by the United States Department of Defense in the 1990s.
It seeks to translate an information advantage, enabled in part by information technology, into a competitive advantage through the robust networking of well informed geographically dispersed forces. This networking—combined with changes in technology, organization, processes, and people—may allow new forms of organizational behavior.
Specifically, the theory contains the following four tenets in its hypotheses:
  • A robustly networked force improves information sharing;
  • Information sharing enhances the quality of information and shared situational awareness;
  • Shared situational awareness enables collaboration and self-synchronization, and enhances sustainability and speed of command; and
  • These, in turn, dramatically increase mission effectiveness."
Now that complexity can be measured in real-time using the QCM engine OntoNet, we can take things to the next level: Complexity-Centric Warfare. The first step is to map the entire information flow obtained from a multitude of sensors onto a Complexity Map (before the enemy can trigger an EMP!). The map evidently changes in time as the battle evolves. The concept is illustrated below.



Clearly, sensors gather data about all the forces involved in a particular scenario. The combined map, showing two opposing forces, is illustrated below (clearly, an extremely simple example is shown). Experiments in Air Traffic Control conducted by Ontonix show that it is possible to track hundreds of airborne objects using radar and in real-time. A Massively Parallel Processing version of OntoNet (currently under development) will allow to process thousands of objects.


Once the maps are established, two issues of tactical character become evident:

  • Concentrate firepower on enemy hubs. 
  • Protect your own hubs.
Hubs are easily identified once a Complexity Map is available. A more sophisticated target ranking approach is based on battle Complexity Profiling, which allows to ranks the various actors based on their footprint on the entire scenario. Clearly, just as a Complexity Map changes with time so will the Complexity profile.

And now to strategic issues. How to manage a battle using complexity? Simple. Fast scenario simulation technology provides numerous options to chose from. And how do you chose between, say, two very similar options? You take the one with lower complexity. In other words, you  try to steer the conflict in a way that reduces its complexity. The concept is illustrated below.



A less complex battle scenario is easier to manage. It is easier to comprehend.It allows for faster decision-making. It is easier to be more efficient in a less complex situation than a highly complex one. Finally, highly complex situations have the nasty habit of suddenly delivering surprising behavior. And in the worst possible moment. Sounds like one of Murphy's laws.




www.ontonix.com




Tuesday 24 December 2013

Amazing what Complexity Technology Can do for Medicine.




Even though the so-called “complexity science” has been around for a few decades, it has failed to produce workable definitions and metrics of complexity. In fact, complexity still today is being seen as a series of phenomena of un-orchestrated self-organization (e.g. swarms of starlings), and emergence in which their complexity is never measured. In early 2005 the first Quantitative Complexity Theory (QCT) has been established by J. Marczyk. According to this theory, complexity is no longer seen as a process but as a new physical property of systems. Complexity, therefore, just like for example energy, is an attribute of every system. In nearly a decade, the QCT has found numerous applications is diverse fields. One of them is medicine.

Because modern science lacks a holistic perspective, favouring super-specialization, a patient is rarely seen and treated as multi-organ dynamic system of systems. Due to this cultural limitation and because of the overwhelming complexity of the human body, only on rare occasions is medical science quantitative.....



Read full White Paper.


Click here for an Interactive Complexity Map of an EEG.


www.ontomeds.com