Tuesday, 15 October 2013
Saturday, 12 October 2013
Complexity, Criticality and the Drake Equation
Frank Drake devised an equation to express the hypothetical number of observable civilizations in our galaxy N = Rs nh fl fi fc L, where N is the number of civilizations in our galaxy, expressed as the product of six factors: Rs is the rate of star formation, nh is the number of habitable worlds per star, fl is the fraction of habitable worlds on which life arises, fi is the fraction of inhabited worlds with intelligent life, fc is the fraction of intelligent life forms that produce civilizations, and L is the average lifetime of such civilizations. But there is an evident paradox. According to the Drake equation, our Universe should be populated by thousands of civilizations similar to our’s. The number of stars that appear to be orbited by Earth-like planets increases on an almost daily basis. But if that is the case, where is everybody? Why are there no signs of their existence? Why does SETI fail to produce evidence that would support the Drake equation?
In 1981, cosmologist Edward Harrison suggested a powerful self-regulating mechanism that would neatly resolve the paradox. Any civilization bent on the intensive colonization of other worlds would be driven by an expansive territorial impulse. But such an aggressive nature would be unstable in combination with the immense technological powers required for interstellar travel. Such a civilization would self-destruct long before it could reach for the stars. The unrestrained territorial drive that served biological evolution so well for millions of years becomes a severe liability for a species once it acquires powers more than sufficient for its self-destruction. The Milky Way may well contain civilizations more advanced than ours, but they must have passed through a filter of natural selection that eliminates, by war or other self-inflicted environmental catastrophes, those civilizations driven by aggressive expansion.
We propose an alternative explanation of the paradox. In the past, the Earth was populated by numerous and disjoint civilizations that thrived almost in isolation. The Sumers, the Mayas, the Incas, the Greeks, the Romans, etc., etc. If one or more of these civilizations happened to disappear, many more remained. The temporal and spatial correlation between civilizations was very limited. However, the Earth today is populated by one single globalized civilization. If this one fails, that’s it. As we know, the evolution and growth of a civilization manifests itself in an increase in complexity. The Egyptians, for example, deliberately chose not to evolve and for many centuries they haven’t advanced an inch. Such a static civilization is only possible in the presence of an extremely structured and rigid society. But any form of progress is accompanied by an increase in complexity (a mix of structure and entropy). Until critical complexity is reached. Close to criticality, a system becomes fragile and therefore vulnerable. In order to continue evolving beyond critical complexity, a civilization must find ways of overcoming the delicate phase of vulnerability in which self-inflicted destruction is the most probable form of demise. It appears - see our previous articles - that our globalized society is now arguably headed for collapse and shall reach criticality around 2040-2045. What does this mean? If we fail to move past criticality, there will be no second chance, no other civilization will take over, at least not for millenia. Clearly, the biological lifetime of our species is likely to be several million years, even if we do our worst, but as far as technological progress is concerned, that will essentially be it. Based on our complexity metric and on the Second Law of Thermodynamics we can conclude that any world populated by multiple and disjoint civilizations will always tends towards a single globalized society. It appears that globalization is inevitable and this, in turn, accelerates the increase of complexity until criticality is reached.
We argue that the self-regulating mechanism that Harrison suggests ultimately stems from critical complexity. Only a civilization which is capable of evolving beyond criticality and in the presence of overwhelmingly powerful technology, can ever hope to reach for the stars. In other words, critical complexity is the hurdle that prevents evolution beyond self-inflicted extinction. Since none of the ancient (and not so ancient) civilizations never evolved beyond critical complexity - in fact, they’re all gone - they were all pre-critical civilizations. There has never been on Earth a post-critical civilization. The only one left that has a chance of becoming a post-critical one is our’s. But what conditions must a civilization meet in order to transition beyond criticality? Essentially two. First, it must lay its hands on technology to actively manage complexity. Second, it must have enough time to employ it. The technology exists. Since 2005.
www.ontonix.com
The not-that-useful Definitions of Complexity
"Every few months seems to produce another paper proposing yet another measure of complexity, generally a quantity which can't be computed for anything you'd actually care to know about, if at all. These quantities are almost never related to any other variable, so they form no part of any theory telling us when or how things get complex, and are usually just quantification for quantification's own sweet sake". Read more in: http://cscs.umich.edu/~crshalizi/notebooks/complexity-measures.html. The above mentioned abundance of candidate complexity measures - a clear reflection of the rampant fragmentation in the field - is summarized in: http://en.wikipedia.org/wiki/Complexity as follows: In several scientific fields, "complexity" has a specific meaning:
In computational complexity theory, the time complexity of a problem is the number of steps that it takes to solve an instance of the problem as a function of the size of the input (usually measured in bits), using the most efficient algorithm. This allows to classify problems by complexity class (such as P, NP) such analysis also exists for space, that is, the memory used by the algorithm.
In algorithmic information theory, the Kolmogorov complexity (also called descriptive complexity or algorithmic entropy) of a string is the length of the shortest binary program which outputs that string.
In information processing, complexity is a measure of the total number of properties transmitted by an object and detected by an observer. Such a collection of properties is often referred to as a state.
In physical systems, complexity is a measure of the probability of the state vector of the system. This is often confused with entropy, but is a distinct Mathematical analysis of the probability of the state of the system, where two distinct states are never conflated and considered equal as in statistical mechanics.
In mathematics, Krohn-Rhodes complexity is an important topic in the study of finite semigroups and automata.
In the sense of how complicated a problem is from the perspective of the person trying to solve it, limits of complexity are measured using a term from cognitive psychology, namely the hrair limit.
Specified complexity is a term used in intelligent design theory, first coined by William Dembski.
Irreducible complexity is a term used in arguments against the generally accepted theory of biological evolution, being a concept popularized by the biochemist Michael Behe.
Unruly complexity denotes situations that do not have clearly defined boundaries, coherent internal dynamics, or simply mediated relations with their external context, as coined by Peter Taylor.
And now, ask yourself this: can I use any of these measures to study the evolution of a corporation, of air-traffic, of a market? Can any of these 'measures' help identify a complex system and distinguish it from a "simple system"?
www.ontonix.com
Isn't Everything a 'Complex' System?
What distinguishes a theory from a conjecture? For
example a characteristic constant (G, c, h, K, etc.) or a fundamental
equation. The so-called 'complexity theory' has none. Most importantly, it
lacks a measure of its most fundamental quantity - complexity. But
worse than that. It lacks a definition of complexity too! Increasing
complexity is, by far, the most evident characteristic of most aspects
of our lives. It is, therefore, quite correct to talk about complexity.
It would be great to be able to manage it before it becomes a problem.
But, if you can't measure it, you can't manage it. Right?
If we accept the current 'definition' of a complex system we can claim that all systems are complex,
This 'definition' states that a system is complex if it is an aggregate
of autonomous agents, which, spontaneously interact and self-organize
leading to more elaborate systems, etc., etc. You know, the usual 'the
whole is greater than the sum of the parts' stuff. It is also stated,
quite correctly, that it is impossible to infer the behaviour of the
system from the properties of the agents that compose it. True.
Analyzing in depth a single human will hint little on the dynamics of a
society. Nothing new under the sun.
According to the above logic, all systems that surround us are 'complex':
-
Atoms spontaneously form molecules
-
Molecules spontaneously form crystals, proteins, etc.
-
Proteins combine to form cells, which, in turn, form organs
-
Humans form societies
-
Grains of sand form dunes and landslides
-
Flakes of snow combine to form avalanches
-
Animals and plants form ecosystems
-
Matter in the universe forms stars, which organize into galaxies
-
Corporations form markets
-
Molecules of water form drops, which, in turn, form waves in the ocean
-
Electrical impulses in networks of neurons form thoughts, sensations, emotions, conscience, etc.
None of the above require outside orchestration of a Master Choreographer.
A closer look at life reveals that everything we see
and experience is a 'complex system'. At this point, then, one may ask
the following question: what benefit (for science and philosophy) stems
from establishing a new name for a set of objects which already
contains all objects?
Tuesday, 8 October 2013
EU Commission: Italy Has Highest Long-Term Sustainability in the EU
We've been saying it for a long time: Italy's economy is one of the most resilient ones in the EU. It may not have the best performance but it has high robustness. Performance is one thing, robustness and sustainability are another.
Today, it is the EU Commission to confirm that in the long run, Italy has the best Sustainability Index (see above figure ) - see the EU Commission's Fiscal Sustainability Report 2012 from which the above graph is taken.
This seems paradoxical, to say the least. Italy, a G8 economy, with a manufacturing industry that is second only to that of Germany, has been bombarded by rating agencies, attacked by speculators and often indicted as the weakest link of the Eurozone.Why?
www.ontonix.com
Nasdaq CFO Says Complexity is the Biggest Challenge to Market Success
In a recent article, the CFO of NASDAQ states that "Complexity is the Biggest Challenge to Market Success". He also speaks of the complexity of financial products and of a complexity reduction initiative. All this can be put in place if and only if you measure complexity. Talking about it will not reduce it. Hope is not a strategy.
Today, the technology to measure complexity exists:
Assetdyne - www.assetdyne.com - to measure the complexity of stocks and financial products
RateABusiness - www.rate-a-business.com - to measure the complexity of a business
You can only manage it if you can measure it. Resistance is futile.
Monday, 7 October 2013
Probability of Default Versus the Principle of Incompatibility
According to the Millennium Project, the biggest
global challenges facing humanity are those illustrated in the image
above. The image conveys a holistic message which some of us already
appreciate: everything is connected with everything else. The economy
isn't indicated explicitly in the above image but, evidently, it's
there, just as the industry, commerce, finance, religions, etc. Indeed a
very complex scenario. The point is not to list everything but to
merely point out that we live in a highly interconnected and dynamic
world. We of course agree with the above picture.
As we have repeatedly pointed out in our previous articles, under similar circumstances:
- it is impossible to make predictions - in fact, even the current economic crisis (of planetary proportions) has not been forecast
- only very rough estimates can be attempted
- there is no such thing as precision
- it is impossible to isolate "cause-effect" statements as everything is linked
- optimization is unjustified - one should seek acceptable solutions, not pursue perfection
The well known Principle of Incompatibility states in
fact that "high precision is incompatible with high complexity".
However, this fundamental principle, which applies to all facets of
human existence, as well as in Nature, goes unnoticed. Neglecting the
Principle of Incompatibility constitutes a tacit and embarrassing
admission of ignorance. One such example is that of ratings. While the
concept of rating lies at the very heart of our economy, and, from a
point of view of principle, it is a necessary concept and tool,
something is terribly wrong. A rating, as we know, measures the
Probability of Default (PoD). Ratings are stratified according to
classes. One example of such classes is shown below:
Class PoD
1 =<0.05%
2 0.05% - 0.1%
3 0.1% - 0.2%
4 0.2% - 0.4%
5 0.4% - 0.7%
6 0.7% - 1.0%
etc.
A rating affects the way stocks of a given company
are traded - this is precisely its function. What is shocking in the
above numbers, however, is the precision (resolution). A PoD of 0.11%
puts a company in class 3, while a 0.099 in class 2. How can this be so?
Isn't the world supposed to be a highly complex system? Clearly, if
even a crisis of planetary proportions cannot be forecast, it not only
points to high complexity (see the Principle of Incompatibility) but it
also says a lot about all the Business Intelligence technology that is
used in economics, finance, or management and decision making. So,
where does all this precision in ratings come from? From a parallel
virtual universe of equations and numbers in which everything is
possible but which, unfortunately, does not map well onto reality. But
the understanding of the real universe cannot be based on a parallel
virtual universe which is incorrect.
The above example of PoD stratification reflects very
little understanding of Nature and of its mechanisms. In fact,
economic crises of global proportions suddenly happen. As Aristotle
wrote in his Nikomachean Ethics: an educated mind is distinguished
by the fact that it is content with that degree of accuracy which the
nature of things permits, and by the fact that it does not seek
exactness where only approximation is possible.
www.ontonix.com
Subscribe to:
Posts (Atom)