There still seems to be a rush towards optimal design. But there is
no better way to fragility and vulnerability than the pursuit of peak
performance and perfection - optimality in other words. But let's take a
look at the logic behind this risky and outdated practise:
- Based on a series of assumptions, a math model of a system/problem is built.
- Hence, we already have the first departure from reality: a model is just a model.
- If you're good, really good, the model will "miss" 10% of reality.
- You then squeeze peak performance out of this model according to some objective function.
- You then manufacture the real thing based on what the model says.
It is known - obviously not to all - that optimal designs may be
well-behaved with respect to random variations in the design parameters
but, at the same time, they are hyper-sensitive to small variations in
the variables that have been left out in the process of building the
model. This is precisely what happens - you design for or against
something, but you miss out that something else. By wiping under the
carpet seemingly innocent variables, you're deciding a-priori what the
physics will be like. And this you cannot do. Now, if your model isn't
forced to be optimal - to go to the limit - it might stand a better
chance in that it will have room for manoeuvre. When you're optimal, you
can only get worse! If you're standing on the peak of a mountain, the
only way is down! Why, then, even attempt to design and build systems
that are optimal and that can only get worse? Is this so difficult to
see? Sounds like the Emperor's new clothes, doesn't it?
www.design4resilience.com
www.ontonix.com
No comments:
Post a Comment