Why is nature simple and yet complicated

Complex or complicated - that is the question

After we repeatedly come across the incorrect use of the term "", we would like to provide an easily understandable explanation with this article, what is to be understood by and why we often fail with many established and proven methods.

When in the movie "Avatar" the military destroy the tree of life of the "Na'vi's" on "Pandora", the life of the residents is thrown out of joint. Because the tree is sacred to the Na'vi ', as it reflects the complex, delicate nature-loving life of the humanoid inhabitants. Your life and thinking is characterized by complexity and networking. Basically transferable to our earthly existence, thinking and acting today.

But in contrast to the world of avatars, we still remain too much in a world of linearity. What does it take New approaches to thinking and networking. Accordingly, the subject of "networking" plays a central role in the present systemic analysis. The background to this is that linear patterns of thought and action have proven their worth in the past and contributed to the social success of Western society. But with the changing framework conditions - especially with a view to digitization and networking - in recent years, our previous solution skills are increasingly reaching their limits. Even more, the "changing world" is shaking our previous model of success.

The crux: linear thinking is based primarily on simple cause-effect relationships and largely avoids dealing with complex networks and interactions that arise from increasing technical and social networking. Current (complex) challenges are often greatly simplified and broken down into individual topics in order to be able to analyze and process them with the previous methods. Numerous major projects that got out of hand, such as the new Berlin Brandenburg Airport as an extreme example, bear witness to this.

A challenge in the following considerations is that texts can only be mapped linearly. Therefore there are always cross-references [→ ...] to other related text passages.

The transformation to a network society

What many of us are hardly aware of is that we are in the middle of a fundamental social transformation. This began around the 1950s with the development of more powerful computers. In addition to the agricultural and industrial society, the network society emerges. In the literature, other terms such as information or knowledge society, the third industrial revolution or the second modernity are used. The term "digitization" can also be used for this. Networks play a central role in this transformation, which is why the term "network society" is used in this article. At the same time, the previously predominant tertiary economic sector (services), like the primary sector (agriculture) and the secondary economic sector (industry), will be largely automated. The increasing cutthroat competition is taking place in parallel in a wide variety of areas. The increased networking and automation naturally also has its downsides. Our life in developed industrial societies is already completely dependent on the functioning of vital, critical infrastructures (such as electricity and telecommunications). A failure can very quickly lead to devastating consequences, even if we can hardly imagine that up to now. This is also because the (technical) networking to date has often grown chaotically and systemic aspects have only been insufficiently taken into account.

What is a system

The term "system" plays a central role in this context (see Figure 1). In general, this is a whole made up of several individual elements. Systems, viewed in the abstract, consist of different system elements that are networked with one another and form an effective structure. Without these networks (= relationships) you only have a collection of elements, but not a system, as is the case with a pile of sand. There is a system boundary that determines the system to its environment. This results in a certain "identity" and a certain purpose of the system. A system description is of course only a model of reality.

Figure 1: Simplified representation of a system [source: own illustration]

These limits do not always have to be physically or actually deterministically present, or they often depend on the observer or on the concrete observation. They can be of a material nature (for example our skin) or of an immaterial nature (for example the delimitation of a social group). Such limits can represent a range limitation which is important for system security and which, for example, limit the spread of disturbances or errors in a system (e.g. diseases). Systems generally have an optimal size or number of system elements.

They are adaptable and have a certain elasticity. However, if a not clearly recognizable critical value is reached and if it is not possible to create appropriate subsystems in good time, the system collapses. Each system element can potentially enter into relationships with every other system element of the overall system. The number of possible interactions increases exponentially with the number of system elements [→ exponential developments, → dynamics]. External relationships to other systems or environmental elements are also possible. This is the rule with natural systems. That is why they are also known as open systems.

It is very important that a system is more than the sum of the system elements [→ emergence]. This is easy to understand using the human example. Even if all the chemical elements of the human body are available in the necessary and quantity, this does not yet result in a human being. The "invisible threads" between the elements are decisive. Humans consist of different subsystems, such as molecules, cells or organs.

Networking and viability

The size of a system depends on the number of system elements and the networking between them. With the increase in networking (redundancies), the stability of a system also increases, but not indefinitely (see Figure 2).

Mankind has also developed according to this pattern. The core system is the family, which can be integrated as a subsystem in a clan or municipality, or has only loose connections to other systems. "The family" is not necessarily to be equated with our present-day conception and organization of the family, as it has had different characteristics over the course of history. It was only through mobility and technical communication options that the boundaries were shifted or the clear structure sank. These were more and more replaced by a wild network (see for example social media). Orientation and observation, but also control and steering, become more difficult.

Figure 2: From unstable individual elements to a stable system. In the event of overstretching, the formation of subsystems is necessary, as otherwise a system collapse threatens [Source: own illustration based on Vester]

Until the widespread motorization began, the range of the individual was still quite limited. Often only a few kilometers that could be covered on foot or with the help of animals. Of course there were also exceptions, such as commercial travelers or explorers, but to a very limited extent or with long travel times. This has changed massively in the last century and especially in the 21st century, first through the increase in mobility and then through the spread and the possibilities of information and communication technology. This has led to the fact that today many people can come into contact and communicate with someone almost anywhere in the world in a matter of seconds, even in otherwise very disadvantaged regions. Through this networking and also information and stimulus overload, we perceive "disturbances" and risks from far away regions much more than the dangers that actually threaten us. This can lead to very paradoxical behavior, such as the fact that after the reactor disaster in Fukushima numerous people in Central Europe bought a Geiger counter in order to be able to assess their "personal" information.

With the increasing size of a system, there is also the risk that disturbances can spread more easily, faster and more far-reaching in the system and beyond system boundaries. Therefore the fragility of a system increases. Today, for example, illnesses can spread much easier, more far-reaching and faster than just a few decades ago due to the massive travel flows. The 2007 financial crisis emanating from the USA triggered far-reaching and worldwide subsequent crises not only in the financial sector.

The (over) viability of a system depends on its size or network density with its environment. In evolutionary development, "small is beautiful" has prevailed, since smaller systems are more adaptable to disturbances. Nature does not limit the interaction between beings, but their size. This automatically limits the range of interference.

In addition, systems are generally strengthened in that individual system elements or subsystems can and may fail. Diversity or variety also plays an important role here, as this ensures adaptability and further development [→ Missing or decreasing diversity]. What has proven itself prevails. However, not through a master plan, but through trial and error [→ Does damage alone make it wise?]. Here, however, the silent certificates are often overlooked. When looking at history, we don't see everything, only the success stories ("the winner writes the story"). Therefore, these are usually overrated. Things that didn't work disappear or are quickly forgotten. In the case of human undesirable developments there are therefore always repetitions, as history shows.

Too big to fail

Even "too big to fail" contradicts these principles. When many small businesses fail, it hardly attracts any attention. However, if a large construction company or a bank fails, all levers are set in motion to prevent this from happening [→ Short-term action versus long-term goals]. However, this makes the overall system more fragile and the probability of a time-delayed major disruption increases [→ Time-delayed effects]. This danger also threatens current flood protection projects, in which a great deal of effort is made to prevent flooding. If a protective structure breaks, however, the damage would be enormous. This is probably also because the potentially affected people prepare for such an event much more poorly due to the protective structure, since they have to cope with fewer and fewer disturbances [→ visualization]. One speaks here of a security or vulnerability paradox. The more a system is protected against a certain danger and against a specific severity of the disturbance, the more vulnerable it becomes to larger or different types of disturbance.

Foresters in American national parks had a similar experience. For a while every small fire was immediately extinguished or prevented. This led to more and more dead (combustible) material accumulating. If there was a fire, it would develop much more quickly into a major fire that was no longer controllable. Small disturbances (burns) strengthen systems and reduce fragility [→ Small cause, big effect]. If this is prevented (by humans), only the time of occurrence is delayed and the effects accumulate, that is, they worsen. These observations are not only found in nature, but also in many other areas. People often overestimate their own manipulation abilities or disregard longer time horizons. The agricultural industry invests a lot of money in genetic research, for example to design pest-resistant plants. With dubious success, as demonstrated by the adaptability of the corn rootworm. Here it is now necessary to increase the dose of poison used at ever shorter intervals. This in turn triggers negative effects in the rest of the environment [→ feedback]. The corn rootworm adapts evolutionarily, it gets stronger and stronger because only the strongest survive and develop further. The same thing happens in other areas too. The increasing concern about so-called "killer bacteria" is therefore more than just hype. Experts assume that climate change in particular favors the development of multi-resistant bacteria.

Diversity

The increasing concentration of seed producers poses a further threat to food security and thus security of supply. The currently dominant growth and efficiency increase paradigm means that fewer and fewer large corporations and varieties remain, which contradicts a diversity that is essential for survival [→ growth paradigm] (see Figure 3).

Figure 3: Consequences of a lack of diversity [source: Taleb et all 2014]

Such a concentration of power and monopoly can also be observed in many other areas. This is particularly acute in IT hardware production, global ship logistics and the pharmaceutical industry. There remains an ever smaller number of ever larger companies. In addition, research budgets for drug research, for example, have been reduced significantly in recent years [→ short-term action versus long-term goals]. For example, new antibiotics are being developed less and less and at the same time the number of antibiotic resistances is increasing. In addition, there are only a handful of production facilities for antibiotics in the world. All of this happens largely unnoticed by the public. There is a lack of perspective and a feeling for the fragility and vulnerability of our way of life [→ complex systems]. The risk of strategic shocks increases [→ strategic shocks].

and complex systems

A major reason for the increase in incalculable risks is an increasing one. The term "" is used frequently and in very different contexts. Whether in the technical or political area, it seems to be applicable almost everywhere ("Boundary Object"). The term is often used to describe opaque, difficult to grasp, dynamic and therefore hardly predictable and controllable situations or systems. A certain overstrain and helplessness is associated with it. The term itself is derived from the Latin complexus, which means something like interwoven or interwoven. With a little distance you can quickly see a connection with technical networking.

Networking increases and dynamics in systems, since there is constant feedback [→ feedback]. Complex open systems arise that are interrelated with their environment. The system boundary of a complex system cannot be precisely defined. A central control as with machines (closed systems) is not possible. The control (regulation) is based on simple decentralized feedback processes and control loops [→ feedback]. Human interventions that do not take these mechanisms into account fail, albeit often with a delay or lead to unintended results and side effects.

Complex systems have a number of properties that we hardly know from our previous technical solutions (closed systems).

The complexity researcher John Casti coined the term complexity gap. It describes the difference between systems of different types (see Figure 4). Gaps in complexity tend to even out. If this is not done by "controlling" interventions, the system is cleaned up [→ networking and viability, s-shaped growth]. The non-adaptable system collapses. The "thread" breaks abruptly.

Figure 4: Complexity gaps arise between the promises of marketing and the technical / physical possibilities and limits [source: own illustration]

People tend to overestimate the elasticity of systems in general. This is certainly given, but then reaches its limits relatively quickly. Whether this is in the individual area ("burnout"), in the financial market, in technical solutions, in resource consumption, in the power supply system, there is no example that we could extend these limits indefinitely [→ networking and viability].

The number and size of such complexity gaps have increased significantly since the increase in crosslinking density. For example, between what politics, the market or marketing promises and specifies and, on the other hand, what is technically / physically possible and sensible or what is still manageable.Due to the time-delayed effect, we have so far only got to know the positive sides of networking [→ time-delayed effects].

The financial crisis of 2007/2008 can once again be used as an example. It turned out that even insiders had lost track of which objects, products or elements were being sold and bought. And a similar level can be observed in many technical areas today. For example, the authors of the insurance study "Beyond Data Breaches: Global Interconnections of" conclude:

"The way in which the complexity of interconnected risks is assessed is painfully similar to how financial risks were assessed prior to the 2008 crash ... in the end, it was this very complexity which helped bring the system down."

The sometimes chaotic and non-systemic technical networking of recent years has led to a massive increase in the number of systemic risks in our society and in critical infrastructures, not least due to the growing complexity gaps. are characterized by:

  • a high degree of networking (dynamics,, interactions);
  • the risk of domino effects;
  • a non-linearity in the effects (no simple cause-and-effect chains, which are recorded for example by the classic) and
  • a systematic underestimation of trigger events and effects by those responsible.
  • As a result, the probability of strategic shock events, i.e. events that are able to change our coexistence sustainably - in the long term and significantly - ("game changer") has increased massively.

Figure 5: Domino effects can trigger a large number of subsequent events [source: Fotolia.com / © arudolf]

Strategic shocks
 
There are now a number of possible and realistic scenarios (Figure 6) that can be triggered by. It is essential that our critical infrastructure would almost always be affected by such a strategic shock event, as there are corresponding interactions and dependencies [→ complex systems]. Strategic shocks are events that are extremely rare but are expected to have enormous effects. Nassim Taleb coined the term "black swans" for this. He also adds that such events are always easy to explain in retrospect, but are not recognized or ignored beforehand.

Figure 6: Possible strategic shock events that can already be estimated [source: own illustration]

A financial collapse would have a significant impact on the numerous infrastructure construction sites of the energy transition. Conversely, blackouts - supra-regional and prolonged power and infrastructure failures - could lead to far-reaching economic and financial crises. Due to the multi-layered networking, strategic shocks can often spread to other systems (domino effects) - to society, to companies, to other areas of infrastructure or to almost everything connected with them. There are a number of pitfalls that favor this spread.

One example is just-in-time / just-in-process logistics, which requires a very high degree of synchronization and is therefore very prone to failure if important chain links fail at the same time. This is hardly noticeable in everyday life, as small disturbances are very well controlled. However, the effects of a possible blackout on this area can hardly be estimated.
The economic pressure to optimize and increase efficiency has meanwhile led to a reduction in the robustness of systems, since important redundancies, reserves and buffers as well as personnel are being saved more and more frequently ("dead capital") [→ growth paradigm]. This means that the skills and competencies required to deal with major disruptions decrease.

The developments in the area of ​​critical infrastructure, on which our community depends to a great extent, are particularly explosive. Through increasingly complex and opaque technical solutions and through increasing networking, we are creating ever greater vulnerabilities without being aware of them or having a plan B for them. Technical safeguards against disasters often only postpone the critical point at which a system tips over. The forest fire example given above also fits here.

Another pitfall is that we have been able to experience very stable and constant conditions in our Central European society over the past few decades. Therefore, there is hardly any awareness that the entire history of mankind and even today most of the world was and is characterized by variability and cyclical developments. In recent years we have reduced important safety nets in many areas, which in turn makes us more susceptible to major disruptions. Whether it concerns the financial sector, the supply of energy and raw materials, the European electricity supply system, one or the possible effects of climate change, there are a multitude of potential events that make us vulnerable to strategic shocks. In view of the possible consequences and the society-changing effects, probabilities are almost irrelevant.

The less often an event occurs, the more serious the consequences, and the more difficult it is to make an analytical assessment. It is therefore not decisive that someone "predicts" an event, but that this "prediction" has consequences. It is therefore necessary that systems and their fragility are analyzed and not individual events or individual elements of a system. A procedure that is largely not common today.
A well-known example to illustrate the misjudgment of risks is the so-called turkey illusion (Figure 7): A turkey that is fed by its owner every day has no idea what will happen on day X. Based on his positive experiences, he has to assume that the probability that something seriously negative will happen will decrease from day to day. On the day before Thanksgiving, however, a decisive turning point will occur, with correspondingly fatal consequences for the turkey. The turkey illusion also stands for the belief that anything can be calculated, although this is not possible.

It also plays a role that the absence of evidence is confused with evidence of non-existence. It can be assumed that ostensibly stable systems are more fragile than systems in which disturbances occur more frequently [→ networking and viability]. In addition, it overlooks the fact that the so-called worst case at the time it occurred was worse than the "worst case" expected at the time.

Figure 7: The turkey illusion as a result of linear, past-oriented thinking [source: own illustration]

Exponential developments

Among other things, exponential developments play a decisive role in the underestimation of the effects (see Figure 8), which are difficult to grasp with linear thinking, as the following legend expresses: The inventor of the chess game had a wish. He wanted the following, ostensibly very modest, reward from his king: one grain for the first square on the chessboard, two grains for the second, four grains for the third and twice as many for each additional square as on the previous square. However, this wish could not be fulfilled. 264 corresponds to around 18 trillion grains of wheat, or around 100 billion truckloads of grain, which could not be covered with all world harvests since the beginning of grain cultivation. In networked systems, the possibility of interactions increases exponentially, which means that controllability drops drastically [→ Small cause, big effect].

dynamics

Networking also increases the dynamism of a system. This describes the changes in all system states over time. A dynamic system never stands still, it lives. Analyzes are therefore always only a section at time X, which usually has a negative effect when planning or carrying out system interventions, since the system is constantly changing.

Figure 8: Exponential developments using the example of 2x come into play, for example, in information technology [source: own illustration]

Emergence

The increasing emergence in complex systems plays a further role. This leads to the spontaneous development of new system properties or structures through the interaction of the system elements and the feedback. The properties of the system elements do not allow any conclusions to be drawn about the emergent properties of the system. For example, the elements oxygen and hydrogen are flammable. Combined in the water molecule, water can be used to extinguish a fire. In addition, there is spontaneous self-organization, controllability as we know it from linear systems ("machines") is no longer possible. Complex problems can therefore not be broken down into sub-problems in order to then analyze them and then put them back together to form an overall solution.

Feedback

In complex systems there are positive and negative feedbacks. Positive feedback is self-reinforcing (more leads to more). Although they are important for starting or braking, they are harmful in the long run. Negative feedback, on the other hand, has a stabilizing effect (more leads to less). Both types are necessary for the self-control of complex systems.

In the financial market in particular, positive feedback leads to the formation of bubbles and crashes. The price rises and with it the interest in the papers, with which the price rises again. But not infinitely [→ networking and viability]. And this point in time can never be predicted [→ Complex Systems]. Even if mathematical models are used again and again to predict negative developments, all attempts have so far been unsuccessful. People generally tend to attribute success to their own ability and failure to external influences and bad luck. It is usually ruled out that it could have anything to do with chance and luck. If you know these mechanisms, you can use them. But that has nothing to do with an alleged predictability.

Figure 9: Control versus regulation [source: own illustration]

versus

Another mistake arises in that correlations are often equated with causalities [→ short-term actionism versus long-term goals]. The describes a relationship between different events, states or functions. There does not have to be a causal relationship. on the other hand denotes a natural law, reproducible connection between cause and effect. A causal event has a fixed temporal direction that always starts from the cause, which is followed by the effect, which therefore leads to false conclusions in complex systems with ongoing feedback [→ complex systems].

So there can be one between the decline in the number of births and the decline in storks in a region, but certainly none.

Short-term action versus long-term goals

Our economic system or the pursuit of the growth paradigm leads to the fact that in many areas planning and action are only very short-term and short-sighted [→ growth paradigm]. The view, which is increasingly driven by key figures, leads to actionism. The horizon is limited to short-term successes, long-term viability is left out. Actionism is now taking place almost everywhere because it has become socially acceptable. Instead of looking for the cause of a problem and starting there, only symptom treatment is often carried out, as this can be applied quickly and delivers a quick ("marketable") result. Fundamental solutions, on the other hand, often lead to disadvantages in the short term and only bring positive benefits or added value in the long term [→ evolutionary imprints].

Whether that is a constitutional or administrative reform that has not taken place, an educational reform, a healthcare reform or a pension reform, the examples can go on for a long time. But also flood protection dams or avalanche protection structures are mainly a symptom treatment. Here, too, the gaps in complexity increase.

Growth paradigm

Our apparently irrefutable growth paradigm, to which virtually everything else has to subordinate itself, plays an essential role in many negative developments. Man has always succeeded in pushing boundaries. Whether in terms of population growth or the occurrence of resources, expectations have always been clearly exceeded. Another possibility for short-term growth or being able to deliver positive numbers is to reduce reserves and redundancies or to delay maintenance intervals. However, whether this is sustainable in the long term will only become apparent in the future. Until then, the previous knowledge applies that a system that absolutely needs permanent growth cannot exist sustainably and therefore itself brings about its own downfall.

In nature there is no unlimited growth, only cyclical or S-shaped growth, as this is self-destructive. So far, tumors have been the unsuccessful counter-attempt.

S-shaped growth

S-shaped growth begins slowly, increases exponentially after a longer period and then flattens out again (see Figure 10). Whether through supply and demand, scarcity of resources or predator-prey relationships, the decisive factor is always self-regulating control loops [→ feedback]. Further growth is only possible through a new cycle (e.g. through a new technology), which must be initiated in good time. The artificial expansion of exponential growth has always led to a system collapse.

Figure 10: S-shaped growth [source: Vester 2011]

People tend to ignore this mechanism. Which can go well for a while, since system boundaries are flexible and systems are generally elastic [→ systems]. However, this success leads to an overestimation of one's own abilities, with mostly long-term negative consequences [→ time-delayed effects]. Long-term market leaders such as Kodak in the field of analog photography or Nokia in the field of mobile phones have underestimated such developments (digital photography and smartphones). Kodak is history, Nokia is only playing a shadowy existence compared to its earlier role. Intensive land use leads to more growth and output in the short term, but leads to an overuse of resources and thus to a decline or complete destruction of usability.

But fundamental social developments usually only arise when the old has perished ("creative destruction"). Innovation also generally means that previous solutions become obsolete or can be managed with far fewer resources. This is often overlooked and then leads to unpleasant surprises, even though the principles are always the same. Either a timely new beginning and transition succeeds, or it comes to an abrupt end.

Delayed Effects

Time-delayed effects also play a role. Things that are far away are difficult for us to assess [→ evolutionary imprints]. Cardiovascular diseases, obesity and many other diseases of affluence are based on years, if not decades, of misconduct. But not only in the personal area we have difficulties with it. Climate change also occurs over many decades, at first slowly and then faster and faster. There is an exponential increase in the effects, which are irreversible [→ exponential developments, → s-shaped growth].

Another example comes from the world of information technology, where there has been an exponential increase in the and quantity of incidents over the past few years. And it seems that the end has not yet been reached, on the contrary. What might still lie ahead would overshadow everything that has gone before. An infrastructural system collapse is not a utopia.

The European power supply system is also being operated largely unnoticed by the public, and is increasingly being pushed to its limits. The unsystemic interventions of the energy transition in Germany contribute significantly to the destabilization [→ short-term actionism versus long-term goals, → complex systems]. A major European disruption ("blackout") due to a system failure now only seems to be a matter of time.

Small cause, big effect

In highly networked systems, small causes can have devastating effects. For example, there are still the aftermath of the American real estate crisis in 2007. If the causes have not been remedied, it can be assumed that the subsequent crises will be even more severe [→ networking and viability]. The interactions of the highly networked financial system have been and are often underestimated.

All major blackouts in recent years (outside of Europe) were triggered by the accumulation of several small events at the wrong time.Europe has so far been spared this, but unfortunately this is no guarantee for the future [→ turkey illusion, → networking and viability].

Avalanches are also a very vivid example. These are triggered by small disturbances. Self-reinforcing feedback creates the devastating and at the same time irreversible effect.

Of course there are also positive examples of small causes, big effects. Like the accidental discovery of penicillin, which had a massive impact on mortality. The Pareto principle can also be used here. This means that 80 percent of the results are achieved with 20 percent of the total effort. The remaining 20 percent of the results require most of the effort (80 percent).

Evolutionary imprints

In human actions, evolutionary patterns always play a role. For example, we tend to accept short-term success rather than long-term added value. In psychology, the term "delayed reward" is used for this. In this case, an immediate (effortless) reward is foregone in favor of a larger reward in the future, which, however, can either only be achieved by waiting or through previous effort. This phenomenon can be observed today in many areas, for example in political decisions [→ short-term action versus long-term goals]. What made sense evolutionarily is often a long-term disadvantage today.

Another characteristic is that when a large group dies, there is much more dismay than when the same number of people die in a distributed manner. This is also historically understandable, since the death of a larger group of a clan threatened the survivability of the entire clan. That no longer applies today. However, we are still reacting according to this pattern. For example, around 1,500 more people died on American roads in 2002 than in previous years. Many people feared flying after 9/11 and preferred to travel by car. A fatal mistake. Since then, a lot has also been invested in (flight). At the same time, however, we have allowed our society to become many times more vulnerable in the infrastructural area.

As a result of climate change, longer periods of heat are to be expected in our part of the world. Previous evaluations have shown that there is a massive increase in mortality during such heat waves. In the past 50 years, Europe has suffered more deaths from heat waves than from all other catastrophic events combined. At the same time, there is hardly any public awareness of this. This is probably also due to the fact that no single event leads to a mass seizure, but that it is a "creeping" process, where the cause of death cannot always be clearly assigned to a cause [→ simple cause-effect thinking].

On the other hand, average values ​​lead to false expectations, such as the increase in global warming as a result of climate change. Much more important are the variances that can be expected (e.g. extreme weather events), which have an impact particularly at the local level and also affect the necessary coping capacities. Average values ​​generally give a wrong impression and lead to wrong conclusions.

A general problem is also that we concentrate too much on what we already know and less on prevention, as the author Nassim Taleb expressed: "We tend not to learn the general but rather the precise We do not learn rules, only facts. Everyone knows we need more prevention than treatment, but hardly anyone rewards preventive measures. We glorify those whose names have gone down in the history books at the expense of those whom our books are silent about. "

Insufficient consideration of the system

What all scenarios have in common is that the basis of these developments can be traced back to a lack of system consideration and consideration. In many areas we still act as if there was no network and one could look at the individual areas in isolation ("silo thinking").

This error was admitted around 2013 within the framework of the European Program for Critical Infrastructure Protection (EPCIP): "The review process of the current European Program for Critical Infrastructure Protection (EPCIP), conducted in close cooperation with the Member States and other stakeholders, revealed that there has not been enough consideration of the links between critical infrastructures in different sectors, nor indeed across national boundaries. The studies indicate that risk assessment methodologies for follow either: 1) a sectoral approach, where each sector is treated separately with its own risk methodologies and risk ranking; or 2) a systems approach, where critical infrastructures are treated as an interconnected network. Most work has been sectoral, but these methodologies show their limits when cross-sectoral issues need to be addressed, so a systems approach will be encouraged by the Commission from now on. "

But rethinking and doing new things takes time. Time that we do not have in many areas, as developments with the potential negative effects are advancing unchecked. Still, we have to start with it.

Systemic thinking

This closes the circle to linear thinking. Albert Einstein is often quoted with the following sentence: "Problems can never be solved with the same way of thinking that caused them." Therefore, current and future challenges, such as climate change, technical disasters ("man-made disaster"), financial crises, food crises, antibiotic resistance, terrorism, famines, natural disasters, pandemics or scarcity of resources are not limited to past and experience-based thinking to solve.

The "control" of networked systems also requires networked thinking and acting, that is, systemic thinking. We have to say goodbye to the idea of ​​controllability, as is possible with machines. That doesn't work with complex, open systems. Only if we can accept this can we learn to deal with the new challenges and the associated risks.

Figure 11: A paradigm shift in the safety assessment is necessary [source: own illustration]

Systemic thinking helps to recognize the essentials of a system, the structure of effects and interactions. It is no longer a matter of concentrating on the essentials, but of capturing the entire pattern (see Figure 11). In addition, attention must be paid to developments and not to conditions. Because conditions change frequently in dynamic systems.

Systemic thinking and the ability to adapt also means recognizing which knowledge is no longer relevant and should therefore be forgotten again. Especially in times of major upheavals and system changes, as we are currently experiencing through the transformation to a network society, unlearning is almost more important than learning. Because this is the only way to free up resources to be able to take new paths. See here once more Kodak or Nokia as warning examples.
The term resilience means less "", as it is often translated. Rather, it is about the ability to learn and adapt, which in turn requires openness to new things. What we are currently experiencing much more often, however, is that we try to maintain the old and well-tried with all means and possibilities. But if the framework conditions no longer fit, shipwreck is inevitable. If we want to keep up with developments, then we have to dare to think things completely new and radically different.

Efficiency - doing things right - the way we do it today contradicts robustness and resilience, since unnecessary redundancies and reserves are undesirable [→ strategic shocks]. Actually, we should therefore ask ourselves the question of effectiveness, i.e. whether we are doing the right things or pursuing goals [→ growth paradigm].

As well ... as

Hence, "both-and-thinking" is required. Technical networking has brought many positive achievements to mankind. Unfortunately, we tend to overestimate this side and ignore the possible downsides until we enter. Our occidental "either-or-thinking" is binary. Good and bad, warm and cold, dry and hot, healthy and sick, poor and rich, and so on. The emphasis is on "and", not on "or". This aspect often gets in our way. This also restricts a lot of room for maneuver. With a "both-and-thinking", reality can be represented more easily. It is not just black and white, there are many shades of gray in between, although the poles play an important role and are mutually dependent. It should therefore go without saying that every sunny side also has a dark side.

This also makes it easier to deal with the contradictions and ambivalences that always exist in everyday life. These often cannot be resolved, or that usually only leads to pseudo-solutions.

System design

In nature, too, no attempt is made to eliminate disturbances, but rather these are integrated into the process (see Figure 12). In addition to reducing energy and resource requirements, as well as decentralization and decentralized control through feedback loops, error-friendliness / fault tolerance is an essential aspect in designing a robust, viable system. This can significantly reduce dependencies and increase the ("resilience") of the system. No fault in the system must have a negative impact on the entire system.

Figure 12: Nature builds disturbances into the process and does not prevent them

Cellular structures and control loops, such as those already used in automation technology, are in demand here. Many current concepts, such as the massive increase in centralized networking (keyword: smart), contradict this approach and lead to an incalculable vulnerability.

Concentrating (power) on less and less large actors [→ Too-big-to-fail] also increases social vulnerability and the potential for abuse. A Facebook replicated a million times over would not be quite as efficient, but it would drastically reduce the potential for abuse or surveillance and at the same time not significantly limit the potential benefits, since the circle of friends is usually located in a very regional context. In addition, interaction with other platforms would not be prevented.

Nevertheless, we will continue to need linear, logical, rational, analytical or specialist thinking for suitable processes and technologies in the future. But we also need more and more people who can see the whole system and recognize possible undesirable developments. Because the previously very successful way of thinking is only suitable for solving problems in complicated systems ("machines") [→ Cynefin model]. If they are used to control complex systems, they lead to undesirable or often even painful side effects and consequential effects.

The larger the system, the more difficult it is to implement. Therefore, the aspects listed in this article (not all of them) must already be incorporated into the system design.

Of course, it would be utopian to assume that this could be implemented simply and using a top-down arrangement, which would even contradict the nature of complex systems. It is therefore a matter of spreading this knowledge as widely as possible so that it can flow into as many areas as possible and to mobilize the self-organization ability of complex systems, such as our society. This also includes, for example, taking on greater personal responsibility again. We will urgently need these skills in turbulent times. We know from nature that evolutionary changes develop from many small pieces of the puzzle. These come together at the right and unpredictable point in time without central control. There is no big plan. All attempts at the central control have so far failed brutally in the truest sense of the word.

Fundamental changes are usually triggered by major breaks or crises. It is no coincidence that crises always offer opportunities to leave the beaten path and to go new ways. In the past, such crises were often associated with wars. Today we would have the knowledge and skills to advance evolutionary development without painful destruction.

We can already prepare for turbulent times today by designing as many pieces of the puzzle as possible. For example, we should start rethinking the system design of our critical infrastructure. Up to increasing the resilience of society as a whole, in that the population is seen again as an active system element and the ability to self-help and self-organization is strengthened. Many small activities, such as the desire for regional products and added value, a decentralized energy supply from renewable energy sources, urban gardening ("the garden in the city") or complementary currencies are signs that changes have already begun bottom-up. Bottom-up means that people take action on their own initiative and initiate a process of change that does not follow a master plan and can therefore only be controlled to a limited extent. The inevitable energy transition in particular is leading to a massive shift in power. It goes without saying that such developments will not go smoothly. The path that is currently being taken is particularly dangerous, as the fundamentally decentralized system of renewable energy supply is integrated into the previously centralized system of energy supply. The energy transition means much more than just the decentralized generation of electricity, as it is currently predominantly pursued. It requires a cultural change in order to minimize the existing gap in complexity. Therefore, a plan B - what do we do if the system can no longer tolerate the interventions and a temporary system collapse occurs - is indispensable in this area in particular. However, there is currently no such thing. Which in turn can be traced back to our linear thinking and to the turkey illusion (see Figure 7).

Cognitive Limits

It is known from studies that our brain can record the interactions with a maximum of three to four interlinked factors and then reach cognitive limits. This is also related to the exponentially increasing interactions. At the same time, however, we continue to try to control the linear ways of thinking and solution approaches that result from networking.

Visualization

One possibility to learn to deal better with these cognitive limits is the visualization or cause-and-effect modeling. This is ostensibly a communication tool that makes the connections and possible interactions (causal relationships) more clearly visible to all those involved. A model becomes better the more different and also contradicting perspectives are included. You must be aware that a model does not represent reality, but only a simplified representation of it, similar to a map, is only a simplified model of the terrain and is used for orientation. In addition, in complex situations, complete predictability is never possible, but only an approximation.

In addition, people are generally looking for patterns or past experiences that should help them cope with the current situation. This works very well in familiar situations, but it always comes up against its limits when it comes to a new type of situation, such as the effects of developments in complexity.

When modeling, too, one will fall back on the existing gut feeling in the case of unknowns or uncertainties. Due to the transparent presentation and discussion, however, weak points or strengths can be better recognized, which is usually not possible with the pure "feeling" of a person or group of people. In addition, the model can be further developed and adapted at any time with new findings. Overall, it is less about ready-made explanations and more about new ways of capturing meaningful explanations.

Such a procedure is supported, for example, by the iModeler software, which was specially developed to promote networked thinking and is easy and intuitive to use.

Figure 13 shows such an impact diagram, which shows the relationship between technical security measures and the decreasing information (awareness) about possible risks.The red ribbons (-: more leads to less; +: more leads to more) Technical safety measures → Information about (threat) → Affected people and goods → Technical safety measures and affected people and goods → Technical safety measures → Small disasters → Affected people and goods are self-reinforcing. The loop technical safety measures → minor disasters → technical safety measures is compensatory.

Figure 13: Example of an impact diagram [Source: Own illustration based on Ossimitz et al. 2006]

Here, both the real reduced risk of small disasters (such as floods) and the apparent security (the supposed absence of danger due to a lack of information) have a counterproductive effect. The perception of risk decreases and thus the willingness to take risks increases, for example to build in potential flood areas. If there is a large one, the damage increases disproportionately, also because people are not prepared for it. This "vicious circle" can only be counteracted by being aware of it and by repeatedly practicing the necessary behavior.

Another visualization was created with the "system security" factor and the factors presented in this article (see Figure 14).

Does damage alone make you wise?

In the previous history of mankind, it has been quite common for people to become wiser by "trial and error". This was also possible insofar as the associated damage could be limited locally / regionally or the damage to individuals was accepted. In an increasingly networked world and through new technologies (such as genetic engineering or nanotechnology), damage can spread much more quickly or more widely. In addition, human losses are no longer morally justifiable in a society that is fixated on the highest possible level. The "trial and error" model therefore appears to have only limited future potential under today's framework conditions.

Figure 14 - Visualization of the system security factor [source: own illustration]

communication

Communication is therefore a key factor in dealing with complex situations. Only the merging of different perspectives and competencies enables the formation of a comprehensive picture. This communication must take place at eye level, where previous hierarchical structures and linear thinking can easily be a stumbling block. These flat and agile structures are also what make startups. But that cannot be ordered; it has to grow and become a lived culture. But here, too, it is often about a both-and and not about the universal solution that fits everything and every situation.

Cynefin model

Finally, we would like to refer to a good and simple way of differentiating between complicated and complex systems, the Cynefin model. In short, everything that is alive is complex. Everything that is dead is complicated and can be analyzed.

Figure 15 - Cynefin model [source: www.wandelweb.de]

Figure 16 - Cynefin model [source: www.wandelweb.de]

And that brings us back to the entry into the article. Because the "Na'vi's" on "Pandora" adopted that of their natural habitat, acted in and with it and knew how to adapt to changes - not linearly. Because they were able to adapt flexibly to the new living conditions, also thanks to their pronounced ability to help and organize themselves. We can learn from this. Let's do it.

List of sources and further references:

  • Beck, Ulrich:: In search of the lost one. Frankfurt am Main: Suhrkamp 2008
  • Casti, John: The Sudden Collapse of Everything: How Extreme Events Can Destroy Our Future. Munich: Piper Verlag GmbH, 2012
  • Dörner, Dietrich. The logic of failure: strategic thinking in complex situations. Reinbeck near Hamburg: Rowohlt Verlag, 201110
  • Dueck, Gunter: Schwarmdumm: We are only that stupid together. Frankfurt am Main: Campus 2015
  • Dueck, Gunter: Understanding in the Tower of Babel: Via multi-channel communication and proactive listening. Frankfurt / New York: Campus Verlag 2013
  • Dueck, Gunter: Break up! Why we have to become a society of excellence. Frankfurt am Main: Eichborn, 2010
  • German Bundestag (Ed.): Information from the Federal Government: Report on Risk Analysis in Civil Protection 2014. In: Internet at URL: dip21.bundestag.de/dip21/btd/18/036/1803682.pdf [14.09.17]
  • European Commission: Commission staff working document on a new approach to the European Program for Critical Infrastructure Protection. Brussels, 2013 under URL: ec.europa.eu/energy/infrastructure/doc/critical/20130828_epcip_commission_staff_working_document.pdf [14.09.17]
  • Frey Ulrich, Frey Johannes: Pitfalls: The most common mistakes in everyday life and science. Munich: C.H. Beck oHG, 20113
  • Giebel, Daniela: Integrated security communication: For the development of uncertainty management skills through and in security communication. Münster: LIT Verlag, 2012
  • Gigerenzer, Gerd. : How to make the right decisions. Munich: C. Bertelsmann, 2013
  • Graeme, S. Halford / Baker, Rosemary / McCredden, Julie E./Bain, John D .: How Many Variables Can Humans Process? In: Internet at URL: pss.sagepub.com/content/16/1/70.abstract [09/14/17]
  • Grüter, Thomas. Offline !: The inevitable end of the internet and the demise of the information society. Heidelberg: Springer-Verlag, 2013
  • Klingholz, Reiner: Slaves of Growth - The Story of a Liberation. Frankfurt: Campus Verlag, 2014
  • Krizanits, Joana. Introduction to the methods of systemic organizational consulting. Heidelberg: Carl-Auer Verlag, 2013
  • Kruse, Peter. next practice: successful management of instability. Offenbach: Gabal Verlag GmbH, 20116
  • Langner, Ralph: Robust Control System Networks / How to achieve reliable control after Stuxnet. New York: Momentum Press, 2012
  • Malik, Fredmund: Strategy: Navigating the New World. Frankfurt am Main: Campus Verlag GmbH, 2011
  • Neumann, Kai: KNOW-WHY: Success through understanding. Norderstedt: Books on Demand, 20132
  • Ossimitz, Günther / Lapp, Christian. Systems: thinking and acting; The Metanoia Principle: An Introduction to Systemic Thinking and Action. Berlin: Franzbecker, 2006
  • Renn, Ortwin: The risk paradox: Why we fear the wrong thing. Frankfurt am Main: Fischer Verlag, 2014
  • Romeike, Frank / Spitzner, Jan: From to - Business simulations in practice, Wiley Verlag, Weinheim 2013.
  • Romeike, Frank / Hager, Peter: Success factor 3.0: Lessons learned, methods, checklists and implementation, 3rd completely revised edition, Springer Verlag, Wiesbaden 2013.
  • Saurugg, Herbert: The network company and 2.0. Vienna-Budapest, 2012a, under URL: www.saurugg.net/wp/wp-content/uploads/2014/10/die_netzwerkgesellschaft_und_krisenmanagement_2.0.pdf [14.09.17]
  • Saurugg, Herbert: Blackout - A national challenge even before the crisis. 2012b under URL: www.saurugg.net/wp/wp-content/uploads/2014/10/Blackout-Eine-nationale-Herausgabe-bereits-vor-der-Krise.pdf [14.09.17]
  • Scheer, Hermann: 100% now: the energy ethical imperative: How to implement the complete switch to renewable energies. Munich: Verlag Antje Kunstmann, 2012
  • Taleb, Nassim Nicholas / Read, Rupert / Douady, Raphael / Yaneer Bar-Yam, Joseph Norman: The Precautionary Principle (with Application to the Genetic Modification of Organisms). In: Internet at URL: arxiv.org/abs/1410.5787 [09/14/17]
  • Taleb, Nassim Nicholas. Antifragility: Instructions for a World We Don't Understand. Munich: Albrecht Knaus Verlag, 2013a
  • Taleb, Nassim Nicholas. The Black Swan: The Power of Highly Unlikely Events. Munich: dtv, 2013b5
  • Taleb, Nassim Nicholas: The Black Swan: Consequences of the Crisis. Munich: dtv, 2012
  • Vester, Frederic. The art of networked thinking: ideas and tools for a new way of dealing with. A report to the Club of Rome. Munich: Deutscher Taschenbuch Verlag, 20118
  • Völkl, Kurt / Wallner, Heinz Peter: The inner game: How decision and change succeed in a playful way. Göttingen: Business Village GmbH, 2013
  • WHO: WHO’s first global report on antibiotic resistance reveals serious, worldwide threat to public health. Under URL: www.who.int/mediacentre/news/releases/2014/amr-report/en/  [09/14/17]
  • Zurich Insurance Company Ltd and Atlantic Council of the United States: Beyond data breaches: global interconnections of. 2014 under URL: www.atlanticcouncil.org/images/publications/Zurich_Cyber_Risk_April_2014.pdf [09/14/17]


Authors:

Herbert Saurugg is a recognized expert in preparing for the failure of vital infrastructures.

He was a career officer in the Austrian Armed Forces for 15 years, most recently in the area of ​​ICT / cyber. Since 2012 he has been dealing with the possible consequences of increasing networking and, in particular, with the European power supply system and a Europe-wide power and infrastructure failure ("blackout").

Frank Romeike is the founder, managing director and owner of the Competence Center RiskNET GmbH - The Risk Management Network. Internationally, he is one of the most renowned and leading experts in and opportunity management. In his professional past he was Chief Risk Officer at IBM Central Europe, where he, among other things, was involved in the introduction of the global risk management process at IBM and led several international projects. He has, inter alia. completed an economics degree (with a focus on actuarial mathematics) in Cologne and Norwich / UK. He then studied political science, psychology and philosophy. He has also completed an executive master's degree in area and compliance management.

 

[Image source: © ustas - Fotolia.com]