Our global networks have generated many benefits and new opportunities. However, they have also established highways for failure
propagation(传播,繁殖), which can ultimately result in human-made disasters. For example, today's quick spreading of emerging
epidemics1 is largely a result of global air traffic, with serious impacts on global health, social welfare, and economic systems. Helbing's publication
illustrates2 how
cascade3 effects and complex
dynamics4 amplify5 the vulnerability of networked systems. For example, just a few long-distance connections can largely decrease our ability to
mitigate6 the threats posed by global pandemics.
Initially7 beneficial trends, such as globalization, increasing network
densities8, higher
complexity9, and an
acceleration10 of institutional decision processes may ultimately push human-made or human-influenced systems towards systemic instability, Helbing finds. Systemic instability refers to a system, which will get out of control sooner or later, even if everybody involved is well skilled, highly motivated and behaving properly. Crowd disasters are shocking examples
illustrating11 that many deaths may occur even when everybody tries hard not to hurt anyone.
Networking system
components12 that are well-behaved in separation may create counter-
intuitive(直觉的) emergent system behaviors, which are not well-behaved at all. For example, cooperative behavior might unexpectedly break down as the connectivity of interaction partners grows. "Applying this to the global network of banks, this might actually have caused the financial meltdown in 2008," believes Helbing.
Globally networked risks are difficult to identify, map and understand, since there are often no evident, unique cause-effect relationships. Failure rates may change depending on the
random13 path taken by the system, with the consequence of increasing risks as cascade failures progress,
thereby14 decreasing the capacity of the system to recover. "In certain cases, cascade effects might reach any size, and the damage might be practically unbounded," says Helbing. "This is quite disturbing and hard to imagine." All of these features make strongly coupled, complex systems difficult to predict and control, such that our attempts to manage them go astray.
"Take the financial system," says Helbing. "The financial crisis hit regulators by surprise." But back in 2003, the
legendary15 investor16 Warren
Buffet17 warned of mega-catastrophic risks created by large-scale investments into financial
derivatives18. It took 5 years until the "investment time bomb" exploded, causing losses of trillions of dollars to our economy. "The financial architecture is not properly designed," concludes Helbing. "The system lacks breaking points, as we have them in our electrical system." This allows local problems to spread globally, thereby reaching catastrophic dimensions.