Understanding cyclic vulnerability to reduce the risk of global collapse

Colin D Butler
Australian National University

Population vulnerability is cyclic, analogous to immunity. Following epidemics, surviving populations have sufficient antibodies to inhibit repeat infection until a sufficient number of immunologically vulnerability people accrue, due to waning immunity and the maturing of a new generation. Other forms of cyclic risk exist, driven by the waxing and waning of collective memory and behaviour and amplified by the rise and fall of social mechanisms. Three examples are global conflict, inequality and economic history.

In the first, strong global social forces following World War II (WWII) led to a sufficiently vigorous social contract to inhibit very large-scale state violence, fortified by numerous institutions including the United Nations. Almost 70 years later, the “social immunity” generated by the two World Wars is still fairly powerful, though some of the institutions are weakening. The second example concerns inequality. Following the Depression and WWII sufficient social forces were liberated to reduce inequality of several forms; in the US memory of the “gilded age” faded, in the UK the National Health Service was born, and the global wave of decolonisation appeared unstoppable. However, gradually, many forms of inequality have reappeared, including in most formerly Communist nations. Economic history comprises the third example. Economic booms and busts have occurred since at least the Great Tulip frenzy (1634-37), and the cycle continues, not least because mainstream opinion in new generations asserts that the problem has been solved – and a new generation of naive speculators and investors is seduced.

Today, global civilisation itself is threatened. This risk may be “emergent”, as defined by this meeting, but is also ancient and recurrent. Numerous civilisations have collapsed in the past; what differs today is the global scale of this risk. This is plausible due not only to globalisation but also to the convergence of several forms of risk “immuno-naïveté”. This vulnerability has also been described as arising from the “Cornucopian Enchantment”, a period since roughly 1980, when most economists, decision makers and even the academy reached quasi-consensus that the problem of scarcity had been permanently solved. This hubris seemed rational to a new generation, trained and rewarded to think that economics and ingenuity would of themselves solve all major problems; such pride was fortified (for a time) by data regarding cheap food, cheap energy and declining global hunger. However, in the last decade, data have accumulated that show not just diminishing reserves (eg oil); but less contestable evidence such as rising prices (oil, food), rising unemployment and increased social resentment. Nevertheless, most policy makers remain wedded to the “old-world thinking” that has helped create these developing, interacting crises.

What can be more important than to reduce the emergent risk of global civilisation collapse? Failure to lower this risk may lead to a dramatic change in global consciousness, following a period likely to make the Dark Ages seem desirable. Instead, it is vital to “immunise” a sufficient number of people who can then demand, develop and support the requisite radical new policies. These include acceptance that resources are limited, development of green economic systems that will price negative externalities, and revival of fairness of opportunity.

"A Unified Concept of Risk and Uncertainty?"

John Doyle, Cal Tech.

Abstract: Engineers, physicians, scientists, statisticians, economists, social scientists, neuroscientists, etc all have notions of risk and uncertainty, which can differ mildly in terminology, emphases, and details, or be frankly incompatible. I'll focus here on a subset of disciplines that may superficially differ, but have the potential for a deeper unification in theory and methods, including systems, network, and resilience engineering, evolutionary, social, and cognitive psychology and neuroscience, evolutionary biology, emergency medicine and intensive care, engineering for natural and technological disasters, and statistics and math.  While there are remains large differences within each of these areas (e.g. engineers who do consumer and entertainment systems vs mission critical infrastructure, evolutionary vs social psychologists, microbial ecosystems evolution and immunology vs population genetics, etc), each has some exciting new developments that create the opportunity for a more unified framework.  Particularly exciting from my theoretical perspective is a more unified mathematical foundation for computing, communications, and control that appears relevant to these domains.  There is an essential role of tradeoffs involving dynamics, far from equilibrium, feedback (autocatalytic and control), robustness, and efficiency.  I'll briefly describe the math progress but focus on illustrating the ideas with hopefully accessible and familiar case studies. 

Emergent Risk: Financial Markets as Large Technical Systems

Donald MacKenzie

This paper will discuss the way in which automated trading has turned US stock markets and related ‘derivatives’ markets (especially the market in stock-index futures) into a large technical system, the ‘spinal cord’ of which is the fibre-optic links between the data centres in New Jersey in which stocks are traded and the data centre of the Chicago Mercantile Exchange, in the western suburbs of Chicago, where index futures are traded.
The paper will begin by briefly discussing the first full-scale crisis of automated trading, the so-called ‘flash crash’ that took place in the afternoon of May 6, 2010, focussing in particular how that afternoon’s trading disruptions moved between Chicago and New Jersey.
The next section of the paper will begin by discussing risks to individual firms (e.g. the $440 million losses incurred by Knight Capital in the first 45 minutes of trading on August 1, 2012) and then move on to emergent risk, ‘the threat to the individual parts produced by their participation in and interaction with the system itself’ (Centeno and Tham). Even ‘bug-free’ programs can interact in unexpected, even bizarre ways, as will be shown by a brief discussion of the interaction of pricing algorithms on Amazon.
The paper will then move on to discuss large technical system dynamics in the financial system, and how those dynamics differ from traditional views of the system (e.g. the efficient market hypothesis of financial economics). The current ‘delegitimation’ of the US stock markets will be highlighted, as will the possibility that the large technical system of US stock and stock-derivative markets has drifted into Perrow’s dangerous quadrant of tight coupling and high complexity (was the flash crash a ‘normal accident’)? The paper will end, nevertheless, by listing a variety of factors that imply that these issues are not the most serious risk currently faced by global finance: for example, delegitimation has the welcome side effect that Vaughan’s ‘normalization of deviance’ is not taking place with respect to automated trading.

The Domestication of Uncertainty: Private and Public Uses of Credit Ratings

Bruce G. Carruthers

Abstract: Credit ratings based on an ordinal category system have become a canonical market-based measure of risk. Higher ratings denote lower credit risk. Invented in the middle of the 19th-century, ratings now have a global reach and affect investment and resource allocation decisions world-wide. This paper reviews their spread, first from trade credit to railway bonds, and then from railway bonds to other kinds of issuers and debt securities. It identifies three drivers of expansion which, roughly in temporal order, were: the use of ratings by investors, use by public regulators, and incorporation into private financial contracts. Remarkably, it wasn’t until the 1940s that systematic evidence was marshaled to document their predictive accuracy and so much of the diffusion of ratings occurred before anyone knew their true predictive value. Something other than “objective” informational value drove the diffusion process and resulted in the surreptitious ubiquity of credit ratings.

Complex systems and crises of energy

John Urry

This paper will explore the usefulness of systems thinking to make sense of what seems centrally significant in the contemporary world. This is the double problem of rising GHGs emissions and the likely shortages of the utterly key energy resource of oil. These both derive from the high carbon systems that got locked in during the last century. And they have left this new century such troublesome and interdependent emergent risks. Some possible future scenarios will be analysed. This paper draws on the forthcoming SOCIETIES BEYOND OIL (ZED, 2012/3).

Organizational Coupling as a Principle to Explore Problem Framing and Solving

Andrea Prencipe

Relying on literature on organizations as problem solving institutions, we argue that the nature and difficulty of problems that organizations must frame and solve to carry out their strategic and operational tasks are related to the occurrence of ambiguity, complexity, and risk.  Ambiguity, complexity, and risk define different degree of problems’ difficulty and they impose different knowledge and organizational requirements. This work aims to develop and discuss a framework within which to analyze organizational approaches to framing and solving problems.  The framework revolves around the concept of organizational coupling and its determinants: responsiveness – i.e. the property of organizational components to maintain some degree of consistency among them – and distinctiveness – i.e. the property of organizational components to retain their own identity. The manifestation of distinctiveness and responsiveness leads to different type of coupling – i.e. tight, loose, or decoupled – among organizational components. Our argument is that the occurrence of risk, complexity, and ambiguity pose different degree of difficulty in terms of knowledge and organizations requirements that can be framed and solved through different types of organizational coupling.  Ambiguity is a situation in which it is unclear what the problem to solve is to the extent that it forces organizations to be engaged in efforts of sense making, interpreting, and framing of changing internal or external environments.  We propose that ambiguous problems requires organizations to be characterized by close and tight interactions amongst units that would enable direct exchange of information – i.e. tightly coupled systems.  Complexity is about number, variety, and unpredictable relationships amongst components that constitute the problem to be solved.  Complex problems are made up with multiple and interdependent parts.  We argue that the solution to complex problems requires organizational systems that simultaneously achieve and manage consistently multiplicity, interdependency, and unpredictability – i.e. loosely coupled systems.  Risk is about finding the information necessary to solve a given problem.  The key challenge that risk poses to organizations is one of finding information, as resolving uncertain situations requires the acquisition of larger and newer information sets.  We submit that risk would lead to organizational systems where each individual unit become independent sensors that acquire new data from multiple sources – i.e. decoupled systems.

Stability of the WTW over Time

Scott Pauls

TheWorld TradeWeb (WTW) is a weighted network whose nodes correspond to countries with edge strengths reecting the value of imports and/or exports between countries. In this paper we introduce to this macroeconomic system the notion of extinction analysis, a technique often used in the analysis of ecosystems, for the purposes of investigating the robustness of this network. In particular, we subject the WTW to a principled set of in silico \knockout experiments," akin to those carried out in the investigation of food webs, but suitably adapted to this macroeconomic network. Informed by results in network theory as well as studies of contagion in economic networks, we seek to understand the role of connectance in the robustness of the system. We interpret increasing connectance as one aspect of a move towards globalization and liberalized trade policy. Broadly, our experiments con_rm two conjectures. First, that the WTWs are \robust yet fragile" networks { robust to random failures but fragile under targeted attack. Second, that growing connectance has both positive and negative impacts on robustness. More specifically, we find that increasing connectance corresponds to increasing robustness for small shocks but to decreasing robustness in the face of large, cascading shocks to the system. This yields evidence in support of the view that globalization, as witnessed by increasing connectance, increases the ability of a system to absorb shock up until a certain size, whereupon the shock overwhelms the system and sparks a broader contagion. We anticipate that experiments and measures like these can play an important role in the evaluation of the stability of economic systems.