Surprised by the Expected: Moral dimensions of resilience in integrated urban systems

Paul R. Brown and R. Shane Trussell

The following paper was presented at Singapore International Water Week on 10 July 2018 at the Theme 3.1 Technical Session on Sustainable Development Goals. It was subsequently accepted for publication in IWA’s H2Open Journal. Citation: Paul R. Brown, R. Shane Trussell; Moral dimensions of resilience in integrated urban systems: surprised by the expected. H2Open Journal 1 December 2018; 1 (2): 169–178. doi: https://doi.org/10.2166/h2oj.2018.011


INTRODUCTION

This paper is a qualitative inquiry into ethical questions raised by two factors affecting water management and engineering: (1) the limitations of traditional probability-based decision tools given hydrological non-stationarity and the consequent unpredictable frequency and severity of extreme events (Milly et al.2008); and (2) the potential for management and operational failures during extreme events as a result of increasingly complex, integrated, and interdependent infrastructure systems-of-systems (SoS), both physical and institutional. In this context, the pursuit of greater urban resilience must reckon with the limitations of traditional engineering reliability analysis in an increasingly unstable environment. The paper concludes with a discussion of its application to implementation measures designed to achieve Sustainable Development Goals (SDGs).

The paper maintains that planners and engineers have an ethical responsibility to (1) explicitly consider and address the increased aleatory risks of both climate change and system-of-systems risks during the planning and design process, (2) consider the likely consequences, recovery capacity, and recovery time of communities in the event of failures, and (3) explore approaches to enhance community resilience during and following degraded or failed performance. It asks that engineers and planners go beyond the traditional criteria of reliability, efficiency, costs and benefits to address possible unintended consequences, social justice, and the resilience of the communities served by these solutions – especially as it relates to the capacity of communities to survive and recover from unexpected failures. Engineering and planning professionals have an ethical obligation to ask these questions, especially in regard to the goals and standards set for developing economies.

MURPHY, GARDONI, AND HARRIS GUIDELINES

The authors are indebted to a paper by Murphy et al. (2010) that sets out to define the ethical dimensions of engineering decision-making under uncertainty. Entitled ‘Classification and moral evaluation of uncertainties in engineering modeling,’ the paper explores the sources of uncertainty in engineering and planning, and presents nine guidelines ‘for the treatment of uncertainty in engineering modeling.’ In their paper, ‘engineering modeling’ covers the wide use of mathematical models to predict the performance of engineering solutions and provide the basis for rational decision-making by engineers and non-engineers alike.

Murphy et al. also focus on differences between the ‘epistemic’ uncertainties (reducible) and ‘aleatory’ uncertainties (irreducible) that are inherent considerations in engineering modeling and decision-making.

The difference between the two types of uncertainties is that aleatory uncertainties are irreducible, whereas epistemic uncertainties are reducible, e.g., by the use of improved models, the acquisition of more accurate measurements and the collection of larger samples. Aleatory uncertainties arise from the inherently uncertain or random character of nature’ (Murphy et al. 2010).

For a comprehensive and lively critique of the history, various approaches, and limitations to categorizing uncertainty, a 2017 paper in Science and Engineering Ethics by Dominic Roser, entitled ‘The irrelevance of the risk-uncertainty distinction,’ offers definitions as follows:

Objective probabilities and related concepts also go under labels such as physical probabilities (Mellor 2005, p. 3), chances (Mellor 2005, p. 3), or aleatory probabilities (Hacking 2001, p. 133). Epistemic probabilities – or, relatedly, logical probabilities – in contrast denote the degree of support given by the evidence for a hypothesis. When we say that the information at hand strongly supports the claim that there will be rain tomorrow or that the evidence makes it likely that the gardener killed the baroness, we focus on a relation between evidence and a hypothesis’ (Roser 2017, p. 1392).

For the purposes of this paper, the focus is on increased aleatory uncertainty related to the probabilities of future extreme physical disruptions and events directly and indirectly affecting the expected performance of engineered systems.

Murphy et al. conclude with nine guidelines for the consideration of risk in engineering modeling and decision-making. In summary, those guidelines (presented below with additional commentary) suggest that engineers and planners should:

  1. Explicitly acknowledge uncertainty as inherent in engineering decisions.‘Acknowledging uncertainty is a precondition for making principled and well-educated decisions about how to treat uncertainties and about the acceptability of risks’ (Murphy et al. 2010).

  2. Evaluate the need for innovations and new technologies, given that new uncertainties will inevitably accompany them. This guideline should inform decisions regarding the implementation of SDGs, especially when introducing large, centralized technologies that may leave communities less resilient to future system failures.

  3. Differentiate between the epistemic and aleatory risks in an engineering solution. This could be more difficult as assumptions regarding randomness in natural events may be changing.

  4. Consider the acceptability of aleatory uncertainties. In the context of urban resilience, evaluating ‘acceptability’ also requires consideration of a community's capacity to recover from system failures in terms of time, resources, and cost.

  5. Analyze the costs and benefits of epistemic risks. Generally, this analysis is part of most current planning and engineering decision-making.

  6. Prioritize the reduction of epistemic risks judged most important to success. An analysis of the immediate consequences of failure and the longer-term institutional capacity, costs, and duration of recovery should be included in the prioritization process.

  7. Categorize the level of confidence in alternative engineering models of the same engineering problem (so called metadoxastic uncertainty). Given the wide suite of climate change models and imprecision of downscaling, this guideline becomes an important consideration for planners and engineers. It cautions against selection (cherry picking) of models that produce results consistent with the expert's existing biases.

  8. Fully disclose to the public and policy makers the uncertainties surrounding the work. Under increasing uncertainty and complexity, full disclosure becomes a critical component of ethical advice and recommendations.

  9. Monitor and learn from the success of solutions as well as unforeseen problems and consequences. This concluding guideline suggests that systems that have failed should not be replaced, in-kind, with the same system that failed.

While this methodology may seem extravagant and unnecessary to some, this approach, or others like it, are essential to avoiding serious ethical lapses based on unintended biases and self-deception. Rules-of-thumb and standardized technologies are increasingly vulnerable to the growing randomness of weather-related events and the increasing complexity of system-of-systems decision-making.

RELIABILITY VERSUS RESILIENCE

The authors wish to make a clear distinction regarding the differences between the definition of ‘reliability’ (which describes a predicted risk of failure) and ‘resilience’ (which addresses performance during and after random extreme events – some resulting in system failures). Reliability predicts performance using well-established theories and probabilities to describe a distribution of future outcomes and their frequencies. It focuses on the performance of the system and its components. Resilience, on the other hand, addresses the performance and consequences of outcomes outside of those boundaries (defined here as extreme events). Resilience focuses on responses to conditions in the so-called ‘tail’ of possible future occurrences.

In fields of engineering, the mathematics of reliability and risk criteria is a well-developed discipline – largely grounded in assumptions of stationarity. It addresses

the solution of problems in predicting, estimating, or optimizing the probability of survival, mean life, or, more generally, life distribution of components or systems; other problems considered … are those involving the probability of proper functioning of the system at either a specified or an arbitrary time, or the proportion of the time the system is functioning properly’ (Barlow & Proschan 1996, p. xi).

The recovery capacity, costs, duration, and quality of life for the survivors is an externality in most of these decisions (optimistically assuming that life goes on).

Assessing the need for and resources required to ensure resilience under extreme conditions should consider the capacity of communities to endure and recover from those conditions. That capacity to recover varies greatly among communities and is often related to the availability of institutional and economic resources. For example, as surface water supplies disappear in Cape Town, South Africa it has been noted that wealthy communities are able to drill new wells, while the poor have no means to do the same. The traditional surface water storage and distribution system is failing for everyone, but the consequences of failure and the capacity to recover varies according to economic inequalities among communities (Sieff 2018).

The following example provides anecdotal evidence of both the potential inadequacy of traditional engineering modeling tools under extreme and compounding events, as well as the potential for these tools to inadequately inform the collaborative decision-making of independent organizations and infrastructure integrated into system-of-systems frameworks.

SURPRISED BY THE EXPECTED

The Montecito case illustrates the potential vulnerability of reliance on hydrological models that were developed and calibrated under assumptions of climate stationarity and acceptable aleatory risks. When two extreme events occur one after another, the uncertainties of compounding hazards suddenly introduce externalities (massive erosion combined with sediment and debris flows) that are not incorporated into the engineering modeling tools employed by decision-makers, public safety officials, and first responders. The case also illustrates the collaboration among agencies and institutions that share predictive tools and output during crisis situations. The integrated response illustrated in this case worked effectively to save lives during the wildfire event but was surprised by the consequences of a subsequent storm.

Montecito, California, is an affluent coastal community to the southeast of the city of Santa Barbara. During December of 2017, the massive Thomas wildfire burned through the Padres National Forest and stripped the watersheds draining to the coast of almost all of their vegetation. Fuelled by years of drought, the fire grew to become the largest wildfire on record in California's history. Fortunately, widespread mandatory evacuations during the fire prevented anyone in Montecito from dying, in spite of the loss of dozens of homes. Sadly, it was an intense rain event in January that killed over 20 people and injured many more. In advance of the storm, public officials issued warnings and encouraged voluntary evacuations. What transpired in the early morning hours of Tuesday, January 9 was a mudslide and debris flows that were described as ‘apocalyptic.’

In a front-page story on the mudslides published in the Los Angeles Times on January 13, 2018, reporters Matt Hamilton and Joseph Serna quoted Montecito Fire Protection District Battalion Chief Scott Chapman, who had reviewed planning maps prepared from data analyzed by the County of Santa Barbara prior to the storm. Those maps depicted 100-year and 500-year storm events and the flooding that could result. The Times reported:

Chapman said the flooding and flows foretold by the map are mostly accurate, with the exception of a small patch of homes by the 101 Freeway and Montecito Creek, which were not as flooded as the map would have predicted’ (Hamilton & Serna 2018, p. 7).

Then, the article went on to report the following statement by Battalion Chief Chapman, who said ‘Even expecting the worst and planning for the worst, no one expected this.’ This statement describes clearly and succinctly the fundamental problem faced by professionals in many dimensions of water management and technology.

The authors characterize this phenomenon as being ‘surprised by the expected,’ and Chief Chapman is not the only person who has experienced it – that is reliance on engineering analyses based on historical records that fail to predict the combined physical consequences of compounding extreme events; in this instance, an unprecedented wildfire followed quickly by an intense rain event. It is likely that surprises resulting from expected events that dramatically exceed expectations will continue to occur.

Addressing this issue following the unprecedented rainfall in Houston resulting from Hurricane Harvey, an editorial by Stanford earth system science professor Diffenbaugh (2017) published in the New York Times, implored policy makers to face reality and prepare for the unknown unknowns (aleatory uncertainties).

Refusing to acknowledge the changing odds of extremes means that we will be unprepared for events that fall outside of our experience. Denying climate science is not just a political statement. It also puts American lives and property at risk’ (Diffenbaugh 2017).

Is it realistic to prepare for ‘events that fall outside of our experience’ – is there a technological solution for that? In the engineering planning and design process, the answer is likely ‘no, not yet.’ The development of design criteria that protect the public from all possible events (including ones that have never happened in the past) is both wishful thinking and unaffordable – even if all of those contingent events could be dreamed up.

What is achievable is an explicit recognition of the likely consequences of catastrophic failure irrespective of the future cause, asking what happens after the failure of infrastructure and how long will it take to recover? The focus on protecting a community during a disaster and minimizing the time needed to recover from it, is the real meaning of ‘resilience.’ Eliminating, to acceptable levels of risk, the possibility of experiencing that disaster is the meaning of ‘reliability.’

SYSTEM-OF-SYSTEMS RISKS

In addition to the uncertainties stemming from climate instability, the second area of vulnerability relates to the system-of-systems created through the integration of agencies and technologies in the context of what some call ‘One Water’ solutions. Generally, these solutions are accomplished through the collaboration of utilities that have traditionally functioned as single-purpose institutions focused on either potable water, used water, stormwater, or flood control.

In addition to organizational independence, these utility functions are often regulated under overlapping but independent regulatory regimes designed in the context of autonomous legacy systems. Each entity maintains its own operational mission, performance criteria, and regulatory requirements – while voluntarily collaborating in the management of the larger closed-loop system. This is the text book definition of a ‘collaborative system-of-systems’ and it is vulnerable to the weaknesses of that structure – particularly during disruptive events and extreme conditions (Ireland 2016).

Southern California offers many examples of collaborative system-of-system water management solutions that rely upon the coordinated management of used-water agencies, new-water suppliers, groundwater basin managers, groundwater pumpers, and stormwater and flood control agencies. Each of these entities is governed under independent authorities, powers, rights, and regulatory regimes. Typically, these agencies manage the components of the integrated system through voluntary agreements and contracts that describe their performance obligations under normal conditions.

For example, a used-water facility, owned and operated by a sanitation district, may provide source water to an advanced treatment facility, owned and operated by a municipal water district. The sanitation district's primary mission is to satisfy the regulatory requirements associated with the disposal of treated water to the environment. While providing water to the advanced treatment is a contractual responsibility, it is a lower priority than meeting the legal requirements of governing regulations under which the used-water facility was established.

At the same time, the municipal water district is focused on water reuse and must prioritize treatment with an end point of drinking water quality and supply in mind. While the water district is greatly impacted by the source water (e.g., treated used water), it has little control over this source and must focus its attention on (1) achieving drinking water quality objectives and (2) maintaining production levels to meet demands and control costs.

Lastly, where new water is stored in groundwater basins, there can be additional agencies receiving the purified water, managing its recharge, storage, pumping, and distribution. In southern California, these entities are focused on the management of the groundwater basins as their primary objective. And while the municipal water district would like to control costs by maximizing production, the groundwater management agency's primary objective is the proper maintenance of the facilities necessary to groundwater basins' operation (e.g., injection wells and spreading basins). Groundwater managers may not want the water at the same time or in the locations best suited to the advanced treatment facility.

When functioning properly, the contractual obligations among all the parties can usually resolve most of the known conflicts. Under extremely severe conditions, the primary mission of each agency may drive the integrated system-of-systems into unexpected shutdowns or failures – especially if the communications infrastructure of the internet breaks down during the incident. There is no agency explicitly responsible for defining and/or balancing the overall system goals and priorities, the autonomous goals of the participants' conflict, and the management resources remain independent and frequently over-taxed during crisis events. These are several of the characteristics of known failure modes in a system-of-systems setting (Alexander et al. 2004, pp. 504–505). Systems-thinker John L. Casti, in the opening notes of his book, X-Events: The Collapse of Everything, describes the situation in terms of ‘a theory of surprise:’

How do we characterize risk in situations where probability theory and statistics cannot be employed? X-events of the human – rather than nature-caused – variety are the result of too little understanding chasing too much complexity in our human systems. … tied up with the exponentially increasing levels of complexity necessary to preserve the critical infrastructure of modern life’ (Casti 2013).

This ‘surprised by the expected’ pattern of complex system-of-systems behaving in counter-intuitive ways, and flawed conclusions based on traditional forecasting tools can lead to loss of life and property. Furthermore, the best practices of rational risk assessment and mitigation will be hard pressed to deliver reliable decision-making in the context of non-stationarity and system-of-systems interdependencies. This poses a serious ethical question. Are flawed tools unwittingly endangering public process and decision-making? Should disclaimers regarding the likely outcomes forecasted by probabilistic analytical tools accompany recommendations that rely upon historical hydrological data to forecast future events?

CHANGING PROFESSIONAL PRACTICES

The speed with which both uncertainty and complexity are increasing is in conflict with the slow process of developing new approaches to professional and institutional governance, decision-making, and practice. In fact, the current process of revising professional practice, codes, and regulations is intentionally slowed with consultations, communication, and conflict-resolution that restrains rapid changes in direction or content (Ben-Joseph 2005). Consequently, learning will inevitably be driven by unforeseen mistakes – a humbling and sometimes humiliating experience.

That learning process will happen either through repeated and unnecessary replacement of failed technology with the same technology that failed or reflective questioning of assumptions and beliefs to develop a more resilient response. The latter approach appeals to the belief that planning and engineering is an ethical endeavor with moral consequences, rather than the enforced application of standardized responses based on prior practice alone.

Ethical hazards are not the result of intentionally trying to do harm but failing to heed the signals and warnings that harm may result from currently accepted practices. That is always following tried-and-tested models, codes, and standards, without questioning their consequences during and following plausible events. The susceptibility to self-deception in instances where hazards are both outside of normal experience and uncomfortable to contemplate can be high. The emotional consequences of distressing conclusions may make them even harder to accept.

The trade-offs made in a cost–benefit analysis depend on predictable benefits and predictable costs, as the investments depend on regulatory certainty. Regulatory rigidity, enforced by legal demands on institutions, have the same invisible stranglehold on water as building codes have on structures. They are generally static and unyielding, do not (cannot) reflect a holistic view, and are not designed to deal with failure following extreme events and their aftermath.

ROLE OF SELF-DECEPTION AND STATISTICAL INTUITION

What is the appropriate response to repeated expressions of chastened surprise from experts forced to admit that what has happened was not predicted and, worse yet, not predictable? Today it is likely that after such an admission, many experts retreat to their established professional methods and tools to reinforce their confidence about what will happen next. There's an element of self-deception and flawed statistical ‘intuition’ that Nobel Prize winners Daniel Kahneman and Amos Tversky identified in their research, and Kahneman described in his book, Thinking, Fast and Slow:

It is wrong to blame anyone for failing to forecast accurately in an unpredictable world. However, it seems fair to blame professionals for believing they can succeed in an impossible task. Claims for correct intuitions in an unpredictable situation are self-delusional at best, sometimes worse. In the absence of valid cues, intuitive ‘hits’ are due either to luck or to lies. If you find this conclusion surprising, you still have a lingering belief that intuition is magic. Remember this rule: intuition cannot be trusted in the absence of stable regularities in the environment’ (Kahneman 2011, p. 241).

The difficulty of accepting this rule in the water industry is exacerbated by centuries of environmental stability and stationarity. Who would have suspected that careers would span two global epochs, a transition from the stable Holocene period to the more volatile Anthropocene period? Ignoring the signals of change may lead to ethical failures and moral indifference.

A paper in Social Justice Research by Tenbrunsel & Messick (2004) entitled ‘Ethical Fading: The Role of Self-Deception in Unethical Behavior’ identifies four enablers of self-deception at the root of unethical decisions. They define self-deception as being ‘unaware of the processes that lead us to form opinions and judgments … Such deception involves avoidance of the truth, the lies that we tell to, and secrets we keep from ourselves.’ They discuss the ‘slippery slope’ of decision-making and describe ethical risks associated with the routinization (that is practices that have become routine) of decision-making over time. They describe it this way:

If what we were doing in the past is OK and our current practice is almost identical, then it too must be OK. This mechanism uses the past practices of an organization as a benchmark for evaluating new practices. If the past practices were ethical and acceptable, then practices that are similar and not too different are also acceptable … Routinization means that when a practice has become routine, it is ordinary, mundane, and acceptable. Any ethical coloration is lost’ (Tenbrunsel & Messick 2004, p. 228).

Martin and Schinzinger, in their classic book Ethics in Engineering, warn against the same ethical risks posed by self-deception, which they define as:

Fooling oneself, either (a) motivated irrationality in which one allows biases to distort judgment, or (b) purposeful (though not fully conscious) evasion of unpleasant realities such as data that goes against what one wants to believe’ (Martin & Schinzinger 2014, p. 216).

When professional best practices may no longer perform as reliably as they have in the past, that fact alone is an ‘unpleasant reality.’ It demands breaking out of the routine and acknowledging new realities – explicitly disclosing and struggling with the ethical dilemmas they create.

OBJECTIONS TO CHANGE

One of the strongest objections to the ethical arguments presented here stems from the lack of accepted alternative methodologies for modeling engineering design and planning decisions that (1) does not rely on historic hydrology or climate adjusted extrapolations based on those hydrologies and (2) the difficulties of internalizing the external systems behavior that accompany more integrated and interdependent water management systems and infrastructure. The thesis of this paper does not attempt to select an alternative methodology or simplify the challenges of system-of-system evaluations. It trusts that repeated reflection on the vulnerabilities cited here and the failures experienced in practice will eventually result in new approaches.

Unfortunately, a failure to embrace these realities may influence the future of emerging economies as detrimentally as developed ones. There may be an underlying assumption that infrastructure investments in water and sanitation can attain a sustainable end-point, which can always be made more efficient and less wasteful, but is essentially complete – providing static, fortified barriers designed to protect the health and safety of people, property, and natural ecosystems. It is unlikely in a world of increasing non-stationarity that fortification alone will provide sustainable protection. Building the capacity of communities to be flexible, adaptive, resourceful, and resilient in the face of dynamic change may be an equally important investment for the future.

SUSTAINABLE DEVELOPMENT GOALS

Those engineers and planners working on the implementation of SDGs focused on and ‘making cities and human settlements inclusive, safe, resilient and sustainable’ (Goal 11) should explicitly address the cross-cutting vulnerabilities of climate instability and infrastructure complexity.

By design, SDGs establish global metrics and timetables. They largely focus on addressing the chronic stressors that weaken capacity and fabric of communities over the long term. The careful selection of the ‘means’ by which those goals are achieved, and a better understanding of their interdependencies is just as important as the stated ends described by performance metrics. A snapshot in time is an important indicator of cumulative progress, but it does not reflect the capacity of communities to maintain progress during dynamic changes in the environment or recover from acute shocks and extreme events. The example of Puerto Rico following Hurricane Maria, which devastated the U.S. territory in September 2017, is a case of 40 years of development nearly wiped out in a 24-hour period. Six months after the event, many remained without basic services and full recovery is not expected for many years (Russell 2018).

SDGs represent an exhaustive global effort to reduce risk through the identification of specific variables and targets that collectively are expected to increase sustainability. Along the way, additional consideration will inevitably be given to irreducible risks resulting from non-stationarity and hazards created by natural and human events beyond recent experience. The process of establishing global metrics requires a heroic assumption that the accomplishment of the targets will mean the world is more sustainable. If all the goals are achieved, the chances of a sustainable future may or may not increase. The aspirational policy dimensions of the SDGs are based on broadly accepted values and norms. In the end, however, the performance measures and timetables are an exercise in the analysis and management of epistemic risk – a deterministic planning and engineering exercise. Their implementation should be accomplished with the ethical review described in guidelines summarized above.

CONCLUSION

To continue promoting the engineering models of the past and present (and establishing global metrics for assessing progress according to those models) may ignore the long-term outcomes that achievement of those metrics could produce. It is known that engineered systems are becoming increasingly complex and integrated, with design criteria for extremes that may well be exceeded. And yet, consideration of recovery capacity and recovery duration following failure is generally not explicitly addressed.

Unfortunately, it is impossible to respond to this dilemma with a prepared tool-box of alternative solutions and standards. Once industry codes, regulations, and standards were offered as globally accepted templates for infrastructure development. Today, given the diversity of geographically, culturally, and economically unique settings, deep questioning is more appropriate than the immediate application of obvious solutions. Because the speed of change is increasing, there is no time to pause and rethink all of our current assumptions and practices. And yet, the asymmetrical pace of development in (1) technology, information management, and artificial intelligence, compared to (2) institutional governance and policy development introduces profound vulnerabilities. Explicit consideration of the limitations of technical decision-making in highly complex systems-of-systems, the increased likelihood of extreme events, and the resources and time needed to recover from failure should increasingly inform the adaptation of our current methodologies and practices for the future.

REFERENCES

Alexander R., Hall-May M. & Kelly T. 2004 Characterisation of system of systems failures. In: Proceedings of the 22nd International System Safety Conference – 2004, pp. 499–508. Retrieved April 14, 2018, from http://issc04c.pdf.Google Scholar

Barlow R. E. & Proschan F. 1996 Mathematical Theory of Reliability. Society of Industrial and Applied Mathematics, Philadelphia, PA, USA.Google Scholar

Ben-Joseph E. 2005 The Code of the City: Standards and the Hidden Language of Place Making. The MIT Press, Cambridge, MA, USA.Google Scholar

Casti J. L. 2013 X-Events: The Collapse of Everything. William Morrow, New York, USA.Google Scholar

Diffenbaugh N. S. 2017 Hurricane Harvey was no surprise. New York Times. Retrieved August 29, 2017, from https://www.nytimes.com/2017/08/28/opinion/hurricane-harvey-global-warming.html?smprod=nytcore-ipad&smid=nytcore-ipad-share.

Hacking I. 2001 An Introduction to Probability and Inductive Logic . Cambridge University Press, Cambridge, UK.Google Scholar

Hamilton M. & Serna J. 2018 Braced for a fire, blindsided by mud. Los Angeles Times.Retrieved January 13, 2018, from http://enewspaper.latimes.com/infinity/latimes/default.aspx?pubid = 50435180-e58e-48b5-8e0c-236bf740270e&edid = e7766e30-a63a-4df5-99aa-21951066f595&pnum = 1.

Ireland V. 2016 Governance of collaborative system of systems. International Journal of System of Systems Engineering 7 (1–3), 159–188.Google ScholarCrossref

Kahneman D. 2011 Thinking Fast and Slow. Farrar, Straus and Giroux, New York, USA.Google Scholar

Martin M. W. & Schinzinger R. 2014 Ethics in Engineering, 4th edn. McGraw-Hill Education, New York, USA.Google Scholar

Mellor D. 2005 Probability: A Philosophical Introduction. Routledge, Abingdon, UK.Google Scholar

Milly P. C. D. , Betancourt J., Falkenmark M., Hirsch R. M., Kundzewicz Z. W., LettenmaierD. P. & Stouffer R. J. 2008 Stationarity is dead: whither water management? Science 319, 573–574.Google ScholarCrossref PubMed

Murphy C. , Gardoni P. & Harris C. 2010 Classification and moral evaluation of uncertainties in engineering modeling. Science and Engineering Ethics 17 (3), 553–570.Google ScholarCrossref PubMed

Roser D. 2017 The irrelevance of the risk-uncertainty distinction. Science and Engineering Ethics 23 (5), 1387–1407. Google ScholarCrossref PubMed

Russell P. R. 2018 Six months later, Puerto Rico inches on. Engineering News-Record. Retrieved March 30, 2018, from https://www.enr.com/articles/44243-six-months-later-puerto-rico-recovery-inches-on?id=44243-six-months-later-puerto-rico-recovery-inches-on.

Sieff K. 2018 As Cape Town's water runs out, the rich drill wells. The poor worry about eating. The Washington Post. Retrieved February 24, 2018, from https://www.washingtonpost.com/classic-apps/as-cape-towns-water-runs-out-the-rich-drill-wells-the-poor-worry-about-eating/2018/02/23/986a0522-10e5-11e8-a68c-e9374188170e_story.html.

Tenbrunsel A. & Messick D. 2004 Ethical fading: the role of self-deception in unethical behavior. Social Justice Research 17 (2), 223–236.Google ScholarCrossref

© 2018 The Authors