Part I
Risk governance perspectives
1
Complexity, uncertainty and ambiguity in inclusive risk governance
Ortwin Renn and Andreas Klinke
Deciding the location of hazardous facilities, setting standards for chemicals, making decisions about clean-ups of contaminated land, regulating food and drugs, as well as designing and enforcing safety limits all have one element in common: these activities are collective endeavours to understand, assess and handle risks to human health and the environment. These attempts are based on two requirements. On the one hand, risk managers need sufficient knowledge about the potential impacts of the risk sources under investigation and the likely consequences of the different decision options to control these risks. On the other hand, they need criteria to judge the desirability or undesirability of these consequences for the people affected and the public at large (Horlick-Jones, Rowe and Walls 2007; Renn and Schweizer 2009; Rowe and Frewer 2000). Criteria on desirability are reflections of social values such as good health, equity or efficient use of scarce resources. Both components â knowledge and values â are necessary for any decision-making process independent of the issue and the problem context.
Anticipating the consequences of human actions or events (knowledge) and evaluating the desirability and moral quality of these consequences (values) pose particular problems if the consequences that need to be considered are complex and uncertain and the values contested and controversial. Dealing with complex, uncertain and (value) ambiguous outcomes often leads to the emergence of social conflict. Questions of how to deal with complex, uncertain and controversial risks demand procedures for dealing with risks that go beyond the conventional risk management routines. Numerous strategies to cope with this challenge have evolved over time. They include technocratic decision-making through the explicit involvement of expert committees, muddling through in a pluralist society, negotiated rule-making via stakeholder involvement, deliberative democracy or ignoring probabilistic information altogether (see reviews in Brooks 1984; Nelkin and Pollak 1979, 1980; Renn 2008: 290ff.) The main thesis of this chapter is that risk management institutions need more adequate governance structures and procedures that enable them to integrate professional assessments (systematic knowledge), adequate institutional process (political legitimacy), responsible handling of public resources (efficiency) and public knowledge and perceptions (reflection on public values and preferences). These various inputs require the involvement of several actors in the risk assessment and risk management process. The structures that evolve from the cooperation of various actors in all phases of the risk handling process are subsumed under the term risk governance (IRGC 2005; Renn 2008: 8). Hutter (2006: 215) characterizes the move from governmental regulation to governance in the following manner:
This decentring of the state involves a move from the public ownership and centralized control to privatized institutions and the encouragement of market competition. It also involves a move to a state reliance on new forms of fragmented regulation, involving the existing specialist regulatory agencies of state but increasingly self-regulating organizations, regimes of enforced self-regulation ⌠and American-style independent regulatory agencies.
(2006: 215)
Governing choices in modern societies are seen as an interplay between governmental institutions, economic forces and civil society actors, such as nongovernmental organizations (NGOs). Risk governance involves the âtranslationâ of the substance and core principles of governance to the context of risk and risk-related decision-making (Hutter 2006). It includes, but also extends beyond, the three conventionally recognized elements of risk analysis (risk assessment, risk management and risk communication). It requires consideration of the legal, institutional, social and economic contexts in which a risk is evaluated, and involvement of the actors and stakeholders who represent them. Risk governance looks at the complex web of actors, rules, conventions, processes and mechanisms concerned with how relevant risk information is collected, analysed and communicated, and how management decisions are taken.
Based on our previous work on risk governance and risk evaluation (Klinke, Dreyer, Renn, Stirling and van Zwanenberg 2006; Klinke and Renn 2001, 2002, 2010, 2012; Renn 2008; Renn and Klinke 2013; Renn, Klinke and van Asselt 2011), we will expand in this chapter on a normative-analytical model of a risk governance process that interlinks diverse actors and their claims, elaborates the institutional means to process diverse input and discusses the prospects and implications for adaptive capacity. The focus will be on collectively binding risk management decisions rather than on private risk management decisions. Such collective decisions are not only a product of governmental policy making but are joint products by a wide variety of actors including scientists, the private sector, civil society and governmental agencies.
In this chapter we will first analyze the major characteristics of risk knowledge and then address major functions of the risk governance process: pre-estimation, interdisciplinary risk estimation (including scientific risk assessment and concern assessment), risk characterization and risk evaluation as well as risk management including decision-making and implementation. Furthermore, we will explicate the design of an effective and fair institutional arrangement including four different forms of public and stakeholder involvement in order to cope with the challenges raised by the three risk characteristics. Finally, the chapter will conclude with some general lessons for risk governance.
Three characteristics of risk knowledge
Integrative risk governance is expected to address challenges raised by three risk characteristics that result from a lack of knowledge and/or competing knowledge claims about the risk problem. Transboundary and collectively relevant risk problems such as global environmental threats (climate change, loss of biological diversity, chemical pollution etc.), new and/or large-scale technologies (nanotechnology, biotechnology, offshore oil production etc.), food security or pandemics are all characterized by limited and sometimes controversial knowledge with respect to their risk properties and their implications (Horlick-Jones and Sime 2004). They need to be handled by transnational entities such as the European Union or the United Nations. The three characteristics are complexity, scientific uncertainty and socio-political ambiguity (Klinke et al. 2006; Klinke and Renn 2002, 2010; Renn 2008).
Complexity
Complexity refers to the difficulty of identifying and quantifying causal links between a multitude of potential candidates and specific adverse effects (cf. Lewin 1992; Underdal 2009). A crucial aspect in this respect concerns the applicability of probabilistic risk assessment techniques. If the chain of events between a cause and an effect follows a linear relationship (as for example in car accidents or in an overdose of pharmaceutical products), simple statistical models are sufficient to calculate the probabilities of harm. Such simple relationships may still be associated with high uncertainty, for example, if only few data are available or the effect is stochastic by its own nature. Sophisticated models of probabilistic inferences are required if the relationship between cause and effects becomes more complex (Renn and Walker 2008). The nature of this difficulty may be traced back to interactive effects among these candidates (synergisms and antagonisms, positive and negative feedback loops), long delay periods between cause and effect, inter-individual variation, intervening variables and others. It is precisely these complexities that make sophisticated scientific investigations necessary, since the causeâeffect relationship is neither obvious nor directly observable. Nonlinear response functions may also result from feedback loops that constitute a complex web of intervening variables. Complexity requires therefore sensitivity to nonlinear transitions as well as to scale (on different levels). It also needs to take into account a multitude of exposure pathways and the composite effects of other agents that are present in the exposure situation. Examples of highly complex risk include sophisticated chemical facilities, synergistic effects of potentially eco-toxic substances on the environment, failure risk of large interconnected infrastructures and risks of critical loads to sensitive ecosystems.
Scientific uncertainty
Scientific uncertainty relates to the limitedness or even absence of scientific knowledge (data, information) that makes it difficult to exactly assess the probability and possible outcomes of undesired effects (cf. Aven and Renn 2009; Filar and Haurie 2010; Rosa 1997). It most often results from an incomplete or inadequate reduction of complexity in modeling causeâeffect chains (cf. Marti, Ermoliev and Makowski 2010). Whether the world is inherently uncertain is a philosophical question that is not pursued here. It is essential to acknowledge in the context of risk assessment that human knowledge is always incomplete and selective and, thus, contingent upon uncertain assumptions, assertions and predictions (Functowicz and Ravetz 1992; Laudan 1996; Renn 2008: 75). It is obvious that the modeled probability distributions within a numerical relational system can only represent an approximation of the empirical relational system that helps elucidate and predict uncertain events. It therefore seems prudent to include additional aspects of uncertainty (van Asselt 2000: 93â138). Although there is no consensus in the literature on the best means of disaggregating uncertainties, the following categories appear to be an appropriate means of distinguishing between the key components of uncertainty:
- Variability refers to different vulnerability of targets such as the divergence of individual responses to identical stimuli among individual targets within a relevant population such as humans, animals, plants, landscapes etc.
- Inferential effects relate to systematic and random errors in modeling including problems of extrapolating or deducing inferences from small statistical samples, from animal data or experimental data onto humans or from large doses to small doses etc. All of these are usually expressed through statistical confidence intervals.
- Indeterminacy results from genuine stochastic relationship between cause and effects, apparently noncausal or noncyclical random events, or badly understood nonlinear, chaotic relationships.
- System boundaries allude to uncertainties stemming from restricted models and the need for focusing on a limited amount of variables and parameters.
- Ignorance means the lack of knowledge about the probability of occurrence of a damaging event and about its possible consequences.
The first two components of uncertainty qualify as statistically quantifiable uncertainty and, therefore, can be reduced by improving existing knowledge, applying standard statistical instruments such as Monte Carlo simulation and estimating random errors within an empirically proven distribution. The last three components represent genuine uncertainty components and can be characterized to some extent by using scientific approaches, but cannot be completely resolved. The validity of the end results is questionable and, for risk management purposes, additional information is needed, such as a subjective confidence level in risk estimates, potential alternative pathways of causeâeffect relationships, ranges of reasonable estimates, maximum loss scenarios and others. Examples of high uncertainty include many natural disasters, such as earthquakes, possible health effects of mass pollutants below the threshold of statistical significance, regional imp...