Roger Strand
University of Bergen, NOR
On the one hand, modern societies have achieved an impressive level of organization and welfare with their great number of highly differentiated institutions and expertise. On the other hand, we live in a world of global inequity and unfairness as well as massive human impact on the environment, including pollution, the destruction of natural habitats, and the excessive consumption of natural resources. Furthermore, even inside the apparently most successful countries, there are symptoms of distrust in the political system as well as in expertise (De Marchi & Ravetz, 1999), an extreme example of which would be the issue of BSE (“mad cow disease”) in the UK.
The point of departure of this article is Ravetz's (1971) distinction between practical problems, defined in terms of ultimate purposes such as human welfare, and technical problems, defined in terms of specifications such as growth in GNP (gross national product). Clearly, modern societies are characterized by a belief in the strategy of reducing practical problems to a set of technical problems to be handled by the appropriate institutions and expertise. This belief, however, has been accused of implying a nonchalant attitude toward uncertainty and complexity (Funtowicz & Ravetz, 1993, 1994b). Indeed, it seems that the current popularity of the concept of governance within international governmental policy discourse reflects the desire to pay more attention to the broader perspective of the practical problems (Carlsson & Ramphal, 1995; Dahle, 1998), as natural and cultural complexity is seen to render certain strategies of modernity inadequate or even harmful because of unforeseen adverse effects (in particular, technological intervention).
For instance, the Commission on Global Governance defined governance as:
the sum of the many ways individuals and institutions, public and private, manage their common affairs. It is a continuing process through which conflicting or diverse interests may be accommodated and co-operative action may be taken. It includes formal institutions and regimes empowered to enforce compliance, as well as informal arrangements that people and institutions either have agreed to or perceive to be in their interest. (Carlsson & Ramphal, 1995)
Beliefs and discussions at this level of generality—that is, the philosophical justification of our choice of societal organization—are profoundly inexact matters. Actual historical evidence as well as its digestion into theory in the historical and political sciences are of course highly important, but hardly sufficient to dictate the conclusions on, say, the design of sustainable governance. Our experience is too scarce; also, the questions are in part normative. On the other hand, to reject the discussion for being inexact is to run away from political choices that have to be made in any case.
In conclusion, we cannot avoid our decisions being informed by speculative beliefs about the general workings of the world (in my definition, ideology; thus, contrary to some usages, it will not be assumed that ideology necessarily is false beliefs or false consciousness). This is not to say that the decisions or beliefs may be made in isolation from scientific knowledge or subtle philosophical thinking. On the contrary, a striking characteristic of modern societies is the deep influence of science and academic philosophy on ideology.
Of special interest to this article is to describe how the growing understanding of natural and cultural complexity affects and should affect our ideological basis for governance. This is no easy task, because definitions, notions, and understandings of “complexity” abound. Within some discourses and practices complexity is a well-defined property, simulated by computers, managed by experts, and sometimes even quantitatively measured. At the other extreme, there are discourses in which the word complexity would stand for the quality of allowing neither adequate scientific description nor technological control. The strategy of this article is not to search for one all-encompassing definition, but rather to see the different notions of complexity as different and individually important departures from notions of simplicity.
Most, if not all, usages of the word “complex” imply some contrast to “simple.” Indeed, the belief that practical problems may successfully be reduced to a set of technical problems is already a claim of simplicity, in the sense that it is assumed that the practical problem only has a limited number of relevant aspects, that these may be sufficiently understood and controlled, and that the whole (practical problem) is “nothing more than the sum of the (technical) parts,” so to speak. This kind of reductionism has deep roots in western intellectual traditions, going back to Ancient Greece, through philosophical thinking related to and inspired by the scientific revolution in the seventeenth century and all the way up to our time. A comprehensive treatment of these traditions requires years of study; for our purpose it will suffice to sketch a stereotype of the worldview of simplicity that emerged from them. I call it the simple view.
Imagine a person who believes firmly in the excellence of modernity, Enlightenment, natural science, and in general western traditions of (secular) thinking. We might picture him as being male; jokingly we may say this is a person who believes that there is a rational and objective answer to most questions, and that he knows quite a few of these answers himself. I suggest the following fiction to be a possible worldview of this person:
I should stress that this description is a sketch of a stereotype and not a summary of any particular philosopher's position. In fact, the history of philosophy documents the immense difficulties and contradictions when working out the details of this worldview, such as the tension between rationalism and empiricism. We may even note a pragmatic inconsistency in the entire tradition of thinking that produced the simple view, concluding with precepts of epistemological and methodological soberness while itself unfolding within a discourse of massive speculation.
The celebrated example is the grand finale of David Hume's (1748) Enquiry into Human Understanding, in which he invites us to consider any book and ask (apparently unaware that his own book would certainly not withstand these criteria):
Does it contain any abstract reasoning concerning quantity or number? No. Does it contain any experimental reasoning concerning matter of fact and existence? No. Commit it then to the flames: for it can contain nothing but sophistry and illusion.
The positive conclusion to be drawn is that there are ways out and away from the simple view also from within its own philosophical tradition, and of course there are alternative philosophical traditions that more directly deal with complexity. A full review of these sources is impossible within the scope of this article. Instead, I will briefly describe a handful of scientific, scholarly, and philosophical developments that cast light on and challenge the simple view. We shall see that although all these developments in some sense add to our understanding of complexity, they cannot easily be integrated into a single concept or theory. However, I will try to show, in a possibly eclectic style, that quite a few of these insights may have implications for governance.
The present use of expressions such as “the sciences of complexity” generally refers to practices such as the study of self-organized critical behavior, cellular automata, agent-based modeling of all sorts, artificial life, and sometimes the study of chaos or fractal geometries. Clearly, these practices have produced some powerful insights into the limitations of the simple view. Naïve belief in the strong law of causality has been replaced by an understanding that many systems are sensitive to change in initial conditions and more generally display dynamics unsuitable for description through equilibrium considerations or perturbation techniques.
Apart from this, however, not much of the simple view has been challenged from within the sciences of complexity. Individual researchers such as Robert Rosen (1991) have questioned aspects of the (above defined) simple physics and metaphysics, notably the idea that a limited set of state functions and laws of efficient causality can provide an exhaustive description of the external world. Such views are hardly mainstream within the field; rather, the internal discussion often takes reductionism for granted (in the sense that macro-phenomena emerge out of “nothing but” mechanistic interactions at the lowest level). Instead, the argument is about the methodological prospects of top-down modeling: that is, the extent to which nonlinear systems display emergent simplicity (Stewart & Cohen, 1994), or the general prospects of predictability and control of nonlinear systems (Casti, 1990, 1997).
Furthermore, as far as mainstream practices of “complex systems” are concerned, they definitely operate within a mechanist frame. In the case of agent-based models, for example, they are sometimes even unnecessarily mechanist and reductionist, with fixed, nonadaptive, bottom-level rules of behavior (reviewed in Gross, 2001). What emerges out of the sciences of complexity is thus little more than thin complexity, a notion of complexity that basically is compatible with the simple view if the latter revises some of its methodological prescriptions for science. Nature has seams, but they are finer, more intertwined, and not in straight lines.
Decades before the advent of the sciences of complexity, quantum mechanics (QM) had already challenged aspects of the simple view, since the metaphysics of the latter was more or less directly imported from the worldview of seventeenth- and eighteenth-century physics (classical mechanics). Thus, when the physics changed, the old philosophical theories got into trouble.
For an adequate presentation of this issue, the reader should consult the designated literature, for instance the philosophical writings of the Danish physicist Niels Bohr (1958, 1963). However, with the risk of oversimplification, we may say that QM as interpreted by Bohr (the so-called Copenhagen interpretation) challenges notions of simple correspondence between physical theory and an objective, observation-independent, microscopic universe of elementary particles. In Bohr's words, there “is no quantum universe”: QM is not about the world as such, but what we can say about observations of it. The measurement design (and hence the observer) must be included in the physical description. Bohr even suggested a universal “principle of complementarity,” that is, that any phenomenon eludes complete description because any act of observation destroys the prospects of information from certain other perspectives.
The debates on QM are perhaps not settled yet. What is clear, though, is that after the entry of QM, certain naïve beliefs about the nature of the universe and our knowledge of it cannot any longer claim to be a direct and obvious consequence of physical science. Indeed, our best physics seems to imply that the external world as well as its relationship to the knowing and observing subject might be anything but simple. Part of Bohr's effort has been to introduce a distinct notion of complexity: complementarity, or the presence of incompatible perspectives.
Thin complexity is a property of a (mechanical) system, while the complexity suggested by QM in part resides in the relationship between system and observer. To understand the relationship between system and observer, or, almost equivalently, object and subject, matter and mind, and thing and idea, has been recognized as a difficult task ever since Plato. The simple view disregards the problem by assuming a sharp dualism, which, however, makes thoughts and intentions have a double, unsettled status. Within the knowing subject, they play the primary role indeed, forming the space in which knowledge and claims of knowledge exist and arguments unfold. As properties of the external, studied world, though, they are epiphenomena that reflect cerebral, mechanical processes. The working hypothesis of academic institutions in the modern world has been that we may treat these two aspects in turn, letting the natural sciences explain nonintentional objects and processes, while the humanities deal with intentions.
This division of labor is to some extent defendable. For instance, the pioneer physiologist Claude Bernard believed that every human disease in principle could be explained in terms of a physiological imbalance, but warned against the “false application of physiology” (Bernard, 1865: 199) in its present imperfect state. Thus, if Bernard had lived to see Thomas and Thomas's theorem, he would probably not have denied it: “If men define situations as real, they are real in their consequences” (Thomas & Thomas, 1928). It is merely that the cognitive act of defining the situation really can be explained as a complicated set of cerebral phenomena in the objectively existing world.
We will of course not live to see the completion of a full explanation of any such set of cerebral phenomena. Indeed, the successes of the natural sciences and the humanities have been due to their skillful choice (or construction) of “pure” objects of investigation. Whenever mind and body, subject and object, intentions and matter, and nature and culture mix more profoundly, such as in psychology and in the social sciences, research becomes incredibly more difficult.
The trouble is that most practical problems in the realm of governance do indeed involve such mixtures. This is particularly true of anthropogenic environmental problems. For instance, no comprehensive explanation of the isotopic composition of the atmosphere after 1950 can ignore the role of the highly cultural “balance of terror” in the production and detonation of hydrogen bombs. Indeed, a profound characteristic of our time is that the human impact on nature is larger than ever, and prediction by means of, say, biological knowledge has become more conditional to the underlying assumptions of human behavior. Applying Thomas and Thomas's theorem, human intentions and values have become very real to the entire biosphere. Thus, Funtowicz and Ravetz proposed a distinction between ordinary complex systems, displaying thin complexity, and emergent complex systems, in which intentionality, bounded as well as unbounded, exists as an objective feature (Funtowicz & Ravetz, 1994a).
One can see many attempts to cope with emergent complexity, either by combining methods from the natural and social sciences (environmental sociology, ecological economics, etc.) or developing entirely new methodologies, such as the so-called post-normal science (Funtowicz & Ravetz, 1993) and actor-network theory. The latter is interesting in this respect since it tries to overcome the shortcoming of dualism by insisting on analytical symmetry between intentional and nonintentional “actants.” In this way, Bruno Latour and others have studied the relationship between the intellectual separation of the natural and cultural domains and the simultaneous progressively stronger intertwining between them by means of technology, forming “hybrids” in the words of Latour (1988, 1993, 1998). Needless to say, however, an analytical framework that does not distinguish between intentional and nonintentional entities is not without its own problems.
I shall now briefly reflect on the role of science in the creation of emergent complexity in modern society. First, a great deal of scientific practice is heavily involved with the development of technology for the isolation, control, measurement, and even production of natural phenomena (Hacking, 1983). Technology cannot be de-invented afterwards, and as such scientific research is de facto an intervention in the world. Secondly, scientific investigation entails methodological choices of perspective: What is to be studied, by what methods, for what purpose? The (historical, philosophical, and sociological) science studies of the latter decades have shown how such choices are neither trivial nor take place in a cultural or societal vacuum (see, e.g., Longino & Keller, 1996; Pickering, 1992). Indeed, many theorists have called for attention to the complexity of scientific research. Already Bachelard (1934) saw how the world and our knowledge of it move together by the construction of what he called “phenomenotechnique”; Polanyi (1964) described and Ravetz (1971) elaborated how craftsmanship and tacit quality judgments play an integral part in the process of transformation from a scientific finding to a fact. Pickering (1995) showed how the influences between the scientific and the extra-scientific domain go in both directions in an open-ended fashion. Thus, he insisted on the temporal and indeterministic character of history, including the history of science.
These insights depart from the simple view into a notion of complexity with regard to the relationship between knowledge and action: Knowledge, however true or objective, cannot be thought of as something entirely outside the realm of action. We have to choose what we want to know, imposing a context; research per se often irreversibly changes the world through its invention of technology; and the course of this development is not inevitable, but has a historical character. In that case, the passion for knowledge cannot be excused from moral consideration. It is not even safe to distinguish sharply between facts and values, since the methodological setup for the production of the fact could have made disguised value commitments. This ought to be clear from the debates on, say, the scientific studies of racial differences among humans.
These problems are difficult and simple solutions invariably fail, such as the attempt of Stalin and Lysenko to replace Mendelian genetics with Lamarckism as a politically correct biology in the Soviet state, or the standpoint epistemologies of the Marxists and feminists of the 1968 generation. Indeed, in the academic disciplines that have been most sensitive to the aspect of contextuality, one has seen the growth of reflexivity as a methodological virtue, typically manifested in an introductory chapter where the author tries to discuss their own background as a possible source of bias and reason for distrust. Needless to say, the practice of reflexivity is still very infrequent in the natural sciences, although for example the British Medical Journal now requests that scientific authors declare their own vested interests.
Finally, the theoretical developments dubbed poststructuralism show aspects of complexity that depart from the semantics as well as the anthropology of the simple view. Thus, from Derrida's writings we may learn how the meanings of signs not only depend on other signs and the structure of the entire language, but that this body also is the subject of change through use. The meanings of words are never entirely fixed; and you cannot define yourself out of that situation without defining yourself out of the language community.
Other things than language may be understood in terms of interdependence and delocalization. Neural networks is one example (Cilliers, 1998). Taken home to philosophical anthropology, this type of perspective is not even necessarily controversial. Although Descartes could commit himself to prolonged meditations in which he doubted if he were awake or dreaming (lending himself what has been called the persona of the idiot; Deleuze & Guattari, 1994), this situation is rather atypical to the human condition, in which the anything but simple relationships to others (emotions, bonds, responsibilities, care) play a constitutive part. By common sense it seems that a substantial part of the agency of any human individual is delocalized in this sense. The effort of the poststructuralist move has been to show how this also may be the case for natural language. Thus, what is suggested is a notion of complexity that entails types of interdependence and delocalization that render analysis by parts a risky venture. We may use the analytical knife to obtain absolute distinctions between knowledge and action, facts and values, my will and the others, but this invariably happens at some cost.
Above I have argued how the various elements of the simple view are being called into question or even denied by cutting-edge knowledge from a host of disciplines, ranging from physics to literary theory and philosophy. Taking the whole range of complexity notions—that is, thin complexity, complementarity, emergent complexity (sensu Funtowicz & Ravetz), intertwining of nature and culture, contextuality and delocalization—could one not unify them into a definition of thick complexity?
One could, of course, but I doubt that the idea is of much sense. The list in this article is by no means exhaustive. We could also have discussed other facets of complexity visible in other disciplines, such as biology or psychology, or indeed ordinary life-world experience. It seems that thick complexity would amount to the negation of the simple view in its absolute form; we are then fairly close to merely making noise, saying that there are no absolute truths or structures or principles. Thus, there may be a perfectly true statement that affirms the existence of thick complexity (in which case it is opened up for a sly discussion about self-referential problems), but perhaps no productive concept.
What we are left with, then, are the different shades of gray from the simple to the ungraspable, all with their potential utilities and disadvantages for governance, the evaluation of which in itself is a judgmental and contextual matter, but possibly also one of some systematic features. We shall turn to these now.
If we imagine the simple view to be true, governance becomes simple— it simply becomes the modern blend of representative democracy and technical expertise. The public expresses certain general value preferences by electing politicians who represent these values. The experts present objective information on the present state of affairs and delineate the technical means to achieve the desired values. Uncertainty in background information and in the outcome of chosen policies can be expressed as quantitative risk and managed in a rational way by risk-cost-benefit analysis. After the politicians have decided on the issues of value (and some meta-issues such as the level of risk aversion/risk acceptance), the required expertise may realize the technical plans.
Let us assume that our practical problem involves a system that displays thin complexity, that is, richness in connections and nonlinearity far from equilibrium. Prediction and control of the system are then sometimes possible, and at other times not. In particular, if there is unprecedented, large-scale human impact on the system, transitions to hitherto unseen system behavior may occur, rendering prediction and control a particularly difficult task, even if accurate information about the present state of the system is available (Gross & Strand, 2000). Rational governance under thin complexity accordingly requires a critical attitude toward the quantification of uncertainty in terms of risk (Wynne, 1992). In particular, in cases where the strong law of causality cannot be expected to hold for the predicted outcomes of the policy alternatives, the rational justification for risk-cost-benefit analyses is no longer valid. The question then becomes: What is the rational strategy in the presence of thin complexity?
There are at least three answers to this question. The first is that we do not know yet, but it might be good idea to consider some principle of precaution (European Community, 1997). Another possibility is to retain the vision of governance of the simple view and try to develop generalized forms of prediction and control. For instance, climate modelers know that the weather is chaotic, but work under the assumption that climate parameters are not. More generally, models of complex systems, notably agent-based or system dynamics models, are developed and sold as policy/management devices with the ambition to provide understanding and some kind of prediction or control of the qualitative dynamics of the modeled system. Belonging to the same category are the various attempts to provide “management on the edge of chaos,” “chaos pilots” (see http://www.kaospilot.dk
The third option is to take thin complexity to have truly radical implications. I have described the simple view as a list of separate ideological components, but these are of course not independent. Indeed, the insight that the behavior of many systems eludes prediction beyond qualitative behavior does not only shake the foundations of risk-cost-benefit analyses but also those of mainstream decision theory, the idea of the rational agent, most of academic economics, and utilitarianist moral philosophy. One could argue that modernity's priority of reason over passion is justified only by the possibility of obtaining hard facts, of replacing ignorance and uncertainty with facts about risks and certainties. When hard facts in principle are unavailable, we are back to square one.
Thus, there is a natural (though not unarguable) line of thought from reflections on thin complexity to other aspects of complexity such as value-ladenness and contextuality, to be recognized in the writings of Funtowicz and Ravetz (1993, 1994a, 1997), Stacey (2000), Stengers (1997; Prigogine & Stengers, 1984), and others, leading us to the question of governance under thick complexity.
An initial observation is that we should not expect to find a unique, optimal theoretical solution to the problem of governance under thick complexity. The idea of unique and universal principles of guidance could be justified by the simple view; without that ideology, we cannot a priori expect more than piecemeal, pragmatic, and imperfect solutions, possibly with the exception of Stoic or Zen-inspired attempts to extinguish our desire for control or indeed governance of our own destiny. In more ordinary language, we cannot expect to succeed by thinking ourselves away from the fact that the world is a mess.
A conservative vision of governance would be to adopt the vision of business management under thick complexity as expounded by Ralph Stacey, who clearly sees that the worldview of thin complexity has to be enriched with the understanding of complementarity and contextuality. Thus, for him, the implication faithful to complex systems theory is not the question “How can I govern the system into a new attractor (desired by me)?” but rather “What is my role in this system, and how does the action of me and others affect the system?” (Stacey, 2000). However, the frame of his discussion is essentially one of enterprise success, which remains a technical problem in so far as the activity of the enterprise is defined. This is different from the typical governance situation, which is constituted not only by the substantive inclusion of the actors, but also the myriad of legitimate opinions on what counts as success and indeed what the issue really is.
A somewhat cruel interpretation of the political life of many countries is that the simple view still apparently serves as the ideological justification for a number of institutions and practices, although the individual citizen, politician, and expert to a large degree recognize the prevalence of complexity. How can a false view serve as a justification?
My clue for an explanation lies in the question of what it is not to feel like an idealist. Imagine that you confront a natural scientist working in an applied field with the uncertainties inherent to his practice. At first, he might utter something like the simple view: We are heading for the truth, we just need more data and better models, and so on. However, when you remind him of such factors as the inherent irreducible uncertainties in his system, the contextuality of the methodology, the defense typically changes: What else should we do? We do not know any other way. We know this is wrong, but there is no better alternative.
To me, this phenomenon (which I have often encountered) seems to hit an essence of late modernity: The simple view is retained exactly because of an awareness of complexity. In the political arena, many former radicals have turned into low-key voices, possibly because they saw how the great revolutions went: They did not have an awareness of complexity and went horribly wrong when things went contrary to the plan. Ravetz (2001) has dubbed this kind of phenomenon “safety paradoxes.” We know that pesticides are unsafe; still, to stop them and upset the economy might be even more unsafe. We know that technical risk assessments cannot represent the uncertainties and complexity of the issues; but a general admittance of that fact might generate fear, instability, and accordingly more danger. This is how the simple view can be operative even if nobody believes in it: It is judged unsafe to stop pretending (Blasco & Strand, 2001).
The pretending game of desperate modernity is intellectually dishonest and hypocritical. However, under thick complexity probably every solution to the problem of governance will be imperfect as judged by some cognitive or moral standard, and so its real test will be if it is pragmatically acceptable. I can sometimes accept the ill-founded risk assessments made by my medical doctor if I believe them to provide part of an acceptable personal healthcare regime (see also Rortveit & Strand, 2001). Also, the use of technocratic forms of governance can be justified as a pragmatic way out of political conflicts and stalemates (Porter, 1995). Thus, under thick complexity, the criticism of desperate modernity will have to be based on experience of harm and injustice to living people, future generations, or other moral objects. More specifically, they will have to be based on harm and injustice, for which strategies of desperate modernity can be blamed.
This insight immediately shows why the discussions about government and governance are so difficult. Not only will there be a thousand voices with their conflicting expressions of harm and benefit. There will also be complex cognitive issues about the necessity of inflicted harm. For instance, some will see the “green revolution” as a technocratic disaster that ruined agricultural traditions and infrastructure and thus the long-term potential for food production and survival, while others consider it a great technological success that saved millions from hunger. Their prospects of arriving at a consensus are small: Typically, they disagree about values, contexts, and methodology for the evaluation of each other's claims. The present discussion about genetically modified plants experiences a similar stalemate of cognitive incommensurability (Strand, 2001).
The institutions of modernity are not designed to tackle cognitive incommensurability. In fact, we see here two possible justifications for the abandonment of desperate modernity: in substantive claims that the strategies of desperate modernity inflict unnecessary harm or injustice, for instance between rich and poor countries, or toward the natural environment; and in the mere observation that desperate modernity leads to unsolvable struggles, incommensurability, and distrust. I shall now leave the question about the strength of these justifications and instead characterize possible visions of governance to which they may lead.
A central tenet of the simple view was that the straightforwardness of the external world and our relation to it allowed informal and subjective judgment to be be replaced with the universal rules and principles of modern science and society. In light of complexity, this tenet no longer holds. Instead, old and new old principles (as those below) will have to be decided on with judgment and with respect to the practical problems at stake.
Most visions of governance include new forms of direct, deliberative, or participatory democracy, including ordinary citizens to a larger extent than in representative democracy. This can be seen as a method for increasing public trust in the institutions, but also as a way to improve the quality of decisions. For instance, in the governance of biotechnology, the particular level of emphasis on an ecological perspective or one of molecular biology may be quite important (Strand, 2001) and hardly a purely “scientific” matter. Rather, it is a choice with normative aspects, and as such it belongs to the public domain.
Thus, the extension of the peer community to include nonexpert (“lay”) participants has been proposed (Funtowicz & Ravetz, 1993). I have met a few academics who were horrified by this proposal, arguing that non-academics cannot be trusted to make the right decisions. I interpret the feeling of horror as a partial understanding of thick complexity; what remains to discover is that academics cannot be trusted to know, and that part of the issue at stake is the choice of criteria for what counts as proper knowledge and a right decision (and for whom).
Quite a few (if not all) of our present, large-scale practical problems are manmade in a strong sense: They are byproducts of the rapid development of technological sophistication and the concurrent population growth. Some of these problems may have no “solution” other than learning how to live with them. For instance, many animal and plant species are already extinct or their natural habitats are all destroyed. An important aspect of governance is thus to impede the creation or escalation of practical problems.
Stories of the creation of problems display elements of thick complexity: unpredictability, open-endedness, novelty, and interpretive/value incommensurability. For instance, the Earth is now a place in which the intended rapid mass destruction of society and the biosphere can be achieved. For the sake of argument, let us take this as an example of a development toward being worse off (bracketing the possible but controversial benefits of civil nuclear power). It seems reasonable to expect that there may be more instances of nuclear warfare in the long run, with massive damage and tragedy. Indeed, the luck of the twentieth century appears to be the immense technical difficulties of making nuclear bombs in a world of terrorism. However, our concepts of (and institutions for) guilt or blame, which are historically and philosophically related to simple, linear notions of causality, appear largely irrelevant and powerless with respect to this unfortunate development as such (although they may be highly relevant to various acts, such as the bombing of Nagasaki). The scientific discoveries of Einstein, Bohr, Szilard, and others “simply” increased our physical knowledge of the world. The development of nuclear arms technology happened inside a tragic dynamic, and it was by no means obviously wrong for the US to initiate the Manhattan project on the suspicion of German (Nazi) efforts to do the same.
How can we impede the creation of new problems, such as new, powerful but cheap weapons (for instance biotechnological terrorism)? What institutions of governance need to be developed, and what insights or ideologies will they have to rest on? I see three alternatives, none of which excludes the others. First, one may enforce and scale up the present strategies, that is, legislation and legal action against “misuse” of technology. The problems with this are well known. Above all, the inflicted harm can sometimes be so large that there can be no legal compensation for the tragedy.
Secondly, it seems that many stories of problems begin with discoveries of simple principles in the physical world that can be reified into powerful technologies. In the tradition of thought flowing from Francis Bacon (1620) it has been taken for granted that the limit utility of scientific discoveries is always positive; under thick complexity we know that it may be otherwise (Strand & Schei, 2001). Developing knowledge allowing the construction of transgenic organisms should probably neither be judged immoral or illegal; nevertheless, it leads to a trajectory that might be horribly unwise from the viewpoint of ecological complexity. We will have to give up unconditional love for and trust in science, replacing it with continuous distrust and doubt about the desirability of research efforts. In light of the above discussion on lay participation, scientific research projects are not justified per se as a production of truth, they should also be accepted by the public to have quality.
Thirdly, and more generally, reflexivity seems to be required as a continuous exercise all along processes of governance, since we now are aware of how easily the ultimate good slips out of sight or becomes perverted by terrible byproducts of our interventions. This means that under thick complexity, we should make even more use of the highly modern virtue of criticism, encouraging and giving more attention to “trouble makers.” We may recall how Ivan Illich's (1975) famous question surprised the medical doctors: He knew about the progress of medical knowledge and technology, but were the patients getting any better?
The primary mode of action under conditions of simplicity is finding the efficient alternative and performing it. Under thick complexity, we know that quite a few apparently rational actions are going to have tragic consequences. The rational strategy under such conditions is to search for qualitative principles rather than making calculations that anyway will be inadequate.
This kind of situation is not new at all. Indeed, in any sane person's development, he or she learns to set aside impulses toward personally rewarding acts for a variety of reasons. Moral and law do not exhaust this domain of cultural convention: There are rules of etiquette, habits, taboos, and also cultural patterns of attention. For instance, in some cultures there was apparently much less attention on finding ways to control nature than was the case in Europe.
Taboos might be useful devices for the cultural learning of complexity (Giner & Tàbara, 1999). Indeed, one might speculate whether this was their origin in prehistoric past. One somehow got an imprecise suspicion that it was unhealthy to eat certain foods, or that there were medical and/or social problems related to sexual relationships between brother and sister. Then, over time, some of these vague, precautionary conventions may have developed into rules of etiquette and finally into a matter of moral disgust. In fact, there seems to be some public disgust for, say, human cloning or nuclear power. Such emotions might play a legitimate and important role in governance.
Everything that has been proposed here in reality implies more talk about the practical problems and less action in the sense of technical intervention. Indeed, when we are aware of the complex relationships between meaning, language, and the world, for instance in the unequal distribution of the powers of language, we will even have to talk about how we talk about things in processes of governance. The processes will be slower and less efficient. Also, even the efforts of intervention will be less efficient, since the exercises of reflexivity will lead to more doubt and possibly less devotion. Even the exercise of doubt itself will often feel idle since the future may be unpredictable anyway.
The insights of complexity suggest a transformation toward a society that collectively chooses hesitant and inefficient behavior, following passions to say “stop” even when there seems to be no explicit reasons for it, declining proposals of novel food technology even if that might have stopped the next famine in Sudan. It is no easy way.
I do not know how the transformation to the governmental practices of thick complexity is to come about. Not only will their mere nature make them “uncompetitive” from the perspective of the endorsed values in present political discourse. It may even be that, say, environmental catastrophe really cannot be stopped without rapid and massive development of novel technology, implemented in a technocratic and unprecautionary manner. In that case, I think we are doomed.
The author wants to thank Hall Bjørnstad, Gay Bradshaw, Dominique Gross, Rune Nydal, Helge Olsen, J. David Tàbara, and the anonymous reviewers for valuable contributions.
Bachelard, Gaston (1934) Le nouvel esprit scientifique, Paris: Les Presses Universitaires de France.
Bacon, Francis (1620; 1994) Novum Organum; with Other Parts of the Great Instauration, Chicago: Open Court Publishing.
Bernard, Claude (1865; 1957) An Introduction to the Study of Experimental Medicine, New York: Dover Publications.
Blasco, Jaume & Strand, Roger (2001) “La incertidumbre completamente normal,” Ecologia Politica, 22: 7-16.
Bohr, Niels (1958; 1987) Essays 1932— 1957 on Atomic Physics and Human Knowledge, Woodbridge, CN: Ox Bow.
Bohr, Niels (1963) Essays 1958-1962 on Atomic Physics and Human Knowledge, New York: Interscience Publishers.
Carlsson, Ingvar & Ramphal, Shridath (1995) Our Global Neighbourhood: The Report of the Commission on Global Governance, Oxford, UK: Oxford University Press.
Casti, John L. (1990) Searching for Certainty: What Scientists Can Know about the Future, New York: William Morrow.
Casti, John L. (1997) Would-be Worlds: How Simulation Is Changing the Frontiers of Science, New York: Wiley.
Cilliers, Paul (1998) Complexity and Postmodernism: Understanding Complex Systems, London: Routledge.
Dahle, Kjell (1998) “Toward governance for future generations: How do we change the course?,” Futures, 30: 277-92.
Deleuze, Gilles & Guattari, Félix (1994) What Is Philosophy?, London: Verso.
De Marchi, Bruna & Ravetz, Jerome R. (1999) “Risk management and governance: A post- normal science approach,” Futures, 31: 743-57.
Descartes, R. (1641; 1960) Meditations on First Philosophy, New York: Bobbs-Merrill.
European Community (1997) “Consolidated version of the Treaty establishing the European Community,” EC Official Journal, C340(10 Nov.): 173-308.
Funtowicz, Silvio & Ravetz, Jerome R. (1993) “Science for the post-normal age,” Futures, 25: 739-55.
Funtowicz, Silvio & Ravetz, Jerome R. (1994a) “Emergent complex systems,” Futures, 26: 566-82.
Funtowicz, Silvio & Ravetz, Jerome R. (1994b) “Uncertainty, complexity and post-normal science,” Environmental Toxicology and Chemistry, 13: 1881-5.
Funtowicz, Silvio & Ravetz, Jerome R. (1997) “The poetry of thermodynamics,” Futures, 29: 791-810.
Giner, Salvador & Tabara, David (1999) “Cosmic piety and ecological rationality,” International Sociology, 14: 59-82.
Gross, Dominique (2001) Natur in Silico, Berlin, Germany: Mensch-und-Buch-Verlag.
Gross, Dominique & Strand, Roger (2000) “Can agent-based models assist decisions on large-scale practical problems? A philosophical analysis,” Complexity, 5: 26-33.
Hacking, Ian (1983) Representing and Intervening: Introductory Topics in the Philosophy of Natural Science, Cambridge, UK: Cambridge University Press.
Hume, David (1748; 1910) An Enquiry Concerning Human Understanding, New York: P F Collier & Son.
Illich, Ivan (1975) Medical Nemesis: The Expropriation of Health, London: Calder & Boyars.
Latour, Bruno (1988) The Pasteurization of France, Cambridge, MA: Harvard University Press.
Latour, Bruno (1993) We Have Never Been Modern, New York: Harvester Wheatsheaf.
Latour, Bruno (1998) “To modernise or ecologise? That is the question,” in Bruce Braun & Noel Castree (eds), Remaking Reality: Nature at the Millennium, London: Routledge: 221-42.
Longino, Helen E. & Keller, Evelyn F (1996) Feminism and Science, Oxford, UK: Oxford University Press.
Pickering, Andrew (1992) Science as Practice and Culture, Chicago: University of Chicago Press.
Pickering, Andrew (1995) The Mangle of Practice: Time, Agency and Science, Chicago: University of Chicago Press.
Plato (1996) Parmenides, trans. M. L. Gill & P Ryan, Indianapolis, IN: Hacket.
Polanyi, Michael (1964) Science, Faith and Society, Chicago: University of Chicago Press.
Porter, Theodore M. (1995) Trust in Numbers: The Pursuit of Objectivity in Science and Public Life, Princeton, NJ: Princeton University Press.
Prigogine, Ilya & Stengers, Isabelle (1984) Order out of Chaos: Man's New Dialogue with Nature, Toronto, Canada: Bantam Books.
Ravetz, Jerome R. (1971) Scientific Knowledge and its Social Problems, Oxford, UK: Clarendon Press.
Ravetz, Jerome R. (2001) “Safety paradoxes,” Journal of Hazardous Materials, 86: 1-16.
Rortveit, Guri & Strand, Roger (2001) “Risk, uncertainty and ignorance in medicine” (in Norwegian), Tidsskrift for Den Norske Laegeforening, 121: 1382-6.
Rosen, Robert (1991) Life Itself: A Comprehensive Inquiry into the Nature, Origin, and Fabrication of Life, New York: Columbia University Press.
Stacey, Ralph D. (2000) Strategic Management and Organizational Dynamics: The Challenge of Complexity, Harlow, UK: Financial Times/Prentice Hall.
Stengers, Isabelle (1997) Power and Invention: Situating Science, Minneapolis, MN: University of Minnesota Press.
Stewart, Ian & Cohen, Jack (1994) “Why are there simple rules in a complicated universe?,” Futures, 26: 648-64.
Strand, Roger (2001) “The role of risk assessments in the governance of genetically modified organisms in agriculture,” Journal of Hazardous Materials, 86: 187-204.
Strand, Roger & Schei, Edvin (2001) “Does knowledge hurt?” (in Norwegian), Tidsskrift for Den Norske Laegeforening, 121: 1502-6.
Thomas, William I. & Thomas, Dorothy S. (1928) The Child in America: Behavior Problems and Programs, New York: Knopf.
Wynne, Brian (1992) “Uncertainty and environmental learning: Reconceiving science and policy in the preventive paradigm,” Global Environmental Change: Human and Policy Dimensions, 2: 111-27.