Journal Information

Article Information


Phenomeno-semantic complexity: A proposal for an alternative notion of complexity as a foundation for the management of complexity in human affairs*


Abstract

?


This paper proposes a novel notion of complexity, derived from the process of semantic and syntactical transformation of message as communicated between sense-making actors. It distinguishes between complexity driven by syntactical transformation and that driven by semantic transformation. This last is also divided into first- and second-order transformation, the former relating to the observing system and the latter to the observed system of an inquiry. The proposed notion of complexity is juxtaposed with the mainstream notion of complex systems, typically understood as the interaction of adaptive agents governed by local rules, yet giving rise to novel unpredictable global behavior. The proposed notion of phenomenological semantic complexity is illustrated with some examples taken from real-life human affairs. The proposed and conventional notions of complexity are regarded as complementary, yet they require different research strategies; some suggestions for further research are therefore put forward here.

Introduction

This paper presents intermediate findings from ongoing research into the management of complexity in human affairs. A distinction between ontological syntactic complexity and phenomenological semantic complexity is introduced, where the former refers to conventional approaches to complexity studies, such as the interaction of adaptive agents governed by local rules giving rise to unpredictable global behavior. The latter, on the other hand, refers to complexity that emerges in the process of communication between sense-making actors, resulting in syntactic and/or semantic transformations, and leading to unpredicted complex behavior in human affairs. This distinction gives rise to promising implications for the practical management of complexity in human affairs in general, and so-called knowledge-intensive enterprises in particular. One such central implication of the novel notion of complexity introduced here is the switch of attention from algorithmic simulation resources to linguistic and communicative models and systems supporting human communication processes.

The text continues with a brief characterization of the contemporary mainstream approach to studies of complexity, leading to identification of some of its key characteristics. The proposed model of phenomeno-semantic complexity is then introduced. This model serves as a foundation for distinguishing the complexity arising from syntactic and semantic transformations in the communication process between sense-making actors. The latter is further divided into first-order and second-order semantic complexity, referring to the observing system and the observed system respectively. Some illustrations from real-life human affairs are then presented, while suggestions for further research and conclusions end the paper.

Conventional notion of complexity: Onto-syntactic

An overview of the conventional notions of complexity is presented in this section, ending with an identification of some of its key characteristics. These will be used to contrast the notion of complexity as introduced here, and will thus contribute to the articulation of its novelty and value.

Outlines of the contemporary theory of complexity

In his classical paper “Science and complexity,” Weaver (1948) distinguishes between three classes of phenomena: simple phenomena addressed by science by means of classical mechanics, those of disorganized complexity that are addressed with thermodynamics, and finally those of organized complexity, which could not yet be successfully addressed by science.

Since that time, a large number of proposals have been put forward regarding the notion of complexity, its measurement, and approaches to its management. Early examples include attempts by the founders of cybernetics and systems sciences, such as Ashby’s (1962, 1964, 1965) “law of requisite variety,” where complexity is understood in terms of variety, and Boulding’s (1956) layered model of nine classes of systems, where each successive class is more complex, with emerging properties, yet without providing any explicit definition of complexity. The Nobel Laureate Simon (1969) proposed a general notion of hierarchical structure of complexity. Other notions include computational complexity, addressing the amount of computational resource (time or memory) needed to solve a class of problem (c.f. Hinegardner & Engelberg, 1983) and Kolmogorov complexity, which focuses on the minimum length of a Turing machine program needed to generate a pattern (Kolmogorov, 1965). Kauffman introduced a working definition of complexity for the formal models of self-organization he was investigating, where complexity is the number of conflicting constraints (Kauffman, 1993).

More recently, attempts have been made to formulate general principles that guide the behavior of complex phenomena, such as Bak’s “self-organized criticality” (Bak, 1997; Bak et al., 1987; Bak & Sneppen, 1993). In this, many out-of-equilibrium systems naturally organize themselves, without external tuning or prodding, into a state that is at the threshold between complete disorder and complete order. The system can thus be said to organize itself into a critical state. This approach has, however, been challenged by critics (Sneppen & Newman, 1996; Newman, 1996). Today’s most promising approach, for the conception of complex systems, seems to be the so-called notion of complex adaptive systems (Holland, 1994).1

The behavior of a complex adaptive system (CAS) is regarded as dynamic, out of equilibrium, and thus highly nonlinear. Such systems manifest the emergence of laws and patterns of order through cooperative effects of the subunits, agents of a complex system; that is, local coordination. CAS employ feedback mechanisms, parallel processing, and simple local rules directing adaptive agents, which in turn can give rise to collective or global behavior of great complexity and variety, where deduction of the emergent behavior is very difficult. CAS are typically characterized by internal in-homogeneity of the system (i.e., it consists of a number of different classes of autonomous agents), adaptation of agents in the system, nonlinear interaction between parts of the system, and the net-like causal structure of the system (high connectivity); see Holland, 1995, 1998; Casti, 1997). They are therefore modeled as a set of agents, with local rules for behavior providing local coordination, giving rise to an emerged global behavior and self-organization.

An example of a CAS is an ant colony that organizes itself without a leader; that is, without any central and top-down directed coordination. Each ant appears to go about its own business, following a few simple rules determining its interaction with its environment or its fellow ants. An incredibly complex and organized society emerges from this interaction, displaying adaptation to changing circumstances.

Key characteristics of the contemporary theory of complexity

The contemporary notions of complexity as outlined above may be understood in terms of some of their key characteristics. First, naturalism, where the empirical observations made as a basis for induction of theoretical propositions are most frequently based in physical, chemical, and biological phenomena. The peculiar characteristics of mental systems, both psychological and socio-psychological, such as sense making or power structures, are not accounted for. Secondly, syntactic orientation, where understanding of the complex behavior of studied phenomena emerges from numerical simulation models using a set of rules guiding the interaction of the constituting components. This is shown by syntactic manipulation—that is, representation—rather than focusing on the pragmatic and semantic aspects of the phenomena studied. Thirdly, objectification, where studies of complexity focus mainly on the observed system and do not seriously account for the observing system; in this endeavor, such studies typically assume a kind of ontological representation; that is, a direct correspondence between the model and the modeled phenomena. These three characteristics allow us to label the contemporary notions of complexity as “onto-syntactic.” We shall soon see that although this notion of complexity is helpful in making various complex phenomena intelligible, it does not do full justice to human affairs and their management, where sense making is a key feature.

Alternative notion of complexity: Phenomeno-semantic

In this section, we will present the proposed alternative and complementary notion of complexity, labeled the model of “phenomeno-semantic complexity.” The argument is built on the conception of complexity and emergence presented by Le Moigne (1990: Chap. 5). However, while Le Moigne’s model, of “information that transforms organization,” focuses on the syntactic part of emergence and complexity, the approach proposed here takes a further step toward the semantic component of information, giving rise to a novel argument. Although in a very different manner, a semantic approach to complexity has also been assumed by other investigators, such as Warfield (2004) for example.

In short, the construction of the argument starts by recalling Richards and Ogden’s triangle of meaning, and proceeds with Naess’s notion of empirical semantics, followed by Shannon’s model of communication, and ends with the support from Quastler’s model of transmission transformation.

The inaccessibility of meaning

Ogden and Richards’s (1985) triangle of meaning (originally published 1923) proposes a conception of meaning as a structure of three components. First, the very meaning, or idea, or mental concept held by a human being in his or her mind. Secondly, a thing, object, or referent, which is supposed to be represented by the mental concept mentioned. Thirdly, the symbol, term, or signal that is supposed to represent the mental concept, and thus also the object; see Figure 1 for an illustration. This notion conforms to contemporary positions within semiotics (e.g., Deeley, 1990; Nöth, 1990) and its recent advances (e.g., Stonier, 1997; Brier, 1998).

Although Ogden’s triangle of meaning is well known and accepted as such, it is

Figure?1

Ogden’s triangle of meaning, consisting of three related components: (i) the meaning, or mental concept, or Idea; (ii) the object, or referent, or Thing, which the mental concept is about; (iii) the symbol, Term, or signal, representing the mental concept. The Idea is private for the individual as only he or she may access it directly, while the Thing and the Term are public, as they may be accessed by other individuals.

the very nature of the relationship between its constituting components that creates discussions. We will borrow the proposals of empirical semantics, initially pioneered by the Norwegian philosopher Arne Naess (1953), as a kind of reaction against the teachings of the Vienna Circle and their logical positivism, which placed all focus on formalism; that is, formal semantics.

Naess states that there is no one-to-one direct and exclusive relationship between our expressions, clauses, sentences, symbols, and the meaning that we assign to them. For example, the sentence “Putin is ill” and the sentence “Putin is bad” may both mean that [Putin is not healthy]. Yet, the sentence “Putin is bad” may also mean that [Putin is a poor politician]. Further, Naess states that there is no one-to-one direct and exclusive relationship between our mental concepts, ideas, or meanings, on the one hand, and the actual situation, or objects, or facts in the world, on the other. For example, the expression “morning star” typically means [the strongest shining star in the morning, in the east], while in fact it is {the second planet from the sun, in the Milky Way}. A second expression, “evening star,” typically means [the strongest shining star in the evening, in the west], while it is in fact {the second planet from the sun, in the Milky Way}. To make the example even more compelling, a third expression, “Venus,” typically means {the second planet from the sun, in the Milky Way} and in fact is {the second planet from the sun, in the Milky Way}.

The lack of direct and exclusive relationship between the three components of the meaning triangle gives rise to difficulties in communication and hence understanding between people and in human affairs. An example is the vagueness of terms, such as in the expression “being bald”; a more careful investigation gives rise to the question: How much—or little—hair does a person need to be described as “bald”?

If we accept the propositions presented above, we must conclude that there is a clear risk of misunderstanding the meaning of things in the world, and the terms we use to denote these things, as well as our ideas about these things.

Communication as unintended transformation of meaning

An instance of Ogden’s triangle of meaning, as presented above, is valid for one individual at a time. Thus, if two individuals wish to communicate, we may conceive this in terms of two triangles of meaning attempting to match each other. More specifically: Idea 1 in the mind of Man 1 represents Thing 1 and is represented by Term 1. This term is then communicated to Man 2, who is supposed to interpret it as Idea 1 and refer it to Thing 1. In this, the idea of Man 1 and Man 2 is private—that is, only the conceiving man has direct mental access to the idea—while the Term as well as the Thing are public, as several individuals may perceive them. This situation is valid when the communication is about things existing in the world, for example those that manifest a physical experience. When the communication refers to a design—that is, a thing that is designed and yet not constructed—then the only public referent is the Term—that is, until such time as the construction of the design is realized. Figure 2 illustrates these relationships and directs us to Shannon and Weaver’s (1949) classic model of communication. We recognize that it was formulated for the conception of machine communication rather than human. When adapting it to the latter, we shall superimpose it onto the above-mentioned two communicating meaning triangles and Naess’s notion of empirical semantics.

In Shannon and Weaver’s model of communication, the sender, here Man 1, has an Idea 1, which he encodes, resulting in Term 1, a symbol or signal. This is then physically communicated to the receiver, here Man 2, who decodes the symbols received, resulting in Idea 2. Typically, the intention of the sender, here Man 1, is that the receiver, here Man 2, shall have the same idea, hence Idea 1 = Idea 2. Figure 3 illustrates this concept.

The usefulness of Shannon’s model of communication is that it facilitates the identification of two types of sources of noise, or transformation of the message, in the process of communication. These are (i) the semantic transformation; and (ii) the syntactic transformation. This is illustrated in Figure 3.

Figure?2

The relationship between two triangles of meaning attempting to communicate. The two individuals communicate about the Thing with the help of the Term; where the latter two are public while the individuals’ Ideas are private.

Figure?3

The Shannon and Weaver model of communication. In this, a sender, Individual 1, has an Idea 1 that is formalized into a symbol, by means of encoding and then physically transferring it to the receiver, here Individual 2. The latter interprets the received symbol by decoding it, which gives rise to an Idea 2. The purpose of this communication is to secure that the Idea 1 of the sender is the same as the Idea 2 of the receiver. This goal is challenged by two types of transformation: the semantic transformation that occurs during encoding and decoding, and the syntactic transformation that occurs during the physical transfer of the symbol.

Syntactic transformation emerges when the actual syntax, symbol, or signal, here Term 1, is physically transferred from the sender to the receiver. In electronic communication this often occurs as a result of various interferences with other electronic signals. In human mouth-to-mouth communication, this may occur due to interference with other sound waves, for example when a construction machine generates sounds outside an open window to a room where two people are talking. For instance, while the sender, Man 1, says to the receiver, Man 2, “I am not ill,” the sound interference may make Man 2 hear “I am ill.”

Semantic transformation may emerge in two places in the communication model presented. First, when an idea of a sender, here Idea 1 of Man 1, is encoded—that is, formalized into symbols, here Term A—and in the process of encoding or formalization some of the meaning is missed. We can denote that here as formalization noise, or encoding noise. An intuitive example of this is a textbook on how to ski or swim: Anyone who skis or swims knows that even though such a book may be instructive in many aspects, it is not enough merely to read the textbook to acquire sufficient knowledge and skills to be able to ski or swim.

The second instance of semantic transformation is when a receiver decodes a received symbol, hence interpretation noise or decoding noise. Here, Man 2 receives Term 1, which is supposed to give rise to Idea 1. In the process of assigning meaning to a symbol some meaning may be lost, or an alternative meaning may be assigned.

These two processes of formalization and interpretation bring us to phenomenological and constructivist notions of knowledge, as well as to interpretive sociology and to hermeneutics, when we reason about the psychological and social conditions and circumstances of formalization and interpretation. While this is important, it lies outside the scope of this paper; for the purpose of our argument here, it is enough to articulate the intentional, hence psychic quality of the processes mentioned, where syntactic transformation and semantic transformation are understood as unintended transformations of meaning.

Furthermore, to support this unintended transformation of meaning, we would like to recall Quastler’s (1964) model of transition transformation. This model served to formalize Shannon’s model, which states that the receiver does not hear everything expressed by the sender, while the receiver may hear things that the sender does not express; see Figure 4 for an illustration.

While Quastler’s model of transition transformation articulates syntactic transformation (i.e., transformation of the code, symbol, signal term), Naess’s empirical semantics articulates the semantic transformation (i.e., transformation of the meaning, idea, concept).

To summarize, we have now advanced a model of transformation of both the syntax

Figure?4

A graphical illustration of Quastler’s (1964) model of transition transformation. It manifests the sender sending a message which the receiver receives only in part, while the receiver also receives a message that the sender did not send.

and the semantics of a message. If a message communicated between two sense-making individuals, for example two employees in an organization, is utilized to stimulate the design and execution of actions, hence behavior, the transformation of such a message may be understood as a foundation or driver of the emergence of unpredicted behavior in psychological and social phenomena.

The syntactic transformation provides a foundation for advancing various original notions of emergence, complexity, and self-organization. These include von Foerster’s (1959) order from noise model; Atlan’s (1972, 1979) complexity from noise model; Varela’s (1979) self-in-formation model; and Le Moigne’s (1990) model of self-organization. For this reason, we would now like to focus attention on semantic transformation as a source of complexity or emergence of novel and unpredicted behavior.

First- and second-order semantic complexity

It follows on from the above that semantic transformation may occur only in the case of communication between two or more sense-making actors with the ability to communicate linguistically. We will thus limit our further inquiry to human activity systems (see Checkland, 1981, for further discussion of the peculiar characteristics of such systems).

Another distinction that we wish to introduce is that between an object system and a subject system. When behavior of a phenomenon such as an enterprise is investigated,

Table?1

An overview of the key concepts of the notion of phenomeno-semantic complexity

Key ConceptIn Summary
Phenomeno-semantic complexityWhen communicated meaning is unintentionally transformed—syntactically and/or semantically—in the process of transmission between two sense-making actors, which may lead to an action on the part of the receiver that cannot be predicted by the sender, hence the emergence of complex behavior.
Triangle of meaning of sense-making actorThe indirect relationship between an idea conceived by a sense-making actor, and the object that this idea refers to, and then the symbol or signal that aims to represent that idea and hence also the object of reference.
Syntactic transformationOccurs in the process of transferring a message between two sense-making actors, and refers to the experience when the symbols or signals, representing the meaning to be transferred, are transformed or changed unintentionally.
Semantic transformationOccurs in the process of transferring a message between two sense-making actors, and refers to the experience when the meaning that is received by the receiving sense-making actor is unintentionally different from the one that was sent by the sender; is caused by either formalization noise or interpretation noise.
Formalization noiseOccurs when a sender assigns a symbol or signal to an idea to be communicated to a receiver, thus formalizing the idea, while the selected formalism cannot represent the intended idea, as perceived by the receiver.
Interpretation noiseOccurs when a symbol or signal is received by a sense-making actor, who then assigns some meaning to it, which differs from the one that was intended by the sender of the signal or symbol.
First-order semantic complexificationThe process of semantic complexification, as described in the first line above, which emerges in the subject system in the inquiry of an object system; i.e., in the process of communication between sense-making actors that investigate a phenomenon, such as a team of business analysts.
Second-order semantic complexificationThe process of semantic complexification, as mentioned above, which emerges in the object system in the inquiry of an object system; i.e., in the process of communication between sense-making actors constituting the investigated phenomena, such as an enterprise.
the investigated enterprise is the object system, while those investigating it, for example business analysts, are the subject system. This serves as a foundation for defining the first-order and the second-order semantic complexification, or emergence of complexity, respectively.

First-order semantic complexification may emerge in any subject system made up of two or more sense-making actors, as they communicate with regard to the investigation of the object system. The unintended semantic transformation induced there may render the conception of the object system complex. In other words, it is the system modelers and analysts—that is, the subject system—that complexify the model of an object system.

Second-order semantic complexification may emerge, in contrast, in any object system constituted by two or more sense-making actors, when thy communicate as required by their operations. This implies that only psychological, socio-psychological, and social systems may manifest second-order semantic complexification; it may not be manifested by, for example, a gas, an immune system, or the interaction of planets in a solar system.

Table 1 presents an overview of the key concepts of the phenomeno-semantic notion of complexity as introduced here.

Illustrations

Due to limited space we can only briefly describe three empirical illustrations of complexity in human affairs, emerging from the transformation of the semantics of the message.

The first example was observed in the Swedish Road Administration, during a redesign of its organization from functionally organized to process organized. In the midst of the reorganization, it became evident that the key term employed in the offices, “road,” was difficult to define. Further investigation revealed that the term was assigned more than ten distinctly different meanings by the organization, plus a set of versions of those ten meanings, depending on the department and the context of usage of the term. For instance, when a unit in the new organization made a request to another unit for approval of reconstruction of a road in the country, this request was not approved as the receiver interpreted it as another type of road than that intended by the sender. This ambiguity of the key term gave rise to misinterpretations when various organizational functions were expected to communicate in a new manner, due to the proposed organizational redesign. These misinterpretations brought about unintended activities, and thus changed behavior. The proposed solution to this situation was to exclude the term “road” from all operations of the Swedish Road Administration, and to introduce new terms that more specifically and in an unambiguous way represent the phenomena intended.

A second example occurred within the Swedish National Defence (SND) organization, which had also emerged from a reorganization. This took place during the 1990s after the collapse of the Soviet bloc. Prior to the reorganization, the SND was organized strictly functionally into three distinct forces: the Navy, the Army, and the Air Force. These forces were instructed and trained to defend Swedish territory from a Soviet bloc invasion, which represented a static enemy. This made it possible for the three forces to develop their operations semi-independently, with only a low degree of coordination. The new organization, on the other hand, was designed for unspecified enemies, and for operations outside the Swedish territory. This in turn called for a closer collaboration between the three forces. In the initial attempts to execute a closer collaboration, it became evident that the term and the concept of an “order” meant slightly different things within the context of the organizations of the respective forces. Within the Air Force the meaning of an order was strictly and explicitly defined, obliging the actors involved to define several variables before an order could be issued. However, an order within the Navy could be reduced to any signal given by a superior officer. This situation showed that when a Navy actor attempted to send an order to an Air Force actor, the latter did not always interpret it as an order, which gave rise to unpredicted behavior in the operations. The solution to this dilemma was to redefine the term “order” within the SND to give a commonly shared meaning within all three forces.

A third example comes from a marketing and sales organization selling a specific pharmaceutical product. This organization had 15 salespeople detailing the product for the targeted physicians. This detailing implied that a salesperson typically conducted a 20—30-minute face-to-face meeting with a particular physician, to inform him or her about the product and its pharmaceutical and therapy-related features. As is common in today’s competitive marketing and sales operations, a product is positioned against targeted customers and key competing products. Briefly, this positioning implies that specific key characteristics of a product are explicitly articulated and communicated to the targeted customers. These characteristics are regarded as the product characteristics most relevant for the current market conditions and aim to manifest the attractiveness of the product, thus promoting its sales. In the case of the pharmaceutical product studied here, the marketing officers did define the product’s position, which included a side-effect profile that was milder than the competitors’. However, as the projected sales figures were not obtained, the marketers began an investigation to track its cause, including a customer research investigation into their perception of the product. This showed that the customers did not perceive the product as intended; that is, with a focus on its low-side-effect profile. This, in turn, forced the marketers to investigate the message the salespeople gave the customers by means of a test. Each salesperson had to answer a few questions regarding the product’s key characteristics. The results showed that 15 different messages were being communicated to the market, one per salesperson, rather than the one intended message of a low-side-effect profile. Hence, the salespeople were unintentionally transforming the intended message, due to a lack of training and discipline. The marketers issued a new strict training program for the salespeople and implemented monitoring procedures for continuous evaluation of the message sent to the market. Some three months later the sales figures had increased and were closer to those projected.

Summary and further research

Since conventional approaches to sciences—that is, classical mechanics and statistics—cannot help us to understand certain types of behavior of phenomena, such as that of so-called organized complexity, various promising approaches to the theory of complexity have been advanced in order to make such behavior intelligible. The dominant approach is currently the notion of complex adaptive systems. In this, the constituting agents of a system interact in accordance with local rules, with no top-down master plan, leading to the emergence of a typically contra-intuitive complex behavior of the system as a whole. Investigations and studies of such systems focus on the construction of numerical representations for simulation purposes. While this approach has succeeded in making complex systems intelligible, it does have some inherent limitations. One such limitation considered here is the disregard for the peculiar characteristics of systems where human beings constitute its agents; namely, the ability of sense making.

To remedy this limitation, and make complexity emerging in human affairs intelligible, an alternative and complementary notion of complexity is introduced here: the phenomeno-semantic notion. This focuses on the process of communication between sense-making actors, where complexity, or unpredictable behavior, emerges as a result of unintended transformation of the communication between these actors. Complexity may emerge both as a result of transformation of the communicated syntax and semantics. Furthermore, the proposed approach to complexity also makes a distinction between the complexity that emerges in the communication of the investigated object system, and that emerging in the communication of the investigating subject system, in the very interaction between the subject system and the object system.

A key implication of the notion of complexity introduced here is that making complexity intelligible and manageable in human affairs does not necessarily require advanced symbol processors and mathematical formalism, with its strive for explanation, predication, and control of phenomena. Rather, effort should also be put into development of tools for explicit conceptualization and management of conceptual models, which articulate and specify meanings, in order to strive for plausible intelligibility of the phenomena, its inherent communication and sense making, and in addition learning in its evolution. We have only observed some sporadic yet promising contributions in this regard.

In terms of methodologies, we would like to mention Checkland’s (1981) soft systems methodology, which offers a set of conceptual tools for the conception of any human activity system in its effort to produce change. A second contribution is systemic enterprise modeling language, developed to support an unambiguous conception of any enterprise and its operations (Eriksson, 2004; Eriksson & Ståhl, 2005). In a somewhat different manner, both contributions focus attention on the semantic and syntactic content of communication between sense-making actors.

In terms of technologies, we would like to mention the so-called spider management information system, based on theories of cognitive modeling (Eden, 1988) and explicitly providing support for groups of people in their articulation and sharing of meanings (Boland et al., 1994, 1995). A second important contribution is “the coordinator,” which is also a computerized information system. It is based on the linguistic theories of the so-called speech acts (Searle, 1969; Searle & van der Verken, 1985), with the purpose of explicitly facilitating unambiguous communication and communicative acts between human agents (Flores et al., 1998). In line with the two examples, Simon (1992), in his critical review of contemporary utilization of information and communication technology, calls for a switch of focus from information processing and its syntactic qualities, into decision making and the semantic qualities in the life of human affairs. In this regard, much more research and development are necessary to further develop our understanding and management of complexity in human affairs emerging from unintended transformation of communication; this is in terms of theories, methodologies, and technologies, where this paper aspires to provide a contribution to the first.

One area of theoretical insight that may merit further research into complexity in human affairs is the multifaceted relationship between the syntactic, the semantic, and the pragmatic aspects of communication and the actions it causes. The syntactic and the semantic aspects alone are addressed here.

Notes

1 There have been various reports of successful application of findings from complexity research. Researchers, for example, have discovered that heart fibrillation can be modeled with equations based on chaos theory (Garfinkel et al., 1992). Other interesting applications include Holland’s (1994) ECHO, artificial stock markets (Arthur et al., 1997), and Sugarscape, a simulation of social systems (Epstein & Axtell, 1998).

References

ref1?

Arthur, W., LeBaron, B., Palmer, B. and Taylor, R. (1997). “Asset pricing under endogenous expectations in an artificial stock market,” in W. Arthur, S. Durlaf, S. and D. Lane (eds.), The Economy as an Evolving Complex System II, vol. XXVII of Studies in the Sciences of Complexity, ISBN 9780201959888.

ref2?

Ashby, W. R. (1962). “Principles of self-organizing systems,” in H. von Foester and G. W. Zopf (eds.), Principle of Self- Organization: Transactions of the University of Illinois Symposium, London, England: Pergamon Press, 255-278.

ref3?

Ashby, W. R. (1964). An Introduction to Cybernetics, ISBN 9780416683004.

ref4?

Ashby, W. R. (1965). Design for a Brain, ISBN 9780412200908.

ref5?

Atlan, H. (1972). L’Organisation Biologique et la Théorie de l’Information, ISBN 9782705613518.

ref6?

Atlan, H. (1979). Entre le Cristal et la Fumée: Essai sur l’Organisation du Vivant, ISBN 9782020052771.

ref7?

Bak, P. (1997). How Nature Works: The Science of Self-Organized Criticality, ISBN 9780387947914.

ref8?

Bak, P., Tang, C. and Wiesenfeld, K. (1987). “Self organized criticality,” Physical Review A, ISSN 1050-2947. 38: 367-374.

ref9?

Bak, P. and Sneppen, K. (1993). “Punctuated equilibrium and criticality in a simple model of evolution,” Physical Review Letters, ISSN 0031-9007, 71: 4083-4086.

ref10?

Boland, R. J., Tenkasi, R. V. and Te’eni, D. (1994). “Designing information technology to support distributed cognition,” Organization Science, ISSN 1047-7039, 5(3): 456-75.

ref11?

Boland, R. J. and Tenkasi, R. V. (1995). “Perspective making and perspective taking in communities of knowing,” Organization Science, ISSN 1047-7039, 4(6): 350-372.

ref12?

Boulding, K. (1956). “General systems theory: The skeleton of science,” Management Science, ISSN 0025-1909, 2: 197-208.

ref13?

Brier, S. (1998). “Cybersemiotics: A transdisciplinary framework for information studies,” BioSystems, ISSN 0303-2647, 46: 185-191.

ref14?

Casti, J. (1997). Would-Be Worlds: How Simulation Is Changing the Frontiers of Science, ISBN 9780471196938 (1998).

ref15?

Checkland, P. B. (1981). Systems Thinking, Systems Practice, ISBN 9780471279112.

ref16?

Deeley, J. (1990). Basics of Semiotics, ISBN 9780253205681.

ref17?

Eden, C. (1988). “Cognitive mapping,” European Journal ofOperations Research, ISSN 0377-2217, 13: 1-13.

ref18?

Epstein, J. and Axtell, R. (1998). Growing Artificial Societies, ISBN 9780262050531.

ref19?

Eriksson, D. M. (2004). Four Proposals for Enterprise Modeling, doctoral thesis, Chalmers University of Technology.

ref20?

Eriksson, D. M. and Stähl. P. (2005). “Proposal for a systemic enterprise modeling language,” Proceedings of the 38th Hawaii International Conference on Systems Sciences, ISSN 1530-1605.

ref21?

Flores, F., Graves, M., Hartfield, B. and Winograd, T. (1988). “Computer system and the design of organizational interaction,” ACM Transactions on Information Systems, ISSN 1046-8188, 6(2): 153-172.

ref22?

Foerster, von, H. (1959). “On the self-organizing systems and on their environment,” reproduced in H. von Foerster (1984), Observing Systems, ISBN 9780914105190.

ref23?

Garfinkel, A, Spano, M. L., Ditto, W. L. and Weiss, J. N. (1992). “Controlling cardiac chaos,” Science, ISSN 0193-4511, 257: 1230-1235.

ref24?

Hinegardner, R. and Engelberg, J. (1983). “Biological complexity,”Journal ofTheoreticalBiology, ISSN 0022-5193, 104: 7-20.

ref25?

Holland, J. (1994). “Echoing emergence: Objectives, rough definitions, and speculations for ECHO-class models,” in G. Crownan, D. Pines and D. Meltzer (eds.), Complexity: Metaphors, Models, and Reality, ISBN 9780201626063, pp. 310-334.

ref26?

Holland, J. (1995). Hidden Order: How Adaptation Builds Complexity, ISBN 9780201407938.

ref27?

Holland, J. (1998). Emergence: From Chaos to Order, ISBN 9780201149432.

ref28?

Kauffman, S. A. (1993). The Origins of Order: Self-Organization and Selection in Evolution, ISBN 9780195058116.

ref29?

Kolmogorov, A. (1965). “Three approaches to the quantitative definition of information,” Problems of Information Transmission, ISSN 0032-9460, 1(1): 3-11.

ref30?

Le Moigne, J. L.(1990). La Modélisation des Systèmes Complexes, ISBN 9782100043828.

ref31?

Naess, A. (1953). Interpretation and Preciseness: A Contribution to the Theory o f Communication, Oslo: Det norske videnskaps-akademi i Oslo.

ref32?

Newman, M. (1996). “Self-organized criticality, evolution and the fossil record,” Proceedings of the Royal Society of London B: Biological Sciences, ISSN 0962-8452, 263: 1605-1610.

ref33?

Nöth, W. (1990). “Meaning, sense, and reference,” in Handbook ofSemiotics, ISBN 9780253209597 (2006), pp. 92-102.

ref34?

Ogden, C. K. and Richards, I. A. (1985). The Meaning of Meaning: A Study of the Influence of Language upon Thought and the Science of Symbolism, ISBN 9780744800333.

ref35?

Quastler H. (1964). The Emergence of Biological Organization, New Haven, CN: Yale University Press.

ref36?

Searle, J. R. (1969). Speech Acts: An Essay in the Philosophy of Language, ISBN 9780521096263 (1970).

ref37?

Searle, J. and van der Verken, D. (1985). Foundations of the Illocutionary Logic, ISBN 9780521263245.

ref38?

Shannon, C. E. and Weaver, W. (1949). A Mathematical Model of Communication, Urbana, IL: University of Illinois Press.

ref39?

Simon, H. A. (1969). The Sciences of the Artificial, ISBN 9780262193740 (1996). ’

ref40?

Simon, H. A. (1992). “On designing information for companies and managements in an electronic age,” Proceedings of the 1992 CEMIT Conference, Tokyo, September.

ref41?

Sneppen, K. and Newman, M. (1996). “Avalanches, scaling, and coherent noise,” Physical Review E: Statistical, Nonlinear and Soft Matter Physics, ISSN 1539-3755, 54(6): 6226-6231.

ref42?

Stonier, T. (1997). Information and Meaning: An Evolutionary Perspective, ISBN 9783540761396.

ref43?

Varela, F. J. (1979). Principles ofBiological Autonomy, ISBN 9780444003218.

ref44?

Warfield, J. N. (2004). “Linguistic adjustments: Precursors to understanding complexity,” Systems Research and Behavioral Science, ISSN 1092-7026, 21: 123-145.

ref45?

Weaver, W. (1948). “Science and complexity,” American Scientist, ISSN 0003-0996, 36: 536-544.


Article Information (continued)


This display is generated from NISO JATS XML with jats-html.xsl. The XSLT engine is Microsoft.