The call of complexity

Today storm-tossed markets call managers to take a stand on how to cope with the rising up of complexity. Companies are constantly facing a crossroad (complexity dilemma): To accept and nurture complexity, or to decline and reduce it? Origins of the first option can be traced back to Ashby’s Law of Requisite Variety1 which states that “only variety can destroy variety”, so that only complexity can cope with complexity3. The amount of internal complexity is optimal when it cope with an equivalent amount of external complexity4. While the second option comes from Luhmann’s theory of Complexity Reduction2, by which each system has to reduce its environmental complexity through selection or differentiation.

In latest years, following Ashby’s law, managers have been urged to complicate themselves as well as their organizations3. This complication moves companies far from Occam’s razor theory, which told us that it is not necessary to put more on something when it could be done with less. Sometimes over-simplification uncovers system’s fragilities as shown, for example, by Ericsson’s supply chain crash5. Ericsson Inc. simplifies its suppliers portfolio for some components. In March 2000 a fire struck a semiconductor plant in New Mexico, leaving Ericsson short of millions of chips. As results, Ericsson was ultimately driven from the market. This example outlines how complexity brings fragility into the system, that becomes dependent through non-linear connections from a multitude of variables. Each variable has to be considered and managed – as Ashby’s law foreseen – because simplifying could make some variables invisible to the system. These variables are those which could disclose system’s fragilities (Bonabeau, 2007). At the same time simplification generates a sense of “risky” certainty, which can lead to a lightly view of market changing conditions6. Sull7 refers to active inertia to define a similar sort of simplification: responding to market shifts by accelerating activities that succeeded in the past. Both Ashby’s and Luhmann’s approaches seem right, but which one should be followed?

The present paper aims to identify ways by which companies could solve Ashby-Luhmann trade-offs (complexity dilemmas), finding solution to increase complexity of behaviours and outputs, through simple solutions. This paper, in accordance with Jost8 and Pina and Rego3, suggests that complex organizing may be – paradoxically – facilitated by a simple infrastructure, and that the theory of organizations may be viewed as resulting from the dialectical interplay between simplicity and complexity. We suggest three tips for dealing with complexity: (i) modularity, (ii) simple rules, and (iii) organisational capabilities.

Consequently, the paper begins with an analysis of the existing literature contributions about complexity and performance in order to analyse both pros and cons of both Ashby and Luhmann approaches. Afterward, the paper illustrates the research question and finally, focus on how companies can solve complexity dilemmas.

Theoretical background: Complexity defined

Herbert Simon defined a complex system as “one made up of a large number of parts that interact in a non-simple way. In such systems, the whole is more than the sum of the parts, at least in the important pragmatic sense that, given the properties of the parts and the laws of their interaction, it is not a trivial matter to infer the properties of the whole”9. After Simon’s definition, many works on complexity were carried out and literature distinguishes different typologies of complexity. The main difference occurs between external complexity and internal complexity. Duncan10 for example distinguishes between the internal dimension composed by personnel, organisational units, and organisational levels, and the external dimension composed by customers, suppliers, competitors, technologies, and socio-political context. Vicari11 distinguishes between the complexity dependent from the company, its products, its technology, structures and personnel and the complexity generated from the external environment. Similarly, Daft12, and Jost8 identify the internal complexity as the sub-elements inside the organisation and the external complexity as the external elements with which the company should relate. Glenn and Malott13, Collinson14 and Collinson & Jay15 differentiate between component complexity (number of elements), hierarchical complexity (number of levels) and environmental complexity (number of external variables). Recently, Chung16 states that it is possible to capture structural aspects of complexity in both static and dynamic forms. He proposes that complexity can be evaluated in terms of the number of components that are within a socio-technical organization and the degree of interrelatedness between these components. Given these variables, it is then possible to characterize complexity in terms of simple, complicated, relatively complex and complex profiles.

In synthesis, external complexity can be defined as the amount of complexity derived from the environment where the organisation operates, such as the country, the markets, suppliers, customers and stakeholders; while internal complexity is the amount of complexity that is internal to the organisation itself, i.e. products, technologies, human resources, processes and organisational structure. Therefore, different aspects compose internal and external complexities. From literature,17,18,19,20,15,21,4,22 we can distinguish four dimensions of complexity:

  • interdependence (connectivity, interconnectedness): degree of interactions and connections among the elements of the system; In fact, within complexity theory, emergence has been linked with formation of entities including networks and social entrepreneurship23;

  • diversity (multiplicity, variety, heterogeneity): number, heterogeneity and variety of the elements of the system;

  • uncertainty (ambiguity, not transparency): degree of unpredictability and ambiguity of the system; for example, Mihm et al.24 suggest that complexity is connected to uncertainty. Organizations engage in search whenever they perform non-routine tasks, such as the definition and validation of a new strategy, the acquisition of new capabilities, or new product development.

  • Dynamicity (dynamism, fast flux, pace, variability): speed of flux, rate of change and coevolution of the system; for example, Hladík25 identifies dynamicity as a key characteristic of complexity because organizations have to react to the accelerating changes and complexity of their environment to survive in future.

Complexity and performance: the complexity curve

Most of time, literature explains complexity as destroying value by adding costs, lengthening cycle time and frustrating both customers and employees, so that many people in companies exist only to “handle” complexity26.

Of course, complexity has an actual impact on firm’s performance, but it cannot be reduced and limited to a “destroying value” adjective. In fact, performance have been proved to depend on the amount of complexity via an inverted U-shaped function27,28,29,15,22, called the complexity curve: Once fixed the amount of market complexity, performance increase as internal complexity increase, till reaching a tipping point. After that point, an overburden of complexity starts to sink performance — as shown in Fig. 1.


Fig. 1: Relation between complexity and performance (the complexity curve)

Anderson27 refers to the inverted U-Shaped curve describing effects of customisations on performance. Schwandt28 hypothesizes that firm’s performance are linked to complexity via an inverted U-Shaped curve. Davis et al.29 discovered – through a mathematical simulation – that the ability to catch market opportunities (matching some simulated random numbers) depends on the number of rules (constraints) that regulated the possibility of catching actions (generation of some partial random number) through an inverted U-Shaped relation. Collinson & Jay15 refers to the complexity curve describing effects of complexity on EBITDA for 200 companies of Global Fortune 500. Finally, Braun & Hadwich22 found the same behaviour in Service complexity: “The overall effect is nonlinear in the shape of an inverted U, indicating the existence of an optimum, moderate level of internal service complexity maximizing internal customer satisfaction”.

Also changes of the external complexity affects the shape of the complexity curve. Davis et al.29 recognise a shift of the complexity curve as opportunities coming from the environment becomes more difficult to catch. Thus, when environmental complexity increases, it is possible to suppose that the tipping point of the complexity curve comes up per higher level of internal complexity, as per Ashby’s law of requisite variety. If environmental complexity decreases, the tipping point of the complexity curve occurs per lower level of internal complexity.

The research question: who to follow?

So, should mangers follow Ashby or Luhmann? It depends on where their companies, units or departments are positioned on the complexity curve. In fact, managers are challenged to reach the tipping point of the complexity curve, namely the “optimal complexity level”. For achieving this objective they could follow Ashby’s law of requisite variety and reconfigure their products, structure, and processes if the current level of internal complexity is lower than the optimal one (their companies, units, departments are located on the left side of the optimal complexity level). Or apply Luhmann’s complexity reduction – e.g., selecting market niches – at the other side of the curve, when complexity is too high to be managed effectively with the current configuration. So the question of the complexity dilemma becomes the following research question:

Which principles of complexity can managers use to solve complexity dilemmas and converge to the tipping point of the complexity curve?

Through a literature research on complexity theory, and from empirical results derived from prior case studies30,31, we propose three tips to face this question, and move surefooted into complexity: modularity, simple rules, and organisational capabilities.

Three tips for moving into complexity

Tip n° 1: Modularity reduces the costs of complexity

The biologist Allen H. Orr32 describes the phenomenon of evolution in organisms labelled as the “cost of complexity”, explaining that more “complex” – meaning multi-traits – is an organism, more likely a mutation of a given size could be deleterious. A favourable mutation in a phenotype trait can be deleterious for another trait influenced by the first. More traits are involved, more are the risks of washout. Nature solved Orr’s cost of complexity through modularity33. Modularity refers to a group of physically or functionally linked molecules or genes (nodes) that work together to achieve a (relatively) distinct function34,35. A module is “a component part of a larger system and yet possessed of its own structural and/or functional identity”36. So that modules are the critical level of biological organization37,35. They have discrete functions that arise from interactions among their components (proteins, DNA, RNA and small molecules), but these functions cannot easily be predicted by studying the properties of the isolated components37. Wagner38 defines modules as units of the phenotype that collectively serve a primary role, that are tightly integrated by strong pleiotropic effects of genetic variation and are relatively independent from other such units. They can be obtained by parcellation or by integration of pleiotropic effects38. The parcellation solves Orr’s cost of complexity by separating pleiotropic effects. Examples of modularity are the CaM gene expression that regulates the bill length, and the BMP4 gene expression that regulates the bill depth/thickness ratio in Darwin’s finches, which are uncoupled35; or the modular patterns of butterflies wings39,40.

Moreover modular structures facilitate evolvability and adaptability. Embedding particular functions in discrete modules allows the core function of a module to be robust to changes, but allowing in the meantime for changes in the properties and functions of a cell (its phenotype) by altering the connections between different modules37. In other words, thinking about Simon’s (1962) parable of Hora and Tempus, in biology, modularity is an important property because it helps a system “save its work” while allowing further evolution41.

These principles can be applied also to products, processes and organisations, permitting a local reduction of complexity – within the single module – while keeping high the complexity and evolvability of the whole system. Campagnolo and Camuffo42 define modularity as “an attribute of a complex system that advocates designing structures based on minimizing interdependence between modules and maximizing interdependence within them that can be mixed and matched in order to obtain new configurations without loss of the system’s functionality or performance”. Let us give an example of the advantage of product/process modularity: two competing aircraft engine architectures are employed in the industry, namely two-shaft and three-shaft. According to literature, the three-shaft launched by Rolls-Royce in early 70s has turned out to be more effective in accommodating evolving customer requirements in terms of engine power due to its modularity. This embedded modularity enabled Rolls-Royce to exploit the same architecture to cater for a broader range of power requirements. Moreover Rolls-Royce acquired a competitive advantage in terms of speed of development of new engines by mixing and matching components and thus introducing incremental changes in the original architecture to meet a wider variety of aircraft makers’ needs than their competitors43.

At the organisational level, modularity matches the self-organizational structure called the “cellular organization”44. Miles et al. represent organizations as a set of self-managing teams or autonomous business units, with an entrepreneurial responsibility to the larger organization. The customers of a particular cell can be outside clients or internal cells. The cell dimension can be variable, from 10 to 150 people (Dunbar number). Every cell can or not nominate a project manager that would be the coordinator for the cell. To work in concert, and behave like a higher-order organism, cells have to share a common culture, common values and common assets. Miles et al.44 provide the example of Technical and Computer Graphics (TCG) of Sydney in Australia and of The Acer Group. TCG develops a wide variety of IT products and services and consists of 13 individual small firms focus on cellularity. Each cell has its own purpose and autonomy, and shares a common purpose with other cells. Some firms are specialized in product categories while others in hardware or software. TCG keeps growing using a process called triangulation, that is a three-cornered partnership among one or more TCG’s cell, an external partner and a principal customer to develop new products or businesses. The second example is Acer, where the co-founder Stan Shih calls for a federation of self-managing firms held together by mutual interest. Each firm is a either a client or a server of other firms in the federation.

Another example comes from the Italian hi-tech firm Loccioni Group that was able to develop an ecosystem of more than eighty spin-offs (cells) that are both clients or suppliers of Loccioni.

Also Ashkenas45 reports the example of ConAgra. When Gary Rodkin came on board in 2002 he realized that the supply chain director that supported each of the customer operating groups did not have enough direct access to the people who worked other departments, who had all been centralized in the enterprise unit. So, when there was an issue it took too long for these directors to pull together a response team. That led Rodkin’s team to further tweak the design by creating small supply chain support team dedicated to each customer group.

Finally, in strategy the concept comes from Skinner’s focused factory46. Skinner’s focalisation permits, following Luhmann’s complexity reduction, to select external complexity to cope with, and thus focalise operative units (internal complexity) on selected external complexity. In that way, organizations can cope with an high level of complexity (sum of complexity handle by each focalized units), keeping low the local complexity (complexity managed by each single unit). Examples are those of Zanussi-Electrolux which subdivided its production processes into five focused units focalized on different products; of McDonald’s, which replicated its units and focalized them to the relevant market47.

According to theories and literature examples above mentioned, we can conclude that, in order to solve the complexity dilemma, when complexity becomes too high to be managed effectively (right side of the complexity curve), the system can be modularized by focalizing business units, teams, products, processes and tasks, following Luhmann’s complexity reduction. Modules reduces local complexity while increasing the global (of the system) complexity level.

Managers can converge to the tipping point of the complexity curve by modularizing the system in focalized units.

Tip n° 2: Simple rules let complexity emerge

Simple rules are defined as “few straightforward, hard and fast rules that define direction without confining it’’48. Although standardization, procedures, and checklists facilitate the reproducibility of tasks, and try to maximize margins gained from reproducible knowledge, in fast changing environments the search for creativity and space of action is required. Simple rules help managers in this work, codifying guidelines and schemes, and giving space to people for creating by themselves value on products, processes, and business as well. In this way is possible to let complex behaviours3, self-organization, and creativity to emerge. Recently, Morieux and Tollmann49 derive from practice six simple rules for managing complexity without being complicated, striving for cooperation. Also in nature complex behaviours of flocks emerge from few rules of interaction among single individuals.

Examples of simple rules in developing new products, come from Lego’s rules48: (1) does the proposed product have the Lego look? (2) Will children learn while having fun? (3) Will parents approve? (4) Does the product maintain high quality standards? (5) Does it stimulate creativity? Another example is Yahoo!’s rules in designing new websites which are (1) knowing the priority rank of each product in development, (2) ensuring that every engineer can work on every project, (3) maintaining the Yahoo! look in the user interface, and (4) launching products quietly.

Thus, when facing high levels of complexity, finding key simple rules to drive out creativity and innovation permit to keep the infrastructure and processes simple, while permitting complex outputs and behaviours.

Managers can converge to the tipping point of the complexity curve simplifying structure and processes by means of simple rules.

Tip n° 3: Organisational capabilities make complexity manageable

Organisational capabilities are collective key intangible assets50 embedded in each organisation51. Garengo & Bernardi52 firstly developed a framework specifying that when complexity increases, companies should build on their capabilities in order to manage that surplus of complexity, which otherwise would led to the “Chaos Zones”. They could moreover develop a surplus of capability in order to create the boost to manage more complexity and enter new markets or develop new products. Teece53 moreover observes that capabilities are necessary in turbulent and fast-changing environments.

In a recent work30 we clustered a list of more than 200 micro-capabilities – from 46 frameworks in literature – in four main capabilities, namely redundancy, interconnection, sharing and reconfiguration.

  • Interconnection: ability to create open networks that exploit the small world effect, in order to promote cooperation and integration with internal and external parties, and developing brand and reputation based on a dialectical approach.

  • Redundancy: surplus of intangibles informational, relational, cognitive and functional resources built through continuous learning.

  • Sharing: ability to share values, vision, strategy, organizational processes and knowledge, through the development of trust and incorporation and promotion of leaders at all levels.

  • Reconfiguration: ability to read the context, capturing weak signals and trends, early recognise opportunities and threats, innovate with strategic and operational flexibility in co-evolution with the environment, continuously recombine knowledge, thanks to an entrepreneurial culture.

From multiple case studies30,31 we discovered that, in order to achieve higher level of firm’s performance, companies have to develop a coherent level of organisational capabilities with the complexity they have to manage. Both undersized capabilities and oversized capabilities negatively influence performance. The former limit effectiveness and success due to lacks of knowledge, resources, links and abilities. Instead, the latter influence efficiency due to excessive costs. An example of these effects comes from one of the biggest Italian retail company31. From data collected in out prior works3031 in which we carried out a nested case study in seven Groups of Large Scale Retail Distribution (Cooperatives), thirty-two Shops of one Group (Shops), and five Departments of one single Shop (Departments), measuring through a questionnaire (Likert scale) the levels of: internal complexity (interdependence, diversity, uncertainty, dynamicity); organisational capabilities (interconnection, redundancy, sharing, reconfiguration); and firm performance (time, quality, cost, flexibility, product differentiation, service differentiation, price). We found that the high performer units are those that had developed a coherent level of organisational capabilities to manage complexity and stay on the top of the complexity curve (see Figure 2). In the Figure the relation between internal complexity and organisational capabilities is represented by the ratio n between the two dimensions. When n is lower than 1 the complexity level is higher than the capabilities level. On the other side, when n is higher than 1 the level of capabilities is higher than the level of complexity measured.


Fig. 2: Relation between the ratio n (level of organisational capabilities over level of complexity measured in the case studies) and performance

Thus, we suggest that companies facing high levels of complexity should invest in developing a coherent level of organisational capabilities (interconnection, redundancy, sharing and reconfiguration). A surplus of capability is needed to manage complexity and that confirm the necessity for redundancy of capability. Investments on developing capabilities (i.e. training, sharing moments, process re-engineering, market analysis etc.) is a requisite for achieving high performance. On the other side an excess of capability produces an opposite effect on performance due to ineffectiveness and costs for investing and keeping an excess of capabilities.

Managers can converge to the tipping point of the complexity curve developing a coherent level of organisational capabilities.


The paper introduced the concept of complexity dilemma which is based on the trade-off of growing or reducing complexity: Ashby versus Luhmann. To solve the dilemma we resort to the inverted U-shaped relation between complexity and performance. If, after a coherent assessment of complexity level, companies are located in the left-hand side of the complexity curve they should choose Ashby’s strategy and nurture complexity (e.g. through integration, investments on capabilities, etc). If, companies are located in the right-hand side of the complexity curve they should follow Luhmann’s complexity reduction and simplify the system (e.g. through modularization, simple rules). For moving in these scenarios we suggests three tips for dealing with complexity by reducing and managing it at local scale (as per Luhmann complexity reduction) while keeping it at global scale (as per Ashby’s law of requisite variety). Firstly, modularity permit to hidden complexity into a module, segregating it and managing it through focalised process, structure and capabilities, without reducing the global complexity level of the system. Secondly, simple rules permit to reduce managerial laws, and useless procedures that paralyze organisations and fix few straightforward rules that as guidelines for defining processes and leave possibilities for creativity and complexity to emerge. Finally, organisational capabilities (interconnection, redundancy, sharing and reconfiguration) give to companies the ability to properly and effectively manage surplus of complexity and are boosters to grow company potential to enter into new markets and embrace more external complexity.

Thus, we can summarize the work in the following main managerial lessons. To cope with complexity dilemma organisation and managers should:

  • Embrace the paradox of “increasing complexity through its local reduction” as modularity permits.

  • Let complex behaviours emerge by mean of few simple rules.

  • Develop and deploy a coherent level of organizational capabilities in order to manage complexity.