Desouza and Hensgen “call attention to the inadequacies plaguing current methods of information management… [and] wish to shed some light on how organizations might better manage…available information and transform such information into knowledge that is useful, valuable, and actionable” (p. x). With these goals in mind, it seemed very appropriate that the authors called upon systems science principles and the role of systems’ emergence, in this instance, the materialization of information into knowledge creation. “Information shapes an organization,” (p. xi) the authors stress, and it can be said that to a great extent an organization does reflect a state of its information management. By understanding the dynamics of information flow within an organization we can, indeed, judge the quality of the organization. In spite of its tile, however, this book does have a stronger focus on technology of communication. Within its almost 250 pages, this book exudes a strong bias toward the mechanics of Information Technology and thus will be a useful resource primarily for managers and executives of various organizations.
Within the text, the authors attempt to contrast the quality of information processing within an organization with the way a crisis impacts the organization. From such position, they then present a rather complicated model for error and crisis detection and aversion. This is in spite of their own statement that “simplicity is the best way to solve any problem.” p. 148 For most models and theories, the ‘law of parsimony’ should be the gold standard as it is the simplicity and adaptability of any model that increases its usefulness. Practical features of this book are the examples of crises from recent past that follow authors’ more general statements; also, each chapter provides a summary of a covered topic highlighting presented lessons. This book does add a valuable insight into how to build virtual crisis centers.
It is a reality, and the authors do acknowledge it, that it’s the “human reaction and the ability to process new information [that] compounds…complexity” (p. 3). It is well-known that throughout history, most problems could be traced to ‘human errors.’ According to Bilson (2004: 21), “Problems arise in a context of perceived inequalities of power.”
In general, human reaction to a stimulus can range from a simple reflex, occurring even without our cognition, to a multi-level mosaic of biology, culture, and linguistics as well as emotions. All these processes take place within the broader category of our biology and almost irrespective of the available technology. Various senses have an uneven ability to sort out signals from the environment. The key to our understanding of what is arbitrarily happening outside of us and even within us is how we ‘understand’ what our senses allowed to pass through. Those signals eventually hit our cognitive threshold that, even among the brightest among us, seems to be set at a low capacity, about 25 bits per second, as compared to the totality of the available information.
Specifically, our senses impose significant limitations on the information available for cognition due to their inherent filters which select and/or compress vast amount of information coming our way before the stimuli face our cognitive threshold. In addition, the derived meaning of information is based on its interpretation which varies for each of us based on the predominant receptors we use (is our sensory orientation primarily visual or auditory?), our cognitive abilities (the logic and pattern recognition), our emotional make up and cultural background, which are all active within an ongoing temporal change. In the final analysis, it is more likely, that it is the possession of the right paradigm by the observer, not the information itself that allows for the transformation of information into true knowledge.
The study of the physiology of vision has been very illuminating and clearly shows the role our senses play in what actually enters our cognitive domain as opposed to what hits our receptors. Most of the information that we perceive does come from our vision, about 70% of it, with the remaining 20% supplied by our hearing and other senses (e.g., touch, proprioception, etc.). For example, as Nolte (2001) points out, the visible light is composed of electromagnetic waves with wavelengths of about half a micron. By comparison, the ultraviolet light has wavelengths shorter than that. There are even electromagnetic waves with wavelengths that are a thousand times smaller but we are capable of perceiving light only in the visible range of the electromagnetic spectrum. This is because the wavelengths that excite the retina in human eyes vary from about 0.4 microns (for violet color) to 0.7 microns (for red). Retina not only responds to only a narrow spectrum of available electromagnetic waves but has only a limited number of sensors and nerve endings that can be activated. Nolte points out that “The human retina contains about 120 million rods and about 6 million cones, but only about 1 million ganglion cells…[which means that] the light information falling on 126 photoreceptors produces only a single signal in one ganglion cell axon.” (p. 72) When the signal-transmitting neurons from the retina get activated it is also only within a certain range as the neurons are not responding below and above specific levels. Nolte (p. 73) calls it the information convergence which transforms the signals from 126 million receptors into only 1 million axons in the optic nerve. The compression ratio of 126:1 in the retina is similar to the currently standard digital compression techniques, such as MP3 format.
David Nolte (2001) further points out that “spoken language proceeds at approximately 25 bits per second… [and that] data rate for non-verbal information is virtually the same as the 25-bit-per-second rate for language comprehension [our cognitive threshold] (p. 106-108)… The entire brain, if flattened out, would measure about a third of a meter. [First,] neural impulses travel at a velocity between 1 to 100 meters per second… [then] signals…jump the 20-nanometer synaptic gap… The time it would take a neural impulse to travel a third of a meter is between 3 and 300 milliseconds… [which is the] range of time [that] is involved in many aspects of cognition… [For example] the response time while driving a car is around 300 milliseconds” (p. 47).
The authors also state that “Information may be thought of as the nutrients that…have the potential to benefit the entire organization (p. 23)…and that information deprivation has the opposite effect.” (p. 24) Sometimes, however, the absence of information may be just as important as the presence of information. A simple analogy can be made with the digital programming world where there is an equal impact of the presence and absence of information epitomized as the “1s” and ”0s”; they are not opposites.
Information, like energy, can be seen flowing through an open system. There is an intake, and similarly to our diet, we need to know what and where the data/information comes from, its reliability, reproducibility, originality, etc. Then, in an open system, there is the throughput or metabolism which processes information; here, we need to decide what we keep, what we discard, and appreciate how the information modifies our system. Finally, there is the output which materializes what we do with the information: are we able to successfully translate it into knowledge, make appropriate decisions, or just file it for future use?
The reviewed book emphasizes that “It is essential to understand that while in transit, information and data are to be considered neutral, that is, no interpretation should be ascribed until the information is reviewed using semantics.” (p. 59) This ‘neutrality’ of information needs to be questioned because for a signal to become data that leads to information, it must be first selected by a ‘collector,’ be it a human or a programmed instrument. This means that even at this point of ‘collection,’ data and information are not neutral and are all subject to Heisenberg’s uncertainly principle and the role of the observer. It’s the selection and the connection of the dots/data which gives us information. What it means, however, with the implications for the entire system under consideration, is in the province of knowledge creation. In the Epilog, authors appropriately mention Gödel’s Incompleteness Theorem:” all logical systems of any complexity are, by definition, incomplete” (p. 192). The filtering of data by our senses seems to be a protective biologic process as otherwise we would get totally overwhelmed with the massive deluge of information coming our way. Also to consider in this regard is the fact that data or information may independently trigger a sudden protective reflex (pain, fear, etc.), way before the signal even reaches our conscious mind.
“A central feature of semiotics is the ability, through analysis and reconstruction of signs, to generate innovation,” (p.49) according to the authors. Real innovation flourishes at the inner edge of chaos and diminishes toward the outer edges of chaos where randomness, disorganized complexity, and volatility prevail. What determines the real proclivity to translating information into innovation is the timing of the information occurrence and where the system is in its cycle. Similarly, errors and crisis assume different role depending upon where they are encountered within the organizational cycle: errors occurring within the inner edge of chaos will likely be corrected or extinguished; if they occur in the outer edge of chaos, they will probably have an exponential and negative impact on the system, similar to biology of cancer (Janecka, 2007).
We can read on page 90, that “Emergence refers to components of a system that are so basic they are called ‘building blocks’ that appear, or emerge, at the most fundamental levels in organization or systems.” It is hard to conceptualize emergence, from systems science point of view, as ‘building blocks’; instead, emergence represents a unique end product of relationships among all systems’ components. How our male/female information-encoded DNAs create a new life, an emergent phenotype, may serve as a good example.
Communication is primarily a biologic, cultural, linguistic, and emotional enterprise taking place in what Maturana (Maturana 1988, Russell and Ison 2004) calls ‘conversation.’ It’s unlikely that a primary focus on technology will improve the creation of knowledge from information. The Iraq war in the early 21st century seems to epitomize the limitations of dependence on technology without the dominant regard for biology and culture, a domain where the real ‘conversation’ takes place. What would be helpful in improving world-wide communication is to incorporate advancing technology and systems science in a biologically-friendly manner. It is quite important to always consider, even in the early origins of information, during the sprouting of data, the informational context. It is the context that begins to define the boundary of a pertinent system, which is primarily a cognitive function, and should not be delegated to any mechanical instrumentation. This point may represent a seeming dilemma of how to handle massive amount of data, e.g. electronic communications. But, a systems science approach, even to a very complex issue, as is for example national security, would seem to indicate that emphasis should be on the evolving contextuality, present and a historic one, and not primarily on the overwhelming technical processing of massive amount of data. It is all about gestalt. Context forms the pattern with boundaries on which we reflect a specific data set in order to get information, or specific information to gain knowledge. There is a progressive increase in boundaries and pattern complexity that can be seen in the ascending process of our expanding paradigm.
At a first glance, the application of semiotics to information seems a very appropriate step as semiotics is about symbols and information contains symbols. The mechanical division of semiotics into syntactic, semantics, and pragmatic groups opens the door, however, to potentially fragmenting something which should always be processed, even from the very beginning, as a whole, from accumulation of data to the creation of meaning. Information processing would greatly benefit from systems science which does provide great guidelines. A very useful metaphor is to consider information processing as a biologic phenomenon, from its encountering the sensory receptors, its translation into electrical impulses, to the cognitive and emotional filters. This is how humans basically deal with informational input. There are many analogies between terrorism and cancer. For this reason, looking at terrorism with the knowledge of biology, instead of pure information processing, can be quite helpful. The biologic perspective highlights not only an early detection and treatment but emphasizes the most valued step, the prevention (Janecka, 2002).
Information is inseparable from systems because wherever it comes from it is generated within a system. All systems are related: a system is simultaneously a component/a subsystem of a larger system and so forth. Viewing data and information from this perspective allows systems science to be of the greatest help as it integrates information into the larger whole. As living entities evolve in cycles, and not in an endless longitudinal progression, information’s role is also determined by the cycles’ temporal and special ‘coordinates.’ Even the most miniscule and insignificant information can have the most profound influence on a system’s cycle if it arrives at a critical timing within the outer edge of chaos. To evaluate data and information, the first thing that one need to know is to ascertain where the organization is within its own cycle and at least one larger one as this forms the framework within which the eventual knowledge would be most meaningful. During any crisis, it is good to remember (Martin, 2004: 74) that “Fear transforms us in physiological ways, making us less able to take in and learn new information.”
- Bilson, A. (2004). “Escaping from intrinsically unstable and untrustful relations: Implications of a consitutive ontology for responding to issues of power,” Cybernetics & Human Knowing, ISSN 0907-0877, 11(2): 21-35.
- Janecka, I.P. (2002). “Terrorism and cancer: Striking similarities,” Plastic and Reconstructive Surgery, ISSN 0032-1052, 110: 697-698.
- Janecka, I.P. (2007). “Cancer control through principles of systems science, complexity, and chaos theory: A model,” International Journal of Medical Sciences, ISSN 1449-1907, 4: 164-173, http://www.medsci.org/v04p0164.htm.
- Martin, R.J. (2004). “The once and future: Thoughts and notes,” Cybernetics & Human Knowing, ISSN 0907-0877, 11(2): 71-76.
- Nolte, D.D. (2001). Mind at Light Speed: A New Kind of Intelligence, ISBN 9780743205016.
- Russell, D. and Ison, R. (2004). “Maturana’s intellectual contribution: A choreography of conversation and action,” Cybernetics & Human Knowing, ISSN 0907-0877, 11(2): 36-48.