W. S. McCulloch
Einstein once defined truth as an agreement obtained by taking into account observations, their relations, and the relations of the observers. In his case, the observations were the coincidences of signals at points in frames of reference; their relations were matters of space and time in those frames; his observers were reduced to what Helmholtz called a locus observandi, devoid of prejudices and imagination; and the only relations he had to consider among them were their relative positions, motions, and accelerations. The truth he had in mind is a picture of the world upon which all observers can agree, because it is expressed in a manner invariant under the transformations required to represent the relations of the observers. It is a paradigm for what “scientific agreement” may mean.
Unfortunately for us, our data could not be so simply defined. It has been gathered by extremely dissimilar methods, by observers biased by disparate endowment and training, and related to one another only through a babel of laboratory slangs and technical jargons. Our most notable agreement is that we have learned to know one another a bit better, and to fight fair in our shirt sleeves. That sounds democratic, or better, anarchistic, as you have twice reminded me. Aside from the tautologies of theory, and the authority of unique access by personal observation of a fact in question, our consensus has never been unanimous. Even had it been so, I see no reason why God should have agreed with us. For we have been very ambitious in seeking those notions which pervade all purposive behavior and all understanding of our world: I mean the mechanistic basis of teleology and the flow of information through machines and men. In our own eyes we stand convicted of gross ignorance and worse, of theoretical incompetence.
Our meetings began chiefly because Norbert Wiener and his friends in mathematics, communication engineering, and physiology, had shown the applicability of the notions of inverse feedback to all problems of regulation, homeostasis, and goal-directed activity from steam engines to human societies. Our early sessions were largely devoted to getting these notions clear in our heads, and to discovering how to employ them in our dissimilar fields. Between sessions many of us made observations and experiments inspired by them, but we generally found it difficult to collect sufficient appropriate data in the 6 months between meetings. At the end of the first five sessions, of which there are no published transactions, we elected to meet but once a year, keeping our group together as nearly as possible, replacing a few who were lost to us, and inviting a few speakers to help us where help was needed most.
By the time we made this change, we had already discovered that what was crucial in all problems of negative feedback in any servo system was not the energy returned but the information about the outcome of the action to date. Our theme shifted slowly and inevitably to a field where Norbert Wiener and his friends still were the presiding genii. It became clear that every signal had two aspects; one physical, the other mental, formal, or logical. This turned our attention to computing machinery, to the storage of information as negative entropy. Here belong questions of coding, of languages and their structures, of how they are learned and how they are understood, including the theme of this, our last meeting, in which we expect to range from the most formal aspects of semantics, to its most contental contact with the world about us. For all our sakes I wish Wiener were still with us, but I understand that he is at present happily immersed in the clear and serene domain of relativity.
To refresh our memories and inform our guests, let me recapitulate, in logical rather than chronological order, the topics we have considered, and on which I believe the majority of us have been of one mind, to the limit of our ability to understand the evidence or the theory. You may find the consensus more frequently in my statement than in our published transactions. I am compelled to watch your faces, and to guess, before I let you have the floor, whether you will speak to the point or not, and from which side of the fence. With malice aforethought I have given the malcontent the floor, because he disagreed, or doubted, however unreasonably. Before I knew you so well this happened by accident, but as time went on, and we learned one another’s languages, I learned that it was the best way to keep our wits on their toes. Our guests have been remarkably good sports, but the transcriptions of our discussions inevitably sometimes result in misunderstandings and altercations instead of agreement. Of those who understood and agreed the transaction reveals nary a trace.
Feedback was defined as an alteration of input by output; gain was defined as ratio of output to input; feedback was said to be negative or inverse if the return decreased the output, say by subtracting from the input. The same term, inverse or negative feedback, was used for a similar effect but dissimilar mechanism, wherein the return decreased the gain. The transmission of signals requires time, and gain depends on frequency; consequently, circuits inverse for some frequencies may be regenerative for others. All become regenerative when gain exceeds one. Regeneration tends to extreme deviation or to schizogenic oscillation, unless gain decreases as the amplitude of the signal increases. Inverse feedback determines some state to be sought by the system, for it returns the system to that state by an amount which increases with the deviation from that state. Servomechanisms are devices in which the state to be sought by the system is determined by signals sent to that system from some other source. These notions were applied to machines, including the steam engine and its governor, to the steering engines of ships, to well-regulated power packs, telephonic repeaters, self-tuning radios, automatic gun-pointing machinery, etc., and thereafter to living systems. Homeostasis was first considered in terms of reflexive mechanisms, in which change initiated in some part of the body caused disturbances, including nervous impulses, which were reflected eventually to that part of the body where they arose, and there stopped or reversed the processes that had given rise to them. Similar regulatory circuits entirely within the central nervous system were found to resemble the automatic volume control of commercial radios. Appetitive behavior was described as inverse feedback over a loop, part of which lay within the organism, part in the environment. When a target or a goal could be indicated, a description of appetitive behavior was found to be couched in the same terms as that for self-steering torpedoes and self-training guns, whether these devices emitted signals reflected by their targets, or merely depended upon signals emitted by the target to readjust subsequent behavior to the outcome of previous behavior so as to minimize its error. Wiener drew a most illuminating comparison between the cerebellum and the control devices of gun turrets, modern winches, and cranes. The function of the cerebellum and of the controls of those machines is, in each case, to precompute the orders necessary for servomechanisms, and to bring to rest, at a preassigned position, a mass that has been put in motion which otherwise, for inertial reasons, would fall short of, or overshoot, the mark. These notions have served to guide subsequent neurophysiological research in the functional organization of the nervous system for the control of position and motion, some carried out in my laboratory in Chicago, and others by Wiener, Pitts, and Rosenblueth in the Institute of Cardiology in Mexico City, as well as by our friends in other laboratories. The general organization was found to consist of multiple closed loops of control, but the circuit action was extremely nonlinear, and consequently not amenable to any general simple mathematical analysis in terms of the Fourier Theory. Generally, multiple loops, severally stable by inverse feedback, may be unstable in conjunction, but the system can be stabilized by adding a portion of each of the returns and subtracting the sum from one or more of the servos. Such a system was found in the central nervous system by Setchenow in 1865, and rediscovered by Magoun. A group of us is studying the detail of its multiple afferents and its mode of affecting all reflexive activity; we shall use destructive lesion, and shall stimulate and map sources and sinks in various parts of the nervous system by methods presented at the last meeting. With failure of inhibitory signals or increased gain, the stretch reflex becomes regenerative, producing a rise in tone and a series of contradictions known as clonus. This has been elegantly analyzed, quantitatively, by Rosenblueth, Pitts, and Wiener, as described at our conference. Moreover they were able to demonstrate that the pool of relays of the so-called monosynaptic arc showed two numerous groups of relays, and a third less numerous, as judged by the random distribution of thresholds around three maxima. It will be years before we have fully exploited these notions.
Closed loops within the central nervous system — first suggested by Kubie as a substitute for undiscoverable motor activity proposed by the behaviorists to explain thinking in terms of reflexes, and by Ranson to account for homeostatic processes within the central nervous system, and independently discovered and demonstrated in the case of nystagmus by Dr. Rafael Lorente de Nó — were mentioned as possibly accounting for transitory memories by McCulloch and Pitts, who indicated that they were logically sufficient, but physiologically improbable, as an explanation for all forms of memory. Livingston has suggested that such mechanism might account for causalgic symptoms after blocking or removal of perverted peripheral circuits which had been rendered regenerative by some trauma resulting in streams of impulses over small afferent neurons appreciated as burning pain. Kubie had proposed that the core of every neurosis was a reiterative process in some closed loop.
I have summarized and presented to the Royal Society of Medicine evidence along all of these lines, with much more obtained from many varieties of intervention in causalgia. It is clear that the notions of feedback are the appropriate ones for the understanding of the normal function and diseases of the structures in question. Since that time, Dr. Galarvardin of Lyons, studying patients with auditory hallucinosis accompanied by muscular activity of mouth, tongue, and larynx, has had removed, bilaterally, the post-central somesthetic area for the face. The consequent disappearance of the hallucinosis had lasted 18 months when I last heard from him. This brings one symptom of a clearly organic psychosis into line with the findings on those obsessive compulsives who have been at least temporarily helped by frontal lobotomy, in that the central pathways of some reverberative process within the brain have been partially interrupted.
Again in terms of these notions, we have been able to make sense out of some aspects of what the psychologists have called goal-directed activity, and our attention has been duly called to the asymmetry of advance and escape, for in the former, the object sought is kept near the center of the receptive field of the sense organs, and behavior duly modified to approach it, whereas, in escape, learning along these lines cannot occur, and the behavior may easily become stereotyped. The most complex situations we have heard discussed are the stabilities engendered by inverse feedback in social structures of isolated communities reported principally by social anthropologists. Their devices have been extremely elaborate, depending, in some cases, on many interwoven loops. They seem to have utilized elaborate forms of distinctions and rules with respect to kinship, forms of address, hazing, bullying, praise, blame, and even rituals with respect to eating. Examples from ecology and from the behavior of anthills have extended these notions of inverse feedback.
Our members interested in economics and the polling of public opinion made use of these notions to explain fluctuations of the market, the banter leading to fight in roosters and boys, and the armament races initiating wars. In such circular systems, it becomes difficult to detect the causal relations. Wiener handled this by pointing out that it was possible to detect causality in the statistical sense by auto- and intercorrelations with lag, in those situations in which correlation was not perfect between the time series of the related component events, and explained how, with such devices, optimum predictions could be obtained. He doubted the applicability of this method to social problems, because of the shortness of our runs of the time series of information concerning human behavior. In these terms we discussed how a fielder catches a ball and a cat a mouse.
The question of conflict between motives was then raised by the psychiatrists, who, like psychologists, would like to have some common measure of value among human desires, comparable to what economists believe they have in the doctrine of marginal utilities and price in an open market. Kubie raised the question of the urgency of dissimilar ends, beginning with the need for moderate temperature, air, drink, food, sleep, and sex, the most urgent need resulting in the simplest response, and the least urgent allowing elaborate play. I indicated that an organism endowed with six neurons, constituting three chains of inverse feedback, and interrelated either by the requirement of summation or inhibitory links, was sufficiently complicated to exhibit the value anomaly, and if organization were left to chance would do so half the time. That is, given A and B it would prefer A; given B and C it would prefer B; but given C and A it would prefer C. A similar question was raised concerning dominance in the pecking order of chickens, but there was no adequate data as to the number or circles in coops of given numbers to settle the question. By this time we had become so weary of far-flung uses of the notion of feedback that we agreed to try to drop the subject for the rest of the conference.
Two interesting digressions appeared at this point: The first concerned cardiac flutter, which appears as a propagated disturbance running around the periphery of an area it cannot cross and it therefore cannot stop itself, whereas fibrillation appears as a disturbance which wanders over changing paths determined from moment to moment by shifts of threshold produced by previous activities at those points. Its mathematical analysis was indicated but not presented to the group. Second, Pitts presented a theory of disturbances in random nets, such perhaps as the cerebral cortex, in which it was possible to find a value around which to perturb the activity; namely, that probability of a signal in a neuron is equal to the probability of a signal in the neurons that are afferent to it.
Moreover, we had all come to realize that for problems of feedback, energy was the wrong thing to consider. The crucial variable was clearly information.
We began by considering computers as “analogue,” if the magnitude of some continuous variable like voltage, pressure, or length were made proportional to a number entering into a computation; and as “digital” if they were a set of stable values (at least two) separated by regions of instability, and the number was represented by the configuration of the stable state of one or more components. Analogue devices showed tendencies for errors to appear in the least significant place, but were limited by precision of manufacture and could not be combined by secure additional places. Digital devices might show errors in any place (a limitation inherent in all positional nomenclatures), never required extreme accuracy, and could always be combined to secure another place, at the same price per place as previously. When components are relays, the digital devices sharpen the signal at every repetition. We considered Turing’s universal machine as a “model” for brains, employing Pitts’ and McCulloch’s calculus for activity in nervous nets. It uses the calculus of propositions of the Principia Mathematica, subscripted for the time of occurrence of an impulse of a given neuron. We demonstrated the equivalence of all general Turing machines, and how they could be designed to answer any non-paradoxical question which could be put to them in an unambiguous manner. We considered the far-flung conclusions that followed here from Goedel’s arithmetizing logic. It became clear that having ideas required circuits capable of computing invariants under the necessary groups of transformation, that is, reverberant activity preserving the form of its afferent, initiated at one time, or inverse feedback leading some figure of input by some path to a canonical presentation out of its many legitimate ones. Gestalt notions led only to multiplications of particulars with distortions attributed to “cortical fields” in which currents are conserved, though in nature there are sources and sinks, and although the areas of cortex they are said to pervade are anatomically discontinuous. The discrete action of nervous components was considered the only way in which they would normally function to handle the amount of information transmitted through them. Gross disturbances of function (epilepsy, etc.) were seen to be accompanied by gross fluctuations affecting most of the neurons in a given area in much the same way, thus producing a loss of information. Emotions were considered as expressions of some overrunning of parts of the computer, producing somewhat fixed response to diffuse and variable inputs, as if in a Turing machine the computed value of an operand ceased to affect subsequent operations. Wiener proposed that by glandular means emotions might broadcast a “to whom it may concern” message, causing items to be locked in, or remembered. It was suggested that the best way to find out what an unknown machine did was to feed it a random input; clearly it had to be random in terms of the aspects of the input that the machine could discriminate. This was likened to the Rorschach Test, and its auditory equivalent, and it was noted that the gibberish produced by free association was apt to cause the psychiatrist to project his own difficulties on his patient.
Three kinds of storage called memory were discussed at length: The first, active reverberation, such as in the acoustic tank, was recognized as responsible for nystagmus and the only storage left in presbyophrenia. J. Z. Young made use of the same notion to describe the residual memory after the destruction of the main memory organ of the octopus. The second type of storage has been found only in the octopus, where it occupies a separate structure with well defined and separate access and egress. The organ itself is composed of a host of small cells; the nature of its synapses is not yet well known. This is the storage that has excited theoretical physicists because of the immense number of bits retained by it. Von Foerster computed its size from accesstime times access-channels against mean half-life of the trace, and Stroud from the number of snapshots one-tenth second each at, for example, a thousand bits per frame. Figures are in rough agreement that it lies between 1013 and 1015. Instead of declining asymptotically to zero, a few per cent of the items are retained forever. Von Foerster has proposed mechanisms to account for this, requiring circa 0.02 watt; the brain is a 24-watt organ. Access to this store is probably not by simple addresses sought seriatim. Recall seems to rest on a process locating items by their contents. This was discussed by Von Neumann concerning similarities, and by Kluver concerning stimulus equivalences, but both had apparently noted retention of eidetic fragments, a topic which should be gone into much more thoroughly hereafter. Peculiarities of this kind of storage in man seem to be that the contents are a series of snapshots, each devoid of motion; they are accessible in the order of filing, not in the reverse order; there is a delay of about a minute between the making of the trace, and the time when it is first accessible; and finally, a snapshot too similar to the one before it may upset the process. These traces cannot be simply localized; each bit is an alteration of synapses effective somewhere in a net, and the alteration is not confined to some one junction. The third type of storage seems to behave more like the growth-with-use characteristic of muscle, and shows fatigue on too frequent testing. Shurrager apparently has evidence that it can occur at a monosynaptic reflex level, and changes with use have been seen where the vagus contacts ganglionic cells in the frog’s auricle, but there is no evidence that such a change persists anywhere in the central nervous system, and there is no anatomical evidence that it ever happens there. Some change of organization with use does occur perhaps as suggested by Ashby. The organization of the visual cortex with use is a case in point. When congenital cataracts were removed, the difficulty with vision was in part found to be attributable to antithetical organization of these mechanisms by impulses from elsewhere in the central nervous system.
“Traffic jams” of brains become increasingly probable with increase in volume, for the number of long distance connections cannot be expanded to keep pace with the number of relays to be connected, except by increasing cable-space disproportionately. It was suggested that potentiation, described by Lloyd, may serve to lock in lines temporarily on a basis of their previous use; this would resemble a scheme proposed in Holland for more efficient use of limited telephonic facilities. Repetitive firing of cerebrospinal neurons leaving the cell body and dendrites largely depolarized when axons were hyperpolarized (P2 after-potential) would thus account for that component of facilitation marked by surface negativity and depth positivity of the cerebral cortex, as a matter of lowered threshold, with increased voltage of volley delivered to cord. The same mechanism would account for the stiffening up of the Parkinsonian patient.
Considerable time was spent discussing the way in which the actual flow of information determined the structure of groups, and discussing the way in which command moved from moment to moment to that place in the net where most information necessary for action was concentrated. In parallel computing machines including brains, when one part is busy or damaged, another will serve for the same computation. This requires that the whole machine be tended by some part of the machine which can switch the problems to them. Such a machine might give correct answers when most of it was out of commission. What appears on one side as the problem of redundancy of neurons and channels composed of them appears on the other as the problem of securing infallible performance from fallible components. Von Neumann’s last work on this score, delivered at a conference on the West Coast, is titled “Probabilistic Logic.”
With respect to language, as second only to vision as a source of information to brains, and all important in human communications, not to mention psychoanalysis, it was generally admitted that we almost never say anything unless we wish someone to do something about it. Apart from this general hortatory aspect, language contains a few signs such as “hum,” “um hum,” “unh unh,” and “huh,” that are specifically so, but otherwise contentless. The question arises whether the logical particle comes from the signs used by dogs and small children. It was generally agreed, as stated by Dr. Mead, that when all definitions must be ostensive, as in learning a language where no possibility of translation exists, e.g., as a child, or as a newcomer to an island of an alien tongue, it is best to learn from children because they will repeat indefinitely. Learning was first defined as an alteration of transition probabilities. Speech, broken into phonemes, distinguished, according to Jakobson, by a few decisions between opposites, poorly represented at best in the conventional spelling of English, and studied by Licklider’s method of chopping and distorting it to an incredible extent, retains its intelligibility when little is left beyond an indication of when pressure waves cross the axis, and even this is enormously redundant. One point, returning to ten snapshots per second, is the peculiarity of speech to remain intelligible when each tenth of a second is half speech and half noise many decibels louder. Total amount of information conveyed by speech is probably not more than ten bits per second, though it takes a thousand bits per second to produce a sound indistinguishable from it. Shannon’s work on redundancy of English in reducing amount of information conveyed per symbol was studied from the position he shares with Wiener, that information is negative entropy. The recipient has a set of entities to match the signals he is intended to receive, and the signal causes him to make the selection. This selective information was found to be comparable to MacKay’s logon information but not to his metron information, the point being that the entropic cost of a metron of information goes up as the square of the number of metrons, rather than as the numbers.
We have considered Zipf’s law — that the number of kinds of any given rarity is proportional to the square of the rarity — but I do not think we are satisfied either as to the validity of the law, the basis of the exceptions, or the universe it presupposes. Finally, we have proposed to look into the amount of information conferred upon us by our genes, and have tried to straighten out for ourselves those difficulties that have arisen because of confusion of the level of discourse. It is my hope that by the time this session is over, we shall have agreed to use vary sparingly the terms “quantity of information” and “negentropy.”
For further research:
Wordcloud: Account, Activity, Agreement, Behavior, Bits, Brains, Central, Components, Computing, Concerning, Conference, Considered, Devices, Discussed, Feedback, Found, General, Given, Information, Input, Inverse, Learned, Logical, Loops, Machine, Mechanism, Memory, Negative, Nervous, Neurons, Notions, Number, Observers, Organization, Per, Point, Position, Presented, Probably, Problems, Process, Question, Relations, Required, Return, Signals, Storage, System, Terms, Wiener
Keywords: Agreement, Signals, Observations, Conferences, Observers, Looms, Einstein, Feedback, Imagination, Positions
Google Books: http://asclinks.live/3gyb
Google Scholar: http://asclinks.live/lc4x