Hostname: page-component-cd9895bd7-7cvxr Total loading time: 0 Render date: 2024-12-23T00:16:18.841Z Has data issue: false hasContentIssue false

Mechanism, Organism, and Society: Some Models in Natural and Social Science

Published online by Cambridge University Press:  14 March 2022

Karl W. Deutsch*
Affiliation:
Massachusetts Institute of Technology

Extract

Men think in terms of models. Their sense organs abstract the events which touch them; their memories store traces of these events as coded symbols; and they may recall them according to patterns which they learned earlier, or recombine them in patterns that are new. In all this, we may think of our thought as consisting of symbols which are put in relations or sequences according to operating rules. Both symbols and operating rules are acquired, in part directly from interaction with the outside world, and in part from elaboration of this material through internal recombination. Together, a set of symbols and a set of rules may constitute what we may call a calculus, a logic, a game or a model. Whatever we call it, it will have some structure, i.e., some pattern of distribution of relative discontinuities, and some “laws” of operation.

Type
Research Article
Copyright
Copyright © 1951, The Williams & Wilkins Company

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

The substance of this paper was presented to the joint meeting of the Philosophy of Science Association and the American Association for the Advancement of Science at New York on December 30, 1949, and some passages have appeared in Goals for American Education, New York, Harper Brothers, 1950.

References

1 Phillip Frank, Relativity—A Richer Truth, Boston, Beacon Press, 1950.

2 A. Rosenblueth and N. Wiener, “The Role of Models in Science”, Philosophy of Science, 12, pp. 317–318 (1945).

3 H. T. Pledge, Science Since 1500, New York, Philosophical Library, 1947, p. 29.

4 Cf. N. Wiener, Cybernetics, New York, John Wiley, 1948, pp. 40–56.

5 In Goals for American Education, New York, Harper Brothers, 1950, p. 131; Italics mine.

6 Ibid, pp. 131–132.

7 “Idea of a Universal History on a Cosmo-Political Plan” by Immanuel Kant, translated by Thomas De Quincey, Speculations Literary and Philosophic, Edinburgh, Adams and Charles Black, 1862, pp. 133–152.

8 Cf. E. Heiman, History of Economic Doctrines, New York, Oxford University Press, 1945, pp. 132, 177. E. Roll, A History of Economic Thought, 2nd ed., New York, Prentice Hall, 1942, pp. 232–247.

9 R. G. Collingwood, The Idea of History, Oxford, Clarendon Press, 1946, pp. 46–52.

10 Migne, Patrologia Latina, 50, 667; cited in E. Rosenstock-Huessy, The Christian Future, (London, S.C.M. Press, Ltd., 1947), V. 75, n. 1.

11 A. J. Toynbee, A Study of History, 2nd ed. London, Oxford University Press, 1945; E. Rosenstock-Huessy, Out of Revolution: Autobiography of Western Man; New York, Morrow, 1938.

12 Simon Kuznets, “Measurement of Economic Growth”, in Economic Growth, A Symposium, A Reprint of The Tasks of Economic History, Supplement VII, 1947, The Journal of Economic History, New York University Press, 1947.

13 On this whole subject, see also A. Rosenblueth, N. Wiener and J. Bigelow “Behavior, Purpose, and Teleology” Philosophy of Science, X, January, 1943, pp. 18–24; W. S. McCulloch and W. Pitts, “A Logical Calculus of the Ideas Immanent in Nervous Activity,” Bulletin of Mathematical Biophysics, V., 1943, pp. 115–133; F. C. S. Northrop, “The Neurological and Behavioristic Basis of the Ordering of Society by Means and Ideas,” Science, 107, N. 2782, April 23, 1948.

14 “Higher Education and the Unity of Knowledge, Goals for American Education, (New York, Harper Brothers, 1950), pp. 110–111.

15 Somewhat differently phrased, a communications network is “a system of physical objects interacting with each other in such a manner that a change in the state of some elements is followed by a determinate pattern of changes in other related elements, in such a manner that the changes remain more or less localized and independent of the system from other sources” (Walter Pitts); a communications channel is a “physical system within which a pattern of change can be transmitted so that the properties of that pattern (or message) are more or less isolated from other changes in the system” (Norbert Wiener); “A state description is a specification of which of its possible states each element of the network is in. A message is any change in the state description of a network or part of it” (Pitts), or again somewhat differently stated, “a message is a reproducible pattern regularly followed by determinate processes depending on that pattern” (Wiener). Oral communications, Massachusetts Institute of Technology, Spring, 1949; Cf. also Claude E. Shannon & Warren Weaver, The Mathematical Theory of Communication, Urbana, Ill.; University of Illinois Press, 1949, pp. 99–106 (“Information”).

16 For a discussion of this entire subject see N. Wiener, Cybernetics; Shannon-Weaver, Op. Cit; for a much simplified account cf. E. C. Berkeley, Giant Brains, New York, Wiley, 1949.

16a For a discussion of this subject, see Quincy Wright, Ed., The World Community, (Chicago, University of Chicago Press, 1948). K. W. Deutsch, Nationalism and Social Communication (Cambridge, Technology Press, public. sched. 1951–52).

17 Peter B. Neiman, The Operational Significance of Recognition, B. S. Thesis, M. I. T., 1949 (unpublished).

18 Rosenblueth-Wiener-Bigelow, op. cit., p. 19. A more refined definition of feedback, would put “output information” in place of “output energy,” in accordance with the distinction between “communications engineering” and “power engineering.” Cf. Wiener, Cybernetics, p. 50.

19 Rosenblueth-Wiener-Bigelow, op. cit. p. 18. There is also another kind of feedback, different from the negative feedback discussed in the text: “The feedback is … positive (if) the fraction of the output which re-enters the object has the same sign as the original input signal. Positive feedback adds to the input signals, it does not correct them….” Ibid., p. 19, see also Wiener, op. cit., pp. 113–136. Only self-correcting, i.e., negative, feedback is discussed in the present paper.

20 Wiener, loc. cit.

21 Cf. John Dollard, “The Acquisition of New Social Habits,” in Linton, The Science of Man in the World Crisis, New York, Columbia, 1945, p. 442; with further references. “Drives … are ‘rewarded’, that is … they are reduced in intensity….” A. Irving Hallowell, “Sociopsychological Aspects of Acculturation,” in Linton, op. cit. p. 183; cf. in the same volume, Kluckhohn and Kelly, “The Concept of Culture,” pp. 84–86.

22 Rosenblueth-Wiener-Bigelow, op. cit., p. 18. “By behavior is meant any change of an entity with respect to its surrounds. … Accordingly, any modification of an object detectable externally, may be denoted as behavior.” Ibid.

23 Analytical understanding of a process need not diminish its sublimity, that is, its emotional impact on us in our experience of recognition. Faust becomes no more trivial by our knowledge of goal changing feedbacks than a sunrise becomes trivial by our knowledge of the laws of refraction.

24 On the importance of flow patterns of information and decision in economic or political organization, see K. W. Deutsch, “Innovation, Entrepreneurship, and the Learning Process,” in Change and the Entrepreneur, Cambridge, Mass., Harvard University Press, 1949, p. 29; and “A Note on the History of Entrepreneurship, Innovation, and Decision-Making,” Explorations in Entrepreneurial History, Vol. I., n. 5, May 1949, pp. 12–16.

25 See K. W. Deutsch, “Some Notes on Research on the Role of Models in the Natural and Social Science”, Synthese, Vol. VII (1948–49) No. 6-B, F. G. Kroonder, Bussum, Netherlands; reprinted for the Communications of the Institute for the Unity of Science, Boston, Massachusetts.

26 For a more extended discussion of “will” in these terms, see K. W. Deutsch, op. cit., pp. 525–531; Rosenblueth-Wiener-Bigelow, op. cit., p. 19; and Warren S. McCulloch, Finality and Form in Nervous Activity, Fifteenth James Arthur Lecture, American Museum of Natural History, New York, May 2, 1946, p. 4 (multigraphed).

27 Another discussion of such differences, stressing considerations other than the ones in this paper will be found in Rosenblueth-Wiener-Bigelow, op. cit., pp. 22–23.

28 Concise Oxford Dictionary, Clarendon Press, Oxford, 1934, pp. 688, 804; my italics.

29 Communication from Norbert Wiener, Massachusetts Institute of Technology, June 1948. See also Cybernetics, pp. 155, 160.

50 In some calculating machines, and perhaps in the cells of the human brain there is some degree of reassignment of general elements to specific tasks or temporary subassemblies serving as “task forces.” In some societies, such as that of the Eciton army ants, there seems to be such a high degree of permanency of specialized function for each ant or class of ants, and so few degrees of freedom for an individual's choice of path that the entire column of ants may trap itself in a circular “suicide mill” where the path of each ant becomes determined by “the vector of the individual ant's centrifugal impulse to resume the march and the centripetal force of trophallaxis (food-exchange) which binds it to its group,” so that the ants continue circling until most of them are dead. T. C. Schneirla and Gerard Piel, “The Army Ant,” Scientific American,“ June, 1948, p. 22.

31 For an example of an “ecological viewpoint”, see Laura Thompson, “The Relations of Men, Animals, and Plants in an Island Community (Fiji)”, American Anthropologist, vol. 51, n. 2, April-June 1949, pp. 253–267, with further references.