Hostname: page-component-848d4c4894-cjp7w Total loading time: 0 Render date: 2024-07-05T16:03:21.387Z Has data issue: false hasContentIssue false

‘Time's Arrow’: An Informational Theory Treatment: The Earl of Halsbury

Published online by Cambridge University Press:  05 November 2010

Extract

Time's arrow is not a property of time but of events: the way in which they succeed one another in our experience and, as many believe, in a reality independent of our experience. I hope to throw a little light on one aspect of this difficult matter by treating it from the standpoint of logic, topology and information theory. If I succeed in my hope I shall still be leaving many other matters unresolved. Let me state briefly what these other matters are in order not to exaggerate the scope of any contribution I hope to make.

Type
Papers
Copyright
Copyright © The Royal Institute of Philosophy and the contributors 1968

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

page 117 note 1 ‘Time itself is not a process in time.’ Whitrow, G. J., The Natural Philosophy of Time (London, 1961), p. 292Google Scholar. A long and distinguished list of views to the contrary indicates how widespread the mistake is: (i) ‘Time is a moving image of Eternity’, Plato, The Timeaus of Plato (trans. Archer-Hind, R. D.) (London, 1888), p. 119Google Scholar; (ii) Time travels in divers paces with divers persons’, , Shakespeare, As You Like It, iii. ii. 336Google Scholar; (iii) ‘We evidently must regard Time as passing with a steady flow’, Barrow, I., Lectiones Geometricae (trans. Stone, E.) (London, 1735), lect. 1, p. 35Google Scholar; (iv) ‘Absolute, true and mathematical time of itself, and from its own nature, flows equably without relation to anything external’, Newton, I., Mathematical Principles (trans. Motte, A., ed. Cajori, F.) (Berkeley, 1934), p. 6Google Scholar; (v) ‘Time like an ever-rolling stream bears all its sons away’, Hymnist.

page 121 note 1 ‘Information’ in communications engineering retains approximately its everyday meaning subject to being measured in the following way. Suppose a situation as known to us before and after some kind of disclosure, signal, observation or whatever it be relevant to the situation. Before receipt of the signal, etc., our a priori knowledge was, let us suppose, measured by PB. After receipt, our a posteriori knowledge is measured PA. PB and PA are probabilities ranging from o to I, o standing for impossibility and I for certainty. The information in the signal is related to the ratio PA/PB. For convenience the logarithm of this ratio is taken and in particular its logarithm to the base 2, though this is not essential. Thus if PA =PB, then PA/PB = I and since log2 I = 0, no information is received. The information measure accordingly relates to the rise in our knowledge of something following an observation. Thus if a coin is covered our a priori knowledge of its being a head or tail is ½ in either case. On uncovering the coin and observing that it is, say, a head, our a posteriori knowledge regarded as certain has risen to the value 1.0. The information content of the observation is therefore log2 1/½ =log2 2 = 1. The unit of information or BIT is therefore taken as measuring the rise in our knowledge of a situation from an even money chance of it being so to certainty that it is so. Information may be linked to a general physical event, or to a linguistic symbol whose meaning is purely conventional. The former class is called ‘intrinsic’, ‘implicit’ or ‘bound’ by different writers; the latter ‘semantic’, ‘explicit’ or ‘free’. These definitions imply nothing as to the importance of what is conveyed by a signal of the second kind. The telegraphic answers Well and Dead i n reply to an enquiry after someone's health contain the same information as measured above. Their significance to the recipient could, however, be greatly different. ‘Noise’ is the component of reception that contains no information. Thus our a posteriori knowledge of events at a transmitter following reception in a noisy receiver is less than certain, i.e. is measured by a figure less than 1·0. Noise thus ranks as negative information. Entropy in thermodynamics is also defined as the logarithm ofa probability. It is therefore very similar to information as defined by communication engineers.

page 124 note 1 The ground rules are given above in the form in which they are generally set out in works on topology and are assigned by analogy with Euclidean spaces and other spaces with a positive definite metric. One can formulate alternatives however, e.g. Minkowski space-time.

page 124 note 2 Side-stepping means what it says. One does not answer a question by agreeing not to discuss it! One way of restricting a discussion is to restrict one's vocabulary in such a way that a particular question cannot be asked. It may still need answering however.

page 125 note 1 The word ‘precede’ must not be interpreted in a spatial or temporal sense. Cf. , Hausdorff, Set Theory (Chelsea Publishing Co., New York, 1957), p. 49Google Scholar: ‘Of course the space-like or time-like characteristics which seem to attach to this explanation because of the use of the prepositions “before” and “after” are of no consequence, and we will convince ourselves by the use of other definitions that what we have here is nothing but an application of the concept of function.’

page 132 note 1 Cf. , Whitrow, op. cit.Google Scholar: ‘The direction of time in our personal experience is that of increasing knowledge of events’, p. 270, and also ‘The order of our awareness [is] of the growth in our information of what occurs’, p. 271.