Hostname: page-component-586b7cd67f-t7czq Total loading time: 0 Render date: 2024-11-29T00:38:53.575Z Has data issue: false hasContentIssue false

Algorithmic thermodynamics

Published online by Cambridge University Press:  06 September 2012

JOHN BAEZ
Affiliation:
Department of Mathematics, University of California, Riverside, California 92521, U.S.A.
MIKE STAY
Affiliation:
Computer Science Department, University of Auckland, and Google, 1600 Amphitheatre Pkwy Mountain View, California 94043, U.S.A.

Abstract

Algorithmic entropy can be viewed as a special case of the entropy studied in statistical mechanics. This viewpoint allows us to apply many techniques developed for use in thermodynamics to the subject of algorithmic information theory. In particular, suppose we fix a universal prefix-free Turing machine and let X be the set of programs that halt for this machine. Then we can regard X as a set of ‘microstates’, and treat any function on X as an ‘observable’. For any collection of observables, we can study the Gibbs ensemble that maximises entropy subject to constraints on the expected values of these observables. We illustrate this by taking the log runtime, length and output of a program as observables analogous to the energy E, volume V and number of molecules N in a container of gas. The conjugate variables of these observables allow us to define quantities we call the ‘algorithmic temperature’ T, ‘algorithmic pressure’ P and ‘algorithmic potential’ μ, since they are analogous to the temperature, pressure and chemical potential. We derive an analogue of the fundamental thermodynamic relation dE = TdSPdV + μdN, and use it to study thermodynamic cycles analogous to those for heat engines. We also investigate the values of T, P and μ for which the partition function converges. At some points on the boundary of this domain of convergence, the partition function becomes uncomputable – indeed, at these points the partition function itself has non-trivial algorithmic entropy.

Type
Paper
Copyright
Copyright © Cambridge University Press 2012

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Bennett, C. H, Gàcs, P., Li, M., Vitányi, M. B. and Zurek, W. H. (1998) Information distance. IEEE Transactions on Information Theory 44 14071423.CrossRefGoogle Scholar
Calude, C. S. (2002) Information and Randomness: An Algorithmic Perspective, Springer-Verlag.CrossRefGoogle Scholar
Calude, C. S. and Stay, M. A. (2006) Most programs stop quickly or never halt. Advances in Applied Mathematics 40 295308.CrossRefGoogle Scholar
Calude, C. S. and Stay, M. A. (2006) Natural halting probabilities, partial randomness, and zeta functions. Information and Computation 204 17181739.CrossRefGoogle Scholar
Calude, C. S., Staiger, L. and Terwijn, S. A. (2006) On partial randomness. Annals of Pure and Applied Logic 138 2030.CrossRefGoogle Scholar
Chaitin, G. (1975) A theory of program size formally identical to information theory. Journal of the ACM 22 329340.CrossRefGoogle Scholar
Chaitin, G. (1976) Algorithmic entropy of sets. Computers and Mathematics with Applications 2 233245.CrossRefGoogle Scholar
Roberts, C. P. (2001) The Bayesian Choice: From Decision-Theoretic Foundations to Computational Implementation, Springer-Verlag.Google Scholar
Fredkin, E. and Toffoli, T. (1982) Conservative logic. International Journal of Theoretical Physics 21 219253.CrossRefGoogle Scholar
Janyes, E. T. (1957) Information theory and statistical mechanics. Physical Review 106 620630.CrossRefGoogle Scholar
Janyes, E. T. (2003) Probability Theory: The Logic of Science, Cambridge University Press.CrossRefGoogle Scholar
Kolmogorov, A. N. (1965) Three approaches to the definition of the quantity of information. Problems of Information Transmission 1 311.Google Scholar
Kullback, S. and Leibler, R. A. (1951) On information and sufficiency. Annals of Mathematical Statistics 22 7986.CrossRefGoogle Scholar
Levin, L. A. (1973) Universal sequential search problems. Problems of Information Transmission 9 265266.Google Scholar
Levin, L. A. (1974) Laws of information conservation (non-growth) and aspects of the foundation of probability theory. Problems of Information Transmission 10 206210.Google Scholar
Levin, L. A. and Zvonkin, A. K. (1970) The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms. Russian Mathematics Surveys 256 83124.Google Scholar
Li, M. and Vitányi, P. (2008) An Introduction to Kolmogorov Complexity Theory and its Applications, Springer-Verlag.CrossRefGoogle Scholar
Manin, Y. (2011) Renormalization and computation I: Motivation and background. In: Loday, J.-L. and Vallette, B. (eds.) OPERADS 2009. Séminaires et Congrès 26, Société Mathématique de France 181233.Google Scholar
Manin, Y. (2009) Renormalisation and computation II: time cut-off and the halting problem. Mathematical Structures in Computer Science (this issue).Google Scholar
Manin, Y. and Marcolli, M. (2009) Error-correcting codes and phase transitions. (Available as arXiv:0910.5135.)CrossRefGoogle Scholar
Pour-El, M. B. and Richards, J. I. (1989) Computability in Analysis and Physics, Springer Verlag.CrossRefGoogle Scholar
Reif, F. (1965) Fundamentals of Statistical and Thermal Physics, McGraw-Hill.Google Scholar
Rényi, A. (1960) On measures of information and entropy. In: Neyman, J. (ed.) Proceedings of the 4th Berkeley Symposium on Mathematical Statistics and Probability 1 547561.Google Scholar
Solomonoff, R. J. (1964) A formal theory of inductive inference, part I. Information and Control 7 122.CrossRefGoogle Scholar
Stoddard, E. J. (1919) Apparatus for obtaining power from compressed air, US Patent 1,926,463.Google Scholar
Szilard, L. (1929) On the decrease of entropy in a thermodynamic system by the intervention of intelligent beings. Zeitschrift für Physik 53 840856. (English translation (2003) In: Leff, H. S. and Rex, A. F. (eds.) Maxwell's Demon 2: Entropy, Information, Computing, Adam Hilger.)CrossRefGoogle Scholar
Tadaki, K. (2002) A generalization of Chaitin's halting probability Ω and halting self-similar sets. Hokkaido Mathematical Journal 31 219253.CrossRefGoogle Scholar
Tadaki, K. (2008) A statistical mechanical interpretation of algorithmic information theory. In: Local Proceedings of Computability in Europe 2008, University of Athens, Greece 425–434. (Available as arXiv:0801.4194.)Google Scholar
Tadaki, K. (2009) A statistical mechanical interpretation of algorithmic information theory III: Composite systems and fixed points. (Available as arXiv:0904.0973.)CrossRefGoogle Scholar