Article contents
A Definition of “Degree of Confirmation”
Published online by Cambridge University Press: 14 March 2022
Extract
1. The problem. The concept of confirmation of an hypothesis by empirical evidence is of fundamental importance in the methodology of empirical science. For, first of all, a sentence cannot even be considered as expressing an empirical hypothesis at all unless it is theoretically capable of confirmation or disconfirmation, i.e. unless the kind of evidence can be characterized whose occurrence would confirm, or disconfirm, the sentence in question. And secondly, the acceptance or rejection of a sentence which does represent an empirical hypothesis is determined, in scientific procedure, by the degree to which it is confirmed by relevant evidence.
- Type
- Research Article
- Information
- Copyright
- Copyright © The Philosophy of Science Association 1945
References
Notes
1 A detailed technical exposition of the theory will be given by Olaf Helmer and Paul Oppenheim in a forthcoming article, in vol. 10 of The Journal of Symbolic Logic.
The present issue of Philosophy of Science contains an article by Professor Rudolf Carnap which likewise sets forth a definition and theory of confirmation. The approach to the problem which is to be developed in the present paper is independent, of Professor Carnap's and differs from it in various respects. Some of the points of difference will be exhibited subsequently as the occasion arises. We wish to express our thanks to Professor Carnap for valuable comments he made in the course of an exchange of ideas on the two different studies of confirmation.
We also wish to thank Dr. Kurt Gödel for his stimulating remarks.
2 For a definition and theory of the classificatory concept of confirmation, see the following two articles by Carl G. Hempel: A purely syntactical definition of confirmation; The Journal of Symbolic Logic, vol. 8(1943), pp. 122–143; Studies in the logic of confirmation; Mind, n.s. vol. 54 (1945).
The technical term “confirmation” should not be construed in the sense of “verification“—an interpretation which would preclude, for example, its application to a hypothesis about an event which is temporally posterior to the data included in the evidence. Rather, as is suggested by the root “firm,” the confirmation of a hypothesis should be understood as a strengthening of the confidence that can rationally be placed in the hypothesis.
3 Illustrations: ‘(P1a1 v P2a1) ∩ (P3a1·∼P4a1)’ stands for “If a 1 has at least one of the properties P1, P2 then it has the property P2, but not the property P4“; ‘∼(x)P1x ∩ (Ex)∼P1x’ stands for “If it is not the case that all objects have the property P1, then there is at least one object which does not have the property P1”.
4 Here and at some later places we use statement connective symbols autonomously, i.e., roughly speaking, as designations of the same symbols in the “object language” L.
5 This argument was suggested by Professor Carnap.
6 On this point, see also sections 10 and 16 in Professor Carnap's article.
7 Cf. Hans Reichenbach, Wahrscheinlichkeitslehre, Leiden 1935, especially §§75–80, and Experience and Prediction, Chicago 1935, Chapter V.
8 Note that distributions cannot be characterized in L and that, therefore, they cannot form the content of any hypothesis that may be formulated in L; we speak about them in a suitable meta-language for L. In our case, this meta-language is English, supplemented by a number of symbols, such as ‘H', ‘E', ‘q1','q2', ···, ‘Δ', etc. It might be well to emphasize at this point that the definition and the entire theory of de for L is formulated in that meta-language, not in L itself: In the meta-language, we speak about the sentences of L and about the degrees to which certain sentences confirm others.
9 In the case of a finite total population, the application of the simple product rule presupposes that the objects constituting a sample are taken from the urn one at a time, and that each of them is replaced into the urn before the next one is drawn. In order to avoid complications of this sort, we assume the population to be infinite.
10 These are stated in section 10 of the present article.
11 This probability concept was developed by Olaf Helmer; a detailed exposition of the theory of this concept is included in the article by Helmer and Oppenheim mentioned in footnote 1.
12 An alternative to this approach would be to determine, by means of Bayes’ theorem, that distribution upon which E confers the greatest probability (in contradistinction to our question for that distribution which confers upon E the maximum probability); but this approach presupposes—to state it first by reference to the urn analogue—an infinity of urns, each with a different frequency distribution; and to each urn U, there would have to be assigned a definite a priori probability for the sample to be taken from U. Applied to our problem, this method would involve reference to an infinity of possible states of the world, to each of which there would have to be attached a certain a priori probability of being realized; and for such a “lottery of states of the world,” as it were, it seems very difficult to find an empiricist interpretation.
13 Cf. R. A. Fisher: The mathematical foundations of theoretical statistics, Phil. Trans. Roy. Soc. London 222 (1922), pp. 309–368. Also see M. G. Kendall: On the method of maximum likelihood, Journal Roy. Stat. Soc. 103 (1940), pp. 388–399, and the same author's work, Advanced Theory of Statistics (London 1943).
14 The symbol ‘dc(H, E)’ is therefore used here in a similar manner as, say, ‘‘ in mathematics; both represent functions which are not generally single-valued. An alternative would be to stipulate that dc(H, E) is to equal pr(H, E, ΔE) in those cases where the latter function is single-valued, and that in all other cases, dc(H, E) is to remain undefined. A third possibility would be to define dc(H, E) as the smallest value of pr(H, E, ΔE); for of two hypotheses tested by means of the same evidence, that one will be considered more reliable for which that smallest value is greater. This definition, however, has a certain disadvantage, which is explained in footnote 17.
15 On this point, cf. also section 3 of Professor Carnap's article.
16 The Journal of Symbolic Logic, vol. 5 (1940), pp. 133–148.
17 In footnote 14, two alternatives to our definition of de were mentioned. It can be shown that the concept determined by the first of these satisfies without exception the requirements 10.1, 10.2, 10.3′, and 10.4, whereas the concept introduced by the second alternative does not. Thus, e.g., if H = ‘P1a1‘, E = ‘P2a2‘, then the values of pr(H, E, ΔE) are all the real numbers from 0 to 1 inclusive, so that the smallest value is 0. The same is true of pr(∼H, E, ΔE); hence these two smallest values violate the principle 5.1 and thus indirectly the postulates 10.1 and 10.2, of which 5.1 can be shown to be a consequence.
17 a The alternative term “likelihood” which suggests itself is inexpedient also, as it has already been introduced into theoretical statistics with a different meaning (cf. section 8 above). If a term customarily associated with “probability” should be desired, then “expectancy” might be taken into consideration.
18 The method characterized above is illustrated by a definition of probability which F. Waismann (Logische Analyse des Wahrscheinlichkeitsbegriffs, Erkenntnis, vol. 1, pp. 228–248) has outlined following a suggestion made in L. Wittgenstein's Tractatus Logico-Philosophicus (New York and London 1922). Also, the regular c-functions introduced in Professor Carnap's article on inductive logic exemplify this way of defining de. In that article, some special choices for the measure function m are presented and examined as to their suitability for the establishment of an adequate definition of the concept of degree of confirmation.
19 Cf. Ernest Nagel, Principles of the Theory of Probability; Internat. Encycl. of Unified Science, vol. 1, No. 6, Chicago 1939; pp. 68–71. Also, see section 15 of Professor Carnap's paper, which contains a discussion of this point.
- 65
- Cited by