Hostname: page-component-78c5997874-xbtfd Total loading time: 0 Render date: 2024-11-07T06:31:32.041Z Has data issue: false hasContentIssue false

Formalization of the Concept “About”

Published online by Cambridge University Press:  14 March 2022

Hilary Putman*
Affiliation:
Princeton University

Extract

The question, what a given statement is “about,” often occurs in philosophic discussion. I shall use this question in the hope of illustrating how a relatively simple application of symbolic logic can clarify a problem which might otherwise turn into a maze of complications.

Type
Research Article
Copyright
Copyright © Philosophy of Science Association 1958

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1 See, for example, the symposium: “Logical truth,” The Journal of Philosophy, Vol. 53, No. 22, Oct. 25, 1956, pp. 671–696.

2 “Subject term” is used in the present paper only when categorical propositions (“All S are P,” “No S are P,” “Some S are P,” etc.) are under discussion: for arbitrary molecular propositions, tradition provides no clear criterion for distinguishing a “subject term.”

3 Cf. Carnap, Logical Foundations of Probability, Univ. of Chicago (1950), Ch. III.

4 This concept was developed independently by John Kemeny (“A Logical Measure Function,” Journal of Symbolic Logic 18, 1953, pp. 289–308) and by Carnap and Bar-Hillel (“Semantic Information,” Br. Jour. Phil. Sci. 4, 1953, pp. 147–157). “Strength” is the term used by Kemeny; “amount of information” is the term used by Carnap and Bar-Hillel.

5 Logical formulas are used in the present paper as names of themselves, and not in their object-language use.

6 “Inf (S)” is used for “the amount of in formation S contains.”

7 If the number of individuals is infinite, we can use the convention that inf (S) = limit (n → ∞) of the amount of information of S when the number of individuals is n,—the “limit convention,” proposed by Carnap; or we can use a somewhat, more convenient convention proposed by Kemeny. (For a discussion of these conventions see my paper “A Definition of Degree of Confirmation for Very Rich Languages,” Philosophy of Science, Jan. 1956, pp. 58–62. They are discussed there in connection with measures of “degree of confirmation,” but the relevant considerations are the same as in the present case.) These details will not be discussed here.

8 “Inf % (S, C)” will be the notation for “the percentage information S gives about C.

9 Mathematical Logic, Harvard (1951), p. 135.

10 “Unity of Science as a working Hypothesis,” to appear in Minnesota Studies in the Philosophy of Sciences, Vol. II, Univ. of Minnesota (forthcoming).

11 Ibid, sec. 2.2.

12 Ibid., sec. 7.

13 “The “levels” L 1, ⃛, L 5 are classes, so that “inf (S, Li)” is perfectly meaningful, provided we assume these classes to be nameable in the scientific language (as they surely would be).

14 Cf. the paper by Kemeny cited in footnote 4.

15 Details may be found in my paper “A Definition of Degree of Confirmation for Very Rich Languages,” Philosophy of Science, Jan. 1956, pp. 58–62.

16 Under the natural extension, the amount of information a sentence S gives about a relation R turns out. to be equal to inf (S, C), where C is the field of R (the union of the domain and the converse domain of R).

17 Quantifications relativized to a predicate P are quantifications of the form “(x)(P(x) ⊃ ⃛)” and “(∃x)(Px · (⃛)).”

18 At the referee's suggestion, I should like to mention the paradoxical character of logically false sentences. These give the maximum amount of information according to (1) (we might say they give too much information!), but they do not give information about any class, since the corresponding sentence Tc mentioned in definition (7) does not exist.