Book contents
- Frontmatter
- Contents
- List of contributors
- Preface
- Part I Introduction
- Part II Representativeness
- Part III Causality and attribution
- Part IV Availability
- Part V Covariation and control
- Part VI Overconfidence
- Part VII Multistage evaluation
- 24 Evaluation of compound probabilities in sequential choice
- 25 Conservatism in human information processing
- 26 The best-guess hypothesis in multistage inference
- 27 Inferences of personal characteristics on the basis of information retrieved from one's memory
- Part VIII Corrective procedures
- Part IX Risk perception
- Part X Postscript
- References
- Index
26 - The best-guess hypothesis in multistage inference
Published online by Cambridge University Press: 05 May 2013
- Frontmatter
- Contents
- List of contributors
- Preface
- Part I Introduction
- Part II Representativeness
- Part III Causality and attribution
- Part IV Availability
- Part V Covariation and control
- Part VI Overconfidence
- Part VII Multistage evaluation
- 24 Evaluation of compound probabilities in sequential choice
- 25 Conservatism in human information processing
- 26 The best-guess hypothesis in multistage inference
- 27 Inferences of personal characteristics on the basis of information retrieved from one's memory
- Part VIII Corrective procedures
- Part IX Risk perception
- Part X Postscript
- References
- Index
Summary
Multistage inference consists of a series of single-stage inferences where the output of each previous stage becomes the input to the next stage. In a single-stage inference men reason from data or unambiguously observed evidence to a set of hypotheses. Multistage inference starts with the same unambiguous data or evidence in the first stage; however, the input for the next stage is the output of the previous stage. The next stage of inference is therefore based on the probabilities of events, rather than upon definite knowledge that a particular event is true (Gettys & Willke, 1969).
For example, suppose you wanted to predict the success or failure of a large garden party. Assume that the party is less likely to be successful if it is crowded indoors because of rain. Your datum is the presence of a dark cloud on the horizon. The first stage of inference would relate the dark cloud to the presence or absence of rain during the party. Suppose you estimated that the probability of rain was .70. This estimate would become the input to the next stage of inference. If you knew with certainty that it would rain, then you could infer the probability that the party would be a success. But you are not entirely sure that it will rain; the data that you have indicates rain with a probability of .70, so how should you proceed?
- Type
- Chapter
- Information
- Judgment under UncertaintyHeuristics and Biases, pp. 370 - 377Publisher: Cambridge University PressPrint publication year: 1982
- 4
- Cited by