Book contents
- Frontmatter
- Contents
- List of contributors
- Preface
- Part I Introduction
- Part II Representativeness
- Part III Causality and attribution
- Part IV Availability
- Part V Covariation and control
- Part VI Overconfidence
- Part VII Multistage evaluation
- 24 Evaluation of compound probabilities in sequential choice
- 25 Conservatism in human information processing
- 26 The best-guess hypothesis in multistage inference
- 27 Inferences of personal characteristics on the basis of information retrieved from one's memory
- Part VIII Corrective procedures
- Part IX Risk perception
- Part X Postscript
- References
- Index
25 - Conservatism in human information processing
Published online by Cambridge University Press: 05 May 2013
- Frontmatter
- Contents
- List of contributors
- Preface
- Part I Introduction
- Part II Representativeness
- Part III Causality and attribution
- Part IV Availability
- Part V Covariation and control
- Part VI Overconfidence
- Part VII Multistage evaluation
- 24 Evaluation of compound probabilities in sequential choice
- 25 Conservatism in human information processing
- 26 The best-guess hypothesis in multistage inference
- 27 Inferences of personal characteristics on the basis of information retrieved from one's memory
- Part VIII Corrective procedures
- Part IX Risk perception
- Part X Postscript
- References
- Index
Summary
… An abundance of research has shown that human beings are conservative processors of fallible information. Such experiments compare human behavior with the outputs of Bayes's theorem, the formally optimal rule about how opinions (that is, probabilities) should be revised on the basis of new information. It turns out that opinion change is very orderly, and usually proportional to numbers calculated from Bayes's theorem – but it is insufficient in amount. A convenient first approximation to the data would say that it takes anywhere from two to five observations to do one observation's worth of work in inducing a subject to change his opinions. A number of experiments have been aimed at an explanation for this phenomenon. They show that a major, probably the major, cause of conservatism is human misaggregation of the data. That is, men perceive each datum accurately and are well aware of its individual diagnostic meaning, but are unable to combine its diagnostic meaning well with the diagnostic meaning of other data when revising their opinions. …
Probabilities quantify uncertainty. A probability, according to Bayesians like ourselves, is simply a number between zero and one that represents the extent to which a somewhat idealized person believes a statement to be true. The reason the person is somewhat idealized is that the sum of his probabilities for two mutually exclusive events must equal his probability that either of the events will occur.
- Type
- Chapter
- Information
- Judgment under UncertaintyHeuristics and Biases, pp. 359 - 369Publisher: Cambridge University PressPrint publication year: 1982
- 105
- Cited by