The manipulations and basic results of stochastic decision theory are introduced. The manipulations of idempotence, transposition, and repetition, introduced for deterministic decision trees, can be used to manipulate stochastic trees. However, there are two major differences. First, in order to obtain a complete set of manipulations it is necessary to introduce an additional rule called indifference. Second, these identities must be treated as rules of inference. Not all the rules can be soundly applied in both directions; in particular, idempotence is a one-way rule.
A manipulation of a stochastic decision tree not only alters the structure of the tree, but also the probability distributions associated with the tree. This allows probability calculation to be viewed as structural manipulation. In particular, a retrieval corresponds to a conditional probability calculation. The algorithm for doing this calculation has, therefore, many applications. For example, the solution to the classical state-estimation problem and the retrieval of information from probabilistic or uncertain knowledge bases may both be viewed as an application of this algorithm.
The main result of this paper is that these manipulations are complete and sound. In order to prove this result, it is necessary to have a semantic setting for these theories. The setting chosen is the category of description spaces which is a generalization of the category of bounded measure spaces with maps which do not increase measure. The proof of this result exploits the retrieval properties of stochastic terms and its relationship to conditional probability calculations in the models.