We describe a principled way of imposing a metric representing dissimilarities on any discrete set of stimuli (symbols, handwritings, consumer products, X-ray films, etc.), given the probabilities with which they are discriminated from each other by a perceiving system, such as an organism, person, group of experts, neuronal structure, technical device, or even an abstract computational algorithm. In this procedure one does not have to assume that discrimination probabilities are monotonically related to distances, or that the distances belong to a predefined class of metrics, such as Minkowski. Discrimination probabilities do not have to be symmetric, the probability of discriminating an object from itself need not be a constant, and discrimination probabilities are allowed to be 0’s and 1’s. The only requirement that has to be satisfied is Regular Minimality, a principle we consider the defining property of discrimination: for ordered stimulus pairs (a, b), b is least frequently discriminated from a if and only if a is least frequently discriminated from b. Regular Minimality generalizes one of the weak consequences of the assumption that discrimination probabilities are monotonically related to distances: the probability of discriminating a from a should be less than that of discriminating a from any other object. This special form of Regular Minimality also underlies such traditional analyses of discrimination probabilities as Multidimensional Scaling and Cluster Analysis.