The universal law of generalization is a theory of cognition stating that the probability of a response to one stimulus being generalized to another is a function of the “distance” between the two stimuli in a psychological space. It was introduced in 1987 by Roger N. Shepard,[1][2] who began researching mechanisms of generalization while he was still a graduate student at Yale:
"I was now convinced that the problem of generalization was the most fundamental problem confronting learning theory. Because we never encounter exactly the same total situation twice, no theory of learning can be complete without a law governing how what is learned in one situation generalizes to another"[3]
Shepard’s 1987 paper gives a "generalization" example of a bird that has eaten one earthworm, and is presented with a slightly different-looking earthworm.[2] Explaining the concept of "psychological space" in the abstract of his 1987 paper, Shepard wrote:
"A psychological space is established for any set of stimuli by determining metric distances between the stimuli such that the probability that a response learned to any stimulus will generalize to any other is an invariant [monotonic function] of the distance between them"[2]
Using experimental evidence from both human and non-human subjects, Shepard hypothesized, more specifically, that the probability of generalization will fall off exponentially with the distance measured by one of two particular metrics. His analysis goes on to argue for the universality of this rule for all sentient organisms, due to evolutionary internalization.
A canonical law of cognitive science -- the Universal Law of Generalization, introduced in a 1987 article also published in Science -- tells us that your brain makes perceptual decisions based on how similar the new stimulus is to previous experience. Specifically, the law states that the probability you will extend a past experience to new stimulus depends on the similarity between the two experiences, with an exponential decay in probability as similarity decreases. This empirical pattern has proven correct in hundreds of experiments across species including humans, pigeons, and even honeybees.