Exponentially decreasing bounds on tail distributions of random variables
In probability theory, a Chernoff bound is an exponentially decreasing upper bound on the tail of a random variable based on its moment generating function. The minimum of all such exponential bounds forms the Chernoff or Chernoff-Cramér bound, which may decay faster than exponential (e.g. sub-Gaussian).[1][2] It is especially useful for sums of independent random variables, such as sums of Bernoulli random variables.[3][4]
The bound is commonly named after Herman Chernoff who described the method in a 1952 paper,[5] though Chernoff himself attributed it to Herman Rubin.[6] In 1938 Harald Cramér had published an almost identical concept now known as Cramér's theorem.
It is a sharper bound than the first- or second-moment-based tail bounds such as Markov's inequality or Chebyshev's inequality, which only yield power-law bounds on tail decay. However, when applied to sums the Chernoff bound requires the random variables to be independent, a condition that is not required by either Markov's inequality or Chebyshev's inequality.