In probability theory, Bernstein inequalities give bounds on the probability that the sum of random variables deviates from its mean. In the simplest case, let X1, ..., Xn be independent Bernoulli random variables taking values +1 and −1 with probability 1/2 (this distribution is also known as the Rademacher distribution), then for every positive ,
Bernstein inequalities were proven and published by Sergei Bernstein in the 1920s and 1930s.[1][2][3][4] Later, these inequalities were rediscovered several times in various forms. Thus, special cases of the Bernstein inequalities are also known as the Chernoff bound, Hoeffding's inequality and Azuma's inequality.
The martingale case of the Bernstein inequality
is known as Freedman's inequality [5] and its refinement
is known as Hoeffding's inequality.[6]
^S.N.Bernstein, "On a modification of Chebyshev's inequality and of the error formula of Laplace" vol. 4, #5 (original publication: Ann. Sci. Inst. Sav. Ukraine, Sect. Math. 1, 1924)
^Bernstein, S. N. (1937). "Об определенных модификациях неравенства Чебышева" [On certain modifications of Chebyshev's inequality]. Doklady Akademii Nauk SSSR. 17 (6): 275–277.
^S.N.Bernstein, "Theory of Probability" (Russian), Moscow, 1927
^J.V.Uspensky, "Introduction to Mathematical Probability", McGraw-Hill Book Company, 1937
^Freedman, D.A. (1975). "On tail probabilities for martingales". Ann. Probab. 3: 100–118.