Ray Solomonoff

Ray Solomonoff
Born(1926-07-25)July 25, 1926
DiedDecember 7, 2009(2009-12-07) (aged 83)
Alma materUniversity of Chicago (M.S. in Physics, 1951)
Known forAlgorithmic probability, General Theory of Inductive Inference, Solomonoff induction, Kolmogorov complexity
Notable work"A Formal Theory of Inductive Inference" (1964), concept of Algorithmic Probability, foundational work on machine learning
AwardsKolmogorov Award (2003)
Scientific career
FieldsMathematics, Artificial intelligence, Algorithmic information theory
InstitutionsOxbridge Research, MIT, University of Saarland, Dalle Molle Institute for Artificial Intelligence

Ray Solomonoff (July 25, 1926 – December 7, 2009)[1][2] was an American mathematician who invented algorithmic probability,[3] his General Theory of Inductive Inference (also known as Universal Inductive Inference),[4] and was a founder of algorithmic information theory.[5] He was an originator of the branch of artificial intelligence based on machine learning, prediction and probability. He circulated the first report on non-semantic machine learning in 1956.[6]

Solomonoff first described algorithmic probability in 1960, publishing the theorem that launched Kolmogorov complexity and algorithmic information theory. He first described these results at a conference at Caltech in 1960,[7] and in a report, Feb. 1960, "A Preliminary Report on a General Theory of Inductive Inference."[8] He clarified these ideas more fully in his 1964 publications, "A Formal Theory of Inductive Inference," Part I[9] and Part II.[10]

Algorithmic probability is a mathematically formalized combination of Occam's razor,[11][12][13][14] and the Principle of Multiple Explanations.[15] It is a machine independent method of assigning a probability value to each hypothesis (algorithm/program) that explains a given observation, with the simplest hypothesis (the shortest program) having the highest probability and the increasingly complex hypotheses receiving increasingly small probabilities.

Solomonoff founded the theory of universal inductive inference, which is based on solid philosophical foundations[4] and has its root in Kolmogorov complexity and algorithmic information theory. The theory uses algorithmic probability in a Bayesian framework. The universal prior is taken over the class of all computable measures; no hypothesis will have a zero probability. This enables Bayes' rule (of causation) to be used to predict the most likely next event in a series of events, and how likely it will be.[10]

Although he is best known for algorithmic probability and his general theory of inductive inference, he made many other important discoveries throughout his life, most of them directed toward his goal in artificial intelligence: to develop a machine that could solve hard problems using probabilistic methods.

  1. ^ "Ray Solomonoff, 1926–2009 « The Third Conference on Artificial General Intelligence". Archived from the original on 2011-08-07. Retrieved 2009-12-12.
  2. ^ Markoff, John (January 9, 2010). "Ray Solomonoff, Pioneer in Artificial Intelligence, Dies at 83". The New York Times. Retrieved January 11, 2009.
  3. ^ Vitanyi, Paul; Legg, Shane; Hutter, Marcus (2007). "Algorithmic probability". Scholarpedia. 2 (8): 2572. Bibcode:2007SchpJ...2.2572H. doi:10.4249/scholarpedia.2572. hdl:1885/15013.
  4. ^ a b Samuel Rathmanner and Marcus Hutter. A philosophical treatise of universal induction. Entropy, 13(6):1076–1136, 2011.
  5. ^ Vitanyi, P. "Obituary: Ray Solomonoff, Founding Father of Algorithmic Information Theory"
  6. ^ "An Inductive Inference Machine", Dartmouth College, N.H., version of Aug. 14, 1956. (pdf scanned copy of the original)
  7. ^ Paper from conference on "Cerebral Systems and Computers", California Institute of Technology, Feb 8–11, 1960, cited in "A Formal Theory of Inductive Inference, Part 1, 1964, p. 1.
  8. ^ Solomonoff, R., "A Preliminary Report on a General Theory of Inductive Inference", Report V-131, Zator Co., Cambridge, Ma. Feb 4, 1960, revision, Nov., 1960.
  9. ^ Solomonoff, R., "A Formal Theory of Inductive Inference, Part I" Information and Control, Vol 7, No. 1 pp 1–22, March 1964.
  10. ^ a b Solomonoff, R., "A Formal Theory of Inductive Inference, Part II" Information and Control, Vol 7, No. 2 pp 224–254, June 1964.
  11. ^ Induction: From Kolmogorov and Solomonoff to De Finetti and Back to Kolmogorov JJ McCall – Metroeconomica, 2004 – Wiley Online Library.
  12. ^ Foundations of Occam's razor and parsimony in learning from ricoh.com D Stork – NIPS 2001 Workshop, 2001
  13. ^ Occam's razor as a formal basis for a physical theory from arxiv.org AN Soklakov – Foundations of Physics Letters, 2002 – Springer
  14. ^ Beyond the Turing Test from uclm.es J HERNANDEZ-ORALLO – Journal of Logic, Language, and …, 2000 – dsi.uclm.es
  15. ^ Ming Li and Paul Vitanyi, An Introduction to Kolmogorov Complexity and Its Applications. Springer-Verlag, N.Y., 2008p 339 ff.