Infomax

Infomax, or the principle of maximum information preservation, is an optimization principle for artificial neural networks and other information processing systems. It prescribes that a function that maps a set of input values to a set of output values should be chosen or learned so as to maximize the average Shannon mutual information between and , subject to a set of specified constraints and/or noise processes. Infomax algorithms are learning algorithms that perform this optimization process. The principle was described by Linsker in 1988.[1] The objective function is called the InfoMax objective.

As the InfoMax objective is difficult to compute exactly, a related notion uses two models giving two outputs , and maximizes the mutual information between these. This contrastive InfoMax objective is a lower bound to the InfoMax objective.[2]

Infomax, in its zero-noise limit, is related to the principle of redundancy reduction proposed for biological sensory processing by Horace Barlow in 1961,[3] and applied quantitatively to retinal processing by Atick and Redlich.[4]

  1. ^ Linsker R (1988). "Self-organization in a perceptual network". IEEE Computer. 21 (3): 105–17. doi:10.1109/2.36. S2CID 1527671.
  2. ^ Becker, Suzanna; Hinton, Geoffrey E. (January 1992). "Self-organizing neural network that discovers surfaces in random-dot stereograms". Nature. 355 (6356): 161–163. doi:10.1038/355161a0. ISSN 1476-4687.
  3. ^ Barlow, H. (1961). "Possible principles underlying the transformations of sensory messages". In Rosenblith, W. (ed.). Sensory Communication. Cambridge MA: MIT Press. pp. 217–234.
  4. ^ Atick JJ, Redlich AN (1992). "What does the retina know about natural scenes?". Neural Computation. 4 (2): 196–210. doi:10.1162/neco.1992.4.2.196. S2CID 17515861.