Infomax is an optimization principle for artificial neural networks and other information processing systems. It prescribes that a function that maps a set of input values I to a set of output values O should be chosen or learned so as to maximize the average Shannonmutual information between I and O, subject to a set of specified constraints and/or noise processes. Infomax algorithms are learning algorithms that perform this optimization process. The principle was described by Linsker in 1988.[1]
Infomax, in its zero-noise limit, is related to the principle of redundancy reduction proposed for biological sensory processing by Horace Barlow in 1961,[2] and applied quantitatively to retinal processing by Atick and Redlich.[3]
^Linsker R (1988). "Self-organization in a perceptual network". IEEE Computer. 21 (3): 105–17. doi:10.1109/2.36. S2CID1527671.
^Barlow, H. (1961). "Possible principles underlying the transformations of sensory messages". In Rosenblith, W. (ed.). Sensory Communication. Cambridge MA: MIT Press. pp. 217–234.
^Nadal J.P., Parga N. (1999). "Sensory coding: information maximization and redundancy reduction". In Burdet, G.; Combe, P.; Parodi, O. (eds.). Neural Information Processing. World Scientific Series in Mathematical Biology and Medicine. Vol. 7. Singapore: World Scientific. pp. 164–171.