ADALINE

Learning inside a single layer ADALINE
Photo of an ADALINE machine, with hand-adjustable weights implemented by rheostats.
Schematic of a single ADALINE unit, from Figure 2 of (Widrow, 1960).

ADALINE (Adaptive Linear Neuron or later Adaptive Linear Element) is an early single-layer artificial neural network and the name of the physical device that implemented this network.[1][2][3][4][5] It was developed by professor Bernard Widrow and his doctoral student Ted Hoff at Stanford University in 1960. It is based on the perceptron. It consists of a weight, a bias and a summation function. The weights and biases were implemented by rheostats (as seen in the "knobby ADALINE"), and later, memistors.

The difference between Adaline and the standard (McCulloch–Pitts) perceptron is in how they learn. Adaline unit weights are adjusted to match a teacher signal, before applying the Heaviside function (see figure), but the standard perceptron unit weights are adjusted to match the correct output, after applying the Heaviside function.

A multilayer network of ADALINE units is a MADALINE.

  1. ^ Anderson, James A.; Rosenfeld, Edward (2000). Talking Nets: An Oral History of Neural Networks. MIT Press. ISBN 9780262511117.
  2. ^ Youtube: widrowlms: Science in Action
  3. ^ 1960: An adaptive "ADALINE" neuron using chemical "memistors"
  4. ^ Youtube: widrowlms: The LMS algorithm and ADALINE. Part I - The LMS algorithm
  5. ^ Youtube: widrowlms: The LMS algorithm and ADALINE. Part II - ADALINE and memistor ADALINE