The LSE approach to econometrics, named for the London School of Economics, involves viewing econometric models as reductions from some unknown data generation process (DGP). A complex DGP is typically modelled as the starting point and this complexity allows information in the data from the real world but absent in the theory to be drawn upon. The complexity is then reduced by the econometrician by a series of restrictions which are tested.
One particular functional form, the error-correction model, is often arrived at when modelling time series. Denis Sargan and David Forbes Hendry (with his general-to-specific modeling) were key figures in the development of the approach and the one way the approach has been extended is through the work on integrated and cointegrated systems by Robert F. Engle, Clive Granger, and Søren Johansen. Another commonly used functional form is distributed lag or autoregressive distributed lag.
David F. Hendry is considered to be the principal architect of the LSE approach. The methodology is often referred to as general-to-specific modelling, "Gets modeling" or "Hendry's methodology".
The software package OxMetrics implements this process via the PcGive module Autometrics
In the 1970s, when the LSE approach was in its infancy, Edward E. Leamer was an early critic of model discovery methodologies.[citation needed]
The approach evolved to include: multiple reduction path searches, indicator saturation, COMFAC testing, and cointegrated vector autoregressive structures.
Economists often associated with "Hendry's methodology" include Clive Granger, Robert F. Engle, Søren Johansen, Grayham Mizon, Jennifer Castle, Hans M. Krolzig, Neil Ericsson, and Jurgen Doornik.