Double descent

An example of the double descent phenomenon in a two-layer neural network: When the ratio of parameters to data points is increased, the test error falls first, then rises, then falls again.[1] The vertical line marks the "interpolation threshold" boundary between the underparametrized regime (more data points than parameters) and the overparameterized regime (more parameters than data points).

In statistics and machine learning, double descent is the phenomenon where a statistical model with a small number of parameters and a model with an extremely large number of parameters have a small test error, but a model whose number of parameters is about the same as the number of data points used to train the model will have a large error.[2] This phenomenon has been considered surprising, as it contradicts assumptions about overfitting in classical machine learning.[1]

  1. ^ a b Schaeffer, Rylan; Khona, Mikail; Robertson, Zachary; Boopathy, Akhilan; Pistunova, Kateryna; Rocks, Jason W.; Fiete, Ila Rani; Koyejo, Oluwasanmi (2023-03-24). "Double Descent Demystified: Identifying, Interpreting & Ablating the Sources of a Deep Learning Puzzle". arXiv:2303.14151v1 [cs.LG].
  2. ^ "Deep Double Descent". OpenAI. 2019-12-05. Retrieved 2022-08-12.