Non-negative least squares

In mathematical optimization, the problem of non-negative least squares (NNLS) is a type of constrained least squares problem where the coefficients are not allowed to become negative. That is, given a matrix A and a (column) vector of response variables y, the goal is to find[1]

subject to x ≥ 0.

Here x ≥ 0 means that each component of the vector x should be non-negative, and ‖·‖2 denotes the Euclidean norm.

Non-negative least squares problems turn up as subproblems in matrix decomposition, e.g. in algorithms for PARAFAC[2] and non-negative matrix/tensor factorization.[3][4] The latter can be considered a generalization of NNLS.[1]

Another generalization of NNLS is bounded-variable least squares (BVLS), with simultaneous upper and lower bounds αixi ≤ βi.[5]: 291 [6]

  1. ^ a b Cite error: The named reference chen was invoked but never defined (see the help page).
  2. ^ Cite error: The named reference bro was invoked but never defined (see the help page).
  3. ^ Lin, Chih-Jen (2007). "Projected Gradient Methods for Nonnegative Matrix Factorization" (PDF). Neural Computation. 19 (10): 2756–2779. CiteSeerX 10.1.1.308.9135. doi:10.1162/neco.2007.19.10.2756. PMID 17716011.
  4. ^ Boutsidis, Christos; Drineas, Petros (2009). "Random projections for the nonnegative least-squares problem". Linear Algebra and Its Applications. 431 (5–7): 760–771. arXiv:0812.4547. doi:10.1016/j.laa.2009.03.026.
  5. ^ Cite error: The named reference lawson was invoked but never defined (see the help page).
  6. ^ Stark, Philip B.; Parker, Robert L. (1995). "Bounded-variable least-squares: an algorithm and applications" (PDF). Computational Statistics. 10: 129.