Rate of convergence

In mathematical analysis, particularly numerical analysis, the rate of convergence and order of convergence of a sequence that converges to a limit are any of several characterizations of how quickly that sequence approaches its limit. These are broadly divided into rates and orders of convergence that describe how quickly a sequence further approaches its limit once it is already close to it, called asymptotic rates and orders of convergence, and those that describe how quickly sequences approach their limits from starting points that are not necessarily close to their limits, called non-asymptotic rates and orders of convergence.

Asymptotic rates and orders of convergence have particular importance both in practical numerics and in formal proof, and they are the primary focus of this article. In practical numerics, asymptotic rates and orders of convergence follow two common conventions for two types of sequences: the first for sequences of iterations of an iterative numerical method and the second for sequences of successively more accurate numerical discretizations of a target. In formal mathematics, rates of convergence and orders of convergence are often described using asymptotic notation commonly called "big O notation," which can be used to encompass both of the prior conventions.

For iterative methods, a sequence that converges to is said to have asymptotic order of convergence and asymptotic rate of convergence if

[1][2]

Where greater methodological precision is required, these rates and orders of convergence are known specifically as the rates and orders of Q-convergence, short for quotient-convergence, since the limit in question is a quotient of error terms.[2] Sequences with larger orders converge more quickly than those with smaller order, and those with smaller rates converge more quickly than those with larger rates for a given order.

This "smaller rates converge more quickly" behavior among sequences of the same order is standard but it can be counterintuitive. It is therefore also common to define as the rate; this is the "number of extra decimals of precision per iterate" for sequences that converge with order 1.[2] The rate of convergence may also be called the asymptotic error constant, and some authors will use rate where this article uses order (e.g., [3]).

Similar concepts are used for sequences of discretizations. For instance, ideally the solution of a differential equation discretized via a regular grid will converge to the solution of the continuous differential equation as the grid spacing goes to zero, and if so the asymptotic rate and order of that convergence are an important characterization of the efficiency of the grid discretization method. A sequence of approximate grid solutions of some problem that converges to a true solution with a corresponding sequence of regular grid spacings that converge to 0 is said to have asymptotic order of convergence and asymptotic rate of convergence if

where the absolute value symbols stand for a function metric for the space of solutions such as the uniform norm. Similar definitions also apply for non-grid discretization schemes such as the polygon meshes of a finite element method or the basis sets in computational chemistry: in general, the appropriate definition of the asymptotic rate will involve the asymptotic limit of the ratio of some approximation error term above to an asymptotic order power of a discretization scale parameter below.

In practice, the asymptotic rate and order of convergence of a sequence of iterates or of approximations provide useful insights when using iterative methods and discretization methods for calculating numerical approximations. Strictly speaking, however, the asymptotic behavior of a sequence does not give conclusive information about any finite part of the sequence.

Series acceleration methods are techniques for improving the rate of convergence of the sequence of partial sums of a series and possibly its order of convergence, also. These accelerations are commonly accomplished with sequence transformations.

  1. ^ Ruye, Wang (2015-02-12). "Order and rate of convergence". hmc.edu. Retrieved 2020-07-31.
  2. ^ a b c Nocedal, Jorge; Wright, Stephen J. (1999). Numerical Optimization (1st ed.). New York, NY: Springer. pp. 28–29. ISBN 978-0-387-98793-4.
  3. ^ Senning, Jonathan R. "Computing and Estimating the Rate of Convergence" (PDF). gordon.edu. Retrieved 2020-08-07.