The history of calculus is fraught with philosophical debates about the meaning and logical validity of fluxions or infinitesimal numbers. The standard way to resolve these debates is to define the operations of calculus using limits rather than infinitesimals. Nonstandard analysis[1][2][3] instead reformulates the calculus using a logically rigorous notion of infinitesimal numbers.
Nonstandard analysis originated in the early 1960s by the mathematician Abraham Robinson.[4][5] He wrote:
... the idea of infinitely small or infinitesimal quantities seems to appeal naturally to our intuition. At any rate, the use of infinitesimals was widespread during the formative stages of the Differential and Integral Calculus. As for the objection ... that the distance between two distinct real numbers cannot be infinitely small, Gottfried Wilhelm Leibniz argued that the theory of infinitesimals implies the introduction of ideal numbers which might be infinitely small or infinitely large compared with the real numbers but which were to possess the same properties as the latter.
Robinson argued that this law of continuity of Leibniz's is a precursor of the transfer principle. Robinson continued:
However, neither he nor his disciples and successors were able to give a rational development leading up to a system of this sort. As a result, the theory of infinitesimals gradually fell into disrepute and was replaced eventually by the classical theory of limits.[6]
Robinson continues:
... Leibniz's ideas can be fully vindicated and ... they lead to a novel and fruitful approach to classical Analysis and to many other branches of mathematics. The key to our method is provided by the detailed analysis of the relation between mathematical languages and mathematical structures which lies at the bottom of contemporary model theory.
In 1973, intuitionist Arend Heyting praised nonstandard analysis as "a standard model of important mathematical research".[7]