In mathematics and machine learning, the softplus function is
It is a smooth approximation (in fact, an analytic function) to the ramp function, which is known as the rectifier or ReLU (rectified linear unit) in machine learning. For large negative it is , so just above 0, while for large positive it is , so just above .
The names softplus[1][2] and SmoothReLU[3] are used in machine learning. The name "softplus" (2000), by analogy with the earlier softmax (1989) is presumably because it is a smooth (soft) approximation of the positive part of x, which is sometimes denoted with a superscript plus, .
Since the sigmoid h has a positive first derivative, its primitive, which we call softplus, is convex.
Rectifier and softplus activation functions. The second one is a smooth version of the first.