In optics, vergence is the angle formed by rays of light that are not perfectly parallel to one another. Rays that move closer to the optical axis as they propagate are said to be converging, while rays that move away from the axis are diverging. These imaginary rays are always perpendicular to the wavefront of the light, thus the vergence of the light is directly related to the radii of curvature of the wavefronts. A convex lens or concave mirror will cause parallel rays to focus, converging toward a point. Beyond that focal point, the rays diverge. Conversely, a concave lens or convex mirror will cause parallel rays to diverge.
Light does not actually consist of imaginary rays and light sources are not single-point sources, thus vergence is typically limited to simple ray modeling of optical systems. In a real system, the vergence is a product of the diameter of a light source, its distance from the optics, and the curvature of the optical surfaces. An increase in curvature causes an increase in vergence and a decrease in focal length, and the image or spot size (waist diameter) will be smaller. Likewise, a decrease in curvature decreases vergence, resulting in a longer focal length and an increase in image or spot diameter. This reciprocal relationship between vergence, focal length, and waist diameter are constant throughout an optical system, and is referred to as the optical invariant. A beam that is expanded to a larger diameter will have a lower degree of divergence, but if condensed to a smaller diameter the divergence will be greater.
The simple ray model fails for some situations, such as for laser light, where Gaussian beam analysis must be used instead.