This article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these messages)
|
A photometer is an instrument that measures the strength of electromagnetic radiation in the range from ultraviolet to infrared and including the visible spectrum. Most photometers convert light into an electric current using a photoresistor, photodiode, or photomultiplier.
Photometers measure:
Historically, photometry was done by estimation, comparing the luminous flux of a source with a standard source. By the 19th century, common photometers included Rumford's photometer, which compared the depths of shadows cast by different light sources, and Ritchie's photometer, which relied on equal illumination of surfaces. Another type was based on the extinction of shadows.
Modern photometers utilize photoresistors, photodiodes or photomultipliers to detect light. Some models employ photon counting, measuring light by counting individual photons. They are especially useful in areas where the irradiance is low. Photometers have wide-ranging applications including photography, where they determine the correct exposure, and science, where they are used in absorption spectroscopy to calculate the concentration of substances in a solution, infrared spectroscopy to study the structure of substances, and atomic absorption spectroscopy to determine the concentration of metals in a solution.