In a chemical analysis, the internal standard method involves adding the same amount of a chemical substance to each sample and calibration solution. The internal standard responds proportionally to changes in the analyte and provides a similar, but not identical, measurement signal. It must also be absent from the sample matrix to ensure there is no other source of the internal standard present. Taking the ratio of analyte signal to internal standard signal and plotting it against the analyte concentrations in the calibration solutions will result in a calibration curve. The calibration curve can then be used to calculate the analyte concentration in an unknown sample.[1]
Selecting an appropriate internal standard accounts for random and systematic sources of uncertainty that arise during sample preparation or instrument fluctuation. This is because the ratio of analyte relative to the amount of internal standard is independent of these variations. If the measured value of the analyte is erroneously shifted above or below the actual value, the internal standard measurements should shift in the same direction.[1]
Ratio plot provides good way of compensation of detector sensitivity variation, but may be biased and should be replaced by Relative concentration/Relative calibration calculations if the reason of response variability is in different mass of analysed sample and traditional (not internal standard) calibration curve of any analyte is not linear through origin.[2]