Quantum limit

A quantum limit in physics is a limit on measurement accuracy at quantum scales.[1] Depending on the context, the limit may be absolute (such as the Heisenberg limit), or it may only apply when the experiment is conducted with naturally occurring quantum states (e.g. the standard quantum limit in interferometry) and can be circumvented with advanced state preparation and measurement schemes.

The usage of the term standard quantum limit or SQL is, however, broader than just interferometry. In principle, any linear measurement of a quantum mechanical observable of a system under study that does not commute with itself at different times leads to such limits. In short, it is the Heisenberg uncertainty principle that is the cause.

A schematic description of how physical measurement process is described in quantum mechanics

A more detailed explanation would be that any measurement in quantum mechanics involves at least two parties, an Object and a Meter. The former is the system whose observable, say , we want to measure. The latter is the system we couple to the Object in order to infer the value of of the Object by recording some chosen observable, , of this system, e.g. the position of the pointer on a scale of the Meter. This, in a nutshell, is a model of most of the measurements happening in physics, known as indirect measurements (see pp. 38–42 of [1]). So any measurement is a result of interaction and that acts in both ways. Therefore, the Meter acts on the Object during each measurement, usually via the quantity, , conjugate to the readout observable , thus perturbing the value of measured observable and modifying the results of subsequent measurements. This is known as back action (quantum) of the Meter on the system under measurement.

At the same time, quantum mechanics prescribes that readout observable of the Meter should have an inherent uncertainty, , additive to and independent of the value of the measured quantity . This one is known as measurement imprecision or measurement noise. Because of the Heisenberg uncertainty principle, this imprecision cannot be arbitrary and is linked to the back-action perturbation by the uncertainty relation:

where is a standard deviation of observable and stands for expectation value of in whatever quantum state the system is. The equality is reached if the system is in a minimum uncertainty state. The consequence for our case is that the more precise is our measurement, i.e the smaller is , the larger will be perturbation the Meter exerts on the measured observable . Therefore, the readout of the meter will, in general, consist of three terms:

where is a value of that the Object would have, were it not coupled to the Meter, and is the perturbation to the value of caused by back action force, . The uncertainty of the latter is proportional to . Thus, there is a minimal value, or the limit to the precision one can get in such a measurement, provided that and are uncorrelated.[2][3]

The terms "quantum limit" and "standard quantum limit" are sometimes used interchangeably. Usually, "quantum limit" is a general term which refers to any restriction on measurement due to quantum effects, while the "standard quantum limit" in any given context refers to a quantum limit which is ubiquitous in that context.

  1. ^ a b Braginsky, V. B.; Khalili, F. Ya. (1992). Quantum Measurement. Cambridge University Press. ISBN 978-0521484138.
  2. ^ Danilishin, S. L.; Khalili F. Ya. (2012). "Quantum Measurement Theory in Gravitational-Wave Detectors". Living Reviews in Relativity. 15 (5): 60. arXiv:1203.1706. Bibcode:2012LRR....15....5D. doi:10.12942/lrr-2012-5. PMC 5256003. PMID 28179836.
  3. ^ Chen, Yanbei (2013). "Macroscopic quantum mechanics: theory and experimental concepts of optomechanics". J. Phys. B: At. Mol. Opt. Phys. 46 (10): 104001. arXiv:1302.1924. Bibcode:2013JPhB...46j4001C. doi:10.1088/0953-4075/46/10/104001. S2CID 118570800.