This article includes a list of general references, but it lacks sufficient corresponding inline citations. (June 2016) |
In information theory, redundancy measures the fractional difference between the entropy H(X) of an ensemble X, and its maximum possible value .[1][2] Informally, it is the amount of wasted "space" used to transmit certain data. Data compression is a way to reduce or eliminate unwanted redundancy, while forward error correction is a way of adding desired redundancy for purposes of error detection and correction when communicating over a noisy channel of limited capacity.
The redundancy measures the fractional difference between H(X) and its maximum possible value,