Main Page | See live article | Alphabetical index

Shannon's theorem

Shannon's theorem, which concerns information entropy, was proved in 1948 by Claude Shannon. It gives the theoretical maximum rate at which error-free bits can be transmitted over a noisy channel. That any such nonzero rate could exist was considered quite surprising at the time since no scheme was known that could achieve such reliable communication; information theory, as we know it today, was born.

The most famous example of this is for the bandwidth-limited and power constrained channel in the presence of Gaussian noise, usually expressed in the form C = W log2(1 + S /N ), where C is the channel capacity in bits per second, W is the bandwidth in hertz, and S /N is the ratio of signal power to noise power.

It should be emphasized that the signal-to-noise ratio used in the equation is not measured in decibels, so the distinction between the power ratio and the amplitude ratio is critical.

As an example, consider the operation of a modem on an ordinary telephone line. The ratio of signal power to noise power is about 1000. The bandwidth is limited to at most 3400 Hz. Therefore:

C = 3400log2(1 + 1000) = (3400)(9.97) ~= 34000 bps

The V.34 modem standard advertises a rate of 33.6 kpbs, and V.90 claims a rate of 56 kbps, apparently in excess of the Shannon capacity. In fact, neither standard actually reaches the Shannon limit. Both use compression, but compute bitrates based on uncompressed data size.

Reference