Channel capacityChannel capacity
, shown often as "C" in communication formulas, is the amount
of discrete information bits that a defined area or segment in a communications
medium can hold. Thus, a telephone wire may be considered a channel in this sense. Breaking
up the frequency bandwidth into smaller sub-segments, and using each of them
to carry communications results in a reduction in the number of bits of
information that each segment can carry. The total number of bits of
information that the entire wire may carry is not expanded by breaking it
into smaller sub-segments.
In reality, this sub-segmentation reduces the total amount of information
that the wire can carry due to the additional overhead of information that is
required to distinguish the sub-segments from each other.
Information Theory, developed by Claude E. Shannon in 1948, provides a
mathematical model by which one can compute the maximal amount of information
that can be carried by a channel.
Two additional considerations: The channel can be light beams, radio
waves, a specified bandwidth, a book, or even elementary particles from whose
state information may be gleaned.
In addition, an analog signal, such as the human voice, can be computed
in terms of the elements of difference that can be detected, and this can be
counted and considered "bits" of information.
See also: information theory, redundancy, Shannon capacity