Claude Shannon

Computer Science American 1916 – 2001 331 quotes

Father of information theory

Quotes by Claude Shannon

The capacity of a noisy channel is the maximum rate at which information can be transmitted over it with arbitrarily small error probability.

A Mathematical Theory of Communication 1948

It is not necessary to understand the meaning of the message in order to define its information content.

A Mathematical Theory of Communication 1948

The significant aspect is not the individual message as such, but the statistical characteristics of the ensemble of messages.

A Mathematical Theory of Communication 1948

The amount of information is measured by the number of binary digits necessary to specify the message.

A Mathematical Theory of Communication 1948

Redundancy is the fraction of the message which is determined by the rules of the language or by statistical probabilities.

A Mathematical Theory of Communication 1948

Noise is any unwanted addition to the signal.

A Mathematical Theory of Communication 1948

The entropy of a discrete set of probabilities is a measure of the average uncertainty of the outcome.

A Mathematical Theory of Communication 1948

The channel capacity C is given by C = W log2(1 + P/N).

A Mathematical Theory of Communication 1948

The problem of communication is essentially that of reproducing at one point either exactly or approximately a message selected at another point.

A Mathematical Theory of Communication 1948

The amount of information is a logarithmic measure of the number of available choices.

A Mathematical Theory of Communication 1948

Information theory is not about meaning, but about the statistical properties of symbols.

A Mathematical Theory of Communication 1948

The theory is largely a statistical theory, and it is not concerned with the meaning of messages.

A Mathematical Theory of Communication 1948

A message is a sequence of symbols.

A Mathematical Theory of Communication 1948

The redundancy of a language is a measure of the extent to which it is predictable.

A Mathematical Theory of Communication 1948

The true measure of information is the reduction of uncertainty.

A Mathematical Theory of Communication 1948

The capacity of a channel is the maximum rate at which information can be transmitted over it with arbitrarily small error.

A Mathematical Theory of Communication 1948

Information is a measure of the unexpectedness of a message.

A Mathematical Theory of Communication 1948

The more probable the message, the less information it conveys.

A Mathematical Theory of Communication 1948

A communication system consists of five parts: an information source, a transmitter, a channel, a receiver, and a destination.

A Mathematical Theory of Communication 1948

The purpose of the transmitter is to encode the message into a signal suitable for transmission over the channel.

A Mathematical Theory of Communication 1948