Claude Shannon
Father of information theory
Quotes by Claude Shannon
The theory of information is a mathematical theory, not a philosophical one.
I was always interested in puzzles and things like that.
I think the most important thing is to have fun.
The problem of communication is to reproduce at one point either exactly or approximately a message selected at another point.
The capacity of a channel is the maximum rate at which information can be transmitted over it with arbitrarily small probability of error.
The entropy of a source is a measure of the average information produced by the source.
The concept of information is a quantitative one.
The theory of communication is concerned with the transmission of information, not its meaning.
I'm not a philosopher, I'm a scientist.
I've always been interested in how things work.
Information is a measure of the freedom of choice in selecting a message.
Entropy is a measure of the uncertainty or randomness in a system.
The redundancy of human languages is about 50 percent.
Noise is not a bug; it's a feature of communication.
Mathematics is the language with which God has written the universe.
The bit is the fundamental unit of information.
In chess, the computer will always beat humans, but in go, it's a different story.
Logic is the beginning of wisdom, not the end.
The mouse is a clean device; it doesn't accumulate dirt like keyboards.
I just did some calculations that show that the human race is doomed.