But even prior to that paper he had made a name for himself when at MIT. His Master’s dissertation there was “A Symbolic Analysis of Relay and Switching Circuits” that pointed out that the logical values of true and false could easily be substituted for a one and a zero, and that this would allow for physical relays to perform logical calculations. Many have called this the most important Master’s thesis of the 1900s.
His paper was a profound breakthrough at the time and was done a decade before the development of computer components. Shannon’s thesis showed how a machine could be made to perform logical calculations and was not limited to just doing mathematical calculations. This made Shannon the first one to realize that a machine could be made to mimic the actions of human thought and some call this paper the genesis of artificial intelligence. This paper provided the push to develop computers since it made it clear that machines could do a lot more things that merely calculate.
Shannon joined Bell Labs as WWII was looming and he went to work immediately on military projects like cryptography and designing a fire control for antiaircraft guns. But in his spare time Shannon worked on his idea that he referred to as a fundamental theory of communications. He saw that it was possible to ‘quantify’ knowledge by the use of binary digits.
This paper was one of those rare breakthroughs in science that come along that are unique and not a refinement of earlier work. Shannon saw information in a way that nobody else had ever thought of it. He showed that information could be quantified in a very precise way. His paper was the first place to use the word ‘bit’ to describe a discrete piece of information.
For those who might be interested, a copy of this paper is here. I read this many years ago and I still find it well worth reading. The paper was unique and so clearly written that it is still used today to teach at MIT.
What Shannon had done was to show how we could measure and quantify the world around us. He made it clear how all measurable data in the world could be captured precisely and then transmitted without losing any precision. Since this was developed at Bell Labs, one of the first applications of the concept was applied to telephone signals. In the lab they were able to convert a voice signal into digital code of 1’s and 0’s and then transmit it to be decoded somewhere else. And the results were just as predicted in that the voice signal that came out at the receiving end was as good as what was recorded at the transmitting end. Until this time voice signals had been analog and that meant that any interference that happened on the line between callers would affect the quality of the call.
But of course, voice is not the only thing that can be encoded as digital signals and as a society we have converted about everything imaginable as 1s and 0s. We applied digital coding to music, pictures, film and text over time and today everything on the Internet has been digitized.
The world reacted quickly to Shannon’s paper and accolades were everywhere. Within two years everybody in science was talking about information theory and applying it to their particular fields of research. Shannon was not comfortable with the fame that came from his paper and he slowly withdrew from society. He left Bell Labs and returned to teach at MIT. But he even slowly withdrew from there and stopped teaching by the mid-60’s.
We owe a huge debt to Claude Shannon. His original thought gave rise to the components that let computers ‘think’, which gave a push to the nascent computer industry and was the genesis of the field of artificial intelligence. And he also developed information theory which is the basis for everything digital that we do today. His work was unique and probably has more real-world applications than anything else developed in the 20th century.