Claude Shannon: have you ever heard the name? How about Isaac Newton, Albert Einstein, and Charles Darwin? Those three names are universally familiar to the general public even though all but a small segment of the population would find it difficult to elaborate significant details of the work that made them immortal in scientific history. In Shannon’s case, his name, his face, his genius, and his immense impact on our world are all virtually unknown, thus unappreciated, outside the realms of mathematics and electrical engineering. Claude Shannon is the “father of information/communication theory” and primarily responsible for the vast networks of computers, data processing, and mass communication that power modern society. It is my intention, here, to at least do minimal justice to his rightful legacy among the great minds of mathematics, science, and engineering.
Shannon’s contributions are numerous and varied, but a closer look reveals that the central theme of most of them are well characterized by his most famous of many publications over the years, The Mathematical Theory of Communication, which appeared in 1948. Most of Shannon’s published papers were issued under the imprimatur of the Bell (Telephone) Labs Technical Journal. Bell Labs had a long and illustrious run as an incredible incubator for many of the most important math, science, and engineering advancements in America during the twentieth century. Accordingly, many of the country’s top minds were associated with the Lab and its activities. Claude Shannon was one of them.
All Information Can Be Represented By Data 1’s and 0’s!
Have you ever marveled at the fact that modern computers can store and reproduce any-and-all information – text, audio, color pictures, and movies – using only organized collections of data 1’s and 0’s? Think of it! A modern computer is little more than a collection of millions of microscopic electronic switches (think light switches) which reside either in an “on” state (a data 1) or an “off” state (a data 0). If that reality has never occurred to you, pause for a few moments and reflect on the enormity of the fact that anything and everything called “media” can be displayed on-command by calling-up organized collections of data 1’s and 0’s which reside in the bowels of your personal computer! In addition, the computer’s “logical intelligence” – its ability to respond to your commands – also resides in the machine’s memory bank in the form of data 1’s and 0’s. In the nineteen-twenties and thirties, Claude Shannon was among other computing pioneers who understood the possibilities emerging from the burgeoning progress of electronics. The notion of a binary (or two-state) number system in a computing device was evident as far back as the eighteen-thirties when Charles Babbage designed and built his first bulky, mechanical computing machines.
Today, in our everyday lives, we use the decimal number system which is inherently unsuited to computers because that number system requires each digit in a number representation to assume one of ten states, 0 thru 9. Modern computers are designed around the binary (or two-state) system in which each digit in a binary number assumes a value, or weight, of either one or zero. A simple light switch or an electronic relay (open or closed) are examples of simple, two-state devices which can be used to represent any single digit in a binary number. In actuality, the two-state devices in modern computers are implemented utilizing millions of microscopic, individual solid-state transistors which can be switched either “on” or “off.” The binary number system, requiring only simple two-state devices (or switches), is the optimal choice.
Shannon would be the first to admit that he was never motivated to change the world by the work he pursued. Nor was he motivated by any prospects of fame and fortune for his efforts. Rather, he was endlessly fascinated by the challenges inherent in pursuing theoretical possibilities, regardless of any possible practicality or profit stemming from his efforts. Claude Shannon’s persona had multiple facets: a genius, out-of-the-box thinker, an inveterate tinkerer and inventor of gadgets, a juggler (circus-type), and a devotee of the unicycle – a conveyance he both rode, designed, and built himself! This most unusual personality forged much of the “quiet legend” which surrounded the reclusive, mysterious Mr. Shannon. Even though he was a tinkerer and builder of “toys and gadgets,” he lived for and thrived on elevated ideas – creations of the mind. In many respects, he was much like Albert Einstein in his outlooks, his rampant curiosity, and his dogged persistence, all of which were on full display as Einstein tackled the mysteries of both special and general relativity.
The Most Important Master’s Degree Thesis Ever Submitted!
In 1937, Claude Shannon submitted a thesis for his master’s degree in electrical engineering at MIT. Normally, a master’s thesis proves to be significantly less impressive in terms of originality and impact than that required for a Phd. Shannon’s master’s thesis proved to be a startling exception to the rule – the first of many unorthodoxies that characterized his unusual career. As an undergraduate at the University of Michigan, he had earned dual degrees in mathematics and electrical engineering. It was at Michigan that he learned the “new math” developed by the English mathematician George Boole and introduced to the scholarly community in 1854 under the title, An Investigation into the Laws of Thought, on Which Are Founded the Mathematical Theories of Logic and Probabilities. This work was the most important contribution to emerge from the genius of Boole who died much too young from pneumonia at the age of forty-nine years.
Shannon was prescient enough to recognize that Boole’s algebraic treatment of the binary number system uncannily lent itself to the development of real-life logical systems (computers) which could be simply implemented using electrical relays – binary (two-state), on/off devices which had cost, space, power, and reliability issues, but which could nevertheless demonstrate computing principles in the nineteen-thirties and forties. In simplest terms, Shannon demonstrated in his master’s thesis that, using Boolean algebra and simple two-state electrical devices, a computer could be designed to “think logically” while processing and displaying stored information.
Shannon’s prescient recognition led to the characterization of his thesis as “The most important Master’s thesis ever written.” Indeed, Shannon opened the doors to a new and exciting vista, one that he vigorously explored while working at AT&T’s Bell Laboratories, and later, at MIT.
Shannon Sets These Major Goals for Himself – No Small Tasks!
How do we define “information,” how do we quantify information, and how can we transmit information most efficiently and reliably through communication channels?
I suggest that the reader pause a moment and ponder the thin air in which Claude Shannon pursued his goals. How in the world does one define and quantify such an “airy” concept as “information?”
Here are some examples, the easiest entry-point into Shannon’s methodology regarding the definition and quantification of information:
When we flip a coin, we receive one data-bit of information from the outcome, according to Shannon’s math! In this case, there are only two outcome possibilities, heads or tails – two “message” possibilities, if you will. Were we to represent “heads” as a binary data “1” and tails as a binary data “0”, we can visualize and quantify the outcome of the coin flip as the resulting state (“1” or “0”) of a single “binary digit” (or “bit”) of information gained in the process of flipping the coin. In Shannon’s world, the amount of information received would equal precisely one-bit of information in either case – heads or tails – because each case is equally probable, statistically. The final comment concerning probabilities is important.
Here is how probability/statistics enters into Shannon’s treatment of information: What would be the case if I had a bona-fide, accurate/true crystal ball at my disposal and I queried it, “Will I still be alive on my upcoming eighty-second birthday – yes or no?” There are only two possible predictions (or messages), but, in this case, the information content of the message conveyed is dependent on which outcome is provided. If the answer is yes, I will make it to my 82nd birthday, I receive (happily) lessthan one bit of information content because actuarial tables of longevity indicate that, statistically speaking, the odds are in my favor. If the answer is no, I (unhappily) receive more than one bit of information due to the probability that not reaching my next birthday is statistically less than 50/50. A message whose content reveals less likely outcomes conveys more information than a message affirming the more likely, predictable outcomes in Shannon’s mathematical model of information.
Here is a third example of Shannon’s system: Consider the case of rolling a single die with six different faces identified as “1” through “6.” There are six possible outcomes, each one having the equal probability of 1/6. According to Shannon’s mathematical model, the amount of information gained from a single roll of the die is 2.59 binary bits. The outcome of a single roll of a die carries 2.59 bits of information vs. only one bit of information from the single flip of a coin. Why is that? It is because any one of six equally likely possible outcomes is less likely to occur than either outcome of a coin flip which presents only two equally likely outcomes!
Lest you think that quantifying the information content of messages strictly on a statistical basis with no regard for the content of the message itself seems a silly bit of elite hair-splitting on the part of math/engineering crackpots, I can assure you that you are dreadfully mistaken for these and numerous other derivations and conclusions that sprang from the curious mind of Claude Shannon form the backbone of today’s trillion dollar computer and communication industries! Shannon and his information/communication theories, like Einstein and his relativity theories, has been proven correct by both time and actual practice. Because of both men, our world has been immensely altered.
A Good Stopping Point for This Journey into Information/Communication Theory
At this point in the story of Claude Shannon and his information /communication theories, we approach the edge of a technical jungle, replete with a formidable underbrush of advanced mathematics, and this is as far as we should go, here. For those well-versed in mathematics and engineering, that jungle path is clearly marked with signposts signifying that “Shannon has passed this way and cleared the pathway of formidable obstacles: proceed…and marvel.” The pathway that Shannon forged guides fortunate, well-equipped adventurers through some deep and beautiful enclaves of human thought and accomplishment.
Claude Shannon was a remarkable original, an imaginative thinker and doer. Inevitably, great milestones in math, engineering, and science are not without some degree of precedence. In Shannon’s case, there was not much to build from, but there was some. Certainly, the Boolean algebra of George Boole was a gift. As mentioned earlier, Shannon’s first publication of his own findings, titled The Mathematical Theory of Communication, appeared in the Bell System Technical Journal of !948.
His paper was quickly published in book form in 1949 by the University of Illinois Press. In his paper, Shannon mentioned the earlier work of Ralph V. L. Hartley and Harry Nyquist, both earlier Bell Laboratory employees, like Shannon. Hartley published his prescient views on the nature of information in the Bell System Technical Journal, dated July 1928. His paper was titled, The Transmission of Information. Although rudimentary, the paper was original and set in motion ideas that led Shannon to his triumphant 1948 publication in the Bell Journal. In Nyquist’s case, in addition to discussions re: the importance of optimal coding for the efficient transmission of information in an earlier, 1924 issue, Nyquist published, in the August 1928 Bell Journal, his ground-breaking analysis of the minimum waveform sampling rate of an analog (continuous) signal necessary to accurately reconstruct the original waveform from stored digital data samples – as is routinely done, today. Nyquist’s famous sampling theorem provided the necessary “bridge” between the world of analog information and digital representations of analog data that was so necessary to make Shannon’s theories applicable to all formats containing information.
Two Crucially Important, Parallel Technology Upheavals Which Enabled Shannon’sTheories in the Real World
The first of these upheavals began with the announcement from Bell Labs of the solid-state transistor in 1948, ironically the same year that Bell Labs published Shannon’s The Mathematical Theory of Communication. Three Bell Labs researchers led by William Shockley won the 1956 Nobel Prize in physics for their work. The transistor was a remarkable achievement which signaled the end of the cumbersome, power-hungry vacuum tubes which powered electrical engineering since their introduction in 1904 by Lee de Forest. By1955, the ultimate promise of the tiny and energy-efficient transistor came into full view.
The second major technology upheaval began in 1958/59 when the integrated circuit was introduced by Jack Kilby of Texas Instruments and, independently, by a team under Robert Noyce at newly founded Fairchild Semiconductor, right here in adjacent Mountain View – part of today’s Silicon Valley. The Fairchild planar process of semiconductor fabrication signaled the unprecedented progress which quickly powered the computer revolution. Today, we have millions of microscopic transistors fabricated on one small silicon “chip” less than one inch square. The versatile transistor can act as an amplifier of analog signals and/or a very effective high-speed and reliable binary switch.
These two parallel revolutions complete the trilogy of events begun by Shannon which determined our path to this present age of mass computation and communication.
A Final Summation
My goal was to make you, the reader, cognizant of Claude Shannon and his impact on our world, a world often taken for granted by many who daily benefit immensely from his legacy. We have come a very long way from the worlds of the telegraph – Morse and Vail, and the telephone – Alexander Graham Bell, and radio – Marconi, and Armstrong. The mathematical theories and characterizations proposed by Claude Shannon have essentially all been proven sound; his conclusions regarding the mathematical theory of communication are amazingly applicable to all modes of communication – from the simple telegraph, to radio, to our vast cellular networks, and to deep-space satellite communication.
I respectfully suggest you keep a few things in mind, going forward:
-Your computer is what it is and does what is does in no small part thanks to Claude Shannon’s insightful genius.
-Your cell phone can connect you anywhere in the world thanks largely to Claude Shannon.
-The abiliity to store a two-hour movie in high-definition and full, living color on a digital compact disc called a DVD is directly due to Claude Shannon.
-The error-correction capability digitally encoded on CD’s and DVD’s which insure playback with no detectable effects even from a badly scratched disc is absolutely the result of Claude Shannon’s ground-breaking work on error-correcting digital codes.
-Your ability to encrypt the data on your computer hard drive so that it is impenetrable to anyone (even experts) who do not possess the decoding key is, yet again, a direct result of Claude Shannon’s cryptography efforts.
And, finally, we arrive at the most surprising fact of them all: how is it that virtually 90 per-cent of the world’s population has benefitted so immensely from the legacy of Claude Shannon, yet so few have even heard of him? Perhaps there are some lessons, here?
Kudos to Claude Shannon and all the other visionaries who made it happen.