Information Theory: How the Genius of Claude Shannon Changed Our Lives By Thinking “Outside the Box”

Claude_Elwood_Shannon_(1916-2001)[1]Claude Shannon: have you ever heard the name? How about Isaac Newton, Albert Einstein, and Charles Darwin? Those three names are universally familiar to the general public even though all but a small segment of the population would find it difficult to elaborate significant details of the work that made them immortal in scientific history. In Shannon’s case, his name, his face, his genius, and his immense impact on our world are all virtually unknown, thus unappreciated, outside the realms of mathematics and electrical engineering. Claude Shannon is the “father of information/communication theory” and primarily responsible for the vast networks of computers, data processing, and mass communication that power modern society. It is my intention, here, to at least do minimal justice to his rightful legacy among the great minds of mathematics, science, and engineering.

Shannon’s contributions are numerous and varied, but a closer look reveals that the central theme of most of them are well characterized by his most famous of many publications over the years, The Mathematical Theory of Communication, which appeared in 1948. Most of Shannon’s published papers were issued under the imprimatur of the Bell (Telephone) Labs Technical Journal. Bell Labs had a long and illustrious run as an incredible incubator for many of the most important math, science, and engineering advancements in America during the twentieth century. Accordingly, many of the country’s top minds were associated with the Lab and its activities. Claude Shannon was one of them.

All Information Can Be Represented By Data 1’s and 0’s!

3653[1]Have you ever marveled at the fact that modern computers can store and reproduce any-and-all information – text, audio, color pictures, and movies – using only organized collections of data 1’s and 0’s? Think of it! A modern computer is little more than a collection of millions of microscopic electronic switches (think light switches) which reside either in an “on” state (a data 1) or an “off” state (a data 0). If that reality has never occurred to you, pause for a few moments and reflect on the enormity of the fact that anything and everything called “media” can be displayed on-command by calling-up organized collections of data 1’s and 0’s which reside in the bowels of your personal computer! In addition, the computer’s “logical intelligence” – its ability to respond to your commands – also resides in the machine’s memory bank in the form of data 1’s and 0’s. In the nineteen-twenties and thirties, Claude Shannon was among other computing pioneers who understood the possibilities emerging from the burgeoning progress of electronics. The notion of a binary (or two-state) number system in a computing device was evident as far back as the eighteen-thirties when Charles Babbage designed and built his first bulky, mechanical computing machines.

Today, in our everyday lives, we use the decimal number system which is inherently unsuited to computers because that number system requires each digit in a number representation to assume one of ten states, 0 thru 9. Modern computers are designed around the binary (or two-state) system in which each digit in a binary number assumes a value, or weight, of either one or zero. A simple light switch or an electronic relay (open or closed) are examples of simple, two-state devices which can be used to represent any single digit in a binary number. In actuality, the two-state devices in modern computers are implemented utilizing millions of microscopic, individual solid-state transistors which can be switched either “on” or “off.” The binary number system, requiring only simple two-state devices (or switches), is the optimal choice.

Shannon would be the first to admit that he was never motivated to change the world by the work he pursued. Nor was he motivated by any prospects of fame and fortune for his efforts. Rather, he was endlessly fascinated by the challenges inherent in pursuing theoretical possibilities, regardless of any possible practicality or profit stemming from his efforts. Claude Shannon’s persona had multiple facets: a genius, out-of-the-box thinker, an inveterate tinkerer and inventor of gadgets, a juggler (circus-type), and a devotee of the unicycle – a conveyance he both rode, designed, and built himself! This most unusual personality forged much of the “quiet legend” which surrounded the reclusive, mysterious Mr. Shannon. Even though he was a tinkerer and builder of “toys and gadgets,” he lived for and thrived on elevated ideas – creations of the mind. In many respects, he was much like Albert Einstein in his outlooks, his rampant curiosity, and his dogged persistence, all of which were on full display as Einstein tackled the mysteries of both special and general relativity.

The Most Important Master’s Degree Thesis Ever Submitted!

In 1937, Claude Shannon submitted a thesis for his master’s degree in electrical engineering at MIT. Normally, a master’s thesis proves to be significantly less impressive in terms of originality and impact than that required for a Phd. Shannon’s master’s thesis proved to be a startling exception to the rule – the first of many unorthodoxies that characterized his unusual career. As an undergraduate at the University of Michigan, he had earned dual degrees in mathematics and electrical engineering. It was at Michigan that he learned the “new math” developed by the English mathematician George Boole and introduced to the scholarly community in 1854 under the title, An Investigation into the Laws of Thought, on Which Are Founded the Mathematical Theories of Logic and Probabilities. This work was the most important contribution to emerge from the genius of Boole who died much too young from pneumonia at the age of forty-nine years.

Shannon's ThesisShannon was prescient enough to recognize that Boole’s algebraic treatment of the binary number system uncannily lent itself to the development of real-life logical systems (computers) which could be simply implemented using electrical relays – binary (two-state), on/off devices which had cost, space, power, and reliability issues, but which could nevertheless demonstrate computing principles in the nineteen-thirties and forties. In simplest terms, Shannon demonstrated in his master’s thesis that, using Boolean algebra and simple two-state electrical devices, a computer could be designed to “think logically” while processing and displaying stored information.

Shannon’s prescient recognition led to the characterization of his thesis as “The most important Master’s thesis ever written.” Indeed, Shannon opened the doors to a new and exciting vista, one that he vigorously explored while working at AT&T’s Bell Laboratories, and later, at MIT.

Shannon Sets These Major Goals for Himself – No Small Tasks!

How do we define “information,” how do we quantify information, and how can we transmit information most efficiently and reliably through communication channels?

I suggest that the reader pause a moment and ponder the thin air in which Claude Shannon pursued his goals. How in the world does one define and quantify such an “airy” concept as “information?” 

Here are some examples, the easiest entry-point into Shannon’s methodology regarding the definition and quantification of information:

When we flip a coin, we receive one data-bit of information from the outcome, according to Shannon’s math! In this case, there are only two outcome possibilities, heads or tails – two “message” possibilities, if you will. Were we to represent “heads” as a binary data “1” and tails as a binary data “0”, we can visualize and quantify the outcome of the coin flip as the resulting state (“1” or “0”) of a single “binary digit” (or “bit”) of information gained in the process of flipping the coin. In Shannon’s world, the amount of information received would equal precisely one-bit of information in either case – heads or tails – because each case is equally probable, statistically. The final comment concerning probabilities is important.

Here is how probability/statistics enters into Shannon’s treatment of information: What would be the case if I had a bona-fide, accurate/true crystal ball at my disposal and I queried it, “Will I still be alive on my upcoming eighty-second birthday – yes or no?” There are only two possible predictions (or messages), but, in this case, the information content of the message conveyed is dependent on which outcome is provided. If the answer is yes, I will make it to my 82nd birthday, I receive (happily) lessthan one bit of information content because actuarial tables of longevity indicate that, statistically speaking, the odds are in my favor. If the answer is no, I (unhappily) receive more than one bit of information due to the probability that not reaching my next birthday is statistically less than 50/50. A message whose content reveals less likely outcomes conveys more information than a message affirming the more likely, predictable outcomes in Shannon’s mathematical model of information.

Here is a third example of Shannon’s system: Consider the case of rolling a single die with six different faces identified as “1” through “6.” There are six possible outcomes, each one having the equal probability of 1/6. According to Shannon’s mathematical model, the amount of information gained from a single roll of the die is 2.59 binary bits. The outcome of a single roll of a die carries 2.59 bits of information vs. only one bit of information from the single flip of a coin. Why is that? It is because any one of six equally likely possible outcomes is less likely to occur than either outcome of a coin flip which presents only two equally likely outcomes!

Lest you think that quantifying the information content of messages strictly on a statistical basis with no regard for the content of the message itself seems a silly bit of elite hair-splitting on the part of math/engineering crackpots, I can assure you that you are dreadfully mistaken for these and numerous other derivations and conclusions that sprang from the curious mind of Claude Shannon form the backbone of today’s trillion dollar computer and communication industries! Shannon and his information/communication theories, like Einstein and his relativity theories, has been proven correct by both time and actual practice. Because of both men, our world has been immensely altered.

A Good Stopping Point for This Journey into                                  Information/Communication Theory

At this point in the story of Claude Shannon and his information /communication theories, we approach the edge of a technical jungle, replete with a formidable underbrush of advanced mathematics, and this is as far as we should go, here. For those well-versed in mathematics and engineering, that jungle path is clearly marked with signposts signifying that “Shannon has passed this way and cleared the pathway of formidable obstacles: proceed…and marvel.” The pathway that Shannon forged guides fortunate, well-equipped adventurers through some deep and beautiful enclaves of human thought and accomplishment.

Claude Shannon was a remarkable original, an imaginative thinker and doer. Inevitably, great milestones in math, engineering, and science are not without some degree of precedence. In Shannon’s case, there was not much to build from, but there was some. Certainly, the Boolean algebra of George Boole was a gift. As mentioned earlier, Shannon’s first publication of his own findings, titled The Mathematical Theory of Communication, appeared in the Bell System Technical Journal of !948.

IMG_2500Hartley BSTJ 1928 Enhanced 21928 Nyquist Sampling_1

His paper was quickly published in book form in 1949 by the University of Illinois Press. In his paper, Shannon mentioned the earlier work of Ralph V. L. Hartley and Harry Nyquist, both earlier Bell Laboratory employees, like Shannon. Hartley published his prescient views on the nature of information in the Bell System Technical Journal, dated July 1928. His paper was titled, The Transmission of Information. Although rudimentary, the paper was original and set in motion ideas that led Shannon to his triumphant 1948 publication in the Bell Journal. In Nyquist’s case, in addition to discussions re: the importance of optimal coding for the efficient transmission of information in an earlier, 1924 issue, Nyquist published, in the August 1928 Bell Journal, his ground-breaking analysis of the minimum waveform sampling rate of an analog (continuous) signal necessary to accurately reconstruct the original waveform from stored digital data samples – as is routinely done, today. Nyquist’s famous sampling theorem provided the necessary “bridge” between the world of analog information and digital representations of analog data that was so necessary to make Shannon’s theories applicable to all formats containing information.

Two Crucially Important, Parallel Technology Upheavals Which Enabled Shannon’sTheories in the Real World

The first of these upheavals began with the announcement from Bell Labs of the solid-state transistor in 1948, ironically the same year that Bell Labs published Shannon’s The Mathematical Theory of Communication. Three Bell Labs researchers led by William Shockley won the 1956 Nobel Prize in physics for their work. The transistor was a remarkable achievement which signaled the end of the cumbersome, power-hungry vacuum tubes which powered electrical engineering since their introduction in 1904 by Lee de Forest. By1955, the ultimate promise of the tiny and energy-efficient transistor came into full view.

The second major technology upheaval began in 1958/59 when the integrated circuit was introduced by Jack Kilby of Texas Instruments and, independently, by a team under Robert Noyce at newly founded Fairchild Semiconductor, right here in adjacent Mountain View – part of today’s Silicon Valley. The Fairchild planar process of semiconductor fabrication signaled the unprecedented progress which quickly powered the computer revolution. Today, we have millions of microscopic transistors fabricated on one small silicon “chip” less than one inch square. The versatile transistor can act as an amplifier of analog signals and/or a very effective high-speed and reliable binary switch.

These two parallel revolutions complete the trilogy of events begun by Shannon which determined our path to this present age of mass computation and communication.

A Final Summation

My goal was to make you, the reader, cognizant of Claude Shannon and his impact on our world, a world often taken for granted by many who daily benefit immensely from his legacy. We have come a very long way from the worlds of the telegraph – Morse and Vail, and the telephone – Alexander Graham Bell, and radio – Marconi, and Armstrong. The mathematical theories and characterizations proposed by Claude Shannon have essentially all been proven sound; his conclusions regarding the mathematical theory of communication are amazingly applicable to all modes of communication – from the simple telegraph, to radio, to our vast cellular networks, and to deep-space satellite communication.

I respectfully suggest you keep a few things in mind, going forward:

-Your computer is what it is and does what is does in no small part thanks to Claude Shannon’s insightful genius.

-Your cell phone can connect you anywhere in the world thanks largely to Claude Shannon.

-The abiliity to store a two-hour movie in high-definition and full, living color on a digital compact disc called a DVD is directly due to Claude Shannon.

-The error-correction capability digitally encoded on CD’s and DVD’s which insure playback with no detectable effects even from a badly scratched disc is absolutely the result of Claude Shannon’s ground-breaking work on error-correcting digital codes.

-Your ability to encrypt the data on your computer hard drive so that it is impenetrable to anyone (even experts) who do not possess the decoding key is, yet again, a direct result of Claude Shannon’s cryptography efforts.

And, finally, we arrive at the most surprising fact of them all: how is it that virtually 90 per-cent of the world’s population has benefitted so immensely from the legacy of Claude Shannon, yet so few have even heard of him? Perhaps there are some lessons, here?

Kudos to Claude Shannon and all the other visionaries who made it happen.

Back Grazing in Familiar Pastures; Shannon’s Milestone Book on Communication Theory

IMG_2499

Do you use the internet and personal communication devices such as cell phones? Since you are here, you must! Who doesn’t these days? One look at people in public places with eyes riveted on phone screens or tablets speaks to the popularity of personal communication. DSL (Direct Subscriber Line) services like AT&T’s U-Verse reliably bring broadband television and the Internet into our homes over lowly, antiquated, but ubiquitous twisted-pair phone wire connections. That miracle is only possible thanks to the power of modern digital communication theory.

The gospel of the engineering/mathematics that enable that capability is this 1949 book edition by Claude Shannon of AT&T’s famous Bell Telephone Laboratories. Its title: The Mathematical Theory of Communication. “Bell Labs” made immense contributions to our body of technical knowledge over many decades through its familiar, blue-wrappered Technical Journal. The authors of its featured papers include many of the most important scientists, engineers, and mathematicians of the past century.

Claude Shannon was one of them; the contents of his 1949 book, published by the University of Illinois Press, first appeared in the Bell System’s Journal in 1948. The paper’s unique and important approach to reliably sending electrical/optical signals from one point (the source) to another (the destination) through a “channel” was instrumental in realizing today’s communication miracles. Shannon’s methods are not limited to this or that specific channel technology; rather, his work applies to virtually all forms of communication channels – from digital audio/video disks, to AM/FM broadcasting, to the technology of the Internet, itself. The wide applicability of Shannon’s insights to communication systems as diverse as Samuel Morse’s original telegraph system and modern satellite communications is quite remarkable and underlines the importance of his findings.

Claude_Elwood_Shannon_(1916-2001)[1]Interestingly, some of the foundation for Shannon’s ideas emanated from the early design of Morse’s first telegraph system which began service in 1844 between Washington and Baltimore. The first message sent over that line was Morse’s famous utterance in Morse code to his assistant, Alfred Vail: “What hath God wrought?” While Claude Shannon is fairly identified as the “father of communication theory” thanks to his famous 1948/49 publications, there were also many grandfathers! Most of them made valuable contributions to the speed and reliability of early communication vis-à-vis the telegraph and early telephony, as pioneered by Alexander Graham Bell. One of the early, key contributors to communication technology was R.V.L. Hartley who, in the July, 1928 issue of the Bell System Technical Journal, published a very original treatise titled Transmission of Information. This paper of Hartley’s and one in the 1924 Journal by Harry Nyquist were acknowledged by Shannon as prime foundational sources for his later ideas.

Hartley Bell Journal_2 1928 Journal w/ Hartley’s Paper: Transmission of Information

What Were Claude Shannon’s Contributions?

A brief but inclusive answer comes from the well-regarded book of J.R. Pierce, Symbols, Signals and Noise. I quote, here:

“The problem Shannon set for himself is somewhat different. Suppose we have a message source which produces messages of a given type, such as English text. Suppose we have a noisy communication channel of specified characteristics. How can we represent or encode messages from the message source by means of electrical signals so as to attain the fastest possible transmission over the noisy channel? Indeed, how fast can we transmit a given type of message over a given channel without error? In a rough and general way, this is the problem that Shannon set himself and solved.”

Although Shannon impressively refined our concepts regarding the statistical nature of communication, Samuel Morse and his assistant, Alfred Vail, had, long ago, recognized statistical ramifications, and that fact was reflected in their telegraph code. Notably, they made certain that the most commonly used letters of the alphabet had the simplest dot/dash implementations in the Morse code – to minimize the overall transmission time of messages. For example, the most commonly used letter “e” was assigned a short, single “dot” as its telegraphic representation. Reportedly, this “code optimization” task was handled by Vail who merely visited a local printing shop and examined the “type bins,” equating the frequency of use in print for a specific letter to the size of its type bin! The printing industry had a good handle on text statistics of the English language long before electrical technology arrived on the scene. The specific dot/dash coding of each letter for Morse’s code proceeded accordingly. From that practical and humble beginning, statistical communication theory reached full mathematical bloom in Shannon’s capable hands. As in Morse’s time, coding theory remains an important subset of modern digital communication theory.

Revisiting Communication Theory:
Grazing Once Again in Technical Pastures of the Past

The most satisfying portion of my engineering career came later – particularly the last ten years – when I became immersed in the fundamentals of communication theory while working in the computer disk drive industry, here in Silicon Valley. My job as electrical engineer was to reliably record and retrieve digital data using the thin, magnetic film deposited on spinning computer disks. As the data demands of personal computers rapidly increased in the decade of the 1990’s, the challenge of reliably “communicating” with the magnetic film and its increasingly dense magnetically recorded bits of data was akin to the DSL task of cramming today’s broadband data streams down the old, low-tech telephone twisted-pair wires which have been resident in phone cables for many decades. Twisted-pair wires make a very poor high speed communication cable compared to coaxial cable or the latest fiber-optic high-speed cable, but they had one huge advantage/motivation for DSL’s innovators: They already fed most every home and business in the country!

I retired from engineering in 2001 after a thirty-seven year career and now find myself wandering back to “technical pastures of the past.” During the last ten and most exciting years of my career, I came to know and work with two brilliant electrical engineering PhDs from Stanford University. They had been doctoral students there under Professor John Cioffi who is considered the “father of DSL.” The two were employed by our company to implement the latest communication technologies into disk storage by working closely with our product design teams. Accordingly, the fundamental communication theories that Shannon developed which enabled the DSL revolution were applied to our disk drive channels to increase data capacity/reliability. Under the technical leadership of the two Stanford PhDs, our design team produced the industry’s first, successful production disk drive utilizing the radically new technology. IBM had preceded our efforts somewhat with their “concept” disk drive, but it never achieved full-scale production. After the successful introduction of our product, the disk drive industry never looked back, and, soon, everyone else was on-board with the new design approach known as a “Partial Response/Maximum Likelihood” channel.

I always appreciated the strong similarities between the technology we implemented and that which made DSL possible, but I recently decided to learn more. I purchased a book, a tech-bible on DSL, co-authored in 1999 by Professor Cioffi. Thumbing through it, I recognize much of the engineering it contains. I have long felt privileged that I and our design team had the opportunity to work with the two young PhD engineers who studied with Cioffi and who knew communication theory inside-out. Along with their academic, theoretical brilliance, the two also possessed a rare, practical mindset toward hardware implementation which immensely helped us transfer theory into practice – in the form of a commercially successful, mass-produced computer product. Everyone on our company staff liked and deeply respected these two fellows.

When the junior of the two left our company as our drive design was nearing full production, he circulated a company-wide memo thanking the organization for his opportunity to work with us. He cited several of us engineers by name for special thanks, an act which really meant a lot to me…and, surely, to my colleagues – an uncommon courtesy, these days, and a class act in every sense of the word!

Even in this valley of pervasive engineering excellence, that particular experience was one of a select few during my career which allowed me a privileged glimpse into the rarified world of “top-minds” in engineering and mathematics – the best of the best. A still-higher category up the ladder of excellence and achievement is that of “monster minds” (like Einstein, Bohr, and Pauli) which the Nobel physicist, Richard Feynman, so humorously wrote about in his book, Surely You’re Joking Mr. Feynman. A very select club!

The recent event which tuned me in, once again, to this technology and my past recollections was the subject of my May 2, 2015 blog post, Two Books from 1948 : Foundations of the Internet and Today’s Computer Technology (click on the link). In it, I describe the incredible good fortune of stumbling upon one of the two scarce, foundational books on communication theory and computer control: Cybernetics by Norbert Wiener. More recently, I acquired a nice copy of Claude Shannon’s 1949 first edition, The Mathematical Theory of Communication (the other book). That one came at no give-away price like my copy of Cybernetics, but, given its importance, it still represents a bargain in my mind.

IMG_2479 PSLike many engineers who are familiar with Shannon and his influence, I had never read his book, although I had taken a course on statistical communication theory in my master’s degree program over 45 years ago. Unlike many engineers, today, whose gaze is fixed only upon the present and the future, I have a deep interest in the history of the profession and a healthy respect for its early practitioners and their foundational work. Accordingly, I have been brushing off some technical rust and am now immersed in Shannon’s book for the first time and in the subject material, once again.

Old, familiar pastures – a bit like coming home, again, to peacefully graze. While the overall “view” improves with age and experience, the “eyesight” is not so keen, anymore. But my curiosity is up, yet again, and I will soldier-on through the technical difficulties and see where that takes me, all the while relishing the journey and the landscape.

Two Books from 1948 : Foundations of the Internet and Today’s Computer Technology

IMG_2350_PSI had a very good day recently. I bought a beautiful $400 book for $20 in Ventura, California! It also happens to be a very important book – literally, a foundation work for today’s Internet and our computer-based technological age . The book is titled: Cybernetics. While traveling south three weeks ago to the annual Santa Clarita Cowboy Music Festival – an annual event for us (see last week’s post) – Linda and I stopped in downtown Ventura, California – also an annual ritual. As always, we had lunch at our favorite hole-in-the-wall Italian restaurant and browsed for a bit at one of our favorite used bookstores in town, The Calico Cat.

Often we find a book or two in this little shop, and sometimes, we do not. After perusing various sections for close to an hour with no luck, I moved to the science/math section. As I ran my eyes along the shelves, I recognized many of the books they held. My scanning gaze froze as I came upon a pristine little book titled Cybernetics: or Control and Communication of the Animal and the Machine, written by Norbert Wiener. Wiener was a mathematics prodigy in his youth who enjoyed a long and distinguished career as a professor of mathematics at M.I.T., the Massachusetts Institute of Technology.

The book’s title reflects its ground-breaking categorization of the messaging and control systems inherent in the two closely-related realms of computer control and the human/animal brain/body connection. Cybernetics appeared at precisely the same time as the first, large-scale electronic computers, and this little book was instrumental in determining the future path of computing and control (robotics).

I immediately recognized the title and author as very important, but could this be the 1948 first edition – a book I knew to be of considerable value? I excitedly pulled it from the shelf and opened to the verso of the title page which stated “Second printing. November, 1948.” I was holding the second printing of the first U.S. edition printed by John Wiley and Sons, Inc. There was a companion edition of the text published in Paris by Hermann et CIE, also in 1948.

I became very excited and called Linda over to show her the book and explain, “I believe this book is worth several hundred dollars in the book trade: It is a very famous, seminal work in communication and control engineering. For a copy in the like-new condition of this one, $400 is easily a realistic value on the market. The apparent penciled price for this pristine copy with an almost perfect dust jacket: $25! The store’s owner called my attention to the fact I misread the price which was actually only $20. With no hint of hesitation, I coolly announced, “I’ll take it!” The book’s original price, still on the dust-jacket: $3! Understandably, the store owners had no clue as to the book’s engineering/mathematical significance to today’s Internet and computer technology.

For a retired electrical engineer, like myself, finding this little book in such perfect condition at such a price is akin to tripping over a diamond protruding from the footpath. My many years in Silicon Valley spent designing computer disk drives all ultimately stemmed from a very few foundational works (books and technical papers) such as this one. Summoning my engineering background, I can read and understand the material in this book –while difficult, it does not require a PhD in Mathematics. That is the beauty of a foundational technical work such as this – profound, yet accessible to most engineers and scientists – given some effort.

The “Other” Book

11120467054[1]There exists another similarly concise book whose pedigree exceeds even that of Cybernetics. That book was authored by Claude Shannon at the Bell Laboratories and titled The Mathematical Theory of Communication. Interestingly, the book was published in 1949 after being first introduced as a technical paper in the Bell Systems Journal of 1948 – the same year that Cybernetics was published.

Dare I press my luck and hope to find a similar bargain for Mr. Shannon’s book? Not likely to happen, but it would be the perfect complement to Wiener’s little volume.

Shannon’s book elegantly achieves the unenviable task of defining “information” in mathematical terms and in a manner which lends itself to quantifying the maximum flow of  information possible over a given communication channel such as the Internet or the radio/television airwaves, to cite two of many possible real-life applications. Reading and decoding the magnetically recorded binary bits of data (1’s and 0’s) stored on computer disk drives occurs in the “read channel” of the drive electronics, as we engineers in the industry referred to it. All such applications concerning “communications” succumb to the mathematics presented by Claude Shannon in this little volume containing a mere 117 pages! Shannon’s methods are equally applicable to yesterday’s analog channels (radio transmissions, for example) and today’s pervasive digital implementations (computers, the Internet, et all).

The next time we are passing through Ventura, I will keep a sharp eye out for this second book and any other bargains like Cybernetics! Good fortune usually takes luck, but when good luck comes knocking, one needs to recognize the sound!