J. Robert Oppenheimer and the Atomic Bomb: Triumph and Tragedy

J. Robert Oppenheimer: Along with Albert Einstein, one of the most interesting and important figures in modern history. Although very different in world-view and personality, the names of these two men are both linked to arguably the most significant human endeavor and resultant “success” in recorded history. The effort in question was the monumental task of the United States government to harness the energy of the atom in a new and devastating weapon of war, the atomic bomb. The super-secret Manhattan Project was a crash program formally authorized by president Franklin Roosevelt on Dec. 6, 1941. The program’s goal: In a time-frame of less than four years and against all odds, to capitalize on very recent scientific discoveries and rapidly develop an operational military weapon of staggering destructive power.

Albert Einstein and the Atomic Bomb

Albert Einstein, whose scientific resume ranks just behind that of Isaac Newton, had virtually no role in this weapons program save for two notable exceptions. First and foremost, it was Einstein’s follow-up paper to his milestone theory of special relativity in 1905 which showed that, contrary to long-standing belief, mass and energy are one and the same, theoretically convertible from one to another. That relationship is expressed by the most famous equation in science, e = mc2, where e is the energy inherent in mass, m is the mass in question, and c is the constant speed of light. One careful look at this relationship reveals its profoundness. Since the speed of light is a very large number (300 million meters per second), a tiny bit of mass (material) converted into its energy equivalent yields a phenomenal amount of energy. Note that Einstein had proposed a theoretical, nonetheless real, relationship in his equation. The big question: Would it ever be possible to produce that predicted yield of energy in practice? In 1938, two chemists in Hitler’s Germany, Hahn and Strassman, demonstrated nuclear fission in the laboratory, on a tiny scale. That news spread quickly throughout the world physics community – like ripples on a giant pond. It now appeared feasible to harness the nuclear power inherent in the atom as expressed by Einstein’s equation.

In August of 1939, alarmed by the recent news from Germany, Hungarian physicist Leo Szilard asked his colleague, Albert Einstein, to affix his signature to a letter addressed to President Roosevelt. The letter warned of recent German scientific advances and Germany’s sudden interest in uranium deposits in the Belgian Congo of Africa. Einstein, a German Jew who fled his homeland in 1932 for fear of Hitler’s growing influence, dutifully but reluctantly signed his name to the letter. Einstein’s imprimatur on the letter was Szilard’s best hope of affixing Roosevelt’s attention on the growing feasibility of an atomic bomb. Einstein and many other European scientists were, from personal experience, justifiably terrified at the prospect of Hitler’s Germany acquiring such a weapon, and the Germans had first-class scientific talent available to tackle such a challenge.

Einstein, one of history’s great pacifists, was thus ironically tied to the atomic bomb program, but his involvement went no further. Einstein never worked on the project and, after the war when Germany was shown to have made no real progress toward a weapon, he stated: “Had I known that the Germans would not succeed in producing an atomic bomb, I never would have lifted a finger.”

Stranger Than Fiction: The High Desert of Los Alamos, New Mexico

By early 1943, peculiar “invitations” from Washington were being received by many of this country’s finest scientific/engineering minds. A significant number of these ranked among the world’s top physicists including Nobel Prize winners who had emigrated from Europe. These shadowy “requests” from the government called for the best and the brightest to head (with their families in many cases) to the wide-open high desert country of New Mexico. Upon arrival, they would be further informed (to a limited extent) of the very important, secret work to be undertaken there. I have always believed that fact is stranger than fiction, and much more interesting and applicable. What transpired at Los Alamos over the next three years under the direction of J. Robert Oppenheimer and Army General Leslie Groves is scarcely believable, and yet it truly happened, and it has changed our lives unalterably.

One of my favorite narratives from Jon Else’s wonderful documentary film on the atomic bomb, The Day After Trinity, beautifully describes the ludicrous situation: “Oppenheimer had brought scientists and their families fresh from distinguished campuses all over the country – ivied halls, soaring campaniles, vaulted chapels. Los Alamos was a boom town – hastily constructed wooden buildings, dirt streets, coal stoves, and [at one point] only five bathtubs / There were no sidewalks. The streets were all dirt. The water situation was always bad / It was not at all unusual to open your faucet and have worms come out.” Los Alamos was like a California gold-rush boom town, constructed in a jiffy with the greatest assemblage of world-class scientific talent that will ever be gathered in one location. General Groves once irreverently quipped (with humor and perhaps some frustration) that Los Alamos had the greatest assemblage of “crack-pots” the world has ever known.

As improbable as the situation and the task at hand appeared – even given an open check-book from Roosevelt and Congress – Groves and Oppenheimer made it happen. I cannot think of any human endeavor in history so complex, so unlikely…and so “successful.” The triumph of NASA in space comes in a close second, but even realizing JFK’s promise of a man on the moon by 1969 cannot top the extraordinary scenario which unfolded at Los Alamos, New Mexico – all largely shielded from view.

The initial (and only) test of the atomic bomb took place on July 16, 1945, on the wide expanse of the New Mexico desert near Los Alamos. The test was code-named “Trinity.” The accompanying picture shows Oppenheimer and General Groves at ground zero of the blast, the site of the high tower from which the bomb was detonated. Evidence of desert sand fused into glass by the intense heat abounds. The test was a complete technical success – vindication for the huge government outlay and the dedication on the part of so many who put their lives on hold by moving to the high desert of New Mexico and literally “willing” their work to success for fear of the Germans. By July of 1945, however, Germany was vanquished without having made any real progress toward an atomic bomb.

The World Would Never Be the Same

That first nuclear detonation signaled a necessary reset for much of human thought and behavior. Many events quickly followed that demonstrated the power of that statement. Of immediate impact was the abrupt termination of World War II, brought about by two atomic bombs successfully dropped on Japan just weeks after the first and only test of the device (Hiroshima, August 6, 1945; Nagasaki, August 9, 1945). The resulting destruction of these two cities accomplished what many thousands of invading U.S. troops might have taken months to complete – with terrible losses. The horrific effect of these two bombs on the people of Japan has been well documented since 1945. Many, including a significant number of those who worked on the development of these weapons protested that such weapons should never be used again. Once the initial flush of “success” passed, the man most responsible for converting scientific theory into a practical weapon of mass destruction quickly realized that the “nuclear genie” was irretrievably out of the bottle, never to be predictably and reliably restrained. Indeed, Russia shocked the world by detonating its first atomic bomb in 1949. The inevitable arms race that Oppenheimer foresaw had already begun… the day after Trinity.

The Matter of J. Robert Oppenheimer, the Man

J. Robert Oppenheimer had been under tremendous pressure as technical leader of the super-secret Manhattan project since being appointed by the military man in charge of the entire project, Army general Leslie Groves. Groves was a military man through and through, accustomed to the disciplined hierarchy of the service, yet he hand-picked as technical lead for the whole program the brilliant physicist and mercurial liberal intellectual, J. Robert Oppenheimer – the most unlikely of candidates. Oppenheimer’s communist wife and brother prompted the FBI to vigorously protest the choice. Groves got his way, however.

Groves’ choice of J. Robert Oppenheimer for the challenging and consuming task of technical leader on the project proved to be a stroke of genius on his part; virtually everyone who worked on the Manhattan Project agreed there was no-one but Oppenheimer who could have made it happen as it did.

“Oppie,” as he was known and referred to by many on the Manhattan Project, directed the efforts of hundreds of the finest scientific and engineering minds on the planet. Foreign-born Nobel prize winners in physics were very much in evidence at Los Alamos. Despite the formidable scientific credentials of such luminaries as Hans Bethe, I.I. Rabi, Edward Teller, Enrico Fermi, and Freeman Dyson, Oppenheimer proved to be their intellectual equal. Oppenheimer either already knew and understood the nuclear physics, the chemistry, and the metallurgy involved at Los Alamos, or he very quickly learned it from the others. His intellect was lightning-quick and very deep. His interests extended well beyond physics as evidenced by his great interest in French metaphysical poetry and his multi-lingual capability. Almost more incredible than his technical grasp of all the work underway at Los Alamos was his unanticipated ability to manage all aspects of this, the most daring, ambitious, and important scientific/engineering endeavor ever undertaken. People who knew well his scientific brilliance from earlier years were amazed at the overnight evolution of “Oppie, the brilliant physicist and academic” into “Oppie, the effective, efficient manager” and co-leader of the project with General Groves.

Indelibly imprinted upon my mind is the interview scene with famous Nobel Laureate Hans Bethe conducted by Jon Else, producer of The Day After Trinity. Bethe was Oppie’s pick to be group leader for all physics on the project. The following comments of Bethe, himself a giant in theoretical physics, cast a penetrating light on the intellectual brilliance of J. Robert Oppenheimer and his successful role in this, the most daring and difficult scientific project ever attempted:

– “He was a tremendous intellect. I don’t believe I have known another person who was quite so quick in comprehending both scientific and general knowledge.”
– “He knew and understood everything that went on in the laboratory, whether it was chemistry, theoretical physics, or machine-shop. He could keep it all in his head and coordinate it. It was clear also at Los Alamos, that he was intellectually superior to us.”

The work was long, hard, and often late into the night at Los Alamos for its two thousand residents, but there was a social life at Los Alamos, and, according to reports, Robert Oppenheimer was invariably the center of attention. He could and often did lead discussions given his wide-ranging knowledge …on most everything! Dorothy McKibben (seated on Oppenheimer’s right in the following picture) was the “Gatekeeper of Los Alamos” according to all who (necessarily) passed through her tiny Manhattan Project Office at 109 East Palace Avenue, Santa Fe, New Mexico. There, they checked-in and collected the credentials and maps required to reach the highly secured desert site of Los Alamos. Ms. McKibben was affluent in her praise of Oppenheimer: “If you were in a large hall, and you saw several groups of people, the largest groups would be hovering around Oppenheimer. He was great at a party, and women simply loved him and still do.”

The Nuclear Weapons Advantage Proves to be Short-Lived

What was believed in 1945 to represent a long term, decided military advantage for the United States turned out to be an illusion, much as Oppenheimer likely suspected. With the help of spies Klaus Fuchs at Los Alamos, Julius Rosenberg, and others, Russia detonated their first atomic bomb only four years later.

Oppenheimer knew better, because he understood the physics involved and that, once demonstrated, nuclear weapons would rapidly pose a problem for the world community. When interviewed years later at Princeton where he had been head of the Institute for Advanced Studies (and Albert Einstein’s “boss”) he is shown in The Day After Trinity responding to the question, “[Can you tell us] what your thoughts are about the proposal of Senator Robert Kennedy that President Johnson initiate talks with the view to halt the spread of nuclear weapons?” Oppenheimer replied rather impatiently, “It’s twenty years too late. It should have been done the day after Trinity.”

J. Robert Oppenheimer fully appreciated, on July 16, 1945, the dangers inherent in the nuclear genie let loose from the bottle. His fears were well founded. Within a few years after Los Alamos, talk surfaced of a new, more powerful bomb based on nuclear fusion rather than fission, nevertheless still in accordance with e = mc2. This became popularly known as the “hydrogen bomb.” Physicist Edward Teller now stepped forward to promote its development in opposition to Oppenheimer’s stated wish to curtail the further use and development of nuclear weapons.

Arguments raged over the “Super” bomb as it was designated, and Teller prevailed. The first device was detonated by the U.S. in 1952. A complex and toxic cocktail of Oppenheimer’s reticence toward development of the Super combined with the past communist leanings of his wife, brother Frank, and other friends led to the Atomic Energy Commission, under President Eisenhower, revoking Oppenheimer’s security clearance in 1954. That action ended any opportunity for Oppenheimer to even continue advising Washington on nuclear weapons policy. The Oppenheimer file was thick, and the ultimate security hearings were dramatic and difficult for all involved. As for the effect on J. Robert Oppenheimer, we have the observations of Hans Bethe and I.I. Rabi, both participants at Los Alamos and Nobel prize winners in physics:

– I.I. Rabi: “I think to a certain extent it actually almost killed him, spiritually, yes. It achieved just what his opponents wanted to achieve. It destroyed him.”
– Hans Bethe: “He had very much the feeling that he was giving the best to the United States in the years during the war and after the war. In my opinion, he did. But others did not agree. And in 1954, he was hauled before a tribunal and accused of being a security risk – a risk to the United States. A risk to betray secrets.”

Later, in 1964, attitudes softened and Edward Teller nominated Oppenheimer for the prestigious Enrico Fermi award which was presented by President Johnson. As I.I. Rabi observed, however, the preceding events had, for all intents and purposes, already destroyed him. Oppenheimer was a conflicted man with a brilliant wide-ranging intellect. While one might readily agree with Hans Bethe’s assessment that Oppenheimer felt he was “giving the best to the United States in the years during and after the war,” there is perhaps more to the story than a significantly patriotic motivation. Oppenheimer was a supremely competent and confident individual whose impatient nature was tinged with a palpable arrogance. These characteristics often worked to his disadvantage with adversaries and co-workers.
Then there was the suggestion that, in addition to his patriotic motives, Oppenheimer was seized by “the glitter and the power of nuclear weapons” and the unprecedented opportunity to do physics on a grand scale at Los Alamos, and those were also major motivations. Other colleagues on the project later confessed to feeling the glitter and power of nuclear weapons, themselves. A brilliant man of many contradictions was Oppenheimer – that much is certain. Also certain is the likelihood that the man was haunted afterward by misgivings concerning his pivotal role, whatever his motivations, in letting loose the nuclear genie. The sadness in his eyes late in life practically confirms the suspicion. That is the tragedy of J. Robert Oppenheimer. Triumph has a way of extracting its penalty, its pound of flesh. I can think of no better example than Oppenheimer.

Immediately upon hearing of the bombing of Hiroshima, Hans Bethe recalled, “The first reaction which we had was one of fulfillment. Now it has been done. Now the work which we have been engaged in has contributed to the war. The second reaction, of course, was one of shock and horror. What have we done? What have we done? And the third reaction: It shouldn’t be done again.”

Nuclear Weapons: The Current State and Future Outlook

In the headlines of today’s news broadcasts as I write this is the looming threat of North Korean nuclear-tipped intercontinental ballistic missiles. The North Koreans have developed and tested nuclear warheads and are currently test-launching long-range missiles which could reach the U.S. mainland, as far east as Chicago. Likewise, Iran is close to having both nuclear weapons and targetable intermediate-range missiles. Nuclear proliferation is alive and well on this earth.

To illustrate the present situation, consider one staple of the U.S. nuclear arsenal -the one megaton thermonuclear, or hydrogen, bomb with the explosive equivalent of just over one million tons of TNT. That explosive energy is fifty times that of the plutonium fission bomb which destroyed the city of Nagasaki, Japan (twenty-two thousand tons of TNT). The number of such powerful weapons in today’s U.S. and Russian nuclear stockpiles is truly staggering, especially when one considers that a single one megaton weapon could essentially flatten and incinerate the core of Manhattan, New York. Such a threat is no longer limited to a device dropped from an aircraft. Nuclear-tipped ICBMs present an even more ominous threat.

The surprise success of the first Russian earth-orbiting satellite, “Sputnik,” in 1957 had far more significance than the loss of prestige in space for the United States. Accordingly, the second monumental and historic U.S. government program – on the very heels of the Manhattan Project – was heralded by the creation of NASA in 1958 and its role in the race to the moon. President John F. Kennedy issued his audacious challenge in 1963 for NASA to regain lost technical ground in rocketry by being first to put a man on the moon …in the decade of the sixties – in less than seven years! Many in the technical community thought the challenge was simply “nuts” given the state of U.S. rocket technology in 1963. As with the then very-recent, incredibly difficult and urgent program to build an atomic bomb, the nation once again accomplished the near-impossible by landing Armstrong and Aldrin on the moon on July 20, 1969 – well ahead of the Russians. And it was important that we surpassed Russia in rocket technology, for our ICBMs, which are the key delivery vehicle for nuclear weapons and thus crucial to most of the U.S. strategic defense, were born of this country’s efforts in space.

“Fat Man,” the bomb used on Nagasaki – 22 kilotons of TNT

Photo: Paul Shambroom

B83 1 megaton hydrogen bombs…compact and deadly

The above picture of a man casually sweeping the warehouse floor in front of nearly ten megatons of explosive, destructive power, enough to level the ten largest cities in America gives one pause to reflect. On our visit to Los Alamos in 2003, I recall the uneasy emotions I felt merely standing next to a dummy casing of this bomb in the visitor’s center and reflecting on the awesome power of the “live” device. Minus their huge development and high “delivery” costs, such bombs are, in fact, very “cheap” weapons from a military point of view.

One conclusion: Unlike the man with the broom in the above picture, we must never casually accept the presence of these weapons in our midst. One mistake, one miscalculation, and nuclear Armageddon may be upon us. The collective angels of man’s better nature had better soon decide on a way to render such weapons unnecessary on this planet. Albert Einstein expressed the situation elegantly and succinctly:

“The unleashing of [the] power of the atom has changed everything but our modes of thinking and thus we drift toward unparalleled catastrophes.”

Under a brilliant New Mexico sky on October 16, 1945, the residents of the Los Alamos mesa gathered for a ceremony on J. Robert Oppenheimer’s last day as director of the laboratory. The occasion: The receipt of a certificate of appreciation from the Secretary of War honoring the contributions of Oppenheimer and Los Alamos.

In his remarks, Oppenheimer stated: “It is our hope that in years to come we may look at this scroll, and all that it signifies, with pride. Today, that pride must be tempered with a profound concern. If atomic bombs are to be added as new weapons to the arsenals of a warring world, or to the arsenals of nations preparing for war, then the time will come when mankind will curse the names of Los Alamos and Hiroshima. The peoples of the world must unite, or they will perish.”

In today’s world, each step along the path of nuclear proliferation brings humanity ever closer to the ultimate fear shared by J. Robert Oppenheimer and Albert Einstein. The world had best heed their warnings.

Voices from My Past: Heard Through a Blog Post!

It is amazing how small this world has become thanks to technology and the reach of social media and blogs. My posts are viewed more than a thousand times each month including a sizeable percentage of views from outside the United States. Two months ago, a mid-west reader responded to one of my earlier posts with the comment: “I believe we are related!” Inasmuch as I had long ago (1948), at age eight, moved with my family to California from Chicago, Illinois, I was surprised and intrigued.

It so happens that Mary is a “lost” second cousin of mine originally from Chicago whose Grandfather Elmer was my Uncle Elmer – the older brother of my dad. Here is Elmer standing in front of his father’s radio repair shop on Diversey Avenue in Chicago, sometime in the early nineteen fifties. His dad was also named Elmer, and he was my paternal grandfather.

It is my grandfather and his tiny radio repair shop, mentioned in that post of mine, which caught second cousin Mary’s eye. The last portion of the post contains a picture of my grandparents (Mary’s great-grandparents) standing behind the counter of their little shop in Chicago (circa 1947) – the only photo of its kind in the entire family, apparently.

Inasmuch as I grew up only a mile or two from my grandparents and their “mom & pop” store with living quarters in the back, I quite vividly recall that shop and have often wished there were another picture of it and them… somewhere. Mary fortunately was able to provide the first photo, used here, showing the exterior of the shop which no longer exists. I well recall the red/orange neon sign in the window announcing: “Radio Service.” My memory bell “rang” at first glance.

On a 2004 vacation trip to Chicago, my wife and I returned to the scenes of my boyhood. I was amazed to find that most everything was still there, including our old brick apartment building, all looking just as recalled some 56 years later. Sadly, the building which housed the little radio repair shop at 6755 Diversey Ave. had long ago been cleared away for a large banquet hall/restaurant which today covers much of the block. I had really hoped to find that little storefront, the seat of so much of our family’s history…and my boyhood consciousness.

Soon after “finding” second cousin Mary, I met her cousin Linda, via E-mail. We have begun to fill-in a number of blanks in the Kubitz family history by exchanging recollections and pictures. Interestingly, both Mary and Linda were not at all sure about the history/existence of my grandparent’s radio repair shop on Diversey Ave. I, on the other hand had no knowledge at all of their grandfather’s (Elmer, pictured in the first photo) later radio repair shop on Belmont Ave. in Chicago. And so begins an interesting quest to learn more about the family history!

I am glad that second cousin Mary “discovered” me and my blog and took the time to verify the family connection. As so often happens, family history gets lost as time and distance take their inevitable toll. For me, leaving Chicago in 1948 when United Air Lines transferred my father, meant severing close ties with my grandparents, aunts, uncles, and cousins. There were no overt reasons why that should have happened so completely as it did. Coming from a family of five kids, as in my father’s case, family dynamics are always a part of the equation, but, mainly, the effect of time and distance took their toll. The daily scramble for a better life takes time and attention away from extended family solidarity. That was especially true back then when Chicago seemed so far away from San Francisco, California.

Thank goodness I was old enough to have collected indelible images and impressions of my close relatives before leaving them. I have always remained curious about them and sad that I never really got to know them as well as I would have liked.

For more background on this post and my personal/family history, click on these links to other applicable posts of mine:











“Out of Africa” / “The End of the Game”

There are few things that sadden me more than the inevitable fate of Africa’s wildlife at the hands of “civilization.” My message, here, touches on that theme while offering a broad-brush picture of Africa, past and present.

I had a farm in Africa, at the foot of the Ngong Hills,” opens the richly written and highly acclaimed memoir, Out of Africa, by Karen Blixen (pen name: Isak Dinesen). In a curious chain of circumstances, my life-long fascination with Africa and its wildlife has recently been rekindled by her story and by other recent events. Many will recall the magnificent 1985 film, Out of Africa, which starred a young Meryl Streep and Robert Redford, directed by Sidney Pollack. The story’s setting recounts the sweeping panorama that was East Africa and Kenya in the early days of its capitol, Nairobi.

It was in Nairobi that many Brits and other Europeans opened up new colonial frontiers for the British Empire. People came for a multitude of reasons: Land grants from the government, pure adventure, and the opportunity to start a new life in a new place – whatever the root motivation. And those who came and risked the hardships of raw Africa in the early twentieth century were either fools or hardy adventurers, determined and pre-destined to succeed, there.

One of those who could and did meet Africa’s challenge was the Scotsman, John Alexander Hunter, who quickly became the most celebrated “white hunter” and later game warden in Africa’s history. Hunter arrived in Nairobi from Scotland in 1908 seeking adventure and a livelihood. Indeed, there was a need for “animal control” during the early days when native Africans eked out an existence among the then-teeming wildlife that surrounded them. Men like Hunter dealt with marauding elephants and rhinos that destroyed crops of the Masai and the Kikuyu farmers. And there were the occasional man-eaters, lions which had tasted human flesh and found the taking too easy. In the early days of plentiful game, white hunters paid their bills by guiding hunting safaris of the rich and privileged. Few people had the foresight in the 1920’s and 30’s to envision the dire wildlife situation which exists today in Africa. J.A. Hunter saw it coming as he later turned to work as a game warden and “gun safety” for strictly photographic safaris. It was J.A. Hunter’s iconic little book, Hunter, that first triggered my personal African odyssey more than fifty years ago. For that story as related on my earlier blog post about J.A. Hunter, click on the following link:


One of Nairobi’s early settlers not destined to make East Africa their permanent home was Karen Blixen who published Out of Africa in 1937, six years after returning to her native Denmark. Her memoir of the years 1913 to 1931 spent on her coffee farm high near the foot of the Ngong Hills, a dozen or so miles from Nairobi, creates a brilliant collage of Africa, Kenya’s bountiful wildlife, and the inscrutable native Africans who served her and her homestead. She related well to the Kikuyu and Somali Africans who worked her farm and household, even forming close bonds with several of those who served inside her home. Nonetheless, she appreciated her ultimate limits in that regard as she noted, “On our safaris and on the farm, my acquaintance with the Natives developed into a settled and personal relationship. We were good friends. I reconciled myself to the fact that while I should never quite know or understand them, they knew me through and through, and were conscious of the decisions I was going to take, before I was certain about them, myself.”

Blixen’s eloquent depictions in the book serve not only as a personal memoir of a bigger-than-life true story, but as a brilliant tapestry of early colonial East Africa, the traditions of safari and Africa’s teeming wildlife, and the challenging surroundings which engulfed the author. Earnest Hemingway, no stranger to literary honors, thought so highly of Out of Africa and its merits that he proposed Blixen as a worthy contender for a Nobel Prize in literature.

There is another prophetic book about Africa which, despite my early naivete concerning the subject, I was prescient enough to purchase in 1965, the year of its initial publication. Its title: The End of the Game, by Peter Beard. It has become something of a cult title, yet despite the author’s unorthodox style, the book was prophetic about the end of “the game” in Africa. While expressing grave pessimism over the fate of Africa’s game animals, the book’s larger thrust is a poke at “the game” as played in Nairobi and elsewhere by the early, privileged white “invaders” from colonial Britain and Europe as they went about executing new “land grants” and confiscating land from the resident natives to build their empires. Does that sound familiar to students of the American West and its history? The pages of this book contain many glimpses of the early settlers in and around Kenya including Baroness (Karen) Blixen and her lover and platonic ideal, the storied, Oxford educated white hunter, Denys Finch Hatton. Blixen and Finch Hatton came from aristocratic, wealthy backgrounds in Denmark and England, respectively. Like so many others of privilege who comprised the early colonial settlements in Kenya, they seemed by background unfit and unprepared to deal, long-term, with the demands of African existence. Blixen and her Danish nobleman husband at the time, Bror Blixen, made the ill-informed decision to plant coffee at their farm, ignoring the fact that the land was too elevated for favorable results. That fact and a later, devastating fire that destroyed the farm’s coffee processing barn, doomed Karen Blixen’s success in Africa.

On the other hand, men like J.A. Hunter and H.K. Binks arrived in Africa already equipped with a steely inner-core and flexible attitudes, tempered by the experiences of a well-grounded, challenging early life at “home.”

H.K. “Pop” Binks came from Yorkshire and arrived in Nairobi in 1900, making him one of the town’s earliest white settlers. He lived very modestly in Nairobi with his wife “Binkie” until death claimed him in 1971. During his lifetime in Nairobi, Binks plied numerous trades, including local photographer, astronomer, and author. He exhibited the attitudes and adaptability that life in Africa demanded. I have read first-hand accounts concerning the initiative and resilience of Mr. Binks, and at least one educated voice who knew him personally claimed him to be “the most interesting person I ever knew.” In his book, African Rainbow, Binks stated: “I have been lonelier in the crowded streets of a city than in the great open spaces of Africa, with all wild things for companions.”

In 1965, I received a personal note from Mr. Binks in response to a letter I wrote to him at his Nairobi address. I had asked if he had any reminiscences of old Nairobi he could share with me – strictly because of my interest in East Africa and early settlers like him. He very kindly answered my “out of the blue” letter but explained he was already involved with a “home” publisher (perhaps his book, African Rainbow) and could not comply. He wished me luck in my search.

Needless to say, I was pleased and grateful that he cared enough to reply to me. I have kept that folded little note from Binks tucked in one of my J.A. Hunter books for over fifty years, now – a prized connection to the East Africa that once was.

Over time, Africa methodically weeded out its unfit would-be residents just as it has always done within its animal populations. Changing conditions hastened the demise of the colonials who enjoyed a privileged existence in old Africa. In a similar vein, evolving world and local conditions appear also to foretell the virtual demise of Africa’s crown jewel, its diverse animal populations, roaming free and wild.

Today, population pressure from within Africa threatens its wildlife like never before. Whereas J.A. Hunter was occasionally called upon to kill a marauding elephant or rhino intent on invading a local native village, today the local human populations expand inexorably outward occupying and fencing vast stretches of what were once grazing lands where animals roamed free. And today we must deal with organized poachers who continue to cull the finest wildlife specimens from the remaining small numbers still “protected” in game preserves.

Saving Africa’s wildlife can succeed only with world-wide support. Much lip service is paid, but a comprehensive, long-range plan and adequate moneys appear wanting.

The problems involved in protecting Africa’s wildlife are very challenging, yet, in the face of halting progress to date, it seems to me that man is a failed species, himself, if he cannot prevent the decimation of Africa’s (and nature’s) crowning glory. I venture to say, in that case, man will ultimately prove incapable of saving himself from himself. If so, perhaps we deserve no better fate as a species.

Sir Isaac Newton: “I Can Calculate the Motions of the Planets, but I Cannot Calculate the Madness of Men”

Isaac Newton, the most incisive mind in the history of science, reportedly uttered that sentiment about human nature. Why would he infer such negativity about his fellow humans? Newton’s scientific greatness stemmed from his ability to see well beyond human horizons. His brilliance was amply demonstrated in his great book, Philosophiae Naturalis Principia Mathematica in which he logically constructed his “system of the world,” using mathematics. The book’s title translates from Latin as Mathematical Principles of Natural Philosophy, often shortened to “the Principia” for convenience.

The Principia is the greatest scientific book ever published. Its enduring fame reflects Newton’s ground-breaking application of mathematics, including aspects of his then-fledgling calculus, to the seemingly insurmountable difficulties of explaining motion physics. An overwhelming challenge for the best mathematicians and “natural philosophers” (scientists) in the year 1684 was to demonstrate mathematically that the planets in our solar system should revolve around the sun in elliptically shaped orbits as opposed to circles or some other geometric path. The fact that they do move in elliptical paths was carefully observed by Johann Kepler and noted in his 1609 masterwork, Astronomia Nova.

In 1687, Newton’s Principia was published after three intense years of effort by the young, relatively unknown Cambridge professor of mathematics. Using mathematics and his revolutionary new concept of universal gravitation, Newton provided precise justification of Kepler’s laws of planetary motion in the Principia. In the process, he revolutionized motion physics and our understanding of how and why bodies of mass, big and small (planets, cannonballs, etc.), move the way they do. Newton did, indeed, as he stated, show us in the Principia how to calculate the motion of heavenly bodies.

In his personal relationships, Newton found dealing with people and human nature to be even more challenging than the formidable problems of motion physics. As one might suspect, Newton did not easily tolerate fools and pretenders in the fields of science and mathematics – “little smatterers in mathematicks,” he called them. Nor did he tolerate much of basic human nature and its shortcomings.

 In the Year 1720, Newton Came Face-to-Face with
His Own Human Vulnerability… in the “Stock Market!”

 In 1720, Newton’s own human fallibility was clearly laid bare as he invested foolishly and lost a small fortune in one of investing’s all-time market collapses. Within our own recent history, we have had suffered through the stock market crash of 1929 and the housing market bubble of 2008/2009. In these more recent “adventures,” society and government had allowed human nature and its greed propensity to over-inflate Wall Street to a ridiculous extent, so much so that a collapse was quite inevitable to any sensible person…and still it continued.

Have you ever heard of the great South Sea Bubble in England? Investing in the South Sea Trading Company – a government sponsored banking endeavor out of London – became a favorite past-time of influential Londoners in the early eighteenth century. Can you guess who found himself caught-up in the glitter of potential investment returns only to end up losing a very large sum? Yes, Isaac Newton was that individual along with thousands of others.

It was this experience that occasioned the remark about his own inability to calculate the madness of men (including himself)!

Indeed, he should have known better than to re-enter the government sponsored South Sea enterprise after initially making a tidy profit from an earlier investment in the stock. As can be seen from the graph below, Newton re-invested (with a lot!) in the South Sea offering for the second time as the bubble neared its peak and just prior to its complete collapse. Newton lost 20,000 English pounds (three million dollars in today’s valuations) when the bubble suddenly burst.

Clearly, Newton’s comment, which is the theme of this post, reflects his view that human nature is vulnerable to fits of emotion (like greed, envy, ambition) which in turn provoke foolish, illogical behaviors. When Newton looked in the mirror after his ill-advised financial misadventure, he saw staring back at him the very madness of men which he then proceeded to rail against! Knowing Newton through the many accounts of his life that I have studied, I can well imagine that his financial fiasco must have been a very tough pill for him to swallow. Many are the times in his life that Newton “railed” to vent his anger against something or someone; his comment concerning the “madness of men” is typical of his outbursts. Certainly, he could disapprove of his fellow man for fueling such an obvious investment bubble. In the end, and most painful for him, was his realization that he paid a stiff price for foolishly ignoring the bloody obvious. For anyone who has risked and lost on the market of Wall Street, the mix of feelings is well understood. Even the great Newton had his human vulnerabilities – in spades, and greed was one of them. One might suspect that Newton, the absorbed scientist, was merely naïve when it came to money matters.

That would be a very erroneous assumption. Sir Isaac Newton held the top-level government position of Master of the Mint in England, during those later years of his scientific retirement – in charge of the entire coinage of the realm!


For more on Isaac Newton and the birth of the Principia click on the link: https://reasonandreflection.wordpress.com/2013/10/27/the-most-important-scientific-book-ever-written-conceived-in-a-london-coffee-house/

Is Life Becoming Too Complex? The Devil Is in the Details….! Can We Keep Up?

Details matter in this life, and they demand our attention – increasingly so. It is becoming impossible to live under illusions such as, “Details are confined mainly to the realm of specialists, like the computer programmer and the watchmaker.” The need for “attention to detail” on the part of everyman has never been greater.


I’ve been around for a while, now – over seventy-six years. Given all those years and, with the detached attitude of an impartial observer, I have reached some general conclusions regarding technology, time, and our quality of life, today.

Conclusion #1:
The opportunity for living a comfortable, meaningful, and rewarding life has never been greater – especially in this United States of America. We have so many choices today in this society, for better or for worse.

Conclusion #2:
The veracity of conclusion #1 is due to the positive influence of science and technology on our lives. Today’s information age has delivered the world, indeed, the universe (and Amazon, too) to our desktops and living rooms.


It is true that computers and the internet are virtually indispensable, now.  However, the tools and the technology of the scientific/information age change continually, at an ever more rapid pace. Can we humans continue to keep pace with it all without making painful choices and sacrifices in our lives? Have computer problems ever driven you nuts? Do we have too many choices and opportunities now, thanks to the internet and stores like Walmart? How often have you shopped for something specific in the supermarket or on Amazon and been bewildered by the blizzard of choices which accost you thanks to high-tech marketing? Even choosing a hair shampoo poses a challenge for today’s shopper.

Conclusion #3:
Scientific knowledge and the rapid technological progress it spawns have become, universally, a 50/50 proposition for the human race. The reality suggests that for every positive gain in our lives brought about by our growing technology base, there is, unrelentingly, a negative factor to be overcome as well – a price to be paid. There is virtually a one-to-one correspondence at play – seemingly like an unspoken law of nature which always holds sway – much like the influence of gravitational attraction! In familiar parlance, “There is no free lunch in life: Rather, a price to paid for everything!”

The best example possible of this contention? Consider Einstein’s revelation in 1905 that mass and energy are interchangeable: e=mc2. This, the most famous equation in science, opened not only new frontiers in physics, but also the possibility of tremendous industrial power – at minimal cost. On the negative side, along with nuclear power plants, we now have nuclear weapons capable, in one day, of essentially ending life on this planet – thanks to that same simple equation. As for usable, nuclear-generated power, the potential price for such energy has been dramatically demonstrated in several notable cases around the globe over recent decades.

Need another example? How about the information technology which enables those handy credit cards which make purchasing “goodies” so quick and easy? On the negative side, how about the punishing cost of credit for account balances not promptly paid? More disturbing is the fact that such technology in the hands of internet criminals makes one’s private financial information so vulnerable, today. I found out the hard way, recently, that just changing your hacked credit card for a new one does not necessarily end your problems with unauthorized charges! The price in real money paid by society for foiling technology savvy ne-er do-wells is huge, in the billions of dollars every year.

Conclusion #4
Society, today, seems to discount the wisdom inherent in the old, familiar phrase, “The devil is in the details!” We are easily enticed by the lure of “user-friendly” computers and devices, and indeed, most are generally well-designed to be just that – considering what they can do for us. But today’s scientists and engineers fully understand the profundity of that “devil is in the details” contention as they burrow deeper and deeper into nature’s secrets. The lawyer and the business man fully understand the message conveyed given the importance of carefully reading “the fine print” embedded in today’s legal documents and agreements. How many of us take (or can even afford) the time to read all the paperwork/legalese which accompanies the purchase of a new automobile or a house! Increasingly, we seem unable/unwilling to keep up with the burgeoning demands imposed by the exponential growth of detail in our lives, and that is not a healthy trend.

I am convinced and concerned that many of us are in way over our heads when it comes to dealing with the more sophisticated aspects of today’s personal computers, and these systems are becoming increasingly necessary for families and seniors merely trying to getting by in today’s internet world. Even those of us with engineering/computer backgrounds have our hands full keeping up with the latest developments and devices: I can personally attest to that! The devil IS in the details, and the details involved in computer science are growing exponentially. Despite the frequently quoted phrase “user-friendly interface,” I can assure you that the complexity lurking just below that user-friendly, top onion-skin-layer of your computer or iPhone is very vast, indeed, and that is why life gets sticky and help-entities like the Geek Squad will never lack for stymied customers.

Make no mistake: It is not merely a question of “Can we handle the specific complexities of operating/maintaining our personal computers?” Rather, the real question is, “Can we handle all the complexities/choices which the vast capabilities of the computer/internet age have spawned?”  

Remember those “user manuals?” Given the rapid technological progress of recent decades, the degree of choice/complexity growth is easily reflected by the growing size of user manuals, those how-to instructions for operating our new autos, ovens, cooktops, washing machines, and, now, phones and computers. Note: The “manuals” for phones and computers are now so complex that printed versions cannot possibly come with these products. Ironically, there are virtually no instructions “in the box.” Rather, many hundreds of data megabytes now construct dozens of computer screens which demonstrate the devices’ intricacies on-line. These software “manuals” necessarily accommodate the bulk and the constantly changing nature of the product itself. Long gone are the old “plug it in and press this button to turn it on” product advisories. More “helpful” product options result in significantly more complexity! Also gone are the “take it in for repair” days. My grandfather ran a radio repair shop in Chicago seventy years ago. Today, it is much cheaper and infinitely more feasible to replace rather than repair anything electronic.

An appropriate phrase to describe today’s burgeoning technologies is “exponential complexity.” What does that really mean and what does it tell us about our future ability to deal with the coming “advantages” of technology which will rain down upon us? I can illustrate what I mean.

Let us suppose that over my seventy-six years, the complexity of living in our society has increased by 5% per year – a modest assumption given the rapid technological gains in recent decades. Using a very simple “exponential” math calculation, at that rate, life for me today is over 40 times more complex than it was for my parents the day I was born!

To summarize: Although many of the technological gains made over recent decades were intended to open new opportunities and to make life easier for us all, they have imposed upon us a very large burden in the form of the time, intelligence, and intellectual energy required to understand the technology and to use it both efficiently and wisely. Manual labor today is much minimized; the intellectual efforts required to cope with all the newest technology is, indeed, very significant and time-consuming. There is a price to be paid…for everything.

The major question: At what point does technology cease to help us as human beings and begin to subjugate us to the tyranny of its inherent, inevitable and necessary details? The realm in which the details live is also home to the devil.

The devil tempts. The burgeoning details and minutia in today’s society act to corrode our true happiness. We should be cautious lest we go too far up the technology curve and lose sight of life’s simpler pleasures… like reading a good book in a quiet place – cell phones off and out of reach. The noise and bustle of Manhattan can appear endlessly intoxicating to the visitor, but such an environment is no long-term substitute for the natural sounds and serenity of nature at her finest. The best approach to living is probably a disciplined and wisely proportioned concoction of both worlds.

The above recipe for true happiness involves judicious choices, especially when it comes to technology and all the wonderful opportunities it offers. Good choices can make a huge difference. That is the ultimate message of this post.

As I write this, I have recently made some personal choices: I am redoubling my efforts to gain a more solid grasp of Windows 10 and OS X on my Mac. Despite the cautionary message of this post regarding technology, I see this as an increasingly necessary (and interesting) challenge in today’s world. This is a choice I have made. I have, however, put activities like FaceBook aside and have become much more choosey about time spent on the internet.

My parting comment and a sentiment which I hope my Grandkids will continue to heed: “So many good books; so little quality time!”

Sir Humphry Davy: Pioneer Chemist and His Invention of the Coal Miner’s “Safe Lamp” at London’s Royal Institution – 1815

humphry-davy-51Among the many examples to be cited of science serving the cause of humanity, one story stands out as exemplary. That narrative profiles a young, pioneering “professional” chemist and his invention which saved the lives of thousands of coal miners while enabling the industrial revolution in nineteenth-century England. The young man was Humphry Davy, who quickly rose to become the most famous chemist/scientist in all of England and Europe by the year 1813. His personal history and the effects of his invention on the growth of “professionalism” in science are a fascinating story.

The year was 1799, and a significant event had occurred. The place: London, England. The setting: The dawning of the industrial revolution, shortly to engulf England and most of Europe. The significant event of which I speak: The chartering of a new, pioneering entity located in the fashionable Mayfair district of London. In 1800, the Royal Institution of Great Britain began operation in a large building at 21 Albemarle Street. Its pioneering mission: To further the cause of scientific research/discovery, particularly as it serves commerce and humanity.


The original staff of the Royal Institution was tiny, headed by its founder, the notable scientist and bon-vivant, Benjamin Thompson, also known as Count Rumford. Quickly, by 1802, a few key members of the founding staff, including Rumford, were gone and the fledgling organization found itself in dis-array and close to closing its doors. Just one year earlier, in 1801, two staff additions had materialized, men who were destined to make their scientific marks in physics and chemistry while righting the floundering ship of the R.I. by virtue of their brilliance – Thomas Young and the object of this post, a young, relatively unknown, pioneering chemist from Penzance/Cornwall, Humphry Davy.

By the year 1800, the industrial revolution was gaining momentum in England and Europe. Science and commerce had already begun to harness the forces of nature required to drive industrial progress rapidly forward. James Watt had invented the steam engine whose motive horsepower was now bridled and serving the cause by the year 1800. The looming industrial electrical age was to dawn two decades later, spearheaded by Michael Faraday, the most illustrious staff member of the Royal Institution, ever, and one of the greatest physicists in the history of science.

In the most unlikely of scenarios at the Royal Institution, Humphry Davy interviewed and hired the very young Faraday as a lab assistant (essentially lab “gofer”) in 1813. By that time, Davy’s star had risen as the premier chemist in England and Europe; little did he know that the young Faraday, who had less than a grade-school education and who worked previously as a bookbinder, would, in twenty short years, ascend to the pinnacle of physics and chemistry and proceed to father the industrial electrical age. The brightness of Faraday’s scientific star soon eclipsed even that of Davy’s, his illustrious benefactor and supervisor.

For more on that story click on this link to my previous post on Michael Faraday: https://reasonandreflection.wordpress.com/2013/08/04/the-electrical-age-born-at-this-place-and-fathered-by-this-great-man/

Wanted: Ever More Coal from England’s Mines 
at the Expense of Thousands Lost in Mine Explosions

Within two short years of obtaining his position at the Royal Institution in 1813, young Faraday found himself working with his idol/mentor Davy on an urgent research project – a chemical examination of the properties of methane gas, or “fire damp,” as it was known by the “colliers,” or coal miners.

The need for increasing amounts of coal to fuel the burgeoning boilers and machinery of the industrial revolution had forced miners deeper and deeper underground in search of rich coal veins. Along with the coal they sought far below the surface, the miners encountered larger pockets of methane gas which, when exposed to the open flame of their miner’s lamp, resulted in a growing series of larger and more deadly mine explosions. The situation escalated to a national crisis in England and resulted in numerous appeals for help from the colliers and from national figures.

By 1815, Humphry Davy at the Royal Institution had received several petitions for help, one of which came from a Reverend Dr. Gray from Sunderland, England, who served as a spokesman/activist for the colliers of that region.

Davy and the Miner’s Safe Lamp:
Science Serving the “Cause of Humanity”

Working feverishly from August and into October, 1815, Davy and Faraday produced what was to become known as the “miner’s safe lamp,” an open flame lamp designed not to explode the pockets of methane gas found deep underground. The first announcement of Davy’s progress and success in his work came in this historic letter to the Reverend Gray dated October 30, 1815.


The announcement heralds one of the earliest, concrete examples of chemistry (and science) put to work to provide a better life for humanity.

Royal Institution
Albermarle St.
Oct 30

 My Dear Sir

                               As it was in consequence of your invitation that I endeavored to investigate the nature of the fire damp I owe to you the first notice of the progress of my experiments.

 My results have been successful far beyond my expectations. I shall inclose a little sketch of my views on the subject & I hope in a few days to be able to send a paper with the apparatus for the Committee.

 I trust the safe lamp will answer all the objects of the collier.

 I consider this at present as a private communication. I wish you to examine the lamps I had constructed before you give any account of my labours to the committee. I have never received so much pleasure from the results of my chemical labours, for I trust the cause of humanity will gain something by it. I beg of you to present my best respects to Mrs. Gray & to remember me to your son.

 I am my dear Sir with many thanks for your hospitality & kindness when I was at Sunderland.


                                                                             H. Davy

This letter is clearly Davy’s initial announcement of a scientifically-based invention which ultimately had a pronounced real and symbolic effect on the nascent idea of “better living through chemistry” – a phrase I recall from early television ads run by a large industrial company like Dupont or Monsanto.


In 1818, Davy published his book on the urgent, but thorough scientific researches he and Faraday conducted in 1815 on the nature of the fire damp (methane gas) and its flammability.


Davy’s coal miner’s safety lamp was the subject of papers presented by Davy before the Royal Society of London in 1816. The Royal Society was, for centuries since its founding by King Charles II in 1662, the foremost scientific body in the world. Sir Isaac Newton, the greatest scientific mind in history, presided as its president from 1703 until his death in 1727. The Society’s presence and considerable influence is still felt today, long afterward.

davy41Davy’s safe lamp had an immediate effect on mine explosions and miner safety, although there were problems which required refinements to the design. The first models featured a wire gauze cylinder surrounding the flame chamber which affected the temperature of the air/methane mixture in the vicinity of the flame. This approach took advantage of the flammability characteristics of methane gas which had been studied so carefully by Davy and his recently hired assistant, Michael Faraday. Ultimately, the principles of the Davy lamp were refined sufficiently to allow the deep-shaft mining of coal to continue in relative safety, literally fueling the industrial revolution.

Humphry Davy was a most unusual individual, as much poet and philosopher in addition to his considerable talents as a scientist. He was close friends with and a kindred spirit to the poets Coleridge, Southey, and Wordsworth. He relished rhetorical flourish and exhibited a personal idealism in his earlier years, a trait on open display in the letter to the Reverend Gray, shown above, regarding his initial success with the miner’s safe lamp.

“I have never received so much pleasure from the results of my chemical labours, for I trust the cause of humanity will gain something by it.”

As proof of the sincerity of this sentiment, Davy refused to patent his valuable contribution to the safety of thousands of coal miners!

Davy has many scientific “firsts” to his credit:

-Experimented with the physiological effects of the gas nitrous oxide (commonly known as “laughing gas”) and first proposed it as a possible medical/dental anesthetic – which it indeed became years later, in 1829.

-Pioneered the new science of electrochemistry using the largest voltaic pile (battery) in the world, constructed for Davy in the basement of the R.I. Alessandro Volta first demonstrated the principles of the electric pile in 1800, and within two years, Davy was using his pile to perfect electrolysis techniques for separating and identifying “new” fundamental elements from common chemical compounds.

-Separated/identified the elements potassium and sodium in 1807, soon followed by others such as calcium and magnesium.

-In his famous, award-winning Bakerian Lecture of 1806, On Some Chemical Agencies of Electricity, Davy shed light on the entire question concerning the constituents of matter and their chemical properties.

-Demonstrated the “first electric light” in the form of an electric arc-lamp which gave off brilliant light.

-Wrote several books including Elements of Chemical Philosophy in 1812.

In addition to his pioneering scientific work, Davy’s heritage still resonates today for other, more general reasons:

-He pioneered the notion of “professional scientist,” working, as he did, as paid staff in one of the world’s first organized/chartered bodies for the promulgation of science and technology, the Royal Institution of Great Britain.

-As previously noted, Davy is properly regarded as the savior of the Royal Institution. Without him, its doors surely would have closed after only two years. His public lectures in the Institution’s lecture theatre quickly became THE rage of established society in and around London. Davy’s charismatic and informative presentations brought the excitement of the “new sciences” like chemistry and electricity front and center to both ladies and gentlemen. Ladies were notably and fashionably present at his lectures, swept up by Davy’s personal charisma and seduced by the thrill of their newly acquired knowledge… and enlightenment!


The famous 1802 engraving/cartoon by satirist/cartoonist James Gillray
Scientific Researches!….New Discoveries on Pneumaticks!…or…An
Experimental Lecture on the Power of Air!

This very famous hand-colored engraving from 1802 satirically portrays an early public demonstration in the lecture hall of the Royal Institution of the powers of the gas, nitrous oxide (laughing gas). Humphry Davy is shown manning the gas-filled bellows! Note the well-heeled gentry in the audience including many ladies of London. Davy’s scientific reputation led to his eventual English title of Baronet and the honor of Knighthood, thus making him Sir Humphry Davy.

The lecture tradition at the R.I. was begun by Davy in 1801 and continued on for many years thereafter by the young, uneducated man hired by Davy himself in 1813 as lab assistant. Michael Faraday was to become, in only eight short years, the long-tenured shining star of the Royal Institution and a physicist whose contributions to science surpassed those of Davy and were but one rank below the legacies of Galileo, Newton, Einstein, and Maxwell. Faraday’s lectures at the R.I. were brilliantly conceived and presented – a must for young scientific minds, both professional and public – and the Royal Institution in London remained a focal point of science for more than three decades under Faraday’s reign, there.


The charter and by-laws of the R.I. published in 1800 and an admission ticket to Michael Faraday’s R.I. lecture on electricity written and signed by him: “Miss Miles or a friend / May 1833”

Although once again facing economic hard times, the Royal Institution exists today – in the same original quarters at 21 Albemarle Street. Its fabulous legacy of promulgating science for over 217 years would not exist were it not for Humphry Davy and Michael Faraday. It was Davy himself who ultimately offered that the greatest of all his discoveries was …Michael Faraday.

Post-Election Re-Post: The Best Government Money Can Buy? Follow the Money!

I believe this is a good time to re-post a much earlier piece I wrote for this blog concerning the greatest threat to this country, the United States of America. Aside from the potential world-wide proliferation of nuclear weapons, the greatest concern comes from within. The problem of money in government is manifest across both sides of the political spectrum. Both President-Elect Donald Trump and Democrat Bernie Sanders warned of the danger during the recent campaigns. It is my contention that all Americans, regardless of political affiliation, should be concerned. Dealing with this problem represents a great opportunity for the incoming administration. My earlier post (which is repeated here in its entirety) can be reached by clicking on the following link:


The Best Government Money Can Buy? Follow the Money!

For a very long time now, I have asked myself, “What is the biggest threat to the United States of America and our way of life?” From the beginning the answer seemed very clear, at least to my way of thinking, and the answer remains the same throughout the years that have passed since I first posed the question.

The Money

We already have the best government that money can buy, and the situation grows worse, particularly in Washington.

This has been the case under numerous administrations, and the situation knows no political party lines. Rather, it seems that such a dilemma is inevitable, somewhat akin to a perpetual law of nature – human nature. History has shown that while technology continues its explosive, exponential growth, human nature changes little as generations come and go; the problems inherent in a people attempting to wisely govern themselves stem from the biologically hard-wired nature of our mental make-ups. Self-preservation, self-interest, and just plain greed are always obvious and abundant in any society. Acting in one’s self-interest can be excused to a certain degree, but what is the proper label for a situation where wealthy and powerful factions “legally” influence (or control?) a society’s government along the lines of their best interests, often to the disadvantage of the middle classes and to the detriment of the society as a whole? Let’s face it, such a situation transcends the label of “political influence”; it is more properly called “corruption.”

 From the middle-class vantage point here in the trenches, it seems that Wall Street and corporate interests, along with labor unions have had a significant effect on the legislation which affects us all – often to our overall detriment. This happens when highly-paid lobbyists representing those interests bestow “legal” favors and political campaign contributions to candidates running for political office, candidates who are desperate to be elected or who face a tough re-election campaign. Why do lobbyists do this? The best answer to that comes from the well-worn advice which is so often relevant, “Follow the money!”

The U.S. Tax Code

Take the U.S. tax code, for instance – please! We are reminded every April how ridiculously complex it is, yet it should not be. Why not simplify it then? It will never happen under current conditions, because moneyed interests will always be on the backs of legislative committees to insure loopholes and myriad exemptions favorable to their particular business or interest. It is precisely the ongoing tinkering – no, make that meddling – by influence-peddlers acting through Congress which results in a ridiculously complex tax code. In fact, many of the bills which emerge from Congress are unduly long and unwieldy for precisely the same reason. Why are so many corporations “incorporated” in obscure places like the Cayman Islands even though their businesses operate primarily within the U.S., Europe and Asia? Follow the money – a more favorable tax base, of course.

Unsustainable Pension Obligations

Have you been hearing about the huge pension-obligation problems in bankrupt Detroit, in Stockton, California, and in many small communities across the country? We see only the tip of the iceberg on this one. The culprits: Labor unions and the politicians friendly to labor’s often excessive demands regarding benefits for their rank and file. Unions clearly have political and financial clout in political campaigns, and that is not without its long-term financial consequences – as we now understand. And then there are those political incumbents facing no imminent election challenges who just do not wish to deal with labor unrest during their tenure, so they readily cave to excessive union demands, “kicking the can down the road” and into the next person’s term of office. And the beat goes on.

A Second, Related Concern: Public Complacency

Nature has endowed humans not only with certain inalienable rights, but also with certain biological “defense mechanisms” – one of which is the tendency to put problems away for another day as long as a crisis is not imminent. This helps prevent ulcers, I suppose! We Americans it seems, have been “kicking the can down the road” for some time, now, while lobbyists have increasingly diverted governments, especially our Congress, away from truly representing “the people.” Politicians are too often focused on satisfying the wealthy and powerful who grant numerous “legal” favors to them and their office.

 And, by the way, did you know that currently something like 42% of legislators who leave Congress become paid lobbyists in Washington? I wonder why they do that? Again, follow the money! I believe that percentage was less than 12% just a few decades ago. Why did the Roman Empire collapse after centuries of world dominance? The experts tell us that corruption and public complacency were the primary causes. Does anything ever really change?

But It’s All Legal!

Some would claim that the lobbying industry operates perfectly legally, within legitimate guidelines regulating such things as political campaign contributions, etc. The rest of us would point to the fact that many of the laws and regulation guidelines as established by Congress and interpreted by the courts have long been unduly influenced by powerful interests. This calls to mind the old adage of “the fox guarding the henhouse.”

At what point do laws which clearly benefit the wealthy and powerful to the detriment of the common citizen and the overall good of the country become recognized as symbols and agents of corruption? I believe “bribery” is another way of expressing the current situation.

Abraham Lincoln’s “Take” on America


I recall many, many years ago, the moving, talking automaton (robot) of Lincoln at Anaheim’s Disneyland. At that time, such computer-controlled realism was quite a new thing. The convincing figure of Lincoln recited a number of his prescient thoughts and memorable utterances. I recall vividly the central idea that struck me the most, but I must paraphrase very liberally here: “This country, with all its resources and potential, will never be conquered by outside forces. Rather, it has more to fear from decay and forces within, than from foreign foes.” Amen.

 Was it not Lincoln who said “something” to the effect that we should resolve that…..“government of the people, by the people, and for the people, shall not perish from the earth?”