His body rests at Cypress Lawn Cemetery near San Francisco where he died in 1860. His head sits at the Warren Anatomical Museum in Boston. Here, Terry Hamburg tells us about a man who suffered a brain injury and the changes it caused – Phineas Gage.

Phineas Gage in the time after his accident.

Phineas Gage is perhaps the most famous neurological patient in modern history, called one of the “great medical curiosities of all time” and a “living part of medical folklore.” Malcolm MacMillan of the University of Melbourne records that two-thirds of introductory psychology textbooks cover Gage and his significance: "He was the first case where you could say fairly definitely that injury to the brain produced some kind of change in personality.” At the time, study of the brain is very rudimentary. Phrenologists, who accessed personalities by calculating protrusions on the skull, are still respected. The famous case of Phineas Gage will become a critical step in modern brain science.

 

September 1848

The young, robust, gregarious lad is employed as munitions foreman for the Rutland & Burlingame Railroad in Vermont. It is the most dangerous job in the crew. A standard blasting procedure involves boring a hole deep into rock, stuffing it with explosive powder and fuse, then using a tamping iron to pack in sand or clay to contain and direct the blast. Proud of his profession, Gage commissions an especially large custom-made tamping rod: three feet seven inches long, 1.4″ in diameter, and weighing over thirteen pounds.

The most dreaded mishap in munitions is a premature explosion.The tamping rod rockets into the left side of Gage’s face in an upward direction just past the lower jaw angle. Traversing the upper jaw and fracturing the cheekbone, it passes behind the left eye, through the left side of the brain, and flies out the top of his skull.

Gage is catapulted, lands hard on his back, convulses for a time, but is able to speak after a few minutes. He walks with little assistance and sits upright in an oxcart for a bumpy one mile ride to his town lodgings. True to the pioneer macho man legend, Gage shrugs off the injury, announcing he is not “much hurt” and expects to be back at work in a few days. His recovery from this horrific event is one of top medical stories of the era. Doctors worldwide exchange ideas and theories on the details and implications of the accident. For the next generation, it becomes the standard against which other injuries to the brain are judged. Some refuse to believe that anyone could survive such an ordeal – it must be a fabrication or a trick.

Despite his own optimism, Gage’s convalescence is long, difficult, and uneven, which requires further attendance by his physician, John M. Harlow, who garners fame as the doctor who treats the man who should not be. By April 1849, the patient is proclaimed to be in good physical health. Gage has, however, lost vision in his left eye and sustains a large forehead scar and a deep depression on top of his head “beneath which the pulsations of the brain can be perceived,” Dr. Harlow noted. “He has no pain in his head but says it has a queer feeling which he is not able to describe.”

 

After recovery

For a brief time after recovery, Gage exploits his newfound fame as a one-man traveling exhibit at New England venues, including an event organized by P.T. Barnum, where he is the object of both morbid curiosity and praise. This sort of exposure is soon overexposed, and the still robust Gage continues to work at various jobs as a farmer, stable and coach service owner, and a long-distance stagecoach driver, but he suffers from occasional seizures and then epilepsy, dying in 1860 twelve years after his injury. There are many reports that he underwent dramatic and negative personality changes – becoming a dishonest, ill-tempered, brawling lout. Gage’s steady work history and other contemporary assessments suggest such claims are exaggerated.

Phrenologists contended that destruction of the mental “organs” of Veneration and Benevolence caused Gage’s behavioral changes. Harlow may have believed that the organ of Comparison was damaged as well.

Dr. Harlow requests and receives the patient’s skull. He is bequeathed the most famous tamping rod in history, which Gage carried wherever he went, inscribed: This is the bar that was shot through the head of Mr Phineas P. Gage at Cavendish, Vermont, Sept 14, 1848. He fully recovered from the injury & deposited this bar in the Museum of the Medical College of Harvard University.

These artifacts, along with a plaster cast of Phineas Gage’s head created during an 1850 examination, are the most sought-out items at the Warren Anatomical Museum on the Harvard Medical School campus.

 

 

Terry Hamburg is director emeritus of the Cypress Lawn Cemetery Heritage Foundation. His recently published book Land of the Dead: How The West Changed Death In America explores how the demands of survival and adaptation in the Gold Rush western migration changed a multitude of American customs, including the way we bury and grieve for our ancestors. California and San Francisco serve as case studies. Visit his author page: https://www.terryhamburgbooks.com.

Sir Barnes Neville Wallis, CBE, FRS, RDI, FRAeS, born on Septembe, 26, 1887 in Ripley, Derbyshire, is often remembered for his role in the development of the famous "bouncing bomb" during the Second World War. However, his contributions to science, engineering, and aeronautics extend far beyond this iconic invention. A visionary in the truest sense, Wallis's legacy includes groundbreaking work in airship design, aircraft development, and advanced weaponry, in addition to, shaping the course of 20th-century technology.

Terry Bailey explains.

Barnes Neville Wallis.

Early life and education

Wallis's early life provided the foundation for his eventual career in engineering. His father, Charles Wallis, was a doctor, but young Barnes developed an early fascination with mechanical objects, much to his father's frustration. After attending Christ's Hospital school in Sussex, where he displayed a knack for mathematics and science, Wallis pursued an apprenticeship at Thames Engineering Works. However, he subsequently changed his apprenticeship to J. Samuel White's, the shipbuilder based at Cowes on the Isle of Wight originally training as a marine engineer, he took a degree in engineering via the University of London external program.

 

Contributions to Airship design

Wallis's early career saw him make significant contributions to the development of airships. In 1913, he joined Vickers, a company heavily involved in aeronautics, where he began working on lighter-than-air vehicles. He played a pivotal role in the design of the R100, a large British airship intended for long-range passenger travel.

The R100 project was part of a competition with the government-sponsored R101, which ultimately ended in disaster with the crash of R101, a craft of a different design to the R100. While the R101's failure effectively ended the British airship program, the R100 itself was a technical success, in large part due to Wallis's innovative structural design, which utilized a geodesic framework. This design became a hallmark of Wallis's work.

The geodesic framework was notable for its strength and lightweight properties. This design not only enhanced the airship's durability but also reduced its overall weight, making it more fuel-efficient. The R100's successful transatlantic flight to Canada in 1930 was a testament to the efficacy of Wallis's design, even though the airship program was ultimately scrapped after the R101 disaster.

 

Transition to aircraft design

After the decline of airship development, Wallis turned his attention to aircraft design. His expertise in geodesic structures led him to work on the Vickers Wellington bomber, which was used extensively by the Royal Air Force, (RAF) during the Second World War. The Wellington's geodesic structure made it incredibly resilient to damage. Unlike conventional aircraft, the Wellington could sustain considerable battle damage yet continue flying due to its ability to retain structural integrity even after losing large sections of the skin or framework.

This durability made it a valuable asset during the war, particularly during the early bombing campaigns. Wallis's work on the Wellington showcased his ability to apply innovative design principles to aircraft, extending the operational capabilities and survivability of warplanes. The Wellington aircraft became one of the most produced British bombers of the war, with more than 11,000 units built, attesting to the practical success of Wallis's engineering philosophy.

 

The Bouncing Bomb and the Dam Busters Raid

Wallis is perhaps most famous and remembered for his invention of the bouncing bomb, which was used in the Dam Busters Raid (Operation Chastise) in 1943. This operation targeted key dams in Germany's industrial Ruhr region, aiming to disrupt water supplies and manufacturing processes critical to the Nazi war effort. The bouncing bomb, officially known as "Upkeep," was an ingenious device that skimmed across the surface of the water before striking the dam and sinking to the optimal depth, then detonated when a hydrostatic pistol fired. In addition to, upkeep two smaller versions were also developed, High-ball and Base-ball.

The design of the bomb required not only advanced physics and mathematics but also extensive practical testing. Wallis conducted numerous experiments with scaled-down prototypes to perfect the bomb's trajectory and spin, ensuring it could bypass underwater defenses and inflict maximum damage, before conducting half and full-scale tests of the bomb. The Dam Busters Raid, though not as strategically decisive as hoped, was a major tactical and propaganda victory that demonstrated the effectiveness of precision engineering in warfare. It also solidified Wallis's reputation as one of Britain's foremost wartime inventors, and designers.

 

Beyond the Bouncing Bomb: The Tallboy and Grand Slam

While the bouncing bomb is Wallis's most well-known design, his development of the "Tallboy" and "Grand Slam" bombs was arguably more impactful. These were so-called "earthquake bombs," designed to penetrate deeply into the ground or fortifications before exploding, causing immense structural damage. The Tallboy, weighing 12,000 pounds, was used effectively against hardened targets such as U-boat pens, railway bridges, and even the German battleship Tirpitz, which was sunk by RAF bombers in 1944.

The Grand Slam, a 22,000-pound bomb, was the largest non-nuclear bomb deployed during the war. Its sheer destructive power was unparalleled, and it played a crucial role in the final stages of the conflict, helping to obliterate reinforced German bunkers and infrastructure. Wallis's work on these bombs demonstrated his understanding of the evolving nature of warfare, where the destruction of heavily fortified targets became a priority.

 

Post-War Contributions: Advancements in supersonic flight

After the war, Wallis continued to push the boundaries of engineering, particularly in the field of supersonic flight. He began working on designs for supersonic aircraft, foreseeing the need for faster travel in both military and civilian aviation. His proposed aircraft designs included the "Swallow" which was a supersonic development of Wild Goose, designed in the mid-1950s and was a tailless aircraft controlled entirely by wing movement with no separate control surfaces.

The design intended to use laminar flow and could have been developed for either military or civil applications, both Wild Goose and Swallow were flight-tested as large (30 ft span) flying scale models. However, despite promising wind tunnel and model work, these designs were not adopted. Government funding for Wild Goose and Swallow was cancelled due to defense cuts.

Although Wallis's supersonic aircraft designs were never fully realized during his lifetime, they laid the groundwork for later advancements in high-speed flight. The variable-sweep wing technology he envisioned was later incorporated into aircraft such as the F-111 Aardvark and concepts of supersonic flight in the iconic Concorde, the world's first supersonic passenger airliner. Wallis's vision of supersonic travel outlined his enduring ability to anticipate technological trends.

 

Marine engineering and submersible craft

Wallis's inventive spirit was not confined to aeronautics. In the post-war years, he became involved in marine engineering, focusing on the development of submersible craft and weaponry. One of his notable projects was the development of an experimental rocket-propelled torpedo codenamed HEYDAY. It was powered by compressed air and hydrogen peroxide that had an unusual streamlined shape designed to maintain laminar flow over much of its length.

Additionally, Wallis also explored the development of deep-sea submersibles. His work on underwater craft highlighted his interest in new forms of exploration and transportation, aligning with the burgeoning post-war interest in oceanography and underwater research. As part of this exploration of underwater craft, he proposed large cargo and passenger-carrying submarines, that would reduce transportation costs drastically, however, nothing came of these designs which probably would have transformed ocean-going transportation.

Due to Wallis's experience in geodesic engineering, he was engaged to consult on the Parkes Radio Telescope in Australia. Some of the ideas he suggested are the same as or closely related to the final design, including the idea of supporting the dish at its center, the geodetic structure of the dish and the master equatorial control system.

 

Later life and recognition

Throughout his life, Wallis maintained a strong commitment to education and mentorship. He was an advocate for the advancement of engineering as a discipline and frequently gave lectures to students and professionals alike. Wallis became a Fellow of the Royal Society in 1945, was knighted in 1968, and received an Honorary Doctorate from Heriot-Watt University in 1969 in recognition of his outstanding engineering achievements. Additionally, he was awarded the Royal Society's prestigious Rumford Medal in 1971 for his work in aerodynamics.

Even in his later years, Wallis remained active in engineering, particularly in exploring the future potential of space travel. His forward-thinking ideas on rocket propulsion and spacecraft design, though largely theoretical at the time, hinted at the emerging field of space exploration, which would become a global endeavor in the following decades.

Wallis passed away on October, 30, 1979, leaving behind a legacy of innovation that continues to inspire engineers and inventors worldwide. His impact on both military and civilian technologies is a testament to his brilliance and determination to push the boundaries of what he knew was possible but others often did not.

 

Legacy

Sir Barnes Neville Wallis, CBE, FRS, RDI, FRAeS, was a true polymath whose influence extended across multiple disciplines. While he is best known for his wartime contributions, particularly the bouncing bomb, his legacy goes far beyond a single invention.

From the geodesic structures of airships and bombers to supersonic aircraft concepts and deep-sea exploration vehicles, in addition to, his innovative ideas on ocean and space exploration and travel. Wallis's career spanned an astonishing range of technological advancements. His ability to marry theoretical physics with practical engineering solutions made him a giant of 20th-century science and technology.

Wallis's story is not just one of wartime ingenuity but of a lifetime spent striving to solve complex problems with creativity and persistence. His contributions continue to resonate today, reminding us that the spirit of innovation is timeless.

 

The site has been offering a wide variety of high-quality, free history content since 2012. If you’d like to say ‘thank you’ and help us with site running costs, please consider donating here.

Britain possesses over sixty hill figures of varying size and design that are carved into her hillsides. They invite both admiration and curiosity. Most are laid out on hillsides that have underlying chalk foundations. These figures are so huge that that they can be viewed from miles around and are especially visually effective on bright summer days. Most of these figures are made up of white horses, but amongst the others there are two giants (one in a naked depiction), a multitude of crosses, various military commemorations, a stag, a lion, a panda, a kiwi, and a kangaroo.

Steve Prout explains.

The Uffington White Horse. Source: World Wind, available here.

The description of this activity has been termed “leucippotomy” for the carving of horses and “gigantotomy” for giants (there are currently four of this kind in Britain). Whether these adopted terms are intended to be applied seriously is debatable since they do not appear in the dictionary. The name usually attached to such figures are geoglyphs. This peculiarity has been a phenomenon that has continued from the 1700s into modern times, one of the latest being a horse that was created in 1999 in Devizes, Wiltshire but there have been subsequent ones. Wiltshire is home to most of Britain’s hill figures but there are others in Scotland, Wales, and the North of England - but their presence is few and far between,  for example only two figures are to be found in Scotland.

Only a handful are considered to have authentically ancient uncertain origins and of the others few pose any mystery and only minor ones if any. Those that were created in the twentieth century are the easiest to explain and can be traced to a specific event. Perhaps these reasons for the more modern examples offer the answer to some of the earlier and unexplained figures. Some created in the 1800s continues to present minor puzzles simply because their creation was unrecorded due to simply being forgotten, confused, or dismissed as frivolous acts - examples of this are the first Westbury Horse and the Rockley Down or the Broad Town Horses.

 

Wartime Commemorations

The early twentieth century saw a variety of new figures appear on numerous hillsides in Britain to join the plethora of white horses. Many of these hill figures were inspired by the events from the First World War. At the end of the war people needed an outlet to grieve, remember and honor the sacrifices they suffered. It was something that needed to match the gravity and sheer size of the tragedy and sacrifice. Every village in England had lost friends, relatives and loved ones and few families were unaffected from the war. Shoreham, a town in Kent, was one of the villages that between May and September 1920, entrenched a thirty-meter chalk Christian style cross into a nearby hillside for this purpose. As it lays solitary in quiet pastures its presence resonates in that quiet serene countryside hill. Its creator was a Samuel Cheeseman, whose motivation emanated from the tragic loss of his two sons in the First World War. The memorial is also dedicated to a further forty-eight inhabitants of the village who also perished alongside them.

A similar style cross was carved at a village in Lenham for the same reasons. Lenham is in that same county of Kent, twenty-three miles away from its earlier counterpart at Shoreham. It is similar in design and is double the size of the Shoreham Cross. It was created a year later by a certain Mr G H Groom who was the local Headmaster.

 

In Wiltshire, a variety of military inspired figures appeared across the landscape that was previously dominated by a multitude of white horses until the early twentieth century. The post war period was a frustrating time for soldiers that were awaiting demobilization. The sheer size of the task to administer the demobilization process was a slow and frustrating process for the men waiting and for those who were challenged with making this monumental task happen. Most of these men were not regular soldiers, only conscripted for the duration of the war and they quite understandably wanted to return home quickly now the fighting was over. It led to the problem of finding ways to keep these soldiers occupied.

The soldiers from New Zealand based at Bulford, Wiltshire set about carving a giant Kiwi into the hillside above their camp. They clearly took their inspiration from the Regimental Badges of Fovant Down that were created two years previously. The kiwi was designed in 1918 by a Captain H Clarke who was an engineer. It is quite incongruous compared to some of the other figures present in Wiltshire as the Kiwi is not native to Britain. It is interesting to note that as time passes on, its significance and the one-time presence of its New Zealand creators will be forgotten - and the very existence of a kiwi will puzzle some heads. It covers over four hundred feet or one and a half acres of land. For the present the Bulford Kiwi still serves as a lasting reminder to the presence of the soldiers of New Zealand that fought on the side of Great Britain.

The First World War also inspired the creation of the Regimental badges of Fovant Down, the best and most obvious example. Before the crosses of Shoreham, Lenham and the Bulford Kiwi, various regiments from the Dominions and the British Army gathered in 1916 for the purpose of carving fourteen individual badges onto the hills at Fovant Down, Wiltshire. Again, the main reason was to occupy the soldiers from the horrors that were being reported from the various fronts, alleviate the feeling of homesickness, and provide a release from the relentless grind of military training. Today those badges that remain still serve to remind us of those sacrifices made by those men, but sadly a number have now long overgrown or have been lost beyond any hope of restoration due to a lack of maintenance. The badges included an outline map of Australia, a badge representing the Royal Army Medical Core, City of London Rifles, a rising sun for the Australian Commonwealth Forces, a Kangaroo, the Devon Regiment, Royal Fusiliers, and The Royal Warwickshire Regiment. Others were added after World War Two.

 

Royal connections

Britain’s incoming Kings and Queens also appear to have inspired the creation of other hill figures. The Hackpen White Horse in Wiltshire and the First White Horse in Littlington, Sussex were carved in 1838 to celebrate the ascent of Queen Victoria to the throne. Other monarchs were also celebrated in similar fashion. The White Horse in Osmington, just outside the coastal resort of Weymouth, is one the largest white horse hill figures in Britain. It is unique in the fact that it is the only figure to feature a rider who represents King George III. From 1789 King George held Weymouth under Royal Patronage and even visited the town on numerous occasions. In 1815 a group of army engineers are believed to have carved this figure into the hillside to occupy their time whilst they waited and prepared for an invasion by Napoleonic forces from the continent. This never materialized and the figure remains today. A different hillside representation to a monarch is the Wye Crown, in the county of Kent. This was carved in 1902 by a local Agricultural College to celebrate the coronation of King Edward VII.

 

Antique follies and examples of strange indulgences

 

There were other reasons for such a myriad of hill figures. The southern part of Britain, in particular Wiltshire, are abundant in White Horse hill figures. Only one horse in Uffington is of genuine antiquity and its origins remain unknown. The rest originate from the 1700s and 1800s. Most of these hill figures pose no mystery as to their origins and for those that do, mostly the mysteries are very minor ones. Some hill figures were not inspired by history - some are recorded as being the product of whimsical and frivolous acts on the part their wealthy and indulged landowners. Examples of this are the First and Second Horse at Westbury, one of the most famous and well known of all hill figures of England.

The first Westbury figure is believed, according to investigations in the 1700s, to have been an antique folly. This very bizarre behavior was contagious amongst certain wealthy landowners in the 1700s. These landowners made claim to possess or discover various objects of antiquity on their land. In some cases, a cairn, burial mound or hill figure would suddenly be “discovered” as was the case at Westbury. Many of these claims would at the time go unchallenged since local tenants would not want to upset their wealthy and influential landlords. It may have been for the simple fact that the more rational thinkers of the population saw it for what it was and that it was a frivolous act that occupied much valuable land. An investigation in 1742 with local people put the creation of the first horse around 1700 or as the investigation quoted “wrought within memory of persons still living or recently dead”. The fact was that this “antique horse” 1778 was willingly destroyed in 1778 on the orders of the landowner and was quickly remodeled by a certain Mr Gee. Perhaps this proves that maybe they knew more of the horse’s origin and that its claims to be of an older age were untrue.

The case of the nearby Cherill Horse (alternatively known as the Oldbury Horse due to its proximity to the nearby castle of the same name) in Wiltshire is another example of this strange behavior. It can be seen with the naked eye from the top of the hill above the Westbury Horse. It was designed by a Dr Alsop two years after the restoration of its relation in Westbury. This restoration allegedly gave him his inspiration, most probably combined with puerile jealousy “of that landowner has one got one, then I want one too.”  To sum up the witnesses at the time Alsop was referred to by local townsfolk as “the mad doctor,” due to the unnatural preoccupation of carving a giant white horse on a Wiltshire slope whilst shouting instructions and directions to his workers from a megaphone!

Many more in Wiltshire were to follow, most of which are accompanied by various conflicting accounts. Perhaps the act of turf cutting was becoming tiresome as others were appearing at Pewsey, Alton Barnes and Broad Town, and few people may not have the sense of urgency or importance to produce a correct account of these events. Many perhaps naturally thought that these figures were unlikely to remain permanent features as land after all was a valuable resource. In the latter case they were mistaken because many of these varying models prevail today.

Another example of whimsicality in a completely different location, is the Kilburn Horse of Yorkshire, one of only a few hill figures present in the North of England. It was carved under the whim of a travelling businessperson called Thomas Taylor. He was inspired, after being present and witnessing the festivities during the scouring and maintenance of the Uffington Horse, to carve a White Horse in his own home county and in 1857 he did just this. The horse can still be seen today in Yorkshire in the Hambleton Hills, Thirsk.

There are two modern examples of surprising appearances on our hillsides. One happened in the 1980s and that is the case of the Luzley Horse near Manchester. It is now lost after being allowed to be overgrown by vegetation, but its origins can be easily researched, and its story found in local paper archives. It was carved by a retired railway worker William Rawsthorne. He hid his work as he gradually worked on his figure over a period then surprised the local inhabitants by unveiling it one night to surprise them as a new day greeted them. It received a mixed reaction and it is now lost.

 

Another example is the case of the Laverstock Panda that has all but disappeared. It appeared in the early hours of January 1969 as part of a student prank known as “rag week” by the undergraduates of North Wales College at Bangor. All kinds of explanations were offered such as this Panda served as a homing device for Soviet Satellites and that it was a celebration of East-West co-operation over the London Panda called Chi-Chi (by the London and Moscow Zoo) who brought the two Pandas together for mating purposes. This is normal in studying the history of hill figures - in the absence of any solid facts the most outlandish explanations appear over time or almost immediately to fill the void, while often dismissing the more prosaic and usually correct explanations.

 

Conclusion

There is very little mystery about Britain’s hill figures that trouble historians’ or archaeologists’ heads. There are some that do lack any solid explanation; however, does that really create a mystery? Morris Marples, who was the leading authority on the subject, aptly summed up the overarching motive for their origins, their continued existence and creation. While he was discussing the Uffington Horse he stated that “man has always like to commemorate his achievements by the erection of some distinct monument and this is assuredly a very effective monument as later imitators realized.”  This is certainly true of many of the hill side artworks we know about, especially those made in the twentieth century.

Aside from the focus of the English monarchy and the Great War there exist further examples with further motives such as the Dover Castle Aeroplane of 1909 that celebrated Louis Bleriot’s first crossing of the English Channel and the second White Horse in Devises, Wiltshire carved in 1999 that marked the coming of the millennium. Marples was right in that man does enjoy celebrating and always seeks to leave a lasting imprint showing his efforts, sacrifices, and achievements in expressive and grandiose ways - so there is no reason to assume that our ancestors were any different. We just do not have the benefit of history being recorded.

Hill figures are no different in purpose from say a cenotaph, a plaque, a stone cross, monument, or even a specific building being named in honor of a person or event. Therefore, for those few that pose us minor mysteries we can at best only be satisfied with a close approximation of the truth. We should as Marples said accept the “simplest answer as it is usually the correct one.”

The art of hill cutting continues today but less frequently and with a more muted response. Like our ancestors we also have day to day things that pre-occupy us while these activities are undertaken. In Leicester, a procession of galloping white horses is cut into a main roundabout, and in a school in Devises a smaller copy of the town’s famous horse has been cut into the school playing fields. Whatever the motive there is no argument that these figures make the landscape of Britain more intriguing and a pleasure to view. Furthermore, by diligently maintaining them we will continue to remember their significance in our history.

 

The site has been offering a wide variety of high-quality, free history content since 2012. If you’d like to say ‘thank you’ and help us with site running costs, please consider donating here.

 

 

Sources

Discovering Hill Figures – Kate Bergamar – 1997 – Shire Publications

White Horses and other Hill Figures – Morris Marples – 1981 – Alan Sutton Publishing

Lost Gods of Albion - Paul Newman - 1998 – Sutton Publishing Ltd

The Hill figure Homepage.co.uk – Dr Mark Howes

Posted
AuthorGeorge Levrier-Jones
CategoriesBlog Post

Candidate Donald Trump thrust immigration issues at the Southern border into the forefront of American politics in the early weeks of the 2016 presidential campaign.   But even then, the issue was not new.

Joseph Bauer, author of Sailing for Grace (Running Wild Press 2024), explains.

A teacher, Mary R. Hyde, and students at the Carlisle Indian Training School. Source: here and here.

At least two years before 2016, large numbers of Central American families, nearly all from the Northern Triangle countries of Honduras, Guatemala, and El Salvador began to arrive in walking caravans at the Texas Border.  Well before Donald Trump emerged, the U.S. immigration system at the border was overtaxed.  Ronald Reagan knew it; George W. Bush (fluent in Spanish) knew it; Barack Obama knew it.  All tried to address it, explaining that congressionally authorized resources were simply inadequate to manage the realities permitted by U.S. immigration law, especially the legal right foreign nationals to seek asylum or refugee status under a federal statute essentially unchanged since 1965.

Efforts for change and improvement all died at the altar of partisan self-interest.  Legislators from across the country—not only those with constituents on the border—concluded that any solution would be more problematic to individual political fortunes than continuing the status quo and arguing the positions most favored by the particular local and regional voters they needed to be elected.

Stoked by political rhetoric from both sides, the public worried about large numbers of new citizens entering the country. Legitimate worries included concern about the strain on public health, school resources, and transportation infrastructure in local communities.  Unfounded worries included concerns about crime.  Numerous studies have documented that immigrants, including those entering legally by asylum and even those “undocumented” persons who are in the U.S. without legal status, commit crimes at materially lower rates than American-born citizens.[1]  The other objection, that newcomers cause Americans to lose or find jobs, has also been refuted. Employers hiring large numbers of workers almost unanimously want as many immigrants allowed in as is possible to fill jobs for which they cannot find applicants otherwise.  And increases in the number of immigrants, by adding to the economy and success of businesses, actually increases the wages and employment opportunities of American-born citizens.[2]

 

2017 and 2018

But in 2017 and 2018, the Trump administration moved, without Congressional authority, to stop the Central American caravans with a new measure:  the broadscale involuntary separation of parents from their children at the Texas border, primarily at El Paso.  The intent of the new policy was deterrence: to discourage asylum seeking families from making their journeys.  If they began to be separated, finding themselves in different countries, it was thought, the seekers would stop seeking.

The program was initially undisclosed by the administration (on the ground that it was merely a “pilot program”) and drew little public or media attention.  It worked as follows.

 A family presented itself to the Border Control agents at the crossing (the recommended pathway for entrance) or on American soil near the border, having managed to reach it by other means (not recommended, but still a legal way to seek asylum under U.S. law).  Following brief interviews, nearly all parents were either detained temporarily in a government facility without their children or summarily deported and sent to the Mexican border, again without their children.  The children, regardless of age, were immediately deemed “unaccompanied minors,” since their parents were no longer present with them, and turned over to the custody of the Homeland Security Refugee Service for temporary housing, often in a church-affiliated respite center, and then ultimately placed in either American foster homes or the home of a qualifying relative somewhere in the U.S., if such a person could be identified.  If a relative could be found at all, the process was often lengthy.

Data on the actual number of children taken from their parents during the Trump Administration are imprecise.  But studies by relief organizations such as the American Council of Catholic Bishops, the Lutheran Immigrant and Refugee Service, and the Washington Office on Latin America have documented that at least 2,000 children were removed from parents between February 2017 and the date the policy became public and official in May 2018; another 2,500 were separated in the 50 days thereafter.   Some estimates are as high as nearly 6,000 children and 3,000 families.  The Trump administration ostensibly stopped the practice on June 20, 2018, in response to public furor and condemnation from all sides of the political spectrum.[3]   Anti-immigrant positions might earn votes for some politicians, but taking children from their parents earned votes for nobody.   What it did was provoke the abhorrence of  the vast majority of (but not all) Americans.

 

Historical context

But was the widescale forced separation of parents and children in 2018 actually new in historical American immigration policy and practice?  Could such a policy have been a reality earlier in the American experiment?  The truth—many would say sad truth—is that it was.  Two prior examples are obvious and known to most Americans.

The first was the long period of legal slavery in the U.S., when millions of African and Caribbean black men and women were forcibly transported to the United States with the approval of the federal and state governments and held here in involuntary servitude.  Those slaves who were able to bring their children with them, or who gave birth to them once here, were routinely sold to new owners, never to see their children or grandchildren again.  There is no denying that slavery in the U.S. was tragically replete with the separation of families.

The second instance was the common practice for decades in the 19th century of removing Native American children from their natural parents and tribes and placing them either in “Indian” boarding schools or the strange Christian homes of white Americans.  (A moving portrait of the practice—and its harmful effects—is depicted in Conrad Richter’s classic novel, The Light in the Forest.)  These involuntary relocations were massive.  Federal and state governments separated as many as 35% of all American indigenous children from their families, according to a 1978 report by the US House of Representatives.[4]

Most of us learned of the above practices in our American educations, if incompletely.  But many may be surprised to learn that family separation in the U.S. occurred at the hands of some state and city governments even into the early 20th century, condoned or overlooked by the federal government.   Prior to 1920, when specific immigration rules were enacted by Congress, state and city governments, motivated by anti-Catholic sentiment, removed an estimated 150,000 to 200,000 children of Irish and Polish immigrants and placed them in Protestant or Anglo-American households, away from their local areas.[5]

We can hope that such interference with the family unit, based on religious hatred, would be unthinkable today.  But it is part of our past.

 

2015 report

In 2015, the American Bar Association’s Commission on Immigration published a report entitled

Family Immigration Detention, Why the Past Cannot be Prologue.  The report addressed the difficult and sad question of the morality of detaining whole families at the border under the Obama administration.  It preceded the family separation policy that the Trump administration implemented 3 years later, which most Americans believe was even sadder and more immoral.

The authors of the ABA report must be disappointed.  The “past” that it examined—the detention of whole families—did not improve.  Instead, it worsened, with a government policy that, at least temporarily, divided the nuclear family unit itself.

 

In this instance, we did not advance from the lessons of our past.  But in history, there is always hope.  Maybe we will learn them at last.

 

 

Joseph Bauer is the author of Sailing for Grace (Running Wild Press 2024), a novel that explores a white widower’s quest to fulfill a promise to his dying wife: to reunite Central American parents with their children separated from them at the Texas Border in 2018.  Mr. Bauer’s previously published novels are The Accidental Patriot (2020), The Patriot’s Angels (2022), and Too True to be Good (2023).  His latest finished manuscript of historical fiction about the lead-up to and conduct of WWII is titled, Arsenal of Secrecy, The FDR Years, A Novel.


[1] See e.g., Undocumented Immigrant Offending Rate Lower Than U.S.-Born Citizen Rate, University of Wisconsin research study funded by the National Institute of Justice (September 2024).  This and many other studies conclude that undocumented “illegal” immigrants commit about or less than half as many crimes as Americans for the same number of persons.  This is true across all kinds of crimes, including murders, other violent crimes, and drug trafficking.  Admitted asylum seekers and refugees also commit far few crimes than American-born counterparts.

 

[2] Immigration’s Effect on US Wages and Employment, Caiumi, Alessandro and Peri, Giovanni, National Bureau of Economic Research (August 2024).

[3] Many sources, including an audit by U.S.  have reported that family separations at the Texas border continued in significant numbers well after the Trump Administration announced a halt to them in June 2018.  See e.g., Long, Colleen; Alonso-Zaldiver, Ricardo. “Watchdog: Thousands More Children May Have Been Separated”. U.S. News & World Report, January 18, 2019.

[4]Sinha, Anita. An American History of Separating Families, American Constitution Society, November 2, 2020.

[5]Americans today almost unanimously believe that our Constitution, by its First Amendment, assures inviolate an individual and collective right to freedom of religion and worship.  But that Amendment, until applied to State and local governments much, much later, did not prevent any state from religious discrimination in its own laws.  Catholics were so generally despised in Massachusetts in the early days of our nation that Catholic priests were forbidden by state law from living there and subject to imprisonment and even execution if they did.

Thousands of political science books and magazines discuss the idea of ​​democratic transformation. For example: how can a country once under authoritarian rule, transform from that to individual and democratic rule? And what do we truly know about dictatorships? Can a democratic country transform into a dictatorial country, despite the pre-existence of a constitution and elections?

Probably the most well-known example of this is Germany: which had a parliament; a multi-party system; laws protecting elections; and laws protecting individual freedoms. At the time, the illiteracy rate was almost zero percent,yet it transformed from a democracy into an expansionist dictatorship in 1933, after Hitler's rise to power.

Nora Manseur and Kaye Porter explain.

Banknotes awaiting distribution during the 1923 German hyperinflation. Source: Bundesarchiv, Bild 183-R1215-506 / CC-BY-SA 3.0, available here.

Early life of Adolf Hitler

A complex history creates the foundation of a man who was able to order the deaths, either directly or indirectly, of over 60 million people. Hitler was a frustrated painter and a vegetarian. His forces occupied 11 countries, some of which he occupied partially, and others completely. Among these countries were Poland, France, Holland, Denmark, Norway, Luxembourg, Yugoslavia, and Greece. Whether we like it or not, the man who failed his initial entrance examinations and who was passed over for positions of leadership, still captured the psyche of nations. Hitler changed the course of history.

The leader of Nazi Germany was born in Austria in 1889. He had reason to hate and fear his father, who was violent towards his mother and used to beat them both severely. In 1907 he attempted to join the Academy of Arts in Vienna, but was rejected twice after he failed the entrance exam. After the death of his mother, Klara Pölzl, at the end of the same year, he moved to live in Vienna, one of the most prominent capitals in Europe. At the time, Vienna’s mayor was a known anti-Semite called Karl Lueger. As a young man who had experienced much violence and rejection, his settlement in Vienna contributed to shaping his ideas, both because of the prominence of and Leuger’s feelings towards the Jews.    

World War I broke out in 1914, so at age 28 Hitler volunteered to join thearmy, where he received the Medal of Courage twice during the war. Despite that, he was not promoted. According to his commanders at the time, Hitler did not have the leadership skills necessary. In 1918, the November Revolution took place in Germany, which led to the transformation of Germany from a federal constitutional monarchy to a democratic parliamentary republic.      

 

End of the war

With the end of the war, Germany surrendered and Kaiser Wilhelm II abdicated the throne, and was ordered to be exiled in the Netherlands. In 1919, the Treaty of Versailles was signed, where Germany was obliged to pay large reparations to the winning side. This was also a new chance, and a new opportunity for Germany. Freed from an authoritarian monarchy, it was now possible for there to be a political opening. German philosophical studies flourished, and new political parties began to spread - and spread their ideas.

The new authorities began to penetrate these new groups and parties. They hoped to use this openness to know more about their ideas and orientations. Hitler, who was still in the army, was one of the informants. In 1919, as an undercover informant he went to a bar where some parties were meeting for discussions, to spy on one of the right-wing parties: the Nazi Workers' Party.

Unlike others in Germany at the time, Hitler did not see this as an opportunity for the nation to grow and form new ideas. The sudden decision to surrender, instead felt like a keen betrayal and only fed the anger inside the young man. After Hitler heard their discussions, he was very impressed by their ideas about the parties' betrayal of the German Army, and their scapegoating of Jews in Germany’s defeat. Rather than informing others of the anger, Hitler instead joined them. In a short time, he became one of the most prominent leaders in that party eventually known as the National Socialist German Workers' Party

 

Nazi Party

Their goals, plainly, were against Judaism, communism and capitalism. Their arrogance was equally as lofty as they believed that as part of the Aryan race, they were themselves descendants of the inhabitants of the legendary continent of Atlantis. To them, who else should rule the world and return Germany to her proud place with all her former glory, power, and prestige? The Nazi Party carried out propaganda and issued its own newspaper to spread its ideas and beliefs. They attracted the attention of additional officers who were against the surrender decision and the government’s plans to reduce the size of the army.

An early ally of Hitler’s was an officer in the German Imperial Army, Ernst Röhm. Initially a friend and ally of Hitler, Röhm was also the co-founder and leader of the “Storm Troopers,” the original paramilitary wing of the Nazi Party. Rather than dispose of the weapons he had taken possession of, Röhm instead armed the militias and party members with them. With a country unstable from a war, and weapons in the hands of angry men who blamed outsiders for their shame and defeat, the party was well positioned to strike for power.

In 1921, Hitler was elected leader of the Nazi Party, and in 1923 the golden opportunity appeared. Because Germany did not have the money to pay its reparations to the Allies, the government decided to print money. The amount of money printed increased without the value increasing in proportion, and the German mark lost its value and collapsed. Prices increased, and a wave of great inflation hit Germany and became known as German hyperinflation.

 

Continuing instability

In response to the failure to pay reparations imposed by the victorious powers after World War I and the Treaty of Versailles, France and Belgium occupied the Ruhr region of Germany. Hitler felt that this was an opportunity to seize power without elections, and staged a coup d’etat. This failed. Hitler      was arrested and was sentenced to five years in prison. The Bavarian Supreme Court pardoned him, however, and Hitler only remained in prison for nine months before he was released.              

In 1928, Hitler decided to participate in the elections, losing by 2.5%, as the Germans once again rejected the Nazi proposal. But when the American stock market collapsed in 1929, it had a major impact on the whole world. As unemployment in Germany reached up to an estimated 6 million, the atmosphere became ripe for radical proposals -fertile ground for right-winger Nazis and left-wing      Communists.

The Nazi Party took advantage of the opportunity to appear as saviors of the German people, for example, by providing aid to the unemployed, which made them the most popular party in Germany. In 1932, the Nazi Party, led by Hitler, became the largest German party, winning 37% of the votes. In 1933, the President of Germany appointed Hitler as chancellor, and Hitler came to power.

 

Find that piece of interest? If so, join us for free by clicking here.

Posted
AuthorGeorge Levrier-Jones

Isambard Kingdom Brunel (1806–1859) is widely regarded as one of the greatest engineers in history. His pioneering work on bridges, railways, and tunnels, in addition to, ships dramatically shaped the industrial landscape of Victorian Britain and left a lasting legacy on modern engineering and transportation that eventually reshaped the world. Brunel's genius for problem-solving and his relentless pursuit of innovation made him a towering figure of the 19th century, whose contributions to society endure to this day.

Terry Bailey explains.

The ship the SS Great Eastern in 1858.

Early Life and Education

Brunel was born on the 9th of April, 1806, in Portsmouth, England, to a French father, Marc Isambard Brunel, and an English mother, Sophia Kingdom. Marc Brunel was an accomplished engineer in his own right, working on various mechanical and civil engineering projects in Britain. From an early age, Isambard was exposed to the world of engineering, with his father encouraging his intellectual curiosity and fostering his talents.

Brunel was educated at prestigious institutions in England and France, where he developed a strong foundation in mathematics, mechanics, and engineering principles. His formal education began at the Henri-Quatre School in Paris and continued at Lycée Saint-Louis before he returned to England. He then apprenticed under his father, gaining practical experience and learning firsthand from an expert engineer. This mentorship would lay the groundwork for his illustrious career, and the pair would collaborate on several major projects.

 

The Thames Tunnel: A Landmark Feat of Engineering

One of the earliest and most significant projects that shaped Brunel's career was the Thames Tunnel, which he worked on alongside his father. Started in 1825, this tunnel was the first to be successfully constructed under a navigable river, connecting Rotherhithe and Wapping in London. It was an ambitious project fraught with challenges, including financial troubles, technical difficulties, and hazardous working conditions.

Brunel took on a leadership role as the chief assistant engineer, displaying his characteristic resourcefulness. He helped develop innovative techniques, such as the use of a tunnelling shield—a safety structure that allowed workers to dig safely through the riverbed without the tunnel collapsing. The tunnel itself was considered a marvel of engineering at the time, overcoming immense pressures and the threat of constant flooding.

Though the Thames Tunnel was beset by numerous setbacks and took nearly two decades to complete, its successful opening in 1843 was used only for pedestrian traffic until the 1860s, when it was converted to railway use.

The Thames Tunnel solidified Brunel's reputation as a rising star in civil engineering. The tunnel remains in use today as part of the London rail network, a testament to its durability and Brunel's engineering vision.

 

The Great Western Railway: Revolutionizing Transportation

Brunel's most famous and far-reaching contributions came in the realm of railway engineering. In 1833, before the Thames tunnel was complete, at just 27 years old, he was appointed chief engineer of the Great Western Railway (GWR), a project that would cement his legacy. His vision was to create a seamless rail connection between London and Bristol, facilitating the movement of goods and passengers across the country with unprecedented speed and efficiency.

Brunel's design for the GWR was innovative in multiple ways, but one of his most notable decisions was to use a broad gauge of 7 feet, rather than the standard gauge of 4 feet 8½ inches. He believed the wider gauge would allow for greater stability, faster speeds, and a more comfortable ride for passengers. While the broad gauge did offer some advantages, it was ultimately phased out in favor of the standard gauge, due to the logistical complications of operating different rail systems across the country.

Nevertheless, Brunel's work on the GWR was groundbreaking.

His commitment to high-quality engineering was evident in the construction of viaducts, tunnels, and stations, all designed with precision and an eye for aesthetics. Two of the most famous structures built for the GWR are the Box Tunnel and Maidenhead Railway Bridge.

The Box Tunnel, completed in 1841, was the longest railway tunnel in the world at the time, stretching 1.83 miles, (approximately 2.95 kilometers), through the chalk hills of Wiltshire. It was an extraordinary feat of engineering, requiring meticulous planning and execution. Legend has it that Brunel aligned the tunnel's construction so that on his birthday, the 9th of April the sunlight would shine straight through from end to end.

Earlier the Maidenhead Railway Bridge, with its flat, wide arches, was another testament to Brunel's brilliance. Completed in 1838, it was revolutionary in its use of low-rise arches that allowed for a stable railway crossing without compromising the integrity of the structure. Today, the bridge remains in use, another symbol of Brunel's lasting contributions to British infrastructure.

 

The Clifton Suspension Bridge: A Masterpiece of Design

One of Brunel's most iconic works, the Clifton Suspension Bridge in Bristol, stands as a symbol of his bold engineering vision. While the bridge wasn't completed during his lifetime, Brunel began work on it in the early 1830s after winning a design competition. His daring design called for a suspension bridge spanning the Avon Gorge, with a total length of 234 yards, (214 meters).

The construction of the bridge faced numerous financial and technical difficulties, however, it was eventually completed in 1864, five years after Brunel's death. Needless to say, the Clifton Suspension Bridge has since become an enduring symbol of British engineering, celebrated for both its functional design and its graceful beauty. The bridge, still in use today, is often seen as a testament to Brunel's ingenuity and his ambition to create structures that were as visually stunning as they were practical.

 

Engineering the Seas: Brunel's Ships

Brunel's talents were not limited to land-based engineering. His foray into maritime engineering led to the construction of three revolutionary ships that would set new standards in shipbuilding and sea travel.

SS Great Western (completed 1838): This was Brunel's first foray into shipbuilding and the world's first steamship designed for transatlantic service. The Great Western was a wooden paddle steamer, which was considered the largest passenger ship of its time. It successfully made the journey from Bristol to New York in 15 days, marking the beginning of regular steam-powered transatlantic crossings. The ship's success proved that steam-powered vessels could dominate long-distance sea travel.

SS Great Britain (completed 1845): Brunel continued to push the boundaries of maritime engineering with the SS Great Britain, the first ocean-going ship to be built with an iron hull and driven by a screw propeller. At 322 feet long, (just over 98 meters), it was the largest ship afloat at the time. The SS Great Britain combined the best of both worlds, using both sails and steam power and set a new benchmark for ship design. It revolutionized shipbuilding, influencing the design of future iron and steel ships.

SS Great Eastern (completed 1859): Brunel's most ambitious and controversial ship was the SS Great Eastern, intended to be the largest ship in the world and capable of carrying 4,000 passengers. It was an extraordinary engineering feat—692 feet long, (almost 211 meters), and 18,915 tons, (over 19218 metric tons). Unfortunately, the Great Eastern was plagued by mismanagement, cost overruns, and technical difficulties, making it a commercial failure. However, its design innovations, particularly in terms of double hull construction, had a lasting impact on shipbuilding practices.

 

Lasting Contributions and Legacy

Isambard Kingdom Brunel's career was marked by audacity, innovation, and an insatiable desire to push the limits of what was possible in engineering. His work on railways, bridges, tunnels, and ships not only transformed Britain's infrastructure but also laid the foundation for modern engineering practices.

Brunel's achievements extended beyond the technical. His vision of an interconnected Britain, where goods and people could move quickly and efficiently across the country and beyond, helped drive the Industrial Revolution and fostered economic growth. His pioneering use of materials such as iron, his development of new construction techniques, and his application of steam power to transport set new standards for engineering that would influence future generations of engineers.

While not all of his projects were commercial successes, Brunel's contributions to society are undeniable. His work on the Great Western Railway alone reshaped the British economy and transformed cities such as Bristol, which became key industrial hubs. Brunel's bridges, tunnels, and ships remain iconic landmarks, serving as testaments to his genius and the transformative power of engineering, in addition, to interconnecting travel.

In recognition of his extraordinary contributions, Brunel was honored in his lifetime and continues to be celebrated posthumously. In 2002, he was named second in a BBC poll of the 100 Greatest Britons, a fitting tribute to the man whose work helped build the modern world.

In conclusion, Isambard Kingdom Brunel's life and career were defined by an unwavering commitment to innovation and a drive to overcome engineering challenges. From the Thames Tunnel to the Great Western Railway, the Clifton Suspension Bridge, and his revolutionary ships, Brunel's work touched nearly every aspect of transportation in the 19th century. His inventions and achievements not only reshaped the physical landscape of Britain but also left an indelible mark on the history of engineering, making him a true visionary whose legacy continues to inspire.

 

The site has been offering a wide variety of high-quality, free history content since 2012. If you’d like to say ‘thank you’ and help us with site running costs, please consider donating here.

The 1942 Cripps Mission took place during the middle of World War 2. It was an attempt in March of that year by Britian to secure greater Indian co-operation to World War 2. It involved Stafford Cripps, a member of the British cabinet, meeting various Indian political leaders.

Bilal Junejo explains.

A sketch of Stafford Cripps.

Whenever it is the purpose of a (political) mission which has to be ascertained, it behoves one to ask three questions without delay: why was the mission sent at all; why was it sent only when it was; and why did it comprise the individuals that it did. Unless such well-meaning cynicism is allowed to inform one’s analysis, it is not likely that one will be able to pierce the veil cast by official pronouncements for public consumption upon the true motives of those who were instrumental in bringing about the mission’s dispatch in the first place. There is, alas, no such thing as undue skepticism in the study of a political event.

So, to begin with, why was the mission in question — which brought with it an offer of an immediate share for Indians in the central government (Zachariah, 2004: 113) if they accepted “a promise of self-government for India via a postwar constituent assembly, subject only to the right of any province not to accede (Clarke and Toye, 2011)” — dispatched at all? A useful starting point would be Prime Minister Churchill’s declaration, when announcing in the House of Commons his administration’s decision to send a political mission to India, that:

“The crisis in the affairs of India arising out of the Japanese advance has made us wish to rally all the forces of Indian life to guard their land from the menace of the invader … We must remember also that India is one of the bases from which the strongest counter-blows must be struck at the advance of tyranny and aggression (The Times, 12 March 1942, page 4).”

 

Japan in the war

Since entering the war just three months earlier, Japan had already shown her might by achieving what Churchill would call “the worst disaster and largest capitulation in British history” — namely the surrender of over 70,000 British and Commonwealth troops in Singapore, a British possession, in February 1942 (Palmer, 1964: 299) — and occupying thereafter the British colony of Burma, on India’s eastern border, in March — a development which marked the first time since the outbreak of war in September 1939 that India, Great Britain’s most cherished imperial possession, was directly threatened by the enemy. No such threat (or a vociferous demand for independence) had arisen at the time of World War I, which was why no similar mission (with a concrete offer) had been dispatched then. For over two years after its outbreak, no mission was dispatched during World War II either, even though a clamor for independence, spearheaded by the Indian National Congress (India’s largest political party), was existent this time. It was only the Japanese advance westward that changed the picture. In Burma, the Japanese had been “welcomed as liberators, since they established an all-Burmese government (Palmer, 1964: 63).” To the British, therefore, it was imperative that the Indians were sufficiently appeased, or sufficiently divided, to eliminate the risk of the Japanese finding hands to have the gates of India opened from within —not least because even before Japan entered the war, it had been reported that:

 

“Arrangements are in progress for an inter-Imperial conference on war supplies to be held in Delhi … [where] it is expected that the Governments of East Africa, South Africa, Australia, New Zealand, Burma, and Malaya will be represented … to confer … on mutually developing their resources to provide the maximum for self-defence and for Great Britain … India (my emphasis), with her vast and varied resources and her central position, is the natural pivot for such arrangements (The Times, 8 August 1940, page 3).”

 

Small wonder, then, that the premier should have described the proposals which the Mission would be bringing as “a constructive contribution to aid India in the realization of full self-government (The Times, 12 March 1942, page 5).” But whilst a desire to garner Indian support for repelling the Japanese would seem able to explain why the mission was sent at all (as well as when), would that desire have also been sufficient to elicit on its own a public offer of eventual self-government from an imperialist as committed as Winston Churchill? As late as October 1939, in a letter to Jawaharlal Nehru (one of the principal Indian leaders), the non-party Stafford Cripps, who had established quite a good rapport with Nehru (Nehru, 2005: 224-5), would be writing (with reference to the Chamberlain administration) that:

“I recognise that it is expecting a great deal more than is probable to expect this Government to do anything more than make a meaningless gesture. The addition of Winston Churchill [to the Cabinet, as First Lord of the Admiralty] has not added to the friends of Indian freedom, though he does look at matters with a realism that is an advantage (Nehru, 2005: 398).”

 

Realism?

Were the Mission’s proposals a (belated) sign of that ‘realism’ then? Even though just six months earlier, shortly after drawing up with President Roosevelt the Atlantic Charter — a declaration of eight common principles in international relations, one of which was “support for the right of peoples to choose their own form of government (Palmer, 1964: 35)” — Churchill had created “a considerable stir when [he] appeared to deny that the Atlantic Charter could have any reference to India (Low, 1984: 155)”? As it turned out, it was realism on Churchill’s part, but without having anything to do with recognizing Indian aspirations. That is because when Churchill announced the Mission, his intended audience were not the Indians at all — not least because they never needed to be. The indispensability of India to the war effort was indisputable, but there was hardly ever any need for Churchill to appease the Indians in order to save the Raj. Simply consider the ease with which the Government of India, notwithstanding the continuing proximity of Japanese forces to the subcontinent, was able to quell the Congress-launched Quit India Movement of August 1942 — which was even described in a telegram to the premier by the Viceroy, Lord Linlithgow, as “by far the most serious rebellion since that of 1857, the gravity and extent of which we have so far concealed from the world for reasons of military security (Zachariah, 2004: 117).” The quelling anticipated Churchill’s asseveration that “I have not become the King’s First Minister in order to preside over the liquidation of the British Empire. For that task, if ever it were prescribed, someone else would have to be found … (The Times, 11 November 1942, page 8).” When British might in India was still a force to be reckoned with, what consideration(s) could have possibly served to have induced the Mission’s dispatch just five months earlier? What would Churchill not have gained had he never sent it?

 

There are two aspects to that, the second of which also addresses the third of our original questions — namely why the Mission was led by the individual that it was. The first aspect was Churchill’s desire, following the debacle in Singapore, to reassure not just his compatriots but also his indispensable transatlantic allies that something was being done to safeguard resource-rich India from the enemy (Owen, 2002: 78-9). With India “now a crucial theatre of war in the path of Japanese advance, Cripps exploited US pressure to secure Churchill’s reluctant agreement to the ‘Cripps offer’ (Clarke and Toye, 2011).” This was not very surprising, for given that he was president of a country which not only owed her birth to anti-imperialism but had also just subscribed to the Atlantic Charter, Roosevelt could not afford domestically to be seen condoning (British) imperialism anywhere in the world. The American view was that Indian support for fighting Japan would be better secured by conciliation than by repression (The Daily Telegraph, 13 April 1942, page 2), and Roosevelt even sent his personal representative, Colonel Louis Johnson, to India to assist in the negotiations (Clarke and Toye, 2011). Under such circumstances, Churchill could have only confuted the Americans by first making an offer of which Washington approved to the Indians, and then proclaiming the futility thereof after it had been rejected by them (The Daily Telegraph, 1 April 1942, page 3). As he wrote before the Mission’s dispatch to Linlithgow, a fellow reactionary who would do much to sabotage the ‘Cripps offer’ by his (predictable) refusal to reconstruct the Executive Council in accordance with Congress’s wishes (removing thereby any incentive Congress might have had for consenting to postwar Dominion status) (Moore, 2011):

“It would be impossible, owing to unfortunate rumours and publicity, and the general American outlook, to stand on a purely negative attitude and the Cripps Mission is indispensable to proving our honesty of purpose … If that is rejected by the Indian parties … our sincerity will be proved to the world (Zachariah, 2004: 114).”

 

Public relations

As anticipated, this public relations gesture, “an unpalatable political necessity” for the gesturer (Moore, 2011) and therefore proof of his ‘realism’, worked — all the more after Cripps, who considered neither Churchill nor Linlithgow primarily responsible for his failure in India (Owen, 2002: 88), proceeded to “redeem his disappointment in Delhi by a propaganda triumph, aimed particularly at the USA, with the aim of unmasking Gandhi as the cause of failure. One result of the Cripps mission, then, was … [that] influential sections of American opinion swung to a less critical view of British policy. In this respect, Churchill owed a substantial, if largely unacknowledged, debt to Cripps (Clarke and Toye, 2011).” The ulterior motive behind sending the Mission became evident to some even at the time. As Nehru himself reflected after once more landing in gaol (for his participation in the Quit India Movement):

“The abrupt termination of the Cripps’ negotiations and Sir Stafford’s sudden departure came as a surprise. Was it to make this feeble offer, which turned out to be, so far as the present was concerned, a mere repetition of what had been repeatedly said before — was it for this that a member of the British War Cabinet had journeyed to India? Or had all this been done merely as a propaganda stunt for the people of the USA (Nehru, 2004: 515)?”

 

A desire, therefore, to satisfy the Americans, who were his intended audience, would explain why Churchill acquiesced in the Mission. But now we come to the other aspect which was alluded to earlier — namely why it was the Cripps Mission. To begin with, Cripps, a non-party person since his expulsion from the Labour Party in January 1939 for advocating a Popular Front with the communists (Kenyon, 1994: 97), had, shortly after the outbreak of war in September, embarked upon a world tour, convinced that “India, China, Russia, and the USA were the countries of the future (Clarke and Toye, 2011)”, and that it would therefore be worth his country’s while to ascertain their future aims. “In India Cripps was warmly received as the friend of Jawaharlal Nehru … [and] though unofficial in status, Cripps’s visit was undertaken with the cognizance of the India Office and was intended to explore the prospects of an agreed plan for progress towards Indian self-government (Clarke and Toye, 2011).” But whilst this visit helped establish his bona fides with the Indian leaders and gave him such a knowledge of Indian affairs as would later make him a publiclysuitable choice for leading the Mission (The Daily Telegraph, 22 April 1952, page 7), Churchill had more private reasons for choosing Cripps in 1942 — as we shall now see.

 

Going abroad

After becoming prime minister in 1940, “Churchill [had] used foreign postings cannily to remove potential opponents and replace them with supporters; as well as Halifax, Hoare and Malcolm MacDonald (who was sent to Canada as high commissioner), he sent five other Chamberlainite former ministers abroad as the governors of Burma and Bombay, as minister resident in West Africa and as the high commissioners to Australia and South Africa. Several others were removed from the Commons through the time-honored expedient of ennobling them (Roberts, 2019: 622).” Similarly, the left-wing Cripps was also sent out of the country — as ambassador to Moscow, where he served for eighteen months, Churchill contemptuously observing when it was suggested Cripps be relocated that “he is a lunatic in a country of lunatics, and it would be a pity to move him (Roberts, 2019: 622).” To us, this remark shows how the Cripps Mission vis-à-vis India was inherently frivolous; for had Churchill considered the fulfilment of its ostensible aims at all important, would he have entrusted the Mission to a ‘lunatic’ (rather than to, say, Leopold Amery, who was his trusted Indian Secretary, and who had already dissuaded him from going to India himself (Lavin, 2015))?

However, after America entered the war, “Churchill [for reasons irrelevant to this essay] came to think Cripps a bigger menace in Russia than at home and sent permission for him to return to London, which he did in January 1942 … [to be] widely hailed as the man who had brought Russia into the war (Clarke and Toye, 2011)” — this at a time when Churchill himself was grappling with a weakened domestic position (Addison, 2018), which the fall of Singapore would do nothing to improve. Anxious to win over the non-party Cripps, who was now his foremost rival for the premiership (Roberts, 2019: 714), Churchill “brought him into the government as Lord Privy Seal and Leader of the House of Commons (Clarke and Toye, 2011).” Rather than engage in domestic politics, however, Cripps “chose to invest his windfall political capital in an initiative to break the political impasse in India (Clarke and Toye, 2011).” But, “as Churchill may well have calculated in advance, the Mission failed and Cripps’s reputation was diminished (Addison, 2018).” The political threat to Churchill decreased considerably, for failure in India meant that Cripps’s removal as Leader of the House of Commons was “inevitable” (The Times, 22 April 1952, page 6). Who could have aspired to the premiership under such circumstances? The Mission had not even been a gamble for Churchill (who would have never sent Cripps only to add to his political capital), since the offer’s provision, prudently inserted by Amery (Lavin, 2015), for a province’s right to refuse accession to a postwar Indian Dominion was certain to have been welcomed by the Muslim League (India’s foremost Muslim political party) — which had declared its quest for some form of partition as early as March 1940 (with the Lahore Resolution), and the retention of whose support during the war was crucial because the Muslims, “besides being a hundred million strong, [constituted] the main fighting part of the [Indian] Army (Kimball, 1984: 374)” — but equally certain to have been rejected by the Hindu-dominated Congress (which was already irked by the stipulation that Dominion status would be granted only after the war, which nobody at the time could have known would end but three years later). Not for nothing had Churchill privately assured an anxious King George VI shortly after the Mission’s dispatch that “[the situation] is like a three-legged stool. Hindustan, Pakistan, and Princestan. The latter two legs, being minorities, will remain under our rule (Roberts, 2019: 720-1).”

 

Conclusion

To conclude, given his views on both India and Cripps, it is not surprising that the premier should have entertained a paradoxical desire for the Mission to succeed by failing — which it did. By easing American pressure on Downing Street to conciliate the Indians and politically emasculating Stafford Cripps at the same time, the Mission served both of the purposes for which it had been sent so astutely by Prime Minister Churchill.

 

Find that piece of interest? If so, join us for free by clicking here.

 

 

Bibliography

Addison, P. (2018) Sir Winston Leonard Spencer Churchill. Oxford Dictionary of National Biography [Online]. Available at https://doi.org/10.1093/ref:odnb/32413 [Accessed on 20.05.24]

Clarke, P. and Toye, R. (2011) Sir (Richard) Stafford Cripps. Oxford Dictionary of National Biography [Online]. Available at https://doi.org/10.1093/ref:odnb/32630 [Accessed on 20.05.24]

Kenyon, J. (1994) The Wordsworth Dictionary of British History. Wordsworth Editions Limited.

Kimball, W. (1984) Churchill & Roosevelt: the complete correspondence. Volume 1 (Alliance Emerging, October 1933 - November 1942). Princeton University Press.

Lavin, D. (2015) Leopold Charles Maurice Stennett Amery. Oxford Dictionary of National Biography [Online]. Available at https://doi.org/10.1093/ref:odnb/30401 [Accessed on 20.05.24]

Low, D. (1984) The mediator’s moment: Sir Tej Bahadur Sapru and the antecedents to the Cripps Mission to India, 1940-42. The Journal of Imperial and Commonwealth History [Online]. Available at https://doi.org/10.1080/03086538408582664 [Accessed on 20.05.24]

Moore, R. (2011) Victor Alexander John Hope, second marquess of Linlithgow. Oxford Dictionary of National Biography [Online]. Available at https://doi.org/10.1093/ref:odnb/33974 [Accessed on 20.05.24]

Nehru, J. (2004) The Discovery of India. Penguin Books India.

Nehru, J. (2005) A Bunch of Old Letters. Penguin Books India.

Owen, N. (2002) The Cripps mission of 1942: A reinterpretation. The Journal of Imperial and Commonwealth History [Online]. Available at https://doi.org/10.1080/03086530208583134 [Accessed on 20.05.24]

Palmer, A. (1964) A Dictionary of Modern History 1789-1945. Penguin Reference Books.

Roberts, A. (2019) Churchill. Penguin Books.

The Daily Telegraph (1 April 1942, 13 April 1942, 22 April 1952)

The Times (8 August 1940, 12 March 1942, 11 November 1942, 22 April 1952)

Zachariah, B. (2004) Nehru. Routledge Historical Biographies.

Posted
AuthorGeorge Levrier-Jones
2 CommentsPost a comment

The 1876 U.S. presidential election is one of the most contentious and controversial elections in American history. This election has a fraught legacy: After months of bitter fighting, lawmakers made a fateful compromise to put one man in office by effectively ending Reconstruction, leading to a century of intensified racial segregation in the South. It involved a dispute over electoral votes that was eventually resolved through a political compromise.  It was not our country’s finest moment.

Lloyd W. Klein explains.

Rutherford B. Hayes.

The Candidates

The candidates were a reform-minded Democrat and a Reconstructionist Republican. Rutherford B. Hayes was the governor of Ohio and was the Republican candidate. His campaign focused on reform and a commitment to civil rights, particularly for African Americans in the South. Before the war he had been a Cincinnati lawyer and abolitionist. He ran against Samuel J. Tilden the governor of New York, the Democratic candidate. He was known for his efforts in fighting corruption, particularly in New York City’s Tammany Hall. Tilden had been a War Democrat who opposed slavery; Tilden opposed Abraham Lincoln in the 1860 presidential election but later supported him and the Union during the Civil War.

The Democratic VP candidate was William A Wheeler, a man about whom Hayes had recently said, “I am ashamed to say: who is Wheeler?" He was a congressman from New York whose opposition was even less prominent. Wheeler was nominated because he was popular among his colleagues and had worked to avoid making enemies in Congress. In addition, he provided geographical balance to the ticket.

The Republican VP candidate was Thomas A Hendricks, the Governor of Indiana and a former US Senator and congressman. Hendricks’s record consisted of challenging the military draft and issuing greenbacks; however, he supported the Union and prosecution of the war, consistently voting in favor of wartime appropriations. Hendricks adamantly opposed Radical Reconstruction. After the war he argued that the Southern states had never been out of the Union and were therefore entitled to representation in the U.S. Congress. Hendricks also maintained that Congress had no authority over the affairs of state governments.

It was widely expected that Tilden and the Democrats would ride a popular wave into office after 16 years of Republicans and the scandals of President Grant’s administration.

Only one of these candidates had had a successful military experience in the war. Hayes, a lawyer, businessman, and abolitionist, was a war hero who went on to serve in Congress and later as Ohio’s governor, where he championed African American suffrage, Hayes was wounded five times, most seriously at the Battle of South Mountain in 1862. He earned a reputation for bravery in combat, rising in the ranks to serve as brevet major general. He was a lawyer from Cincinnati (educated at Kenyon College and Harvard) with no formal military training. After the war had returned to politics. He had displayed great courage in the Kanawha Division, working under George Crook and David Hunter.

He was not the only potential candidate with a war background. William T Sherman was the commander of the Army but had no interest whatsoever. Grant had intentionally given Winfield Scott Hancock, a Democrat, obscure assignments away from the South.  He did receive some votes in the convention in 1876 for the nomination but 1880 would be his real attempt for the office. It was widely assumed during 1875 that incumbent President Ulysses S. Grant would run for a third term as president despite the poor economic conditions, the numerous political scandals that had developed since he assumed office in 1869, and a longstanding tradition set by George Washington not to stay in office for more than two terms. Grant's inner circle advised him to run for a third term and he almost did so, but on December 15, 1875, the House, by a sweeping 233–18 vote, passed a resolution declaring that the two-term tradition was to prevent a dictatorship.

The initial favorite in 1876 was James G Blaine of Maine. He had the lead in delegates but was 100 votes short of the majority, as the southern states would not support his views. Hayes won the nomination by appealing in a conciliatory manner to the Southern Republicans, which left Frederick Douglass confused about whether the new black southern vote was wanted.

 

The Election Campaigns

In 1876 it was the tradition that the candidates did not campaign, and their surrogates made their cases locally. The Republicans expected to lose. The poor economic conditions made the party in power unpopular. Both candidates concentrated on the swing states of New York and Indiana, as well as the three southern states—Louisiana, South Carolina, and Florida—where Reconstruction Republican governments still barely ruled, amid recurring political violence, including widespread efforts to suppress freedman voting. Democrats, whose voter base resided in the former Confederacy, had been partly shut out of the political sphere; now, with Republican Ulysses S. Grant facing charges of corruption, Tilden’s reform-minded candidacy seemed like a well-timed opportunity for Democrats to regain political power.

The Republican outlook was indeed bleak. Hayes was a virtual unknown outside his home state of Ohio. Henry Adams called Hayes "a third-rate nonentity whose only recommendations are that he is obnoxious to no one". Hayes’s most important asset was his help to the Republican ticket in carrying Ohio, a crucial swing state. For the Democrats, the newspaperman John D. Defrees described Tilden as "a very nice, prim, little, withered-up, fidgety old bachelor, about one-hundred and twenty-pounds avoirdupois, who never had a genuine impulse for many nor any affection for woman".

The Democratic strategy for victory in the South relied on paramilitary groups such as the Red Shirts and the White League. These groups saw themselves as the military wing of the Democrats. Using the strategy of the Mississippi Plan, they actively suppressed both black and white Republican voting. They violently disrupted meetings and rallies, attacked party organizers, and threatened potential voters with retaliation for voting Republican.

During the election of 1876, Southern Democrats who supported Wade Hampton for governor used mob violence to attack and intimidate African American voters in Charleston. Republican Governor Daniel Henry Chamberlain appealed to President Grant for military assistance. In October 1876, Grant, after issuing a proclamation, instructed Sherman to gather all available Atlantic region troops and dispatch them to South Carolina to stop the mob violence.

It’s a cliché to say that in America, race is always on the ballot. But in 1876, it was probably the central issue.  The election process in Southern states was rife with voter fraud—on the part of both parties—and marked by violent voter suppression against black Americans. Under Reconstruction, African Americans had achieved unprecedented political power, and new federal legislation sought to provide a modicum of economic equality for newly enfranchised people. In response, white Southerners rebelled against African Americans’ newfound power and sought to intimate and disenfranchise black voters through violence.

Voter suppression was rampant in the post-Confederacy South. Many historians argue that if votes had been counted accurately and fairly in Southern states, Hayes might have won the 1876 election outright. “[I]f you had a fair election in the south, a peaceful election, there’s no question that the Republican Hayes would have won a totally legitimate and indisputable victory,” wrote Eric Foner.

 

Election Night

On election night, Hayes was losing so badly that he prepared his concession speech before turning in for the night. His party chairman went to bed with a bottle of whiskey. “We soon fell into a refreshing sleep,” Hayes later wrote in his diary about the events of November 7, 1876. “[T]he affair seemed over.”

But after four months of fierce debate and negotiations, Hayes would be sworn into office as 19th president of the United States. Historians often describe his narrow, controversial win over Democrat Samuel J. Tilden as one of the most bitterly contested presidential elections in history.

Just a few days following the election, Tilden appeared poised to narrowly clinch the election. He had captured 51.5 percent of the popular vote to Hayes’s 48 percent, a margin of about 250,000 votes. But Tilden had amassed only 184 electoral votes—one shy of the number needed to reach the 185 electoral votes necessary for the presidency. Hayes, meanwhile, had 165. Election returns from three Republican-controlled Southern states—Louisiana, Florida and South Carolina—were divided, with both sides declaring victory. Together, the states represented a total of 19 electoral votes, which along with one disputed elector from Oregon would be enough to swing the election Hayes’s way.

Hayes’ proponents realized that those contested votes could sway the election. They seized the uncertainty of the moment, encouraging Republican leaders in the three states to stall, and argued that if black voters hadn’t been intimidated away from the polls—and if voter fraud hadn’t been as rampant—Hayes would have won the contested states. With a Republican-controlled Senate, a Democrat-control

 

The Disputed Election

At the end of election day, no clear winner emerged because the outcomes in South Carolina, Florida, and Louisiana were unclear. Both parties claimed victory in those states, but Republican-controlled “returning” boards would determine the official electoral votes.  “The elections in three states—Florida, Louisiana, and South Carolina—were alleged to have been conducted illegally,” the senators of those states wrote in a statement.“

Republicans and Democrats rushed to those three states to watch and try to influence the counting of the votes. The returning boards determined which votes to count and could throw out votes, if they deemed them fraudulent. The returning boards in all three states argued that fraud, intimidation, and violence in certain districts invalidated votes, and they threw out enough Democratic votes for Hayes to win. All three returning boards awarded their states’ electoral votes to Hayes.

Meanwhile in Oregon, a strange development added that state to the uncertain mix. Hayes won the state, but one of the Republican electors, John W. Watts, was also postmaster, and the US Constitution forbids federal officeholders from being electors. Watts planned to resign from his position in order to be a Republican elector, but the governor of Oregon who was a Democrat, disqualified Watts and instead certified a Tilden elector.

The U.S. Constitution provided no way of resolving the dispute, and now Congress would have to decide. As Democrats controlled the House of Representatives, and Republicans dominated in the Senate, the two sides compromised by creating a bipartisan electoral commission with five representatives, five senators and five Supreme Court justices.

Electors cast their ballots in state capitals on December 6, 1876. Generally, the process went smoothly but in four capitals—Salem, Oregon; Columbia, South Carolina; Tallahassee, Florida; and New Orleans, Louisiana—two sets of conflicting electors met and voted so that the US Congress received two sets of conflicting electoral votes. At this point, Tilden had 184 electoral votes while Hayes had 165 with 20 votes still disputed.

When the Electoral College does not give a majority to a candidate, such as ties or when there are uncertain electors involved, they are called contingent elections. An example of a tie was the 1800 election, and it required a compromise for Jefferson to be president over Burr. After that, a legal remedy was agreed on. Another contingent election was in 1824 when John Quincy Adams was ultimately elected over Andrew Jackson, a result that was reversed in 1828.

Why wasn’t the election resolved in the states, like they are supposed to be? The Constitution outlines what is supposed to happen in these situations, but it didn’t actually happen in 1876.

That year the contingent election system was bypassed when there was a contested outcome. At the height of Reconstruction, the issue was not that no candidate got a majority in the Electoral College, but rather that the three Southern states – Florida, Louisiana and South Carolina – sent multiple slates of electoral votes to Washington, DC, after the state elections were disputed. And in Oregon, there was a dispute over one elector. The question was which were the legitimate sets of electors.

The Constitution stipulates that the electoral votes be directed to the President of the Senate, who was Republican Thomas W. Ferry. Although Republicans argued that he had the right to decide which votes to count, Democrats disagreed and argued that the Democratic majority in Congress should decide.

 

The Compromise of 1877

A compromise was reached. In an unprecedented move, Congress decided to create an extralegal “Election Commission.  Congress created a special bipartisan commission, to determine which candidate should get the 20 disputed electoral votes. So on January 29, 1877, the Electoral Commission Act established a commission of five senators (three Republicans, two Democrats), five representatives (three Democrats, two Republicans), and five Supreme Court justices (two Republicans, two Democrats, and one independent) to decide which votes to count and resolve the dispute. However, the independent Supreme Court justice refused to serve on the commission and was replaced by a Republican justice.

In the disputed Presidential election of 1876 between the Republican Rutherford Hayes and the Democrat Samuel Tilden, Congress created a special Electoral Commission to decide to whom to award a total of 20 electoral votes which were disputed from the states of Florida, LouisianaSouth Carolina and Oregon. The Commission was to be composed of 15 members: five drawn from the U.S. House of Representatives, five from the U.S. Senate, and five from the U.S. Supreme Court. The majority party in each legislative chamber would get three seats on the Commission, and the minority party would get two. Both parties agreed to this arrangement because it was understood that the Commission would have seven Republicans, seven Democrats, and one independent.

Obviously that independent would be the one to decide. Both parties wanted the same man: Justice David Davis, a friend and former colleague of Abraham Lincoln. Judge Davis was a brilliant and ethical man, and was reputed as such in his lifetime. This episode proves it beyond any doubt, and exactly why few know about it is astounding. Davis, who was the most trusted independent in the nation. According to one historian, "No one, perhaps not even Davis himself, knew which presidential candidate he preferred." Just as the Electoral Commission Bill was passing Congress, the legislature of Illinois elected Davis to the Senate. Democrats in the Illinois Legislature believed that they had purchased Davis's support by voting for him. However, they had made a miscalculation; instead of staying on the Supreme Court so that he could serve on the Commission, he promptly resigned as a Justice, in order to take his Senate seat. His replacement, a Republican, voted for Hayes.

In late January, the commission voted 8-7 along party lines that Hayes had won all the contested states, and therefore the presidency, by just one electoral vote. They ultimately gave the votes to Republican Rutherford B. Hayes even though Democrat Samuel Tilden got more popular votes.  With 185 votes to Tilden's 184, Hayes was declared the winner two days before he was inaugurated.

And just exactly how was this decision reached? Tilden and the Democrats gave up the election, which in all fairness, they probably did win because they got something in return. Disputed returns and secret back-room negotiations put Republican Rutherford B. Hayes in the White House. The commission voted 8 to 7 to award the electoral votes from South Carolina, Florida, and Louisiana (and one from Oregon) to Hayes.  But that was not the end of the election.

 

What Happened Behind Closed Doors

Democratic members of Congress threatened to prevent the count of electoral votes and delay the resolution of the election with frequent adjournments and filibusters. With the threat of delay, Democrats hoped to win some concessions from Republicans. Furious Democrats refused to accept the ruling of the special commission and threatened a filibuster. So, in long meetings behind closed doors, Democrats and Hayes’ Republican allies hashed out what came to be known as the Compromise of 1877: also known as the Wormley Agreement, the Bargain of 1877, or the Corrupt Bargain: an informal but binding agreement.

Finally, just after 4 a.m. on March 2, 1877, the Senate president declared Hayes the president-elect of the United States. Hayes—dubbed “His Fraudulency” by a bitter Democratic press—would be publicly inaugurated just two days later.

A secret backroom deal decided the election. The negotiations put Republican Rutherford B. Hayes in the White House—and Democrats back in control of the South. Hayes would become president on the condition that he ended Reconstruction in the South. Hayes secured his win by agreeing to end Reconstruction. he filibuster of the certified results and the threat of political violence in exchange for an end to federal Reconstruction. Two issues interested Democrats—restoring their control of governments, and thus white supremacy, in the South (and removing the last of the federal troops) and a federal subsidy for railroads. However, it is doubtful that Hayes, his supporters, and Democrats reached any sort of deal beyond what Hayes promised to do in his letter of acceptance. Samuel J. Randall, the Democratic Speaker of the House, realizing that creating chaos would backfire on the Democrats, finally ruled the filibusterers out of order and forced the completion of the count in the early hours of March 2, 1877.

In fact, even as the electoral commission deliberated, national party leaders had been meeting in secret to hash out what would become known as the Compromise of 1877. Hayes agreed to cede control of the South to Democratic governments and back away from attempts at federal intervention in the region, as well as place a Southerner in his cabinet. In return, Democrats would not dispute Hayes’s election, and agreed to respect the civil rights of Black citizens. Just two months after his inauguration, Hayes made good on his compromise and ordered the removal of the last federal troops from Louisiana. These troops had been in place since the end of the Civil War and had helped enforce the civil and legal rights of many formerly enslaved individuals.

The disputed 1876 presidential election resulted in a compromise in which Republican Rutherford B. Hayes became president in exchange for the withdrawal of federal troops from the South. This effectively ended Reconstruction. Southern white Democrats, known as "Redeemers," regained control of state governments. They systematically dismantled Reconstruction-era reforms and restored white supremacy through laws, violence, and intimidation. The late 19th and early 20th centuries saw the establishment of Jim Crow laws, which enforced racial segregation and disenfranchised African Americans. These laws undid many of the advances made during Reconstruction. In conclusion, while the Radical Republicans initially succeeded in imposing their Reconstruction policies, their gains were largely undone by the end of the 19th century, leading to nearly a century of segregation and disenfranchisement for African Americans in the South.

More stringent enforcement and a more robust federal military presence and oversight in the South could have provided more protection for African Americans and ensured the implementation of Reconstruction policies. This would have helped prevent the rise of white supremacist groups and the rollback of civil rights gains. I don’t think extending the occupation past 1876 would have; however, the damage was done.

With this deal, Hayes ended the Reconstruction era and ushered in a period of Southern “home rule.” Soon after his inauguration, Hayes made good on his promise, ordering federal troops to withdraw from Louisiana and South Carolina, where they had been protecting Republican claimants to the governorships in those states. This action marked the effective end of the Reconstruction era, and began a period of solid Democratic control in the South. Soon, a reactionary, unfettered white supremacist rule rose to power in many Southern states. In the absence of federal intervention over the next several decades, hate groups such as the Ku Klux Klan flourished, and states enacted racist Jim Crow laws whose impacts continue to be felt today. For their part, white Southern Democrats did not honor their pledge to uphold the rights of Black citizens, but moved quickly to reverse as many of Reconstruction’s policies as possible. In the decades to come, disenfranchisement of Black voters throughout the South, often through intimidation and violence, helped ensure the racial segregation imposed by the Jim Crow laws—a system that endured for more than a half-century, until the advances of the civil rights movement in the 1960s. In this sense the 1876 presidential election provided the foundation for America’s political landscape, as well as race relations, for the next 100 years.

Key decisions by the U.S. Supreme Court struck at the protections afforded by Reconstruction-era constitutional amendments and legislation. The Court’s decision in the Slaughterhouse Cases (1873), established that the 14th Amendment applied only to former enslaved people, and protected only rights granted by the federal government, not by the states.

Three years later, in United States v. Cruikshank, the Supreme Court overturned the convictions of three white men convicted in connection with the massacre of more than 100 Black men in Colfax, Louisiana in 1873, as part of a political dispute. The men had been convicted of violating the 1870 Enforcement Act, which banned conspiracies to deny citizens’ constitutional rights and had been intended to combat violence by the Ku Klux Klan against Black people in the South.

The Supreme Court’s ruling—that the 14th Amendment’s promise of due process and equal protection covered violations of citizens’ rights by the states, but not by individuals—would make prosecuting anti-Black violence increasingly difficult, even as the Klan and other white supremacist groups were helping to disenfranchise Black voters and reassert white control of the South.

Ten years later, the debacle would also result in a long-overdue law: the Electoral Count Act of 1887, which codified Electoral College procedure. This was recently further supplemented after the events of January 6, 2020.

 

The site has been offering a wide variety of high-quality, free history content since 2012. If you’d like to say ‘thank you’ and help us with site running costs, please consider donating here.

 

 

References

Foner, Eric (2002) [1988]. Reconstruction: America's Unfinished Revolution, 1863–1877. New York: Harper Perennial Modern Classics.

Grant, Ulysses S. (2003) [1885]. Personal Memoirs. New York: Barnes & Noble, Inc.

McFeely, William S. (1981). Grant: A Biography. New York: Norton.

https://www.history.com/news/reconstruction-1876-election-rutherford-hayes

https://millercenter.org/the-presidency/educational-resources/disputed-election-1876

https://www.smithsonianmag.com/smart-news/confusion-voter-suppression-and-constitutional-crisis-five-things-know-about-1876-presidential-election-180976677/

https://guides.loc.gov/presidential-election-1876

https://www.history.com/topics/us-presidents/compromise-of-1877

https://www.nationalgeographic.com/history/article/1876-election-most-divisive-united-states-history-how-congress-responded/

Robert F.  Kenned Jr.’s suspension of his third-party campaign for president and endorsement of Republican Donald J. Trump, was a development with historical resonance. RFK, Jr. has long been known as a fiercely independent and idiosyncratic lawyer and environmentalist with an eclectic collection of positions and ideas, including vaccine skepticism. But among his other actions and assertions, RFK, Jr.’s embrace of Trump and, by extension, the Republican party, stands out for its direct opposition to the Democrats, the party of his forefathers who did much to shape its values and lore, and inspire future generations of adherents. RFK, Jr. is now campaigning energetically for Trump, and given the still-potent draw of the legendary Kennedy name, his support could conceivably make the difference in a razor-tight race.

Larry Deblinger explains.

Booby Kennedy (left) with President Lyndon B. Johnson in 1966.

Upon hearing of RFK’s decision, five of his eight surviving siblings released a brief statement condemning it as a “betrayal” of their family’s values and “a sad ending to a sad story.” Previously,  at least 15 Kennedy family members had shunned RFK Jr.’s candidacy and endorsed Joe Biden for president, before Biden dropped out. These relatives appear to view RFK, Jr. as a black sheep of the family, an aberration whose actions should be lamented and dismissed.

It might be tempting to view RFK, Jr’s “sad story” through the operatic lens that the Kennedy family saga has typically been chronicled, replete with tragic and untimely deaths, noble ideals, soaring oratory, and unrealized dreams. Indeed, RFK, Jr. hinted of his move to come on the basis of a high-minded principle, befitting a Kennedy. In April, RFK, Jr. asserted on CNN that President Joe Biden was a greater threat to American democracy than Trump, even though he called Trump’s attempts to subvert the 2020 election and other of his actions “appalling.” He argued that social media websites had blocked him from espousing his vaccine conspiracy theories under pressure and weaponization of government agencies by the Biden administration, thus violating his Constitutional right to freedom of speech, and threatening the most important pillar of democracy.

But it serves to note that RFK, Jr’s complaint was also of a direct and personal nature. And in this context, it must also be considered that among their traits of good looks and charisma, drive, wit, brilliance, eloquence, and idealism, prominent Kennedys have shown a capacity to act out of sheer spite: personal, petty, mean-spirited, and hateful vindictiveness. Both RFK, Jr.’s father, Robert F. “Bobby” Kennedy, and his uncle Edward M. “Ted” Kennedy, evinced this marked tendency in the political arena at key moments in American history. Through this lens, RFK, Jr.’s action appears not so much a “betrayal” of Kennedy family value as another familiar recurrence of a Kennedy failing, and his allegiance with Trump, little more than a personal and vindictive swipe against the Democratic party.

 

Youth

From his youth, Bobby Kennedy was a kind of family attack dog, keen to perceive and avenge any slights to himself or his family members. The “runt” of Rose and Joseph Kennedy’s storied litter, Bobby made up for his small size and limited talents (at least compared with his brothers) with tenacity and scrappiness in sports and academics, often spoiling for fights. It did not take much; as a student at Harvard, RFK once smashed a beer bottle over a young man’s head, sending him to the hospital for stitches, simply because he had the temerity to celebrate his birthday at the same Cambridge bar and same time as Bobby.1  And he held a grudge. “When Bobby hates you, you stay hated,” Joe Kennedy once said of the son who seemed most to take after him.2 As an adult, armed with a law degree from the University of Virginia, RFK became an assistant counsel to US Republican Senator Joseph V. McCarthy’s infamous investigative committee that during 1953-54 recklessly and often spuriously alleged Communist influence in the US government and media.

It was during this period that RFK first met then-Senate Majority Leader, Lyndon Johnson, a Democrat from Texas, and for Bobby, it was hatred at first sight. He had known of Johnson as a protégé of former President Franklin D. Roosevelt, the man who had recalled his father as US Ambassador to England in 1940 and fired him; Johnson was at FDR’s side during much of the humiliating process, and that, apparently, was enough for Bobby.3  FDR had clear and substantive reasons for his action, including Joe Kennedy’s early support for appeasement of Adolf Hitler in the late 1930s; publicly expressed pessimism over the survival of Great Britain and of democracy in Europe (and privately expressed antisemitism); suspicion of his being a Nazi sympathizer; and British Prime Minister Winston Churchill’s calls for Kennedy’s dismissal. Nonetheless, son Bobby saw the firing as a family offense not to be forgiven.

So, when Majority Leader Johnson entered the Senate cafeteria with two assistants one day in 1953 and passed a table where McCarthy was meeting with his staff, Bobby sat glowering in his seat while the rest of McCarthy’s team jumped up to shake the hand of the “Leader,” in keeping with Senate decorum. 3 Not to be deterred, the towering, almost 6 foot 4-inch tall, LBJ stood over RFK and stuck out his hand, waiting for a long, awkward moment before Bobby finally rose and shook it without looking at Johnson.

 

Feud

The epic LBJ-RFK feud was on. There were Johnson’s repeated attempts after the first to squeeze handshakes out of Bobby Kennedy just to torment him, and a few disparaging comments from Johnson about Joe Kennedy’s ambassadorship in England. There was the incident in 1959 on Johnson’s ranch, where RFK was sent by his brother John to sound out Johnson on his intentions of running for president, when LBJ insisted on some deer hunting and Bobby was thrown flat on his back by a rifle recoil. “Son, you’ve got to learn how to handle a gun like a man,” Johnson said as he helped him up.4  

 

Beyond the insults, RFK despised Johnson as a man who in his opinion exhibited all the worst traits of the classic politician: an unprincipled and conniving lust for power, loose regard for the truth, rampant egoism, and selfish vanity. To RFK’s Northeastern elite sensibilities, Johnson’s rude and crude Southwestern-dirt-poor, working-class manners, physically overbearing political style, and segregationist past were repugnant and worthy of withering scorn, something Johnson fully recognized and resented.

But the true measure of RFK’s pettiness emerged with the ascendance of LBJ to Vice President in his brother John’s administration, and to the presidency after his brother’s death: an inability to respect the office however much he detested the man. Even though JFK had offered LBJ the VP post, considering him vital to his electoral prospects, and LBJ had accepted, during the Democratic convention, Bobby repeatedly visited Johnson in his hotel room to get him to decline the offer. RFK later insisted his attempts were at his brother’s behest, a contention that historians view with skepticism.5,6 It was during this episode that Johnson began calling RFK “that little shitass” and “worse” names, according to a close associate.7

The ill-will continued through JFK’s tragically shortened presidency, under which RFK served as Attorney General. JFK knew that the vice presidency was an extremely confining office for an accomplished power broker like Johnson, and he was determined that LBJ be treated with dignity, if only to assuage his massive ego. In general, JFK and Johnson enjoyed cordial, gentlemanly, and mutually respectful relations.8,9 Yet, RFK radiated disrespect towards Johnson, barging into his meetings without a word of apology and treating him like an underling9; indeed, for all practical purposes, Bobby was the number two in the JFK administration. The tight-knit Kennedy staffers called LBJ nicknames like “Uncle Cornpone” behind his back.10

 

Out of Office

It was even worse out of the office. Bobby and his wife Ethel held frequent parties for “Kennedy people” (Johnson called them “the Harvards”) at their home, Hickory Hill in Virginia, where the ridicule of LBJ turned kind of sick, according to historian Jeff Shesol in his 1997 book on the RFK-LBJ feud, Mutual Contempt:

Johnson jokes and Johnson stories were as inexhaustible as they were merciless. Those that percolated during the campaign had been humorous, but this new material betrayed a real bitterness, a mean-spiritedness that was hard to explain…Time (magazine)’s Hugh Sidey, a frequent visitor to Hickory Hill was appalled by the gang’s ridicule of LBJ, which he described as “just awful…inexcusable, really.” In October 1963, friends gave Bobby Kennedy an LBJ voodoo doll; “the merriment,” Sidey later reported, “was overwhelming.”11

 

 

The frivolity likely vanished after the assassination of JFK in Dallas, Texas, but not the feud between RFK and LBJ, exacerbated by the fact that the shooting occurred in Johnson’s home state. RFK, overwhelmed with grief, resolved to stay on as Attorney General, but without letting go of his animosity. “From the moment Air Force One (bearing JFK’s body) landed in Washington, and progressively in the days and weeks that followed, Bobby was ready to see slights to his brother, his brother’s widow, or himself in whatever Lyndon Johnson did or didn’t do,” wrote LBJ biographer Merle Miller.12         

Although Johnson performed faithfully and admirably in honoring JFK’s legacy and advancing his policy agenda, according to contemporary journalists and historians, his personal attempts as President to show respect and sensitivity to the Kennedys were all rudely rebuffed. “Overtures from Johnson to the Kennedy family after the Kennedy assassination were rejected in a manner that was thoroughly offensive and insulting,” observed contemporary Clark Clifford, an eminent Washington DC attorney and veteran Democratic party insider.12

And the hostility did not stop at mere personal gestures.  As historian Shesol explains of Johnson’s early days as president:

Johnson desperately needed affirmation, and in the hour of his greatest burden, it came from unlikely sources—from the Congress, which had spurned and mocked him for a thousand days; from the cabinet, appointed by his predecessor; from the American people, who cherished John Kennedy in death as they had not in life. All rallied to the new president. They gave him their patience and their trust.

Bobby Kennedy was not among them, and in Bobby’s absence Johnson felt the suspicion and rejection he feared from the rest.13  

           

Ironically, a book that the Kennedy family members had commissioned expressly to control the narrative of the JFK assassination and aftermath, and protect their image, publicly exposed the intense antagonism towards LBJ, which shocked reviewers. Entitled “The Death of a President,” by William Manchester, who was given extensive and exclusive access to the Kennedys and their records, the book was, in the words of Time magazine, “seriously flawed by the fact that its partisan portrayal of Lyndon Johnson is so hostile that it almost demeans the office itself.” It is impossible to parse exactly what proportion of this hostility might have come independently from the author, rather than the Kennedys (although the author was handpicked and vetted by the family). At any rate, the Kennedys were unhappy with the book for various reasons and sued to stop general publication of it before changes were made. “Bobby worried that the book might make it appear that the Kennedys had not given Johnson a chance to succeed in the Presidency and that their opposition was nothing more than a personal vendetta,” wrote Michael W. Schuyler, an historian at Kearny State University, New Mexico.14

           

Bobby Kennedy

LBJ went on to win election in his own right in 1964, by one of the largest landslide victories in American history. He then successfully pushed through epochal Civil Rights legislation and social welfare programs like Medicare and Medicaid, anti-poverty initiatives and other legislation ranging from the arts to immigration, environmental protection, education, and gun control, compiling a domestic record that, on the whole, remains a landmark achievement of American progressivism. But his controversial and disastrous Vietnam war policies rapidly undermined his presidency, compelled him to decline to run for re-election, and ended his political career. Bobby Kennedy left the Johnson administration to run for US Senator from New York, which office he won in 1965. He was assassinated while campaigning for president on an anti-war platform in 1968.                                            

It might be reassuring, in terms of the Kennedy legacy, to think that the LBJ-RFK feud was entirely a one-off, generated by the forced proximity and interaction of two dynamic personalities who were almost uniquely born to clash. But that is not the case. A mere 12 years after Bobby’s violent death, a relatively brief but all-too-familiar spectacle of petty and personal spite and resentment involving a Kennedy took center stage in American politics.

 

1980 convention

The setting was the Democratic party convention of 1980, a presidential election year. The intraparty combatants were the incumbent US president James Earl Carter, son of a peanut farmer from Georgia and Ted Kennedy, US Senator from Massachusetts, scion of the wealthy, celebrated, star-crossed political family, which some Americans viewed like royals in exile. Although Carter had won the party’s nomination handily after a bitter battle, he stood awkwardly at the podium, having completed his acceptance speech, waiting for Kennedy to arrive and, in effect, certify his candidacy as though he were a higher authority.

The contest itself was inherently anomalous, and humiliating for Carter. “Never before had a sitting President, an elected President, with command of both houses of Congress and the party machinery, been so challenged by his own people. What was even more remarkable was the nature of the challenge—a charge of incompetence,” wrote contemporary journalist and historian Teddy White.15

By 1980, Carter’s presidency was foundering, beset on all sides by crises foreign and domestic. The economy was struggling with the combination of persistent inflation, slow economic growth, and high unemployment, called “stagflation.” A revolution in Iran to replace the US-backed Shah with an Islamic theocracy in 1979 spooked Americans who remembered the Arab oil embargo of the early 1970s, and drove them to hoard gas. This resulted in long gas lines, dwindling gas supplies, and mounting hysteria, including killings and riots. The infamous Iran hostage crisis of 1979 erupted when Iranian militants captured over 50 Americans at the US embassy in Tehran and kept them for 444 days, prompted at least in part by Carter’s decision to allow the exiled Shah to enter the US for cancer  treatment.

 

 

Ted Kennedy

Despite some landmark achievements such as his forging of the Camp David Peace Accords between Israel and Egypt, Carter failed to convince the American people that he had a sure grip on the helm of state. He had a curiously stiff personal style, despite his ever-present wide smile, and a technician’s approach to solving national problems that was uninspiring to the public and did not always work. Like Carter, his closest advisers were from Georgia, and the team, including the President, came to office with a regional chip on their shoulders, bristling with peevish hyperawareness, if not combative pride, in being outsiders to the Washington establishment. As Carter’s approval ratings began to plunge, sinking to 28% in June of 1979, a bit of that Southern defiance appeared to flare when Carter was asked at a gathering of Congressmen whether he planned to run for re-election (a question insulting in itself), particularly given the possibility that Ted Kennedy might challenge him for his party’s nomination.

“I’m going to whip his ass,” Carter replied, referring to Kennedy, and then repeated it, when asked (in disbelief) if that was what he meant.16 When confronted with the widely reported statement, Kennedy smoothly responded that the president must have been misquoted.

It was the first publicly overt expression of tension between Carter and Kennedy. Later in 1979, further signs of tension and rivalry were palpable at the opening of the John F. Kennedy Library,  in Boston, where they both spoke. The event started out inauspiciously for Carter when he leaned in to kiss Jacqueline Kennedy Onassis on the cheek in greeting, “just as a matter of courtesy,” and “she flinched away ostentatiously,” as Carter remembered decades later.17In their speeches, ostensibly in honor of JFK., both Carter and Kennedy slyly inserted warnings, or shots across the bow, to each other.

Observing with growing disgust Carter’s faltering efforts to be the president the American people wanted and needed, Kennedy became convinced that he could fill the void of leadership, and announced his candidacy for the Democratic nomination.

 

Contest

But the matchup was a contest of weaknesses. While Carter had acquired the image of a bumbler, Kennedy was a deeply flawed and inept candidate. Grave doubts about his character relentlessly shadowed him over the 1969 incident in Chappaquiddick, Massachusetts, an island off Martha’s Vinyard, when he drove a car off a bridge and into a pond, causing the death of Mary Jo Kopechne, a young woman who was a passenger in the car. Although Kennedy swam to safety, he failed to call the police for 10 hours during which Kopechne’s life might have been saved. Kennedy further undermined himself with a one-on-one interview on prime-time, network television, in which he was unable to answer the direct question of why he wanted to be president, responding  with an incoherent stream of hesitations and pointless phrases, i.e. an epic word salad. Mirroring this ambivalence, Kennedy campaigned with inconsistent energy and conviction, championing an old-line liberalism that many thought outdated.

By a month before the convention, Carter had won enough primaries and delegates to secure his renomination, with a commanding lead over Kennedy; as promised, Carter had “whipped” Kennedy. And yet, Kennedy refused to bow out, having adopted a “kamikaze-like state of mind,” according to Jon Ward in his 2019 book about the Carter-Kennedy rivalry, Camelot’s End. “Many in the Kennedy camp were disgusted by Carter,” wrote Ward. “They felt he was no better than (Republican presidential nominee Ronald) Reagan, and almost preferred to see Reagan win,”18

The Kennedy camp insisted on an “open convention,” meaning that delegates could be free to vote for whom they wished regardless of the choice of the rank-and-file primary voters they were supposedly pledged to represent. In the meantime, a poll showed Carter with a 77% national disapproval rating.19 The Democrats agreed to the open convention format.

When the open vote was over, Carter had finally won the nomination with almost two-thirds of the vote. Kennedy conceded but he was not done fighting. His camp insisted on a party platform vote, including liberal planks far to the left of Carter’s policies, which would defy and embarrass the President, and would take place right after Senator Kennedy was scheduled to speak, so as to set the most favorable atmosphere for their approval.

The Carter people knew exactly what was planned and were losing patience. “If you have any wisdom and judgment at all, you know you don’t get carried away by personalities and pettiness in a political fight,” recounted Carter’s campaign  manager, Bob Strauss, to The New Yorker. “Politics is tough enough…that you don’t cut each other’s throats.” Carter’s Press Secretary, Jody Powell, later wrote, “We neglected to take into account one of the most obvious facets of Kennedy’s character, an almost child-like self-centeredness,” in his memoir of the election

 

Kennedy’s speech        

In the event, Kennedy’s convention staff did behave childishly, like a bunch of drunken frat-boys, on the day of his speech. Kennedy floor manager Harold Ickes invoked an obscure procedural rule to stop the afternoon convention activities, “in a gesture done purely out of spite,” wrote Ward in a 2024 Politico article.  “We just said, ‘Fuck ‘em,’” explained Ickes in an interview. “I mean, we weren’t thinking about the country. We weren’t even thinking about the general election. It was, ‘Fuck ‘em.’ You know? To be blunt about it.”

Fistfights almost broke out the convention floor when outraged Carter staffers confronted Ickes, who responded with “Go fuck yourself, I’m shutting this convention down.” The fisticuffs were luckily averted by a phone call from Kennedy at his hotel room, curious to know what had stopped the proceedings he was watching on television. When told the convention would be stalled for two hours, Kennedy, after a long pause, told Ickes to allow it to go forward.

Perhaps relieved from the burden of pursuing a losing cause, Kennedy gave a thoughtful, eloquent, stem-winding speech later that night, which is still remembered as one of the best speeches in American political convention history. Kennedy invoked the Democratic party’s heritage of support for the common man, and the wisdom of 19th century poet Alfred Lord Tennyson, with pleas to re-unite the country and the party, lyrically concluding, in a paean to big-hearted, big-spending liberalism, "the work goes on, the cause endures, the hope still lives, and the dream shall never die."

And yet, the good vibes and elevated, Camelot-like aura were shattered by another Kennedy-driven spectacle before a prime-time national TV audience on the last, climactic night of the convention. Carter did not help his cause by starting off his acceptance and campaign kick-off speech with a shouted tribute to Democratic Senator and former VP, Hubert Horatio Humphrey, whom he misnamed as Hubert Horatio Hornblower (Horatio Hornblower was a fictional, Napoleonic-era, British naval officer in a popular 20th century series of stories and novels) before hastily correcting himself. When he had finished his speech, almost 20 minutes ticked by as various party luminaries (and some not so luminary) joined him on the stage for a desultory show of unity, waiting for the final moment, and leaving bored TV news commentators to mutter derisive comments to their audiences.

The Kennedy team had orchestrated that final moment by insisting that Kennedy would not watch the speech at the arena but in his hotel room, and would then make his way to the convention, thus having the dramatically delayed, final appearance of the show, like the top star of a rock concert, or a champion boxer.  

 

Handshake

When Kennedy did appear, to a roar of excitement, it was obvious to almost everyone watching, or made clear to them by the TV journalists on the scene, that Carter was looking for one thing: the classic political handshake of the party’s top politicians, former rivals, standing together in full view of the spotlights and cameras, their interlocked hands thrust high in the air, in a thrilling and triumphant show of unity, strength, and expectation of victory, of party over personal interest, bitterness, and division. He never got it. Kennedy did shake Carter’s hand five times by Ward’s count, but each time in a crowd, with the brief and perfunctory manner a campaigner might take the hand of someone in a rope line. The TV commentators duly noted each, increasingly embarrassing, failure. As Carter followed him around, Kennedy began to “smirk” and “chuckle,” according to Ward; he finally patted Carter on the back before leaving the arena to cheers.

Two months later, a peripheral, nonofficial member of the Kennedy campaign staff, but with longstanding ties as a helper to the Kennedy family, named Paul Corbin, stole Carter’s briefing books for a general election debate with Reagan and gave them to the Reagan campaign, 20 according to information gathered in a Congressional  investigation and a 2009 book by political consultant and author Craig Shirley.20

 

After the 1980 election

Carter went on to lose the election to Reagan, but thereafter has led one of the most active, productive, and distinguished post-presidential lives in  American history.  Ted Kennedy, who died in 2009, remained US Senator from Massachusetts for decades, compiling a highly distinguished legislative career, featuring his steadfast advocacy for a national health care system, which was finally realized in at least some form in 2010, under the Obama administration.

With regard to health care reform, however, Carter has charged in his presidential memoirs that his administration’s proposal for a national health plan, which was devised over a two-year period by an array of economic experts and government leaders, including Ted Kennedy, and had support from key Congressional leaders, was scuttled by Kennedy in 1979 when he opposed it “at the very end,” which ultimately resulted in a 30-year delay in national health care.21  Carter repeated the charge in 2010 in TV interviews with 60 Minutes and Larry King, alleging that Kennedy acted “out of personal spite,” and his ambition to run for president and enact his own health care plan. In his own writings, Kennedy had counter-charged that it was Carter who delayed the plan (https://www.cbsnews.com/news/time-has-not-cooled-jimmy-carter-ted-kennedy-feud/).

 

RFK, Jr.

And so, we arrive at RFK, Jr., son of Bobby and nephew of Ted, choosing to support Republican Donald J. Trump, a convicted felon facing dozens of additional criminal charges, in his campaign for re-election as president. RFK, Jr. appears to justify his stance at least partly as a defense of freedom of speech. But he has yet to explain how supporting a candidate whose relentless abuse and corruption of that very right, by knowingly spewing lies that have sown chaos, threatened the democratic system, and endangered public safety, could possibly serve to protect freedom of speech and democracy. Then again, RFK, Jr.’s stance might have little to do with anything so grand as ideas, principles, and the national interest. After all, he is a Kennedy.  

 

Find that piece of interest? If so, join us for free by clicking here.

 

 

 

Print References

1.     Caro, Robert A. (2012). The Years of Lyndon Johnson. The Passage of Power. Alfred A. Knopf; New York, NY: pg. 63.

2.     Ibid, pg. 66.

3.     Ibid, ppg. 61-3.

4.     Shesol, Jeff. (1997). Mutual Contempt. Lyndon Johnson, Robert Kennedy, and the Feud that Defined a Decade. W.W Norton & Company; New York, NY: pg.10.

5.     Caro, Robert A. (2012). The Years of Lyndon Johnson. The Passage of Power. Alfred A. Knopf; New York, NY: ppg. 122-40.

6.     Shesol, Jeff. (1997). Mutual Contempt. Lyndon Johnson, Robert Kennedy, and the Feud that Defined a Decade. W.W Norton & Company; New York, NY: ppg.48-57.

7.     Caro, Robert A. (2012). The Years of Lyndon Johnson. The Passage of Power. Alfred A. Knopf; New York, NY: pg. 139.

8.     Ibid., pg. 177-195.

9.     Shesol, Jeff. (1997). Mutual Contempt. Lyndon Johnson, Robert Kennedy, and the Feud that Defined a Decade. W.W Norton & Company; New York, NY: ppg.77-79.

10.  Caro, Robert A. (2012). The Years of Lyndon Johnson. The Passage of Power. Alfred A. Knopf; New York, NY: pg. 198.

11.  Shesol, Jeff. (1997). Mutual Contempt. Lyndon Johnson, Robert Kennedy, and the Feud that Defined a Decade. W.W Norton & Company; New York, NY: pg.104.

12.  Ball, Moira Ann. The phantom of the oval office:  The John F. Kennedy’s assassination’s symbolic impact on Lyndon B. Johnson, his key advisors, and the Vietnam decision-making process.  Presidential Studies Quarterly. 1994;24(1):105-119.

13.  Shesol, Jeff. (1997). Mutual Contempt. Lyndon Johnson, Robert Kennedy, and the Feud that Defined a Decade. W.W Norton & Company; New York, NY: pg.119.

14.  Schuyler M.W. Ghosts in the White House: LBJ, RFK, and the assassination of JFK. Presidential Studies Quarterly. 1987; 17(3):503-518.

15.  Ward J. (2019).  Camelot’s End. Kennedy vs. Carter and the Fight that Broke the Democratic Party.  Hachette Book Group;  New York, NY: pg. 146.

16.  Ibid., pg. 126.

17.  Ibid., pg. 152.

18.  Ibid., pg.230.

19.  Ibid., pg.251.

20.  Ibid.,  pg.284-5.

21.  Carter J. (2010). White House Diary. Farrar, Strous and Giroux. New York, NY: pg. 325.

Sir Isaac Newton, was born (according to the Julian calendar in use in England at the time) on Christmas Day, the 25th of December 1642 (New system calendar the 4th of January 1643), at Woolsthorpe Manor in Woolsthorpe-by-Colsterworth, a hamlet in the county of Lincolnshire, England.

Sir Isaac Newton was one of the most influential scientists in human history, with his groundbreaking work in mathematics, physics, and astronomy laying the foundations for classical mechanics continuing to this day and shaping modern science.

Terry Bailey explains.

Isaac Newton in later life. Painting by James Thornhill.

Early Life and Education

Newton's early life was marked by personal hardships. His father died three months before he was born, and when Newton was three, his mother remarried, leaving him in the care of his maternal grandmother. As a child, Newton displayed a curiosity about the world that would later evolve into groundbreaking scientific inquiries. He was sent to The King's School in Grantham, where he demonstrated a gift for mathematics and mechanics, often constructing elaborate mechanical devices, such as sundials and windmills, during his free time.

At 18, Newton enrolled at Trinity College, Cambridge, in 1661. Cambridge, however, offered a curriculum centered around Aristotelian philosophy, which Newton found inadequate to explain the natural world. During this time, he encountered the works of philosophers such as René Descartes and astronomers like Galileo Galilei, which inspired his independent thinking. It was during the mid-1660s, when Cambridge was closed, (1665-1667), due to the Great Plague, that Newton made his first breakthroughs.

 

The Annus Mirabilis (The "Year of Wonders")

Newton's most productive period came during his time away from Cambridge between 1665 and 1667, often referred to as his "Annus Mirabilis." During these years, he developed the fundamental principles of calculus, formulated his theories on optics, and famously began to conceive the laws of motion and universal gravitation.

 

Calculus: The Foundation of Modern Mathematics

 

One of Newton's most profound achievements was the development of calculus, a new branch of mathematics that allowed for the analysis of continuously changing quantities. Although the invention of calculus is often attributed to both Newton and German mathematician Gottfried Wilhelm Leibniz, Newton's work predated Leibniz's publication by several years. It is also important to note that Archimedes, 287 BCE-212 BCE had already started developing the early concepts of integral calculus.

Newton used calculus to describe rates of change, which was crucial in his subsequent scientific discoveries. For example, calculus allowed Newton to analyze the motion of objects, calculate the changing velocities of falling bodies, and predict the paths of planets.

 

Optics: The Nature of Light and Color

During his retreat in 1666, Newton also conducted groundbreaking experiments with optics. Using a prism, Newton demonstrated that white light could be split into a spectrum of colors, showing that white light was a mixture of different wavelengths of light rather than a pure substance.

This discovery revolutionized the field of optics and dispelled prevailing theories that colors were produced by the modification of white light. His work on light also led him to build the first practical reflecting telescope, known as the Newtonian telescope, in 1668. This innovation eliminated chromatic aberration—a problem that plagued refracting telescopes—and allowed for sharper images of celestial objects.

 

The Principia and Newton's Laws of Motion

In 1687, Newton published his magnum opus, Philosophiæ Naturalis Principia Mathematica, often referred to simply as the Principia. This work laid the groundwork for classical mechanics and established Newton's lasting influence on science.

 

Newton's Laws of Motion

The Principia is perhaps most famous for the articulation of Newton's three laws of motion, which describe the relationship between an object's motion and the forces acting upon it:

First Law (Inertia): An object at rest stays at rest, and an object in motion stays in motion with the same speed and in the same direction unless acted upon by an unbalanced external force.

Second Law (Force and Acceleration): The acceleration of an object is directly proportional to the net force acting upon it and inversely proportional to its mass. This law is succinctly expressed by the formula F = ma (force equals mass times acceleration).

Third Law (Action and Reaction): For every action, there is an equal and opposite reaction. This principle explains why a rocket is propelled upward as gas is expelled downward.

 

These laws transformed the study of motion and became the foundation of classical mechanics, allowing scientists to predict the behavior of moving objects and understand phenomena like the orbits of planets and the trajectories of projectiles.

 

The Universal Law of Gravitation

Newton's law of universal gravitation is another keystone of his legacy. Newton proposed that every particle of matter in the universe attracts every other particle with a force that is proportional to the product of their masses and inversely proportional to the square of the distance between them. This revolutionary idea provided a unifying explanation for both terrestrial and celestial phenomena.

Newton's law of gravitation explained why apples fall to the ground, why the Moon orbits the Earth, and why planets revolve around the Sun. It was the first time a mathematical theory provided a comprehensive explanation of the mechanics of the universe. With this law, Newton showed that the same forces governing falling objects on Earth were responsible for the motion of the planets, revolutionizing our understanding of the cosmos.

 

Later Life and Scientific Work

Following the publication of the Principia, Newton's reputation as one of the world's foremost scientists was firmly established. He was appointed Lucasian Professor of Mathematics at Cambridge, a position he held until 1696 when he moved to London to become Warden of the Royal Mint. There, Newton played a key role in reforming England's coinage and combating widespread counterfeiting.

 

Alchemy and Theology

Although Newton is best known for his contributions to mathematics and physics, he also spent a significant portion of his life studying alchemy and theology. Alchemy, a proto-scientific tradition, sought to transform base metals into gold and discover the elixir of life, (which to the initiated was a metaphorical concept for other scientific Pursuits). While Newton never made significant strides in these areas, his alchemical work reveals the breadth of his intellectual curiosity as he applied solid scientific methodologies to this pursuit.

Newton's theological writings were also substantial, though they remained unpublished during his lifetime. He was deeply interested in biblical prophecy and sought to reconcile his scientific work with his religious beliefs. Despite his unorthodox theological views, Newton believed that the universe operated under divine law, and this conviction reinforced his scientific inquiries. However, the more he studied these ideas the more separate the two concepts became.

 

Newton's Legacy in Science

Isaac Newton's scientific achievements had a profound impact on future generations of scientists. His methods of inquiry—based on empirical observation, mathematical rigor and logical reasoning became the standard for scientific exploration.

 

Influence on Physics

Newton's work in physics formed the basis for much of what is now called classical mechanics. For over two centuries, Newton's laws of motion and gravitation remained the cornerstone of physics, providing a comprehensive framework for understanding the movement of bodies in the universe.

It was not until the 20th century, with the advent of Einstein's theory of relativity and quantum mechanics, that Newton's ideas were modified to account for the behavior of objects at extreme speeds and small scales. However, even in these contexts, Newton's laws remain a valid approximation for much of the physical world.

It is vital to understand that Newton was not incorrect and Einstein's theories simply were furtherance of Newton's findings. Issac Newton lived in the time of horse and carriage and the concept of light speed was virtually unknown. When Einstein added light speed into the equation it allowed science to move beyond Newton's discoveries, the true aspect of scientific discovery and solid proof of Newton's legacy to the world.

 

Impact on Mathematics

Newton's development of calculus opened new avenues for mathematical exploration. His methods for calculating the rate of change and determining areas under curves became essential tools in mathematics, engineering, and physics. Calculus remains a central component of modern mathematics education and is used extensively in fields ranging from physics to economics.

 

Contributions to Astronomy

 

Newton's law of gravitation allowed astronomers to better understand planetary motion and celestial mechanics. Using Newton's theories, astronomers could predict the orbits of planets and comets with unprecedented accuracy. Newton's work also helped scientists understand the forces governing tides, the behavior of moons, and the dynamics of stars and galaxies.

Side note:- Galileo Galilei had already discovered the first 4 moons orbiting around Jupiter, originally named the Galilean moons, (satellites), on the 7th of January 1610. These moons eventually became known as Io, Europa, Ganymede, and Callisto.

 

Newton's Philosophical Impact

In addition to his scientific work, Newton influenced the philosophical understanding of nature and human knowledge. His emphasis on observation and mathematical explanation helped shape the Enlightenment view that nature operates according to discoverable laws. Philosophers like John Locke and Immanuel Kant were profoundly influenced by Newton's work, and his ideas were integral to the rise of empiricism and the scientific method.

In conclusion, Isaac Newton's life and work are solid evidence of the power of human curiosity and intellect. From his formulation of calculus and groundbreaking work in optics to his laws of motion and gravitation, Newton reshaped humanity's understanding of the natural world. His influence extended far beyond his era, setting the stage for centuries of scientific progress. Newton's legacy endures, not only in the discoveries he made but in the methods of inquiry and analysis he championed—methods that continue to drive science forward today.

 

The site has been offering a wide variety of high-quality, free history content since 2012. If you’d like to say ‘thank you’ and help us with site running costs, please consider donating here.

 

 

Notes:

Reflecting telescope, (Newtonian telescope)

A reflecting telescope often referred to as a Newtonian telescope, is a type of reflecting telescope invented by the English scientist Sir Isaac Newton, using a concave primary mirror and a flat diagonal secondary mirror. Newton's first reflecting telescope was completed in 1668 and is the earliest known functional reflecting telescope.

The Newtonian telescope's simple design has made it very popular with amateur telescope makers.

 

Refracting telescope

 

A refracting telescope often referred to as a refractor is a type of optical telescope that uses a lens as its objective to form an image, known as a dioptric telescope and was the earliest type of optical telescope.

The first record of a refracting telescope appeared in the Netherlands about 1608 when a spectacle maker from Middelburg named Hans Lippershey unsuccessfully tried to patent one.

News of the patent spread fast and Galileo Galilei, happening to be in Venice in May 1609, heard of the invention, constructed a version of his own, and applied it to making astronomical discoveries.

 

Chromatic aberration

Chromatic aberration, also referred to as chromatic distortion, color fringing, and sphero-chromatism, is a common optical phenomenon that occurs when a lens cannot bring all wavelengths of light to a single converging point

 

Jupiter's moons

Jupiter currently has 95 moons that have been officially confirmed and recognized by the International Astronomical Union, (IAU).