The word progressive is used a badge of honor by some and a means of attack by others in modern politics. But to be progressive meant something different in earlier times. Here, Joseph Larsen tells us about a new book on the subject: Illiberal Reformers: Race, Eugenics and American Economics in the Progressive Era, by Thomas C. Leonard.

Bernie Sanders, a self-styled progressive and contender for the Democratic presidential nomination in 2016.  Pictured here in 2014.

Bernie Sanders, a self-styled progressive and contender for the Democratic presidential nomination in 2016.  Pictured here in 2014.

The United States is in an election year with public confidence in government sinking – 2014 and 2015 Gallup polls show confidence in Congress at all-time lows.[1] Voters and pundits are engaged in bitter battles over the meaning of left and right, with the politically charged term “progressive” used and abused by voices across the political spectrum. Bernie Sanders and Hillary Clinton, the leading Democratic Party candidates, both wear it as a badge of honor. But this term is often used but little understood. During Barack Obama’s first presidential term, one left-leaning history professor described a progressive as anyone “who believes that social problems have systemic causes and that governmental power can be used to solve those problems.”

Progressivism has an ugly history, too. The side of the Progressive Era the American left would rather forget is dredged up by Princeton University Scholar Thomas C. Leonard in Illiberal Reformers: Race, Eugenics and American Economics in the Progressive Era. In a scathing criticism of the American Progressive Era Leonard emphasizes the movement’s rejection of racial equality, individualism, and natural rights. Progressivism was inspired by the torrent of economic growth and urbanization that was late nineteenth century America. Mass-scale industrialization had turned the autonomous individual into a relic. “Society shaped and made the individual rather than the other way around,” writes Leonard. “The only question was who shall do the shaping and molding” (p. 23). Naturally, the progressives chose themselves for that task.

Much of the book is devoted to eugenics. Defined as efforts to improve human heredity through selective breeding, the now-defunct pseudoscience was a pillar of early 20th century progressivism. Leonard argues that eugenics fit snugly into the movement’s faith in social control, economic regulation, and Darwinism (p. 88). But Darwin was ambiguous on whether natural selection resulted in not only change but also progress. This gave progressive biologists and social scientists a chance to exercise their self-styled expertise. Random genetic variance and the survival of inferior traits is useless; what’s needed is social selection, reproduction managed from above to ensure proliferation of the fit and removal of the unfit (p. 106). Experts could expose undesirables and remove them from the gene pool. Forced sterilization and racial immigration quotas were popular methods.

 

 

The book’s most memorable chapter is where it analyzes minimum wage legislation. These days, this novelty of the administrative state is taken for granted – many on the left currently argue that raising the wage floor doesn’t destroy jobs – but Leonard finds its roots in Progressive Era biases against market exchange, immigrants, and racial minorities. Assuming that employers always hire the lowest-cost candidates and that non-Anglo-Saxon migrants (as a function of their inferior race) always underbid the competition, certain progressives undertook to push them out of the labor market. Their tool was the minimum wage. Writes Leonard:

The economists among labor reformers well understood that a minimum wage, as a wage floor, caused unemployment, while alternative policy options, such as wage subsidies for the working poor, could uplift unskilled workers without throwing the least skilled out of work … Eugenically minded economists such as [Royal] Meeker preferred the minimum wage to wage subsidies not in spite of the unemployment the minimum wage caused but because of it (p. 163).

 

In the hands of a lesser author, this book could have been a partisan attack on American liberalism, and one that would find a welcoming audience in the current political landscape. Leonard deftly stands above the left-right fray. Rather than give ammunition to the right he argues that progressivism attracted people from both ends of the political spectrum. Take Teddy Roosevelt, a social conservative and nationalist who nonetheless used the presidency to promote a progressive agenda. “Right progressives, no less than left progressives were illiberal, glad to subordinate individual rights to their reading of the common good. American conservative thinking was never especially antistatist”, Leonard writes (p. 39). Furthermore, eugenics had followers among progressives, conservatives, and socialists alike. The true enemy of progressivism? Classical liberalism, the belief that society is a web of interactions between individuals and not a collective “social organism.”

 

Insights for today?

Leonard combines rigorous research with lucid writing, presenting a work that is intellectually sound, relevant, and original. Readers should take his insights to heart when asking how much of the Progressive Era still lives in 2016. The answer is not simple. Contemporary progressives like Clinton and Sanders certainly don’t espouse biological racism. For those who whip up anti-immigrant sentiment to win votes, “progressive” is a dirty word, not a badge of honor. Moreover, the American left long ago abandoned attempts to control the economy via technocratic experts.

But that doesn’t tell the whole story. Modern progressives still place a disturbing amount of faith in the administrative state and a lack of it in market exchange. Leonard closes by arguing that the Progressive Era lives on: “Progressivism reconstructed American liberalism by dismantling the free market of classical liberalism and erecting in its place the welfare state of modern liberalism.” (p. 191). It is up to the reader to decide whether that is something to be lauded or fought against.

 

Did you find the article interesting? If so, share it with others by clicking on one of the buttons below.

 

You can buy the book Illiberal Reformers: Race, Eugenics and American Economics in the Progressive Era, by Thomas C. Leonard here: Amazon US | Amazon UK

 

Joseph Larsen is a political scientist and journalist based in Tbilisi, Georgia. He writes about the pressing issues of today, yesterday, and tomorrow. You can follow him on Twitter @JosephLarsen2.

 

[1] “Confidence in Institutions.” Gallup.com. Accessed January 29, 2016. http://www.gallup.com/poll/1597/confidence-institutions.aspx/.

For most of us, cocaine brings to mind the image of drug-fueled discos or wealthy Wall Street stockbrokers, feeding an insatiable habit. However, the history of this addictive stimulant is a far more interesting tale than one might imagine. Liz Greene explains.

An 1885 advert for children's cocaine toothache drops.

An 1885 advert for children's cocaine toothache drops.

The story of cocaine starts in the high mountain ranges of South America, where native Peruvians chewed the leaves of the coca plant in order to increase energy and strength. The stimulating effects of the leaf sped breathing, raising the oxygen level in their blood and countering the effects of living in thin mountain air. Once the Spanish arrived in the 1500s, word of the coca plant and its interesting effects began to spread.

 

The Wonder Drug

In 1859, German chemist Albert Niemann isolated, extracted, and named the purified alkaloid cocaine from a batch of coca leaves transported from South America. Despite the detailed information he provided on the alkaloid in his dissertation, it wouldn’t be until later in the century that its effects were recognized in the medical community.

As medical experiments testing cocaine’s analgesic properties began, other doctors were studying the drug’s more stimulating traits. In 1883, Theodor Aschenbrandt, a German army physician, administered cocaine to soldiers in the Bavarian Army. He reported that the drug reduced fatigue and enhanced the soldiers’ endurance during drills. These positive findings were published in a German medical journal, where they came to the attention of famed psychoanalyst, Sigmund Freud.

Freud’s findings on cocaine were based widely on his own experience with the drug. Not only did he use it regularly, he also prescribed it to his girlfriend, best friend, and father. In July 1884, he published Über Coca, a paper promoting cocaine as a treatment of everything from depression to morphine addiction. He concluded,

Absolutely no craving for the further use of cocaine appears after the first, or even after repeated taking of the drug...

 

Unfortunately, he was not only wrong, he was already addicted.

 

A Wider Audience

Inspired by Paolo Mantegazza’s reports of coca use in Peru, French chemist, Angelo Mariani developed a new drink concocted of claret and cocaine. With 6 milligrams of cocaine in every ounce, Vin Mariani became extremely popular, even among such high hitters as Queen Victoria, Pope Leo XIII, and Pope Saint Pius X.

Motivated by the success of Vin Mariani, in 1885, a drugstore owner in Columbus, Georgia decided to formulate his own version. Unfortunately for John Pemberton, the county in which he lived passed prohibition legislation, forcing him to come up with a new recipe for his French Wine Nerve Tonic. In 1886 he created a new, nonalcoholic version based on both coca and kola nut extracts — giving rise to the name Coca Cola. The euphoric and energizing effects of the drink helped to skyrocket the popularity of Coca-Cola by the turn of the century. Until 1903, a standard serving contained around 60mg of cocaine.

But cocaine wasn’t limited to beverages. Throughout the early 1900s, unregulated patent medicines containing cocaine were sold en masse. Toothache drops, nausea pills, analgesic syrups — all were easy to obtain, and far more addictive than consumers realized. By 1902 there were an estimated 200,000 cocaine addicts in the United States.

A 1890s advert for Vin Mariani tonic wine.

A 1890s advert for Vin Mariani tonic wine.

A Serious Problem

As cocaine use in society increased, the dangers of the drug became more evident. In 1903, the New York Tribune ran an expose that linked cocaine to crime in America, pressuring the Coca-Cola Company to remove cocaine from the soft drink. Eleven years later, the Harrison Narcotic Act came into effect, regulating the manufacture and dispense of cocaine in the United States. With the passing of the Narcotic Drugs Import and Export Act in 1922, cocaine became so heavily regulated that usage began to decline sharply — and continued to do so through the 1960s.

In 1970, the Controlled Substances Act was signed into law by President Richard Nixon. It classified cocaine as a Schedule II Controlled Substance, meaning the drug could only be possessed with a written prescription of a practitioner. This allowed for cocaine to still be used medically as a topical anesthetic, but not recreationally.

The passing of the Controlled Substances Act didn’t stop the popular media of the time from portraying cocaine as fashionable and glamorous. Rock stars, actors, and other popular figures of the time brandished paraphernalia like a trendy accessory, and America’s urban youth were watching.

Around this same time, a new, crystallized form of cocaine — known as crack — appeared. This cheaper alternative to cocaine made a name for itself in low-income communities during the 1980s. With such a high rate of addiction, users were willing to do almost anything for their next hit — leading to a dramatic rise in crime and a moral panic labeling crack as an epidemic.

Though cocaine use has steadily declined in recent years, the drug is still gathering about 1,600 new users each day. More than 40,000 people die from drug overdoses each year in the U.S — around 5,000 of which are due to cocaine. It’s seems as though cocaine isn’t quite ready to let go of its place in society — nor does it appear to be going away anytime soon.

 

Liz Greene is a dog loving, beard envying, history and pop culture geek from the beautiful city of trees, Boise, Idaho. You can catch up with her latest misadventures on Instant Lo or follow her on Twitter @LizVGreene.

 

Did you find this article interesting? If so, tell the world – tweet about it, like it, or share it by clicking on one of the buttons below…

Posted
AuthorGeorge Levrier-Jones
Categories19th century

In 1860 Western forces burned the Summer Palace, a wonderful and magnificent building to the northwest of Beijing, China. British and French troops pillaged the palace, and then burned it to the ground in a terrifying act during the Second Opium War. Here, Scarlett Zhu explains what happened and responses to the attack.

The looting of the Summer Palace by Anglo-French forces in 1860.

The looting of the Summer Palace by Anglo-French forces in 1860.

"We call ourselves civilized and them barbarians," wrote the outraged author, Victor Hugo. "Here is what Civilization has done to Barbarity."

One of the deepest, unhealed and entrenched historical wounds of China stems from the destruction of the country's most beautiful palace in 1860 - the burning of the Old Summer Palace by the British and French armies. As Charles George Gordon, a soldier of the force, wrote about his experience, one can "scarcely imagine the beauty and magnificence of the places being burnt."

 

The palace that once boasted of possessing the most extensive and invaluable art collection of China, became a site of ruins within 3 days in the face of some 3,500 screaming soldiers and burning torches. Dense smoke and ashes eclipsed the sky, marble arches crumbled, and sacred texts were torn apart.  At the heart of this merciless act stood Lord Elgin, the British High Commissioner to China, a man who preferred revenge and retaliation to peace talks and compromise. He was also a man highly sensitive to any injustices or humiliation suffered by his own country. Thus, the act was a response to the imprisonment and torture of the delegates sent for a negotiation on the Qing dynasty's surrender. However, as modern Chinese historians would argue, this was a far-from-satisfactory excuse to justify this performance of wickedness, as before the imprisonment took place, there had already been extensive looting by the French and British soldiers and the burning was only "the final blow".

The treasures of the Imperial Palace were irresistible and within the reach of the British and French. Officers and men seemed to have been seized with temporary insanity, said one witness; in body and soul they were absorbed in one pursuit: plunder. The British and the French helped themselves to all the porcelain, the silk and the ancient books - there were an estimated 1.5 million ancient Chinese relics taken away. The extent of this rampant abuse was highlighted even more by the burning of the Emperor's courtiers, eunuch servants and maids - many estimates place the death toll in the hundreds. This atrocious indifference towards human life inflamed international opposition, notably illustrated by Hugo's radiant criticisms.

 

The response to the attack

But there was no significant resistance to the looting, even though many Qing soldiers were in the vicinity - perhaps they had already anticipated the reality of colonial oppression or did not bother themselves with the painful loss of the often-distant imperial family. But the Emperor, XianFeng, was not an unreceptive spectator; in fact, he was said to have vomited blood upon hearing the news.

However, there was evidence to suggest that some soldiers did feel that this was "a wretchedly demoralizing work for an army". As James M'Ghee, chaplain to the British forces, writes in his narrative, he shall "ever regret the stern but just necessity which laid them in ashes". He later acknowledged that it was "a sacrifice of all that was most ancient and most beautiful”, yet he could not tear himself away from the palace's vanished glory. Historian Greg M. Thomas went so far as to argue that the French Ambassador and generals refused to participate this destruction as it "exceeded the military aims of their mission", and would be an irreparable damage to an important cultural monument.

Nowadays, what is left of the palace are the gigantic marble and stone blocks, which used to be backdrops of the European-style fountains situated in the distant corner of the Imperial gardens for entertaining the Emperor, since those made out of timber and tile did not survive the fires. The remains acted as a somber reminder of the West's ransack and the East's "century of humiliation".

This is more than a story of patriotism, nationalism and universal discontent. History used to teach us that patriotism isn't history, but rather propaganda in disguise. Yet how could one ignore and omit a historical event so demoralizing and compelling on its own, that it is no longer a matter of morality and dignity, but a matter of seeking the truth, tracing the past and its inseparable link with the present? When considering the savage and blatant destruction of the Old Summer Palace, along with the unspoken hatred of the humiliated and the suppressed, it seems therefore appropriate to end with the cries of the enraged Chinese commoners as they witnessed the worst of mankind's atrocities: “Kill the foreign devils! Kill the foreign devils!”

 

Did you find this article of interest? If so, tell the world – tweet about it, like it, or share it by clicking on one of the buttons below.

Bibliography

1. Hugo, Victor. The sack of the summer palace, November 1985

2. Bowlby, Chris. "The palace of shame that makes China angry"

3.  M'Ghee, Robert. How we got to Pekin: A Narrative of the Campaign in China of 1860, pp. 202-216, 1862

4. "The Burning of the Yuan Ming Yuan: 150 Years Later", http://granitestudio.org/2010/10/24/the-burning-of-the-yuanmingyuan-150-years-later

5. "Fine China, but at what cost?”, http://thepolitic.org/fine-china-but-at-what-cost/

In this article Janet Ford discusses the horrific act of infanticide in the nineteenth century with the help of records from London’s Old Bailey court – with cases from London and (from 1856) further afield. It provides an insightful look into this terrible crime in Victorian England…

The Old Bailey in the early nineteenth century.

The Old Bailey in the early nineteenth century.

In the nineteenth century there were 203 cases of infanticide recorded in the Old Bailey.

Of the 203 cases, 83 people were found guilty, 114 were found not guilty and one was a ‘misc’ verdict. Out of the 83 who were found guilty, only 18 were actually found guilty of killing, with three of those being found insane and two with a ‘recommendation’. 65 were not guilty of killing but guilty of the lesser crime of concealing the birth. This shows that even though it was a highly emotional and shocking crime women were not automatically found guilty. The reason why so many were found not guilty of killing was often due to medical evidence, such as the health of the baby and mother. There was also an increased involvement of character witnesses in the courts, who could explain the background of the person, and an increased interest in the criminal mind, especially those of women. Finally, there was more of an understanding of childbirth itself.

 

What the cases show about the crime and society

The role of Medical people

As all the cases involved doctors, surgeons or midwives, there was a need and want to have physical evidence, rather than just hearsay, in order to get the right verdict and justice. They would have knowledge and experience of all types of childbirth, and so they could provide evidence of it being accidental, deliberate or it being too difficult to tell.

 

What it shows about Childbirth and its effects on crime

The records show two main aspects of childbirth: the physical effect on the baby and the emotional aspect. The emotional aspect of childbirth was the shame of having a baby out of wedlock - but also of having the father run out during the pregnancy, not being sure who the father was, not wanting to be a single mother, or sexual assault. It meant that women felt they had to injure or kill their baby, conceal the birth or self deliver. They were seen as criminals, which many were, but many were also victims of social attitudes and even of crimes themselves. The physical aspect of childbirth was the consequence of these elements, as women felt they had to deliver on their own. This meant there was no other person to help if the delivery was difficult. An example of the physical affect can be seen with this statement from Doctor Thomas Green in Ellen Millgate’s case.  

Health of the mother and child

The cases show that the health of both the mother and baby were taken into consideration and used as evidence. The health of the mother, such as if she was epileptic, would have affected her ability to care for the baby properly. Poor health helped the mother’s case, as it was out of her control, as did the baby being premature. An example of health being used as evidence is shown with Ellen Middleship, who was found not guilty.

20151011 Clip 2.png

Born alive

One of the main reasons why so many were found not guilty or only guilty of concealing the birth was the baby being born dead on delivery. It was out of the mother’s control, and so she would have been found not guilty. In many cases, it was too difficult to tell if the baby had been born alive during the delivery, as shown with the case of Elizabeth Ann Poyle.

Personal aspects

Along with medical evidence, personal aspects were also taken into consideration. Personal elements such as good character, age, previous children and the relationship with the father were all taken into account. These elements could show that the mother could not have committed the crime, as it was out of character, or at least helped to lessen the punishment, which did happen with many women. An example is shown with Sarah Jeffery giving a statement about Jane Hale, who was guilty of concealing but not of killing.   

Violence

 

The most shocking aspect of the cases, whether the women were found guilty or not guilty, was violence. Violence could have been caused by cutting the cord, getting the child out, falling, or hitting. This was one of the most difficult aspects of a case, as it could be difficult to determine if injuries were caused by the birth or on purpose. What helped resolve this was medical knowledge, an understanding of childbirth, or eyewitness accounts. The understanding of childbirth helped to explain why there were marks on, for example, the neck and head. This was due to ribbons or rope being used to get the baby out, or the baby falling during childbirth. Even though the marks caused by childbirth were not committed on purpose, it is still shocking to read, as shown with Ellen Millgate - the marks were around a vulnerable part of the baby. With the help of eyewitness accounts, it was only in a few cases where it was determined that the injuries were committed on purpose. An example of this can be seen with Ann Dakin giving evidence in the Joseph Burch and Caroline Nash case, who were both found guilty and given four year penal servitude.

It is one of the most shocking cases due to the violence and a reminder that parents could abuse their own children. But also, as with many of the other guilty cases, it shows that women could be quite cruel and violent. Another element of violence was getting rid of the body. The main example is from this description by James Stone of what he found in Martha Barratt’s room. She was found guilty of concealing the birth but not of killing.  

Mercy towards women

Even with the violence, and the shame of committing the crime, the verdicts and the punishments show that there was an understanding and sympathy towards women, as the majority were found not guilty of infanticide or guilty of a lesser crime. This was due to a better understanding of women, society, childbirth, and the criminal mind over the century.

The cases show that infanticide was a very complex crime, as it involved and was affected by so many factors - health, childbirth, social attitudes, babies, violence and high levels of emotion. It also shows the various sides of the 19th century…

 

If you found this article of interest, do tell others. Tweet about it, like it, or share it by clicking on one of the buttons below…

References

Anne-Marie Kilday, A history of infanticide in Britain, c. 1600 to the present (Palgrave Macmillan, 2013)

M Jackson, Infanticide: historical perspectives on child murder and concealment, 1550-2000 (Ashgate, 2002)

Old Bailey Online, January 1800-December 1899, Infanticide 

Ellen Millgate, 28th November 1842

Ellen Middleship, 21st October 1850

Elizabeth Ann Poyle, 22nd May 1882

Jane Hale, 28th November 1836

Joseph Nash and Caroline Nash, 24th October 1853

Martha Barratt, 9th April 1829

Body parts and the strangeness of the human anatomy have fascinated people for centuries. And they have been displayed and collected for some time. Here, Rebecca Anne Lush takes a look at how displays of ‘medical marvels’ have progressed though the ages…

An old scene from the Hunterian Museum in London.

An old scene from the Hunterian Museum in London.

With contents to both fascinate and repulse it is no wonder medical museums continue to entice visitors. Gunther von Hagen’s Body Worlds has attracted thousands of visitors worldwide since its first exhibition in Tokyo in 1995. Today, there are nine exhibitions on display across the world. With a further four planned in the near future, it appears as though this museum has sustained the public’s interest. According to their mission statement, they endeavor to teach the public the ins and outs of anatomy. Body Worlds is not alone. The Mütter Museum in Philadelphia and the Hunterian and Wellcome Museums in London also continue to engage the public with their morbid and fascinating specimens.

The history of medical museums is incredibly rich, filled with mystery and mayhem, curiosity and control. In the Victorian era especially, they came to represent a conflict between the professional and the public. No longer could an individual pay a small fee to sit in on an autopsy and leave with a qualification. As the Victorian era progressed, pathology and anatomy schools both professionalized and specialized. Their conflict with the public realm is a curious case indeed.

 

Before the nineteenth-century

Body parts have been displayed for centuries serving multiple purposes. It can be argued that medieval churches displaying relics and reliquaries were amongst some of the earliest in the Western world.

The collection and display of body parts became a more secular practice during the Renaissance. So called Cabinets of Curiosities allowed avid collectors to organize their specimens and exhibit them to the public. Such cabinets could include human rarities to please and entertain visiting crowds.

It was not until the seventeenth-century, however, that anatomical specimens were more carefully collected, labeled, and stored in permanent institutions. Many anatomy teachers during this period held private collections to increase their credibility. Two very famous brothers, William and John Hunter, collected en masse anatomical specimens later donated to the Royal College of Surgeons. The seventeenth-century was also a time for commercial anatomical displays, such as freak shows and travelling exhibitions of human oddities.

 

Dr Kahn’s Anatomical Museum

Such early examples were the foundations for Victorian public and professional medical museums. No public medical museum was more influential than Dr Kahn’s Anatomical Museum. Joseph Kahn, a self-professed medical doctor, moved from Alsace, Germany to London opening his anatomical museum in London, 1851. Initially entry was restricted to males who could afford the fee of two shillings. After two months, however, women were allowed to step inside during specific viewing times. Eight years later, the admission price halved to one shilling attracting larger crowds and more inquisitive minds.

On entering the exhibition space, visitors encountered an anatomical wax Venus, the organs of which could be removed. The rest of the museum consisted of wax models, specimens held in jars, and special “doctors-only” rooms. Medical doctors frequented Dr Kahn’s until its closure in 1864.

 

Dr. Kahn.

Dr. Kahn.

Professional Museums

Developing alongside these public spectacles were the more professional museums, belonging to hospitals, pathology societies, private schools, universities and Royal Colleges.

More formal institutions collected specimens to aid in medical education. Acquiring both abnormal and normal specimens increased levels of anatomical knowledge and encouraged anatomy to transform into a professional activity that aimed to improve standards of health. Although some were open to the public, the majority were kept under lock and key.

 

Conflict

In 1857 the Obscene Publications Act prevented any ‘obscene’ anatomy to be displayed in a public setting. Dr Kahn’s museum was deemed immoral under this act resulting in its later closure. Other public anatomy museums continued to operate until the mid-1870s.

Both professional and public museums were striving to be centers of education. At first, the professionals admired Dr Kahn’s museum, especially the rooms dedicated to their study. Not only were early opinions favorable, but there is also evidence to suggest there were close relationships. Robert Abercrombie, for example, affiliated himself with the Strand Museum in London, establishing a consultation room next to the museum. Visitors were able to not only visit the museum, but also receive medical care on site.

As the Victorian era progressed, and as anatomy became specialized, these public museums were regarded inappropriate to disseminate such medical information. Ongoing legal and social battles ensured that the professional schools of anatomy and pathology alone were the stakeholders to the industry. It was a conflict of words with professional museums writing at length about their distrust and disgust in their medical journals.

 

Today

It is quite interesting to see another shift occurring in the past few decades. Today, even the more professional museums from the Victorian era are now open to the wider public. No longer is all medical information guarded by the elite and trained, but it can be accessible to anyone who wants to learn. Accompanying this is the fact that public medical museums displaying wax models are again appearing on the medical landscape. The curious case of medical marvels is a comment on how medical museums have been developed and transformed in order to meet the human desire for knowledge.  

 

Did you find this article interesting? If so, tell us why below…

References

Alberti, Samuel J. M. M. Morbid Curiosities: Medical Museums in Nineteenth- Century Britain. Oxford: Oxford University Press, 2011.

Bates, A. W. “Dr Kahn’s Museum: obscene anatomy in Victorian London.” Journal of the Royal Society of Medicine 99, no. 12 (2006): 618-624.

Bates, A. W. “Indecent and Demoralizing Representations: Public Anatomy Museums in mid-Victorian England.” Medical History 52, no. 1 (2008): 1-22.

Kahn, Dr. Joseph. Catalogue of Dr Kahn’s Celebrated Anatomical Museum. Leicester Square: W. J. Golbourn, 1853.

Kesteven, W. B. “The Indecency of the Exhibition of Dr Kahn’s Museum.” Letter. The British Medical Journal 1, no. 49 (1853): 1094.

“Medical News: Dr Kahn’s Anatomical Museum.” The Lancet 1, no. 1443 (April 26, 1851): 474.

Stephens, Elizabeth. Anatomy as Spectacle: Public Exhibitions of the Body from 1700 to the Present. Liverpool: Liverpool University Press, 2011.

It may seem strange, but there is very strong evidence that the White House killed a number of presidents in the mid-nineteenth century. The deaths of Zachary Taylor, William Henry Harrison, and James K. Polk are all linked to something in the White House – although many believed that some presidents were poisoned by their enemies. William Bodkin explains all…

A poster of Zachary Taylor, circa 1847. He is one the presidents the White House may have helped to killed...

A poster of Zachary Taylor, circa 1847. He is one the presidents the White House may have helped to killed...

President of the United States is often considered the most stressful job in the world.  We watch fascinated as Presidents prematurely age before our eyes, greying under the challenges of the office.  Presidential campaigns have become a microcosm of the actual job, with the conventional wisdom being that any candidate who wilts under the pressures of a campaign could never withstand the rigors of the presidency.  But there was a time, not so long ago, when it was not just the stress of the job that was figuratively killing the Presidents.  In fact, living in the White House was, in all likelihood, literally killing them.

Between 1840 and 1850, living in the White House proved fatal for three of the four Presidents who served.  William Henry Harrison, elected in 1840, died after his first month in office.  James K. Polk, elected in 1844, died three months after he left the White House.  Zachary Taylor, elected in 1848, died about a year into his term, in 1850.  The only occupant of the Oval Office during that period to survive was John Tyler, who succeeded to the Presidency on Harrison’s death.  What killed these Presidents?  Historical legend tells us that William Henry Harrison “got too cold and died” and that Zachary Taylor “got too hot and died.”  But the truth, thanks to recent research, indicates that Harrison, Taylor, and Polk may have died from similar strains of bacteria that were coursing through the White House water supply.


Conspiracies and Legends

On July 9, 1850, President Zachary Taylor, Old Rough and Ready, former general and hero of the Mexican-American War, succumbed to what doctors called at the time “cholera morbus,” or, in today’s terms, gastroenteritis.  On July 4, 1850, President Taylor sat out on the National Mall for Independence Day festivities, including the laying of the cornerstone for the Washington Monument.  Taylor, legend has it, indulged freely in refreshments that day, including a bowl of fresh cherries and iced milk.  Taylor fell ill shortly after returning to the White House, suffering severe abdominal cramps.  The presidential doctors treated Taylor with no success.  Five days later, he was dead.

Taylor’s death shocked the nation.  Rumors began circulating immediately concerning his possible assassination.  The rumors arose for a good reason.  Taylor, a Southerner, opposed the growth of slavery in the United States despite being a slave owner himself.  While President, Taylor had worked to prevent the expansion of slavery into the newly acquired California and Utah territories, then under the control of the federal government.  Taylor prodded those future states, which he knew would draft state constitutions banning slavery, to finish those constitutions so that they could be admitted to the Union as free states.

Taylor’s position infuriated his southern supporters, including Jefferson Davis, who had been married to Taylor’s late daughter, Knox.  Davis, who would go on to be the first and only President of the Confederate States of America, had campaigned vigorously throughout the South for Taylor, assuring Southerners that Taylor would be friendly to their interests.  But in truth, no one really knew Taylor’s views.  A career military man, Taylor hewed to the time honored tradition of taking no public positions on political issues.  Taylor believed it was improper for him to take political positions because he had sworn to serve the Commander-in-Chief, without regard to person or party.  Indeed, he had never even voted in a Presidential election before running himself.

Tensions between Taylor and the South grew when Henry Clay proposed his Great Compromise of 1850, which offered something for every interest.  The slave trade would be abolished in the District of Columbia, but the Fugitive Slave Law would be strengthened.  The bill also carved out new territories in New Mexico and Utah.  The Compromise would allow the people of the territories to decide whether those territories would be slave or free by popular vote, circumventing Taylor’s effort to have slavery banned in their state constitutions.  But Taylor blocked passage of the compromise, even threatening in one exchange to hang the Secessionists if they chose to carry out their threats.


More speculation

Speculation on the true cause of Taylor’s death only increased throughout the years, particularly after his former son-in-law, Davis, who had been at Taylor’s bedside when he died, became President of the Confederacy.  The wondering reached a fever pitch in the late twentieth century, when a University of Florida professor, Clara Rising, persuaded Taylor’s closest living relative to agree to an exhumation of his body for a new forensic examination.  Rising, who was researching her book The Taylor File: The Mysterious Death of a President, had become convinced that Taylor was poisoned.  But the team of Kentucky medical examiners assembled to examine the corpse concluded that Taylor was not poisoned, but had died of natural causes, i.e. something akin to gastroenteritis, and that his illness was undoubtedly exacerbated by the conditions of the day.

But what caused Taylor’s fatal illness?  Was it the cherries and milk, or something more insidious?   While the culprit lurked in the White House when Zachary Taylor died, it was not at the President’s bedside, but rather, in the pipes.

During the first half of the nineteenth century, Washington D.C. had no sewer system.  It was not built until 1871.  The website of the DC Water and Sewage company notes that by 1850, most of the streets along Pennsylvania Avenue had spring or well water piped in, creating the need for a sanitary sewage process. Sewage was discharged into the nearest body of water.  With literally nowhere to go, the sewage seeped into the ground, forming a fetid marsh.  Perhaps even more shocking, the White House water supply itself was just seven blocks downstream from a depository for “night soil,” a euphemism for human feces collected from cesspools and outhouses.  This depository, which likely contaminated the White House’s water supply, would have been a breeding ground for salmonella bacteria and the gastroenteritis that typically accompanies it.  Ironically, the night soil deposited a few blocks from the White House had been brought there by the federal government.


Something in the water

It should come as no surprise, then, that Zachary Taylor succumbed to what was essentially an acute form of gastroenteritis.  The cause of Taylor’s gastroenteritis was probably salmonella bacteria, not cherries and iced milk.  James K. Polk, too, reported frequently in his diary that he suffered from explosive diarrhea while in the White House.  For example, Polk’s diary entry for Thursday, June 29, 1848 noted that “before sun-rise” that morning he was taken with a “violent diarrhea” accompanied by “severe pain,” which rendered him unable to move.  Polk, a noted workaholic, spent nearly his entire administration tethered to the White House.  After leaving office, weakened by years of gastric poisoning, Polk succumbed, reportedly like Taylor, to “cholera morbis”, a mere three months after leaving the Oval Office.

The White House is also a leading suspect in the death of William Henry Harrison. History has generally accepted that Harrison died of pneumonia after giving what remains the longest inaugural address on record, in a freezing rain without benefit of hat or coat.  However, Harrison’s gastrointestinal tract may have been a veritable playground for the bacteria in the White House water.

Harrison suffered from indigestion most of his life.  The standard treatment then was to use carbonated alkali, a base, to neutralize the gastric acid.  Unfortunately, in neutralizing the gastric acid, Harrison removed his natural defense to harmful bacteria.  As a result, it might have taken far less than the usual concentration of salmonella to cause gastroenteritis.  In addition, Harrison was treated during his final illness with opium, standard at the time, which slowed the ability of his body to get rid of bacteria, allowing them more time to get into his bloodstream.  It has been noted, that, as Harrison lay dying, he had a sinking pulse and cold, blue extremities, which is consistent with septic shock.  Did Harrison die of pneumonia?  Possibly.  But the strong likelihood is that pneumonia was secondary to gastroenteritis.

Neither was this phenomena limited to the mid-nineteenth century Presidents.  In 1803, Thomas Jefferson mentioned in a letter to his good friend, fellow founder Dr. Benjamin Rush that “after all my life having enjoyed the benefit of well formed organs of digestion and deportation,” he was taken, “two years ago,” after moving into the White House, “with the diarrhea, after having dined moderately on fish.  Jefferson noted he had never had it before.  The problem plagued him for the rest of his life.  Early reports of Jefferson’s even death stated that he had died because of dehydration from diarrhea.

Presidents after Zachary Taylor fared better, once D.C. built its sewer system.  The second accidental President, Millard Fillmore, lived another twenty years after succeeding Zachary Taylor.  But what about the myths surrounding these early Presidential deaths?  They were created, in part, by a lack of medical and scientific understanding of what really killed these men.  With the benefit of modern science we can turn a critical eye on these myths. But we should not forget that myth-making can serve an important purpose past simple deception.  In the case of Zachary Taylor, it provided a simple explanation for his unexpected death.  Suspicion or accusations of foul play would have further inflamed the sides of the slavery question that in another decade erupted into Civil War, perhaps even starting that war before Lincoln’s Presidency.  In Harrison’s case, that overcoat explanation helped the country get over the shock of the first President dying in office and permitted John Tyler to establish the precedent that the Vice-President became President upon the death of a President.  In sum, these nineteenth century myths helped the still new Republic march on to its ever brighter future.


What did you think of today’s article? Do you think it was the water that killed several Presidents? Let us know below…


Finally, William's previous pieces have been on George Washington (link here), John Adams (link here), Thomas Jefferson (link here), James Madison (link here), James Monroe (link here), John Quincy Adams (link here), Andrew Jackson (link here), Martin Van Buren (link here), William Henry Harrison (link here), John Tyler (link here), and James K. Polk (link here).


Sources

  • Catherine Clinton, “Zachary Taylor,” essay in “To The Best of My Ability:” The American Presidents, James M. McPherson, ed. (Dorling Kindersley, 2000)
  • Letter, Thomas Jefferson to Benjamin Rush, February 28, 1803
  • Milo Milton Quaife, ed., “Diary of James K. Polk During His Presidency, 1845-1849” (A.C. McClurg & Co., 1910)
  • Jane McHugh and Philip A. Mackowiak, “What Really Killed William Henry Harrison?” New York Times, March 31, 2014
  • Clara Rising, “The Taylor File: The Mysterious Death of a President” (Xlibris 2007)

Nineteenth century poet Margaret Fuller died in a tragic way in 1850. And it was the writer Ralph Waldo Emerson who was perhaps most devastated by the loss. Here Edward J. Vinski looks at the fascinating relationship between them and what happened after Fuller’s passing.

A nineteenth century engraving of Margaret Fuller.

A nineteenth century engraving of Margaret Fuller.

Margaret Fuller

“On Friday, 19 July, Margaret dies on the rocks of Fire Island Beach within sight of & within 60 rods of the shore. To the last her country proves inhospitable to her.” (Emerson, 1850/1982, p. 511)

 

The Margaret to whom Ralph Waldo Emerson referred is Margaret Fuller, a writer and poet associated with American transcendentalism in the nineteenth century. Born in 1810, Fuller was educated under her father’s direction. Timothy Fuller’s tutelage was both intense and, in its own way, fortuitous. He began her instruction in Latin when she was but six years of age. Her lessons would last throughout the day, and young Margaret was often sent to bed overtaxed and unable to sleep. In spite of the nausea, bad dreams and headaches she incurred, Margaret appreciated that he held her to the same standards to which he would have held a son (Richardson, 1995).

Although they had mutual friends, Fuller and Emerson did not meet until the summer of 1836 when Fuller paid a three-week visit to the Emerson home in Concord, Massachusetts. Prior to this, she had attended some of Emerson’s talks and had wished to meet him for some time, but it was only after he read her translation of Goethe’s Taso that Emerson returned the interest and offered her the long-awaited invitation (Richardson, 1995). Thus began a relationship between the two that would have a profound effect on both of them.

 

Fuller and Emerson

Richardson (1995) has remarked that “Fuller took less from Emerson than either Thoreau or Whitman, and she probably gave him more than either of them” (p. 239-240). Perhaps more than any person other than his deceased first wife, Ellen, Fuller knew best how to pierce the armor of his innermost life. Nowhere is this more clearly evident than in the fact that following their initial meeting, Emerson finished his book Nature which had been drifting toward theoretical idealism. Fuller, according to Richardson (1995), pushed him toward an “idealism that is concerned with ideas only as they can be lived […] with the spiritual only when it animates the material” (p. 240).

Fuller, however, took from Emerson as well.  “From him,” she wrote, “I first learned what is meant by an inward life” (Fuller, n.d., as cited in in Bullen, 2012, Chapter V, para 4). She had long searched for an intellectual mentor and by the time of her first visit to Emerson, she was fearful that she may never find one. In Emerson, she found someone with whom she could share her ideas as well as her intimacies. As their relationship developed, however, it became clear that she was requiring even more from Emerson. Since no written record of her requests survive, precisely what she asked of him is difficult to discern. Although married, he was clearly conflicted by his feelings for her. In his journal, he confessed that she was someone “Whom I always admire, most revere and sometimes love” (Emerson, 1841/1914, p. 167), and in a later entry recorded a nighttime river walk with her. Whatever the case may be, it is clear that Emerson’s second wife, Lydian, saw Fuller as a threat (Allen, 1981).

After editing The Dial, a transcendentalist magazine, for several years, Fuller left America for Europe in the summer of 1846 as a correspondent for the New York Tribune. After some time in England, she relocated to Italy with her husband, Giovani Ossoli[1], a marquis who supported the Italian revolution. Fuller and her husband both took an active role in the revolution, and she chronicled its events in a book she had hoped to publish. When the revolt finally failed, the family, which now included a young son, was forced to return to America. Their ship, the Elizabeth, met with bad luck almost immediately. At Gibraltar, the captain died of smallpox, leaving the ship under the direction of its first mate. In the early morning of July 19, 1850, the ship ran aground on a sandbar a few hundred meters off Fire Island, NY. The following day, Margaret Fuller, her husband, and her child drowned when the ship broke up.

 

Thoreau’s Mission

News of the disaster reached Concord some days later. On or about July 21, Emerson made the journal entry indicated above. In a letter to Marcus Spring, dated July 23, Emerson wrote:

At first, I thought I would go myself and see if I could help in the inquiry at the wrecking ground and act for the friends. But I have prevailed on my friend, Mr Henry D. Thoreau, to go for me and all the friends. Mr Thoreau is the most competent person that could be selected and […] he is authorized to act for them all (Emerson, 1850/1997, p. 385).

 

Emerson doubted that any manuscripts would have survived the wreck, but knowing that Fuller would have had with her the manuscript to her History of the Italian Revolution, he was willing to pay whatever costs Thoreau might incur in his attempt to salvage it.

Thoreau, for his part, set out immediately. On July 25, he wrote to Emerson describing what details he had learned of the disaster:

…the ship struck at ten minutes after four A.M., and all hands, being mostly in their nightclothes, made haste to the forecastle, the water coming in at once […] The first man got ashore at nine; many from nine to noon. At flood tide, about half past three o’clock, when the ship broke up entirely, they came out of the forecastle, and Margaret sat with her back to the foremast, with her hands on her knees, her husband and child already drowned. A great wave came and washed her aft. The steward had just before taken her child and started for shore. Both were drowned (Thoreau, 1850/1958a, p. 262).

 

Margaret Fuller’s remains and those of her husband were never found. Her son’s body washed ashore, dead but still warm. A desk, a trunk, and a carpet bag were recovered from the scene, but none of Margaret’s valuable papers were found. Thoreau promised to do what he could, holding out some hope that, since a significant part of the wreckage remained where the ship ran aground, some items might still be salvaged, but it is clear that he was not confident.

In a letter to abolitionist and future Senator Charles Sumner, whose brother Horace was also aboard, Thoreau wrote

I saw on the beach, four or five miles west of the wreck, a portion of a human skeleton, which was found the day before, probably from the Elizabeth, but I have not knowledge enough of anatomy to decide confidently, as many might, whether it was that of a male or a female (Thoreau, 1850/1958b, p. 263).[2]

 

After visiting nearby Patchogue, New York, where many of those who scavenged the wreckage instead of attempting a rescue were thought to reside, he returned to Fire Island empty handed.

In all, Thoreau’s mission was unproductive. “I have visited the child’s grave,” he wrote to Emerson. “Its body will probably be taken away today” (Thoreau, 1850/1958, p. 262). The corpse of her son, a few insubstantial papers, and a button pried from her husband’s jacket by Thoreau himself were essentially Margaret Fuller’s only relics that would return to Massachusetts.

 

Conclusion

The relationship between Emerson and Margaret Fuller is enigmatic. She was not only his intellectual equal, but their interactions suggest “an only slightly erotic relationship, about which he clearly fretted” (Sacks, 2003, p. 51). Although Emerson’s life had been scarred by the losses of many loved ones, Fuller’s death clearly devastated him on many levels. The intellectual impact is obvious in a journal entry around the time of her death. “I have lost in her my audience,” he wrote (Emerson, 1850, p. 512). No longer would the two be able to exchange ideas with one another. It impacted him socially as well.  “She bound in the belt of her sympathy and friendship all whom I know and love,” (p. 511) he wrote. Perhaps he wondered what would happen now that the belt had been broken. But was there, in fact, something deeper? “Her heart, which few knew, was as great as her mind, which all knew,” (Emerson, 1850, p. 511-512). Emerson clearly knew her heart more intimately than most.

Why did Emerson dispatch Thoreau to Fire Island and not go himself as he had initially planned? Ostensibly, he wanted to begin work, at once, on a memorial book in Fuller’s honor. We may, however, speculate that there were deeper reasons as well. Years earlier, Emerson had opened the coffin of his first wife, Ellen, who had died of tuberculosis fourteen months before. While he gave no explanation for his action, it seems that he needed to view her decomposing corpse to somehow convince himself of the soul’s immortality (Richardson, 1995). This event marked a turning point in his life. His focus shifted from death to life, from the material to the ideal. 

The death of Margaret Fuller marked another profound turn. Ellen’s death due to illness, while tragic, was predictable. Fuller’s death was unexpected, and he would struggle mightily to recover from it. He became acutely aware of his own mortality. “I hurry now to my work admonished that I have few days left,” he wrote (Emerson, 1850/1982, p. 512). Fuller, who had pushed Emerson to focus on the spiritual as it animates the material was now, herself, inanimate. Emerson might well have stayed in Concord because he somehow sensed that the trip would be fruitless. It might also be that he could not bear the thought of once again standing over the lifeless body of a woman he loved.

 

Postscript

Years later, a small monument to Margaret Fuller was erected on the Fire Island beach not far from the wreck site. It stood as a memorial to a remarkable woman for 10 years. Then, it too was claimed by the sea (Field, n.d.).

 

What do you think of the article? Let us know by leaving a comment below…

 

References

  • Allen, G.W. (1981). Waldo Emerson. NY: Viking.
  • Bullen, D. (2012). The dangers of passion: The transcendental friendship of Ralph Waldo Emerson and Margaret Fuller. Amherst, MA: Levellers Press (Kindle Fire Version). Retrieved from http://www.amazon.com
  • Emerson, R. W. (1841/1914). Journal entry. In B. Perry (Ed.).The heart of Emerson’s journals. Boston: Houghton Mifflin.
  • Emerson, R.W. (1850/1982). Journal entry. In L. Rosenwald (Ed.). Ralph Waldo Emerson: Selected Journals 1841-1877. NY: Library of America.
  • Emerson, R.W. (1850/1997). Letter to Marcus Spring. In J. Meyerson (Ed.). The selected letters of Ralph Waldo Emerson (p. 358).  NY: Columbia University Press.
  • Field, V. R. (n.d.). The strange story of the bark ELIZABETH. http://longislandgenealogy.com/BarkElizabeth.html
  • Richardson, R. D. (1995). Emerson: The mind on fire. Berkley, CA: University of California Press
  • Sacks, K.S. (2003) Understanding Emerson: “The American Scholar” and his struggle for self-reliance. Princeton, NJ: Princeton University Press.
  • Thoreau, H.D. (1850/1958a). Letter to Ralph Waldo Emerson. In W. Hardy & C. Bode (Eds.). The correspondence of Henry David Thoreau (pp. 262-263). NY: NYU Press.
  • Thoreau, H.D. (1850/1958b). Letter to Charles Sumner. In W. Hardy & C. Bode (Eds.). The             correspondence of Henry David Thoreau (p. 263). NY: NYU Press.

 

Footnotes

1. There is some question as to whether they were officially married.

2. Thoreau would incorporate some of his memories from this mission, including that of the skeleton, into his book Cape Cod

Posted
AuthorGeorge Levrier-Jones
5 CommentsPost a comment

The banjo has a popular place in American culture. But few people know of the instrument’s complex roots. In this article, Reed Parker discusses how a banjo-like instrument was originally brought to the US by African slaves - before being remodeled. And the complex cultural interactions between different groups and the banjo…

The Banjo Player, a painting by William Sidney Mount from 1856.

The Banjo Player, a painting by William Sidney Mount from 1856.

In 2005, the first Black Banjo Gathering took place at Appalachian State University in Boone, North Carolina. The purpose of the gathering was to celebrate the tradition of the banjo and bring awareness to the fact that, even though the banjo has become an emblem of white-mountain culture, it is an African instrument at its core. The banjo as we know it today has a decidedly tragic origin story.

 

From Africa to America

Over the last few centuries, the banjo has secured a spot in the canon of traditional American music. In the time before the American Revolution, minstrels became a popular form of entertainment and they often played an early relative to the banjo known as a banjar.

Other relatives of what would eventually become the banjo existed in many different areas of West Africa. There is the ngoni, which had anywhere from three to nine strings, the konou, which has two strings, and the juru keleni, which has just one string. One of the most elaborate of these variations is the kora which has 21 strings and leather straps tied to the pole neck to hold the strings in place. These predecessors are still being played today in their native lands.

The direct predecessor of the banjo, most commonly known as a banjar, arrived on the slave ships that came from West Africa in the 17th century. The instrument was made from half of a gourd with animal skin stretched over it and a pole that acted as a neck. The strings of the banjar were made from waxed horsehair or from the intestines of animals, most commonly cattle or goat. The intestinal strings were referred to as catgut or simply gut strings. The banjar was easily constructed because the materials required were easy to find. Eventually the instrument evolved to include tuning pegs and a flat fretboard in place of the pole neck. This allowed for notes to be manipulated with slides and bends.

 

The banjar in the US

In West Africa, “talking drums” were a common method of long distance communication. This tradition was carried across the ocean to the plantations. In 1739, drums and brass horns were outlawed in the colonies as a result of the Stono Insurrection in which slaves on a South Carolina plantation coordinated an uprising against their slave owners. They had used these instruments to communicate the plan. Prior to this, ensembles of brass horns, drums, and banjars were quite popular. Afterward, however, solo banjar acts became more popular.

A sad reality of this time in the banjar’s life is that its burgeoning popularity had a lot to do with traveling white minstrels who would perform in blackface. The banjar acted as a prop for the minstrels to use in their acts, acts that often satirized aspects of African culture that were brought to the US. It is also theorized that some white old time musicians learned the oral tradition directly from black banjo players and merely wanted to continue the tradition, instead of satirizing it.

By the early 1800s, the European fiddle music that settlers brought over with them and African banjar music were beginning to mutually influence each other. The style of banjar play that started to emerge at this time was known as thumping, which would evolve to become the clawhammer or “frailing” style, a style that combines rhythm and melody into one strumming pattern using a claw-shaped hand position.

 

The arrival of the banjo

Joel Sweeney, a Virginia man of Irish descent, has been credited with either inventing or popularizing the earliest form of the modern banjo which features five strings, an open back, and a wooden rim. His contributions are contested and some claim that it was actually the fourth string that was Sweeney’s invention and that the fifth came later.

Around the middle of the nineteenth century, minstrel groups traveled to Britain, spreading the banjo’s influence over the musical landscape. At the same time, the now booming steamboat travel business put African slaves, on lease from their owners, together with Irish and German immigrant laborers. These marginalized groups would entertain each other with jigs and reels. The mutual influencing continued into the Civil War era and the musical pairing of the banjo and fiddle became and would stay the most popular in the Appalachian region into the twentieth century.

Fortunately, other events outside of blackface minstrel shows were developed to showcase banjo skill. Banjo contests and tournaments were held at a multitude of venues including bars, race tracks, and hotels. Before the Civil War, the contestants were almost exclusively white, but blacks began making an appearance when the war was over.

Further changes to banjo construction were made around this time such as tension rods and wire strings. Tension rods, or truss rods, were implemented to provide the ability to adjust the neck if it warped from dryness or humidity. Wire strings were a cheaper alternative to gut strings, but they were largely dismissed at first for the buzzing they produced.

In the early 1900s, full string bands began to emerge. These groups added a fuller sound to the banjo/fiddle duos with the addition of guitar, upright bass, mandolin, and sometimes other instruments. That is not to say that banjo/fiddle duos were replaced entirely though. Many loyal traditionalist Appalachian banjo players, such as Roscoe Holcomb and Fred Cockerham, continued to play solo or with fiddle accompaniment. Also around this time, different playing styles emerged that were starkly different than the Appalachian clawhammer style. Where clawhammer used thumb and index finger, these styles used three finger picking patterns that allow for a higher volume of notes to be played in a short amount of time. These picking styles are collectively referred to as bluegrass style.

Through the mid-1900s, the banjo was used to evoke Appalachian imagery in contemporary folk and country music as well as pop culture. For example, the theme songs to the television show The Beverly Hillbillies and the film Deliverance became earworms that spread to a mainstream audience, even though their appeal was somewhat of a novelty.

 

The modern age

According to Robert Lloyd Webb, author of Ring the Banjar!, a major turning point for the banjo came in 2000 with the release of the film O Brother, Where Art Thou? The film’s Grammy-winning soundtrack was full of traditional music and was able to garner a more universal appeal. Among those captivated by the soundtrack were members of the band Mumford & Sons who, when they formed, began featuring the banjo in their Pop-Americana sound.

Additionally, celebrities such as Steve Martin and Ed Helms, whether inadvertently or not, have given mainstream credibility to the instrument. Martin, who has been playing the banjo for more than fifty years, has been touring extensively recently in support of his bluegrass albums. Helms recently put out a record with his group The Lonesome Trio and during his time on the sitcom The Office, his character Andy Bernard was shown playing the banjo.

The story of the banjo is a bitter one because of its slavery and racism-laden roots. Lately efforts have emerged for the history to come full circle. In addition to the Black Banjo Gathering, bands like The Carolina Chocolate Drops are reviving old minstrel-style music that consists of a banjo, a fiddle, and a set of bones (a percussion instrument traditionally made from animal bones, but now more often from wood).

The banjo has proven itself to be a versatile instrument appearing in the genres of folk, bluegrass, country, and traditional, as well as jazz, swing, and blues. Deering banjos, one of the most popular manufacturers in the United States, has reported a surge in sales since 2011. Hopefully the growth in the banjo’s popularity will lead to a further fleshing out of its history.

 

Did you find this article interesting? If so, tell the world. Tweet about it, like it, or share it by clicking on one of the buttons below!

Sources

Posted
AuthorGeorge Levrier-Jones
3 CommentsPost a comment

James K. Polk, eleventh US President, has gone down in history as the man who finished the westward expansion of America through a great plan to acquire California and Oregon. And even more remarkably, he achieved this very rapidly.

But, did he really have a grand strategy to expand America and achieve a number of great measures? Or did events just play their course? William Bodkin returns to the site and explains the legend of James K. Polk.

A portrait of James K. Polk.

A portrait of James K. Polk.

What if the one thing America remembered about a President was false?  James K. Polk, who seemingly came from nowhere to become America’s eleventh President, is remembered for the four “great measures” of his Administration: (1) obtaining California and its neighboring territories following the Mexican War; (2) negotiating the purchase of the Oregon territories from Great Britain; (3) lowering the nation’s tariff on imported goods to promote free trade; and (4) establishing an independent treasury to put an end to the nation’s money problems.  Polk is celebrated for stating, at the outset of his Administration, that he would accomplish these goals in four short years.

Polk’s bold prediction and follow through led another President, Harry Truman, to describe him as the ideal Chief Executive.  Truman famously opined that Polk knew what he wanted to do, did it, and then left.  Unfortunately, while these are unquestionably Polk’s accomplishments, there is little to no evidence that he predicted them.  Instead, the prediction seems to have been created after the fact by one of Polk’s top advisors, historian George Bancroft.

 

The President From Nowhere

How did Polk become President?  In 1844, John Tyler was winding down William Henry Harrison’s term of office.  Tyler, in becoming President on Harrison’s death, alienated the two dominant political parties in America, the Democrats and the Whigs.  Tyler had angered the Democrats prior to becoming President, when, although a Democrat, he agreed to run with Harrison on the Whig ticket.  When he became President, Tyler governed mostly as a Democrat, angering the Whigs.

Waiting in the wings for the Democrats was Martin Van Buren, yearning to avenge his loss to Harrison.  Van Buren, however, before even receiving the nomination, stumbled on one of the key issues of the day, admitting Texas to the Union.  Texas had declared its independence from Mexico in 1836, seeking to join the United States.  Tyler, in one of the last acts of his Presidency, pushed to admit Texas, but failed.

The presumed Presidential nominees, though, both opposed admitting Texas. Henry Clay, for the Whigs, opposed Texas because it would be admitted as a slave state.  Van Buren, in a political calculation that backfired, claimed he opposed admitting Texas because he didn’t want to insult Mexico.  In truth, Van Buren believed that supporting Texas’s admission into the Union would cost him his traditional, staunchly abolitionist Northeast electoral base.  The gamble failed.  It cost Van Buren the support of the political powerhouse who had actually propelled him to the Presidency: Andrew Jackson.

Jackson favored admitting Texas.  Furious over Van Buren’s position, Jackson summoned Polk, his Tennessee protégé, to The Hermitage.  Polk, still reeling from a run of bad political luck, had been eyeing the Vice-Presidency.  A former Congressman, he had been Speaker of the House of Representatives from 1835-39, largely through Jackson’s support.  He left the Speaker’s chair to become Governor of Tennessee, but served only one term before being ousted in 1841.  In 1843, Polk tried and failed to win back the governor’s mansion.

At his estate, Jackson made his views plain.  Van Buren’s Texas position must be fatal to him.   The nominee would be an “annexation man,” preferably from what was then the American Southwest, meaning, Tennessee.  Polk was the best candidate.  As usual, Jackson got want he wanted.  At the Democrats’ Baltimore convention, Van Buren’s support eroded and the Democrats turned to Polk who narrowly won election over Clay. 

 

‘Thigh-Slapping” Predictions

Polk, once in office, resolved that despite Jackson’s support, he would himself be President of the United States.  According to Polk’s Secretary of the Navy and Ambassador to Great Britain, historian George Bancroft, Polk set his goals early on.  Bancroft said that in a meeting with Polk during the early days of the Administration, the President “raised his hand high in the air,” brought it down “with great force on his thigh,” and declared the “four great measures” of his administration.  First, with Texas on the road to statehood, the question of Oregon would be settled with Great Britain.  Second, with Oregon and Texas secure, California and its adjacent areas would round out the continent.  Third, the tariff, which was crippling the Southern states economically, would be made less protective and more revenue based.  Fourth, an independent national Treasury, immune from the banking schemes of recent years, would be established. 

Bancroft’s tale is problematic in two respects.  First, such a display was uncharacteristic of Polk.  Polk has been described as peculiarly simple.  He was a straightforward man and not particularly outspoken.  Polk was a workaholic, with few friendships other than his wife, no children, and no interests other than politics.  By most accounts, he was phlegmatic in disposition at best, and unlikely to engage in any dramatic exclamation.

The second problem with this story is that it comes from Bancroft.  While a superb historian, Bancroft is unfortunately a dubious source. He served in Polk’s administration, wholeheartedly endorsed its expansionist policies, and burned to write Polk’s official biography.  Polk rejected Bancroft as administration historian, instead seeking to have his former Secretary of War, William Marcy, do the job. Marcy had been in Washington for the entire administration; whereas Bancroft had left for London in 1846.  Despite this, Bancroft remained loyal to Polk.  By the late 1880s, Bancroft was the only remaining living member of Polk’s cabinet.

This is significant because during the 1880s, a number of historians dismissed Polk as being controlled by events round him and having been bullied into his expansionist policies.  The young historian and future President Theodore Roosevelt took this view, finding Polk’s administration not to be particularly capable.  Other historians viewed the Mexican War as having led to the Civil War, and condemned Polk for it.

Bancroft was offended by these assessments.  By the late 1880s, despite Polk’s previous opposition, Bancroft resolved to write a biography of Polk.  The earliest known mention of the “thigh-slapping” conversation is in an unpublished manuscript located in Bancroft’s papers titled “Biographical Sketch of James K. Polk,” apparently written in the late 1880s.  Historian James Schouler, in his “History of the United States of America, Under the Constitution,” first published the story.  Schouler noted that Bancroft had relayed the anecdote to him in a February 1887 letter.  After its initial publication, the “thigh-slapping” story was re-published, gradually taking on a life of its own.

Recent scholarship, however, indicates that Bancroft might have manufactured the incident.  On August 5, 1844, Bancroft wrote an admiring letter to Polk where he inventoried all of the administration’s accomplishments, including the annexation of Texas, the post-war purchase of New Mexico and California, the establishment of the Treasury and the overthrow of the protective tariff.  Bancroft wrote to Polk that these accomplishments “formed a series of measures, the like of which can hardly ever be crowded into one administration of four years & which in the eyes of posterity will single yours out among the administrations of the century.”

Did Bancroft help the “eyes of posterity” look more favorably toward James K. Polk?  It seems likely.  However, an historian, when examining primary sources, can never truly know the intent of historical actors and what motivated their writings.  Despite seeming evidence to the contrary, the ”thigh-slapping” story could have happened as Bancroft said it did.  History, it has been said, is written by the victors.  There are times though, when the person who writes the history determines the identity of the victor and the extent of the victory.

 

Did you enjoy this article? If so, tell the world! Tweet about it, like it, or share it by clicking on one of the buttons below…

Finally, William's previous pieces have been on George Washington (link here), John Adams (link here), Thomas Jefferson (link here), James Madison (link here), James Monroe (link here), John Quincy Adams (link here), Andrew Jackson (link here), Martin Van Buren (link here), William Henry Harrison (link here), and John Tyler (link here).

  

References

  • Anthony Berger, “2014 Presidential Rankings, No. 7: James K. Polk,” www.deadpresidents.tumblr.com
  • Walter R. Borneman, “Polk: The Man Who Transformed the Presidency and America,” Random House, 2008.
  • Tom Chaffin, “Met His Every Goal? James K. Polk and the Legends of Manifest Destiny,” University of Tennessee Press, 2014.
  • Milo Milton Quaife, editor, “Diary of James K. Polk during His Presidency, 1845-1849.” A.C. McClurg & Co., Chicago, 1910.
  • Sean Wilentz, “The Rise of American Democracy: Jefferson to Lincoln,” WW Norton and Company, 2005.
  • Jules Witcover, “Party of the People, A History of the Democrats,” Random House, 2003.

 

Frederick Douglass was born a slave, but his life was to later move into a different world. He became an important figure in the US abolitionist movement in the mid-nineteenth century. Here, Christopher Benedict looks at Douglass’ views on the Fourth of July and whether slaves could really appreciate Independence Day when they were not free.

Frederick Douglass in 1856.

Frederick Douglass in 1856.

From Plantation to Platform

The Douglass family, which in 1848 consisted of Frederick and his wife Anna, not to mention their five children Rosetta, Lewis, Frederick Jr., Charles, and Annie, settled into their new nine room home at 4 Alexander Place in Rochester, New York.

From here, Douglass contributed to and edited the abolitionist newspaper North Star, embarked upon speaking engagements in New England, New York, Ohio, and Pennsylvania, made the acquaintances of John Brown and Elizabeth Cady Stanton (whose suffrage movement benefitted from his being the sole public voice of assent), lobbied for the desegregation of Rochester’s learning institutions when Rosetta was forced to leave her private school, supported Free Soil candidates Martin Van Buren and Charles Francis Adams, and sheltered numerous fugitive slaves while assisting them with safe passage to Canada.

These surroundings and circumstances may have been a far cry from the Maryland of his birth thirty years earlier, but his youth spent on Holme Hill Farm in Talbot County, and particularly his year as a rented resource to farm owner and brutal overseer Edward Covey, would never fade into distant memory. His mother was an indentured servant named Harriet Bailey and it was believed by fellow slaves, though never confirmed nor denied, that Frederick’s father was also his white master, Aaron Anthony, which would hardly have been an uncommon occurrence.

After escaping Baltimore for Wilmington, Delaware by train in 1838 using protection papers given to him by a merchant seaman, he first sets foot in free territory after reaching Philadelphia by steamer. A second locomotive journey lands Frederick in New York City where he is reunited with Anna after their engagement back in Maryland and abandons his birth name of Bailey in favor of the alias Johnson. It would be at the urging of the welcomed and securely protected black community in New Bedford, Massachusetts that he then dropped the all-too-common Johnson for Douglas, inspired by the character of the Scottish lord from Sir Walter Scott’s The Lady of the Lake (and adding the additional ‘s’).

Because he had become proficient at the trade of caulking at the Baltimore shipyards of his mostly benevolent former possessors Hugh and Sophia Auld, where he began as bookkeeper after Sophia had taught him to read and write (which was then frowned upon and discouraged, necessitating his own covert self-education), Douglass easily finds work in the storied whaling village, joins the congregation of the African Methodist Episcopal Zion Church, and subscribes to William Lloyd Garrison’s The Liberator.

Invited to appear before an abolitionist fair in Concord, MA which was attended by Henry David Thoreau and Ralph Waldo Emerson, he then began what would become his hugely successful autobiography Narrative of the Life of Frederick Douglass, an American Slave, Written By Himself, published in 1845 (as an aside, this is still celebrated in New Bedford every February with a community read-a-thon sponsored by its Historical Society, which I proudly got to participate in while an unfortunately short-lived resident of the Bay State in 2011-12).

It begged reason for many to accept that an uncultured black man, one that the bulk of white society took on face value to be an exchangeable and disposable commodity rather than a human being with hopes and dreams and love and hurt in his heart, could compose without generous assistance such a thoughtful, highly articulate work of literature.

Nonetheless, the man born into bondage had not only endeavored toward his liberation, but was now embraced within the most illustrious intellectual circles, walking freely and proudly into their literary salons and halls of academia.

Now a distinguished citizen of Rochester, Douglass was asked to deliver a speech from the stage of Corinthian Hall on July 5, 1852 commemorating the anniversary of America’s independence. The irony, if it was not intentional or, for that matter, even at first apparent to some, would be manifested brilliantly and manipulated scorchingly.

 

As With Rivers, So With Nations

Treading lightly while wading toward troubled waters, Douglass begins on a misleadingly modest note, offering apologies for “my limited powers of speech” and “distrust of my ability”, professing to have thrown “my thoughts hastily and imperfectly together” owing to “little experience and less learning”.

Douglass compares the deliverance of the country’s political freedom to the Passover celebrated by the emancipated children of god, noting the buoyancy inherent to the Republic’s relatively youthful age, 76 years, which he remarks is “a good old age for a man, but a mere speck in the life of a nation.” Perhaps, Frederick suggests, “Were the nation older, the patriot’s heart might be sadder, and the reformer’s brow heavier. Its future might be shrouded in gloom, and the hope of its prophets go out in sorrow.” 

Interestingly, Douglass refers to the free and independent states of America through the use of feminine pronouns, whether as a repudiation of their former British fatherland and/or the noble words and deeds of the nation’s Founding Fathers he feels are now being bastardized, or as an unspoken remembrance of his own birth-giver, the mother he last saw at the age of 7 or 8 when she presented him with a heart-shaped ginger cake and the pet name “Valentine”. 

“Great streams are not easily turned from channels, worn deep in the course of ages,” says Douglass. “They might sometimes rise in quiet and stately majesty and inundate the land, refreshing and fertilizing the earth with their mysterious properties. They may also rise in wrath and fury, and bear away on their angry waves the accumulated wealth of toil and hardship.”

While the river “may gradually flow back to the same old channel, and flow on serenely as ever,” Douglass begins the shift in his discourse with the warning that “it may dry up, and leave nothing behind but the withered branch, and the unsightly rock, to howl in the abyss-sweeping wind, the sad tale of departed glory.”

 

Dastards, Brave Men, and Mad Men

Conceding that “the point from which I am compelled to view them is not, certainly, the most favorable”, the nation’s founders were, in Douglass’ estimation, “brave men” and “great men”, also “peace men” who nonetheless “preferred revolution to peaceful submission to bondage”, “quiet men” who “did not shrink from agitating against oppression”, and men who “believed in order, but not in the order of tyranny.”

Likewise, they had intentionally not framed within their Declaration and Constitution the idea of an infallible government, one which Douglass believed had since become fashionable, while falling out of repute was the deliberate action of “agitators and rebels...to side with the right against the wrong, with the weak against the strong, and with the oppressed against the oppressor.”

Douglass’ assertion was that the natural clash of these contemporary ideologies culminated in the 1850 Fugitive Slave Act, which made legalized sport of hunting down and returning runaway slaves to their masters, and a grotesquely profitable one at that.

George Washington, Douglass pointed out, “could not die until till he had broken the chains of his slaves. Yet his monument is built up by the price of human blood, and the traders in the bodies and souls of men.”

He drives this point home by quoting from Shakespeare’s Julius Caesar, “The evil that men do lives after them. The good is oft interred with their bones.”

 

Inhuman Mockery

Now comes Douglass’ direct confrontation of the question pertaining to why he was called upon to give this address on this occasion, the answer to which lay in the larger matter of whether the “life, liberty, and pursuit of happiness” Thomas Jefferson bequeathed to America’s countrymen were rights that extended to him, as well as his kith and kin. If there remained any doubt about the reply, Douglass demolished it.

“The sunlight that brought light and healing to you, has brought stripes and death to me. This Fourth of July is yours, not mine. You may rejoice, I must mourn.”

Unable to equivocate or excuse the great blasphemy of human slavery which made a mockery not only of the Constitution but of the Bible, Douglass declared to his “Fellow Americans” that “above your national, tumultuous joy, I hear the mournful wail of millions whose chains, heavy and grievous yesterday, are today rendered more intolerable by the jubilee shouts that reach them.”

He raises next the hypothetical argument of whether he and fellow abolitionists would be better served to “argue more and denounce less...persuade more and rebuke less.”

Again, his condemnation of these tactics arrives swift and decisive as a lightning strike.

“Am I to argue that it is wrong to make men brutes, to rob them of their liberty, to work them without wages, to keep them ignorant of their relations to their fellow men, to beat them with sticks, to flay their flesh with the lash, to load their limbs with irons, to hunt them with dogs, to sell them at auction, to sunder their families, to knock out their teeth, to starve them into obedience and submission to their masters?”

To do so, Douglass insisted would “make myself ridiculous and to offer an insult to your understanding.”

 

Unholy License

If the “peculiar institution” of slavery was upheld by American religion in addition to American politics, was it to be viewed as somehow supernal?

That the church largely ignored the Fugitive Slave Act as “an act of war against religious liberty”, how else could its rituals be regarded, Douglass wonders, but as “simply a form of worship, an empty ceremony and not a vital principle requiring benevolence, justice, love, and good will towards man?”

To this says Douglass, “welcome infidelity, welcome atheism, welcome anything in preference to the gospel as preached by those Divines.”

Using the word of god against itself with incendiary righteousness, he recites from the book of Isaiah. “Your new moons, and your appointed feasts my soul hateth. They are a trouble to me, I am weary to bear them, and when ye spread forth your hands I will hide mine eyes from you. Yea, when ye make many prayers, I will not hear. Your hands are full of blood. Cease to do evil, learn to do well. Seek judgment, relieve the oppressed. Judge for the fatherless, plead for the widow.”

Among the exceptionally noble men that Douglass gives name to are Brooklyn’s abolitionist firebrand Henry Ward Beecher, Syracuse’s Samuel J. May, and Reverend R. R. Raymond who shared the platform with him that day. Douglass charges them with the task of continuing “to inspire our ranks with high religious faith and zeal, and to cheer us on in the great mission of the slave’s redemption from his chains.”

 

Penetrating the Darkness

The Constitution will always remain open to the interpretation of those whose will is to bend and stretch the wording of its amendments one way or another to the advancement of a specific agenda. Regardless, Frederick Douglass maintained that it is “a glorious liberty document” in which “there is neither warrant, license, nor sanction of the hateful thing” that is slavery.

Similarly, he drew encouragement from the Declaration of Independence, “the great principles it contains, and the genius of American institutions.”

Knowledge and intelligence, time and space, were colliding in many wonderful ways which gave Douglass, ultimately, reason for hope and optimism.

“Notwithstanding the dark picture I have this day presented...I do not despair of this country. There are forces in operation which must inevitably work the downfall of slavery. No abuse, no outrage whether in taste, sport, or avarice, can now hide itself from the all-pervading light.”

And, despite the fact that they would shortly thereafter experience a bitter falling-out, Douglass ended on a conciliatory note, courtesy of a passage from William Lloyd Garrison:

In every clime be understood

The claims of human brotherhood

And each return for evil, good

Not blow for blow

That day will come all feuds to end

And change into a faithful friend

Each foe

 

Did you find this article interesting? If so, tell the world! Tweet about, like it, or share it by clicking on one of the buttons below.

Sources

  • What to the Slave is the Fourth of July?, speech delivered by Frederick Douglass July 5, 1852 in Rochester, NY
  • Autobiographies: Narrative of the Life, My Bondage and My Freedom, and Life and Times by Frederick Douglass, edited and with notes by Henry Louis Gates Jr. (Library of America, 1994)