Switzerland had a curious position during World War Two. It was officially a neutral country, but that neutrality was not always strictly maintained. Here, Laura Kerr considers how neutral Switzerland really was and how helpful it may have been to Nazi Germany…

Fascist leaders Benito Mussolini and Adolf Hitler together in Munich in 1940. The pair discussed an invasion of Switzerland during World War Two.

Fascist leaders Benito Mussolini and Adolf Hitler together in Munich in 1940. The pair discussed an invasion of Switzerland during World War Two.

Switzerland. Three things come to mind: watches, chocolate and neutrality. And for good reason. Firstly, Switzerland is home to both Rolex and Omega which can boast the titles of ‘first watch on the moon’, ‘James Bond’s official watch since 1995’, and the watch of choice for both the American and British armies during World War One. However despite its truly fascinating watch history, that is not the aspect of Switzerland that I am focusing on today.

Switzerland is the longest standing neutral nation in the world and has not taken part in a war since 1505. Its official stance of non-involvement had been decided during The Congress of Vienna in 1815, in which major European leaders met to discuss the nature of Europe after the defeat of Napoleon.

Up until World War Two, Switzerland upheld her stance of neutrality rather admirably. But despite not engaging in combat during the war, Switzerland’s so called ‘neutrality’ has been heavily scrutinized in recent years, with particular emphasis on border controls, banking and trade with Nazi Germany.

 

Hitler’s decision not to invade

The first question that needs to be answered to fully understand Switzerland’s position during WWII, is why Hitler did not invade the country while trying to establish the Third Reich. Hitler described Switzerland as a “pimple on the face of Europe” and both its geographical location and culture would seem like a clear target for the Nazis.

A good way to summaries Hitler’s reasoning not to invade Switzerland is simply ‘risk versus reward’. At the prospect of a German invasion, the Swiss improved and invested heavily in their ‘National Redoubt’ (The Swiss National Defense Plan). Along with the tough terrain and modern machinery, this didn’t make the Swiss a particularly easy target.  Not only was the risk high, the reward wasn’t tremendously great for Hitler either. Switzerland and Germany already had a beneficial trading partnership which helped Germany’s war effort. Additionally, the neutral but infamous Swiss banks made Switzerland useful to the Nazis.

There’s little doubt that once the Allies had been defeated, Hitler would have mobilized an attack on Switzerland (a planned invasion was named known Operation Tannenbaum). But as it was, his attention and resources were preoccupied on bigger enemies so any attacks on Switzerland had to wait.

Nevertheless, by 1940 Switzerland was completely surrounded by Axis powers and the Nazis occupied France, making it increasingly difficult to stay clear of the Second World War. It is the ways in which Switzerland allowed and in some ways, assisted, Nazi Germany which makes her “neutrality” so questionable.

 

Border control

After the Nazis gained power in Germany, many racial minorities attempted to flee to avoid persecution. Switzerland, a neighboring but impartial nation seemed a clear destination choice. As well as an agreement of neutrality, Switzerland had also pledged to be an asylum for any discriminated groups in Europe. They had taken in Huguenots that had fled from France in the 16th century and was an asylum for many liberals, socialists and anarchists from all over Europe in the 19th century. However, this wasn’t exactly upheld during WWII.

In fear of angering Hitler and prompting an invasion, Swiss border regulations were tightened. They did establish internment camps which housed 200,000 refugees, of which 20,000 were Jewish. Importantly though, the Swiss government taxed the Swiss Jewish community for any Jewish refugees they allowed to enter the country.

In 1942 alone, over 30,000 Jews were denied entrance into Switzerland, leaving them under the control of the Nazis. In an infamous speech, a Swiss government official stated “our little life boat is full.” Although the prospect of leaving Jewish civilians to certain death under the Nazis is unthinkable, there are arguments in Switzerland’s defense. Switzerland was a small country (with a population of roughly 4 million) which was completely surrounded by Nazi troops and nations under Hitler’s control. In comparison, the USA (arguably the safest nation for fleeing Jews) repeatedly rejected Jewish refugees and only accommodated approximately 250,000 people between the years from 1939 to 1945; tiny compared to its size. Historians today estimate that the USA could have easily accommodated over 6 million refugees.

But that is not the only controversy when it comes to Swiss border control. It was the Chief of the Swiss Federal Police, Dr Heinrich Rothmund, who proposed the idea of marking Jewish passports with a red ‘J’, and which became an important method of discrimination adopted by the Nazis. The Swiss government wanted to know and control the amount of Jews entering Switzerland but it led to a measure that made fleeing from the Nazis even harder for Jews.

Interestingly, on the March 8, 1995, the Swiss government made an official apology for their involvement with the Nazi Party, in particular their role in developing the ‘J’ stamp.

 

Banking

To this day, Swiss banks are known for their secretive but successful policies that created one of the strongest economies in the world. They were massively important during WWII, especially to high-ranking Nazis, and became another way in which Swiss neutrality was questioned.

But why were they so important?

Until 1936, the Swiss Franc was the only remaining freely convertible currency in the world. Therefore both the Allies and Axis Powers sold large amounts of gold to the Swiss National Bank and relied heavily on its economic stability. The German national currency was no longer a means of payment in international markets which meant the Nazis relied on Swiss banks in order to buy war machinery and commodities from other countries.

But if the banks accepted gold from both sides, then surely they are still technically neutral? Although that may be the case, it is the type of gold and the secretive way in which it was handled which has caused massive controversy in recent years. For over 581,000 Francs worth of ‘melmer’ gold taken from Holocaust victims was sold and kept by Swiss banks. Following the defeat of the Nazis, Swiss banks struggled with what to do with the gold, whose rightful owners had been killed in the awful genocide.

 

Trade

Prior to WWII, Switzerland had relied heavily on trade with Germany to build a strong and economically powerful nation. It was an industrialized country with virtually no raw materials, experiencing the same economic depression that was felt throughout both Europe and America. When World War Two commenced, Switzerland worried that any non-cooperation would lead to a cease in vital trade and even more significantly, an invasion. As it was completely surrounded by Nazi controlled countries, the Swiss had two choices: cooperate with Nazi trade policies or fight against them.

Between the years of 1939 and 1945, roughly 10,276,000 tons of coal was transported from Germany to Switzerland and provided 41% of Switzerland’s energy requirement. This demonstrates how the Swiss were keen to stay on good terms with Germany to continue their vital trade.

One thing Switzerland provided to the Nazis in return for important materials was access to the railway that ran through Switzerland and connected Italy and Germany. In the event of an invasion, the Swiss army planned to destroy vital tunnels and bridges, immobilizing the railway for years and making transportation between Italy and Germany nearly impossible. To uphold their neutral stand, Switzerland’s governments laid down restrictions on what could be transported over their railway. The Swiss would only allow sealed boxes to pass through without checking their contents, in exchange for raw materials and trade. Officially, the Swiss banned any transportation of people (troops) or war goods over their railway, but the extent to which this was upheld is very questionable.  

 

So, despite its attempts, Switzerland struggled to remain truly neutral during the Second World War. In fairness, World War Two was a ‘Total War’ which made it hard to remain impartial for almost every nation. It is the type of involvement, however, that is interesting and less well known to people studying history.

The extent to which a country remains neutral during times of armed conflict goes beyond their lack of involvement in armed combat. A country can only be considered neutral if they demonstrate no bias in business, social and economic activity.

Was Switzerland neutral? Arguably not.

But the extent to which they ‘helped’ the Nazis is a much more complex matter.

 

Did you find this article of interest? If so, tell the world. Tweet about it, like it, or share it by clicking on one of the buttons below…

Jesse A. Heitz considers the issue of African security in a unique way by answering the question of “Which of the Four Horsemen of the Apocalypse - Conquest, War, Famine, or Pestilence - has most affected African security in the second half of the 20th century and early 21st century?”  He argues that of the Four Horsemen of the Apocalypse, war has posed the greatest threat to African security. But the other horsemen have had significant roles to play – and are often closely linked to war…

Four Horsemen of the Apocalypse by Viktor Vasnetsov. 1887.

Four Horsemen of the Apocalypse by Viktor Vasnetsov. 1887.

Pestilence – Libya & Kenya

War has a great effect on the horseman known as Pestilence. The term pestilence will extend beyond its biblical connotation.  It will be comprised of both its traditional identity of disease, as well as what can be described as a political disease, that being political instability.

In 1969, Libyan King Idris was deposed in a military coup by Colonel Muammar Gaddafi.  The freshly-minted dictator quickly introduced state socialism and nationalized virtually all of the country’s industry, including the all-important oil industry.[1]  Over the next several decades Gaddafi’s Libya militarily intervened in neighboring states and its nationals engaged in terrorist acts around the globe, most notably the 1988 Lockerbie Bombing.[2]  In early 2011, violent protests broke out in Benghazi following the arrest of a human rights campaigner.[3]  Gaddafi’s security forces quickly retaliated, leading to a full-scale civil war.[4]  With help from allied airstrikes, Gaddafi was expelled from Tripoli in August of that year.  Within two months he had been captured and killed.[5] 

While Gaddafi had maintained his rule for four decades through the use of exceptional cunning and political mastery, the Libyan public had grown tired of the rampant corruption within his regime, whose officials often demanded millions of dollars in consultancy fees from foreign firms.[6]  He was documented to have extorted $1.5 billion from oil companies to pay for the Lockerbie settlement, and was said to have siphoned off tens of billions of dollars in state revenue into his own personal coffers.[7]  With Gaddafi’s corrupt, but relatively stable, government gone the post-Gaddafi Libya has been in a veritable state of violent flux ever since.

In Kenya, the course of events has been considerably different insofar as its government never experienced a period of state failure.  However, that is not to say that it did not fluctuate between efficient and ineffective.[8]  The swansong of the British Empire in Kenya, the Kenyan Emergency, lasted from 1952 to 1960.  With the level of conflict and tension so fierce, Britain opted to hasten granting Kenya its independence.[9]  For the following forty years, Kenya was marked by tribal animosity, political assassinations, and human rights violations.[10]  In recent years Kenya has stabilized, but the U.S. State Department has warned that regional instability in the Horn of Africa is the greatest threat to its security.[11]  Kenya has thus far extracted itself from its tradition of political pestilence born out of years of armed conflict and opposition, only to have its newfound stability threatened by the wars taking place in neighboring lands.

The nations of Africa are not the sole actors in the creation of political instability.  Foreign actors continue to jeopardize the political stability of developing nations in Africa.  Once it was perpetrated by the colonial powers, then dueling superpowers at the height of the Cold War, now it is nations that seek to service their own national interests.  For example, Ian Smith’s Rhodesia, which waged war against Robert Mugabe’s forces throughout the 1960s and 1970s, and the oppressive South Africa, found commercial partners in the United States.  The U.S. and its firms purchased large sums of manganese, platinum, and chromium from South Africa[12], while it bought chromium from Rhodesia[13] as well.  It cannot be doubted that such transactions did well to fund and prolong the conflicts raging in those states.

 

Pestilence & Disease

The final manifestation of pestilence heavily influenced by war is disease itself.  The Darfur Conflict illustrates this well.  Since fighting broke out in 2003 between the Sudanese government, its allied rebel groups and militias, and its enemies in the southern reaches of the country, some 2.7 million have been displaced[14], with an estimated 300,000 deaths.  Of those 300,000 deaths, it is reported that 80% were due to disease.[15]  While humanitarian organizations have made strides in caring for refugees, the threat of violence and attacks on convoys diminishes the ability of aid groups to combat disease by providing medical care and immunizations, clean water, and the rations necessary to stave off malnutrition-related illness.[16]

During and in the wake of war, numerous endemic diseases have surfaced, plaguing civilian populations.  The massive migrations of refugees have allowed a disease such as malaria to infect millions, and as of 1998 Africa accounted for some 90% of the world’s cases of malaria.[17]  Additionally, sub-Saharan Africa is horribly afflicted with varying types of infectious illness ranging from cholera and tuberculosis to dysentery.  Authorities estimate that 70% of the deaths in this massive portion of Africa are due to infectious disease.[18]

Another disease which is decimating many African nations is HIV/AIDS.  According to the U.N., in 2011 there were 1.8 million new cases of HIV for a total of 23.5 million people living with the disease, with some 1.2 million people dying from AIDS in sub-Saharan Africa.[19]  Stable and relatively conflict free states such as Botswana have achieved an 80% treatment level for its citizens suffering from HIV/AIDS.[20]  For war-torn and recovering states such as South Sudan and Somalia, the treatment rate falls to below 20%.[21]  Perhaps the most horrific correlation between HIV transmission and war is the widespread occurrence of sexual assault in war zones.  For example, scholars have alleged that there was a “willful transmission” of HIV, or the use of HIV as a weapon, during the Rwandan genocide when an estimated 200,000 to 500,000 women were raped.[22] 

One of the forgotten health concerns stemming from war is mental health.  Some sources have stated that the population of Uganda, which has been battling an insurrection in its northern territory for two decades, may have an incidence of PTSD in excess of 50%, and an incidence of clinical depression that sits above 70%.[23]  As shown, war can create and exacerbate the physical and psychological manifestations of pestilence.

 

Conquest – Troubles in Congo and Rwanda

The second horseman, Conquest, has been showcased in a series of intertwined wars that marred the Congo and its neighbors for decades and continue to define its security.  In the early 1950s, the native peoples of the Belgian colony of Congo achieved citizenship, which placed them on a more even footing with the Europeans that occupied their land.[24]  By 1958, the Congolese people began their march towards independence in earnest with the rise of Kasa-Vubu.[25]  Despite the tangible signs of progress, the call for immediate independence grew louder.  The Belgians had hoped to ever so slowly transition into releasing the reins on the Congo, but after riots in 1959, it was clear that such lofty aspirations were unrealistic.  By June the following year, the Belgians abruptly left their prized colony.[26]  Revolts and rioting quickly ensued, leading to several years of government instability, external interventions, and bloody conflict.[27] 

By November, Joseph Mobutu had seized power in a coup and wasted little time in tightening his grip on the infrastructure barren state, going so far as to rename it Zaire.  He cemented his control over the military, nationalized the industry within the state, and racked up the favor of Western governments who saw him as an opponent of the Communist Sphere.[28]  Throughout the 1970s, he engorged himself on the industry he had absorbed and brutally crushed any opposition to his rule.[29]  By the 1980s, an opposition party under the leadership of Etienne Tshisekedi emerged and kick-started the process of eroding Mobutu’s position.  As the Soviet Union began to disintegrate, the West found decreasing utility from the murderous dictator and began applying diplomatic pressure on his regime.  Mobutu’s control continued to fade as his military began voicing their displeasure.[30]

Events in neighboring Rwanda in 1994 sealed Mobutu’s fate.  At that point Rwanda had a population of approximately seven million people, ripe with ethnic tension between the majority Hutus and the minority Tutsis.[31]  In April of that year, Rwandan President Habyarimana’s plane was shot down and violence erupted almost immediately.[32]  Officials capable of stemming the bloodshed were quickly dispatched.[33]  By the end of the 100-day genocide, nearly three-quarters of the Tutsi population had been wiped out.[34]  Refugees and Tutsi rebel forces flooded into Zaire, eventually launching a counterattack and regaining control of Rwanda.  Then it was the turn of the perpetrators of the genocide to flee to Zaire.[35]

Congolese rebel forces under Laurent Kabila, a longtime Mobutu opponent, which had been growing in strength for years, led the charge against the Hutu rebels operating in Zaire.  With the support of Rwanda and Uganda, Kabila’s AFDL soon marched on Mobutu.  The First Congo War was well underway.  Kabila quickly overthrew Mobutu, who fled into exile, and renamed the nation the Democratic Republic of Congo (DRC).[36]  Yet, Kabila ruled with a firm hand.  Such a governing style was not in the best interests of his backers, who had hoped to plunder the DRC’s vast resources.[37]  Rwanda and Uganda then began funding the rebel groups fighting to unseat him.  Soon, Angola, Zimbabwe, Namibia, and Chad, all sent troops in support of Kabila, with the intent of serving their own economic interests.[38]

War continued to ravage the DRC for the years that followed.  By 1999, the United Nations had stepped in levying the Lusaka Peace Accord.[39]  All signatories except Rwanda and Uganda withdrew their troops.  With violence still raging, the U.N. grossly increased its peacekeeping force.[40]  In 2006, Rwandan President, Paul Kagame, stated that all of his troops had been removed from the DRC’s Kivu provinces.[41]  Later that year, Joseph Kabila, Laurent Kabila’s son and successor following his 2001 assassination, signed a new constitution which ushered in sweeping reforms.[42] 

In 2008, Rwanda and the DRC, which had been steadily rebuilding the foundations of its government[43], joined forces to fight a rebel group named Democratic Forces for the Liberation of Rwanda (FDLR), which had been operating in the DRC’s Kivu provinces.[44]  Unfortunately, by 2012, relations between the two rival states had broken down once more, with the DRC accusing Rwanda and Uganda of arming the M23 band of rebels.[45]  By the close of 2012, the U.N. was forced to maintain a 20,000 man strong peacekeeping force in the DRC.[46]  This seemingly endless string of war has devastated the DRC, with some four million people, nearly all of whom were civilians, perishing.[47]  The recent Kivu Conflict alone has displaced a reported three million people.[48]

 

Famine – From Ethiopia to Nigeria and beyond

The third horseman to be discussed is Famine.  Again, here we will extend beyond the word’s strict definition.  It will deal with both food shortages and economic difficulties, or hunger and poverty.  War is commonly attributed as a factor capable of causing famine.  In times of war and targeted violence, fields and food production facilities are often damaged or destroyed, efficient transportation is often impaired, and large populations of people are relocated to sometimes barren refugee camps where rations may be substandard.

A prominent example of war impacting or even causing famine could be witnessed through an examination of a portion of the Ethiopian Civil War during the 1970s and 1980s, when Ethiopia’s dictator, Mengistu, withheld food assistance to the Tigray peasantry, of whom his opponents were comprised.[49]  In the Democratic Republic of Congo, war has worsened food shortages.  During the never-ending sequence of war in that country, farmers in certain regions have lost up to 50% of their tools and 75% of their livestock.[50]  The 1984-85 famine in Ethiopia resulted in approximately one million deaths alone.[51]  The Nigerian Civil War, which took place from 1967 to 1970, witnessed 3,000 to 5,000 people lose their lives each day due to starvation.[52]  Famine, while complicated by numerous factors, can most certainly be both a cause and effect of war.

The second form of famine takes the shape of economics.  War has the ability to directly affect the properties that can drive economic decline and stagnation.  War can, and often does, cripple infrastructure, displace civilians including laborers, and foster the growth and extension of disease that can greatly tax healthcare systems.  One only needs to look at Libyan GDP per capita from the years 2010 to 2012 to view the economic impacts war can cause.  In 2010 Libyan GDP per capita was $15,900.  In 2011, the year of the civil war that ousted Gaddafi, it was reduced by over half to a paltry $6,100.  The following year it had rebounded to $12,300.[53] 

As mentioned above, the African continent had long been pilfered by colonial occupiers, self-indulging dictators, and opportunistic states.  There may be no better example of such a situation than that of Sierra Leone during the 1990s.  Rich with diamonds, ominously nicknamed “blood diamonds”, Sierra Leone was once besieged by rebels so brutal that their hallmark was amputating the hands and arms of civilians, including children, yet its neighbors such as Liberia, as well as nations and companies hailing from several different continents, have coldly picked sides based on who promised to auction off diamonds for the lowest price.[54]

 

War – The ultimate horseman?

In several African nations, economic growth is underway.  The mining and oil industries in particular are rushing into the “Dark Continent” with an almost unprecedented fervor[55], and the resultant influx of revenue for many once perpetually impoverished nations will only serve to bolster their security.  However, Malawian Vice President, Justin Mawelezi, warned in 2002 that armed conflict in southern Africa was a threat to attracting meaningful direct foreign investment.[56]  In other words, war could jeopardize economic growth.

In terms of African security, war has proven itself to be the bringer of pestilence, famine, and conquest.  War can cripple entire institutions such as education[57], it can create armies of child soldiers, and it can propel itself through attracting arms traffickers[58].  What makes the case for war’s supremacy amongst its fellow horsemen is that it is quantifiable and visible, its barbarism and resultant chaos are in plain view.  In biblical terms, war is fully capable of being, and often is, the proverbial “Alpha and Omega”, the beginning and the end of the Four Horsemen of the Apocalypse.

 

Did you find this article of interest? Tell the world if so. Tweet about it, like it, or share it by clicking on one of the buttons below…

 

[1] "Libya Profile." BBC News. BBC, 26 June 2013. Web. 12 July 2013. <http://www.bbc.co.uk/news/world-africa-13755445>.

[2] Ibid.

[3] Ibid.

[4] Ibid.

[5] Ibid

[6] Lichtblau, Eric, David Rohde, and James Risen. "Shady Dealings Helped Qaddafi Build Fortune and Regime." Nytimes.com. New York Times, 24 Mar. 2011. Web. 13 July 2013. <http://www.nytimes.com/2011/03/24/world/africa/24qaddafi.html?pagewanted=all&_r=0>.

[7] Ibid.

[8] Charles Hornsby, Kenya: A History Since Independence, (London: I. B. Tauris, 2012), p.3

[9] Duncan Hill, World at War: 1945 to the Present Day, (Croxley Green, Hertfordshire, UK: Transatlantic, 2011), p.22

[10] "Kenya: A Political History." BBC News. BBC, 24 Dec. 1997. Web. 10 July 2013. <http://news.bbc.co.uk/2/hi/special_report/for_christmas/_new_year/kenyan_elections/41737.stm>.

[11] "U.S. Relations With Kenya." State.gov. U.S. Department of State, 11 Dec. 2012. Web. 16 July 2013. <http://www.state.gov/r/pa/ei/bgn/2962.htm>.

[12] Thomas G. Paterson, John Garry Clifford, and Kenneth J. . Hagan, American Foreign Relations: Volume 2, Since 1895, (Boston Mass: Houghton Mifflin, 2000), p.424

[13] Paterson, p.384

[14] "Darfur--Overview." Unicef.org. UNICEF, Oct. 2008. Web. 18 July 2013. <http://www.unicef.org/infobycountry/sudan_darfuroverview.html>.

[15] Associated Press. "Study: Most Deaths in Darfur War from Disease." Msnbc.com. NBC News, 23 Jan. 2010. Web. 12 July 2013.

[16] "Darfur--Overview.”.

[17] Thomas C. Nchinda, "Malaria: A Reemerging Disease in Africa." Emerging Infectious Diseases 4.3 (1998): 398-403. Ncbi.nlm.nih.gov. World Health Organization. Web. 15 July 2013.

[18] Maire A. Connolly, and David L. Heymann. "Deadly Comrades: War and Infectious Diseases." The Lancet Supplement 360 (2002): 23-24. Rice University. Web. 11 July 2013.

[19] "Regional Fact Sheet 2012: Sub-Saharan Africa." Unaids.org. United Nations, n.d. Web. 9 July 2013. <http://www.unaids.org/en/media/unaids/contentassets/documents/epidemiology/2012/gr2012/2012_FS_regional_ssa_en.pdf>.

[20] Ibid.

[21] Ibid

[22] Obijiofor Aginam, "Rape and HIV as Weapons of War." Unu.edu. United Nations University, 27 June 2012. Web. 13 July 2013.

[23] Stephen Leahy, "Africa: Untreated Mental Illness the Invisible Fallout of War and Poverty." Allafrica.com. All Africa, 10 Oct. 2012. Web. 19 July 2013.

[24] Sean Rorison, Congo: Democratic Republic and Republic, (Chalfont St. Peter: Bradt Travel Guides, 2008), p. 65

[25] Rorison, p. 66

[26] Rorison, p. 66

[27] Rorison, p. 67

[28] Rorison, p. 68

[29] Rorison, p. 69

[30] Rorison, p. 69

[31] "Genocide in Rwanda." Unitedhumanrights.org. United Human Rights Council, n.d. Web. 16 July 2013. <http://www.unitedhumanrights.org/genocide/genocide_in_rwanda.htm>.

[32] Ibid.

[33] Ibid.

[34] Ibid.

[35] Rorison, p. 70

[36] Rorison, p. 70

[37] "DR Congo." Refugeesinternational.org. Refugees International, n.d. Web. 17 July 2013.

[38] Rorison, p. 71

[39] Rorison, p. 72

[40] Rorison, p. 73

[41] Rorison, p. 74

[42] Rorison, p. 74

[43] "Q&A: DR Congo Conflict." BBC News. BBC, 20 Nov. 2012. Web. 11 July 2013. <http://www.bbc.co.uk/news/world-africa-11108589>.

[44] Ibid.

[45] Ibid.

[46] Ibid.

[47] Rorison, p. 71

[48] "DR Congo.”.

[49] "Ethiopian Famine 25th Anniversary - Questions and Answers." One.org.us. One, n.d. Web. 16 July 2013. <http://www.one.org/c/us/issuebrief/3127/>.

[50] "Congo: Grappling with Malnutrition and Post-Conflict Woes." Irinnews.org. IRIN Africa, 9 Aug. 2007. Web. 11 July 2013.

[51] "Ethiopian Famine 25th Anniversary - Questions and Answers.".

[52] Hurst, Ryan. "Nigerian Civil War (1967-1970)." Blackpast.org. The Black Past, n.d. Web. 18 July 2013. <http://www.blackpast.org/?q=gah/nigerian-civil-war-1967-1970>.

[53] "Libya." Cia.gov. The Central Intelligence Agency, 10 July 2013. Web. 16 July 2013.

 

[54] James Rupert, "Diamond Hunters Fuel Africa's Brutal Wars." Washingtonpost.com. The Washington Post, 16 Oct. 1999. Web. 14 July 2013. <http://www.washingtonpost.com/wp-srv/inatl/daily/oct99/sierra16.htm>.

[55] Leka, Acha, Susan Lund, Charles Roxburgh, and Arend Van Wamelen. "What's Driving Africa's Growth?" Www.Mckinsey.com. McKinsey & Company, June 2010.

[56] "Instability Scares Off Investment, Malawi Official Warns." Panapress.com. Panapress, 12 Jan. 2002. Web. 15 July 2013.

[57] "Conflict Makes Millions Miss School." Aljazeera.com. Al Jazeera English, 1 Mar. 2011. Web. 15 July 2013. <http://www.aljazeera.com/news/africa/2011/03/201131194628514946.html>.

[58] Kester Kenn Klomegah, "Russia Eyes Africa to Boost Arms Sales." Guardian.co.uk. Guardian News and Media, 04 Apr. 2013. Web. 18 July 2013. <http://www.guardian.co.uk/world/2013/apr/04/arms-trade-africa>.

The story of an incredible person… From the racialized world of Jim Crow Georgia and the boxing rings of England and France to the killing fields of World War One and the celebrated jazz clubs of the Montmartre—Eugene Bullard lived an exceptional life.

Eugene Bullard with his pet in 1917 as a pilot in the Lafayette Flying Corps in France.

Eugene Bullard with his pet in 1917 as a pilot in the Lafayette Flying Corps in France.

Born in Columbus, Georgia in 1895, Bullard, like most Southern blacks of his generation, seemed destined for a life of crude “shotgun houses”, low grade labor, perpetual deference, and limited social mobility. Jim Crow, the region’s racial caste system, proved insufferable as it subjected the region’s black residents to vitriolic racism, de jure segregation that was most certainly separate but anything but equal, and political disenfranchisement. Based on his skin color alone, Bullard was born into a lifetime of second-class citizenship. Part of being a second-class citizen meant living under the never ceasing threat of racial violence. In fact, the Jim Crow South’s predilection for terror tactics made an early, but paramount impression on the young Bullard. His father, a man known as ‘Chief Big Ox” for his vaunted strength and supposed Indian ancestry, was the victim of physical and verbal abuse at the warehouse where he worked (his father worked as a drayman and stevedore along Columbus’s riverfront). After remonstrating with the warehouse’s owner, W.C. Brady, the abuse persisted. Infuriated with the elder Bullard’s plea to Brady, the supervisor responded by striking “Chief Big Ox” with an iron hook. The physically superior Bullard subdued his assailant and calmly launched him into a storage cellar. Brady quickly realized Bullard’s innocence in the situation and engineered a compromise between the two men. However, later that evening, a drunken white mob surrounded the Bullard home, attempting to push their way into both doors. The elder Bullard waited inside with his shotgun in hand while the rest of his family huddled together in fright. Luckily the mob, apparently too inebriated to continue, disbanded, but Bullard, fearing for his safety, fled the city while the tensions cooled. The elder Bullard narrowly escaped what would have most certainly been a lynching, but the incident illuminated the horrid reality of Jim Crow so clearly that even young Eugene, still only a child, could easily understand: though no longer slaves, Southern blacks were hardly free.

 

To another place

Feeling, on one hand, the intolerable restrictions on black life in the South and the natural wonder lust of youth on the other, the young Eugene took to the road at the ripe age of eleven. Even in his adolescence, the headstrong Bullard desired to be his own man, and after traveling with a band of gypsies and using his skillful horsemanship to earn a wage on a number of farms in southern Georgia, he realized that such a goal could never achieved in caste conscious America. The racially liberal environs of Western Europe, the gypsies assured him, had no such color line. Thus after having his leg gashed open by a white passerby in downtown Atlanta for no reason other than that he was sporting a fashionable “Buster Brown” suit, Bullard hopped a series of trains and boats to Norfolk, Virginia where he would eventually stow away on a ship bound for Hamburg, Germany.

Yet he only made it as far as Aberdeen, Scotland. From there, he migrated south, finally arriving in Liverpool. His time in the English port city would be formative as it was there that he found steady pay in professions that, for one, entered him into tight nit professional circles and brought him a modicum of notoriety. His first venture was show business. Upon arriving in Liverpool he found work at the Birkenhead amusement park which proved to be his gateway into a much larger act—the Belle Davis’s Freedman’s Pickaninnies, a vaudeville act specializing in minstrelsy. Modern readers recognize such shows as highly offensive and otherwise demeaning, but Europe was not America. Bullard, always highly self-aware, had little reservations about mocking racial stereotypes because he realized that doing so in Europe did not reinforce any particular racial order or hierarchy. He found the laugh of the European void of the malice and perversity that characterized the contemptuous American laugh. Having a steady job and steady pay allowed him to try his hand at boxing on the weekends. By the turn of the twentieth century boxing had become the sport of choice for working class Englishmen, and a number of African American boxers had gained considerable fame across the channel. Perhaps the most popular was a young Southerner named Aaron Lester Brown who, like Bullard, fled the suffocating environment of the Jim Crow South, earning him the nickname the “Dixie Kid.” Bullard quickly became Brown’s understudy, and before long the two were touring across England and France on the same match card. While visiting as a boxer, he fell in love with Paris, a city that welcomed blacks and exhibited little apprehensions about black and white interactions. He eventually relocated to the city, becoming, in his mind at least, a proud Frenchman.

 

War

However, the blaring guns of August 1914 cut his boxing career short. At the age of nineteen, Bullard joined the French Foreign Legion. He fought bravely at the Battle of the Somme, where he proved to be a highly efficient machine gunner. He would go on to survive the initial month of the bloody and prolonged battle of Verdun, but a month into the fighting, an incoming artillery barrage blew open a wound in his thigh as he was carrying a message from one officer to another. Though he would eventually be awarded the Croix de Guerre for his heroism, his service, at least as an infantryman, would end at Verdun. But Bullard would not be ousted so quickly. After finishing his convalescence, he enrolled in the French aviation school, becoming the first African American military pilot. He went on to fly a number of missions, registering at least one acknowledged “kill”.

America’s involvement in the war, however, re-introduced Bullard to the racism he thought he left behind. His accomplishments were not only ignored by the American press, but Edmund C. Gros, an influential American living in France, successfully terminated his piloting career almost as soon as it began. As American troops crossed the Atlantic, the American army sought to maintain the statutes of Jim Crow—black and white soldiers were kept separate, blacks were normally employed in menial services, and black troops were typically led by white officers. Bullard posed a threat to the standing system at home. A common Jim Crow assumption asserted that black men did not have the mental capacity to operate heavy machinery unsupervised, relegating them to mostly tenant farming and unskilled labor. Bullard, being a pilot, negated such a faulty assumption. More importantly, though, Bullard’s mere presence in France made American whites hoping to not upset the racial order uneasy. The French, while very accepting of black troops, were forced to comply with the American demands to take Bullard off the front lines, as they were desperate for the added American manpower. Thus while Bullard became a national hero in France, he was, if nothing else, scorned by the white American military establishment. Just as he was in his early days in Columbus, America’s involvement in the Great War once again designated him a persona non grata.

 

But Bullard would carry on. Following the war, he began playing drums for a black American jazz band. His new role would prove fortuitous as Parisian nightlife yearned for this new, inherently African American brand of music. With his proficiency in English and French and his wealth of connections in show business from his time with the Freedman’s Pickaninnies, Bullard became a valuable hiring agent for the jazz clubs of the Montmartre. He quickly befriended Joe Zelli, a nightclub impresario who owned popular clubs in New York and London. With the help of Bullard’s friend Robert Henri, the two obtained an all-night club license and went into business together. Soon Zelli’s, the chosen name of the club, became the most popular club in Paris. He then struck out on his own, buying the club, Le Grand Duc. Though his ownership of the club is contested, his presence in the Montmartre scene was undeniable. He mediated contracts and recruited the black American musicians teeming across the Atlantic, finding them work and introducing them to highly influential and wealthy patrons. Bullard’s time in the Montmartre put him in contact with celebrities like Jack Dean and Fannie Ward and even royalty as Edward Windsor, the Prince of Wales and heir to the English throne was a frequent guest of the Le Grand Duc.  

Sadly, whereas the First World War paved the way for Bullard’s entrance into an elite circle of artists and celebrities, the Second World War marked his exit. Fearing the Nazi regime and its racial intolerance, he fled to New York, an ironic twist in an already perplexing life. In New York, he offered to use his influence to help various activist groups like the National Association for the Advancement of Colored People (NAACP). But much to his surprise, he was an unknown. Very few Americans knew about his wartime career and even fewer knew about his time in the Montmartre. Already an older gentleman, Bullard spent his last days as an elevator operator at New York’s Rockefeller Center. In 1959 he was the subject of a special edition of the Today Show, where his wartime service and extraordinary life was put on display. But even then, he was introduced only as the building’s black elevator operator, not Eugene Bullard the vaunted prizefighter, jazz drummer, French national hero, celebrated pilot, or nightclub owner. He died soon after, in 1961 at the age of 66, thus ending a remarkable life that was both a triumph and a tragedy.

 

If you found the article interesting, tweet about it, share it, or like it by clicking on one of the buttons below…

Eugene Bullard being interviewed on the Today Show in December 1959.

Eugene Bullard being interviewed on the Today Show in December 1959.

References

Craig Lloyd, Eugene Bullard, Black Expatriate in Jazz-Age Paris (Athens: University of Georgia Press, 2000).

http://www.blackpast.org/aah/bullard-eugene-jacques-1894-1961

http://www.georgiaencyclopedia.org/articles/history-archaeology/eugene-bullard-1895-1961

Posted
AuthorGeorge Levrier-Jones

In this article Janet Ford discusses the horrific act of infanticide in the nineteenth century with the help of records from London’s Old Bailey court – with cases from London and (from 1856) further afield. It provides an insightful look into this terrible crime in Victorian England…

The Old Bailey in the early nineteenth century.

The Old Bailey in the early nineteenth century.

In the nineteenth century there were 203 cases of infanticide recorded in the Old Bailey.

Of the 203 cases, 83 people were found guilty, 114 were found not guilty and one was a ‘misc’ verdict. Out of the 83 who were found guilty, only 18 were actually found guilty of killing, with three of those being found insane and two with a ‘recommendation’. 65 were not guilty of killing but guilty of the lesser crime of concealing the birth. This shows that even though it was a highly emotional and shocking crime women were not automatically found guilty. The reason why so many were found not guilty of killing was often due to medical evidence, such as the health of the baby and mother. There was also an increased involvement of character witnesses in the courts, who could explain the background of the person, and an increased interest in the criminal mind, especially those of women. Finally, there was more of an understanding of childbirth itself.

 

What the cases show about the crime and society

The role of Medical people

As all the cases involved doctors, surgeons or midwives, there was a need and want to have physical evidence, rather than just hearsay, in order to get the right verdict and justice. They would have knowledge and experience of all types of childbirth, and so they could provide evidence of it being accidental, deliberate or it being too difficult to tell.

 

What it shows about Childbirth and its effects on crime

The records show two main aspects of childbirth: the physical effect on the baby and the emotional aspect. The emotional aspect of childbirth was the shame of having a baby out of wedlock - but also of having the father run out during the pregnancy, not being sure who the father was, not wanting to be a single mother, or sexual assault. It meant that women felt they had to injure or kill their baby, conceal the birth or self deliver. They were seen as criminals, which many were, but many were also victims of social attitudes and even of crimes themselves. The physical aspect of childbirth was the consequence of these elements, as women felt they had to deliver on their own. This meant there was no other person to help if the delivery was difficult. An example of the physical affect can be seen with this statement from Doctor Thomas Green in Ellen Millgate’s case.  

Health of the mother and child

The cases show that the health of both the mother and baby were taken into consideration and used as evidence. The health of the mother, such as if she was epileptic, would have affected her ability to care for the baby properly. Poor health helped the mother’s case, as it was out of her control, as did the baby being premature. An example of health being used as evidence is shown with Ellen Middleship, who was found not guilty.

20151011 Clip 2.png

Born alive

One of the main reasons why so many were found not guilty or only guilty of concealing the birth was the baby being born dead on delivery. It was out of the mother’s control, and so she would have been found not guilty. In many cases, it was too difficult to tell if the baby had been born alive during the delivery, as shown with the case of Elizabeth Ann Poyle.

Personal aspects

Along with medical evidence, personal aspects were also taken into consideration. Personal elements such as good character, age, previous children and the relationship with the father were all taken into account. These elements could show that the mother could not have committed the crime, as it was out of character, or at least helped to lessen the punishment, which did happen with many women. An example is shown with Sarah Jeffery giving a statement about Jane Hale, who was guilty of concealing but not of killing.   

Violence

 

The most shocking aspect of the cases, whether the women were found guilty or not guilty, was violence. Violence could have been caused by cutting the cord, getting the child out, falling, or hitting. This was one of the most difficult aspects of a case, as it could be difficult to determine if injuries were caused by the birth or on purpose. What helped resolve this was medical knowledge, an understanding of childbirth, or eyewitness accounts. The understanding of childbirth helped to explain why there were marks on, for example, the neck and head. This was due to ribbons or rope being used to get the baby out, or the baby falling during childbirth. Even though the marks caused by childbirth were not committed on purpose, it is still shocking to read, as shown with Ellen Millgate - the marks were around a vulnerable part of the baby. With the help of eyewitness accounts, it was only in a few cases where it was determined that the injuries were committed on purpose. An example of this can be seen with Ann Dakin giving evidence in the Joseph Burch and Caroline Nash case, who were both found guilty and given four year penal servitude.

It is one of the most shocking cases due to the violence and a reminder that parents could abuse their own children. But also, as with many of the other guilty cases, it shows that women could be quite cruel and violent. Another element of violence was getting rid of the body. The main example is from this description by James Stone of what he found in Martha Barratt’s room. She was found guilty of concealing the birth but not of killing.  

Mercy towards women

Even with the violence, and the shame of committing the crime, the verdicts and the punishments show that there was an understanding and sympathy towards women, as the majority were found not guilty of infanticide or guilty of a lesser crime. This was due to a better understanding of women, society, childbirth, and the criminal mind over the century.

The cases show that infanticide was a very complex crime, as it involved and was affected by so many factors - health, childbirth, social attitudes, babies, violence and high levels of emotion. It also shows the various sides of the 19th century…

 

If you found this article of interest, do tell others. Tweet about it, like it, or share it by clicking on one of the buttons below…

References

Anne-Marie Kilday, A history of infanticide in Britain, c. 1600 to the present (Palgrave Macmillan, 2013)

M Jackson, Infanticide: historical perspectives on child murder and concealment, 1550-2000 (Ashgate, 2002)

Old Bailey Online, January 1800-December 1899, Infanticide 

Ellen Millgate, 28th November 1842

Ellen Middleship, 21st October 1850

Elizabeth Ann Poyle, 22nd May 1882

Jane Hale, 28th November 1836

Joseph Nash and Caroline Nash, 24th October 1853

Martha Barratt, 9th April 1829

Body parts and the strangeness of the human anatomy have fascinated people for centuries. And they have been displayed and collected for some time. Here, Rebecca Anne Lush takes a look at how displays of ‘medical marvels’ have progressed though the ages…

An old scene from the Hunterian Museum in London.

An old scene from the Hunterian Museum in London.

With contents to both fascinate and repulse it is no wonder medical museums continue to entice visitors. Gunther von Hagen’s Body Worlds has attracted thousands of visitors worldwide since its first exhibition in Tokyo in 1995. Today, there are nine exhibitions on display across the world. With a further four planned in the near future, it appears as though this museum has sustained the public’s interest. According to their mission statement, they endeavor to teach the public the ins and outs of anatomy. Body Worlds is not alone. The Mütter Museum in Philadelphia and the Hunterian and Wellcome Museums in London also continue to engage the public with their morbid and fascinating specimens.

The history of medical museums is incredibly rich, filled with mystery and mayhem, curiosity and control. In the Victorian era especially, they came to represent a conflict between the professional and the public. No longer could an individual pay a small fee to sit in on an autopsy and leave with a qualification. As the Victorian era progressed, pathology and anatomy schools both professionalized and specialized. Their conflict with the public realm is a curious case indeed.

 

Before the nineteenth-century

Body parts have been displayed for centuries serving multiple purposes. It can be argued that medieval churches displaying relics and reliquaries were amongst some of the earliest in the Western world.

The collection and display of body parts became a more secular practice during the Renaissance. So called Cabinets of Curiosities allowed avid collectors to organize their specimens and exhibit them to the public. Such cabinets could include human rarities to please and entertain visiting crowds.

It was not until the seventeenth-century, however, that anatomical specimens were more carefully collected, labeled, and stored in permanent institutions. Many anatomy teachers during this period held private collections to increase their credibility. Two very famous brothers, William and John Hunter, collected en masse anatomical specimens later donated to the Royal College of Surgeons. The seventeenth-century was also a time for commercial anatomical displays, such as freak shows and travelling exhibitions of human oddities.

 

Dr Kahn’s Anatomical Museum

Such early examples were the foundations for Victorian public and professional medical museums. No public medical museum was more influential than Dr Kahn’s Anatomical Museum. Joseph Kahn, a self-professed medical doctor, moved from Alsace, Germany to London opening his anatomical museum in London, 1851. Initially entry was restricted to males who could afford the fee of two shillings. After two months, however, women were allowed to step inside during specific viewing times. Eight years later, the admission price halved to one shilling attracting larger crowds and more inquisitive minds.

On entering the exhibition space, visitors encountered an anatomical wax Venus, the organs of which could be removed. The rest of the museum consisted of wax models, specimens held in jars, and special “doctors-only” rooms. Medical doctors frequented Dr Kahn’s until its closure in 1864.

 

Dr. Kahn.

Dr. Kahn.

Professional Museums

Developing alongside these public spectacles were the more professional museums, belonging to hospitals, pathology societies, private schools, universities and Royal Colleges.

More formal institutions collected specimens to aid in medical education. Acquiring both abnormal and normal specimens increased levels of anatomical knowledge and encouraged anatomy to transform into a professional activity that aimed to improve standards of health. Although some were open to the public, the majority were kept under lock and key.

 

Conflict

In 1857 the Obscene Publications Act prevented any ‘obscene’ anatomy to be displayed in a public setting. Dr Kahn’s museum was deemed immoral under this act resulting in its later closure. Other public anatomy museums continued to operate until the mid-1870s.

Both professional and public museums were striving to be centers of education. At first, the professionals admired Dr Kahn’s museum, especially the rooms dedicated to their study. Not only were early opinions favorable, but there is also evidence to suggest there were close relationships. Robert Abercrombie, for example, affiliated himself with the Strand Museum in London, establishing a consultation room next to the museum. Visitors were able to not only visit the museum, but also receive medical care on site.

As the Victorian era progressed, and as anatomy became specialized, these public museums were regarded inappropriate to disseminate such medical information. Ongoing legal and social battles ensured that the professional schools of anatomy and pathology alone were the stakeholders to the industry. It was a conflict of words with professional museums writing at length about their distrust and disgust in their medical journals.

 

Today

It is quite interesting to see another shift occurring in the past few decades. Today, even the more professional museums from the Victorian era are now open to the wider public. No longer is all medical information guarded by the elite and trained, but it can be accessible to anyone who wants to learn. Accompanying this is the fact that public medical museums displaying wax models are again appearing on the medical landscape. The curious case of medical marvels is a comment on how medical museums have been developed and transformed in order to meet the human desire for knowledge.  

 

Did you find this article interesting? If so, tell us why below…

References

Alberti, Samuel J. M. M. Morbid Curiosities: Medical Museums in Nineteenth- Century Britain. Oxford: Oxford University Press, 2011.

Bates, A. W. “Dr Kahn’s Museum: obscene anatomy in Victorian London.” Journal of the Royal Society of Medicine 99, no. 12 (2006): 618-624.

Bates, A. W. “Indecent and Demoralizing Representations: Public Anatomy Museums in mid-Victorian England.” Medical History 52, no. 1 (2008): 1-22.

Kahn, Dr. Joseph. Catalogue of Dr Kahn’s Celebrated Anatomical Museum. Leicester Square: W. J. Golbourn, 1853.

Kesteven, W. B. “The Indecency of the Exhibition of Dr Kahn’s Museum.” Letter. The British Medical Journal 1, no. 49 (1853): 1094.

“Medical News: Dr Kahn’s Anatomical Museum.” The Lancet 1, no. 1443 (April 26, 1851): 474.

Stephens, Elizabeth. Anatomy as Spectacle: Public Exhibitions of the Body from 1700 to the Present. Liverpool: Liverpool University Press, 2011.

Jupiter Hammon was born into slavery in the early eighteenth century in one of the Northern states. However, he came out better than most slaves as his owners thought well of him and gave him a good education. Ultimately this contributed to him being America’s first published black poet. Christopher Benedict tells the fascinating story of Jupiter Hammon.

A depiction of Jupiter Hammon.

A depiction of Jupiter Hammon.

He Being Thy Captive Slave

Sometimes history exists, like those who contribute mightily to it, right under your nose and yet hidden in plain sight.

I have lived on Long Island, with one brief exception, for my entire 44-year lifespan. However, it took until a few months ago for a good friend and fellow history buff to point out the fact that the first black poet published in America was born and buried on an estate a mere seven miles from where I now reside.

Jupiter Hammon was born into slavery on October 17, 1711, his father Obadiah and mother Rose both duty-bound in the indentured servitude of Henry and Rebecca Lloyd on the little peninsula called the Manor of Queens Village.

This title was rather more regal-sounding than the name which preceded it. Horse Neck, derived from the sixteenth century English equestrians from Huntington who stabled their steeds there, displaced the original designation bestowed upon it by the Matinecock Indians, Caumsett (translated as “place by sharp rock”), and would itself be later rechristened Lloyd Harbor as an ode to its 200 year-long residents.

The 1676 acquisition of Horse Neck by James Lloyd, an entrepreneurial Boston-based merchant, preceded its annexation to Oyster Bay of Queens County after he was officially granted its royal patent nine years later. Opting to stay in New England and look after business affairs firsthand, James instead leased this 300-acre plot to local farmers until gifting the neglected property to his son Henry, a 24-year-old shipper until then operating out of Newport, Rhode Island, who relocated and saw to the construction of his post-medieval Manor House (employing slave labor as well as hired hands paid with Bibles, needles, and other tradable commodities) in 1711, the year of Jupiter’s birth.

 

Firmly Fixed His Holy Word

While Jupiter still remains something resembling an enigma, next to nothing seems to be known regarding his parents, other than that Obadiah was literate and had made a number of unsuccessful escape attempts dating back to 1687 when he and Rose were among those comprising the first delivery of subjugated human cargo to the Lloyd estate.

As far as Jupiter is concerned, his warm feelings toward the Lloyd family were repaid in kind, as he was permitted not only personal living quarters within the Manor, but unfettered access to formal education. He attended classes alongside the Lloyd children and maintained a close enough relationship with the sons that he earned their affectionate nickname “Brother Jupiter”. 

Supplementing his fortune by continuing his father’s practice of renting parcels of land to be worked by tenant farmhands, Henry’s import/export business also flourished as never before. It often warranted unaccompanied journeys by the now fully grown Jupiter, working as a clerk when not tilling the fields surrounding the Manor House, into New York City to facilitate trade agreements, such was the unthinkable level of respect and trust established between master and servant.

How Jupiter’s Christian faith germinated is not clear, but it would be fed consistently and fervently throughout the decades, as would his general intellectual pursuits, cross-pollinating then blossoming into a historically significant 88-line poem, the first to be published in the yet-to-be liberated American Colonies by a person of African lineage.

An Evening Thought: Salvation by Christ, with Penitential Cries was printed and circulated as a one-sheet broadside in 1761 and contained the momentous byline, Composed by Jupiter Hammon, a Negro belonging to Mr. Lloyd, of Queen’s Village, on Long Island, the 25th of December, 1760.

As the title suggests, it reads like a hymn with the opening stanza:

Salvation comes by Jesus Christ alone,

The only Son of God,

Redemption now to every one,

That love his holy Word,

Dear Jesus, we would fly to Thee,

And leave off every Sin,

The Tender Mercy well agree,

Salvation from our King.

 

When you consider other passages, however, innocuous sounding lines such as:

Ho, every one that hunger hath,

Or pineth after me,

Salvation be thy leading Staff,

To set the Sinner free.

Dear Jesus unto Thee we fly,

Depart, depart from sin.”

 

trace the written origins of Hammon’s concept of slavery, which he will soon after fill in with explicit detail and later come under scathing attack for, as almost sacramental atonement for misdeeds perpetrated against the heavenly father, the penance for which was subservience to the slave driver.

 

From Every Sinful Wound

Henry Lloyd died in 1763 and Jupiter, never emancipated, would afterwards live with Henry’s son Joseph, who had a Manor House of his own built on the estate three years later.

Before the British occupation of Long Island, which was made possible by their victory over George Washington’s forces in August 1776, Joseph, a steadfast patriot, fled to Hartford, Connecticut with the other members of the Lloyd family (those who were not Tories) in addition to the Conklins of nearby Huntington.

Jupiter would remain in their company, and with them return once hostilities had ended and true independence won.

An Address to Phillis Wheatley appeared in 1778, in which one is left to wonder whether Hammon’s purpose is to flatter or chastise the “Ethiopian Poetess”.

“Come, dear Phillis, be advis’d

To drink Samaria’s flood,

There’s nothing that shall suffice

But Christ’s redeeming blood.

While thousands muse with earthly toys,

and range about the street,

Dear Phillis, seek for Heaven’s joys,

Where we do hope to meet.”

 

Wheatley herself wrote glowingly of a nearly evangelical deliverance from her native Africa, which she maligns as a “pagan land”, much as Jupiter’s imagery of “a dark abode” mirrors her sentiments here. Their thoughts of one another, whatever they may have been, are not known, and relegated to the oblique lines composed by Hammon.

 

The Blessing of Many Ready to Perish

Jupiter was invited to speak before a meeting of the African Society of New York City on September 24, 1786 and delivered an oration which was published the following year under the title An Address to the Negroes in the State of New York.

The pamphlet was prefaced by an editorial assurance “To the Public” from “The Printers” that the following words “wrote in a better Stile than could be expected from a slave” were indeed those of the author, whose hand-written manuscript, they vowed, was “in our possession”.

Though he begins by intertwining the plights of the slaves and the Jews with a quotation from the apostle Paul that “I have great heaviness and continual sorrow in my heart for my brethren, my kinsmen according to the flesh,” he then turns an abrupt about-face.

“When I think of your ignorance and stupidity, and the great wickedness of the most of you, I am pained to the heart.”

It is shocking to read Jupiter’s assertion that, “for my own part I do not wish to be free”, and though he softens the blow with the following sentiment, “I should be glad if others, especially the young Negroes, were to be free”, he comes full circle by resigning to the fact that “many of us, who are grown up slaves, and have always had masters to take care of us, should hardly know how to take care of ourselves.”

Confessing that, “I have had such desires, a sense of my own ignorance, and unfitness to teach others,” Jupiter (at just shy of 75 years of age) nonetheless says that he feels obliged “to call upon you, with the tenderness of a father and friend, and to give you the last, and I may say dying advice, who wishes your best good in this world, and the world to come.”

In the 250 years since Hammon’s writings have been available for public consumption and examination, Jupiter’s accomplishments as an educated slave and published poet have been eclipsed, particularly in the eyes of contemporary critics, and dimmed considerably by the ignominious upbraiding of his fellow, far less fortunate, slaves during this address.

The first point belabored during his presentation is “Respecting obedience to masters,” elaborating that, “we cannot be happy unless we please them. This we cannot do without muttering or finding fault.”

The second “particular I would mention is honesty and faithfulness,” Hammon continued. “We have no right to stay when we are sent on errands any longer than to do the business we were sent upon. All time spent idly is spent wickedly, and is unfaithfulness to our masters.”

Refraining from profanity, specifically taking “God’s holy name in vain”, Jupiter insists will enable those overseen by slave drivers in this world to slip the chains of Satan in the next and “sit with God in his kingdom as Kings and Priests, and rejoice forever and ever.”  

Even sexual gratification occurs to Hammon as an evil deed, as “the carnal mind is not subject to the law of God.”

Jupiter submits that “If God has put us in bad circumstances, that is not our fault and he will not punish us for it. If any are wicked in keeping us so, we cannot help it, they must answer to God for it. The same God will judge both them and us.” That said, he also professes, “If God designs to set us free, he will do it in his own time and way.”

 

To Taste Things More Divine

Both Jupiter Hammon and Phillis Wheatley have been taken to task for their beliefs (some may say apologies) of slavery being exercised upon African Americans as a biblical trial, out of which only the most virtuous will arise to reap Heavenly reward. Linked together as colonial sell-outs, if Phillis Wheatley was castigated as the Civil Rights movement’s “Aunt Jemima”, Jupiter Hammon became their “Uncle Tom”.

It is important to bear in mind that their personal experiences were unusual, if not unique, and differed drastically from the common hell on earth shared by many (mostly Southern) bondsmen and women. Neither Phillis nor Jupiter, both slaves of the Northern colonies, knew the weighty burden of shackles and chains, the mistrust or disgust of their masters, the sight and perhaps taste of their own blood drawn by the fist or the whip. While these conditions surely did not erode their capacity for empathy, it was a compassion channeled through a heavy current of pity rather than a true sense of commiseration.

And, as far as Jupiter’s seemingly condescending address is concerned, you will recall that Frederick Douglass likewise cautioned against woeful and wasteful pastimes, writing in his Narrative of the Life, “instead of spending the Sabbath in wrestling, boxing, and drinking whisky, we were trying to learn how to read the will of God; for they had much rather see us engaged in those degrading sports than to see us behaving like intellectual, moral, and accountable beings.”

Not only is he buried in an unmarked grave on the Lloyd estate, but the year of Jupiter Hammon’s death was not recorded and, thus, open to historical speculation placing it most likely in 1806 (making him 85 at the time), but possibly as early as 1790. 

In February 2013, Julie McCown, a student of Cedrick May’s English class at the University of Texas Arlington’s College of Liberal Arts, was given an archival research assignment centered around Hammon’s Address to the Negroes in the State of New York, during which she and her professor would make a startling discovery.

McCown and May exhumed from the Yale University Manuscripts and Archives Library a never-published and thought-lost manuscript of An Essay on Slavery, written in Jupiter’s own hand. Dating to 1786, the 25-stanza poem is all the more remarkable for the somewhat more somberly defiant overtones not present in the address delivered that same year and conspicuously absent from his first published work a quarter of a century earlier. 

Our forefathers came from Africa

Tost over the raging main

To a Christian shore for to stay

And not return again.

Dark and dismal was the day

When slavery began

All humble thoughts were put away

Then slaves were made by man.

 

What do you think of the article and the views of Jupiter Hammon? Let us know by leaving a comment below…

Sources

  • An Evening thought: Salvation by Christ, with Penitential Cries by Jupiter Hammon (December 25, 1760)
  • An Address to Miss Phillis Wheatley by Jupiter Hammon (August 4, 1778)
  • An Address to the Negroes in the State of New York by Jupiter Hammon (Carroll and Patterson New York, 1787)
  • An Essay on Slavery, With Justification to Divine Providence, that God Rules Over All Things by Jupiter Hammon (1786, published in June 2013 Yale Alumni Magazine)
  • UT Arlington Professor, Graduate Student Discover Poem Written by 18th Century Slave from New York (UT Arlington News Release, February 5, 2013)
  • Jupiter Hammon: A New Appraisal by George Wallace (http://www.poetry.about.com)
  • Narrative of the Life of Frederick Douglass, an American Slave by Frederick Douglass (1960, Belknap Press)
  • http://lloydharborhistoricalsociety.org

 

It may seem strange, but there is very strong evidence that the White House killed a number of presidents in the mid-nineteenth century. The deaths of Zachary Taylor, William Henry Harrison, and James K. Polk are all linked to something in the White House – although many believed that some presidents were poisoned by their enemies. William Bodkin explains all…

A poster of Zachary Taylor, circa 1847. He is one the presidents the White House may have helped to killed...

A poster of Zachary Taylor, circa 1847. He is one the presidents the White House may have helped to killed...

President of the United States is often considered the most stressful job in the world.  We watch fascinated as Presidents prematurely age before our eyes, greying under the challenges of the office.  Presidential campaigns have become a microcosm of the actual job, with the conventional wisdom being that any candidate who wilts under the pressures of a campaign could never withstand the rigors of the presidency.  But there was a time, not so long ago, when it was not just the stress of the job that was figuratively killing the Presidents.  In fact, living in the White House was, in all likelihood, literally killing them.

Between 1840 and 1850, living in the White House proved fatal for three of the four Presidents who served.  William Henry Harrison, elected in 1840, died after his first month in office.  James K. Polk, elected in 1844, died three months after he left the White House.  Zachary Taylor, elected in 1848, died about a year into his term, in 1850.  The only occupant of the Oval Office during that period to survive was John Tyler, who succeeded to the Presidency on Harrison’s death.  What killed these Presidents?  Historical legend tells us that William Henry Harrison “got too cold and died” and that Zachary Taylor “got too hot and died.”  But the truth, thanks to recent research, indicates that Harrison, Taylor, and Polk may have died from similar strains of bacteria that were coursing through the White House water supply.


Conspiracies and Legends

On July 9, 1850, President Zachary Taylor, Old Rough and Ready, former general and hero of the Mexican-American War, succumbed to what doctors called at the time “cholera morbus,” or, in today’s terms, gastroenteritis.  On July 4, 1850, President Taylor sat out on the National Mall for Independence Day festivities, including the laying of the cornerstone for the Washington Monument.  Taylor, legend has it, indulged freely in refreshments that day, including a bowl of fresh cherries and iced milk.  Taylor fell ill shortly after returning to the White House, suffering severe abdominal cramps.  The presidential doctors treated Taylor with no success.  Five days later, he was dead.

Taylor’s death shocked the nation.  Rumors began circulating immediately concerning his possible assassination.  The rumors arose for a good reason.  Taylor, a Southerner, opposed the growth of slavery in the United States despite being a slave owner himself.  While President, Taylor had worked to prevent the expansion of slavery into the newly acquired California and Utah territories, then under the control of the federal government.  Taylor prodded those future states, which he knew would draft state constitutions banning slavery, to finish those constitutions so that they could be admitted to the Union as free states.

Taylor’s position infuriated his southern supporters, including Jefferson Davis, who had been married to Taylor’s late daughter, Knox.  Davis, who would go on to be the first and only President of the Confederate States of America, had campaigned vigorously throughout the South for Taylor, assuring Southerners that Taylor would be friendly to their interests.  But in truth, no one really knew Taylor’s views.  A career military man, Taylor hewed to the time honored tradition of taking no public positions on political issues.  Taylor believed it was improper for him to take political positions because he had sworn to serve the Commander-in-Chief, without regard to person or party.  Indeed, he had never even voted in a Presidential election before running himself.

Tensions between Taylor and the South grew when Henry Clay proposed his Great Compromise of 1850, which offered something for every interest.  The slave trade would be abolished in the District of Columbia, but the Fugitive Slave Law would be strengthened.  The bill also carved out new territories in New Mexico and Utah.  The Compromise would allow the people of the territories to decide whether those territories would be slave or free by popular vote, circumventing Taylor’s effort to have slavery banned in their state constitutions.  But Taylor blocked passage of the compromise, even threatening in one exchange to hang the Secessionists if they chose to carry out their threats.


More speculation

Speculation on the true cause of Taylor’s death only increased throughout the years, particularly after his former son-in-law, Davis, who had been at Taylor’s bedside when he died, became President of the Confederacy.  The wondering reached a fever pitch in the late twentieth century, when a University of Florida professor, Clara Rising, persuaded Taylor’s closest living relative to agree to an exhumation of his body for a new forensic examination.  Rising, who was researching her book The Taylor File: The Mysterious Death of a President, had become convinced that Taylor was poisoned.  But the team of Kentucky medical examiners assembled to examine the corpse concluded that Taylor was not poisoned, but had died of natural causes, i.e. something akin to gastroenteritis, and that his illness was undoubtedly exacerbated by the conditions of the day.

But what caused Taylor’s fatal illness?  Was it the cherries and milk, or something more insidious?   While the culprit lurked in the White House when Zachary Taylor died, it was not at the President’s bedside, but rather, in the pipes.

During the first half of the nineteenth century, Washington D.C. had no sewer system.  It was not built until 1871.  The website of the DC Water and Sewage company notes that by 1850, most of the streets along Pennsylvania Avenue had spring or well water piped in, creating the need for a sanitary sewage process. Sewage was discharged into the nearest body of water.  With literally nowhere to go, the sewage seeped into the ground, forming a fetid marsh.  Perhaps even more shocking, the White House water supply itself was just seven blocks downstream from a depository for “night soil,” a euphemism for human feces collected from cesspools and outhouses.  This depository, which likely contaminated the White House’s water supply, would have been a breeding ground for salmonella bacteria and the gastroenteritis that typically accompanies it.  Ironically, the night soil deposited a few blocks from the White House had been brought there by the federal government.


Something in the water

It should come as no surprise, then, that Zachary Taylor succumbed to what was essentially an acute form of gastroenteritis.  The cause of Taylor’s gastroenteritis was probably salmonella bacteria, not cherries and iced milk.  James K. Polk, too, reported frequently in his diary that he suffered from explosive diarrhea while in the White House.  For example, Polk’s diary entry for Thursday, June 29, 1848 noted that “before sun-rise” that morning he was taken with a “violent diarrhea” accompanied by “severe pain,” which rendered him unable to move.  Polk, a noted workaholic, spent nearly his entire administration tethered to the White House.  After leaving office, weakened by years of gastric poisoning, Polk succumbed, reportedly like Taylor, to “cholera morbis”, a mere three months after leaving the Oval Office.

The White House is also a leading suspect in the death of William Henry Harrison. History has generally accepted that Harrison died of pneumonia after giving what remains the longest inaugural address on record, in a freezing rain without benefit of hat or coat.  However, Harrison’s gastrointestinal tract may have been a veritable playground for the bacteria in the White House water.

Harrison suffered from indigestion most of his life.  The standard treatment then was to use carbonated alkali, a base, to neutralize the gastric acid.  Unfortunately, in neutralizing the gastric acid, Harrison removed his natural defense to harmful bacteria.  As a result, it might have taken far less than the usual concentration of salmonella to cause gastroenteritis.  In addition, Harrison was treated during his final illness with opium, standard at the time, which slowed the ability of his body to get rid of bacteria, allowing them more time to get into his bloodstream.  It has been noted, that, as Harrison lay dying, he had a sinking pulse and cold, blue extremities, which is consistent with septic shock.  Did Harrison die of pneumonia?  Possibly.  But the strong likelihood is that pneumonia was secondary to gastroenteritis.

Neither was this phenomena limited to the mid-nineteenth century Presidents.  In 1803, Thomas Jefferson mentioned in a letter to his good friend, fellow founder Dr. Benjamin Rush that “after all my life having enjoyed the benefit of well formed organs of digestion and deportation,” he was taken, “two years ago,” after moving into the White House, “with the diarrhea, after having dined moderately on fish.  Jefferson noted he had never had it before.  The problem plagued him for the rest of his life.  Early reports of Jefferson’s even death stated that he had died because of dehydration from diarrhea.

Presidents after Zachary Taylor fared better, once D.C. built its sewer system.  The second accidental President, Millard Fillmore, lived another twenty years after succeeding Zachary Taylor.  But what about the myths surrounding these early Presidential deaths?  They were created, in part, by a lack of medical and scientific understanding of what really killed these men.  With the benefit of modern science we can turn a critical eye on these myths. But we should not forget that myth-making can serve an important purpose past simple deception.  In the case of Zachary Taylor, it provided a simple explanation for his unexpected death.  Suspicion or accusations of foul play would have further inflamed the sides of the slavery question that in another decade erupted into Civil War, perhaps even starting that war before Lincoln’s Presidency.  In Harrison’s case, that overcoat explanation helped the country get over the shock of the first President dying in office and permitted John Tyler to establish the precedent that the Vice-President became President upon the death of a President.  In sum, these nineteenth century myths helped the still new Republic march on to its ever brighter future.


What did you think of today’s article? Do you think it was the water that killed several Presidents? Let us know below…


Finally, William's previous pieces have been on George Washington (link here), John Adams (link here), Thomas Jefferson (link here), James Madison (link here), James Monroe (link here), John Quincy Adams (link here), Andrew Jackson (link here), Martin Van Buren (link here), William Henry Harrison (link here), John Tyler (link here), and James K. Polk (link here).


Sources

  • Catherine Clinton, “Zachary Taylor,” essay in “To The Best of My Ability:” The American Presidents, James M. McPherson, ed. (Dorling Kindersley, 2000)
  • Letter, Thomas Jefferson to Benjamin Rush, February 28, 1803
  • Milo Milton Quaife, ed., “Diary of James K. Polk During His Presidency, 1845-1849” (A.C. McClurg & Co., 1910)
  • Jane McHugh and Philip A. Mackowiak, “What Really Killed William Henry Harrison?” New York Times, March 31, 2014
  • Clara Rising, “The Taylor File: The Mysterious Death of a President” (Xlibris 2007)

Nineteenth century poet Margaret Fuller died in a tragic way in 1850. And it was the writer Ralph Waldo Emerson who was perhaps most devastated by the loss. Here Edward J. Vinski looks at the fascinating relationship between them and what happened after Fuller’s passing.

A nineteenth century engraving of Margaret Fuller.

A nineteenth century engraving of Margaret Fuller.

Margaret Fuller

“On Friday, 19 July, Margaret dies on the rocks of Fire Island Beach within sight of & within 60 rods of the shore. To the last her country proves inhospitable to her.” (Emerson, 1850/1982, p. 511)

 

The Margaret to whom Ralph Waldo Emerson referred is Margaret Fuller, a writer and poet associated with American transcendentalism in the nineteenth century. Born in 1810, Fuller was educated under her father’s direction. Timothy Fuller’s tutelage was both intense and, in its own way, fortuitous. He began her instruction in Latin when she was but six years of age. Her lessons would last throughout the day, and young Margaret was often sent to bed overtaxed and unable to sleep. In spite of the nausea, bad dreams and headaches she incurred, Margaret appreciated that he held her to the same standards to which he would have held a son (Richardson, 1995).

Although they had mutual friends, Fuller and Emerson did not meet until the summer of 1836 when Fuller paid a three-week visit to the Emerson home in Concord, Massachusetts. Prior to this, she had attended some of Emerson’s talks and had wished to meet him for some time, but it was only after he read her translation of Goethe’s Taso that Emerson returned the interest and offered her the long-awaited invitation (Richardson, 1995). Thus began a relationship between the two that would have a profound effect on both of them.

 

Fuller and Emerson

Richardson (1995) has remarked that “Fuller took less from Emerson than either Thoreau or Whitman, and she probably gave him more than either of them” (p. 239-240). Perhaps more than any person other than his deceased first wife, Ellen, Fuller knew best how to pierce the armor of his innermost life. Nowhere is this more clearly evident than in the fact that following their initial meeting, Emerson finished his book Nature which had been drifting toward theoretical idealism. Fuller, according to Richardson (1995), pushed him toward an “idealism that is concerned with ideas only as they can be lived […] with the spiritual only when it animates the material” (p. 240).

Fuller, however, took from Emerson as well.  “From him,” she wrote, “I first learned what is meant by an inward life” (Fuller, n.d., as cited in in Bullen, 2012, Chapter V, para 4). She had long searched for an intellectual mentor and by the time of her first visit to Emerson, she was fearful that she may never find one. In Emerson, she found someone with whom she could share her ideas as well as her intimacies. As their relationship developed, however, it became clear that she was requiring even more from Emerson. Since no written record of her requests survive, precisely what she asked of him is difficult to discern. Although married, he was clearly conflicted by his feelings for her. In his journal, he confessed that she was someone “Whom I always admire, most revere and sometimes love” (Emerson, 1841/1914, p. 167), and in a later entry recorded a nighttime river walk with her. Whatever the case may be, it is clear that Emerson’s second wife, Lydian, saw Fuller as a threat (Allen, 1981).

After editing The Dial, a transcendentalist magazine, for several years, Fuller left America for Europe in the summer of 1846 as a correspondent for the New York Tribune. After some time in England, she relocated to Italy with her husband, Giovani Ossoli[1], a marquis who supported the Italian revolution. Fuller and her husband both took an active role in the revolution, and she chronicled its events in a book she had hoped to publish. When the revolt finally failed, the family, which now included a young son, was forced to return to America. Their ship, the Elizabeth, met with bad luck almost immediately. At Gibraltar, the captain died of smallpox, leaving the ship under the direction of its first mate. In the early morning of July 19, 1850, the ship ran aground on a sandbar a few hundred meters off Fire Island, NY. The following day, Margaret Fuller, her husband, and her child drowned when the ship broke up.

 

Thoreau’s Mission

News of the disaster reached Concord some days later. On or about July 21, Emerson made the journal entry indicated above. In a letter to Marcus Spring, dated July 23, Emerson wrote:

At first, I thought I would go myself and see if I could help in the inquiry at the wrecking ground and act for the friends. But I have prevailed on my friend, Mr Henry D. Thoreau, to go for me and all the friends. Mr Thoreau is the most competent person that could be selected and […] he is authorized to act for them all (Emerson, 1850/1997, p. 385).

 

Emerson doubted that any manuscripts would have survived the wreck, but knowing that Fuller would have had with her the manuscript to her History of the Italian Revolution, he was willing to pay whatever costs Thoreau might incur in his attempt to salvage it.

Thoreau, for his part, set out immediately. On July 25, he wrote to Emerson describing what details he had learned of the disaster:

…the ship struck at ten minutes after four A.M., and all hands, being mostly in their nightclothes, made haste to the forecastle, the water coming in at once […] The first man got ashore at nine; many from nine to noon. At flood tide, about half past three o’clock, when the ship broke up entirely, they came out of the forecastle, and Margaret sat with her back to the foremast, with her hands on her knees, her husband and child already drowned. A great wave came and washed her aft. The steward had just before taken her child and started for shore. Both were drowned (Thoreau, 1850/1958a, p. 262).

 

Margaret Fuller’s remains and those of her husband were never found. Her son’s body washed ashore, dead but still warm. A desk, a trunk, and a carpet bag were recovered from the scene, but none of Margaret’s valuable papers were found. Thoreau promised to do what he could, holding out some hope that, since a significant part of the wreckage remained where the ship ran aground, some items might still be salvaged, but it is clear that he was not confident.

In a letter to abolitionist and future Senator Charles Sumner, whose brother Horace was also aboard, Thoreau wrote

I saw on the beach, four or five miles west of the wreck, a portion of a human skeleton, which was found the day before, probably from the Elizabeth, but I have not knowledge enough of anatomy to decide confidently, as many might, whether it was that of a male or a female (Thoreau, 1850/1958b, p. 263).[2]

 

After visiting nearby Patchogue, New York, where many of those who scavenged the wreckage instead of attempting a rescue were thought to reside, he returned to Fire Island empty handed.

In all, Thoreau’s mission was unproductive. “I have visited the child’s grave,” he wrote to Emerson. “Its body will probably be taken away today” (Thoreau, 1850/1958, p. 262). The corpse of her son, a few insubstantial papers, and a button pried from her husband’s jacket by Thoreau himself were essentially Margaret Fuller’s only relics that would return to Massachusetts.

 

Conclusion

The relationship between Emerson and Margaret Fuller is enigmatic. She was not only his intellectual equal, but their interactions suggest “an only slightly erotic relationship, about which he clearly fretted” (Sacks, 2003, p. 51). Although Emerson’s life had been scarred by the losses of many loved ones, Fuller’s death clearly devastated him on many levels. The intellectual impact is obvious in a journal entry around the time of her death. “I have lost in her my audience,” he wrote (Emerson, 1850, p. 512). No longer would the two be able to exchange ideas with one another. It impacted him socially as well.  “She bound in the belt of her sympathy and friendship all whom I know and love,” (p. 511) he wrote. Perhaps he wondered what would happen now that the belt had been broken. But was there, in fact, something deeper? “Her heart, which few knew, was as great as her mind, which all knew,” (Emerson, 1850, p. 511-512). Emerson clearly knew her heart more intimately than most.

Why did Emerson dispatch Thoreau to Fire Island and not go himself as he had initially planned? Ostensibly, he wanted to begin work, at once, on a memorial book in Fuller’s honor. We may, however, speculate that there were deeper reasons as well. Years earlier, Emerson had opened the coffin of his first wife, Ellen, who had died of tuberculosis fourteen months before. While he gave no explanation for his action, it seems that he needed to view her decomposing corpse to somehow convince himself of the soul’s immortality (Richardson, 1995). This event marked a turning point in his life. His focus shifted from death to life, from the material to the ideal. 

The death of Margaret Fuller marked another profound turn. Ellen’s death due to illness, while tragic, was predictable. Fuller’s death was unexpected, and he would struggle mightily to recover from it. He became acutely aware of his own mortality. “I hurry now to my work admonished that I have few days left,” he wrote (Emerson, 1850/1982, p. 512). Fuller, who had pushed Emerson to focus on the spiritual as it animates the material was now, herself, inanimate. Emerson might well have stayed in Concord because he somehow sensed that the trip would be fruitless. It might also be that he could not bear the thought of once again standing over the lifeless body of a woman he loved.

 

Postscript

Years later, a small monument to Margaret Fuller was erected on the Fire Island beach not far from the wreck site. It stood as a memorial to a remarkable woman for 10 years. Then, it too was claimed by the sea (Field, n.d.).

 

What do you think of the article? Let us know by leaving a comment below…

 

References

  • Allen, G.W. (1981). Waldo Emerson. NY: Viking.
  • Bullen, D. (2012). The dangers of passion: The transcendental friendship of Ralph Waldo Emerson and Margaret Fuller. Amherst, MA: Levellers Press (Kindle Fire Version). Retrieved from http://www.amazon.com
  • Emerson, R. W. (1841/1914). Journal entry. In B. Perry (Ed.).The heart of Emerson’s journals. Boston: Houghton Mifflin.
  • Emerson, R.W. (1850/1982). Journal entry. In L. Rosenwald (Ed.). Ralph Waldo Emerson: Selected Journals 1841-1877. NY: Library of America.
  • Emerson, R.W. (1850/1997). Letter to Marcus Spring. In J. Meyerson (Ed.). The selected letters of Ralph Waldo Emerson (p. 358).  NY: Columbia University Press.
  • Field, V. R. (n.d.). The strange story of the bark ELIZABETH. http://longislandgenealogy.com/BarkElizabeth.html
  • Richardson, R. D. (1995). Emerson: The mind on fire. Berkley, CA: University of California Press
  • Sacks, K.S. (2003) Understanding Emerson: “The American Scholar” and his struggle for self-reliance. Princeton, NJ: Princeton University Press.
  • Thoreau, H.D. (1850/1958a). Letter to Ralph Waldo Emerson. In W. Hardy & C. Bode (Eds.). The correspondence of Henry David Thoreau (pp. 262-263). NY: NYU Press.
  • Thoreau, H.D. (1850/1958b). Letter to Charles Sumner. In W. Hardy & C. Bode (Eds.). The             correspondence of Henry David Thoreau (p. 263). NY: NYU Press.

 

Footnotes

1. There is some question as to whether they were officially married.

2. Thoreau would incorporate some of his memories from this mission, including that of the skeleton, into his book Cape Cod

Posted
AuthorGeorge Levrier-Jones
5 CommentsPost a comment

The banjo has a popular place in American culture. But few people know of the instrument’s complex roots. In this article, Reed Parker discusses how a banjo-like instrument was originally brought to the US by African slaves - before being remodeled. And the complex cultural interactions between different groups and the banjo…

The Banjo Player, a painting by William Sidney Mount from 1856.

The Banjo Player, a painting by William Sidney Mount from 1856.

In 2005, the first Black Banjo Gathering took place at Appalachian State University in Boone, North Carolina. The purpose of the gathering was to celebrate the tradition of the banjo and bring awareness to the fact that, even though the banjo has become an emblem of white-mountain culture, it is an African instrument at its core. The banjo as we know it today has a decidedly tragic origin story.

 

From Africa to America

Over the last few centuries, the banjo has secured a spot in the canon of traditional American music. In the time before the American Revolution, minstrels became a popular form of entertainment and they often played an early relative to the banjo known as a banjar.

Other relatives of what would eventually become the banjo existed in many different areas of West Africa. There is the ngoni, which had anywhere from three to nine strings, the konou, which has two strings, and the juru keleni, which has just one string. One of the most elaborate of these variations is the kora which has 21 strings and leather straps tied to the pole neck to hold the strings in place. These predecessors are still being played today in their native lands.

The direct predecessor of the banjo, most commonly known as a banjar, arrived on the slave ships that came from West Africa in the 17th century. The instrument was made from half of a gourd with animal skin stretched over it and a pole that acted as a neck. The strings of the banjar were made from waxed horsehair or from the intestines of animals, most commonly cattle or goat. The intestinal strings were referred to as catgut or simply gut strings. The banjar was easily constructed because the materials required were easy to find. Eventually the instrument evolved to include tuning pegs and a flat fretboard in place of the pole neck. This allowed for notes to be manipulated with slides and bends.

 

The banjar in the US

In West Africa, “talking drums” were a common method of long distance communication. This tradition was carried across the ocean to the plantations. In 1739, drums and brass horns were outlawed in the colonies as a result of the Stono Insurrection in which slaves on a South Carolina plantation coordinated an uprising against their slave owners. They had used these instruments to communicate the plan. Prior to this, ensembles of brass horns, drums, and banjars were quite popular. Afterward, however, solo banjar acts became more popular.

A sad reality of this time in the banjar’s life is that its burgeoning popularity had a lot to do with traveling white minstrels who would perform in blackface. The banjar acted as a prop for the minstrels to use in their acts, acts that often satirized aspects of African culture that were brought to the US. It is also theorized that some white old time musicians learned the oral tradition directly from black banjo players and merely wanted to continue the tradition, instead of satirizing it.

By the early 1800s, the European fiddle music that settlers brought over with them and African banjar music were beginning to mutually influence each other. The style of banjar play that started to emerge at this time was known as thumping, which would evolve to become the clawhammer or “frailing” style, a style that combines rhythm and melody into one strumming pattern using a claw-shaped hand position.

 

The arrival of the banjo

Joel Sweeney, a Virginia man of Irish descent, has been credited with either inventing or popularizing the earliest form of the modern banjo which features five strings, an open back, and a wooden rim. His contributions are contested and some claim that it was actually the fourth string that was Sweeney’s invention and that the fifth came later.

Around the middle of the nineteenth century, minstrel groups traveled to Britain, spreading the banjo’s influence over the musical landscape. At the same time, the now booming steamboat travel business put African slaves, on lease from their owners, together with Irish and German immigrant laborers. These marginalized groups would entertain each other with jigs and reels. The mutual influencing continued into the Civil War era and the musical pairing of the banjo and fiddle became and would stay the most popular in the Appalachian region into the twentieth century.

Fortunately, other events outside of blackface minstrel shows were developed to showcase banjo skill. Banjo contests and tournaments were held at a multitude of venues including bars, race tracks, and hotels. Before the Civil War, the contestants were almost exclusively white, but blacks began making an appearance when the war was over.

Further changes to banjo construction were made around this time such as tension rods and wire strings. Tension rods, or truss rods, were implemented to provide the ability to adjust the neck if it warped from dryness or humidity. Wire strings were a cheaper alternative to gut strings, but they were largely dismissed at first for the buzzing they produced.

In the early 1900s, full string bands began to emerge. These groups added a fuller sound to the banjo/fiddle duos with the addition of guitar, upright bass, mandolin, and sometimes other instruments. That is not to say that banjo/fiddle duos were replaced entirely though. Many loyal traditionalist Appalachian banjo players, such as Roscoe Holcomb and Fred Cockerham, continued to play solo or with fiddle accompaniment. Also around this time, different playing styles emerged that were starkly different than the Appalachian clawhammer style. Where clawhammer used thumb and index finger, these styles used three finger picking patterns that allow for a higher volume of notes to be played in a short amount of time. These picking styles are collectively referred to as bluegrass style.

Through the mid-1900s, the banjo was used to evoke Appalachian imagery in contemporary folk and country music as well as pop culture. For example, the theme songs to the television show The Beverly Hillbillies and the film Deliverance became earworms that spread to a mainstream audience, even though their appeal was somewhat of a novelty.

 

The modern age

According to Robert Lloyd Webb, author of Ring the Banjar!, a major turning point for the banjo came in 2000 with the release of the film O Brother, Where Art Thou? The film’s Grammy-winning soundtrack was full of traditional music and was able to garner a more universal appeal. Among those captivated by the soundtrack were members of the band Mumford & Sons who, when they formed, began featuring the banjo in their Pop-Americana sound.

Additionally, celebrities such as Steve Martin and Ed Helms, whether inadvertently or not, have given mainstream credibility to the instrument. Martin, who has been playing the banjo for more than fifty years, has been touring extensively recently in support of his bluegrass albums. Helms recently put out a record with his group The Lonesome Trio and during his time on the sitcom The Office, his character Andy Bernard was shown playing the banjo.

The story of the banjo is a bitter one because of its slavery and racism-laden roots. Lately efforts have emerged for the history to come full circle. In addition to the Black Banjo Gathering, bands like The Carolina Chocolate Drops are reviving old minstrel-style music that consists of a banjo, a fiddle, and a set of bones (a percussion instrument traditionally made from animal bones, but now more often from wood).

The banjo has proven itself to be a versatile instrument appearing in the genres of folk, bluegrass, country, and traditional, as well as jazz, swing, and blues. Deering banjos, one of the most popular manufacturers in the United States, has reported a surge in sales since 2011. Hopefully the growth in the banjo’s popularity will lead to a further fleshing out of its history.

 

Did you find this article interesting? If so, tell the world. Tweet about it, like it, or share it by clicking on one of the buttons below!

Sources

Posted
AuthorGeorge Levrier-Jones
3 CommentsPost a comment

The Red Ball Express was a supply line that was set up to ensure that the Allied troops who invaded France in 1944 were well supplied. It wasn’t just any supply line though; it was vital to the Allies’ advance against Nazi Germany in the latter months of 1944… Here, Greg Bailey tells this World War Two story.

A Red Ball Express convoy is waved on near Alenon, France. September 1944.

A Red Ball Express convoy is waved on near Alenon, France. September 1944.

Like the Pony Express, whose legend has lasted far longer than its short history, the Red Ball Express, the vital supply line across France supporting the Allies’ war-effort against Germany, has earned a well-deserved heroic reputation. The around-the-clock stream of truck convoys was as important as any battle fought in World War II.

The Red Ball Express was created on the battlefield to solve an unforeseen but welcome development. The planners of D-Day anticipated there would be enough supplies, primarily gasoline, to support the advancing combat units while engineers completed a gas supply line from the Normandy landing area to the rear of the combat area. For a time, as the Allies slowly fought their way through difficult hedgerow country, the supplies piled up. But after Bradley’s division broke through the German lines, General George Patton saw an opening and aggressively took it. He charged across France and the army soon began to run out of supplies. By mid August Patton had to slow down his advance for lack of fuel. The gasoline and other supplies his men needed were piled up far from the front. "My men can eat their belts” Patton said, "but my tanks gotta have gas."  The solution was a special unit running on designated roads to move the supplies. Borrowing the name from the railroads, the Red Ball Express was born.

 

The Express at work

The Red Ball Express only ran from the end of August to the middle of November 1944. Men and trucks from scattered units were hurriedly brought together.  During those few months the convoys running on the designated roads marked by red ball signs, hauled more than 400,000 tons of materials from the Normandy beaches to the ever changing front lines of the Allied campaign. The loads included ammunition, medical supplies and food but above all gasoline in five gallon jerricans that were needed to keep the fuel hungry tanks and other vehicles advancing toward the enemy. Patton called the operations of the Red Ball Express “our most important weapon.”

Patton’s most important weapon was a combination of one of the best examples of American ingenuity and one of the most shameful episodes of American history.  Although the army used several models of truck during the operations, the mainstay was the two and a half tom Jimmie. The Jimmie had a five-ton cargo capacity.  The no frills version of the civilian truck, the Jimmie, was designed to be easily and quickly assembled. With simple, interchangeable parts, during the Red Ball Express’ operations, mechanics were able to swap out engines and transmissions by the side of the road often under enemy fire. Tires were a problem, often flattened on the road by discarded C-ration cans.  Under these tough conditions, each Jimmie had a life expectancy of less than a year.

 

Valiance in the face of Discrimination

What really pushed the operation was the men driving and repairing the trucks Three quarters of the Red Ball Express personnel were African Americans serving in all black units with white officers over them, barred from serving in combat under the segregation laws of the time. The white troops lived in separate quarters and were kept away from their comrades during and after duty.  British Major General H. Essame said: "few who saw them will ever forget the enthusiasm of the Negro drivers, hell-bent whatever the risk, to get General Patton his supplies."

Despite the sting of discrimination the men charged with the vital supply mission went above and beyond. On an average day 83 transportation units operated almost 900 trucks on the network of roads closed to all other military or civilian traffic.  On paper the speed limit for the five truck convoys was 25 mph with each truck spaced out in 60 feet intervals. In reality drivers disabled the governors on the truck engines to exceed the posted speed limits and the trucks were sometimes overloaded above their five-ton capacity.

During the first days of the Express, as the front lines nearly ran out of supplies, drivers set out with maps torn out of the pages of the Stars and Stripes newspaper.  And while the route was a solid line on a map, in reality the roads were narrow and twisting, pock marked with battle damage, running through fields of dead livestock and hidden snipers. The trucks ran at night with obscured headlights soon called cats’ eyes. Along the roads drivers passed the remains of trucks wrecked in accidents or destroyed by enemy fire.

Indeed, although the Red Ball Express was officially a non-combat unit, drivers were drawn into battles. Some of the trucks were fitted with 50 caliber machine guns and all of the personnel carried rifles with them. In these battles, black drivers left their trucks and fought alongside white soldiers and then returned to their second class status behind the wheels of their trucks marked with bullet holes. Against these hazards the Red Ball Express pushed on, with drivers completing the average 600-mile round-trip with little or no rest.

 

The murkier side

There was a dark side to the operation. In his 2000 book The Road to Victory author David Colley tells how bottles of premium French wine were traded for far more valuable cans of gasoline. Prostitutes along the way accepted jerricans as payment.  A few fully loaded trucks disappeared into the Paris black market under the unchallenged story that the trucks were destroyed by enemy fire.

By November other supply lines including pipelines and secured ports and rail lines had taken over the task of the Express. The Red Ball Express trucks were using a great amount of fuel to deliver gas to the increasingly distant destinations. The Red Ball Express had completed its mission. Other operations ran on other routes but the Red Ball Express image lived on it part because of the red circles on the transportation units insignia.

 

Tributes

After the war the Red Ball Express was celebrated in the Broadway musical Call Me Mister. “Steam was hissing from the hoods when they showed up with the goods. But they turned around and went back for more.”  A wildly inaccurate film on the Red Ball Express was released in 1952 staring actor Jeff Chandelier leading mixed white and black crews on trucks through burring villages to delivery gas to the stranded tank crews. An equally inaccurate sitcom on the Express ran for a short time on CBS in the 1970s.

But perhaps the most sincere tribute was expressed by the simple words of Allied Supreme Commander Dwight Eisenhower. After calling the Red Ball Express the “lifeline between combat and supply”, Eisenhower said:

To it falls the tremendous task of getting vital supplies from ports and depots to the combat troops, when and where such supplies are needed, material without which the armies might fail. To you drivers and mechanics and your officers, who keep the ‘Red Ball’ vehicles constantly moving, I wish to express my deep appreciation. You are doing an excellent job.

 

Did you enjoy this article? If so, tell the world. Tweet about it, like it, or share it by clicking on of the buttons below.

 

Greg Bailey is a history writer from St. Louis. His book The Voyage of the F.H. Moore and Other 19th Century Whaling Accounts was published last year.