France first started to colonize Algeria in 1830 and its influence grew there in the following century; however, after World War II, there was pressure to allow Algeiran independence, ultimetaly relwating in the Algerian War of Independence. Here Will Desvallees tells us about French colonialism in Algeria and the lasting impacts of it in contemporary France.

A depiction of the 1836 Battle of Constantine in Algeria. The French lost this battle, but ultimately took control of Algeria.

A depiction of the 1836 Battle of Constantine in Algeria. The French lost this battle, but ultimately took control of Algeria.

In 1945, WWII came to an end, but the European presence in North Africa did not, and tensions between settlers and local populations grew in the years that followed. In the case of Algeria, a “malaise politique”[1] set in between Algerians and French settlers. Eventually, this deteriorating relationship would push Algeria to achieve independence from France in 1962. Under French control, Algerians suffered. Questions, ambitions, and public sentiments regarding national identity animated the conflict, which would become increasingly violent in nature. The story of the Algerian War (1954-1962) and the history of Franco-Algerian relations before the conflict reveals how French colonialism took root and operated. The history, however, continues to resonate. The war’s cascading effects are present in the disturbing rise of anti-Semitic and anti-Islamic politics in contemporary France. The foundations of twentieth century French nationalism are rooted not only in the civic commitments to liberté, égalité, and fraternité, but also in the suffering the French inflicted upon Algerians in defense of their imperial acquisition. In the last ten years, France has seen a rise in violence and nationalist far-right ambitions, much of which can be linked to the human rights abuses, violence, and torture Algerians underwent at the hands of French colonial forces as they sought independence.

 

France in Algeria

French involvement in Algeria began in 1830 when France took direct political control of port cities on the Algerian coast, seeing in the territory a vast supply of raw materials for its nascent industry and presaging a process of accumulative expansion. In addition to natural resources such as oil, the Algerian territory was ideal for wine production as well as other agricultural products.[2] The years that followed led to an increasing number of French settlers and French présence: “En 1930, les terres issue de cette colonisation officielle représente 1,500,000 hectares sur les 2,300,000 possédés par les Européen.”[3] French colonization of Algeria only serves as one example of the broader rise of imperialism in Europe, as white settlers subjugated “natives” across the Global South. In 1919, the first Algerian social movement for independence would be created under the leadership of Ferhat Abbas (1899-1985), which would send representatives to the League of Nations to fight on behalf of Algerian independence. In the first half of the twentieth century, rightist ideology in European countries grew in response to social inequality. In response to this, the Algerian movement expanded in reach and popularity.

Following the Second World War, given Algeria’s economic dependence on French subsidies, the Algerian colonial economy was devastated. “The wine, grain, and livestock industries collapsed leaving an impoverished, unemployed proletariat of 10 million Muslims governed by an increasingly French colonial state” (Hitchcock 2003, 184). If the French were to stay in Algeria, how could they let its people suffer? Algerian resentment began to rise. In 1945 in a series of articles published by Albert Camus in a daily French newspaper, one article he entitled“malaise politique” depicts the rising strength of Algerian opposition to French rule:

The Algeria of 1945 is drowning in an economic and political crisis that it has always known, but that has not yet reached this degree of acuity. In this admirable country that’s Springtime without legal protection in this moment of its flowers and its lights, men are suffering from hunger and demand justice. These are sufferings that cannot leave us indifferent, because we have known them ourselves.[4]

 

Growing tensions

Camus wrote this piece on May 16th, approximately one week after the beginning of a violent French reassertion of control on May 8th 1945, as France celebrated its own liberation. That day, Algerian citizens began to protest in large numbers. Outraged by this, the French did not hesitate to use violence against Algerian citizens who participated in these demonstrations. One group of Algerians would claim the lives of twenty Europeans. That month, in an effort to retaliate and demonstrate their strength, the French killed thousands of Algerians and tensions between Algerian nationals and French authorities would reach a tipping point: “Over a hundred Europeans died during this month of insurrection, Algerian deaths are unknown, but have been estimated at between 6,000 and 8,000.”[5]

One of the main concerns for French armed forces in Algeria can be traced to the military defeats they suffered in Vietnam, largely because they were unprepared for the guerilla warfare tactics of the Viet Minh. Paranoia pushed the French military to employ more violent means of maintaining control in Algeria. The French would use excessive force in an attempt to prevent any of the military defeats they had suffered in Indochina.

While France was winning the war in Algeria in the late 1950s, the French public was increasingly opposed to the methods of torture used by French military personnel in Algeria, which were exposed in lurid detail by numerous French publications. Among those covering the war was Claude Bourdet, a journalist for France Observateur, who in an article entitled “Votre Gestapo d'Algérie,” gave his readers examples of the brutality employed by the French military: “l’empalement sur une bouteille ou un bâton, les coups de poing, de pied, de nerf de boeuf ne sont pas non plus épargnés. Tout ceci explique que les tortionnaires ne remettent les prisonniers au juge que cinq à dix jours après leur arrestation.”[6] In his article, Bourdet referred to French military officers as “Gestapistes,” drawing for a French public who had lived only very recently under Nazi occupation a sharp comparison between the methods used by French authorities and those employed by the German secret police.

 

Frantz Fanon on colonialism

Similar coverage in French mass media stimulated a snowball effect for domestic discontent and opposition to the war in Algeria. Indeed, the hypocrisy of employing Nazi-associated torture methods after the ruthless devastation France faced during WWII did not escape an increasingly conscious French public. The brutality of French colonial administration after WWII, in Indochina and Algeria, and the associated atrocities committed against “natives” pushed Frantz Fanon, a French psychiatrist and political philosopher from Martinique to write The Wretched of The Earth. He published this work as France was finalizing the last stages of its official exit from Algeria. In the first part of his work entitled “On Violence”, Fanon focuses on the vital role of violence as a necessary tool for activists to fight for independence. Principally basing his argument on the current events and recent history of what had taken place in Algeria, Fanon paints the portrait of decolonization as a violent process no matter where or no matter who is involved. He relates this tendency to a colonial structure he defines as the presence of a native population inevitably dehumanized by the settlers. Two foundational principles that come out of his work to explain the long term impact of colonization. First, he explains that it is the replacement of one’s population by another. Second, he describes the manner in which natives know they are human too and immediately develop a progressively deepening rebellious and resentful attitudes towards the settlers. Camus was warning the French public of this in 1945 when he was explaining the “malaise politique” he perceived was growing rapidly in Algeria between the settlers and the settled. 

Fanon would also explain that the colonial process divides the native population into three distinguishable groups: native workers valued by the settlers for their labor value, “colonized intellectuals” a term he uses to refer to the more educated members of the native population who are recruited by the settlers to convince natives that the settlers are acting properly, and “Lumpenproletariat” a term Fanon coined based on Marxist principles to refer to the least-advantaged social classes of the native population. He explains that this third, least advantaged group of natives will naturally be the first to utilize violence against settlers as they are the worst-off from the effect of colonization: “The native who decides to put the program into practice, and to become its moving force, is ready for violence at all times. From birth it is clear to him that this narrow World, strewn with prohibitions, can only be called in question by absolute violence.”[7] Some of the long-term effects Fanon focused on would help to explain the long-term cultural and human impact from colonization. French violence during Algerian occupation followed by the French-Algerian war would lead to long-term devastating impacts to Algerian nationals and generations to follow: 

In ‘On Violence’, Fanon highlights the mechanisms of the colonized violence against themselves. (...)The exacerbated militarization og the ‘indigenous sector’ in Algeria manifests itself physically in the de-humanization of the colonial subjects who turn the colonial violence and repressed anger against themselves (madness, suicide) or against each other (physical fights, murder) in a desperate attemt to extricate themselves from and escape the sordid reality of colonialism.[8]

 

 

Fanon’s work is important in explaining not only the violence that Algerians being the colonized needed to use to fight for their independence, but also in highlighting the internal social and cultural devastation that would lead to violence and devastation among Algerians themselves. Fanon suggests that the impact of colonialism can directly be linked to violence between the colonists and the natives, and indirectly between the natives themselves. This can be linked to the frustration, pain, and suffering felt by Algerians leading to internal deprivation and conflict among themselves. 

Fanon was an outspoken supporter of Algerian independence from France and of the FLN’s operations to accomplish this goal: “The immobility to which the native is condemned can only be called into question if the native decides to put an end to the history of colonization - the history of pillage - and to bring into existence the history of the nation - the history of decolonization.”[9] Fanon’s unique and powerful reflection on colonial violence and the long term effect of colonization would serve as an instrumental source to enlighten the French people of what was taking place in Algeria and that it needed to come to an end. Eventually public attitudes and the seemingly endless violence in Algeria would push French President Charles de Gaulle to move towards granting Algeria independence and put an end to French involvement in the region.

 

Charles de Gaulle’s impact

General Charles de Gaulle, who was elected president of France in 1958, made it one of his main responsibilities to move France out of Algeria as peacefully as possible. His plan consisted of a gradual removal of French military personnel in Algeria in the goal of keeping what was left of any kind of relationship between the two countries as strong as possible. While he chose not to exit Algeria abruptly and quickly, de Gaulle wanted Algeria to be decolonized and for Algeria to eventually declare its independence. At the same time, he was attempting to preserve any international relationship they had before the years of the war: “Depending on one’s politics, the endgame that de Gaulle played in Algeria may be seen as the brilliant management of an explosive crisis in which he brought France to accept the inevitability of Algerian independence.”[10] Eventually, de Gaulle would put an end to the conflict in 1962 when he would formally declare Algeria to be an independent nation. On July 1st 1962, a referendum in Algeria was held with a voting population of 6,549,736 Algerians. The question which respondents had to answer in the affirmative or negative was: “En conséquence la Commission Central de Contrôle du référendum constate qu'à la question: ‘Voulez-vous que L'Algérie devienne un Etat indépendant coopérant avec la France dans les conditions définies par les déclarations du 19 Mars 1962’, les électeurs ont répondu affirmativement a la majorite ci-dessus indiquées.”[11] The declarations this central referendum question refers to are the conditions of a structured exit of France from Algeria in which both countries could continue to maintain a mutual and positive relationship. Of those who participated, 5,992,115 (91.5%) expressed that they experienced suffrage under French control, and 5,975,581 (91.2%) responded in the affirmative to the main question asked. In 1962, Algeria had an estimated population of approximately 11.62 million. This means that a large majority of the Algerian adult population participated in this referendum, meaning that the results were significant in showing the extent to which Algerians felt they had suffered under French control and were devout supporters of a new independent Algerian nation.

Among many other factors which contributed to the growing foundations for a successful right-wing nationalist political party, many viewed France’s withdrawal from Algeria as another military defeat, like they had suffered in Indochina. 

The purged collaborators of Vichy France joined virulent anti-communists and those disillusioned by the weakness of the Fourth Republic (1945-1958) to form a ready clientele for anti system nationalist movements. The impetus for the Radical Right in postwar France was seventeen years of unsuccessful colonial War, first in Indochina (1945-1954) and especially in Algeria (1954-1962).[12]

 

After independence

Post-independence relations between Algeria and France would lead to a massive increase in legal migration of Algerians into France. The 1960s and 1970s naturally became a time in which many first generation French citizens from non-french parents were born. This was also met by an increase in the number of mosques and Muslim establishments in France. Traditional French families became increasingly in number disfavorable to the transformation in the ethnic makeup of France’s population. The Front National’s (FN’s) resurgence can largely be connected to these trends, and Algeria was the principal country from which Muslims from the Maghreb immigrated into France. In 1999, the largest immigrant population in France was still Algerians at 576,000 total immigrants. Today, more than 8.8% of the French population is Muslim, and many of them are second or third-generation descendants of individuals who had migrated in the 1960s from the Maghreb. In recent years, the resurgence of the Front National was largely in response to the millions of Muslim migrants, many of whom were political refugees from Syria and other countries.

The French-Algerian War carried on for eight years. These were eight years of bloodshed in which hundreds of thousands of people died, the majority being under-sourced and outmatched Algerian nationals. The violence and oppression felt by natives during this time carries a burden for generations to come. Specifically, the perpetuation of this burden is reinforced by islamophobia and highly conservative views on topics of immigration. In 1962, once Algeria had finally declared its independence, many immigrated into France making Algerians the largest population of Muslim immigrants from North Africa. While speculation is foolish, one can certainly establish a link between far-right ideology, its resurgence in recent decades, and its relation to French colonial history. The implications of colonialism, as Fanon explains, can only lead to violence and long-term animosity between the settlers and the natives. The long-term sysemic oppresion facing french Muslim citizens of North African descent, perpetuated and reinforced by the populist far-right of France, are the implications that Fanon correctly forecasted in 1961 and symbolic of the stigmatizing view shared by so many in our world today. 

 

What do you think of France’s actions in Algeria? Let us know below.


[1] Camus, Albert. “Le Malaise Politique.” (Paris: Combat, 18 May 1945).

[2] William I. Hitchcock, The Struggle for Europe: the Turbulent History of a Divided Continent, 1945 to the Present (New York: Anchor Books, 2003), 184.

[3] Marie Fauré, La Guerre d’Algérie: La Terre aux Remous de la Décolonisation (Ixelles: Lemaitre Publishing, 2017), 7.

[4]  Camus, Albert. “Crise en Algérie,” Combat, 13 May 1945.

[5]  Hitchcock, The Struggle for Europe, 185.

[6] Bourdet, Claude. “Votre Gestapo d'Algérie.” France Observateur, 13 January 1955.

[7] Fanon, Frantz, Richard Philcox, Jean-Paul Sartre, and Homi K. Bhabha. The Wretched of the Earth. (Cape Town: Kwela Books, 2017), 37.

[8] Sajed, Alina. Postcolonial Encounters in International Relations: The Politics of Transgression in the Maghreb. (Taylor & Francis Group, 2013).

[9]  Fanon, Frantz, Richard Philcox, Jean-Paul Sartre, and Homi K. Bhabha. The Wretched of the Earth. (Cape Town: Kwela Books, 2017), 51.

[10] Hitchcock, The Struggle for Europe, 189.

[11]  Sator, Kaddour. Proclamation Des Résultats du Référendum D'Autodétermination Du 1er Juillet 1962. (Algerie: Commission Centrale de Contrôle Electorale, 3 July 1962.)

[12] Paxton, Robert O. The Anatomy of Fascism. (Vintage Books, 2005), 177

Secondary Sources

Fauré, Marie, and 50 Minutes. La Guerre D'Algérie: La France Aux Remous De La Décolonisation. Vol. 47, (Lemaitre Publishing, 2017).

Hitchcock, William I. The Struggle for Europe: The Turbulent History of a Divided Continent, 1945 to the Present. (Anchor Books - A Division of Random House, Inc. New York, 2003).

Howell, Jennifer. The Algerian War in French-Language Comics: Postcolonial Memory, History, and Subjectivity. (Lexington Books, 2015).

Saada, Emmanuelle, and Arthur Goldhammer. Empires Children: Race, Filiation, and Citizenship in the French Colonies. (The University of Chicago Press, 2012).

Sajed, Alina. Postcolonial Encounters in International Relations: The Politics of Transgression in the Maghreb. (Taylor & Francis Group, 2013).

Silverman, Max. Frantz Fanon's Black Skin, White Masks: New Interdisciplinary Essays. (Manchester University Press, 2017).

 

Primary Sources

Boualam, Bachaga. Mon Pays… La France. Paris, France: (Editions France-Empire, 1962).

Bourdet, Claude. “Votre Gestapo d'Algérie.” (France Observateur, 13 January 1955).

Camus, Albert. “Crise en Algérie.” (Combat, 13 May 1945).

Camus, Albert. “Des Bateaux Et De La Justice.” (Combat, 16 May 1945).

Camus, Albert. “Le Malaise Politique.” (Combat, 18 May 1945).

 Sator, Kaddour. Proclamation Des Résultats du Référendum D'Autodétermination Du 1er Juillet 1962. (Commision Centrale de Controle Electorale, 3 July 1962).

 Fanon, Frantz, Richard Philcox, Jean-Paul Sartre, and Homi K. Bhabha. The Wretched of the Earth. (Cape Town: Kwela Books, 2017).

The Korean War (1950-53) pitted the capitalist South Korea, backed by the U.S., against communist North Korea. A major offensive by South Korea ultimately led to Communist China becoming involved in the war.

Here, Victor Gamma considers how failures in the U.S. understanding of China’s intentions and over-confidence that China would not intervene led to the U.S. backed forces moving very close to the Chinese border.

You can also read Victor’s article on Henry VIII’s divorce of Catherine of Aragon here.

Fighting with the 2nd Inf. Div. north of the Chongchon River, Sfc. Major Cleveland, weapons squad leader, points out an enemy position to his machine gun crew. November 20,1950.

Fighting with the 2nd Inf. Div. north of the Chongchon River, Sfc. Major Cleveland, weapons squad leader, points out an enemy position to his machine gun crew. November 20,1950.

Seventy years ago, on June 25, 1950 the Cold War turned hot when North Korean forces launched a surprise attack across the 38th Parallel. The communist attempt to violently take-over the entire peninsula was thwarted by the firm U.S. response and General Douglas MacArthur's brilliant counterattack at Inchon that September. Now it was the turn of the U.S.-led UN and South Korean forces to attempt the unification of the Korean Peninsula on their terms. UN forces moved towards the Yalu River and victory seemed within sight. This objective, however, was shattered when massive, skillfully handled attacks by Chinese forces beginning on November 25 caused a disastrous defeat for UN forces, including the longest retreat in U.S. military history. The attacks on what the Chinese called the Western Sector came to be called the battles of the Ch'ongch'on River. These were among the most decisive in the post-World War Two era. How could the world’s premier superpower be taken by surprise? How could the nation with arguably the best intelligence-gathering capacity in the world be completely fooled? The answer is complex. It includes persistent miscalculations from the American military and political leadership, preconceived assumptions regarding communist China, ignorance regarding local conditions and China's interests, the refusal of American officials to deal with the Communist Chinese or to take their government seriously, and the on-going failure to correctly interpret intelligence.

 

Mistrust and Misunderstanding

Part of the problem can be traced to the atmosphere of heightened tension and suspicion that existed between East and West. This suspicion led to a breakdown in communications and effectively clouded understanding. The West viewed the communist world in a monolithic sense. The bickering between China and the Soviets was unknown to Western decision-makers. The Korean attack was viewed as an attempt by Russia to gain advantage and it was to the Russians the United Nations had to resist. The West's suspicions are understandable; had not Kim Il-Sung been an officer in the Red Army? Hadn't the Soviets accumulated a track record of forcing 'friendly' regimes on every state that came under their dominance? Hadn't the Red Army left behind a hardline communist state devoid of free elections in North Korea? However, documents declassified in 1992 fail to support the contention that Kim Il-Sung was merely a puppet acting under Stalin's orders. In contrast, the post-Cold War research revealed that although Stalin ultimately gave his approval and was willing to aid Kim, he was not the initiator of it. Valentin Pak, Kim's translator who read Stalin's communications to Kim, stated emphatically that Stalin did not encourage an attack on the South.

Dr. Katherine Weathersby examined declassified Soviet documents and affirmed that Stalin thought it unwise to initiate hostilities in Korea. This knowledge was unknown to U.S. policy-makers because American intelligence failed utterly to investigate the nature of the Soviet-Korean relationship. The reasons for Stalin's final go-ahead to Kim are still unclear. Whatever he may have hoped to gain by a communist victory, his contribution to the effort was relatively meager and he did not want to associate himself too closely with it.  U.S. intelligence failed completely to investigate the relationship between the Soviets and Kim. Instead, the Americans went into the conflict viewing Kim as a Soviet proxy. 

At first, China did not appear strongly concerned. The Chinese had several motives for intervening in Korea. First, they were alarmed by the rise of anti-communist sentiment in the West and thought the UN advance toward Manchuria was a military expression of this. North Korea was also deemed vital to the Chinese leadership, which had plans for large-scale industrial growth. Hydroelectric resources in North Korea were deemed essential to these plans.  U.S. officials failed to grasp these local conditions and took little interest in Korea beyond the view of Korea as merely one more piece in the massive chess game against Soviet expansionism. 

                  But rather than conduct talks with the Chinese, the animosity and mistrust that characterized East-West relations not only thoroughly poisoned any attempt at accommodation but effectively disrupted clear interpretations of Chinese or Soviet intentions. Throughout the period before open Chinese intervention (June 25 - November 25, 1950), communication between China and the West was hampered by the fact that there were no formal relations with Communist China. Any knowledge about Chinese intentions had to be gathered from reading the cables of foreign ambassadors stationed in China or by Signal Intelligence (SIGIN). The Armed Forces Security Administration (AFSA) had the responsibility for intelligence gathering. Information collected would then be passed on to the appropriate officials for evaluation. These leaders would then act on the information accordingly. There was sufficient intelligence during the summer and autumn of 1950 to alert Western officials that China had a vital interest in Korea and would back up their interests with force. China had communicated its approval of a pro-Western regime in Korea if it did not involve direct Western intervention and if it occurred at the expense of the Russians. This communication was ignored or misunderstood.  Cable traffic to and from the office of the Indian Ambassador at Beijing, Dr. Kavalm Madhava Panikkar, was being read by the Armed Forces Security Agency (AFSA). Dr. Panikkar was probably the most well informed diplomat in Beijing and had built good relations with high-ranking Chinese officials. He was thus a quite valuable source of information regarding Chinese diplomacy. In July and August 1950, cables revealed that Panikkar had been informed that China would not intervene in Korea. However, decrypted cables began to reveal a dramatic change following the Inchon landing and the turn of the tide against North Korea. For example, the cables of the Burmese ambassador now revealed that China intended to intervene militarily. A week later, on September 25, Panikkar cabled New Delhi that China would intervene if UN forces crossed the 38th Parallel. This information was transmitted to Washington, where officials dismissed it due to Panikkar's supposed pro-Chinese sympathies. Less than a week later, on October 1, 1950, Mao Tse Tung publicly declared: "The Chinese people will not tolerate foreign aggression and will not stand aside if the imperialists wantonly invade the territory of their neighbor". Mao continued by warning that if non-Korean forces crossed the 38th Parallel, the Chinese "will send troops to aid the People's Republic of Korea." That same day South Korean troops crossed the 38th parallel. Accordingly, on the following day the Chinese Politburo made the decision to intervene in Korea. Mao Tse Tung ordered 260,000 troops to cross into Korea by October 15.

 

The Die is Cast

On October 2 the Indian ambassador was roused from sleep shortly after midnight and ordered to meet with Chou En-lai, China's premier and foreign minister.  Chou informed the Indian Ambassador of China's intentions regarding Korea, which information was passed to the West on October 3. Additionally, the Dutch charge d' affaires also cabled The Hague quoting Chou en Lai's comments that China would fight if U.S. forces crossed the 38th Parallel. These warnings were dismissed in Washington as a bluff. 

Meanwhile the first U.S. forces crossed the 38th Parallel on October 5 and advanced on Pyongyang.  MacArthur had not been totally incognizant of the possibility of Chinese intervention. In fact, in the days and weeks following Inchon, he repeatedly inquired of his subordinates if any sign of Russians or Chinese had been noted. Since no reports of their presence were forthcoming, he assumed that the Soviets and Chinese had decided to withdraw at America's demonstration of will. The United Nations mandate of June 27 read in part; "furnish such assistance to the Republic of Korea as may be necessary to repel the armed attack and to restore international peace and security in the areas." It was widely held that the only way to accomplish this objective was the complete removal of the Kim regime as a threat to the south by occupying the north and reunifying the entire peninsula. 

Not everyone shared this sanguine hope, however. Several doubts about an advance into the north had been voiced. The Policy Planning Staff in Washington had also expressed grave doubts about the ability to invade the north without provoking a war with China. In addition, a National Security Council 81 report was circulated on September 1 emphasizing the risk involved but also pointed out the unlikeness of Soviet intervention. It recommended that only Republic of Korea (ROK) forces should cross the 38th Parallel. 

MacArthur harbored no such doubts, though, and the advance continued with non-Korean forces in support. On October 15 Chinese General Pen Dehuai received the order from Mao Tse Tung to begin moving his forces across the Yalu River into Korea. That night, the 372 Regiment of the 42nd Army crossed into Korea. Before long, more than 300,000 Chinese troops moved into North Korea, completely undetected by AFSA. However, other warnings did occur. SIGINT observed changes in Soviet, Chinese and North Korean military activity, indicating some kind of major operation was impending. The CIA sent a top secret coded memo (still only partially declassified) to President Truman. The memo stated that intelligence sources indicated that the Chinese would intervene to protect their interests in the Suiho Hydroelectric complex in North Korea. The memo also noted an increase in fighter aircraft in Manchuria. On October 21 AFSA reported from Chinese radio traffic that no less than three Chinese armies had been deployed along the Yalu River and reported heavy troop train movement from Shanghai to Manchuria. All of this information was dismissed because it contradicted the dominant opinions of the U.S. intelligence community. For an example of this view, an article in The Review of the World Situation dated October 18, stated: "Unless the USSR is ready to precipitate global war, or unless for some reason that Peiping does not think that war with the U.S. would result from open intervention in Korea, the odds are that Communist China, like the USSR, will not openly intervene in North Korea."  In Tokyo, Lieutenant Colonel Morton Rubin went over the intelligence indicating Chinese intentions to intervene with General MacArthur and his intelligence chief General Charles Willoughby. Neither MacArthur nor Willoughby was convinced of the reality of the threat. In fact, the UN side also suffered from an atmosphere that discouraged healthy challenges to official decisions. Surrounding MacArthur, especially after Inchon, was an aura of infallibility that effectively wilted any suggestions for greater vigilance against possible Chinese action. In this atmosphere, the disregarding of the Chinese threat became the “party line” and a healthy exchange of views was discouraged.  Matthew Ridgway, who was soon to replace MacArthur, added: "the great fault over there was poor evaluation of the intelligence that was obtained. They knew the facts but they were poorly evaluated. I don't know just why this was. It was probably in good part because of MacArthur's personality. If he did not want to believe something, he wouldn't." 

 

Now, read part 2 on how the U.S. backed forces moved ever closer to China and the eventual Chinese counter-attack here.

Why do you think many in the U.S. did not think China would join the Korean War? Let us know below.

Sources

Aid, Matthew. The Secret Sentry, the Untold History of the National Security Agency. New York: Bloomsbury Press, 2009.

Alexander, Bevin R.  Korea: The First War We Lost. New York, NY: Hippocrene Books, 1986.

Bacharach,  Deborah. The Korean War. San Diego: Lucent Books, 1991.

Conway, John Richard. Primary Source Accounts of the Korean War. Berkeley Heights: Enslow Publishers, Inc., 2006.

Fehrenbach, T.R. This Kind of War: A Study in Unpreparedness. New York: The Macmillan Company, 1963.

Halberstam, David. The Coldest Winter: America and the Korean War. New York: Hyperion, 2007.

Halberstam, David. The Fifties. New York: Fawcett Books, 1993.

Hammel. Eric Chosin: Heroic Ordeal of the Korean. New York: The Vanguard Press. 2018.

Harry S. Truman, ed. by Laura K. Egendorf. San Diego: Greenhaven Press, 2002. 

Hastings, Max. The Korean War. New York: Simon and Schuster, 1987.

Hammel, Eric. Chosin: Heroic Ordeal of the Korean. New York: The Vanguard Press,1981.

Kaufman, Burton I. The Korean War: Challenges in Crisis, Credibility and Command. McGraw-Hill Publishing Company, 1997. 

Litai, John and Xue . Uncertain Partners, (Stanford University Press), 1995.

Paul Lashmar, New Statesman & Society; 2/2/96, Vol.9 Issue 388, p24, 2p, 1bw

Truman, Harry S. Memoirs by Harry S Truman: Year of Decisions. New York: Doubleday and Company, 1955. 

Truman, Margaret, editor. Where the Buck Stops: The Personal and Private writings of Harry S. Truman. New York: Warner Books, 1989.

"Analysis: The Foreign Interventions: Stalin and USSR" www.mtholyoke.edu/~park25h/classweb/worldpolitics/analysisstalin.html‎)

Charity Lamb (c. 1818-1879) was infamous in her time for the being the first woman convicted of murder in the new Oregon territory (the territory in the north-west of the United States). Here, Jordann Stover returns and tells us about the murder, Charity’s trial, and the aftermath.

You can also read Jordann’s article on Princess Anastasia Romanova, the youngest daughter of Tsar Nicholas II here, and on Princess Olga ‘Olishka’ Nikolaevna, the Eldest Daughter of Tsar Nicholas II of Russia here.

The Oregon Hospital for the Insane, where Charity Lamb spent her years from 1862.

The Oregon Hospital for the Insane, where Charity Lamb spent her years from 1862.

Charity Lamb -- we do not know the exact date of her birth or what she looked like. We have photos of the asylum she spent the rest of her days confined to, photos of her lawyer but there is nothing of the woman herself. She was born around 1818 and died some sixty years later. She was convicted of murder, the first woman to recieve such a conviction in the new Oregon territory after she plunged an axe into her husband, Nathaniel’s, skull. 

Humans have always had an inherent curiosity for crime, the deadlier the better. We find ourselves captivated by blood spatter and ballistics, by the process of getting into the mind of the world’s most violent individuals. Just as we have done and continue to do in the face of horror, Charity Lamb’s case was sensationalized by the world around her. There were talks of love triangles, insanity, infidelity, and more. The Oregon territory would have had you believing that Lamb was certifiable, that she was a woman lusting after a young man under her husband's (and daughter’s) nose. She, who was almost certainly a housewife who led a monotonous, ordinary life up until the beginning of this fiasco, was seen as a cold-blooded sex-feind. The truth was, of course, far less Lifetime-y. The story of Charity Lamb is one born of an all too familiar circumstance-- a woman trying desperately to survive her marriage to a violent man. 

 

The Crime

It happened on a Saturday evening. Charity, her husband, and their children sat around the table for a dinner Charity had certainly spent some time preparing. At some point during the meal, Charity stood from the table and left the cabin. We cannot be sure if there was a reaction of any sort from the rest of the family, not until Charity returned just a moment later with an axe. She stepped up behind her husband and hit him as hard as she could in the back of his head not once but two times. After doing so, she and her eldest child, Mary Ann, who was seventeen at the time, fled. The remaining children watched in horror as their father fell to the floor, his body “scrambl[ing] about a little” before falling unconscious. The man did not die immediately; instead, he held on for a few days before dying.

What seemed to have precipitated this event was the affections Mary Ann felt for a man named Collins. Collins was said to have been a farmhand working for a family nearby. There is no record to confirm whether or not the feelings were reciprocated. Perhaps Mary Ann had not gotten a chance to truly express her feelings to the man before her father forbade her from being with him which subsequently led to the teenager asking her mother for help in writing a secret letter to the young man. 

Nathaniel discovered the letter on his wife and accused her of having feelings for Collins herself. We cannot be sure whether or not she truly had feelings for the young man but we can make assumptions-- a case such as this makes a retelling without such assumptions practically impossible. It is unlikely that this woman with a group of children and nearing forty would have been pursuing a presumably penniless farmhand. What is far more believable is that Charity, a mother who knew very well how deeply her daughter’s feelings went, was doing her best to help. Regardless of what was the truth, Nathaniel was furious. He threatened Charity, threatened to take her children away, to murder her. Charity was quite obviously terrified but according to their children who testified at her trial, Nathaniel had frequently been violent with their mother. He’d knocked her to the ground, kicked her, forced her to work when she was sick. He was downright brutal with her for their entire marriage which leaves us to wonder-- what was it about this last threat that scared her so much? Charity was used to this violence so whatever he said to her, whatever he might have done was enough for her to legitimately believe her life was in jeopardy. 

According to Charity, he’d threatened her just as he had many times before; however, this time he was serious. He told her that she would die before the end of the week and once she was gone he’d take their children far away, hurting them in the process. He told her that if she ran, he’d hunt her down and shoot her— it was known how good of a shot Nathaniel was as he was an avid, accomplished hunter. 

 

The Trial

Charity and Mary Ann were arrested following the events of that morning. The community was outraged. They hated them, and saw them as monsters. Newspapers practically rewrote the events to match whatever story they believed would sell. They told salacious fable after salacious fable until Charity became the most hated woman in the Oregon territory. 

Mary Ann went to trial before her mother and was acquitted. One can only imagine the relief Charity must have felt— this was her fight, certainly not something she wanted her daughter tangled up in anymore than she already was. Charity’s trial followed a few days later and a similar outcome was expected; however, she would not be so lucky. 

A part of the blame can be put on the men who decided to defend her. They had her plead not guilty by reason of insanity, insisting that Charity was not mentally sound; therefore, she could not have known the consequences of her actions. They claimed that her husband’s actions had driven her to insanity. This proved to be the beginning of the end for her hopes of acquittal as anyone in the room could see that she was relatively competent. The judge, in a move that was questionable for someone who was supposed to remain impartial in such matters, sympathized with her. He instructed the jury to acquit if they truly believed her actions were done in self defense.

Despite the sympathies of the judge and the testimonies of the Lamb children confirming the abuses Charity claimed to experience, she was found guilty of murdering her husband. 

Charity wept loudly as the verdict was read. This woman who had survived the Oregon Trail, multiple pregnancies, life on the frontier, and a violent husband was sentenced to prison where she would be subjected to hard labor. The officers had to take her infant from her arms, depositing the child into the arms of another. 

There were no prisons for women in the Oregon territory; Charity was the first woman to be charged with such crimes in the area. The local prison where she was eventually sent had no provisions for her and she remained the only female prisoner for her entire stay. She did the warden’s laundry and other household tasks to fulfill her sentence of hard labor until she was transferred to Oregon Hospital for the Insane in 1862. She lived out the rest of her days in that hospital with a smile on her face and proclaiming her innocence. 

 

What do you think about the trial of Charity Lamb? Let us know below.

Sources

Lansing, Ronald B. "The Tragedy of Charity Lamb, Oregon's First Convicted Murderess." Oregon Historical Quarterly 40 (Spring 2000)

“Charity Lamb.”, The Oregon Encyclopedia 

https://oregonencyclopedia.org/articles/lamb_charity_1879_/#.Xukj7kXYrrc

Botany Bay in Sydney, Australia has had human habitation for thousands of years. But when Captain James Cook led arrived in 1770 with his British ship HMS Endeavour, it changed its direction greatly. Here, Spencer Striker tells us about what happened after the British arrived – and its negative effect on the native Aboriginal communities.

Botany Bay, a watercolour by Charles Gore from the late 1780s.

Botany Bay, a watercolour by Charles Gore from the late 1780s.

A penal colony

Back in 1770, the British, under the command of Lieutenant James Cook, landed in Australia for the first time, in what is known as the Botany Bay. Thus far, the land belonged to many Aboriginal tribes who were indigenous and were believed to reside there for the previous 5,000 years. This landing of the British in the island marked a historical turning point, from which onwards Australia was considered a profitable colony for the British Empire. What attracted the British were the wide variety of flora and fauna, and Cook decided to name the bay Botany Bay, for its botanical biodiversity. An interesting fact that is relatively unknown, is that the land was not used as a plantation as many colonies were, but as a place where convicts and prisoners of the British Empire were relocated, but not under any restrictions. Practically, the British were said to use Botany Bay as a place to 'dump convicts' and other felons, for British reasons and interests, one of which included lumbering, despite the harm this caused to the native populations and the natural habitat. 

 

The rich resources

Like the plantation colonies of the Caribbean islands, the Australian colonies were results of the ‘push and pull’ factors of competitive capitalism. The needs of the huge empire were enormous, and Australia's rich land, with untouched forests and natural resources, was valuable for the capitalist interests of the time. The colonizing processes included mining, agricultural activities, lumbering of the forests, and the usage of the large water resources. However, it is important to understand the strategic significance of the colonization of Australia, as a position that facilitated remote control over the "Indo-China trade routes”. Upon arrival, the economic system they established in the natives’ land was liberal, claiming the land for the financial interests of Britain, and the Crown. The relocation of the convicts in Australia saved the need for social reform in Britain. Another colonizing factor could be considered the widely accepted philosophy of spreading ‘European’ socio-political and cultural influence over foreign territories, across borders and boundaries. 

 

Terra Nullius

When the British colonizers came, they considered the land ‘terra nullius’, which means 'empty land', and is a term used in post-colonial studies to explain the colonizer's ideology behind colonization and indigenous genocide. By considering land empty, the British considered the indigenous tribes of Australia less than humans, and they justified their atrocities against them for the sake of their Empire. It can be seen that this philosophy of 'terra nullius' has an enormous impact on Australia since the descendants of these Aboriginal natives still suffer from racism. The Aboriginals, however, were very advanced culturally, and their existence revolved around spirituality and tribal practices in "respect for the sanctity of life." By considering the land empty, the settlers reduced the Aboriginals to the state of ‘bare life’, a concept used to describe how people stripped of their human rights are treated exceptionally, not very different from animals. In addition to that, this imperialist concept also denoted that the lands were declared British property, and the treatment of the natural habitat and environment fell under the hands of the Empire. Therefore, the Australian land suffered from ecocide, as well, as thoughtless wasting of the natural resources and exhaustive cultivation of land, along with deforestation, led to the damage of its natural diversity. 

 

A case of genocide

In contrast to the colonialists, the tribes were peaceful, and whatever conflicts arose “didn’t result in warfare.” With the arrival of the British, a physical genocide of the Native population was practically inevitable, since diseases foreign to the land were devastating for the population. However, the British effectively forged a chemical and physical war against the Natives by importing dangerous viruses. Death was obliterating the Native population, with diseases such as "smallpox, syphilis, typhoid, whooping cough, diphtheria, tuberculosis, measles, dysentery, and influenza." The British settlers’ chauvinistic approach caused not only the death of vast numbers of the native population, but it also resulted in a sort of cultural genocide, or as is referred to in post-colonial studies, an "ontological violence" that did not allow room for the bare existence of the Native population in their own land. Similar cases of colonization and genocide took place in the New World, with the arrival of Columbus in America in 1492. The very same "dispossession, with ruthless destructiveness" of the land, their people, and their culture shaped the future of the continent in ways difficult to untangle.

The history of this colonization is recorded in detail, but rarely is the side of the Aboriginal people represented, for whom this first encounter with the British was in fact an invasion of their land. Revisiting historical archives can always shed light on shadowy historical events, and it has been giving voice to the under-represented people, who can now come into the center of the hegemonic representation and tell their story as well.

 

Jiemba and the Death of the Rainbow Serpent

Jiemba is a fictitious character of the Eora tribe, who were the Aboriginal people around Sydney, and they even had sub-tribes with variations in languages. They used to call the Botany bay 'the blue bay' and they have been native to their land since their development and domination over animals. Their civilization was blooming, until the British came with their ship "bringing with them sickness and aggression." 

Jiemba's story starts around 1795, although the very first 'white men' as he calls them were spotted seven years before.

With the help of the newly-published book History Adventures, World of Characters Revolutions & Industrialization, 1750 – 1900 by Spencer Striker, we get a glimpse of what it was like for a common aboriginal man to witness the first British ship arrival. Through the story of Jiemba, the indigenous witness, we can get closer to a wider view of the events, that represents both parties and allows us to see the complex history of the colonization of Australia by the great power that Britain was.


More on Spencer’s book:

History is a fascinating subject, so why is it that so many students struggle with it? It's because of the way it is taught. Just being pumped full of names, events, and dates takes all the real meaning out of it. It's the stories and characters behind the happenings that make it memorable, which is what makes Spencer Striker PhD's interactive digital history book, History Adventures, World of Characters, Revolutions & Industrialization, 1750 – 1900, so interesting. 

Sources

Agamben, G. Homo Sacer: Sovereign Power and Bare Life. Trans. Daniel Heller-Roazen. Stanford, CA: Stanford University Press, 1998.

Banner, Stuart. "Why Terra Nullius-Anthropology and Property Law in Early Australia." Law & Hist. Rev. 23 (2005): 95.

Genger, Peter. "The British Colonization of Australia: An Exposé of the Models, Impacts, and Pertinent Questions." Peace and Conflict Studies 25.1 (2018): 4. p. 2.

Gillen, Mollie. "The Botany Bay decision, 1786: convicts, not empire." The English Historical Review 97.CCCLXXXV (1982): 740-766.

Spencer Striker, PhD. History Adventures, World of Characters Revolutions & Industrialization, 1750 – 1900. 2020 [Online] https://books.apple.com/us/book/history-adventures-world-of-characters/id1505237819?ls=1

Posted
AuthorGeorge Levrier-Jones
CategoriesBlog Post

Margaret Bourke-White (1904-71) was a photographer who had a fascinating career. She went to the Soviet Union in 1930, photographed the Great Depression in 1930s America, and took photos in various wars. Parker Beverly explains – and we also include Parker’s documentary on Margaret below.

American Way of Life, a 1937 photo by Margaret Bourke-White

American Way of Life, a 1937 photo by Margaret Bourke-White

From the battlefields of Italy during World War II to the tranquil home of Mahatma Gandhi throughout the Partition of India, Margaret Bourke-White was there to capture it all. Born in the Bronx, New York in 1904, Bourke-White grew up in a modest household.[1] It was not until her college years that Margaret began exploring the art of photography.[2] A gifted writer, Bourke-White fused the line between visual and written mediums, creating photographs that spoke to viewers' emotions and sensibilities. A serendipitous shoot at Cleveland's Otis Steel factory landed her industrial photograph portfolio on the desk of Henry Luce, the publisher of Time magazine who tasked her with being the first photographer for Fortune magazine.[3]Climbing atop ledges of the Chrysler Building and entering hazardous industrial factories, Bourke-White soon gained a reputation of being a fearless photographer, undaunted by gender or occupational boundaries.  

Her 1931 book Eyes on Russia which documented her travels throughout the nascent industrial Soviet Union through photographs made her the first photographer to capture the country's steadily growing industrialization.[4] In 1937, she along with her then husband, Erskine Caldwell published You Have Seen Their Faces which detailed the plight of poor rural Southerners in the midst of economic hardship.[5] It was this experience with suffering along with a Fortune assignment covering Midwestern droughts[6] that changed Bourke-White's photography from an advertiser's lens to one depicting the human condition. One of her more famous photographs captures the stark difference between commercial and racial realities in the United States with a line of African Americans seeking emergency aid pictured against a billboard depicting a white unaffected family.  

 

Wars

In 1936, Margaret's career changed focus as she transitioned into her role as a staff photographer for Life Magazine.[7] Well regarded for its photo essays which documented everything from the building of the Fort Peck Dam to a wartime ThanksgivingLife provided a national platform for Bourke-White's photography. Seeking to relay news from the battlefield to the home front, Life sent Margaret to photograph various scenes from World War II including torpedo attacks, bombing missions, and the liberation of concentration camps.[8] Soon after, she captured the struggles of apartheid in South Africa and lastly, documented the strife of the Korean War.[9]  While in Korea, Bourke-White began noticing symptoms of Parkinson's disease, a condition which she fought for nearly 20 years.[10]

I came across Margaret's fascinating story while researching for the National History Day (NHD) contest in 2017. I was struck by the lack of coverage on her remarkable and pioneering photojournalism career and knew I wanted to tell her story. Interviewing individuals such as Cokie Roberts and Judy Woodruff, I brought her noteworthy narrative to life through a documentary film seen in both the NHD contest and the All-American High School Film Festival. Three years later, I am still inspired by Margaret's overwhelming tenacity.  Today, her photographs still provide relevant discussion of important moments in history while her trailblazing career encourages others to follow their passions.

 

You can see Parker’s film on Margaret Bourke-White below. Let us know what you think below.

[1] Vicki Goldberg, Margaret Bourke-White:  A Biography (New York City, NY:  HarperCollins, 1986).  

[2]  Ibid.  

[3]  Beverly W. Brannan, “Margaret Bourke-White (1904-1971),” Library of Congress Prints & Photographs Division.  

[4]  Ibid.  

[5]  Jay E. Caldwell, Erskine Caldwell, Margaret Bourke-White, and the Popular Front:  Photojournalism in Russia     (Athens, GA:  University of Georgia Press, 2016).  

[6] Beverly W. Brannan, “Margaret Bourke-White (1904-1971),” Library of Congress Prints & Photographs Divisionhttps://loc.gov/rr/print/coll/womphotoj/bourkewhiteessay.html

[7]  Margaret Bourke-White, Portrait of Myself (New York City, New York:  Simon and Schuster, 1963). 

[8]  Ibid.  

[9]  Vicki Goldberg, Margaret Bourke-White:  A Biography (New York City, NY:  HarperCollins, 1986).

[10]  Ibid. 

My documentary for the National History Day Contest (2017-2018) for the theme "Conflict and Compromise in History." 2018 Official Selection for the All Ameri...

The American Revolutionary War (1775-83) resulted in defeat for the British; however, its impact was very different in other parts of the world. Here Bilal Junejo explains how defeat in the war led to Britain strengthening its presence in India.

King George III of England, 1799/1800.

King George III of England, 1799/1800.

Of all the upheavals that dot the annals of the turbulent eighteenth century, it is improbable that many could readily vie in either import or impact with the seminal War of American Independence, a landmark which, whilst it tolled the death knell of imperial aggrandizement at one end of the globe, simultaneously, if inadvertently, also served to herald its retrospectively ineluctable flourish at the other by dint of the virtual liquidation that it secured of all non-Indian obstacles in the path of British expansion in India. Indeed, had it not been for this colossal western loss that preceded the eventually colossal eastern gain, General Charles Cornwallis, the Governor-General of India from 1786-93, might never have been afforded the means of expiating his ignominious capitulation to General George Washington at Yorktown in 1781.1 What might have happened in the case of the colonists’ defeat at British hands must necessarily remain the sport of conjecture, but what is certain is that with their victory, the eventual one of their erstwhile masters in London also became well-nigh certain in that illustrious subcontinent of Asia entitled India, the lure of the ages. The way Great Britain’s own fortunes were affected by the American fiasco directly determined the manner in which she would go on to determine those of India. Principally, the impact of the Revolution had two facets: one domestic and one foreign. But because the latter could scarcely have made any difference in the absence of the former, it is to the domestic aspect that we must first turn our attention, before proceeding to contemplate how it operated in conjunction with the other one to render the cumulative result of incorporating India as the brightest jewel in the British crown.

The immediate domestic consequence lay in the dissolution of that effete administration whose memory has become intertwined with the loss of the American colonies, and the hallmark of   which had lain in the anachronistic fantasies of a monarch and the correspondingly complaisant follies of his premier. The government of Lord Frederick North (1770-82) had distinguished itself not only by the acute myopia which had informed its dealings with the colonists since, at least, the Boston Tea Party (1773)2, but also by the slow, yet steady, erosion of those gains which had been consolidated in the practice of parliamentary government since the Glorious Revolution of 1688. King George III, the unfortunate disciple in his early years of the royalist tutelage that pervaded the philosophy of the ironically hapless Earl of Bute3, and in stark contrast to the relatively democratic predilections of the first two Hanoverians, ascended the throne with a vigorous resolve to effect the full exercise of royal powers, but in his personal capacity, a regression that would entail a gradual erosion of the need to govern through ministers responsible to parliament. The Settlement of 1689 had provided that thenceforth the government should be a constitutional monarchy, but the immediate consequence of that compromise, as Trevelyan explained, was to limit any further expansion of the royal prerogative, rather than effect its transfer from the sovereign to their ministers, which only transpired gradually over the decades— a classic example of what the Fabian Sidney Webb called the ‘inevitability of gradualness’. Of this inexorable transformation’s culmination, the essence was succinctly delineated by one Lord Esher, in a memorandum that His Lordship prepared for King George V in 1913, during the constitutional troubles over the issue of Home Rule for Ireland:

“Has the King then no prerogatives? Yes, he has many, but when translated into action, they must be exercised on the advice of a Minister responsible to Parliament. In no case can the Sovereign take political action unless he is screened by a Minister who has to answer to Parliament. This proposition is fundamental, and differentiates a Constitutional Monarchy based upon the principles of 1688 from all other forms of government.”4

 

The impact in Britain

It is not for us to delve into the constitutional implications of George III’s untoward proclivities, for all that need concern us here are the political ramifications, in the light of that era’s constitutional status quo, that would likely have ensued following a British victory in America. In any given society, it is axiomatic to say that an overseas victory achieved by the incumbent regime will redound to its credit and increase its popularity amongst the electorate, whereas any loss would only serve to undermine its popular appeal and support. Because the defeat in America was so categorical, the pretensions of the George-North administration were dealt a mortal blow, and the peril of a return to the polity of James II was practically expunged. Englishmen of the seventeenth century had waged a formidable Civil War for the blessings of political liberty and accountable government, restored Charles II when it seemed expedient to do so to restore stability after the less than favorable developments following Cromwell’s demise, but then again overthrown   James II a mere five and twenty years later when it appeared that his deleterious inclinations promised a return to the autocracy of his father’s days. It is, therefore, highly unlikely that had autocratic power begun to increase in the wake of a victory in America, the people (especially the Whigs) of Britain would have so submissively acquiesced in a renewed emulation of the traditions that still inspired the dilapidated ancien régime in neighboring France. Indeed, the famous writer and politician, Edmund Burke (1729-97), had begun to sound the alarm as early as 1770, even before the Revolution, when he published his pamphlet entitled Thoughts on the Cause of the Present Discontents, arguing that King George III was upsetting the balance between crown and parliament in the British constitution by seeking to rule without due acknowledgement of the party political system.5 And in 1780, whilst the war was still going on, Dunning’s resolution— which lamented that “the influence of the Crown has increased, is increasing, and ought to be diminished”— was passed by a distrustful House of Commons.6 Thus, it is not fanciful to suppose that victory in America would have given a fresh lease of life to the George-North administration, any continuance of which could have only served to deepen the fissures in British society. If the King could block Catholic Emancipation, despite his American failure, for as long as he lived, then one can only wonder at what he might have done had he won that redoubtable contest of wills on transatlantic shores. As it happened, though, a contretemps in America averted the much greater danger of domestic unrest and civil war at home, which would scarcely have conduced to the acquisition of empire in the world. The last Jacobite uprising of 1745-6, with all its turbulence, was still a living memory, and Bonnie Prince Charlie, the Young Pretender, was destined to live until 1788, which means that it was not impossible for him, or his nominee, to become the figurehead   of a popular resistance to a jubilant George-North oligarchy. An unstable metropolis cannot exude the aura of that infallibility and serenity which is indispensable for cowing a foreign people into deferential submission, even against their will.

 

The rivalry with France

The second aspect that merits consideration here is the impact that the Americans’ victory had on France, Britain’s historic— and, in India, the principal— rival and the chief abettor of seditious endeavors across the Atlantic. How the war affected France was aptly summed up by the historian, Herbert Fisher, when he observed that “for Louis XVI and Marie Antoinette, no policy could have been more improvident, for not only did the American war give the final push to the tottering edifice of French finance, but the spectacle of republicanism triumphant and monarchy overthrown across the Atlantic kindled in every forward-reaching mind in France the vision of a Europe remade after the new American pattern of republican liberty.”7  Again, we can only speculate about what might have happened in the case of French neutrality or the Americans’ defeat, but what is certain is that after Washington’s triumph at Yorktown, and the ironic, not to mention portentous, fact that the treaty of peace and recognition between Great Britain and the new American democracy was signed at despotic Versailles, revolution in France became only a matter of time. The cost of the war was unlikely to have been inflamed to the degree that it was on the eve of the Bastille’s fall had it not been for the legitimate pride that the likes of Lafayette could take in the succor they had rendered the armies of Washington. France might have collapsed even earlier in the case of defeat in America, but it is also possible that she might have launched a fresh war of revenge in Europe for the distraction of domestic opinion from real domestic issues to manufactured foreign perils. And if France had lost, then England would have won, and thereby consolidated the insidious gains in royal power made by King George III up to then, resulting in British foreign policy coming to reflect royal predilections more and more, as opposed to those of Parliament. One must not forget that the English monarch back then, a Hanoverian, was also the Elector of Hanover at the same time, and if France had decided to avenge an ignominious failure in America by attacking Hanover to her east (thereby precluding the need to try to reach a conclusion with the Royal Navy), George III might have decided to focus his entire attention on saving his Electorate without worrying about Britain’s overseas possessions, and given the latent insanity with which we know, thanks to the benefit of hindsight, that he was afflicted, all sorts of untoward eventualities might have arisen.

 

The impact on India

How exactly did these two consequences cumulatively affect India? This is the question that constitutes the end of our discussion. In 1623, the massacre of Amboina had forced the English to withdraw from the East Indies. Now, Yorktown had also necessitated a kindred evacuation from the American colonies, so India was perforce the main attraction left for imperial gratification. But such gratification, quite naturally, presupposes uninterrupted stability in the metropolis, and this was achieved by the Revolution when it shattered the autocratic ambitions of King George III, any realization of which might have imperiled the island state’s security by precipitating a fresh civil war. And we must not forget that towards the end of the eighteenth, as well as the beginning of the nineteenth, century, some of the most crucial battles that would determine the fate of the East India Company in India were fought (e.g. with Tipu Sahib of Mysore and the Marathas). Even though France was wracked with internal unrest, the contagion of which soon pervaded the rest of Europe and did not abate until 1815, she was nevertheless able to create great problems for the British. Indeed, one of the main reasons for remembering Lord Wellesley, the Governor-General of India from 1798-1805, is his frustration of Napoleon’s plans, which encompassed burgeoning contacts with Tipu, to subvert the Indians.8 And when Admiral Nelson decimated the overweening French fleet at Aboukir Bay in August 1798, thereby annihilating any hopes of Napoleon’s advance eastwards to India, it was the East India Company that, out of profuse gratitude, rewarded him with a munificent ten thousand  pounds sterling, a stupendous sum in those days.9 To judge from the magnitude of this largesse, such were the fears aroused by the grandiose ambitions of a feverish and unstable France that one can only wonder what might have happened had the Bastille not been stormed in 1789— a cogently distinct possibility, but for that eruption which commenced at Lexington and was carried to triumph under the auspices of French arms.

Thus, the inevitable conclusion we draw is that the American Revolution, by domestically strengthening Britain at the same time as it domestically weakened France, made it assured that no serious challenge from without could henceforth arise to check the British rise within India. It was so because, to recollect the memorable verdict of Fisher, after the Peace of Versailles, “the continent merely saw that an empire had been lost. It did not perceive that a constitution had  been saved. Yet such was the case. The failure of the king’s American policy involved the breakdown of the last effectual experiment in personal rule which has been tried in Britain.”10 And it was from the ashes of this humbled royal pride that there arose the Pax Britannica. God bless Peace, and God bless Britain.

 

What do you think of the article? Let us know below.

1 John Kenyon, The Wordsworth Dictionary of British History (first published 1981, Wordsworth 1994) 93

2  Ibidem, 44

3  Ibidem, 55

4 G. M. Trevelyan, The English Revolution 1688-1689 (first published 1938, Thornton Butterworth Ltd 1938) 193

5 John Kenyon, The Wordsworth Dictionary of British History (first published 1981, Wordsworth 1994) 54

6 Ibidem, 118

7 H. A. L. Fisher, A History of Europe (first published 1935, The Fontana Library 1972) 861

8 Winston S. Churchill, A History of the English-speaking peoples (Cassell and Company Ltd 1957) Volume 3, pages 188-9

9 James Brown, The Life & Times Of Lord Nelson (Parragon Book Service Ltd 1996) 41

10 H. A. L. Fisher, A History of Europe (first published 1935, The Fontana Library 1972) 862

Posted
AuthorGeorge Levrier-Jones
CategoriesBlog Post
8 CommentsPost a comment

The American Civil War ended in 1865, but its effects lasted a long time – and even linger to this day. Here, Daniel L. Smith returns and presents his views on how economic and social control emerged from the Civil War and last to the present in America.

Daniel’s book on mid-19th century northern California is now available. Find our more here: Amazon US | Amazon UK

Freedmen voting in New Orleans in 1867.

Freedmen voting in New Orleans in 1867.

It's far from over. In fact, it was never over. Here's a historical clarification to give an insight and some background information into the political 'shadow-war' occurring today in Washington DC and within states nationwide. And that is just the fallout of the ongoing American Civil War. American historians James McPherson and James Hogue, both prominent intellectuals whose area of expertise are in the era of the Civil War and Reconstruction, gave an eye-opening account on the forecast of the Democratic Party’s intentions for America in 1857 and beyond.

​“Slavery lies at the root of all shame, poverty, ignorance, tyranny, and imbecility…” With a direct emphasis on the rogue political tactics used to obligate the whole mass of society, “the lords of the lash” (speaking of Democratic politicians and business elites) who “are not only absolute masters of blacks [but] of all non-slave-holding whites, whose freedom is merely nominal, and whose unparalleled literacy and degradation is purposely and fiendishly perpetuated.”[1]

R. H. Purdom would give an early warning: "Decided course for the speedy suppression of the intolerable abuses” taken on by workers was absolutely necessary for the “permanent welfare of the institution of slavery itself.”[2] Mr. Purdom was a master mechanic who stood up to address a meeting in Jackson, Mississippi. He gave a stark warning to the elite’s controlling the southern economy. By this point, even the poor working white class were ready to turn coat on their own institutions.

In September 1865, a prominent leading Democratic politician (just recently pardoned by the federal government after losing the Civil War) publicly scoffed at any idea of the Democratic Party remaining loyal or maintaining good relations with the newly re-established United States government. Even Wade Hampton, one of the South’s wealthiest elite farmers, would mention immediately after the Civil War that it “is our duty” (talking of the post-war Confederates who were legally pardoned of treason) to support the President of the United States; however their loyalty to the new government would only stay intact if “he manifests a disposition to restore all our rights as a sovereign State.”[2]

 

After war’s end

Even though rebellious military action ceased weeks after the loss, the Democratic Party of the post-Civil War period only declared a momentary political ceasefire. And although they had formally lost, they did not willingly capitulate to the federal government (the Union) at the moment of military surrender. Between April 9 and November 6, 1865, a nearly invisible ‘shadow war’ marked the 'beginning of the end' for the future of political and social cohesion within America.

Democrats had regained power in most Southern states by the late 1870s. Later, this period came to be referred to as "Redemption". From 1890 to 1908, the Democrats passed statutes and amendments to state constitutions that effectively disenfranchised most African Americans and tens of thousands of poor whites. They did this through devices such as poll taxes for voters and literacy tests to “qualify” to vote (among other underhand tactics). By the late 1950s, the Democratic Party began to embrace the Civil Rights Movement, and the old argument that Southern whites had to vote for Democrats "to protect racial segregation" grew weaker.

The Democratic Party realized that regardless of the outcomes of the Civil War and Reconstruction, the policy of "slavery-by-color" was over. Segregation also became incompatible with their party’s ethics, which is to oppress the poor regardless of color. So what did they do? Modernization had brought factories, national businesses and a more diverse culture to cities such as Atlanta, Dallas, Charlotte, and Houston. This attracted many northern migrants, including many African Americans. They gave priority to modernization and economic growth over preservation of the "old ways" of the Democratic Party, but they wanted social and economic control, a process which had started earlier.

 

Social and economic control

Between 1865 and the late-1880s, prices were falling and people's incomes increased six-fold, so offering American's more purchasing power.[3] The politicians of the New South began feeling the pressures of big businesses complaints that the increased wages were rising fast. It is because of this major economic shift that the attack on the greedy worker was to begin. There was another shift as well. A social one. Now the freedmen (former slaves) and previously non-slave-holding whites, were able to climb the free-market ladder unhindered. For the Democratic Party, it was time to shift the focus to social and economic control.

"Cut their wages to begin with. Make them work harder. To align their interests with their employers, put wage earners on piecework (part-time). Above everything, do something to stop skilled workers from setting the pace of production and spreading to co-workers their spirit of 'manly' resistance to speed-ups" (hostile resistance to forced increases in manual labor). Much like the post-Modern Institutions of Fast Food, Gas, and Retail, one laborer wrote: "You start in to be a man, but you become more and more a machine.... It's like any severe labor. It drags you down mentally and morally, just as it does physically."[4] Of course the Iron Workers during those times had it painstakingly hard physically, but the shift today has moved to being exhausting mentally.

With the Covid-19 Pandemic, Republicans are screaming at Americans to "get out and live!" They want to encourage financial independence and societal success. The Democrats are screaming at Americans to "stay home and save lives!" At this point, for what? One Democratic politician was quoted recently as telling Americans that they should just stay home and "get paid" with the federal government paying out a basic universal income for everybody. And in the future? Who knows, but the way things look, it could possibly be by something as simple as misleading everybody into eventually doing everything from home -and only home.[5]

It is apparent through history's evidence that control is the Democratic Party's modern end-game.

At least it seems that way.

​Enough said.

 

 

You can read a selection of Daniel’s past articles on: California in the US Civil War (here), Spanish Colonial Influence on Native Americans in Northern California (here), Christian ideology in history (here), the collapse of the Spanish Armada in 1588 (here), early Christianity in Britain (here), the First Anglo-Dutch War (here), and the 1918 Spanish Influenza outbreak (here).

Finally, Daniel Smith writes at complexamerica.org.

Bibliography

[1] McPherson, James M., and James K. Hogue. "The Problems of Peace and Presidential Reconstruction, 1865." In Ordeal by Fire: The Civil War and Reconstruction, 543. New York: McGraw-Hill, 2009.

[2] Beatty, Jack. "The Problems of Peace and Presidential Reconstruction, 1865." In Age of Betrayal: The Triumph of Money in America, 1865-1900, 543. New York: Vintage, 2008.

[3] “Mechanical Association,” Mississippian State Gazette, Dec. 29, 1858, 3.

[4] Perrow, Charles. "A Society of Organizations." Theory and Society 20 (1991), 791. doi:10.1007/bf00678095.

[5] Chris Talgo, Opinion Contributor. "Universal Basic Income and the End of the Republic." TheHill. Last modified May 12, 2020.https://thehill.com/opinion/finance/497244-universal-basic-income-and-the-end-of-the-republic.

Posted
AuthorGeorge Levrier-Jones

With the current Covid-19 pandemic causing upheaval the world over, can we look to the past to learn lessons? Here, Mac Guffey continues a series considering lessons from the 1918 Influenza Epidemic, an epidemic that infected around a third of the world’s population and killed some 40 million people (exact estimates vary from 15 million to 50 million or more). He will consider the question: Can something that happened over a hundred years ago in a society so vastly different from today provide any useful guidance regarding the Covid-19 Pandemic?

Here, part 4 in the series considers some personal tales of the Great Flu of 1918 – and reflects on how little that flu is remembered today. After all, if we knew more about it, maybe the 2020 Flu Pandemic would have been less destructive.

If you missed it, the first article in the series considered what happened during the 1918 Influenza Pandemic and the lessons we can draw on the economy (here), part 2 considered the healthcare lessons from the pandemic by contrasting a successful and less successful approach (here), and part 3 looked at the importance of effective leadership (here).

Policemen in Seattle wearing masks made by the Red Cross, during the flu epidemic. December 1918.

Policemen in Seattle wearing masks made by the Red Cross, during the flu epidemic. December 1918.

My mom, who has long since passed away, was the first person to ever tell me about the “Spanish Flu” as she called it. Her uncle died from it in 1919 – several months after he returned from World War One. She was five at the time.

She had a photograph of him kneeling beside her in his “doughboy” uniform. He was quite a guy, I guess. Served with distinction, survived multiple “over-the-tops”, gas attacks, trench strafings, and came home to die in the third and last wave of the infamous influenza pandemic. 

He was one of the 675,000 American casualties of that virus.

Across America during the fall and winter of 1918-19, many such tragic memories were made. Here are a few from Mike Leavitt’s The Great Pandemic of 1918 State by State. (Leavit, 2006)

In Hartford, Connecticut, Beatrice Springer Wilde, a nurse, recounted the tragic story of four Yale students she treated. They had become ill while traveling and decided to get off the train in Hartford. Their last steps were taken from the train station to the hospital. Within twenty-four hours, all were dead. 

Bill Sardo, a funeral director in Washington, D.C., remembered:

"From the moment I got up in the morning to when I went to bed at night, I felt a constant sense of fear. We wore gauze masks. We were afraid to kiss each other, to eat with each other, to have contact of any kind. We had no family life, no church life, no community life. Fear tore people apart." 

 

All public gatherings were banned in Seattle, Washington including church services. Many of the local ministers complained until the mayor said publicly, “Religion which won’t keep for two weeks, is not worth having.” 

The town council in Rapid City, South Dakota made spitting on the sidewalks illegal. A local police officer was seen spitting shortly thereafter. He was arrested and fined $6 for committing the offense. No one was exempt.

Augusta, Georgia was the hardest-hit city in the state. The nurses in the local medical facilities were also struck down by the pandemic. As a consequence, nursing students were put in charge of shifts at a local hospital. Schoolteachers were enlisted to act as nurses, cooks and hospital clerks, and an emergency hospital was constructed on a local fairground. In Athens, Georgia, the University of Georgia indefinitely suspended classes.

An Ocala, Florida man named Olson traveled to Jacksonville, Florida for a carpentry job. Jacksonville was inundated with the flu at the time, and despite a citywide quarantine and the use of gauze masks, Olson contracted the flu. Eager to return to his hometown and family, he slipped past the quarantine and caught a train back home, taking the virus with him. Within days of his return, he had infected his family, and his neighborhood.

James Geiger, the U.S. Public Health Service Officer for Arkansas continuously downplayed the influenza threat to the state - even after he caught the flu, and his wife died from it.

 

Alaska & Authors

The 1918 pandemic also swept through Native American communities in Alaska killing whole villages. One school teacher later reported that, in her area, three villages were wiped out entirely. Others, she said, averaged as many as 85% deaths and probably 25% of those were too sick to get firewood and froze to death before help arrived. When the pandemic passed, because many were so sick that they were unable to fish or hunt and store food for the winter, they died of starvation. Some were forced to eat their sled dogs, and some sled dogs, unfed and hungry, ate the dead and the dying.

This last story from 1918 is about the effect this epidemic had on one of America’s best known authors - Katherine Anne Porter. 

Porter, who would later earn a Pulitzer Prize for her short stories, was one of the thousands who became ill during the epidemic in Denver, Colorado. Porter contracted influenza while working as a journalist for the Rocky Mountain News. She could not be admitted to the hospital at first, because there was no room. Instead, she was threatened with eviction by her landlady and then cared for by an unknown boarder who nursed her until a bed opened at the hospital. Porter was so sick that her newspaper colleagues prepared an obituary, and her father chose a burial plot. That near-death experience changed Porter in a profound way. She said afterward, "It just simply divided my life, cut across it like that. So that everything before that was just getting ready, and after that I was in some strange way altered." Her book, Pale Horse, Pale Rider, is a fictionalized account of her experience in the 1918 pandemic.

 

Lesson Four: Conclusions – ‘Such a big event, so little public memory’

Will and Ariel Durant, the husband and wife co-authors of that massive eleven-volume study The Story of Civilization, also wrote a thought-provoking short work entitled, The Lessons of History. On page eleven they ask:

As his studies come to a close the historian faces the challenge: Of what use have your studies been?... Have you derived from history any illumination of our present condition, any guidance for our judgments and policies, any guard against the rebuffs of surprise or the vicissitudes of change? (Durant, 1968)

 

While that quote is certainly apropos for this last article in a series entitled “Lessons from the 1918 Influenza Epidemic”, it’s not for that reason that I selected it. 

It’s for a far more personal reason.

When I grew up and became a historian, that epidemic in 1918-19, despite my personal connection to it, was never a topic in my teaching curriculum.

And it should have been. 

As an educator, I admit now that I was remiss in not teaching about pandemics and our nation’s susceptibility to them. Had I done so, perhaps one of my students (and there were many) would have gone on to do something in that field. Or perhaps, the 2020 Pandemic would have been less traumatic for all of them.

Every experience that we’ve had in 2020 - our delayed response to the threat of a pandemic - our overwhelmed medical personnel and inadequate supplies - the quarantines - the public pushback and even the key community “stakeholders” – was there in 1918. 

But no one paid attention. It’s unfortunate that we never seem to seek (or adequately teach) the lessons that the past provides us - until it’s too late. We are NOW facing the greatest threat to our Democracy and to our existence as a nation that the United States has faced since the Civil War. The lessons from the 1918 Influenza Epidemic would have helped us in so many ways.

During my research for this series, I came across a 2018 comment that someone left at the end of an article on the Philly Voiceblog during the 100th Anniversary of the 1918 Influenza Pandemic – “Such a big event, so little public memory.” (McGovern & Kopp, 2018)

Indeed. How many five-year-olds will lose a favorite uncle this time?

Food for thought.

 

Why do you think there is so little public knowledge of the 1918 Great Flu Pandemic? Let us know below.

Read more from Mac Guffey in the Amazing Women Airforce Service Pilots of World War Two here.

 

Works Cited

Durant, W. a. (1968). The Lessons of History. New York, New York: Simon and Schuster.

Leavit, M. (2006, January thru July). The Great Pandemic of 1918: State by State. Retrieved May 3, 2020, from Flu Trackers.com: https://flutrackers.com/forum/forum/welcome-to-the-scientific-library/-1918-pandemic-data-stories-history/14750-the-great-pandemic-of-1918-state-by-state

McGovern, B., & Kopp, J. (2018, September 28). "In 1918, Philadelphia was in 'the grippe' of misery and suffering". Retrieved April 10, 2020, from Philly Voice: https://www.phillyvoice.com/1918-philadelphia-was-grippe-misery-and-suffering/

With the current Covid-19 pandemic causing upheaval the world over, can we look to the past to learn lessons? Here, Mac Guffey continues a series considering lessons from the 1918 Influenza Epidemic, an epidemic that infected around a third of the world’s population and killed some 40 million people (exact estimates vary from 15 million to 50 million or more). He will consider the question: Can something that happened over a hundred years ago in a society so vastly different from today provide any useful guidance regarding the Covid-19 Pandemic?

Here, part 3 in the series considers the importance of effective leadership. Mac looks at how the cities of St. Louis, Milwaukee, and Minneapolis managed to have lower rates of infection when compared to other comparably sized cities thanks to effective leadership.

If you missed it, the first article in the series considered what happened during the 1918 Influenza Pandemic and the lessons we can draw on the economy (here) and part 2 considered the healthcare lessons from the pandemic by contrasting a successful and less successful approach (here).

A 1918 poster warning about ‘Spanish Flu’ and how it could impact war production for World War I.

A 1918 poster warning about ‘Spanish Flu’ and how it could impact war production for World War I.

The federal government’s role regarding the public health is generally an advisory one. By and large, the real business of public health and safety is basically a local matter. State, county, and city health departments operate under a rag bag of rules and regulations that vary from community to community based on a community’s prior public health experiences. (Garrett L. , 2020)

Because of this, the way the 1918 Influenza Epidemic unfolded across the United States actually provides a tremendous series of independent case studies about what worked and what didn’t work.

The determining factor – community mortality rates.

Thirteen years ago, Anthony Fauci* and David Morens did just that and wrote an article about the 1918 Influenza Pandemic for The Journal of Infectious Diseases. It was subtitled “Insights for the 21st Century”. 

In their article, they made several key points. One - historical evidence about pandemics suggests there are no predictable cycles; therefore, countries need to be prepared for the possibility of a pandemic at all times. Two - if a novel virus as virulent as that of 1918 were to reappear, a substantial number of potential fatalities could be prevented with aggressive public-health and medical interventions. 

But the best antidote, they said, was prevention - through vigilance, predetermined countermeasures, and planning. (Morens & Fauci, 2007)

Morens’s and Fauci’s recommendations were partially based on the similar way several major urban areas truly “met the moment and prevailed” with the lowest mortality outcomes during that exceptionally virulent second wave of the 1918 Influenza Epidemic.

It was all about leadership.

 

Lesson Three: Leadership – ‘Vigilance, Predetermined Countermeasures, and Planning’

In addition to St. Louis (covered in Parts 1 and 2 of this series and reviewed here for comparison), Milwaukee, and Minneapolis also registered lower mortality rates than most urban areas of a comparable size during the 1918 Influenza Pandemic. 

These cities also encountered many of the same problems and challenges during that pandemic that we’ve faced across the nation in 2020 – disruptive citizens, pushback from churches, schools, and businesses, and failures to comply with mask and distancing mandates.

However, the way those city leaders approached these problems and challenges had a major impact on the civilian death rates in their cities.

 

St. Louis

As just a quick review, St. Louis was led by a strong-willed and capable health commissioner, Dr. Max C. Starkloff, who had the foresight to actively monitor the news as the influenza contagion spread westward. The city’s medical and political communities were quickly prepared for the inevitability that the epidemic would find its way to St. Louis. His first action was to issue a request through the influential St. Louis Medical Society that physicians voluntarily report to his office any and all cases of influenza they discovered. (St. Louis Globe-Democrat, 1918)

When St. Louis physicians reported their first cases of influenza, he asked the city’s Board of Aldermen to pass an emergency bill declaring influenza a contagious disease. This gave the mayor the legal authority to declare a state of public health emergency. The bill also levied stiff fines for physicians who failed to report any new cases of the disease. (St. Louis Globe-Democrat, 1918)

Starkloff and St. Louis Mayor Henry Kiel then executed an open-minded, flexible approach to quarantining, school closings, and other social distancing measures. They also maintained a unified front despite persistent pushback from various St. Louis constituencies. Because of the quick and sustained action by its leaders, St. Louis experienced one of the lowest excess death rates in the nation. (University of Michigan Center for the History of Medicine, 2016)

 

Milwaukee

Even with two influenza waves between October and December 1918, the magnitude of Milwaukee’s brush with the 1918 Influenza Epidemic was still less severe than other U.S. cities of a comparable size. In the aftermath, Milwaukee Health Commissioner George C. Ruhland believed there were three reasons for the better outcomes. (Milwaukee Health Department, 1918)

The first reason was the readiness of the public to comply with any regulatory measures. For that Ruhland credited the Milwaukee medical community’s plan to engage the public. With the support of the city’s newspaper editors, the group began an immense public education campaign - with printed literature in six languages, including English. They created flyers and speaker’s notes, selected respected physicians and city notables as speakers, and requested the area clergy to discuss the flyers from the pulpit. If citizens, business owners, and city government all understood exactly what they were facing, there might be greater cooperation and acceptance should any draconian measures be necessary to blunt the epidemic. (Milwaukee Sentinel, 1918)

The second reason Ruhland listed was the timing of the closing orders and the generally widespread compliance from Milwaukee’s citizens. What’s interesting is that because of the two waves – October and December - Ruhland’s team actually tried two different approaches to see which one worked better. The October approach involved mandated closings - all places of amusement, churches, public gatherings, and eventually the schools. (Milwaukee Journal, 1918)

However, as the number of new cases in Milwaukee declined, some citizens and business owners believed the influenza threat was almost over. They got together and sent a number of requests to Ruhland to lift the bans on public gatherings. He refused. As more businesses clamored for relief, Ruhland publicly pointed out the consequence of overconfidence in other cities - reopening prematurely resulted in another wave of the infectious disease. (Milwaukee Journal, 1918)

Despite Ruhland’s gradual reopening however, a resurgence of the virus occurred in December 1918.

This time, to avoid outright closures, Ruhland shifted the responsibility to the public. He recommended masks be worn in public, set restrictions to the amount of personal space surrounding people in public - every other row was vacant in theaters and churches, retail customers surrounded themselves with six square feet of vacant space – and then he left it up to the people to govern themselves. The citizens, for the most part, ignored the self-restrictions, and that idea failed. (Milwaukee Journal, 1918)

The conclusion Ruhland came to after these two experiences have important ramifications for the world pandemic today. While closures don’t prevent influenza, they are very necessary in order to flatten or prevent the severe spikes in the number of influenza cases that can occur over a short period of time. It’s the severe spikes, he said, that overwhelm the available hospital facilities, healthcare workers, and medical supplies. Preventing those spikes flattens the mortality curve because those who do fall ill have access to better – not desperate - healthcare. (Milwaukee Wisconsin Department of Health, 1918)

The last factor that helped contribute to the lower mortality rates was the overall cooperation from all the community “leaders” during the epidemic – city government, physicians, hospital administrators, businessmen, the Red Cross and other relief agency leaders. Thanks to that cooperation, all necessary decisions were implemented rapidly and immediately. (Milwaukee Health Department, 1918)

In this city of 450,000 people, more than 30,000 of them came down with the flu during those two waves in 1918. Thanks to leadership vigilance, predetermined countermeasures, and planning, fewer than 500 died.

 

Minneapolis

Spanish influenza does not exist in Minneapolis and never has, but it probably will reach here during the fall,” the City of Minneapolis Health Commissioner, Dr. H. M. Guilford, told residents on September 19, 1918. (Minneapolis Morning Tribune, 1918)

Less than a month later, the flu epidemic struck the city. By then, Guilford had a plan ready. The health department ordered all schools, churches, and non-essential businesses closed indefinitely. The measure was unanimously endorsed by the Minneapolis city council. The council also stipulated that the city’s department of health had the full authority to issue any closure orders with or without the consent of Minneapolis’s mayor or the council. (Minneapolis City Council, 1918)

Pushback, however, was almost immediate. 

The Minneapolis Board of Education disagreed with the shut-down order and reopened the schools. The Superintendent of Schools, B. B. Jackson, argued that the leading medical authorities across the nation had determined that epidemic influenza was not a children’s disease. Guilford however, refused to give ground and at his request, the Minneapolis Chief of Police met with the school board and persuaded them to close the schools once again. (Minneapolis Morning Tribune, 1918)

In spite of the school board resistance and a later protest by the owners of amusement businesses, Guilford kept the city closed down until November 15, when the number of new influenza cases reached what he deemed an acceptable level. At that point, schools and businesses were allowed to reopen. (University of Michigan Center for the History of Medicine, 2016)

However, in early December, the number of Influenza cases spiked again – this time, it was among the school populations. Guilford reinstituted the school closures until the end of the month, but he added an important caveat: All students would be required to undergo a thorough examination to ensure that he or she was free of any illness before being allowed to return to the classroom. (University of Michigan Center for the History of Medicine, 2016)

Strong leadership, sustained adherence to science, and a unified front both politically and medically throughout the 1918 Influenza Epidemic enabled Minneapolis to keep the mortality rate of its citizens lower than most urban centers of a comparable size.

 

Conclusion

One of the more important “negative” leadership lessons from the 1918 Influenza Epidemic was the ‘too little, too late” actions by many public officials at the national, state, and local levels that exacerbated the spread of that deadly pandemic. (Mihm, 2020)

That was not the case in St. Louis, Milwaukee, or Minneapolis. Doctors Stackworth, Ruhland, and Guilford each showed vigilance by tracking the progression of the epidemic in other cities, in the military camps nearby, and mandating that their local medical communities report every new case of influenza. They all formed teams, set sound policy directions, communicated and educated about them, and implemented effective, predetermined countermeasures.

However, the greatest insight that 1918 epidemic provides for our 21st century health crisis is the determination of those leaders to maintain the aggressive public-health and medical interventions they put in place for the well-being of their citizens in the face of political, economic, and public pushback.

 

History Is Now

After taking office in 2017, the Trump administration fired the government’s entire pandemic response chain of command, including the White House management infrastructure, and disbanded the National Security Council’s pandemic team and a State Department program designed to identify outbreaks and other emerging threats around the world. (Garrett L. , 2020)

Then, in late December or early January 2020, Trump and his administration were informed by intelligence officials of a contagion raging in Wuhan, China. The administration, however, publicly treated the epidemic as a minor threat that was under control, at least domestically, and repeatedly assured the public that the risk to Americans was very low. 

By the end of January, there were about 12,000 reported cases in China, and growing rapidly by the day. At this point, the U.S. had a handful of confirmed cases, but there was almost certainly already significant community spread in the Seattle area.

Finally, on January 27, the White House created the Coronavirus Task Force (publicly announced on January 29) and declared a public health emergency on January 31. At that point, the federal government began to put in motion the executive, legal, and regulatory pandemic response procedures already on the books. (Wallach & Myers, 2020)

On March 24, 2020, the U.S. death toll from the Covid-19 Pandemic stood at 705 Americans. (CDC, 2020) That day, President Donald Trump said in his then daily Coronavirus Task Force briefing:

There is tremendous hope as we look forward and we begin to see the light at the end of the tunnel. Stay focused and stay strong and my administration and myself will deliver for you as we have in the past." (Woodward & Yen, 2020)

 

Trump’s ‘hope’ versus the ‘vigilance, predetermined countermeasures, and planning’ of Starkloff, Ruhland and Guilford.

As of June 1, 2020, America’s death toll stands at over 106,000 coronavirus-related deaths. (CDC, 2020)

Food for thought.

 

Now, read part 4 here: Lessons from the 1918 Influenza Epidemic: Part 4 – Conclusions – ’Such a big event, so little public memory’

What lessons do you think we can learn from the 1918 Influenza Pandemic? Let us know below.

References

CDC. (2020, April 30). “Coronavirus (COVID-19) pandemic – Overview: statistics”. Retrieved May 2, 2020, from Bing.com: https://www.bing.com/search?q=death+toll+coronavirus&form=EDNTHT&mkt=en-us&httpsmsn=1&msnews=1&rec_search=1&plvar=0&refig=60ce389eba704e0788409300929840cb&PC=HCTS&sp=1&ghc=1&qs=EP&pq=death+toll&sk=PRES1&sc=8-10&cvid=60ce389eba704e0788409300929840cb&cc=US&

Garrett, L. (2020, January 31). Trump Has Sabotaged America’s Coronavirus Response. Retrieved April 28, 2020, from FP (Foreign Policy): https://foreignpolicy.com/2020/01/31/coronavirus-china-trump-united-states-public-health-emergency-response/

Markel H, L. H. (2007). " Nonpharmacuetical interventions implemented by U.S. cities during the 1918-1919 influenza pandemic". JAMA, 298:647.

Mihm, S. (2020, March 3). Lessons From the Philadelphia Flu of 1918: Prioritizing politics over public health is a recipe for disaster. Retrieved April 22, 2020, from Bloomberg Opinion: https://www.bloomberg.com/opinion/articles/2020-03-03/coronavirus-history-lesson-learning-from-1918-s-flu-epidemic

Milwaukee Health Department. (1918). Bulletin of the Milwaukee Health Department 8, no. 10-11. City of Milwaukee, Health. Milwaukee: np.

Milwaukee Journal. (1918, October 10). "City closed to fight flu,” Milwaukee Journal, 10 Oct. 1918, 1. Milwaukee Journal, p. 1.

Milwaukee Journal. (1918, December 2). "Schools closed to stop flu". Milwaukee Journal, pp. 1, 6.

Milwaukee Journal. (1918, October 26). "Weather Cause of Deaths". Milwaukee Journal, p. 2.

Milwaukee Sentinel. (1918, October 11). "City Starts Big Battle On Influenza". Milwaukee Sentinel, p. 6.

Milwaukee Wisconsin Department of Health. (1918). Forty-second annual report of the Commissioner of Health City of Milwaukee. Milwaukee: np.

Minneapolis City Council. (1918). Proceedings of the City Council of the City of Minneapolis, Minnesota, from January 1, 1918 to January 1, 1919. Minneapolis City Council, Proceedings of the City Council of the City of Minneapolis, Minnesota, (p. 536). Minneapolis.

Minneapolis Morning Tribune. (1918, October 20). "Clash Over School Order Due Monday". Minneapolis Morning Tribune, p. 1.

Minneapolis Morning Tribune. (1918, September 20). “No Spanish Influenza in City, Says Guilford”. Minneapolis Morning Tribune, p. 2.

Morens, D. M., & Fauci, A. S. (2007). The 1918 Influenza Pandemic: Insights for the 21st Century. Journal of Infectious Diseases, Volume 195, Issue 7,, 1018-1028.

St. Louis Globe-Democrat. (1918, September 20). “Doctors Here Must Report Influenza,” St. Louis Globe-Democrat, 20 Sept. 1918, 2. St. Louis Globe-Democrat, p. 2.

St. Louis Globe-Democrat. (1918, October 6). “No Quarantine Here against Influenza, Says Dr. Starkloff". St. Louis Globe-Democrat, p. 8.

University of Michigan Center for the History of Medicine. (2016, September 19). City Essays. Retrieved April 21, 2020, from American Influenza Epidemic of 1918 - 1919: A Digital Encyclopedia.: http://www.influenzaarchive.org.

Wallach, P. A., & Myers, J. (2020, March 31). “The federal government’s coronavirus response—Public health timeline - part of the Series on Regulatory Process and Perspective”. Retrieved April 4, 2020, from Brookings: https://www.brookings.edu/research/the-federal-governments-corona

Woodward, C., & Yen, H. (2020, March 28). ”Fact check: Donald Trump is a rosy outlier on the science of the virus”. - Saturday, March 28, 2020. Retrieved April 20, 2020, from Associated Press Website: https://apnews.com/

Wright, J. (2020, March 3). Four disastrous mistakes that leaders make during epidemics. Retrieved April 15, 2020, from The Washington Post: https://www.washingtonpost.com/outlook/2020/03/03/four-disastrous-mistakes-that-leaders-make-during-epidemics/

 

With the current Covid-19 pandemic causing upheaval the world over, can we look to the past to learn lessons? Here, Mac Guffey continues a series considering lessons from the 1918 Influenza Epidemic, an epidemic that infected around a third of the world’s population and killed some 40 million people (exact estimates vary from 15 million to 50 million or more). He will consider the question: Can something that happened over a hundred years ago in a society so vastly different from today provide any useful guidance regarding the Covid-19 Pandemic?

Here, part 2 in the series considers the medical readiness lessons for today, by contrasting the very different approaches of Philadelphia and St Louis in the 1918 Influenza Epidemic.

If you missed it, the first article in the series considered what happened during the 1918 Influenza Pandemic and the lessons we can draw on the economy: Available here.

With masks over their faces, members of the American Red Cross remove a victim of the 1918 Influenza Epidemic from a house at Etzel and Page Avenues, St. Louis, Missouri. St Louis managed the Epidemic better than many other US cities.

With masks over their faces, members of the American Red Cross remove a victim of the 1918 Influenza Epidemic from a house at Etzel and Page Avenues, St. Louis, Missouri. St Louis managed the Epidemic better than many other US cities.

American life in 1918 was busy, demanding, and non-stop. 

A world war was raging in Europe; military camps were springing up all over the country to accommodate the military’s demand for more soldiers. Factories (and even community clubs, organizations, and families) were busy turning out provisions needed by those boys going “over there”.

But the demands of this war also drained the nation’s supply of healthcare workers, medical equipment, and diminished the quality of available civilian medical care everywhere. So when the second wave of the 1918 Influenza Epidemic struck toward the end of September, the civilian hospitals and medical personnel were simply unprepared.

The state health officer for New Jersey announced on September 27th that the influenza “was unusually prevalent” throughout the state. Within the next three days, more than 2,000 new cases were reported. Newark’s medical facilities were so quickly overwhelmed that the city purchased a vacant furniture warehouse to be used as an emergency hospital. (Leavitt, 2006)

Makeshift hospitals like that one were hastily opened in almost every community to deal with the astronomical surge in people seeking medical help, but the virulence of this epidemic simply overwhelmed them all. 

One New Jersey physician recalled the outbreak: “There was no need to make appointments. You walked out of your office in the morning and people grabbed you as you walked down the street. You just kept going from one patient to another until late in the evening.” He treated more than 3,000 patients that month. (Leavitt, 2006)

Finally, in newspapers around the country, messages from desperate city health departments appeared:

. . . The spread of the Spanish Influenza is now a matter for the individual citizen. The city is doing what it can. Now it is up to the public. You can help keep the disease down. IT’S UP TO YOU—TAKE CARE OF YOURSELF  (Johnson City Health Department, 1918)

That 1918 directive – very pertinent in both substance and form as we currently battle our own pandemic of epic proportions – actually represented a capitulation of sorts by America’s city and state governments.

At that point, everyone was on his or her own.

 

Lesson Two: Healthcare - Two Cities - Two Outcomes – One Reason

The very virulence of the influenza virus that late summer and early fall doomed the unready medical system in this country almost immediately. The United States had 5,323 hospitals with just 612,000 beds available to accommodate a nationwide population of some 92 million people. Within forty-eight to seventy-two hours of almost every local outbreak, all of the hospitals in that area were filled beyond capacity. (U.S. Bureau of the Census, 1976)

During the initial outbreak of the Covid-19 Pandemic in spring 2020, the modern healthcare system in the United States came perilously close to the limits that 1918 crossed. How do we handle a second wave that’s as virulent or more so than our first wave?

As history would have it, there actually is an answer to that very question in the 1918 Epidemic. It’s a tale of two cities – Philadelphia and St. Louis.

 

Philadelphia

For the sake of establishing a timeline by which to compare these two cities, let’s reiterate the facts we discussed in “Lesson One” of this series about Philadelphia’s health director Dr. Wilmer Krusen’s disastrous decisions that led to Philadelphia’s high mortality rate.

Despite evidence to the contrary regarding the virus’ virulence in the various military camps surrounding Philadelphia, Krusen was quoted in a Harrisburg newspaper on September 14th that he didn’t see any danger in the “Spanish Flu”. (Harrisburg Telegraph, 1918)

Three days later on September 17th, the first cases of the flu in the city were reported. Krusen took no quarantine measures or other social precautions, and furthermore, he ignored pleas from the local medical community to cancel the September 28thparade through the city. Over one hundred thousand people witnessed and participated in the parade that day. (Hatchett, Mecher, & Lipsitch, 2007)

Within seventy-two hours, Philadelphia’s hospitals were overrun. As the disease spread, essential services collapsed. Nearly 500 policemen failed to report for duty. Firemen, garbage collectors, and city administrators fell ill. [1] But it wasn’t until October 3rd that the city finally closed schools, banned public gatherings, and took other citywide measures to suppress the epidemic. (Hatchett, Mecher, & Lipsitch, 2007)

Krusen’s fourteen-day delay between the first reported cases in the city and his decision to finally impose a quarantine played a major role in the deaths of well over 12,162 people from influenza and other influenza-related complications between October 1 and November 2, 1918. (Dunnington, 2017)

One of the findings Thomas Garrett noted in his 2008 study of the 1918 Influenza Epidemic was that healthcare actually becomes irrelevant if there are no plans in place to ensure that a pandemic does not incapacitate the healthcare system like it did in 1918. (Garrett, 2008)

 

St Louis

In St Louis, Dr. Max Starkloff, the St. Louis health commissioner, planned ahead.

Instead of waiting for the virus to start, Starkloff started. 

First, he changed his thinking from IF to WHEN. By September 20th, Starkloff had already published a list of social “Don’ts” regarding behavior that could spread the “epidemic of influenza” that was happening in the east. (Evening Missourian, 1918)

He also alerted the local medical community to be prepared, and with their help set up a network of volunteer nurses to treat residents in their homes when the hospitals ran out of space. Members of the Red Cross Motor Ambulance Corps* were diverted from various camps to help transport civilian patients to the hospital and to deliver broth and food to those influenza patients being treated in their homes. (St Louis Post Dispatch, 2014)

Starkloff was ready.

When the first cases of the influenza epidemic were reported in St. Louis on October 5th, he and his staff moved rapidly. Two days later on October 7th, they closed schools, theaters, playgrounds and other public places. Quickly added to that list were churches and taverns, as well as restricted attendance at funerals. Streetcars were limited to seated passengers only – the usual crowds of standing riders were forbidden. (St Louis Post Dispatch, 2014)

These restrictions were enforced too. Despite significant pushback from local religious leaders and business owners who complained about the “draconian” measures and predicted dire economic consequences because of the closings, Starkloff and Mayor Henry Kiel remained firm. (St Louis Post Dispatch, 2014)

 

One Reason

What’s so staggering is the contrast in the mortality figures for these two cities. Philadelphia experienced 12,162 (or more) deaths in one month; St. Louis experienced 1,703 deaths over four months – the lowest mortality rate among the nation’s largest cities. (Hatchett, Mecher, & Lipsitch, 2007)

The one and only reason: In St. Louis, an intervention plan was in place and ready to go when the first cases were reported.

 

History is NOW

In a recent Washington Post interview, a frustrated ER nurse at Sinai Grace Hospital in Detroit, Michigan, Mikaela Sakal, described their struggle against the coronavirus:

Nobody prepared us for this, because this didn’t exist. These aren’t the kinds of scenarios you go over in training. Where do you put 26 critical patients when you only have 12 rooms? How many stretchers fit into a hallway? (Saslow, 2020)

 

The Covid-19 Pandemic is filled with healthcare anecdotes like this one.

While all kinds of “plans” supposedly have been designed to deal with a pandemic, there were no complete readiness plans that had been designed, practiced, corrected, approved, and waiting to be implemented. Even essential medical stockpiles of such common essentials as facemasks, hazmat suits, ventilators, or the machines to make them were wholly inadequate to handle the demands of this disaster.

And when a pandemic like 1918 DID hit in 2020, what occurred was panicky, uncoordinated, reactionary moves with no contingency plans for implementation problems, the long-term effects of each measure, or the necessary vs. available resources.

Nothing went by the book,” Ms. Sakal explained angrily, “Every night, we had to come into work and rewrite the rules.” (Saslow, 2020)

When there is no virus vaccine, “readiness” becomes the major factor in the government’s ability to protect its citizens from a pandemic. Faced with spiraling mortality rates across vast populations over a short period of time, a “virus war” requires preparation, the ready availability of healthcare workers, hospital beds, and huge stockpiles of ready medical equipment to meet the massive demand.

More importantly, as the Philadelphia and St. Louis examples demonstrated, a national readiness plan uniting all of these elements is necessary.

Otherwise, healthcare becomes irrelevant, and 675,000 Americans could die.

Food for thought.

Now, read part 3 here: Lessons from the 1918 Influenza Epidemic: Part 3 – Leadership – ‘Vigilance, Predetermined Countermeasures, and Planning’ 

What lessons do you think we can learn from the 1918 Influenza Pandemic? Let us know below.