The US Civil War was the first modern war in which the productive capacities of the industrial state were completely integrated into the war effort. This has significant impacts on the ability to kill and injure the enemy. Here, Richard Bluttal starts a three-part series on the impacts of trauma during wars by looking at the American Civil War.

Clara Barton, a nurse and founder of the American Red Cross, in the 1860s.

In early May 1864, Lieutenant General Ulysses S. Grant (1822-85) launched his Overland campaign, in which his Army of the Potomac clashed with Robert E. Lee’s Army of Northern Virginia in a series of battles in Virginia. Lt. J. E. Mallet of the Union army distinctly remembers the sensations experienced upon being hit.” I imagined that a cannonball had struck me on the left hipbone, that it took a downward course, tearing the intestines in its course, and lodged against the marrow of the right thigh bone. I fancied I saw sparks of fire, and curtains of cobwebs wet with dew, sparkling in the sun. I heard a monstrous roar of distant cataracts. I felt my teeth chatter, a rush of blood to my eyes, ears, nose and to the ends of my fingers and toes. I tried to get up, fell, and "became completely insensible” He described his wait on the battlefield (at least a day) and the journey to the hospital transport ship quite matter-of-factly. Men in his regiment on the ards. The heavy soft, unjacketed lead bullet flattened out on impact, which produced severe wounds and carrying pieces of clothing into the wound.

The number of combat engagements during the American Civil War was the largest in history to that time, and exponential increases in the killing power of weapons produced rates of casualties beyond the imagination of military medical planners. In a four-year period, 2,196 combat engagements were fought, in which 620,000 men perished—360,000 in the Union Army and 260,000 in the Confederate Army. Some 67,000 Union soldiers were killed outright, 43,000 died of wounds, and 130,000 were disfigured for life, often with missing limbs; 94,000 Confederate soldiers died of wounds. Twice as many soldiers died of disease during the war than in combat. During the 1860s, doctors had yet to develop bacteriology and were generally ignorant of the causes of disease. Generally, Civil War doctors underwent two years of medical school, though some pursued more education. Medicine in the United States was woefully behind Europe. Harvard Medical School did not even own a single stethoscope or microscope until after the war. Most Civil War surgeons had never treated a gunshot wound and many had never performed surgery. Medical boards admitted many "quacks," with little to no qualification. Yet, for the most part, the Civil War doctor (as understaffed, underqualified, and under-supplied as he was) did the best he could, muddling through the so-called "medical Middle Ages." Some 10,000 surgeons served in the Union army and about 4,000 served in the Confederate. Medicine made significant gains during the war. However, it was the tragedy of the era that medical knowledge of the 1860s had not yet encompassed the use of sterile dressings, antiseptic surgery, and the recognition of the importance of sanitation and hygiene. As a result, thousands died from diseases such as typhoid or dysentery.

Why did so many have to die? The deadliest thing that faced the Civil War soldier was disease. For every soldier who died in battle, two died of disease. In particular, intestinal complaints such as dysentery and diarrhea claimed many lives. In fact, diarrhea and dysentery alone claimed more men than did battle wounds. The Civil War soldier also faced outbreaks of measles, smallpox, malaria, pneumonia, or camp itch. Soldiers were exposed to malaria when camping in damp areas which were conductive to breeding mosquitos, while camp itch was caused by insects or a skin disease. In brief, the high incidence of disease was caused by inadequate physical examination of recruits, ignorance, the rural origin of soldiers, neglect of camp hygiene, insects and vermin, exposure; lack of clothing and shoes; poor food and water.

The Germ Theory, which states that microscopic bacteria and viruses caused disease, was not yet understood (Sohn). These pathogenic microorganisms thrived in filthy environments, and the conditions soldiers lived in were horrendous. Because of water shortages in camps, items were rarely cleaned. This includes all medical tools. If scalpels or forceps were dropped on the ground, they were "only washed in tap water," according to one Civil War surgeon (Ledoux). Between operations, tools were not sterilized. Doctors rarely washed their hands, and even less often were their garments cleaned. No one yet knew why these post-surgery infections took place, nor how to prevent them.

Organization of Battlefield Medical Care

On July 16, 1861, Clara Barton watched more than 30,000 “noble, gallant, [and] handsome” Federal soldiers, “armed to the teeth,” march out of Washington to confront a Confederate army near Manassas Junction, this was the first engagement, Battle of Bull Run, against confederate forces. Many around the country, soldiers and citizens alike, naively expected a short conflict with no need to prepare for large numbers of wounded. In the days before the First Battle of Bull Run, as the US Army neared contact with the enemy, the army’s medical department made no preparation to set up hospital sites until after the battle began. No permanent military hospital sites had been established in the city. Instead, sick soldiers were languishing in abandoned warehouses, churches, schools, and other public buildings. That meant Washington’s “hospitals” were already overflowing when the army left for battle. There was simply no space for more patients in these makeshift facilities. A significant number of Union wounded were left on the battlefield because the medical department didn’t have authority over most of the ambulances. The medical disaster at Bull Run in July 1861 convinced Clara Barton, ordinary citizens, and even the Union medical department to take the medical needs of the US Army in the aftermath of a battle more seriously,

How medical care was delivered on and off the battlefield changed during the war. Early on, stretcher bearers were members of the regimental band, and many fled when the battle started. There was no military ambulance corps in the Union Army until August of 1862. Until that time, civilians drove the ambulances. Initially the ambulance corps was under the Quartermaster corps, which meant that ambulances were often commandeered to deliver supplies and ammunition to the front.

If a soldier was injured during battle, volunteers took the howling victim behind the front lines using a stretcher made from canvas and wooden poles. From there, a horse-and-buggy-type wagon would cart them to the nearest field hospital. The "stretcher-bearers" would assess the condition of the patient, dividing them into three main categories: mortally wounded, slightly wounded, and surgical cases. They would then assist the patient to the best of their ability in the back of the jostling horse-drawn vehicle. This process was called "Letterman's Ambulance," devised by the director of the Army of the Potomac, Jonathan Letterman. His system evacuated the injured more efficiently and paved the way for our modern ambulance system.

Combat Related Injuries

In order to be reported, a soldier had to be either transported to or make it back to a field hospital, and this may have resulted in an underreporting of deaths from cannon fire.  Most injuries resulted from the Minié ball invented by the French officer Claude-Etienne Minié in 1849. The Minié ball is a 0.58-caliber bullet that is slow moving and is made from soft lead. It flattens on impact and creates a wound that grows larger as the bullet moves deeper into tissues. It shatters bone above and below impact and usually does not exit. Because of its relatively slow muzzle velocity, it brought bits of clothing, skin, and bacteria into the wound. The majority of gunshot wounds occurred in the upper and lower extremities, but the fatality rate from these wounds was low. Only 18% of wounds were to the abdomen, but these were more often fatal from intestinal perforation in the preantibiotic era.

Non-Combat Related Death and Illness

A variety of factors contributed to a high rate of noncombat-related illness, including overcrowded and filthy camps. Latrines were often not used or were drained into drinking water supplies or not covered daily. Food quality was poor from several standpoints. It was poorly stored, poorly cooked, and lacked enough vitamin C to prevent scurvy. The Army of the Potomac eventually added a number of rules: camps had be pitched on new ground and drained by ditches 18 inches deep, tents had to be struck twice a week to sun their floors, cooking had to be done only by company cooks, all refuse had to be burned or buried daily, soldiers had to bathe twice a week and change clothing at least once a week, and latrines had to be 8 feet deep and covered by 6 inches of dirt daily.

There were few useful medications at the time, and about two thirds of all drugs were botanicals. In 1860 Oliver Wendell Holmes stated at the annual meeting of the Massachusetts Medical Society, “I firmly believe that if the whole material medica, as now used, could be sunk to the bottom of the sea, it would be all the better for mankind,—and all the worse for the fishes”. Medications that were helpful included quinine for malaria, morphine, chloroform, and ether, as well as paregoric. Many others were harmful. Fowler's solution was used to treat fevers and contained arsenic. Calomel (mercurous chloride) was used for diarrhea. Mercury is excreted in high concentration in saliva. This led to excessive salivation, loss of teeth, and gangrene of the mouth and cheeks in some patients.

Battlefield Wounded and Surgery

Battlefield surgery was also at best archaic. Doctors often took over houses, churches, schools, even barns for hospitals. The field hospital was located near the front lines -- sometimes only a mile behind the lines -- and was marked with a yellow flag with a green "H". Anesthesia's first recorded use was in 1846 and was commonly in use during the Civil War. In fact, there are 800,000 recorded cases of its use. Chloroform was the most common anesthetic, used in 75% of operations. In a sample of 8,900 uses of anesthesia, only 43 deaths were attributed to the anesthetic, a remarkable mortality rate of 0.4%. Anesthesia was usually administered by the open-drop technique. The anesthetic was applied to a cloth held over the patient's mouth and nose and was withdrawn after the patient was unconscious. Surgeons worked all night, with piles of limbs reaching four or five feet. Lack of water and time meant they did not wash off their hands or instruments. Bloody fingers often were used as probes. Bloody knives were used as scalpels. Doctors operated in pus-stained coats. Everything about Civil War surgery was septic. The antiseptic era and Lister's pioneering works in medicine were in the future. Blood poisoning, sepsis or Pyemia (Pyemia meaning literally pus in the blood) was common and often very deadly. Surgical fevers and gangrene were constant threats. One witness described surgery as such: "Tables about breast high had been erected upon which the screaming victims were having legs and arms cut off. The surgeons and their assistants, stripped to the waist and bespattered with blood, stood around, some holding the poor fellows while others, armed with long, bloody knives and saws, cut and sawed away with frightful rapidity, throwing the mangled limbs on a pile nearby as soon as removed." If a soldier survived the table, he faced awful surgical fever. However, about 75% of amputees did survive.

Amputation was the most successful method used to halt the spread of deadly infections, like gangrene, caused by battle wounds during the Civil War. Contrary to popular belief, the process was not as barbaric as it seemed. The process was efficient and effective. After a soldier was injured on the battlefield, he was immediately bandaged by medical volunteers. He was shuttled to either the nearest field hospital or medical tent at a camp using the new ambulance system. On the way, the wounded soldier was given whiskey to ease his shock. Once the patient, still in great distress, was set on an "operating table," a chloroform- soaked cloth was held onto the patient's nose and mouth. Tourniquets were tightly secured above the amputation area to prevent the patient from bleeding out. A long, though often dull, blade was used to sever tissue and ligaments, then a serrated saw was used to cut through the bone. One man who witnessed an amputation said this: "Tables about breast high had been erected upon which the screaming victims were having legs and arms cut off. The `surgeons and their assistants, stripped to the waist and bespattered with blood, stood around, some holding the poorfellows while others, armed with long,  bloody knives and saws, cut and sawed away with frightful rapidity, throwing the mangled limbs on a pile nearby as soon as removed.” An experienced field surgeon could perform an amputation in under ten minutes.

Medical advances and improvements leading up to World War 1

The contributions to medical care that developed during the Civil War have not been fully appreciated, probably because the quality of care administered was compared against modern standards rather than the standards of the time. The specific accomplishments that constituted major advances were as follows. 1. Accumulation of adequate records and detailed reports for the first time permitted a complete military medical history. This led to the publication of the Medical and Surgical History of the War of the Rebellion, which was identified in Europe as the first major academic accomplishment by US medicine. 2. Development of a system of managing mass casualties, including aid stations, field hospitals, and general hospitals, set the pattern for management of the wounded in World War I, World War II, and the Korean War. 3. The pavilion-style general hospitals, which were well ventilated and clean, were copied in the design of large civilian hospitals over the next 75 years. 4. The importance of immediate, definitive treatment of wounds and fractures was demonstrated, and it was shown that major operative procedures, such as amputation, were optimally carried out in the first 24 hours after wounding. 5. The importance of sanitation and hygiene in preventing infection, disease, and death among the troops in the field was demonstrated. 6. Female nurses were introduced to hospital care and Catholic orders entered the hospital business. 7. The experience and training of thousands of physicians were upgraded, and they were introduced to new ideas and standards of care. These included familiarity with prevention and treatment of infectious disease, with anesthetic agents, and with surgical principles that rapidly advanced the overall quality of American medical practice. 8. The Sanitary Commission was formed, a civilian-organized soldier's relief society that set the pattern for the development of the American Red Cross.

In August of 1862, a physician named Jonathan Letterman set up the first ambulance system in the Union’s Army of the Potomac. With the support of Hammond, he instituted a three-step system for evacuating soldiers from the battlefield and established the Ambulance Corps. Their first stop was a field dressing station, where tourniquets were applied, and wounds were dressed. Then they moved to a field hospital, where doctors performed emergency medical procedures. Finally, ambulances would transport patients to a large hospital far from the battlefield for long-term treatment. The U.S. military uses the same basic system today.

What do you think of trauma during the US Civil War? Let us know below.

Now read Richard’s piece on the history of slavery in New York here.

Weather has played a key role in shaping the progress of the world and its societies. Here, Kayla Vickery looks at how weather conditions shaped the 1789 French Revolution. She considers the Little Ice Age, the Lake volcano eruption, the poor weather of 1788, and the Great Hailstorm of Paris.

The Storming of the Bastille by Jean-Baptiste Lallemand, 1790.

Introduction

Historians have long debated the causes of the French Revolution. There have been falsehoods about King Louis XVI and Marie Antoinette (“Let them eat cake,” anyone?), finger-pointing at various players, and many theories. But ultimately, they all want to know the same thing: how did the Bourbon dynasty fall? While many events over the eighteenth century created a domino effect against the monarchy, did you know that extreme weather contributed greatly to France’s economic struggles?  Over a few decades, several different weather events left the country in trouble and would eventually inspire the lower classes to rise up and overthrow the monarchy. From the Laki Volcano eruption to the drought/winter of 1788, these natural occurrences all had real consequences. While there were several ideological changes, the events of 1789 would not have been as severe were it not for the weather and the havoc it played on the 18th-century French economy. 

Little Ice Age

First was the Little Ice Age, a period of the Earth's cooling. Way less cool than the animated film Ice Age starring one of my favorite characters Sid the Sloth; it is generally accepted that it started during the Middle Ages and ended sometime in the mid-nineteenth century. Researchers have also observed three periods of freezing weather, one of which occurred in 1770, a mere twenty years before the beginning of the French Revolution. Temperatures dropped globally at this time by as much as 4 degrees, and Europe was significantly affected. Like my moods, the weather was unpredictable and often swung from one extreme to another. This began to affect the crop yield and livelihood of the people of France. The peasants of 18th century France depended highly on good crop output to, ya know, eat. In the late 1700s, the classes had a significant income disparity. As the crops began to fail, the price of food began to rise, leaving workers with extraordinarily little extra income. The Little Ice Age and its negative impact on crops began to cause hunger throughout the country. At this same time, there was a huge population boom across France. Growing from 22.5 million inhabitants in 1715 to 28.5 million in 1789 meant there was a growth of about 25 percent! History has shown that when people are cold, the need for warmth leads to a cuddle which leads to how babies are made. As the number of people grew, so did the demand for goods. A need that was often unable to be met because of the Little Ice Age. As the government became more entrenched in debt, it continued to raise the taxes paid by the lowers classes. Nobles and the clergy were excluded from paying taxes (eat the rich!), so the government's debt came to rest on the shoulders of the most abused from above. The social order of France made it so that even the tiniest shortcomings would be detrimental to the lives of much of the population. 

Laki Volcano Eruption

In 1775, after a poor grain harvest from northern France, people began to let their unhappiness show the only way they know how… with good ol' fashioned riots! The people's anger was directed at the wealthy landowners, and even managed to reach Versailles. This peasant uprising became known as the Flour War, a straightforward title because there was no time for cleverness when starting a Revolution! This uprising would be squashed in a few weeks, but the damage had been done. The peasants of France had seen the power behind a widespread protest and knew what kneaded (I know what I did) to be done.

The Little Ice Age and its powerful effect on the crops and, inherently, the people growing them would just be one of many weather occurrences that would ignite the people of France to revolt. From 1783 to 1784, the Laki Volcano continuously erupted in Iceland, sending ash across Europe. The ash would block the sun, darkening the skies, lowering the temperature, and thoroughly convincing people they were living in actual Hell. With ships unable to move because of the fog from the ash, and weather patterns disrupted, the food crisis became even more severe. To understand the impact of poor harvests, one must realize how little the Third Estate had in the 1700s. Even though they made up 98 percent of the population, they were people with limited economic means and struggled to reach survival levels. They were also forced to pay exorbitant taxes to the King and maintain their feudal obligations to their landlord. They also held no judicial power meaning they could do nothing about the unfair circumstances forced onto them by the King. With such a decrease in their livelihoods, peasants cut back on spending, which hurt the economy even more. I like to point to this moment when trying to prove that my shopping addiction is, in fact, good for the economy!

The Great Hailstorm of Paris

In the already broken economy struggling to recover after years of war and failing markets, the weather of 1788 would push the people of France over the edge. The spring of 1788 was a disaster for the planting season in France. After an abnormally dry spring that dramatically affected the crops of the already starving people, there was a summer of extreme temperatures and random downpours. The mass majority of the population was severely malnourished and was now pushed into yet another famine after a period of economic slump and hunger.

One such event, The Great Hailstorm of Paris, was a ferocious storm that ripped through the countryside, wreaking havoc on July 13, 1788. The destitution the storm caused would go on to infuriate the starving citizens of France, and the devastation of the crops would have dire consequences for the economy of France. Bread prices would continue to soar, and the citizens would find their incomes significantly lowered. As if they hadn't been through enough, the conditions of the lower classes after the Great Hailstorm of Paris would only worsen with the extreme winter ahead. The winter of 1788/1789 would be one of the coldest on record. During this harsh winter, Emmanuel Sieyes's published his political pamphlet What Is The Third Estate? An essay that would attack the privilege of the nobility and give words to the struggle of the lower classes. Think Hamilton but with less rapping. With the pamphlet What Is the Third Estate, the common people of France finally had a physical manifestation of their resentments against the other two estates. By April of 1789, the people of France were rioting regularly over the rising price of bread. The economy's downfall and the crops' failure for several years would push them over the edge and into Revolution. The mood in Paris before the fall of the Bastille was one of anger and desperation. There are many firsthand accounts of the rowdiness of the crowds in Paris who were rioting and demanding answers for the skyrocketing bread costs. Eventually, the hungry and abused crowds would march on the Bastille and overtake the prison, and the French government learned the very valuable lesson of never coming between the French population and a baguette. 

Conclusion

After decades of unheard-of weather patterns working against their livelihoods and without help from their King, the resentment of the poor in France would eventually rise and change the course of history. The Little Ice Age, the Laki Volcano, and the severe drought and winter of 1788 would lead to the uprising of the peasants, the fall of the Bastille, the abolishment of the feudalism system, and eventually, the heads of the French monarchy. 

 What role do you think that weather played in the French Revolution? Let us know below.

Now, if you enjoy the site and want to help us out a little, click here.

References

Dispatches from Paris (April-July 1789)" in The Old Regime and the French Revolution, ed. Keith Michael Baker (Chicago: University of Chicago Press, 1987)

Jessene, Jean-Pierre. The Social and Economic Crisis in France at the End of the Ancien Régime, 1st ed., 2013 Blackwell Publishing Ltd. Published 2013 by Blackwell Publishing Ltd.

Lancaster, John. “How the Little Ice Age Changed History.” The New Yorker, March 2019,https://www.newyorker.com/magazine/2019/04/01/how-the-little-ice-age-changed-history.

Loyseau, Charles "A Treatise on Orders," in The Old Regime and the French Revolution, ed. Keith Michael Baker (Chicago: University of Chicago Press, 1987)

McWillimas, Brendan, ‘The Fall of the Bastille', The Irish Times, Jul 13, 1998, https://www.irishtimes.com/news/the-fall-of-the-bastille-1.172547

 Neumann, J and Dettwiller, J. “Great Historical Events that were Significantly Affected by the Weather: Part 9, the Year Leading to the Revolution of 1789 in France (II).” Bulletin of the American Meteorological Society 

Popkin, Jeremy D, A New World Begins: The History of the French Revolution (New York: Basic Books, 2019)

Sieyes, Emmanuel-Joseph, "What is the Third Estate?" in The Old Regime and the French Revolution, ed. Keith Michael Baker (Chicago: University of Chicago Press, 1987)

Waldinger, Maria, Drought and the French Revolution: The effects of adverse weather conditions on peasant revolts in 1789, (2013)

Privateers are private naval people or vessels that are used by authorities to do tasks such as attack and plunder enemy ships. For many centuries, privateers played key roles in times of war. Here, Avery Scott looks at the important role that privateers played during the American Revolution.

Captain Luke Ryan. From Hibernian Magazine in May 1782.

From the beginning of time, rivers, lakes, and the sea all formed highways that allowed men, supplies, goods, and money to flow from one location to another. In times of war, these superhighways became significantly more important, as those that controlled the sea, controlled the supply lines. Because of this, countries have always battled for control of the sealanes linking their country to others. However, the maintenance of a full navy is expensive, difficult, and, at times, impossible. The difficulty of maintaining a navy led to the rise of privateering commissions, or letters of marque. Letters of marque have played key roles in many periods in history, specifically from the 1500s to the golden age of piracy (1650-1730). Men like Sir. Francis Drake, Henry Morgan, and William Kidd all obtained letters of marque that allowed them the right to pillage and plunder enemy vessels during times of war, and to act as a proxy to the nations navy. Prior to the golden age of piracy, privateers were commonplace, at times replacing the country's navy. However, diplomatic relationships changed around the golden age of piracy, and less commissions were awarded. This led the formerly government sanctioned privateers to begin acting as pirates. Despite the number of privateer commissions decreasing, it did not rid the world of their operations entirely. They would continue to appear during times of war when a particular government found their dastardly services advantageous. And, fortunately for privateers, the American revolution would lead to the commission of hundreds of privateers and untold wealth for those brave enough to risk their lives against the powerful Royal Navy. These privateers, in many cases, are the untold heroes of the American revolution. While they did not have the direct impact of men like George Washington, John Adams, Benjamin’s Franklin, The Marquis de LaFayette, and Baron Von Steuben, they were a menace to the Crown.

The Black Prince

One such example of this is the story of Luke Ryan, and his ship the Black Prince. Ryan began his career as an Irish smuggler turned privateer for the British. However, Ryan was unable to shake his former habits, and returned to Ireland without completing his mission as a privateer, instead he returned with large amounts of contraband. His vessel, the Friendship, was quickly impounded by customs officials, and the crew was thrown into jail. Ryan, having already vacated the ship, was not among these. Ryan decided that he must break his crew out of prison, and quickly sail out of Irish waters. Early in the morning, the crew escaped out of the prison, stole several smaller boats which they rowed to their commandeered ship the Friendship. They proceeded to cut the anchor cables and confined the guards aboard. After escaping from the Black Dog prison, he renamed his ship the Black Prince, and headed for France to obtain a privateers commission from Benjamin Franklin.

Ryan faced a major challenge in obtaining this commission - he was Irish. Not only was Ryan Irish, but nearly his entire crew as well. Franklin could only grant commissions to vessels that had, at minimum, an American captain. However, it was not long before Ryan found Stephen Marchant, an American in France in search of a vessel to command. Ryan felt that Marchant could be easily manipulated into doing whatever Ryan and the other Irish sailors desired - essentially, giving Ryan total control of the ship, and Marchant nothing more than the appearance of a command. Only Ryan and his crew would be privy to the plan, which ultimately resulted in Franklin granting Marchant the commission. It was not long before the Black Prince became a successful privateer that struck fear in the hearts of British merchant vessels and seafaring towns alike. Soon after Marchant was informed of Ryan's plan, and was relieved of the modicum of command he currently maintained. Franklin also found out about the situation, but due to the overall success of the cruises he was unconcerned, even sending Ryan a gift to show his gratitude. In addition to his gift, Franklin was so impressed that he commissioned another vessel - the Black Princess. After another successful cruise, Franklin granted another commission, this one named the Fearnot. Again, these cruises were very successful. There were setbacks in the cruises, but these were minimal compared to the number of successes that each cruise produced. However, Ryans’ commissions were eventually recalled when it became apparent that privateers were hurting Franco-American relations due to a number of factors, not the least being Franklin's leniency in granting them. Overall, Ryan’s cruises amounted to 114 captured vessels and huge monetary damages from insurance rates, trade interruptions, and an overall distraction to the Royal Navy.

The Lee

Stories such as that of Luke Ryan happened frequently during the revolution, whether those privateers came from France or from American shores. Millions of dollars (billions in today’s money) of damages occurred as well as damage to the Royal Navy's morale, supplies, and productivity. Another benefit privateers offered the continental forces was their ability to obtain supplies needed by continental forces. Early in the revolution, in November 1775, the Lee, was patrolling near the coast of Massachusetts, when it came across a large vessel and decided to board. Captain John Manley sent several of his best sailors on board with concealed weapons, taking the British ship Nancy completely by surprise. The Nancy’s Captain Robert Hunter was excited to see the men, assuming they were there to assist the Nancy, as the ship had battled tough seas and was in need of repair. Hunter soon realized that Manley’s men were not there to help, but it was too late. Manley’s men drew their weapons, and Nancy's crew had to surrender. The Lee’s crew discovered a huge military cargo that was one of the most valuable of the war. The cargo included 2,000 muskets, 7,000 cannonballs, and 30 tons of shot, in addition to many other war time essentials. Washington was thrilled at the capture, calling it an “instance of divine favor.”

Conclusion

These two stories show how privateers played a pivotal role in winning the revolution. They helped to obtain goods needed for fighting the war and molested Royal Navy ships and merchantmen coming to resupply British troops. They expanded the size of the Navy, while reducing the financial burden of supplying a full time Navy. They caused huge financial woes for British citizens, merchantmen, and the government. Finally, they helped to create a war weary public that pushed for an end to the Revolutionary War. While often unacknowledged, ignored, or simply forgotten about, privateers were true heroes of the Revolutionary War, and victory would likely have not been possible without their brave contributions.

What do you think of the role or privateering in the American revolution? Let us know below.

Now read Avery’s article on Captain Henry Morgan and the escape from Maracaibo here.

References

Empire of Blue Water by Stephan Talty

Rebels at Sea by Eric Jay Dolin

The Republic of Pirates by Colin Woodard

Throughout history, many people have crossed the seas searching for new lands, a new life, cargo and leisure resulting in millions of journeys undertaken by boat. Life at sea was treacherous and many sailors faced uncertainty as to what their voyage would entail. The many myths and legends of sea monsters, curses and superstitions attempt to offer an explanation for natural events or understand things people couldn’t explain, such as sea creatures. Popular culture has romanticised the idea of sea myths and superstitions that belittle the beliefs that many men and women governed their lives and actions throughout history. Many films and television shows explore plots where crews are cursed for immoral actions at sea, such as Pirates of the Caribbean and The Curse of the black pearl.  Superstitions surrounding sea travel dictated when to sail – not on Fridays – and who could and couldn’t be aboard the ship, such as priests. Some of these beliefs have religious connotations, such as not starting a voyage on a Friday. The idea of good and bad luck is dictated by our actions or decisions like a lucky pair of socks or a ritual before taking an exam. While the ritual itself is not the decider of the future, it makes people feel like they have control in a situation that has many variables.

Here, Amy Chandler explores the types of superstitions that were believed by sailors and why they became popular throughout history.

A depiction of the Mary Celeste in the 1860s (then known as Amazon).

Origins of superstitious beliefs

Superstitious beliefs or behaviours “arise through the incorrect assignment of cause and effect”. (1) In order to explain an event or action that is strange, an individual assigns what they think is a logical reason based on their current knowledge. (1) During the sixteenth and seventeenth centuries, there was a rise in belief in witchcraft and the idea of what a witch was and how they behaved. It is not difficult to understand how this susceptibility to the idea of witchcraft, voodoo and individuals possessing ideas, knowledge and power is beyond humanly possible in small communities. Many sailing families pass down stories and superstitions from generation to generation. These types of beliefs set the scene for other ideas of mermaids, krakens and sea monsters to thrive in the imagination. These stories and beliefs thrive on fear.

The fear of the unknown was common amongst sailors, with vast areas of the world undiscovered and undocumented, so meaning that little was understood. Superstitions when sailors were aboard and before they set sail helped many crewmembers have control in an uncontrollable situation. It is naval tradition to bless the boat and its crew before sailing; for example, in Britain it is customary to bless a ship by breaking a bottle of wine or champagne. (2) Life was difficult with long periods away from land, poor hygiene and sanitary conditions, disease was rife, and diet was inadequate.

Furthermore, boats throughout history have changed dramatically from straight sterns to curved ones. (3) Oak was the most common wood for battleships in the eighteenth century because of its strong structure, while ships are now made of steel and reinforced to ensure a safer voyage, especially with the increase of cruise ships with many floors, guests, facilities and weight. The change in sailing boats and increased knowledge in scientific and technological advancement reduces the risk at sea.  The idea of sea superstitions and myths are still prevalent in popular culture and society.

Influence of literature on myths

Literature like Homer’s The Odyssey mentions Greek mythology like Scylla and Charybdis, Circe and other monsters that commonly lure men to their deaths within myths and popular culture. While these figures featured in Odysseus’ journey are fictional, they suggested an explanation as to why ill-fated events happened through an individual's actions. Similarly, in Samuel Coleridge’s poem, The Rime of the Ancient Mariner, the crew shoots down the albatross and dooms their voyage.

A famous example of a ship that was abandoned with no trace of the passengers is the Mary Celeste, discovered on December 5, 1872 abandoned near Azores, Portugal. The boat went through a name change, major structural changes and owners before this voyage. Originally called the Amazon, the boat underwent many damages and mishaps, was sold in 1868 to Richard W Haines who renamed the ship. Once sold again after major refurbishments, the ship set sail on November 7, 1872 from New York City to Genoa, Italy with 1,700 barrels of alcohol. The ship’s log recorded two weeks of bad weather and when the ship was spotted by British ships near Portugal, they realised the boat was abandoned with no crew or captain, but the longboat was missing. (4) On closer inspection, the boat had taken on water, but was still sailable. The crew and captain were never found and what happened is only speculation. This incident is an example of an ‘ill-fated’ voyage where the ships experienced bad weather and the name of the boat was changed. While it is logical to assume the crew left the boat to avoid sinking and used the longboat to escape, it still encouraged superstitious thought. Authors like Arthur Conan Doyle was inspired by the story and anonymously published J. Habakuk Jephson’s Statement in The Cornhill Magazine in January 1884. (5) The story became popular and the press thought this was a real account from a survivor of the Mary Celeste.

In some incidents, the sensation of curses and myths becoming a reality is one that is fuelled by the popular press and thrived on creating reports that caused a stir. During the famous Titanic’s voyage in 1912 the ship was also said to be transporting an ancient Egyptian artefact, the ornate coffin belonging to Princess of Amen-Ra. It was not until after the discovery of King Tutankhamun’s tomb in 1922, that the sensational story of an Egyptian curse was widely commented on. This idea of a curse was then applied in retrospect to Ancient Egyptian artefacts, such as the one aboard of the Titanic.

Popular superstitions at sea

Many weird and wonderful sailing superstitions exist, to bring good luck included having tattoos of lucky animals such as pigs or roosters, stepping on the boat with the right-foot, cats were good omens and having salt in their pockets are just a few. Similarly, it was bad luck to have bananas on board, red-headed women or women in general, whistling at sea, losing a hat overboard and saying goodbye before a voyage. These are just several of the many superstitions sailors believed throughout history.

In the Nottingham Evening Post, February 7, 1931, reported a crew of British trading ship sailing near New Zealand began throwing cargo overboard because they “believed it was bringing them bad luck”. (6) This report is one of few recorded events of maritime superstition that impacted the day-to-day workings of a ship. The reports continues that sailors would usually indulge in their superstitions in private “without advertising it to the outerworld” suggesting that people who aren’t sailors are not exposed to the myths and legends do not understand why sailors behaved in certain manners.

Similarly, a report in the Fleetwood Chronicle in 1911 suggested that many people were aware of sailing superstitions, such as “the ill-luck which is said to belong to the ship whose name has been changed” and was a “belief [that] prevails among seafaring men that the vessel whose name ends in ‘A’ rests, also, under an evil spell”. (7) This report continues to give evidence of incidents where a ship with a name ending in an ‘A’, sinks or has a disaster. This offers a clear reason, albeit incorrect, to why a ship has a disaster, such as HMS Victoria that sunk in the Mediterranean in 1893, killing 358 crewmembers. While this incident has a logical and reasonable explanation of colliding with another ship, it fuels that idea that some names, places or objects are ‘cursed’ or ‘ill-fated’.

Another report in the Westerham Herald in 1917, recounts that while aboard a ship travelling from Massina to Malta a passenger noticed the ship’s Captain, an experienced sailor “standing at the bow, muttering and pointing with his finger”. (8) The captain was supposedly breaking the force of waves by making a ‘cross’ shape with his fingers and speaking a prayer. The account suggests when asked the captain replied that every ninth wave was dangerous and fatal to the ship’s safety, and the passenger said it was “strange to say, every ninth wave was much greater than any of the others, and threatened the ship with immediate destruction”. (8) Interestingly, when the ninth wave approached the ship with a captain who was signing a ‘cross’, the wave began to break and the danger was avoided.  The report continues that “Arab sailors believe[d] that the high seas off the coast of Abyssinia [were] enchanted” and whenever they sailed through these waters would “recite verses which they suppose have a tendency to subdue them”. (8) This report emphasises that superstitions and actions that cause good or bad luck differs around the world and cultural beliefs. Despite this passenger not having a logical reason why the waves were breaking, seeing the captain perform a ritual that coincides with the danger subsiding provides evidence to those who witness it that these superstitions and rituals work. These experiences are told to others and reported that reinforces these actions and continues to support the idea of good and bad luck.

Science and superstition

Scientifically, there is a logical reason for why sailors claim to see floating, ghost ships. Objects that appear to float above the horizon or appear distorted are due to the Fata Morgana, a type of mirage. This mirage was named after the Arthurian legend of Morgan Le Fey that were believed to occur on the Strait of Messina where fairy castles floated in the air and false lands were created by witchcraft to lure sailors to their death. (9)

Scientifically mirages and superior mirages are created by atmospheric refraction where light bends through varying density of temperature or air creating distorted images, where an object in the distance is longer, higher, and ghostly. Whereas a Fata Morgana is a superior mirage that is more complex. The mirage is created below the original object and distorts an object to a point of being unrecognisable. These mirages are not limited to the sea, but are seen on land, polar or desert terrain and use any distant objects, like boats, coasts or islands to create an image. These images also change rapidly and stack on top of each other creating an imposing image of elongated and compressed versions of the original object. Therefore, this change in temperature, light refraction and distortion creates the appearance of ghostly and distorted image that has no resemblance to the original objects. (9) These boats appear to sit in the waves, on top of the water or parallel with the original object in a ghostly image. Sailors did not understand science in the same way we do now and seeing such a phenomenon would prove to many men that the stories and legends were true.

Conclusion

Sailor superstitions are engrained in how the public view the profession historically and many tourist seaside towns play on the idea of pirates in smugglers cove, strange stories of ghost ships and noises that draw visitors into the local history of the area. Real events and fiction become blurred and embellished over time, especially as there are no sailors from history alive to contradict what they may have seen or heard. It is difficult to understand what is fact or fiction when reading historical accounts. Scientifically we can attempt to explain why these stories developed and demystify the myths and legends through studying mirages and weather conditions that alter the senses. However, there is a large majority of the sea that has not been discovered and there is a high possibility that strange creatures that have mutated over time do exist through evolution. One of the many allures and fears of the sea is the unknown and what could be lurking beneath the sea’s surface. This unknown is what fiction and film thrive on and use to create imaginative situations and worlds that have some resemblance to real events caused by nature, such as deep caverns and whirl pools.

What do you think of superstitions at sea? Let us know below.

Now read Amy’s article on the history of medicine at sea here.

References

(1) K.R. Foster. H. Kokko, ‘The evolution of superstitious and superstition-like behaviour’. Proc Biol Sci. vol. 7 (2009) 2009, p.31.

(2) Royal Museums Greenwich, ‘Ship launching ceremonies’, 2023, Royal Museums Greenwich <https://www.rmg.co.uk/stories/topics/ship-launching-ceremonies> [accessed 24 April 2023].

(3) Royal Museums Greenwich, ‘Ship building: 800 – 1800’ , 2023, Royal Museums Greenwich <  https://www.rmg.co.uk/stories/topics/shipbuilding-800-1800 > [accessed 24 April 2023].

(4) A. Tikkanen. ‘Mary Celeste’. Encyclopedia Britannica, 2023, <https://www.britannica.com/topic/Mary-Celeste > [accessed 20 April 2023].

(5) The Arthur Conan Doyle Encyclopaedia, ‘J. Habakuk Jephson's Statement’, The Arthur Conan Doyle, 2023, < https://www.arthur-conan-doyle.com/index.php/J._Habakuk_Jephson%27s_Statement >[accessed 19 April 2023].

(6)British Newspaper Archive, ‘Superstition of the sea’, Nottingham Evening Post, (7 Feb 1931).

(7) British Newspaper Archive, ‘Good and Bad luck on the ocean waves’, Fleetwood Chronicle, (11 July 1911)

(8) British Newspaper Archive, ‘Superstitions of the sea’,Westerham Herald,(3 Nov 1917)

(9) SKY brary, ‘Fata Morgana’, SKY Brary, 2023 < https://www.skybrary.aero/articles/fata-morgana >[accessed 15 April 2023].

Imperialism leads to war, bloodshed, and generations suffering from its consequences. It is rare to see imperialism yield positive outcomes. While imperialism was not that favorable in China, the cabinet of Emperor Meiji brought about drastic changes in Japan that laid the foundations of the advanced nation we know today. The advanced military technologies adopted by Japan were a significant factor in its victory in the First Sino-Japanese War. Was modernization the only smart step toward the building of a strong country? What did the Tongzhi Restoration lack in comparison to the Meiji Restoration? Disha Mule explains.

If you missed them, you can read Disha’s article on the First Sino-Japanese War here, and how the war may have led to the collapse of the Qing dynasty here, and Korea in the 19th century here.

An. image of Emperor Meiji in 1873. Photograph by Uchida Kuichi.

The Tokugawa Shogunate

As far back as the twelfth century, Japan was ruled by shoguns or military generals. The emperor did not exercise much power. The shogun did not need the emperor’s permission to run the administration. The country was divided into numerous domains, each ruled by a daimyo. The stability of this system was disturbed during the fifteenth century when Japan found itself in a constant state of war, that continued for about a hundred years. It was in 1603, with the establishment of the Tokugawa shogunate, that the chaos ended. After its victory at Sekigahara, one of the major problems for the new shogunate was impressing its superiority upon the entire country.

The founder of the shogunate, Tokugawa Ieyasu, had a clever way of keeping the daimyos in check. This system was called sankin kotai. The daimyo had to be present in the shogun’s castle in Edo (now Tokyo) from time to time. In his absence, he had to leave his family there. The main purpose behind keeping these hostages was to ensure there was no possibility of any rebellion against the shogun. This system was made more strict during the time of the third shogun, Tokugawa Iemitsu(1).

However, the shogun’s rule was not entirely unquestioned. The daimyos at Choshu and Satsuma were among the strongest of his opponents. He ensured that these tozama (outsiders) daimyos remained far away from the capital of Edo(2).

And so, the Tokugawa rule continued for over two centuries. This period is also called the Edo period and saw many developments in the economic and educational fields.

The Western World Comes Knocking

As many European powers had started establishing colonies worldwide, the constant threat of invasion from the West loomed over kingdoms in Asia. It was inevitable that the kingdoms in the east, that had remained secluded for centuries, had to open up. The isolation policy of Japan (sakoku) prevented the entry of foreigners and prohibited Japanese people from leaving the country. The only Western country that had contact with Japan during this period of seclusion was Holland. This changed with the signing of the Treaty of Kanagawa.

Japan was coerced into signing the Treaty of Kanagawa when Commodore Matthew Perry of the US Navy brought his fleets to Japanese shores. China’s defeat in the First Opium War was an important turning point in Japan’s perception of the West. The Japanese were convinced that the ‘barbarians’ would stop at nothing until they had everything going their way(3). The unequal treaties were calculated plans made by the Western powers to exploit the resources of the other country involved. The Treaty of Nanjing opened up five Chinese ports; the Treaty of Kanagawa was meant to serve a similar purpose. By agreeing to the treaty, the Japanese cleverly appeased the Westerners and at the same time, got a chance to explore Western advancements.

Unlike the Chinese, the Japanese were already educated about the happenings around the world(4). The Dutch traded with Japan through the port of Nagasaki. They were demanded to submit reports to the shogun, detailing everything that they learned about the world from the ships arriving at Nagasaki(5). The Tokugawa regime had also set up a similar outpost at Pusan, in order to maintain diplomatic relations with Korea(6).

It is also noteworthy that during the 1860s, the Qing dynasty in China was trying to bring back its popularity through the Tongzhi Restoration. The chief driving force behind the movement was the emperor’s mother, Empress Dowager Cixi. But it was not an easy task.

China was home to a diverse population. It consisted of the Hans, the Mongols, the Manchus, etc. These communities often had many clashes, making governing them difficult. The improper execution of Confucianism was labeled as the root cause of all the difficulties of the state(7).

The Self-Strengthening Movement in China helped in uplifting the situation to some extent. In 1868, the Burlingame Mission was sent to countries like the US, Britain, France, etc. In the same year, China and the US signed the Burlingame-Seward Treaty which reduced the hostility between the states and made traveling less complicated. Despite its intentions of reforming China, the Burlingame Mission could not make much of an impact due to the reluctance of certain pro-Confucian officials(8).

On the contrary, Japan was a more homogenous society. The sense of solidarity was strong amongst the Japanese youth who spoke the same language and belonged to the same culture. This unity proved to be advantageous for the shogunate as these scholars would later become the leaders of their domains and help in smooth administration(9). Japan had also started sending missions to other countries, even before the famous Iwakura Mission during the Meiji period. Traveling became much easier for Japanese citizens, thanks to the Tariff Convention of 1866 which removed the ban on overseas travel(10). A Chinese writer called Wei Yuan had written a book containing details about Western countries. Ironically, it was more popular among the Japanese than the Chinese(11).

Emperor Meiji

The Choshu and Satsuma domains were not particularly on friendly terms, but they shared a strong dislike for the shogunate. The age-old saying ‘The enemy of an enemy is a friend’ seems apt to describe the formation of the Satcho alliance in 1866.

The shogunate went to war with Choshu in 1866, where it had to accept defeat. The next months were marked with numerous rebellions - as many as 106 peasant protests(12). The daimyo at Tosa, another anti-Tokugawa domain, proposed to make the administration bicameral. The shogun seemed to come to terms with the idea, provided the Tokugawas would be the rightful owners of their land(13). However, the Choshu and Satsuma domains were not pleased with the fact that the shogun’s family would still, within the new system, manage to hold a considerable amount of power with the lands they possessed. They marched to Kyoto and convinced the crown prince Mutsuhito, who had just ascended the throne, to take the power in his hands. This was the start of a massive civil war between the armies of the Tokugawa shogunate and the imperial loyalists, otherwise known as the Boshin War.

The last shogun, Tokugawa Yoshinobu, resigned on November 9, 1867. This formally marked the end of the shogunate. The power was handed over to the emperor. Mutsuhito was now known as Emperor Meiji.

In 1868, in the coronation ceremony of the new emperor, it was proclaimed that decisions would be taken after consulting the public and ‘knowledge would be sought from all around the world’(14). Many Western military traditions like firing a twenty-one-gun salute soon became an eminent part of Japanese military traditions. The emperor himself wore Western clothing but did not entirely give up his Japanese roots(15). It was the Emperor's cabinet that was responsible for the rapid changes in the society (the emperor was just a boy of 15 when he was crowned). But the emperor was sincerely curious about the developments in the nation. He valued education - both traditional and Western(16). He also encouraged the production of Japanese goods. Sakuma Shozan’s ideology of blending Eastern ethics with Western science is said to have influenced Meiji greatly(17). While major changes kept happening, a mission was sent abroad in 1871 to learn about the West with a closer lens. The Iwakura Mission was a milestone in the process of establishing a distinct identity of the imperial state and nullifying the effects of the unequal treaties.

It is quite interesting that both China and Japan faced similar kinds of crises. Japan systematically tackled them by making the necessary changes that the circumstances called for. The rise of imperialism in Japan overlapped with the decline of the Qing dynasty in China. The Qings, undoubtedly, made a blunder by ignoring the telltale signs of their incompetence, resulting in a rather humiliating defeat in the war with Japan in 1894.

What do you think of the Meiji Restoration? Let us know below.

Now read Disha’s article on the Hitler Youth here.

Bibliography

Gordon, Andrew. A Modern History of Japan: From Tokugawa Times to the Present. Oxford University Press, 2003.

Jansen, Marius B. The Making of Modern Japan. Cambridge, MA: The Belknap Press of Harvard University Press, 2000.

Keene, Donald. Emperor of Japan: Meiji and his World, 1852-1912. Columbia University Press, 2002.

Vogel, Ezra F. China and Japan: Facing History. Cambridge, MA: The Belknap Press of Harvard University Press, 2019.

Wilson, Noell H. “Western Whalers in 1860s’ Hakodate: How the Nantucket of the North Pacific Connected Restoration Era Japan to Global Flows.” Chapter. In The Meiji Restoration: Japan as a Global Nation, edited by Robert Hellyer and Harald Fuess, 40–61. Cambridge: Cambridge University Press, 2020.

References

1 Marius B. Jansen, The Making of Modern Japan, ‘The Tokugawa State’, 56-57.

2 Ezra F. Vogel, China and Japan: Facing History, ‘Trade without Transformative Learning, 838–1862’, 52.

3 Andrew Gordon, A Modern History of Japan: From Tokugawa Times to the Present, ‘The Overthrow of the Tokugawa’, 48-49.

4 Vogel, China and Japan, ‘Responding to Western Challenges and Reopening Relations, 1839–1882’, 67-68.

5 Ibid.

6 Gordon, A Modern History of Japan, ‘The Tokugawa Polity’, 18.

7 Vogel, China and Japan, ‘Responding to Western Challenges and Reopening Relations, 1839-1882’, 69.

8 Ibid, 71.

9 Ibid, 66.

10 Noell H. Wilson, The Meiji Restoration: Japan as a Global Nation, ‘Western Whalers in 1860s’ Hakodate: How the Nantucket of the North Pacific Connected Restoration Era Japan to Global Flows.” 48-49.

11 Vogel, China and Japan, ‘Responding to Western Challenges and Reopening Relations, 1839–1882’, 67-68.

12 Gordon, A Modern History of Japan, ‘The Overthrow of the Tokugawa’, 57-58.

13 Ibid.

14 Vogel, China and Japan: Facing History, ‘Responding to Western Challenges and Reopening Relations, 1839–1882’, 73.

15 Keene, Emperor of Japan, 'Chapter 23', 214-215.

16 Ibid.

17 Ibid, ‘Chapter 21’, 193.

The Industrial Revolution, which saw many countries move from predominantly farming economies to industrial ones, began in England in approximately 1840. However, Spain did not experience that movement until at least roughly 1880. Janel Miller explains some of the reasons why.

The group who built a tramline from Barcelona to Mataro in mid-19th century Spain.

Multiple Reasons for the Delay

As late as 1855, only about 20 percent of Spain’s land was considered cultivated. The rest had been “blasted by a ruinous system of exploitation.” Around the same time, criminals and beggars were rampant, formal education was not widespread and free speech was not common.

For those reasons and likely others, very few roads had been built by 1955. Compounding Spain’s ability to transport what goods it did produce and thus grow its economy was that its geography is more mountainous than all but one other European country (Switzerland). Spain also had “virtually no” rivers or canals that ships could sail on smoothly.   

In the limited instances where roads did exist, it appeared that few bridges spanned waterways well into the 1860s, hindering some transportation efforts. In addition, in the 1850s and 1860s, a majority of Spain’s workforce was employed in an agricultural-based business. Spain’s coal, which was being used in large amounts by the United Kingdom (and likely at least several other countries) to support the industrial plants being built on their landscapes, was also apparently inferior to that of some of its neighbors.

As one author put it when describing Spain during some of the years discussed here, “the rest of the world had long since awakened to a life of freedom and joined in the race of modern development; Spain was still asleep, drugged with the fumes of prescribed ignorance and dictated intolerance.”

Spain’s Gross Domestic Product

Spain’s Gross Domestic Product (GDP) grew only about 1.1 percent annually from 1850 until 1935. This increase was better than countries such as Italy and Britain, whose yearly GDP grew 0.7 percent and 0.8 percent, respectively, during that 85-year span.

However, Spain’s GDP was lower than that of France and Germany, each of whose annual GDP increased 1.6 percent each year from 1850 to 1935.

Efforts to Modernize Hit Roadblock

Various regimes in the 1850s and 1860s enacted several laws to try and modernize its economy. Some of these laws are discussed below.

One such law was known as the Disentailment Law in the English language. This law allowed the taking of land and the awarding of lands that once belonged to the church, state and local governments to the highest bidder.

Another such law, whose name in English translated to the General Railway Act, removed many of the “administrative” difficulties in building railways that had previously been in place.

A third law was known as the Credit Company Act in the English language. This law allowed the creation of investment banks that were similar in scope to other countries that had already begun the economic modernization process.

However well-intended these laws may have been, Spain experienced a financial crisis from 1864 to 1866 that at least partially hindered that country’s growth.

In Context

Spain’s economy during the 1850s and 1860s, when looked at how it compared to some of its European neighbors, may remind some of how Haiti’s economy compares with the relatively nearby United States. (Haiti was chosen randomly for the purposes of the comparisons that follow.)

In the United States, the GDP of the United States is more than $20 trillion, placing it first among all countries, while Haiti’s GDP is approximately $21 billion, and the poorest (or almost the poorest) country in the Americas per head of population. In addition, while roughly 10.5 percent of U.S. workers are in the agricultural, food and related businesses, about 66 percent of Haitian workers are employed in farming.   

Determining the appropriate GDP to maintain a decent standard of living and the suitable number of agricultural workers a country should employ is beyond the scope of this blog.

That said, since the 1860s, Spain has gained economic ground against its neighbors, providing hope that in the future, the difference between the lower, middle and upper classes in all countries will become less apparent.

What do you think of Spain’s position in the mid 19th century?

Now read Janel’s article on the role of Brazil in World War 2 here.

References

Brittanica.com Editors. Brittanica.com. “Industrial Revolution.” https://www.britannica.com/event/Industrial-Revolution. Accessed April 2, 2023.

Tapia FJB and Martinez-Galarraga J. “Inequality and Growth in a Developing Economy: Evidence from Regional Data (Spain 1860-1930).” Social Science History. Volume 44, Number 1. https://www.cambridge.org/core/journals/social-science-history/article/inequality-and-growth-in-a-developing-economy-evidence-from-regional-data-spain-18601930/802599439621953BD012A8797A684DC7. Accessed March 31, 2023.

Delmar A. “The Resources, Production and Social Condition of Spain.” American Philosophical Society. Volume 14, Number 94, Pages 301-343. https://www.jstor.org/stable/981861. Accessed March 21, 2023.

Simpson J. “Economic Development of Spain, 1850-1936.” The Economic History Review. Volume 50, Issue 2, Pages 348-359. https://doi.org/10.1111/1468-0289.00058. Accessed March 21, 2023.

Delmar A. “The Resources, Production and Social Condition of Spain.” American Philosophical Society. Volume 14, Number 94, Pages 301-343. https://www.jstor.org/stable/981861. Accessed March 21, 2023.

Clark G and Jacks D. “Coal and the Industrial Revolution.” European Review of Economic History. Volume 11, Number 1, Pages 39-72. https://www.jstor.org/stable/41378456. Accessed April 18, 2023.

Delmar A. “The Resources, Production and Social Condition of Spain.” American Philosophical Society. Volume 14, Number 94, Pages 301-343. https://www.jstor.org/stable/981861. Accessed March 21, 2023.

Moro A, et al. “A Twin Crisis with Multiple Banks of Issue: Spain in the 1860s.” European Central Bank. No. 1561. Published July 2013. Accessed March 31, 2023.

WorldPopulationReview.com Editors. WorldPopulationReview.com. “GDP Ranked by County 2023.” https://worldpopulationreview.com/countries/by-gdp. Accessed April 17, 2023.

USDA.gov Editors. USDA.gov. “Ag and Food Sectors and the Economy.” https://www.ers.usda.gov/data-products/ag-and-food-statistics-charting-the-essentials/ag-and-food-sectors-and-the-economy/. Accessed April 17, 2023.

NationsEncyclopedia.com Editors. NationsEncyclopedia.com. “Haiti-Agriculture.” https://www.nationsencyclopedia.com/Americas/Haiti-AGRICULTURE.html. Accessed April 17, 2023.

Simpson J. “Economic Development of Spain, 1850-1936.” The Economic History Review. Volume 50, Issue 2, Pages 348-359. https://doi.org/10.1111/1468-0289.00058. Accessed March 21, 2023.

Posted
AuthorGeorge Levrier-Jones

World War Two was full of very terrible atrocities, foremost among them being the murder of six million Jews during the Holocaust. In this article, Felix Debieux looks at how the sheer number of people murdered during the Holocaust was possible, with a particular focus on the role of the company IBM.

Edwin Black, author of the book IBM and the Holocaust. Source: Juda Engelmayer, available here.

The Convention on the Prevention and Punishment of the Crime of Genocide, better known as the Genocide Convention, represents a landmark in the field of international law. It was the first human rights treaty adopted by the UN General Assembly, and the first legal apparatus used to codify genocide as a crime. Since 1948, it has signified the international community’s commitment to ‘never again’ after the atrocities committed during the Second World War.

Ensuring that genocide is never repeated means providing the crime with a tight, verifiable definition. The treaty has this covered. “Genocide means any of the following acts committed with intent to destroy, in whole or in part, a national, ethnical, racial or religious group”:

  • Killing members of a group.

  • Causing serious bodily or mental harm to members of a group.

  • Deliberately inflicting on a group conditions of life calculated to bring about its physical destruction in whole or in part.

  • Imposing measures intended to prevent births within a group.

  • Forcibly transferring children of a group to another group.

A legal framework for genocide, however, has not prevented the murder of countless innocents since the end of the Second World War. From Rwanda to Cambodia, history is littered with appalling episodes of human-inflicted suffering which meet the technical threshold for genocide. Each episode is unique in its origins and execution. Also unique are the experiences of those who have survived genocide, each group having fought for justice with varying degrees of success.

Anyone who has read even a little into the subject of genocide is very likely to have stumbled into the, at times, vociferous debate surrounding the uniqueness of one genocide in particular: the murder of six million Jews during the Holocaust. This article isn’t about to intervene in the debate; a morbid contest of ‘who-suffered-the-most' is neither enlightening nor sensitive to the victims of genocide. It will, however, agree with those who attest to the uniqueness of the Holocaust on one thing: that the sheer number of people murdered would not have been possible were it not for the unprecedented application by the Nazis of advanced industrial, scientific and technological capabilities.

Where did the Nazis obtain these capabilities, the logistical capacity to manage the identification, transportation, ghettoization and extermination of so many? A full answer to this question means looking beyond the Nazi government itself, and considering the partnerships the regime forged with private companies. Indeed, companies implicated in the Holocaust range from Audi and BMW - who maximised the opportunities afforded by slave labour - to Deutsche Bank, who provided loans for the construction of Auschwitz. One company which perhaps contributed more than any other was the US multinational company IBM (International Business Machines Corporation), whose tabulation technology was used to track individuals, monitor their movements, and ultimately facilitate their transportation across a network of prison, labour and extermination camps. IBM technology, quite literally, ensured that the trains to Auschwitz ran on time. How did the company become involved in the Holocaust, how much deniability can it claim, and what does this tell us about corporate complicity in human rights abuses?

IBM’s origins

To understand IBM’s part in the Holocaust, we first need to take a look at the company’s roots in early data processing and the US census. This is not as dull is it might sound. Back in the 1880s, the US Census Bureau employed a young German-American statistician named Herman Hollerith. Hollerith would go on to make a name for himself as a seminal figure in the development of data processing, eventually founding a company that in 1911 was amalgamated to form the Computing-Tabulating-Recording Company (CTR) - renamed in 1924 as IBM. The young statistician’s role in this story is critical.

While working for the US Census Bureau, Hollerith conceived the idea that would make his company rich: readable cards with standardised perforations, each representing individual traits such as nationality, sex, and occupation. When produced in their millions, these punch cards could be counted in the national census and tabulated based on the specific information they contained about each citizen. This innovation promised the US government a quantified snapshot of its population, filterable using demographic characteristics such as sex or occupation. One of CTR’s first customers was the US Census Bureau, which contracted the company to tabulate the 1890 census.

Fast forward to the 1930s, and IBM had established itself as a major player in the global computing industry with a number of offices across Europe. Chief among them was Dehomag, IBM’s German subsidiary, headed by Chief Executive and enthusiastic Hitler supporter Willy Heidinger. The ability to quantify and analyse entire populations like never before would, naturally, greatly interest a regime hellbent on purifying its citizenry of undesirables. But how did the latest tools and techniques in data processing fall into Nazi hands? For a second time, we find that a national census provided the opportunity for IBM to showcase its technology.

A lucrative partnership

Hitler’s rise to power in 1933 was met with a spectrum of reactions. Where some saw a threat to peace, others quickly grasped at the business opportunities presented by regime change. Among those who sought to capitalise was IBM president Thomas J. Watson, who from the very first days of the Nazi government manoeuvred to form a partnership. Despite widespread international calls to boycott the new regime, Watson inserted himself extremely closely into the management of IBM’s German operation. Indeed, between 1933 and 1939, Watson travelled to Berlin at least twice annually to personally supervise Dehomag’s work. In this period, the Nazi government would become one of IBM’s most important overseas clients.

On April 12, 1933, Dehomag was presented with a huge opportunity to cement the partnership. This was the date on which the Nazis announced plans to conduct a long-delayed national census, a project which would enable identification of Jews, Roma and other minority groups deemed subhuman by the new order. First in line to offer their services was Dehomag, backed at every step by IBM’s US headquarters. Indeed, Watson personally travelled to Germany in October that year, and drastically expanded investment in Dehomag from 400,000 Reichsmarks to a staggering 7,000,000. This injection of capital gave Dehomag the means to purchase land in Berlin, and to start construction of IBM’s first German factory. The scaling up of operations in Germany would prepare IBM to take on a bigger role in Nazi atrocities. Indeed, it was tabulated census data that enabled the Nazis to expand their estimate of 400,000 to 600,000 Jews living in Germany to 2,000,000.

Some part of Watson must have known that his company's partnership with the Third Reich was immoral, if not embarrassing. Tellingly, he took great pains to ensure deniability through his continued insistence on direct verbal instructions to his German staff. Nothing was written down, even in the case of high-value contracts. And yet there was no denying the tight leash with which Watson directed business. For instance, correspondence written in German was translated by the IBM New York office for Watson’s personal comment. In one anecdote, German staff recalled having to wait for Watson’s express permission before they were allowed to paint a corridor. Watson’s tenure as CEO would see IBM’s partnership with the Nazis grow more intimate still.

Business gets intimate

Writing at a time in which multinational corporations are heavily scrutinised in the public eye for any role – no matter how small – in human rights abuses, we might be forgiven for assuming that IBM maintained at least some semblance of distance from the atrocities taking place across Nazi-occupied Europe. The reality, however, is much more disturbing. As the regime’s sole supplier of punch cards and spare parts, IBM trainees (or sometimes authorised dealers) were required to be physically present when servicing their tabulation machines – even those located at infamous sites like Dachau. More chilling still, each IBM machine was tailor-made to not only tabulate inputted information, but also to produce data which the Nazis were interested in analysing. There were no universal punch cards, and so IBM’s role in servicing the machines ensured that they continued to operate at maximum efficiency.

To give a sense of how it worked, it might be helpful to describe an application of IBM tabulation technology in action. One set of punch cards, for example, recorded religion, nationality and mother tongue. By creating additional columns and rows for ‘Jew’, ‘Polish language’, ‘Polish nationality’, ‘Berlin’, and ‘fur trade’, the Nazis were able to cross-tabulate at a rate of 25,000 cards per hour to identify precisely how many Berlin furriers were Jews of Polish origin. Train cars, which previously would have taken two weeks to mobilise, could be quickly dispatched in just two days by means of an immense network of IBM punch card machines. This same technology was also put to use in concentration camps. Each camp maintained its own Hollerith-Abteilung (Hollerith Department), assigned with keeping tabs on inmates through the use of IBM's punch cards. The machines were so sophisticated that they were even capable of matching the skills of prisoners with projects that needed slave labour. Chillingly, IBM’s code for a Jewish inmate was “6” and the code it used for gas chamber was “8”.

While Nazi Germany extended its domination across Europe, there is no evidence to suggest that IBM paused at any point to reflect on its role in facilitating industrial-scale murder. On the contrary, each nation that fell to the Nazi war machine was subjected to a census, which relied on the machinery and punch cards supplied by IBM. At the same time as Europe’s Jews were murdered in their millions, IBM decision-makers in New York were gleefully carving up sales territories. Edwin Black, who's 2001 book first bought to light the company’s instrumental role in the Holocaust, warns us not to think of IBM’s partnership with the Nazis as some rogue corporate element operating out of a basement.  Far from it. This was a carefully micro-managed alliance spanning twelve years, which generated profit up until the last gasp of Hitler’s monstrous regime.

Legacy: IBM’s reaction and the role of big tech in genocide today

Revisiting his book twenty years later, Edwin Black makes the point that – with or without IBM – there would always have been a Holocaust. ‘Einsatzgruppen murder squads and their militia cohorts would still have heinously murdered East European Jews bullet by bullet in pits, ravines, and isolated clearings in the woods’. The question, however, is would the Nazis have been able to annihilate as many victims as they did without the data processing power offered by IBM technology? For Edwin, the answer to that question is never in doubt. IBM is responsible for facilitating the ‘industrial, high-speed, six-million-person Holocaust, metering ghetto residents out to trains, then carefully scheduling those trains to concentration camps for murder and cremation within hours, thus clearing the way for the next shipment of victims—day and night’. Put it another way: without IBM, the death toll of the Holocaust would be measured in the hundreds of thousands, not in the millions.

To date, IBM has never directly denied any of the evidence of its role in the Holocaust. The company has previously insisted that most of its records from Europe were lost or destroyed during the war, and that it has no other information it can share about its operations during that time. It would seem IBM sees little benefit in attempting to refute or downplay its part in the Holocaust. Indeed, in the twenty years since Black published his book, he reminds us that ‘IBM has never requested a correction or denied any facts’. Since 2001, each edition of the book has provided further evidence of the company’s guilt.

Are there any lessons that we can draw from IBM’s role in the Holocaust? Importantly, the company’s facilitation of mass murder is a stark reminder of the power of data in the wrong hands. Indeed, we do not have to look too hard to find examples of authoritarian regimes using data to perpetuate genocide even today. From China's use of facial recognition technology to monitor and persecute its Uighur population, to Myanmar's use of social media to incite violence against Rohingya Muslims, we are bearing witness to new and alarming ways in which data is weaponised to inflict human rights abuses. While we do of course need to be vigilant about the ways in which governments – our own or further afield – might use data, we also need to remain extremely wary of non-governmental actors. Indeed, if IBM’s story shows us anything, it is that large multinational corporations are adept at evading accountability and continuing to function with impunity. Despite the millions that such organisations spend on PR management and glossy marketing campaigns, it is critical that we remain suspicious of what big tech can do to surveil, censor and unduly influence our lives.

What do you think of the role IBM in the Holocaust? Let us know below.

Now read Felix’s article on Henry Ford’s calamitous utopia in Brazil: Fordlandia Here.

When we think of the Wild West, we usually picture cowboys, rangers, and formidable gangsters who followed their own laws. However, women also left their mark on this piece of American history.

In the 1800s, the way of life in the American West demanded tough character from both men and women. In order to survive and thrive, they had to be cunning, quick-witted, and often merciless. Not to mention skilled at shooting firearms. Men weren’t the only colorful figures of the Wild West. Women proved easily their equal.

It was during this transition period of the Old West that several women established names for themselves, names easily as famous as their male counterparts.  There has been little written about some of these unhearled  women but each one had a major impact in the journey West and formation of our nation.

Richard Bluttal explains.

A picture of Calamity Jane, around the year 1880.

Calamity Jane (1856-1903)

Martha “Calamity” Jane Cannary was a frontierswoman who earned her nickname after rescuing a military Captain involved in a Native American ambush. How did Martha Jane Canary go from an orphaned prostitute to one of the most famous women in the Wild West? In Wyoming, she began to develop the identity that would make her famous as Calamity Jane.

With questionable character, boldness, and the ability to captivate, Calamity Jane was a woman-of-all trades. Following the military from fort to fort on the frontier, Jane was no stranger to the Wild West.

Far from a blushing rose, Jane’s life story is peppered with wild tales that still inspire filmmakers and writers to this day. She was even known to claim children in her company as her own, only to never be seen with them again.

Calamity Jane, one of the rowdiest and adventurous women in the Old west , was a frontierswoman and professional scout, who was known for her being a friend to Wild Bill Hickok and appearing in Buffalo Bill Cody’s Wild West Show.

In 1870, she joined General George Armstrong Custer as a scout at Fort Russell, Wyoming, donning the uniform of a soldier. This was the beginning of Calamity Jane’s habit of dressing like a man. Heading south, the campaign traveled to Arizona in their zest to put Native Americans on reservations. In her own words, Calamity would later say of this time, that she was the most reckless and daring rider and one of the best shots in the West.

Some legends say that she disguised herself as a man to accompany soldiers as a scout on expeditions, including the 1875 expedition of General George Crook against the Lakota. She developed a reputation for hanging out with the miners, railroad workers, and soldiers—enjoying heavy drinking with them. She was arrested, frequently, for drunkenness and disturbing the peace.

In 1877 and 1878, Edward L. Wheeler featured Calamity Jane in his popular Western dime novels, adding to her reputation. She became something of a local legend at this time because of her many eccentricities. Calamity Jane gained admiration when she nursed victims of a smallpox epidemic in 1878, also dressed as a man.

How did Jane get the moniker "Calamity Jane"? Many answers have been offered by historians and storytellers. "Calamity," some say, is what Jane would threaten to any man who bothered her. She also claimed the name was given to her because she was good to have around in a calamity, such as the smallpox epidemic of 1878. Maybe the name was a description of a very hard and tough life. Like much in her life, it's simply not certain.

 

Charley Parkhurst (1812-1879)

Charley Parkhurst was a legendary driver of six-horse stagecoaches during California’s Gold Rush — the “best whip in California,” by one account.

Times were rough for ladies in the Wild West, so this crackerjack stagecoach driver decided to live most of her life as a man. Born in 1812, Parkhurst lived well into her sixties, in spite of being a hard-drinking, tobacco-chewing, fearless, one-eyed brute. She drove stages for Wells Fargo and the California Stage Company, not an easy or particularly safe career. The job was treacherous and not for the faint of heart — pulling cargos of gold over tight mountain passes and open desert, at constant peril from rattlesnakes and desperadoes — but Parkhurst had the makeup for it: “short and stocky,” a whiskey drinker, cigar smoker and tobacco chewer who wore a black eyepatch after being kicked in the left eye by a horse. In California, she quickly became known for her ability to move passengers and gold safely over important routes between gold-mining outposts and major towns like San Francisco or Sacramento. “Only a rare breed of men (and women),” wrote the historian Ed Sams in his 2014 book “The Real Mountain Charley,” “could be depended upon to ignore the gold fever of the 1850s and hold down a steady job of grueling travel over narrow one-way dirt roads that swerved around mountain curves, plummeting into deep canyons and often forded swollen, icy streams.”

The legend really took off after her death, when the coroner learned that Charley was a female, who had been named Charlene and had once given birth. She had pulled off one of the most remarkable hoaxes on record. It was an amazing story and much talked about in California, where her exploits driving four-in-hand or six-in-hand teams was common knowledge and where so many in the livery business had personal recollections of her daring coolness in times of danger.

Using her secret identity, Parkhurst was a registered voter and may have been the first American woman to cast a ballot. She lived out the rest of her life raising cattle and chickens until her death in 1879. It was then that her true identity was revealed, much to the surprise of her friends.

Narcissa Whitman (1808-1847)

Narcissa Whitman was one of the first white women to cross the North American continent overland on her way to become a missionary to the Cayuse Nation in present-day Washington. She, and her husband Marcus, helped facilitate the colonization of the Oregon Country via the Oregon Trail before ultimately being killed during an attack on the mission site in 1847.Pioneer and Missionary in Oregon Country Narcissa Whitman (1808-1847) traveled some 3,000 miles from her home in upstate New York to Oregon Country. She was the first white woman to cross the Rocky Mountains in 1836 on her way to found the Whitman Mission among the Cayuse Indians near modern day Walla Walla, Washington. She became one of the best known figures of the 19th century through her diaries and the many letters she wrote to family and friends in the east.

Narcissa Prentiss married Marcus Whitman on February 18, 1836. She was 27; he, 33. Among the guests was one of two Nez Perce boys that Marcus had brought back with him, in hopes they would learn enough English to serve as translators once the new mission was established. He was the first Native American Narcissa had ever seen.

The Whitmans left for Oregon Country in March 1836 to begin their missionary activities among the Native Americans there. The 3,000-mile journey – made by sleigh, canal barge, wagon, river sternwheeler, on horseback and on foot – took about seven months. As the missionaries traveled in relative comfort on Missouri River steamboats, Narcissa reveled in the luxury of “servants, who stand at our elbows ready to supply every want” (March 28, 1836).

“Can scarcely resist the temptation to stand out to view the shores of the majestic river,” she wrote in her diary as the boat approached St. Louis. “Varied scenes present themselves as we pass up – beautiful landscapes – on the one side high and rugged bluffs, and on the other low plains” (March 28, 1836). She was in good spirits. “I think I shall endure the journey well – perhaps better than any of the rest of us” (April 7, 1836).

Ahead lay some 1,900 miles of prairie, mountain and desert. To cross in safety, the small missionary party joined the American Fur Company’s caravan of 70 or so traders on their way to the annual rendezvous in Green River, Wyoming. The missionaries were late setting out and ended up having to make several forced marches before they caught up with the caravan on May 26, 1836.

The next day, they encountered their first Indian villages. Narcissa and Eliza were the first white women the Indians had ever seen. “We ladies were such a curiosity to them,” Narcissa wrote. “They would come in and stand around our tent, peep in, and grin in their astonishment to see such looking objects” (June 27, 1836).

The caravan’s route followed river valleys westward toward the Rocky Mountains. This part of the journey was long and tedious, covering only fifteen miles or so in a good day. The diet by that point consisted mostly of buffalo meat (supplied by the caravan’s hunters), supplemented with milk from the missionaries’ cows. Narcissa seemed to relish the experience. “I never was so contented and happy before, neither have I enjoyed such health for years,” she wrote (June 4, 1836).

Narcissa died on November 29, 1847, along her husband and eleven other adult men. She was killed in an attack on the mission by a small group of Weyíiletpuu men who were motivated by the raging measles epidemic in their community and Dr. Whitman’s inability to cure their dying people.  

Mary Fields (1835-1914)

Better known as “Stagecoach Mary,” was a force to be reckoned with: a pioneer who made a name for herself as the first African American woman to receive employment as a U.S. postal service star-route mail carrier.

Fields was born into slavery and was freed at the end of the Civil War. She eventually made her way out west to Montana where she worked for St. Peter’s Mission. She received her mail service contract in 1895 and held her contract for 8 years. Fields had the star route contract for the delivery of U.S. mail from Cascade, Montana, to Saint Peter's Mission.

By 1895, at sixty years old, Fields secured a job as a Star Route Carrier which used a stagecoach to deliver mail in the unforgiving weather and rocky terrain of Montana, with the help of nearby Ursuline nuns, who relied on Mary for help at their mission. This made her the first African-American woman to work for the U.S. Postal Service. True to her fearless demeanor, she carried multiple firearms, most notably a .38 Smith & Wesson under her apron to protect herself and the mail from wolves, thieves and bandits, driving the route with horses and a mule named Moses. She never missed a day, and her reliability earned her the nickname "Stagecoach Mary" due to her preferred mode of transportation. If the snow was too deep for her horses, Fields delivered the mail on snowshoes, carrying the sacks on her shoulders.

Mary’s legend grew her death. She was made a hero, a symbol of female black empowerment. Yet how did Montanans truly understand about her during her time in Cascade? Were people capable of understanding the autonomy, persona, and character of a freed, literate African American woman who did not conform to the ideals put on her by society?

Mary drank and wore men’s clothing at times, she smoked and carried guns. Yet in death she has become this powerhouse woman. Mary had the ability to become the first African American woman Star Route Carrier during a time when the West was a predominantly white society, which says something to Mary’s relentless character and larger than life personality

Sacagawea: Translator and Guide (1788-1818/1819)

One of the best-known women of the American West, the native-born Sacagawea gained renown for her crucial role in helping the Lewis & Clark expedition successfully reach the Pacific coast.

President Thomas Jefferson dispatched Meriwether Lewis and William Clark to chart the new land and scout a Northwest Passage to the Pacific coast. After more than a year of planning and initial travel, the expedition reached the Hidatsa-Mandan settlement. Here they met Sacagawea and Charbonneau, whose combined language skills proved invaluable–especially Sacagawea’s ability to speak to the Shoshone.

Sacagawea, along with her newborn baby, was the only woman to accompany the 31 permanent members of the Lewis & Clark expedition to the Western edge of the nation and back.  Her knowledge of the Shoshone and Hidatsa languages was a great help during their journey. She communicated with other tribes and interpreted for Lewis and Clark. She was also skilled at finding edible plants, which proved to be crucial to supplementing their rations along the journey. Further, Sacagawea was valuable to the expedition because her presence signified peace and trustworthiness

Once they reached Idaho, Sacagawea’s knowledge of the landscape and the Shoshone language proved valuable. The expedition was eager to find the Shoshone and trade with them for horses. The success of the journey hinged on finding the tribe: without horses the explorers would be unable to get their supplies over the mountains. Recognizing landmarks in her old neighborhood, Sacagawea reassured the explorers that the Shoshone - and their horses - would soon be found. When the Expedition did meet the Shoshone, Sacagawea helped the Corps communicate, translating along with her husband.

Historians have debated the events of Sacagawea’s life after the journey’s end. Although opinions differ, it is believed that she died at Fort Manuel Lisa near present-day Kenel, South Dakota. At the time of her death she was not yet 30.

Short stories about other women

Mary Walton

Mary Walton was an early environmental pioneer. In 1879, she developed a way to deflect factory smokestack emissions using water tanks. This technology was later adapted for steam engines, which emitted large plumes of soot as they rode the rails.

Cathay Williams

She was the first African-American woman to enlist in the army, and did so by disguising herself as a man. Though she was hospitalized five times, no one ever discovered her secret. She called herself William Cathay and was deemed fit for duty. 

Biddy Mason

She started life as a slave, but after winning her freedom in court in 1856, she moved to Los Angeles and became a nurse and midwife. Ten years later, she bought her own land for $250, making her one of the first Black women to own land in Los Angeles.

Goldy Griffith

Goldie Griffith, often known as the “Rose of the Klondike,” was a well-known character during the late-nineteenth-century Klondike Gold Rush. She was born in Montana in 1871 and became involved in the Alaska gold rush when she was in her twenties.

Goldie soon rose to prominence as a prospector with the ability to hold her own in a male-dominated sector. She was also recognized for her beauty and charm, and she was well-liked by the region’s miners and prospectors.

Goldie staked a claim in the Yukon in 1898, becoming one of the few women to own and run a mine during the gold rush. She was also well-known for her involvement in a number of businesses, including a saloon and a hotel.

What do you think about women in the Wild West? Let us know below.

Now read Richard’s piece on the history of slavery in New York here.

In 1961 Yuri Gagarin went to space, but more importantly he didn’t visit the United States immediately after. John F. Kennedy personally barred him from entering, scared of his popularity—so the Telegraph, Wikipedia, and countless blogs say. It has all the makings of a classic Cold War conspiracy theory: John F. Kennedy, fear of the Soviet Union, and the Space Race. There’s just one problem: it isn’t true. Yet while the evidence refutes this Cold War truism, it explains why the story was easily accepted. This myth says much more about the nature of the United States during the Red Scare than it does about Yuri Gagarin.

Steve Ewin explains.

Yuri Gagarin in Warsaw, Poland in 1961.

There are two main versions of the Gagarin Myth. The first, as stated in Britain’s Telegraph, is that John F. Kennedy was so alarmed by Gagarin’s popularity that he barred him from the United States. The second, as an extension of the first, is that Kennedy’s method of barring was via Executive Order.

The second version is the easiest to disprove: no executive order or proclamation exists that barred Gagarin from the United States.(1) The only references to Gagarin by Kennedy as official actions of the United States are those of congratulatory messages for his achievement.

Expanding this to other offices of the executive branch also produces no evidence. The agency responsible for enforcing bans on specific individuals fell to the Immigration and Naturalization Service. Thousands of pages exist regarding Charlie Chaplin, barred from entering the United States in 1952.(2) Further, the United States Customs and Immigration Service (the INS’ successor agency) has thousands of pages of documents related to John Lennon’s attempted barring.(3) In response to a FOIA request for records related to Gagarin, none were found. The stories of Chaplin and Lennon, however, are inseparable from the Red Scare and Cold War politics.

The politics of it all

The Red Scare is what makes the first version of this myth seem plausible. In 1952 the United States Congress passed, and President Eisenhower signed, the Immigration and Nationality Act of 1952. This act effectively barred any Soviet citizen from entry to the United States. A win trumpeted by American Cold Warriors, it quickly became a disaster for the United States abroad. A National Security Council report dated March 25 1955 states that the general travel restriction:

placed [the US] in a paradoxical position, which is being exploited by Communist propaganda. Despite its traditional policy favoring freedom of travel and its record of having favored a liberal exchange of persons…the U.S. is being accused of maintaining an “Iron Curtain”; and these accusations are being made not only by representatives of international Communism but also by otherwise friendly persons in the free world.(4)

These restrictions were still in place during Gagarin’s goodwill tour post-space. Kennedy would not have needed a reason to personally bar Gagarin from the United States after his historic 1961 flight. He would not have been allowed in the United States by default.

There was a way around this. The Immigration Act of 1952 provided exemptions for official and diplomatic business. As the United States and the Soviet Union maintained diplomatic ties, an exemption was built into the act which allowed for members of “deportable” affiliations to be in the United States if on official business from their home governments. If Gagarin was invited to the United States as an official representative of the Soviet Union (or sent by the Soviet Union as one), the Immigration Act of 1952 would have allowed it. In the immediate aftermath of Gagarin’s flight such an invitation was recommended by the American Ambassador to the Soviet Union.(5)

Official discouragement

The timing of Gagarin’s flight was not opportune for an invitation. Five days after Gagarin’s triumphant flight the ill-fated Bay of Pigs invasion occurred. The American-backed attempted invasion of a major Soviet ally greatly damaged American prestige. Yet, by the time Gagarin was on a good-will tour, America had an answer. Alan Shepard became the first American in space on May 5, 1961. According to John Logsdon’s award winning book, John F. Kennedy and the Race to the Moon, worldwide reaction to Shepard’s flight was more favourable than Gagarin’s. According to a May 1961 report of the U.S. Information Agency, the United States was already winning the propaganda battle of space flights.(6)

A June 1961 State Department telegram is a not-quite-smoking gun. The formerly classified document states that “no invitation for Gagarin to visit [the] US” had been made. Further, it states that the United States government “has made efforts to discourage invitation.”(7) This is the closest document which exists to suggest that Gagarin was banned from the United States: a discouragement. With the United States riding the wave of international support brought by Shepard’s flight, there was nothing to fear about Gagarin. Within a year, however, this discouragement would be moot.

Kennedy himself lifted the general travel restrictions in 1962. This decision was made upon recommendation by Secretary of State Dean Rusk and in consultation with the Central Intelligence Agency.(8) In April 1962, White House Press Secretary Pierre Salinger wrote a memorandum stating that Gagarin was expected to be in Washington, DC that summer.(9) On July 6, 1962, the United States informed the Soviet Ambassador to the United States that the travel restrictions had been removed.(10) On October 16, 1963, Yuri Gagarin appeared before the United Nations General Assembly in New York City.

While Gagarin’s purported banishment from the United States makes for a good Cold War story, the evidence simply does not support it. Legislation, and governmental opinion, would have allowed Gagarin entry into the United States at any point, had it been politically expedient. However, due to the political climate of the Cold War and the rivalry between the United Stated and Soviet Union, the myth took root and flourished.

What do you think of Gagarin and JFK? Let us know below.

References

1 “Written Presidential Orders | The American Presidency Project,” n.d., https://www.presidency.ucsb.edu/documents/app-categories/presidential/written-presidential-orders

2 Electronic Reading Room - USCIS. “Charlie Chaplin,” December 25, 1977. Accessed April 11, 2023. https://www.uscis.gov/records/electronic-reading-room?ddt_mon=&ddt_yr=&query=Chaplin&items_per_page=10.

3 Electronic Reading Room - USCIS. “John Lennon,” December 8, 1980. Accessed April 11, 2023. https://www.uscis.gov/records/electronic-reading-room?ddt_mon=&ddt_yr=&query=john+lennon&items_per_page=10.

4 U.S. Department of State, Office of The Historian. “National Security Council Report NSC 5508/1,” March 26, 1955. Accessed April 11, 2023. https://history.state.gov/historicaldocuments/frus1955-57v24/d94.

5 17 April 1961, US Department of State Staff Summary, Papers of John F. Kennedy. Presidential Papers. President's Office Files. Departments and Agencies. State, 1961: April-May, pg 166/ https://www.jfklibrary.org/asset-viewer/archives/JFKPOF/088/JFKPOF-088-001?image_identifier=JFKPOF-088-001-p0001

6 J Logsdon. 2016. John F. Kennedy and the Race to the Moon. Palgrave Macmillan. 96-97.

7 State to Paris, Telegram 1839, June  26 1961, 033.6140/6-2461, 1960-63 CDF, RG59, USNA.

8 U.S. Department of State, Office of The Historian. “Memorandum From Secretary of State Rusk to President Kennedy,” April 25, 1962. Accessed April 11, 2023. https://history.state.gov/historicaldocuments/frus1955-57v24/d94.

9 Papers of John F. Kennedy. Presidential Papers. White House Central Subject Files. Outer Space (OS). OS: 4-1: Astronauts: General, 1962: 26 March-31 May, page 38. https://www.jfklibrary.org/asset-viewer/archives/JFKWHCSF/0655/JFKWHCSF-0655-007

10 American Foreign Policy, Current Documents. 1962. Department of State, 1966 .pp. 740-741.

Posted
AuthorGeorge Levrier-Jones

In 1935 Fascist Italy, under Benito Mussolini’s rule, invaded Abyssinia, one of the few independent countries in Africa at the time. The war split opinion in Europe, and caused particular issues for Britain and France as they hoped to ally with Italy against Nazi Germany’s plans. Should they strongly intervene against Italy, or offer a more limited response? Stephen Prout explains.

Italian troops advancing on Addis Ababa during the Second Italo-Ethiopian War (1935-37).

Introduction

In 1935 Italy invaded Abyssinia (modern day Ethiopia). Italy was then under the control of a fascist regime ruled by Benito Mussolini and part of his grandiose plans was to expand Italy’s modest empire. In the immediate years after the First World War’s end there was deep dissatisfaction with the terms of the Paris Peace conference, Vittorio Orlando and Sonnino his Foreign Ministers departed early betrayed by her Western allies. In the 1915 Treaty of London Italy’s former allies, Britain, France, and Russia, promised her territory in the Balkans and North Africa in return for her participation in the war on the Allied side - but these promises were broken and she left empty handed. Mussolini came to power in 1921. His aspiration was clear - it was to make Italy great, respected and feared.  Part of that plan was the expansion of the Italian Empire.

Abyssinia was Italy’s first major territorial gain and in October 1935 forces of the Italian Army conquered the country isnless than a year. The new League of nations would be outraged and on the surface Britain and France would be disapproving. Although the conflict was on a different continent thousands miles away it had grave significance for European affairs. The outcome would place an isolated Italy in the Nazi Camp and once again divide Europe into two opposing camps.

Abyssinia and its relations with Europe

The imperial powers had always been present and hovering in the background since the late nineteenth century ready to meddle in Abyssinian affairs. Italy had long carried an irreconcilable sense of national humiliation from her defeat by Abyssinian forces in 1896 at Adwa. She had not had the opportunity to repair her international standing in the same way Britain did when the Zulu army overcame British forces at Isandlwana in the 1880s. This had always been a blight to Italy’s new national pride.

As far as other European powers were concerned, in 1906 Italy along with Britain and France formed a Tripartite Pact in which spheres of influence in Abyssinia were established amongst the three powers, and they were ready to enact and occupy in the event of the country’s collapse following a period of turmoil.

Abyssinia was one of two independent countries in Africa in the 1930s (the other was Liberia). That set them apart from the rest of the colonized continent (slight exceptions were South Africa and Egypt who had semi-autonomous roles as either veiled protectorates or Dominion status). Africa was still very much under European domination, mainly Britain and France. Italy was seeking expansion in North Africa, the Balkans and the Mediterranean.  Abyssinia offered that sole opportunity as far as Africa was concerned. Trying to seize British or French territory was militarily out of the question for her.

When compared to European standards Abyssinia was very much behind economically, socially, and politically. Abyssinia entered the twentieth century with many of its medieval ways and customs intact. It operated a slave trade this far into the twentieth century and it did not end until after the Second World War.  The education system excluded much of the population and the army was largely equipped with traditional weaponry. Conversely there was evidence of a nascent modernization as she had access to modest trade with the USA, Germany, Britain, and Italy at the turn of the century. For example in 1906 its exports to the USA amounted to 3 million dollars ($106 million in today’s money). Internally the education system was also progressing as a government edict made education compulsory for all males and it was no longer restricted to religious instruction.    

For Italy there was the unresolved matter of her military defeat in 1896 and the promise to expand the Italian Empire to make good the broken promises of the 1915 Treaty of London. The pretext Italy used to justify the war was a retaliation to border violations after growing tensions supported with a spurious claim to abolish the slave trade that continued in Abyssinia. Mussolini had made it clear that he wanted to build a new Roman Empire and make Italy respected and feared. Abyssinia was his opportunity and he justified the action by believing that he was acting no differently to Britain and France in Africa.  However, his mistake was not realizing that the time of empires and colonies had no place in the mid-twentieth century. Italy was fifty years too late for an African scramble.

The Dilemma for Europe

As far as the British public was concerned, on the surface this was a moral battle. An underdog nation was fighting for its existence against a more powerful aggressor. This hypocrisy seemed to largely ignore the fact that Britain still had a firm grip of its empire and was in some areas supressing with force independence movements. The resolve that was expressed however was clear and it was any action short of war, with no intention or plan for any alternative.  In diplomatic circles in Europe the perspective was very different indeed. The events in Abyssinia from the Italian invasion were more of a side show but how could a developing country some eight thousand kilometres from Europe be a concern to the Western Powers of Europe and indeed Germany?

In truth the British and French were not concerned over Abyssinia. It was public opinion and the state of the League of Nations that forced the appearance of urgency from Britain and France.  Underneath all this Italy was an ally despite its Fascist nature - and more importantly a member of the recently formed alliance that became known as the Stresa Front Alliance with France and Britain that had the aim of maintaining peace and stability in Europe and containing revisionist Germany. It was important that she was not irked or isolated by actions that Britain or France may be compelled to take on direction or pressure from the League of Nations.

Britain only had her own interests in mind and that was security in Western stability and her own colonies. The Permanent Under Secretary of State for the Colonies, John Maffey, quickly assured the Government that none of the nation’s British interests were at risk after the Italian invasion. At home not all British politicians shared the public outrage. Within the Conservative Party, Leo Amery expressed his support for the Italian actions. Churchill remained quiet on the matter. The widely held belief of Amery is that he was an anti-appeaser due his famous speech that he would make later in 1940 demanding the resignation of Prime Minister Neville Chamberlain. In fact, quite the reverse was true. Amery voiced support for the Japanese invasion of Manchuria as well as for Italy’s actions in Africa. Amery argued in the case of the former that Japan had a “strong case for her invasion of Manchuria” and that Britain and France should have ceded Abyssinian territory much earlier to Italy and eschew League intervention. More specifically, in 1936 he stated that the Italian intervention would give a “merciful deliverance to be released from Abyssinian control”.

Britain was also concerned with her appearance before the League of Nations and had to balance her own interests with her obligations as a leading member. The Abyssinian crisis showed how impotent the League could be in the face of aggression from a permanent member and also when combined with conflicting interests and agendas of the members. Some diplomats were quite willing to circumvent the League in such cases as Lord Curzon during Italy’s actions in Corfu.  In the background there was the concern that the presence of an independent nation on the borders of Europe’s colonies could also spread nationalistic ideas.  This was especially a worry for Britain who had lost Egypt, Ireland, and Iraq already, while India was showing noisy displays of dissent. Therefore the Italian invasion of Abyssinia would not inconvenience her nor the French too much. Ever since In the Tripartite Treaty of 1906, the three were all prepared to occupy this independent nation if their own interests were threatened or if the situation in the country did not favor them.

For Britain and France Italy was more valuable in keeping on side within the 1935 Stresa front. Italian membership and military support were vital and essential if Germany was to be contained. The Western democracies were making overtures to Mussolini to avoid his isolation while applying very modest and meek sanctions to keep up appearances in the League. Abyssinia would be a small price to pay for their own security and interests, but ultimately the 1935 Stresa Front would crumble.

The Outcome

The outcome of the war was a victory for Italy. Within a year of the conflict ending the new Prime Minister of Great Britain Neville Chamberlain was already exchanging friendly correspondence with Mussolini and in 1937 was ready to acquiesce to Mussolini’s requests for recognition of Italy’s complete annexation of Abyssinia. Spheres of influence that were established in the Tripartite Act of 1906 were forgone, and the Western Powers even passed up the chance to fight for territory in Abyssinia. This displayed not so publicly how quickly the Western Powers could move on but how unimportant the sovereignty of Abyssinia was to them. Britain and France were colonial powers, and the disappearance of a sovereign nation would help to extinguish ideas of independence reaching their own colonies.

Britain and France could potentially have saved some of Abyssinia if they chose to by invoking the 1935 Stresa Front.  By occupying their self-proclaimed spheres, they could have denied much of the country to Italian forces.  This would have been a grander gesture than the mild sanctions applied. It only strengthened the point that the fate of Abyssinia was just not important enough.

Like all wars it had atrocities and more has been focused on the use of poison gas by Italy. Equally, Abyssinian combatants also acted outside of the conditions of the Geneva convention. The International Red Cross reported the castration of Italian prisoners of war by Abyssinian troops. Furthermore, it should be also noted that while under Italian occupation the slave trade was curtailed and outlawed, something which showed no signs of being arrested under the country’s old rulers. During Italian rule two laws were issued in October 1935 and in April 1936 which abolished slavery and freed 420,000 Ethiopian slaves. While not condoning Fascist actions the campaign was not as one sided as some accounts suggest. As far as the rest of the world was concerned the indifference to the fate of Abyssinia fate was shared as only six nations failed to recognise the Italian fait accompli.

The Abyssinians endured ten years of Italian occupation. Europe had now been forced into two camps with Italy now firmly on the side of the totalitarian powers rather than a country that would contain a growing and powerful Germany. Italian actions in Abyssinia along with Japanese intervention in Manchuria were portents. Italy would soon join Germany in intervening in Spain before participating against her former allies in the Second World War. Was it inevitable or was it a diplomatic tragedy?

What do you think of the Abyssinian Affair? Let us know below.

Now, if you enjoy the site and want to help us out a little, click here.

Sources

Encyclopaedia of Antislavery and Abolition [Two Volumes] -Greenwood Press, 2006 - Peter P. Hinks, ‎John R. McKivigan, ‎R. Owen Williams

Mussolini- A New Life – N Farrell – 2003 – Weidenfeld & Nicholson.

AJP Taylor – English History 1914 – 1945

Europe of The Dictators

Report of War Crimes and Atrocities Abyssinia – International Red Cross

Leo Amery’s Imperial Attitude to Appeasement in the 1930’s – Richard S Grayson – University of London 2006