World War Two caused so much misery and was much more of a truly global conflict than World War One. The battles of that war took place largely across Europe, Asia, and Africa and in seas the world over. Here, Richard Bluttal concludes his three-part series on the impacts of trauma during wars by looking at World War 2.

If you missed it, read part one on the American Civil War here, and part 2 on World War 1 here.

Advert encouraging sign-ups to the Army Nurse Corps during World War 2.

Lawrence McCauley was a member of the 65th Armored Field Artillery Battalion, trained to drive trucks armed with .50-caliber machine guns, halftracks and landing craft, just in case. In England, preparing for the D-Day invasion, he  became fast friends with Otto Lutz, a tall Chicagoan. We were all very close,” he said of his unit when he was interviewed in 2020 at the age of 97 and living in Lewis Center. “You knew about their wives and children — everything you could know about your buddy, because there was nothing else to talk about.”

He and Otto were next to each other on a landing craft as it approached Omaha Beach. The front door dropped open and a bullet hit Otto in the forehead. McCauley remembers looking back and seeing his friend’s face sink beneath the water. “There was no stopping,” he said. “Our orders were `Don’t stop,’ because you’re better off as a moving target. That’s hard.”

The purpose of military medicine during World War II was the same as in previous wars: to conserve the strength and efficiency of the fighting forces so as to keep as many men at as many guns for as many days as possible. What transpired between 1939 and 1945 was a cataclysmic event made worse by the nature of the weapons the combatants used. The use of machine guns, submarines, airplanes, and tanks was widespread in World War I; but in World War II these weapons reached unimagined perfection as killing machines. In every theater of war, small arms, land-and sea-based artillery, torpedoes, and armor-piercing and antipersonnel bombs took a terrible toll in human life. In America's first major encounter at Pearl Harbor, the survivors of the Japanese attack could describe what modern warfare really meant. Strafing aircraft, exploding ordnance, and burning ships caused penetrating injuries, simple and compound fractures, traumatic amputations, blast injuries, and horrific burns, to name just a few. Total U.S. battle deaths in World War II numbered 292,131 with 671,801 reported wounded or missing.

Conserving fighting strength and enabling armies and navies to defeat the enemy also meant recognizing that disease, more than enemy action, often threatened this goal. For example, during the early Pacific campaign to subdue the Solomon Islands, malaria caused more casualties than Japanese bullets. Following the initial landings on Guadalcanal, the number of patients hospitalized with malaria exceeded all other diseases. Some units suffered 100 percent casualty rates, with personnel sometimes being hospitalized more than once. Only when malaria and other tropical diseases were controlled could the Pacific war be won.

The military's top priority organized its medical services to care for battlefield casualties, make them well, and return them to duty. The systems developed by the army and navy worked similarly. In all theaters of war, but particularly in the Pacific, both army and navy medicine faced their greatest challenge dealing with the aftermath of intense, bloody warfare fought far from fixed hospitals. This put enormous pressure on medical personnel closest to the front and forced new approaches to primary care and evacuation.

World War II service members lived through an inflection point in the history of medicine and warfare. In all previous US wars, non-battle deaths—related to conditions like smallpox, typhoid, dysentery, yellow fever, tuberculosis, and influenza—outnumbered battle-related fatalities. During the Spanish-American War, more than 2,000 of the approximately 2,400 deaths were due to causes other than battle. During World War I, 53,000 died due to battle versus 63,000 who died due to other causes. World War II marked the first time the ratio was reversed. Of 16.1 million who served, 405,399 died—291,557 of them in battle, and 113,842 due to other causes. A variety of factors contributed to the shift. Crucially, during World War II, the government mobilized expansive public, professional, and private resources to enhance health-related research and development, as well as services offered by the Army Surgeon General’s Office, which oversaw care for soldiers. Also, rather than creating mobilization and treatment plans from scratch, the military health apparatus built on knowledge and administrative infrastructure developed during and after prior conflicts.

Organization of battlefield medical care

The military's top priority organized its medical services to care for battlefield casualties, make them well, and return them to duty. The systems developed by the army and navy worked similarly. In all theaters of war, but particularly in the Pacific, both army and navy medicine faced their greatest challenge dealing with the aftermath of intense, bloody warfare fought far from fixed hospitals. This put enormous pressure on medical personnel closest to the front and forced new approaches to primary care and evacuation.

Army medics or navy corpsmen were the first critical link in the evacuation chain. From the time a soldier suffered a wound on a battlefield in France or a marine was hit on an invasion beach at Iwo Jima, the medic or corpsman braved enemy fire to render aid. He applied a battle dressing, administered morphine and perhaps plasma or serum albumin, and tagged the casualty. Indeed, one of the lingering images of the World War II battlefield is the corpsman or medic crouched beside a wounded patient, his upstretched hand gripping a glass bottle. From the bottle flowed a liquid that brought many a marine or soldier back from the threshold of death. In the early days of the conflict that fluid was plasma. Throughout the war, scientists sought and finally developed a better blood substitute, serum albumin. Finally, in 1945, whole blood, rich in oxygen-carrying red cells, became available in medical facilities close to the battlefield.

If he was lucky, the medic or corpsman might commandeer a litter team to move the casualty out of harm's way and on to a battalion aid station or a collecting and clearing company for further treatment. This care would mean stabilizing the patient with plasma, serum albumin, or whole blood. In some cases, the casualty was then evacuated. Other casualties were taken to a divisional hospital, where doctors performed further stabilization including surgery, if needed. In the Pacific, where sailors, soldiers, and marines were doing the fighting, both navy and army hospital ships, employed mainly as ambulances, provided first aid and some surgical care for the casualties' needs while ferrying them to base hospitals in the Pacific or back to the United States for definitive care. As the war continued, air evacuation helped carry the load. Trained army and navy nurses, medics, and corpsmen staffed the evacuation aircraft.

Combat Related Injuries

The experience of a battle casualty in the Second World War was not radically different to that of the First World War. The most common injuries were caused by shells and bullets, and a casualty was evacuated through a similarly organized chain of medical posts, dressing stations and hospitals. Common combat injuries include second- and third-degree burns, broken bones, shrapnel wounds, brain injuries, spinal cord injuries, nerve damage, paralysis, loss of sight and hearing, post-traumatic stress disorder (PTSD), and limb loss.

Non-Combat Related Death and Injuries

Not all wounds are physical. In a previous era, the psychologically wounded suffered from "nostalgia" during the Civil War, and "shell-shock" in World War I. In World War II this condition was termed combat exhaustion or combat fatigue. Although the World War I experience of treating men at the front had been successful, military psychiatrists and psychologists at the beginning of World War II had to relearn those lessons. Nevertheless, the care givers soon recognized that given a respite from combat, a safe place to rest, regular food, and a clean environment, 85 to 90 percent of patients could again become efficient warriors. The more psychologically damaged received therapy in military hospitals.

In the Southwest Pacific, where death rates due to disease were highest, soldiers faced scourges like malaria, as well as tsutsugamushi fever, poliomyelitis, and diseases of the digestive system. In the northern theater—Alaska, Canada, Greenland, Iceland—threats included cold injuries like frostbite and trench foot. Neuropsychiatric disorders and venereal disease were widespread, regardless of where one served, including among those in the United States.

Army doctor Paul F. Russell recalled after the war an earlier statement from General Douglas MacArthur, who had reported that he “was not at all worried about defeating the Japanese, but he was greatly concerned about the failure up to that time to defeat the Anopheles mosquito,” the vector for malaria. By war’s end, more than 490,000 soldiers had been diagnosed with malaria, equating to a loss of approximately nine million “man-days.”

Between 1941 and 1944, more than 10 percent—roughly two million of 15 million examined men—were excluded from service; 37 percent of those dismissals were made based on neuropsychiatric findings. Still, diagnoses of mental “disorders” within the military catapulted well beyond expectations. A total of one million soldiers were admitted for neuropsychiatric illness, constituting approximately 6 percent of all wartime admissions. Within two years of American entry into the war, it was clear that so-called combat stress or “exhaustion” would pose a major threat to soldiers and the army they served—as it had during prior generations. Experiences and realizations of the World War II period had important implications for the future of military medicine.

Army officials began devoting more resources to neuropsychiatric treatment because of an imperative to increase return-to-duty rates, but long-term impacts of care on individual service members were questionable. In early 1943, military psychiatrists noted that men in the Tunisian campaign diagnosed as “psychiatric casualties” were generally lost to their units after being transferred to distant base hospitals. To increase retention, they instituted principles of “forward psychiatry” that had been adopted by World War I-era armies—and henceforth largely disregarded by World War II planners in the United States: treat patients quickly, in close proximity to battle, and with the expectation that they would recover. After army psychiatrist Frederick Hanson reported in the spring of 1943 that 70 percent of approximately 500 psychiatric battle casualties were returned to duty thanks to this approach, it was gradually adopted in other theaters. Still, military psychiatrists acknowledged the method was hardly a panacea. Systematic follow-up studies were lacking, but one contemporary account noted that many who underwent treatment were unable to return to combat, and some who did “relapsed after the first shot was fired.’”

Medical Advancements and Improvements

Battlefield medicine improved throughout the course of the war. At the beginning, only plasma was available as a substitute for the loss of blood. By 1945, serum albumin had been developed, which is whole blood that is rich in the red blood cells that carry oxygen and is considerably more effective than plasma alone. This was the first major war in which air evacuation of the wounded became available.

During the war, surgery techniques such as removing dead tissue resulted in fewer amputations than at any time. To treat bacterial infections, penicillin or streptomycin were administered for the first time in large-scale combat.

Service members with combat fatigue, which later became known as post-traumatic stress disorder, were given a safe place to stay away from battle zones with plenty of food and rest. This resulted in about 90% of patients recovering enough to return to the fight.

War also brought about the mass production of antibiotics, especially sulfanilamide and penicillin. World War II helped both of them find widespread respect, production, and use.

In 1928, when Scottish bacteriologist Alexander Fleming noticed a weird mold had taken over his Petri dishes and eliminated the bacteria on them, his findings didn’t get much notice. But Fleming continued his research and kept talking up what he called “mold juice” (he didn’t come up with “penicillin” until later), eventually winning a Nobel Prize and attracting the attention of drug maker Pfizer. The company soon began mass-producing the drugs for distribution to medics during WWII, and ultimately, to doctors and hospitals across the country.

In 1932, German biochemist Gerhard Johannes Paul Domagk discovered that the compound sulfanilamide could vanquish deadly strains of bacteria, like the streptococcus in his lab mice and in his first human test subject, his gravely ill young daughter. The wide distribution of so-called “sulfa drugs” began when World War II soldiers carried powdered sulfanilamide in their first-aid kits. By the end of the war, doctors were routinely using these antibiotics to treat streptococcus, meningitis, and other infections.

In the tropical islands of the Pacific, malaria was a serious threat. Service members received atabrine — a group of medications used to protect against malaria — before going into affected areas.

Service members were also inoculated with vaccinations for smallpox, typhoid, tetanus, cholera, typhus, yellow fever and bubonic plague, depending where they were sent.

Other improvements during World War II included improved crash helmets, Because of improvements like these and others, the survival rate for the wounded and ill climbed to 50% during World War II from only 4% during World War I, according to Dr. Daniel P. Murphy, who published a paper on "Battlefield Injuries and Medicine."

As medical advancements progress so does the capability of our medical teams to treat our service men and women when injured in the field.

What do you think of trauma during World War II? Let us know below.

Now read Richard’s piece on the history of slavery in New York here.

Posted
AuthorGeorge Levrier-Jones

World War I was the war that caused the most deaths up until that time. The trenches of that war caused great horror and misery for many. Here, Richard Bluttal continues his three-part series on the impacts of trauma during wars by looking at World War One.

If you missed it, read part one on the American Civil War here.

A depiction of French surgeon Théodore Tuffier.

The pocket diary of Rifleman William Eve of 1/16th (County of London) Battalion (Queen’s Westminster Rifles):

”Poured with rain all day and night. Water rose steadily till knee deep when we had the order to retire to our trenches. Dropped blanket and fur coat in the water. Slipped down as getting up on parapet, got soaked up to my waist. Went sand-bag filling and then sewer guard for 2 hours. Had no dug out to sleep in, so had to chop and change about. Roache shot while getting water and [Rifleman PH] Tibbs shot while going to his aid (in the mouth). Laid in open all day, was brought in in the evening”, unconscious but still alive. Passed away soon after.

The war caused 350,000 total American casualties, of which over 117,000 were deaths. The best estimates today are 53,000 combat deaths, and 64,000 deaths from disease. (Official figures in 1919 were 107,000 total, with 50,000 combat deaths, and 57,000 deaths from disease.)  About half of the latter were from the great influenza epidemic, 1918-1920.  Considering that 4,450,000 men were mobilized, and half those were sent to Europe, the figure is far less than the casualty rates suffered by all of the other combatants.

World War 1 represented the coming of age of American military medicine.  The techniques and organizational principles of the Great War were greatly different from any earlier wars and were far more advanced.  Medical and surgical techniques, in contrast with previous wars, represented the best available in civilian medicine at the time.  Indeed, many of the leaders of American medicine were found on the battlefields of Europe in 1917 and 1918.  The efforts to meet the challenge were often hurried.  The results lacked polish and were far from perfect.  But the country can rightly be proud of the medical efforts made during the Great War.

The primary medical challenges for the U.S. upon entering the war were, “creating a fit force of four million people, keeping them healthy and dealing with the wounded,” says the museum's curator of medicine (Smithsonian's National Museum of American History and science) Diane Wendt. “Whether it was moving them through a system of care to return them to the battlefield or take them out of service, we have a nation that was coming to grips with that.”

The First World War created thousands of casualties. New weapons such as the machine gun caused unprecedented damage to soldiers’ bodies. This presented new challenges to doctors on both sides in the conflict, as they sought to save their patients’ lives and limit the harm to their bodies. New types of treatment, organization and medical technologies were developed to reduce the number of deaths.

In addition to wounds, many soldiers became ill. Weakened immune systems and the presence of contagious disease meant that many men were in hospital for sickness, not wounds. Between October 1914 and May 1915 at the No 1 Canadian General Hospital, there were 458 cases of influenza and 992 of gonorrhea amongst officers and men.

Wounding also became a way for men to avoid the danger and horror of the trenches. Doctors were instructed to be vigilant in cases of ‘malingering’, where soldiers pretended to be ill or wounded themselves so that they did not have to fight. It was a common belief of the medical profession that wounds on the left hand were suspicious. 

Wounding was not always physical. Thousands of men suffered emotional trauma from their war experience. ‘Shellshock’, as it came to be known, was viewed with suspicion by the War Office and by many doctors, who believed that it was another form of weakness or malingering. Sufferers were treated at a range of institutions.

Organization of Battlefield Medical Care

In response to the realities of the Western Front in Europe, the Medical Department established a treatment and evacuation system that could function in both static and mobile environments. Based on their history of success in the American Civil War, and on the best practices of the French and British systems, the Department created specific units designed to provide a sequence of continuous care from the front line to the rear area in what they labelled the Theater of Operations.

Casualties had to be taken from the field of battle to the places where doctors and nurses could treat them. They were collected by stretcher-bearers and moved by a combination of people, horse and cart, and later on by motorized ambulance ‘down the line’. Men would be moved until they reached a location where treatment for their specific injury would take place.

Where soldiers ended up depended largely on the severity of their wounds. Owing to the number of wounded, hospitals were set up in any available buildings, such as abandoned chateaux in France. Often Casualty Clearing Stations (CCS) were set up in tents. Surgery was often performed at the CCS; arms and legs were amputated, and wounds were operated on. As the battlefield became static and trench warfare set in, the CCS became more permanent, with better facilities for surgery and accommodation for female nurses, which was situated far away from the male patients.

Combat Related Injuries

For World War I, ideas of the front lines entered the popular imagination through works as disparate as All Quiet on the Western Front and Blackadder. The strain and the boredom of trench warfare are part of our collective memory; the drama of war comes from two sources: mustard gas and machine guns. The use of chemical weapons and the mechanization of shooting brought horror to men’s lives at the front. Yet they were not the greatest source of casualties. By far, artillery was the biggest killer in World War I, and provided the greatest source of war wounded.

World War I was an artillery war. In his book Trench: A History of Trench Warfare on the Western Front (2010), Stephen Bull concluded that in the western front, artillery was the biggest killer, responsible for “two-thirds of all deaths and injuries.” Of this total, a third resulted in death, two-thirds in injuries. Artillery wounded the whole body, if not entirely obliterated, the body was often dismembered, losing arms, legs, ears, noses, and even faces. Even when there was not superficial damage, concussive injuries and “shell shock” put many men out of action. Of course, shooting—in combat as well as from snipers—was another great source of wounding. Gas attacks were a third. Phosgene, chlorine, mustard gas, and tear gas debilitated more than killed, though many ended up suffering long-term disability. Overall, the war claimed about 10M military dead, and about 20M–21M military wounded, with 5% of those wounds’ life-debilitating, that is, about a million persons.

August 1914 would dramatically alter the paradigm of casualty care. Gigantic cannons, high explosives, and the machine gun soon invalidated all pre-war suppositions and strategy. More than eighty percent of wounds were due to shell fragments, which caused multiple shredding injuries. "There were battles which were almost nothing but artillery duels," a chagrined Edmond Delorme observed. Mud and manured fields took care of the rest. Devitalized tissue was quickly occupied by Clostridia pathogens, and gas gangrene became a deadly consequence. Delays in wound debridement, prompted by standard military practice, caused astounding lethality. Some claimed more than fifty percent of deaths were due to negligent care. And the numbers of casualties were staggering. More than 200,000 were wounded in the first months alone: far too many for the outdated system of triage and evacuation envisioned just years before. American observer Doctor Edmund Gros visited the battlefield in 1914:

If a soldier is wounded in the open, he falls on the firing line and tries to drag himself to some place of safety. Sometimes the fire of the enemy is so severe that he cannot move a step. Sometimes, he seeks refuge behind a haystack or in some hollow or behind some knoll…. Under the cover of darkness, those who can do so walk with or without help to the Poste de Secours. . . . Stretcher-bearers are sent out to collect the severely wounded . . . peasants' carts and wagons [are used] . . . the wounded are placed on straw spread on the bottom of these carts without springs, and thus they are conveyed during five or six hours before they reach the sanitary train or temporary field hospital. What torture many of them must endure, especially those with multiple fractures!

Non-Combat Related Death and Illness

In the course of the First World War, many more soldiers died of disease than by the efforts of the enemy. Lice caused itching and transmitted infections such as typhus and trench fever. In summer it was impossible to keep food fresh and everyone got food poisoning. In winter men suffered from frostbite and exposure and from trench foot. There were no antibiotics so death from gangrenous wounds and syphilis were common. Others suicided as a result of psychological stress.

Battlefield Wounded and Surgery

In the early years of the war, compound lower limb fractures caused by gunshots in trench warfare sparked debate over the traditional splinting practices that delayed surgery, leading to high mortality rates, particularly for open femoral fractures.

Femoral fractures stranded soldiers on the battlefield, and stretcher-bearers reached them only with difficulty, leaving many lying wounded for days or enduring rough transport, all of which left soldiers particularly vulnerable to gas gangrene and secondary hemorrhage. Australian surgeons in France reported injury-to-treatment times ranging from 36 hours to a week and averaging three to four days. Fracture immobilization during transport was poor, and in the early war years surgeons reported about 80% mortality for soldiers with femoral fractures transported from the field.

By 1915 medics and stretcher-bearers were routinely trained to apply immobilizing splints, and by 1917 specialized femur wards had been established; during this period mortality from all fractures fell to about 12% and below 20% for open femoral fractures.

Théodore Tuffier, a leading French surgeon, testified in 1915 to the Academy of Medicine that 70 percent of amputations were due to infection, not to the initial injury. “Professor Tuffier stated that antiseptics had not proven satisfactory, that cases of gas gangrene were most difficult to handle,” Crile wrote. “All penetrating wounds of the abdomen, he said, die of shock and infection. … He himself tried in fifteen instances to perform immediate operations in cases of penetrating abdominal wounds, and he lost every case. In fact, they have abandoned any attempt to operate penetrating wounds of the abdomen. All wounds large and small are infected. The usual antiseptics, bichloride, carbolic, iodine, etc., fail.”

Every war has its distinctive injury. For World War I, it was facial injuries, which affected 10–15% of casualties, or over a half-million men. The nature of combat--with faces often exposed above the trench line--contributed to this high incidence. Most countries founded specialist hospitals with surgeons like Johannes Esser in the Netherlands and Hippolyte Morestin in France who dedicated their practices to developing new techniques to repair facial trauma.

World War I presented surgeons with myriad new challenges. They responded to these difficulties not only with courage and sedulity but also with an open mind and active investigation. Military medicine practiced in 1918 differed substantially from that in 1914. This shift did not occur by happenstance. It represented collaboration between some of the brightest minds in academia and professional military doctors, combining their expertise to solve problems, take care of patients, and preserve fighting strength. It required multiple inter-allied conferences both to identify common medical problems and also to determine optimal solutions. Reams of books and pamphlets buttressed the in-person instruction consultants provided to educate young physicians on best practices. Most significantly, this change demanded a willingness to admit a given intervention was not working, creatively try something new, assess its efficacy using data from thousands of soldiers, disseminate the knowledge, and ensure widespread application of the novel practice. No step was easy, and combining execute them while fighting the Great War required a remarkable degree of perseverance, intellectual honesty, and operational flexibility.

Medical advances and improvements leading up to World War 2

 With most of the fighting set in the trenches of Europe and with the unexpected length of the war, soldiers were often malnourished, exposed to all weather conditions, sleep-deprived, and often knee-deep in the mud along with the bodies of men and animals. In the wake of the mass slaughter, it became clear that the “only way to cope with the sheer numbers of casualties was to have an efficient administrative system that identified and prioritized injuries as they arrived.” This was the birth of the Triage System. Medicine, in World War I, made major advances in several directions. The war is better known as the first mass killing of the 20th century—with an estimated 10 million military deaths alone—but for the injured, doctors learned enough to vastly improve a soldier’s chances of survival. They went from amputation as the only solution, to being able to transport soldiers to hospital, to disinfect their wounds and to operate on them to repair the damage wrought by artillery. Ambulances, antiseptics, and anesthesia, three elements of medicine taken entirely for granted today, emerged from the depths of suffering in the First World War.

Two Welshmen were responsible for one of the most important advances - the Thomas splint - which is still used in war zones today. It was invented in the late 19th century by pioneering surgeon Hugh Owen Thomas, often described as the father of British orthopedics, born in Anglesey to a family of "bone setters”.

In France, vehicles were commandeered to become mobile X-ray units. New antiseptics were developed to clean wounds, and soldiers became more disciplined about hygiene. Also, because the sheer scale of the destruction meant armies had to become better organized in looking after the wounded, surgeons were drafted in closer to the frontline and hospital trains used to evacuate casualties.

When the war broke out, the making of prosthetic limbs was a small industry in Britain. Production had to increase dramatically. One of the ways this was achieved was by employing men who had amputations to make prosthetic limbs – most commonly at Erskine and Roehampton, where they learnt the trade alongside established tradespeople. This had the added advantage of providing occupation for discharged soldiers who, because of their disabilities, would probably have had difficulty finding work.

While it was not an innovation of war, the process of blood transfusion was greatly refined during World War I and contributed to medical progress. Previously, all blood stored near the front lines was at risk of clotting. Anticoagulant methods were implemented, such as adding citrate or using paraffin inside the storage vessel. This resulted in blood being successfully stored for an average of 26 days, simplifying transportation. The storage and maintenance of blood meant that by 1918 blood transfusions were being used in front-line casualty clearing stations (CCS). Clearing stations were medical facilities that were positioned just out of enemy fire.

One of the most profound medical advancements resulting from World War I was the exploration of mental illness and trauma. Originally, any individual showing symptoms of neurosis was immediately sent to an asylum and consequently forgotten. As World War I made its debut, it brought forward a new type of warfare that no one was prepared for in its technological, military, and biological advances.

Another successful innovation came in the form of the base hospitals and clearing stations. These allowed doctors and medics to categorize men as serious or mild, and results came to light that many stress-related disorders were a result of

exhaustion or deep trauma. “Making these distinctions was a breakthrough…the new system meant that mild cases could be rested then returned to their posts without being sent home.”

What do you think of trauma during World War I? Let us know below.

Now read Richard’s piece on the history of slavery in New York here.The pocket diary of Rifleman William Eve of 1/16th (County of London) Battalion (Queen’s Westminster Rifles):

”Poured with rain all day and night. Water rose steadily till knee deep when we had the order to retire to our trenches. Dropped blanket and fur coat in the water. Slipped down as getting up on parapet, got soaked up to my waist. Went sand-bag filling and then sewer guard for 2 hours. Had no dug out to sleep in, so had to chop and change about. Roache shot while getting water and [Rifleman PH] Tibbs shot while going to his aid (in the mouth). Laid in open all day, was brought in in the evening”, unconscious but still alive. Passed away soon after.

The war caused 350,000 total American casualties, of which over 117,000 were deaths. The best estimates today are 53,000 combat deaths, and 64,000 deaths from disease. (Official figures in 1919 were 107,000 total, with 50,000 combat deaths, and 57,000 deaths from disease.)  About half of the latter were from the great influenza epidemic, 1918-1920.  Considering that 4,450,000 men were mobilized, and half those were sent to Europe, the figure is far less than the casualty rates suffered by all of the other combatants.

World War 1 represented the coming of age of American military medicine.  The techniques and organizational principles of the Great War were greatly different from any earlier wars and were far more advanced.  Medical and surgical techniques, in contrast with previous wars, represented the best available in civilian medicine at the time.  Indeed, many of the leaders of American medicine were found on the battlefields of Europe in 1917 and 1918.  The efforts to meet the challenge were often hurried.  The results lacked polish and were far from perfect.  But the country can rightly be proud of the medical efforts made during the Great War.

The primary medical challenges for the U.S. upon entering the war were, “creating a fit force of four million people, keeping them healthy and dealing with the wounded,” says the museum's curator of medicine (Smithsonian's National Museum of American History and science) Diane Wendt. “Whether it was moving them through a system of care to return them to the battlefield or take them out of service, we have a nation that was coming to grips with that.”

The First World War created thousands of casualties. New weapons such as the machine gun caused unprecedented damage to soldiers’ bodies. This presented new challenges to doctors on both sides in the conflict, as they sought to save their patients’ lives and limit the harm to their bodies. New types of treatment, organization and medical technologies were developed to reduce the number of deaths.

In addition to wounds, many soldiers became ill. Weakened immune systems and the presence of contagious disease meant that many men were in hospital for sickness, not wounds. Between October 1914 and May 1915 at the No 1 Canadian General Hospital, there were 458 cases of influenza and 992 of gonorrhea amongst officers and men.

Wounding also became a way for men to avoid the danger and horror of the trenches. Doctors were instructed to be vigilant in cases of ‘malingering’, where soldiers pretended to be ill or wounded themselves so that they did not have to fight. It was a common belief of the medical profession that wounds on the left hand were suspicious. 

Wounding was not always physical. Thousands of men suffered emotional trauma from their war experience. ‘Shellshock’, as it came to be known, was viewed with suspicion by the War Office and by many doctors, who believed that it was another form of weakness or malingering. Sufferers were treated at a range of institutions.

Organization of Battlefield Medical Care

In response to the realities of the Western Front in Europe, the Medical Department established a treatment and evacuation system that could function in both static and mobile environments. Based on their history of success in the American Civil War, and on the best practices of the French and British systems, the Department created specific units designed to provide a sequence of continuous care from the front line to the rear area in what they labelled the Theater of Operations.

Casualties had to be taken from the field of battle to the places where doctors and nurses could treat them. They were collected by stretcher-bearers and moved by a combination of people, horse and cart, and later on by motorized ambulance ‘down the line’. Men would be moved until they reached a location where treatment for their specific injury would take place.

Where soldiers ended up depended largely on the severity of their wounds. Owing to the number of wounded, hospitals were set up in any available buildings, such as abandoned chateaux in France. Often Casualty Clearing Stations (CCS) were set up in tents. Surgery was often performed at the CCS; arms and legs were amputated, and wounds were operated on. As the battlefield became static and trench warfare set in, the CCS became more permanent, with better facilities for surgery and accommodation for female nurses, which was situated far away from the male patients.

Combat Related Injuries

For World War I, ideas of the front lines entered the popular imagination through works as disparate as All Quiet on the Western Front and Blackadder. The strain and the boredom of trench warfare are part of our collective memory; the drama of war comes from two sources: mustard gas and machine guns. The use of chemical weapons and the mechanization of shooting brought horror to men’s lives at the front. Yet they were not the greatest source of casualties. By far, artillery was the biggest killer in World War I, and provided the greatest source of war wounded.

World War I was an artillery war. In his book Trench: A History of Trench Warfare on the Western Front (2010), Stephen Bull concluded that in the western front, artillery was the biggest killer, responsible for “two-thirds of all deaths and injuries.” Of this total, a third resulted in death, two-thirds in injuries. Artillery wounded the whole body, if not entirely obliterated, the body was often dismembered, losing arms, legs, ears, noses, and even faces. Even when there was not superficial damage, concussive injuries and “shell shock” put many men out of action. Of course, shooting—in combat as well as from snipers—was another great source of wounding. Gas attacks were a third. Phosgene, chlorine, mustard gas, and tear gas debilitated more than killed, though many ended up suffering long-term disability. Overall, the war claimed about 10M military dead, and about 20M–21M military wounded, with 5% of those wounds’ life-debilitating, that is, about a million persons.

August 1914 would dramatically alter the paradigm of casualty care. Gigantic cannons, high explosives, and the machine gun soon invalidated all pre-war suppositions and strategy. More than eighty percent of wounds were due to shell fragments, which caused multiple shredding injuries. "There were battles which were almost nothing but artillery duels," a chagrined Edmond Delorme observed. Mud and manured fields took care of the rest. Devitalized tissue was quickly occupied by Clostridia pathogens, and gas gangrene became a deadly consequence. Delays in wound debridement, prompted by standard military practice, caused astounding lethality. Some claimed more than fifty percent of deaths were due to negligent care. And the numbers of casualties were staggering. More than 200,000 were wounded in the first months alone: far too many for the outdated system of triage and evacuation envisioned just years before. American observer Doctor Edmund Gros visited the battlefield in 1914:

If a soldier is wounded in the open, he falls on the firing line and tries to drag himself to some place of safety. Sometimes the fire of the enemy is so severe that he cannot move a step. Sometimes, he seeks refuge behind a haystack or in some hollow or behind some knoll…. Under the cover of darkness, those who can do so walk with or without help to the Poste de Secours. . . . Stretcher-bearers are sent out to collect the severely wounded . . . peasants' carts and wagons [are used] . . . the wounded are placed on straw spread on the bottom of these carts without springs, and thus they are conveyed during five or six hours before they reach the sanitary train or temporary field hospital. What torture many of them must endure, especially those with multiple fractures!

Non-Combat Related Death and Illness

In the course of the First World War, many more soldiers died of disease than by the efforts of the enemy. Lice caused itching and transmitted infections such as typhus and trench fever. In summer it was impossible to keep food fresh and everyone got food poisoning. In winter men suffered from frostbite and exposure and from trench foot. There were no antibiotics so death from gangrenous wounds and syphilis were common. Others suicided as a result of psychological stress.

Battlefield Wounded and Surgery

In the early years of the war, compound lower limb fractures caused by gunshots in trench warfare sparked debate over the traditional splinting practices that delayed surgery, leading to high mortality rates, particularly for open femoral fractures.

Femoral fractures stranded soldiers on the battlefield, and stretcher-bearers reached them only with difficulty, leaving many lying wounded for days or enduring rough transport, all of which left soldiers particularly vulnerable to gas gangrene and secondary hemorrhage. Australian surgeons in France reported injury-to-treatment times ranging from 36 hours to a week and averaging three to four days. Fracture immobilization during transport was poor, and in the early war years surgeons reported about 80% mortality for soldiers with femoral fractures transported from the field.

By 1915 medics and stretcher-bearers were routinely trained to apply immobilizing splints, and by 1917 specialized femur wards had been established; during this period mortality from all fractures fell to about 12% and below 20% for open femoral fractures.

Théodore Tuffier, a leading French surgeon, testified in 1915 to the Academy of Medicine that 70 percent of amputations were due to infection, not to the initial injury. “Professor Tuffier stated that antiseptics had not proven satisfactory, that cases of gas gangrene were most difficult to handle,” Crile wrote. “All penetrating wounds of the abdomen, he said, die of shock and infection. … He himself tried in fifteen instances to perform immediate operations in cases of penetrating abdominal wounds, and he lost every case. In fact, they have abandoned any attempt to operate penetrating wounds of the abdomen. All wounds large and small are infected. The usual antiseptics, bichloride, carbolic, iodine, etc., fail.”

Every war has its distinctive injury. For World War I, it was facial injuries, which affected 10–15% of casualties, or over a half-million men. The nature of combat--with faces often exposed above the trench line--contributed to this high incidence. Most countries founded specialist hospitals with surgeons like Johannes Esser in the Netherlands and Hippolyte Morestin in France who dedicated their practices to developing new techniques to repair facial trauma.

World War I presented surgeons with myriad new challenges. They responded to these difficulties not only with courage and sedulity but also with an open mind and active investigation. Military medicine practiced in 1918 differed substantially from that in 1914. This shift did not occur by happenstance. It represented collaboration between some of the brightest minds in academia and professional military doctors, combining their expertise to solve problems, take care of patients, and preserve fighting strength. It required multiple inter-allied conferences both to identify common medical problems and also to determine optimal solutions. Reams of books and pamphlets buttressed the in-person instruction consultants provided to educate young physicians on best practices. Most significantly, this change demanded a willingness to admit a given intervention was not working, creatively try something new, assess its efficacy using data from thousands of soldiers, disseminate the knowledge, and ensure widespread application of the novel practice. No step was easy, and combining execute them while fighting the Great War required a remarkable degree of perseverance, intellectual honesty, and operational flexibility.

Medical advances and improvements leading up to World War 2

 With most of the fighting set in the trenches of Europe and with the unexpected length of the war, soldiers were often malnourished, exposed to all weather conditions, sleep-deprived, and often knee-deep in the mud along with the bodies of men and animals. In the wake of the mass slaughter, it became clear that the “only way to cope with the sheer numbers of casualties was to have an efficient administrative system that identified and prioritized injuries as they arrived.” This was the birth of the Triage System. Medicine, in World War I, made major advances in several directions. The war is better known as the first mass killing of the 20th century—with an estimated 10 million military deaths alone—but for the injured, doctors learned enough to vastly improve a soldier’s chances of survival. They went from amputation as the only solution, to being able to transport soldiers to hospital, to disinfect their wounds and to operate on them to repair the damage wrought by artillery. Ambulances, antiseptics, and anesthesia, three elements of medicine taken entirely for granted today, emerged from the depths of suffering in the First World War.

Two Welshmen were responsible for one of the most important advances - the Thomas splint - which is still used in war zones today. It was invented in the late 19th century by pioneering surgeon Hugh Owen Thomas, often described as the father of British orthopedics, born in Anglesey to a family of "bone setters”.

In France, vehicles were commandeered to become mobile X-ray units. New antiseptics were developed to clean wounds, and soldiers became more disciplined about hygiene. Also, because the sheer scale of the destruction meant armies had to become better organized in looking after the wounded, surgeons were drafted in closer to the frontline and hospital trains used to evacuate casualties.

When the war broke out, the making of prosthetic limbs was a small industry in Britain. Production had to increase dramatically. One of the ways this was achieved was by employing men who had amputations to make prosthetic limbs – most commonly at Erskine and Roehampton, where they learnt the trade alongside established tradespeople. This had the added advantage of providing occupation for discharged soldiers who, because of their disabilities, would probably have had difficulty finding work.

While it was not an innovation of war, the process of blood transfusion was greatly refined during World War I and contributed to medical progress. Previously, all blood stored near the front lines was at risk of clotting. Anticoagulant methods were implemented, such as adding citrate or using paraffin inside the storage vessel. This resulted in blood being successfully stored for an average of 26 days, simplifying transportation. The storage and maintenance of blood meant that by 1918 blood transfusions were being used in front-line casualty clearing stations (CCS). Clearing stations were medical facilities that were positioned just out of enemy fire.

One of the most profound medical advancements resulting from World War I was the exploration of mental illness and trauma. Originally, any individual showing symptoms of neurosis was immediately sent to an asylum and consequently forgotten. As World War I made its debut, it brought forward a new type of warfare that no one was prepared for in its technological, military, and biological advances.

Another successful innovation came in the form of the base hospitals and clearing stations. These allowed doctors and medics to categorize men as serious or mild, and results came to light that many stress-related disorders were a result of

exhaustion or deep trauma. “Making these distinctions was a breakthrough…the new system meant that mild cases could be rested then returned to their posts without being sent home.”

What do you think of trauma during World War I? Let us know below.

Now read Richard’s piece on the history of slavery in New York here.

Posted
AuthorGeorge Levrier-Jones

The US Civil War was the first modern war in which the productive capacities of the industrial state were completely integrated into the war effort. This has significant impacts on the ability to kill and injure the enemy. Here, Richard Bluttal starts a three-part series on the impacts of trauma during wars by looking at the American Civil War.

Clara Barton, a nurse and founder of the American Red Cross, in the 1860s.

In early May 1864, Lieutenant General Ulysses S. Grant (1822-85) launched his Overland campaign, in which his Army of the Potomac clashed with Robert E. Lee’s Army of Northern Virginia in a series of battles in Virginia. Lt. J. E. Mallet of the Union army distinctly remembers the sensations experienced upon being hit.” I imagined that a cannonball had struck me on the left hipbone, that it took a downward course, tearing the intestines in its course, and lodged against the marrow of the right thigh bone. I fancied I saw sparks of fire, and curtains of cobwebs wet with dew, sparkling in the sun. I heard a monstrous roar of distant cataracts. I felt my teeth chatter, a rush of blood to my eyes, ears, nose and to the ends of my fingers and toes. I tried to get up, fell, and "became completely insensible” He described his wait on the battlefield (at least a day) and the journey to the hospital transport ship quite matter-of-factly. Men in his regiment on the ards. The heavy soft, unjacketed lead bullet flattened out on impact, which produced severe wounds and carrying pieces of clothing into the wound.

The number of combat engagements during the American Civil War was the largest in history to that time, and exponential increases in the killing power of weapons produced rates of casualties beyond the imagination of military medical planners. In a four-year period, 2,196 combat engagements were fought, in which 620,000 men perished—360,000 in the Union Army and 260,000 in the Confederate Army. Some 67,000 Union soldiers were killed outright, 43,000 died of wounds, and 130,000 were disfigured for life, often with missing limbs; 94,000 Confederate soldiers died of wounds. Twice as many soldiers died of disease during the war than in combat. During the 1860s, doctors had yet to develop bacteriology and were generally ignorant of the causes of disease. Generally, Civil War doctors underwent two years of medical school, though some pursued more education. Medicine in the United States was woefully behind Europe. Harvard Medical School did not even own a single stethoscope or microscope until after the war. Most Civil War surgeons had never treated a gunshot wound and many had never performed surgery. Medical boards admitted many "quacks," with little to no qualification. Yet, for the most part, the Civil War doctor (as understaffed, underqualified, and under-supplied as he was) did the best he could, muddling through the so-called "medical Middle Ages." Some 10,000 surgeons served in the Union army and about 4,000 served in the Confederate. Medicine made significant gains during the war. However, it was the tragedy of the era that medical knowledge of the 1860s had not yet encompassed the use of sterile dressings, antiseptic surgery, and the recognition of the importance of sanitation and hygiene. As a result, thousands died from diseases such as typhoid or dysentery.

Why did so many have to die? The deadliest thing that faced the Civil War soldier was disease. For every soldier who died in battle, two died of disease. In particular, intestinal complaints such as dysentery and diarrhea claimed many lives. In fact, diarrhea and dysentery alone claimed more men than did battle wounds. The Civil War soldier also faced outbreaks of measles, smallpox, malaria, pneumonia, or camp itch. Soldiers were exposed to malaria when camping in damp areas which were conductive to breeding mosquitos, while camp itch was caused by insects or a skin disease. In brief, the high incidence of disease was caused by inadequate physical examination of recruits, ignorance, the rural origin of soldiers, neglect of camp hygiene, insects and vermin, exposure; lack of clothing and shoes; poor food and water.

The Germ Theory, which states that microscopic bacteria and viruses caused disease, was not yet understood (Sohn). These pathogenic microorganisms thrived in filthy environments, and the conditions soldiers lived in were horrendous. Because of water shortages in camps, items were rarely cleaned. This includes all medical tools. If scalpels or forceps were dropped on the ground, they were "only washed in tap water," according to one Civil War surgeon (Ledoux). Between operations, tools were not sterilized. Doctors rarely washed their hands, and even less often were their garments cleaned. No one yet knew why these post-surgery infections took place, nor how to prevent them.

Organization of Battlefield Medical Care

On July 16, 1861, Clara Barton watched more than 30,000 “noble, gallant, [and] handsome” Federal soldiers, “armed to the teeth,” march out of Washington to confront a Confederate army near Manassas Junction, this was the first engagement, Battle of Bull Run, against confederate forces. Many around the country, soldiers and citizens alike, naively expected a short conflict with no need to prepare for large numbers of wounded. In the days before the First Battle of Bull Run, as the US Army neared contact with the enemy, the army’s medical department made no preparation to set up hospital sites until after the battle began. No permanent military hospital sites had been established in the city. Instead, sick soldiers were languishing in abandoned warehouses, churches, schools, and other public buildings. That meant Washington’s “hospitals” were already overflowing when the army left for battle. There was simply no space for more patients in these makeshift facilities. A significant number of Union wounded were left on the battlefield because the medical department didn’t have authority over most of the ambulances. The medical disaster at Bull Run in July 1861 convinced Clara Barton, ordinary citizens, and even the Union medical department to take the medical needs of the US Army in the aftermath of a battle more seriously,

How medical care was delivered on and off the battlefield changed during the war. Early on, stretcher bearers were members of the regimental band, and many fled when the battle started. There was no military ambulance corps in the Union Army until August of 1862. Until that time, civilians drove the ambulances. Initially the ambulance corps was under the Quartermaster corps, which meant that ambulances were often commandeered to deliver supplies and ammunition to the front.

If a soldier was injured during battle, volunteers took the howling victim behind the front lines using a stretcher made from canvas and wooden poles. From there, a horse-and-buggy-type wagon would cart them to the nearest field hospital. The "stretcher-bearers" would assess the condition of the patient, dividing them into three main categories: mortally wounded, slightly wounded, and surgical cases. They would then assist the patient to the best of their ability in the back of the jostling horse-drawn vehicle. This process was called "Letterman's Ambulance," devised by the director of the Army of the Potomac, Jonathan Letterman. His system evacuated the injured more efficiently and paved the way for our modern ambulance system.

Combat Related Injuries

In order to be reported, a soldier had to be either transported to or make it back to a field hospital, and this may have resulted in an underreporting of deaths from cannon fire.  Most injuries resulted from the Minié ball invented by the French officer Claude-Etienne Minié in 1849. The Minié ball is a 0.58-caliber bullet that is slow moving and is made from soft lead. It flattens on impact and creates a wound that grows larger as the bullet moves deeper into tissues. It shatters bone above and below impact and usually does not exit. Because of its relatively slow muzzle velocity, it brought bits of clothing, skin, and bacteria into the wound. The majority of gunshot wounds occurred in the upper and lower extremities, but the fatality rate from these wounds was low. Only 18% of wounds were to the abdomen, but these were more often fatal from intestinal perforation in the preantibiotic era.

Non-Combat Related Death and Illness

A variety of factors contributed to a high rate of noncombat-related illness, including overcrowded and filthy camps. Latrines were often not used or were drained into drinking water supplies or not covered daily. Food quality was poor from several standpoints. It was poorly stored, poorly cooked, and lacked enough vitamin C to prevent scurvy. The Army of the Potomac eventually added a number of rules: camps had be pitched on new ground and drained by ditches 18 inches deep, tents had to be struck twice a week to sun their floors, cooking had to be done only by company cooks, all refuse had to be burned or buried daily, soldiers had to bathe twice a week and change clothing at least once a week, and latrines had to be 8 feet deep and covered by 6 inches of dirt daily.

There were few useful medications at the time, and about two thirds of all drugs were botanicals. In 1860 Oliver Wendell Holmes stated at the annual meeting of the Massachusetts Medical Society, “I firmly believe that if the whole material medica, as now used, could be sunk to the bottom of the sea, it would be all the better for mankind,—and all the worse for the fishes”. Medications that were helpful included quinine for malaria, morphine, chloroform, and ether, as well as paregoric. Many others were harmful. Fowler's solution was used to treat fevers and contained arsenic. Calomel (mercurous chloride) was used for diarrhea. Mercury is excreted in high concentration in saliva. This led to excessive salivation, loss of teeth, and gangrene of the mouth and cheeks in some patients.

Battlefield Wounded and Surgery

Battlefield surgery was also at best archaic. Doctors often took over houses, churches, schools, even barns for hospitals. The field hospital was located near the front lines -- sometimes only a mile behind the lines -- and was marked with a yellow flag with a green "H". Anesthesia's first recorded use was in 1846 and was commonly in use during the Civil War. In fact, there are 800,000 recorded cases of its use. Chloroform was the most common anesthetic, used in 75% of operations. In a sample of 8,900 uses of anesthesia, only 43 deaths were attributed to the anesthetic, a remarkable mortality rate of 0.4%. Anesthesia was usually administered by the open-drop technique. The anesthetic was applied to a cloth held over the patient's mouth and nose and was withdrawn after the patient was unconscious. Surgeons worked all night, with piles of limbs reaching four or five feet. Lack of water and time meant they did not wash off their hands or instruments. Bloody fingers often were used as probes. Bloody knives were used as scalpels. Doctors operated in pus-stained coats. Everything about Civil War surgery was septic. The antiseptic era and Lister's pioneering works in medicine were in the future. Blood poisoning, sepsis or Pyemia (Pyemia meaning literally pus in the blood) was common and often very deadly. Surgical fevers and gangrene were constant threats. One witness described surgery as such: "Tables about breast high had been erected upon which the screaming victims were having legs and arms cut off. The surgeons and their assistants, stripped to the waist and bespattered with blood, stood around, some holding the poor fellows while others, armed with long, bloody knives and saws, cut and sawed away with frightful rapidity, throwing the mangled limbs on a pile nearby as soon as removed." If a soldier survived the table, he faced awful surgical fever. However, about 75% of amputees did survive.

Amputation was the most successful method used to halt the spread of deadly infections, like gangrene, caused by battle wounds during the Civil War. Contrary to popular belief, the process was not as barbaric as it seemed. The process was efficient and effective. After a soldier was injured on the battlefield, he was immediately bandaged by medical volunteers. He was shuttled to either the nearest field hospital or medical tent at a camp using the new ambulance system. On the way, the wounded soldier was given whiskey to ease his shock. Once the patient, still in great distress, was set on an "operating table," a chloroform- soaked cloth was held onto the patient's nose and mouth. Tourniquets were tightly secured above the amputation area to prevent the patient from bleeding out. A long, though often dull, blade was used to sever tissue and ligaments, then a serrated saw was used to cut through the bone. One man who witnessed an amputation said this: "Tables about breast high had been erected upon which the screaming victims were having legs and arms cut off. The `surgeons and their assistants, stripped to the waist and bespattered with blood, stood around, some holding the poorfellows while others, armed with long,  bloody knives and saws, cut and sawed away with frightful rapidity, throwing the mangled limbs on a pile nearby as soon as removed.” An experienced field surgeon could perform an amputation in under ten minutes.

Medical advances and improvements leading up to World War 1

The contributions to medical care that developed during the Civil War have not been fully appreciated, probably because the quality of care administered was compared against modern standards rather than the standards of the time. The specific accomplishments that constituted major advances were as follows. 1. Accumulation of adequate records and detailed reports for the first time permitted a complete military medical history. This led to the publication of the Medical and Surgical History of the War of the Rebellion, which was identified in Europe as the first major academic accomplishment by US medicine. 2. Development of a system of managing mass casualties, including aid stations, field hospitals, and general hospitals, set the pattern for management of the wounded in World War I, World War II, and the Korean War. 3. The pavilion-style general hospitals, which were well ventilated and clean, were copied in the design of large civilian hospitals over the next 75 years. 4. The importance of immediate, definitive treatment of wounds and fractures was demonstrated, and it was shown that major operative procedures, such as amputation, were optimally carried out in the first 24 hours after wounding. 5. The importance of sanitation and hygiene in preventing infection, disease, and death among the troops in the field was demonstrated. 6. Female nurses were introduced to hospital care and Catholic orders entered the hospital business. 7. The experience and training of thousands of physicians were upgraded, and they were introduced to new ideas and standards of care. These included familiarity with prevention and treatment of infectious disease, with anesthetic agents, and with surgical principles that rapidly advanced the overall quality of American medical practice. 8. The Sanitary Commission was formed, a civilian-organized soldier's relief society that set the pattern for the development of the American Red Cross.

In August of 1862, a physician named Jonathan Letterman set up the first ambulance system in the Union’s Army of the Potomac. With the support of Hammond, he instituted a three-step system for evacuating soldiers from the battlefield and established the Ambulance Corps. Their first stop was a field dressing station, where tourniquets were applied, and wounds were dressed. Then they moved to a field hospital, where doctors performed emergency medical procedures. Finally, ambulances would transport patients to a large hospital far from the battlefield for long-term treatment. The U.S. military uses the same basic system today.

What do you think of trauma during the US Civil War? Let us know below.

Now read Richard’s piece on the history of slavery in New York here.