World War Two caused so much misery and was much more of a truly global conflict than World War One. The battles of that war took place largely across Europe, Asia, and Africa and in seas the world over. Here, Richard Bluttal concludes his three-part series on the impacts of trauma during wars by looking at World War 2.
If you missed it, read part one on the American Civil War here, and part 2 on World War 1 here.
Lawrence McCauley was a member of the 65th Armored Field Artillery Battalion, trained to drive trucks armed with .50-caliber machine guns, halftracks and landing craft, just in case. In England, preparing for the D-Day invasion, he became fast friends with Otto Lutz, a tall Chicagoan. We were all very close,” he said of his unit when he was interviewed in 2020 at the age of 97 and living in Lewis Center. “You knew about their wives and children — everything you could know about your buddy, because there was nothing else to talk about.”
He and Otto were next to each other on a landing craft as it approached Omaha Beach. The front door dropped open and a bullet hit Otto in the forehead. McCauley remembers looking back and seeing his friend’s face sink beneath the water. “There was no stopping,” he said. “Our orders were `Don’t stop,’ because you’re better off as a moving target. That’s hard.”
The purpose of military medicine during World War II was the same as in previous wars: to conserve the strength and efficiency of the fighting forces so as to keep as many men at as many guns for as many days as possible. What transpired between 1939 and 1945 was a cataclysmic event made worse by the nature of the weapons the combatants used. The use of machine guns, submarines, airplanes, and tanks was widespread in World War I; but in World War II these weapons reached unimagined perfection as killing machines. In every theater of war, small arms, land-and sea-based artillery, torpedoes, and armor-piercing and antipersonnel bombs took a terrible toll in human life. In America's first major encounter at Pearl Harbor, the survivors of the Japanese attack could describe what modern warfare really meant. Strafing aircraft, exploding ordnance, and burning ships caused penetrating injuries, simple and compound fractures, traumatic amputations, blast injuries, and horrific burns, to name just a few. Total U.S. battle deaths in World War II numbered 292,131 with 671,801 reported wounded or missing.
Conserving fighting strength and enabling armies and navies to defeat the enemy also meant recognizing that disease, more than enemy action, often threatened this goal. For example, during the early Pacific campaign to subdue the Solomon Islands, malaria caused more casualties than Japanese bullets. Following the initial landings on Guadalcanal, the number of patients hospitalized with malaria exceeded all other diseases. Some units suffered 100 percent casualty rates, with personnel sometimes being hospitalized more than once. Only when malaria and other tropical diseases were controlled could the Pacific war be won.
The military's top priority organized its medical services to care for battlefield casualties, make them well, and return them to duty. The systems developed by the army and navy worked similarly. In all theaters of war, but particularly in the Pacific, both army and navy medicine faced their greatest challenge dealing with the aftermath of intense, bloody warfare fought far from fixed hospitals. This put enormous pressure on medical personnel closest to the front and forced new approaches to primary care and evacuation.
World War II service members lived through an inflection point in the history of medicine and warfare. In all previous US wars, non-battle deaths—related to conditions like smallpox, typhoid, dysentery, yellow fever, tuberculosis, and influenza—outnumbered battle-related fatalities. During the Spanish-American War, more than 2,000 of the approximately 2,400 deaths were due to causes other than battle. During World War I, 53,000 died due to battle versus 63,000 who died due to other causes. World War II marked the first time the ratio was reversed. Of 16.1 million who served, 405,399 died—291,557 of them in battle, and 113,842 due to other causes. A variety of factors contributed to the shift. Crucially, during World War II, the government mobilized expansive public, professional, and private resources to enhance health-related research and development, as well as services offered by the Army Surgeon General’s Office, which oversaw care for soldiers. Also, rather than creating mobilization and treatment plans from scratch, the military health apparatus built on knowledge and administrative infrastructure developed during and after prior conflicts.
Organization of battlefield medical care
The military's top priority organized its medical services to care for battlefield casualties, make them well, and return them to duty. The systems developed by the army and navy worked similarly. In all theaters of war, but particularly in the Pacific, both army and navy medicine faced their greatest challenge dealing with the aftermath of intense, bloody warfare fought far from fixed hospitals. This put enormous pressure on medical personnel closest to the front and forced new approaches to primary care and evacuation.
Army medics or navy corpsmen were the first critical link in the evacuation chain. From the time a soldier suffered a wound on a battlefield in France or a marine was hit on an invasion beach at Iwo Jima, the medic or corpsman braved enemy fire to render aid. He applied a battle dressing, administered morphine and perhaps plasma or serum albumin, and tagged the casualty. Indeed, one of the lingering images of the World War II battlefield is the corpsman or medic crouched beside a wounded patient, his upstretched hand gripping a glass bottle. From the bottle flowed a liquid that brought many a marine or soldier back from the threshold of death. In the early days of the conflict that fluid was plasma. Throughout the war, scientists sought and finally developed a better blood substitute, serum albumin. Finally, in 1945, whole blood, rich in oxygen-carrying red cells, became available in medical facilities close to the battlefield.
If he was lucky, the medic or corpsman might commandeer a litter team to move the casualty out of harm's way and on to a battalion aid station or a collecting and clearing company for further treatment. This care would mean stabilizing the patient with plasma, serum albumin, or whole blood. In some cases, the casualty was then evacuated. Other casualties were taken to a divisional hospital, where doctors performed further stabilization including surgery, if needed. In the Pacific, where sailors, soldiers, and marines were doing the fighting, both navy and army hospital ships, employed mainly as ambulances, provided first aid and some surgical care for the casualties' needs while ferrying them to base hospitals in the Pacific or back to the United States for definitive care. As the war continued, air evacuation helped carry the load. Trained army and navy nurses, medics, and corpsmen staffed the evacuation aircraft.
Combat Related Injuries
The experience of a battle casualty in the Second World War was not radically different to that of the First World War. The most common injuries were caused by shells and bullets, and a casualty was evacuated through a similarly organized chain of medical posts, dressing stations and hospitals. Common combat injuries include second- and third-degree burns, broken bones, shrapnel wounds, brain injuries, spinal cord injuries, nerve damage, paralysis, loss of sight and hearing, post-traumatic stress disorder (PTSD), and limb loss.
Non-Combat Related Death and Injuries
Not all wounds are physical. In a previous era, the psychologically wounded suffered from "nostalgia" during the Civil War, and "shell-shock" in World War I. In World War II this condition was termed combat exhaustion or combat fatigue. Although the World War I experience of treating men at the front had been successful, military psychiatrists and psychologists at the beginning of World War II had to relearn those lessons. Nevertheless, the care givers soon recognized that given a respite from combat, a safe place to rest, regular food, and a clean environment, 85 to 90 percent of patients could again become efficient warriors. The more psychologically damaged received therapy in military hospitals.
In the Southwest Pacific, where death rates due to disease were highest, soldiers faced scourges like malaria, as well as tsutsugamushi fever, poliomyelitis, and diseases of the digestive system. In the northern theater—Alaska, Canada, Greenland, Iceland—threats included cold injuries like frostbite and trench foot. Neuropsychiatric disorders and venereal disease were widespread, regardless of where one served, including among those in the United States.
Army doctor Paul F. Russell recalled after the war an earlier statement from General Douglas MacArthur, who had reported that he “was not at all worried about defeating the Japanese, but he was greatly concerned about the failure up to that time to defeat the Anopheles mosquito,” the vector for malaria. By war’s end, more than 490,000 soldiers had been diagnosed with malaria, equating to a loss of approximately nine million “man-days.”
Between 1941 and 1944, more than 10 percent—roughly two million of 15 million examined men—were excluded from service; 37 percent of those dismissals were made based on neuropsychiatric findings. Still, diagnoses of mental “disorders” within the military catapulted well beyond expectations. A total of one million soldiers were admitted for neuropsychiatric illness, constituting approximately 6 percent of all wartime admissions. Within two years of American entry into the war, it was clear that so-called combat stress or “exhaustion” would pose a major threat to soldiers and the army they served—as it had during prior generations. Experiences and realizations of the World War II period had important implications for the future of military medicine.
Army officials began devoting more resources to neuropsychiatric treatment because of an imperative to increase return-to-duty rates, but long-term impacts of care on individual service members were questionable. In early 1943, military psychiatrists noted that men in the Tunisian campaign diagnosed as “psychiatric casualties” were generally lost to their units after being transferred to distant base hospitals. To increase retention, they instituted principles of “forward psychiatry” that had been adopted by World War I-era armies—and henceforth largely disregarded by World War II planners in the United States: treat patients quickly, in close proximity to battle, and with the expectation that they would recover. After army psychiatrist Frederick Hanson reported in the spring of 1943 that 70 percent of approximately 500 psychiatric battle casualties were returned to duty thanks to this approach, it was gradually adopted in other theaters. Still, military psychiatrists acknowledged the method was hardly a panacea. Systematic follow-up studies were lacking, but one contemporary account noted that many who underwent treatment were unable to return to combat, and some who did “relapsed after the first shot was fired.’”
Medical Advancements and Improvements
Battlefield medicine improved throughout the course of the war. At the beginning, only plasma was available as a substitute for the loss of blood. By 1945, serum albumin had been developed, which is whole blood that is rich in the red blood cells that carry oxygen and is considerably more effective than plasma alone. This was the first major war in which air evacuation of the wounded became available.
During the war, surgery techniques such as removing dead tissue resulted in fewer amputations than at any time. To treat bacterial infections, penicillin or streptomycin were administered for the first time in large-scale combat.
Service members with combat fatigue, which later became known as post-traumatic stress disorder, were given a safe place to stay away from battle zones with plenty of food and rest. This resulted in about 90% of patients recovering enough to return to the fight.
War also brought about the mass production of antibiotics, especially sulfanilamide and penicillin. World War II helped both of them find widespread respect, production, and use.
In 1928, when Scottish bacteriologist Alexander Fleming noticed a weird mold had taken over his Petri dishes and eliminated the bacteria on them, his findings didn’t get much notice. But Fleming continued his research and kept talking up what he called “mold juice” (he didn’t come up with “penicillin” until later), eventually winning a Nobel Prize and attracting the attention of drug maker Pfizer. The company soon began mass-producing the drugs for distribution to medics during WWII, and ultimately, to doctors and hospitals across the country.
In 1932, German biochemist Gerhard Johannes Paul Domagk discovered that the compound sulfanilamide could vanquish deadly strains of bacteria, like the streptococcus in his lab mice and in his first human test subject, his gravely ill young daughter. The wide distribution of so-called “sulfa drugs” began when World War II soldiers carried powdered sulfanilamide in their first-aid kits. By the end of the war, doctors were routinely using these antibiotics to treat streptococcus, meningitis, and other infections.
In the tropical islands of the Pacific, malaria was a serious threat. Service members received atabrine — a group of medications used to protect against malaria — before going into affected areas.
Service members were also inoculated with vaccinations for smallpox, typhoid, tetanus, cholera, typhus, yellow fever and bubonic plague, depending where they were sent.
Other improvements during World War II included improved crash helmets, Because of improvements like these and others, the survival rate for the wounded and ill climbed to 50% during World War II from only 4% during World War I, according to Dr. Daniel P. Murphy, who published a paper on "Battlefield Injuries and Medicine."
As medical advancements progress so does the capability of our medical teams to treat our service men and women when injured in the field.
What do you think of trauma during World War II? Let us know below.
Now read Richard’s piece on the history of slavery in New York here.