Propaganda and censorship have long been a tool used during war - and particularly with the advent of the printing press and electronic means of communication. Here, Amy Chandler looks at their role during World War Two in Britain.

A British World War II propaganda poster related to the 1940 Battle of Britain.

The way that society today consumes news and information is ever changing with the influx of social media and less direct channels of information through podcasts and a plethora of broadcasters all vying for attention in a saturated market. Many of these information sources have different validation processes or have multiple eyes on ensuring that information is correct and up-to-date. Recently, there have been cases of advanced technology like artificial intelligence (AI) manipulating images, videos and voices that spreads false information. The war against misinformation is rife, but during the Second World War (WW2) Britain was fighting not just Germany and its allies, but the war on keeping secrets from enemy hands. The rules of censorship were strict and the process to approve news reports was lengthy under the principle of ‘self enforcement’. This policy issued newspapers with topic guidelines that adhered to censorship and reporters submitted stories for review. (1) These stories went under rigorous review and redacted under the official policy, for example redacting weather reports, location of military manoeuvres and any other information that could be used to infiltrate British operations. Only approved reports would be stamped with an official stamp with changes marked in blue pencil and stories that were deemed unacceptable and not ‘Passed for censorship’ were liable for prosecution. In some instances the Ministry of Information (MOI) applied retrospective censorship to news outlets, for example the arrival of British Expeditionary Force in France, 1939, which caused crisis in Government and disgruntlement with the Press. (1) This article will explore how the British Government used wartime propaganda to boost morale and how important censorship was in ensuring military victory.

 

Ministry of Information

In the face of war across Europe, the British Government passed the Emergency Powers (Defence) Act (1939) that granted the Government power to take any necessary actions in wartime, which extended to controlling many areas of society, such as rationing and blackouts. (2) These defence regulations superseded the usual channels and processes that controlled law making and existing rules. The outbreak of war changed the way society ran and in turn created a higher level of extended control to achieve order. Britain is generally presented as a liberal country with freedom of speech a necessity in society. However when in the throws of war the need to control what information was broadcasted was not easy. During wartime the MOI was a “servant of all Government Departments” where the majority of all departments needed to “use publicity campaigns to tell the public what they would like them to do and why.” (3) During wartime, these campaigns were integral to Britain’s survival and operations. In a Parliamentary debate in 1944, raised the issue of what to do when too many departments wanted to publicise a message or campaign and how to ensure the public were not overwhelmed by mixed messages. The MOI was dedicated to co-ordinating important messages to the public and prioritise the most urgent campaigns. Many members of Parliament were concerned about a “free-for-all competition” within Government vying for the “attention of the public, and for the very limited advertising space available in the Press.” (3) This concern suggested the complex workings that carried out behind the scene of wartime Government through ensuring that all of society was receiving the publicity campaigns. Publicity campaigns worked closely with the Public Relations Officers and experts in advertising with many messages relying greatly on regional offices to ensure that every region was receiving the appropriate information.

Aside from radio, film, posters and other forms of propaganda the MOI also published a large number of books and pamphlets that one Member of Parliament (MP) described as “a new technique in publishing.” (3) This new invention referred to official war books that were like no other publication that sought to present in print and in picture a “conspectus of the many sides of Britain’s war achievements.” (3) These books had great success nationally with homes sales of 23,000,000 copies and similar success in USA with the book, Combined Operations selling 350,000 copies in one year and translated in 12 languages. (3) Despite the success of these publications, the process to producing such material was lengthy, vast and complex with multiple departments working in collaboration to write, proofread, check and re-examine. The MOI also self-published many books but decided to publish twice as many books with private publishers to keep up with the amount of information being produced. Not all attempts were successful and early attempts at distributing propaganda and information were forced with pamphlets tucked inside books and on one occasion the MOI underwent a copyright dispute.

The MOI also employed other forms of media outlets such as film to circulate their public notices and propaganda. In 1943, the Ministry’s film division produced 160 films in English. However many members of the public titled these films “dreary” documentaries. Even in the midst of war, the MOI were already planning and preparing films to circulate across liberated Europe. These films were ready to be sent and shown to each country as they were liberated that displayed the role Britain played in the war since “Goebbels’ blanket of darkness spread over their heads.” (3) In conjunction with films displaying Britain’s pivotal role, the MOI also intended to circulate a number of British made entertainment and feature films. For example in 1944, France received a batch of French films from Britain, as well as several films translated into 15 languages that by 1944 were awaiting distribution. The British Government’s relationship with the film industry was in a mutual beneficial partnership where the MOI commissioned several feature length films in return to help the production of 38 commercial films. (3) It appears in many of these cases that Britain was more occupied with how they looked and their reputation to Europe to ensure its efforts were not forgotten after 1945. In many ways, this was also a way for Britain to assert dominance and reclaim a political standing in Europe after a period of political and economic crisis, fragmented and re-drawn borders and alliances on an international stage.

 

Keeping up the war effort

While Europe was at war, the bleak reality of life was unavoidable therefore propaganda was designed to maintain morale and influence opinions abroad. At home propaganda was aimed to encourage public responsibility and a feeling of directly contributing to Britain’s fight, focusing on rationing, blackouts, secrecy and recruiting women into the workforce. One poster in particular commissioned by the National Savings Committee in 1943 titled Squander Bug aimed at discouraging wasteful or personal spending. (4) The poster depicted a series of scenes where a woman went shopping and the squander bug encouraged her to buy products that were too expensive or unnecessary, all while the bug took pleasure in the detriment the overspending had on the war effort. The poster was aimed at women and encouraged the public to either save or invest money into the war effort. The artist of the poster, Phillip Boydell, created a bug covered with swatstikas, the Nazi German symbol, to associate wasteful spending and ‘squandering’ money to helping the enemy rather than Britain. The poster’s slogan reads ‘Don’t take the squander bug when you go shopping.’ (4) This is another way that Britain found a way to visualise Nazi Germany to the British public instead of fighting an invisible enemy. The squander bug symbolised the enemy on a smaller scale, potentially suggesting that the enemy was inside the walls waiting to take advantage.

Another poster issued titled, Dig for Victory (1939 - 1945) emphasised the importance of home grown fruit and vegetables to aid production of food all year round, while rationing was introduced in January 1940. This poster was brightly coloured and depicted a trug abundant with a range of fresh vegetables and fruits, such as carrots, cabbage, courgettes, onions, peas and tomatoes. By the outbreak of war, 70% of food was imported from abroad that relied on key shipping routes that could easily become attacked or blocked. (4) Interestingly, fruit and vegetables were never rationed despite the short supply network, while sugar, meat, fats and diary products were under rationing. By 1943, over a million of fruit and vegetables were grown across Britain. The poster was successful in encouraging the public to take control of food production, however it may have also been partly to the scarcity of products and long ration queues that worked as a deciding factor in why many grew their own vegetables.

Women were not the only ones targeted by propaganda, men were also targeted by an anti-gossip notice designed by Harold Foster called ‘Keep mum she’s not so dumb’ (1941). (4) This particular campaign by the MOI alerted the public to the threat of enemy spies and the danger of gossiping within social settings. In this poster, a woman in an evening dress surrounded by men in military uniform gossiping and drinking suggested that anyone could be listening even if they appeared inconsequential. Many of these posters worked on stereotypes and gender roles to promote their propaganda. It was a form of control that did not necessarily stifle freedom of speech, but was a constant reminder that relied on feelings of accountability. Other posters included salvaging and mending clothes, recruiting women to munitions factories and emphasising Britain’s allies with political undertones.

Despite the MOI’s intention to use propaganda to boost morale and ensure the public adhered to playing their part within the war effort, there were several cases of increase in crime such as breeches in the blackouts and bending the rules. During the Blitz (1940-1941) where Nazi Germany’s Luftwaffe bombed the East End of London and other major cities across Britain provided new opportunities for looting. Historians acknowledged that the Blitz created a determination to maintain the war effort through ‘Blitz Spirit’. But in a period of upheaval and turmoil, it is difficult to ascertain whether many carried on because they had no choice. On one occasion looters used a bombing raid as an opportunity to raid a house in Dover and when the resident returned they discovered their home had been stripped even down to the carpets and pipes. While this case suggested uncontained thievery, it also paints a picture of desperation when items were heavily sought after and rationed. By 1940, 4,584 looting cases were prosecuted in the Central Criminal Court (Old Bailey) in London, while others used bombings as cover-up for murder. The rational thought seemed not to exist for some looters and on one occasion a women stole a pair of shoes from a shop window because “if those shoes were just left there, somebody will steal them”. (5)

Another report questioned the motive as to why someone would steal a sink in the Yorkshire Evening Post. Wartime propaganda may have depicted a community that worked together to keep up the war effort, but it hid the darker aspects of society that flourished under such chaos. (5)

 

 

Radio Hamburg & German Propaganda

British broadcasters and reporters weren’t the only ones that the MOI had to worry about, one in particular was William Joyce also known as Lord Haw Haw, who rose to popularity as a personality broadcasting German propaganda to British audiences. His radio broadcasts recorded 50% of the British public through Radio Hamburg. Joyce was a firm supporter of the Nazis and travelled to Germany in August 1939 with a British passport, which he lied to obtain claiming he was a British citizen when he was in fact Irish. When in Germany he collaborated with the German Propaganda Ministry with regular radio broadcasts in September 1939. He commonly issued threats and misinformation towards Britain in a bid to undermine morale. It is interesting that while the MOI tried their best to censor and streamline exactly the information and propaganda that the British public consumed, many still listened to Joyce’s broadcasts. (6)

Some historians have suggested that this deliberate decision by Britain ensured that they didn’t ruin their reputation as a trusted news source by lashing out at enemy stations. The BBC was advised to continue to report truthfully and accurately but to withhold any information that would cause distress, for example omitting the number of casualties while still reporting incidents. However, the question should be asked was the BBC lulling the British public into a false sense of security instead of reporting the stark realities abroad? If Britain banned such a broadcast, many would have found other ways to listen. The only solution was for the BBC to direct attention back to its broadcasts in the form of entertaining content rather than dreary reports. But the question has to be asked, why did so many members of the British public tune into listen in the early years of the war? Joyce didn’t just spread propaganda but also attempted to undermine key political figures such as Prime Minister, Winston Churchill. The British public craved in many ways entertainment as an escape from the dreary news and uncertainty. When asked about why they listened, some British listeners found the broadcasts entertaining and wondered if what Joyce was reporting had a slither of truth. Eventually, Joyce was captured and trialled when Germany surrendered in 1945.

One example of detrimental censorship was the Hallsville School bombing in Canning Town, East End, where it was reported only 77 civilians were killed despite eye witnesses claiming it was closer to the 600 mark. (7) The British Government denied the claim due to not having sufficient evidence to report such high numbers. It was seen as detrimental for the Government to report such a devastating incident in case it deteriorated mass morale. Furthermore, a media blackout was issued to the Press to avoid publishing specific details on the location, photographs and casualties. This case emphasised a fine line between honesty and censorship that could have easily forced the public to lose trust in the British Government for denying something so blatantly obvious with eyewitnesses. (7)

 

Conclusion

In conclusion, the British Government’s desire to censor reports and withhold vital information and use propaganda was successful on the surface but allowed darker and more sinister events to transpire at home, like crime. The British Government also took the opportunity of continuing their legacy and reputation throughout liberated Europe through films to secure they place in politics. It is also worth noting that this is no different to how other countries employed political propaganda to ensure their success. While censorship and propaganda have many benefits to boosting morale, it also had negative consequences that alienated the public when lived events were reported incorrectly or denied outright. The war changed the way media and radio operated and pushed boundaries between dreary information and entertainment as well as democratic principles. It is also significant that the BBC still censor what they broadcast, for example the FIFA World Cup in Qatar in 2022 saw the BBC boycott the opening ceremony on main shown programming without explanation although it was widely reported and implied that it didn’t align with their editorial values. Censorship still occurs but through subtle ways that are not often recognised.

 

Find that piece of interest? If so, join us for free by clicking here.

 

 

References

(1)   H. Irving, ‘Chaos and Censorship in the Second World War’, 2014, Gov.UK < https://history.blog.gov.uk/2014/09/12/chaos-and-censorship/ >[accessed 23 May 2024].

(2)   UK Parliament, ‘Emergency Powers (Defence) Act 1939’, 2024, UK Parliament < https://www.parliament.uk/about/living-heritage/transformingsociety/private-lives/yourcountry/collections/collections-second-world-war/second-world-war-legislation/emergency-powers-defence-act-c20-1940-/   >[accessed 22 May 2024].

(3)   HC Deb, 29 June 1944, vol 401, cols 822 – 825.

(4)   The National Archives, ‘Second World War Propaganda Posters’, 2024, BETA The National Archives < https://beta.nationalarchives.gov.uk/explore-the-collection/explore-by-time-period/second-world-war/second-world-war-propaganda-posters/#:~:text=During%20the%20Second%20World%20War,production%2C%20salvage%20and%20military%20recruitment>[accessed 24 May 2024].

(5)   BNA, ‘Crime and the Blitz’, 2015, The British Newspaper Archives < https://blog.britishnewspaperarchive.co.uk/2015/07/17/crime-and-the-blitz/ >[accessed 24 May 2024].

(6)   IMW, ‘The Rise and Fall of Lord Haw Haw During the Second World War’,  2024, IWM <

https://www.iwm.org.uk/history/the-rise-and-fall-of-lord-haw-haw-during-the-second-world-war >[accessed 28 May 2024].

(7)   M. Oakley, ‘Second World War Bombing Raid South Hallsville School’, 2023, East London History < https://www.eastlondonhistory.co.uk/second-world-war-bombing-raid-south-hallsville-school/ >[accessed 29 May 2024].

Operation Biting, also known as the Bruneval Raid, was undertaken by Britain against Nazi Germany in February 1942. It involved a daring raid on a radar station on Nazi-occupied northern France. Terry Bailey explains.

A photo of the the radar near Bruneval, France in December 1941.

As the Nazi forces of fascist Germany ravaged Europe, Britain and the commonwealth stood alone upholding the ideas of freedom, until the USA entered the war on December 7, 1941, against Japanese imperialism and European fascist brutality, after the infamous attack on Pearl Harbor.

Britain then curtailed Hitler's plans to invade Britain (Operation Sea Lion), by defeating the German air force (Luftwaffe) over the skies of Britain in what has become known as the Battle of Britain.

Great Britain and the Commonwealth continued the fight against Nazi terror, across a broad front in large and small-scale actions. Some were to protect oil supplies and reserves like the North African campaign, while other military ventures were purely to offer resistance against the Nazi threat while Great Britain continued to rearm after the lack of military spending between the two great wars.

Winston Churchill, Britain's prime minister, was always adventurous and a risk taker promoting bold action, whereas, the higher echelons of the military believed large-scale well-planned campaigns were the only way to defeat the Nazi threat.

However, Winston Churchill, understood that it was impossible to stand by as Nazi Germany terrorized Europe while Britain took time to rearm. With this in mind he ordered the instigation of the Special Operations Executive (SOE), with orders to set Europe ablaze, the Commando forces, the fledgling Airborne units and the combined operations organization, which was tasked with coordinating specialist tasks and raids against occupied Europe.

Combined operations coordinated missions by gathering the appropriate force required from the Royal Navy, Royal Marines, Army, Airforce, the Airborne units, in addition to the Commando forces. Note:- From 1942 onwards Royal Marines progressively trained as commandos and are Great Britain's elite commando force to this day.

These small-scale raids caused the German occupation forces disproportionate disruption and vast logistical headaches, that eventually prompted Adolf Hitler to issue his infamous Commando order, due to the success of this form of warfare. See notes at the bottom.

Behind all the tumultuous events of the war from large-scale action to small-scale raids, the first electronic warfare race was underway between Great Britain and Germany, who had competed for nearly a decade at this point to develop and improve radar. This technology had already aided the Royal Air Force in the defeat of the German Luftwaffe, in the Battle of Britain, developed from the early work carried out by Robert Watson-Watt.

However, the Germans had also developed an extensive radar network along the French coast, providing them with early warnings of Allied air raids. One such radar installation was located near the small village of Bruneval, on the Normandy coast. Intelligence reports suggested that this site housed a Freya and Würzburg radar array, a sophisticated system that the Allies still did not fully understand.

R. V. Jones, a British scientist tasked with researching how advanced German radar was in comparison to Britain's system, was not only able to convince doubters that the Germans had radar but had two types of radar. This radar system consisted of the Freya array and a second part of the Freya set-up, referred to in Enigma decrypts as Würzburg.

Freya was a long-range early-warning radar system but lacked precision; whereas, Würzburg had a much shorter range but was far more precise. So that Jones and his team could develop countermeasures for the Wurzburg system they needed to study one of the systems or at least the more vital pieces of technology of the system.

The British War Office recognized the critical importance of acquiring detailed information about the Würzburg radar. If the Allies could capture and study this technology, it would significantly enhance their countermeasures against the Luftwaffe. Thus, the idea of a commando raid to seize the radar components and gather intelligence was conceived. The responsibility for planning and executing this daring mission fell to the newly formed Combined Operations Headquarters, under the command of Vice-Admiral Louis Mountbatten (Commodore at that time).

It is against this backdrop of indirectly linked events that Operation Biting (the Bruneval Raid) was proposed in 1941, as the German air defenses started to become more effective against the Allied bombing campaign waged on Germany, due to their radar capability.

 

Planning the Raid

Operation Biting was meticulously planned, with careful consideration given to every detail. The operation required a combination of precise military action, technical expertise, and logistical coordination, in addition to, intelligence.

This intelligence not only came from enigma decryptions but human intelligence, in the form of the French resistance coordinated through the Free French forces located in London, England, sponsored by both British SIS and SOE. Human intelligence was gathered by Gilbert Renault, known to the British by the code-name 'Rémy', by several members of his resistance network.

Major John Durnford-Slater (Breveted Lieutenant Colonel), an experienced and resourceful officer, was chosen to lead the raid. Durnford-Slater was the commanding officer of No. 3 Commando, an elite unit specially trained for such operations.

Although designated No. 3 Commando, No.1 and No. 2 did not exist at the time of raising the Commando unit the intention was to raise these as airborne units and as such Durnford-Slater's unit was the first commando unit raised during the Second World War, therefore, Durnford-Slater is considered to be the first British commando of the war.

However, due to the extensive coastal defenses erected by the Germans to protect the installation from a seaborne raid, the British believed that a commando raid from the sea would suffer heavy losses while giving the German defenders sufficient time to destroy the installation.

Therefore, the planner decided on a night-time airborne assault, a method chosen for its element of surprise and the ability to insert troops directly into the vicinity of the target. This type of mission was well suited for glider-borne assault; however, the glider force was even more embryonic than the parachutists.

 

Needless to say, the final choice of assault troops was parachute insertion to be led by Major John Frost OC, of C Company, 2nd Battalion (2 Para), 1st Parachute Brigade, tasked with carrying out the airborne phase of the operation. Frost, who would later gain fame for his role in the Battle of Arnhem, was a seasoned and respected officer with a reputation for bravery and tactical acumen.

 

The Execution of the Raid

On the night of February 27-28, 1942, the operation commenced, as a fleet of Armstrong Whitworth Whitley bombers, modified for paratroop deployment, took off from RAF Thruxton, carrying the raiding party. The aircraft flew across the English Channel under the cover of darkness, navigating carefully to avoid detection by German radar.

As the planes approached Bruneval, the paratroopers prepared for the jump. The landing zone was a field near the radar site, carefully selected for its proximity and relative isolation. Despite challenging weather conditions and the inherent risks of a night jump, the paratroopers landed with remarkable precision. They quickly regrouped and moved towards their objective.

The raiding party encountered immediate resistance from German troops stationed at the radar site. A fierce firefight ensued, but the airborne troops utilizing their training, aggressive fighting spirit and superior tactics, managed to overcome the defenders. During the engagement, the paratroopers captured several German personnel, including a radar technician who would later provide valuable intelligence.

 

Capturing the Radar

With the site secured, the technical team, led by Flight Sergeant Charles Cox, set to work dismantling the Würzburg radar. This was a delicate and complex task, requiring both technical skill and speed. Cox and his team managed to extract the most crucial components, including the radar dish and its associated equipment, all while under the threat of counterattacks and the ticking clock.

As dawn approached, the raiding party signaled for the extraction phase. Landing craft and Royal Navy Motor Gun Boats (MGBs) plus Motor Launches (MLs), under the command of Commander F. N. Cook of the Royal Australian Navy, were positioned offshore to evacuate the raiders. The airborne raiders made their way to the extraction point on the beach, carrying the valuable radar components and escorting their prisoners.

The evacuation was fraught with danger, as German reinforcements were rapidly approaching, landing craft hit the beach with the covering troops opening fire on the German soldiers gathering at the top of the cliff, while the radar equipment, German prisoners and all but six of the raiding force were embarked and transferred to motor gunboats.

The raiding force then withdrew under the cover of naval gunfire. By the time the Germans reached the beach, the raiders were already en route back to England, escorted by a Royal Naval destroyer and Royal Air Force Spitfires.

 

The Aftermath and Impact

Operation Biting was hailed as a resounding success. The captured radar components and the intelligence gleaned from the raid provided the Allies with crucial insights into German radar technology. This knowledge enabled the development of effective countermeasures, helping diminish the effectiveness of the German radar network.

The raid also had a profound psychological impact, demonstrating the capability and determination of Allied elite Special Forces, boosting morale and showcasing the potential of combined operations. For the Germans, it was a stark reminder of the Allies' ability to strike with precision and impunity, even in seemingly secure locations. Additionally, this operation helped secure the validity of airborne forces for specialist raids.

In conclusion, Operation Biting stands as a showcase for the courage, ingenuity, and determination of the Allied forces during the Second World War. The successful execution of the Bruneval Raid not only provided vital intelligence but also demonstrated the effectiveness of combined arms operations and elite Special Forces. The legacy of the raid and its commanders continues to inspire military strategists and historians, highlighting the enduring importance of adaptability and innovation in warfare.

The success of the raid prompted the War Office to expand the existing British airborne forces, setting up the Airborne Forces Depot and Battle School in Derbyshire in April 1942, and creating the Parachute Regiment, in addition to, converting several infantry battalions into airborne battalions in August 1942.

The Bruneval Raid remains a shining example of what can be achieved when meticulous planning, exceptional leadership, and unwavering bravery converge on the same goal. Thereby, serving as a reminder of the sacrifices made by those who undertook such perilous missions, and the profound impact these operations had on the course of history.

 

Find that piece of interest? If so, join us for free by clicking here.

 

 

Note:

SOE operated independently until the successful Royal Marine raid on the port of Bordeaux, known as Operation Frankton, between the 7th to the 12th of December 1942. Due to both combined operations utilizing the Royal Marines and SOE mounting duplicate missions on the same target independently. a clearing house for special operations was set up, to prevent further duplication of mission. A policy that is now standard practice for all NATO member countries today.

Even though the mission was highly successful, some Royal Marines were executed by the Germans under the commando order. Yet the Germans said Operation Frankton was the most courageous raid of all time.

 

Decorations and awards

19 decorations were awarded including a Military Cross (MC) for Major John Frost, Distinguished Service Cross (DSC), Commander F. N. Cook, and Military Medal, (MM) for Flight Sergeant Cox.

2 additional Distinguished Service Crosses, DSCs

2 further Distinguished Service Medals, (DSM)

Another Military Cross, (MC)

2 further Military Medals, (MMs)

9 Mentions in Dispatches (MiD)

 

In addition to these awards, a bar to the Distinguished Service Order (DSO), for Wing Commander Percy Charles Pickard, of No. 51 Squadron Royal Air Force provided the aircraft and aircrew needed for the operation.

 

The Commanders and Their Legacies

Major John Frost, who led C Company, 2nd Battalion (2 Para), the 1st Parachute Brigade during the raid, continued to distinguish himself throughout the war. He played a pivotal role in the Battle of Arnhem during Operation Market Garden in 1944, where his leadership and tenacity earned him widespread admiration. Despite being captured and enduring the hardships of a prisoner of war, Frost's legacy as a courageous and skilled leader remained intact.

After the war, he continued to serve in the British Army, eventually retiring as a major general. His memoirs, "A Drop Too Many," provide a detailed account of his wartime experiences and the Bruneval Raid.

Major John Durnford-Slater, the commander of No. 3 Commando, also had a distinguished military career. He led his unit in several other successful operations, including the St. Nazaire Raid, known as the greatest raid of all time.

Durnford-Slater's leadership and innovative tactics helped shape the future of British Special Forces, along with several other figures. After the war, he retired from the military and wrote "Commando: Memoirs of a Fighting Commando in World War II," which remains a seminal work on commando operations.

Once the war was war over, he reverted to the rank of Captain, before being promoted to Major in January 1946, retiring a month later with the honorary rank of Brigadier. He maintained his contact with the military, however, and in 1947 went on to the Reserve list, until 1964 when he reached mandatory retirement age.

 

The German commando order

 

The order itself stated:

1.   For a long time now our opponents have been employing in their conduct of the war, methods which contravene the International Convention of Geneva. The members of the so-called Commandos behave in a particularly brutal and underhanded manner, and it has been established that those units recruit criminals not only from their own country but even former convicts set free in enemy territories. From captured orders, it emerges that they are instructed not only to tie up prisoners, but also to kill out-of-hand unarmed captives who they think might prove an encumbrance to them, or hinder them in successfully carrying out their aims. Orders have indeed been found in which the killing of prisoners has positively been demanded of them.

2.   In this connection, it has already been notified in an Appendix to Army Orders of 7.10.1942. that in future, Germany will adopt the same methods against these Sabotage units of the British and their Allies; i.e. that, whenever they appear, they shall be ruthlessly destroyed by the German troops.

3.   I order, therefore:— From now on all men operating against German troops in so-called Commando raids in Europe or in Africa, are to be annihilated to the last man. This is to be carried out whether they be soldiers in uniform, or saboteurs, with or without arms; and whether fighting or seeking to escape; and it is equally immaterial whether they come into action from Ships and Aircraft, or whether they land by parachute. Even if these individuals on discovery make obvious their intention of giving themselves up as prisoners, no pardon is on any account to be given. On this matter, a report is to be made on each case to Headquarters for the information of Higher Command.

 

 

Should individual members of these Commandos, such as agents, saboteurs etc., fall into the hands of the Armed Forces through any means – as, for example, through the Police in one of the Occupied Territories – they are to be instantly handed over to the SD, to bold them in military custody – for example in P.O.W. Camps, etc., – even if only as a temporary measure, is strictly forbidden.

1.   This order does not apply to the treatment of those enemy soldiers who are taken prisoner or give themselves up in open battle, in the course of normal operations, large-scale attacks; or in major assault landings or airborne operations. Neither does it apply to those who fall into our hands after a sea-fight, nor to those enemy soldiers who, after air battle, seek to save their lives by parachute.

2.   I will hold all Commanders and Officers responsible under Military Law for any omission to carry out this order, whether by failure in their duty to instruct their units accordingly or if they themselves act contrary to it.

Henry Wallace was Franklin D. Roosevelt’s third-term vice president. He had been forced off the Democratic ticket by Democratic Party leaders in 1944. But what would have happened had he won? Here, Benn Steil considers what could have happened to the Cold War.

Henry Wallace in 1940.

In a 2012 “documentary” film and book titled The Untold History of the United States, filmmaker Oliver Stone contended that there would have been “no Cold War” had Henry Wallace, FDR’s third-term vice president, not been forced off the ticket by reactionary Democratic Party leaders in 1944.[1] Wallace, rather than Harry Truman, would have become president on FDR’s death the following April, and would, Stone claims, have successfully pursued a policy of peace.

Based on a multitude of primary-source accounts of the nomination battle between Wallace and Truman, and my review of the careers of all 1,176 Democratic convention delegates, who Stone (and others) have alleged were bribed with ambassadorships and the like, I can safely conclude that this was no case of a “stolen election”—Truman won fairly and convincingly.[2] But this paper will look at the much more compelling and interesting question—which has been raised not just by the polemicist Stone, but by serious scholars—of whether the Cold War was avoidable with a different American president, pursuing very different policies.  In the case of a Wallace presidency, we know that there would been no Truman Doctrine, no Marshall Plan, no NATO, no West Germany, no western European integration, and no policy of containment.  All of these initiatives, foundational to what has been called “the American Century,” Henry Wallace denounced as imperialistic and unjustifiably hostile to the Soviet Union.

 

Wallace’s Beliefs

With utter conviction, Henry Wallace believed that friendly, trusting cooperation between the United States and the Soviet Union was essential to spreading global peace and prosperity after the Second World War.  He also believed that the fault for rapidly deteriorating relations between the two great powers after the February 1945 Yalta conference lay primarily with the United States (and Great Britain), whose original sin was to oppose the Bolsheviks’ rise to power after 1917.  Wallace, a deeply religious man, abhorred Communism as a misguided godless ideology, but admired major elements of Soviet planning, such as agricultural collectivization, on the grounds that they were, to his mind, being driven by technocrats in the interest of advancing industrial progress and “economic democracy.” He was convinced that building a global “Century of the Common Man” required a blending of American political democracy with the Soviet economic version.

Though Wallace was a brilliant agricultural geneticist, who with great insight and persistence revolutionized the development of superior strains of crops, he was also fascinated, throughout his adult life, with what he considered alternative ways of “knowing.” These included astrology, theosophy, and mysticism.  He defended these interests on the basis of the writings of the neo-transcendentalist psychologist and philosopher of religion William James—highly controversial writings about the rationality of “belief.”

James was not interested in whether Jesus was the messiah, or whether the Jews were chosen.  For James, a “true” belief was one that was useful to the believer.  It was neither necessary nor useful to inquire as to whether a belief was true in the sense that it corresponded to some objective external reality, since that might be unknowable.  It was necessary only to ask whether the belief had practical value for the believer here and now, which in turn depended on the use to which he or she put it.  This conception of truth derived from the tenets of the pedigreed philosophical program of pragmatism.

Since much of what we require to make sense of the world is simply not available to us, it was, James argued, only rational to evaluate a belief based on whether it helped the believer to cope effectively.[3] Understanding “true” belief as being a property of the believer, and not something that could necessarily be shared by others, may not be commonplace.  Yet for those like Wallace, who internalized it, pragmatism freed them to examine spiritual systems and to reserve judgment until their effect on one’s ability to navigate the world could be evaluated.  Wallace embraced James’s controversial argument that it was often rational to believe without evidence, for the reason that access to evidence may first require the adoption of certain beliefs.[4] As a political figure, particularly at the apex of his career, Wallace would elevate James’s “beliefs about beliefs” to a central place in his quest to transform not just the content of American foreign policy, but the very way in which America conducted diplomacy.  He would never, however, take to heart the philosopher’s warning: that whereas “we have the right to believe” without evidence, we do so “at our own risk.”[5]

Wallace believed that peace with the Soviet Union would come naturally once Joseph Stalin and his government saw that American leaders truly believed in it—and set policy as if they believed it.  In Jamesian fashion, Wallace did not claim to have evidence that the Soviets would pursue peaceful policies if America did—that is, if it abandoned its overseas air bases, withdrew its troops from Asia, put its atomic bombs into UN escrow, and foreswore military and financial support for Greece, Turkey, and nationalist China.  Running for president as the Progressive Party candidate in 1948, Wallace explained that “you get peace by preparing for peace rather than for war.”[6] That is, peaceful behavior begets peaceful behavior in others.  He thus denied any legitimate role for military readiness or deterrence, contravening a basic tenet of thinking in international relations.  As observed by the scholar Hans Morgenthau, “the political aim of military preparations is . . . to make the actual application of military force unnecessary by inducing the prospective enemy to desist from [its] use.”[7] Consistent with Morgenthau’s thinking, Wallace had, in 1940, under the banner of “total preparation,” defended the buildup of American naval and air force bases in the Western Hemisphere.  “If we are properly prepared, we shall not have war on this hemisphere.”[8] His post-war political thinking therefore deviated radically not just from conventional thinking, but from his own pre-war expressions of it.

A few months after Wallace announced his candidacy for president, the U.S Representative on the new UN Commission on Human Rights, Eleanor Roosevelt, who had staunchly opposed his removal from the ticket in 1944, wrote that Wallace was now “doing more wishful thinking than realistic facing of facts.” The Soviets, she said, “understand strength, not weakness.”[9] After garnering barely a million votes, and no electoral votes, in the 1948 election—coming in fourth behind Dixiecrat segregationist Strom Thurmond—Wallace became a political irrelevance to both the Soviets and the American Communists.  Stalin and six other Politburo members handling major foreign policy decisions voted in January 1949 to cease contacts with him.

With the advent of the Korean War in June 1950, however, Wallace found a convenient pretext to assert that the break with Moscow was his own doing.  Wallace condemned the Soviets for precipitating the North Korean invasion, and resigned from the Progressive Party.  Stalin, he asserted rightly (though without the documentary evidence we have now), precipitated the invasion to incite war between the United States and China.  He was now, he said in December, “convinced that Russia is out to dominate the world.”[10]

In 1952, he wrote a piece in the New York Times entitled “Where I Was Wrong,” in which he confessed his failure to see “the Soviet determination to enslave the common man morally, mentally, and physically for its own imperial purposes.” Though he had in 1948 blamed the Communist takeover in Czechoslovakia on the U.S. ambassador and “rightists” in the Czech government, he now regarded his earlier defense of Prague’s “Moscow-trained Communists” as “my greatest mistake.”

In a New York Times interview eleven years later, he went further.  “I was mistaken,” Wallace confessed, “in my estimate of the Russians’ intentions.  I believed then that Stalin was prepared to be the kind of partner in peace that he had been in war.  I believed that, if we could overcome the Russians’ centuries-old distrust of Western imperialism and their later fear of Western capitalism, they would collaborate in the rebuilding of a truly democratic world.”[11]

These were remarkable admissions, unacknowledged by Stone and other prominent Wallace acolytes, that he had been unjustified in blaming the United States for what he had previously termed “defensive” acts of Communist aggression and expansion.  “[W]e can do a great deal to end any abuses on [Russia’s] part,” he had said in 1947, “through economic assistance and sincere pledges of friendship with the Russian people.”[12]  Still, in spite of his now condemning “Russian Communism” as “something totally evil,” he maintained, illogically, that “the whole course of history” would have been different had Roosevelt “remained alive and in good health”—as if Roosevelt could have vanquished “evil” with unilateral disarmament and words of peace.[13]

 

Soviet Beliefs

On Wallace

The Soviets began paying keen attention to Wallace in 1942, when Andrey Gromyko, then counselor at the Soviet embassy in Washington, learned, and independently corroborated, that Wallace had defended the Soviet invasion of Finland three years prior.  Gromyko cabled the information to Moscow, stressing that Wallace was “the most probable Democratic Presidential candidate” in the 1944 election.[14]

In late May of 1944, less than two months before the Democratic convention at which Truman would replace him on the ticket, Wallace began a four-week tour of Siberia—a tour mischievously suggested by FDR after refusing Wallace’s request to visit Moscow.  The Soviets, at great cost, constructed a Potemkin continent for him, disguising labor camps and shepherding him, under intensive NKVD watch, through suddenly stocked stores, enterprises newly staffed by Communist officials, and concerts performed by political prisoners.  Despite the vice president’s glowing praise for Stalin’s accomplishments in Asia, intelligence agents stole and copied his diary—before confirming for Moscow that his public sentiments appeared genuine.

Wallace went on to meet with Chiang Kai-shek in Chunking, where the Soviets spied on him intensively.  They discovered that Wallace had, outside earshot of his State Department minder, urged Chiang to make territorial and commercial concessions to Moscow to smooth relations after the war.  The intelligence find naturally went up to Stalin.  The Soviets interpreted Wallace’s extraordinary unauthorized intervention as a sign that the U.S. administration would give them a free hand in Manchuria, leading to rapacious nine-month occupation of the region from August 1945 to May 1946.  By the end of that occupation, Mao’s forces were able to use it as a base to defeat Chiang’s Kuomintang and unify the mainland under Communist control.

In perhaps the most concise summary of Soviet views of Wallace, assistant foreign minister Andrey Vyshinsky, lead prosecutor at the notorious Moscow show trials of 1934 to 1938, reported to Stalin in October 1947, after meeting with Wallace at the Soviet consulate in New York, that the soon-to-be Progressive Party presidential candidate was both “sympathetic to us” and “somewhat naïve.”[15] In March and April 1948, Wallace would meet secretly with Gromyko, now UN ambassador, to plead for Stalin’s endorsement of his peace ideas—ideas that Wallace said that Stalin could draft for him.  Yet the Soviets would still not accept his sincerity.  Gromyko cabled Moscow that Wallace’s thoughts on disarmament were, lamentably, “much like the official position of the Americans and the British, who consider trust a prerequisite for disarmament.” The Soviets were demanding immediate American nuclear disarmament, while rejecting their own participation in any international inspections regime; neither trust nor verification were to be part of the equation.  The Soviet treatment of Wallace, their most consistent and genuine friend in the Roosevelt and Truman administrations, bore out George Kennan’s quip, in 1946, that even if the United States were to disarm entirely, deliver its “air and naval forces to Russia,” and resign “powers of government to American Communists,” the Soviets would still smell a trap.[16]

The wider point is that the Soviets never showed the slightest regard for Wallace’s Jamesian belief in world peace. Wallace was an avowed capitalist (albeit one with an aberrant love of planning), and part of an imperialist establishment with which peace was only possible as a temporary political expedient.  To be sure, Stalin would have welcomed a Wallace presidency, but not because it would have reduced the need for expanded frontiers in eastern and central Europe and northeast Asia, or rapid development of an atom bomb.  Though Wallace insisted publicly that Stalin wanted peace “above everything else,” and that Soviet policy was directed at “the achievement of economic and social justice,”[17] the truth was quite different.

 

On Imperial Expansion

“I saw my mission in extending the borders of our Motherland as far as possible,” Vyacheslav Molotov, Stalin’s longtime foreign minister, would explain in retirement.  “It seems, Stalin and I, we coped with this task pretty well.”[18]  In Russian security thinking, there was never a meaningful distinction to be drawn between offense and defense.  With a western border stretching thousands of miles through unprotected plains, defense always required, in their view, extending Russian domination further into new “buffer” zones.  Stalin and Molotov would therefore have welcomed a Wallace presidency not because it meant “peace” but because it would have lessened American resistance to Soviet expansion.

Particularly telling is Molotov’s explanation of why Moscow abandoned territorial claims on Turkey and withdrew its 300,000 troops from the country’s borders in 1946.  “It was a good thing we retreated in time,” he said, referring to Truman’s warnings and show of naval force in the region.  “Otherwise it would have led to a joint [Anglo-American] aggression against us.” It was not American disarmament, military retrenchment, and pledges of peace that saved Turkey, but rather American resolve.[19] Yet Wallace opposed financial and military aid to Turkey in 1947, declaring blithely that “there is no Communist problem in Turkey.”[20]

In Greece, where there was most assuredly “a Communist problem,” Wallace opposed aid on the grounds that “Truman’s policy will spread Communism.” Each Communist death “by American bullets,” he said, would bring forth ten more Communists.[21]  Yet by October 1949, thanks to U.S. aid, the Communist guerillas would be defeated.  And in February 1952, Greece would become a member of the new U.S.-led NATO security alliance.  Stalin stayed out of the Greek civil war, and scolded the Yugoslavs to do so as well, not because of American peace pledges, but because he knew that Truman would not let the Communists win. “"[D]o you think that . . . the United States, the most powerful state in the world,” he scolded Yugoslav diplomats in early 1948, “will permit you to break their line of communication in the Mediterranean?  Nonsense!”[22]

In Germany, the heart of the early Cold War conflict, Wallace opposed the creation of a separate democratic state in the west.  Whereas Wallace believed that division of the country would lead to war with the Soviets, division in fact prevented it, as neither the United States nor the Soviet Union could countenance a united Germany being an ally of the other.  Stalin’s determination to dominate a unified country is clear.  “All of Germany must be ours,” he told Bulgarian and Yugoslav leaders in 1946.  “That is, Soviet, Communist.”[23]

In June of 1950, Stalin gave North Korean leader Kim Il-sung permission to invade the South, and urged Chinese leader Mao Tse-tung “to immediately concentrate nine Chinese divisions on the Korean border for volunteer action in case the adversary crosses the 38th parallel.” He pledged “to provide air cover” to protect them.[24]  His secret aim, Stalin explained to the Communist Czech president Klement Gottwald, was to “pull China into the struggle” and force the United States to “overextend itself.” This would “provide the time necessary to strengthen socialism in Europe” and “revolutionize the entire Far East.”[25]  These were hardly the words of a Soviet leader who, in Wallace’s eyes (until 1950), “really wants peace,” and was only reacting to American aggression.[26] Wallace concluded in 1952 that, “knowing more about Russia’s methods,” it had been “a serious mistake when we withdrew our troops” from the region in 1949—a withdrawal he had back then deemed essential to promoting world peace.[27]  He further explained that “Russian aggression” had caused him to reverse his opposition to the atom bomb.  Korea, he explained, now “justified” holding on to it.[28]

 

On Atom Bomb Development

On June 14, 1946, Bernard Baruch presented the U.S. atomic regulation plan to the new United Nations Atomic Energy Commission (UNAEC).  Andrey Gromyko countered with the Soviet plan five days later.

The two plans were fundamentally different.  The United States wanted internationalization of atomic energy control, but insisted on effective machinery for inspection and enforcement before giving up its bombs or the industrial technique to make them.  The Soviets held that international inspection would constitute intolerable interference into national sovereignty.  They wanted immediate American disarmament, and violations of any future treaty subject to remedy only by approval of the Security Council—and even then, only in cases involving “aggression.”[29] This framework appeared to give Moscow carte blanche to develop and deploy atomic bombs while America disarmed. Even if the Soviets were to use such bombs for “aggression,” they could veto any punishment.

Wallace, as Commerce secretary, had, in a July 23 letter to the president, attacked the U.S. atomic plan for its “fatal defect . . . of requiring other nations to enter into binding commitments not to conduct research into the military uses of atomic energy and to disclose their uranium and thorium resources while the United States retains the right to withhold its technical knowledge of atomic energy until the international control and inspection system is working to our satisfaction.”

“Is it any wonder,” Wallace asked rhetorically, “that the Russians did not show any great enthusiasm for our plan?” He predicted that the Russians would now “redouble their efforts to manufacture bombs,” and “may also decide to expand their ‘security zone’ in a serious way.” Such aggressive efforts would then be the fault of the United States.

But Wallace (or, rather, the Soviet agent who had drafted his letter—Harry Magdoff) had grossly mischaracterized the U.S. plan.  Rather than the various stages of disarmament and information-sharing being set according to U.S. whim and diktat, as Wallace had charged, Baruch’s proposal called for staged action according to “pre-arranged schedules.” This structure was precisely what Wallace was urging.

Wallace’s mischaracterizations were clearly taken from an article in the June 24 issue of Pravda, in which the Soviet journalist Boris Izakov charged, with no basis, that “the U.S. government [was] likely counting on determining on its own discretion the terms within which it will permit the international agency—‘in successive stages’—to take a peek at [its atomic] secrets.” It was, Izakov wrote, expecting “all other nations [to] show blind trust in [its] intentions.”[30]Wallace had, in fact, discussed the Pravda “atomic blast” with the Times’s Felix Belair back on June 25, and referred to it in his July 23 letter to Truman.

The resemblance between the Pravda and Wallace critiques of Baruch is uncanny.  Wallace had simply accepted a Soviet caricature of U.S. policy as accurate, and had not even bothered to speak with his own country’s U.N. delegation before sending his letter to the president.[31] Once confronted with clear evidence from Baruch that his claims were inaccurate, however, not to mention damaging to the credibility of U.S. negotiators, he might have been expected to concede his mistakes.  Instead, he chose to reiterate his original position—that is, Pravda’s position.

Wallace’s July 23 letter had also offered a muddled defense of Gromyko’s counterproposal.  The Soviets wanted the United States to destroy all stocks of atomic weapons, finished or unfinished, within three months of an agreement’s signing.  In this respect, at least, according to Wallace, Moscow’s plan “goes even further than our[s]” toward international control of atomic energy.

But this assertion was nonsensical, since Moscow’s plan contained no provision for international inspection and no mechanism for punishing violations.  What Wallace had not understood was that completing a Soviet bomb had, since Potsdam, become Stalin’s overriding national objective.  “International” control—which Stalin understood to be synonymous with American control—could not have been of less interest to him.

The purpose of the Gromyko plan, unveiled in June 1946, was, the State Department’s George Kennan argued, to exploit “the merciless spotlight of free information” in America to compel U.S. disarmament while the Soviets “proceed[ed] undisturbed with the development of atomic weapons in secrecy.”[32] For Washington, therefore, any credible international plan to eliminate the weapons had to manage the processes of disarmament, inspection, and control simultaneously.

Underscoring the seriousness with which the Baruch plan took the integrity of such efforts, it required the permanent members of the U.N. Security Council to renounce their vetoes with respect to any agreement.  This provision was meant to ensure that no U.N. member could stymie the legitimate sanctions authority of the new atomic control agency.  But the Soviets refused to accept any weakening of veto rights.  To do so, Izakov wrote in Pravda, would mean “renouncing their sovereignty . . . in favour of the USA.” Wallace, notably, defended the Soviets by arguing that the veto was “completely irrelevant,” since the treaty signatories could simply declare war on a violator.  Yet this point underscored that no action short of war—war unsanctioned by any international authority—would be available to the signatories if a Security Council veto could block enforcement or punishment action.

The New York Times concluded, charitably, that the “vagueness” of Wallace’s attack on U.S. policy reflected a failure to “fortify his idealism with the necessary facts.” Moreover, in being “unpardonably careless with the deadly fireworks of atomic policy,” he had undermined prospects for success in critical and delicate negotiations.[33] What even Baruch had not understood at the time, though, was that these negotiations never stood any practical chance of success.

On June 21, 1946,[34] two days after Baruch presented his plan to the UNAEC, former NKVD head Lavrenty Beria, now supervising the Soviet bomb project, submitted to Stalin for approval a draft decree of the Council of Ministers of the USSR to begin actual production of atom bombs—the first one to be ready for testing by January 1, 1948 (too optimistic by twenty months).[35] All technical hurdles had been surmounted.  Stalin was now sure he had his bomb in sight, and so his diplomacy aimed at pressuring the United States to disarm while spinning out U.N. negotiations until it could be completed.

The appointment of the relentless Gromyko as Soviet representative to the UNAEC was central to carrying out the strategy of badger and delay.  “[T]he American project [remains] unacceptable in substance,” according to instructions he received from the Soviet Foreign Ministry on December 27, 1946.[36] “For tactical reasons,” however, “we believe that it is necessary not to decline discussion, but to suggest its discussion point by point, simultaneously insisting on introducing amendments.  Such tactics are more flexible and may give better results.” By rejecting Soviet counterproposals, the Americans would “bring odium on themselves for the break up.”[37]

That, however, would not happen.  On December 30, Baruch, with Truman’s backing, demanded that the UNAEC vote.  It went 10–0 in favor of the United States, with abstentions by the Soviet Union and Poland.  The Soviet proposals of 1947, following a joint statement by Canada, China, France, and the U.K. condemning them, would be officially rejected on April 5, 1948, by a vote of 9–2.[38] The Soviets got their stalemate, but failed to achieve any propaganda victory.

It may be argued that since nothing like the Baruch plan could ever have secured Soviet support, given Stalin’s determination to build the bomb, Wallace’s attack on it did little damage.[39] The plan, however, represented a sincere and serious approach to marrying disarmament with a robust inspection regime, one widely supported by top peace-loving, internationalist-minded American scientists, as well as prominent liberal political figures such as Eleanor Roosevelt.  As such, it deserved better than the glib treatment to which Wallace had subjected it.[40] At the very least, Wallace, by parroting Pravda and discrediting Baruch’s efforts among many progressives, only helped the Soviets escape their share of responsibility for the horrific atomic arms race that followed.

 

So Was the Cold War Inevitable?

In short, Wallace’s Jamesian belief in peace was gravely misguided.  From what we today know of Soviet ambitions in the early postwar years, a Wallace presidency could only have resulted in a delayed Cold War—delayed, that is, until November 1948, at which time he would almost surely have been defeated in an election.  Wallace himself doubted he could have swung Congress or “public opinion” in his favor.  “[I]t is a very grave question whether I would have been [elected] with the tactics that I would have used in order to preserve the peace,” he reflected in retirement.  Most likely, he concluded, “I was done a very great favor when I was not named in ’44.”[41]

In any case, a delayed Cold War would have come at great cost to U.S. security and economic interests.  A failure to resist and deter Stalin would likely have meant Soviet domination of northern Iran, eastern Turkey, the Turkish straits, Hokkaido, the Korean Peninsula, Greece, and all of Germany.  Stalin, contrary to Wallace’s professions of belief, coveted these territories, and never valued peace for its own sake.  As Churchill said in his famous “Iron Curtain” speech of March 5, 1946, Stalin did not desire war but “the fruits of war and the indefinite expansion of Soviet power and doctrines.”[42] And so he valued the occasion that a passive United States would have afforded him to expand his empire.  In light of both Russian history and geography, one may choose to characterize Soviet expansionism as either opportunistic offense or pre-emptive defense, but expansionist probing and penetration was inevitable—whoever was in the White House.

 

 

Benn Steil is senior fellow and director of international economics at the Council on Foreign Relations and the author, most recently, of The World That Wasn’t: Henry Wallace and the Fate of the American Century.

References

Arkhiv Prezidenta Rossiiskoi Federatsii (The Archive of the President of the Russian Federation) [AP RF], Moscow, Russia.

Arkhiv vneshnei politiki Rossiiskoi Federatsii (The Archive of the Foreign Policy of the Russian Federation) [AVP RF], Moscow, Russia.

Baldwin, Hanson W. “Atomic Energy Control: The Points in Dispute.” New York Times. October 6, 1946

Batiuk, V.I. “Plan Barukha i SSSR”—Kholodnaia Voina. Novye podkhody. Novye dokumenty. Moskva: Institut Vseobshchei istorii RAN, 1995. (Batiuk, V.I. “The Baruch Plan and the USSR,” in The Cold War. New Approaches. New Documents. Moscow: The Institute of General History, Russian Academy of Sciences, 1995.)

Blum, John Morton (ed.). The Price of Vision: The Diary of Henry A. Wallace. Boston: Houghton Mifflin, 1973.

Churchill, Winston. “Sinews of Peace.” Fulton, Missouri. March 5, 1946.

Chuev, F. Sto sorok besed s Molotovym: Iz dnevnika F. Chujeva. Moskva: Terra, 1991 (Chuev, Felix. One Hundred Forty Conversations with Molotov: From the Diary of F. Chuev. Moscow: Terra, 1991.)

Devine, Thomas W. Henry Wallace’s 1948 Presidential Campaign and the Future of Postwar Liberalism. Chapel Hill: University of North Carolina Press, 2013.

Djilas, Milovan. Conversations with Stalin. San Diego, New York, London: Harcourt Brace & Company, 1962.

Eleanor Roosevelt Papers, George Washington University, Washington, DC.

Feinberg, Alexander. “Contrasting Views on Russian Moves.” New York Times. March 20, 1946.

Foreign Relations of the United States [FRUS]. Washington, DC: U.S. Government Printing Office.

Gerber, Larry G. “The Baruch Plan and the Origins of the Cold War.” Diplomatic History. Vol. 6, No. 1 (Winter 1982): 69–95.

Goldschmidt, Bertrand. “A Forerunner of the NPT? The Soviet Proposals of 1947.” International Atomic Energy Agency Bulletin. Vol. 28, No. 1 (March 1986).

Grieder, Peter. The East German Leadership, 1946–73: Conflict and Crisis. Manchester: Manchester University Press, 2000

Hamilton, Thomas J. “Baruch Counters Wallace, Says Atomic Policy Stands.” New York Times. September 20, 1946.

Henry A. Wallace Collection, University of Iowa, Iowa City, Iowa.

Izakov, Boris. “Mezhdunarodnoe obozrenie.” “Pravda,” 24 ijunia 1946. (Izakov, Boris. “International Review.” Pravda. June 24, 1946.)

James, William. The Varieties of Religious Experience: A Study in Human Nature. New York, London, and Bombay: Longmans, Green, 1902.

——— . The Will to Believe. New York, London, and Bombay: Longmans, Green, 1896 [1912]

Krock, Arthur. “Mr. Wallace Contributes to a Growing Impression.” New York Times. October 4, 1946.

MacDougall, Curtis D. Gideon’s Army. Three Volumes. New York: Marzani & Munsell, 1965.

Mal’kov, V.L. “Igra bez myacha: sotsial’no-politicheskii kontekst sovetskoi ‘atomnoi diplomatii’ (1945–1949).” Holodnaia voina 1945–1963. Istoricheskaia retrospektiva: Sbornik statei pod red. Jegorova, N.I., Chibarian, A.O. Moskva: OLMA-Press, 2003. (Malkov, V.L. “Off the Ball Game: Social-Psychological Context of the Soviet ‘Atomic Diplomacy’ (1945–1949).” In The Cold War 1945–1963. Historic Retrospective, edited by N.I. Jegorova and A.O. Chubaryan. Moscow: OLMA-Press, 2003.)

Morgenthau, Hans J. Politics Among Nations: The Struggle for Power and Peace, Brief Edition. Revised by Kenneth W. Thompson. New York: McGraw-Hill, 1948 [1993].

New York Times. “Wallace Says U.S. Force Should Quit Iceland Base.” March 22, 1946.

New York Times. “Text of Secretary Wallace’s Letter to President Truman on U.S. Foreign Policy.” September 18, 1946.

New York Times. “Statement by Baruch on Controversy with Wallace and Texts of Exchanges Between Them.” October 3, 1946.

New York Times. “Some Facts for Mr. Wallace.” October 4, 1946.

New York Times. “Baruch vs. Wallace.” October 6, 1946

New York Times. “Russia Says Korea Justifies Atom Bomb.” August 11, 1950

New York Times. “Wallace Says Russia Seeks to Rule World.” December 4, 1950.

New York Times. “Wallace Declares ‘Mr. X’ Story False.” March 18, 1952.

Pechatnov, Vladimir O. “The Soviet Union and the World, 1944–1953.” In The Cambridge History of the Cold War, Vol. I: Origins, edited by Melvyn P. Leffler and Odd Arne Westad. Cambridge: Cambridge University Press, 2010.

Pechatnov, Vladimir O., and C. Earl Edmondson. “The Russian Perspective.” In Debating the Origins of the Cold War: American and Russian Perspectives, by Ralph B. Levering, Vladimir O. Pechatnov, Verena Botzenhart-Viehe, and C. Earl Edmondson. Lanham, MD: Rowman & Littlefield, 2001.

Phillips, Cabell. “At 75, Henry Wallace Cultivates His Garden.” New York Times. October 6, 1963.

Pigliucci, Massimo. “The Ethics (or Lack Thereof) of Belief.” Philosophy as a Way of Life (blog), August 31, 2022.

Reminiscences of Henry Agard Wallace, 1951–1953, Columbia Center for Oral History [CCOH], Columbia University, New York, New York.

Roosevelt, Eleanor. “Plain Talk About Wallace.” Courage in a Dangerous World: The Political Writing of Eleanor Roosevelt. Edited by Allida M. Black. New York: Columbia University Press, 1999.

Rossiiskii gosudarstvennyi arkhiv sotsialno-politicheskoi istorii (The Russian State Archive of Social and Political History) [RGASPI], Moscow, Russia.

Schapsmeier, Edward L., and Frederick H. Schapsmeier. Henry A. Wallace of Iowa: The Agrarian Years, 1910–1940. Ames: Iowa State University Press, 1968.

Sovetsko-amerikanskie otnosheniia 1945–1948: Dokumenty / Pod obshchei redaktsijei Yakovleva A.N. Mezhdunarodnyi fond “Demokratiia.” Moskva: “Materik,” 2004. (Soviet-American Relations 1945–1948: Documents / Academic editor Sevostianov, G.N. International Foundation “Democracy.” Moscow: “Materik,” 2004.)

Steil, Benn, The World That Wasn’t: Henry Wallace and the Fate of the American Century, New York: Avid Reader Press / Simon & Schuster, 2024.

Stone, Oliver, and Peter Kuznick, The Untold History of the United States, New York: Gallery Books, 2012.

Wallace, Henry A. “The UN and Disarmament.” The New Republic. December 23, 1946.

——— .  “The Fight for Peace Begins.” The New Republic. March 24, 1947

——— . “Stand Up and Be Counted.” The New Republic. January 5, 1948.

——— . “Where I Was Wrong.” The Week Magazine. September 7, 1952.

Wilson Center Digital Archive, Washington, DC.


[1] Stone and Kuznick (2012).

[2] See chapter 8 of Steil (2024).

[3] See, in particular, James (1902).

[4] James (1896 [1912]): https://www.gutenberg.org/files/26659/26659-h/26659-h.htm.  For an excellent critique of James’s “ethics of belief,” see Pigliucci (August 31, 2022): https://philosophyasawayoflife.medium.com/the-ethics-of-belief-f1d459c572e3.

[5] James (1896 [1912]).

[6] Wallace (January 5, 1948).

[7] Morgenthau (1948 [1993]: 34).

[8] Schapsmeier and Schapsmeier (1968: 259).

[9] January 2, 1948, “My Day” by Eleanor Roosevelt, Eleanor Roosevelt Papers, George Washington University. Roosevelt (1999: 245). Devine (2013: 68).

[10] New York Times (December 4, 1950).

[11] Phillips (October 6, 1963).

[12] MacDougall I (1965: 170-171).

[13] New York Times (March 18, 1952). Wallace (September 7, 1952).

[14] A. Gromyko, “Record of conversation with Counsel (in the rank of Minister) of the Mexican government in Washington—Don Louis Quintanilla,” September 30, 1942, AVP RF, Fond 0129, op. 26, P 143, file 2, p. 27. A. Gromyko, Counsel, Soviet Embassy in the USA, to A.Ja. Vyshinsky, Assistant People’s Commissar of Foreign Affairs, November 13, 1942, AVP RF, Fond 0129, op. 26, P 143, file 6, p. 28 (NKID US Department entry stamp—January 23, 1943).

[15] “Record of conversation of Assistant Foreign Minister A.J. Vyshinsky and V.A. Zorin with US politician H. Wallace on Soviet-American relations, New York, October 14, 1947, Top Secret,” RGASPI, Fond 82, op. 2, file 1308, p. 68. L. Baranov to M. Suslov, February 27, 1948, RGASPI, Fond 17, op. 128, file 1138, p. 59.

[16] The Chargé in the Soviet Union (Kennan) to the Secretary of State, March 20, 1946, in FRUS, 1946, VI: 721–23.

[17] Feinberg (March 20, 1946). New York Times (March 22, 1946).

[18] Chuev (1991).

[19] Pechatnov and Edmondson (2001: 119).

[20] Wallace (March 24, 1947).

[21] Wallace (January 5, 1948).

[22] Djilas (1962: 141)

[23] Grieder (2000: 12); Djilas (1962: 139); Pechatnov (2010: 103); Pechatnov and Edmondson (2001: 109).

[24] Filippov [Stalin] to Soviet ambassador in Peking for Zhou Enlai, July 5, 1950—RGASPI, Fond 558, op. 11, file 334, p. 79.

[25] Filippov [Stalin] to Mikhail Silin, Soviet Ambassador in Prague, for passing the message orally to Klement Gottwald, August 27, 1950, Wilson Center Digital Archive, referring to a still classified file in RGASPI, Stalin Papers, Fond 558, op. 11, file 62, pp. 71–72.

[26] Wallace (December 23, 1946).

[27] Wallace (September 7, 1952).

[28] New York Times (August 11, 1950). See also Correspondence from Henry A. Wallace to Wayne T. Cottingham dated September 11, 1950, Henry A. Wallace correspondence [reel 47], August 1950–January 1951—Ia47-0439–Ia47-0440, Henry A. Wallace Collection, University of Iowa.

[29] Gerber (Winter 1982). Hamilton (September 20, 1946).

[30] Izakov (June 24, 1946) (italics added). 

[31] Krock (October 4, 1946).  Blum (1973: 581–82).

[32] Memorandum for Under Secretary of State Dean Acheson, July 18, 1946, in FRUS, 1946, I: 861–62.

[33] New York Times (October 3, 1946).  New York Times (September 18, 1946), “Text of Secretary Wallace’s Letter to President Truman on U.S. Foreign Policy.”  New York Times (October 4, 1946).  New York Times (October 6, 1946).  Krock (October 4, 1946). Baldwin (October 6, 1946).Izakov (June 24, 1946).

[34] It may have been shortly before June 21, 1946, but no later than that date.

[35] The letter of L.P. Beria to I.V. Stalin, submitting for [his] approval the draft of the Decision of SM [Council of Ministers] of the USSR, “On the plan for the development of works of CB [Construction Bureau]-11 under Laboratory No. 2 of the AN USSR Academy of Sciences of the USSR,” no later than June 21, 1946. Strictly Secret (Special File), referring to Ryabev II (1999: 432–34); sourced from AP RF, Fond 93, file 99/46, p. 20.

[36] Malkov (2003: 311)

[37] Soviet-American Relations VI (2004: 356–57).

[38] Goldschmidt (March 1986: 62–63).

[39] For a Russian (post-Soviet) statement of this position, see Batiuk (1995: 85–98).

[40] Gerber (Winter 1982) argues, unconvincingly, that Baruch’s position was so unyielding that it never represented a credible effort to reach agreement with the Soviets.  But Baruch was always willing to negotiate within the confines of the U.N. General Assembly’s mandate to the UNAEC, to which the Soviet Union subscribed, which included setting up “effective safeguards” to prevent the misuse of atomic energy.  The Soviets never made a counterproposal which encompassed such safeguards.

[41] Reminiscences of Henry Agard Wallace, CCOH, pp. 4567–70.

[42] Churchill, speech, “Sinews of Peace,” March 5, 1946: https://www.nationalchurchillmuseum.org/sinews-of-peace-iron-curtain-speech.html.

A 1998 U.S. Department of Commerce report provided the following assessment on the emergence of the internet:

"The internet's pace of adoption eclipses all other technologies that preceded it. Radio was in existence 38 years before 50 million people tuned in; TV took 13 years to reach that benchmark. Sixteen years after the first PC kit came out, 50 million people were using one. Once it was opened to the general public, the Internet crossed that line in four years".

Here, Felix Debieux returns to the site and considers the military origin of the internet - and the role of the Vietnam War in that.

A DEC PDP-10 computer. Source: Gah4, available here.

Despite the hold that the internet now has on everyday life, it is difficult to imagine it as something that was ever invented. Unlike the car, the telephone, or the aeroplane, the internet by comparison has managed to achieve ubiquity without us being able to point to an obvious creator. Of course, we see traces of the internet in apps, video games and email, but there is nothing we can picture holding, touching, or turning over in our hands for inspection.

Maybe it is the ubiquity of the internet which makes it such a daunting prospect for historians. With so many applications, how would one even begin to trace its origins? Historians might also be put off by the technical workings of the internet – the circuit boards, networks and switches. The truth, however, is that the history of the internet is more straightforward than we might expect. In fact, if it was not for its simplicity, it is arguable that the meteoric success of the internet would never have occurred.

The emergence of the internet is a story of global proportions. Its inventors (yes, there were inventors!) include the French government-sponsored computer network Cyclades, England’s National Physical Laboratory, Xerox, and the University of Hawaii. Normally placed at the foreground of the typical story are the freewheeling creatives and plucky entrepreneurs of Silicon Valley. Here, the internet is typically cast as a great liberating force that helped to decentralise power and spread democracy around the globe.

The glitz of this conventional narrative, however, obscures a much more sinister history explored in great detail by investigative reporter Yasha Levine. In Surveillance Valley: The Secret Military History of the Internet, Levine shifts the focus of the story onto one of its most important and yet consistently overlooked characters: The Advanced Research Projects Agency (ARPA). A generously funded research arm of the U.S. Department of Defense, ARPA was born out of America’s insecurities during the Cold War. Its remit grew quickly, however, to encompass a wide array of counterinsurgency and surveillance projects which played a key part in the genesis of the internet.

The best starting point for this story is the Cold War, when ARPA first appears on the scene.

 

The Cold War and ARPA

The origins of the internet are rooted in the heightened international tensions of the Cold War, a period during which the U.S. and Soviet Union vied for technological supremacy. Each boasted a deadly arsenal of nuclear weapons, and populations on both sides lived in fear of surprise attack. In the U.S., paranoia peaked in 1957 with the launch of the Soviet satellite Sputnik 1. The success of the Soviet’s in reaching space first shattered America’s sense of exceptionalism. Politicians seized on the launch as a sign of U.S. military and technological weakness. How had America fallen behind the reds in something so vital?

President Dwight Eisenhower was vilified for appearing to have fallen asleep at the wheel. Generals and political rivals spun tales of an impending Soviet conquest of Earth and space, and pushed for greater military spending. Even Vice President Richard Nixon publicly criticised Eisenhower, informing business leaders that the technology gap between America and the Soviet Union was too great for them to expect a tax cut. As the public reeled from defeat in the so-called Space Race, Eisenhower knew that the only way to save face was to do something big, bold and very public.

It was against this backdrop that Eisenhower established ARPA in 1958. The idea was straightforward. With a small staff count and a large budget, ARPA would function as a civilian-led unit housed by the Pentagon. It would neither build or run its own research facilities, but instead would operate as an executive management hub that identified priorities and then siphoned the research out to universities, private research institutes, and military contractors. ARPA would bring together some of the top scientific minds in the country, with the aim of keeping American military technology ahead of the communists. One area given priority were the perceived vulnerabilities in U.S. computer communications. Indeed, military commanders were very keen to develop a computer communications system without a central core, and with no obvious headquarters that could easily be knocked out by a single Soviet strike (thus crippling the entire U.S. network).

 

The ARPANET

ARPA was therefore tasked with testing the feasibility of a large-scale computer network. Lawrence Roberts, the first person to have succeeded in connecting two computers, was responsible for developing the network, and collaborated closely with scientist Leonard Kleinrock. By 1969, the first packet-switch network was developed and Kleinrock successfully used it to send messages to another site. The ARPA Network, or ARPANET - the grandfather of the internet as we know it - was born.

Originally, there were only four computers connected when the ARPANET was created. These were located in the computer labs of UCLA (Honeywell DDP-516 computer), Stanford Research Institute (SDS-940 computer), University of California, Santa Barbara (IBM 360/75) and the University of Utah (DEC PDP-10). The first data exchange over this new network was between computers at UCLA and the Stanford Research Institute. On their first attempt to log into Stanford's computer by typing "log win," UCLA researchers crashed their computer after typing the letter “g”.

In time the network would expand, and different models of computer were connected. This gave rise to inevitable compatibility issues. The solution rested on an improved set of protocols called TCP/IP (Transmission Control Protocol/Internet Protocol) that were designed in 1982. This worked by breaking data into IP (Internet Protocol) packets, akin to individually addressed digital envelopes. TCP (Transmission Control Protocol) then ensured that the packets were delivered from client to server and reassembled in the right sequence.

Through the ARPANET, we see the emergence of several computing innovations which we take for granted today. Notable examples include email (or electronic mail), a system that allows for simple messages to be sent to another person across the network (1971), telnet, a remote connection service for controlling a computer (1972) and file transfer protocol (FTP), which allows information to be sent from one computer to another in bulk (1973).

In its early days, the ARPANET was seen largely as a tool for academic engineers and computer scientists. It linked departments at several universities into a wider network, which by 1973 had expanded to over 30 intuitions in locations as far apart as Hawaii and Norway. While we could choose to end the story of the ARPANET here, it is only by following the roots deeper that we are able to acquire a fuller understanding of how the technology came to be. Indeed, while academic scientists were busy using the ARPANET to establish connections between research sites, it was a conflict taking place thousands of miles away which provided the perfect conditions for the technology to be tested and refined.

 

Vietnam

Some further context is likely to be useful here.  During the Cold War, the U.S. faced regional insurgencies against allied governments. From Algiers to Laos, from Nicaragua to Lebanon, most of these conflicts shared common characteristics. They were born out of local movements, they recruited local fighters, and they were supported by local populations. No matter how many nuclear weapons the U.S. boasted, countering insurgencies of this nature was not something that a conventional military operation was equipped to deal with. There was an obvious need to modernise and expand U.S. military capabilities, the case for which was made very clear by President Kennedy in his 1961 message to Congress:

The Free World’s security can be endangered not only by a nuclear attack, but also by being slowly nibbled away at the periphery, regardless of our strategic power, by forces of subversion, infiltration, intimidation, indirect or non-overt aggression, internal revolution, diplomatic blackmail, guerrilla warfare or a series of limited wars […] we need a greater ability to deal with guerrilla forces, insurrections, and subversion”.

 

In essence, Kennedy envisioned cleverer and more sophisticated ways of fighting communism. What better institution was there than ARPA to develop the modern technological capabilities which the U.S. so desperately needed? In response to Kennedy’s speech, the CIA, the Pentagon, and the State Department drew up plans for a massive programme of covert military, economic, and psychological warfare initiatives to deal with one of America’s key geopolitical problems: the growing insurrection in Vietnam.

Among the biggest beneficiaries in the funding pipeline was ARPA’s Project Agile, a high-tech counterinsurgency programme which aimed to support the government of South Vietnam in researching and developing new techniques for use against the Vietcong. More specifically, Project Agile sought to develop weapons and adapt counterinsurgency gadgets suited to the dense, sweltering jungles of South East Asia. Some of the initiatives in the project included:

§  Testing light combat arms for the South Vietnamese military, which led to the adoption of the AR-15 and M-16 as standard-issue rifles.

§  Developing a light surveillance aircraft that glided silently above the jungle canopy.

§  Formulating field rations and food suited to the hot, wet climate.

§  Developing sophisticated electronic surveillance systems and elaborate efforts to collect all manner of conflict-related intelligence.

§  Working to improve the function of military communication technology in dense rainforest.

§  Developing portable radar installations that could be floated up on a balloon.

Through Project Agile, ARPA would push the boundaries of what contemporaries considered technologically possible. It pioneered electronic surveillance systems that were decades ahead of their time.

Agile was by no means the only ARPA battlefield project. Indeed, among ARPA’s most ambitious initiatives was Project Igloo White, a multi-billion-dollar computerised surveillance barrier. Operated out of a secret air force base in Thailand, the project involved depositing thousands of radio-controlled seismic sensors, microphones, and heat and urine detectors in the jungle. These eavesdropping devices, disguised as sticks or plants and usually dropped from aeroplanes, transmitted signals to a centralised computer control centre which alerted technicians to any movement in the bush. If movement was detected, an air strike was called in and the area was blanketed with bombs and napalm. Igloo White could be described as a giant wireless alarm system that spanned hundreds of miles of jungle. As the US Air Force explained: “we are, in effect, bugging the battlefield”.

Soon we will see how ARPA’s burgeoning surveillance and counterinsurgency expertise fed into the development of the ARPANET. But first it is necessary to understand ARAP’s deepening role in the conflict.

 

Know thy enemy

While the development of high-tech counterinsurgency and surveillance equipment was an important dimension of ARPA’s role in the war, key to our story is the application of the technology to the examination and study of rebellious peoples and their culture. This was the truly visionary part of ARPA’s mandate: to use advanced science not just to develop weapons, but to beat the motivated and well-disciplined Vietnamese insurgents. The idea was to understand why they resisted, to weaken their resolve, to predict insurgency, and to prevent it from maturing. The value of this science to the U.S. military, deployed to a hostile environment which they did not understand, was obvious.

In a step up of its activities, ARPA was given license to figure out how to weaponize anthropology, psychology, and sociology in support of the military’s counterinsurgency work. ARPA doled out millions of dollars to studies of Vietnamese peasants, captured Vietcong fighters, and rebellious hill tribes of northern Thailand. Swarms of ARPA contractors—anthropologists, political scientists, linguists, and sociologists—put poor villages under the microscope. Their work involved measuring, gathering data, interviewing, studying, assessing, and reporting on their inhabitants. As Levine explains, “the intention was to understand the enemy, to know their hopes, their fears, their dreams, their social networks, and their relationships to power”.

Studies commissioned by ARPA contractors sought answers to the questions which nagged at the American psyche: why were North Vietnamese fighters not defecting to our side? What was so appealing about their cause? Don’t they want to live like we do in America? Why was their morale so high? 2,400 interviews of North Vietnamese prisoners and defectors were conducted, generating tens of thousands of pages of intelligence. Perhaps the clearest illustration of ARPA’s application of science to socio-cultural problems was its work on the Strategic Hamlet Programme. The programme was a pacification effort that had been developed as part of Project Agile, and involved the forced resettlement of South Vietnamese peasants from their traditional villages into new areas that were walled off and made “safe” from Vietcong infiltration.

In addition, ARPA funded several projects which sought to pinpoint the precise socio-cultural indicators that could be used to predict when tribes would go insurgent. One initiative involved a team of political scientists and anthropologists sent from UCLA and UC Berkeley to Thailand to map out the religious beliefs, value systems, group dynamics and civil-military relationships of Thai hill tribes, focusing in particular on predictive behaviour. A project report summarised the objective: “to determine the most likely sources of social conflict in north east Thailand, concentrating on those local problems and attitudes which could be exploited by the communists”.

Not content with merely gathering intelligence, ARPA also experimented with how it could shape indigenous populations. For example, one study carried out for ARPA by the CIA-connected American Institutes for Research (AIR) attempted to gauge the effectiveness of counterinsurgency techniques deployed against rebellious hill tribes. Techniques included assassinating tribal leaders, forcibly relocating villages, and using artificially-induced famine to pacify rebellious populations. A 1970 investigation for Ramparts magazine detailed the effects of these brutal counterinsurgency methods on the Meo hill tribe: “conditions in the Meo resettlement villages are harsh, strongly reminiscent of the American Indian reservations of the 19th century. The people lack sufficient rice and water, and corrupt local agents pocket the funds appropriated for the Meo in Bangkok.”

While ARPA’s activities in Vietnam were certainly disturbing, we have yet to explain their relevance to the creation of the ARPANET. An understanding of how ARPA’s science projects were adapted from the jungles of Vietnam and used back home on American citizens will bring us closer to an explanation.

 

Spying at home

To contemporaries, America in this period must have felt like it might explode at any second. Race riots, militant black activism, left-wing student movements, and anti-war protests were rife. In 1968 alone, Robert Kennedy and Martin Luther King Jr. were both assassinated, the death of the latter sparking riots across the country. In November the following year, three hundred thousand people descended on Washington D.C. for the largest anti-war protest in American history. Suspicion of communist agitation was high, and U.S. intelligence services focused their resources on weeding out communist troublemakers - both real and alleged.

This climate of suspicion and paranoia afforded ARPA the perfect conditions in which to refine everything it had learned in Vietnam in a domestic setting. Take its work on the Strategic Hamlet Programme. Perhaps the most unsettling aspect of the initiative is that it was, at least in part, always intended to serve as a model for counterinsurgency operations elsewhere in the world – including against black Americans living in inner cities back home. This was made explicit in the project proposal:

The potential applicability of the findings in the United States will also receive special attention. In many of our key domestic programs, especially those directed at disadvantaged sub-cultures, the methodological problems are similar to those described in this proposal […] the application of the Thai findings at home constitutes a potentially most significant project contribution”.

 

Once the war in Vietnam had concluded, ARPA researchers returned to the U.S. and began applying the lessons of the programme to the domestic problems of racial and socio-economic inequality.

One of the strongest insights we have into ARPA’s domestic turn comes from a shocking exposé, which dragged the agency’s work out into the sunlight. From 2nd June 1975, NBC correspondent Ford Rowan appeared on the evening news every night for a week to warn millions of viewers that the country’s military had built a sophisticated computer communications network and was using it to spy on Americans:

Our sources say, the Army’s information on thousands of American protesters has been given to the CIA, and some of it is in CIA computers now. We don’t know who gave the order to copy and keep the files. What we do know is that once the files are computerized, the Defense Department’s new technology makes it incredibly easy to move information from one computer to another […] this network links computers at the CIA, the Defense Intelligence Agency, the National Security Agency, more than 20 universities, and a dozen research centres, like the RAND Corporation”.

Rowan was no doubt talking about the ARPANET, and warning his fellow citizens of the threat that its networking capabilities posed to American freedoms. This was a story which Rowan had pieced together from the testimony of whistleblowing ARPA contractors, who had grown increasingly uncomfortable with this new application of the ARPANET. His reporting offers a vital window into how the U.S. defence and intelligence communities were using network technology to snoop on Americans in the very earliest version of the internet. Surveillance was baked into its original design.

A demonstration of the technology in action can be gleaned from its use against communists, both real and alleged, living in American.

 

Targeting communists

One of the most illustrative domestic counterinsurgency and surveillance operations conducted against Americans went by the name CONUS Intel (Continental United States Intelligence). Managed by U.S. Army Intelligence Command, CONUS Intel deployed thousands of undercover agents with the aim of infiltrating anti-war groups, monitoring left-wing activists, and filing reports in a centralised database on millions of U.S. citizens. The scale of CONUS Intel was far reaching. Agents reported on the smallest of protests, monitored labour strikes, and kept detailed notes on union supporters. At the 1968 Democratic National Convention they even tapped the phone of Senator Eugene McCarthy, a vocal critic of the Vietnam War. Agents infiltrated a meeting of Catholic priests who protested the church’s ban on birth control. They spied on the funeral of Martin Luther King, mingling with mourners and recording any discussions they heard. They even infiltrated the 1970 Earth Day festival, photographing and filing reports on anti-pollution activists.

In January 1970, former military intelligence officer and whistleblower Christopher Pyle published an exposé in the Washington Monthly which revealed details of CONUS Intel to the public. “When this program began in the summer of 1965, its purpose was to provide early warning of civil disorders which the Army might be called upon to quell in the summer of 1967,” reported Pyle. “Today, the Army maintains files on the membership, ideology, programs, and practices of virtually every activist political group in the country”. This is identical to the way in which ARPA had curated the lives, behaviours and cultures of Vietnamese insurgents.

Pyle’s exposé went on to describe how CONUS Intel’s surveillance data was encoded onto IBM (International Business Machines Corporation) punch cards and fed into a computer located at the Army Counter Intelligence Corps Centre at Fort Holabird. The centre was equipped with a terminal link that could be used to access almost a hundred different information categories and to print out reports on individual people. As Pyle explained:

In this respect, the Army’s data bank promises to be unique. Unlike similar computers now in use at the FBI’s National Crime Information Center in Washington and New York State’s Identification and Intelligence System in Albany, it will not be restricted to the storage of case histories of persons arrested for, or convicted of, crimes. Rather it will specialize in files devoted exclusively to the descriptions of the lawful political activity of civilians”.

As with Rowan’s exposé, Americans were warned again that their freedoms were under threat. In this case, we see how ARPA’s techniques and technology - which had first been used to predict communist uprisings in Vietnam - were now being applied for precisely the same purpose against perceived domestic enemies. Indeed, the same ARPANET which academics had used to link research sites also provided the intelligence community with some of its earliest network surveillance capabilities. On a fundamental level, the academic and military applications were no different. Weeding out communists required surveillance, intelligence, computers to process the intelligence, and networks to transmit the intelligence between advisors in remote locations and their commanding officers in the Pentagon, Whitehouse, or elsewhere. Surveillance of this kind cannot be separated from either the creation of the ARPANET nor the emergence of the internet.

 

Conclusion

In examining the military origins of the internet we are confronted with a paradigm shift. Over a short span of time, we see how counterinsurgency and surveillance capabilities came to play a much greater role on the battlefield than trained soldiers. ARPA played a critical role in this shift. Indeed, its work on the ARPANET provided the ability to manage counterinsurgency and surveillance operations on an unprecedented scale. The networking technology it provided transformed the way in which enemies were monitored and studied, and the ways in which intelligence was gathered, processed and deployed. While ARPA outwardly preferred to talk up the academic benefits of the ARPANET, we cannot escape the military imperatives at the heart of its development.

Taking this a step further, the development of the ARPANET demonstrates how the boundaries between military and civilian life - already eroded by previous 20th century conflicts - took on a disturbing permanence during the Cold War era. Indeed, technology ostensibly developed for academic use proved highly effective in military operations. In turn, military operations provided the conditions needed for the further experimentation and refinement of the technology. In this context, the fine points of distinction between enemies on the battlefield and civilians back home seemed unimportant. The latter increasingly came to be seen as legitimate targets for counterinsurgency and mass surveillance.

To end on an optimistic note, we should also reflect on the importance of whistleblowers, testimony and exposé. At different points during our story, figures like Ford Rowan and Christopher Pyle were willing – some might even say brave – to call out what they saw as the immoral use of new technology against civil liberties. Their efforts to warn Americans of the dangers of mass surveillance and intelligence gathering played a part in holding those with power to account. This feels particularly important today. In recent years, we have seen the reach of the state dwarfed by the emergence of new players in the form of Big Tech. These companies, and the products and services they provide, have opened up new concerns about the way in which the internet works and who benefits from it. While these concerns are not likely to be resolved any time soon, it is worth remembering how people in the past sought to address the earliest ethical questions of the internet.

 

Find that piece of interest? If so, join us for free by clicking here.

 

 

References

Yasha Levine, Surveillance Valley: The Secret Military History of the Internet (2019)

In the Beginning, There Was Arpanet | Air & Space Forces Magazine (airandspaceforces.com)

Inside DARPA, The Pentagon Agency Whose Technology Has 'Changed the World' : NPR

The Emerging Digital Economy | U.S. Department of Commerce

https://en.wikipedia.org/wiki/Strategic_Hamlet_Program

How the internet was invented | Internet | The Guardian

INTERNET Prehistory: ARPANET Chronology

The Origin and Nature of the US “Military-Industrial Complex”

June 28 this week marks the 110th anniversary of the assassination of Archduke Franz Ferdinand and his wife Sophie in Sarajevo, the capital of Bosnia and Herzegovina. The assassination was one of history's greatest turning points, putting into play the diplomatic crisis that led to the First World War. However, it happened by accident, as a result of a whole series of mistakes and missed opportunities.

Alan Bardos, author of a related novel (Amazon US | Amazon UK) explains.

A depiction of the assassination of Archduke Franz Ferdinand of Austria in Sarajevo. From Domenica del Corriere, by Achille Beltrame.

Oskar Potiorek

Bosnia and Herzegovina was a hotly disputed territory in 1914. It had been annexed by the Austro-Hungarian Monarchy in 1910 from the crumbling Ottoman Empire, but was also claimed by neighboring Serbia and had a growing nationalist movement amongst its youth who wanted it to be part of a South Slav state. The decision to send the crown prince of Austro-Hungary into such an unstable region, to attend army maneuvers, was largely made in an attempt to strengthen the monarchy’s rule; charming the local population and demonstrating its military might.

The security for Franz Ferdinand’s visit fell to the military governor of Bosnia and Herzegovina, General Oskar Potiorek. Potiorek wanted to become the Chief of the General Staff and saw the visit as an opportunity to stake his claim. Franz Ferdinand was a fastidious man, prone to fits of rage and was known as ‘The Ogre’ in court circles. A single error could have finished the governor’s career. The Archduke had already blocked Potiorek’s promotion, twice.

Potiorek paid close attention to every aspect of the Archduke’s needs during the visit. He had an extraordinary eye for detail. He over saw all arrangements, from ensuring that the Archduke’s wine would be served at the correct temperature, to building him a private chapel.

Nothing was left to chance, that is except the security of Franz Ferdinand’s visit to Sarajevo. Austro-Hungarian intelligence was aware of plots against Archduke Ferdinand and the danger posed by Serbia who were attempting to resist Austro-Hungarian expansion in the region. There was however no definite evidence of a plot against Archduke Franz Ferdinand, threats of this kind were not unusual in the increasingly volatile Austro-Hungarian monarchy.

 

A schoolboy conspiracy

Unlike other areas of the Monarchy there had not been any violence attributed to nationalism in Bosnia. Potiorek did not recognize the growing nationalism among the youth that had inspired a Young Bosnia Movement and the assassins.

This reflects the Austro-Hungarian Government’s attitude to the threat placed by the nationalist movements in their Balkan provinces. No attempt was made to counter them because the security services did not believe they existed. The idea that half-starved schoolboys could be any kind of a threat was too ridiculous to contemplate.

There were officials in Sarajevo who did understand the growing danger from these “schoolboys” and that they were working with Serbia. The police commissioner for Sarajevo, Dr Edmund Gerde, advised Potirok that there was a conspiracy, two weeks before Franz Ferdinand’s visit. Dr Sunaric the vice president of the Bosnian Parliament urged Potiorek to cancel the archducal visit because of possible Young Bosnia activity. Potiorek dismissed these warnings.

Archduke Ferdinand himself was warned about the possibility of an assassination attempt, but travelled to Bosnia nonetheless, albeit reluctantly. Franz Ferdinand’s wife Sophie insisted on accompanying her husband when she heard of the threats, as she did not believe anyone would shoot at him if a woman was by his side.

The night before the Royal couple were due to visit Sarajevo, Sophie met Doctor Sunaric at a state dinner and told him that he was wrong. Wherever they had been, ‘everyone had greeted them with great friendliness’. Doctor Sunaric responded ‘Your Highness, I pray to God that when I have the honor of meeting you again tomorrow night, you can repeat those words to me.’

One of the most tragic aspects of the whole affair is that Archduke Franz Ferdinand decided to cancel the visit to Sarajevo following the state dinner, possibly having been warned by the local police, but he was persuaded to complete the planned itinerary by Colonel von Merizzi, Governor Potiorek’s aide-de-camp.

 

Sarajevo, Sunday morning, June 28, 1914

Potiorek left the protection of Franz Ferdinand largely to the officers of the Archduke’s entourage. There were only 120 gendarmes lining the streets, to provide security in a city of over 50,000 people. Bringing in additional policemen was deemed to be too expensive, as all of the budget had gone on building the chapel for Franz Ferdinand. Potiorek also refused to use the troops from the maneuvers as extra security. He felt a strong military presence would offend the local inhabitants and the soldiers did not have their dress uniforms.

Consequently one of the conspirators in the assassination plot, Nedeljko Cabrinovic, was able to throw a bomb at the Archduke’s car as it drove to the official reception. The bomb missed, but when the motorcade reached the reception Potiorek took full responsibility and assured Franz Ferdinand that the danger had passed.

Potiorek spurred the suggestion that troops be used to clear the streets stating, ‘do you think that Sarajevo is full of assassins?’ Potiorek did suggest cutting the Archduke’s itinerary short and proceeding to his residence for lunch. To avoid any further danger in the narrow backstreets that the Archduke was scheduled to drive through.

The Archduke however wanted to visit the wounded from the earlier bombing. In the ensuing confusion the change of route was not communicated to the driver of the first car in the Archduke’s motorcade. When the motorcade left the reception the lead driver stuck to the original route and turned into a backstreet.

As the Archduke’s car began to follow, Potiorek realized the mistake and ordered the driver to stop. In front of 19 year old Gavrilo Princip, who fired twice with a Browning model semi-automatic pistol, killing the Archduke and Sophie, and sparking the First World War.

The fact that Potiorek was in the car and that Princip claimed to be shooting at him when he shot Franz Ferdinand’s wife, has meant that Potiorek was never held to account for his actions. The blame was placed on Serbia. The assassins were aided by elements in Serbian intelligence, but if Potiorek had followed the advice of his police chief; or acted decisively following the first assassination attempt, Gavrilo Princip and the ensuing war could have been stopped.

           

The events depicted in this article inspired Alan Bardos’ novel ‘The Assassins’, which can be purchased here: Amazon US | Amazon UK

 

Sources

'One Morning in Sarajevo', David James Smith

'The Archduke and the Assassin', Lavender Cassels

'The Road To Sarajevo', Vadmire Dedijer

'The Desperate Act', Roberta Strauss Feuerlicht.

'Archduke of Sarajevo: The Romance & Tragedy of Franz Ferdinand of Austria', Gordon Brook-Shepherd

'The Assassination of the Archduke', Greg King & Sue Woolmans

In the twilight of the 19th century, the world watched as China convulsed in a tumultuous uprising known as the Boxer Rebellion. This cataclysmic event, which erupted in 1900, was not merely a clash of arms, but a collision of civilizations, ideologies, and ambitions. At its core, the Boxer Rebellion was a struggle for the soul of China, pitting traditional values against encroaching foreign influence.

Here Terry Bailey delves into the multifaceted dimensions of the rebellion, outline the foreign powers involved, their political aims, the valor recognized through decorations like the Victoria Cross and Congressional Medal of Honor, and the perspectives of the Chinese Boxers, including the pivotal role played by Empress Dowager Cixi.

The photo shows foreign forces inside the Forbidden City in Beijing in November 1900 during the Boxer Rebellion.

Origins of the Boxer Movement

To comprehend the Boxer Rebellion, one must understand its roots deeply entwined with China's history of internal strife and external pressures. The late 19th century saw China reeling from a series of humiliations at the hands of foreign powers, compounded by internal turmoil and economic distress. The Boxers, officially known as the Society of Righteous and Harmonious Fists, emerged as a grassroots movement fueled by resentment towards foreign domination and perceived cultural erosion.

 

The International Response

As the Boxer movement gained momentum, foreign nationals and missionaries in China became targets of violent attacks, triggering international alarm. In response, an Eight-Nation Alliance composed of troops from Austria-Hungary, France, Germany, Italy, Japan, Russia, the United States and the United Kingdom intervened to quell the rebellion and protect their interests in China.

Each member of the alliance had its own political aims and agendas driving their involvement in the conflict. For instance, European powers sought to safeguard their economic privileges and spheres of influence in China, while Japan seized the opportunity to assert its growing regional power. The United States, keen on preserving its ‘Open Door Policy’ and ensuring the safety of American citizens, also joined the intervention force.

 

The Boxers' Perspective

Contrary to portrayals by Western accounts, the Boxers were not merely mindless fanatics but individuals driven by a complex blend of nationalism, religious fervor, and socio-economic grievances. Comprising primarily of peasants and martial artists, the Boxers perceived themselves as defenders of Chinese tradition against the encroachment of Western imperialism and Christian missionary activities.

For the Boxers, their struggle was not just against foreign powers but also against the corruption and decadence of the Qing dynasty. Their rallying cry, "Support the Qing, destroy the foreigners," encapsulated their belief in restoring China's glory by expelling foreign influence and purging the nation of perceived traitors.

 

Empress Dowager Cixi's Role

At the heart of the Boxer Rebellion stood Empress Dowager Cixi, a formidable figure whose political maneuvering would shape the course of Chinese history. Initially hesitant to openly support the Boxers, Cixi eventually threw her support behind the movement, viewing it as a means to bolster her own waning authority and expel foreign influences.

Cixi's decision to align with the Boxers proved fateful, leading to a declaration of war against the Eight-Nation Alliance. Despite her efforts to galvanize Chinese forces, the coalition's superior firepower and logistical prowess ultimately overwhelmed the Boxer forces and brought about the collapse of their rebellion.

 

Legacy of the Boxer Rebellion

The Boxer Rebellion left an indelible mark on China and the world, reshaping geopolitical dynamics and fueling nationalist sentiments. While the intervention of the Eight-Nation Alliance temporarily quelled the uprising, it also deepened China's resentment towards foreign powers and sowed the seeds of future conflicts, in addition to further internal strife.

The rebellion's aftermath witnessed the imposition of harsh indemnities on China, further weakening the Qing dynasty and hastening its eventual collapse. The events of 1900 served as a stark reminder of the perils of imperialism and the enduring struggle for national sovereignty.

Sun Yat-sen, known in China as Sun Zhongshan was the eventual galvanized the popular overthrow of the imperial dynasty through his force of personality. Which occurred on the 19th of October 1911. At the time of the eventual successful overthrow of the 2000 year old dynasty Sun Yat-sen was in America attempting to raise funds for the future of China.

He was a highly educated individual who was strongly opposed to the actions of the Boxers before and during the rebellion, knowing that violent offensive action against the strong foreign powers would be detrimental to China’s future.

 

In conclusion

The Boxer Rebellion is an outstanding example of the complexities of history, where competing interests, ideologies, and aspirations converge in a crucible of conflict. Reflecting on this turbulent chapter, it is possible to be reminded of the enduring quest for dignity, autonomy, and justice that transcends borders and generations.

 

Additionally, the history of the Boxer rebellion should provide a stark reminder for any nation that decides to intervene into another nation’s concerns where the intervening power has hidden political agenda residing below the surface.

This reminder should be dealt to all nations, not only where a political fueled agenda influences an intervention by military force but any intervention that the preservation and protection of life is not the prime concern of military action.

 

“war is a continuation of politics by other means,”

Carl Philipp Gottfried von Clausewitz, 1st of July 1780 – 16th of November 1831

 

Find that piece of interest? If so, join us for free by clicking here.

 

 

 

 

 

Victoria Cross and Congressional Medal of Honor Recipients

The Boxer Rebellion witnessed acts of exceptional bravery and heroism, recognized through prestigious military decorations such as the Victoria Cross and Congressional Medal of Honor, for soldier of the United Kingdom of Great Britain and the United States of America.

 

Victoria Cross recipients

General Sir Lewis Stratford Tollemache Halliday VC, KCB

General Sir Lewis Stratford Tollemache Halliday VC, KCB (14th of May 1870 – 9th of March 1966) was an English recipient of the Victoria Cross, the highest and most prestigious award for gallantry in the face of the enemy that can be awarded to British and Commonwealth forces.

Rank when awarded VC (and later highest rank): Captain RMLI, (later General)

 

His citation reads:

Captain (now Brevet Major) Lewis Stratford Tollemache Halliday, Royal Marine Light Infantry, on the 24th June, 1900. The enemy, consisting of Boxers and Imperial troops, made a fierce attack on the west wall of the British Legation, setting fire to the West Gate of the south stable quarters, and taking cover in the buildings which adjoined the wall. The fire, which spread to part of the stables, and through which and the smoke a galling fire was kept up by the Imperial troops, was with difficulty extinguished, and as the presence of the enemy in the adjoining buildings was a grave danger to the Legation, a sortie was organized to drive them out.

 A hole was made in the Legation Wall, and Captain Halliday, in command of twenty Marines, led the way into the buildings and almost immediately engaged a party of the enemy. Before he could use his revolver, however, he was shot through the left shoulder, at point blank range, the bullet fracturing the shoulder and carrying away part of the lung.

Notwithstanding the extremely severe nature of his wound, Captain Halliday killed three of his assailants, and telling his men to "carry on and not mind him," walked back unaided to the hospital, refusing escort and aid so as not to diminish the number of men engaged in the sortie.

 

Commander Basil John Douglas Guy VC, DSO

Commander Basil John Douglas Guy VC, DSO (9th of May 1882 – 29th of December 1956) was an English recipient of the Victoria Cross, the highest and most prestigious award for gallantry in the face of the enemy that can be awarded to British and Commonwealth forces.

Rank when awarded VC (and later highest rank): Midshipman RN, (later Commander)

 

London Gazette citation

“Mr, (read Midshipman), Basil John Douglas Guy, Midshipman of Her Majesty’s Ship “Barfleur”.

 

On 19th July, 1900, during the attack on Tientsin City, a very heavy cross-fire was brought to bear on the Naval Brigade, and there were several casualties. Among those who fell was one A.B.I. McCarthy, shot about 50 yards short of cover.

Mr. Guy stopped with him, and, after seeing what the injury was, attempted to lift him up and carry him in, but was not strong enough, so after binding up the wound Mr. Guy ran to get assistance.

In the meantime the remainder of the company had passed in under cover, and the entire fire from the city wall was concentrated on Mr. Guy and McCarthy. Shortly after Mr. Guy had got in under cover the stretchers came up, and again Mr. Guy dashed out and assisted in placing McCarthy on the stretcher and carrying him in.

The wounded man was however shot dead just as he was being carried into safety. During the whole time a very heavy fire had been brought to bear upon Mr. Guy, and the ground around him was absolutely ploughed up.

 

Congressional Medal of Honor Recipients

During the Boxer rebellion, 59 American servicemen received the Medal of Honor for their actions. Four of these were for Army personnel, twenty-two went to navy sailors and the remaining thirty-three went to Marines. Harry Fisher was the first Marine to receive the medal posthumously and the only posthumous recipient for this conflict.

 

Side note:

Total number Victoria Crosses awarded

Since the inception of the Victoria Cross in 1856, there have been 1,358 VCs awarded. This total includes three bars granted to soldiers who won a second VC and the cross awarded to the unknown American soldier.

The most recent was awarded to Lance Corporal Joshua Leakey of 1st Battalion The Parachute Regiment, whose VC was gazetted in February 2015, following an action in Afghanistan on 22nd of August 2013, this information was correct at the time of writing.

 

Total number of Congressional Medal of Honor, (MOH), awarded

Since the inception of the MOH in, 1861 there have been 3,536 MOH awarded.

The most recent was awarded was made to former Army Capt. Larry L. Taylor during a ceremony at the White House, by President Joe Biden, Sept. 5, 2023, this information was correct at the time of writing.

The Medal of Honor was introduced for the Naval Service in 1861, followed in 1862 a version for the Army.

The British Labour Party has long been at the forefront of progressive social change in the United Kingdom, introducing such policy innovations over the years as the NHS, comprehensive education, and the national minimum wage. Labour has also left its mark in local government, where historically the party has often been successful in putting its socialist principles into practice. Early Twentieth Century Labour councils built a reputation not only for providing more generous levels of social assistance in comparison to non-Labour councils, but also for fostering improvements in education and public health, lowering rates for municipal gas and electricity rates, and serving as good local employers. However, it is nationally that Labour has had the greatest impact on people’s lives; a trend that started exactly 100 years ago.

Vittorio Trevitt explains.

Prime Minister Ramsay MacDonald (middle of bottom row in lighter suit) and his ministers in January 1924.

This January marked the centenary of the formation of the First Labour Government. The product of an inconclusive election the previous month, in which the Conservatives won the most seats but were unable to win a parliamentary vote of confidence, Labour was able to form a minority government with the backing of the Liberal Party.

The coming to power of the First Labour Government has long been of great interest, due to the fact that it marked the first time that Labour, a democratic socialist party committed to radical social change, had come to lead the United Kingdom. Unsurprisingly, Labour’s ascension caused mixed reactions, with elements of the press incredulous at the thought of socialist “wildmen” running Britain, while the Annual Register, by contrast, referred to this historic event as representing ‘A revolution in British politics as profound as that associated with the Reform Act of 1832’. It also led to an overhaul of the two-party system, where for decades power had alternated between the Conservatives and Liberals; their traditional rivals. It was now Labour that held the mantle of chief rival of the Conservative Party; a role that it has continued to play to this day. Symbolically for a worker-oriented party, the new prime minister Ramsay MacDonald was the first person from a working-class background to hold that notable position.

Apart from the symbolism of Labour finally holding the reins of power after years in opposition, it is important to ask oneself what the Party actually achieved in office. One should not, I believe, celebrate a socialist government coming into being if it is unable to implement policies of social justice that represent the ideals of democratic socialism. The First Labour Government’s actions, however, certainly lived up to these.

 

War’s end

Labour came to power during a period following the end of the First World War when a number of other socialist parties came to power throughout Europe for the first time, either as senior or junior partners in coalitions. In Germany, the Social Democratic Party (the nation’s longest-established party) formed an all-socialist administration, while other European states like Hungary, Poland, Estonia, and Austria witnessed members of social-democratic parties assuming ministerial positions in office, enabling their members to have the opportunity to drive and influencepositive social change. British Labour was no different, although numerous policy proposals put forward during Labour’s time in opposition failed to see the light of day, hampered by the Party’s minority status in Parliament. A proposed capital levy never materialised, while Labour failed to secure passage of a number of bills such as one aimed at regulating rents and a private bill focusing on shop assistants’ hours of work. Despite these shortcomings, Labour succeeded in implementing a broad range of reforms during their relatively short period in office, many of which left an indelible stamp on society. Duties on certain foodstuffs were reduced, while improvements were made in financial support for the unemployed. Benefits were increased by a fifth for men and women, eligibility for payments to dependents was widened while more people were brought under the umbrella of unemployment insurance, and a statutory right to cash benefits was introduced. The government also acted to improve conditions for pensioners; raising pensions and extending the old-age pension to all those over the age of 70 in need. In regards to earnings, action was taken on trade boards, with the number of minimum rate enforcement inspectors increased by a third, Grocery Trade Boards revived (after having previously lapsed) and an official investigation launched into certain sections of the catering trade. Additionally, machinery for fixing minimum wage rates for agricultural workers (which was dismantled in 1921) was re-established.

 

Infrastructure

Emphasis was placed on developing infrastructure, with money made available for drainage, roads, and repairs and improvements for dwellings dating back to the First World War. Also of significance was the setting up of Royal Commissions on schooling and health insurance to formulate plans for delivering future changes in those areas (with the latter focusing on the uneven coverage of health insurance in Britain, amongst other aspects of the system), together with a Royal Commission on mental illness law, whose work culminated in important developments in provisions for people with mental illnesses in later years. Agricultural research received a sizeable cash injection, while the Small Debt (Scotland) Act offered support to poorer individuals in that part of the UK, with provisions such as a rise in the amount that could not be attached from wages and the direct payment by instalments of sums found due in small debt courts in rent arrear cases. As a sign of the spirit of the times, a parliamentary motion was adopted in March 1924 calling on the government to establish a Commission of Inquiry to look into the setting of minimum pay scales for working people.

The grant terms of the Unemployment Grants Committee, a body set up 4 years earlier to provide grants to local authorities offering work schemes to jobless persons, were also improved. More areas, for instance, became eligible for grants, while a 6 month probation of 75% or 87.5% of wages was eliminated and a stipulation introduced whereby contracts had to include Fair Wage Clauses. Provisions for war veterans and dependents were also improved, in keeping with the commitment made by Labour in its election programme to ensure “fair play” for this segment of British society.

True to its progressive principles, the First Labour Government reversed various austerity measures introduced in the years following the Armistice, which had entailed cutbacks in areas such as health, education, and housing. Cuts made to the educational system (including the abolition of state scholarships to universities) were reversed, a grant for adult education was bolstered, and an easing of regulations on the construction of schools was carried out. Efforts were made to reduce the number of unqualified teachers, while class sizes in elementary schools were brought down.Reflecting a policy adopted by Labour a year earlier to make universal secondary education a reality, the government increased the number of free secondary school places available; a policy development that resulted in nearly 50% of all secondary school children receiving their education for free by 1931.

 

Housing Act of 1924

Arguably the most radical measure of the First Labour Government was the Housing Act of 1924. The result of the work of health minister John Wheatley, this far-reaching piece of legislation, which raised government subsidies to housing let at regulated rents, facilitated the construction of more than half a million homes, increased the standard of council housing built, and included a fair wages clause for those involved in the building of these homes. This landmark law was, according to one historian, the First Labour Government’s “most significant domestic reform,” and to me represents a perfect example of progressive politics in action.

Despite these noteworthy accomplishments, Labour’s aforementioned lack of a legislative majority meant that it was unable to implement (in comparison with future Labour administrations) a programme of radical change, and lasted less than a year before losing the support of the Liberals and failing to win a snap election. In the run-up to this election, Labour was confronted with accusations that it was “soft” on the USSR, as arguably demonstrated by its recognition of and promise of a loan to the latter; decisions which possibly contributed to its electoral defeat.

The fate of Labour’s first administration was sealed by the controversy surrounding the “Campbell Case,” in which the government dropped a case put against the left-wing journalist John Campbell, who was accused of encouraging British troops to commit acts of mutiny by calling on soldiers to ignore orders to fire on striking workers if ever told to do so. A vote of confidence was held which Labour lost, and in the subsequent election, the Conservatives returned to office with a massive majority of seats in Parliament. Anti-communist propaganda deployed by Conservatives during the election campaign arguably contributed to Labour’s loss, with various Tory candidates equating the Labour Party with communism or leading Britain down this path. Both the Campbell case, along with the government’s building of bridges with the Soviet Union, gave ample ammunition for right-wing propagandists.

What helped their successful anti-socialist crusade was the publication in the Daily Mail (a few days prior to the election) of the Zinoviev Letter. Presumably from Communist International president Grigory Zinoviev to an official of the British Communist Party, the letter encouraged British communists to foment revolution. Although its authenticity remains open to debate, its likely that this document played a part in turning potential voters against Labour, bringing its first stint in power to a premature end.

 

Conclusion

In spite of its brevity in office, and the challenges it faced, it is to Labour’s credit that it was able to do so much, while demonstrating to voters that Labourites were democrats who believed in change through constitutional means and could be trusted to safely run the country while also being a credible alternative to the Conservatives. Undoubtedly, the positive achievements of the First Labour Government under the circumstances it found itself in demonstrated both Labour’s effectiveness as a governing party and its commitment to changing Britain for the better.

The lesson that progressives can learn from the record of the First Labour Government is that social change can be achieved even when a reforming administration lacks a majority in the chambers of power, as long as there is the will to do so. At the time of writing, Labour looks set to emerge victorious in the upcoming general election. If it does, a Starmer Administration should, in the face of a difficult economic situation, do its best to carry out as much in the way of social-democratic reform as possible if it wishes to make Britain a more just society for all in the years ahead. To do so would not only be to the benefit of ordinary people, but would serve as a great tribute to the memory of the First Labour Government.

 

Enjoy that piece? If so, join us for free by clicking here.

Today, when most people think of Afghanistan, they recall the Biden administration’s calamitous withdrawal in the summer of 2021 and the end of what many have termed a ‘forever war.’ Tragically, the Taliban’s victory reversed two decades of effort to establish liberal institutions and women’s rights in the war-ravaged country. Many commentators have compared America’s retreat from Afghanistan to the country’s hasty evacuation of Vietnam in 1975. Indeed, there are similarities in the chaotic nature of the two withdrawals and the resulting tragic effects for the people of Afghanistan and Vietnam, respectively.

Brian Morra looks at the lessons the Biden Administration could have taken from earlier Soviet and American wars in Afghanistan.

Soviet troops atop a tank in Kabul in 1986.

Regarding Afghanistan, Americans are less likely to remember the Soviet Union’s war there and Moscow’s own rather ignominious pull out. This is unfortunate because there are lessons to be learned from the USSR’s ill-fated foray into Afghanistan that US policymakers ought to have heeded during our own twenty-year war. My latest historical novel, The Righteous Arrows (Amazon US | Amazon UK), published by Koehler Books, devotes a good deal of ink to the missteps made by Washington and Moscow in that long-ago war. My intention with The Righteous Arrows is to entertain while providing the reader with a sense of what should have been learned from the Soviets’ failed adventure in Afghanistan.

 

What was Moscow’s war in Afghanistan all about?

Like many wars, it began with what seemed to be good intentions. The Kremlin leaders who made the decision to go to war thought that it would not really be a war at all but a ‘police action’ or a ‘special military operation’ if you like. The Kremlin leadership expected their engagement in Afghanistan to be sharp and quick. Instead, it turned into a decade-long, bloody slog that contributed to the later implosion of the USSR itself. Talk about unintended consequences!

 

How did the Soviet foray into Afghanistan start?

It began over the Christmas season in 1979 when the Soviet leader Leonid Brezhnev was convinced to come to the aid of a weak, pro-Russian socialist regime in Kabul. The initial operation was led by the KGB with support from the GRU (Soviet Military Intelligence) and Army Airborne units. After initial success, the Kremlin quickly became embroiled in a war with tribal militias who did not like either the socialist puppet regime that Moscow was propping up or the Soviet occupation.

What was supposed to be a quick operation became a ferocious guerrilla war that lasted most of the 1980s and killed some 16,000 Soviet troops. The war sapped the strength of the Soviet armed forces and exposed to anyone who was paying attention just how weak the USSR had become. By 1986, the reformist Soviet leader Mikhail Gorbachev had decided to get out of the bloody quagmire in Afghanistan. He found it was not that easy to leave and the last Russian forces did not depart Afghanistan until February 1989.

 

United States’ involvement

Beginning with the Jimmy Carter administration, the United States provided arms to the Afghan Islamic rebels fighting the Soviets. Military support from the US grew exponentially under President Ronald Reagan and by 1986 Washington was arming the Mujaheddin with advanced weapons, including the Stinger surface-to-air missiles that decimated Soviet airpower. America’s weapons turned the tide against the Soviet occupiers, but Washington also rolled the dice by arming Islamic rebels that it could not control.

Not only did the Afghan Islamic fighters become radicalized, but they were also joined by idealistic jihadis from all over the world. The Soviets’ ten-year occupation of Afghanistan became a magnet for recruiting jihadis, as did NATO’s two-decades long occupation some years later. The founder of al Qaeda, Usama bin Laden, brought together and funded Arab fighters in Afghanistan, ostensibly to fight the Russians, but mainly to build his own power base. Although Washington never armed bin Laden’s fighters, he used his presence in Afghanistan during the Soviet war and occupation as a propaganda bonanza. He trumpeted the military prowess of al Qaeda, which was largely a myth of bin Laden’s own creation, and claimed that he brought down the Soviet bear. His propaganda machine claimed that if al Qaeda could defeat one superpower (the USSR), then it also could beat the other one (the USA).

During the 1980s, Washington officials downplayed the danger of arming radical Islamic fighters. It was far more important for the White House to bring down the Soviet Union than to worry about a handful of Mujaheddin. One must admit that the Americans’ proxy war against the USSR in Afghanistan was the most successful one it conducted during the entire Cold War. On the other hand, Washington opened a virtual Pandora’s box of militarized jihadism and has been dealing with the consequences ever since.

 

The aftermath

The Soviet occupation encouraged most of Afghanistan’s middle class to flee the country, leaving an increasingly radicalized and militarized society in its wake. This was the Afghanistan the United States invaded in the fall of 2001, shortly after bin Laden’s 9/11 attacks on Wall Street and the Pentagon. Policymakers in Washington failed to grasp just how radically Afghan society had changed because of the Soviet occupation.

There was discussion in Washington’s national security circles in the immediate aftermath of 9/11, warning of the dangers of fighting in Afghanistan. Bromides were offered, calling the country the ‘graveyard of empires’, but none of it had much impact on policy. Most officials in the George W. Bush administration did not understand how radicalized Afghan society had become and how severe the costs of fighting a counter-insurgency operation in Afghanistan might turn out to be.

The CIA-led operation to defeat the Soviets in Afghanistan in the 1980s with small numbers of Americans was the game plan Washington also used in the fall of 2001 to defeat al Qaeda and bring down the ruling Taliban regime. The playbook worked brilliantly in both cases. Unfortunately, for the United States and our NATO allies, the initial defeat of the Taliban did not make for a lasting victory or an enduring peace.

For twenty years, the United States fought two different wars in Afghanistan. One was a counter-terror war, the fight to defeat al Qaeda and its affiliates and to prevent them from reconstituting. The other war was a counterinsurgency against the Taliban and their allies. The reason the United States and NATO went into Afghanistan was to prosecute the first war – the anti-terror war. We fell into a counterinsurgency conflict as the Taliban reconstituted with help from Pakistan and others. This was a classic case of ‘mission creep’ and it required large combat forces to be deployed, in contrast to the light footprint of the counter-terror operation. The first war – the counter-terror war – prevented another 9/11 style major attack on the United States, while the second one required the US and NATO to deploy massive force and – ultimately – depended on the soundness of the Afghan government we supported.

The sad fact is that the successful campaign against the Taliban and the routing of al Qaeda in 2001 and 2002 led to an unfocused twenty-year war that ended with the Taliban back in charge and a humiliated United States leaving hundreds of thousands of vulnerable Afghan allies behind. In sworn Congressional testimony, General Milley, who in 2021 was Chairman of the Joint Chiefs, and General Mackenzie, who was Commander of Central Command, have stated that they forcefully advised President Biden to leave a small footprint of US forces and contractors in Afghanistan to prosecute the counter-terror war. Our NATO allies were willing to stay and in fact increased their forces in Afghanistan shortly after Biden was inaugurated. Not only did President Biden not heed his military advisors, but he also later denied that they ever counseled him to keep a small force in Afghanistan. Some have described Biden’s decision to pull out of Afghanistan as ‘pulling defeat from the jaws of victory’.

President Biden further asserted that, by withdrawing, he was merely honoring the agreement President Trump had made earlier with the Taliban. This claim does not stand up to objective scrutiny because the Taliban repeatedly violated the terms of the Trump agreement, which gave the White House ample opportunity to declare it null and void.

 

What are the lessons the United States should have learned from the Soviet and American wars in Afghanistan?

  1. Keep your war aims limited and crystal clear.

  2. Fight mission creep and do not allow it to warp the original war aims or plans for a light footprint of forces.

  3. Beware of the law of unintended consequences. Consider the downside risks of arming the enemy of one’s enemy.

  4. Be willing to invest for the long-term or do not get involved. The United States still has forces in Germany, Italy, and Japan nearly eighty years after the end of World War II. Some victories are worth protecting.

  5. Ensure that the Washington tendency toward ‘group think’ does not hijack critical thinking. Senior policymakers must think and act strategically, so that ‘hope’ does not become the plan.

 

Footnote on Ukraine

I will close with a footnote about the Russian war in Ukraine. The Soviet war in Afghanistan has dire similarities with Russia’s ‘special military operation’ underway today in Ukraine. In Afghanistan, Soviet forces killed indiscriminately and almost certainly committed numerous war crimes. The war also militarized Afghan society – a condition that persists to this day, and one that had a profound impact on America’s war in Afghanistan. In Ukraine, Russia’s invasion has been characterized by war crimes, mass emigration, and the militarization of Ukrainian society.

 

Much as in Iraq and Afghanistan, senior US policy in Ukraine is failing to identify clear strategic outcomes. It is the role of our most senior policy officials to focus on strategic outcomes and an exit strategy (if one is warranted) beforecommitting American forces or treasure to foreign wars. Too often, platitudes have masqueraded as strategy. When one thinks of great wartime presidents like Lincoln and Franklin Roosevelt, the trait they shared was a singular focus on strategic outcomes and on how to shape the post-war environment. The Soviets failed to do so in their war in Afghanistan. The US also fell short in Afghanistan. Unfortunately, the banalities that pass for foreign policy strategy we hear in Washington today indicate that we have not learned from the past.

 

 

Brian J. Morra is the author of two historical novels: “The Able Archers” (Amazon US | Amazon UK) and the recently-published “The Righteous Arrows” (Amazon US | Amazon UK).

 

 

More about Brian:

Brian a former U.S. intelligence officer and a retired senior aerospace executive. He helped lead the American intelligence team in Japan that uncovered the true story behind the Soviet Union's shootdown of Korean Airlines flight 007 in September 1983. He also served on the Air Staff at the Pentagon while on active duty. As an aerospace executive he worked on many important national security programs. Morra earned a BA from William and Mary, an MPA from the University of Oklahoma, an MA in National Security Studies from Georgetown University, and completed the Advanced Management Program at Harvard Business School. He has provided commentary for CBS, Netflix and the BBC. Learn more at: www.brianjmorra.com

Astrophysics, the study of the universe beyond Earth's atmosphere, is a clear indication that humanity has an insatiable curiosity and relentless pursuit of knowledge. From the earliest civilizations to the modern era, our understanding of the cosmos has evolved exponentially, propelled by the brilliance of countless minds across centuries.

Here, Terry Bailey takes us on a brief yet captivating journey through history, tracing the origins and development of astrophysics and astronomy from its humble beginnings to the awe-inspiring advancements of the present day.

A depiction of Renaissance astronomer Nicolaus Copernicus.

Our journey commences in ancient times, where the seeds of astrophysics were sown amidst the fertile intellectual landscapes of early civilizations such as ancient Mesopotamia, Egypt, China and Greece to name a few. Among the pioneers of this era were Democritus of Thrace born in the 5th Century BCE, whose revolutionary atomic theory posited that all matter consisted of indivisible particles called atoms. Although his ideas primarily pertained to terrestrial phenomena, it laid a foundational framework for understanding the fundamental building blocks of the universe.

Another luminary of antiquity was Aristarchus of Samos, whose heliocentric model challenged the prevailing geocentric, (Earth centric), worldview. In the 3rd century BCE, Aristarchus proposed that the Earth and other planets orbited the Sun—a concept far ahead of its time. Despite facing resistance from contemporaries such as Aristotle, Aristarchus's visionary insight foreshadowed the Copernican revolution millennia later. Additionally, he calculated and estimated the distance to the Moon and the Sun and size of the Sun. It was after realizing the Sun was far larger than Earth he concluded that the Earth and other planets orbited the Sun.

Eratosthenes of Cyrene emerges as yet another luminary of antiquity, renowned for his groundbreaking contributions to geometry, astronomy and mathematics. In the 3rd century BCE, Eratosthenes accurately calculated the circumference of the Earth using simple trigonometric principles, and calculated the Earth's axial tilt. In addition, Eratosthenes also worked on calculating the distance to the Moon and Sun as well as the diameter of the Sun adding to the works of Aristarchus of Samos - showcasing the ancient world's nascent grasp of celestial mechanics.

 

Renaissance

As the world transitioned into the Renaissance period, the torch of astrophysical inquiry continued to burn brightly in the hands of visionaries such as Nicolaus Copernicus. Building upon the heliocentric model proposed by Aristarchus, Copernicus's seminal work "De Revolutionibus Orbium Coelestium", (On the Revolutions of the Celestial Spheres),  revolutionized our understanding of the solar system. Thereby, placing the Sun at the center of the cosmos, Copernicus catalyzed a paradigm shift that would forever alter humanity's perception of its place in the universe.

The Enlightenment era ushered in a golden age of scientific discovery, with luminaries such as Johannes Kepler and Galileo Galilei making indelible contributions to astrophysics / astronomy. Kepler's laws of planetary motion provided a mathematical framework for understanding the dynamics of celestial bodies, while Galileo's telescopic observations offered compelling evidence in support of the heliocentric model.

 

20th century

The dawn of the 20th century witnessed the birth of modern astrophysics / astronomy marked by transformative developments in theoretical physics and observational techniques. Albert Einstein's theory of general relativity completely changed our understanding of gravity, by adding to Isaac Newton findings, Einstein’s work offered profound insight into the curvature of space-time and the behavior of massive objects in the cosmos. Meanwhile, advancements in spectroscopy and telescopic technology facilitated unprecedented discoveries, allowing astronomers to peer deeper into the universe than ever before.

The latter half of the 20th century witnessed a surge in astrophysical / astronomical research, fueled by technological innovations such as space-based observatories and supercomputers. The advent of radio, X-ray and Gamma astronomy opened new vistas of exploration, enabling scientists to study cosmic phenomena beyond the visible spectrum. Concurrently, the emergence of particle astrophysics shed light on the enigmatic nature of dark matter and dark energy—two elusive entities that comprise the majority of the universe's mass-energy content.

In recent decades, the field of astrophysics has witnessed a convergence of disciplines, as researchers unravel the mysteries of black holes, gravitational waves, and the cosmic microwave background, (CMBR). Groundbreaking discoveries such as the detection of gravitational waves by the Laser Interferometer Gravitational-Wave Observatory (LIGO) have provided empirical validation for Einstein's predictions, while opening new avenues for studying the dynamics of space-time.

As we stand on the precipice of a new era of exploration, the future of astrophysics / astronomy appears more promising and tantalizing than ever before. From the quest to uncover the origins of cosmic inflation to the search for extraterrestrial life, humanity's insatiable curiosity continues to drive scientific inquiry to unprecedented heights. As we gaze upon the vast expanse of the cosmos, we are reminded of our shared quest to unravel the mysteries of existence and unlock the secrets of the universe.

 

In perspective

In retracing the illustrious history of astrophysics / astronomy, we are reminded of humanity's enduring quest for knowledge and understanding. From the ancient musings of Democritus and Aristarchus to the groundbreaking discoveries of modern-day physicists, the journey of astrophysics serves as proof to the boundless potential of the human intellect. Gazing upon celestial weave of the cosmos, humans are reminded of our humble place within the vast expanse of space and time—a reminder of the profound interconnectedness that binds us to the universe itself.

I should add that amateur astronomers are now playing an increasingly active role in cosmic research, a number of projects are currently running that actively engage amateur astronomers to aid in the searching of the cosmos.

The vast expanse of the cosmos means it is impossible to observe the whole universe all the time, therefore, by engaging experienced home based astronomers across the globe it allows more of the cosmos to be observed, thus providing researchers with extra eyes to report possible finds.

Amateur astronomy is currently one of the fastest growing pastimes, as a professional astrophysicist I actively encourage everyone to look skyward and explore the cosmos either for pure pleasure and wonder or with the aim of possible becoming engaged in a live project.

 

Find that piece of interest? If so, join us for free by clicking here.

 

 

 

Special notes:

Size of the observable universe

The observable universe is more than 46 billion light-years in any direction from Earth, therefore, the observable universe is 93 billion light-years in diameter. Given the constant expansion of the universe, the observable universe expands another light-year every Earth year. However, it is important to note this is only the observable universe and Earth is simply part of the universe and not at the center of the universe.

One light year is equivalent to 9.46 trillion kilometers.

 

 

Point of interest:

Recent ground breaking research has identified evidence that suggests black holes are the source of dark energy, however, it must be remembered that this research is in the early stages and required more work.

Although, other researchers have proposed sources for dark energy, what makes this research unique is this is the first observational research where nothing was added to explain the source of dark energy in the Universe. This research simply uses existing proven physics, in other words black holes in Einstein's theory of gravity are the dark energy.

As further research is carried out and empirical testing hopefully provides the same answers then the mystery of the source for dark energy with be known at last, resolving a physics conundrum.

 

Theory of general relativity

A simple summing up of the core principles of general relativity.

John Wheeler, theoretical physicist, summed up the core of Albert Einstein’s theory of general relativity. “Matter tells space-time how to curve, and curved space-time tells matter how to move”.

The historic preservation movement has shifted its focus multiple times and broadened its purposes throughout its existence. Here, Roy Williams returns and considers how it has evolved over time – and how it can be focused today.

The American Heritage Documentation Programs team measures the Kentucky School for the Blind. In Louisville, Kentucky, 1934.

In the beginning the historic preservation movement’s emphasis was preserving heritage through the built environment. The preservation of the built environment provided a framework and grounding point for understanding the culture and heritage of a nation. This provided understanding of the core valued concepts, institutions, and values which make up a nation. In addition to the preservation of the built environment, the National Park Service initially was formed to protect landscapes from the destructive industrial ravages of the 19th century. Ian Tyrell provides the argument that the rise of support for the National Park Service came as a direct result of a perceived global threat to the environment and the immediate need for conservation.[1] In this, the conservation of the environment and historic preservation have similar causes and roots. The two major reasons for preservation stand in pragmatic conservation for the utility of future use and preservation for the sake of preserving the existence of land and sites regardless of their utility to humans.

 

1970s

The historic preservation movement largely stood in its goals of preserving heritage and identity until the 1970s when the conservation of the environment began to become an important aspect of historic preservation. Questions of how much environmental destruction was wrought from the demolition of buildings and effects of new construction provided another angle to the importance of historic preservation. The embodied energy present in the construction of an old building provided the argument that older buildings should be preserved both out of cultural continuity regarding historic preservation as well as for the pragmatic aspect that the energy of construction would be wasted in the demolition of older buildings. The embodied energy of new construction would also add to the greater energy costs on top of the demolition of the older building. While this conclusion seems straight forward initially, there are opponents to the concept. Helena Meryman adheres to the concept of embodied energy in buildings but delves deeper into the world of materials associated with the preservation of the built environment with an emphasis upon the maintenance and conservation of the environment. In this regard, Meryman argues that while many materials should be recycled and previous methods of craftsmanship should be utilized in maintaining historic reconstructions, the importance of evaluating the potential of certain resources and their environmental impacts remains tantamount. Specifically, Meryman provides the counter argument against the proponents of wood as a sustainable material, that the lumber industry does contribute to deforestation and that, “The rates of timber consumption are exceeding the rate of renewal of natural forests.”[2] Instead, Meryman argues for the attempted extension of the life of current wood materials associated with historic structures and only using newly harvested wood for repairs when absolutely necessary. The National Park Service utilizes a concept known as replacement in kind which allows for the use of new materials at times allowing flexibility in the maintenance and rehabilitation of historic structures, the problem however with replacement in kind stands in the continued dependence on newly harvested lumber. The National Park Service provides the guidelines for when replacement in kind may be utilized as follows,

“The Secretary of the Interior's Standards for Rehabilitation generally require that deteriorated distinctive architectural features of a historic property be repaired rather than replaced. Standard 6 of the Standards for Rehabilitation further states that when replacement of a distinctive feature is necessary, the new feature must “match the old in composition, design, color, texture, and other visual properties, and, where possible, materials” (emphasis added). While the use of matching materials to replace historic ones is always preferred under the Standards for Rehabilitation, the Standards also purposely recognize that flexibility may sometimes be needed when it comes to new and replacement materials as part of a historic rehabilitation project. Substitute materials that closely match the visual and physical properties of historic materials can be successfully used on many rehabilitation projects in ways that are consistent with the Standards.”[3]

 

Embodied energy

Embodied energy is largely an accepted philosophical aspect of the debate between preservation, demolition, and new construction, the costs involved in the creation of materials such as lumber, brick, steel all amount to a substantial amount monetarily and in carbon output. The energy expended in moving materials from one location to another for the actual construction also amounts to a substantial amount especially in carbon output regarding transportation through the combustion of gasoline and diesel engines. Finally, the human labor utilized to create residential and commercial buildings remains a factor in the overall energy costs embedded in a building. The question from this embodied energy in buildings then arises in whether an older building is more environmentally friendly when retrofitted and updated than the construction of a new building? While at first glance, this seems straightforward the answer is not so simple and has engendered a small but passionate debate regarding the future regarding the nexus of preservation, environmental conservation, demolition, and construction.

Some scholars argue that the previous costs or embodied energies of an older building should not factor into current decision making regarding the demolition or preservation of a building. This current of thought argues that the energy that went into the construction of a building 50 to 100 years ago is water under the bridge and has no factor in this decision. Tristan Roberts drives this point home, arguing that resources expended in the past are not relative to the present, stating that, “when it comes to the energy expended in the 19th century to build that structure, that’s not a good reason for saving a building from demolition — it’s water under the bridge. Energy spent 2, 20, or 200 years ago to build a building simply isn’t a resource to us today.”[4]  Rather, they argue that the only two factors that should be considered are the costs of demolition and the potential benefits of more energy efficient construction. If the costs of demolition and its environmental impact cannot be offset quickly by more energy efficient new construction, then it is best to wait and leave the current building present for use. If the costs of demolition can be offset quickly by the more energy efficient new construction, then the demolition should proceed. The problem with this argument is that it stands in a philosophical binary regarding environmental destruction and energy efficiency rather than a larger more nuanced analysis. What does it matter which option is technically more efficient if both methods continue to add carbon-based pollution to the atmosphere and destroy the environment? Even with the recycling of materials, the costs of demolition and new construction both still pollute the environment and add to overall carbon emissions. Is the goal of this binary decision making to take the best of two bad options or is it to solve the problem and work towards an environmentally sustainable future? The National Trust for Historic Preservation argues that,

“A new, high-performance building needs between 10-80 years, depending on the building type and where it is built, to offset the environmental impact of its construction. In comparing new and retrofitted buildings of similar size, function, and performance, energy savings in retrofitted buildings ranged from 4-46 percent higher than new construction. The benefits of retrofitting and reusing existing buildings are even more pronounced in regions powered by coal and that experience wider climate variations.”[5]

 

Green buildings

The United States Environmental Protection Agency details that, “600 million tons of C&D debris were generated in the United States in 2018, which is more than twice the amount of generated municipal solid waste. Demolition represents more than 90 percent of total C&D debris generation, while construction represents less than 10 percent. Just over 455 million tons of C&D debris were directed to next use and just under 145 million tons were sent to landfills. Aggregate was the main next use for the materials in the C&D debris.”[6] The context for this amount of waste provides a framework in understanding how Construction and Debris waste amounts for twice the amount of debris created from municipal solid waste. While new construction may produce less construction and debris-based waste, the reality that older buildings are demolished to make room for the construction of new buildings makes this problem inherently interconnected.

The sentiment of the environmental turn regarding historic preservation can be summarized best, in the words of Carl Elefante, former president of the American Institute of Architects, who said, “The greenest building is the one that already exists”[7] Simply put, the importance of preserving buildings for the purpose of environmental conservation serves both goals of the environmental movement as well as the historic preservation movement. Architect Steward Brand took this concept and developed it extensively in presenting how architects should not be confined to the realm of mastering space but to becoming artists of time.[8] Brand argues that buildings should be constantly updated and refined and utilized by humans to ensure their potential and utility. In this regard, Brand also believes that buildings should be constantly updated by their occupants rather than destroyed for the purposes of new construction. Following Brand’s deeper points, he argues that all buildings are made up of six shearing layers Site, Structure, Skin, Services, Space Plan & Stuff. The organization Restore Oregon provides supplementary evidence to this premise regarding environmental impact. Restore Oregon utilizes the Eco-Northwest study which detailed that,

“Renovating a 1,500 SF older home reduces embedded CO2 emissions by 126 metric tons, versus tearing down the same structure and replacing it with a new 3,000 SF residential building. Such savings may be better understood this way: a savings of 126 metric tons of embedded CO2 is roughly equivalent to the prevention of 44,048 gallons of gasoline emissions being released into the atmosphere. In the case of a 10,000 SF commercial building, which would typically utilize more energy-intensive materials and construction techniques than residential construction, the CO2 emissions savings would be 1,383 metric tons, or the equivalent of 484,127 gallons of gasoline burned.”

Substantially, the ECONorthwest study also presented the reality that preserved buildings have a far greater positive impact on CO2 Emissions than the removal of present combustion engines when accounting for the average annual 474 gallons of gasoline used by American vehicles, arguing that, “renovating an older home, rather than demolishing and replacing it, equates to removing 93 cars from the road for an entire year, while a single commercial renovation equates to removing 1,028 cars from the road for the same period of time.” This stands as significant statistical analysis when considering the broader trends of historic preservation and environmental conservation as united fields.

 

Reuse and deconstruction movement

A particular bright spot regarding sustainability and the middle ground between preservation and demolition is the recent advancements in the reuse and deconstruction movement. Rather than the current largescale trend of demolition, deconstruction aims to dismantle older buildings for the purpose of reuse and recycling. Deconstruction ordinances such as those in Portland Oregon dictate that older residential buildings must be deconstructed, and their materials reutilized. This revolutionary concept could potentially pave the way for a more circular economy in which waste is recycled and reused rather than discarded for new construction. However, the movement remains in its early stages, “There isn’t a salvage economy in the U.S. for commercial buildings,”[9] said Jason F. McLennan, the chief executive of McLennan Design and the creator of the Living Building Challenge, with most of the projects remaining residential in nature. Reuse at this stage is not necessarily more profitable due to the labor included in deconstruction and the process of storing materials. The most advantageous aspect of reuse remains in the preservation of historic materials as well as the preservation of embodied energy remaining in the materials.

Re: Purpose Savannah led 501(c)3 nonprofit advocates for sustainability through deconstruction, salvage, and reuse of historic buildings.[10] This effort towards deconstruction and recycling provides a viable alternative to the waste and pollution generated from conventional demolition. Upon research and documentation in preserving the history of a structure, Re:Purpose Savannah deconstructs the materials storing them and selling the salvaged material at their own lumberyard in Savannah. Re:Purpose Savannah’s documentation of historic deconstructed buildings includes detailed analysis of the building’s history, history of owners, photos, mapping, and its larger connections to American history all digitized on their website for the purpose of preservation all while the materials are recycled. Another example of reuse of buildings in a potentially viable manner is the development of deconstruction and recycling companies such as The Re Store, Re-Use Consulting, and Unbuilders throughout the country, reselling recycled building materials to builders. These companies originated with the beginning of deconstruction ordinances in the 1990s. Localities which have enacted deconstruction ordinances include Portland Oregon, King County Washington, and Vancouver British Columbia.[11]

The Portland Oregon Deconstruction ordinance of October 31, 2016, dictates that all structures built from 1940 earlier, (Amended from 1916) are subject to the ordinance. This means that all buildings falling within the ordinance must be deconstructed rather than mechanically demolished for the purpose of recycling valuable materials rather than being crushed and landfilled.[12] The city of Portland further regulates this process by determining that the building must be deconstructed by a certified deconstruction contractor rather than any unlicensed and unregulated construction firm or individual. The ordinance does allow for exemptions if the building is deemed unsafe for human life but generally the most valuable reusable material exists in the framing of the house allowing for very little room for exemptions.

 

Economic benefits

While the marketplace for deconstruction is still in its developing stage, there are certainly economic benefits to its further implementation. The Urban Sustainability Directors Network details the benefits of recycling C&D debris as well as the reusing of material on the project site. Specifically, USDN displays how deconstruction ordinances can significantly reduce waste and the amount of material disposed away in landfills. According to USDN, the Foster Hill California ordinance dictates that 50% of all C&D tonnage to be diverted from landfills to reuse, similarly, Portland’s deconstruction ordinance diverts an estimated 8 million pounds in material to reuse annually.[13] As deconstruction ordinances expand, the amount of materials reused rather than discarded will increase significantly improving overall sustainability, the largest problem stands in implementing ordinances and expanding the market from residential buildings to commercial deconstruction.

The principles of Historic Preservation show that abandonment can be one of the worst things for a building. When a building sits vacant, it becomes vulnerable to the degradation of time, climate-based factors, and wildlife which will utilize the building regardless of humans but may destroy certain aspects of it. The importance of keeping buildings in use stands as a significant goal both between those who are more directed towards the environmental preservation of the built environment as well as those based in concerns of historic preservation. The preservation of buildings in cities throughout the country for the purpose of environmental conservation as the utmost priority can serve multiple goals in maintaining a form of historic preservation, practicing environmental conservation, and in preventing cities from losing their historical character and the effects of the ever-present cultural danger of urban environments hollowing out.

One particular piece of artwork which describes the environmental turn within the historic preservation movement and the larger consideration of embodied energy stands in the gas can building artwork entitled Preservation: Reusing Americas Energy provided by the National Trust for Historic Preservation for Preservation week in 1980.[14] This artwork shows a commercial style building as a gas can to describe the embodied energy present in the building which should be preserved rather than destroyed for the purpose of new construction. In this regard, the building is depicted as having 640,000 gallons of gasoline embodied within the building and should not be preserved to continue utilizing that energy rather than destroyed. The artwork is certainly representative of the times when considering the stagflation and high gas prices of the late 1970s going forward in the 1980s. The importance of preservation for the purpose of utility is on full display as one of the most consequential aspects presented. While the representation of embodied energy as gasoline may feel dated as we continue to transition from fossil fuel-based energy to more sustainable solutions, the importance of conservation and the embodied energy stored in buildings remains an ever-present issue.

 

Paradigm change

Changing the Paradigm from Demolition to Reuse—Building Reuse Ordinances, by Tom Mayes provides another argument directly implicating the current tactics of urban planning and city management which professes to be utilizing environmentally friendly practices yet demolishes buildings without a second thought. Mayes argues that “few cities actively promote the reuse of existing buildings as a green strategy.”[15] With most being discarded and their materials ending up in a landfill with the new materials gained through extractive methods and transported using fossil fuels. This process of demolition and construction continues the process of environmental destruction all while cities pretend to be solving the issue with new ultra-modern style sleek energy efficient construction. This process does not help the issue of environmental conservation in any meaningful way but continues the process of disposability which continues to destroy the environment. Nigel Whiteley describes this process of disposability and hyper consumerism which became established in the 1950s and the 1960s, displaying how products became designed with an explicit understanding that they would soon become obsolete for other products to take their place in demand.[16] This process of disposability moved from fashion to automobiles, to construction. Logically the development of a throw away disposable culture eventually leads to the disposal of buildings for the consumption of newer and more developed buildings with no concern for the energy expended in their previous historic construction. This also has a significant impact of historic preservation as the continuity of urban environments is broken for newer construction rather than the continuance of previous historic structures which grounded the identity of the area.

The present issues faced by both the historic preservation and environmentalist movements can be best summarized by the words of John Muir, “People need Beauty as well as Bread” The importance of maintaining the pragmatism of a working economy and environment cannot be overlooked. The world of human interaction and commerce cannot stop to ensure that the environment can recover, however, new and innovative practices can be put into place which will ensure that the environment can be conserved for future use. Much like the preservation of the human identity through the environment which has shaped human history, the preservation of the built environment is inexplicably unified with this purpose. Whether in the reduction of CO2 emissions by the preservation and retrofitting of older buildings or in the protection of vast swathes of landscapes to protect both the environmental and cultural identity, both movements are linked together. The importance of adaptation and innovation in accomplishing these goals remains significant in addressing the current challenges and problems both fields face.[17]

 

 

Find that piece of interest? If so, join us for free by clicking here.

 

 

Bibliography

Primary Sources

Lindlaw, Scott. “Preservations Urge Weighing Environmental Impact of Teardowns.” New Bedford Standard-Times, April 9, 2008. https://www.southcoasttoday.com/story/lifestyle/2008/04/09/preservations-urge-weighing-environmental-impact/52453454007/.

 

 

Secondary Sources

Journal Articles

Adam, Robert. “‘The Greenest Building Is the One That Already Exists.’” The Architects’ Journal, August 13, 2021. https://www.architectsjournal.co.uk/news/opinion/the-greenest-building-is-the-one-that-already-exists#:~:text=Carl%20Elefante%2C%20former%20president%20of,the%20one%20that%20already%20exists’.

“Deconstruction Requirements.” Portland.gov. October 31, 2016. https://www.portland.gov/bps/climate-action/decon/deconstruction-requirements.

“Encouraging and Mandating Building Deconstruction.” Urban Sustainability Directors Network. https://sustainableconsumption.usdn.org/initiatives-list/encouraging-and-mandating-building-deconstruction.

MAYES, TOM. “Changing the Paradigm from Demolition to Reuse—Building Reuse Ordinances.” In Bending the Future: Fifty Ideas for the Next Fifty Years of Historic Preservation in the United States, edited by Max Page and Marla R. Miller, 162–65. University of Massachusetts Press, 2016. http://www.jstor.org/stable/j.ctt1hd19hg.31.

McMahon, Edward T., and A. Elizabeth Watson. “In My Opinion: In Search of Collaboration: Historic Preservation and the Environmental Movement.” History News 48, no. 6 (1993): 26–27. http://www.jstor.org/stable/42655670.

Meryman, Helena. “Structural Materials in Historic Restoration: Environmental Issues and Greener Strategies.” APT Bulletin: The Journal of Preservation Technology 36, no. 4 (2005): 31–38. http://www.jstor.org/stable/40003161.

National Park Service. “Evaluating Substitute Materials in Historic Buildings.” Last Modified October 6, 2023. https://www.nps.gov/subjects/taxincentives/evaluating-substitute-materials.htm.

  “Our Mission.” Re:Purpose Savannah. 2023. https://www.repurposesavannah.org/mission.

Preservation Green Lab, “The Greenest Building: Quantifying the Environmental Value of Building Reuse,” National Trust for Historic Preservation, 2011.

Prevost, Lisa. “Sustainability Advocates Ask: Why Demolish When You Can Deconstruct? New York Times.September 1, 2021. https://www.nytimes.com/2021/09/01/business/waste-salvage-deconstruction-sustainability.html.

Roberts, Tristan. “Does Saving Historic Buildings Save Energy.” Green Building Advisor, May 2, 2011. https://www.greenbuildingadvisor.com/article/does-saving-historic-buildings-save-energy.

“Successes of a Sister City: Deconstruction around the World.” Re-Store.org. July 25, 2019. https://re-store.org/successes-of-a-sister-city-deconstruction-around-the-world/.

“Sustainable Management of Construction and Demolition Materials.” United States Environmental Protection Agency. Accessed November 7, 2023. https://www.epa.gov/smm/sustainable-management-construction-and-demolition-materials#:~:text=Demolition%20represents%20more%20than%2090,materials%20in%20the%20C%26D%20debris.

Tyrrell, Ian. “America’s National Parks: The Transnational Creation of National Space in the Progressive Era.” Journal of American Studies 46, no. 1 (2012): 1–21. http://www.jstor.org/stable/41427306.

Whiteley, Nigel. “Toward a Throw-Away Culture. Consumerism, ‘Style Obsolescence’ and Cultural Theory in the 1950s and 1960s.” Oxford Art Journal 10. no. 2 (1987): 3–27. http://www.jstor.org/stable/1360444.

 

 

 

Books

Miller, Marla R., and Max Page. Bending the Future: Fifty Ideas for the Next Fifty Years of Historic Preservation in the United States. UPCC Book Collections on Project MUSE. Amherst: University of Massachusetts Press, 2016. https://search.ebscohost.com/login.aspx?direct=true&AuthType=ip,shib&db=nlebk&AN=1425207&site=eds-live&scope=site.

Brand, Stewart. How Buildings Learn: What Happens After They're Built. United States: Penguin Publishing Group, 1995.


[1] Ian Tyrell, “America’s National Parks: The Transnational Creation of National Space in the Progressive Era,” Journal of American Studies 46, no. 1 (2012): 1–21. http://www.jstor.org/stable/41427306.

[2] Helena Meryman, “Structural Materials in Historic Restoration: Environmental Issues and Greener Strategies,” The Journal of Preservation Technology 36, no. 4 (2005): 31–38. http://www.jstor.org/stable/40003161.

[3]“Evaluating Substitute Materials in Historic Buildings,” National Park Service, Last Modified October 6, 2023, https://www.nps.gov/subjects/taxincentives/evaluating-substitute-materials.htm.

[4] Tristan Roberts, “Does Saving Historic Buildings Save Energy,” Green Building Advisor, May 2, 2011, https://www.greenbuildingadvisor.com/article/does-saving-historic-buildings-save-energy.

[5] Preservation Green Lab, “The Greenest Building: Quantifying the Environmental Value of Building Reuse,” National Trust for Historic Preservation, 2011.

[6]“Sustainable Management of Construction and Demolition Materials,” United States Environmental Protection Agency, Accessed November 7, 2023, https://www.epa.gov/smm/sustainable-management-construction-and-demolition-materials#:~:text=Demolition%20represents%20more%20than%2090,materials%20in%20the%20C%26D%20debris.

[7]Robert Adam, “‘The Greenest Building Is the One That Already Exists,’” The Architects’ Journal, August 13, 2021, https://www.architectsjournal.co.uk/news/opinion/the-greenest-building-is-the-one-that-already-exists#:~:text=Carl%20Elefante%2C%20former%20president%20of,the%20one%20that%20already%20exists’.

[8] Stewart Brand, How Buildings Learn: What Happens After They're Built, United States: Penguin Publishing Group, 1995.

[9]Lisa Prevost, Sustainability Advocates Ask: Why Demolish When You Can Deconstruct?” New York Times, September 1, 2021, https://www.nytimes.com/2021/09/01/business/waste-salvage-deconstruction-sustainability.html.

[10] “Our Mission,” Re:Purpose Savannah, 2023, https://www.repurposesavannah.org/mission.

[11] “Successes of a Sister City: Deconstruction around the World,” Re-Store.org, July 25, 2019, https://re-store.org/successes-of-a-sister-city-deconstruction-around-the-world/.

[12] “Deconstruction Requirements,” Portland.gov, October 31, 2016, https://www.portland.gov/bps/climate-action/decon/deconstruction-requirements.

[13] “Encouraging and Mandating Building Deconstruction,” Urban Sustainability Directors Network, https://sustainableconsumption.usdn.org/initiatives-list/encouraging-and-mandating-building-deconstruction.

[14] Scott Lindlaw, “Preservations Urge Weighing Environmental Impact of Teardowns,” New Bedford Standard-Times, April 9, 2008, https://www.southcoasttoday.com/story/lifestyle/2008/04/09/preservations-urge-weighing-environmental-impact/52453454007/.

[15] Tom Mayes, “Changing the Paradigm from Demolition to Reuse—Building Reuse Ordinances,” In Bending the Future: Fifty Ideas for the Next Fifty Years of Historic Preservation in the United States, edited by Max Page and Marla R. Miller, 162–65. University of Massachusetts Press, 2016. http://www.jstor.org/stable/j.ctt1hd19hg.31.

[16]Nigel Whiteley, “Toward a Throw-Away Culture. Consumerism, ‘Style Obsolescence’ and Cultural Theory in the 1950s and 1960s,” Oxford Art Journal 10, no. 2 (1987): 3–27. http://www.jstor.org/stable/1360444.

[17]Edward T. McMahon, and A. Elizabeth Watson, “In My Opinion: In Search of Collaboration: Historic Preservation and the Environmental Movement,” History News 48, no. 6 (1993): 26–27. http://www.jstor.org/stable/42655670.