Historically Speaking: How Malaria Brought Down Great Empires

A mosquito-borne parasite has impoverished nations and stopped armies in their tracks

The Wall Street Journal

October 15, 2021

Last week brought very welcome news from the World Health Organization, which approved the first-ever childhood vaccine for malaria, a disease that has been one of nature’s grim reapers for millennia.

Originating in Africa, the mosquito-borne parasitic infection left its mark on nearly every ancient society, contributing to the collapse of Bronze-Age civilizations in Greece, Mesopotamia and Egypt. The boy pharaoh Tutankhamen, who died around 1324 BC, suffered from a host of conditions including a club foot and cleft palate, but malaria was likely what killed him.

Malaria could stop an army in its tracks. In 413 BC, at the height of the disastrous Sicilian Expedition, malaria sucked the life out of the Athenian army as it lay siege to Syracuse. Athens never recovered from its losses and fell to the Spartans in 404 BC.

But while malaria helped to destroy the Athenians, it provided the Roman Republic with a natural barrier against invaders. The infested Pontine Marshes south of Rome enabled successive generations of Romans to conquer North Africa, the Middle East and Europe with some assurance they wouldn’t lose their own homeland. Thus, the spread of classical civilization was carried on the wings of the mosquito. In the 5th century, though, the blessing became a curse as the disease robbed the Roman Empire of its manpower.

Throughout the medieval era, malaria checked the territorial ambitions of kings and emperors. The greatest beneficiary was Africa, where endemic malaria was deadly to would-be colonizers. The conquistadors suffered no such handicap in the New World.

ILLUSTRATION: JAMES STEINBERG

The first medical breakthrough came in 1623 after malaria killed Pope Gregory XV and at least six of the cardinals who gathered to elect his successor. Urged on by this catastrophe to find a cure, Jesuit missionaries in Peru realized that the indigenous Quechua people successfully treated fevers with the bark of the cinchona tree. This led to the invention of quinine, which kills malarial parasites.

For a time, quinine was as powerful as gunpowder. George Washington secured almost all the available supplies of it for his Continental Army during the War of Independence. When Lord Cornwallis surrendered at Yorktown in 1781, less than half his army was fit to fight: Malaria had incapacitated the rest.

During the 19th century, quinine helped to turn Africa, India and Southeast Asia into a constellation of European colonies. It also fueled the growth of global trade. Malaria had defeated all attempts to build the Panama Canal until a combination of quinine and better mosquito control methods led to its completion in 1914. But the drug had its limits, as both Allied and Axis forces discovered in the two World Wars. While fighting in the Pacific Theatre in 1943, General Douglas MacArthur reckoned that for every fighting division at his disposal, two were laid low by malaria.

A raging infection rate during the Vietnam War was malaria’s parting gift to the U.S. in the waning years of the 20th century. Between 1964 and 1973, the U.S. Army suffered an estimated 391,965 sick-days from malaria cases alone. The disease didn’t decide the war, but it stacked the odds.

Throughout history, malaria hasn’t had to wipe out entire populations to be devastating. It has left them poor and enfeebled instead. With the advent of the new vaccine, the hardest hit countries can envisage a future no longer shaped by the disease.

The Sunday Times: Rumsfeld was the wrong man at the wrong time

Bush’s war supremo brought about his own worst fear: another Vietnam

July 4, 2021

On the whole, sacked US defence secretaries should avoid quoting Winston Churchill as they depart from the White House, in much the same way as disgraced preachers should leave off quoting Jesus as they are led away in handcuffs. Donald Rumsfeld, who died aged 88 last week, was the chief architect of the American invasions of Afghanistan and Iraq. The aftermath was considered a failure, however, and President George W Bush reversed course after Rumsfeld’s departure in December 2006, sending in extra troops to stop Iraq’s descent into civil war. Rumsfeld handled being fired with his customary mix of puckish humour and pugnacious bravado. In his final speech he paraphrased Churchill, saying:“I have benefited greatly from criticism and at no time have I suffered a lack thereof.”

The last bit was true then, and it continued to be until his death. Rumsfeld’s critics on his refusal to commit sufficient fighting troops in either Afghanistan or Iraq could fill a football pitch. (The first of many damning appraisals appeared in 2007, entitled Rumsfeld: An American Disaster.) But the claim he benefited from criticism was so laughable to anyone who knew him that it only highlighted the catastrophic deficiencies of the man. Rumsfeld was incapable of listening to anyone who didn’t already agree with him. He was the wrong man for the job at the most inopportune time in America’s history.

As several obituaries of Rumsfeld pointed out, his first stint as defence secretary, under Gerald Ford in 1975-77, happened under arguably worse circumstances. A survivor from Richard Nixon’s administration, where he stood out for his unwavering commitment to civil rights, Rumsfeld was the White House chief of staff during the last days of Saigon in April 1975. Appointed defence secretary shortly afterwards, Rumsfeld regarded it as his mission to keep the military ready and competitive but essentially inactive. This wasn’t cowardice but good Cold War strategy.

Rumsfeld’s reputation as a strategic thinker was subsequently borne out by his wildly successful transition to the business world. He was also a clear, no-nonsense communicator, whose fondness for aphorisms and golden rules became the stuff of legend. When Rumsfeld left the White House for the first time, he bequeathed a memorandum of best practices, Rumsfeld’s Rules, 11 of which were specific to the secretary of defence. They began with the reminder: “The secretary of defence is not a super general or admiral. His task is to exercise civilian control over the department for the commander-in-chief [the president] and the country”, and included such important pieces of advice as: “Establish good relations between the departments of Defence, State, the National Security Council, CIA and the Office of Management and Budget.”

When Rumsfeld returned to the White House in 2001, aged 68, he broke almost every one of his own rules. Not only did he allow relations between the departments to become utterly dysfunctional but he so alienated the generals and joint chiefs of staff that it was widely assumed he “hated” the military. The serving chairman of the joint chiefs, General Hugh Shelton, complained that Rumsfeld operated “the worst style of leadership I witnessed in 38 years of service”. Rumsfeld treated all soldiers as boneheaded grunts, and the generals simply as boneheaded grunts with the medals to prove it.

His planned military transformations suffered from an overly technocratic mentality. He believed that the army was costly and lacked efficiency — what army doesn’t?— as though bottom lines apply equally in business and fighting. Rumsfeld wanted to remake the military as one that relied more on air power and technical advantages and less on soldiers with guns. The charge against him is that he eviscerated the military’s ground capabilities and reduced its fighting numbers at precisely the moment both were paramount to American success in Afghanistan and Iraq.

What was going through Rumsfeld’s mind? Millions of words have been spilt in the effort to explain why the defence secretary doggedly pursued a losing policy in the teeth of protests from the military. In his last year in the job six retired generals publicly rebuked him. Part of the answer lies in his character: he was a micromanager who failed to engage, a bureaucrat who despised professionals, an aggressor who disliked to fight. But the real driver of his actions can be traced back to 1975. More than anything else, even more than winning perhaps, he wanted to avoid a repeat of the Vietnam War, with its “limited war” rhetoric that was belied by the fact it was the first of what the US media have dubbed America’s “forever wars” — metastasising conflicts without a clear end in sight.

Rumsfeld emerged from his first period in the White House blaming the military for having misled successive administrations into committing themselves to an unwinnable and unpopular war. Hence, he believed that nothing the military said could be taken at face value. He was not going to allow himself to be taken prisoner by the top brass. Unlike Robert McNamara, the defence secretary most associated with US involvement in Vietnam, Rumsfeld was determined to stick to quick, in-and-out military objectives. There would be no quagmires, mass body bags, forever wars or attempts at nation-building on his watch.

Yet he was a prisoner all the same. Even though the causes and conditions were different, the Vietnam War remained the lens through which Americans judged the Iraq war. A few months after the coalition’s initial victory in Iraq in 2003, Senator John McCain, who spent five years as a PoW in Vietnam, warned the Bush administration that stopping the army doing its job, as interpreted by McCain, would risk “the most serious American defeat on the global stage since Vietnam”. By 2004, Democrats were saying: “Iraq is George Bush’s Vietnam”. Rumsfeld ended up being directly compared to McNamara, even though, by winding down, rather than ratcheting up, the US presence in the Middle East, he was doing the opposite of his predecessor. A 2005 column in the Washington Post intoned: “Just as Vietnam became McNamara’s war, Iraq has become Rumsfeld’s war”.

The successful troop “surge” in 2008 appeared to erase Rumsfeld’s Vietnam legacy. Only it didn’t. Barack Obama’s foreign policy — summed up as “leading from behind” — was Rumsfeldian in its horror of American military entanglement in foreign affairs. The Trump administration’s was even more so. More than six trillion war dollars later, with countless lives lost, if Rumsfeld leaves anything behind, it’s the lesson that America’s forever war syndrome is a state of mind, not just a reality.

Historically Speaking: Whistleblowing’s Evolution, From Rome to the Pentagon Papers to Wikileaks

The exposure 50 years ago of government documents about the Vietnam War ushered in a modern era of leaks, built on a long tradition

The Wall Street Journal

June 12, 2021

The Pentagon Papers—a secret Defense Department review of America’s involvement in the Vietnam War—became public 50 years ago next week. The ensuing Supreme Court case guaranteed the freedom of the press to report government malfeasance, but the U.S. military analyst behind the revelation, Daniel Ellsberg, still ended up being prosecuted for espionage. Luckily for him, the charges were dropped after the trial became caught up in the Watergate scandal.

The twists and turns surrounding the Pentagon Papers have a uniquely American flavor to them. At the time, no other country regarded whistleblowing as a basic right.

The origins of whistleblowing are far less idealistic. The idea is descended from Roman ‘‘Qui Tam’’ laws, from a Latin phrase meaning “he who sues for himself does also for the king.” The Qui Tam laws served a policing function by giving informers a financial incentive to turn in wrong-doers. A citizen who successfully sued over malfeasance was rewarded with a portion of the defendant’s estate.

Daniel Ellsberg, left, testifying before members of Congress on July 28, 1971, several weeks after the publication of the Pentagon Papers.
PHOTO: BETTMANN ARCHIVE/GETTY IMAGES

Anglo-Saxon law retained a crude version of Qui Tam. At first primarily aimed at punishing Sabbath-breakers, it evolved into whistleblowing against corruption. In 1360, the English monarch Edward III resorted to Qui Tam-style laws to encourage the reporting of jurors and public officials who accepted bribes.

Whistleblowers could never be sure that those in power wouldn’t retaliate, however. The fate of two American sailors in the Revolutionary War, Richard Marven and Samuel Shaw, was a case in point. The men were imprisoned for libel after they reported the commander of the navy, Esek Hopkins, for a string of abuses, including the torture of British prisoners of war. In desperation they petitioned the Continental Congress for redress. Eager to assert its authority, the Congress not only paid the men’s legal bill but also passed what is generally held to be the first whistleblower-protection law in history. The law was strengthened during the Civil War via the False Claims Act, to deter the sale of shoddy military supplies.

These early laws framed such actions as an expression of patriotism. The phrase “to blow the whistle” only emerged in the 1920s, but by then U.S. whistleblowing culture had already reigned in the corporate behemoth Standard Oil. In 1902, a clerk glanced over some documents that he had been ordered to burn, only to realize they contained evidence of wrongdoing. He passed them to a friend, and they reached journalist Ida Tarbell, forming a vital part of her expose of Standard Oil’s monopolistic abuses.

During World War II, the World Jewish Congress requested special permission from the government to ransom Jewish refugees in Romania and German-occupied France. A Treasury Department lawyer named Joshua E. Dubois Jr. discovered that State Department officials were surreptitiously preventing the money from going abroad. He threatened to go public with the evidence, forcing a reluctant President Franklin Roosevelt to establish the War Refugee Board.

Over the past half-century, the number of corporate and government whistleblowers has grown enormously. Nowadays, the Internet is awash with Wikileaks-style whistleblowers. But in contrast to the saga of the Pentagon Papers, which became a turning point in the Vietnam War and concluded with Mr. Ellsberg’s vindication, it’s not clear what the release of classified documents by Julian Assange, Chelsea Manning and Edward Snowden has achieved. To some, the three are heroes; to others, they are simply spies.

Historically Speaking: The Winning Ways of Moving the Troops

Since the siege of Troy, getting armed forces into battle zones quickly and efficiently has made a decisive difference in warfare

The Wall Street Journal

May 6, 2021

The massing of more than 100,000 Russian soldiers at Ukraine’s border in April was an unambiguous message to the West: President Putin could dispatch them at any moment, if he chose.

How troops move into battle positions is hardly the stuff of poetry. Homer’s “The Iliad” begins with the Greeks having already spent 10 years besieging Troy. Yet the engine of war is, quite literally, the ability to move armies. Many scholars believe that the actual Trojan War may have been part of a larger conflict between the Bronze Age kingdoms of the Mediterranean and a maritime confederacy known as the Sea Peoples.

The identity of these seafaring raiders is still debated, but their means of transportation is well-attested. The Sea Peoples had the largest and best fleets, allowing them to roam the seas unchecked. The trade network of the Mediterranean collapsed beneath their relentless attacks. Civilization went backward in many regions; even the Greeks lost the art of writing for several centuries.

ILLUSTRATION: THOMAS FUCHS

The West recovered and flourished until the fifth century, when the Romans were overwhelmed by the superior horse-borne armies of the Vandals. Their Central European horses, bred for strength and stamina, transformed the art of warfare, making it faster and more mobile. The invention of the stirrup, the curb bit, and finally the war saddle made mobility an effective weapon in and of itself.

Genghis Khan understood this better than any of his adversaries. His mounted troops could cover up to 100 miles a day, helping to stretch the Mongol empire from the shores of eastern China to the Austrian border. But horses need pasture, and Europe’s climate between 1238 to 1242 was excessively wet. Previously fertile plains became boggy marshes. The first modern invasion was stopped by rain.

Bad weather continued to provide an effective defense against invaders. Napoleon entered Russia in 1812 with a force of over 500,000. An unseasonably hot summer followed by an unbearably cold winter killed off most of his horses, immobilizing the cavalry and the supply wagons that would have prevented his army from starving. He returned with fewer than 20,000 men.

The reliance on pack animals for transport meant that until the Industrial Revolution, armies were no faster than their Roman counterparts. The U.S. Civil War first showed how decisive railroads could be. In 1863 the Confederate siege of Chattanooga, Tenn., was broken by 23,000 Federal troops who traveled over 1,200 miles across seven states to relieve Union forces under General William Rosecrans.

The Prussians referred to this kind of troop-maneuvering warfare as bewegungskrieg, war of movement, using it to crushing effect over the less-mobile French in the Franco-Prussian War. In the early weeks of World War I, France still struggled to mobilize; Gen. Joseph S. Gallieni, the military governor of Paris, famously resorted to commandeering Renault taxicabs to ferry soldiers to the Battle of the Marne.

The Germans started World War II with their production capacity lagging that of the Allies; they compensated by updating bewegungskrieg to what became known as blitzkrieg, or lightning strike, which combined speed with concentrated force. They overwhelmed French defenses in six weeks.

In the latter half of the 20th century, troop transport became even more inventive, if not decisive. Most of the 2.7 million U.S. soldiers sent into the Vietnam War were flown commercial. (Civilian air stewardesses flying over combat zones were given the military rank of Second Lieutenant.)

Although future conflicts may be fought in cyberspace, for now, modern warfare means mass deployment. Winning still requires moving.

WSJ Historically Speaking: Undying Defeat: The Power of Failed Uprisings

From the Warsaw Ghetto to the Alamo, doomed rebels live on in culture

John Wayne said that he saw the Alamo as ‘a metaphor for America’. PHOTO: ALAMY

Earlier this month, Israel commemorated the 75th anniversary of the Warsaw Ghetto Uprising of April 1943. The annual Remembrance Day of the Holocaust and Heroism, as it is called, reminds Israelis of the moral duty to fight to the last.

The Warsaw ghetto battle is one of many doomed uprisings across history that have cast their influence far beyond their failures, providing inspiration to a nation’s politics and culture.

Nearly 500,000 Polish Jews once lived in the ghetto. By January 1943, the Nazis had marked the surviving 55,000 for deportation. The Jewish Fighting Organization had just one machine gun and fewer than a hundred revolvers for a thousand or so sick and starving volunteer soldiers. The Jews started by blowing up some tanks and fought on until May 16. The Germans executed 7,000 survivors and deported the rest.

For many Jews, the rebellion offered a narrative of resistance, an alternative to the grim story of the fortress of Masada, where nearly 1,000 besieged fighters chose suicide over slavery during the First Jewish-Roman War (A.D. 66–73).
The story of the Warsaw ghetto uprising has also entered the wider culture. The title of Leon Uris’s 1961 novel “Mila 18” comes from the street address of the headquarters of the Jewish resistance in their hopeless fight. Four decades later, Roman Polanski made the uprising a crucial part of his 2002 Oscar-winning film, “The Pianist,” whose musician hero aids the effort.

Other doomed uprisings have also been preserved in art. The 48-hour Paris Uprising of 1832, fought by 3,000 insurrectionists against 30,000 regular troops, gained immortality through Victor Hugo, who made the revolt a major plot point in “Les Misérables” (1862). The novel was a hit on its debut and ever after—and gave its world-wide readership a set of martyrs to emulate.

Even a young country like the U.S. has its share of national myths, of desperate last stands serving as touchstones for American identity. One has been the Battle of the Alamo in 1836 during the War of Texas Independence. “Remember the Alamo” became the Texan war cry only weeks after roughly 200 ill-equipped rebels, among them the frontiersman Davy Crockett, were killed defending the Alamo mission in San Antonio against some 2,000 Mexican troops.

The Alamo’s imagery of patriotic sacrifice became popular in novels and paintings but really took off during the film era, beginning in 1915 with the D.W. Griffith production, “Martyrs of the Alamo.” Walt Disney got in on the act with his 1950s TV miniseries, “ Davy Crockett : King of the Wild Frontier.” John Wayne’s 1960 “The Alamo,” starring Wayne as Crockett, immortalized the character for a generation.

Wayne said that he saw the Alamo as “a metaphor of America” and its will for freedom. Others did too, even in very different contexts. During the Vietnam War, President Lyndon Johnson, whose hometown wasn’t far from San Antonio, once told the National Security Council why he believed U.S. troops needed to be fighting in Southeast Asia: “Hell,” he said, “Vietnam is just like the Alamo.”