Historically Speaking: The Hunt for a Better Way to Vote

Despite centuries of innovation, the humble 2,500-year-old ballot box is here to stay.

The Wall Street Journal

January 4, 2024

At least 40 national elections will take place around the world over the next year, with some two billion people going to the polls. Thanks to the 2,500-year-old invention of the ballot box, in most races these votes will actually count and be counted.

Ballot boxes were first used in Athens during the 5th century B.C., but in trials rather than elections. Legal cases were tried before a gathering of male citizens, known as the Assembly, and decided by vote. Jurors indicated their verdict by dropping either a marked or unmarked pebble into an urn, which protected them against violence by keeping their decision secret.

The first recorded case of ballot box stuffing also took place in 5th century Athens. To exile an unpopular Athenian via an “ostracism election,” Assembly voters simply had to scratch his name on an ostraka, a pottery shard, and whomever reached a certain threshold of votes was banished for 10 years. It is believed that Themistocles’s political enemies rigged his ostracism vote in 472 B.C. by distributing pre-etched shards throughout the Assembly.

In 139 B.C., the Romans passed a series of voter secrecy laws starting with the Lex Gabinia, which introduced the secret ballot for magistrate elections. A citizen would write his vote on a wax-covered wooden tablet and then deposit it in a wicker basket called a cista. The cistae were so effective at protecting voters from public scrutiny that many senators, including Cicero, regarded the ballot box as an attack on their authority and a dangerous concession to mob rule.

ILLUSTRATION: THOMAS FUCHS

The fact that secret ballots allowed men to vote as they pleased was one reason why King Charles I of England ordered all “balloting boxes” to be taken out of circulation in 1637, where they remained until the 19th century.

Even then, there was strong resistance to ballot boxes in Britain and America on the grounds that they were unmanly: A citizen ought to display his vote, not hide it. In any case, there was nothing special about a 19th-century ballot box except its convenience for stealing or stuffing. One notorious election scam in San Francisco in the 1850s involved a ballot box with a false bottom.

Public outrage over rigged elections in the U.S. led some states to adopt Samuel Jollie’s tamper-proof glass ballot box, which New York first used in 1857. But a transparent design couldn’t prevent these boxes from mysteriously disappearing, nor would it have saved Edgar Allan Poe, who is thought to have died in Baltimore from being “cooped,” a practice where kidnap victims were drugged into docility and made to vote multiple times.

To better guarantee the integrity of elections, New York introduced in 1892 a new machine by Jacob Myers that allowed voters to privately choose candidates by pulling a lever, which dispensed with ballots and ballot boxes. Other inventors quickly improved on the design and by 1900 Jollie’s glass ballot box had become obsolete. By World War II almost every city had switched over to mechanical voting systems, which tallied votes automatically.

The simple ballot box seemed destined to disappear until controversies over machine irregularities in the 2000 presidential election resulted in the Help America Vote Act, which requires all votes to have a paper record. The ballot box still isn’t the perfect shield against fraud. But then, neither is anything else.

Historically Speaking: Sending Cards for a Happy Birthday

On Oct. 26, imprisoned WSJ reporter Evan Gershkovich will turn 32. Since ancient times, birthdays have been occasions for poems, letters and expressions of solidarity.

The Wall Street Journal

October 13, 2023

Wall Street Journal reporter Evan Gershkovich turns 32 on Oct. 26. This year he will be spending his birthday in Lefortovo prison in Moscow, a detention center for high-profile and political prisoners. He has been there for the past six months, accused of espionage—a charge vehemently denied by the U.S. government and The Journal.

Despite the extreme restrictions placed on Lefortovo prisoners, it is still possible to send Evan messages of support via the U.S. Embassy in Moscow or the freegershkovich.com website, like a birthday card, to let him know that the world cares.

Birthday cards are so cheap and plentiful it is easy to miss their cultural value. They are the modern iteration of a literary tradition that goes back at least to Roman times. Poets were especially given to composing birthday odes to their friends and patrons. The Augustan poet Horace dedicated many of his poems to Maecenas, whose birthday, he wrote, “is almost more sacred to me than that of my own birth.”

The custom of birthday salutations petered out along with much else during the Dark Ages but was revived with the spread of mass literacy. Jane Austen would write to her siblings on their birthdays, wishing them the customary “joy,” but toward the end of her life she began to experiment with the form. In 1817, she sent her three-year-old niece Cassy a special birthday letter written in reverse spelling, beginning with “Ym raed Yssac.”

Austen’s sense that a birthday letter ought to be unique coincided with a technological race in the printing industry. One of the first people to realize the commercial potential of greeting cards was Louis Prang, a German immigrant in Boston, who began selling printed cards in 1856. Holiday cards were an instant success, but birthday cards were less popular until World War I, when many American families had a relative fighting overseas.

Demand for birthday cards stayed high after the war, as did the importance attached to them. King George V seized on their popularity to introduce the royal tradition of sending every British citizen who reaches 100 a congratulatory birthday card. In 1926, to show how much they appreciated the gift of U.S. aid, more than 5 million Poles signed a 30,000-page birthday card commemorating America’s 150th anniversary.

During the Cold War, the symbolism of the birthday card became a power in itself. In 1984, Illinois Rep. John Edward Porter and other members of Congress sent birthday cards to Mart Niklus, an Estonian civil rights campaigner imprisoned in the U.S.S.R. By coincidence, the Soviets released Niklus in July 1988, the same month that Nelson Mandela received more than 50,000 cards for his 70th birthday. The frustrated South African prison authorities allowed him to have 12 of them. But the writing was on the wall, as it were, and Mandela was released from prison two years later.

Rep. Porter didn’t know what effect his birthday card to Niklus would have. “I doubt he will get them,” he told the House. “Yet by sending these birthday cards…we let the Soviet officials know that we will not forget him.”

I am sending my birthday card to Evan in the same spirit.

Historically Speaking: Tourists Behaving Badly

When today’s travelers get in trouble for knocking over statues or defacing temples, they’re following an obnoxious tradition that dates back to the Romans.

The Wall Street Journal

September 8, 2023

Tourists are giving tourism a bad name. The industry is a vital cog in the world economy, generating more than 10% of global GDP in 2019. But the antisocial behavior of a significant minority is causing some popular destinations to enact new rules and limits. Among the list of egregious tourist incidents this year, two drunk Americans had to be rescued off the Eiffel Tower, a group of Germans in Italy knocked over a 150-year-old statue while taking selfies, and a Canadian teen in Japan defaced an 8th-century temple.

It’s ironic that sightseeing, one of the great perks of civilization, has become one of its downsides. The ancient Greeks called it theoria and considered it to be both good for the mind and essential for appreciating one’s own culture. As the 5th-century B.C. Greek poet Lysippus quipped: “If you’ve never seen Athens, your brain’s a morass./If you’ve seen it and weren’t entranced, you’re an ass.”

The Romans surpassed the Greeks in their love of travel. Unfortunately, they became the prototype for that tourist cliché, the “ugly American,” since they were rich, entitled and careless of other people’s heritage. The Romans never saw an ancient monument they didn’t want to buy, steal or cover in graffiti. The word “miravi”—“I was amazed”—was the Latin equivalent of “Kilroy was here,” and can be found all over Egypt and the Mediterranean.

Thomas Fuchs

Mass tourism picked up during the Middle Ages, facilitated by the Crusades and the popularity of religious pilgrimages. But so did the Roman habit of treating every ancient building like a public visitor’s book. The French Crusader Lord de Coucy actually painted his family coat of arms onto one of the columns of the Church of the Nativity in Bethlehem.

In 17th- and 18th-century Europe, the scions of aristocratic families would embark on a Grand Tour of famous sights, especially in France and Italy. The idea was to turn them into sophisticated men of the world, but for many young men, the real point of the jaunt was to sample bordellos and be drunk and disorderly without fear of their parents finding out.

Even after the Grand Tour went out of fashion, the figure of the tourist usually conjured up negative images. Visiting Alexandria in Egypt in the 19th century, the French novelist Gustave Flaubert raged at the “imbecile” who had painted the words “Thompson of Sunderland” in six-foot-high letters on Pompey’s Pillar in Alexandria in Egypt. The perpetrators were in fact the rescued crew of a ship named the Thompson.

Flaubert was nevertheless right about the sheer destructiveness of some tourists. Souvenir hunters were among the worst offenders. In the Victorian era, Stonehenge in England was chipped and chiseled with such careless disregard that one of its massive stones eventually collapsed.

Sometimes tourists go beyond vandalism to outright madness. Jerusalem Syndrome, first recognized in the Middle Ages, is the sudden onset of religious delusions while visiting the biblical city. Stendhal Syndrome is an acute psychosomatic reaction to the beauty of Florence’s artworks, named for the French writer who suffered such an attack in 1817. There’s also Paris Syndrome, a transient psychosis triggered by extreme feelings of letdown on encountering the real Paris.

As for Stockholm Syndrome, when an abused person identifies with their abuser, there’s little chance of it developing in any of the places held hostage by hordes of tourists.

Historically Speaking: Saving Lives With Lighthouses

Since the first one was built in ancient Alexandria, lighthouses have helped humanity master the danger of the seas.

The Wall Street Journal

July 21, 2023

For those who dream big, there will be a government auction on Aug. 1 for two decommissioned lighthouses, one in Cleveland, Ohio, the other in Michigan’s Upper Peninsula. Calling these lighthouses “fixer-uppers,” however, hardly does justice to the challenge of converting them into livable

France’s Cordouan Lighthouse. GETTY IMAGES

homes. Lighthouses were built so man could fight nature, not sit back and enjoy it.

The Lighthouse of Alexandria, the earliest one recorded, was one of the ancient Seven Wonders of the World. An astonishing 300 feet tall or more, it was commissioned in 290 B.C. by Ptolemy I Soter, the founder of Egypt’s Ptolemaic dynasty, to guide ships into the harbor and keep them from the dangerous shoals surrounding the entrance. No word existed for lighthouse, hence it was called the Pharos of Alexandria, after the small islet on which it was located.

The Lighthouse did wonders for the Ptolemies’ reputation as the major power players in the region. The Romans implemented the same strategy on a massive scale. Emperor Trajan’s Torre de Hercules in A Coruña, in northwestern Spain, can still be visited. But after the empire’s collapse, its lighthouses were abandoned.

More than a thousand years passed before Europe again possessed the infrastructure and maritime capacity to need lighthouses, let alone build them. The contrasting approaches of France and England says much about the two cultures. The French regarded them as a government priority, resulting in such architectural masterpieces as Bordeaux’s Cordouan Lighthouse, commissioned by Henri III in 1584. The English entrusted theirs to Trinity House, a private charity, which led to inconsistent implementation. In 1707, poor lighthouse guidance contributed to the sinking of Admiral Sir Cloudesley Shovell’s fleet off the coast of the Scilly Isles, costing his and roughly 1,500 other lives.

Ida Lewis saved at least 18 people from drowning as the lighthouse keeper of Lime Rock in Newport, R.I.

In 1789, the U.S. adopted a third approach. Alexander Hamilton, the first Secretary of the Treasury, argued that federal oversight of lighthouses was an important symbol of the new government’s authority. Congress ordered the states to transfer control of their existing lighthouses to a new federal agency, the U.S. Lighthouse Establishment. But in the following decades Congress’s chief concern was cutting costs. America’s lighthouses were decades behind Europe’s in adopting the Fresnel lens, invented in France in 1822, which concentrated light into a powerful beam.

The U.S. had caught up by the time of the Civil War, but no amount engineering improvements could lessen the hardship and dangers involved in lighthouse-keeping. Isolation, accidents and deadly storms took their toll, yet it was one of the few government jobs open to women. Ida Lewis saved at least 18 people from drowning during her 54-year tenure of Lime Rock Station off Newport, R.I.

Starting in the early 1900s, there were moves to convert lighthouses to electricity. The days of the lighthouse keeper were numbered. Fortunately, when a Category 4 hurricane hit Galveston, Texas, on Sept. 8, 1900, its lighthouse station was still fully manned. The keeper, Henry C. Claiborne, managed to shelter 125 people in his tower before the storm surge engulfed the lower floors. Among them were the nine survivors of a stranded passenger train. Claiborne labored all night, manually rotating the lens after its mechanical parts became stuck. The lighthouse was a beacon of safety during the storm, and a beacon of hope afterward.

To own a lighthouse is to possess a piece of history, plus unrivaled views and not a neighbor in sight—a bargain whatever the price.

Historically Speaking: The Long Road to Pensions for All

ILLUSTRATION: THOMAS FUCHS

From the Song Dynasty to the American Civil War, governments have experimented with ways to support retired soldiers and workers

The Wall Street Journal

April 6, 2023

“Will you still need me, will you still feed me,/When I’m sixty-four?” sang the Beatles in their 1967 album Sgt. Pepper’s Lonely Hearts Club Band. These were somewhat hypothetical questions at a time when the mean age of American men taking retirement was 64, and their average life expectancy was 67. More than a half-century later, the Beatles song resonates in a different way, because there are so few countries left where retirement on a state pension at 64 is even possible.

Historically, governments preferred not to be in the retirement business, but self-interest sometimes achieved what charitable impulses could not. In 6 A.D., a well-founded fear of civil unrest encouraged Augustus Caesar to institute the first state pension system, the aerarium militare, which looked after retired army veterans. He earmarked a 5% tax on inheritances to pay for the scheme, which served as a stabilizing force in the Roman Empire for the next 400 years. The Sack of Rome in 410 by Alaric, leader of the Visigoths, probably could have been avoided if Roman officials had kept their promise to pay his allied troops their military pensions.

In the 11th century, the Song emperor Shenzong invited the brilliant but mercurial governor of Nanjing, Wang Anshi, to reform China’s entire system of government. Wang’s far-reaching “New Laws” included state welfare plans to care for the aged and infirm. Some of his ideas were accepted but not the retirement plan, which achieved the remarkable feat of uniting both conservatives and radicals against him: The former regarded state pensions as an assault on family responsibility, the latter thought it gave too much power to the government. Wang was forced to retire in 1075.

Leaders in the West were content to muddle along until, like Augustus, they realized that a large nation-state requires a national army to defend it. England’s Queen Elizabeth I oversaw the first army and navy pensions in Europe. She also instituted the first Poor Laws, which codified the state’s responsibility toward its citizens. The problem with the Poor Laws, however, was that they transferred a national problem to the local level and kept it there.

Before he fell victim to the Terror during the French Revolution, the Marquis de Condorcet tried to figure out how France might pay for a national pension system. The question was largely ignored in the U.S. until the Civil War forced the federal government into a reckoning. A military pension system that helped fewer than 10,000 people in 1861 grew into a behemoth serving over 300,000 in 1885. By 1894 military pensions accounted for 37% of the federal budget. One side effect was to hamper the development of national and private pension schemes. Among the few companies to offer retirement pensions for employees were the railroads and American Express.

By the time Frances Perkins, President Franklin Roosevelt’s Labor Secretary, ushered in Social Security in 1935, Germany’s national pension scheme was almost 50 years old. But the German system started at age 70, far too late for most people, which was the idea. As Jane Austen’s Mrs. Dashwood complained in “Sense and Sensibility,” “People always live forever when there is an annuity to be paid to them.” The last Civil War pensioner was Irene Triplett, who died in 2020. She was receiving $73.13 every month for her father’s Union service.

Historically Speaking: The Fungus That Fed Gods And Felled a Pope

There’s no hiding the fact that mushrooms, though delicious, have a dark side

The Wall Street Journal

October 21, 2022

Fall means mushroom season. And, oh, what joy. The Romans called mushrooms the food of the gods; to the ancient Chinese, they contained the elixir of life; and for many people, anything with truffles is the next best thing to a taste of heaven.

Lovers of mushrooms are known as mycophiles, while haters are mycophobes. Both sets have good reasons for feeling so strongly. The medicinal properties of mushrooms have been recognized for thousands of years. The ancient Chinese herbal text “Shen Nong Ben Cao Jing,” written down sometime during the Eastern Han Dynasty, 25-220 AD, was among the earliest medical treatises to highlight the immune-boosting powers of the reishi mushroom, also known as lingzhi.

The hallucinogenic powers of certain mushrooms were also widely known. Many societies, from the ancient Mayans to the Vikings, used psilocybin-containing fungi, popularly known as magic mushrooms, to achieve altered states either during religious rituals or in preparation for battle. One of the very few pre-Hispanic texts to survive Spanish destruction, the Codex Yuta Tnoho or Vindobonensis Mexicanus I, reveals the central role played by the mushroom in the cosmology of the Mixtecs.

ILLUSTRATION: THOMAS FUCHS

There is no hiding the fact that mushrooms have a dark side, however. Fewer than 100 species are actually poisonous out of the thousands of varieties that have been identified. But some are so deadly—the death cap (Amanita phalloides), for example—that recovery is uncertain even with swift treatment. Murder by mushroom is a staple of crime writing, although modern forensic science has made it impossible to disguise.

There is a strong possibility that this is how the Roman Emperor Claudius died on Oct. 13, 54 A.D. The alleged perpetrator, his fourth wife Agrippina the Younger, wanted to clear the path for her son Nero to sit on the imperial throne. Nero dutifully deified the late emperor, as was Roman custom. But according to the historian Dio Cassius, he revealed his true feelings by joking that mushrooms were surely a dish for gods, since Claudius, by means of a mushroom, had become a god.

Another victim of the death cap mushroom, it has been speculated, was Pope Clement VII in 1534, who is best known for opposing King Henry VIII’s attempt to get rid of Catherine of Aragon, the first of his six wives. Two centuries later, in what was almost certainly an accident, Holy Roman Emperor King Charles VI died in Vienna on Oct. 20, 1740, after attempting to treat a cold and fever with his favorite dish of stewed mushrooms.

Of course, mushrooms don’t need to be lethal to be dangerous. Ergot fungus, which looks like innocuous black seeds, can contaminate cereal grains, notably rye. Its baleful effects include twitching, convulsions, the sensation of burning, and terrifying hallucinations. The Renaissance painter Hieronymus Bosch may well have been suffering from ergotism, known as St. Anthony’s Fire in his day, when he painted his depictions of hell. Less clear is whether ergotism was behind the strange symptoms recorded among some of the townspeople during the Salem witch panic of 1692-93.

Unfortunately, the mushroom’s mixed reputation deterred scientific research into its many uses. But earlier this year a small study in the Journal of Psychopharmacology found evidence to support what many college students already believe: Magic mushrooms can be therapeutic. Medication containing psilocybin had an antidepressant effect over the course of a year. More studies are needed, but I know one thing for sure: Sautéed mushrooms and garlic are a recipe for happiness.

Historically Speaking: The Modern Flush Toilet Has Ancient Origins

Even the Minoans of Crete found ways to whisk away waste with flowing water.

The Wall Street Journal

June 9, 2021

Defecation is a great equalizer. As the 16th-century French Renaissance philosopher Michel de Montaigne put it trenchantly in his Essays, “Kings and philosophers shit, as do ladies.”

Yet, even if each person is equal before the loo, not all toilets are considered equal. Sitting or squatting, high or low-tech, single or dual flush: Every culture has preferences and prejudices. A top-end Japanese toilet with all the fixtures costs as much as a new car.

THOMAS FUCHS

Pride in having the superior bathroom experience goes back to ancient times. As early as 2500 B.C., wealthy Mesopotamians could boast of having pedestal lavatories and underfloor piping that fed into cesspits. The Harappans of the Indus Valley Civilization went one better, building public drainage systems that enabled even ordinary dwellings to have bathrooms and toilets. Both, however, were surpassed by the Minoans of Crete, who invented the first known flush toilet, using roof cisterns that relied on the power of gravity to flush the contents into an underground sewer.

The Romans’ greatest contribution to sanitary comfort was the public restroom. By 300 B.C., Rome had nearly 150 public toilet facilities. These were communal, multi-seater affairs consisting of long stone benches with cut-out holes set over a channel of continuously running water. Setting high standards for hygiene, the restrooms had a second water trough for washing and sponging.

Although much knowledge and technology was lost during the Dark Ages, the Monty Python depiction of medieval society as unimaginably filthy was somewhat of an exaggeration. Castle bedrooms were often en-suite, with pipes running down the exterior walls or via internal channels to moats or cesspits. Waste management was fraught with danger, though—from mishaps as much as disease.

The most famous accident was the Erfurt Latrine Disaster. In 1184, King Henry VI of Germany convened a royal gathering at Petersburg Citadel in Erfurt, Thuringia. Unfortunately, the ancient hall was built over the citadel’s latrines. The meeting was in full swing when the wooden flooring suddenly collapsed, hurling many of the assembled nobles to their deaths in the cesspit below.

Another danger was the vulnerability of the sewage system to outside penetration. Less than 20 years after Erfurt, French troops in Normandy captured the English-held Chateau Gaillard by climbing up the waste shafts.

Sir John Harington, a godson of Queen Elizabeth I, rediscovered the flushable toilet in 1596. Her Majesty had one installed at Richmond Palace. But the contraption failed to catch on, perhaps because smells could travel back up the pipe. The Scottish inventor Alexander Cumming solved that problem in the late 18th century by introducing an S-shaped pipe below the bowl that prevented sewer gas from escaping.

Thomas Crapper, contrary to lore, didn’t invent the modern toilet: He was the chief supplier to the royal household. Strangely for a country renowned for its number of bathrooms per household, the U.S. granted its first patent for a toilet—or “plunger closet”—only in 1857. As late as 1940, some 45% of households still had outhouses. The American toilet race, like the space race, only took off later, in the ‘50s. There is no sign of its slowing down. Coming to a galaxy near you: The cloud-connected toilet that keeps track of your vitals and cardiovascular health.

Historically Speaking: Inflation Once Had No Name, Let Alone Remedy

Empires from Rome to China struggled to restore the value of currencies that spiraled out of control

The Wall Street Journal

May 27, 2022

Even if experts don’t always agree on the specifics, there is broad agreement on what inflation is and on its dangers. But this consensus is relatively new: The term “inflation” only came into general usage during the mid-19th century.

Long before that, Roman emperors struggled to address the nameless affliction by debasing their coinage, which only worsened the problem. By 268 AD, the silver content of the denarius had dropped to 0.5%, while the price of wheat had risen almost 200-fold. In 301, Emperor Diocletian tried to restore the value of Roman currency by imposing rigid controls on the economy. But the reforms addressed inflation’s symptoms rather than its causes. Even Diocletian’s government preferred to collect taxes in kind rather than in specie.

A lack of knowledge about the laws of supply and demand also doomed early Chinese experiments in paper money during the Southern Song, Mongol and Ming Dynasties. Too many notes wound up in circulation, leading to rampant inflation. Thinking that paper was the culprit, the Chongzhen Emperor hoped to restore stability by switching to silver coins. But these introduced other vulnerabilities. In the 1630s, the decline of Spanish silver from the New World (alongside a spate of crop failures) resulted in a money shortage—and a new round of inflation. The Ming Dynasty collapsed not long after, in 1644.

Spain was hardly in better shape. The country endured unrelenting inflation during the so-called Price Revolution in Europe in the 16th and 17th centuries, as populations increased, demand for goods spiraled and the purchasing power of silver collapsed. The French political theorist Jean Bodin recognized as early as 1568 that rising prices were connected to the amount of money circulating in the system. But his considered view was overlooked in the rush to find scapegoats, such as the Jews.

ILLUSTRATION: THOMAS FUCHS

The great breakthrough came in the 18th century as classical economists led by Adam Smith argued that the market was governed by laws and could be studied like any other science. Smith also came close to identifying inflation, observing that wealth is destroyed when governments attempt to “inflate the currency.” The term “inflation” became common in the mid-19th century, particularly in the U.S., in the context of boom and bust cycles caused by an unsecured money supply.

Ironically, the worst cases of inflation during the 20th century coincided with the rise of increasingly sophisticated models for predicting it. The hyperinflation of the German Papiermark during the Weimar Republic in 1921-23 may be the most famous, but it pales in comparison to the Hungarian Pengo in 1945-46. Inflamed by the government’s weak response, prices doubled every 15 hours at their peak. The one billion trillion Pengo note was worth about one pound sterling. By 1949 the currency had gone—and so had Hungary’s democracy.

In 1982, the U.S. Federal Reserve under Paul Volcker achieved a historic victory over what became known as the Great Inflation of the 1960s and ‘70s. It did so through an aggressive regimen of high interest rates to curb spending. Ordinary Americans suffered high unemployment as a result, but the country endured. As with any affliction, it isn’t enough for doctors to identify the cause: The patient must be prepared to take his medicine.

Historically Speaking: How Malaria Brought Down Great Empires

A mosquito-borne parasite has impoverished nations and stopped armies in their tracks

The Wall Street Journal

October 15, 2021

Last week brought very welcome news from the World Health Organization, which approved the first-ever childhood vaccine for malaria, a disease that has been one of nature’s grim reapers for millennia.

Originating in Africa, the mosquito-borne parasitic infection left its mark on nearly every ancient society, contributing to the collapse of Bronze-Age civilizations in Greece, Mesopotamia and Egypt. The boy pharaoh Tutankhamen, who died around 1324 BC, suffered from a host of conditions including a club foot and cleft palate, but malaria was likely what killed him.

Malaria could stop an army in its tracks. In 413 BC, at the height of the disastrous Sicilian Expedition, malaria sucked the life out of the Athenian army as it lay siege to Syracuse. Athens never recovered from its losses and fell to the Spartans in 404 BC.

But while malaria helped to destroy the Athenians, it provided the Roman Republic with a natural barrier against invaders. The infested Pontine Marshes south of Rome enabled successive generations of Romans to conquer North Africa, the Middle East and Europe with some assurance they wouldn’t lose their own homeland. Thus, the spread of classical civilization was carried on the wings of the mosquito. In the 5th century, though, the blessing became a curse as the disease robbed the Roman Empire of its manpower.

Throughout the medieval era, malaria checked the territorial ambitions of kings and emperors. The greatest beneficiary was Africa, where endemic malaria was deadly to would-be colonizers. The conquistadors suffered no such handicap in the New World.

ILLUSTRATION: JAMES STEINBERG

The first medical breakthrough came in 1623 after malaria killed Pope Gregory XV and at least six of the cardinals who gathered to elect his successor. Urged on by this catastrophe to find a cure, Jesuit missionaries in Peru realized that the indigenous Quechua people successfully treated fevers with the bark of the cinchona tree. This led to the invention of quinine, which kills malarial parasites.

For a time, quinine was as powerful as gunpowder. George Washington secured almost all the available supplies of it for his Continental Army during the War of Independence. When Lord Cornwallis surrendered at Yorktown in 1781, less than half his army was fit to fight: Malaria had incapacitated the rest.

During the 19th century, quinine helped to turn Africa, India and Southeast Asia into a constellation of European colonies. It also fueled the growth of global trade. Malaria had defeated all attempts to build the Panama Canal until a combination of quinine and better mosquito control methods led to its completion in 1914. But the drug had its limits, as both Allied and Axis forces discovered in the two World Wars. While fighting in the Pacific Theatre in 1943, General Douglas MacArthur reckoned that for every fighting division at his disposal, two were laid low by malaria.

A raging infection rate during the Vietnam War was malaria’s parting gift to the U.S. in the waning years of the 20th century. Between 1964 and 1973, the U.S. Army suffered an estimated 391,965 sick-days from malaria cases alone. The disease didn’t decide the war, but it stacked the odds.

Throughout history, malaria hasn’t had to wipe out entire populations to be devastating. It has left them poor and enfeebled instead. With the advent of the new vaccine, the hardest hit countries can envisage a future no longer shaped by the disease.

Historically Speaking: Let Slip the Dogs, Birds and Donkeys of War

Animals have served human militaries with distinction since ancient times

The Wall Street Journal

August 5, 2021

Cher Ami, a carrier pigeon credited with rescuing a U.S. battalion from friendly fire in World War I, has been on display at the Smithsonian for more a century. The bird made news again this summer, when DNA testing revealed that the avian hero was a “he” and not—as two feature films, several novels and a host of poems depicted—a ”she.”

Cher Ami was one of more than 200,000 messenger pigeons Allied forces employed during the War. On Oct. 4, 1918, a battalion from the U.S. 77th Infantry Division in Verdun, northern France, was trapped behind enemy lines. The Germans had grown adept at shooting down any bird suspected of working for the other side. They struck Cher Ami in the chest and leg—but the pigeon still managed to make the perilous flight back to his loft with a message for U.S. headquarters.

Animals have played a crucial role in human warfare since ancient times. One of the earliest depictions of a war animal appears on the celebrated 4,500-year-old Sumerian box known as the Standard of Ur. One side shows scenes of war; the other, scenes of peace. On the war side, animals that are most probably onagers, a species of wild donkey, are shown dragging a chariot over the bodies of enemy soldiers.

War elephants of Pyrrhus in a 20th century Russian painting
PHOTO: ALAMY

The two most feared war animals of the classical world were horses and elephants. Alexander the Great perfected the use of the former and introduced the latter after his foray into India in 327 BC. For a time, the elephant was the ultimate weapon of war. At the Battle of Heraclea in 280 B.C., a mere 20 of them helped Pyrrhus, king of Epirus—whose costly victories inspired the term “Pyrrhic victory”—rout an entire Roman army.

War animals didn’t have to be big to be effective, however. The Romans learned how to defeat elephants by exploiting their fear of pigs. In 198 B.C., the citizens of Hatra, near Mosul in modern Iraq, successfully fought off a Roman attack by pouring scorpions on the heads of the besiegers. Eight years later, the Carthaginian general Hannibal won a surprise naval victory against King Eumenes II of Pergamon by catapulting “snake bombs”—jars stuffed with poisonous snakes—onto his ships.

Ancient war animals often suffered extraordinary cruelty. When the Romans sent pigs to confront Pyrrhus’s army, they doused the animals in oil and set them on fire to make them more terrifying. Hannibal would get his elephants drunk and stab their legs to make them angry.

Counterintuitively, as warfare became more mechanized the need for animals increased. Artillery needed transporting; supplies, camps, and prisoners needed guarding. A favorite mascot or horse might be well treated: George Washington had Nelson, and Napoleon had Marengo. But the life of the common army animal was hard and short. The Civil War killed between one and three million horses, mules and donkeys.

According to the Imperial War Museum in Britain, some 16 million animals served during World War I, including canaries, dogs, bears and monkeys. Horses bore the brunt of the fighting, though, with as many as 8 million dying over the four years.

Dolphins and sea lions have conducted underwater surveillance for the U.S. Navy and helped to clear mines in the Persian Gulf. The U.S. Army relies on dogs to detect hidden IEDs, locate missing soldiers, and even fight when necessary. In 2016, four sniffer dogs serving in Afghanistan were awarded the K-9 Medal of Courage by the American Humane Association. As the troop withdrawal continues, the military’s four-legged warriors are coming home, too