WSJ Historically Speaking: In Epidemics, Leaders Play a Crucial Role

ILLUSTRATION: JON KRAUSE

Lessons in heroism and horror as a famed flu pandemic hits a milestone

A century ago this week, an army cook named Albert Gitchell at Fort Riley, Kansas, paid a visit to the camp infirmary, complaining of a severe cold. It’s now thought that he was America’s patient zero in the Spanish Flu pandemic of 1918.

The disease killed more than 40 million people world-wide, including 675,000 Americans. In this case, as in so many others throughout history, the pace of the pandemic’s deadly progress depended on the actions of public officials.

Spain had allowed unrestricted reporting about the flu, so people mistakenly believed it originated there. Other countries, including the U.S., squandered thousands of lives by suppressing news and delaying health measures. Chicago kept its schools open, citing a state commission that had declared the epidemic at a “standstill,” while the city’s public health commissioner said, “It is our duty to keep the people from fear. Worry kills more people than the epidemic.”

Worry had indeed sown chaos, misery and violence in many previous outbreaks, such as the Black Plague. The disease, probably caused by bacteria-infected fleas living on rodents, swept through Asia and Europe during the 1340s, killing up to a quarter of the world’s population. In Europe, where over 50 million died, a search for scapegoats led to widespread pogroms against Jews. In 1349, the city of Strasbourg in France, already somewhat affected by the plague, put to death hundreds of Jews and expelled the rest.

But not all authorities lost their heads at the first sign of contagion. Pope Clement VI (1291-1352), one of a series of popes who ruled from the southern French city of Avignon, declared that the Jews had not caused the plague and issued two papal bulls against their persecution.

In Italy, Venetian authorities took the practical approach: They didn’t allow ships from infected ports to dock and subjected all travelers to a period of isolation. The term quarantine comes from the Italian quaranta giorni, meaning “40 days”—the official length of time until the Venetians granted foreign ships the right of entry.

Less exalted rulers could also show prudence and compassion in the face of a pandemic. After the Black Plague struck the village of Eyam in England, the vicar William Mompesson persuaded its several hundred inhabitants not to flee, to prevent the disease from spreading to other villages. The biggest landowner in the county, the earl of Devonshire, ensured a regular supply of food and necessities to the stricken community. Some 260 villagers died during their self-imposed quarantine, but their decision likely saved thousands of lives.

The response to more recent pandemics has not always met that same high standard. When viral severe acute respiratory syndrome (SARS) began in China in November 2002, the government’s refusal to acknowledge the outbreak allowed the disease to spread to Hong Kong, a hub for the West and much of Asia, thus creating a world problem. On a more hopeful note, when Ebola was spreading uncontrollably through West Africa in 2014, the Ugandans leapt into action, saturating their media with warnings and enabling quick reporting of suspected cases, and successfully contained their outbreak.

Pandemics always create a sense of crisis. History shows that public leadership is the most powerful weapon in keeping them from becoming full-blown tragedies.

WSJ Historically Speaking: The Quest for Unconsciousness: A Brief History of Anesthesia

The ancient Greeks used alcohol and opium. Patients in the 12th century got a ‘soporific sponge.’ A look at anesthetics over the centuries

ILLUSTRATION: ELLEN WEINSTEIN

Every year, some 21 million Americans undergo a general anesthetic. During recent minor surgery, I became one of the roughly 26,000 Americans a year who experience “anesthetic awareness” during sedation: I woke up. I still can’t say what was more disturbing: being conscious or seeing the horrified faces of the doctors and nurses.

The best explanation my doctors could give was that not all brains react in the same way to a general anesthetic. Redheads, for example, seem to require higher dosages than brunettes. While not exactly reassuring, this explanation does highlight one of the many mysteries behind the science of anesthesia.

Although being asleep and being unconscious might look the same, they are very different states. Until the mid-19th century, a medically induced deep unconsciousness was beyond the reach of science. Healers had no reliable way to control, let alone eliminate, a patient’s awareness or pain during surgery, though not for lack of trying.

The ancient Greeks generally relied on alcohol, poppy opium or mandrake root to sedate patients. Evidence from the “Sushruta Samhita,” an ancient Sanskrit medical text, suggests that Indian healers used cannabis incense. The Chinese developed acupuncture at some point before 100 B.C., and in Central and South America, shamans used the spit from chewed coca leaves as a numbing balm.

Little changed over the centuries. In the 12th century, Nicholas of Salerno recorded in a treatise the recipe for a “soporific sponge” with ingredients that hadn’t advanced much beyond the medicines used by the Greeks: a mixture of opium, mulberry juice, lettuce seed, mandrake, ivy and hemlock.

Discoveries came but weren’t exploited. In 1540, the German alchemist and astrologer Paracelsus (aka Theophrastus Bombastus von Hohenheim) noted that liquid ether could induce sleep in animals. In 1772, the English chemist Joseph Priestley discovered nitrous oxide gas (laughing gas). Using it became the thing to do at parties—in 1799, the poet Coleridge described trying the gas—but no one apparently tried using ether or nitrous oxide for medicinal purposes.

In 1811, the novelist Fanny Burney had no recourse when she went under the knife for suspected breast cancer. She wrote later, “O Heaven!—I then felt the Knife rackling against the breast bone—scraping it!”

Despite the ordeal, Burney lived into her 80s, dying in 1840—just before everything changed. Ether, nitrous oxide and later chloroform soon became common in operating theaters. On Oct. 16, 1846, a young dentist from Boston named William Morton made history by performing surgery on a patient anesthetized with ether. It was such a success that, a few months later, Frances Appleton Longfellow, wife of Henry Wadsworth Longfellow, became the first American to receive anesthesia during childbirth.

But these wonder drugs were lethal if not administered properly. A German study compiled in 1934 estimated that the number of chloroform-related deaths was as high as 1 in 3,000 operations. The drive for safer drugs produced such breakthroughs as halothane in 1955, which could be inhaled by patients.

Yet for all the continuous advances in anesthesia, scientists still don’t entirely understand how it works. A study published in the December 2017 issue of Annals of Botany reveals that anesthetics can also stop motion in plants like the Venus flytrap—which, as far as we know, doesn’t have a brain. Clearly, we still have a lot to learn about consciousness in every form.

WSJ Historically Speaking: Life Beyond the Three-Ring Circus

Why ‘The Greatest Show on Earth’ foundered—and what’s next

ILLUSTRATION: THOMAS FUCHS

The modern circus, which celebrates its 250th anniversary this year, has attracted such famous fans as Queen Victoria, Charles Dickens and Ernest Hemingway, who wrote in 1953, “It’s the only spectacle I know that, while you watch it, gives the quality of a truly happy dream.”

Recently, however, the “happy dream” has struggled with lawsuits, high-profile bankruptcies and killer clown scares inspired in part by the evil Pennywise in Stephen King’s “It.” Even the new Hugh Jackman -led circus film, “The Greatest Showman,” comes with an ironic twist. The surprise hit—about the legendary impresario P.T. Barnum, co-founder of “The Greatest Show on Earth”—arrives on the heels of last year’s closing of the actual Ringling Bros., Barnum and Bailey Circus, after 146 years in business.

The word circus is Roman, but Roman and modern circuses do not share the same roots. Rome’s giant Circus Maximus, which could hold some 150,000 people, was more of a sporting arena than a theatrical venue, built to hold races, athletic competitions and executions. The Roman satirist Juvenal was alluding to the popular appeal of such spectacles when he coined the phrase “bread and circuses,” assailing citizens’ lack of interest in politics.

In fact, the entertainments commonly linked with the modern circus—acrobatics, animal performances and pantomimes—belong to traditions long predating the Romans. Four-millennia-old Egyptian paintings show female jugglers; in China, archaeologists have found 2,000-year-old clay figurines of tumblers.

Circus-type entertainments could be hideously violent: In 17th-century Britain, dogs tore into bears and chimpanzees. A humane change of pace came in 1768, when Philip Astley, often called the father of the modern circus, put on his first show in London, in a simple horse-riding ring. He found that a circle 42 feet in diameter was ideal for using centrifugal force as an aid in balancing on a horse’s back while doing tricks. It’s a size still used today. Between the horse shows, he scheduled clowning and tumbling acts.Circuses in fledgling America, with its long distances, shortage of venues and lack of large cities, found the European model too static and costly. In 1808, Hachaliah Bailey took the circus in a new direction by making animals the real stars, particularly an African elephant named Old Bet. The focus on animal spectacles became the American model, while Europeans still emphasized human performers.

When railroads spread across America, circuses could ship their menageries. Already famous for his museums and “freak shows,” P.T. Barnum and his partners joined forces with rivals and used special circus trains to create the largest circus in the country. Although Barnum played up the animal and human oddities in his “sideshow,” the marquee attraction was Jumbo the Elephant. In its final year, the Ringling Bros. animal contingent, according to a news report, included tigers, camels, horses, kangaroos and snakes. The elephants had already retired.

Once animal-rights protests and rising travel costs started eroding profitability in the late 20th century, the American circus became trapped by its own history. But the success of Canada’s Cirque du Soleil, which since its 1984 debut has conquered the globe with its astounding acrobatics and staging, shows that the older European tradition introduced by Astley still has the power to inspire wonder. The future may well lie in looking backward, to the era when the stars of the show were the people in the ring.

WSJ Historically Speaking: Remembering the Pueblo: Hostages as Propaganda Tools

The Pueblo incident, involving the North Korean takeover of a spy ship, turns 50

ILLUSTRATION: THOMAS FUCHS

Fifty years ago, on Jan. 23, 1968, North Korean forces captured the U.S. Navy spy ship Pueblo in international waters. North Korea took 82 crew members hostage (one was killed in the attack) and subjected them to 11 months of sporadic torture and starvation, humiliating appearances and forced confessions before an international radio and TV audience. Communications technology had given the ancient practice of hostage-taking a whole new purpose as a tool of propaganda.

Hostages have always been a part of warfare. By the second millennium B.C., Egyptians would take the young princes of conquered states and hold them as surety for good behavior, treating the young nobles well with the aim of turning them into future allies.

The Romans admired this tactic and imitated it. But others were simply interested in money. As a young man, Julius Caesar (100-44 B.C.) was held for ransom by pirates. A biographer of the time writes that while hostage, Caesar amused himself by reading his poems and speeches to his captors. The pirates assumed he was mad, especially when he promised to come back and hang them all. Once the ransom had been paid, the future general fulfilled his vow, hunting down the pirates and executing all of them.

During the Middle Ages, a hostage was better than money in the bank. Negotiating parties used hostages to enforce peace treaties, trade deals and even safe passage. In 1412, for instance, a French political faction sealed an alliance with the English King Henry IV. As part of the guarantee, the 12 year-old John of Orléans, Count of Angoulême, was sent to England, where he remained a political hostage for the next 32 years.

If a deal fell apart, however, retribution could be devastating. During the Third Crusade (1189–1192), King Richard I of England, known as the Lionheart, ordered the massacre of nearly 3,000 Muslim hostages after the Sultan Saladin reneged on his promise to pay a ransom and return his Christian prisoners along with relics of the True Cross.

Brutality toward hostages has been a lamentably common feature of modern warfare. The Germans showed little compunction during the Franco-Prussian War of 1870-71, when they used civilians as human shields on military trains. During World War II, amid a range of other atrocities, the Nazis killed thousands of civilian hostages across Europe, often in reprisal for earlier attacks. During one massacre in German-occupied Serbia in 1941, 100 hostages were to be shot for each dead German soldier.

The idea of hostage-taking as an end in itself is largely a 20th-century development—a way to exploit the powerful reach of mass media. The North Koreans were hardly alone. Domestic extremists also saw the propaganda value of hostages, as in the 1974 kidnapping of Patty Hearst by the Symbionese Liberation Army.

Just five years later, students supporting Iran’s Islamic revolution stormed the U.S. Embassy in Tehran and took 66 American hostages. The students had various demands, among them the extradition of the deposed shah. But their real motivation seemed to be inflicting pain on the captive Americans—who were beaten, threatened with death and paraded in blindfolds before a mob—and on the U.S. itself. There were some early releases, but 52 hostages were held under appalling conditions for 444 days.

Today, memories of the Pueblo incident and the Iran hostage crisis have faded, but both hostage-takings have had a lasting influence on American attitudes. In certain ways, they still define U.S. relations with the regimes of North Korea and Iran.

WSJ Historically Speaking: The First Fixer-Upper: A Look at White House Renovations

Rude visitors, sinking pianos and dismayed presidential residents

ILLUSTRATION: THOMAS FUCHS

This year marks the bicentennial of the public reopening of the White House after the War of 1812, when the British burned the executive mansion and sent President James Madison fleeing. Though the grand house has legions of devotees today, its occupants haven’t always loved the place.

The problems began in the 1790s, as the Founding Fathers struggled with the question of how grand such a residence should be for an elected president in a popular government. Was the building to be a government office with sleeping arrangements, a private home, the people’s palace or all of the above? Frequent name changes reflected the confusion: President’s Palace, President’s House and Executive Mansion. The president made its official name the White House only in 1901.

Continue reading…

WSJ Historically Speaking: The Long and Winding Road to New Year’s

 

The hour, date and kind of celebration have changed century to century

With its loud TV hosts, drunken parties and awful singing, New Year’s Eve might seem to have been around forever. Yet when it comes to the timing and treatment of the  holiday, our version of New Year’s—the eve and day itself—is a relatively recent tradition.

ILLUSTRATION: THOMAS FUCHS

The Babylonians celebrated New Year’s in March, when the vernal equinox—a day of equal light and darkness—takes place. To them, New Year’s was a time of pious reckoning rather than raucous partying. The Egyptians got the big parties going: Their celebration fell in line with the annual flooding of the Nile River. It was a chance to get roaring drunk for a few weeks rather just for a few hours. The holiday’s timing, though, was the opposite of ours, in July.

Continue reading…

WSJ Historically Speaking: The Ancient Magic of Mistletoe

The plant’s odyssey from a Greek festival to a role in the works of Dickens and Trollope

ILLUSTRATION: THOMAS FUCHS

Is mistletoe naughty or nice? The No. 1 hit single for Christmas 1952 was young Jimmy Boyd warbling how he caught “mommy kissing Santa Claus underneath the mistletoe last night.” It may very well have been daddy in costume—but, if not, that would make mistletoe very naughty indeed. For this plant, that would be par for the course.

Mistletoe, in its various species, is found all over the world and has played a part in fertility rituals for thousands of years. The plant’s ability to live off other trees—it’s a parasite—and remain evergreen even in the dead of winter awed the earliest agricultural societies. Mistletoe became a go-to plant for sacred rites and poetic inspiration.

Kissing under the mistletoe may have begun with the Greeks’ Kronia agricultural festival. Its Roman successor, the Saturnalia, combined licentious behavior with mistletoe. The naturalist Pliny the Elder, who died in A.D. 79, noticed to his surprise that mistletoe was just as sacred, if not more, to the Druids of Gaul. Its growth on certain oak trees, which the Druids believed to possess magical powers, spurred them to use mistletoe in ritual sacrifices and medicinal potions to cure ailments such as infertility.

Mistletoe’s mystical properties also earned it a starring role in the 13th-century Old Norse collection of mythical tales known as the Prose Edda. Here mistletoe becomes a deadly weapon in the form of an arrow that kills the sun-god Baldur. His mother Frigga, the goddess of love and marriage, weeps tears that turn into white mistletoe berries. In some versions, this brings Baldur back to life, carrying faint echoes of the reincarnation myths of ancient Mesopotamia. Either way, Frigga declares mistletoe to be the symbol of peace and love.

Beliefs about mistletoe’s powers managed to survive the Catholic Church’s official disapproval for all things pagan. People used the plant as a totem to scare away trolls, thwart witchcraft, prevent fires and bring about reconciliations. But such superstitions fizzled out in the wake of the Enlightenment.

Continue reading…

WSJ Historically Speaking: Kylo Ren, Meet Huck Finn: A History of Sequels and Their Heroes

The pedigree of sequels is as old as storytelling itself

ILLUSTRATION: RUTH GWILY

“Star Wars: The Last Jedi” may end up being the most successful movie sequel in the biggest sequel-driven franchise in the history of entertainment. That’s saying something, given Hollywood’s obsession with sequels, prequels, reboots and remakes. Although this year’s “Guardians of the Galaxy 2” was arguably better than the first, plenty of people—from critics to stand-up comedians—have wondered why in the world we needed a 29th “Godzilla,” an 11th “Pink Panther” or “The Godfather Part III.”

But sequels aren’t simply about chasing the money. They have a distinguished pedigree, as old as storytelling itself. Homer gets credit for popularizing the trend in the eighth century B.C., when he followed up “The Iliad” with “The Odyssey,” in which one of the relatively minor characters in the original story triumphs over sexy immortals, scary monsters and evil suitors of his faithful wife. Presumably with an eye to drawing in fans of the “Iliad,” Homer was sure to throw in a flashback about the Trojan horse. Continue reading…

WSJ Historically Speaking: Short but Tasty History of Pumpkin Pie

An odyssey from colonial staple to political emblem to holiday standby

ILLUSTRATION: THOMAS FUCHS

Pumpkin pie may not compete with its apple-filled rival for most of the year, but on Thanksgiving, it’s the iconic dessert, despite often resembling a giant helping of baby food. As a slice of Americana, the pie has a history as complicated as the country itself.

The pumpkin’s ancestors were ancient gourds that left Asia some 60 million years ago. Known botanically as Cucurbitaceae, the plant family slowly spread to the African, Australian and American continents, laying down roots (and vines) to become such familiar garden goodies as the melon, the cucumber and the squash.

Continue reading…

WSJ Historically Speaking: A History of the Unloved Electoral College

Opponents have ranged from John Adams to Richard Nixon. Why has the system survived?

PHOTO: THOMAS FUCHS

The 2016 election results caused plenty of bitterness—not the least of which had to do with the Electoral College. Donald Trump won the presidency a year ago this week but lost the popular vote—something that has happened a handful of times in the republic’s history and twice in the past two decades. In a December press conference, President Barack Obama declared the system to be past its sell-by date: “It’s a carry-over from an earlier vision of how our federal government was going to work.”

What were the Founding Fathers thinking? At the 1787 Constitutional Convention, they created a unique system for choosing the president. Each state got a number of electors based on the total of its U.S. senators (two) and U.S. representatives (as set by census). Each state legislature could decide the method of picking electors, but if the electors’ vote was inconclusive, the choice would be sent to the House of Representatives. “The original idea,” wrote Federal Election Commission official William C. Kimberling in 1992, “was for the most knowledgeable and informed individuals from each State to select the president based solely on merit and without regard to State of origin or political party.” Continue reading…