When Royal Love Affairs Go Wrong

From Cleopatra to Edward VIII, monarchs have followed their hearts—with disastrous results.

ILLUSTRATION: THOMAS FUCHS

“Ay me!” laments Lysander in Shakespeare’s “A Midsummer Night’s Dream.” “For aught that I could ever read, / Could ever hear by tale or history, / The course of true love never did run smooth.” What audience would disagree? Thwarted lovers are indeed the stuff of history and art—especially when the lovers are kings and queens.

But there were good reasons why the monarchs of old were not allowed to follow their hearts. Realpolitik and royal passion do not mix, as Cleopatra VII (69-30 B.C.), the anniversary of whose death falls on Aug. 12, found to her cost. Her theatrical seduction of and subsequent affair with Julius Caesar insulated Egypt from Roman imperial designs. But in 41 B.C., she let her heart rule her head and fell in love with Mark Antony, who was fighting Caesar’s adopted son Octavian for control of Rome.

Cleopatra’s demand that Antony divorce his wife Octavia—sister of Octavian—and marry her instead was a catastrophic misstep. It made Egypt the target of Octavian’s fury, and forced Cleopatra into fighting Rome on Antony’s behalf. The couple’s defeat at the sea battle of Actium in 31 B.C. didn’t only end in personal tragedy: the 300-year-old Ptolemaic dynasty was destroyed, and Egypt was reduced to a Roman province.

In Shakespeare’s play “Antony and Cleopatra,” Antony laments, “I am dying, Egypt, dying.” It is a reminder that, as Egypt’s queen, Cleopatra was the living embodiment of her country; their fates were intertwined. That is why royal marriages have usually been inseparable from international diplomacy.

In 1339, when Prince Pedro of Portugal fell in love with his wife’s Castilian lady-in-waiting, Inés de Castro, the problem wasn’t the affair per se but the opportunity it gave to neighboring Castile to meddle in Portuguese politics. In 1355, Pedro’s father, King Afonso IV, took the surest way of separating the couple—who by now had four children together—by having Inés murdered. Pedro responded by launching a bloody civil war against his father that left northern Portugal in ruins. The dozens of romantic operas and plays inspired by the tragic love story neglect to mention its political repercussions; for decades afterward, the Portuguese throne was weak and the country divided.

Perhaps no monarchy in history bears more scars from Cupid’s arrow than the British. From Edward II (1284-1327), whose poor choice of male lovers unleashed murder and mayhem on the country—he himself was allegedly killed with a red hot poker—to Henry VIII (1491-1547), who bullied and butchered his way through six wives and destroyed England’s Catholic way of life in the process, British rulers have been remarkable for their willingness to place personal happiness above public responsibility.

Edward VIII (1894 -1972) was a chip off the block, in the worst way. The moral climate of the 1930s couldn’t accept the King of England marrying a twice-divorced American. Declaring he would have Wallis Simpson or no one, Edward plunged the country into crisis by abdicating in 1936. With European monarchies falling on every side, Britain’s suddenly looked extremely vulnerable. The current Queen’s father, King George VI, quite literally saved it from collapse.

According to a popular saying, “Everything in the world is about sex except sex. Sex is about power.” That goes double when the lovers wear royal crowns.

No more midlife crisis – I’m riding the U-curve of happiness

Evidence shows people become happier in their fifties, but achieving that takes some soul-searching

I used not to believe in the “midlife crisis”. I am ashamed to say that I thought it was a convenient excuse for self-indulgent behaviour — such as splurging on a Lamborghini or getting buttock implants. So I wasn’t even aware that I was having one until earlier this year, when my family complained that I had become miserable to be around. I didn’t shout or take to my bed, but five minutes in my company was a real downer. The closer I got to my 50th birthday, the more I radiated dissatisfaction.

Can you be simultaneously contented and discontented? The answer is yes. Surveys of “national wellbeing” in several countries, including the UK, by the Office for National Statistics have revealed a fascinating U-curve in relation to happiness and age. In Britain, feelings of stress and anxiety appear to peak at 49 and subsequently fade as the years increase. Interestingly, a 2012 study showed that chimpanzees and orang-utans exhibited a similar U-curve of happiness as they reach middle age.

On a rational level, I wasn’t the least bit disappointed with my life. The troika of family, work and friends made me very happy. And yet something was eating away at my peace of mind. I regarded myself as a failure — not in terms of work but as a human being. Learning that I wasn’t alone in my daily acid bath of gloom didn’t change anything.

One of F Scott Fitzgerald’s most memorable lines is: “There are no second acts in American lives.” It’s so often quoted that it’s achieved the status of a truism. It’s often taken to be an ironic commentary on how Americans, particularly men, are so frightened of failure that they cling to the fiction that life is a perpetual first act. As I thought about the line in relation to my own life, Fitzgerald’s meaning seemed clear. First acts are about actions and opportunities. There is hope, possibility and redemption. Second acts are about reactions and consequences.

Old habits die hard, however. I couldn’t help conducting a little research into Fitzgerald’s life. What was the author of The Great Gatsby really thinking when he wrote the line? Would it even matter?

The answer turned out to be complicated. As far as the quotation goes, Fitzgerald actually wrote the reverse. The line appears in a 1935 essay entitled My Lost City, about his relationship with New York: “I once thought that there were no second acts in American lives, but there was certainly to be a second act to New York’s boom days.”

It reappeared in the notes for his Hollywood novel, The Love of the Last Tycoon, which was half finished when he died in 1940, aged 44. Whatever he had planned for his characters, the book was certainly meant to have been Fitzgerald’s literary comeback — his second act — after a decade of drunken missteps, declining book sales and failed film projects.

Fitzgerald may not have subscribed to the “It’s never too late to be what you might have been” school of thought, but he wasn’t blind to reality. Of course he believed in second acts. The world is full of middle-aged people who successfully reinvented themselves a second or even third time. The mercurial rise of Emperor Claudius (10BC to AD54) is one of the earliest historical examples of the true “second act”.

According to Suetonius, Claudius’s physical infirmities had made him the butt of scorn among his powerful family. But his lowly status saved him after the assassination of his nephew, Caligula. The plotters found the 56-year-old Claudius cowering behind a curtain. On the spur of the moment, instead of killing him, as they did Caligula’s wife and daughter, the plotters decided the stumbling and stuttering scion of the Julio-Claudian dynasty could be turned into a puppet emperor. It was a grave miscalculation. Claudius seized on his changed circumstances. The bumbling persona was dropped and, although flawed, he became a forceful and innovative ruler.

Mostly, however, it isn’t a single event that shapes life after 50 but the willingness to stay the course long after the world has turned away. It’s extraordinary how the granting of extra time can turn tragedy into triumph. In his heyday, General Mikhail Kutuzov was hailed as Russia’s greatest military leader. But by 1800 the 55-year-old was prematurely aged. Stiff-limbed, bloated and blind in one eye, Kutuzov looked more suited to play the role of the buffoon than the great general. He was Alexander I’s last choice to lead the Russian forces at the Battle of Austerlitz in 1805, but was the first to be blamed for the army’s defeat.

Kutuzov was relegated to the sidelines after Austerlitz. He remained under official disfavour until Napoleon’s army was halfway to Moscow in 1812. Only then, with the army and the aristocracy begging for his recall, did the tsar agree to his reappointment. Thus, in Russia’s hour of need it ended up being Kutuzov, the disgraced general, who saved the country.

Winston Churchill had a similar apotheosis in the Second World War. For most of the 1930s he was considered a political has-been by friends and foes alike. His elevation to prime minister in 1940 at the age of 65 changed all that, of course. But had it not been for the extraordinary circumstances created by the war, Robert Rhodes James’s Churchill: A Study in Failure, 1900-1939 would have been the epitaph rather than the prelude to the greatest chapter in his life.

It isn’t just generals and politicians who can benefit from second acts. For writers and artists, particularly women, middle age can be extremely liberating. The Booker prize-winning novelist Penelope Fitzgerald published her first book at 59 after a lifetime of teaching while supporting her children and alcoholic husband. Thereafter she wrote at a furious pace, producing nine novels and three biographies before she died at 83.

I could stop right now and end with a celebratory quote from Morituri Salutamus by the American poet Henry Wadsworth Longfellow: “For age is opportunity no less/ than youth itself, though in another dress, / And as the evening twilight fades away / The sky is filled with stars, invisible by day.”

However, that isn’t — and wasn’t — what was troubling me in the first place. I don’t think the existential anxieties of middle age are caused or cured by our careers. Sure, I could distract myself with happy thoughts about a second act where I become someone who can write a book a year rather than one a decade. But that would still leave the problem of the flesh-and-blood person I had become in reality. What to think of her? It finally dawned on me that this had been my fear all along: it doesn’t matter which act I am in; I am still me.

My funk lifted once the big day rolled around. I suspect that joining a gym and going on a regular basis had a great deal to do with it. But I had also learnt something valuable during these past few months. Worrying about who you thought you would be or what you might have been fills a void but leaves little space for anything else. It’s coming to terms with who you are right now that really matters.

 

In Awe of the Grand Canyon

Since the 16th century, travelers have recorded the overwhelming impact of a natural wonder.

ILLUSTRATION BY THOMAS FUCHS

Strange as it may sound, it was watching Geena Davis and Susan Sarandon in the tragic final scene of “Thelma and Louise” (1991) that convinced me I had to go to the Grand Canyon one day and experience its life-changing beauty. Nearly three decades have passed, but I’m finally here. Instead of a stylish 1966 Ford Thunderbird, however, I’m driving a mammoth RV, with my family in tow.

The overwhelming presence of the Grand Canyon is just as I dreamed. Yet I’m acutely aware of how one-sided the relationship is. As the Pulitzer Prize-winning poet Carl Sandburg wrote in “Many Hats” in 1928: “For each man sees himself in the Grand Canyon—each one makes his own Canyon before he comes.”

The first Europeans to encounter the Canyon were Spanish conquistadors searching for the legendary Seven Golden Cities of Cibola. In 1540, Hopi guides took a small scouting party led by García López de Cárdenas to the South Rim (60 miles north of present-day Williams, Ariz.). In Cárdenas’s mind, the Canyon was a route to riches. After trying for three days to find a path to reach the river below, he cut his losses in disgust and left. Cárdenas saw no point to the Grand Canyon if it failed to yield any treasure.

Three centuries later, in 1858, the first Euro-American to follow in Cárdenas’s footsteps, Lt. Joseph Christmas Ives of the U.S. Army Corps of Topographical Engineers, had a similar reaction. In his official report, Ives waxed lyrical about the magnificent scenery but concluded, “The region is, of course, altogether valueless….Ours has been the first, and will doubtless be the last, party of whites to visit this profitless locality.”

Americans only properly “discovered” the Grand Canyon through the works of artists such as Thomas Moran. A devotee of the Hudson River School of painters, Moran found his spiritual and artistic home in the untamed landscapes of the West. His romantic pictures awakened the public to the natural wonder in their midst. Eager to see the real thing, the trickle of visitors turned into a stream by the late 1880s.

The effusive reactions to the Canyon recorded by tourists who made the arduous trek from Flagstaff, Ariz. (a railway to Grand Canyon Village was only built in 1901) have become a familiar refrain: “Not for human needs was it fashioned, but for the abode of gods…. To the end it effaced me,” wrote Harriet Monroe, the founder of Poetry magazine, in 1899.

But there was one class of people who were apparently insensible to the Canyon: copper miners. Watching their thoughtless destruction of the landscape, Monroe wondered, “Do they cease to feel it?” President Theodore Roosevelt feared so, and in 1908 he made an executive decision to protect 800,000 acres from exploitation by creating the Grand Canyon National Monument.

Roosevelt’s farsightedness may have put a crimp in the profits of mining companies, but it paid dividends in other ways. By the 1950s, the Canyon had become a must-see destination, attracting visitors from all over the world. Among them were the tragic Sylvia Plath, author of “The Bell Jar,” and her husband, Ted Hughes, the future British Poet Laureate. Thirty years later, the visit to the Canyon still haunted Hughes: “I never went back and you are dead. / But at odd moments it comes, / As if for the first time.” He is not alone, I suspect, in never fully leaving the Canyon behind.

The Gym, for Millennia of Bodies and Souls

Today’s gyms, which depend on our vanity and body envy, are a far cry from what the Greeks envisioned

ILLUSTRATION: THOMAS FUCHS

Going to the gym takes on a special urgency at this time of year, as we prepare to put our bodies on display at the pool and beach. Though the desire to live a virtuous life of fitness no doubt plays its part, vanity and body envy are, I suspect, the main motivation for our seasonal exertions.

The ancient Greeks, who invented gyms (the Greek gymnasion means “school for naked exercise”), were also body-conscious, but they saw a deeper point to the sweat. No mere muscle shops, Greek gymnasia were state-sponsored institutions aimed at training young men to embody, literally, the highest ideals of Greek virtue. In Plato’s “The Republic,” Socrates says that the two branches of physical and academic education “seem to have been given by some god to men…to ensure a proper harmony between energy and initiative on the one hand and reason on the other, by tuning each to the right pitch.”

Physical competition, culminating in the Olympics, was a form of patriotic activity, and young men went to the gym to socialize, bathe and learn to think. Aristotle founded his school of philosophy in the Lyceum, in a gymnasium that included physical training.

The Greek concept fell out of favor in the West with the rise of Christianity. The abbot St. Bernard of Clairvaux (1090–1153), who advised five popes, wrote, “The spirit flourishes more strongly…in an infirm and weak body,” neatly summing up the medieval ambivalence toward physicality.

Many centuries later, an eccentric German educator named Friedrich Jahn (1778-1852) played a key role in the gym’s revival. Convinced that Prussia’s defeat by Napoleon was due to his compatriots’ descent into physical and moral weakness, Jahn decided that a Greek-style gym would “preserve young people from laxity and…prepare them to fight for the fatherland.” In 1811, he opened a gym in Berlin for military-style physical training (not to be confused with the older German usage of the term gymnasium for the most advanced level of secondary schools).

By the mid-19th century, Europe’s upper-middle classes had sufficient wealth and leisure time to devote themselves to exercise for exercise’s sake. Hippolyte Triat opened two of the first truly commercial gyms in Brussels and Paris in the late 1840s. A retired circus strongman, he capitalized on his physique to sell his “look.”

But broader spiritual ideas still influenced the spread of physical fitness. The 19th-century movement Muscular Christianity sought to transform the working classes into healthy, patriotic Christians. One offshoot, the Young Men’s Christian Association, became famous for its low-cost gyms.

By the mid-20th century, Americans were using their gyms for two different sets of purposes. Those devoted to “manliness” worked out at places like Gold’s Gym and aimed to wow others with their physiques. The other group, “health and fitness” advocates, expanded sharply after Jack LaLanne, who founded his first gym in 1936, turned a healthy lifestyle into a salable commodity. A few decades later, Jazzercise, aerobics, disco and spandex made the gym a liberating, fashionable and sexy place.

More than 57 million Americans belong to a health club today, but until local libraries start adding spinning classes and CrossFit, the gym will remain a shadow of the original Greek ideal. We prize our sound bodies, but we aren’t nearly as devoted to developing sound mind and character.

With Big Prizes Often Comes Controversy

It’s not just the Nobel: Award-giving missteps have a long history

ILLUSTRATION: THOMAS FUCHS

This spring, controversies have engulfed three big prizes.

The Swedish Academy isn’t awarding the Nobel Prize for Literature this year while it deals with the fallout from a scandal over allegations of sexual assault and financial impropriety.

In the U.S., the author Junot Díaz has stepped down as Pulitzer Prize chairman while the board investigates allegations of sexual misconduct. In a statement through his literary agent earlier this month, Mr. Díaz did not address individual accusations but said in part, “I take responsibility for my past.” Finally, the organizers of the Echo, Germany’s version of the Grammys, said they would no longer bestow the awards after one of this year’s prizes went to rappers who used anti-Semitic words and images in their lyrics and videos.

Prize-giving controversies—some more serious than others—go back millennia. I know something about prizes, having served as chairwoman of the literary Man Booker Prize jury.

The ancient Greeks gave us the concept of the arts prize. To avoid jury corruption in their drama competitions, during the Festival of Dionysus, the Athenians devised a complicated system of votes and lotteries that is still not entirely understood today. Looking back now, the quality of the judging seems questionable. Euripides, the greatest tragedian of classical Greece, habitually challenged his society’s assumptions in tragedies like “Medea,” which sympathetically portrayed female desperation in a society where men ruled absolutely. In a three-way competition, “Medea,” which still holds the stage today, placed last.

Controversy surrounding a competition can be a revitalizing force—especially when the powers that be support the dissidents. By the 1860s, France’s Academy of Fine Arts, the defender of official taste, was growing increasingly out of touch with contemporary art. In 1863, the jury of the prestigious annual Salon exhibition, which the academy controlled, rejected artists such as Paul Cézanne, Camille Pissarro and Édouard Manet.

The furor caused Emperor Napoleon III to order a special exhibition called the Salon “of Rejects” to “let the public judge” who was right. The public was divided, but the artists felt emboldened, and many scholars regard 1863 as the birthdate of modern painting. The Academy ultimately relinquished its control of the Salon in 1881. Its time was over.

At other times, controversies over prizes are more flash than substance. As antigovernment student protests swept Paris and many other places in 1968, a group of filmmakers tried to show solidarity with the protesters by shutting down the venerable Cannes Film Festival. At one point, directors hung from a curtain to prevent a film from starting. The festival was canceled but returned in 1969 without the revolutionary changes some critics were hoping for.

In contrast, a recent dispute at the festival over its refusal to allow in its competition Netflix films that bypass French theaters for streaming was relatively quiet but reflects the serious power struggle between streaming services and theatrical movie distributors.

As the summer approaches and the beleaguered festivals around the world take a breather, here’s some advice from a survivor of the prize process: Use this time to reflect and revive.

Serendipity of Science is Often Born of Years of Labor

Over the centuries, lucky discoveries depend on training and discernment

ILLUSTRATION: THOMAS FUCHS

One recent example comes from an international scientific team studying the bacterium, Ideonella sakaiensis 201-F6, which makes an enzyme that breaks down the most commonly used form of plastic, thus allowing the bacterium to eat it. As reported last month in the Proceedings of the National Academy of Sciences, in the course of their research the scientists accidentally created an enzyme even better at dissolving the plastic. It’s still early days, but we may have moved a step closer to solving the man-made scourge of plastics pollution.

The development illustrates a truth about seemingly serendipitous discoveries: The “serendipity” part is usually the result of years of experimentation—and failure. A new book by two business professors at Wharton and a biology professor, “Managing Discovery in the Life Sciences,” argues that governments and pharmaceutical companies should adopt more flexible funding requirements—otherwise innovation and creativity could end up stifled by the drive for quick, concrete results. As one of the authors, Philip Rea, argues, serendipity means “getting answers to questions that were never posed.”

So much depends on who has observed the accident, too. As Joseph Henry, the first head of the Smithsonian Institution, said, “The seeds of great discoveries are constantly floating around us, but they only take root in minds well prepared to receive them.”

One famously lucky meeting of perception and conception happened in 1666, when Isaac Newton observed an apple fall from the tree. (The details remain hazy, but there’s no evidence that the fruit actually hit him, as legend has it.) Newton had seen apples fall before, of course, but this time the sight inspired him to ask questions about gravity’s relationship to the rules of motion that he was contemplating. Still, it took Newton another 20 years of work before he published his Law of Universal Gravitation.

Bad weather was the catalyst for another revelation, leading to physicist Henri Becquerel’s discovery of radioactivity in 1896. Unable to continue his photographic X-ray experiments on the effect of sunlight on uranium salt, Becquerel put the plates in a drawer. They developed, incredibly, without light. Realizing that he had been pursing the wrong question, Becquerel started again, this time focusing on uranium itself as a radiation emitter.

As for inventions, accident and inadvertence played a role in the development of Post-it Notes and microwave heating. During the 1990s, Viagra failed miserably in trials as a treatment for angina, but alert researchers at Pfizer realized that one of the side effects could have global appeal.

The most famous accidental medical discovery is antibiotics. The biologist Alexander Fleming discovered penicillin in 1928 after he went on vacation, leaving a petri dish of bacteria out in the laboratory. On his return, the dish had developed mold, with a clean area around it. Fleming realized that something in the mold must have killed off the bacteria.

That ability to ask the right questions can be more important than knowing the right answers. Funders of science should take note.

 

How Mermaid-Merman Tales Got to This Year’s Oscars

ILLUSTRATON: DANIEL ZALKUS

‘The Shape of Water,’ the best-picture winner, extends a tradition of ancient tales of these water creatures and their dealings with humans

Popular culture is enamored with mermaids. This year’s Best Picture Oscar winner, Guillermo del Toro’s “The Shape of Water,” about a lonely mute woman and a captured amphibious man, is a new take on an old theme. “The Little Mermaid,” Disney ’senormously successful 1989 animated film, was based on the Hans Christian Andersen story of the same name, and it was turned into a Broadway musical, which even now is still being staged across the country.

The fascination with mermythology began with the ancient Greeks. In the beginning, mermen were few and far between. As for mermaids, they were simply members of a large chorus of female sea creatures that included the benign Nereids, the sea-nymph daughters of the sea god Nereus, and the Sirens, whose singing led sailors to their doom—a fate Odysseus barely escapes in Homer’s epic “The Odyssey.”

Over the centuries, the innocuous mermaid became interchangeable with the deadly sirens. They led Scottish sailors to their deaths in one of the variations of the anonymous poem “Sir Patrick Spens,” probably written in the 15th century: “Then up it raise the mermaiden, / Wi the comb an glass in her hand: / ‘Here’s a health to you, my merrie young men, / For you never will see dry land.’”

In pictures, mermaids endlessly combed their hair while sitting semi-naked on the rocks, lying in wait for seafarers. During the Elizabethan era, a “mermaid” was a euphemism for a prostitute. Poets and artists used them to link feminine sexuality with eternal damnation.

But in other tales, the original, more innocent idea of a mermaid persisted. Andersen’s 1837 story followed an old literary tradition of a “virtuous” mermaid hoping to redeem herself through human love.

Andersen purposely broke with the old tales. As he acknowledged to a friend, his fishy heroine would “follow a more natural, more divine path” that depended on her own actions rather than that of “an alien creature.” Egged on by her sisters to murder the prince whom she loves and return to her mermaid existence, she chooses death instead—a sacrifice that earns her the right to a soul, something that mermaids were said to lack.

Richard Wagner’s version of mermaids—the Rhine maidens who guard the treasure of “Das Rheingold”—also bucked the “temptress” cliché. While these maidens could be cruel, they gave valuable advice later in the “Ring” cycle.

The cultural rehabilitation of mermaids gained steam in the 20th century. In T.S. Eliot’s 1915 poem, “The Love Song of J. Alfred Prufrock,” their erotic power becomes a symbol of release from stifling respectability. The sad protagonist laments, “I have heard the mermaids singing, each to each. / I do not think that they will sing to me.” By 1984, when a gorgeous mermaid (Daryl Hannah) fell in love with a nerdy man ( Tom Hanks ) in the film comedy “Splash,” audiences were ready to accept that mermaids might offer a liberating alternative to society’s hang-ups, and that humans themselves are the obstacle to perfect happiness, not female sexuality.

What makes “The Shape of Water” unusual is that a scaly male, not a sexy mermaid, is the object of affection to be rescued. Andersen probably wouldn’t recognize his Little Mermaid in Mr. del Toro’s nameless, male amphibian, yet the two tales are mirror images of the same fantasy: Love conquers all.

In Epidemics, Leaders Play a Crucial Role

ILLUSTRATION: JON KRAUSE

Lessons in heroism and horror as a famed flu pandemic hits a milestone

A century ago this week, an army cook named Albert Gitchell at Fort Riley, Kansas, paid a visit to the camp infirmary, complaining of a severe cold. It’s now thought that he was America’s patient zero in the Spanish Flu pandemic of 1918.

The disease killed more than 40 million people world-wide, including 675,000 Americans. In this case, as in so many others throughout history, the pace of the pandemic’s deadly progress depended on the actions of public officials.

Spain had allowed unrestricted reporting about the flu, so people mistakenly believed it originated there. Other countries, including the U.S., squandered thousands of lives by suppressing news and delaying health measures. Chicago kept its schools open, citing a state commission that had declared the epidemic at a “standstill,” while the city’s public health commissioner said, “It is our duty to keep the people from fear. Worry kills more people than the epidemic.”

Worry had indeed sown chaos, misery and violence in many previous outbreaks, such as the Black Plague. The disease, probably caused by bacteria-infected fleas living on rodents, swept through Asia and Europe during the 1340s, killing up to a quarter of the world’s population. In Europe, where over 50 million died, a search for scapegoats led to widespread pogroms against Jews. In 1349, the city of Strasbourg in France, already somewhat affected by the plague, put to death hundreds of Jews and expelled the rest.

But not all authorities lost their heads at the first sign of contagion. Pope Clement VI (1291-1352), one of a series of popes who ruled from the southern French city of Avignon, declared that the Jews had not caused the plague and issued two papal bulls against their persecution.

In Italy, Venetian authorities took the practical approach: They didn’t allow ships from infected ports to dock and subjected all travelers to a period of isolation. The term quarantine comes from the Italian quaranta giorni, meaning “40 days”—the official length of time until the Venetians granted foreign ships the right of entry.

Less exalted rulers could also show prudence and compassion in the face of a pandemic. After the Black Plague struck the village of Eyam in England, the vicar William Mompesson persuaded its several hundred inhabitants not to flee, to prevent the disease from spreading to other villages. The biggest landowner in the county, the earl of Devonshire, ensured a regular supply of food and necessities to the stricken community. Some 260 villagers died during their self-imposed quarantine, but their decision likely saved thousands of lives.

The response to more recent pandemics has not always met that same high standard. When viral severe acute respiratory syndrome (SARS) began in China in November 2002, the government’s refusal to acknowledge the outbreak allowed the disease to spread to Hong Kong, a hub for the West and much of Asia, thus creating a world problem. On a more hopeful note, when Ebola was spreading uncontrollably through West Africa in 2014, the Ugandans leapt into action, saturating their media with warnings and enabling quick reporting of suspected cases, and successfully contained their outbreak.

Pandemics always create a sense of crisis. History shows that public leadership is the most powerful weapon in keeping them from becoming full-blown tragedies.

The Quest for Unconsciousness: A Brief History of Anesthesia

The ancient Greeks used alcohol and opium. Patients in the 12th century got a ‘soporific sponge.’ A look at anesthetics over the centuries

ILLUSTRATION: ELLEN WEINSTEIN

Every year, some 21 million Americans undergo a general anesthetic. During recent minor surgery, I became one of the roughly 26,000 Americans a year who experience “anesthetic awareness” during sedation: I woke up. I still can’t say what was more disturbing: being conscious or seeing the horrified faces of the doctors and nurses.

The best explanation my doctors could give was that not all brains react in the same way to a general anesthetic. Redheads, for example, seem to require higher dosages than brunettes. While not exactly reassuring, this explanation does highlight one of the many mysteries behind the science of anesthesia.

Although being asleep and being unconscious might look the same, they are very different states. Until the mid-19th century, a medically induced deep unconsciousness was beyond the reach of science. Healers had no reliable way to control, let alone eliminate, a patient’s awareness or pain during surgery, though not for lack of trying.

The ancient Greeks generally relied on alcohol, poppy opium or mandrake root to sedate patients. Evidence from the “Sushruta Samhita,” an ancient Sanskrit medical text, suggests that Indian healers used cannabis incense. The Chinese developed acupuncture at some point before 100 B.C., and in Central and South America, shamans used the spit from chewed coca leaves as a numbing balm.

Little changed over the centuries. In the 12th century, Nicholas of Salerno recorded in a treatise the recipe for a “soporific sponge” with ingredients that hadn’t advanced much beyond the medicines used by the Greeks: a mixture of opium, mulberry juice, lettuce seed, mandrake, ivy and hemlock.

Discoveries came but weren’t exploited. In 1540, the German alchemist and astrologer Paracelsus (aka Theophrastus Bombastus von Hohenheim) noted that liquid ether could induce sleep in animals. In 1772, the English chemist Joseph Priestley discovered nitrous oxide gas (laughing gas). Using it became the thing to do at parties—in 1799, the poet Coleridge described trying the gas—but no one apparently tried using ether or nitrous oxide for medicinal purposes.

In 1811, the novelist Fanny Burney had no recourse when she went under the knife for suspected breast cancer. She wrote later, “O Heaven!—I then felt the Knife rackling against the breast bone—scraping it!”

Despite the ordeal, Burney lived into her 80s, dying in 1840—just before everything changed. Ether, nitrous oxide and later chloroform soon became common in operating theaters. On Oct. 16, 1846, a young dentist from Boston named William Morton made history by performing surgery on a patient anesthetized with ether. It was such a success that, a few months later, Frances Appleton Longfellow, wife of Henry Wadsworth Longfellow, became the first American to receive anesthesia during childbirth.

But these wonder drugs were lethal if not administered properly. A German study compiled in 1934 estimated that the number of chloroform-related deaths was as high as 1 in 3,000 operations. The drive for safer drugs produced such breakthroughs as halothane in 1955, which could be inhaled by patients.

Yet for all the continuous advances in anesthesia, scientists still don’t entirely understand how it works. A study published in the December 2017 issue of Annals of Botany reveals that anesthetics can also stop motion in plants like the Venus flytrap—which, as far as we know, doesn’t have a brain. Clearly, we still have a lot to learn about consciousness in every form.

Life Beyond the Three-Ring Circus

Why ‘The Greatest Show on Earth’ foundered—and what’s next

ILLUSTRATION: THOMAS FUCHS

The modern circus, which celebrates its 250th anniversary this year, has attracted such famous fans as Queen Victoria, Charles Dickens and Ernest Hemingway, who wrote in 1953, “It’s the only spectacle I know that, while you watch it, gives the quality of a truly happy dream.”

Recently, however, the “happy dream” has struggled with lawsuits, high-profile bankruptcies and killer clown scares inspired in part by the evil Pennywise in Stephen King’s “It.” Even the new Hugh Jackman -led circus film, “The Greatest Showman,” comes with an ironic twist. The surprise hit—about the legendary impresario P.T. Barnum, co-founder of “The Greatest Show on Earth”—arrives on the heels of last year’s closing of the actual Ringling Bros., Barnum and Bailey Circus, after 146 years in business.

The word circus is Roman, but Roman and modern circuses do not share the same roots. Rome’s giant Circus Maximus, which could hold some 150,000 people, was more of a sporting arena than a theatrical venue, built to hold races, athletic competitions and executions. The Roman satirist Juvenal was alluding to the popular appeal of such spectacles when he coined the phrase “bread and circuses,” assailing citizens’ lack of interest in politics.

In fact, the entertainments commonly linked with the modern circus—acrobatics, animal performances and pantomimes—belong to traditions long predating the Romans. Four-millennia-old Egyptian paintings show female jugglers; in China, archaeologists have found 2,000-year-old clay figurines of tumblers.

Circus-type entertainments could be hideously violent: In 17th-century Britain, dogs tore into bears and chimpanzees. A humane change of pace came in 1768, when Philip Astley, often called the father of the modern circus, put on his first show in London, in a simple horse-riding ring. He found that a circle 42 feet in diameter was ideal for using centrifugal force as an aid in balancing on a horse’s back while doing tricks. It’s a size still used today. Between the horse shows, he scheduled clowning and tumbling acts.Circuses in fledgling America, with its long distances, shortage of venues and lack of large cities, found the European model too static and costly. In 1808, Hachaliah Bailey took the circus in a new direction by making animals the real stars, particularly an African elephant named Old Bet. The focus on animal spectacles became the American model, while Europeans still emphasized human performers.

When railroads spread across America, circuses could ship their menageries. Already famous for his museums and “freak shows,” P.T. Barnum and his partners joined forces with rivals and used special circus trains to create the largest circus in the country. Although Barnum played up the animal and human oddities in his “sideshow,” the marquee attraction was Jumbo the Elephant. In its final year, the Ringling Bros. animal contingent, according to a news report, included tigers, camels, horses, kangaroos and snakes. The elephants had already retired.

Once animal-rights protests and rising travel costs started eroding profitability in the late 20th century, the American circus became trapped by its own history. But the success of Canada’s Cirque du Soleil, which since its 1984 debut has conquered the globe with its astounding acrobatics and staging, shows that the older European tradition introduced by Astley still has the power to inspire wonder. The future may well lie in looking backward, to the era when the stars of the show were the people in the ring.