When Royal Love Affairs Go Wrong

From Cleopatra to Edward VIII, monarchs have followed their hearts—with disastrous results.

ILLUSTRATION: THOMAS FUCHS

“Ay me!” laments Lysander in Shakespeare’s “A Midsummer Night’s Dream.” “For aught that I could ever read, / Could ever hear by tale or history, / The course of true love never did run smooth.” What audience would disagree? Thwarted lovers are indeed the stuff of history and art—especially when the lovers are kings and queens.

But there were good reasons why the monarchs of old were not allowed to follow their hearts. Realpolitik and royal passion do not mix, as Cleopatra VII (69-30 B.C.), the anniversary of whose death falls on Aug. 12, found to her cost. Her theatrical seduction of and subsequent affair with Julius Caesar insulated Egypt from Roman imperial designs. But in 41 B.C., she let her heart rule her head and fell in love with Mark Antony, who was fighting Caesar’s adopted son Octavian for control of Rome.

Cleopatra’s demand that Antony divorce his wife Octavia—sister of Octavian—and marry her instead was a catastrophic misstep. It made Egypt the target of Octavian’s fury, and forced Cleopatra into fighting Rome on Antony’s behalf. The couple’s defeat at the sea battle of Actium in 31 B.C. didn’t only end in personal tragedy: the 300-year-old Ptolemaic dynasty was destroyed, and Egypt was reduced to a Roman province.

In Shakespeare’s play “Antony and Cleopatra,” Antony laments, “I am dying, Egypt, dying.” It is a reminder that, as Egypt’s queen, Cleopatra was the living embodiment of her country; their fates were intertwined. That is why royal marriages have usually been inseparable from international diplomacy.

In 1339, when Prince Pedro of Portugal fell in love with his wife’s Castilian lady-in-waiting, Inés de Castro, the problem wasn’t the affair per se but the opportunity it gave to neighboring Castile to meddle in Portuguese politics. In 1355, Pedro’s father, King Afonso IV, took the surest way of separating the couple—who by now had four children together—by having Inés murdered. Pedro responded by launching a bloody civil war against his father that left northern Portugal in ruins. The dozens of romantic operas and plays inspired by the tragic love story neglect to mention its political repercussions; for decades afterward, the Portuguese throne was weak and the country divided.

Perhaps no monarchy in history bears more scars from Cupid’s arrow than the British. From Edward II (1284-1327), whose poor choice of male lovers unleashed murder and mayhem on the country—he himself was allegedly killed with a red hot poker—to Henry VIII (1491-1547), who bullied and butchered his way through six wives and destroyed England’s Catholic way of life in the process, British rulers have been remarkable for their willingness to place personal happiness above public responsibility.

Edward VIII (1894 -1972) was a chip off the block, in the worst way. The moral climate of the 1930s couldn’t accept the King of England marrying a twice-divorced American. Declaring he would have Wallis Simpson or no one, Edward plunged the country into crisis by abdicating in 1936. With European monarchies falling on every side, Britain’s suddenly looked extremely vulnerable. The current Queen’s father, King George VI, quite literally saved it from collapse.

According to a popular saying, “Everything in the world is about sex except sex. Sex is about power.” That goes double when the lovers wear royal crowns.

In Awe of the Grand Canyon

Since the 16th century, travelers have recorded the overwhelming impact of a natural wonder.

ILLUSTRATION BY THOMAS FUCHS

Strange as it may sound, it was watching Geena Davis and Susan Sarandon in the tragic final scene of “Thelma and Louise” (1991) that convinced me I had to go to the Grand Canyon one day and experience its life-changing beauty. Nearly three decades have passed, but I’m finally here. Instead of a stylish 1966 Ford Thunderbird, however, I’m driving a mammoth RV, with my family in tow.

The overwhelming presence of the Grand Canyon is just as I dreamed. Yet I’m acutely aware of how one-sided the relationship is. As the Pulitzer Prize-winning poet Carl Sandburg wrote in “Many Hats” in 1928: “For each man sees himself in the Grand Canyon—each one makes his own Canyon before he comes.”

The first Europeans to encounter the Canyon were Spanish conquistadors searching for the legendary Seven Golden Cities of Cibola. In 1540, Hopi guides took a small scouting party led by García López de Cárdenas to the South Rim (60 miles north of present-day Williams, Ariz.). In Cárdenas’s mind, the Canyon was a route to riches. After trying for three days to find a path to reach the river below, he cut his losses in disgust and left. Cárdenas saw no point to the Grand Canyon if it failed to yield any treasure.

Three centuries later, in 1858, the first Euro-American to follow in Cárdenas’s footsteps, Lt. Joseph Christmas Ives of the U.S. Army Corps of Topographical Engineers, had a similar reaction. In his official report, Ives waxed lyrical about the magnificent scenery but concluded, “The region is, of course, altogether valueless….Ours has been the first, and will doubtless be the last, party of whites to visit this profitless locality.”

Americans only properly “discovered” the Grand Canyon through the works of artists such as Thomas Moran. A devotee of the Hudson River School of painters, Moran found his spiritual and artistic home in the untamed landscapes of the West. His romantic pictures awakened the public to the natural wonder in their midst. Eager to see the real thing, the trickle of visitors turned into a stream by the late 1880s.

The effusive reactions to the Canyon recorded by tourists who made the arduous trek from Flagstaff, Ariz. (a railway to Grand Canyon Village was only built in 1901) have become a familiar refrain: “Not for human needs was it fashioned, but for the abode of gods…. To the end it effaced me,” wrote Harriet Monroe, the founder of Poetry magazine, in 1899.

But there was one class of people who were apparently insensible to the Canyon: copper miners. Watching their thoughtless destruction of the landscape, Monroe wondered, “Do they cease to feel it?” President Theodore Roosevelt feared so, and in 1908 he made an executive decision to protect 800,000 acres from exploitation by creating the Grand Canyon National Monument.

Roosevelt’s farsightedness may have put a crimp in the profits of mining companies, but it paid dividends in other ways. By the 1950s, the Canyon had become a must-see destination, attracting visitors from all over the world. Among them were the tragic Sylvia Plath, author of “The Bell Jar,” and her husband, Ted Hughes, the future British Poet Laureate. Thirty years later, the visit to the Canyon still haunted Hughes: “I never went back and you are dead. / But at odd moments it comes, / As if for the first time.” He is not alone, I suspect, in never fully leaving the Canyon behind.

The Gym, for Millennia of Bodies and Souls

Today’s gyms, which depend on our vanity and body envy, are a far cry from what the Greeks envisioned

ILLUSTRATION: THOMAS FUCHS

Going to the gym takes on a special urgency at this time of year, as we prepare to put our bodies on display at the pool and beach. Though the desire to live a virtuous life of fitness no doubt plays its part, vanity and body envy are, I suspect, the main motivation for our seasonal exertions.

The ancient Greeks, who invented gyms (the Greek gymnasion means “school for naked exercise”), were also body-conscious, but they saw a deeper point to the sweat. No mere muscle shops, Greek gymnasia were state-sponsored institutions aimed at training young men to embody, literally, the highest ideals of Greek virtue. In Plato’s “The Republic,” Socrates says that the two branches of physical and academic education “seem to have been given by some god to men…to ensure a proper harmony between energy and initiative on the one hand and reason on the other, by tuning each to the right pitch.”

Physical competition, culminating in the Olympics, was a form of patriotic activity, and young men went to the gym to socialize, bathe and learn to think. Aristotle founded his school of philosophy in the Lyceum, in a gymnasium that included physical training.

The Greek concept fell out of favor in the West with the rise of Christianity. The abbot St. Bernard of Clairvaux (1090–1153), who advised five popes, wrote, “The spirit flourishes more strongly…in an infirm and weak body,” neatly summing up the medieval ambivalence toward physicality.

Many centuries later, an eccentric German educator named Friedrich Jahn (1778-1852) played a key role in the gym’s revival. Convinced that Prussia’s defeat by Napoleon was due to his compatriots’ descent into physical and moral weakness, Jahn decided that a Greek-style gym would “preserve young people from laxity and…prepare them to fight for the fatherland.” In 1811, he opened a gym in Berlin for military-style physical training (not to be confused with the older German usage of the term gymnasium for the most advanced level of secondary schools).

By the mid-19th century, Europe’s upper-middle classes had sufficient wealth and leisure time to devote themselves to exercise for exercise’s sake. Hippolyte Triat opened two of the first truly commercial gyms in Brussels and Paris in the late 1840s. A retired circus strongman, he capitalized on his physique to sell his “look.”

But broader spiritual ideas still influenced the spread of physical fitness. The 19th-century movement Muscular Christianity sought to transform the working classes into healthy, patriotic Christians. One offshoot, the Young Men’s Christian Association, became famous for its low-cost gyms.

By the mid-20th century, Americans were using their gyms for two different sets of purposes. Those devoted to “manliness” worked out at places like Gold’s Gym and aimed to wow others with their physiques. The other group, “health and fitness” advocates, expanded sharply after Jack LaLanne, who founded his first gym in 1936, turned a healthy lifestyle into a salable commodity. A few decades later, Jazzercise, aerobics, disco and spandex made the gym a liberating, fashionable and sexy place.

More than 57 million Americans belong to a health club today, but until local libraries start adding spinning classes and CrossFit, the gym will remain a shadow of the original Greek ideal. We prize our sound bodies, but we aren’t nearly as devoted to developing sound mind and character.

The Quest for Unconsciousness: A Brief History of Anesthesia

The ancient Greeks used alcohol and opium. Patients in the 12th century got a ‘soporific sponge.’ A look at anesthetics over the centuries

ILLUSTRATION: ELLEN WEINSTEIN

Every year, some 21 million Americans undergo a general anesthetic. During recent minor surgery, I became one of the roughly 26,000 Americans a year who experience “anesthetic awareness” during sedation: I woke up. I still can’t say what was more disturbing: being conscious or seeing the horrified faces of the doctors and nurses.

The best explanation my doctors could give was that not all brains react in the same way to a general anesthetic. Redheads, for example, seem to require higher dosages than brunettes. While not exactly reassuring, this explanation does highlight one of the many mysteries behind the science of anesthesia.

Although being asleep and being unconscious might look the same, they are very different states. Until the mid-19th century, a medically induced deep unconsciousness was beyond the reach of science. Healers had no reliable way to control, let alone eliminate, a patient’s awareness or pain during surgery, though not for lack of trying.

The ancient Greeks generally relied on alcohol, poppy opium or mandrake root to sedate patients. Evidence from the “Sushruta Samhita,” an ancient Sanskrit medical text, suggests that Indian healers used cannabis incense. The Chinese developed acupuncture at some point before 100 B.C., and in Central and South America, shamans used the spit from chewed coca leaves as a numbing balm.

Little changed over the centuries. In the 12th century, Nicholas of Salerno recorded in a treatise the recipe for a “soporific sponge” with ingredients that hadn’t advanced much beyond the medicines used by the Greeks: a mixture of opium, mulberry juice, lettuce seed, mandrake, ivy and hemlock.

Discoveries came but weren’t exploited. In 1540, the German alchemist and astrologer Paracelsus (aka Theophrastus Bombastus von Hohenheim) noted that liquid ether could induce sleep in animals. In 1772, the English chemist Joseph Priestley discovered nitrous oxide gas (laughing gas). Using it became the thing to do at parties—in 1799, the poet Coleridge described trying the gas—but no one apparently tried using ether or nitrous oxide for medicinal purposes.

In 1811, the novelist Fanny Burney had no recourse when she went under the knife for suspected breast cancer. She wrote later, “O Heaven!—I then felt the Knife rackling against the breast bone—scraping it!”

Despite the ordeal, Burney lived into her 80s, dying in 1840—just before everything changed. Ether, nitrous oxide and later chloroform soon became common in operating theaters. On Oct. 16, 1846, a young dentist from Boston named William Morton made history by performing surgery on a patient anesthetized with ether. It was such a success that, a few months later, Frances Appleton Longfellow, wife of Henry Wadsworth Longfellow, became the first American to receive anesthesia during childbirth.

But these wonder drugs were lethal if not administered properly. A German study compiled in 1934 estimated that the number of chloroform-related deaths was as high as 1 in 3,000 operations. The drive for safer drugs produced such breakthroughs as halothane in 1955, which could be inhaled by patients.

Yet for all the continuous advances in anesthesia, scientists still don’t entirely understand how it works. A study published in the December 2017 issue of Annals of Botany reveals that anesthetics can also stop motion in plants like the Venus flytrap—which, as far as we know, doesn’t have a brain. Clearly, we still have a lot to learn about consciousness in every form.

Life Beyond the Three-Ring Circus

Why ‘The Greatest Show on Earth’ foundered—and what’s next

ILLUSTRATION: THOMAS FUCHS

The modern circus, which celebrates its 250th anniversary this year, has attracted such famous fans as Queen Victoria, Charles Dickens and Ernest Hemingway, who wrote in 1953, “It’s the only spectacle I know that, while you watch it, gives the quality of a truly happy dream.”

Recently, however, the “happy dream” has struggled with lawsuits, high-profile bankruptcies and killer clown scares inspired in part by the evil Pennywise in Stephen King’s “It.” Even the new Hugh Jackman -led circus film, “The Greatest Showman,” comes with an ironic twist. The surprise hit—about the legendary impresario P.T. Barnum, co-founder of “The Greatest Show on Earth”—arrives on the heels of last year’s closing of the actual Ringling Bros., Barnum and Bailey Circus, after 146 years in business.

The word circus is Roman, but Roman and modern circuses do not share the same roots. Rome’s giant Circus Maximus, which could hold some 150,000 people, was more of a sporting arena than a theatrical venue, built to hold races, athletic competitions and executions. The Roman satirist Juvenal was alluding to the popular appeal of such spectacles when he coined the phrase “bread and circuses,” assailing citizens’ lack of interest in politics.

In fact, the entertainments commonly linked with the modern circus—acrobatics, animal performances and pantomimes—belong to traditions long predating the Romans. Four-millennia-old Egyptian paintings show female jugglers; in China, archaeologists have found 2,000-year-old clay figurines of tumblers.

Circus-type entertainments could be hideously violent: In 17th-century Britain, dogs tore into bears and chimpanzees. A humane change of pace came in 1768, when Philip Astley, often called the father of the modern circus, put on his first show in London, in a simple horse-riding ring. He found that a circle 42 feet in diameter was ideal for using centrifugal force as an aid in balancing on a horse’s back while doing tricks. It’s a size still used today. Between the horse shows, he scheduled clowning and tumbling acts.Circuses in fledgling America, with its long distances, shortage of venues and lack of large cities, found the European model too static and costly. In 1808, Hachaliah Bailey took the circus in a new direction by making animals the real stars, particularly an African elephant named Old Bet. The focus on animal spectacles became the American model, while Europeans still emphasized human performers.

When railroads spread across America, circuses could ship their menageries. Already famous for his museums and “freak shows,” P.T. Barnum and his partners joined forces with rivals and used special circus trains to create the largest circus in the country. Although Barnum played up the animal and human oddities in his “sideshow,” the marquee attraction was Jumbo the Elephant. In its final year, the Ringling Bros. animal contingent, according to a news report, included tigers, camels, horses, kangaroos and snakes. The elephants had already retired.

Once animal-rights protests and rising travel costs started eroding profitability in the late 20th century, the American circus became trapped by its own history. But the success of Canada’s Cirque du Soleil, which since its 1984 debut has conquered the globe with its astounding acrobatics and staging, shows that the older European tradition introduced by Astley still has the power to inspire wonder. The future may well lie in looking backward, to the era when the stars of the show were the people in the ring.

The First Fixer-Upper: A Look at White House Renovations

Rude visitors, sinking pianos and dismayed presidential residents

ILLUSTRATION: THOMAS FUCHS

This year marks the bicentennial of the public reopening of the White House after the War of 1812, when the British burned the executive mansion and sent President James Madison fleeing. Though the grand house has legions of devotees today, its occupants haven’t always loved the place.

The problems began in the 1790s, as the Founding Fathers struggled with the question of how grand such a residence should be for an elected president in a popular government. Was the building to be a government office with sleeping arrangements, a private home, the people’s palace or all of the above? Frequent name changes reflected the confusion: President’s Palace, President’s House and Executive Mansion. The president made its official name the White House only in 1901.

Continue reading…

The Long and Winding Road to New Year’s

 

The hour, date and kind of celebration have changed century to century

With its loud TV hosts, drunken parties and awful singing, New Year’s Eve might seem to have been around forever. Yet when it comes to the timing and treatment of the  holiday, our version of New Year’s—the eve and day itself—is a relatively recent tradition.

ILLUSTRATION: THOMAS FUCHS

The Babylonians celebrated New Year’s in March, when the vernal equinox—a day of equal light and darkness—takes place. To them, New Year’s was a time of pious reckoning rather than raucous partying. The Egyptians got the big parties going: Their celebration fell in line with the annual flooding of the Nile River. It was a chance to get roaring drunk for a few weeks rather just for a few hours. The holiday’s timing, though, was the opposite of ours, in July.

Continue reading…

The Ancient Magic of Mistletoe

The plant’s odyssey from a Greek festival to a role in the works of Dickens and Trollope

ILLUSTRATION: THOMAS FUCHS

Is mistletoe naughty or nice? The No. 1 hit single for Christmas 1952 was young Jimmy Boyd warbling how he caught “mommy kissing Santa Claus underneath the mistletoe last night.” It may very well have been daddy in costume—but, if not, that would make mistletoe very naughty indeed. For this plant, that would be par for the course.

Mistletoe, in its various species, is found all over the world and has played a part in fertility rituals for thousands of years. The plant’s ability to live off other trees—it’s a parasite—and remain evergreen even in the dead of winter awed the earliest agricultural societies. Mistletoe became a go-to plant for sacred rites and poetic inspiration.

Kissing under the mistletoe may have begun with the Greeks’ Kronia agricultural festival. Its Roman successor, the Saturnalia, combined licentious behavior with mistletoe. The naturalist Pliny the Elder, who died in A.D. 79, noticed to his surprise that mistletoe was just as sacred, if not more, to the Druids of Gaul. Its growth on certain oak trees, which the Druids believed to possess magical powers, spurred them to use mistletoe in ritual sacrifices and medicinal potions to cure ailments such as infertility.

Mistletoe’s mystical properties also earned it a starring role in the 13th-century Old Norse collection of mythical tales known as the Prose Edda. Here mistletoe becomes a deadly weapon in the form of an arrow that kills the sun-god Baldur. His mother Frigga, the goddess of love and marriage, weeps tears that turn into white mistletoe berries. In some versions, this brings Baldur back to life, carrying faint echoes of the reincarnation myths of ancient Mesopotamia. Either way, Frigga declares mistletoe to be the symbol of peace and love.

Beliefs about mistletoe’s powers managed to survive the Catholic Church’s official disapproval for all things pagan. People used the plant as a totem to scare away trolls, thwart witchcraft, prevent fires and bring about reconciliations. But such superstitions fizzled out in the wake of the Enlightenment.

Continue reading…

Kylo Ren, Meet Huck Finn: A History of Sequels and Their Heroes

The pedigree of sequels is as old as storytelling itself

ILLUSTRATION: RUTH GWILY

“Star Wars: The Last Jedi” may end up being the most successful movie sequel in the biggest sequel-driven franchise in the history of entertainment. That’s saying something, given Hollywood’s obsession with sequels, prequels, reboots and remakes. Although this year’s “Guardians of the Galaxy 2” was arguably better than the first, plenty of people—from critics to stand-up comedians—have wondered why in the world we needed a 29th “Godzilla,” an 11th “Pink Panther” or “The Godfather Part III.”

But sequels aren’t simply about chasing the money. They have a distinguished pedigree, as old as storytelling itself. Homer gets credit for popularizing the trend in the eighth century B.C., when he followed up “The Iliad” with “The Odyssey,” in which one of the relatively minor characters in the original story triumphs over sexy immortals, scary monsters and evil suitors of his faithful wife. Presumably with an eye to drawing in fans of the “Iliad,” Homer was sure to throw in a flashback about the Trojan horse. Continue reading…