Historically Speaking: Democracy Helped Seed National Parks

Green spaces and nature preserves have long existed, but the idea of protecting natural wonders for human enjoyment has American roots.

The Wall Street Journal

March 3, 2022

Yellowstone, the world’s oldest national park, turned 150 this month. The anniversary of its founding is a timely reminder that democracy isn’t just a political system but a way of life. The Transcendentalist writer Henry David Thoreau was one of the earliest Americans to link democratic values with national parks. Writing in 1858 he declared that having “renounced the king’s authority” over their land, Americans should use their hard-won freedom to create national preserves for “inspiration and our own true re-creation.”

There had been nature reserves, royal hunting grounds, pleasure gardens and parks long before Yellowstone, of course. The origins of green spaces can be traced to ancient Egypt’s temple gardens. Hyde Park, which King Charles I opened to Londoners in 1637, led the way for public parks. In the 3rd century B.C., King Devanampiya Tissa of Sri Lanka created the Mihintale nature reserve as a wildlife sanctuary, prefiguring by more than 2,000 years the likely first modern nature reserve, which the English naturalist Charles Waterton built on his estate in Yorkshire in the 1820s.

The Grand Canyon of the Yellowstone River, Yellowstone National Park, 1920. GAMMA-KEYSTONE/GETTY IMAGES

The 18th century saw a flowering of interest in man’s relationship with nature, and these ideas encouraged better management of the land. The English scientist Stephen Hale demonstrated the correlation between tree coverage and rainfall, leading a British MP named Soame Jenyns to convince Parliament to found the Tobago Main Ridge Forest Reserve in 1776. The protection of the Caribbean colony’s largest forest was a watershed moment in the history of conservation. Highly motivated individuals soon started conservation projects in multiple countries.

As stewards of their newly independent nation, Americans regarded their country’s natural wonders as places to be protected for the people rather than from them. (Niagara Falls, already marred by development when Alexis de Tocqueville visited in 1831, served as a cautionary example of legislative failure.) The first attempt to create a public nature reserve was at the state level: In 1864 President Abraham Lincoln signed the Yosemite Valley Grant Act, giving the land to California “upon the express conditions that the premises shall be held for public use, resort, and recreation.” But the initiative lacked any real oversight.

Many groups pushed for a federal system of national parks. Among them were the Transcendentalists, environmentalists and landscape painters such as George Catlin, Thomas Cole and Albert Bierstadt, but the ultimate credit belongs to the geologist Ferdinand Hayden, who surveyed Yellowstone in 1871. The mass acclaim following his expedition finally convinced Congress to turn Yellowstone into a national park.

Unfortunately, successive administrations failed to provide sufficient funds for its upkeep, and Yellowstone suffered years of illegal poaching and exploitation. In desperation, the federal government sent the U.S. Army in to take control of the park in 1886. The idea proved to be an inspired one. The military was such a conscientious custodian that its management style became the model for the newly created National Park Service in 1916.

The NPS currently oversees 63 National Parks. But the ethos hasn’t changed since the Yellowstone Act of 1872 set aside 2 million pristine acres “for the benefit and enjoyment of the people.” These words are now engraved above the north entrance to the park, an advertisement, as the novelist and environmentalist Wallace Stegner once wrote, that national parks are “absolutely American, absolutely democratic.”

Historically Speaking: Anorexia’s Ancient Roots And Present Toll

The deadly affliction, once called self-starvation, has become much more common during the confinement of the pandemic.

The Wall Street Journal

February 18, 2022

Two years ago, when countries suspended the routines of daily life in an attempt to halt the spread of Covid-19, the mental health of children plunged precipitously.

Two years ago, when countries suspended the routines of daily life in an attempt to halt the spread of Covid-19, the mental health of children took a plunge. One worrying piece of evidence for this was an extraordinary spike in hospitalizations for anorexia and other eating disorders among adolescents, especially girls between the ages of 12 and 18, and not just in the U.S. but around the world. U.S. hospitalizations for eating disorders doubled between March and May 2020. England’s National Health Service recorded a 46% increase in eating disorder referrals by 2021 compared with 2019. Perth Children’s hospital in Australia saw a 104% increase in hospitalizations and in Canada, the rate tripled.

Anorexia nervosa has a higher death rate than any other mental illness. According to the National Eating Disorders Association, 75% of its sufferers are female. And while the affliction might seem relatively new, it has ancient antecedents.

As early as the sixth century B.C., adherents of Jainism in India regarded “santhara,” fasting to death, as a purifying religious ritual, particularly for men. Emperor Chandragupta, founder of the Mauryan dynasty, died in this way in 297 B.C. St. Jerome, who lived in the fourth and fifth centuries A.D., portrayed extreme asceticism as an expression of Christian piety. In 384, one of his disciples, a young Roman woman named Blaesilla, died of starvation. Perhaps because she fits the contemporary stereotype of the middle-class, female anorexic, Blaesilla rather than Chandragupta is commonly cited as the first known case.

The label given to spiritual and ascetic self-starvation is anorexia mirabilis, or “holy anorexia,” to differentiate it from the modern diagnosis of anorexia nervosa. There were two major outbreaks in history. The first began around 1300 and was concentrated among nuns and deeply religious women, some of whom were later elevated to sainthood. The second took off during the 19th century. So-called “fasting girls” or “miraculous maids” in Europe and America won acclaim for appearing to survive without food. Some were exposed as fakes; others, tragically, were allowed to waste away.

But, confusingly, there are other historical examples of anorexic-like behavior that didn’t involve religion or women. The first medical description of anorexia, written by Dr. Richard Morton in 1689, concerned two patients—an adolescent boy and a young woman—who simply wouldn’t eat. Unable to find a physical cause, Morton called the condition “nervous consumption.”

A subject under study in the Minnesota Starvation Experiment, 1945. WALLACE KIRKLAND/THE LIFE PICTURE COLLECTION/SHUTTERSTOCK

Almost two centuries passed before French and English doctors accepted Morton’s suspicion that the malady had a psychological component. In 1873, Queen Victoria’s physician, Sir William Gull, coined the term “Anorexia Nervosa.”

Naming the disease was a huge step forward. But its treatment was guided by an ever-changing understanding of anorexia’s causes, which has spanned the gamut from the biological to the psychosexual, from bad parenting to societal misogyny.

The first breakthrough in anorexia treatment, however, came from an experiment involving men. The Minnesota Starvation Experiment, a World War II –era study on how to treat starving prisoners, found that the 36 male volunteers exhibited many of the same behaviors as anorexics, including food obsessions, excessive chewing, bingeing and purging. The study showed that the malnourished brain reacts in predictable ways regardless of race, class or gender.

Recent research now suggests that a genetic predisposition could count for as many as 60% of the risk factors behind the disease. If this knowledge leads to new specialized treatments, it will do so at a desperate time: At the start of the year, the Lancet medical journal called on governments to take action before mass anorexia cases become mass deaths. The lockdown is over. Now save the children.

A shorter version appeared in The Wall Street Journal

Historically Speaking: A Mollusk With a Storied Past in Human Seduction

Long associated with Aphrodite, oysters graced the menus of Roman orgies, Gold Rush eateries and Manhattan brothels.

The Wall Street Journal

February 4, 2021

The oyster is one of nature’s great survivors—or it was. Today it is menaced by the European green crab, which has been taking over Washington’s Lummi Sea Pond and outer coastal areas. Last month’s emergency order by Gov. Jay Inslee, backed up by almost $9 million in funds, speaks to the threat facing the Pacific Northwest shellfish industry if the invaders take over.

As any oyster lover knows, the true oyster, or oyster ostreidae, is the edible kind—not to be confused with the pearl-making oysters of the pteriidae family. But both are bivalves, meaning they have hinged shells, and they have been around for at least 200 million years.

King James I of England is alleged to have remarked, “He was a very valiant man, who first adventured on eating of oysters.” That man may also have lived as many as 164,000 years ago, when evidence from Africa suggests that humans were already eating shellfish.

The ancient Greeks were the first to make an explicit connection between oysters and, ahem, sex. Aphrodite, the goddess of love, was said by the 8th-century B.C. poet Hesiod to have risen from the sea foam when the Titan god Kronos cut off the genitals of his father Ouranos and hurled them into the sea. Thereafter, Greek artists frequently depicted her emerging from a flat shell, making a visual pun on the notion that oysters resemble female genitalia.

By Roman times it had become a truism that oysters were an aphrodisiac, and they graced the menus of orgies. The Roman engineer Sergius Orata, who is credited with being the father of underfloor heating, also designed the first oyster farms.

ILLUSTRATION: THOMAS FUCHS

Many skills and practices were lost during the Dark Ages, but not the eating of oysters. In his medical treatise, “A Golden Practice of Physick,” the 16th-century Swiss physician Felix Platter recommended eating oysters for restoring a lost libido. The great Italian seducer Giacomo Casanova clearly didn’t suffer from that problem, but he did make oysters a part of his seductive arsenal: “Voluptuous reader, try it,” he urged in his memoirs.

In the 19th century, oysters were so large and plentiful in New York and San Francisco that they were a staple food. A dish from the Gold Rush, called the Hangtown Fry, was an omelet made with deep fried oysters and bacon and is often cited as the start of Californian cuisine. In New York there were oyster restaurants for every class of clientele, from oyster cellars-cum-brothels to luxury oyster houses that catered to the aristocracy. The most sought-after was Thomas Downing’s Oyster House on 5 Broad Street. In addition to making Downing, the son of freed slaves, an extremely wealthy man, his oyster restaurant provided refuge for escaped slaves on the Underground Railroad to Canada.

At least since the 20th century, it has been well known that oysters play a vital role in filtering pollution out of our waters. And it turns out that their association with Aphrodite contains an element of truth as well. A 2009 study published in the Journal of Human Reproductive Sciences found a link between zinc deficiency and sexual dysfunction in rats. Per serving, the oyster contains more zinc than any other food. Nature has provided a cure, if the green crab doesn’t eat it first.

Historically Speaking: Water Has Long Eluded Human Mastery

From ancient Mesopotamia to the California desert, people have struggled to bend earth’s most plentiful resource to their will

The Wall Street Journal

January 21, 2022

In “Chinatown,” Roman Polanski’s classic 1974 film noir, loosely based on the events surrounding the diversion of water from the Owens Valley to Los Angeles in 1913, an ex-politician warns: “Beneath this building, beneath every street, there’s a desert. Without water the dust will rise up and cover us as though we’d never existed!”

The words resonate as California, indeed the entire American West, now enters the third decade of what scientists are terming a “mega-drought.” Water levels at Lake Mead in Nevada, the nation’s largest reservoir, and Lake Powell in Arizona, the second-largest, have dropped to historic lows. Earlier this month, the first ever federal water restrictions on the Colorado River system came into effect.

Since the earliest civilizations emerged in the Fertile Crescent in the Middle East, humankind has tried to master water resources, only to be brought low by its own hubris and nature’s resistance to control.

The Sumerians of Mesopotamia, builders of the first cities, created canals and irrigation systems to ensure that their crops could withstand the region’s frequent droughts. Competition between cities resulted in wars and conflicts—leading, around 2550 B.C., to history’s first recorded treaty: an agreement between the cities of Lagash and Umma to respect each other’s access to the water supply. Unfortunately, the Sumerians didn’t know that irrigation must be carefully managed to avoid pollution and excessive salinization of the land. They literally sowed their earth with salt, ruining the soil and ultimately contributing to their civilization’s demise.

Water became a potent weapon in the ancient world. Invaders and defenders regularly poisoned water or blocked it from reaching their foes. When Julius Caesar was under siege in Alexandria in 47 B.C., Ptolemy XIII contaminated the local water supply in an effort to force the Romans to withdraw. But the Romans managed to dig two deep wells for fresh water within the territory they held.

ILLUSTRATION: THOMAS FUCHS

Desiccated ruins of once-great cities can be found on almost every continent. The last emperors of the Classic Maya civilization on the Yucatán Peninsula, 250-950 A.D., couldn’t overcome a crippling drought that started around 750 and continued intermittently until 1025. As the water dried up, Mayan society entered a death spiral of wars, famine and internal conflicts. Their cities in the southern lowlands were eventually reclaimed by the jungle.

In Southeast Asia during the 14th and 15th centuries, one of the most sophisticated hydraulic systems of its time couldn’t save Angkor Wat, capital of the Khmer Empire, from the double onslaught of droughts and floods. The city is now a haunting ruin in the Cambodian jungle.

Modern technology, from desalination plants to hydroelectric dams, have enabled humans to stay one step ahead of nature’s vagaries, until now. According U.N. and World Bank experts in 2018, some 40% of the world’s population struggles with water scarcity. Water conflicts are proliferating, including in the U.S. In California, Chinatown-type skullduggery may be a thing of the past, but tensions remain. Extreme drought in the Klamath Basin along the California-Oregon border has pitted communities against one another for decades, with no solution in sight.

In 1962, President John F Kennedy declared: “Anyone who can solve the problems of water will be worthy of two Nobel Prizes—one for peace and one for science.” We are still waiting for that person.

The Sunday Times: I don’t want to fight about it but this talk of US civil war is overblown

Experts on conflict predict unrest, but America has a long way to go before it is as divided as it was in 1861

The Sunday Times

January 9, 2022

Violence is in the air. No one who saw the shocking scenes during the Capitol riot in Washington on January 6, 2021, can pretend that it was just a big misunderstanding. Donald Trump and his allies attempted to retain power at all costs. Terrible things happened that day. A year later the wounds are still raw and the country is still polarised. Only Democratic leaders participated in last week’s anniversary commemoration; Republicans stayed away. The one-year mark has produced a blizzard of warnings that the US is spiralling into a second civil war.

Only an idiot would ignore the obvious signs of a country turning against itself. Happy, contented electorates don’t storm their parliament (although terrified and oppressed peoples don’t either). America has reached the point where the mid-term elections are no longer a yawn but a test case for future civil unrest.

Predictably, the left and right are equally loud in their denunciations of each other. “Liberals” look at “conservatives” and see the alt-right: white supremacists and religious fanatics working together to suppress voting rights, women’s rights and democratic rights. Conservatives stare back and see antifa: essentially, progressive totalitarians making common cause with socialists and anarchists to undermine the pillars of American freedom and democracy. Put the two sides together and you have an electorate that has become angry, suspicious and volatile.

The looming threat of a civil war is almost the only thing that unites pundits and politicians across the political spectrum. Two new books, one by the Canadian journalist Stephen Marche and the other by the conflict analyst Barbara Walter, argue that the conditions for civil war are already in place. Walter believes that America is embracing “anocracy” (outwardly democratic, inwardly autocratic), joining a dismal list of countries that includes Turkey, Hungary and Poland. The two authors’ arguments have been boosted by the warnings of respected historians, including Timothy Snyder, who wrote in The New York Times that the US is teetering over the “abyss” of civil war.

If you accept the premise that America is facing, at the very least, a severe test of its democracy, then it is all the more important to subject the claims of incipient civil war to rigorous analysis. The fears aren’t baseless; the problem is that predictions are slippery things. How to prove a negative against something that hasn’t happened yet? There’s also the danger of the self-fulfilling prophecy: wishing and predicting don’t make things so, although they certainly help to fix the idea in people’s minds. The more Americans say that the past is repeating itself and the country has reached the point of no return, the more likely it will be believed.

Predictions based on comparisons to Weimar Germany, Nazi Germany, the Russian Revolution and the fall of Rome are simplistic and easy to dismiss. But, just as there is absolutely no basis for the jailed Capitol rioters to compare themselves to “Jews in Germany”, as one woman recently did, arguments that equate today’s fractured politics with the extreme violence that plagued the country just before the Civil War are equally overblown — not to mention trivialising of its 1.5 million casualties.

There simply isn’t a correlation between the factors dividing America then and now. In the run-up to the war in 1861, the North and South were already distinct entities in terms of ethnicity, customs and law. Crucially, the North’s economy was based on free labour and was prone to slumps and financial panics, whereas the South’s depended on slavery and was richer and more stable. The 13 Southern states seceded because they had local government, the military and judicial institutions on side.

Today there is a far greater plurality of voters spread out geographically. President Biden won Virginia and Georgia and almost picked up Texas in 2020; in 1860 there were ten Southern states where Abraham Lincoln didn’t even appear on the ballot.

When it comes to assessing the validity of generally accepted conditions for civil breakdown, the picture becomes more complicated. A 2006 study by the political scientists Havard Hegre and Nicholas Sambanis found that at least 88 circumstances are used to explain civil war. The generally accepted ones include: a fragile economy, deep ethnic and religious divides, weak government, long-standing grievances and factionalised elites. But models and circumstances are like railway tracks: they take us down one path and blind us to the others.

In 2015 the European think tank VoxEU conducted a historical analysis of over 100 financial crises between 1870 and 2014. Researchers found a pattern of street violence, greater distrust of government, increased polarisation and a rise in popular and right-wing parties in the five years after a crisis. This would perfectly describe the situation in the US except for one thing: the polarisation and populism have coincided with falling unemployment and economic growth. The Capitol riot took place despite, not because of, the strength of the financial system.

A country can meet a whole checklist of conditions and not erupt into outright civil war (for example, Northern Ireland in the 1970s) or meet only a few of the conditions and become a total disaster. It’s not only possible for the US, a rich, developed nation, to share certain similarities with an impoverished, conflict-ridden country and yet not become one; it’s also quite likely, given that for much of its history it has held together while being a violent, populist-driven society seething with racial and religious antagonisms behind a veneer of civil discourse. This is not an argument for complacency; it is simply a reminder that theory is not destiny.

A more worrying aspect of the torrent of civil war predictions by experts and ordinary Americans alike is the readiness to demonise and assume the absolute worst of the other side. It’s a problem when millions of voters believe that the American polity is irredeemably tainted, whether by corruption, communism, elitism, racism or what have you. The social cost of this divide is enormous. According to the Armed Conflict Location and Event Data Project: “In this hyper-polarised environment, state forces are taking a more heavy-handed approach to dissent, non-state actors are becoming more active and assertive and counter-demonstrators are looking to resolve their political disputes in the street.”

Dissecting the roots of America’s lurch towards rebarbative populism requires a particular kind of micro human analysis involving real-life interviews with perpetrators and protesters as well as trawls through huge sets of data. The results have shown that, more often than not, the attribution of white supremacist motives to the Capitol rioters, or anti-Americanism to Black Lives Matter protesters, says more about the politics of the accuser than the accused.

Social media is an amplifier for hire — polarisation lies at the heart of its business models and algorithms. Researchers looking at the “how” rather than just the “why” of America’s political Balkanisation have also found evidence of large-scale foreign manipulation of social media. A recent investigation by ProPublica and The Washington Post revealed that after November 3, 2020, there were more than 10,000 posts a day on Facebook attacking the legitimacy of the election.

In 2018 a Rasmussen poll asked American voters whether the US would experience a second civil war within the next five years. Almost a third said it would. In a similar poll conducted last year the proportion had risen to 46 per cent. Is it concerning? Yes. Does it make the prediction true? Well, polls also showed a win for Hillary Clinton and a landslide for Joe Biden. So, no.

Historically Speaking: How the Waistband Got Its Stretch

Once upon a time, human girth was bound by hooks and buttons, and corsets had metal stays. Along came rubber and a whole new technology of flexible cloth.

The Wall Street Journal

January 7, 2021

The New Year has arrived, and if you’re like me, you’ve promised yourself a slimmer, fitter and healthier you in 2022. But in the meantime there is the old you to deal with—the you who overindulged at Thanksgiving and didn’t stop for the next 37 days. No miracle diet or resolution can instantaneously eradicate five weeks of wild excess. Fortunately, modern science has provided the next best thing to a miracle: the elasticated waistband.

Before the invention of elastic, adjustable clothing was dependent on technology that had hardly changed since ancient times. The Indus Valley Civilization made buttons from seashells as early as 2000 BC.

The first inkling that there might be an alternative to buttons, belts, hooks and other adjustable paraphernalia came in the late 18th century, with the discovery that rubber wasn’t only good for toys. It also had immensely practical applications for things such as pencil erasers and lid sealants. Rubber’s stretchable nature offered further possibilities in the clothing department. But there was no word for its special property until the poet William Cowper borrowed the 17th-century term “elastic,” used to describe the expansion and contraction of gases, for his translation of the Iliad in 1791: “At once he bent Against Tydides his elastic bow.”

PHOTO: GETTY IMAGES

By 1820, an enterprising English engineer named Thomas Hancock was making elastic straps and suspenders out of rubber. He also invented the “masticator,” a machine that rolled shredded rubber into sheets for industrial use. Elastic seemed poised to make a breakthrough: In the 1840s, Queen Victoria’s shoemaker, Joseph Sparkes Hall, popularized his invention of the elastic-gusset ankle boot, still known today as the Chelsea Boot.

But rubber had drawbacks. Not only was it a rare and expensive luxury that tended to wear out quickly, it was also sticky, sweaty and smelly. Elasticized textiles became popular only after World War I, helped by the demand for steel—and female workers—that led women to forego corsets with metal stays. Improved production techniques at last made elasticated girdles a viable alternative: In 1924, the Madame X rubber girdle promised to help women achieve a thinner form in “perfect comfort while you sit, work or play.”

The promise of comfort became real with the invention of Lastex, essentially rubber yarn, in 1930. Four years later, in 1934, Alexander Simpson, a London tailor, removed the need for belts or suspenders by introducing the adjustable rubber waistband in men’s trousers.

The constant threat of rubber shortages sparked a global race to devise synthetic alternatives. The winner was the DuPont Company, which invented neoprene in 1930. That research led to an even more exciting invention: the nylon stocking. Sales were halted during World War II, creating such pent-up demand that in 1946 there were “nylon riots” throughout the U.S., including in Pittsburgh, where 40,000 people tried to buy 13,000 pairs of stockings.

DuPont scored another win in 1958 with spandex, also known under the brand name Lycra, which is not only more durable than nylon but also stretchier. Spandex made dreams possible by making fabrics more flexible and forgiving: It helped the astronaut Neil Armstrong to walk on the moon and Simone Biles to become the most decorated female gymnast in history. And it will help me to breathe a little easier until I can fit into my jeans again.

Historically Speaking: Boycotts that Brought Change

Modern rights movements have often used the threat of lost business to press for progress

The Wall Street Journal

November 12, 2021

Sixty-five years ago, on Nov. 13, 1956, the U.S. Supreme Court upheld Browder v. Gayle, putting an end to racial segregation on buses. The organizers of the Montgomery bus boycott, which had begun shortly after Rosa Park s’s arrest on Dec. 1, 1955, declared victory and called off the campaign as soon as the ruling came into effect. But that was only the beginning. As the Civil Rights Movement progressed, boycotts of businesses that supported segregation became a regular—and successful—feature of Black protests.

Boycotts are often confused with other forms of protest. At least four times between 494 and 287 B.C., Rome’s plebeian class marched out of the city en masse, withholding their labor in an effort to win more political rights. Some historians describe this as a boycott, but it more closely resembled a general strike. A better historical precedent is the West’s reaction to Sultan Mehmed II’s imposition of punitive taxes on Silk Road users in 1453. European traders simply avoided the overland networks via Constantinople in favor of new sea routes, initiating the so-called Age of Discovery.

The first modern boycotts began in the 18th century. Britain increased taxes on its colonial subjects following the Seven Years War in 1756-1763, inciting a wave of civil disobedience. Merchants in Philadelphia, New York and Boston united to boycott British imports. Together with intense political lobbying by Benjamin Franklin among others, the boycott resulted in the levies’ repeal in 1766. Several years later, London again tried to raise revenue through taxes, touching off a more famous boycott, during which the self-styled Sons of Liberty dumped the contents of 342 tea chests into Boston Harbor on Dec. 16, 1773.

In 1791 the British abolitionist movement instigated a national boycott of products made by enslaved people. As many as 300,000 British families stopped buying West Indian sugar, causing sales to drop by a third to half in some areas. Although the French Revolutionary Wars stymied this campaign, its early successes demonstrated the power of consumer protest.

Despite the growing popularity of the action, the term “boycott” wasn’t used until 1880. It was coined in Ireland as part of a campaign of civil disobedience against absentee landowners. Locals turned Captain Charles Cunningham Boycott, an unpopular land agent in County Mayo for the Earl of Erne, into a pariah. His isolation was so complete that the Boycott family relocated to England. The “boycott” of Boycott not only garnered international attention but inspired imitators.

Boycotts enabled oppressed people around the world to make their voices heard. But the same tool could also be a powerful weapon in the hands of oppressors. During the late 19th century in the American west, Chinese workers and the businesses that hired them were often subject to nativist boycotts. In 1933, German Nazi party leaders organized a nationwide boycott of Jewish-owned business in the lead-up to the passage of a law barring Jews from public sector employment.

Despite the potential for misuse, the popularity of economic and political boycotts increased after World War II. In January 1957, shortly after the successful conclusion of the Montgomery bus boycott, black South Africans in Johannesburg started their own bus boycott. From this local protest grew a national and then international boycott movement that continued until South Africa ended apartheid in 1991. Sometimes the penny, as well as the pen, is more powerful than the sword.

Historically Speaking: When Masquerade Was All the Rage

Before there was Halloween, there were costume balls and Carnival, among other occasions for the liberation of dressing up

The Wall Street Journal

October 28, 2021

Costume parades and Halloween parties are back after being canceled last year. Donning a costume and mask to go prancing around might seem like the height of frivolity, but the act of dressing-up has deep roots in the human psyche.

During the early classical era, worshipers at the annual festivals of Dionysus—the god of wine, ritual madness and impersonation, among other things—expanded mask-wearing from religious use to personal celebrations and plays performed in his honor. Masks symbolized the suspension of real world rules: A human could become a god, an ordinary citizen could become a king, a man could be a woman. Anthropologists call such practices “rituals of inversion.”

In Christianized Europe, despite official disapproval of paganism, rituals of inversion not only survived but flourished. Carnival—possibly a corruption of the Latin phrase “carne vale,” farewell to meat, because the festival took place before Lent—included the Feast of Fools, where junior clergymen are alleged to have dressed as nuns and bishops and danced in the streets.

By the 13th century, the Venetians had taken to dressing up and wearing masks with such gusto that the Venice Carnival became an occasion for ever more elaborate masquerade. The city’s Great Council passed special laws to keep the practice within bounds, such as banning masks while gambling or visiting convents.

ILLUSTRATION: THOMAS FUCHS

The liberation granted by a costume could be dangerous. In January of 1393, King Charles VI of France and his wife, Isabeau of Bavaria, held the Bal des Sauvages, or Wild Men’s Ball, to celebrate the wedding of one of her ladies-in-waiting. The king had already suffered his first bout of insanity, and it was hoped that the costume ball would be an emotional outlet for his disordered mind. But the farce became a tragedy. The king and his entourage, dressed as hairy wild men, were meant to perform a “crazy” dance. Horrifically, the costumes caught fire, and only Charles and one other knight survived.

The masked ball became a staple of royal entertainments, offering delicious opportunities for sexual subterfuge and social subversion. At a masquerade in 1745, Louis XV of France disguised himself as a yew tree so he could pursue his latest love, the future Madame de Pompadour. Meanwhile, the Dauphine danced the night away with a charming Spanish knight, not realizing he was a lowly cook who had tricked his way in. More ominously, a group of disaffected nobles in Sweden infiltrated a masquerade to assassinate King Gustav III of Sweden in 1792. Five years later, the new ruler of Venice, Francis II of Austria, banned Carnival and forbade the city’s residents to wear masks.

Queen Victoria helped to return dress-up parties to respectability with historically-themed balls that celebrated creativity rather than debauchery. By 1893, American Vogue could run articles about fabulous Halloween costumes without fear of offense. The first Halloween parade took place not in cosmopolitan New York but in rural Hiawatha, Kansas, in 1914.

In the modern era, the taint of anarchy and licentiousness associated with dressing-up has been replaced by complaints about cultural appropriation, a concern that would have baffled our ancestors. Becoming what we are not, however briefly, is part of being who we are.

Historically Speaking: How Malaria Brought Down Great Empires

A mosquito-borne parasite has impoverished nations and stopped armies in their tracks

The Wall Street Journal

October 15, 2021

Last week brought very welcome news from the World Health Organization, which approved the first-ever childhood vaccine for malaria, a disease that has been one of nature’s grim reapers for millennia.

Originating in Africa, the mosquito-borne parasitic infection left its mark on nearly every ancient society, contributing to the collapse of Bronze-Age civilizations in Greece, Mesopotamia and Egypt. The boy pharaoh Tutankhamen, who died around 1324 BC, suffered from a host of conditions including a club foot and cleft palate, but malaria was likely what killed him.

Malaria could stop an army in its tracks. In 413 BC, at the height of the disastrous Sicilian Expedition, malaria sucked the life out of the Athenian army as it lay siege to Syracuse. Athens never recovered from its losses and fell to the Spartans in 404 BC.

But while malaria helped to destroy the Athenians, it provided the Roman Republic with a natural barrier against invaders. The infested Pontine Marshes south of Rome enabled successive generations of Romans to conquer North Africa, the Middle East and Europe with some assurance they wouldn’t lose their own homeland. Thus, the spread of classical civilization was carried on the wings of the mosquito. In the 5th century, though, the blessing became a curse as the disease robbed the Roman Empire of its manpower.

Throughout the medieval era, malaria checked the territorial ambitions of kings and emperors. The greatest beneficiary was Africa, where endemic malaria was deadly to would-be colonizers. The conquistadors suffered no such handicap in the New World.

ILLUSTRATION: JAMES STEINBERG

The first medical breakthrough came in 1623 after malaria killed Pope Gregory XV and at least six of the cardinals who gathered to elect his successor. Urged on by this catastrophe to find a cure, Jesuit missionaries in Peru realized that the indigenous Quechua people successfully treated fevers with the bark of the cinchona tree. This led to the invention of quinine, which kills malarial parasites.

For a time, quinine was as powerful as gunpowder. George Washington secured almost all the available supplies of it for his Continental Army during the War of Independence. When Lord Cornwallis surrendered at Yorktown in 1781, less than half his army was fit to fight: Malaria had incapacitated the rest.

During the 19th century, quinine helped to turn Africa, India and Southeast Asia into a constellation of European colonies. It also fueled the growth of global trade. Malaria had defeated all attempts to build the Panama Canal until a combination of quinine and better mosquito control methods led to its completion in 1914. But the drug had its limits, as both Allied and Axis forces discovered in the two World Wars. While fighting in the Pacific Theatre in 1943, General Douglas MacArthur reckoned that for every fighting division at his disposal, two were laid low by malaria.

A raging infection rate during the Vietnam War was malaria’s parting gift to the U.S. in the waning years of the 20th century. Between 1964 and 1973, the U.S. Army suffered an estimated 391,965 sick-days from malaria cases alone. The disease didn’t decide the war, but it stacked the odds.

Throughout history, malaria hasn’t had to wipe out entire populations to be devastating. It has left them poor and enfeebled instead. With the advent of the new vaccine, the hardest hit countries can envisage a future no longer shaped by the disease.

Historically Speaking: Dante’s Enduring Vision of Hell

The “Inferno” brought human complexity to the medieval conception of the afterlife

The Wall Street Journal

September 30, 2021

What is hell? For Plato, it was Tartarus, the lowest level of Hades where those who had sinned against the gods suffered eternal punishment. For Jean-Paul Sartre, the father of existentialism, hell was other people. For many travelers today, it is airport security.

No depiction of hell, however, has been more enduring than the “Inferno,” part one of the “Divine Comedy” by Dante Alighieri, the 700th anniversary of whose death is commemorated this year. Dante’s hell is divided into nine concentric circles, each one more terrifying and brutal than the last until the frozen center, where Satan resides alongside Judas, Brutus and Cassius. With Virgil as his guide, Dante’s spiritually bereft and depressed alter ego enters via a gate bearing the motto “Abandon all hope, ye who enter here”—a phrase so ubiquitous in modern times that it greets visitors to Disney’s Pirates of the Caribbean ride.

The inscription was a Dantean invention, but the idea of a physical gate separating the land of the living from a desolate one of the dead was already at least 3,000 years old: In the Sumerian Epic of Gilgamesh, written around 2150 B.C., two scorpionlike figures guard the gateway to an underworld filled with darkness and dust.

ILLUSTRATION: THOMAS FUCHS

The underworld of the ancient Egyptians was only marginally less bleak. Seven gates blocked the way to the Hall of Judgment, according to the Book of the Dead. Getting through them was arduous and fraught with failure. The successful then had to submit to having their hearts weighed against the Feather of Truth. Those found wanting were thrown into the fire of oblivion.

Zoroastrianism, the official religion of the ancient Persians, was possibly the first to divide the afterlife into two physically separate places, one for good souls and the other for bad. This vision contrasted with the Greek view of Hades as the catchall for the human soul and the early Hebrew Bible’s description of Sheol as a shadowy pit of nothingness. In the 4th century B.C., Alexander the Great’s Macedonian empire swallowed both Persia and Judea, and the three visions of the afterlife commingled. “hell” would then appear frequently in Greek versions of the New Testament. But the word, scholars point out, was a single translation for several distinct Hebrew terms.

Early Christianity offered more than one vision of hell, but all contained the essential elements of Satan, sinners and fire. The “Apocalypse of Peter,” a 2nd century text, helped start the trend of listing every sadistic torture that awaited the wicked.

Dante was thus following a well-trod path with his imaginatively crafted punishments of boiling pitch for the dishonest and downpours of icy rain on the gluttonous. But he deviated from tradition by describing Hell’s occupants with psychological depth and insight. Dante’s narrator rediscovers the meaning of Christian truth and love through his encounters. In this way the Inferno speaks to the complexities of the human condition rather than serving merely as a literary zoo of the dammed.

The “Divine Comedy” changed the medieval world’s conception of hell, and with it, man’s understanding of himself. Boccaccio, Chaucer, Milton, Balzac —the list of writers directly inspired by Dante’s vision goes on. “Dante and Shakespeare divide the world between them,” wrote T.S. Eliot. “There is no third.”