The Guardian: Kingmaker by Sonia Purnell review – a woman of influence

The extraordinary story of one of the greatest, most gossiped-about political fixers of the 20th century.

The Guardian

October 3, 2024

Pamela Digby after her wedding to Randolph Churchill. Photograph: Fred Ramage/Getty Images

In early 1965, the death of Alma Mahler-Gropius-Werfel, married to three of the most brilliant men of early 20th-century Europe and the lover of several others, inspired Tom Lehrer to compose a spoof paean to her wondrous assets. “Alma, Tell us, All modern women are jealous,” he sang with a knowing chuckle.

The song tickled upper-crust New Yorkers; they had an English version of Alma living among them in the form of The Hon Pamela Churchill-Hayward. Pamela was then on marriage number two, her first being a wartime mistake to Randolph Churchill. She exited it with a child, Winston, whom she neglected, and a famous last name that she protected fiercely. Her American do-over was Leland Hayward, the Broadway producer of such hits as South Pacific and The Sound of Music.

Her husbands were overshadowed, though, by the fabulously moneyed and titled lovers she acquired in between: Aly Khan, son of the Aga Kahn, the Fiat heir Gianni Agnelli, Baron Elie de Rothschild, and William S Paley, owner of CBS, among them. She was one of Truman Capote’s “swans” – rich society women he befriended – and he waspishly joked that the collected tales of Pamela Hayward’s exploits would last not A Thousand and One Nights, but A Thousand and Twelve. Widowhood in 1971 lasted an unhappy six months until Averell Harriman became husband number three. The business magnate turned diplomat, statesman and eminence grise of the Democratic party was also, not so incidentally, a former lover and recent widower.

She was a youthful 51, he a somewhat spry 79; time was short. Pamela’s assault on Washington was still ongoing when Averell died in 1986. The city capitulated to her in the end, just as London, Paris and New York had done. With her face and reputation burnished in ways that only serious money can achieve, Pamela triumphantly returned to Europe in 1993 as President Bill Clinton’s ambassador to France. She died in the job four years later, neither the best nor the worst political appointee who ever went to Paris.

Pamela’s fascinating life earned her the scholarly attentions of the late Christopher Ogden, Sally Bedell Smith and now Sonia Purnell. Few attract one excellent biographer, let alone three. She hated the first two books, which exhumed every skeleton and buffed it up for display. Ogden’s was remarkably fair considering she reneged on their agreement and tried to stiff him of his author’s fee. However, she would have found this latest portrait just right.

Purnell makes the case for Pamela as a woman of substance. First, because her wartime lovers – Harriman, sent to London by Roosevelt as his special envoy, Edward R Murrow, the CBS news correspondent, and Frederick Anderson, head of Eighth Bomber Command – made her a useful conduit between the British and Americans. Second, because she helped transform political fundraising in the 1980s.

Admittedly, the time Pamela exchanged views with this or that person, or convened a meeting of top-level politicos, is a lot less fun to read than the time she caught Gianni Agnelli in flagrante – he crashed into a tree while escaping her wrath, leaving him with a permanent limp.

Nevertheless, Kingmaker is on to something important. Successful women are judged differently than men. A monster like Picasso gets a free pass, but woe betide the unlikable woman. What does it matter, asks Purnell, if Harriman was a ruthless social climber and an unsatisfactory friend, mother and stepmother? She was a brilliant operator and strategist, raising millions for the Democrats during the Reagan years.

Harriman’s political effectiveness ought to be separated from her personal faults, but she is still in the dock, in my opinion. Throughout history, backdoor influencers like her were among patriarchy’s biggest cheerleaders and beneficiaries. In 1981, the year Pamela and Averell established her fundraising political action committee, or PAC, Sandra Day O’Connor became the first female US supreme court justice and Jeanne Kirkpatrick the first female US ambassador to the UN. Meanwhile, the Harriman dinners were still feeding men’s egos with port and cigars and forcing the women to take their coffee and bonbons in the drawing room.

Kingmaker: Pamela Churchill Harriman’s Astonishing Life of Seduction, Intrigue and Power by Sonia Purnell is published by Virago (£25).

To support the Guardian and Observer, order your copy at guardianbookshop.com. Delivery charges may apply.

The Spectator – Towards Zero: the gruesome countdown to the American Civil War

The North and South had been bitterly divided over slavery since the invention of the cotton gin in the 1790s, but the Battle of Fort Sumter in 1861 would prove the point of no return

Some 100,000 books have been written about the American Civil War since it ended in 1865. That’s hardly surprising, given the four-year conflict’s impact on society, and not just because of the immense death toll, which new estimates put as high as 750,000 – more than the losses from all other wars combined. The effusion of blood created a new nation and a new mythology, anchored on the principles of freedom, equality and democracy.

Coloured lithograph of the bombardment of Fort Sumter, Charleston Harbor, 12-13 April 1861. [Lithograph by Currier & Ives, New York, 1861/ Getty Images]

There is not much room in this crowded field for Civil War neophytes. Erik Larson knows what he is about, however, in The Demon of Unrest – but do his critics? The mixed reception this book has received suggests not. As with his previous best-sellers, the author has taken a single event, the Battle of Fort Sumter in Charleston, South Carolina, in this case, and used it as a highly effective framing device for the immersive story he wishes to tell.

The actual event was a straightforward one. After holding out against Confederate forces for 108 days, the starving Union garrison relinquished its control of the fort on 13 April 1861. The battle was the point of no return, and although there hadn’t been any fatalities during the 34-hour bombardment leading up to it, a gruesome accident during the 50-gun salute did result in the first death of the war. Neither the Confederate besiegers nor the northern defenders of the fort had any inkling of the hell they were about to unleash on their fellow Americans.

This is not to say they were blindsided by the war. The country had been tearing itself apart over the issue of free vs slave labour ever since the invention of the cotton gin in the 1790s. Once slave-grown cotton could be efficiently cleaned and processed by machine, the South went from being an agricultural backwater to an economic powerhouse exporting four million cotton bales a year.

The production costs were cheap, the supply inexhaustible, and the southern states had a virtual monopoly on the global market. Unlike the northern states, they didn’t need immigration, education or industrialisation to grow rich – just millions of Africans, force-fed and force-bred into perpetual bondage. In the 1830s, southerners began referring to slavery as the ‘peculiar institution’, not because it was wicked and shameful, but on account of it being unique and inseparable from the southern way of life.

If northerners were at all uncertain about the non-negotiability of slavery – unlikely, given the political paralysis it caused in Washington – incidents on the floor of the Senate such as the Mississippi senator Henry Foote brandishing a loaded revolver and the South Carolina senator Preston Brooks beating the abolitionist campaigner Charles Sumner unconscious, helped to clear up any confusion.

International condemnation of slavery also fuelled southern bluster and arrogance. On the eve of the war, Britons were unamused to hear the South Carolina plantation owner James Hammond (rendered in gloriously repulsive detail by Larson) describe England as a vassal state in the southern empire. ‘Cotton is king,’ Hammond raged in a speech in 1858 that made him infamous on both sides of the Atlantic. ‘No power on Earth dares to make war on it.’ No power except the executive power of Abraham Lincoln, it turned out.‘Cotton is king,’ James Hammond raged in 1858. ‘No power on Earth dares to make war on it’

The pivotal role that individual action plays in momentous times is a recurring theme in Larson’s books. He is fascinated by two kinds of anti-heroes: the monster with a talent for propelling events, like Hammond, and the decent man whose limitations spur him towards catastrophe, like the professorial William Dodd, America’s ambassador to Germany the year Hitler came to power (In the Garden of Beasts, 2011). A colleague of Dodd’s later recalled he had seldom, if ever, ‘worked with a chief of mission who was more futile and ineffective’.In The Demon of Unrest, Larson’s anti-heroes are more starkly drawn. The decent men may be doomed to die, like Lincoln, or fail in their objective, like US Major Robert Anderson, the commander of Fort Sumter, but their flaws and limitations render them more, rather than less, admirable.

Lincoln was elected the first Republican president in November 1860 by a majority in the 34-state electoral college. But he lost the popular vote by a wide margin, giving the impression that his victory was an accident. His failure to address this added fuel to the claims of southern fire-eaters that he intended to destroy the South’s economy using high tariffs, ram abolition down their throats and set off a race war between blacks and whites.

Having never visited the Deep South, Lincoln was unaware of how entrenched the secession movement had become or how desperately pro-Union southerners needed support and leadership. Even more damaging for the prospects of peace was the traditional four months’ grace between the election and the inauguration. The General Assembly of South Carolina, the state with the noisiest supporters and longest history of secession attempts, voted to become an ‘independent commonwealth’ on 20 December 1860. Half a dozen more followed soon afterwards, yet Lincoln was still in Illinois in early February when the seven announced the formation of the Confederate States of America under President Jefferson Davis.

Lincoln only settled into Washington at the end of February, by which time another four states were preparing to secede, bringing the final total to 11. At his inauguration on 4 March, his belated attempt to cool secession fever by insisting he would defend the Union to his last breath while promising that slavery was safe in his hands alienated everyone. Southerners were convinced he was lying, even as abolitionists hoped that he was.

Initially, the majority of Lincoln’s cabinet felt certain he was not up to the job. Several tried to sideline him, sowing chaos among the already confused attempts to prevent disunion. Lincoln never altered his position, however, that slavery was a negotiable issue, but not Federal authority. Only two naval fortifications were still in Union hands by the time he took office: Sumter in Charleston and Fort Pickens in Pensacola, Florida. Back in December, John Floyd, Buchanan’s secessionist secretary of war, believed he had picked ‘one of us’ when he assigned a middle-aged southern officer from a slave-owning family to oversee Charleston’s fortifications. Major Robert Anderson’s rise up the ranks had stalled, despite an exemplary service record and being severely wounded in action during the Mexican-American War. He was teaching cadets at West Point when Floyd recalled him. It was not uncommon for such men to seek compensation by means of treachery. Anderson was the exception.

South Carolina’s secession was meant to have been the signal for Anderson to stand aside or evacuate his position. He did, slipping out under cover of darkness with a few dozen troops; but only to take up a stronger one. Although still under construction, Fort Sumter was the biggest of Charleston’s three forts. Anderson didn’t care about the rights or wrongs of slavery, nor did he care much about politics, but his honour and duty were sacred to him. To the indignation and fury of his now former brother officers, he made it clear that he would protect the Union flag flying over Sumter until he ran out of food or Confederate forces overwhelmed him.

In Larson’s dramatic rendering of the countdown to war, Lincoln was the one who cocked the starting gun by insisting that Major Anderson be resupplied, knowing it would provoke the Confederates; by engaging in a battle he knew he would lose, Anderson was the one who fired it. There’s an unmistakable aura of Greek tragedy to these men in The Demon of Unrest. They are reluctant heroes, forced to act the way they do because they are incapable of behaving in any other way.

It’s history as a form of catharsis, which leads to the deeper purpose of this work. The real project of the book may not be immediately divined from its soaring prose and ripping action scenes, yet it was shaped by the calamitous events at the Capitol on 6 January 2021. ‘I had the eerie feeling that present and past had merged,’ Larson writes in his foreword. It seemed to him as though America was once again in danger of slipping loose from its ideological moorings.

The Demon of Unrest is Larson’s attempt to call the country back to its senses. The book is not so much history in the traditional Thucydidean manner of causes and events, but rather a political argument posing as history in the manner of Xenophon, the father of popular narrative history. It is a full-throated defence of democratic values, individual agency and the power of collective action. Why read it? Because to understand the meaning of freedom for others is to know it in ourselves.

The Wall Street Journal: What I Learned on My Summer Vacation

Don’t Dream of Escaping the Diaper Pail

Travel can force us to have sudden realizations. Five writers share one thing they discovered on a trip.

 Junot Diaz found out how it feels to stand out in “a very American place.” Amanda Foreman proved that a car can smell too bad to steal. Kevin D. Williamson realized there are no good bears. Andrew Rannells found the limit to “me” time. And Joanne Lipman saw a vacation change her son’s life.

The twins’ fourth birthday was marked by the retirement of the diaper pail. Five children born in quick succession had kept it in continuous service. It had been my friend and savior during the circle-of-poo years but also an unyielding and tyrannical master. I wasn’t sorry to say goodbye. Giddy from our hard-won freedom, my husband and I threw caution to the wind that summer and took the family to Italy, a country we had last visited as newlyweds.

Illustrations by Ben Kirchner

Driving to our Umbrian B&B, we spotted the same restaurant we had passed 10 years earlier, the T and V of “TAVERNA” still flickering. Fate was surely calling out to us.

We needn’t have listened, and after visiting the bathroom, I wished we hadn’t. The ground-level squatting nonplussed the children. They could squat easily enough; it was not stepping or falling into the basin afterward that was the tricky part.

“Why are the kids naked?” My husband asked when we returned. I handed him a bulky garbage bag. “Just like old times,” he commented, and tossed it in the back of the van. We drove the rest of the way with the windows down and the air conditioning on high.

“I feel sick” said one of the twins, and vomited into my Kate Spade bag right as we pulled into the driveway. My husband grabbed my hand. “Don’t look,” he said, “it won’t help.” He zipped the bag shut and tossed it over his shoulder.

The village was tiny and we could forget about the car for two weeks. When it was time to set off for the airport, my husband opened the back and was almost overpowered by the force of escaping gases. We peered cautiously inside; the Kate Spade lay on its side, shrunken and inert. The clothes bag, on the other hand, seemed to be incubating new life.

I dared him to make first contact. “They’re your bags,” he said. We trapped them under the suitcases, hoping the smell would dissipate once we got going. The AC was turned up, the windows rolled down, and yet the stench grew worse. What else lurked in our van of horrors?

We stopped at a gas station and clambered out, gasping for air. Googling “strong sulfur smell” suggested that the odor might be coming from a broken catalytic converter. “I’ll trade you two flat tires instead,” I told the car. Our pantomime with the Italian garage mechanic was interrupted by the children jumping and pointing, as two men got into our van. Seconds later it was gone.

Illustrations by Ben Kirchner

The mechanic sweetly summoned a taxi to take us to the police station. We had been driving for 10 minutes when I caught sight of our luggage. The thieves had chucked them out of the van in ones and twos, turning the road into a giant slalom. The driver took the course at speed but clipped the Kate Spade and flattened the clothes bag.

He came to a hard stop next to our abandoned car. The doors were open and the engine still running.

Grim-faced and silent, we rescued what we could of our belongings and bundled everyone back in. Two excruciating hours later, the flickering “AERNA” sign appeared. We had come full circle. I caught my husband’s eye and felt the car speed up.

Historically Speaking: The Drama of Finding Lost Cities

We are always a discovery away from rewriting ancient history.

The Wall Street Journal

March 21, 2024

In January, scientists announced the discovery of the oldest pre-Columbian cities in the Amazon. They were built in the Upano Valley in Ecuador by an organized urban society that flourished between 800 B.C. and 600 A.D. Surprisingly, this lost Amazonian civilization created cities that resemble modern suburbs, complete with low-density housing, ample green space and manicured road networks. It would seem that ancient urban complexity didn’t just arise in the familiar model of the Eurasian walled city.

ILLUSTRATION: THOMAS FUCHS

This kind of challenge to the scholarly consensus is actually fairly common with the discovery of lost cities. When civilizations are erased from history, bringing them back into the narrative can lead to a dramatic realignment of facts.

The discovery of Troy is an early example of this phenomenon. In 1870, after 12 years of fruitless searching, a German businessman and amateur archaeologist named Heinrich Schliemann found Troy’s ruins near modern Hisarlik on the Turkish Aegean coast. His careless excavation of the site makes modern archaeologists cringe, but against incredible odds Schliemann proved that Homer’s Iliad was based on real events. The study of ancient Greece was changed forever.

Three major discoveries in the early 1900s also reshaped our understanding of the Bronze Age. In 1906, German archaeologists digging through the ruins of a windswept plateau in north-central Turkey uncovered the forgotten city of Hattusa. Its royal archive confirmed that the city had been the last capital of the Hittites and the nerve center of a powerful empire that rivaled Egypt. After a glorious run of five hundred years, the Hittite civilization disintegrated so completely by around 1200 B.C. that little was known about it beyond the mentions in the Bible. The recovery of Hattusa redrew the map of the ancient world, this time with the Hittites at the center.

Across the Aegean, Sir Arthur Evans similarly rescued the Minoans with his discovery around 1905 of the Palace of Knossos on Crete, which flourished between 1700 and 1500 B.C. Evans’s breakthrough was to show that Minoan culture was nonviolent, unlike the other ancient Greek societies known at the time. It was also perhaps matriarchal. The palace, which lacked fortifications, had frescoes of women in authoritative poses. Recovered sculptures often featured goddesses. Evans ultimately argued that Knossos was proof that gender inequality was not, as generally believed, intrinsic to civilization.

The discovery in 1911 of Mohenjo-daro in present-day Pakistan was no less revelatory. This once-vital center of the Indus Valley Civilization, a trading society that thrived between 2500 and 1700 B.C., notably lacked temples, palaces, noble houses or even rich or poor neighborhoods. The city seems to have spent its wealth instead on civic amenities, such as public granaries and universal plumbing.

The egalitarianism at the heart of Mohenjo-daro suggests that the more cities and civilizations we find, the more it will complicate our understanding of human society’s origins. It is humbling to know that we are always a discovery away from rewriting ancient history. It’s invigorating, too.

Essay: Princess Kate Isn’t Kim Kardashian

The public backlash against an altered photo of the royal family shows that, even in the age of social media, royalty is supposed to mean more than just celebrity.

The Wall Street Journal

March 15, 2024

There I was on CBS News at the beginning of March, as a commentator on the royals, pooh-poohing fears that the public was being deceived. The Princess of Wales’s three-month absence from public life following her surgery in January was generating far too much hysteria.

Yes, it was a little strange that Prince William had suddenly withdrawn from a scheduled public appearance citing a “personal matter.” And there was no denying that the two royal households were not in sync.

Buckingham Palace’s willingness to share photographs of the cancer-stricken King Charles III made the news blackout on Princess Catherine look suspicious. But Kensington Palace, the residence of the Waleses, had been quite clear from the outset that it wouldn’t be issuing any updates until her convalescence was over. Trust me, I had said, the real issue here is the fact we are paying too much attention to baseless rumors.

Two weeks later, where-is-Kate-gate had turned into photogate. The world had been clamoring for a picture to show that she wasn’t in a coma, on a beach filing for divorce or recovering from a Brazilian Butt Lift, as some speculated on social media. What it got on March 9, Mother’s Day in Britain, was an Adobe Photoshop special in the worst way.

The charming family portrait of Kate sitting on a bench, surrounded by her three children, was bound to be scrutinized pixel by pixel. At the very least, it needed to be devoid of any mystery. Instead, the image had been obviously but crudely digitally enhanced, giving credence to every crackpot theory out there. By being secretive and then looking manipulative, the Waleses had created a public relations catastrophe. And they had damaged their “brand,” which depends crucially on how their public role transcends that of mere celebrities.

For the record, I don’t have any inside information about Kate’s surgery. The Germans think it was diverticulitis, the British assume it was a hysterectomy. But leaving aside the knotty question of privacy, the royals’ junior varsity team needs a reset. Even if the fiasco was the result of a mislabeled file, or a key member of the comms office bunking off early on a Friday, William and Kate are playing like amateurs in a pro sport that kills its losers.

But the bigger issue exposed by photogate isn’t a “them” problem at all, it’s an “us” problem. Americans are behaving as though Kate cheated on them. The betrayal! She has done what no one else in this country has ever managed to do and unite the conservative and liberal media around a cause they are both passionate about: the evils of royal photoshopping. It’s a breach of trust, don’t you know, and Americans won’t stand for it.

Lest we forget, in 1953 the newly crowned Elizabeth II deep-faked her official coronation photograph, which remains one of the most famous images of the 20th century. It was taken after she had left Westminster Abbey and returned to Buckingham Palace. The photographer Cecil Beaton sat her in front of a painted backdrop of the Abbey’s interior so he could make it look all dreamy and romantic, gloomy gothic not being his style.

No one accused Queen Elizabeth of scamming the public. But for Kate there’s no mercy. She has been “outed” as a fake. If people discovered that Kim Kardashian, the “Queen of Reality TV,” was actually a composite character played by identical triplets, I doubt it would cause this much outrage.

A print of Queen Elizabeth II’s 1953 coronation photo, which was taken in front of a painted backdrop, not at Westminster Abbey. PHOTO: HERITAGE IMAGES/GETTY IMAGES

In fact, it isn’t far-fetched to compare the Windsors and the Kardashians as living megabrands, family enterprises whose presence on social media and ability to sway public opinion are totally out of proportion to their actual physical footprint. But however much they may resemble each other in their reach, they are opposites in terms of what they represent, and that may be why the response to Kate’s doctored photo was so visceral.

Over the years, Kim Kardashian has tried not just to emulate Kate but to be the American equivalent. Whether it was delusional or simply delusions of grandeur, the reality TV star modeled her own 2011 wedding to the NBA player Kris Humphries after William and Kate’s, even going so far as to order a similar cake at the reception. There were only four months between the two weddings so comparisons between the two were unavoidable. The press played along, sort of, variously referring to the Kardashian-Humphries $10 million extravaganza as “America’s Royal Wedding” and the “royal-ish wedding.”

Both weddings catapulted the brides into even greater stardom. But let us not forget the very different ways they have acquired and used their fame. In the age of social media and AI, the key differences between the two family brands hinge on what we now mean by “real” and “authentic” and “genuine.”

Kim Kardashian owns a global business empire, has 364.4 million followers on Instagram and still has her own reality TV show, renamed “The Kardashians.” She makes money playing a fake version of herself that people are willing to accept as “real” without caring which bits might be true and which are faked.

By contrast, Kate works for a nonprofit that pays the Waleses comparatively little and prohibits conspicuous displays of luxury. Her shared Instagram account with William has a mere 15.2 million followers. Although they, too, have a long-running TV show about their family, with their early romance concluding the most recent season, “The Crown” is too unflattering to be a vehicle for self-promotion.

Kate has no say or ownership in her show and has to watch an actor play a fake version of herself that people know isn’t real but nevertheless believe to be true on some level. She is a permanent resident inside a hall of mirrors that reflects two things: ratings and popularity.

But her job is of an older sort: Kings and queens have no career path or rank to attain. They exist in a different category from the rich, the famous and the beneficiaries of nepotism. The monarch can only be sui generis.

Over the years, the British monarchy has learned how to present itself as the embodiment of duty and public service. The royals are buoyed by the timeless values that define them. Queen Elizabeth II was authentic to a fault. The public’s grief at her passing reflects how deeply that quality still resonates.

Social media has trapped us all in a very different world, especially in the U.S., where public opinion has always been the ultimate power. It’s what caused Alexis de Tocqueville the most disquiet when he visited America in 1831. “It acts upon the will as well as upon the actions of men,” he wrote in “Democracy in America.” But when public opinion is the final arbiter of everything, where does that leave authenticity and the values it safeguards?

Fear that the royals are squandering the values and credibility of the monarchy is what provoked the internet-wide howl of rage over Kate’s photograph. I now realize why people rushed to fill the silence surrounding the Waleses with outrageous speculation. Scandal is somehow preferable to seeing Princess Catherine reduced to Kim Kardashian.

Historically Speaking: Here Comes the Rain Again

Storms have long shaped human destiny, as Californians know all too well.

The Wall Street Journal

February 15, 2024

Given that much of California was suffering a severe drought just two years ago, it might seem ungrateful to complain about too much rain. Yet Californians have already managed two record-breaking storms this year, and more are expected. The increase in the frequency and strength of these weather events spells trouble for the state. Some worry it is a sign that the “Big One”—a massive once-in-a-millennium storm—is nigh.

A man walks his dog on the edge of the Los Angeles River, Feb. 4.

Scientists think that these storms are growing more severe as a result of climate change, but mythical stories about destructive floods have haunted humans for eons, from the Sumerian Epic of Gilgamesh in 2000 B.C. to the biblical story of Noah and his ark.

Thousands of years of accumulated climate records combined with modern computing methods have led to new insights for the role rain has played in shaping human destiny. For example, excessive rain can now be added to the list of reasons behind the collapse of the Roman Empire. Apparently the final decades of the fifth century were unusually wet. Harvests failed and granaries rotted, setting off a cataclysmic chain of famines, wars and mass migration that hastened the empire’s demise.

Abnormal rainfall needn’t spell human disaster. In the early 13th century, 15 consecutive years of unprecedented rainfall turned the barren Mongolian steppe into fertile grassland. The region could finally feed the massive armies that allowed Genghis Khan to pursue his dream of a Mongol empire. But the intensely wet spring of 1242 may have pushed his descendants to abruptly leave Central Europe. The Mongol cavalry could seemingly defeat any foe except the bottomless mud of the Hungarian plain.

A series of engravings made for the first edition of the ‘Liber Genesis,’ 1612

A recurring theme in most Great Flood myths is how destruction can be creative; the washing away of the past being necessary for a redemptive transformation. A real-life example can be seen in Europe’s response to the crisis of 1816—the so-called Year Without a VolSummer. The 1815 eruption of Mt. Tambora in Indonesia ejected a huge cloud of sulfate gases into the atmosphere and created an unseasonable chill in much of the Northern Hemisphere. Endless rain watered already sodden fields. Communities starved; typhus outbreaks infected millions of people; rioting became endemic.

The sheer scale of the human emergency forced a reconceptualization in Britain and elsewhere of the purpose of government. No longer could it be concerned only with taxes, laws and diplomacy. The modern state must also consider public health, public administration and public responsibility.

California’s Great Flood of 1862, which remains the state’s worst disaster, was another catalyst for change. The storm began in late 1861 and lasted eight weeks. California’s Central Valley became an inland sea. At least 4,000 people died and a quarter of the state’s economy was destroyed. The Sacramento government relocated to San Francisco, which was also partially underwater.

The robust reconstruction efforts afterward marked a shift in attitude. Californians erected new flood defenses and instituted better building regulations. Sacramento rose again, literally 10 feet higher than before. The infrastructure may be up to 150 years old, but it is still doing its job. No matter what the rain brings, the answer isn’t an ark. It is being prepared.

Historically Speaking: Aspirin, a Pioneering Wonder Drug

The winding, millennia-long route from bark to Bayer.

The Wall Street Journal

February 1, 2024

For ages the most reliable medical advice was also the most simple: Take two aspirin and call me in the morning. This cheap pain reliever, which also thins blood and reduces inflammation, has been a medicine cabinet staple ever since it became available over the counter nearly 110 years ago.

Willow bark, a distant ancestor of aspirin, was a popular ingredient in ancient remedies to relieve pain and treat skin problems. Hippocrates, the father of medicine, was a firm believer in willow’s curative powers. For women with gynecological troubles in the fourth century B.C., he advised burning the leaves “until the steam enters the womb.”

That willow bark could reduce fevers wasn’t discovered until the 18th century. Edward Stone, an English clergyman, noticed its extremely bitter taste was similar to that of the cinchona tree, the source of the costly malaria drug quinine. Stone dried the bark and dosed himself to treat a fever. When he felt better, he tested the powder on others suffering from “ague,” or malaria. When their fevers disappeared, he reported triumphantly to the Royal Society in 1763 that he had found another malaria cure. In fact, he had identified a way to treat its symptoms.

Willows contain salicin, a plant hormone with anti-inflammatory, fever-reducing and pain-relieving properties. Experiments with salicin, and its byproduct salicylic acid, began in earnest in Europe in the 1820s. In 1853 Charles Frédéric Gerhardt, a French chemist, discovered how to create acetylsalicylic acid, the active ingredient in aspirin, but then abandoned his research and died young.

There is some debate over how aspirin became a blockbuster drug for the German company Bayer. Its official history credits Felix Hoffmann, a Bayer chemist, with synthesizing acetylsalicylic acid in 1897 in the hopes of alleviating his father’s severe rheumatic pain. Bayer patented aspirin in 1899 and by 1918 it had become one of the most widely used drugs in the world.

ILLUSTRATION: THOMAS FUCHS

But did Hoffman work alone? Shortly before his death in 1949, Arthur Eichengrün, a Jewish chemist who had spent World War II in a concentration camp, published a paper claiming that Bayer had erased his contribution. In 2000 the BMJ published a study supporting Eichengrün’s claim. Bayer, which became part of the Nazi-backing conglomerate I.G. Farben in 1925, has denied that Eichengrün had a role in the breakthrough.

Aspirin shed its associations with the Third Reich after I.G. Farben sold off Bayer in the early 1950s, but the drug’s pain-relieving hegemony was fleeting. By 1956 Bayer’s British affiliate brought acetaminophen to the market. Ibuprofen became available in 1962.

The drug’s fortunes recovered after the New England Journal of Medicine published a study in 1989 that found the pill reduced the threat of a heart attack by 44%. Some public-health officials promptly encouraged anyone over 50 to take a daily aspirin as a preventive measure.

But as with the case with Rev. Stone, it seems the science is more complicated. In 2022 the U.S. Preventive Services Task Force officially advised against taking the drug prophylactically, given the risk of internal bleeding and the availability of other therapies. Aspirin may work wonders, but it can’t work miracles.

Historically Speaking: Our Fraught Love Affair With Cannabis

Ban it? Tax it? Humans have been hounded by these questions for millennia.

The Wall Street Journal

January 19, 2024

Ohio’s new marijuana law marks a watershed moment in the decriminalization of cannabis: more than half of Americans now live in places where recreational marijuana is legal. It is a profound shift, but only the latest twist in the long and winding saga of society’s relationship with pot.

Humans first domesticated cannabis sativa around 12,000 years ago in Central and East Asia as hemp, mostly for rope and other textiles. Later, some adventurous forebears found more interesting uses. In 2008, archaeologists in northwestern China discovered almost 800 grams of dried cannabis containing high levels of THC, the psychoactive ingredient in marijuana, among the burial items of a seventh century B.C. shaman.

The Greeks and Romans used cannabis for hemp, medicine and possibly religious purposes, but the plant was never as pervasive in the classical world as it was in ancient India. Cannabis indica, the sacred plant of the god Shiva, was revered for its ability to relieve physical suffering and bring spiritual enlightenment to the holy.

Cannabis gradually spread across the Middle East in the form of hashish, which is smoked or eaten. The first drug laws were enacted by Islamic rulers who feared their subjects wanted to do little else. King al-Zahir Babar in Egypt banned hashish cultivation and consumption in 1266. When that failed, a successor tried taxing hashish instead in 1279. This filled local coffers, but consumption levels soared and the ban was restored.

The march of cannabis continued unabated across the old and new worlds, apparently reaching Stratford-upon-Avon by the 16th century. Fragments of some 400-year-old tobacco pipes excavated from Shakespeare’s garden were found to contain cannabis residue. If not the Bard, at least someone in the household was having a good time.

By the 1600s American colonies were cultivating hemp for the shipping trade, using its fibers for rigs and sails. George Washington and Thomas Jefferson grew cannabis on their Virginia plantations, seemingly unaware of its intoxicating properties.

Veterans of Napoleon’s Egypt campaign brought hashish to France in the early 1800s, where efforts to ban the habit may have enhanced its popularity. Members of the Club des Hashischins, which included Charles Baudelaire, Honoré de Balzac, Alexander Dumas and Victor Hugo, would meet to compare notes on their respective highs.

ILLUSTRATION: THOMAS FUCHS

Although Queen Victoria’s own physician advocated using cannabis to relieve childbirth and menstrual pains, British lawmakers swung back and forth over whether to tax or ban its cultivation in India.

In the U.S., however, Americans lumped cannabis with the opioid epidemic that followed the Civil War. Early 20th-century politicians further stigmatized the drug by associating it with Black people and Latino immigrants. Congress outlawed nonmedicinal cannabis in 1937, a year after the movie “Reefer Madness” portrayed pot as a corrupting influence on white teenagers.

American views of cannabis have changed since President Nixon declared an all-out War on Drugs more than 50 years ago, yet federal law still classifies the drug alongside heroin. As lawmakers struggle to catch up with the zeitgeist, two things remain certain: Governments are often out of touch with their citizens, and what people want isn’t always what’s good for them.

Historically Speaking: The Hunt for a Better Way to Vote

Despite centuries of innovation, the humble 2,500-year-old ballot box is here to stay.

The Wall Street Journal

January 4, 2024

At least 40 national elections will take place around the world over the next year, with some two billion people going to the polls. Thanks to the 2,500-year-old invention of the ballot box, in most races these votes will actually count and be counted.

Ballot boxes were first used in Athens during the 5th century B.C., but in trials rather than elections. Legal cases were tried before a gathering of male citizens, known as the Assembly, and decided by vote. Jurors indicated their verdict by dropping either a marked or unmarked pebble into an urn, which protected them against violence by keeping their decision secret.

The first recorded case of ballot box stuffing also took place in 5th century Athens. To exile an unpopular Athenian via an “ostracism election,” Assembly voters simply had to scratch his name on an ostraka, a pottery shard, and whomever reached a certain threshold of votes was banished for 10 years. It is believed that Themistocles’s political enemies rigged his ostracism vote in 472 B.C. by distributing pre-etched shards throughout the Assembly.

In 139 B.C., the Romans passed a series of voter secrecy laws starting with the Lex Gabinia, which introduced the secret ballot for magistrate elections. A citizen would write his vote on a wax-covered wooden tablet and then deposit it in a wicker basket called a cista. The cistae were so effective at protecting voters from public scrutiny that many senators, including Cicero, regarded the ballot box as an attack on their authority and a dangerous concession to mob rule.

ILLUSTRATION: THOMAS FUCHS

The fact that secret ballots allowed men to vote as they pleased was one reason why King Charles I of England ordered all “balloting boxes” to be taken out of circulation in 1637, where they remained until the 19th century.

Even then, there was strong resistance to ballot boxes in Britain and America on the grounds that they were unmanly: A citizen ought to display his vote, not hide it. In any case, there was nothing special about a 19th-century ballot box except its convenience for stealing or stuffing. One notorious election scam in San Francisco in the 1850s involved a ballot box with a false bottom.

Public outrage over rigged elections in the U.S. led some states to adopt Samuel Jollie’s tamper-proof glass ballot box, which New York first used in 1857. But a transparent design couldn’t prevent these boxes from mysteriously disappearing, nor would it have saved Edgar Allan Poe, who is thought to have died in Baltimore from being “cooped,” a practice where kidnap victims were drugged into docility and made to vote multiple times.

To better guarantee the integrity of elections, New York introduced in 1892 a new machine by Jacob Myers that allowed voters to privately choose candidates by pulling a lever, which dispensed with ballots and ballot boxes. Other inventors quickly improved on the design and by 1900 Jollie’s glass ballot box had become obsolete. By World War II almost every city had switched over to mechanical voting systems, which tallied votes automatically.

The simple ballot box seemed destined to disappear until controversies over machine irregularities in the 2000 presidential election resulted in the Help America Vote Act, which requires all votes to have a paper record. The ballot box still isn’t the perfect shield against fraud. But then, neither is anything else.

Historically Speaking: The Ancient Origins of the Christmas Wreath

Before they became a holiday symbol, wreaths were used to celebrate Egyptian gods, victorious Roman generals and the winter solstice.

On Christmas Eve, 1843, three ghosts visited Ebenezer Scrooge in the Charles Dickens novella “A Christmas Carol” and changed Christmas forever. Dickens is often credited with “inventing” the modern idea of Christmas because he popularized and reinvigorated such beloved traditions as the turkey feast, singing carols and saying “Merry Christmas.”

But one tradition for which he cannot take credit is the ubiquitous Christmas wreath, whose pedigree goes back many centuries, even to pre-Christian times.

ILLUSTRATION: THOMAS FUCHS

Wreaths can be found in almost every ancient culture and were worn or hung for many purposes. In Egypt, participants in the festival of Sokar, a god of the underworld, wore onion wreaths because the vegetable was venerated as a symbol of eternal life. The Greeks awarded laurel wreaths to the winners of competitions because the laurel tree was sacred to Apollo, the god of poetry and athletics. The Romans bestowed them on emperors and victorious generals.

Fixing wreaths and boughs onto doorposts to bring good luck was another widely practiced tradition. As early as the 6th century, Lunar New Year celebrants in central China would decorate their doors with young willow branches, a symbol of immortality and rebirth.

The early Christians did not, as might be assumed, create the Yuletide wreath. That was the pagan Vikings. They celebrated the winter solstice with mistletoe, evergreen wreaths made of holly and ivy, and 12 days of feasting, all of which were subsequently turned into Christian symbols. Holly, for example, has often been equated with the crown of thorns. Perhaps not surprisingly for the people who also introduced the words “knife,” “slaughter” and “berserk,” one Viking custom involved setting the Yuletide wreath on fire in the hope of attracting the sun’s attention.

Across northern Europe, the Norse wreath was eventually absorbed into the Christian calendar as the Advent wreath, symbolizing the four weeks before Christmas. It became part of German culture in 1839, when Johann Hinrich Wichern, a Lutheran pastor in Hamburg, captured the public’s imagination with his enormous Advent wreath. Made with a cartwheel and 24 candles, it was intended to help the children at his orphanage count the days to Christmas.

German immigrants to the U.S. brought the Advent wreath with them. Still, while Americans might accept a candlelit wreath inside the house, door wreaths were rare, especially in former Puritan strongholds such as Boston. The whiff of disapproval hung about until 1935, when Colonial Williamsburg appointed Louise B. Fisher, an underemployed and overqualified professor’s wife, to be its head of flowers and decorations. Inspired by her love of 15th-century Italian art, Fisher allowed her wreath designs to run riot on the excuse that she was only adding fruits and other whimsies that had been available during the Colonial era.

Thousands of visitors saw her designs and returned home to copy them. Like Wichern, she helped inspire a whole generation to be unapologetically joyous with their wreaths. Thanks in part to her efforts, America’s front doors are a thing to behold at this time of year, proving that it’s never too late to pour new energy into an old tradition.