The Sunday Times: Beyoncé’s NFL half-time show embraced Trump’s new America

A denim pick-up truck. Stetsons galore. And it all played out on a Texan football field. The star’s 12-minute, $20 million set arguably heralds a new cultural era

The Sunday Times

December 28, 2024

Post Malone and Beyoncé performing at the halftime show of a Baltimore Ravens vs. Houston Texans game.

Beyoncé was joined by another US artist, Post Malone, and a jeans-clad jalopy ALEX SLITZ/GETTY IMAGES

The United States during the Reagan years in the 1980s was proud, loud, and unabashed. Americans wore diamantes to bed and ten-gallon stetsons to dinner. They drove gas-guzzling sedans, watched MTV on cable, and boasted a president who owned a white Arabian stallion called El Alamein. It was America resurgent with brass knobs on.

Last week, for her NFL half-time Christmas Day concert, Beyoncé gave this long-discarded America its first public airing in nearly forty years. All the familiar elements were there: the bedazzling, the strutting, the clunky Yank Tanks — even the gleaming white horse which she rode into the stadium.

Beyoncé performing at the Netflix NFL Christmas Gameday halftime show.

NETFLIX/BEYONCÉ

Beyoncé began her set with an off-stage rendition of 16 Carriages, a defiant ode to a hardscrabble childhood, quite possibly her own. The contrast between the grim lyrics and the blindingly white country and western aesthetic was the first of several striking dissonances during the performance. The 32-time Grammy winner has an almost unique ability to channel the zeitgeist into politically charged songs that still have mainstream appeal.

American audiences know that when Beyoncé sings, she isn’t simply putting stories to music, she is communicating a series of messages that collectively form a meta-commentary about what it means to be a person of colour in white America today.

Viewers got an eyeful of the extra-white teeth, the extra-big cars, the extra-wide shoulder pads, and saw the return of 1980s Dallas chic when everybody wanted to own a pair of cowboy boots and the whole world was obsessed with who shot JR. The Guardian waxed lyrical about the joy and playfulness of Beyoncé’s routines rather than the charged racial undertones.

The world’s reaction to the Christmas Day concert was similarly context-free. They took in the unbridled exuberance of Beyoncé’s version of Pop Americana, and saw only, as British viewers did, the 1980s-style flashiness that accompanied America’s resurgence under Ronald Reagan.

But there is more going on. There’s been a seismic shift in how America views itself. In recent years the country’s self-confidence has been dented. The combination of Black Lives Matter, Maga, #MeToo, QAnon, Stop the Steal, the Anti-Vax Movement and cancel culture in general has achieved what Vietnam, the oil crisis, 9/11, the Great Recession, and Covid-19 could not: turn America from an overly positive country into a hyper-negative one.

Americans are feeling bad about themselves, their country, and their history. Those who are not angry are anxious.

Donald Trump’s recent victory was his way of overturning that. A promise of what life could be like in the 80s, America’s boom time. Whereas Reagan projected an open-handed, “what’s-not-to-like?” image, a movie star straight from a Western, now we have a loud, close-fisted, angry, defensive kind of brashness.

Trump’s election promises to enact foreign tariffs, stay out of foreign conflicts, and deport more than a million illegal immigrants are a flat-out contradiction of Reagan’s pro-immigration, pro-trade, pro-American interventionism. Almost by sheer luck, the 40th president proved that the occasional arms race can be good for capitalism. But history offers multiple examples of what price controls coupled with isolationist policies does to a country.

In Beyoncé’s America, both sides can co-exist.

Donald Trump at a campaign rally, surrounded by Secret Service agents.

An image of Trump in the moments after an assassination attempt in July could define the era /EVAN VUCCI

To my mind, Beyoncé was telling Americans to move on and embrace the now. It isn’t about slick LA or the liberal seaboard, but heartland Americana. Her performance was a carefully curated smorgasbord of nods and references to local and national symbols. She sang Dolly Parton’s Jolene, but her backing singers were country music singers who are also black women; she used a 200-strong marching band, but an all-black one recruited from one of the country’s historically black universities.

Beyoncé and backup singers in white cowboy hats and gowns at a halftime performance.

NETFLIX/BEYONCÉ

Texas, and Houston specifically, received a royal wave with line dancing, rodeo stars, and the traditional homecoming parade, complete with car cavalcade and not one but two homecoming queens, both former Texas beauty pageant winners.

Hispanic culture was also a beneficiary of Beyoncé’s cultural largesse with her mariachi-inspired costume decorations, and the deployment of Lowrider Impala Chevrolets, a ban on which was recently lifted by the governor of California, Gavin Newsom. A civilian truck was was clad in denim in an homage to Beyoncé’s song, LEVII’S JEANS.

Beyoncé and Blue Ivy performing together at a halftime show.

Beyoncé was joined during the performance by her 12-year-old daughter Blue Ivy DAVID J PHILLIP

That said, the appearance of an American flag shrouded in sheet plastic was one of the biggest controversies of the performance. One half of the angry sides of America objected to its presence at all, another complained it was insufficient. The rest were anxious about its ambiguity — was it an embrace or rejection of American values? Was Beyoncé ruining country music, a sacred American institution, with her non-white themes, rhythms and tropes, or reinventing it?

Beyoncé's 2024 NFL Halftime Show performance.

YOUTUBE

2025 may show that neither interpretation of Beyoncé’s half-time concert was grounded in any sort of reality. I fear that Beyoncé’s “playful” references to a time when the Dukes of Hazzard could be watched without cringing were not meant to be taken literally. That America is not coming back. However, it will take a great deal more than twelve minutes of foot-stamping glory to usher in the other America, the one where self-hatred gives way to self-reinvention.

The Wall Street Journal – The Christmas Gift I’ll Never Forget: More Time Together

Some holiday presents are so perfect, or so awful, that you remember them forever.

The Wall Street Journal

December 20, 2024

Christmas Eve. Things are not going according to plan.

Wind back: It’s T-minus 15 days to Christmas. I’m shopping for reindeer hats when my phone rings. That X-ray for my husband Reg’s sore rib? The problem isn’t his rib.

T-minus 12 days: He has cancer!

T-minus 9: Don’t worry. Our physician thinks they caught it early.

T-minus 7: Worry. The surgeon says it’s stage 4. “But there is always hope,” she tells us.

T-minus 2: Reg goes in for more tests.

T-minus 1: And another biopsy. Ouch.

So here we are on a stormy Christmas Eve. I am driving to the country with unwrapped gifts in the trunk, five sleeping kids in the back and my semiconscious, possibly dying husband in the passenger seat.

My revised plan: Forget the turkey. Just wrap presents, stuff the stockings and sleep, so I’m not a wreck on Christmas morning.

Midnight: I’m ahead of the game.

2 a.m.: The wind has dislodged some roof tiles, and water is cascading through the children’s bedrooms to the dining room below. I find sleeping bags and herd the kids into the corridor. Yay! Christmas and camping, their two favorite activities. I don’t have enough buckets.

3 a.m.: Oh wait, I still need to do the stockings! I’d been thrilled to find ones that were the perfect Norman Rockwell red. Now, I hate them.

7:50 a.m. I think I dozed off. Only the reindeer hats to stuff.

8:00 a.m. Christmas Day! Reg is awake. His eyes widen when he sees me. “Why are you crying?” he asks.

“Everything. This. You.”

“Look at me,” he says. “I love you. I have everything I have ever wanted in you, our children, our life together. The only thing I no longer have is time.” Time. His words are the worst and best Christmas gifts rolled into one.

That was 15 years ago. It turned out there was hope—and more time, too.

llustration by Mar Hernández

Historically Speaking: Paying With Plastic Is New, but Credit Isn’t

From the Knights Templar to the Diners Club, people have long traded in promises instead of cash

The Wall Street Journal

March 17, 2022

On March 6, AmexMastercard and Visa announced that they were suspending their operations in Russia. The move will deal a blow to Russian commerce, as a life tied to cash and checks can be cumbersome in the extreme.

The Mesopotamians were among the first known to grasp the usefulness of the charge account. The extensive trade between Mesopotamia and the Indus Valley made regular shipments of gold highly impracticable. Instead, traders used seals and clay tablets to keep running tallies for settlement at a future date.

Greek and Roman travelers relied on letters of creditworthiness from their personal banks. But this practice died out during the Dark Ages. The Knights Templar revived it during the 12th century as one of their services to Christians traveling to Jerusalem. Pilgrims could deposit their money at any Templar house, receive a “letter of credit” and use it to withdraw funds from the Temple stronghold in Jerusalem.

Illustration: Thomas Fuchs

The letter of credit eventually evolved into the bill of exchange, or promissory note, between banks, used for business transactions. In 1772, the London Exchange Banking Company in England offered its clients a version for everyday use: Called “circular notes,” they were issued in set denominations, could be cashed in many major cities and were guaranteed against loss and theft.

The idea was slow to catch on in the United States until a freight transport business called the American Express Company decided its real profits lay in facilitating the movement of money. Having already enjoyed considerable success with money orders, in 1891 it rolled out the American Express traveler’s check, which merely required the owner’s counter signature to be valid.

The traveler’s check was by no means the only alternative to cash. By the late 19th century, most department stores had tokens, often personalized metal key fobs, that loyal customers could present in lieu of immediate payment. After World War I, oil companies went a step further, offering “courtesy cards” that could be used at their gas stations. Airline companies and hotels did the same.

The profusion of charge cards soon became onerous. In 1946, a Brooklyn bank experimented with the Charg-It Card, which could be used at local businesses. Three years later, so the story goes, New York businessman Frank X. McNamara was dining at a restaurant with clients when he realized he was out of cash. The ensuing embarrassment inspired him to propose a new kind of charge card: one that was members-based, would work anywhere and earned its profit by charging each customer an annual fee.

The Diners Club card had more than 10,000 members by the end of its first year. The first bank to copy the idea was Bank of America in 1958. Its BankAmericard—which became Visa in 1976—allowed card owners to pay interest rather than settling their monthly bill. By the mid-’60s, other banks were scrambling to imitate what had effectively become a cash-cow. The most successful competitor was a consortium of banks behind the Interbank Card, today known as Mastercard.

The first bank outside the U.S. to offer a credit card was Britain’s Barclays Bank in 1966. But by then Visa and Mastercard were already expanding to other countries, setting the stage for the global duopoly they are today.

Vladimir Putin may have difficulty charging his next holiday to his Visa. But he can still use his China-backed UnionPay card—for now.

Appeared in the March 19, 2022, print edition as ‘Paying With Plastic Is New, But Credit Isn’t’.

The Guardian: Kingmaker by Sonia Purnell review – a woman of influence

The extraordinary story of one of the greatest, most gossiped-about political fixers of the 20th century.

The Guardian

October 3, 2024

Pamela Digby after her wedding to Randolph Churchill. Photograph: Fred Ramage/Getty Images

In early 1965, the death of Alma Mahler-Gropius-Werfel, married to three of the most brilliant men of early 20th-century Europe and the lover of several others, inspired Tom Lehrer to compose a spoof paean to her wondrous assets. “Alma, Tell us, All modern women are jealous,” he sang with a knowing chuckle.

The song tickled upper-crust New Yorkers; they had an English version of Alma living among them in the form of The Hon Pamela Churchill-Hayward. Pamela was then on marriage number two, her first being a wartime mistake to Randolph Churchill. She exited it with a child, Winston, whom she neglected, and a famous last name that she protected fiercely. Her American do-over was Leland Hayward, the Broadway producer of such hits as South Pacific and The Sound of Music.

Her husbands were overshadowed, though, by the fabulously moneyed and titled lovers she acquired in between: Aly Khan, son of the Aga Kahn, the Fiat heir Gianni Agnelli, Baron Elie de Rothschild, and William S Paley, owner of CBS, among them. She was one of Truman Capote’s “swans” – rich society women he befriended – and he waspishly joked that the collected tales of Pamela Hayward’s exploits would last not A Thousand and One Nights, but A Thousand and Twelve. Widowhood in 1971 lasted an unhappy six months until Averell Harriman became husband number three. The business magnate turned diplomat, statesman and eminence grise of the Democratic party was also, not so incidentally, a former lover and recent widower.

She was a youthful 51, he a somewhat spry 79; time was short. Pamela’s assault on Washington was still ongoing when Averell died in 1986. The city capitulated to her in the end, just as London, Paris and New York had done. With her face and reputation burnished in ways that only serious money can achieve, Pamela triumphantly returned to Europe in 1993 as President Bill Clinton’s ambassador to France. She died in the job four years later, neither the best nor the worst political appointee who ever went to Paris.

Pamela’s fascinating life earned her the scholarly attentions of the late Christopher Ogden, Sally Bedell Smith and now Sonia Purnell. Few attract one excellent biographer, let alone three. She hated the first two books, which exhumed every skeleton and buffed it up for display. Ogden’s was remarkably fair considering she reneged on their agreement and tried to stiff him of his author’s fee. However, she would have found this latest portrait just right.

Purnell makes the case for Pamela as a woman of substance. First, because her wartime lovers – Harriman, sent to London by Roosevelt as his special envoy, Edward R Murrow, the CBS news correspondent, and Frederick Anderson, head of Eighth Bomber Command – made her a useful conduit between the British and Americans. Second, because she helped transform political fundraising in the 1980s.

Admittedly, the time Pamela exchanged views with this or that person, or convened a meeting of top-level politicos, is a lot less fun to read than the time she caught Gianni Agnelli in flagrante – he crashed into a tree while escaping her wrath, leaving him with a permanent limp.

Nevertheless, Kingmaker is on to something important. Successful women are judged differently than men. A monster like Picasso gets a free pass, but woe betide the unlikable woman. What does it matter, asks Purnell, if Harriman was a ruthless social climber and an unsatisfactory friend, mother and stepmother? She was a brilliant operator and strategist, raising millions for the Democrats during the Reagan years.

Harriman’s political effectiveness ought to be separated from her personal faults, but she is still in the dock, in my opinion. Throughout history, backdoor influencers like her were among patriarchy’s biggest cheerleaders and beneficiaries. In 1981, the year Pamela and Averell established her fundraising political action committee, or PAC, Sandra Day O’Connor became the first female US supreme court justice and Jeanne Kirkpatrick the first female US ambassador to the UN. Meanwhile, the Harriman dinners were still feeding men’s egos with port and cigars and forcing the women to take their coffee and bonbons in the drawing room.

Kingmaker: Pamela Churchill Harriman’s Astonishing Life of Seduction, Intrigue and Power by Sonia Purnell is published by Virago (£25).

To support the Guardian and Observer, order your copy at guardianbookshop.com. Delivery charges may apply.

The Spectator – Towards Zero: the gruesome countdown to the American Civil War

The North and South had been bitterly divided over slavery since the invention of the cotton gin in the 1790s, but the Battle of Fort Sumter in 1861 would prove the point of no return

Some 100,000 books have been written about the American Civil War since it ended in 1865. That’s hardly surprising, given the four-year conflict’s impact on society, and not just because of the immense death toll, which new estimates put as high as 750,000 – more than the losses from all other wars combined. The effusion of blood created a new nation and a new mythology, anchored on the principles of freedom, equality and democracy.

Coloured lithograph of the bombardment of Fort Sumter, Charleston Harbor, 12-13 April 1861. [Lithograph by Currier & Ives, New York, 1861/ Getty Images]

There is not much room in this crowded field for Civil War neophytes. Erik Larson knows what he is about, however, in The Demon of Unrest – but do his critics? The mixed reception this book has received suggests not. As with his previous best-sellers, the author has taken a single event, the Battle of Fort Sumter in Charleston, South Carolina, in this case, and used it as a highly effective framing device for the immersive story he wishes to tell.

The actual event was a straightforward one. After holding out against Confederate forces for 108 days, the starving Union garrison relinquished its control of the fort on 13 April 1861. The battle was the point of no return, and although there hadn’t been any fatalities during the 34-hour bombardment leading up to it, a gruesome accident during the 50-gun salute did result in the first death of the war. Neither the Confederate besiegers nor the northern defenders of the fort had any inkling of the hell they were about to unleash on their fellow Americans.

This is not to say they were blindsided by the war. The country had been tearing itself apart over the issue of free vs slave labour ever since the invention of the cotton gin in the 1790s. Once slave-grown cotton could be efficiently cleaned and processed by machine, the South went from being an agricultural backwater to an economic powerhouse exporting four million cotton bales a year.

The production costs were cheap, the supply inexhaustible, and the southern states had a virtual monopoly on the global market. Unlike the northern states, they didn’t need immigration, education or industrialisation to grow rich – just millions of Africans, force-fed and force-bred into perpetual bondage. In the 1830s, southerners began referring to slavery as the ‘peculiar institution’, not because it was wicked and shameful, but on account of it being unique and inseparable from the southern way of life.

If northerners were at all uncertain about the non-negotiability of slavery – unlikely, given the political paralysis it caused in Washington – incidents on the floor of the Senate such as the Mississippi senator Henry Foote brandishing a loaded revolver and the South Carolina senator Preston Brooks beating the abolitionist campaigner Charles Sumner unconscious, helped to clear up any confusion.

International condemnation of slavery also fuelled southern bluster and arrogance. On the eve of the war, Britons were unamused to hear the South Carolina plantation owner James Hammond (rendered in gloriously repulsive detail by Larson) describe England as a vassal state in the southern empire. ‘Cotton is king,’ Hammond raged in a speech in 1858 that made him infamous on both sides of the Atlantic. ‘No power on Earth dares to make war on it.’ No power except the executive power of Abraham Lincoln, it turned out.‘Cotton is king,’ James Hammond raged in 1858. ‘No power on Earth dares to make war on it’

The pivotal role that individual action plays in momentous times is a recurring theme in Larson’s books. He is fascinated by two kinds of anti-heroes: the monster with a talent for propelling events, like Hammond, and the decent man whose limitations spur him towards catastrophe, like the professorial William Dodd, America’s ambassador to Germany the year Hitler came to power (In the Garden of Beasts, 2011). A colleague of Dodd’s later recalled he had seldom, if ever, ‘worked with a chief of mission who was more futile and ineffective’.In The Demon of Unrest, Larson’s anti-heroes are more starkly drawn. The decent men may be doomed to die, like Lincoln, or fail in their objective, like US Major Robert Anderson, the commander of Fort Sumter, but their flaws and limitations render them more, rather than less, admirable.

Lincoln was elected the first Republican president in November 1860 by a majority in the 34-state electoral college. But he lost the popular vote by a wide margin, giving the impression that his victory was an accident. His failure to address this added fuel to the claims of southern fire-eaters that he intended to destroy the South’s economy using high tariffs, ram abolition down their throats and set off a race war between blacks and whites.

Having never visited the Deep South, Lincoln was unaware of how entrenched the secession movement had become or how desperately pro-Union southerners needed support and leadership. Even more damaging for the prospects of peace was the traditional four months’ grace between the election and the inauguration. The General Assembly of South Carolina, the state with the noisiest supporters and longest history of secession attempts, voted to become an ‘independent commonwealth’ on 20 December 1860. Half a dozen more followed soon afterwards, yet Lincoln was still in Illinois in early February when the seven announced the formation of the Confederate States of America under President Jefferson Davis.

Lincoln only settled into Washington at the end of February, by which time another four states were preparing to secede, bringing the final total to 11. At his inauguration on 4 March, his belated attempt to cool secession fever by insisting he would defend the Union to his last breath while promising that slavery was safe in his hands alienated everyone. Southerners were convinced he was lying, even as abolitionists hoped that he was.

Initially, the majority of Lincoln’s cabinet felt certain he was not up to the job. Several tried to sideline him, sowing chaos among the already confused attempts to prevent disunion. Lincoln never altered his position, however, that slavery was a negotiable issue, but not Federal authority. Only two naval fortifications were still in Union hands by the time he took office: Sumter in Charleston and Fort Pickens in Pensacola, Florida. Back in December, John Floyd, Buchanan’s secessionist secretary of war, believed he had picked ‘one of us’ when he assigned a middle-aged southern officer from a slave-owning family to oversee Charleston’s fortifications. Major Robert Anderson’s rise up the ranks had stalled, despite an exemplary service record and being severely wounded in action during the Mexican-American War. He was teaching cadets at West Point when Floyd recalled him. It was not uncommon for such men to seek compensation by means of treachery. Anderson was the exception.

South Carolina’s secession was meant to have been the signal for Anderson to stand aside or evacuate his position. He did, slipping out under cover of darkness with a few dozen troops; but only to take up a stronger one. Although still under construction, Fort Sumter was the biggest of Charleston’s three forts. Anderson didn’t care about the rights or wrongs of slavery, nor did he care much about politics, but his honour and duty were sacred to him. To the indignation and fury of his now former brother officers, he made it clear that he would protect the Union flag flying over Sumter until he ran out of food or Confederate forces overwhelmed him.

In Larson’s dramatic rendering of the countdown to war, Lincoln was the one who cocked the starting gun by insisting that Major Anderson be resupplied, knowing it would provoke the Confederates; by engaging in a battle he knew he would lose, Anderson was the one who fired it. There’s an unmistakable aura of Greek tragedy to these men in The Demon of Unrest. They are reluctant heroes, forced to act the way they do because they are incapable of behaving in any other way.

It’s history as a form of catharsis, which leads to the deeper purpose of this work. The real project of the book may not be immediately divined from its soaring prose and ripping action scenes, yet it was shaped by the calamitous events at the Capitol on 6 January 2021. ‘I had the eerie feeling that present and past had merged,’ Larson writes in his foreword. It seemed to him as though America was once again in danger of slipping loose from its ideological moorings.

The Demon of Unrest is Larson’s attempt to call the country back to its senses. The book is not so much history in the traditional Thucydidean manner of causes and events, but rather a political argument posing as history in the manner of Xenophon, the father of popular narrative history. It is a full-throated defence of democratic values, individual agency and the power of collective action. Why read it? Because to understand the meaning of freedom for others is to know it in ourselves.

The Wall Street Journal: What I Learned on My Summer Vacation

Don’t Dream of Escaping the Diaper Pail

Travel can force us to have sudden realizations. Five writers share one thing they discovered on a trip.

 Junot Diaz found out how it feels to stand out in “a very American place.” Amanda Foreman proved that a car can smell too bad to steal. Kevin D. Williamson realized there are no good bears. Andrew Rannells found the limit to “me” time. And Joanne Lipman saw a vacation change her son’s life.

The twins’ fourth birthday was marked by the retirement of the diaper pail. Five children born in quick succession had kept it in continuous service. It had been my friend and savior during the circle-of-poo years but also an unyielding and tyrannical master. I wasn’t sorry to say goodbye. Giddy from our hard-won freedom, my husband and I threw caution to the wind that summer and took the family to Italy, a country we had last visited as newlyweds.

Illustrations by Ben Kirchner

Driving to our Umbrian B&B, we spotted the same restaurant we had passed 10 years earlier, the T and V of “TAVERNA” still flickering. Fate was surely calling out to us.

We needn’t have listened, and after visiting the bathroom, I wished we hadn’t. The ground-level squatting nonplussed the children. They could squat easily enough; it was not stepping or falling into the basin afterward that was the tricky part.

“Why are the kids naked?” My husband asked when we returned. I handed him a bulky garbage bag. “Just like old times,” he commented, and tossed it in the back of the van. We drove the rest of the way with the windows down and the air conditioning on high.

“I feel sick” said one of the twins, and vomited into my Kate Spade bag right as we pulled into the driveway. My husband grabbed my hand. “Don’t look,” he said, “it won’t help.” He zipped the bag shut and tossed it over his shoulder.

The village was tiny and we could forget about the car for two weeks. When it was time to set off for the airport, my husband opened the back and was almost overpowered by the force of escaping gases. We peered cautiously inside; the Kate Spade lay on its side, shrunken and inert. The clothes bag, on the other hand, seemed to be incubating new life.

I dared him to make first contact. “They’re your bags,” he said. We trapped them under the suitcases, hoping the smell would dissipate once we got going. The AC was turned up, the windows rolled down, and yet the stench grew worse. What else lurked in our van of horrors?

We stopped at a gas station and clambered out, gasping for air. Googling “strong sulfur smell” suggested that the odor might be coming from a broken catalytic converter. “I’ll trade you two flat tires instead,” I told the car. Our pantomime with the Italian garage mechanic was interrupted by the children jumping and pointing, as two men got into our van. Seconds later it was gone.

Illustrations by Ben Kirchner

The mechanic sweetly summoned a taxi to take us to the police station. We had been driving for 10 minutes when I caught sight of our luggage. The thieves had chucked them out of the van in ones and twos, turning the road into a giant slalom. The driver took the course at speed but clipped the Kate Spade and flattened the clothes bag.

He came to a hard stop next to our abandoned car. The doors were open and the engine still running.

Grim-faced and silent, we rescued what we could of our belongings and bundled everyone back in. Two excruciating hours later, the flickering “AERNA” sign appeared. We had come full circle. I caught my husband’s eye and felt the car speed up.

Historically Speaking: The Drama of Finding Lost Cities

We are always a discovery away from rewriting ancient history.

The Wall Street Journal

March 21, 2024

In January, scientists announced the discovery of the oldest pre-Columbian cities in the Amazon. They were built in the Upano Valley in Ecuador by an organized urban society that flourished between 800 B.C. and 600 A.D. Surprisingly, this lost Amazonian civilization created cities that resemble modern suburbs, complete with low-density housing, ample green space and manicured road networks. It would seem that ancient urban complexity didn’t just arise in the familiar model of the Eurasian walled city.

ILLUSTRATION: THOMAS FUCHS

This kind of challenge to the scholarly consensus is actually fairly common with the discovery of lost cities. When civilizations are erased from history, bringing them back into the narrative can lead to a dramatic realignment of facts.

The discovery of Troy is an early example of this phenomenon. In 1870, after 12 years of fruitless searching, a German businessman and amateur archaeologist named Heinrich Schliemann found Troy’s ruins near modern Hisarlik on the Turkish Aegean coast. His careless excavation of the site makes modern archaeologists cringe, but against incredible odds Schliemann proved that Homer’s Iliad was based on real events. The study of ancient Greece was changed forever.

Three major discoveries in the early 1900s also reshaped our understanding of the Bronze Age. In 1906, German archaeologists digging through the ruins of a windswept plateau in north-central Turkey uncovered the forgotten city of Hattusa. Its royal archive confirmed that the city had been the last capital of the Hittites and the nerve center of a powerful empire that rivaled Egypt. After a glorious run of five hundred years, the Hittite civilization disintegrated so completely by around 1200 B.C. that little was known about it beyond the mentions in the Bible. The recovery of Hattusa redrew the map of the ancient world, this time with the Hittites at the center.

Across the Aegean, Sir Arthur Evans similarly rescued the Minoans with his discovery around 1905 of the Palace of Knossos on Crete, which flourished between 1700 and 1500 B.C. Evans’s breakthrough was to show that Minoan culture was nonviolent, unlike the other ancient Greek societies known at the time. It was also perhaps matriarchal. The palace, which lacked fortifications, had frescoes of women in authoritative poses. Recovered sculptures often featured goddesses. Evans ultimately argued that Knossos was proof that gender inequality was not, as generally believed, intrinsic to civilization.

The discovery in 1911 of Mohenjo-daro in present-day Pakistan was no less revelatory. This once-vital center of the Indus Valley Civilization, a trading society that thrived between 2500 and 1700 B.C., notably lacked temples, palaces, noble houses or even rich or poor neighborhoods. The city seems to have spent its wealth instead on civic amenities, such as public granaries and universal plumbing.

The egalitarianism at the heart of Mohenjo-daro suggests that the more cities and civilizations we find, the more it will complicate our understanding of human society’s origins. It is humbling to know that we are always a discovery away from rewriting ancient history. It’s invigorating, too.

Essay: Princess Kate Isn’t Kim Kardashian

The public backlash against an altered photo of the royal family shows that, even in the age of social media, royalty is supposed to mean more than just celebrity.

The Wall Street Journal

March 15, 2024

There I was on CBS News at the beginning of March, as a commentator on the royals, pooh-poohing fears that the public was being deceived. The Princess of Wales’s three-month absence from public life following her surgery in January was generating far too much hysteria.

Yes, it was a little strange that Prince William had suddenly withdrawn from a scheduled public appearance citing a “personal matter.” And there was no denying that the two royal households were not in sync.

Buckingham Palace’s willingness to share photographs of the cancer-stricken King Charles III made the news blackout on Princess Catherine look suspicious. But Kensington Palace, the residence of the Waleses, had been quite clear from the outset that it wouldn’t be issuing any updates until her convalescence was over. Trust me, I had said, the real issue here is the fact we are paying too much attention to baseless rumors.

Two weeks later, where-is-Kate-gate had turned into photogate. The world had been clamoring for a picture to show that she wasn’t in a coma, on a beach filing for divorce or recovering from a Brazilian Butt Lift, as some speculated on social media. What it got on March 9, Mother’s Day in Britain, was an Adobe Photoshop special in the worst way.

The charming family portrait of Kate sitting on a bench, surrounded by her three children, was bound to be scrutinized pixel by pixel. At the very least, it needed to be devoid of any mystery. Instead, the image had been obviously but crudely digitally enhanced, giving credence to every crackpot theory out there. By being secretive and then looking manipulative, the Waleses had created a public relations catastrophe. And they had damaged their “brand,” which depends crucially on how their public role transcends that of mere celebrities.

For the record, I don’t have any inside information about Kate’s surgery. The Germans think it was diverticulitis, the British assume it was a hysterectomy. But leaving aside the knotty question of privacy, the royals’ junior varsity team needs a reset. Even if the fiasco was the result of a mislabeled file, or a key member of the comms office bunking off early on a Friday, William and Kate are playing like amateurs in a pro sport that kills its losers.

But the bigger issue exposed by photogate isn’t a “them” problem at all, it’s an “us” problem. Americans are behaving as though Kate cheated on them. The betrayal! She has done what no one else in this country has ever managed to do and unite the conservative and liberal media around a cause they are both passionate about: the evils of royal photoshopping. It’s a breach of trust, don’t you know, and Americans won’t stand for it.

Lest we forget, in 1953 the newly crowned Elizabeth II deep-faked her official coronation photograph, which remains one of the most famous images of the 20th century. It was taken after she had left Westminster Abbey and returned to Buckingham Palace. The photographer Cecil Beaton sat her in front of a painted backdrop of the Abbey’s interior so he could make it look all dreamy and romantic, gloomy gothic not being his style.

No one accused Queen Elizabeth of scamming the public. But for Kate there’s no mercy. She has been “outed” as a fake. If people discovered that Kim Kardashian, the “Queen of Reality TV,” was actually a composite character played by identical triplets, I doubt it would cause this much outrage.

A print of Queen Elizabeth II’s 1953 coronation photo, which was taken in front of a painted backdrop, not at Westminster Abbey. PHOTO: HERITAGE IMAGES/GETTY IMAGES

In fact, it isn’t far-fetched to compare the Windsors and the Kardashians as living megabrands, family enterprises whose presence on social media and ability to sway public opinion are totally out of proportion to their actual physical footprint. But however much they may resemble each other in their reach, they are opposites in terms of what they represent, and that may be why the response to Kate’s doctored photo was so visceral.

Over the years, Kim Kardashian has tried not just to emulate Kate but to be the American equivalent. Whether it was delusional or simply delusions of grandeur, the reality TV star modeled her own 2011 wedding to the NBA player Kris Humphries after William and Kate’s, even going so far as to order a similar cake at the reception. There were only four months between the two weddings so comparisons between the two were unavoidable. The press played along, sort of, variously referring to the Kardashian-Humphries $10 million extravaganza as “America’s Royal Wedding” and the “royal-ish wedding.”

Both weddings catapulted the brides into even greater stardom. But let us not forget the very different ways they have acquired and used their fame. In the age of social media and AI, the key differences between the two family brands hinge on what we now mean by “real” and “authentic” and “genuine.”

Kim Kardashian owns a global business empire, has 364.4 million followers on Instagram and still has her own reality TV show, renamed “The Kardashians.” She makes money playing a fake version of herself that people are willing to accept as “real” without caring which bits might be true and which are faked.

By contrast, Kate works for a nonprofit that pays the Waleses comparatively little and prohibits conspicuous displays of luxury. Her shared Instagram account with William has a mere 15.2 million followers. Although they, too, have a long-running TV show about their family, with their early romance concluding the most recent season, “The Crown” is too unflattering to be a vehicle for self-promotion.

Kate has no say or ownership in her show and has to watch an actor play a fake version of herself that people know isn’t real but nevertheless believe to be true on some level. She is a permanent resident inside a hall of mirrors that reflects two things: ratings and popularity.

But her job is of an older sort: Kings and queens have no career path or rank to attain. They exist in a different category from the rich, the famous and the beneficiaries of nepotism. The monarch can only be sui generis.

Over the years, the British monarchy has learned how to present itself as the embodiment of duty and public service. The royals are buoyed by the timeless values that define them. Queen Elizabeth II was authentic to a fault. The public’s grief at her passing reflects how deeply that quality still resonates.

Social media has trapped us all in a very different world, especially in the U.S., where public opinion has always been the ultimate power. It’s what caused Alexis de Tocqueville the most disquiet when he visited America in 1831. “It acts upon the will as well as upon the actions of men,” he wrote in “Democracy in America.” But when public opinion is the final arbiter of everything, where does that leave authenticity and the values it safeguards?

Fear that the royals are squandering the values and credibility of the monarchy is what provoked the internet-wide howl of rage over Kate’s photograph. I now realize why people rushed to fill the silence surrounding the Waleses with outrageous speculation. Scandal is somehow preferable to seeing Princess Catherine reduced to Kim Kardashian.

Historically Speaking: Aspirin, a Pioneering Wonder Drug

The winding, millennia-long route from bark to Bayer.

The Wall Street Journal

February 1, 2024

For ages the most reliable medical advice was also the most simple: Take two aspirin and call me in the morning. This cheap pain reliever, which also thins blood and reduces inflammation, has been a medicine cabinet staple ever since it became available over the counter nearly 110 years ago.

Willow bark, a distant ancestor of aspirin, was a popular ingredient in ancient remedies to relieve pain and treat skin problems. Hippocrates, the father of medicine, was a firm believer in willow’s curative powers. For women with gynecological troubles in the fourth century B.C., he advised burning the leaves “until the steam enters the womb.”

That willow bark could reduce fevers wasn’t discovered until the 18th century. Edward Stone, an English clergyman, noticed its extremely bitter taste was similar to that of the cinchona tree, the source of the costly malaria drug quinine. Stone dried the bark and dosed himself to treat a fever. When he felt better, he tested the powder on others suffering from “ague,” or malaria. When their fevers disappeared, he reported triumphantly to the Royal Society in 1763 that he had found another malaria cure. In fact, he had identified a way to treat its symptoms.

Willows contain salicin, a plant hormone with anti-inflammatory, fever-reducing and pain-relieving properties. Experiments with salicin, and its byproduct salicylic acid, began in earnest in Europe in the 1820s. In 1853 Charles Frédéric Gerhardt, a French chemist, discovered how to create acetylsalicylic acid, the active ingredient in aspirin, but then abandoned his research and died young.

There is some debate over how aspirin became a blockbuster drug for the German company Bayer. Its official history credits Felix Hoffmann, a Bayer chemist, with synthesizing acetylsalicylic acid in 1897 in the hopes of alleviating his father’s severe rheumatic pain. Bayer patented aspirin in 1899 and by 1918 it had become one of the most widely used drugs in the world.

ILLUSTRATION: THOMAS FUCHS

But did Hoffman work alone? Shortly before his death in 1949, Arthur Eichengrün, a Jewish chemist who had spent World War II in a concentration camp, published a paper claiming that Bayer had erased his contribution. In 2000 the BMJ published a study supporting Eichengrün’s claim. Bayer, which became part of the Nazi-backing conglomerate I.G. Farben in 1925, has denied that Eichengrün had a role in the breakthrough.

Aspirin shed its associations with the Third Reich after I.G. Farben sold off Bayer in the early 1950s, but the drug’s pain-relieving hegemony was fleeting. By 1956 Bayer’s British affiliate brought acetaminophen to the market. Ibuprofen became available in 1962.

The drug’s fortunes recovered after the New England Journal of Medicine published a study in 1989 that found the pill reduced the threat of a heart attack by 44%. Some public-health officials promptly encouraged anyone over 50 to take a daily aspirin as a preventive measure.

But as with the case with Rev. Stone, it seems the science is more complicated. In 2022 the U.S. Preventive Services Task Force officially advised against taking the drug prophylactically, given the risk of internal bleeding and the availability of other therapies. Aspirin may work wonders, but it can’t work miracles.

Historically Speaking: Our Fraught Love Affair With Cannabis

Ban it? Tax it? Humans have been hounded by these questions for millennia.

The Wall Street Journal

January 19, 2024

Ohio’s new marijuana law marks a watershed moment in the decriminalization of cannabis: more than half of Americans now live in places where recreational marijuana is legal. It is a profound shift, but only the latest twist in the long and winding saga of society’s relationship with pot.

Humans first domesticated cannabis sativa around 12,000 years ago in Central and East Asia as hemp, mostly for rope and other textiles. Later, some adventurous forebears found more interesting uses. In 2008, archaeologists in northwestern China discovered almost 800 grams of dried cannabis containing high levels of THC, the psychoactive ingredient in marijuana, among the burial items of a seventh century B.C. shaman.

The Greeks and Romans used cannabis for hemp, medicine and possibly religious purposes, but the plant was never as pervasive in the classical world as it was in ancient India. Cannabis indica, the sacred plant of the god Shiva, was revered for its ability to relieve physical suffering and bring spiritual enlightenment to the holy.

Cannabis gradually spread across the Middle East in the form of hashish, which is smoked or eaten. The first drug laws were enacted by Islamic rulers who feared their subjects wanted to do little else. King al-Zahir Babar in Egypt banned hashish cultivation and consumption in 1266. When that failed, a successor tried taxing hashish instead in 1279. This filled local coffers, but consumption levels soared and the ban was restored.

The march of cannabis continued unabated across the old and new worlds, apparently reaching Stratford-upon-Avon by the 16th century. Fragments of some 400-year-old tobacco pipes excavated from Shakespeare’s garden were found to contain cannabis residue. If not the Bard, at least someone in the household was having a good time.

By the 1600s American colonies were cultivating hemp for the shipping trade, using its fibers for rigs and sails. George Washington and Thomas Jefferson grew cannabis on their Virginia plantations, seemingly unaware of its intoxicating properties.

Veterans of Napoleon’s Egypt campaign brought hashish to France in the early 1800s, where efforts to ban the habit may have enhanced its popularity. Members of the Club des Hashischins, which included Charles Baudelaire, Honoré de Balzac, Alexander Dumas and Victor Hugo, would meet to compare notes on their respective highs.

ILLUSTRATION: THOMAS FUCHS

Although Queen Victoria’s own physician advocated using cannabis to relieve childbirth and menstrual pains, British lawmakers swung back and forth over whether to tax or ban its cultivation in India.

In the U.S., however, Americans lumped cannabis with the opioid epidemic that followed the Civil War. Early 20th-century politicians further stigmatized the drug by associating it with Black people and Latino immigrants. Congress outlawed nonmedicinal cannabis in 1937, a year after the movie “Reefer Madness” portrayed pot as a corrupting influence on white teenagers.

American views of cannabis have changed since President Nixon declared an all-out War on Drugs more than 50 years ago, yet federal law still classifies the drug alongside heroin. As lawmakers struggle to catch up with the zeitgeist, two things remain certain: Governments are often out of touch with their citizens, and what people want isn’t always what’s good for them.