Essay: Princess Kate Isn’t Kim Kardashian

The public backlash against an altered photo of the royal family shows that, even in the age of social media, royalty is supposed to mean more than just celebrity.

The Wall Street Journal

March 15, 2024

There I was on CBS News at the beginning of March, as a commentator on the royals, pooh-poohing fears that the public was being deceived. The Princess of Wales’s three-month absence from public life following her surgery in January was generating far too much hysteria.

Yes, it was a little strange that Prince William had suddenly withdrawn from a scheduled public appearance citing a “personal matter.” And there was no denying that the two royal households were not in sync.

Buckingham Palace’s willingness to share photographs of the cancer-stricken King Charles III made the news blackout on Princess Catherine look suspicious. But Kensington Palace, the residence of the Waleses, had been quite clear from the outset that it wouldn’t be issuing any updates until her convalescence was over. Trust me, I had said, the real issue here is the fact we are paying too much attention to baseless rumors.

Two weeks later, where-is-Kate-gate had turned into photogate. The world had been clamoring for a picture to show that she wasn’t in a coma, on a beach filing for divorce or recovering from a Brazilian Butt Lift, as some speculated on social media. What it got on March 9, Mother’s Day in Britain, was an Adobe Photoshop special in the worst way.

The charming family portrait of Kate sitting on a bench, surrounded by her three children, was bound to be scrutinized pixel by pixel. At the very least, it needed to be devoid of any mystery. Instead, the image had been obviously but crudely digitally enhanced, giving credence to every crackpot theory out there. By being secretive and then looking manipulative, the Waleses had created a public relations catastrophe. And they had damaged their “brand,” which depends crucially on how their public role transcends that of mere celebrities.

For the record, I don’t have any inside information about Kate’s surgery. The Germans think it was diverticulitis, the British assume it was a hysterectomy. But leaving aside the knotty question of privacy, the royals’ junior varsity team needs a reset. Even if the fiasco was the result of a mislabeled file, or a key member of the comms office bunking off early on a Friday, William and Kate are playing like amateurs in a pro sport that kills its losers.

But the bigger issue exposed by photogate isn’t a “them” problem at all, it’s an “us” problem. Americans are behaving as though Kate cheated on them. The betrayal! She has done what no one else in this country has ever managed to do and unite the conservative and liberal media around a cause they are both passionate about: the evils of royal photoshopping. It’s a breach of trust, don’t you know, and Americans won’t stand for it.

Lest we forget, in 1953 the newly crowned Elizabeth II deep-faked her official coronation photograph, which remains one of the most famous images of the 20th century. It was taken after she had left Westminster Abbey and returned to Buckingham Palace. The photographer Cecil Beaton sat her in front of a painted backdrop of the Abbey’s interior so he could make it look all dreamy and romantic, gloomy gothic not being his style.

No one accused Queen Elizabeth of scamming the public. But for Kate there’s no mercy. She has been “outed” as a fake. If people discovered that Kim Kardashian, the “Queen of Reality TV,” was actually a composite character played by identical triplets, I doubt it would cause this much outrage.

A print of Queen Elizabeth II’s 1953 coronation photo, which was taken in front of a painted backdrop, not at Westminster Abbey. PHOTO: HERITAGE IMAGES/GETTY IMAGES

In fact, it isn’t far-fetched to compare the Windsors and the Kardashians as living megabrands, family enterprises whose presence on social media and ability to sway public opinion are totally out of proportion to their actual physical footprint. But however much they may resemble each other in their reach, they are opposites in terms of what they represent, and that may be why the response to Kate’s doctored photo was so visceral.

Over the years, Kim Kardashian has tried not just to emulate Kate but to be the American equivalent. Whether it was delusional or simply delusions of grandeur, the reality TV star modeled her own 2011 wedding to the NBA player Kris Humphries after William and Kate’s, even going so far as to order a similar cake at the reception. There were only four months between the two weddings so comparisons between the two were unavoidable. The press played along, sort of, variously referring to the Kardashian-Humphries $10 million extravaganza as “America’s Royal Wedding” and the “royal-ish wedding.”

Both weddings catapulted the brides into even greater stardom. But let us not forget the very different ways they have acquired and used their fame. In the age of social media and AI, the key differences between the two family brands hinge on what we now mean by “real” and “authentic” and “genuine.”

Kim Kardashian owns a global business empire, has 364.4 million followers on Instagram and still has her own reality TV show, renamed “The Kardashians.” She makes money playing a fake version of herself that people are willing to accept as “real” without caring which bits might be true and which are faked.

By contrast, Kate works for a nonprofit that pays the Waleses comparatively little and prohibits conspicuous displays of luxury. Her shared Instagram account with William has a mere 15.2 million followers. Although they, too, have a long-running TV show about their family, with their early romance concluding the most recent season, “The Crown” is too unflattering to be a vehicle for self-promotion.

Kate has no say or ownership in her show and has to watch an actor play a fake version of herself that people know isn’t real but nevertheless believe to be true on some level. She is a permanent resident inside a hall of mirrors that reflects two things: ratings and popularity.

But her job is of an older sort: Kings and queens have no career path or rank to attain. They exist in a different category from the rich, the famous and the beneficiaries of nepotism. The monarch can only be sui generis.

Over the years, the British monarchy has learned how to present itself as the embodiment of duty and public service. The royals are buoyed by the timeless values that define them. Queen Elizabeth II was authentic to a fault. The public’s grief at her passing reflects how deeply that quality still resonates.

Social media has trapped us all in a very different world, especially in the U.S., where public opinion has always been the ultimate power. It’s what caused Alexis de Tocqueville the most disquiet when he visited America in 1831. “It acts upon the will as well as upon the actions of men,” he wrote in “Democracy in America.” But when public opinion is the final arbiter of everything, where does that leave authenticity and the values it safeguards?

Fear that the royals are squandering the values and credibility of the monarchy is what provoked the internet-wide howl of rage over Kate’s photograph. I now realize why people rushed to fill the silence surrounding the Waleses with outrageous speculation. Scandal is somehow preferable to seeing Princess Catherine reduced to Kim Kardashian.

Historically Speaking: Here Comes the Rain Again

Storms have long shaped human destiny, as Californians know all too well.

The Wall Street Journal

February 15, 2024

Given that much of California was suffering a severe drought just two years ago, it might seem ungrateful to complain about too much rain. Yet Californians have already managed two record-breaking storms this year, and more are expected. The increase in the frequency and strength of these weather events spells trouble for the state. Some worry it is a sign that the “Big One”—a massive once-in-a-millennium storm—is nigh.

A man walks his dog on the edge of the Los Angeles River, Feb. 4.

Scientists think that these storms are growing more severe as a result of climate change, but mythical stories about destructive floods have haunted humans for eons, from the Sumerian Epic of Gilgamesh in 2000 B.C. to the biblical story of Noah and his ark.

Thousands of years of accumulated climate records combined with modern computing methods have led to new insights for the role rain has played in shaping human destiny. For example, excessive rain can now be added to the list of reasons behind the collapse of the Roman Empire. Apparently the final decades of the fifth century were unusually wet. Harvests failed and granaries rotted, setting off a cataclysmic chain of famines, wars and mass migration that hastened the empire’s demise.

Abnormal rainfall needn’t spell human disaster. In the early 13th century, 15 consecutive years of unprecedented rainfall turned the barren Mongolian steppe into fertile grassland. The region could finally feed the massive armies that allowed Genghis Khan to pursue his dream of a Mongol empire. But the intensely wet spring of 1242 may have pushed his descendants to abruptly leave Central Europe. The Mongol cavalry could seemingly defeat any foe except the bottomless mud of the Hungarian plain.

A series of engravings made for the first edition of the ‘Liber Genesis,’ 1612

A recurring theme in most Great Flood myths is how destruction can be creative; the washing away of the past being necessary for a redemptive transformation. A real-life example can be seen in Europe’s response to the crisis of 1816—the so-called Year Without a VolSummer. The 1815 eruption of Mt. Tambora in Indonesia ejected a huge cloud of sulfate gases into the atmosphere and created an unseasonable chill in much of the Northern Hemisphere. Endless rain watered already sodden fields. Communities starved; typhus outbreaks infected millions of people; rioting became endemic.

The sheer scale of the human emergency forced a reconceptualization in Britain and elsewhere of the purpose of government. No longer could it be concerned only with taxes, laws and diplomacy. The modern state must also consider public health, public administration and public responsibility.

California’s Great Flood of 1862, which remains the state’s worst disaster, was another catalyst for change. The storm began in late 1861 and lasted eight weeks. California’s Central Valley became an inland sea. At least 4,000 people died and a quarter of the state’s economy was destroyed. The Sacramento government relocated to San Francisco, which was also partially underwater.

The robust reconstruction efforts afterward marked a shift in attitude. Californians erected new flood defenses and instituted better building regulations. Sacramento rose again, literally 10 feet higher than before. The infrastructure may be up to 150 years old, but it is still doing its job. No matter what the rain brings, the answer isn’t an ark. It is being prepared.

Historically Speaking: Aspirin, a Pioneering Wonder Drug

The winding, millennia-long route from bark to Bayer.

The Wall Street Journal

February 1, 2024

For ages the most reliable medical advice was also the most simple: Take two aspirin and call me in the morning. This cheap pain reliever, which also thins blood and reduces inflammation, has been a medicine cabinet staple ever since it became available over the counter nearly 110 years ago.

Willow bark, a distant ancestor of aspirin, was a popular ingredient in ancient remedies to relieve pain and treat skin problems. Hippocrates, the father of medicine, was a firm believer in willow’s curative powers. For women with gynecological troubles in the fourth century B.C., he advised burning the leaves “until the steam enters the womb.”

That willow bark could reduce fevers wasn’t discovered until the 18th century. Edward Stone, an English clergyman, noticed its extremely bitter taste was similar to that of the cinchona tree, the source of the costly malaria drug quinine. Stone dried the bark and dosed himself to treat a fever. When he felt better, he tested the powder on others suffering from “ague,” or malaria. When their fevers disappeared, he reported triumphantly to the Royal Society in 1763 that he had found another malaria cure. In fact, he had identified a way to treat its symptoms.

Willows contain salicin, a plant hormone with anti-inflammatory, fever-reducing and pain-relieving properties. Experiments with salicin, and its byproduct salicylic acid, began in earnest in Europe in the 1820s. In 1853 Charles Frédéric Gerhardt, a French chemist, discovered how to create acetylsalicylic acid, the active ingredient in aspirin, but then abandoned his research and died young.

There is some debate over how aspirin became a blockbuster drug for the German company Bayer. Its official history credits Felix Hoffmann, a Bayer chemist, with synthesizing acetylsalicylic acid in 1897 in the hopes of alleviating his father’s severe rheumatic pain. Bayer patented aspirin in 1899 and by 1918 it had become one of the most widely used drugs in the world.

ILLUSTRATION: THOMAS FUCHS

But did Hoffman work alone? Shortly before his death in 1949, Arthur Eichengrün, a Jewish chemist who had spent World War II in a concentration camp, published a paper claiming that Bayer had erased his contribution. In 2000 the BMJ published a study supporting Eichengrün’s claim. Bayer, which became part of the Nazi-backing conglomerate I.G. Farben in 1925, has denied that Eichengrün had a role in the breakthrough.

Aspirin shed its associations with the Third Reich after I.G. Farben sold off Bayer in the early 1950s, but the drug’s pain-relieving hegemony was fleeting. By 1956 Bayer’s British affiliate brought acetaminophen to the market. Ibuprofen became available in 1962.

The drug’s fortunes recovered after the New England Journal of Medicine published a study in 1989 that found the pill reduced the threat of a heart attack by 44%. Some public-health officials promptly encouraged anyone over 50 to take a daily aspirin as a preventive measure.

But as with the case with Rev. Stone, it seems the science is more complicated. In 2022 the U.S. Preventive Services Task Force officially advised against taking the drug prophylactically, given the risk of internal bleeding and the availability of other therapies. Aspirin may work wonders, but it can’t work miracles.

Historically Speaking: Our Fraught Love Affair With Cannabis

Ban it? Tax it? Humans have been hounded by these questions for millennia.

The Wall Street Journal

January 19, 2024

Ohio’s new marijuana law marks a watershed moment in the decriminalization of cannabis: more than half of Americans now live in places where recreational marijuana is legal. It is a profound shift, but only the latest twist in the long and winding saga of society’s relationship with pot.

Humans first domesticated cannabis sativa around 12,000 years ago in Central and East Asia as hemp, mostly for rope and other textiles. Later, some adventurous forebears found more interesting uses. In 2008, archaeologists in northwestern China discovered almost 800 grams of dried cannabis containing high levels of THC, the psychoactive ingredient in marijuana, among the burial items of a seventh century B.C. shaman.

The Greeks and Romans used cannabis for hemp, medicine and possibly religious purposes, but the plant was never as pervasive in the classical world as it was in ancient India. Cannabis indica, the sacred plant of the god Shiva, was revered for its ability to relieve physical suffering and bring spiritual enlightenment to the holy.

Cannabis gradually spread across the Middle East in the form of hashish, which is smoked or eaten. The first drug laws were enacted by Islamic rulers who feared their subjects wanted to do little else. King al-Zahir Babar in Egypt banned hashish cultivation and consumption in 1266. When that failed, a successor tried taxing hashish instead in 1279. This filled local coffers, but consumption levels soared and the ban was restored.

The march of cannabis continued unabated across the old and new worlds, apparently reaching Stratford-upon-Avon by the 16th century. Fragments of some 400-year-old tobacco pipes excavated from Shakespeare’s garden were found to contain cannabis residue. If not the Bard, at least someone in the household was having a good time.

By the 1600s American colonies were cultivating hemp for the shipping trade, using its fibers for rigs and sails. George Washington and Thomas Jefferson grew cannabis on their Virginia plantations, seemingly unaware of its intoxicating properties.

Veterans of Napoleon’s Egypt campaign brought hashish to France in the early 1800s, where efforts to ban the habit may have enhanced its popularity. Members of the Club des Hashischins, which included Charles Baudelaire, Honoré de Balzac, Alexander Dumas and Victor Hugo, would meet to compare notes on their respective highs.

ILLUSTRATION: THOMAS FUCHS

Although Queen Victoria’s own physician advocated using cannabis to relieve childbirth and menstrual pains, British lawmakers swung back and forth over whether to tax or ban its cultivation in India.

In the U.S., however, Americans lumped cannabis with the opioid epidemic that followed the Civil War. Early 20th-century politicians further stigmatized the drug by associating it with Black people and Latino immigrants. Congress outlawed nonmedicinal cannabis in 1937, a year after the movie “Reefer Madness” portrayed pot as a corrupting influence on white teenagers.

American views of cannabis have changed since President Nixon declared an all-out War on Drugs more than 50 years ago, yet federal law still classifies the drug alongside heroin. As lawmakers struggle to catch up with the zeitgeist, two things remain certain: Governments are often out of touch with their citizens, and what people want isn’t always what’s good for them.

Historically Speaking: The Hunt for a Better Way to Vote

Despite centuries of innovation, the humble 2,500-year-old ballot box is here to stay.

The Wall Street Journal

January 4, 2024

At least 40 national elections will take place around the world over the next year, with some two billion people going to the polls. Thanks to the 2,500-year-old invention of the ballot box, in most races these votes will actually count and be counted.

Ballot boxes were first used in Athens during the 5th century B.C., but in trials rather than elections. Legal cases were tried before a gathering of male citizens, known as the Assembly, and decided by vote. Jurors indicated their verdict by dropping either a marked or unmarked pebble into an urn, which protected them against violence by keeping their decision secret.

The first recorded case of ballot box stuffing also took place in 5th century Athens. To exile an unpopular Athenian via an “ostracism election,” Assembly voters simply had to scratch his name on an ostraka, a pottery shard, and whomever reached a certain threshold of votes was banished for 10 years. It is believed that Themistocles’s political enemies rigged his ostracism vote in 472 B.C. by distributing pre-etched shards throughout the Assembly.

In 139 B.C., the Romans passed a series of voter secrecy laws starting with the Lex Gabinia, which introduced the secret ballot for magistrate elections. A citizen would write his vote on a wax-covered wooden tablet and then deposit it in a wicker basket called a cista. The cistae were so effective at protecting voters from public scrutiny that many senators, including Cicero, regarded the ballot box as an attack on their authority and a dangerous concession to mob rule.

ILLUSTRATION: THOMAS FUCHS

The fact that secret ballots allowed men to vote as they pleased was one reason why King Charles I of England ordered all “balloting boxes” to be taken out of circulation in 1637, where they remained until the 19th century.

Even then, there was strong resistance to ballot boxes in Britain and America on the grounds that they were unmanly: A citizen ought to display his vote, not hide it. In any case, there was nothing special about a 19th-century ballot box except its convenience for stealing or stuffing. One notorious election scam in San Francisco in the 1850s involved a ballot box with a false bottom.

Public outrage over rigged elections in the U.S. led some states to adopt Samuel Jollie’s tamper-proof glass ballot box, which New York first used in 1857. But a transparent design couldn’t prevent these boxes from mysteriously disappearing, nor would it have saved Edgar Allan Poe, who is thought to have died in Baltimore from being “cooped,” a practice where kidnap victims were drugged into docility and made to vote multiple times.

To better guarantee the integrity of elections, New York introduced in 1892 a new machine by Jacob Myers that allowed voters to privately choose candidates by pulling a lever, which dispensed with ballots and ballot boxes. Other inventors quickly improved on the design and by 1900 Jollie’s glass ballot box had become obsolete. By World War II almost every city had switched over to mechanical voting systems, which tallied votes automatically.

The simple ballot box seemed destined to disappear until controversies over machine irregularities in the 2000 presidential election resulted in the Help America Vote Act, which requires all votes to have a paper record. The ballot box still isn’t the perfect shield against fraud. But then, neither is anything else.

Historically Speaking: The Ancient Origins of the Christmas Wreath

Before they became a holiday symbol, wreaths were used to celebrate Egyptian gods, victorious Roman generals and the winter solstice.

On Christmas Eve, 1843, three ghosts visited Ebenezer Scrooge in the Charles Dickens novella “A Christmas Carol” and changed Christmas forever. Dickens is often credited with “inventing” the modern idea of Christmas because he popularized and reinvigorated such beloved traditions as the turkey feast, singing carols and saying “Merry Christmas.”

But one tradition for which he cannot take credit is the ubiquitous Christmas wreath, whose pedigree goes back many centuries, even to pre-Christian times.

ILLUSTRATION: THOMAS FUCHS

Wreaths can be found in almost every ancient culture and were worn or hung for many purposes. In Egypt, participants in the festival of Sokar, a god of the underworld, wore onion wreaths because the vegetable was venerated as a symbol of eternal life. The Greeks awarded laurel wreaths to the winners of competitions because the laurel tree was sacred to Apollo, the god of poetry and athletics. The Romans bestowed them on emperors and victorious generals.

Fixing wreaths and boughs onto doorposts to bring good luck was another widely practiced tradition. As early as the 6th century, Lunar New Year celebrants in central China would decorate their doors with young willow branches, a symbol of immortality and rebirth.

The early Christians did not, as might be assumed, create the Yuletide wreath. That was the pagan Vikings. They celebrated the winter solstice with mistletoe, evergreen wreaths made of holly and ivy, and 12 days of feasting, all of which were subsequently turned into Christian symbols. Holly, for example, has often been equated with the crown of thorns. Perhaps not surprisingly for the people who also introduced the words “knife,” “slaughter” and “berserk,” one Viking custom involved setting the Yuletide wreath on fire in the hope of attracting the sun’s attention.

Across northern Europe, the Norse wreath was eventually absorbed into the Christian calendar as the Advent wreath, symbolizing the four weeks before Christmas. It became part of German culture in 1839, when Johann Hinrich Wichern, a Lutheran pastor in Hamburg, captured the public’s imagination with his enormous Advent wreath. Made with a cartwheel and 24 candles, it was intended to help the children at his orphanage count the days to Christmas.

German immigrants to the U.S. brought the Advent wreath with them. Still, while Americans might accept a candlelit wreath inside the house, door wreaths were rare, especially in former Puritan strongholds such as Boston. The whiff of disapproval hung about until 1935, when Colonial Williamsburg appointed Louise B. Fisher, an underemployed and overqualified professor’s wife, to be its head of flowers and decorations. Inspired by her love of 15th-century Italian art, Fisher allowed her wreath designs to run riot on the excuse that she was only adding fruits and other whimsies that had been available during the Colonial era.

Thousands of visitors saw her designs and returned home to copy them. Like Wichern, she helped inspire a whole generation to be unapologetically joyous with their wreaths. Thanks in part to her efforts, America’s front doors are a thing to behold at this time of year, proving that it’s never too late to pour new energy into an old tradition.

Historically Speaking: A Tale of Two Hats

Napoleon’s bicorne and Santa Claus’s red cap both trace their origins to the felted headgear worn in Asia Minor thousands of years ago.

December makes me think of hats—well, one hat in particular. Not Napoleon’s bicorne hat, an original of which (just in time for Ridley Scott’s movie) sold for $2.1 million at an auction last month in France, but Santa’s hat.

The two aren’t as different as you might imagine. They share the same origins and, improbably, tell a similar story. Both owe their existence to the invention of felt, a densely matted textile. The technique of felting was developed several thousand years ago by the nomads of Central Asia. Since felt stays waterproof and keeps its shape, it could be used to make tents, padding and clothes.

The ancient Phrygians of Asia Minor were famous for their conical felt hats, which resemble the Santa cap but with the peak curving upward and forwards. Greek artists used them to indicate a barbarian. The Romans adopted a red, flat-headed version, the pileus, which they bestowed on freed slaves.

Although the Phrygian style never went out of fashion, felt was largely unknown in Western Europe until the Crusades. Its introduction released a torrent of creativity, but nothing matched the sensation created by the hat worn by King Charles VII of France in 1449. At a celebration to mark the French victory over the English in Normandy, he appeared in a fabulously expensive, wide-brimmed, felted beaver-fur hat imported from the Low Countries. Beaver hats were not unknown; the show-off merchant in Chaucer’s “Canterbury Tales” flaunts a “Flandrish beaver hat.” But after Charles, everyone wanted one.

Hat brims got wider with each decade, but even beaver fur is subject to gravity. By the 17th century, wearers of the “cavalier hat” had to cock or fold up one or both sides for stability. Thus emerged the gentleman’s three-sided cocked hat, or tricorne, as it later became known—the ultimate divider between the haves and the have-nots.

The Phrygian hat resurfaced in the 18th century as the red “Liberty Cap.” Its historical connections made it the headgear of choice for rebels and revolutionaries. During the Reign of Terror, any Frenchman who valued his head wore a Liberty Cap. But afterward, it became synonymous with extreme radicalism and disappeared. In the meantime, the hated tricorne had been replaced by the less inflammatory top hat. It was only naval and military men, like Napoleon, who could get away with the bicorne.

The wide-brimmed felted beaver hat was resurrected in the 1860s by John B. Stetson, then a gold prospector in Colorado. Using the felting techniques taught to him by his hatter father, Stetson made himself an all-weather head protector, turning the former advertisement for privilege into the iconic hat of the American cowboy.

Thomas Nast, the Civil War caricaturist and father of Santa Claus’s modern image, performed a similar rehabilitation on the Phrygian cap. To give his Santa a far-away but still benign look, he gave him a semi-Phrygian crossed with a camauro, the medieval clergyman’s cap. Subsequent artists exaggerated the peak and cocked it back, like a nightcap. Thus the red cap of revolution became the cartoon version of Christmas.

In this tale of two hats lies a possible rejoinder to the cry in T.S. Eliot’s “The Waste Land”: “Who is the third who walks always beside you?” It is history, invisible yet present, protean yet permanent—and sometimes atop Santa’s head.

Historically Speaking: The Enduring Allure of a Close Shave

Men may be avoiding their razors for ‘Movember,’ but getting rid of facial and other body hair goes back millennia in many different cultures

November is a tough time for razors. Huge numbers of them haven’t seen their owners since the start of “Movember,” the annual no-shave fundraiser for men’s health. But the razors needn’t fear: The urge of men to express themselves by trimming and removing their hair, facial and otherwise, runs deep.

Although no two cultures share the exact same attitude toward shaving, it has excited strong feelings in every time and place. The act is redolent with symbolism, especially religious and sexual. Even Paleolithic Man shaved his hair on occasion, using shells or flint blades as a crude razor. Ancient Egypt was the earliest society to make hair removal a way of life for both sexes. By 3000 B.C., the Egyptians were using gold and copper to manufacture razors with handles. The whole head was shaved and covered by a wig, a bald head being a practical solution against head lice and overheating. Hair’s association with body odor and poor hygiene also made its absence a sign of purity.

Around 1900 B.C., the Egyptians started experimenting with depilatory pastes made of beeswax and boiled caramel. The methods were so efficacious that many of them are still used today. The only parts never shaved were the eyebrows, except when mourning the death of a cat, a sacred animal in Egyptian culture. The Greco-Roman world adopted the shaved look following Alexander the Great, who thought beards were a liability in battle, since they could be grabbed and pulled. The Romans took their cue from him.

But they also shared the Egyptian obsession with body hair. The Roman fetish for tweezing created professional hair pluckers. The philosopher Seneca, who lived next door to a public bath, complained bitterly about the loud screaming from plucking sessions, which intruded upon the serenity he required for deep thoughts. Both pluckers and barbers disappeared during Rome’s decline.

Professional barbers only reappeared in significant numbers after 1163 when the Council of Tours banned the clergy, who often provided this service, from shedding blood of any kind. The skills of barber-surgeons improved, unlike their utensils, which hardly changed at all until the English invented the foldable straight razor in the late 17th century.

The innovation prompted a “safety” race between English and French blade manufacturers. The French surged ahead in the late 18th century with the Perret razor, which had protective guards on three sides. The English responded with the T-shaped razor patented by William Henson in 1847. In 1875, the U.S. leapfrogged its European rivals with electrolysis, still the best way to remove hair permanently.

Nevertheless, the holy grail of a safe, self-shaving experience remained out of reach until King Camp Gillette, a traveling salesman from New York, worked with an engineer to produce the first disposable, double-edged razorblade. His Gillette razor went on sale in 1903. By the end of World War I, he was selling millions—and not just to men.

Gillette seized on the fashion for sleeveless summer dresses to market the Milady Décolleté Gillette Razor for the “smooth underarm” look. The campaign to convince women to use a traditionally male tool was helped by a series of scandals in the 1920s and 30s over tainted hair products. The worst involved Koremlu, a depilatory cream that was largely rat poison.

The razor may be at least 50,000 years old, but it remains an essential tool. And it’s a great stocking stuffer.

Historically Speaking: Marriage as a Mirror of Human Nature

From sacred ritual to declining institution, wedlock has always reflected our ideas about liberty and commitment.

The Wall Street Journal

October 26, 2023

Marriage is in decline in almost every part of the world. In the U.S., the marriage rate is roughly six per 1,000 people, a fall of nearly 60% since the 1970s. But this is still high compared with most of the highly developed countries in the Organization for Economic Cooperation and Development, where the average marriage rate has dropped below four per 1,000. Modern vie

History of Marriage

THOMAS FUCHS

ws on marriage are sharply divided: In a recent poll, two in five young adult Americans said that the institution has outlived its usefulness.

The earliest civilizations had no such thoughts. Marriage was an inseparable part of the religious and secular life of society. In Mesopotamian mythology, the first marriage was the heavenly union between Innana/Ishtar, the goddess of war and love, and her human lover, the shepherd Dumuzi. Each year, the high point of the religious calendar was the symbolic re-enactment of the Sacred Marriage Rite by the king and the high priestess of the city.

Throughout the ancient world, marriage placed extra constraints on women while allowing polygamy for men. The first major change to the institution took place in ancient Greece. A marriage between one man and one woman, with no others involved, became the bedrock of democratic states. According to Athenian law, only the son of two married citizens could inherit the rights of citizenship. The change altered the definition of marriage to give it a civic purpose, although women’s subordination remained unchanged.

At the end of the 1st century B.C., Augustus Caesar, the founder of the Roman Empire, tried to use the law to reinvigorate “traditional” marriage values. But it was the Stoic philosophers who had the greatest impact on ideas about marriage, teaching that its purpose included personal fulfillment. The 1st-century philosopher Musonius Rufus argued that love and companionship weren’t just incidental benefits but major purposes of marriage.

The early Church’s general hostility toward sex did away with such views. Matrimony was considered less desirable than celibacy; priests didn’t start officiating at wedding ceremonies until the 800s. On the other hand, during the 12th century the Catholic Church made marriage one of the seven unbreakable sacraments. In the 16th century, its intransigence on divorce resulted in King Henry VIII establishing the Anglican Church so he could leave Catherine of Aragon and marry Anne Boleyn.

In the U.S. after the Civil War, thousands of former slaves applied for marriage certificates from the Freedmen’s Bureau. Concurrently, between 1867 and 1886, there were 328,716 divorces among all Americans. The simultaneous moves by some to escape the bonds of matrimony, and by others to have the right to claim it, highlight the institution’s peculiar place in our ideas of individual liberty.

In 1920, female suffrage transformed the nature of marriage yet again, implicitly recognizing the right of wives to a separate legal identity. Still, the institution survived and even thrived. At the height of World War II in 1942, weddings were up 83% from the previous decade.

Though marriage symbolizes stability, its meaning is unstable. It doesn’t date or fall behind; for better or worse, it simply reflects who we are.

Historically Speaking: Sending Cards for a Happy Birthday

On Oct. 26, imprisoned WSJ reporter Evan Gershkovich will turn 32. Since ancient times, birthdays have been occasions for poems, letters and expressions of solidarity.

The Wall Street Journal

October 13, 2023

Wall Street Journal reporter Evan Gershkovich turns 32 on Oct. 26. This year he will be spending his birthday in Lefortovo prison in Moscow, a detention center for high-profile and political prisoners. He has been there for the past six months, accused of espionage—a charge vehemently denied by the U.S. government and The Journal.

Despite the extreme restrictions placed on Lefortovo prisoners, it is still possible to send Evan messages of support via the U.S. Embassy in Moscow or the freegershkovich.com website, like a birthday card, to let him know that the world cares.

Birthday cards are so cheap and plentiful it is easy to miss their cultural value. They are the modern iteration of a literary tradition that goes back at least to Roman times. Poets were especially given to composing birthday odes to their friends and patrons. The Augustan poet Horace dedicated many of his poems to Maecenas, whose birthday, he wrote, “is almost more sacred to me than that of my own birth.”

The custom of birthday salutations petered out along with much else during the Dark Ages but was revived with the spread of mass literacy. Jane Austen would write to her siblings on their birthdays, wishing them the customary “joy,” but toward the end of her life she began to experiment with the form. In 1817, she sent her three-year-old niece Cassy a special birthday letter written in reverse spelling, beginning with “Ym raed Yssac.”

Austen’s sense that a birthday letter ought to be unique coincided with a technological race in the printing industry. One of the first people to realize the commercial potential of greeting cards was Louis Prang, a German immigrant in Boston, who began selling printed cards in 1856. Holiday cards were an instant success, but birthday cards were less popular until World War I, when many American families had a relative fighting overseas.

Demand for birthday cards stayed high after the war, as did the importance attached to them. King George V seized on their popularity to introduce the royal tradition of sending every British citizen who reaches 100 a congratulatory birthday card. In 1926, to show how much they appreciated the gift of U.S. aid, more than 5 million Poles signed a 30,000-page birthday card commemorating America’s 150th anniversary.

During the Cold War, the symbolism of the birthday card became a power in itself. In 1984, Illinois Rep. John Edward Porter and other members of Congress sent birthday cards to Mart Niklus, an Estonian civil rights campaigner imprisoned in the U.S.S.R. By coincidence, the Soviets released Niklus in July 1988, the same month that Nelson Mandela received more than 50,000 cards for his 70th birthday. The frustrated South African prison authorities allowed him to have 12 of them. But the writing was on the wall, as it were, and Mandela was released from prison two years later.

Rep. Porter didn’t know what effect his birthday card to Niklus would have. “I doubt he will get them,” he told the House. “Yet by sending these birthday cards…we let the Soviet officials know that we will not forget him.”

I am sending my birthday card to Evan in the same spirit.