The Spectator – Towards Zero: the gruesome countdown to the American Civil War

The North and South had been bitterly divided over slavery since the invention of the cotton gin in the 1790s, but the Battle of Fort Sumter in 1861 would prove the point of no return

Some 100,000 books have been written about the American Civil War since it ended in 1865. That’s hardly surprising, given the four-year conflict’s impact on society, and not just because of the immense death toll, which new estimates put as high as 750,000 – more than the losses from all other wars combined. The effusion of blood created a new nation and a new mythology, anchored on the principles of freedom, equality and democracy.

Coloured lithograph of the bombardment of Fort Sumter, Charleston Harbor, 12-13 April 1861. [Lithograph by Currier & Ives, New York, 1861/ Getty Images]

There is not much room in this crowded field for Civil War neophytes. Erik Larson knows what he is about, however, in The Demon of Unrest – but do his critics? The mixed reception this book has received suggests not. As with his previous best-sellers, the author has taken a single event, the Battle of Fort Sumter in Charleston, South Carolina, in this case, and used it as a highly effective framing device for the immersive story he wishes to tell.

The actual event was a straightforward one. After holding out against Confederate forces for 108 days, the starving Union garrison relinquished its control of the fort on 13 April 1861. The battle was the point of no return, and although there hadn’t been any fatalities during the 34-hour bombardment leading up to it, a gruesome accident during the 50-gun salute did result in the first death of the war. Neither the Confederate besiegers nor the northern defenders of the fort had any inkling of the hell they were about to unleash on their fellow Americans.

This is not to say they were blindsided by the war. The country had been tearing itself apart over the issue of free vs slave labour ever since the invention of the cotton gin in the 1790s. Once slave-grown cotton could be efficiently cleaned and processed by machine, the South went from being an agricultural backwater to an economic powerhouse exporting four million cotton bales a year.

The production costs were cheap, the supply inexhaustible, and the southern states had a virtual monopoly on the global market. Unlike the northern states, they didn’t need immigration, education or industrialisation to grow rich – just millions of Africans, force-fed and force-bred into perpetual bondage. In the 1830s, southerners began referring to slavery as the ‘peculiar institution’, not because it was wicked and shameful, but on account of it being unique and inseparable from the southern way of life.

If northerners were at all uncertain about the non-negotiability of slavery – unlikely, given the political paralysis it caused in Washington – incidents on the floor of the Senate such as the Mississippi senator Henry Foote brandishing a loaded revolver and the South Carolina senator Preston Brooks beating the abolitionist campaigner Charles Sumner unconscious, helped to clear up any confusion.

International condemnation of slavery also fuelled southern bluster and arrogance. On the eve of the war, Britons were unamused to hear the South Carolina plantation owner James Hammond (rendered in gloriously repulsive detail by Larson) describe England as a vassal state in the southern empire. ‘Cotton is king,’ Hammond raged in a speech in 1858 that made him infamous on both sides of the Atlantic. ‘No power on Earth dares to make war on it.’ No power except the executive power of Abraham Lincoln, it turned out.‘Cotton is king,’ James Hammond raged in 1858. ‘No power on Earth dares to make war on it’

The pivotal role that individual action plays in momentous times is a recurring theme in Larson’s books. He is fascinated by two kinds of anti-heroes: the monster with a talent for propelling events, like Hammond, and the decent man whose limitations spur him towards catastrophe, like the professorial William Dodd, America’s ambassador to Germany the year Hitler came to power (In the Garden of Beasts, 2011). A colleague of Dodd’s later recalled he had seldom, if ever, ‘worked with a chief of mission who was more futile and ineffective’.In The Demon of Unrest, Larson’s anti-heroes are more starkly drawn. The decent men may be doomed to die, like Lincoln, or fail in their objective, like US Major Robert Anderson, the commander of Fort Sumter, but their flaws and limitations render them more, rather than less, admirable.

Lincoln was elected the first Republican president in November 1860 by a majority in the 34-state electoral college. But he lost the popular vote by a wide margin, giving the impression that his victory was an accident. His failure to address this added fuel to the claims of southern fire-eaters that he intended to destroy the South’s economy using high tariffs, ram abolition down their throats and set off a race war between blacks and whites.

Having never visited the Deep South, Lincoln was unaware of how entrenched the secession movement had become or how desperately pro-Union southerners needed support and leadership. Even more damaging for the prospects of peace was the traditional four months’ grace between the election and the inauguration. The General Assembly of South Carolina, the state with the noisiest supporters and longest history of secession attempts, voted to become an ‘independent commonwealth’ on 20 December 1860. Half a dozen more followed soon afterwards, yet Lincoln was still in Illinois in early February when the seven announced the formation of the Confederate States of America under President Jefferson Davis.

Lincoln only settled into Washington at the end of February, by which time another four states were preparing to secede, bringing the final total to 11. At his inauguration on 4 March, his belated attempt to cool secession fever by insisting he would defend the Union to his last breath while promising that slavery was safe in his hands alienated everyone. Southerners were convinced he was lying, even as abolitionists hoped that he was.

Initially, the majority of Lincoln’s cabinet felt certain he was not up to the job. Several tried to sideline him, sowing chaos among the already confused attempts to prevent disunion. Lincoln never altered his position, however, that slavery was a negotiable issue, but not Federal authority. Only two naval fortifications were still in Union hands by the time he took office: Sumter in Charleston and Fort Pickens in Pensacola, Florida. Back in December, John Floyd, Buchanan’s secessionist secretary of war, believed he had picked ‘one of us’ when he assigned a middle-aged southern officer from a slave-owning family to oversee Charleston’s fortifications. Major Robert Anderson’s rise up the ranks had stalled, despite an exemplary service record and being severely wounded in action during the Mexican-American War. He was teaching cadets at West Point when Floyd recalled him. It was not uncommon for such men to seek compensation by means of treachery. Anderson was the exception.

South Carolina’s secession was meant to have been the signal for Anderson to stand aside or evacuate his position. He did, slipping out under cover of darkness with a few dozen troops; but only to take up a stronger one. Although still under construction, Fort Sumter was the biggest of Charleston’s three forts. Anderson didn’t care about the rights or wrongs of slavery, nor did he care much about politics, but his honour and duty were sacred to him. To the indignation and fury of his now former brother officers, he made it clear that he would protect the Union flag flying over Sumter until he ran out of food or Confederate forces overwhelmed him.

In Larson’s dramatic rendering of the countdown to war, Lincoln was the one who cocked the starting gun by insisting that Major Anderson be resupplied, knowing it would provoke the Confederates; by engaging in a battle he knew he would lose, Anderson was the one who fired it. There’s an unmistakable aura of Greek tragedy to these men in The Demon of Unrest. They are reluctant heroes, forced to act the way they do because they are incapable of behaving in any other way.

It’s history as a form of catharsis, which leads to the deeper purpose of this work. The real project of the book may not be immediately divined from its soaring prose and ripping action scenes, yet it was shaped by the calamitous events at the Capitol on 6 January 2021. ‘I had the eerie feeling that present and past had merged,’ Larson writes in his foreword. It seemed to him as though America was once again in danger of slipping loose from its ideological moorings.

The Demon of Unrest is Larson’s attempt to call the country back to its senses. The book is not so much history in the traditional Thucydidean manner of causes and events, but rather a political argument posing as history in the manner of Xenophon, the father of popular narrative history. It is a full-throated defence of democratic values, individual agency and the power of collective action. Why read it? Because to understand the meaning of freedom for others is to know it in ourselves.

Historically Speaking: The Drama of Finding Lost Cities

We are always a discovery away from rewriting ancient history.

The Wall Street Journal

March 21, 2024

In January, scientists announced the discovery of the oldest pre-Columbian cities in the Amazon. They were built in the Upano Valley in Ecuador by an organized urban society that flourished between 800 B.C. and 600 A.D. Surprisingly, this lost Amazonian civilization created cities that resemble modern suburbs, complete with low-density housing, ample green space and manicured road networks. It would seem that ancient urban complexity didn’t just arise in the familiar model of the Eurasian walled city.

ILLUSTRATION: THOMAS FUCHS

This kind of challenge to the scholarly consensus is actually fairly common with the discovery of lost cities. When civilizations are erased from history, bringing them back into the narrative can lead to a dramatic realignment of facts.

The discovery of Troy is an early example of this phenomenon. In 1870, after 12 years of fruitless searching, a German businessman and amateur archaeologist named Heinrich Schliemann found Troy’s ruins near modern Hisarlik on the Turkish Aegean coast. His careless excavation of the site makes modern archaeologists cringe, but against incredible odds Schliemann proved that Homer’s Iliad was based on real events. The study of ancient Greece was changed forever.

Three major discoveries in the early 1900s also reshaped our understanding of the Bronze Age. In 1906, German archaeologists digging through the ruins of a windswept plateau in north-central Turkey uncovered the forgotten city of Hattusa. Its royal archive confirmed that the city had been the last capital of the Hittites and the nerve center of a powerful empire that rivaled Egypt. After a glorious run of five hundred years, the Hittite civilization disintegrated so completely by around 1200 B.C. that little was known about it beyond the mentions in the Bible. The recovery of Hattusa redrew the map of the ancient world, this time with the Hittites at the center.

Across the Aegean, Sir Arthur Evans similarly rescued the Minoans with his discovery around 1905 of the Palace of Knossos on Crete, which flourished between 1700 and 1500 B.C. Evans’s breakthrough was to show that Minoan culture was nonviolent, unlike the other ancient Greek societies known at the time. It was also perhaps matriarchal. The palace, which lacked fortifications, had frescoes of women in authoritative poses. Recovered sculptures often featured goddesses. Evans ultimately argued that Knossos was proof that gender inequality was not, as generally believed, intrinsic to civilization.

The discovery in 1911 of Mohenjo-daro in present-day Pakistan was no less revelatory. This once-vital center of the Indus Valley Civilization, a trading society that thrived between 2500 and 1700 B.C., notably lacked temples, palaces, noble houses or even rich or poor neighborhoods. The city seems to have spent its wealth instead on civic amenities, such as public granaries and universal plumbing.

The egalitarianism at the heart of Mohenjo-daro suggests that the more cities and civilizations we find, the more it will complicate our understanding of human society’s origins. It is humbling to know that we are always a discovery away from rewriting ancient history. It’s invigorating, too.

Historically Speaking: Aspirin, a Pioneering Wonder Drug

The winding, millennia-long route from bark to Bayer.

The Wall Street Journal

February 1, 2024

For ages the most reliable medical advice was also the most simple: Take two aspirin and call me in the morning. This cheap pain reliever, which also thins blood and reduces inflammation, has been a medicine cabinet staple ever since it became available over the counter nearly 110 years ago.

Willow bark, a distant ancestor of aspirin, was a popular ingredient in ancient remedies to relieve pain and treat skin problems. Hippocrates, the father of medicine, was a firm believer in willow’s curative powers. For women with gynecological troubles in the fourth century B.C., he advised burning the leaves “until the steam enters the womb.”

That willow bark could reduce fevers wasn’t discovered until the 18th century. Edward Stone, an English clergyman, noticed its extremely bitter taste was similar to that of the cinchona tree, the source of the costly malaria drug quinine. Stone dried the bark and dosed himself to treat a fever. When he felt better, he tested the powder on others suffering from “ague,” or malaria. When their fevers disappeared, he reported triumphantly to the Royal Society in 1763 that he had found another malaria cure. In fact, he had identified a way to treat its symptoms.

Willows contain salicin, a plant hormone with anti-inflammatory, fever-reducing and pain-relieving properties. Experiments with salicin, and its byproduct salicylic acid, began in earnest in Europe in the 1820s. In 1853 Charles Frédéric Gerhardt, a French chemist, discovered how to create acetylsalicylic acid, the active ingredient in aspirin, but then abandoned his research and died young.

There is some debate over how aspirin became a blockbuster drug for the German company Bayer. Its official history credits Felix Hoffmann, a Bayer chemist, with synthesizing acetylsalicylic acid in 1897 in the hopes of alleviating his father’s severe rheumatic pain. Bayer patented aspirin in 1899 and by 1918 it had become one of the most widely used drugs in the world.

ILLUSTRATION: THOMAS FUCHS

But did Hoffman work alone? Shortly before his death in 1949, Arthur Eichengrün, a Jewish chemist who had spent World War II in a concentration camp, published a paper claiming that Bayer had erased his contribution. In 2000 the BMJ published a study supporting Eichengrün’s claim. Bayer, which became part of the Nazi-backing conglomerate I.G. Farben in 1925, has denied that Eichengrün had a role in the breakthrough.

Aspirin shed its associations with the Third Reich after I.G. Farben sold off Bayer in the early 1950s, but the drug’s pain-relieving hegemony was fleeting. By 1956 Bayer’s British affiliate brought acetaminophen to the market. Ibuprofen became available in 1962.

The drug’s fortunes recovered after the New England Journal of Medicine published a study in 1989 that found the pill reduced the threat of a heart attack by 44%. Some public-health officials promptly encouraged anyone over 50 to take a daily aspirin as a preventive measure.

But as with the case with Rev. Stone, it seems the science is more complicated. In 2022 the U.S. Preventive Services Task Force officially advised against taking the drug prophylactically, given the risk of internal bleeding and the availability of other therapies. Aspirin may work wonders, but it can’t work miracles.

Historically Speaking: Our Fraught Love Affair With Cannabis

Ban it? Tax it? Humans have been hounded by these questions for millennia.

The Wall Street Journal

January 19, 2024

Ohio’s new marijuana law marks a watershed moment in the decriminalization of cannabis: more than half of Americans now live in places where recreational marijuana is legal. It is a profound shift, but only the latest twist in the long and winding saga of society’s relationship with pot.

Humans first domesticated cannabis sativa around 12,000 years ago in Central and East Asia as hemp, mostly for rope and other textiles. Later, some adventurous forebears found more interesting uses. In 2008, archaeologists in northwestern China discovered almost 800 grams of dried cannabis containing high levels of THC, the psychoactive ingredient in marijuana, among the burial items of a seventh century B.C. shaman.

The Greeks and Romans used cannabis for hemp, medicine and possibly religious purposes, but the plant was never as pervasive in the classical world as it was in ancient India. Cannabis indica, the sacred plant of the god Shiva, was revered for its ability to relieve physical suffering and bring spiritual enlightenment to the holy.

Cannabis gradually spread across the Middle East in the form of hashish, which is smoked or eaten. The first drug laws were enacted by Islamic rulers who feared their subjects wanted to do little else. King al-Zahir Babar in Egypt banned hashish cultivation and consumption in 1266. When that failed, a successor tried taxing hashish instead in 1279. This filled local coffers, but consumption levels soared and the ban was restored.

The march of cannabis continued unabated across the old and new worlds, apparently reaching Stratford-upon-Avon by the 16th century. Fragments of some 400-year-old tobacco pipes excavated from Shakespeare’s garden were found to contain cannabis residue. If not the Bard, at least someone in the household was having a good time.

By the 1600s American colonies were cultivating hemp for the shipping trade, using its fibers for rigs and sails. George Washington and Thomas Jefferson grew cannabis on their Virginia plantations, seemingly unaware of its intoxicating properties.

Veterans of Napoleon’s Egypt campaign brought hashish to France in the early 1800s, where efforts to ban the habit may have enhanced its popularity. Members of the Club des Hashischins, which included Charles Baudelaire, Honoré de Balzac, Alexander Dumas and Victor Hugo, would meet to compare notes on their respective highs.

ILLUSTRATION: THOMAS FUCHS

Although Queen Victoria’s own physician advocated using cannabis to relieve childbirth and menstrual pains, British lawmakers swung back and forth over whether to tax or ban its cultivation in India.

In the U.S., however, Americans lumped cannabis with the opioid epidemic that followed the Civil War. Early 20th-century politicians further stigmatized the drug by associating it with Black people and Latino immigrants. Congress outlawed nonmedicinal cannabis in 1937, a year after the movie “Reefer Madness” portrayed pot as a corrupting influence on white teenagers.

American views of cannabis have changed since President Nixon declared an all-out War on Drugs more than 50 years ago, yet federal law still classifies the drug alongside heroin. As lawmakers struggle to catch up with the zeitgeist, two things remain certain: Governments are often out of touch with their citizens, and what people want isn’t always what’s good for them.

Historically Speaking: The Hunt for a Better Way to Vote

Despite centuries of innovation, the humble 2,500-year-old ballot box is here to stay.

The Wall Street Journal

January 4, 2024

At least 40 national elections will take place around the world over the next year, with some two billion people going to the polls. Thanks to the 2,500-year-old invention of the ballot box, in most races these votes will actually count and be counted.

Ballot boxes were first used in Athens during the 5th century B.C., but in trials rather than elections. Legal cases were tried before a gathering of male citizens, known as the Assembly, and decided by vote. Jurors indicated their verdict by dropping either a marked or unmarked pebble into an urn, which protected them against violence by keeping their decision secret.

The first recorded case of ballot box stuffing also took place in 5th century Athens. To exile an unpopular Athenian via an “ostracism election,” Assembly voters simply had to scratch his name on an ostraka, a pottery shard, and whomever reached a certain threshold of votes was banished for 10 years. It is believed that Themistocles’s political enemies rigged his ostracism vote in 472 B.C. by distributing pre-etched shards throughout the Assembly.

In 139 B.C., the Romans passed a series of voter secrecy laws starting with the Lex Gabinia, which introduced the secret ballot for magistrate elections. A citizen would write his vote on a wax-covered wooden tablet and then deposit it in a wicker basket called a cista. The cistae were so effective at protecting voters from public scrutiny that many senators, including Cicero, regarded the ballot box as an attack on their authority and a dangerous concession to mob rule.

ILLUSTRATION: THOMAS FUCHS

The fact that secret ballots allowed men to vote as they pleased was one reason why King Charles I of England ordered all “balloting boxes” to be taken out of circulation in 1637, where they remained until the 19th century.

Even then, there was strong resistance to ballot boxes in Britain and America on the grounds that they were unmanly: A citizen ought to display his vote, not hide it. In any case, there was nothing special about a 19th-century ballot box except its convenience for stealing or stuffing. One notorious election scam in San Francisco in the 1850s involved a ballot box with a false bottom.

Public outrage over rigged elections in the U.S. led some states to adopt Samuel Jollie’s tamper-proof glass ballot box, which New York first used in 1857. But a transparent design couldn’t prevent these boxes from mysteriously disappearing, nor would it have saved Edgar Allan Poe, who is thought to have died in Baltimore from being “cooped,” a practice where kidnap victims were drugged into docility and made to vote multiple times.

To better guarantee the integrity of elections, New York introduced in 1892 a new machine by Jacob Myers that allowed voters to privately choose candidates by pulling a lever, which dispensed with ballots and ballot boxes. Other inventors quickly improved on the design and by 1900 Jollie’s glass ballot box had become obsolete. By World War II almost every city had switched over to mechanical voting systems, which tallied votes automatically.

The simple ballot box seemed destined to disappear until controversies over machine irregularities in the 2000 presidential election resulted in the Help America Vote Act, which requires all votes to have a paper record. The ballot box still isn’t the perfect shield against fraud. But then, neither is anything else.

Historically Speaking: The Ancient Origins of the Christmas Wreath

Before they became a holiday symbol, wreaths were used to celebrate Egyptian gods, victorious Roman generals and the winter solstice.

On Christmas Eve, 1843, three ghosts visited Ebenezer Scrooge in the Charles Dickens novella “A Christmas Carol” and changed Christmas forever. Dickens is often credited with “inventing” the modern idea of Christmas because he popularized and reinvigorated such beloved traditions as the turkey feast, singing carols and saying “Merry Christmas.”

But one tradition for which he cannot take credit is the ubiquitous Christmas wreath, whose pedigree goes back many centuries, even to pre-Christian times.

ILLUSTRATION: THOMAS FUCHS

Wreaths can be found in almost every ancient culture and were worn or hung for many purposes. In Egypt, participants in the festival of Sokar, a god of the underworld, wore onion wreaths because the vegetable was venerated as a symbol of eternal life. The Greeks awarded laurel wreaths to the winners of competitions because the laurel tree was sacred to Apollo, the god of poetry and athletics. The Romans bestowed them on emperors and victorious generals.

Fixing wreaths and boughs onto doorposts to bring good luck was another widely practiced tradition. As early as the 6th century, Lunar New Year celebrants in central China would decorate their doors with young willow branches, a symbol of immortality and rebirth.

The early Christians did not, as might be assumed, create the Yuletide wreath. That was the pagan Vikings. They celebrated the winter solstice with mistletoe, evergreen wreaths made of holly and ivy, and 12 days of feasting, all of which were subsequently turned into Christian symbols. Holly, for example, has often been equated with the crown of thorns. Perhaps not surprisingly for the people who also introduced the words “knife,” “slaughter” and “berserk,” one Viking custom involved setting the Yuletide wreath on fire in the hope of attracting the sun’s attention.

Across northern Europe, the Norse wreath was eventually absorbed into the Christian calendar as the Advent wreath, symbolizing the four weeks before Christmas. It became part of German culture in 1839, when Johann Hinrich Wichern, a Lutheran pastor in Hamburg, captured the public’s imagination with his enormous Advent wreath. Made with a cartwheel and 24 candles, it was intended to help the children at his orphanage count the days to Christmas.

German immigrants to the U.S. brought the Advent wreath with them. Still, while Americans might accept a candlelit wreath inside the house, door wreaths were rare, especially in former Puritan strongholds such as Boston. The whiff of disapproval hung about until 1935, when Colonial Williamsburg appointed Louise B. Fisher, an underemployed and overqualified professor’s wife, to be its head of flowers and decorations. Inspired by her love of 15th-century Italian art, Fisher allowed her wreath designs to run riot on the excuse that she was only adding fruits and other whimsies that had been available during the Colonial era.

Thousands of visitors saw her designs and returned home to copy them. Like Wichern, she helped inspire a whole generation to be unapologetically joyous with their wreaths. Thanks in part to her efforts, America’s front doors are a thing to behold at this time of year, proving that it’s never too late to pour new energy into an old tradition.

Historically Speaking: A Tale of Two Hats

Napoleon’s bicorne and Santa Claus’s red cap both trace their origins to the felted headgear worn in Asia Minor thousands of years ago.

December makes me think of hats—well, one hat in particular. Not Napoleon’s bicorne hat, an original of which (just in time for Ridley Scott’s movie) sold for $2.1 million at an auction last month in France, but Santa’s hat.

The two aren’t as different as you might imagine. They share the same origins and, improbably, tell a similar story. Both owe their existence to the invention of felt, a densely matted textile. The technique of felting was developed several thousand years ago by the nomads of Central Asia. Since felt stays waterproof and keeps its shape, it could be used to make tents, padding and clothes.

The ancient Phrygians of Asia Minor were famous for their conical felt hats, which resemble the Santa cap but with the peak curving upward and forwards. Greek artists used them to indicate a barbarian. The Romans adopted a red, flat-headed version, the pileus, which they bestowed on freed slaves.

Although the Phrygian style never went out of fashion, felt was largely unknown in Western Europe until the Crusades. Its introduction released a torrent of creativity, but nothing matched the sensation created by the hat worn by King Charles VII of France in 1449. At a celebration to mark the French victory over the English in Normandy, he appeared in a fabulously expensive, wide-brimmed, felted beaver-fur hat imported from the Low Countries. Beaver hats were not unknown; the show-off merchant in Chaucer’s “Canterbury Tales” flaunts a “Flandrish beaver hat.” But after Charles, everyone wanted one.

Hat brims got wider with each decade, but even beaver fur is subject to gravity. By the 17th century, wearers of the “cavalier hat” had to cock or fold up one or both sides for stability. Thus emerged the gentleman’s three-sided cocked hat, or tricorne, as it later became known—the ultimate divider between the haves and the have-nots.

The Phrygian hat resurfaced in the 18th century as the red “Liberty Cap.” Its historical connections made it the headgear of choice for rebels and revolutionaries. During the Reign of Terror, any Frenchman who valued his head wore a Liberty Cap. But afterward, it became synonymous with extreme radicalism and disappeared. In the meantime, the hated tricorne had been replaced by the less inflammatory top hat. It was only naval and military men, like Napoleon, who could get away with the bicorne.

The wide-brimmed felted beaver hat was resurrected in the 1860s by John B. Stetson, then a gold prospector in Colorado. Using the felting techniques taught to him by his hatter father, Stetson made himself an all-weather head protector, turning the former advertisement for privilege into the iconic hat of the American cowboy.

Thomas Nast, the Civil War caricaturist and father of Santa Claus’s modern image, performed a similar rehabilitation on the Phrygian cap. To give his Santa a far-away but still benign look, he gave him a semi-Phrygian crossed with a camauro, the medieval clergyman’s cap. Subsequent artists exaggerated the peak and cocked it back, like a nightcap. Thus the red cap of revolution became the cartoon version of Christmas.

In this tale of two hats lies a possible rejoinder to the cry in T.S. Eliot’s “The Waste Land”: “Who is the third who walks always beside you?” It is history, invisible yet present, protean yet permanent—and sometimes atop Santa’s head.

Historically Speaking: The Enduring Allure of a Close Shave

Men may be avoiding their razors for ‘Movember,’ but getting rid of facial and other body hair goes back millennia in many different cultures

November is a tough time for razors. Huge numbers of them haven’t seen their owners since the start of “Movember,” the annual no-shave fundraiser for men’s health. But the razors needn’t fear: The urge of men to express themselves by trimming and removing their hair, facial and otherwise, runs deep.

Although no two cultures share the exact same attitude toward shaving, it has excited strong feelings in every time and place. The act is redolent with symbolism, especially religious and sexual. Even Paleolithic Man shaved his hair on occasion, using shells or flint blades as a crude razor. Ancient Egypt was the earliest society to make hair removal a way of life for both sexes. By 3000 B.C., the Egyptians were using gold and copper to manufacture razors with handles. The whole head was shaved and covered by a wig, a bald head being a practical solution against head lice and overheating. Hair’s association with body odor and poor hygiene also made its absence a sign of purity.

Around 1900 B.C., the Egyptians started experimenting with depilatory pastes made of beeswax and boiled caramel. The methods were so efficacious that many of them are still used today. The only parts never shaved were the eyebrows, except when mourning the death of a cat, a sacred animal in Egyptian culture. The Greco-Roman world adopted the shaved look following Alexander the Great, who thought beards were a liability in battle, since they could be grabbed and pulled. The Romans took their cue from him.

But they also shared the Egyptian obsession with body hair. The Roman fetish for tweezing created professional hair pluckers. The philosopher Seneca, who lived next door to a public bath, complained bitterly about the loud screaming from plucking sessions, which intruded upon the serenity he required for deep thoughts. Both pluckers and barbers disappeared during Rome’s decline.

Professional barbers only reappeared in significant numbers after 1163 when the Council of Tours banned the clergy, who often provided this service, from shedding blood of any kind. The skills of barber-surgeons improved, unlike their utensils, which hardly changed at all until the English invented the foldable straight razor in the late 17th century.

The innovation prompted a “safety” race between English and French blade manufacturers. The French surged ahead in the late 18th century with the Perret razor, which had protective guards on three sides. The English responded with the T-shaped razor patented by William Henson in 1847. In 1875, the U.S. leapfrogged its European rivals with electrolysis, still the best way to remove hair permanently.

Nevertheless, the holy grail of a safe, self-shaving experience remained out of reach until King Camp Gillette, a traveling salesman from New York, worked with an engineer to produce the first disposable, double-edged razorblade. His Gillette razor went on sale in 1903. By the end of World War I, he was selling millions—and not just to men.

Gillette seized on the fashion for sleeveless summer dresses to market the Milady Décolleté Gillette Razor for the “smooth underarm” look. The campaign to convince women to use a traditionally male tool was helped by a series of scandals in the 1920s and 30s over tainted hair products. The worst involved Koremlu, a depilatory cream that was largely rat poison.

The razor may be at least 50,000 years old, but it remains an essential tool. And it’s a great stocking stuffer.

Historically Speaking: Marriage as a Mirror of Human Nature

From sacred ritual to declining institution, wedlock has always reflected our ideas about liberty and commitment.

The Wall Street Journal

October 26, 2023

Marriage is in decline in almost every part of the world. In the U.S., the marriage rate is roughly six per 1,000 people, a fall of nearly 60% since the 1970s. But this is still high compared with most of the highly developed countries in the Organization for Economic Cooperation and Development, where the average marriage rate has dropped below four per 1,000. Modern vie

History of Marriage

THOMAS FUCHS

ws on marriage are sharply divided: In a recent poll, two in five young adult Americans said that the institution has outlived its usefulness.

The earliest civilizations had no such thoughts. Marriage was an inseparable part of the religious and secular life of society. In Mesopotamian mythology, the first marriage was the heavenly union between Innana/Ishtar, the goddess of war and love, and her human lover, the shepherd Dumuzi. Each year, the high point of the religious calendar was the symbolic re-enactment of the Sacred Marriage Rite by the king and the high priestess of the city.

Throughout the ancient world, marriage placed extra constraints on women while allowing polygamy for men. The first major change to the institution took place in ancient Greece. A marriage between one man and one woman, with no others involved, became the bedrock of democratic states. According to Athenian law, only the son of two married citizens could inherit the rights of citizenship. The change altered the definition of marriage to give it a civic purpose, although women’s subordination remained unchanged.

At the end of the 1st century B.C., Augustus Caesar, the founder of the Roman Empire, tried to use the law to reinvigorate “traditional” marriage values. But it was the Stoic philosophers who had the greatest impact on ideas about marriage, teaching that its purpose included personal fulfillment. The 1st-century philosopher Musonius Rufus argued that love and companionship weren’t just incidental benefits but major purposes of marriage.

The early Church’s general hostility toward sex did away with such views. Matrimony was considered less desirable than celibacy; priests didn’t start officiating at wedding ceremonies until the 800s. On the other hand, during the 12th century the Catholic Church made marriage one of the seven unbreakable sacraments. In the 16th century, its intransigence on divorce resulted in King Henry VIII establishing the Anglican Church so he could leave Catherine of Aragon and marry Anne Boleyn.

In the U.S. after the Civil War, thousands of former slaves applied for marriage certificates from the Freedmen’s Bureau. Concurrently, between 1867 and 1886, there were 328,716 divorces among all Americans. The simultaneous moves by some to escape the bonds of matrimony, and by others to have the right to claim it, highlight the institution’s peculiar place in our ideas of individual liberty.

In 1920, female suffrage transformed the nature of marriage yet again, implicitly recognizing the right of wives to a separate legal identity. Still, the institution survived and even thrived. At the height of World War II in 1942, weddings were up 83% from the previous decade.

Though marriage symbolizes stability, its meaning is unstable. It doesn’t date or fall behind; for better or worse, it simply reflects who we are.

Historically Speaking: Sending Cards for a Happy Birthday

On Oct. 26, imprisoned WSJ reporter Evan Gershkovich will turn 32. Since ancient times, birthdays have been occasions for poems, letters and expressions of solidarity.

The Wall Street Journal

October 13, 2023

Wall Street Journal reporter Evan Gershkovich turns 32 on Oct. 26. This year he will be spending his birthday in Lefortovo prison in Moscow, a detention center for high-profile and political prisoners. He has been there for the past six months, accused of espionage—a charge vehemently denied by the U.S. government and The Journal.

Despite the extreme restrictions placed on Lefortovo prisoners, it is still possible to send Evan messages of support via the U.S. Embassy in Moscow or the freegershkovich.com website, like a birthday card, to let him know that the world cares.

Birthday cards are so cheap and plentiful it is easy to miss their cultural value. They are the modern iteration of a literary tradition that goes back at least to Roman times. Poets were especially given to composing birthday odes to their friends and patrons. The Augustan poet Horace dedicated many of his poems to Maecenas, whose birthday, he wrote, “is almost more sacred to me than that of my own birth.”

The custom of birthday salutations petered out along with much else during the Dark Ages but was revived with the spread of mass literacy. Jane Austen would write to her siblings on their birthdays, wishing them the customary “joy,” but toward the end of her life she began to experiment with the form. In 1817, she sent her three-year-old niece Cassy a special birthday letter written in reverse spelling, beginning with “Ym raed Yssac.”

Austen’s sense that a birthday letter ought to be unique coincided with a technological race in the printing industry. One of the first people to realize the commercial potential of greeting cards was Louis Prang, a German immigrant in Boston, who began selling printed cards in 1856. Holiday cards were an instant success, but birthday cards were less popular until World War I, when many American families had a relative fighting overseas.

Demand for birthday cards stayed high after the war, as did the importance attached to them. King George V seized on their popularity to introduce the royal tradition of sending every British citizen who reaches 100 a congratulatory birthday card. In 1926, to show how much they appreciated the gift of U.S. aid, more than 5 million Poles signed a 30,000-page birthday card commemorating America’s 150th anniversary.

During the Cold War, the symbolism of the birthday card became a power in itself. In 1984, Illinois Rep. John Edward Porter and other members of Congress sent birthday cards to Mart Niklus, an Estonian civil rights campaigner imprisoned in the U.S.S.R. By coincidence, the Soviets released Niklus in July 1988, the same month that Nelson Mandela received more than 50,000 cards for his 70th birthday. The frustrated South African prison authorities allowed him to have 12 of them. But the writing was on the wall, as it were, and Mandela was released from prison two years later.

Rep. Porter didn’t know what effect his birthday card to Niklus would have. “I doubt he will get them,” he told the House. “Yet by sending these birthday cards…we let the Soviet officials know that we will not forget him.”

I am sending my birthday card to Evan in the same spirit.