Historically Speaking: A Tale of Two Hats

Napoleon’s bicorne and Santa Claus’s red cap both trace their origins to the felted headgear worn in Asia Minor thousands of years ago.

December makes me think of hats—well, one hat in particular. Not Napoleon’s bicorne hat, an original of which (just in time for Ridley Scott’s movie) sold for $2.1 million at an auction last month in France, but Santa’s hat.

The two aren’t as different as you might imagine. They share the same origins and, improbably, tell a similar story. Both owe their existence to the invention of felt, a densely matted textile. The technique of felting was developed several thousand years ago by the nomads of Central Asia. Since felt stays waterproof and keeps its shape, it could be used to make tents, padding and clothes.

The ancient Phrygians of Asia Minor were famous for their conical felt hats, which resemble the Santa cap but with the peak curving upward and forwards. Greek artists used them to indicate a barbarian. The Romans adopted a red, flat-headed version, the pileus, which they bestowed on freed slaves.

Although the Phrygian style never went out of fashion, felt was largely unknown in Western Europe until the Crusades. Its introduction released a torrent of creativity, but nothing matched the sensation created by the hat worn by King Charles VII of France in 1449. At a celebration to mark the French victory over the English in Normandy, he appeared in a fabulously expensive, wide-brimmed, felted beaver-fur hat imported from the Low Countries. Beaver hats were not unknown; the show-off merchant in Chaucer’s “Canterbury Tales” flaunts a “Flandrish beaver hat.” But after Charles, everyone wanted one.

Hat brims got wider with each decade, but even beaver fur is subject to gravity. By the 17th century, wearers of the “cavalier hat” had to cock or fold up one or both sides for stability. Thus emerged the gentleman’s three-sided cocked hat, or tricorne, as it later became known—the ultimate divider between the haves and the have-nots.

The Phrygian hat resurfaced in the 18th century as the red “Liberty Cap.” Its historical connections made it the headgear of choice for rebels and revolutionaries. During the Reign of Terror, any Frenchman who valued his head wore a Liberty Cap. But afterward, it became synonymous with extreme radicalism and disappeared. In the meantime, the hated tricorne had been replaced by the less inflammatory top hat. It was only naval and military men, like Napoleon, who could get away with the bicorne.

The wide-brimmed felted beaver hat was resurrected in the 1860s by John B. Stetson, then a gold prospector in Colorado. Using the felting techniques taught to him by his hatter father, Stetson made himself an all-weather head protector, turning the former advertisement for privilege into the iconic hat of the American cowboy.

Thomas Nast, the Civil War caricaturist and father of Santa Claus’s modern image, performed a similar rehabilitation on the Phrygian cap. To give his Santa a far-away but still benign look, he gave him a semi-Phrygian crossed with a camauro, the medieval clergyman’s cap. Subsequent artists exaggerated the peak and cocked it back, like a nightcap. Thus the red cap of revolution became the cartoon version of Christmas.

In this tale of two hats lies a possible rejoinder to the cry in T.S. Eliot’s “The Waste Land”: “Who is the third who walks always beside you?” It is history, invisible yet present, protean yet permanent—and sometimes atop Santa’s head.

Historically Speaking: The Royal Origins of Tennis

The strict etiquette at Wimbledon and other tournaments is a reminder that the sport’s first players were French kings and aristocrats.

The Wall Street Journal

June 15, 2023

For the 136th Wimbledon Championships, opening on July 3, lady competitors will be allowed to ignore the all-white clothing rule for the first time—though only as it applies to their undergarments. Tennis may never be the same.

The break with tradition is all the more surprising given the sport’s penchant for strict etiquette rules and dress codes. The earliest recorded version of tennis was a type of handball played by medieval monks in France. Called “jeu de paume,” “game of the palm,” it involved hitting a leather ball against the cloister wall.

Thomas Fuchs

As the sport spread beyond the monastic world, it gained a new name, “tenez,” the French word for “receive.” It had instant social cachet, since it could only be played in large, high-walled courtyards, thus narrowing the pool of players to kings and aristocrats.

Early “tenez” was not without its dangers. Legend has it that King Louis X of France, who introduced the first covered courts, took ill and died after an overly strenuous match in 1316. In 1498 another French king, Charles VIII, suffered an untimely end after banging his head on a lintel while hurrying to his tennis court.

By the 16th century, the game had evolved into something in between modern squash and tennis. Players used angled wooden rackets, and the ball could be bounced off the walls and sloping roof as well as hit over the net. This version, known as real or royal tennis, is still played at a small number of courts around the world.

The Royal Tennis Court at King Henry VIII’s Hampton Court Palace, outside London, was the most luxurious in Europe, but the sophisticated surroundings failed to elevate on-court behavior. In 1541, Sir Edmund Knyvett was condemned to have his hand chopped off for striking his opponent and drawing blood. Henry ended up granting him a reprieve—more than he did for wives two and five.

The association of tennis with royal privilege hastened its demise in the 18th century. On June 20, 1789, Louis XVI’s tennis court at Versailles hosted one of the most important events of the French Revolution. The new National Assembly gathered there after being locked out of its premises, and made a pledge, the Tennis Court Oath, not to disband until France had a constitution. It was a very brave or very foolish person who played the game after that.

Modern tennis—known at first as “lawn tennis,” since it was played on a grass court—began to emerge in the 1870s, when an eccentric British Army major named Walter Clopton Wingfield invented a version of the game using rubber balls. His name for it—“Sphairistike,” from the Greek word for ball playing—never caught on. But the social opportunities offered by tennis made it extremely popular among the upper classes.

The exclusive All England Croquet and Lawn Tennis Club in Wimbledon, whose championships began in 1877, inspired imitators on both sides of the Atlantic. Unfortunately, many tennis players expended nearly as much effort keeping the “wrong sort” out as they did keeping the ball in. For years, the major tennis tournaments offered no prizes and were only open to amateurs, meaning the wealthy. Professionals were relegated to a separate circuit.

Tennis’s own revolution took place in 1968, following sustained pressure from players and fans for the Grand Slam Tournaments to be open to all competitors. Fifty-five years on, the barricades—and the barriers—are still coming down.

Historically Speaking: The Long Road to Pensions for All

ILLUSTRATION: THOMAS FUCHS

From the Song Dynasty to the American Civil War, governments have experimented with ways to support retired soldiers and workers

The Wall Street Journal

April 6, 2023

“Will you still need me, will you still feed me,/When I’m sixty-four?” sang the Beatles in their 1967 album Sgt. Pepper’s Lonely Hearts Club Band. These were somewhat hypothetical questions at a time when the mean age of American men taking retirement was 64, and their average life expectancy was 67. More than a half-century later, the Beatles song resonates in a different way, because there are so few countries left where retirement on a state pension at 64 is even possible.

Historically, governments preferred not to be in the retirement business, but self-interest sometimes achieved what charitable impulses could not. In 6 A.D., a well-founded fear of civil unrest encouraged Augustus Caesar to institute the first state pension system, the aerarium militare, which looked after retired army veterans. He earmarked a 5% tax on inheritances to pay for the scheme, which served as a stabilizing force in the Roman Empire for the next 400 years. The Sack of Rome in 410 by Alaric, leader of the Visigoths, probably could have been avoided if Roman officials had kept their promise to pay his allied troops their military pensions.

In the 11th century, the Song emperor Shenzong invited the brilliant but mercurial governor of Nanjing, Wang Anshi, to reform China’s entire system of government. Wang’s far-reaching “New Laws” included state welfare plans to care for the aged and infirm. Some of his ideas were accepted but not the retirement plan, which achieved the remarkable feat of uniting both conservatives and radicals against him: The former regarded state pensions as an assault on family responsibility, the latter thought it gave too much power to the government. Wang was forced to retire in 1075.

Leaders in the West were content to muddle along until, like Augustus, they realized that a large nation-state requires a national army to defend it. England’s Queen Elizabeth I oversaw the first army and navy pensions in Europe. She also instituted the first Poor Laws, which codified the state’s responsibility toward its citizens. The problem with the Poor Laws, however, was that they transferred a national problem to the local level and kept it there.

Before he fell victim to the Terror during the French Revolution, the Marquis de Condorcet tried to figure out how France might pay for a national pension system. The question was largely ignored in the U.S. until the Civil War forced the federal government into a reckoning. A military pension system that helped fewer than 10,000 people in 1861 grew into a behemoth serving over 300,000 in 1885. By 1894 military pensions accounted for 37% of the federal budget. One side effect was to hamper the development of national and private pension schemes. Among the few companies to offer retirement pensions for employees were the railroads and American Express.

By the time Frances Perkins, President Franklin Roosevelt’s Labor Secretary, ushered in Social Security in 1935, Germany’s national pension scheme was almost 50 years old. But the German system started at age 70, far too late for most people, which was the idea. As Jane Austen’s Mrs. Dashwood complained in “Sense and Sensibility,” “People always live forever when there is an annuity to be paid to them.” The last Civil War pensioner was Irene Triplett, who died in 2020. She was receiving $73.13 every month for her father’s Union service.

Historically Speaking: The Tragedy of Vandalizing the Past

The 20th anniversary of the destruction of the Bamiyan Buddhas in Afghanistan reminds us of the imperative of historical preservation

April 15, 2021

Twenty years ago this spring, the Taliban completed their obliteration of Afghanistan’s 1,500-year-old Buddhas of Bamiyan. The colossal stone sculptures had survived major assaults in the 17th and 18th centuries by the Mughal emperor Aurangzeb and the Persian king Nader Afshar. Lacking sufficient firepower, both gave up after partly defacing the monuments.

The Taliban’s methodical destruction recalled the calculated brutality of ancient days. By the time the Romans were finished with Carthage in 146 B.C., the entire city had been reduced to rubble. They were given a taste of their own medicine in 455 A.D. by Genseric, King of the Vandals, who stripped Rome bare in two weeks of systematic looting and destruction.

One of the Buddhas of Bamiyan in 1997, before their destruction.
PHOTO: ALAMY

Like other vanquished cities, Rome’s buildings became a source of free material. Emperor Constans II of Byzantium blithely stole the Pantheon’s copper roofing in the mid-17th century; a millennium later, Pope Urban VIII appropriated its bronze girders for Bernini’s baldacchino over the high altar in St. Peter’s Basilica.

When not dismantled, ancient buildings might be repurposed by new owners. Thus Hagia Sophia Cathedral became a mosque after the Ottomans captured Constantinople, and St. Radegund’s Priory was turned into Jesus College at Cambridge University on the orders of King Henry VIII.

The idea that a country’s ancient heritage forms part of its cultural identity took hold in the wake of the French Revolution. Incensed by the Jacobins’ pillaging of churches, Henri Gregoire, the Constitutional Bishop of Blois, coined the term vandalisme. His protest inspired the novelist Victor Hugo’s efforts to save Notre Dame. But the architect chosen for the restoration, Eugène Emmanuel Viollet-le-Duc, added his own touches to the building, including the central spire that fell when the cathedral’s roof burned in 2019, spurring controversy over what to restore. Viollet-le-Duc’s own interpolations set off a fierce debate, led by the English art critic John Ruskin, about what constitutes proper historical preservation.

Ruskin inspired people to rethink society’s relationship with the past. There was uproar in England in 1883 when the London and South Western Railway tried to justify building a rail-track alongside Stonehenge, claiming the ancient site was unused.

Public opinion in the U.S., when aroused, could be equally determined. The first preservation society was started in the 1850s by Ann Pamela Cunningham of South Carolina. Despite being disabled by a riding accident, Cunningham initiated a successful campaign to save George Washington’s Mount Vernon from ruin.

But developers have a way of getting what they want. Not even modernist architect Philip Johnson protesting in front of New York’s Penn Station was able to save the McKim, Mead & White masterpiece in 1963. Two years later, fearing that the world’s architectural treasures were being squandered, retired army colonel James Gray founded the International Fund for Monuments (now the World Monuments Fund). Without the WMF’s campaign in 1996, the deteriorating south side of Ellis Island, gateway for 12 million immigrants, might have been lost to history.

The fight never ends. I still miss the magnificent beaux-arts interior of the old Rizzoli Bookstore on 57th Street in Manhattan. The 109-year-old building was torn down in 2014. Nothing like it will ever be seen again.

The Telegraph: The Ascent of Woman, episode 4, review: passion and erudition

Source: BBC/Silver River

Source: BBC/Silver River

By Gerard O’Donovan

Watching the final part of Amanda Foreman’s The Ascent of Woman (BBC Two) was a reminder of how powerful, inspiring and important television can be at its best. One of Foreman’s chief arguments has been that women have contributed as much to history as men but have rarely been accorded the credit for it.

And this final episode, which focused on a series of extraordinary but little known 19th- and 20th-century revolutionaries and campaigners, offered a formidable exposition of the extent to which so many women have, unforgivably, been written out of that history.

Literally so in the case of the French revolutionary Olympe de Gouges, who published her Declaration for the Rights of Women in 1791 and whose champions Foreman met and interviewed still, 200 years on, marching the streets of Paris to have her contributions fully recognised.

Time and again Foreman offered examples of revolutions in which the contributions of women were encouraged – until the subject of their own rights was broached. Perhaps most fascinatingly in the case of Alexandra Kollontai, an extraordinary firebrand who pushed feminism to the heart of the Bolshevik agenda during the Russian revolution – only to see it rolled back again by Stalin and her considerable achievements wiped from the record.

It was on the subject of forgotten heroines like this that the programme was at its most atmospheric, with Foreman joining candlelit memorial parades in Moscow, or interviewing Kollontai’s natural heirs, the members of Pussy Riot. But she was just as ardent, if not more so, in recalling the better known achievements of campaigners such as Millicent Fawcett, founder of Newnham College, Cambridge, and Margaret Sanger in America, whose tireless (and wonderfully fearless) campaigning for access to birth control eventually led to the development of the contraceptive pill in 1960 – a day when “women’s lives changed forever”.

There were times when Foreman could be accused of oversimplifying her argument. That there were political and social factors other than an unalloyed male desire to suppress the rise of women that perhaps contributed to the extinguishing of some of these feminist flames.

But to argue that would be to miss the point. What Foreman achieved in this episode was to distil the essence of the last two centuries of global striving for equality into the space of a single hour with enormous passion and erudition. Few who watched could be anything other than grateful for her efforts to redress the balance of history, or disagree with her conclusion that it is “vital for the future that we have a proper understanding of the past.”

The Sunday Times: To the barricades once more, ladies, and this time men shall not deny us

Photo: Steffan Hill

Photo: Steffan Hill

If the women-to-the-back debacle of Jeremy Corbyn’s new cabinet has a silver lining, it’s the reminder that women are good for revolutions, but not all revolutions are good for us. For many on the left this is a painful truth, and one often avoided for fear of giving ammunition to the right. Yet for the future, let alone the history, of women, it’s a truth that has to be confronted.

Since the 18th century it has been the same old pattern. The people become restless. Women mobilise against injustice and the status quo. Their participation tips the scale in favour of change. The old regime collapses. A new order emerges. Political reforms ensue. Women demand their fair share. They go home empty-handed.

Continue reading…