Leaders Who Bowed Out Gracefully

Kings and politicians have used their last moments on the world stage to deliver words of inspiration.

November 5, 2020

The Wall Street Journal

The concession speech is one of the great accomplishments of modern democracy. The election is over and passions are running high, but the loser graciously concedes defeat, calls for national unity and reminds supporters that tomorrow is another day. It may be pure political theater, but it’s pageantry with a purpose.

For most of history, defeated rulers didn’t give concession speeches; they were too busy begging for their lives, since a king who lost his throne was usually killed shortly after. The Iliad recounts six separate occasions where a defeated warrior asks his opponent for mercy, only to be hacked to death anyway. The Romans had no interest whatsoever in listening to defeated enemies—except once, in the 1st century, when the British chieftain Caractacus was brought in chains before the Senate.

Republican presidential candidate John McCain delivers his concession speech on Nov. 4, 2008, after losing the election to Barack Obama.
PHOTO: ROBYN BECK/AGENCE FRANCE-PRESSE/GETTY IMAGES

On a whim, the Emperor Claudius told Caractacus to give one reason why his life should be spared. According to the historian Cassius Dio, the defeated Briton gave an impassioned speech about the glory of Rome, and how much greater it would be if he was spared: “If you save my life, I shall be an everlasting memorial of your clemency.” Impressed, the Senate set him free.

King Charles I had no hope for clemency on Jan. 30, 1649, when he faced execution after the English Civil War. But this made his speech all the more powerful, because Charles was speaking to posterity more than to his replacement, Oliver Cromwell. His final words have been a template for concession speeches ever since: After defending his record and reputation, Charles urged Cromwell to rule for the good of the country, “to endeavor to the last gasp the peace of the kingdom.”

In modern times, appeals to the nation became an important part of royal farewell speeches. When Napoleon Bonaparte abdicated as emperor of France in 1814, he stood in the courtyard of the palace of Fontainebleau and bade an emotional goodbye to the remnants of his Old Guard. He said that he was leaving to prevent further bloodshed, and ended with the exhortation: “I go, but you, my friends, will continue to serve France.”

Emperor Hirohito delivered a similar message in his radio broadcast on Aug. 14, 1945, announcing Japan’s surrender in World War II. The Emperor stressed that by choosing peace over annihilation he was serving the ultimate interests of the nation. He expected his subjects to do the same, to “enhance the innate glory of the Imperial State.” The shock of the Emperor’s words was compounded by the fact that no one outside the court and cabinet had ever heard his voice before.

In the U.S., the quality of presidential concession speeches rose markedly after they began to be televised in 1952. Over the years, Republican candidates, in particular, have elevated the art of losing to almost Churchillian heights. John McCain’s words on election night 2008, when he lost to Barack Obama, remain unmatched: “Americans never quit. We never surrender. We never hide from history. We make history.”

Historically Speaking: Tales That Go Bump in the Night

From Homer to Edgar Allan Poe, ghost stories have given us a chilling good time

The Wall Street Journal

October 23, 2020

As the novelist Neil Gaiman, a master of the macabre, once said, “Fear is a wonderful thing, in small doses.” In this respect, we’re no different than our ancestors: They, too, loved to tell ghost stories.

One of the earliest ghosts in literature appears in Homer’s Odyssey. Odysseus entertains King Alcinous of Phaeacia with an account of his trip to the Underworld, where he met the spirits of Greek heroes killed in the Trojan War. The dead Achilles complains that being a ghost is no fun: “I should choose, so I might live on earth, to serve as the hireling of another…rather than to be lord over all the dead.”

ILLUSTRATION: THOMAS FUCHS

It was a common belief in both Eastern and Western societies that ghosts could sometimes return to right a great wrong, such as an improper burial. The idea that ghosts are intrinsically evil—the core of any good ghost story—received a boost from Plato, who believed that only wicked souls hang on after death; the good know when it’s time to let go.

Ghosts were problematic for early Christianity, which taught that sinners went straight to Hell; they weren’t supposed to be slumming it on Earth. The ghost story was dangerously close to heresy until the Church adopted the belief in Purgatory, a realm where the souls of minor sinners waited to be cleansed. The Byland Abbey tales, a collection of ghost stories recorded by an anonymous 15th-century English monk, suggest that the medieval Church regarded the supernatural as a useful form of advertising: Not paying the priest to say a mass for the dead could lead to a nasty case of haunting.

The ghost story reached its apogee in the early modern era with Shakespeare’s “Hamlet,” which opens with a terrified guard seeing the ghost of the late king on the battlements of Elsinore Castle. But the rise of scientific skepticism made the genre seem old-fashioned and unsophisticated. Ghosts were notably absent from English literature until Horace Walpole, son of Britain’s first prime minister, published the supernatural mystery novel “The Castle of Otranto” in 1764, as a protest against the deadening effect of “reason” on art.

Washington Irving was the first American writer to take the ghost story seriously, creating the Headless Horseman in his 1820 tale “The Legend of Sleepy Hollow.” He was a lightweight, however, compared with Edgar Allan Poe, who turned horror into an art form. His famous 1839 story “The Fall of the House of Usher” heightens the tension with ambiguity: For most of the story, it isn’t clear whether Roderick Usher’s house really is haunted, or if he is merely “enchained by certain superstitious impressions.”

Henry James used a similar technique in 1895, when, unhappy with the tepid reception of his novels in the U.S., he decided to frighten Americans into liking him. The result was the psychological horror story “The Turn of the Screw,” about a governess who may or may not be seeing ghosts. The reviews expressed horror at the horror, with one critic describing it as “the most hopelessly evil story that we could have read in any literature.” With such universal condemnation, success was assured.

Stepping out of the Shadows

Sylvia Pankhurst by Rachel Holmes, review — finally having her moment.

Her mother and sister were once better known, but this fine biography shows just how remarkable the women’s rights activist was.

The Times

September 22, 2020

After decades of obscurity, Sylvia Pankhurst is finally having her moment. This is the third biography in seven years — not bad for a woman who spent much of her life being unfavourably compared with her more popular mother and sister.

The neglect is partly owing to Sylvia’s rich, complex life not being easily pigeonholed. Although she played an instrumental role in the suffrage movement, she was first and foremost a defender of the poor, the oppressed and the marginalised. Her political choices were often noble, but lonely ones.

Sylvia inherited her appetite for social activism and boundless energy for work from her parents, Richard and Emmeline. A perpetually aspiring MP, Richard cheerfully espoused atheism, women’s suffrage, republicanism, anti-imperialism and socialism at a time when any one of these causes was sufficient to scupper a man’s electoral chances. Emmeline was just as politically involved and only slightly less radical.

Sylvia’s mother, the suffragette Emmeline Pankhurst
ALAMY

Despite financial troubles and career disappointments, the Pankhurst parents were a devoted couple and the household a happy one. Sylvia was born in 1882, the second of five children and the middle daughter between Christabel (her mother’s favourite) and Adela (no one’s favourite). She craved Emmeline’s good opinion, but was closer to her father. “Life is valueless without enthusiasms,” he once told her, a piece of advice she took to heart.

Sylvia was only 16 when her father died. Without his counter-influence, the three sisters (and their brother, Harry, who died of polio aged 20) lived in thrall to their powerful mother. After Emmeline and Christabel founded the Women’s Social and Political Union (WSPU) in 1903 — having become frustrated by the lack of support from the Independent Labour Party — there was no question that Sylvia and Adela would do anything other than sacrifice their personal interests for the good of the cause. Sylvia later admitted that one of her greatest regrets was being made to give up a promising art career for politics.

She was imprisoned for the first time in 1906. As the tactics of the WSPU became more extreme, so did the violence employed by the authorities against its members. Sylvia was the only Pankhurst to be subjected to force-feeding, an experience she likened to rape.

“Infinitely worse than the pain,” she wrote of the experience, “was the sense of degradation.” Indeed, in some cases that was the whole point of the exercise. While not widespread, vaginal and anal “feeding” was practised on the hunger strikers. Holmes hints, but doesn’t speculate that Sylvia may have been one of its victims.


Pankhurst died in Ethiopia in 1960 after accepting an invitation from Emperor Haile Selassie, pictured, to emigrate to Africa
PHOTO ARCHIVE/GETTY

Ironically, Sylvia suffered the most while being the least convinced by the WSPU’s militant tactics. It wasn’t only the violence she abhorred. Emmeline and Christabel wanted the WSPU to remain an essentially middle-class, politically aloof organisation, whereas Sylvia regarded women’s rights as part of a wider struggle for revolutionary socialism. The differences between them became unbridgeable after Sylvia founded a separate socialist wing of the WSPU in the East End. Both she and Adela, whom Emmeline and Christabel dismissed as a talentless lightweight, were summarily expelled from the WSPU in February 1914. The four women would never again be in the same room together.

Sylvia had recognised early on that first-wave feminism suffered from a fundamental weakness. It was simultaneously too narrow and too broad to be a stand-alone political platform. The wildly different directions that were taken by the four Pankhursts after the victory of 1918 proved her right: Emmeline became a Conservative, Christabel a born-again Christian, Sylvia a communist and Adela a fascist, yet all remained loyal to their concept of women’s rights.

Once cut loose from the Pankhurst orbit, Sylvia claimed the freedom to think and act as her conscience directed. In 1918 she fell in love with an Italian anarchist socialist refugee, Silvio Corio, who already had three children with two women. Undeterred, she lived with him in Woodford Green, Essex, in a ramshackle home appropriately named Red Cottage. They remained partners for the best part of 30 years, writing, publishing and campaigning together. Even more distressing for her uptight family, not to mention society in general, at the advanced age of 45 she had a son by him, Richard, who was given her surname rather than Silvio’s.

Sylvia Pankhurst, here in 1940, became a communist after the victory of 1918
ALAMY

Broadly speaking, her life can be divided into four campaigns: after women’s suffrage came communism, then anti-fascism and finally Ethiopian independence. (The last has received the least attention, although Sylvia insisted it gave her the greatest pride.) None was an unalloyed success or without controversy. Her fierce independence would lead her to break with Lenin over their ideological differences, and later support her erstwhile enemy Winston Churchill when their views on fascism aligned. She never had any time for Stalin, left-wing antisemitism or liberal racism. In her mid-seventies and widowed, she cut all ties with Britain by accepting an invitation from Emperor Haile Selassie to emigrate to Ethiopia. She died there in 1960.

The genius of Holmes’s fascinating and important biography is that it approaches Sylvia’s life as if she were a man. The writing isn’t prettified or leavened by amusing anecdotes about Victorian manners, it’s dense and serious, as befits a woman who never wore make-up and didn’t care about clothes. To paraphrase the WSPU’s slogan, it is about deeds not domesticity. Rather than dwelling on moods and relationships, Holmes is interested in ideas and consequences. It’s wonderfully refreshing. Sylvia lived for her work; her literary output was astounding. In addition to publishing her own newspaper almost every week for over four decades, she wrote nonfiction, fiction, plays, poetry and investigative reports. She even taught herself Romanian so that she could translate the poems of the 19th-century Romantic poet Mihail Eminescu. It doesn’t matter whether Sylvia was right or wrong in her political enthusiasms; as Holmes rightly insists, what counts is that by acting on them she helped to make history.

Historically Speaking: The Sharp Riposte as a Battle Tactic

Caustic comebacks have been exchanged between military leaders for millennia, from the Spartans to World War II

The Wall Street Journal

August 6, 2020

In the center of Bastogne, Belgium (pop. 16,000), there is a statue of U.S. Army General Anthony C. McAuliffe, who died 45 years ago this week. It’s a small town with a big history. Bastogne came under attack during the final German offensive of World War II, known as the Battle of the Bulge. The town was the gateway to Antwerp, a vital port for the Allies, and all that stood between the Germans and their objective was Gen. McAuliffe and his 101st Airborne Division. Despite being outnumbered by a factor of four to one, he refused to surrender, fighting on until Gen. George Patton’s reinforcements could break the siege.

ILLUSTRATION: THOMAS FUCHS

While staving off the German attack, Gen. McAuliffe uttered the greatest comeback of the war. A typewritten ultimatum from Commander Heinrich von Lüttwitz of the 47th German Panzer Corps gave him two hours to surrender the town or face “annihilation.” With ammunition running low and casualties mounting, the general made his choice. He sent back the following typed reply:

December 22, 1944

To the German Commander,

N U T S !

The American Commander

The true laconic riposte is extremely rare. The Spartans, whose ancient homeland of Lakonia inspired the term “laconic,” were masters of the art. When Philip II of Macedon ordered them to open their borders, he warned them, “For if I bring my army into your land, I will destroy your farms, slay your people and raze your city.” According to the later account of the Greek philosopher Plutarch, they sent back a one-word message: ‘If’.

Once the age of the Spartans passed, it took more than a millennium for the laconic comeback to return in earnest. The man most responsible was a colorful Teutonic knight called Götz von Berlichingen, who participated in countless German conflicts, uprisings and skirmishes, including the Swabian War between the Hapsburgs and the Swiss. He lost a hand to a cannonball and wore an iron prosthesis (hence his nickname, Götz of the Iron Hand). In 1515, sick of trading insults with an opponent who wouldn’t come out to fight, Götz abruptly ended the conversation with: “soldt mich hinden leckhenn,” which literally meant “kiss my ass.”

He recorded the encounter in his memoirs, but it remained little known until Johann Wolfgang von Goethe adapted Götz’s autobiography into a successful play in 1773. From then on, the insult was popularly known in Germany as a “Swabian salute.”

In his novel “Les Misérables,” Victor Hugo—possibly inspired by Goethe—immortalized the French defeat in 1815 at the Battle of Waterloo with a scene of spectacular, if laconic, defiance that incorporated France’s most common expletive. According to Hugo, Gen. Pierre Cambronne, commander of Napoleon Bonaparte’s Imperial Guard, fought to the last syllable in the face of overwhelming force. Encircled by the British army, “They could hear the sound of the guns being reloaded and see the lighted slow matches gleaming like the eyes of tigers in the dusk. An English general called out to them, ‘Brave Frenchmen, will you not surrender?’ Cambronne answered, ‘Merde’” (that is, “shit”).

The scene’s veracity is still hotly debated, the fog of war making memories hazy. But Cambronne—who survived—later disavowed it, especially after “le mot de Cambronne’’ (Cambronne’s word) became a common euphemism for the profanity.

Historically Speaking: Playing Cards for Fun and Money

From 13th-century Egypt to the Wild West, the standard deck of 52 cards has provided entertainment—and temptation.

ILLUSTRATION: THOMAS FUCHS

More than 8,500 people traveled to Las Vegas to play in this year’s World Series of Poker, which ended July 16—a near-record for the contest. I’m not a poker player myself, but I understand the fun and excitement of playing with real cards in an actual game rather than online. There’s something uniquely pleasurable about a pack of cards—the way they look and feel to the touch—that can’t be replicated on the screen.

Although the origins of the playing card are believed to lie in China, the oldest known examples come from the Mamluk Caliphate, an Islamic empire that stretched across Egypt and the eastern Mediterranean from 1260 to 1517. It’s significant that the empire was governed by a warrior caste of former slaves: A playing card can be seen as an assertion of freedom, since time spent playing cards is time spent freely. The Mamluk card deck consisted of 52 cards divided into four suits, whose symbols reflected the daily realities of soldiering—a scimitar, a polo stick, a chalice and a coin.

Returning Crusaders and Venetian traders were probably responsible for bringing cards to Europe. Church and state authorities were not amused: In 1377, Parisians were banned from playing cards on work days. Like dice, cards were classed as gateway vices that led to greater sins. The authorities may not have been entirely wrong: Some surviving card decks from the 15th century have incredibly bawdy themes.

The suits and symbols used in playing cards became more uniform and less ornate following the advent of the printing press. French printers added a number of innovations, including dividing the four suits into red and black, and giving us the heart, diamond, club and spade symbols. Standardization enabled cards to become a lingua franca across cultures, further enhancing their appeal as a communal leisure activity.

In the 18th century, the humble playing card was the downfall of many a noble family, with vast fortunes being won and lost at the gaming table. Cards also started to feature in paintings and novels as symbols of the vagaries of fortune. The 19th-century short story “The Queen of Spades,” by the Russian writer Alexander Pushkin, beautifully captures the card mania of the period. The anti-hero, Hermann, is destroyed by his obsession with winning at Faro, a game of chance that was as popular in the saloons of the American West as it was in the drawing rooms of Europe. The lawman Wyatt Earp may have won fame in the gunfight at the OK Corral, but he earned his money as a Faro dealer in Tombstone, Ariz.

In Britain, attempts to regulate card-playing through high taxes on the cards themselves were a failure, though they did result in one change: Every ace of spades had to show a government tax stamp, which is why it’s the card that traditionally carries the manufacturer’s mark. The last innovation in the card deck, like the first, had military origins. Many Civil War regiments killed time by playing Euchre, which requires an extra trump card. The Samuel Hart Co. duly obliged with a card which became the forerunner to the Joker, the wild card that doesn’t have a suit.

But we shouldn’t allow the unsavory association of card games with gambling to have the last word. As Charles Dickens wrote in “Nicholas Nickleby”: “Thus two people who cannot afford to play cards for money, sometimes sit down to a quiet game for love.”

Historically Speaking: How We Kept Cool Before Air Conditioning

Wind-catching towers and human-powered rotary fans were just some of the devices invented to fight the heat.

ILLUSTRATION: THOMAS FUCHS

What would we do without our air conditioning? Given the number of rolling blackouts and brownouts that happen across the U.S. each summer, that’s not exactly a rhetorical question.

Fortunately, our ancestors knew a thing or two about staying cool even without electricity. The ancient Egyptians developed the earliest known technique: Evaporative cooling involved hanging wet reeds in front of windows, so that the air cooled as the water evaporated.

The Romans, the greatest engineers of the ancient world, had more sophisticated methods. By 312 B.C. they were piping fresh water into Rome via underground pipes and aqueducts, enabling the rich to cool and heat their houses using cold water pipes embedded in the walls and hot water pipes under the floor.

Nor were the Romans alone in developing clever domestic architecture to provide relief in hot climes. In the Middle East, architects constructed buildings with “wind catchers”—tall, four-windowed towers that funneled cool breezes down to ground level and allowed hot air to escape. These had the advantage of working on their own, without human labor. The Chinese had started using rotary fans as early as the second century, but they required a dedicated army of slaves to keep them moving. The addition of hydraulic power during the Song era, 960-1279, alleviated but didn’t end the manpower issue.

There had been no significant improvements in air conditioning designs for almost a thousand years when, in 1734, British politicians turned to Dr. John Theophilus Desaguliers, a former assistant to Isaac Newton, and begged him to find a way of cooling the overheated House of Commons. Desaguliers designed a marvelous Rube Goldberg-like system that used all three traditional methods: wind towers, pipes and rotary fans. It actually worked, so long as there was someone to crank the handle at all times.

But the machinery wore out in the late 1760s, leaving politicians as hot and bothered as ever. In desperation, the House invited Benjamin Franklin and other leading scientists to design something new. Their final scheme turned out to be no better than Desaguliers’s and required not one but two men to keep the system working.

The real breakthrough occurred in 1841, after the British engineer David Boswell Reid figured out how to control room temperature using steam power. St. George’s Hall in Liverpool is widely considered to be the world’s first air-conditioned building.

Indeed, Reid is one of history’s unsung heroes. His system worked so well that he was invited to install his pipe and ventilation design in hospitals and public buildings around the world. He was working in the U.S. at the start of the Civil War and was appointed inspector of military hospitals. Unfortunately, he died suddenly in 1863, leaving his proposed improvements to gather dust.

The chief problem with Reid’s system was that steam power was about to be overtaken by electricity. When President James Garfield was shot by an assassin in the summer of 1881, naval engineers attempted to keep him cool by using electric fans to blow air over blocks of ice. Two decades later, Willis Haviland Carrier invented the first all-electric air conditioning unit. Architects and construction engineers have been designing around it ever since.

Fears for our power grid may be exaggerated, but it’s good to know that if the unthinkable were to happen and we lost our air conditioners, history can offer us some cool solutions.

Harper’s Bazaar: Buckingham Palace is opening a new exhibition exploring the life of Queen Victoria this summer

This year marks the 200th anniversary of the birth of Queen Victoria and to celebrate, Buckingham Palace has announced a special exhibition as part of its state opening this summer, co-curated by Dr. Amanda Foreman.

Harper’s Bazaar, May 7, 2019

by Katie Frost

A portrait of Queen Victoria by Thomas Sully

The exhibit will explore the life of the monarch and how she turned the once unloved palace into the royal residence we know today. Highlights will include a portrait of the young queen painted by Thomas Sully soon after she moved into her new home, along with Victoria’s personal insignia, the Star and Collar of the Order of the Bath.

Victoria moved into the palace in 1837 when she was just 18. It had been empty for seven years following the death of her uncle, George IV. After Victoria married Prince Albert and started a family, Victoria wrote a letter to the Prime Minister, Sir Robert Peel, about her plans to revamp her family home. In it, she spoke of “the urgent necessity of doing something to Buckingham Palace” and “the total want of accommodation for our growing little family”, according to the Royal Collection Trust.

Historically Speaking: Beloved Buildings That Rose from the Ashes

From ancient Rome to modern London, great structures like Notre Dame have fallen and been built again

The Wall Street Journal, May 2, 2019

A disaster like the Notre Dame cathedral fire is as much a tragedy of the heart as it is a loss to architecture. But fortunately, unlike most love affairs, a building can be resurrected. In fact, throughout history communities have gone to remarkable lengths to rebuild monuments of sacred or national importance.

There is no shortage of inspirational examples of beloved buildings that have risen from the ashes. The Second Temple was built in Jerusalem in 515 B.C. following the destruction of the First by King Nebuchadnezzar II of Babylonia in 586 B.C.; Dresden’s Baroque Frauenkirche was faithfully rebuilt in 2005, after being destroyed by bombs in 1945.

Often the new structures are exact replicas, as with Venice and Barcelona’s opera houses, La Fenice and Gran Teatre del Liceu, both of which were rebuilt after suffering devastating fires in the 1990s. If France decides to rebuild Notre Dame according to the principle “as it was, where it was,” the skill and technology aren’t lacking.

In other cases, however, disasters have allowed for beloved landmarks to be reimagined. The Great Fire of Rome in 64 A.D. led to a revolution in architectural styles and techniques. After Hagia Sophia cathedral was torched during riots in Constantinople in 532, the Byzantine Emperor Justinian asked his architects Anthemius and Isidore to build something bold and impressive. It was risky to change such a renowned symbol of the Eastern Roman Empire; moreover, for security and financial reasons, the work had to be completed in just six years. Still, the result dazzled Justinian, who exclaimed when he saw it, ‘‘Solomon, I have outdone thee.” Almost a thousand years later, following Constantinople’s fall to the Turks in 1453, Sultan Mehmet II had a similar reaction and ordered Hagia Sophia to be turned into a mosque rather than destroyed.

Sir Christopher Wren, who rebuilt St. Paul’s Cathedral in London, was not so lucky in the reactions to his creation. The Great Fire of 1666 had left the medieval church in ruins, but there was little appetite for an innovative reconstruction and no money in the Treasury to pay for it. Wren endured setbacks at every turn, including a chronic shortage of stone. At one point, Parliament suspended half his salary in protest at the slowness of the work, which took almost three decades and spanned the reigns of five monarchs.

The reason for the delay became clear after the finished building was revealed to the public. Inspired by drawings of Hagia Sophia, Wren had ignored the approved design for a traditional reconstruction and quietly opted for a more experimental approach. Ironically, many of his contemporaries were appalled by the now iconic great dome, especially the Protestant clergy, who deemed it too foreign and Catholic-looking. Yet Wren’s vision has endured. During the German bombing of London in World War II, St. Paul’s was the one building that Winston Churchill declared must be saved at all costs.

It is never easy deciding how to draw the line between history and modernity, particularly when dealing with the loss of an architectural masterpiece. There isn’t always a right answer, but it may help to remember Churchill’s words: “We shape our buildings and afterwards our buildings shape us.”

Historically Speaking: Real-Life Games of Thrones

From King Solomon to the Shah of Persia, rulers have used stately seats to project power.

The Wall Street Journal, April 22, 2019

ILLUSTRATION BY THOMAS FUCHS

Uneasy lies the head that wears a crown,” complains the long-suffering King Henry IV in Shakespeare. But that is not a monarch’s only problem; uneasy, too, is the bottom that sits on a throne, for thrones are often a dangerous place to be. That is why the image of a throne made of swords, in the HBO hit show “Game of Thrones” (which last week began its eighth and final season), has served as such an apt visual metaphor. It strikingly symbolizes the endless cycle of violence between the rivals for the Iron Throne, the seat of power in the show’s continent of Westeros.

In real history, too, virtually every state once put its leader on a throne. The English word comes from the ancient Greek “thronos,” meaning “stately seat,” but the thing itself is much older. Archaeologists working at the 4th millennium B.C. site of Arslantepe, in eastern Turkey, where a pre-Hittite Early Bronze Age civilization flourished, recently found evidence of what is believed to be the world’s oldest throne. It seems to have consisted of a raised bench which enabled the ruler to display his or her elevated status by literally sitting above all visitors to the palace.

Thrones were also associated with divine power: The famous 18th-century B.C. basalt stele inscribed with the law code of King Hammurabi of Babylon, which can be seen at the Louvre, depicts the king receiving the laws directly from the sun god Shamash, who is seated on a throne.

Naturally, because they were invested with so much religious and political symbolism, thrones often became a prime target in war. According to Jewish legend, King Solomon’s spectacular gold and ivory throne was stolen first by the Egyptians, who then lost it to the Assyrians, who subsequently gave it up to the Persians, whereupon it became lost forever.

In India, King Solomon’s throne was reimagined in the early 17th century by the Mughal Emperor Shah Jahan as the jewel-and-gold-encrusted Peacock Throne, featuring the 186-carat Koh-i-Noor diamond. (Shah Jahan also commissioned the Taj Mahal.) This throne also came to an unfortunate end: It was stolen during the sack of Delhi by the Shah of Iran and taken back to Persia. A mere eight years later, the Shah was assassinated by his own bodyguards and the Peacock Throne was destroyed, its valuable decorations stolen.

Perhaps the moral of the story is to keep things simple. In 1308, King Edward I of England commissioned a coronation throne made of oak. For the past 700 years it has supported the heads and backsides of 38 British monarchs during the coronation ceremony at Westminster Abbey. No harm has ever come to it, save for the pranks of a few very naughty choir boys, one of whom carved on the back of the throne: “P. Abbott slept in this chair 5-6 July 1800.”

Historically Speaking: The Invention of Ice Hockey

Canada gave us the modern form of a sport that has been played for centuries around the world

The Wall Street Journal, February 11, 2019

ILLUSTRATION: THOMAS FUCHS

Canadians like to say—and print on mugs and T-shirts—that “Canada is Hockey.” No fewer than five Canadian cities and towns claim to be the birthplace of ice hockey, including Windsor, Nova Scotia, which has an entire museum dedicated to the sport. Canada’s annual Hockey Day, which falls on February 9 this year, features a TV marathon of hockey games. Such is the country’s love for the game that last year’s broadcast was watched by more than 1 in 4 Canadians.

But as with many of humanity’s great advances, no single country or person can take the credit for inventing ice hockey. Stick-and-ball games are as old as civilization itself. The ancient Egyptians were playing a form of field hockey as early as the 21st century B.C., if a mural on a tomb at Beni Hasan, a Middle Kingdom burial site about 120 miles south of Cairo, is anything to go by. The ancient Greeks also played a version of the game, as did the early Christian Ethiopians, the Mesoamerican Teotihuacanos in the Valley of Mexico, and the Daur tribes of Inner Mongolia. And the Scottish and Irish versions of field hockey, known as shinty and hurling respectively, have strong similarities with the modern game.

Taking a ball and stick onto the ice was therefore a fairly obvious innovation, at least in places with snowy winters. The figures may be tiny, but three villagers playing an ice hockey-type game can be seen in the background of Pieter Bruegel the Elder’s 1565 painting “Hunters in the Snow.” There is no such pictorial evidence to show when the Mi’kmaq Indians of Nova Scotia first started hitting a ball on ice, but linguistic clues suggest that their hockey tradition existed before the arrival of European traders in the 16th century. The two cultures then proceeded to influence each other, with the Mi’kmaqs becoming the foremost maker of hockey sticks in the 19th century.

The earliest known use of the word hockey appears in a book, “Juvenile Sports and Pastimes,” written by Richard Johnson in London in 1776. Recently, Charles Darwin became an unlikely contributor to ice hockey history after researchers found a letter in which he reminisced about playing the game as a boy in the 1820s: “I used to be very fond of playing Hocky [sic] on the ice in skates.” On January 8, 1864, the future King Edward VII played ice hockey at Windsor Castle while awaiting the birth of his first child.

As for Canada, apart from really liking the game, what has been its real contribution to ice hockey? The answer is that it created the game we know today, from the official rulebook to the size and shape of the rink to the establishment of the Stanley Cup championship in 1894. The first indoor ice hockey game was played in Montreal in 1875, thereby solving the perennial problem of pucks getting lost. (The rink was natural ice, with Canada’s cold winter supplying the refrigeration.) The game involved two teams of nine players, each with a set position—three more than teams field today—a wooden puck, and a list of rules for fouls and scoring.

In addition to being the first properly organized game, the Montreal match also initiated ice hockey’s other famous tradition: brawling on the ice. In this case, the fighting erupted between the players, spectators and skaters who wanted the ice rink back for free skating. Go Canada!