Historically Speaking: The Long Haul of Distance Running

How the marathon became the world’s top endurance race

The Wall Street Journal

September 2, 2021

The New York City Marathon, the world’s largest, will hold its 50th race this autumn, after missing last year’s due to the pandemic. A podiatrist once told me that he always knows when there has been a marathon because of the sudden uptick in patients with stress fractures and missing toenails. Nevertheless, humans are uniquely suited to long-distance running.

Some 2-3 million years ago, our hominid ancestors began to develop sweat glands that enabled their bodies to stay cool while chasing after prey. Other mammals, by contrast, overheat unless they stop and rest. Thus, slow but sweaty humans won out over fleet but panting animals.

The marathon, at 26.2 miles, isn’t the oldest known long-distance race. Egyptian Pharaoh Taharqa liked to organize runs to keep his soldiers fit. A monument inscribed around 685 B.C. records a two-day, 62-mile race from Memphis to Fayum and back. The unnamed winner of the first leg (31 miles) completed it in about four hours.

ILLUSTRATION: THOMAS FUCHS

The considerably shorter marathon derives from the story of a Greek messenger, Pheidippides, who allegedly ran from Marathon to Athens in 490 B.C. to deliver news of victory over the Persians—only to drop dead of exhaustion at the end. But while it is true that the Greeks used long-distance runners, called hemerodromoi, or day runners, to convey messages, this story is probably a myth or a conflation of different events.

Still, foot-bound messengers ran impressive distances in their day. Within 24 hours of Herman Cortes’s landing in Mexico in 1519, messenger relays had carried news of his arrival over 260 miles to King Montezuma II in Tenochtitlan.

As a competitive sport, the marathon has a shorter history. The longest race at the ancient Olympic Games was about 3 miles. This didn’t stop the French philologist Michel Bréal from persuading the organizers of the inaugural modern Olympics in 1896 to recreate Pheidippides’s epic run as a way of adding a little classical flavor to the Games. The event exceeded his expectations: The Greek team trained so hard that it won 8 of the first 9 places. John Graham, manager of the U.S. Olympic team, was inspired to organize the first Boston Marathon in 1897.

Marathon runners became fitter and faster with each Olympics. But at the 1908 London Games the first runner to reach the stadium, the Italian Dorando Pietri, arrived delirious with exhaustion. He staggered and fell five times before concerned officials eventually helped him over the line. This, unfortunately, disqualified his time of 2:54:46.

Pietri’s collapse added fuel to the arguments of those who thought that a woman’s body could not possibly stand up to a marathon’s demands. Women were banned from the sport until 1964, when Britain’s Isle of Wight Marathon allowed the Scotswoman Dale Greig to run, with an ambulance on standby just in case. Organizers of the Boston Marathon proved more intransigent: Roberta Gibb and Katherine Switzer tried to force their way into the race in 1966 and ’67, but Boston’s gender bar stayed in place until 1972. The Olympics held out until 1984.

Since that time, marathons have become a great equalizer, with men and women on the same course: For 26.2 miles, the only label that counts is “runner.”

Historically Speaking: Let Slip the Dogs, Birds and Donkeys of War

Animals have served human militaries with distinction since ancient times

The Wall Street Journal

August 5, 2021

Cher Ami, a carrier pigeon credited with rescuing a U.S. battalion from friendly fire in World War I, has been on display at the Smithsonian for more a century. The bird made news again this summer, when DNA testing revealed that the avian hero was a “he” and not—as two feature films, several novels and a host of poems depicted—a ”she.”

Cher Ami was one of more than 200,000 messenger pigeons Allied forces employed during the War. On Oct. 4, 1918, a battalion from the U.S. 77th Infantry Division in Verdun, northern France, was trapped behind enemy lines. The Germans had grown adept at shooting down any bird suspected of working for the other side. They struck Cher Ami in the chest and leg—but the pigeon still managed to make the perilous flight back to his loft with a message for U.S. headquarters.

Animals have played a crucial role in human warfare since ancient times. One of the earliest depictions of a war animal appears on the celebrated 4,500-year-old Sumerian box known as the Standard of Ur. One side shows scenes of war; the other, scenes of peace. On the war side, animals that are most probably onagers, a species of wild donkey, are shown dragging a chariot over the bodies of enemy soldiers.

War elephants of Pyrrhus in a 20th century Russian painting
PHOTO: ALAMY

The two most feared war animals of the classical world were horses and elephants. Alexander the Great perfected the use of the former and introduced the latter after his foray into India in 327 BC. For a time, the elephant was the ultimate weapon of war. At the Battle of Heraclea in 280 B.C., a mere 20 of them helped Pyrrhus, king of Epirus—whose costly victories inspired the term “Pyrrhic victory”—rout an entire Roman army.

War animals didn’t have to be big to be effective, however. The Romans learned how to defeat elephants by exploiting their fear of pigs. In 198 B.C., the citizens of Hatra, near Mosul in modern Iraq, successfully fought off a Roman attack by pouring scorpions on the heads of the besiegers. Eight years later, the Carthaginian general Hannibal won a surprise naval victory against King Eumenes II of Pergamon by catapulting “snake bombs”—jars stuffed with poisonous snakes—onto his ships.

Ancient war animals often suffered extraordinary cruelty. When the Romans sent pigs to confront Pyrrhus’s army, they doused the animals in oil and set them on fire to make them more terrifying. Hannibal would get his elephants drunk and stab their legs to make them angry.

Counterintuitively, as warfare became more mechanized the need for animals increased. Artillery needed transporting; supplies, camps, and prisoners needed guarding. A favorite mascot or horse might be well treated: George Washington had Nelson, and Napoleon had Marengo. But the life of the common army animal was hard and short. The Civil War killed between one and three million horses, mules and donkeys.

According to the Imperial War Museum in Britain, some 16 million animals served during World War I, including canaries, dogs, bears and monkeys. Horses bore the brunt of the fighting, though, with as many as 8 million dying over the four years.

Dolphins and sea lions have conducted underwater surveillance for the U.S. Navy and helped to clear mines in the Persian Gulf. The U.S. Army relies on dogs to detect hidden IEDs, locate missing soldiers, and even fight when necessary. In 2016, four sniffer dogs serving in Afghanistan were awarded the K-9 Medal of Courage by the American Humane Association. As the troop withdrawal continues, the military’s four-legged warriors are coming home, too

Historically Speaking: How the Office Became a Place to Work

Employees are starting to return to their traditional desks in large shared spaces. But centuries ago, ‘office’ just meant work to be done, not where to do it.

The Wall Street Journal

June 24, 2021

Wall Street wants its workforce back in the office. Bank of America, Morgan Stanley and Goldman Sachs have all let employees know that the time is approaching to exchange pajamas and sweats for less comfortable work garb. Some employees are thrilled at the prospect, but others waved goodbye to the water cooler last year and have no wish to return.

Contrary to popular belief, office work is not a beastly invention of the capitalist system. As far back as 3000 B.C, the temple cities of Mesopotamia employed teams of scribes to keep records of official business. The word “office” is an amalgamation of the Latin officium, which meant a position or duty, and ob ficium, literally “toward doing.” Geoffrey Chaucer was the first writer known to use “office” to mean an actual place, in “The Canterbury Tales” in 1395.

In the 16th century, the Medicis of Florence built the Uffizi, now famous as a museum, for conducting their commercial and political business (the name means “offices” in Italian). The idea didn’t catch on in Europe, however, until the British began to flex their muscles across the globe. When the Royal Navy outgrew its cramped headquarters, it commissioned a U-shaped building in central London originally known as Ripley Block and later as the Old Admiralty building. Completed in 1726, it is credited with being the U.K.’s first purpose-built office.

Three years later, the East India Company began administering its Indian possessions from gleaming new offices in Leadenhall Street. The essayist and critic Charles Lamb joined the East India Company there as a junior clerk in 1792 and stayed until his retirement, but he detested office life, calling it “daylight servitude.” “I always arrive late at the office,” he famously wrote, “but I make up for it by leaving early.”

A scene from “The Office,” which reflected the modern ambivalence toward deskbound work.
PHOTO: CHRIS HASTON/NBC/EVERETT COLLECTION

Not everyone regarded the office as a prison without bars. For women it could be liberating. An acute manpower shortage during the Civil War led Francis Elias Spinner, the U.S. Treasurer, to hire the government’s first women office clerks. Some Americans were scandalized by the development. In 1864, Rep. James H Brooks told a spellbound House that the Treasury Department was being defiled by “orgies and bacchanals.”

In the late 19th century, the inventions of the light bulb and elevator were as transformative for the office as the telephone and typewriter: More employees could be crammed into larger spaces for longer hours. Then in 1911, Frederick Winslow Taylor published “The Principles of Scientific Management,” which advocated a factory-style approach to the workplace with rows of desks lined up in an open-plan room. “Taylorism” inspired an entire discipline devoted to squeezing more productivity from employees.

Sinclair Lewis’s 1917 novel, “The Job,” portrayed the office as a place of opportunity for his female protagonist, but he was an outlier among writers and social critics. Most fretted about the effects of office work on the souls of employees. In 1955, Sloan Wilson’s “The Man in the Grey Flannel Suit,” about a disillusioned war veteran trapped in a job that he hates, perfectly captured the deep-seated American ambivalence toward the office. Modern television satires like “The Office” show that the ambivalence has endured—as do our conflicted attitudes toward a post-pandemic return to office routines.

Historically Speaking: The Tragedy of Vandalizing the Past

The 20th anniversary of the destruction of the Bamiyan Buddhas in Afghanistan reminds us of the imperative of historical preservation

April 15, 2021

Twenty years ago this spring, the Taliban completed their obliteration of Afghanistan’s 1,500-year-old Buddhas of Bamiyan. The colossal stone sculptures had survived major assaults in the 17th and 18th centuries by the Mughal emperor Aurangzeb and the Persian king Nader Afshar. Lacking sufficient firepower, both gave up after partly defacing the monuments.

The Taliban’s methodical destruction recalled the calculated brutality of ancient days. By the time the Romans were finished with Carthage in 146 B.C., the entire city had been reduced to rubble. They were given a taste of their own medicine in 455 A.D. by Genseric, King of the Vandals, who stripped Rome bare in two weeks of systematic looting and destruction.

One of the Buddhas of Bamiyan in 1997, before their destruction.
PHOTO: ALAMY

Like other vanquished cities, Rome’s buildings became a source of free material. Emperor Constans II of Byzantium blithely stole the Pantheon’s copper roofing in the mid-17th century; a millennium later, Pope Urban VIII appropriated its bronze girders for Bernini’s baldacchino over the high altar in St. Peter’s Basilica.

When not dismantled, ancient buildings might be repurposed by new owners. Thus Hagia Sophia Cathedral became a mosque after the Ottomans captured Constantinople, and St. Radegund’s Priory was turned into Jesus College at Cambridge University on the orders of King Henry VIII.

The idea that a country’s ancient heritage forms part of its cultural identity took hold in the wake of the French Revolution. Incensed by the Jacobins’ pillaging of churches, Henri Gregoire, the Constitutional Bishop of Blois, coined the term vandalisme. His protest inspired the novelist Victor Hugo’s efforts to save Notre Dame. But the architect chosen for the restoration, Eugène Emmanuel Viollet-le-Duc, added his own touches to the building, including the central spire that fell when the cathedral’s roof burned in 2019, spurring controversy over what to restore. Viollet-le-Duc’s own interpolations set off a fierce debate, led by the English art critic John Ruskin, about what constitutes proper historical preservation.

Ruskin inspired people to rethink society’s relationship with the past. There was uproar in England in 1883 when the London and South Western Railway tried to justify building a rail-track alongside Stonehenge, claiming the ancient site was unused.

Public opinion in the U.S., when aroused, could be equally determined. The first preservation society was started in the 1850s by Ann Pamela Cunningham of South Carolina. Despite being disabled by a riding accident, Cunningham initiated a successful campaign to save George Washington’s Mount Vernon from ruin.

But developers have a way of getting what they want. Not even modernist architect Philip Johnson protesting in front of New York’s Penn Station was able to save the McKim, Mead & White masterpiece in 1963. Two years later, fearing that the world’s architectural treasures were being squandered, retired army colonel James Gray founded the International Fund for Monuments (now the World Monuments Fund). Without the WMF’s campaign in 1996, the deteriorating south side of Ellis Island, gateway for 12 million immigrants, might have been lost to history.

The fight never ends. I still miss the magnificent beaux-arts interior of the old Rizzoli Bookstore on 57th Street in Manhattan. The 109-year-old building was torn down in 2014. Nothing like it will ever be seen again.

Historically Speaking: The Ordeal of Standardized Testing

From the Incas to the College Board, exams have been a popular way for societies to select an elite.

The Wall Street Journal

March 11, 2021

Last month, the University of Texas at Austin joined the growing list of colleges that have made standardized test scores optional for another year due to the pandemic. Last year, applicants were quick to take up the offer: Only 44% of high-school students who applied to college using the Common Application submitted SAT or ACT scores in 2020-21, compared with 77% the previous year.

Nobody relishes taking exams, yet every culture expects some kind of proof of educational attainment from its young. To enter Plato’s Academy in ancient Athens, a prospective student had to solve mathematical problems. Would-be doctors at one of the many medical schools in Ephesus had to participate in a two-day competition that tested their knowledge as well as their surgical skills.

ILLUSTRATION: THOMAS FUCHS

On the other side of the world, the Incas of Peru were no less demanding. Entry into the nobility required four years of rigorous instruction in the Quechua language, religion and history. At the end of the course students underwent a harsh examination lasting several days that tested their physical and mental endurance.

It was the Chinese who invented the written examination, as a means of improving the quality of imperial civil servants. During the reign of Empress Wu Zetian, China’s only female ruler, in the 7th century, the exam became a national rite of passage for the intelligentsia. Despite its burdensome academic requirements, several hundred thousand candidates took it every year. A geographical quota system was eventually introduced to prevent the richer regions of China from dominating.

Over the centuries, all that cramming for one exam stifled innovation and encouraged conformity. Still, the meritocratic nature of the Chinese imperial exam greatly impressed educational reformers in the West. In 1702, Trinity College, Cambridge became the first institution to require students to take exams in writing rather than orally. By the end of the 19th century, exams to enter a college or earn a degree had become a fixture in most European countries.

In the U.S., the reformer Horace Mann introduced standardized testing in Boston schools in the 1840s, hoping to raise the level of teaching and ensure that all citizens would have equal access to a good education. The College Board, a nonprofit organization founded by a group of colleges and high schools in 1899, established the first standardized test for university applicants.

Not every institution that adopted standardized testing had noble aims, however. The U.S. Army had experimented with multiple-choice intelligence tastes during World War I and found them useless as a predictive tool. But in the early 1920s, the president of Columbia University, Nicholas M. Butler, adopted the Thorndike Tests for Mental Alertness as part of the admissions process, believing it would limit the number of Jewish students.

The College Board adopted the SAT, a multiple-choice aptitude test, in 1926, as a fair and inclusive alternative to written exams, which were thought to be biased against poorer students. In the 1960s, civil rights activists began to argue that standardized tests like the SAT and ACT were biased against minority students, but despite the mounting criticisms, the tests seemed like a permanent part of American education—until now.

Historically Speaking: Iron Curtains Are Older Than the Cold War

Winston Churchill made the term famous, but ideological rivalries have driven geopolitics since Athens and Sparta.

The Wall Street Journal

February 25, 2021

It was an unseasonably springlike day on March 5, 1946, when Winston Churchill visited Fulton, Missouri. The former British Prime Minister was ostensibly there to receive an honorary degree from Westminster College. But Churchill’s real purpose in coming was to urge the U.S. to form an alliance with Britain to keep the Soviet Union from expanding any further. Speaking before an august audience that included President Harry S. Truman, Churchill declared: “From Stettin in the Baltic to Trieste in the Adriatic, an iron curtain has descended across the continent.”

ILLUSTRATION: THOMAS FUCHS

Churchill wasn’t the first person to employ the phrase “iron curtain” as a political metaphor. Originally a theatrical term for the safety barrier between the stage and the audience, by the early 20th century it was being used to mean a barrier between opposing powers. Nevertheless, “iron curtain” became indelibly associated with Churchill and with the defense of freedom and democracy.

This was a modern expression of an idea first articulated by the ancient Greeks: that political beliefs are worth defending. In the winter of 431-30 B.C., the Athenians were staggering under a devastating plague while simultaneously fighting Sparta in the Peloponnesian War. The stakes couldn’t have been higher when the great statesman Pericles used a speech commemorating the war dead to define the struggle in terms that every Athenian would understand.

As reported by the historian Thucydides, Pericles told his compatriots that the fight wasn’t for more land, trade or treasure; it was for democracy pure and simple. Athens was special because its government existed “for the many instead of the few,” guaranteeing “equal justice to all.” No other regime, and certainly not the Spartans, could make the same claim.

Pericles died the following year, and Athens eventually went down in defeat in 404 B.C. But the idea that fighting for one’s country meant defending a political ideal continued to be influential. According to the 2nd-century Roman historian Cassius Dio, the Empire had to make war on the “lawless and godless” tribes living outside its borders. Fortifications such as Hadrian’s Wall in northern England weren’t just defensive measures but political statements: Inside bloomed civilization, outside lurked savagery.

The Great Wall of China, begun in 220 B.C. by Emperor Qin Shi Huang, had a similar function. In addition to keeping out the nomadic populations in the Mongolian plains, the wall symbolized the unity of the country under imperial rule and the Confucian belief system that supported it. Successive dynasties continued to fortify the Great Wall until the mid-17th century.

During the Napoleonic Wars, the British considered themselves to be fighting for democracy against dictatorship, like the ancient Athenians. In 1806, Napoleon instigated the Continental System, an economic blockade intended to cut off Britain from trading with France’s European allies and conquests. But the attack on free trade only strengthened British determination.

A similar resolve among the NATO allies led to the collapse of the Iron Curtain in 1991, when the Soviet Union was dissolved and withdrew its armies from Eastern Europe. As Churchill had predicted, freedom and democracy is the ultimate shield against “war and tyranny.”

Historically Speaking: The Original Victims of Cancel Culture

Roman emperors and modern dictators have feared the social and spiritual penalties of excommunication.

The Wall Street Journal

January 28, 2021

Nowadays, all it takes for a person to be condemned to internal exile is a Twitter stampede of outrage. The lack of any regulating authority or established criteria for what constitutes repentance gives “cancel culture,” as it is popularly known, a particularly modern edge over more old-fashioned expressions of public shaming such as tar-and-feathering, boycotts and blacklists.

Portrait of Martin Luther by Lucas Cranach the Elder.
PHOTO: CORBIS/VCG/GETTY IMAGES

But the practice of turning nonconforming individuals into non-persons has been used with great effectiveness for centuries, none more so than the punishment of excommunication by the Roman Catholic Church. The penalties included social ostracism, refusal of communion and Christian burial, and eternal damnation of one’s soul.

The fear inspired by excommunication was great enough to make even kings fall in line. In 390, soldiers under the command of the Roman emperor Theodosius I massacred thousands in the Greek city of Thessalonica. In response, Bishop Ambrose of Milan excommunicated Theodosius, forcing him to don sackcloth and ashes as public penance. Ambrose’s victory established the Church’s authority over secular rulers.

Later church leaders relied on the threat of excommunication to maintain their power, but the method could backfire. In 1054, Pope Leo III of Rome excommunicated Patriarch Michael Cerularius of Constantinople, the head of the eastern Church, who retaliated by excommunicating Leo and the western Church. Since this Great Schism, the two churches, Roman Catholic and Eastern Orthodox, have never reunited.

During the Middle Ages, the penalty of excommunication broadened to include the cancellation of all legal protections, including the right to collect debts. Neither kings nor cities were safe. After being excommunicated by Pope Gregory VII in 1076, Holy Roman Emperor Henry IV stood barefoot in the snow for three days before the pontiff grudgingly welcomed him inside to hear his repentance. The entire city of Venice was excommunicated over half a dozen times, and on each occasion the frightened Venetians capitulated to papal authority.

But the excommunication of Martin Luther, the founder of Protestantism, by Pope Leo X in January 1521, 500 years ago this month, didn’t work out as planned. Summoned to explain himself at the Diet of Worms, a meeting presided over by the Holy Roman Emperor Charles V, Luther refused to recant and ask forgiveness, allegedly saying: “Here I stand, I can do no other.” In response, the Emperor declared him a heretic and outlaw, putting his life in danger. Luther was only saved from assassination by his patron, Frederick, Elector of Saxony, who hid him in a castle. Luther used the time to begin translating the Bible into German.

Napoleon Bonaparte was equally unconcerned about the spiritual consequences when he was excommunicated by Pope Pius VII in 1809. Nevertheless, he was sufficiently frightened of public opinion to kidnap the pontiff and keep him out of sight for several months. In 1938, angry over Nazi Germany’s takeover of Austria, the Italian dictator Benito Mussolini tried to persuade Pope Pius XI to excommunicate the German dictator Adolf Hitler, a nonpracticing Catholic. Who knows what would have happened if he had been successful.

Historically Speaking: Two Centuries of Exploring Antarctica

Charting the southern continent took generations of heroic sacrifice and international cooperation.

The Wall Street Journal

January 14, 2021

There is a place on Earth that remains untouched by war, slavery or riots. Its inhabitants coexist in peace, and all nationalities are welcomed. No, it’s not Neverland or Shangri-La—it’s Antarctica, home to the South Pole, roughly 20 million penguins and a transient population of about 4,000 scientists and support staff.

Antarctica’s existence was only confirmed 200 years ago. Following some initial sightings by British and Russian explorers in January 1821, Captain John Davis, a British-born American sealer and explorer, landed on the Antarctic Peninsula on Feb. 7, 1821. Davis was struck by its immense size, writing in his logbook, “I think this Southern Land to be a Continent.” It is, in fact, the fifth-largest of Earth’s seven continents.

Herbert Ponting is attacked by a penguin during the 1911 Scott expedition in Antarctica.
PHOTO: HERBERT PONTING/SCOTT POLAR RESEARCH INSTITUTE, UNIVERSITY OF CAMBRIDGE/GETTY IMAGES

People had long speculated that there had to be something down at the bottom of the globe—in cartographers’ terms, a Terra Australis Incognita (“unknown southern land”). The ancient Greeks referred to the putative landmass as “Ant-Arktos,” because it was on the opposite side of the globe from the constellation of Arktos, the Bear, which appears in the north. But the closest anyone came to penetrating the freezing wastes of the Antarctic Circle was Captain James Cook, the British explorer, who looked for a southern continent from 1772-75. He got within 80 miles of the coast, but the harshness of the region convinced Cook that “no man will ever venture further than I have done.”

Davis proved him wrong half a century later, but explorers were unable to make further progress until the heroic age of Antarctic exploration in the early 20th century. In 1911, the British explorer Robert F. Scott led a research expedition to the South Pole, only to be beaten by the Norwegian Roald Amundsen, who misled his backers about his true intentions and jettisoned scientific research for the sake of getting there quickly.

Extraordinarily bad luck led to the deaths of Scott and his teammates on their return journey. In 1915, Ernest Shackleton led a British expedition that aimed to make the first crossing of Antarctica by land, but his ship Endurance was trapped in the polar ice. The crew’s 18-month odyssey to return to civilization became the stuff of legend.

Soon exploration gave way to international competition over Antarctica’s natural resources. Great Britain marked almost two-thirds of the continent’s landmass as part of the British Empire, but a half dozen other countries also staked claims. In 1947 the U.S. joined the fray with Operation High Jump, a U.S. Navy-led mission to establish a research base that involved 13 ships and 23 aircraft.

Antarctica’s freedom and neutrality were in question during the Cold War. But in 1957, a group of geophysicists managed to launch a year-long Antarctic research project involving 12 countries. It was such a success that two years later the countries, including the U.S., the U.K. and the USSR, signed the Antarctic Treaty, guaranteeing the continent’s protection from militarization and exploitation. This goodwill toward men took a further 20 years to extend to women, but in 1979 American engineer Irene C. Peden became the first woman to work at the South Pole for an entire winter.

Historically Speaking: The Martini’s Contribution to Civilization

The cocktail was invented in the U.S., but it soon became a worldwide symbol of sophistication.

Wall Street Journal

December 18, 2020

In 1887, the Chicago Tribune hailed the martini as the quintessential Christmas drink, reminding readers that it is “made of Vermouth, Booth’s Gin, and Angostura Bitters.” That remains the classic recipe, even though no one can say for certain who created it.

The journalist H.L. Mencken famously declared that the martini was “the only American invention as perfect as the sonnet,” and there are plenty of claimants to the title of inventor. The city of Martinez, Calif., insists the martini was first made there in 1849, for a miner who wanted to celebrate a gold strike with something “special.” Another origin story gives the credit to Jerry Thomas, the bartender of the Occidental Hotel in San Francisco, in 1867.

Actor Pierce Brosnan as James Bond, with his signature martini.
PHOTO: MGM/EVERETT COLLECTION

Of course, just as calculus was discovered simultaneously by Isaac Newton and Gottfried Leibniz, the martini may have sprung from multiple cocktail shakers. What soon made it stand out from all other gin cocktails was its association with high society. The hero of “Burning Daylight,” Jack London’s 1910 novel about a gold-miner turned entrepreneur, drinks martinis to prove to himself and others that he has “arrived.” Ernest Hemingway paid tribute to the drink in his 1929 novel “A Farewell To Arms” with the immortal line, “I had never tasted anything so cool and clean. They made me feel civilized.”

Prohibition was a golden age for the martini. Its adaptability was a boon: Even the coarsest bathtub gin could be made palatable with the addition of vermouth and olive brine (a dirty martini), a pickled onion (Gibson), lemon (twist), lime cordial (gimlet) or extra vermouth (wet). President Franklin D. Roosevelt was so attached to the cocktail that he tried a little martini diplomacy on Stalin during the Yalta conference of 1945. Stalin could just about stand the taste but informed Roosevelt that the cold on the way down wasn’t to his liking at all.

The American love affair with the martini continued in Hollywood films like “All About Eve,” starring Bette Davis, which portrayed it as the epitome of glamour and sophistication. But change was coming. In Ian Fleming’s 1954 novel “Live and Let Die,” James Bond ordered a martini made with vodka instead of gin. Worse, two years later in “Diamonds are Forever,” Fleming described the drink as being “shaken and not stirred,” even though shaking weakens it. Then again, according to an analysis of Bond’s alcohol consumption published in the British Medical Journal in 2013, 007 sometimes downed the equivalent of 14 martinis in a 24-hour period, so his whole body would have been shaking.

American businessmen weren’t all that far behind. The three-martini lunch was a national pastime until business lunches ceased to be fully tax-deductible in the 1980s. Banished from meetings, the martini went back to its roots as a mixologists’ dream, reinventing itself as a ‘tini for all seasons.

The 1990s brought new varieties that even James Bond might have thought twice about, like the chocolate martini, made with creme de cacao, and the appletini, made with apple liqueur, cider or juice. Whatever your favorite, this holiday season let’s toast to feeling civilized.

Leaders Who Bowed Out Gracefully

Kings and politicians have used their last moments on the world stage to deliver words of inspiration.

November 5, 2020

The Wall Street Journal

The concession speech is one of the great accomplishments of modern democracy. The election is over and passions are running high, but the loser graciously concedes defeat, calls for national unity and reminds supporters that tomorrow is another day. It may be pure political theater, but it’s pageantry with a purpose.

For most of history, defeated rulers didn’t give concession speeches; they were too busy begging for their lives, since a king who lost his throne was usually killed shortly after. The Iliad recounts six separate occasions where a defeated warrior asks his opponent for mercy, only to be hacked to death anyway. The Romans had no interest whatsoever in listening to defeated enemies—except once, in the 1st century, when the British chieftain Caractacus was brought in chains before the Senate.

Republican presidential candidate John McCain delivers his concession speech on Nov. 4, 2008, after losing the election to Barack Obama.
PHOTO: ROBYN BECK/AGENCE FRANCE-PRESSE/GETTY IMAGES

On a whim, the Emperor Claudius told Caractacus to give one reason why his life should be spared. According to the historian Cassius Dio, the defeated Briton gave an impassioned speech about the glory of Rome, and how much greater it would be if he was spared: “If you save my life, I shall be an everlasting memorial of your clemency.” Impressed, the Senate set him free.

King Charles I had no hope for clemency on Jan. 30, 1649, when he faced execution after the English Civil War. But this made his speech all the more powerful, because Charles was speaking to posterity more than to his replacement, Oliver Cromwell. His final words have been a template for concession speeches ever since: After defending his record and reputation, Charles urged Cromwell to rule for the good of the country, “to endeavor to the last gasp the peace of the kingdom.”

In modern times, appeals to the nation became an important part of royal farewell speeches. When Napoleon Bonaparte abdicated as emperor of France in 1814, he stood in the courtyard of the palace of Fontainebleau and bade an emotional goodbye to the remnants of his Old Guard. He said that he was leaving to prevent further bloodshed, and ended with the exhortation: “I go, but you, my friends, will continue to serve France.”

Emperor Hirohito delivered a similar message in his radio broadcast on Aug. 14, 1945, announcing Japan’s surrender in World War II. The Emperor stressed that by choosing peace over annihilation he was serving the ultimate interests of the nation. He expected his subjects to do the same, to “enhance the innate glory of the Imperial State.” The shock of the Emperor’s words was compounded by the fact that no one outside the court and cabinet had ever heard his voice before.

In the U.S., the quality of presidential concession speeches rose markedly after they began to be televised in 1952. Over the years, Republican candidates, in particular, have elevated the art of losing to almost Churchillian heights. John McCain’s words on election night 2008, when he lost to Barack Obama, remain unmatched: “Americans never quit. We never surrender. We never hide from history. We make history.”