Leaders Who Bowed Out Gracefully

Kings and politicians have used their last moments on the world stage to deliver words of inspiration.

November 5, 2020

The Wall Street Journal

The concession speech is one of the great accomplishments of modern democracy. The election is over and passions are running high, but the loser graciously concedes defeat, calls for national unity and reminds supporters that tomorrow is another day. It may be pure political theater, but it’s pageantry with a purpose.

For most of history, defeated rulers didn’t give concession speeches; they were too busy begging for their lives, since a king who lost his throne was usually killed shortly after. The Iliad recounts six separate occasions where a defeated warrior asks his opponent for mercy, only to be hacked to death anyway. The Romans had no interest whatsoever in listening to defeated enemies—except once, in the 1st century, when the British chieftain Caractacus was brought in chains before the Senate.

Republican presidential candidate John McCain delivers his concession speech on Nov. 4, 2008, after losing the election to Barack Obama.
PHOTO: ROBYN BECK/AGENCE FRANCE-PRESSE/GETTY IMAGES

On a whim, the Emperor Claudius told Caractacus to give one reason why his life should be spared. According to the historian Cassius Dio, the defeated Briton gave an impassioned speech about the glory of Rome, and how much greater it would be if he was spared: “If you save my life, I shall be an everlasting memorial of your clemency.” Impressed, the Senate set him free.

King Charles I had no hope for clemency on Jan. 30, 1649, when he faced execution after the English Civil War. But this made his speech all the more powerful, because Charles was speaking to posterity more than to his replacement, Oliver Cromwell. His final words have been a template for concession speeches ever since: After defending his record and reputation, Charles urged Cromwell to rule for the good of the country, “to endeavor to the last gasp the peace of the kingdom.”

In modern times, appeals to the nation became an important part of royal farewell speeches. When Napoleon Bonaparte abdicated as emperor of France in 1814, he stood in the courtyard of the palace of Fontainebleau and bade an emotional goodbye to the remnants of his Old Guard. He said that he was leaving to prevent further bloodshed, and ended with the exhortation: “I go, but you, my friends, will continue to serve France.”

Emperor Hirohito delivered a similar message in his radio broadcast on Aug. 14, 1945, announcing Japan’s surrender in World War II. The Emperor stressed that by choosing peace over annihilation he was serving the ultimate interests of the nation. He expected his subjects to do the same, to “enhance the innate glory of the Imperial State.” The shock of the Emperor’s words was compounded by the fact that no one outside the court and cabinet had ever heard his voice before.

In the U.S., the quality of presidential concession speeches rose markedly after they began to be televised in 1952. Over the years, Republican candidates, in particular, have elevated the art of losing to almost Churchillian heights. John McCain’s words on election night 2008, when he lost to Barack Obama, remain unmatched: “Americans never quit. We never surrender. We never hide from history. We make history.”

Historically Speaking: Tales That Go Bump in the Night

From Homer to Edgar Allan Poe, ghost stories have given us a chilling good time

The Wall Street Journal

October 23, 2020

As the novelist Neil Gaiman, a master of the macabre, once said, “Fear is a wonderful thing, in small doses.” In this respect, we’re no different than our ancestors: They, too, loved to tell ghost stories.

One of the earliest ghosts in literature appears in Homer’s Odyssey. Odysseus entertains King Alcinous of Phaeacia with an account of his trip to the Underworld, where he met the spirits of Greek heroes killed in the Trojan War. The dead Achilles complains that being a ghost is no fun: “I should choose, so I might live on earth, to serve as the hireling of another…rather than to be lord over all the dead.”

ILLUSTRATION: THOMAS FUCHS

It was a common belief in both Eastern and Western societies that ghosts could sometimes return to right a great wrong, such as an improper burial. The idea that ghosts are intrinsically evil—the core of any good ghost story—received a boost from Plato, who believed that only wicked souls hang on after death; the good know when it’s time to let go.

Ghosts were problematic for early Christianity, which taught that sinners went straight to Hell; they weren’t supposed to be slumming it on Earth. The ghost story was dangerously close to heresy until the Church adopted the belief in Purgatory, a realm where the souls of minor sinners waited to be cleansed. The Byland Abbey tales, a collection of ghost stories recorded by an anonymous 15th-century English monk, suggest that the medieval Church regarded the supernatural as a useful form of advertising: Not paying the priest to say a mass for the dead could lead to a nasty case of haunting.

The ghost story reached its apogee in the early modern era with Shakespeare’s “Hamlet,” which opens with a terrified guard seeing the ghost of the late king on the battlements of Elsinore Castle. But the rise of scientific skepticism made the genre seem old-fashioned and unsophisticated. Ghosts were notably absent from English literature until Horace Walpole, son of Britain’s first prime minister, published the supernatural mystery novel “The Castle of Otranto” in 1764, as a protest against the deadening effect of “reason” on art.

Washington Irving was the first American writer to take the ghost story seriously, creating the Headless Horseman in his 1820 tale “The Legend of Sleepy Hollow.” He was a lightweight, however, compared with Edgar Allan Poe, who turned horror into an art form. His famous 1839 story “The Fall of the House of Usher” heightens the tension with ambiguity: For most of the story, it isn’t clear whether Roderick Usher’s house really is haunted, or if he is merely “enchained by certain superstitious impressions.”

Henry James used a similar technique in 1895, when, unhappy with the tepid reception of his novels in the U.S., he decided to frighten Americans into liking him. The result was the psychological horror story “The Turn of the Screw,” about a governess who may or may not be seeing ghosts. The reviews expressed horror at the horror, with one critic describing it as “the most hopelessly evil story that we could have read in any literature.” With such universal condemnation, success was assured.

Historically Speaking: The Power of Telling Stories in Pictures

‘Peanuts’ turns 70 this month, but the origins of narrative art go back to ancient Sumeria more than 4,000 years ago.

The Wall Street Journal

October 8, 2020

Good grief! It’s Snoopy’s 70th birthday. Charles M. Schulz’s ‘Peanuts’ comic strip made its debut on Oct. 2, 1950, and though it ended with Schulz’s death in 2000, Snoopy, Charlie Brown, Lucy and their friends are still beloved today. Their longevity is a testament to the power of the comic strip—a form of storytelling that has been around for thousands of years.

The Sumerians were the first to integrate words and pictures to tell an individual’s story. The earliest example is the Stele of the Vultures, created around 2450 B.C., which depicts King Eannatum of Lagash’s crushing victory over the kingdom of Umma. The form, known as narrative sequential art, spread across the ancient world, reaching its apogee under the Roman emperor Trajan. A 126-foot high column in Rome, completed in 113 A.D., recounts his successful campaign against the Dacians in 155 intricately carved scenes.

A detail from the Bayeux Tapestry.
PHOTO: PRINT COLLECTOR/GETTY IMAGES

Rather more humbly, though no less eloquently, a 1st-century tomb depicts the founding of the Roman city Capitolias, in what is now Jordan, through a series of wall paintings. One scene shows a stonemason at work. In the equivalent of a speech bubble, he says “I am cutting stone.” Another figure says, “Alas for me! I am dead!”

Narrative sequential art developed independently in Asia as well as pre-Columbian America. Buildings in the Mayan city of Yaxchilan, now in Mexico, had carved stone lintels depicting scenes that associated the monarchy with divine rule. Lintel 24, made around 725 and now in the British Museum, shows King Shield Jaguar II and his wife Lady K’abal Xook undergoing painful blood rituals to prove their fitness to rule. Lady Xook pulls a thorn-encrusted rope through her tongue.

Royal and religious purposes were also served by the Bayeux Tapestry, made in the 11th century, which depicts the events leading up to the Norman conquest of England in 1066. Its original audience would have understood its more than 70 scenes as a drawn-out morality tale about the consequences of sin: England was invaded because King Harold broke his oath of loyalty to the Normans.

In the age of print, graphic narratives could reach much wider audiences. The English artist William Hogarth managed to make sin look sexy in “The Rake’s Progress,” a series of eight paintings that were turned into popular engravings in 1735.

But the comic strip as we know it today was invented in the 1830s, almost by chance, by the Swiss artist and caricaturist Rodolphe Topffer. The comical cartoon sequences he drew for his friends attracted the admiration of Goethe, becoming so popular that publication was inevitable. His “Histoire de M. Vieux Bois” inspired hundreds of American imitators when it was published in the U.S. in 1842 as “The Adventures of Obadiah Oldbuck.”

The real golden age of American comic strips began in 1929, with Hal Foster’s adaptation of Tarzan into a continuous-action adventure strip. Marvel and DC Comics later turned Foster’s genius into a lucrative formula. Nearly a century later, the genre is still reaching new heights: In 2016, the late Congressman John Lewis’s memoir “March: Book Three” became the first graphic novel to win a National Book award.

Historically Speaking: The Business and Pleasure of Dining Out

The food service industry will eventually overcome the pandemic, just as it bounced back from ancient Roman bans and Prohibition.

The Wall Street Journal

September 24, 2020

It remains anyone’s guess what America’s once-vibrant restaurant scene will look like in 2021. At the beginning of this year, there were 209 Michelin-starred restaurants in the U.S. This month, the editors of the Michelin Guide announced that just 29 had managed to reopen after the pandemic lockdown.

ILLUSTRATION: THOMAS FUCHS

The food-service industry has always had to struggle. In the Roman Empire, the typical eatery was the thermopolium, a commercial kitchen that sold mulled wine and a prepared meal—either to-go or, in larger establishments, to eat at the counter. They were extremely popular among the working poor—archaeologists have found over 150 in Pompeii alone—and therefore regarded with suspicion by the authorities. In 41 A.D., Emperor Claudius ordered a ban on thermopolia, but the setback was temporary at best.

In Europe during the Middle Ages, the “cook shop” served a similar function for the poor. For the wealthier sort, however, it was surprisingly difficult to find places to eat out. Only a few monasteries and taverns provided hospitality. In Geoffrey Chaucer’s 14th-century comic travelogue “The Canterbury Tales,” the pilgrims have to bring their own cook, Roger of Ware, who is said to be an expert at roasting, boiling, broiling and frying.

To experience restaurant-style dining with separate tables, waiters and a menu, one had to follow in the footsteps of the Venetian merchant Marco Polo to the Far East. The earliest prototype of the modern restaurant developed in Kaifeng, the last capital of the Song Dynasty (960-1279), to accommodate its vast transient population of merchants and officials. The accumulation of so many rich and homesick men led to a boom in sophisticated eateries offering meals cooked to order.

Europe had nothing similar until the French began to experiment with different forms of public catering in the 1760s. These new places advertised themselves as a healthy alternative to the tavern, offering restorative soups and broths—hence their name, the restaurant.

In 1782, this rather humble start inspired Antoine Beauvilliers to open the first modern restaurant, La Grande Taverne de Londres, which unashamedly replicated the luxury of royal dining. By the 1800s, the term “restaurant” in any language meant a superior establishment serving refined French cuisine.

In 1830, two Swiss brothers, John and Peter Delmonico, opened the first restaurant in the U.S., Delmonico’s in New York. It was a temple of haute cuisine, with uniformed waiters, imported linens and produce grown on a dedicated farm. What’s more, diners could make reservations ahead of time and order either a la carte or prix fixe—all novel concepts in 19th century America.

Delmonico’s reign lasted until Prohibition, which forced thousands of U.S. restaurants out of business, unable to survive without alcohol sales. During that time, the only growth in the restaurant trade was in illegal speakeasies and family-friendly diners. Yet in 1934, just one year after Prohibition’s repeal, the art deco-themed Rainbow Room opened its doors at Rockefeller Plaza in New York. Out of the ashes of the old, a new age of elegance had begun.

Historically Speaking: Women Who Made the American West

From authors to outlaws, female pioneers helped to shape frontier society.

The Wall Street Journal

September 9, 2020

On Sept. 14, 1920, Connecticut became the 37th state to ratify the 19th Amendment, which guaranteed women the right to vote. The exercise was largely symbolic, since ratification had already been achieved thanks to Tennessee on August 18. Still, the fact that Connecticut and the rest of the laggard states were located in the eastern part of the U.S. wasn’t a coincidence. Though women are often portrayed in Westerns as either vixens or victims, they played a vital role in the life of the American frontier.

The outlaw Belle Starr, born Myra Belle Shirley, in 1886.
PHOTO: ROEDER BROTHERS/BUYENLARGE/GETTY IMAGES

Louisa Ann Swain of Laramie, Wyo., was the first woman in the U.S. to vote legally in a general election, in September 1870. The state was also ahead of the pack in granting women the right to sit on a jury, act as a justice of the peace and serve as a bailiff. Admittedly, it wasn’t so much enlightened thinking that opened up these traditionally male roles as it was the desperate shortage of women. No white woman crossed the continent until 17-year-old Nancy Kelsey traveled with her husband from Missouri to California in 1841. Once there, as countless pioneer women subsequently discovered, the family’s survival depended on her ability to manage without his help.

Women can and must fend for themselves was the essential message in the ‘”Little House on the Prairie” series of books by Laura Ingalls Wilder, who was brought up on a series of homesteads in Wisconsin and Minnesota in the 1870s. Independence was so natural to her that she refused to say “I obey” in her marriage vows, explaining, “even if I tried, I do not think I could obey anybody against my better judgment.”

Although the American frontier represented incredible hardship and danger, for many women it also offered a unique kind of freedom. They could forge themselves anew, seizing opportunities that would have been impossible for women in the more settled and urbanized parts of the country.

This was especially true for women of color. Colorado’s first Black settler was a former slave named Clara Brown, who won her freedom in 1856 and subsequently worked her way west to the gold-mining town of Central City. Recognizing a need in the market, she founded a successful laundry business catering to miners and their families. Some of her profits went to buy land and shares in mines; the rest she spent on philanthropy, earning her the nickname “Angel of the Rockies.” After the Civil War, Brown made it her mission to locate her lost family, ultimately finding a grown-up daughter, Eliza.

However, the flip of side of being able to “act like men” was that women had to be prepared to die like men, too. Myra Belle Shirley, aka Belle Starr, was a prolific Texas outlaw whose known associates included the notorious James brothers. Despite a long criminal career that mainly involved bootlegging and fencing stolen horses, Starr was convicted only once, resulting in a nine-month prison sentence in the Detroit House of Correction. Her luck finally ran out in 1889, two days before her 41st birthday. By now a widow for the third time, Belle was riding alone in Oklahoma when she was shot and killed in an ambush. The list of suspects included her own children, although the murder was never solved.

HistoricallySpeaking: How Fear of Sharks Became an American Obsession

Since colonial times, we’ve dreaded what one explorer called ‘the most ravenous fish known in the sea’

The Wall Street Journal

August 27, 2020

There had never been a fatal shark incident in Maine until last month’s shocking attack on a woman swimmer by a great white near Bailey Island in Casco Bay. Scientists suspect that the recent rise in seal numbers, rather than the presence of humans, was responsible for luring the shark inland.

A great white shark on the attack.
PHOTO: CHRIS PERKINS / MEDIADRUMWORLD / ZUMA PRESS

It’s often said that sharks aren’t the bloodthirsty killing machines portrayed in the media. In 2019 there were only 64 shark attacks world-wide, with just two fatalities. Still, they are feared for good reason.

The ancient Greeks knew well the horror that could await anyone unfortunate enough to fall into the wine-dark sea. Herodotus recorded how, in 492 B.C., a Persian invasion fleet of 300 ships was heading toward Greece when a sudden storm blew up around Mt. Athos. The ships broke apart, tossing some 20,000 men into the water. Those who didn’t drown immediately were “devoured” by sharks.

The Age of Discovery introduced European explorers not just to new landmasses but also to new shark species far more dangerous than the ones they knew at home. In a narrative of his 1593 journey to the South Seas, the explorer and pirate Richard Hawkins described the shark as “the most ravenous fishe knowne in the sea.”

It’s believed that the first deadly shark attack in the U.S. took place in 1642 at Spuyten Duyvil, an inlet on the Hudson River north of Manhattan. Antony Van Corlaer was attempting to swim across to the Bronx when a giant fish was seen to drag him under the water.

But the first confirmed American survivor of a shark attack was Brook Watson, a 14-year-old sailor from Boston. In 1749, Watson was serving on board a merchant ship when he was attacked while swimming in Cuba’s Havana Harbor. Fortunately, his crewmates were able to launch a rowboat and pull him from the water, leaving Watson’s right foot in the shark’s mouth.

Despite having a wooden leg, Watson enjoyed a successful career at sea before returning to his British roots to enter politics. He ended up serving as Lord Mayor of London and becoming Sir Brook Watson. His miraculous escape was immortalized by his friend the American painter John Singleton Copley. “Watson and the Shark” was completely fanciful, however, since Copley had never seen a shark.

The American relationship with sharks was changed irrevocably during the summer of 1916. The East Coast was gripped by both a heat wave and a polio epidemic, leaving the beach as one of the few safe places for Americans to relax. On July 1, a man was killed by a shark on Long Beach Island off the New Jersey coast. Over the next 10 days, sharks in the area killed three more people and left one severely injured. In the ensuing national uproar, President Woodrow Wilson offered federal funds to help get rid of the sharks, an understandable but impossible wish.

The Jersey Shore attacks served as an inspiration for Peter Benchley’s bestselling 1974 novel “Jaws,” which was turned into a blockbuster film the next year by Steven Spielberg. Since then the shark population in U.S. waters has dropped by 60%, in part due to an increase in shark-fishing inspired by the movie. Appalled by what he had unleashed, Benchley spent the last decades of his life campaigning for shark conservation.

Historically Speaking: The Sharp Riposte as a Battle Tactic

Caustic comebacks have been exchanged between military leaders for millennia, from the Spartans to World War II

The Wall Street Journal

August 6, 2020

In the center of Bastogne, Belgium (pop. 16,000), there is a statue of U.S. Army General Anthony C. McAuliffe, who died 45 years ago this week. It’s a small town with a big history. Bastogne came under attack during the final German offensive of World War II, known as the Battle of the Bulge. The town was the gateway to Antwerp, a vital port for the Allies, and all that stood between the Germans and their objective was Gen. McAuliffe and his 101st Airborne Division. Despite being outnumbered by a factor of four to one, he refused to surrender, fighting on until Gen. George Patton’s reinforcements could break the siege.

ILLUSTRATION: THOMAS FUCHS

While staving off the German attack, Gen. McAuliffe uttered the greatest comeback of the war. A typewritten ultimatum from Commander Heinrich von Lüttwitz of the 47th German Panzer Corps gave him two hours to surrender the town or face “annihilation.” With ammunition running low and casualties mounting, the general made his choice. He sent back the following typed reply:

December 22, 1944

To the German Commander,

N U T S !

The American Commander

The true laconic riposte is extremely rare. The Spartans, whose ancient homeland of Lakonia inspired the term “laconic,” were masters of the art. When Philip II of Macedon ordered them to open their borders, he warned them, “For if I bring my army into your land, I will destroy your farms, slay your people and raze your city.” According to the later account of the Greek philosopher Plutarch, they sent back a one-word message: ‘If’.

Once the age of the Spartans passed, it took more than a millennium for the laconic comeback to return in earnest. The man most responsible was a colorful Teutonic knight called Götz von Berlichingen, who participated in countless German conflicts, uprisings and skirmishes, including the Swabian War between the Hapsburgs and the Swiss. He lost a hand to a cannonball and wore an iron prosthesis (hence his nickname, Götz of the Iron Hand). In 1515, sick of trading insults with an opponent who wouldn’t come out to fight, Götz abruptly ended the conversation with: “soldt mich hinden leckhenn,” which literally meant “kiss my ass.”

He recorded the encounter in his memoirs, but it remained little known until Johann Wolfgang von Goethe adapted Götz’s autobiography into a successful play in 1773. From then on, the insult was popularly known in Germany as a “Swabian salute.”

In his novel “Les Misérables,” Victor Hugo—possibly inspired by Goethe—immortalized the French defeat in 1815 at the Battle of Waterloo with a scene of spectacular, if laconic, defiance that incorporated France’s most common expletive. According to Hugo, Gen. Pierre Cambronne, commander of Napoleon Bonaparte’s Imperial Guard, fought to the last syllable in the face of overwhelming force. Encircled by the British army, “They could hear the sound of the guns being reloaded and see the lighted slow matches gleaming like the eyes of tigers in the dusk. An English general called out to them, ‘Brave Frenchmen, will you not surrender?’ Cambronne answered, ‘Merde’” (that is, “shit”).

The scene’s veracity is still hotly debated, the fog of war making memories hazy. But Cambronne—who survived—later disavowed it, especially after “le mot de Cambronne’’ (Cambronne’s word) became a common euphemism for the profanity.

Historically Speaking: The American Invention of Summer Camp

Since 1876, children have looked forward to their long vacation as a time to build friendships and character.

July  23, 2020

The Wall Street Journal

Hello Muddah, hello Faddah,/Here I am at Camp Granada…

With more than half of the country’s 14,000 summer camps temporarily closed because of Covid-19, millions of children are missing out on experiences that have helped to shape young Americans for nearly 150 years.

I went hiking with Joe Spivey/He developed poison ivy…

Hayley Mills plays twins who meet at summer camp in ‘The Parent Trap’ (1961).
PHOTO: EVERETT COLLECTION

The idea that a spell in the great outdoors builds character has ancient roots. The Spartans practiced a particularly rigorous form: When a warrior-in-training reached the age of 12, he was sent into the wilderness for a year. Those who gave up were barred from attaining full citizenship.

And the head coach wants no sissies/ so he reads to us from something called Ulysses…

But the modern summer camp can be traced to the Transcendentalist movement of the 1830s and ‘40s. Henry David Thoreau and Ralph Waldo Emerson were ardent proselytizers for learning to live at one with nature. Their message resonated with the environmentalist Joseph T. Rothrock, who founded the country’s first sleep-away camp, the North Mountain School of Physical Culture, in 1876 near Wilkes-Barre, Pa. Rothrock believed he could take “weakly boys” from the city and rehabilitate them into healthy young men.

Ernest Balch was moved by ‘the miserable condition of boys from well-to-do families” who spent their summers living in hotels, rather than out in nature. He was still a Dartmouth College student when he founded Camp Chocorura in New Hampshire in 1881. Its emphasis on self-reliance and character-building became the blueprint for other summer camps.

You remember Jeffrey Hardy /They’re about to organize a searching party…

By 1918 there were over 1,000 in the U.S. Charles W. Eliot, a former president of Harvard, went so far as to declare in 1922 that summer camp was “the most important step in education that America has given the world.”

This would have been news to Britain, where Robert Baden-Powell, founder of the Scout movement, had been running his own summer camp since 1907. But American camps were unique in their diversity, with options for every faith and political creed. The oldest camp for Black children, Camp Atwater, was founded in North Brookfleld, Mass., in 1921, at a time when summer camping was segregated; it is still going strong today.

Take me home, oh Muddah, Faddah…

Summer camp retained a wholesome image for decades. Films such as “The Parent Trap” (1961), starring Hayley Mills as separated twin sisters who are unexpectedly reunited at a summer camp, focused on innocent fun. But darker themes were coming. The “Friday the 13th” franchise, launched in 1980, has a higher body count than many war films, with much of the carnage taking place at the fictional Camp Crystal Lake.

Wait a minute, it’s stopped hailing /Guys are swimming, guys are sailing…

But summer camps continued to grow. By the mid-2010s, according to the American Camp Association, they were an $18 billion industry serving 14 million campers every year. The disappointment of missing camp this summer will hopefully make it even more joyful to return next year. As Allan Sherman’s beloved satire concludes: Muddah, Faddah, kindly disregard this letter.

Historically Speaking: The Delicious Evolution of Mayonnaise

Ancient Romans ate a pungent version, but the modern egg-based spread was created by an 18th-century French chef.

July 9, 2020

The Wall Street Journal

I can’t imagine a summer picnic without mayonnaise—in the potato salad, the veggie dips, the coleslaw, and yes, even on the french fries. It feels like a great dollop of pure Americana in a creamy, satisfying sort of way. But like a lot of what makes our country so successful, mayonnaise originally came from somewhere else.

Where, exactly, is one of those food disputes that will never be resolved, along with the true origins of baklava pastry, hummus and the pisco sour cocktail. In all likelihood, the earliest version of mayonnaise was an ancient Roman concoction of garlic and olive oil, much praised for its medicinal properties by Pliny the Elder in his first-century encyclopedia “Naturalis Historia.” This strong-tasting, aioli-like proto-mayonnaise remained a southern Mediterranean specialty for millennia.

But most historians believe that modern mayonnaise was born in 1756 in the port city of Mahon, in the Balearic Islands off the coast of Spain. At the start of the Seven Years’ War between France and Britain, the French navy, led by the Duc de Richelieu, smashed Admiral Byng’s poorly armed fleet at the Battle of Minorca. (Byng was subsequently executed for not trying hard enough.) While preparing the fish course for Richelieu’s victory dinner, his chef coped with the lack of cream on the island by ingeniously substituting a goo of eggs mixed with oil and garlic.

The anonymous cook took the recipe for “mahonnaise” back to France, where it was vastly improved by Marie-Antoine Careme, the founder of haute cuisine. Careme realized that whisking rather than simply stirring the mixture created a soft emulsion that could be used in any number of dishes, from the savory to the sweet.

It wasn’t just the French who fell for Careme’s version. Mayonnaise blended easily with local cuisines, evolving into tartar sauce in Eastern Europe, remoulades in the Baltic countries and salad dressing in Britain. By 1838, the menu at the iconic New York restaurant Delmonico’s featured lobster mayonnaise as a signature dish.

All that whisking, however, made mayonnaise too laborious for home cooks until the invention of the mechanical eggbeater, first patented by the Black inventor Willis Johnson, of Cincinnati, Ohio, in 1884. “Try it once,” gushed Good Housekeeping magazine in 1889, “and you’ll never go back to the old way as long as you live.”

Making mayonnaise was one thing, preserving it quite another, since the raw egg made it spoil quickly. The conundrum was finally solved in 1912 by Richard Hellmann, a German-American deli owner in New York. By using his own trucks and factories, Hellmann was able to manufacture and transport mayonnaise faster. And in a revolutionary move, he designed the distinctive wide-necked Hellmann’s jar, encouraging liberal slatherings of mayo and thereby speeding up consumption.

Five years later, Eugenia Duke of North Carolina created Duke’s mayonnaise, which is eggier and has no sugar. The two brands are still dueling it out. But when it comes to eating, there are no winners and losers in the mayo department, just 14 grams of fat and 103 delicious calories per tablespoon.

Historically Speaking: Pioneers of America’s Black Press

Since the early 19th century, African-American publications have built community and challenged injustice.

June 18, 2020

The Wall Street Journal

It isn’t enough to have a voice, it must also be used and heard. “Too long have others spoken for us,” announced the first issue of Freedom’s Journal, the first black-owned and operated newspaper in the U.S. Founded by Samuel Cornish and John Russwurm in 1827, the weekly New York City paper published news and opinions; almost equally important, it provided an advertising platform for black businesses, charities and organizations. During its two-year run, Freedom’s Journal reached 50,000 households in 11 states, helping to foster a sense of community and pride among the free but disenfranchised African-Americans of the North.

Journalist Ida B. Wells, ca. 1893.
PHOTO: ALAMY

Black-owned newspapers multiplied in the decades before the Civil War. Some were abolitionist, such as Frederick Douglass’s The North Star, founded in Rochester, N.Y., in 1847. Others were religious: The Christian Recorder, the official periodical of the African Methodist Episcopal Church, was founded in 1848 and is still published today, making it the longest-running African-American periodical.

Black journalists were often targeted for violence by whites. In 1892, the lynching of three black men in Memphis, Tenn., prompted the young Ida B. Wells to begin an anti-lynching crusade in the Memphis Free Speech and Headlight, the newspaper she co-owned and edited. Her courageous reporting brought international attention to the atrocities taking place with impunity in the South. But it also made her a marked woman: The newspaper’s staff was attacked and its office burned down. Wells left Memphis and urged her readers to do the same: “There is, therefore, only one thing left to do; save our money and leave.”

In the early 20th century, the Chicago Defender was one of the most influential papers in the U.S., black or white. Its aggressive championing of what would become known as “The Great Migration” helped persuade many African-Americans to move to Chicago and other Northern cities. Literature and the arts were nourished by The Crisis, the monthly magazine of the NAACP. Founded in 1910 by W.E.B. DuBois, it featured up-and-coming black writers such as Langston Hughes and Countee Cullen.

The influence of black-owned newspapers sometimes incurred the wrath of the U.S. government. During World War II, the Pittsburgh Courier campaigned for equal rights for black soldiers, leading FBI Director J. Edgar Hoover to try to charge its publishers with treason. The effort was thwarted by other members of the Roosevelt administration, and black newspapers continued to play a vital role in the civil-rights movement of the 1950s and 60s.

But barriers remained. In 1968, Coretta Scott King had to demand that the press pool covering her husband’s funeral include a black photographer. Moneta Sleet of Ebony magazine was selected and his photograph of Mrs. King and her daughter went on to win a Pulitzer Prize, making him the first African-American man to receive the award.

The Pulitzer confirmed Ebony’s status as the leading black publication in the U.S. Founded by John H. Johnson in 1945, by the early 2000s it was read by almost 40% of all African-American adults, giving it a clout that corporate America couldn’t afford to ignore.

The black press is continually evolving and expanding, from the women’s magazine Essence, which celebrated its 50th birthday this year, to Blavity, a web magazine for millennials launched in 2014. The current Black Lives Matter protests show just how vital these voices continue to be.