Historically Speaking: Two Centuries of Exploring Antarctica

Charting the southern continent took generations of heroic sacrifice and international cooperation.

The Wall Street Journal

January 14, 2021

There is a place on Earth that remains untouched by war, slavery or riots. Its inhabitants coexist in peace, and all nationalities are welcomed. No, it’s not Neverland or Shangri-La—it’s Antarctica, home to the South Pole, roughly 20 million penguins and a transient population of about 4,000 scientists and support staff.

Antarctica’s existence was only confirmed 200 years ago. Following some initial sightings by British and Russian explorers in January 1821, Captain John Davis, a British-born American sealer and explorer, landed on the Antarctic Peninsula on Feb. 7, 1821. Davis was struck by its immense size, writing in his logbook, “I think this Southern Land to be a Continent.” It is, in fact, the fifth-largest of Earth’s seven continents.

Herbert Ponting is attacked by a penguin during the 1911 Scott expedition in Antarctica.
PHOTO: HERBERT PONTING/SCOTT POLAR RESEARCH INSTITUTE, UNIVERSITY OF CAMBRIDGE/GETTY IMAGES

People had long speculated that there had to be something down at the bottom of the globe—in cartographers’ terms, a Terra Australis Incognita (“unknown southern land”). The ancient Greeks referred to the putative landmass as “Ant-Arktos,” because it was on the opposite side of the globe from the constellation of Arktos, the Bear, which appears in the north. But the closest anyone came to penetrating the freezing wastes of the Antarctic Circle was Captain James Cook, the British explorer, who looked for a southern continent from 1772-75. He got within 80 miles of the coast, but the harshness of the region convinced Cook that “no man will ever venture further than I have done.”

Davis proved him wrong half a century later, but explorers were unable to make further progress until the heroic age of Antarctic exploration in the early 20th century. In 1911, the British explorer Robert F. Scott led a research expedition to the South Pole, only to be beaten by the Norwegian Roald Amundsen, who misled his backers about his true intentions and jettisoned scientific research for the sake of getting there quickly.

Extraordinarily bad luck led to the deaths of Scott and his teammates on their return journey. In 1915, Ernest Shackleton led a British expedition that aimed to make the first crossing of Antarctica by land, but his ship Endurance was trapped in the polar ice. The crew’s 18-month odyssey to return to civilization became the stuff of legend.

Soon exploration gave way to international competition over Antarctica’s natural resources. Great Britain marked almost two-thirds of the continent’s landmass as part of the British Empire, but a half dozen other countries also staked claims. In 1947 the U.S. joined the fray with Operation High Jump, a U.S. Navy-led mission to establish a research base that involved 13 ships and 23 aircraft.

Antarctica’s freedom and neutrality were in question during the Cold War. But in 1957, a group of geophysicists managed to launch a year-long Antarctic research project involving 12 countries. It was such a success that two years later the countries, including the U.S., the U.K. and the USSR, signed the Antarctic Treaty, guaranteeing the continent’s protection from militarization and exploitation. This goodwill toward men took a further 20 years to extend to women, but in 1979 American engineer Irene C. Peden became the first woman to work at the South Pole for an entire winter.

Historically Speaking: Awed by the Meteor Shower of the New Year’s Sky

Human beings have always marveled at displays like this weekend’s Quadrantids, but now we can understand them as well.

The Wall Street Journal

January 1, 2021

If you wish upon a star this week, you probably won’t get your heart’s desire. But if you’re lucky, you’ll be treated to an outstanding display of the Quadrantids, the annual New Year’s meteor shower that rivals the Perseids in intensity and quality of fireballs. The Quadrantids are exceptionally brief, however: The peak lasts only a few hours on January 2, and a cloudy sky or full moon can ruin the entire show.

A long-exposure photograph of the Draconid meteor shower in October 2018.
PHOTO: SMITYUK YURI/TASS/ZUMA PRESS

Meteor showers happen when the Earth encounters dust and rock sloughed off by a comet as it orbits the sun. The streaks of light we see are produced by this debris burning up in the Earth’s atmosphere.

Human beings have been aware of the phenomenon since ancient times. Some Christian archaeologists have theorized that the biblical story of Sodom and Gomorrah was inspired by a massive meteor strike near the Dead Sea some 3,700 years ago, which wiped out the Bronze Age city of Tall el-Hammam in modern Jordan.

Aristotle believed that comets and meteors weren’t heavenly bodies but “exhalations” from the Earth that ignited in the sky. As a result, Western astronomers took little interest in them until the rise of modern science. By contrast, the Chinese began recording meteor events as early as 687 B.C. The Mayans were also fascinated by meteor showers: Studies of hieroglyphic records suggest that important occasions, such as royal coronations, were timed to coincide with the Eta Aquarid shower in the spring.

Even before telescopes were invented, it wasn’t hard to observe comets, meteors and meteor showers. The 11th-century Bayeux Tapestry contains a depiction of Halley’s comet, which appeared in 1066. But people couldn’t see meteors for what they really were. Medieval Christians referred to the annual Perseid shower as “the tears of St. Lawrence,” believing that the burning tears of the martyred saint lit up the sky on his feast day, August 10.

Things began to change in the 19th century, as astronomers noticed that some meteor showers recurred on a fixed cycle. In November 1799, the Leonid shower was recorded by Andrew Ellicott, an American surveyor on a mission to establish the boundary between the U.S. and the Spanish territory of Florida. Ellicott was on board a ship in the Florida Keys when he observed the Leonids, writing in his journal that “the whole heavens appeared as if illuminated with skyrockets, flying in an infinity of directions, and I was in constant expectation of some of them falling on the vessel.” When a similar spectacle lit up the skies in the eastern U.S. in 1833, astronomers realized that it was a recurrence of the same phenomenon and that the meteor storm must be linked to the orbit of a particular comet.

The origin of the Quadrantids was harder to locate. Astronomers kept looking for its parent comet until 2003, when NASA scientist Peter Jenniskens realized that they were on the wrong track: The shower is actually caused by a giant asteroid, designated 2003 EH1, which broke off from a comet 500 years ago. It is somehow fitting that a mystery of the New Year’s night sky yielded to the power of an open mind.

Historically Speaking: The Martini’s Contribution to Civilization

The cocktail was invented in the U.S., but it soon became a worldwide symbol of sophistication.

Wall Street Journal

December 18, 2020

In 1887, the Chicago Tribune hailed the martini as the quintessential Christmas drink, reminding readers that it is “made of Vermouth, Booth’s Gin, and Angostura Bitters.” That remains the classic recipe, even though no one can say for certain who created it.

The journalist H.L. Mencken famously declared that the martini was “the only American invention as perfect as the sonnet,” and there are plenty of claimants to the title of inventor. The city of Martinez, Calif., insists the martini was first made there in 1849, for a miner who wanted to celebrate a gold strike with something “special.” Another origin story gives the credit to Jerry Thomas, the bartender of the Occidental Hotel in San Francisco, in 1867.

Actor Pierce Brosnan as James Bond, with his signature martini.
PHOTO: MGM/EVERETT COLLECTION

Of course, just as calculus was discovered simultaneously by Isaac Newton and Gottfried Leibniz, the martini may have sprung from multiple cocktail shakers. What soon made it stand out from all other gin cocktails was its association with high society. The hero of “Burning Daylight,” Jack London’s 1910 novel about a gold-miner turned entrepreneur, drinks martinis to prove to himself and others that he has “arrived.” Ernest Hemingway paid tribute to the drink in his 1929 novel “A Farewell To Arms” with the immortal line, “I had never tasted anything so cool and clean. They made me feel civilized.”

Prohibition was a golden age for the martini. Its adaptability was a boon: Even the coarsest bathtub gin could be made palatable with the addition of vermouth and olive brine (a dirty martini), a pickled onion (Gibson), lemon (twist), lime cordial (gimlet) or extra vermouth (wet). President Franklin D. Roosevelt was so attached to the cocktail that he tried a little martini diplomacy on Stalin during the Yalta conference of 1945. Stalin could just about stand the taste but informed Roosevelt that the cold on the way down wasn’t to his liking at all.

The American love affair with the martini continued in Hollywood films like “All About Eve,” starring Bette Davis, which portrayed it as the epitome of glamour and sophistication. But change was coming. In Ian Fleming’s 1954 novel “Live and Let Die,” James Bond ordered a martini made with vodka instead of gin. Worse, two years later in “Diamonds are Forever,” Fleming described the drink as being “shaken and not stirred,” even though shaking weakens it. Then again, according to an analysis of Bond’s alcohol consumption published in the British Medical Journal in 2013, 007 sometimes downed the equivalent of 14 martinis in a 24-hour period, so his whole body would have been shaking.

American businessmen weren’t all that far behind. The three-martini lunch was a national pastime until business lunches ceased to be fully tax-deductible in the 1980s. Banished from meetings, the martini went back to its roots as a mixologists’ dream, reinventing itself as a ‘tini for all seasons.

The 1990s brought new varieties that even James Bond might have thought twice about, like the chocolate martini, made with creme de cacao, and the appletini, made with apple liqueur, cider or juice. Whatever your favorite, this holiday season let’s toast to feeling civilized.

Leaders Who Bowed Out Gracefully

Kings and politicians have used their last moments on the world stage to deliver words of inspiration.

November 5, 2020

The Wall Street Journal

The concession speech is one of the great accomplishments of modern democracy. The election is over and passions are running high, but the loser graciously concedes defeat, calls for national unity and reminds supporters that tomorrow is another day. It may be pure political theater, but it’s pageantry with a purpose.

For most of history, defeated rulers didn’t give concession speeches; they were too busy begging for their lives, since a king who lost his throne was usually killed shortly after. The Iliad recounts six separate occasions where a defeated warrior asks his opponent for mercy, only to be hacked to death anyway. The Romans had no interest whatsoever in listening to defeated enemies—except once, in the 1st century, when the British chieftain Caractacus was brought in chains before the Senate.

Republican presidential candidate John McCain delivers his concession speech on Nov. 4, 2008, after losing the election to Barack Obama.
PHOTO: ROBYN BECK/AGENCE FRANCE-PRESSE/GETTY IMAGES

On a whim, the Emperor Claudius told Caractacus to give one reason why his life should be spared. According to the historian Cassius Dio, the defeated Briton gave an impassioned speech about the glory of Rome, and how much greater it would be if he was spared: “If you save my life, I shall be an everlasting memorial of your clemency.” Impressed, the Senate set him free.

King Charles I had no hope for clemency on Jan. 30, 1649, when he faced execution after the English Civil War. But this made his speech all the more powerful, because Charles was speaking to posterity more than to his replacement, Oliver Cromwell. His final words have been a template for concession speeches ever since: After defending his record and reputation, Charles urged Cromwell to rule for the good of the country, “to endeavor to the last gasp the peace of the kingdom.”

In modern times, appeals to the nation became an important part of royal farewell speeches. When Napoleon Bonaparte abdicated as emperor of France in 1814, he stood in the courtyard of the palace of Fontainebleau and bade an emotional goodbye to the remnants of his Old Guard. He said that he was leaving to prevent further bloodshed, and ended with the exhortation: “I go, but you, my friends, will continue to serve France.”

Emperor Hirohito delivered a similar message in his radio broadcast on Aug. 14, 1945, announcing Japan’s surrender in World War II. The Emperor stressed that by choosing peace over annihilation he was serving the ultimate interests of the nation. He expected his subjects to do the same, to “enhance the innate glory of the Imperial State.” The shock of the Emperor’s words was compounded by the fact that no one outside the court and cabinet had ever heard his voice before.

In the U.S., the quality of presidential concession speeches rose markedly after they began to be televised in 1952. Over the years, Republican candidates, in particular, have elevated the art of losing to almost Churchillian heights. John McCain’s words on election night 2008, when he lost to Barack Obama, remain unmatched: “Americans never quit. We never surrender. We never hide from history. We make history.”

Historically Speaking: Tales That Go Bump in the Night

From Homer to Edgar Allan Poe, ghost stories have given us a chilling good time

The Wall Street Journal

October 23, 2020

As the novelist Neil Gaiman, a master of the macabre, once said, “Fear is a wonderful thing, in small doses.” In this respect, we’re no different than our ancestors: They, too, loved to tell ghost stories.

One of the earliest ghosts in literature appears in Homer’s Odyssey. Odysseus entertains King Alcinous of Phaeacia with an account of his trip to the Underworld, where he met the spirits of Greek heroes killed in the Trojan War. The dead Achilles complains that being a ghost is no fun: “I should choose, so I might live on earth, to serve as the hireling of another…rather than to be lord over all the dead.”

ILLUSTRATION: THOMAS FUCHS

It was a common belief in both Eastern and Western societies that ghosts could sometimes return to right a great wrong, such as an improper burial. The idea that ghosts are intrinsically evil—the core of any good ghost story—received a boost from Plato, who believed that only wicked souls hang on after death; the good know when it’s time to let go.

Ghosts were problematic for early Christianity, which taught that sinners went straight to Hell; they weren’t supposed to be slumming it on Earth. The ghost story was dangerously close to heresy until the Church adopted the belief in Purgatory, a realm where the souls of minor sinners waited to be cleansed. The Byland Abbey tales, a collection of ghost stories recorded by an anonymous 15th-century English monk, suggest that the medieval Church regarded the supernatural as a useful form of advertising: Not paying the priest to say a mass for the dead could lead to a nasty case of haunting.

The ghost story reached its apogee in the early modern era with Shakespeare’s “Hamlet,” which opens with a terrified guard seeing the ghost of the late king on the battlements of Elsinore Castle. But the rise of scientific skepticism made the genre seem old-fashioned and unsophisticated. Ghosts were notably absent from English literature until Horace Walpole, son of Britain’s first prime minister, published the supernatural mystery novel “The Castle of Otranto” in 1764, as a protest against the deadening effect of “reason” on art.

Washington Irving was the first American writer to take the ghost story seriously, creating the Headless Horseman in his 1820 tale “The Legend of Sleepy Hollow.” He was a lightweight, however, compared with Edgar Allan Poe, who turned horror into an art form. His famous 1839 story “The Fall of the House of Usher” heightens the tension with ambiguity: For most of the story, it isn’t clear whether Roderick Usher’s house really is haunted, or if he is merely “enchained by certain superstitious impressions.”

Henry James used a similar technique in 1895, when, unhappy with the tepid reception of his novels in the U.S., he decided to frighten Americans into liking him. The result was the psychological horror story “The Turn of the Screw,” about a governess who may or may not be seeing ghosts. The reviews expressed horror at the horror, with one critic describing it as “the most hopelessly evil story that we could have read in any literature.” With such universal condemnation, success was assured.

Historically Speaking: The Business and Pleasure of Dining Out

The food service industry will eventually overcome the pandemic, just as it bounced back from ancient Roman bans and Prohibition.

The Wall Street Journal

September 24, 2020

It remains anyone’s guess what America’s once-vibrant restaurant scene will look like in 2021. At the beginning of this year, there were 209 Michelin-starred restaurants in the U.S. This month, the editors of the Michelin Guide announced that just 29 had managed to reopen after the pandemic lockdown.

ILLUSTRATION: THOMAS FUCHS

The food-service industry has always had to struggle. In the Roman Empire, the typical eatery was the thermopolium, a commercial kitchen that sold mulled wine and a prepared meal—either to-go or, in larger establishments, to eat at the counter. They were extremely popular among the working poor—archaeologists have found over 150 in Pompeii alone—and therefore regarded with suspicion by the authorities. In 41 A.D., Emperor Claudius ordered a ban on thermopolia, but the setback was temporary at best.

In Europe during the Middle Ages, the “cook shop” served a similar function for the poor. For the wealthier sort, however, it was surprisingly difficult to find places to eat out. Only a few monasteries and taverns provided hospitality. In Geoffrey Chaucer’s 14th-century comic travelogue “The Canterbury Tales,” the pilgrims have to bring their own cook, Roger of Ware, who is said to be an expert at roasting, boiling, broiling and frying.

To experience restaurant-style dining with separate tables, waiters and a menu, one had to follow in the footsteps of the Venetian merchant Marco Polo to the Far East. The earliest prototype of the modern restaurant developed in Kaifeng, the last capital of the Song Dynasty (960-1279), to accommodate its vast transient population of merchants and officials. The accumulation of so many rich and homesick men led to a boom in sophisticated eateries offering meals cooked to order.

Europe had nothing similar until the French began to experiment with different forms of public catering in the 1760s. These new places advertised themselves as a healthy alternative to the tavern, offering restorative soups and broths—hence their name, the restaurant.

In 1782, this rather humble start inspired Antoine Beauvilliers to open the first modern restaurant, La Grande Taverne de Londres, which unashamedly replicated the luxury of royal dining. By the 1800s, the term “restaurant” in any language meant a superior establishment serving refined French cuisine.

In 1830, two Swiss brothers, John and Peter Delmonico, opened the first restaurant in the U.S., Delmonico’s in New York. It was a temple of haute cuisine, with uniformed waiters, imported linens and produce grown on a dedicated farm. What’s more, diners could make reservations ahead of time and order either a la carte or prix fixe—all novel concepts in 19th century America.

Delmonico’s reign lasted until Prohibition, which forced thousands of U.S. restaurants out of business, unable to survive without alcohol sales. During that time, the only growth in the restaurant trade was in illegal speakeasies and family-friendly diners. Yet in 1934, just one year after Prohibition’s repeal, the art deco-themed Rainbow Room opened its doors at Rockefeller Plaza in New York. Out of the ashes of the old, a new age of elegance had begun.

Stepping out of the Shadows

Sylvia Pankhurst by Rachel Holmes, review — finally having her moment.

Her mother and sister were once better known, but this fine biography shows just how remarkable the women’s rights activist was.

The Times

September 22, 2020

After decades of obscurity, Sylvia Pankhurst is finally having her moment. This is the third biography in seven years — not bad for a woman who spent much of her life being unfavourably compared with her more popular mother and sister.

The neglect is partly owing to Sylvia’s rich, complex life not being easily pigeonholed. Although she played an instrumental role in the suffrage movement, she was first and foremost a defender of the poor, the oppressed and the marginalised. Her political choices were often noble, but lonely ones.

Sylvia inherited her appetite for social activism and boundless energy for work from her parents, Richard and Emmeline. A perpetually aspiring MP, Richard cheerfully espoused atheism, women’s suffrage, republicanism, anti-imperialism and socialism at a time when any one of these causes was sufficient to scupper a man’s electoral chances. Emmeline was just as politically involved and only slightly less radical.

Sylvia’s mother, the suffragette Emmeline Pankhurst
ALAMY

Despite financial troubles and career disappointments, the Pankhurst parents were a devoted couple and the household a happy one. Sylvia was born in 1882, the second of five children and the middle daughter between Christabel (her mother’s favourite) and Adela (no one’s favourite). She craved Emmeline’s good opinion, but was closer to her father. “Life is valueless without enthusiasms,” he once told her, a piece of advice she took to heart.

Sylvia was only 16 when her father died. Without his counter-influence, the three sisters (and their brother, Harry, who died of polio aged 20) lived in thrall to their powerful mother. After Emmeline and Christabel founded the Women’s Social and Political Union (WSPU) in 1903 — having become frustrated by the lack of support from the Independent Labour Party — there was no question that Sylvia and Adela would do anything other than sacrifice their personal interests for the good of the cause. Sylvia later admitted that one of her greatest regrets was being made to give up a promising art career for politics.

She was imprisoned for the first time in 1906. As the tactics of the WSPU became more extreme, so did the violence employed by the authorities against its members. Sylvia was the only Pankhurst to be subjected to force-feeding, an experience she likened to rape.

“Infinitely worse than the pain,” she wrote of the experience, “was the sense of degradation.” Indeed, in some cases that was the whole point of the exercise. While not widespread, vaginal and anal “feeding” was practised on the hunger strikers. Holmes hints, but doesn’t speculate that Sylvia may have been one of its victims.


Pankhurst died in Ethiopia in 1960 after accepting an invitation from Emperor Haile Selassie, pictured, to emigrate to Africa
PHOTO ARCHIVE/GETTY

Ironically, Sylvia suffered the most while being the least convinced by the WSPU’s militant tactics. It wasn’t only the violence she abhorred. Emmeline and Christabel wanted the WSPU to remain an essentially middle-class, politically aloof organisation, whereas Sylvia regarded women’s rights as part of a wider struggle for revolutionary socialism. The differences between them became unbridgeable after Sylvia founded a separate socialist wing of the WSPU in the East End. Both she and Adela, whom Emmeline and Christabel dismissed as a talentless lightweight, were summarily expelled from the WSPU in February 1914. The four women would never again be in the same room together.

Sylvia had recognised early on that first-wave feminism suffered from a fundamental weakness. It was simultaneously too narrow and too broad to be a stand-alone political platform. The wildly different directions that were taken by the four Pankhursts after the victory of 1918 proved her right: Emmeline became a Conservative, Christabel a born-again Christian, Sylvia a communist and Adela a fascist, yet all remained loyal to their concept of women’s rights.

Once cut loose from the Pankhurst orbit, Sylvia claimed the freedom to think and act as her conscience directed. In 1918 she fell in love with an Italian anarchist socialist refugee, Silvio Corio, who already had three children with two women. Undeterred, she lived with him in Woodford Green, Essex, in a ramshackle home appropriately named Red Cottage. They remained partners for the best part of 30 years, writing, publishing and campaigning together. Even more distressing for her uptight family, not to mention society in general, at the advanced age of 45 she had a son by him, Richard, who was given her surname rather than Silvio’s.

Sylvia Pankhurst, here in 1940, became a communist after the victory of 1918
ALAMY

Broadly speaking, her life can be divided into four campaigns: after women’s suffrage came communism, then anti-fascism and finally Ethiopian independence. (The last has received the least attention, although Sylvia insisted it gave her the greatest pride.) None was an unalloyed success or without controversy. Her fierce independence would lead her to break with Lenin over their ideological differences, and later support her erstwhile enemy Winston Churchill when their views on fascism aligned. She never had any time for Stalin, left-wing antisemitism or liberal racism. In her mid-seventies and widowed, she cut all ties with Britain by accepting an invitation from Emperor Haile Selassie to emigrate to Ethiopia. She died there in 1960.

The genius of Holmes’s fascinating and important biography is that it approaches Sylvia’s life as if she were a man. The writing isn’t prettified or leavened by amusing anecdotes about Victorian manners, it’s dense and serious, as befits a woman who never wore make-up and didn’t care about clothes. To paraphrase the WSPU’s slogan, it is about deeds not domesticity. Rather than dwelling on moods and relationships, Holmes is interested in ideas and consequences. It’s wonderfully refreshing. Sylvia lived for her work; her literary output was astounding. In addition to publishing her own newspaper almost every week for over four decades, she wrote nonfiction, fiction, plays, poetry and investigative reports. She even taught herself Romanian so that she could translate the poems of the 19th-century Romantic poet Mihail Eminescu. It doesn’t matter whether Sylvia was right or wrong in her political enthusiasms; as Holmes rightly insists, what counts is that by acting on them she helped to make history.

Historically Speaking: Women Who Made the American West

From authors to outlaws, female pioneers helped to shape frontier society.

The Wall Street Journal

September 9, 2020

On Sept. 14, 1920, Connecticut became the 37th state to ratify the 19th Amendment, which guaranteed women the right to vote. The exercise was largely symbolic, since ratification had already been achieved thanks to Tennessee on August 18. Still, the fact that Connecticut and the rest of the laggard states were located in the eastern part of the U.S. wasn’t a coincidence. Though women are often portrayed in Westerns as either vixens or victims, they played a vital role in the life of the American frontier.

The outlaw Belle Starr, born Myra Belle Shirley, in 1886.
PHOTO: ROEDER BROTHERS/BUYENLARGE/GETTY IMAGES

Louisa Ann Swain of Laramie, Wyo., was the first woman in the U.S. to vote legally in a general election, in September 1870. The state was also ahead of the pack in granting women the right to sit on a jury, act as a justice of the peace and serve as a bailiff. Admittedly, it wasn’t so much enlightened thinking that opened up these traditionally male roles as it was the desperate shortage of women. No white woman crossed the continent until 17-year-old Nancy Kelsey traveled with her husband from Missouri to California in 1841. Once there, as countless pioneer women subsequently discovered, the family’s survival depended on her ability to manage without his help.

Women can and must fend for themselves was the essential message in the ‘”Little House on the Prairie” series of books by Laura Ingalls Wilder, who was brought up on a series of homesteads in Wisconsin and Minnesota in the 1870s. Independence was so natural to her that she refused to say “I obey” in her marriage vows, explaining, “even if I tried, I do not think I could obey anybody against my better judgment.”

Although the American frontier represented incredible hardship and danger, for many women it also offered a unique kind of freedom. They could forge themselves anew, seizing opportunities that would have been impossible for women in the more settled and urbanized parts of the country.

This was especially true for women of color. Colorado’s first Black settler was a former slave named Clara Brown, who won her freedom in 1856 and subsequently worked her way west to the gold-mining town of Central City. Recognizing a need in the market, she founded a successful laundry business catering to miners and their families. Some of her profits went to buy land and shares in mines; the rest she spent on philanthropy, earning her the nickname “Angel of the Rockies.” After the Civil War, Brown made it her mission to locate her lost family, ultimately finding a grown-up daughter, Eliza.

However, the flip of side of being able to “act like men” was that women had to be prepared to die like men, too. Myra Belle Shirley, aka Belle Starr, was a prolific Texas outlaw whose known associates included the notorious James brothers. Despite a long criminal career that mainly involved bootlegging and fencing stolen horses, Starr was convicted only once, resulting in a nine-month prison sentence in the Detroit House of Correction. Her luck finally ran out in 1889, two days before her 41st birthday. By now a widow for the third time, Belle was riding alone in Oklahoma when she was shot and killed in an ambush. The list of suspects included her own children, although the murder was never solved.

HistoricallySpeaking: How Fear of Sharks Became an American Obsession

Since colonial times, we’ve dreaded what one explorer called ‘the most ravenous fish known in the sea’

The Wall Street Journal

August 27, 2020

There had never been a fatal shark incident in Maine until last month’s shocking attack on a woman swimmer by a great white near Bailey Island in Casco Bay. Scientists suspect that the recent rise in seal numbers, rather than the presence of humans, was responsible for luring the shark inland.

A great white shark on the attack.
PHOTO: CHRIS PERKINS / MEDIADRUMWORLD / ZUMA PRESS

It’s often said that sharks aren’t the bloodthirsty killing machines portrayed in the media. In 2019 there were only 64 shark attacks world-wide, with just two fatalities. Still, they are feared for good reason.

The ancient Greeks knew well the horror that could await anyone unfortunate enough to fall into the wine-dark sea. Herodotus recorded how, in 492 B.C., a Persian invasion fleet of 300 ships was heading toward Greece when a sudden storm blew up around Mt. Athos. The ships broke apart, tossing some 20,000 men into the water. Those who didn’t drown immediately were “devoured” by sharks.

The Age of Discovery introduced European explorers not just to new landmasses but also to new shark species far more dangerous than the ones they knew at home. In a narrative of his 1593 journey to the South Seas, the explorer and pirate Richard Hawkins described the shark as “the most ravenous fishe knowne in the sea.”

It’s believed that the first deadly shark attack in the U.S. took place in 1642 at Spuyten Duyvil, an inlet on the Hudson River north of Manhattan. Antony Van Corlaer was attempting to swim across to the Bronx when a giant fish was seen to drag him under the water.

But the first confirmed American survivor of a shark attack was Brook Watson, a 14-year-old sailor from Boston. In 1749, Watson was serving on board a merchant ship when he was attacked while swimming in Cuba’s Havana Harbor. Fortunately, his crewmates were able to launch a rowboat and pull him from the water, leaving Watson’s right foot in the shark’s mouth.

Despite having a wooden leg, Watson enjoyed a successful career at sea before returning to his British roots to enter politics. He ended up serving as Lord Mayor of London and becoming Sir Brook Watson. His miraculous escape was immortalized by his friend the American painter John Singleton Copley. “Watson and the Shark” was completely fanciful, however, since Copley had never seen a shark.

The American relationship with sharks was changed irrevocably during the summer of 1916. The East Coast was gripped by both a heat wave and a polio epidemic, leaving the beach as one of the few safe places for Americans to relax. On July 1, a man was killed by a shark on Long Beach Island off the New Jersey coast. Over the next 10 days, sharks in the area killed three more people and left one severely injured. In the ensuing national uproar, President Woodrow Wilson offered federal funds to help get rid of the sharks, an understandable but impossible wish.

The Jersey Shore attacks served as an inspiration for Peter Benchley’s bestselling 1974 novel “Jaws,” which was turned into a blockbuster film the next year by Steven Spielberg. Since then the shark population in U.S. waters has dropped by 60%, in part due to an increase in shark-fishing inspired by the movie. Appalled by what he had unleashed, Benchley spent the last decades of his life campaigning for shark conservation.

Historically Speaking: The Sharp Riposte as a Battle Tactic

Caustic comebacks have been exchanged between military leaders for millennia, from the Spartans to World War II

The Wall Street Journal

August 6, 2020

In the center of Bastogne, Belgium (pop. 16,000), there is a statue of U.S. Army General Anthony C. McAuliffe, who died 45 years ago this week. It’s a small town with a big history. Bastogne came under attack during the final German offensive of World War II, known as the Battle of the Bulge. The town was the gateway to Antwerp, a vital port for the Allies, and all that stood between the Germans and their objective was Gen. McAuliffe and his 101st Airborne Division. Despite being outnumbered by a factor of four to one, he refused to surrender, fighting on until Gen. George Patton’s reinforcements could break the siege.

ILLUSTRATION: THOMAS FUCHS

While staving off the German attack, Gen. McAuliffe uttered the greatest comeback of the war. A typewritten ultimatum from Commander Heinrich von Lüttwitz of the 47th German Panzer Corps gave him two hours to surrender the town or face “annihilation.” With ammunition running low and casualties mounting, the general made his choice. He sent back the following typed reply:

December 22, 1944

To the German Commander,

N U T S !

The American Commander

The true laconic riposte is extremely rare. The Spartans, whose ancient homeland of Lakonia inspired the term “laconic,” were masters of the art. When Philip II of Macedon ordered them to open their borders, he warned them, “For if I bring my army into your land, I will destroy your farms, slay your people and raze your city.” According to the later account of the Greek philosopher Plutarch, they sent back a one-word message: ‘If’.

Once the age of the Spartans passed, it took more than a millennium for the laconic comeback to return in earnest. The man most responsible was a colorful Teutonic knight called Götz von Berlichingen, who participated in countless German conflicts, uprisings and skirmishes, including the Swabian War between the Hapsburgs and the Swiss. He lost a hand to a cannonball and wore an iron prosthesis (hence his nickname, Götz of the Iron Hand). In 1515, sick of trading insults with an opponent who wouldn’t come out to fight, Götz abruptly ended the conversation with: “soldt mich hinden leckhenn,” which literally meant “kiss my ass.”

He recorded the encounter in his memoirs, but it remained little known until Johann Wolfgang von Goethe adapted Götz’s autobiography into a successful play in 1773. From then on, the insult was popularly known in Germany as a “Swabian salute.”

In his novel “Les Misérables,” Victor Hugo—possibly inspired by Goethe—immortalized the French defeat in 1815 at the Battle of Waterloo with a scene of spectacular, if laconic, defiance that incorporated France’s most common expletive. According to Hugo, Gen. Pierre Cambronne, commander of Napoleon Bonaparte’s Imperial Guard, fought to the last syllable in the face of overwhelming force. Encircled by the British army, “They could hear the sound of the guns being reloaded and see the lighted slow matches gleaming like the eyes of tigers in the dusk. An English general called out to them, ‘Brave Frenchmen, will you not surrender?’ Cambronne answered, ‘Merde’” (that is, “shit”).

The scene’s veracity is still hotly debated, the fog of war making memories hazy. But Cambronne—who survived—later disavowed it, especially after “le mot de Cambronne’’ (Cambronne’s word) became a common euphemism for the profanity.