Historically Speaking: Whistleblowing’s Evolution, From Rome to the Pentagon Papers to Wikileaks

The exposure 50 years ago of government documents about the Vietnam War ushered in a modern era of leaks, built on a long tradition

The Wall Street Journal

June 12, 2021

The Pentagon Papers—a secret Defense Department review of America’s involvement in the Vietnam War—became public 50 years ago next week. The ensuing Supreme Court case guaranteed the freedom of the press to report government malfeasance, but the U.S. military analyst behind the revelation, Daniel Ellsberg, still ended up being prosecuted for espionage. Luckily for him, the charges were dropped after the trial became caught up in the Watergate scandal.

The twists and turns surrounding the Pentagon Papers have a uniquely American flavor to them. At the time, no other country regarded whistleblowing as a basic right.

The origins of whistleblowing are far less idealistic. The idea is descended from Roman ‘‘Qui Tam’’ laws, from a Latin phrase meaning “he who sues for himself does also for the king.” The Qui Tam laws served a policing function by giving informers a financial incentive to turn in wrong-doers. A citizen who successfully sued over malfeasance was rewarded with a portion of the defendant’s estate.

Daniel Ellsberg, left, testifying before members of Congress on July 28, 1971, several weeks after the publication of the Pentagon Papers.
PHOTO: BETTMANN ARCHIVE/GETTY IMAGES

Anglo-Saxon law retained a crude version of Qui Tam. At first primarily aimed at punishing Sabbath-breakers, it evolved into whistleblowing against corruption. In 1360, the English monarch Edward III resorted to Qui Tam-style laws to encourage the reporting of jurors and public officials who accepted bribes.

Whistleblowers could never be sure that those in power wouldn’t retaliate, however. The fate of two American sailors in the Revolutionary War, Richard Marven and Samuel Shaw, was a case in point. The men were imprisoned for libel after they reported the commander of the navy, Esek Hopkins, for a string of abuses, including the torture of British prisoners of war. In desperation they petitioned the Continental Congress for redress. Eager to assert its authority, the Congress not only paid the men’s legal bill but also passed what is generally held to be the first whistleblower-protection law in history. The law was strengthened during the Civil War via the False Claims Act, to deter the sale of shoddy military supplies.

These early laws framed such actions as an expression of patriotism. The phrase “to blow the whistle” only emerged in the 1920s, but by then U.S. whistleblowing culture had already reigned in the corporate behemoth Standard Oil. In 1902, a clerk glanced over some documents that he had been ordered to burn, only to realize they contained evidence of wrongdoing. He passed them to a friend, and they reached journalist Ida Tarbell, forming a vital part of her expose of Standard Oil’s monopolistic abuses.

During World War II, the World Jewish Congress requested special permission from the government to ransom Jewish refugees in Romania and German-occupied France. A Treasury Department lawyer named Joshua E. Dubois Jr. discovered that State Department officials were surreptitiously preventing the money from going abroad. He threatened to go public with the evidence, forcing a reluctant President Franklin Roosevelt to establish the War Refugee Board.

Over the past half-century, the number of corporate and government whistleblowers has grown enormously. Nowadays, the Internet is awash with Wikileaks-style whistleblowers. But in contrast to the saga of the Pentagon Papers, which became a turning point in the Vietnam War and concluded with Mr. Ellsberg’s vindication, it’s not clear what the release of classified documents by Julian Assange, Chelsea Manning and Edward Snowden has achieved. To some, the three are heroes; to others, they are simply spies.

Historically Speaking: The Long Road to Protecting Inventions With Patents

Gunpowder was never protected. Neither were inventions by Southern slaves. Vaccines are—but that’s now the subject of a debate.

The Wall Street Journal

May 20, 2021

The U.S. and China don’t see eye to eye on much nowadays, but in a rare show of consensus, the two countries both support a waiver of patent rights for Covid-19 vaccines. If that happens, it would be the latest bump in a long, rocky road for intellectual property rights.

Elijah McCoy and a diagram from one of his patents for engine lubrication.
ILLUSTRATION: THOMAS FUCHS

There was no such thing as patent law in the ancient world. Indeed, until the invention of gunpowder, the true cost of failing to protect new ideas was never even considered. In the mid-11th century, the Chinese Song government realized too late that it had allowed the secret of gunpowder to escape. It tried to limit the damage by banning the sale of saltpeter to foreigners. But merchants found ways to smuggle it out, and by 1280 Western inventors were creating their own recipes for gunpowder.

Medieval Europeans understood that knowledge and expertise were valuable, but government attempts at control were crude in the extreme. The Italian Republic of Lucca protected its silk trade technology by prohibiting skilled workers from emigrating; Genoa offered bounties for fugitive artisans. Craft guilds were meant to protect against intellectual expropriation, but all too often they simply stifled innovation.

The architect Filippo Brunelleschi, designer of the famous dome of Florence’s Santa Maria del Fiore, was the first to rebel against the power of the guilds. In 1421 he demanded that the city grant him the exclusive right to build a new type of river boat. His deal with Florence is regarded as the first legal patent. Unfortunately, the boat sank on its first voyage, but other cities took note of Brunelleschi’s bold new business approach.

In 1474 the Venetians invited individuals “capable of devising and inventing all kinds of ingenious contrivances” to establish their workshops in Venice. In return for settling in the city, the Republic offered them the sole right to manufacture their inventions for 10 years. Countries that imitated Venice’s approach reaped great financial rewards. England’s Queen Elizabeth I granted over 50 individual patents, often with the proviso that the patent holder train English craftsmen to carry on the trade.

Taking their cue from British precedent, the framers of the U.S. Constitution gave Congress the power to legislate on intellectual property rights. Congress duly passed a patent law in 1790 but failed to address the legal position of enslaved inventors. Their anomalous position came to a head in 1857 after a Southern slave owner named Oscar Stuart tried to patent a new plow invented by his slave Ned. The request was denied on the grounds that the inventor was a slave and therefore not a citizen, and while the owner was a citizen, he wasn’t the inventor.

After the Civil War, the opening up of patent rights enabled African-American inventors to bypass racial barriers and amass significant fortunes. Elijah McCoy (1844-1929) transformed American rail travel with his engine lubrication system.

McCoy ultimately registered 57 U.S. patents, significantly more than Alexander Graham Bell’s 18, though far fewer than Thomas Edison’s 1,093. The American appetite for registering inventions remains unbounded. Last fiscal year alone, the U.S. Patent and Trademark Office issued 399,055 patents.

Is there anything that can’t be patented? The answer is yes. In 1999 Smuckers attempted to patent its crustless peanut butter and jelly sandwich with crimped edges. Eight years and a billion homemade PB&J sandwiches later, a federal appeals court ruled there was nothing “novel” about foregoing the crusts.

Historically Speaking: The Long Fight to Take the Weekend Off

Ancient Jews and Christians observed a day of rest, but not until the 20th century did workers get two days a week to do as they pleased.

Wall Street Journal

April 1, 2021

Last month the Spanish government agreed to a pilot program for experimenting with a four-day working week. Before the pandemic, such a proposal would have seemed impossible—but then, so was the idea of working from home for months on end, with no clear downtime and no in-person schooling to keep the children occupied.

In ancient times, a week meant different things to different cultures. The Egyptians used sets of 10 days called decans; there were no official days off except for the craftsmen working on royal tombs and temples, who were allowed two days out of every 10. The Romans tried an eight-day cycle, with the eighth set aside as a market day. The Babylonians regarded the number seven as having divine properties and applied it whenever possible: There were seven celestial bodies, seven nights of each lunar phase and seven days of the week.

A day of leisure in Newport Beach, Calif., 1928. PHOTO: DICK WHITTINGTON STUDIO/CORBIS/GETTY IMAGES

A day of leisure in Newport Beach, Calif., 1928. PHOTO: DICK WHITTINGTON STUDIO/CORBIS/GETTY IMAGES

The ancient Jews, who also used a seven-day week, were the first to mandate a Sabbath or rest day, on Saturday, for all people regardless of rank or occupation. In 321 A.D., the Roman emperor Constantine integrated the Judeo-Christian Sabbath into the Julian calendar, but mindful of pagan sensibilities, he chose Sunday, the day of the god Sol, for rest and worship.

Constantine’s tinkering was the last change to the Western workweek for more than a millennia. The authorities saw no reason to allow the lower orders more than one day off a week, but they couldn’t stop them from taking matters into their own hands. By the early 18th century, the custom of “keeping Saint Monday”—that is, taking the day to recover from the Sunday hangover—had become firmly entrenched among the working classes in America and Britain.

Partly out of desperation, British factory owners began offering workers a half-day off on Saturday in return for a full day’s work on Monday. Rail companies supported the campaign with cheap-rate Saturday excursions. By the late 1870s, the term “weekend” had become so popular that even the British aristocracy started using it. For them however, the weekend began on Saturday and ended on Monday night.

American workers weren’t so fortunate. In 1908, a few New England mill owners granted workers Saturdays and Sundays off because of their large number of Jewish employees. Few other businesses followed suit until 1922, when Malcolm Gray, owner of the Rochester Can Company in upstate New York, decided to give a five-day week to his workers as a Christmas gift. The subsequent uptick in productivity was sufficiently impressive to convince Henry Ford to try the same experiment in 1926 at the Ford Motor Company. Ford’s success made the rest of the country take notice.

Meanwhile, the Soviet Union was moving in the other direction. In 1929, Joseph Stalin introduced the continuous week, which required 80% of the population to be working on any given day. It was so unpopular that the system was abandoned in 1940, the same year that the five-day workweek became law in the U.S. under the Fair Labor Standards Act. The battle for the weekend had been won at last. Now let the battle for the four-day week begin.

Historically Speaking: The Ordeal of Standardized Testing

From the Incas to the College Board, exams have been a popular way for societies to select an elite.

The Wall Street Journal

March 11, 2021

Last month, the University of Texas at Austin joined the growing list of colleges that have made standardized test scores optional for another year due to the pandemic. Last year, applicants were quick to take up the offer: Only 44% of high-school students who applied to college using the Common Application submitted SAT or ACT scores in 2020-21, compared with 77% the previous year.

Nobody relishes taking exams, yet every culture expects some kind of proof of educational attainment from its young. To enter Plato’s Academy in ancient Athens, a prospective student had to solve mathematical problems. Would-be doctors at one of the many medical schools in Ephesus had to participate in a two-day competition that tested their knowledge as well as their surgical skills.

ILLUSTRATION: THOMAS FUCHS

On the other side of the world, the Incas of Peru were no less demanding. Entry into the nobility required four years of rigorous instruction in the Quechua language, religion and history. At the end of the course students underwent a harsh examination lasting several days that tested their physical and mental endurance.

It was the Chinese who invented the written examination, as a means of improving the quality of imperial civil servants. During the reign of Empress Wu Zetian, China’s only female ruler, in the 7th century, the exam became a national rite of passage for the intelligentsia. Despite its burdensome academic requirements, several hundred thousand candidates took it every year. A geographical quota system was eventually introduced to prevent the richer regions of China from dominating.

Over the centuries, all that cramming for one exam stifled innovation and encouraged conformity. Still, the meritocratic nature of the Chinese imperial exam greatly impressed educational reformers in the West. In 1702, Trinity College, Cambridge became the first institution to require students to take exams in writing rather than orally. By the end of the 19th century, exams to enter a college or earn a degree had become a fixture in most European countries.

In the U.S., the reformer Horace Mann introduced standardized testing in Boston schools in the 1840s, hoping to raise the level of teaching and ensure that all citizens would have equal access to a good education. The College Board, a nonprofit organization founded by a group of colleges and high schools in 1899, established the first standardized test for university applicants.

Not every institution that adopted standardized testing had noble aims, however. The U.S. Army had experimented with multiple-choice intelligence tastes during World War I and found them useless as a predictive tool. But in the early 1920s, the president of Columbia University, Nicholas M. Butler, adopted the Thorndike Tests for Mental Alertness as part of the admissions process, believing it would limit the number of Jewish students.

The College Board adopted the SAT, a multiple-choice aptitude test, in 1926, as a fair and inclusive alternative to written exams, which were thought to be biased against poorer students. In the 1960s, civil rights activists began to argue that standardized tests like the SAT and ACT were biased against minority students, but despite the mounting criticisms, the tests seemed like a permanent part of American education—until now.

Historically Speaking: Masterpieces That Began as Failures

From Melville’s ‘Moby-Dick’ to Beethoven’s ‘Fidelio,’ some great works of art have taken a long time to win recognition.

The Wall Street Journal

November 19, 2020

Failure hurts. Yet history, the ultimate judge, shows that today’s failure sometimes turns out to be tomorrow’s stroke of genius—even if it takes many tomorrows.

Take Ludwig van Beethoven’s opera “Fidelio,” which tells the story of Leonore, a young woman who disguises herself as a man to rescue her husband Florestan, a political prisoner marked for execution. The opera contains some of the most noble and inspiring Beethoven music ever wrote, but when it premiered in Vienna on Nov. 20, 1805, it was a fiasco.

PHOTO: ALAMY

The timing was terrible. The Austrian capital was practically deserted, having fallen to Napoleon’s army the week before. The audience was mainly comprised of French army officers, who showed little interest in an opera celebrating female heroism, conjugal love and freedom from tyranny. “Fidelio” had just three performances before it was pulled from the stage.

Crestfallen at the failure of his first (and only) attempt at opera, Beethoven set about revising and shortening the work. When “Fidelio” finally returned to the Vienna stage on May 23, 1814, it was a great success, though the composer’s increasing deafness affected his conducting, almost throwing the performance into chaos. The chorus master saved the day by keeping time behind him.

Beethoven was lucky that the Viennese finally appreciated the revolutionary nature of “Fidelio” in his own lifetime. The 19th-century Danish philosopher Soren Kierkegaard, by contrast, lived and died a national joke. The problem wasn’t so much his ideas about being and suffering, though these fell on deaf ears, but his character. Kierkegaard was a feud-mongering eccentric so derided in his native Copenhagen that people would tease him in the street.

Kierkegaard’s most important book, “Either/Or,” was published at his own expense in 1843 and sold a mere 525 copies. He remained on the outer fringes of philosophy until his writings began to be translated into other languages. But in the aftermath of World War I, Kierkegaard inspired German and French thinkers to approach the meaning of life in an entirely new way, creating the school of thought known as existentialism.

Being a ground-breaker is often a lonely business, as the novelist Herman Melville discovered. His first five novels, starting with “Typee” in 1846, were highly popular adventure tales based on his experiences as a sailor. But more than money and success, Melville wanted to write a great book that would reimagine the American novel. He made the attempt in “Moby-Dick, or The Whale,” which appeared in 1851.

But readers loathed Melville’s highflown new style, and critics were bemused at best. His literary career never recovered. In 1856, he wrote to Nathaniel Hawthorne, “I have pretty much made up my mind to be annihilated.” He died in 1891 with his final book, “Billy Budd,” unfinished.

Still, Melville never wavered in his belief that life’s true winners are the strivers. As he wrote, “‘It is better to fail in originality than to succeed in imitation. He who has never failed somewhere, that man can not be great. Failure is the true test of greatness.”

Historically Speaking: Women Who Made the American West

From authors to outlaws, female pioneers helped to shape frontier society.

The Wall Street Journal

September 9, 2020

On Sept. 14, 1920, Connecticut became the 37th state to ratify the 19th Amendment, which guaranteed women the right to vote. The exercise was largely symbolic, since ratification had already been achieved thanks to Tennessee on August 18. Still, the fact that Connecticut and the rest of the laggard states were located in the eastern part of the U.S. wasn’t a coincidence. Though women are often portrayed in Westerns as either vixens or victims, they played a vital role in the life of the American frontier.

The outlaw Belle Starr, born Myra Belle Shirley, in 1886.
PHOTO: ROEDER BROTHERS/BUYENLARGE/GETTY IMAGES

Louisa Ann Swain of Laramie, Wyo., was the first woman in the U.S. to vote legally in a general election, in September 1870. The state was also ahead of the pack in granting women the right to sit on a jury, act as a justice of the peace and serve as a bailiff. Admittedly, it wasn’t so much enlightened thinking that opened up these traditionally male roles as it was the desperate shortage of women. No white woman crossed the continent until 17-year-old Nancy Kelsey traveled with her husband from Missouri to California in 1841. Once there, as countless pioneer women subsequently discovered, the family’s survival depended on her ability to manage without his help.

Women can and must fend for themselves was the essential message in the ‘”Little House on the Prairie” series of books by Laura Ingalls Wilder, who was brought up on a series of homesteads in Wisconsin and Minnesota in the 1870s. Independence was so natural to her that she refused to say “I obey” in her marriage vows, explaining, “even if I tried, I do not think I could obey anybody against my better judgment.”

Although the American frontier represented incredible hardship and danger, for many women it also offered a unique kind of freedom. They could forge themselves anew, seizing opportunities that would have been impossible for women in the more settled and urbanized parts of the country.

This was especially true for women of color. Colorado’s first Black settler was a former slave named Clara Brown, who won her freedom in 1856 and subsequently worked her way west to the gold-mining town of Central City. Recognizing a need in the market, she founded a successful laundry business catering to miners and their families. Some of her profits went to buy land and shares in mines; the rest she spent on philanthropy, earning her the nickname “Angel of the Rockies.” After the Civil War, Brown made it her mission to locate her lost family, ultimately finding a grown-up daughter, Eliza.

However, the flip of side of being able to “act like men” was that women had to be prepared to die like men, too. Myra Belle Shirley, aka Belle Starr, was a prolific Texas outlaw whose known associates included the notorious James brothers. Despite a long criminal career that mainly involved bootlegging and fencing stolen horses, Starr was convicted only once, resulting in a nine-month prison sentence in the Detroit House of Correction. Her luck finally ran out in 1889, two days before her 41st birthday. By now a widow for the third time, Belle was riding alone in Oklahoma when she was shot and killed in an ambush. The list of suspects included her own children, although the murder was never solved.

HistoricallySpeaking: How Fear of Sharks Became an American Obsession

Since colonial times, we’ve dreaded what one explorer called ‘the most ravenous fish known in the sea’

The Wall Street Journal

August 27, 2020

There had never been a fatal shark incident in Maine until last month’s shocking attack on a woman swimmer by a great white near Bailey Island in Casco Bay. Scientists suspect that the recent rise in seal numbers, rather than the presence of humans, was responsible for luring the shark inland.

A great white shark on the attack.
PHOTO: CHRIS PERKINS / MEDIADRUMWORLD / ZUMA PRESS

It’s often said that sharks aren’t the bloodthirsty killing machines portrayed in the media. In 2019 there were only 64 shark attacks world-wide, with just two fatalities. Still, they are feared for good reason.

The ancient Greeks knew well the horror that could await anyone unfortunate enough to fall into the wine-dark sea. Herodotus recorded how, in 492 B.C., a Persian invasion fleet of 300 ships was heading toward Greece when a sudden storm blew up around Mt. Athos. The ships broke apart, tossing some 20,000 men into the water. Those who didn’t drown immediately were “devoured” by sharks.

The Age of Discovery introduced European explorers not just to new landmasses but also to new shark species far more dangerous than the ones they knew at home. In a narrative of his 1593 journey to the South Seas, the explorer and pirate Richard Hawkins described the shark as “the most ravenous fishe knowne in the sea.”

It’s believed that the first deadly shark attack in the U.S. took place in 1642 at Spuyten Duyvil, an inlet on the Hudson River north of Manhattan. Antony Van Corlaer was attempting to swim across to the Bronx when a giant fish was seen to drag him under the water.

But the first confirmed American survivor of a shark attack was Brook Watson, a 14-year-old sailor from Boston. In 1749, Watson was serving on board a merchant ship when he was attacked while swimming in Cuba’s Havana Harbor. Fortunately, his crewmates were able to launch a rowboat and pull him from the water, leaving Watson’s right foot in the shark’s mouth.

Despite having a wooden leg, Watson enjoyed a successful career at sea before returning to his British roots to enter politics. He ended up serving as Lord Mayor of London and becoming Sir Brook Watson. His miraculous escape was immortalized by his friend the American painter John Singleton Copley. “Watson and the Shark” was completely fanciful, however, since Copley had never seen a shark.

The American relationship with sharks was changed irrevocably during the summer of 1916. The East Coast was gripped by both a heat wave and a polio epidemic, leaving the beach as one of the few safe places for Americans to relax. On July 1, a man was killed by a shark on Long Beach Island off the New Jersey coast. Over the next 10 days, sharks in the area killed three more people and left one severely injured. In the ensuing national uproar, President Woodrow Wilson offered federal funds to help get rid of the sharks, an understandable but impossible wish.

The Jersey Shore attacks served as an inspiration for Peter Benchley’s bestselling 1974 novel “Jaws,” which was turned into a blockbuster film the next year by Steven Spielberg. Since then the shark population in U.S. waters has dropped by 60%, in part due to an increase in shark-fishing inspired by the movie. Appalled by what he had unleashed, Benchley spent the last decades of his life campaigning for shark conservation.

Historically Speaking: The Delicious Evolution of Mayonnaise

Ancient Romans ate a pungent version, but the modern egg-based spread was created by an 18th-century French chef.

July 9, 2020

The Wall Street Journal

I can’t imagine a summer picnic without mayonnaise—in the potato salad, the veggie dips, the coleslaw, and yes, even on the french fries. It feels like a great dollop of pure Americana in a creamy, satisfying sort of way. But like a lot of what makes our country so successful, mayonnaise originally came from somewhere else.

Where, exactly, is one of those food disputes that will never be resolved, along with the true origins of baklava pastry, hummus and the pisco sour cocktail. In all likelihood, the earliest version of mayonnaise was an ancient Roman concoction of garlic and olive oil, much praised for its medicinal properties by Pliny the Elder in his first-century encyclopedia “Naturalis Historia.” This strong-tasting, aioli-like proto-mayonnaise remained a southern Mediterranean specialty for millennia.

But most historians believe that modern mayonnaise was born in 1756 in the port city of Mahon, in the Balearic Islands off the coast of Spain. At the start of the Seven Years’ War between France and Britain, the French navy, led by the Duc de Richelieu, smashed Admiral Byng’s poorly armed fleet at the Battle of Minorca. (Byng was subsequently executed for not trying hard enough.) While preparing the fish course for Richelieu’s victory dinner, his chef coped with the lack of cream on the island by ingeniously substituting a goo of eggs mixed with oil and garlic.

The anonymous cook took the recipe for “mahonnaise” back to France, where it was vastly improved by Marie-Antoine Careme, the founder of haute cuisine. Careme realized that whisking rather than simply stirring the mixture created a soft emulsion that could be used in any number of dishes, from the savory to the sweet.

It wasn’t just the French who fell for Careme’s version. Mayonnaise blended easily with local cuisines, evolving into tartar sauce in Eastern Europe, remoulades in the Baltic countries and salad dressing in Britain. By 1838, the menu at the iconic New York restaurant Delmonico’s featured lobster mayonnaise as a signature dish.

All that whisking, however, made mayonnaise too laborious for home cooks until the invention of the mechanical eggbeater, first patented by the Black inventor Willis Johnson, of Cincinnati, Ohio, in 1884. “Try it once,” gushed Good Housekeeping magazine in 1889, “and you’ll never go back to the old way as long as you live.”

Making mayonnaise was one thing, preserving it quite another, since the raw egg made it spoil quickly. The conundrum was finally solved in 1912 by Richard Hellmann, a German-American deli owner in New York. By using his own trucks and factories, Hellmann was able to manufacture and transport mayonnaise faster. And in a revolutionary move, he designed the distinctive wide-necked Hellmann’s jar, encouraging liberal slatherings of mayo and thereby speeding up consumption.

Five years later, Eugenia Duke of North Carolina created Duke’s mayonnaise, which is eggier and has no sugar. The two brands are still dueling it out. But when it comes to eating, there are no winners and losers in the mayo department, just 14 grams of fat and 103 delicious calories per tablespoon.

Historically Speaking: Golfing With Emperors and Presidents

From medieval Scotland to the White House, the game has appealed to the powerful as well as the common man.

June 3, 2020

The Wall Street Journal

The history of golf is a tale of two sports: one played by the common man, the other by kings and presidents. The plebeian variety came first. Paganica, a game played with a bent stick and a hard ball stuffed with feathers, was invented by Roman soldiers as a way to relieve the monotony of camp life. It is believed that a version of Paganica was introduced to Scotland when the Roman emperor Septimius Severus invaded the country in 208 A.D.

Golf buddies Arnold Palmer (left) and Dwight Eisenhower.
PHOTO: AUGUSTA NATIONAL/GETTY IMAGES

Golf might also have been influenced by stick-and-ball games from other cultures, such as the medieval Chinese chuiwan (“hit-ball”) and Dutch colf, an indoor game using rubber balls and heavy clubs. But the game we know today originated in the 15th century on the Links—the long, grassy sand dunes that are such a distinctive feature of Scotland’s coastline. The terrain was perfect for all-weather play, as well as for keeping out of sight of the authorities: Scottish kings prohibited the game until 1502, anxious that it would interfere with archery practice.

Two years after lifting the ban, King James IV of Scotland played the first recorded golf match while staying at Falkland Palace near St. Andrews. In theory, anyone could play on the Links since it was common land. Starting in 1754, however, access was controlled by the Royal and Ancient Golf Club of St. Andrews, known today as the “Home of Golf.” The R & A did much to standardize the rules of the game, while cementing golf’s reputation as an aristocratic activity.

In the 19th century, innovations in lawn care and ball manufacturing lowered the cost of golf, but the perception of elitism persisted. When William Howard Taft ran for president in 1908, Teddy Roosevelt urged him to beware of projecting an upper-crust image: “photographs on horseback, yes; tennis, no. And golf is fatal.” Taft ignored Roosevelt’s advice, as did Woodrow Wilson, who played more rounds of golf—nearly 1,200 in all—than any other president. He even played in the snow, using a black-painted ball.

Wilson’s record was nearly matched by Dwight Eisenhower, who so loved the game that he had a putting green installed outside the Oval Office in 1954. At first the media criticized his fondness for a rich man’s game. But that changed after Arnold Palmer, one of the greatest and most charismatic golfers in history, became Eisenhower’s friend and regular golf partner. The frequent sight of the president and the sports hero playing together made golf appear attractive, aspirational and above all accessible, inspiring millions of ordinary Americans to try the game for the first time.

But that popularity has been dented in recent years. The number of golfers in the U.S. dropped from a high of 30 million in 2005 to 24.1 million in 2015. In addition to being pricey, golf is still criticized for being snobby. Earlier this year, Brooks Koepka, a professional golfer once ranked number one in the world, told GQ that he loved the game but not “the stuffy atmosphere that comes along with it.” “Golf has always had this persona of the triple-pleated khaki pants, the button-up shirt, very country club atmosphere,” he complained. Now that almost all of the country’s golf courses have reopened from pandemic-related shutdowns, golf has a new opportunity to make every player feel included.

Historically Speaking: Sleuthing Through the Ages

Illustration by Dominic Bugatto

From Oedipus to Sherlock Holmes, readers have flocked to stories about determined detectives.

May 21, 2020

The Wall Street Journal

I have to confess that I’ve spent the lockdown reading thrillers and whodunits. But judging by the domination of mystery titles on the bestseller lists, so has nearly everyone else. In uncertain times, crime fiction offers certainty, resolution and comfort.

The roots of the genre go back to the ancient Greeks. Sophocles’s “Oedipus the King,” written around 429 B.C., is in essence a murder mystery. The play begins with Oedipus swearing that he will not rest until he discovers who killed Laius, the previous king of Thebes. Like a modern detective, Oedipus questions witnesses and follows clues until the terrible truth is revealed: He is both the investigator and the criminal, having unwittingly murdered his father and married his mother.

The Chinese were the first to give crime fiction a name. Gong’an or “magistrate’s desk” literature developed during the Song dynasty (960-1279), featuring judges who recount the details of a difficult or dangerous case. Modern Western crime fiction adopted a more individualistic approach, making heroes out of amateurs. The 1819 novella “Mademoiselle de Scuderi,” by the German writer E.T.A. Hoffmann, is an early prototype: The heroine, an elderly writer, helps to solve a serial murder case involving stolen jewelry.

But it is Edgar Allan Poe who is generally regarded as the godfather of detective fiction. His short story “The Murders in the Rue Morgue,” published in 1841, features an amateur sleuth, Auguste Dupin, who solves the mysterious, gruesome deaths of two women. (Spoiler: The culprit was an escaped orangutan.) Poe invented some of the genre’s most important devices, including the “locked room” puzzle, in which a murder takes place under seemingly impossible conditions.

Toward the end of the 19th century, Arthur Conan Doyle’s Sherlock Holmes series added three innovations that quickly became conventions: the loyal sidekick, the arch-villain and the use of forensic science. In the violin-playing, drug-abusing Holmes, Doyle also created a psychologically complex character who enthralled readers—too much for Doyle’s liking. Desperate to be considered a literary writer, he killed off Holmes in 1893, only to be forced by public demand to resurrect him 12 years later.

When Doyle published his last Holmes story in 1927, the “Golden Age” of British crime fiction was in full swing. Writers such as Agatha Christie and Dorothy L. Sayers created genteel detectives who solved “cozy crimes” in upper-middle-class settings, winning a huge readership and inspiring American imitators like S.S. Van Dine, the creator of detective Philo Vance, who published a list of “Twenty Rules for Writing Detective Stories.”

As violence and corruption increased under Prohibition, American mystery writing turned toward more “hard-boiled” social realism. In Dashiell Hammett and Raymond Chandler’s noir fiction, dead bodies in libraries are replaced by bloody corpses in cars.

At the time, critics quarreled about which type of mystery was superior, though both can seem old-fashioned compared with today’s spy novels and psychological thrillers. The number of mystery subgenres seems to be infinite. Yet one thing will never change: our yearning for a hero who is, in Raymond Chandler’s words, “the best man in his world and a good enough man for any world.”