Historically Speaking: The Long Fight to Take the Weekend Off

Ancient Jews and Christians observed a day of rest, but not until the 20th century did workers get two days a week to do as they pleased.

Wall Street Journal

April 1, 2021

Last month the Spanish government agreed to a pilot program for experimenting with a four-day working week. Before the pandemic, such a proposal would have seemed impossible—but then, so was the idea of working from home for months on end, with no clear downtime and no in-person schooling to keep the children occupied.

In ancient times, a week meant different things to different cultures. The Egyptians used sets of 10 days called decans; there were no official days off except for the craftsmen working on royal tombs and temples, who were allowed two days out of every 10. The Romans tried an eight-day cycle, with the eighth set aside as a market day. The Babylonians regarded the number seven as having divine properties and applied it whenever possible: There were seven celestial bodies, seven nights of each lunar phase and seven days of the week.

A day of leisure in Newport Beach, Calif., 1928. PHOTO: DICK WHITTINGTON STUDIO/CORBIS/GETTY IMAGES

A day of leisure in Newport Beach, Calif., 1928. PHOTO: DICK WHITTINGTON STUDIO/CORBIS/GETTY IMAGES

The ancient Jews, who also used a seven-day week, were the first to mandate a Sabbath or rest day, on Saturday, for all people regardless of rank or occupation. In 321 A.D., the Roman emperor Constantine integrated the Judeo-Christian Sabbath into the Julian calendar, but mindful of pagan sensibilities, he chose Sunday, the day of the god Sol, for rest and worship.

Constantine’s tinkering was the last change to the Western workweek for more than a millennia. The authorities saw no reason to allow the lower orders more than one day off a week, but they couldn’t stop them from taking matters into their own hands. By the early 18th century, the custom of “keeping Saint Monday”—that is, taking the day to recover from the Sunday hangover—had become firmly entrenched among the working classes in America and Britain.

Partly out of desperation, British factory owners began offering workers a half-day off on Saturday in return for a full day’s work on Monday. Rail companies supported the campaign with cheap-rate Saturday excursions. By the late 1870s, the term “weekend” had become so popular that even the British aristocracy started using it. For them however, the weekend began on Saturday and ended on Monday night.

American workers weren’t so fortunate. In 1908, a few New England mill owners granted workers Saturdays and Sundays off because of their large number of Jewish employees. Few other businesses followed suit until 1922, when Malcolm Gray, owner of the Rochester Can Company in upstate New York, decided to give a five-day week to his workers as a Christmas gift. The subsequent uptick in productivity was sufficiently impressive to convince Henry Ford to try the same experiment in 1926 at the Ford Motor Company. Ford’s success made the rest of the country take notice.

Meanwhile, the Soviet Union was moving in the other direction. In 1929, Joseph Stalin introduced the continuous week, which required 80% of the population to be working on any given day. It was so unpopular that the system was abandoned in 1940, the same year that the five-day workweek became law in the U.S. under the Fair Labor Standards Act. The battle for the weekend had been won at last. Now let the battle for the four-day week begin.

Historically Speaking: The Ordeal of Standardized Testing

From the Incas to the College Board, exams have been a popular way for societies to select an elite.

The Wall Street Journal

March 11, 2021

Last month, the University of Texas at Austin joined the growing list of colleges that have made standardized test scores optional for another year due to the pandemic. Last year, applicants were quick to take up the offer: Only 44% of high-school students who applied to college using the Common Application submitted SAT or ACT scores in 2020-21, compared with 77% the previous year.

Nobody relishes taking exams, yet every culture expects some kind of proof of educational attainment from its young. To enter Plato’s Academy in ancient Athens, a prospective student had to solve mathematical problems. Would-be doctors at one of the many medical schools in Ephesus had to participate in a two-day competition that tested their knowledge as well as their surgical skills.

ILLUSTRATION: THOMAS FUCHS

On the other side of the world, the Incas of Peru were no less demanding. Entry into the nobility required four years of rigorous instruction in the Quechua language, religion and history. At the end of the course students underwent a harsh examination lasting several days that tested their physical and mental endurance.

It was the Chinese who invented the written examination, as a means of improving the quality of imperial civil servants. During the reign of Empress Wu Zetian, China’s only female ruler, in the 7th century, the exam became a national rite of passage for the intelligentsia. Despite its burdensome academic requirements, several hundred thousand candidates took it every year. A geographical quota system was eventually introduced to prevent the richer regions of China from dominating.

Over the centuries, all that cramming for one exam stifled innovation and encouraged conformity. Still, the meritocratic nature of the Chinese imperial exam greatly impressed educational reformers in the West. In 1702, Trinity College, Cambridge became the first institution to require students to take exams in writing rather than orally. By the end of the 19th century, exams to enter a college or earn a degree had become a fixture in most European countries.

In the U.S., the reformer Horace Mann introduced standardized testing in Boston schools in the 1840s, hoping to raise the level of teaching and ensure that all citizens would have equal access to a good education. The College Board, a nonprofit organization founded by a group of colleges and high schools in 1899, established the first standardized test for university applicants.

Not every institution that adopted standardized testing had noble aims, however. The U.S. Army had experimented with multiple-choice intelligence tastes during World War I and found them useless as a predictive tool. But in the early 1920s, the president of Columbia University, Nicholas M. Butler, adopted the Thorndike Tests for Mental Alertness as part of the admissions process, believing it would limit the number of Jewish students.

The College Board adopted the SAT, a multiple-choice aptitude test, in 1926, as a fair and inclusive alternative to written exams, which were thought to be biased against poorer students. In the 1960s, civil rights activists began to argue that standardized tests like the SAT and ACT were biased against minority students, but despite the mounting criticisms, the tests seemed like a permanent part of American education—until now.

Historically Speaking: Masterpieces That Began as Failures

From Melville’s ‘Moby-Dick’ to Beethoven’s ‘Fidelio,’ some great works of art have taken a long time to win recognition.

The Wall Street Journal

November 19, 2020

Failure hurts. Yet history, the ultimate judge, shows that today’s failure sometimes turns out to be tomorrow’s stroke of genius—even if it takes many tomorrows.

Take Ludwig van Beethoven’s opera “Fidelio,” which tells the story of Leonore, a young woman who disguises herself as a man to rescue her husband Florestan, a political prisoner marked for execution. The opera contains some of the most noble and inspiring Beethoven music ever wrote, but when it premiered in Vienna on Nov. 20, 1805, it was a fiasco.

PHOTO: ALAMY

The timing was terrible. The Austrian capital was practically deserted, having fallen to Napoleon’s army the week before. The audience was mainly comprised of French army officers, who showed little interest in an opera celebrating female heroism, conjugal love and freedom from tyranny. “Fidelio” had just three performances before it was pulled from the stage.

Crestfallen at the failure of his first (and only) attempt at opera, Beethoven set about revising and shortening the work. When “Fidelio” finally returned to the Vienna stage on May 23, 1814, it was a great success, though the composer’s increasing deafness affected his conducting, almost throwing the performance into chaos. The chorus master saved the day by keeping time behind him.

Beethoven was lucky that the Viennese finally appreciated the revolutionary nature of “Fidelio” in his own lifetime. The 19th-century Danish philosopher Soren Kierkegaard, by contrast, lived and died a national joke. The problem wasn’t so much his ideas about being and suffering, though these fell on deaf ears, but his character. Kierkegaard was a feud-mongering eccentric so derided in his native Copenhagen that people would tease him in the street.

Kierkegaard’s most important book, “Either/Or,” was published at his own expense in 1843 and sold a mere 525 copies. He remained on the outer fringes of philosophy until his writings began to be translated into other languages. But in the aftermath of World War I, Kierkegaard inspired German and French thinkers to approach the meaning of life in an entirely new way, creating the school of thought known as existentialism.

Being a ground-breaker is often a lonely business, as the novelist Herman Melville discovered. His first five novels, starting with “Typee” in 1846, were highly popular adventure tales based on his experiences as a sailor. But more than money and success, Melville wanted to write a great book that would reimagine the American novel. He made the attempt in “Moby-Dick, or The Whale,” which appeared in 1851.

But readers loathed Melville’s highflown new style, and critics were bemused at best. His literary career never recovered. In 1856, he wrote to Nathaniel Hawthorne, “I have pretty much made up my mind to be annihilated.” He died in 1891 with his final book, “Billy Budd,” unfinished.

Still, Melville never wavered in his belief that life’s true winners are the strivers. As he wrote, “‘It is better to fail in originality than to succeed in imitation. He who has never failed somewhere, that man can not be great. Failure is the true test of greatness.”

Historically Speaking: Women Who Made the American West

From authors to outlaws, female pioneers helped to shape frontier society.

The Wall Street Journal

September 9, 2020

On Sept. 14, 1920, Connecticut became the 37th state to ratify the 19th Amendment, which guaranteed women the right to vote. The exercise was largely symbolic, since ratification had already been achieved thanks to Tennessee on August 18. Still, the fact that Connecticut and the rest of the laggard states were located in the eastern part of the U.S. wasn’t a coincidence. Though women are often portrayed in Westerns as either vixens or victims, they played a vital role in the life of the American frontier.

The outlaw Belle Starr, born Myra Belle Shirley, in 1886.
PHOTO: ROEDER BROTHERS/BUYENLARGE/GETTY IMAGES

Louisa Ann Swain of Laramie, Wyo., was the first woman in the U.S. to vote legally in a general election, in September 1870. The state was also ahead of the pack in granting women the right to sit on a jury, act as a justice of the peace and serve as a bailiff. Admittedly, it wasn’t so much enlightened thinking that opened up these traditionally male roles as it was the desperate shortage of women. No white woman crossed the continent until 17-year-old Nancy Kelsey traveled with her husband from Missouri to California in 1841. Once there, as countless pioneer women subsequently discovered, the family’s survival depended on her ability to manage without his help.

Women can and must fend for themselves was the essential message in the ‘”Little House on the Prairie” series of books by Laura Ingalls Wilder, who was brought up on a series of homesteads in Wisconsin and Minnesota in the 1870s. Independence was so natural to her that she refused to say “I obey” in her marriage vows, explaining, “even if I tried, I do not think I could obey anybody against my better judgment.”

Although the American frontier represented incredible hardship and danger, for many women it also offered a unique kind of freedom. They could forge themselves anew, seizing opportunities that would have been impossible for women in the more settled and urbanized parts of the country.

This was especially true for women of color. Colorado’s first Black settler was a former slave named Clara Brown, who won her freedom in 1856 and subsequently worked her way west to the gold-mining town of Central City. Recognizing a need in the market, she founded a successful laundry business catering to miners and their families. Some of her profits went to buy land and shares in mines; the rest she spent on philanthropy, earning her the nickname “Angel of the Rockies.” After the Civil War, Brown made it her mission to locate her lost family, ultimately finding a grown-up daughter, Eliza.

However, the flip of side of being able to “act like men” was that women had to be prepared to die like men, too. Myra Belle Shirley, aka Belle Starr, was a prolific Texas outlaw whose known associates included the notorious James brothers. Despite a long criminal career that mainly involved bootlegging and fencing stolen horses, Starr was convicted only once, resulting in a nine-month prison sentence in the Detroit House of Correction. Her luck finally ran out in 1889, two days before her 41st birthday. By now a widow for the third time, Belle was riding alone in Oklahoma when she was shot and killed in an ambush. The list of suspects included her own children, although the murder was never solved.

HistoricallySpeaking: How Fear of Sharks Became an American Obsession

Since colonial times, we’ve dreaded what one explorer called ‘the most ravenous fish known in the sea’

The Wall Street Journal

August 27, 2020

There had never been a fatal shark incident in Maine until last month’s shocking attack on a woman swimmer by a great white near Bailey Island in Casco Bay. Scientists suspect that the recent rise in seal numbers, rather than the presence of humans, was responsible for luring the shark inland.

A great white shark on the attack.
PHOTO: CHRIS PERKINS / MEDIADRUMWORLD / ZUMA PRESS

It’s often said that sharks aren’t the bloodthirsty killing machines portrayed in the media. In 2019 there were only 64 shark attacks world-wide, with just two fatalities. Still, they are feared for good reason.

The ancient Greeks knew well the horror that could await anyone unfortunate enough to fall into the wine-dark sea. Herodotus recorded how, in 492 B.C., a Persian invasion fleet of 300 ships was heading toward Greece when a sudden storm blew up around Mt. Athos. The ships broke apart, tossing some 20,000 men into the water. Those who didn’t drown immediately were “devoured” by sharks.

The Age of Discovery introduced European explorers not just to new landmasses but also to new shark species far more dangerous than the ones they knew at home. In a narrative of his 1593 journey to the South Seas, the explorer and pirate Richard Hawkins described the shark as “the most ravenous fishe knowne in the sea.”

It’s believed that the first deadly shark attack in the U.S. took place in 1642 at Spuyten Duyvil, an inlet on the Hudson River north of Manhattan. Antony Van Corlaer was attempting to swim across to the Bronx when a giant fish was seen to drag him under the water.

But the first confirmed American survivor of a shark attack was Brook Watson, a 14-year-old sailor from Boston. In 1749, Watson was serving on board a merchant ship when he was attacked while swimming in Cuba’s Havana Harbor. Fortunately, his crewmates were able to launch a rowboat and pull him from the water, leaving Watson’s right foot in the shark’s mouth.

Despite having a wooden leg, Watson enjoyed a successful career at sea before returning to his British roots to enter politics. He ended up serving as Lord Mayor of London and becoming Sir Brook Watson. His miraculous escape was immortalized by his friend the American painter John Singleton Copley. “Watson and the Shark” was completely fanciful, however, since Copley had never seen a shark.

The American relationship with sharks was changed irrevocably during the summer of 1916. The East Coast was gripped by both a heat wave and a polio epidemic, leaving the beach as one of the few safe places for Americans to relax. On July 1, a man was killed by a shark on Long Beach Island off the New Jersey coast. Over the next 10 days, sharks in the area killed three more people and left one severely injured. In the ensuing national uproar, President Woodrow Wilson offered federal funds to help get rid of the sharks, an understandable but impossible wish.

The Jersey Shore attacks served as an inspiration for Peter Benchley’s bestselling 1974 novel “Jaws,” which was turned into a blockbuster film the next year by Steven Spielberg. Since then the shark population in U.S. waters has dropped by 60%, in part due to an increase in shark-fishing inspired by the movie. Appalled by what he had unleashed, Benchley spent the last decades of his life campaigning for shark conservation.

Historically Speaking: The Delicious Evolution of Mayonnaise

Ancient Romans ate a pungent version, but the modern egg-based spread was created by an 18th-century French chef.

July 9, 2020

The Wall Street Journal

I can’t imagine a summer picnic without mayonnaise—in the potato salad, the veggie dips, the coleslaw, and yes, even on the french fries. It feels like a great dollop of pure Americana in a creamy, satisfying sort of way. But like a lot of what makes our country so successful, mayonnaise originally came from somewhere else.

Where, exactly, is one of those food disputes that will never be resolved, along with the true origins of baklava pastry, hummus and the pisco sour cocktail. In all likelihood, the earliest version of mayonnaise was an ancient Roman concoction of garlic and olive oil, much praised for its medicinal properties by Pliny the Elder in his first-century encyclopedia “Naturalis Historia.” This strong-tasting, aioli-like proto-mayonnaise remained a southern Mediterranean specialty for millennia.

But most historians believe that modern mayonnaise was born in 1756 in the port city of Mahon, in the Balearic Islands off the coast of Spain. At the start of the Seven Years’ War between France and Britain, the French navy, led by the Duc de Richelieu, smashed Admiral Byng’s poorly armed fleet at the Battle of Minorca. (Byng was subsequently executed for not trying hard enough.) While preparing the fish course for Richelieu’s victory dinner, his chef coped with the lack of cream on the island by ingeniously substituting a goo of eggs mixed with oil and garlic.

The anonymous cook took the recipe for “mahonnaise” back to France, where it was vastly improved by Marie-Antoine Careme, the founder of haute cuisine. Careme realized that whisking rather than simply stirring the mixture created a soft emulsion that could be used in any number of dishes, from the savory to the sweet.

It wasn’t just the French who fell for Careme’s version. Mayonnaise blended easily with local cuisines, evolving into tartar sauce in Eastern Europe, remoulades in the Baltic countries and salad dressing in Britain. By 1838, the menu at the iconic New York restaurant Delmonico’s featured lobster mayonnaise as a signature dish.

All that whisking, however, made mayonnaise too laborious for home cooks until the invention of the mechanical eggbeater, first patented by the Black inventor Willis Johnson, of Cincinnati, Ohio, in 1884. “Try it once,” gushed Good Housekeeping magazine in 1889, “and you’ll never go back to the old way as long as you live.”

Making mayonnaise was one thing, preserving it quite another, since the raw egg made it spoil quickly. The conundrum was finally solved in 1912 by Richard Hellmann, a German-American deli owner in New York. By using his own trucks and factories, Hellmann was able to manufacture and transport mayonnaise faster. And in a revolutionary move, he designed the distinctive wide-necked Hellmann’s jar, encouraging liberal slatherings of mayo and thereby speeding up consumption.

Five years later, Eugenia Duke of North Carolina created Duke’s mayonnaise, which is eggier and has no sugar. The two brands are still dueling it out. But when it comes to eating, there are no winners and losers in the mayo department, just 14 grams of fat and 103 delicious calories per tablespoon.

Historically Speaking: Golfing With Emperors and Presidents

From medieval Scotland to the White House, the game has appealed to the powerful as well as the common man.

June 3, 2020

The Wall Street Journal

The history of golf is a tale of two sports: one played by the common man, the other by kings and presidents. The plebeian variety came first. Paganica, a game played with a bent stick and a hard ball stuffed with feathers, was invented by Roman soldiers as a way to relieve the monotony of camp life. It is believed that a version of Paganica was introduced to Scotland when the Roman emperor Septimius Severus invaded the country in 208 A.D.

Golf buddies Arnold Palmer (left) and Dwight Eisenhower.
PHOTO: AUGUSTA NATIONAL/GETTY IMAGES

Golf might also have been influenced by stick-and-ball games from other cultures, such as the medieval Chinese chuiwan (“hit-ball”) and Dutch colf, an indoor game using rubber balls and heavy clubs. But the game we know today originated in the 15th century on the Links—the long, grassy sand dunes that are such a distinctive feature of Scotland’s coastline. The terrain was perfect for all-weather play, as well as for keeping out of sight of the authorities: Scottish kings prohibited the game until 1502, anxious that it would interfere with archery practice.

Two years after lifting the ban, King James IV of Scotland played the first recorded golf match while staying at Falkland Palace near St. Andrews. In theory, anyone could play on the Links since it was common land. Starting in 1754, however, access was controlled by the Royal and Ancient Golf Club of St. Andrews, known today as the “Home of Golf.” The R & A did much to standardize the rules of the game, while cementing golf’s reputation as an aristocratic activity.

In the 19th century, innovations in lawn care and ball manufacturing lowered the cost of golf, but the perception of elitism persisted. When William Howard Taft ran for president in 1908, Teddy Roosevelt urged him to beware of projecting an upper-crust image: “photographs on horseback, yes; tennis, no. And golf is fatal.” Taft ignored Roosevelt’s advice, as did Woodrow Wilson, who played more rounds of golf—nearly 1,200 in all—than any other president. He even played in the snow, using a black-painted ball.

Wilson’s record was nearly matched by Dwight Eisenhower, who so loved the game that he had a putting green installed outside the Oval Office in 1954. At first the media criticized his fondness for a rich man’s game. But that changed after Arnold Palmer, one of the greatest and most charismatic golfers in history, became Eisenhower’s friend and regular golf partner. The frequent sight of the president and the sports hero playing together made golf appear attractive, aspirational and above all accessible, inspiring millions of ordinary Americans to try the game for the first time.

But that popularity has been dented in recent years. The number of golfers in the U.S. dropped from a high of 30 million in 2005 to 24.1 million in 2015. In addition to being pricey, golf is still criticized for being snobby. Earlier this year, Brooks Koepka, a professional golfer once ranked number one in the world, told GQ that he loved the game but not “the stuffy atmosphere that comes along with it.” “Golf has always had this persona of the triple-pleated khaki pants, the button-up shirt, very country club atmosphere,” he complained. Now that almost all of the country’s golf courses have reopened from pandemic-related shutdowns, golf has a new opportunity to make every player feel included.

Historically Speaking: Sleuthing Through the Ages

Illustration by Dominic Bugatto

From Oedipus to Sherlock Holmes, readers have flocked to stories about determined detectives.

May 21, 2020

The Wall Street Journal

I have to confess that I’ve spent the lockdown reading thrillers and whodunits. But judging by the domination of mystery titles on the bestseller lists, so has nearly everyone else. In uncertain times, crime fiction offers certainty, resolution and comfort.

The roots of the genre go back to the ancient Greeks. Sophocles’s “Oedipus the King,” written around 429 B.C., is in essence a murder mystery. The play begins with Oedipus swearing that he will not rest until he discovers who killed Laius, the previous king of Thebes. Like a modern detective, Oedipus questions witnesses and follows clues until the terrible truth is revealed: He is both the investigator and the criminal, having unwittingly murdered his father and married his mother.

The Chinese were the first to give crime fiction a name. Gong’an or “magistrate’s desk” literature developed during the Song dynasty (960-1279), featuring judges who recount the details of a difficult or dangerous case. Modern Western crime fiction adopted a more individualistic approach, making heroes out of amateurs. The 1819 novella “Mademoiselle de Scuderi,” by the German writer E.T.A. Hoffmann, is an early prototype: The heroine, an elderly writer, helps to solve a serial murder case involving stolen jewelry.

But it is Edgar Allan Poe who is generally regarded as the godfather of detective fiction. His short story “The Murders in the Rue Morgue,” published in 1841, features an amateur sleuth, Auguste Dupin, who solves the mysterious, gruesome deaths of two women. (Spoiler: The culprit was an escaped orangutan.) Poe invented some of the genre’s most important devices, including the “locked room” puzzle, in which a murder takes place under seemingly impossible conditions.

Toward the end of the 19th century, Arthur Conan Doyle’s Sherlock Holmes series added three innovations that quickly became conventions: the loyal sidekick, the arch-villain and the use of forensic science. In the violin-playing, drug-abusing Holmes, Doyle also created a psychologically complex character who enthralled readers—too much for Doyle’s liking. Desperate to be considered a literary writer, he killed off Holmes in 1893, only to be forced by public demand to resurrect him 12 years later.

When Doyle published his last Holmes story in 1927, the “Golden Age” of British crime fiction was in full swing. Writers such as Agatha Christie and Dorothy L. Sayers created genteel detectives who solved “cozy crimes” in upper-middle-class settings, winning a huge readership and inspiring American imitators like S.S. Van Dine, the creator of detective Philo Vance, who published a list of “Twenty Rules for Writing Detective Stories.”

As violence and corruption increased under Prohibition, American mystery writing turned toward more “hard-boiled” social realism. In Dashiell Hammett and Raymond Chandler’s noir fiction, dead bodies in libraries are replaced by bloody corpses in cars.

At the time, critics quarreled about which type of mystery was superior, though both can seem old-fashioned compared with today’s spy novels and psychological thrillers. The number of mystery subgenres seems to be infinite. Yet one thing will never change: our yearning for a hero who is, in Raymond Chandler’s words, “the best man in his world and a good enough man for any world.”

 

Historically Speaking: Hobbies for Kings and the People

From collecting ancient coins to Victorian taxidermy, we’ve found ingenious ways to fill our free time.

Wall Street Journal, April 16, 2020

It’s no surprise that many Americans are turning or returning to hobbies during the current crisis. By definition, a hobby requires time outside of work.

Sofonisba Anguissola, ‘The Chess Game’ (1555)

We don’t hear much about hobbies in ancient history because most people never had any leisure time. They were too busy obeying their masters or just scraping by. The earliest known hobbyists may have been Nabonidus, the last king of Babylonia in the 6th century B.C., and his daughter Ennigaldi-Nanna. Both were passionate antiquarians: Nabonidus liked to restore ruined temples while Ennigaldi-Nanna collected ancient artifacts. She displayed them in a special room in her palace, effectively creating the world’s first museum.

Augustus Caesar, the first Roman emperor, was another avid collector of ancient objects, especially Greek gold coins. The Romans recognized the benefits of having a hobby, but for them the concept excluded any kind of manual work. When the poet Ovid, exiled by Augustus on unknown charges, wrote home that he yearned to tend his garden again, he didn’t mean with a shovel. That’s what slaves were for.

Hobbies long continued to be a luxury for potentates. But in the Renaissance, the printing press combined with higher standards of living to create new possibilities for hobbyists. The change can be seen in the paintings of Sofonisba Anguissola, one of the first Italian painters to depict her subjects enjoying ordinary activities like reading or playing an instrument. Her most famous painting, “The Chess Game” (1555), shows members of her family engaged in a match.

Upper-class snobbery toward any hobby that might be deemed physical still lingered, however. The English diplomat and scholar Sir Thomas Elyot warned readers in “The Boke Named the Governour” (1531) that playing a musical instrument was fine ”‘for recreation after tedious or laborious affaires.” But it had to be kept private, lest the practitioner be mistaken for “a common servant or minstrel.”

Hobbies received a massive boost from the Industrial Revolution. It wasn’t simply that people had more free time; there were also many more things to do and acquire. Stamp collecting took off soon after the introduction of the world’s first adhesive stamp, the Penny Black, in Britain in 1840. As technology became cheaper, hobbies emerged that bridged the old division between intellectual and manual labor, such as photography and microscopy. Taxidermy allowed the Victorians to mash the macrabre and the whimsical together: Ice-skating hedgehogs, card-playing mice and dancing cats were popular with taxidermists.

In the U.S., the adoption of hobbies increased dramatically during the Great Depression. For the unemployed, they were an inexpensive way to give purpose and achievement to their days. Throughout the 1930s, nonprofit organizations such as the Leisure League of America and the National Home Workshop Guild encouraged Americans to develop their talents. “You Can Write” was the hopeful title of a 1934 Leisure League publication.

Even Winston Churchill took up painting in his 40s, saying later that the hobby rescued him “in a most trying time.” We are in our own trying time, so why not go for it? I think I’ll teach myself to bake bread next week.

Historically Speaking: The Long Fight Against Unjust Taxes

From ancient Jerusalem to the American Revolution and beyond, rebels have risen up against the burden of taxation.

March 19, 2020

The Wall Street Journal

With the world in the grip of a major health crisis, historical milestones are passing by with little notice. But the Boston Massacre, whose 250th anniversary was this month, deserves to be remembered as a cautionary tale.

ILLUSTRATION: THOMAS FUCHS

The bloody encounter on March 5, 1770, began with the harassment of a British soldier by a crowd of Bostonians. Panicked soldiers responded by firing on the crowd, leaving five dead and six wounded. The colonists were irate about new taxes imposed by the British Parliament to pay for the expenses of the Seven Years War, which in North America pitted the British and Americans against the French and their Indian allies. Whether or not the tax increase was justified, the failure of British leaders to include the American colonies in the deliberative process was catastrophic. The slogan “No taxation without representation” became a rallying cry for the fledgling nation.

The attitude of tax collecting authorities had hardly changed since ancient times, when empires treated their subject populations with greed, brutality and arrogance. In 1st century Judea, anger over the taxes imposed by Rome combined with religious grievances to provoke a full-scale Jewish revolt in 66-73 A.D. It was an unequal battle, as most tax rebellions are, and the resistors were made to pay dearly: Jerusalem was sacked and the Second Temple destroyed, and all Jews in the Roman Empire were forced to pay a punitive tax.

Even when tax revolts met with initial success, there was no guarantee that the authorities would carry out their promises. In 1381, a humble English roof tiler named Wat Tyler led an uprising, dubbed the Peasants’ Revolt, against a new poll tax. King Richard II met with Tyler and agreed to his demands, but only as a delaying tactic. The ringleaders were then rounded up and executed, and Richard revoked his concessions, claiming they had been made under duress.

Nevertheless, as the historian David F. Burg notes in his book “A World History of Tax Rebellions,” tax revolts have been more frequent than we realize, mainly because governments tend not to advertise them. In Germany, 210 separate protests and uprisings were recorded from 1300 to 1550, and at least 1,000 in Japan from 1600 to 1868.

The 19th century saw the rise of a new kind of tax rebel, the conscientious objector. In 1846, the writer and abolitionist Henry David Thoreau spent a night in the Concord, Mass., jail after he refused to pay a poll tax as a protest against slavery. He was released the next morning when his aunt paid it for him, against his will. But Thoreau would go on to withhold his taxes in protest against the Mexican-American War, arguing in his 1849 essay “Civil Disobedience” that it was better to go to jail than to “enable the State to commit violence and shed innocent blood.”

Irwin Schiff, a colorful antitax advocate and failed libertarian presidential candidate, wouldn’t get off so easily. Arguing that the income tax violated the U.S. Constitution, he refused to pay it, despite being convicted of tax evasion three times. In 2015, he died at age 87 in a federal prison—an ironic confirmation of Benjamin Franklin’s adage that “nothing can be said to be certain, except death and taxes.”

Fortunately for Americans at this time of national duress, tax day this year has been mercifully postponed.