This month we mark the centennial of the ratification of the Constitution’s 18th Amendment, better known as Prohibition. But the temperance movement was active for over a half-century before winning its great prize. As the novelist Anthony Trollope discovered to his regret while touring North America in 1861-2, Maine had been dry for a decade. The convivial Englishman condemned the ban: “This law, like all sumptuary laws, must fail,” he wrote.
Sumptuary laws had largely fallen into disuse by the 19th century, but they were once a near-universal tool, used in the East and West alike to control economies and preserve social hierarchies. A sumptuary law is a rule that regulates consumption in its broadest sense, from what a person may eat and drink to what they may own, wear or display. The oldest known example, the Locrian Law Code devised by the seventh century B.C. Greek law giver Zaleucus, banned all citizens of Locri (except prostitutes) from ostentatious displays of gold jewelry.
Sumptuary laws were often political weapons disguised as moral pieties, aimed at less powerful groups, particularly women. In 215 B.C., at the height of the Second Punic War, the Roman Senate passed the Lex Oppia, which (among other restrictions) banned women from owning more than a half ounce of gold. Ostensibly a wartime austerity measure, 20 years later the law appeared so ridiculous as to be unenforceable. But during debate on its repeal in 195 B.C., Cato the Elder, its strongest defender, inadvertently revealed the Lex Oppia’s true purpose: “What [these women] want is complete freedom…. Once they have achieved equality, they will be your masters.”
Cato’s message about preserving social hierarchy echoed down the centuries. As trade and economic stability returned to Europe during the High Middle Ages (1000-1300), so did the use of sumptuary laws to keep the new merchant elites in their place. By the 16th century, sumptuary laws in Europe had extended from clothing to almost every aspect of daily life. The more they were circumvented, the more specific such laws became. An edict issued by King Henry VIII of England in 1517, for example, dictated the maximum number of dishes allowed at a meal: nine for a cardinal, seven for the aristocracy and three for the gentry.
The rise of modern capitalism ultimately made sumptuary laws obsolete. Trade turned once-scarce luxuries into mass commodities that simply couldn’t be controlled. Adam Smith’s “The Wealth of Nations” (1776) confirmed what had been obvious for over a century: Consumption and liberty go hand in hand. “It is the highest impertinence,” he wrote, “to pretend to watch over the economy of private people…either by sumptuary laws, or by prohibiting the importation of foreign luxuries.”
Smith’s pragmatic view was echoed by President William Howard Taft. He opposed Prohibition on the grounds that it was coercive rather than consensual, arguing that “experience has shown that a law of this kind, sumptuary in its character, can only be properly enforced in districts in which a majority of the people favor the law.” Mass immigration in early 20th-century America had changed many cities into ethnic melting-pots. Taft recognized Prohibition as an attempt by nativists to impose cultural uniformity on immigrant communities whose attitudes toward alcohol were more permissive. But his warning was ignored, and the disastrous course of Prohibition was set.
On January 12, 1819 Thomas Jefferson wrote to his friend Nathaniel Macon, “I have…entire confidence in the late and present Presidents…I slumber without fear.” He did concede, though, that market fluctuations can trip up even the best governments. Jefferson was prescient: A few days later, the country plunged into a full-blown financial panic. The trigger was a collapse in the overseas cotton market, but the crisis had been building for months. The factors that led to the crash included the actions of the Second Bank of the United States, which had helped to fuel a real estate boom in the West only to reverse course suddenly and call in its loans.
The recession that followed the panic of 1819 was prolonged and severe: Banks closed, lending all but ceased and businesses failed by the thousands. By the time it was over in 1823, almost a third of the population—including Jefferson himself—had suffered irreversible losses.
As we mark the 200th anniversary of the 1819 panic, it is worth pondering the role of governments in a financial crisis. During a panic in Rome in the year 33, the emperor Tiberius’s prompt action prevented a total collapse of the city’s finances. Rome was caught among falling property prices, a real estate bubble and a sudden credit crunch. Instead of waiting it out, Tiberius ordered interest rates to be lowered and released 100 million sestertii (large brass coins) into the banking system to avoid a mass default.
But not all government interventions have been as successful or timely. In 1124, King Henry I of England attempted to restore confidence in the country’s money by having the mint-makers publicly castrated and their right hands amputated for producing substandard coins. A temporary fix at best, his bloody act neither deterred people from debasing the coinage nor allayed fears over England’s creditworthiness.
On the other side of the globe, China began using paper money in 1023. Successive emperors of the Ming Dynasty (1368-1644) failed, however, to limit the number of notes in circulation or to back the money with gold or silver specie. By the mid-15th century the economy was in the grip of hyperinflationary cycles. The emperor Yingzong simply gave up on the problem: China returned to coinage just as Europe was discovering the uses of paper.
The rise of commercial paper along with paper currencies allowed European countries to develop more sophisticated banking systems. But they also led to panics, inflation and dangerous speculation—sometimes all at once, as in France in 1720, when John Law’s disastrous Mississippi Company share scheme ended in mass bankruptcies for its investors and the collapse of the French livre.
As it turns out, it is easier to predict the consequences of a crisis than it is to prevent one from happening. In 2015, the U.K.’s Centre for Economic Policy Research published a paper on the effects of 100 financial crises in 20 Western countries over the past 150 years, down to the recession of 2007-09. They found two consistent outcomes. The first is that politics becomes more extreme and polarized following a crisis; the second is that countries become more ungovernable as violence, protests and populist revolts overshadow the rule of law.
With the U.S. stock market having suffered its worst December since the Great Depression of the 1930s, it is worth remembering that the only thing more frightening than a financial crisis can be its aftermath.
I don’t look forward to New Year’s Eve. When the bells start to ring, it isn’t “Auld Lang Syne” I hear but echoes from the Anglican “Book of Common Prayer”: “We have left undone those things which we ought to have done; And we have done those things which we ought not to have done.”
At least I’m not alone in my annual dip into the waters of woe. Experiencing the sharp sting of regret around the New Year has a long pedigree. The ancient Babylonians required their kings to offer a ritual apology during the Akitu festival of New Year: The king would go down on his knees before an image of the god Marduk, beg his forgiveness, insist that he hadn’t sinned against the god himself and promise to do better next year. The rite ended with the high priest giving the royal cheek the hardest possible slap.
There are sufficient similarities between the Akitu festival and Yom Kippur, Judaism’s Day of Atonement—which takes place 10 days after the Jewish New Year—to suggest that there was likely a historical link between them. Yom Kippur, however, is about accepting responsibility, with the emphasis on owning up to sins committed rather than pointing out those omitted.
In Europe, the 14th-century Middle English poem “Sir Gawain and the Green Knight” begins its strange tale on New Year’s Day. A green-skinned knight arrives at King Arthur’s Camelot and challenges the knights to strike at him, on the condition that he can return the blow in a year and a day. Sir Gawain reluctantly accepts the challenge, and embarks on a year filled with adventures. Although he ultimately survives his encounter with the Green Knight, Gawain ends up haunted by his moral lapses over the previous 12 months. For, he laments (in J.R.R. Tolkien’s elegant translation), “a man may cover his blemish, but unbind it he cannot.”
New Year’s Eve in Shakespeare’s era was regarded as a day for gift-giving rather than as a catalyst for regret. But Sonnet 30 shows that Shakespeare was no stranger to the melancholy that looking back can inspire: “I summon up remembrance of things past, / I sigh the lack of many a thing I sought, / And with old woes new wail my dear time’s waste.”
For a full dose of New Year’s misery, however, nothing beats the Victorians. “I wait its close, I court its gloom,” declared the poet Walter Savage Landor in “Mild Is the Parting Year.” Not to be outdone, William Wordsworth offered his “Lament of Mary Queen of Scots on the Eve of a New Year”: “Pondering that Time tonight will pass / The threshold of another year; /…My very moments are too full / Of hopelessness and fear.”
Fortunately, there is always Charles Dickens. In 1844, Dickens followed up the wildly successful “A Christmas Carol” with a slightly darker but still uplifting seasonal tale, “The Chimes.” Trotty Veck, an elderly messenger, takes stock of his life on New Year’s Eve and decides that he has been nothing but a burden on society. He resolves to kill himself, but the spirits of the church bells intervene, showing him a vision of what would happen to the people he loves.
Today, most Americans recognize this story as the basis of the bittersweet 1946 Frank Capra film “It’s a Wonderful Life.” As an antidote to New Year’s blues, George Bailey’s lesson holds true for everyone: “No man is a failure who has friends.”
Queen Victoria and family with their Christmas tree in 1848. PHOTO: GETTY IMAGES
My family never had a pink-frosted Christmas tree, though Lord knows my 10-year-old self really wanted one. Every year my family went to Sal’s Christmas Emporium on Wilshire Boulevard in Los Angeles, where you could buy neon-colored trees, mechanical trees that played Christmas carols, blue and white Hanukkah bushes or even a real Douglas fir if you wanted to go retro. We were solidly retro.
Decorating the Christmas tree remains one of my most treasured memories, and according to the National Christmas Tree Association, the tradition is still thriving in our digital age: In 2017 Americans bought 48.5 million real and artificial Christmas trees. Clearly, bringing a tree into the house, especially during winter, taps into something deeply spiritual in the human psyche.
Nearly every society has at some point venerated the tree as a symbol of fertility and rebirth, or as a living link between the heavens, the earth and the underworld. In the ancient Near East, “tree of life” motifs appear on pottery as early as 7000 B.C. By the second millennium B.C., variations of the motif were being carved onto temple walls in Egypt and fashioned into bronze sculptures in southern China.
The early Christian fathers were troubled by the possibility that the faithful might identify the Garden of Eden’s trees of life and knowledge, described in the Book of Genesis, with paganism’s divine trees and sacred groves. Accordingly, in 572 the Council of Braga banned Christians from participating in the Roman celebration of Saturnalia—a popular winter solstice festival in honor of Saturn, the god of agriculture, that included decking the home with boughs of holly, his sacred symbol.
It wasn’t until the late Middle Ages that evergreens received a qualified welcome from the Church, as props in the mystery plays that told the story of Creation. In Germany, mystery plays were performed on Christmas Eve, traditionally celebrated in the church calendar as the feast day of Adam and Eve. The original baubles that hung on these “paradise trees,” representing the trees in the Garden of Eden, were round wafer breads that symbolized the Eucharist.
The Christmas tree remained a northern European tradition until Queen Charlotte, the German-born wife of George III, had one erected for a children’s party at Windsor Castle in 1800. The British upper classes quickly followed suit, but the rest of the country remained aloof until 1848, when the London Illustrated News published a charming picture of Queen Victoria and her family gathered around a large Christmas tree. Suddenly, every household had to have one for the children to decorate. It didn’t take long for President Franklin Pierce to introduce the first Christmas tree to the White House, in 1853—a practice that every President has honored except Theodore Roosevelt, who in 1902 refused to have a tree on conservationist grounds. (His children objected so much to the ban that he eventually gave in.)
Many writers have tried to capture the complex feelings that Christmas trees inspire, particularly in children. Few, though, can rival T.S. Eliot’s timeless meditation on joy, death and life everlasting, in his 1954 poem “The Cultivation of Christmas Trees”: “The child wonders at the Christmas Tree: / Let him continue in the spirit of wonder / At the Feast as an event not accepted as a pretext; / So that the glittering rapture, the amazement / Of the first-remembered Christmas Tree /…May not be forgotten.”
The tell-all memoir has been a feature of American politics ever since Raymond Moley, an ex-aide to Franklin Delano Roosevelt, published his excoriating book “After Seven Years” while FDR was still in office. What makes the Trump administration unusual is the speed at which such accounts are appearing—most recently, “Unhinged,” by Omarosa Manigault Newman, a former political aide to the president.
Spilling the beans on one’s boss may be disloyal, but it has a long pedigree. Alexander the Great is thought to have inspired the genre. His great run of military victories, beginning with the Battle of Chaeronea in 338 B.C., was so unprecedented that several of his generals felt the urge—unknown in Greek literature before then—to record their experiences for posterity.
Unfortunately, their accounts didn’t survive, save for the memoir of Ptolemy Soter, the founder of the Ptolemaic dynasty in Egypt, which exists in fragments. The great majority of Roman political memoirs have also disappeared—many by official suppression. Historians particularly regret the loss of the memoirs of Agrippina, the mother of Emperor Nero, who once boasted that she could bring down the entire imperial family with her revelations.
The Heian period (794-1185) in Japan produced four notable court memoirs, all by noblewomen. Dissatisfaction with their lot was a major factor behind these accounts—particularly for the anonymous author of ‘The Gossamer Years,” written around 974. The author was married to Fujiwara no Kane’ie, the regent for the Emperor Ichijo. Her exalted position at court masked a deeply unhappy private life; she was made miserable by her husband’s serial philandering, describing herself as “rich only in loneliness and sorrow.”
In Europe, the first modern political memoir was written by the Duc de Saint-Simon (1675-1755), a frustrated courtier at Versailles who took revenge on Louis XIV with his pen. Saint-Simon’s tales hilariously reveal the drama, gossip and intrigue that surrounded a king whose intellect, in his view, was “beneath mediocrity.”
But even Saint-Simon’s memoirs pale next to those of the Korean noblewoman Lady Hyegyeong (1735-1816), wife of Crown Prince Sado of the Joseon Dynasty. Her book, “Memoirs Written in Silence,” tells shocking tales of murder and madness at the heart of the Korean court. Sado, she writes, was a homicidal psychopath who went on a bloody killing spree that was only stopped by the intervention of his father King Yeongjo. Unwilling to see his son publicly executed, Yeongjo had the prince locked inside a rice chest and left to die. Understandably, Hyegyeong’s memoirs caused a huge sensation in Korea when they were first published in 1939, following the death of the last Emperor in 1926.
Fortunately, the Washington political memoir has been free of this kind of violence. Still, it isn’t just Roman emperors who have tried to silence uncomfortable voices. According to the historian Michael Beschloss, President John F. Kennedy had the White House household staff sign agreements to refrain from writing any memoirs. But eventually, of course, even Kennedy’s secrets came out. Perhaps every political leader should be given a plaque that reads: “Just remember, your underlings will have the last word.”
At the first Thanksgiving dinner, eaten by the Wampanoag Indians and the Pilgrims in 1621, the menu was rather different from what’s served today. For one thing, the pumpkin was roasted, not made into a pie. And there definitely wasn’t a side dish of mashed potatoes.
In fact, the first hundred Thanksgivings were spud-free, since potatoes weren’t grown in North America until 1719, when Scotch-Irish settlers began planting them in New Hampshire. Mashed potatoes were an even later invention. The first recorded recipe for the dish appeared in 1747, in Hannah Glasse’s splendidly titled “The Art of Cookery Made Plain and Easy, Which Far Exceeds Any Thing of the Kind yet Published.”
By then, the potato had been known in Europe for a full two centuries. It was first introduced by the Spanish conquerors of Peru, where the Incas had revered the potato and even invented a natural way of freeze-drying it for storage. Nevertheless, despite its nutritional value and ease of growing, the potato didn’t catch on in Europe. It wasn’t merely foreign and ugly-looking; to wheat-growing farmers it seemed unnatural—possibly even un-Christian, since there is no mention of the potato in the Bible. Outside of Spain, it was generally grown for animal feed.
The change in the potato’s fortunes was largely due to the efforts of a Frenchman named Antoine-Augustin Parmentier (1737-1813). During the Seven Years’ War, he was taken prisoner by the Prussians and forced to live on a diet of potatoes. To his surprise, he stayed relatively healthy. Convinced he had found a solution to famine, Parmentier dedicated his life after the war to popularizing the potato’s nutritional benefits. He even persuaded Marie-Antoinette to wear potato flowers in her hair.
Among the converts to his message were the economist Adam Smith, who realized the potato’s economic potential as a staple food for workers, and Thomas Jefferson, then the U.S. Ambassador to France, who was keen for his new nation to eat well in all senses of the word. Jefferson is credited with introducing Americans to french fries at a White House dinner in 1802.
As Smith predicted, the potato became the fuel for the Industrial Revolution. A study published in 2011 by Nathan Nunn and Nancy Qian in the Quarterly Journal of Economics estimates that up to a quarter of the world’s population growth from 1700 to 1900 can be attributed solely to the introduction of the potato. As Louisa May Alcott observed in “Little Men,” in 1871, “Money is the root of all evil, and yet it is such a useful root that we cannot get on without it any more than we can without potatoes.”
In 1887, two Americans, Jacob Fitzgerald and William H. Silver, patented the first potato ricer, which forced a cooked potato through a cast iron sieve, ending the scourge of lumpy mash. Still, the holy grail of “quick and easy” mashed potatoes remained elusive until the late 1950s. Using the flakes produced by the potato ricer and a new freeze drying method, U.S. government scientists perfected instant mashed potatoes, which only requires the simple step of adding hot water or milk to the mix. The days of peeling, boiling and mashing were now optional, and for millions of cooks, Thanksgiving became a little easier. And that’s something to be thankful for.
Ever since Neolithic people wandered the earth, inadvertently bringing the mouse along for the ride, humans have been responsible for introducing animal and plant species into new environments. But problems can arise when a non-native species encounters no barriers to population growth, allowing it to rampage unchecked through the new habitat, overwhelming the ecosystem. On more than one occasion, humans have transplanted a species for what seemed like good reasons, only to find out too late that the consequences were disastrous.
One of the most famous examples is celebrating its 150th anniversary this year: the introduction of Japanese knotweed to the U.S. A highly aggressive plant, it can grow 15 feet high and has roots that spread up to 45 feet. Knotweed had already been a hit in Europe because of its pretty little white flowers, and, yes, its miraculous indestructibility.
First mentioned in botanical articles in 1868, knotweed was brought to New York by the Hogg brothers, James and Thomas, eminent American horticulturalists and among the earliest collectors of Japanese plants. Thanks to their extensive contacts, knotweed found a home in arboretums, botanical gardens and even Central Park. Not content with importing one of world’s most invasive shrubs, the Hoggs also introduced Americans to the wonders of kudzu, a dense vine that can grow a foot a day.
Impressed by the vigor of kudzu, agriculturalists recommended using these plants to provide animal fodder and prevent soil erosion. In the 1930s, the government was even paying Southern farmers $8 per acre to plant kudzu. Today it is known as the “vine that ate the South,” because of the way it covers huge tracts of land in a green blanket of death. And Japanese knotweed is still spreading, colonizing entire habitats from Mississippi to Alaska, where only the Arctic tundra holds it back from world domination.
Knotweed has also reached Australia, a country that has been ground zero for the worst excesses of invasive species. In the 19th century, the British imported non-native animals such as rabbits, cats, goats, donkeys, pigs, foxes and camels, causing mass extinctions of Australia’s native mammal species. Australians are still paying the price; there are more rabbits in the country today than wombats, more camels than kangaroos.
Yet the lesson wasn’t learned. In the 1930s, scientists in both Australia and the U.S. decided to import the South American cane toad as a form of biowarfare against beetles that eat sugar cane. The experiment failed, and it turned out that the cane toad was poisonous to any predator that ate it. There’s also the matter of the 30,000 eggs it can lay at a time. Today, the cane toad can be found all over northern Australia and south Florida.
So is there anything we can do once an invasive species has taken up residence? The answer is yes, but it requires more than just fences, traps and pesticides; it means changing human incentives. Today, for instance, the voracious Indo-Pacific lionfish is gobbling up local fish in the west Atlantic, while the Asian carp threatens the ecosystem of the Great Lakes. There is only one solution: We must eat them, dear reader. These invasive fish can be grilled, fried or consumed as sashimi, and they taste delicious. Likewise, kudzu makes great salsa, and Japanese knotweed can be treated like rhubarb. Eat for America and save the environment.
As Halloween approaches, decorations featuring scary black cats are starting to make their seasonal appearance. But what did the black cat ever do to deserve its reputation as a symbol of evil? Why is it considered bad luck to have a black cat cross your path?
It wasn’t always this way. In fact, the first human-cat interactions were benign and based on mutual convenience. The invention of agriculture in the Neolithic era led to surpluses of grain, which attracted rodents, which in turn motivated wild cats to hang around humans in the hope of catching dinner. Domestication soon followed: The world’s oldest pet cat was found in a 9,500 year-old grave in Cyprus, buried alongside its human owner.
Yet as the ancient Egyptians realized, even when domesticated, the cat retains its independence. The Egyptians were fascinated by divine opposites and cosmic symmetries, and they saw this kind of duality in the cat—a fierce predator that was also a loyal guardian. Several Egyptian deities were depicted in part-cat, part-human form, including Bastet, who was a goddess of violence as well as fertility. One of her sacred colors was black, which is how the black cat first achieved its special status.
According to the Roman writer Polyaenus, who lived in the second century A.D., the Egyptian veneration of cats led to disaster at the Battle of Pelusium in 525 B.C. The invading Persian army carried cats on the front lines, rightly calculating that the Egyptians would rather accept defeat than kill a cat.
The Egyptians were unique in their extreme veneration of cats, but they weren’t alone in regarding them as having a special connection to the spirit world. In Greek mythology the cat was a familiar of Hecate, goddess of magic, sorcery and witchcraft. Hecate’s pet had once been a serving maid named Galanthis, who was turned into a cat as punishment by the goddess Hera for being rude.
When Christianity became the official religion of Rome in 380, the association of cats with paganism and witchcraft made them suspect. Moreover, the cat’s independence suggested a willful rebellion against the teaching of the Bible, which said that Adam had dominion over all the animals. The cat’s reputation worsened during the medieval era, as the Catholic Church battled against heresies and dissent. Fed lurid tales by his inquisitors, in 1233 Pope Gregory IX issued a papal bull, “Vox in Rama,” which accused heretics of using black cats in their nighttime sex orgies with Lucifer—who was described as half-cat in appearance.
In Europe, countless numbers of cats were killed in the belief that they could be witches in disguise. In 1484, Pope Innocent VIII fanned the flames of anti-cat prejudice with his papal bull on witchcraft, “Summis Desiderantes Affectibus,” which stated that the cat was “the devil’s favorite animal and idol of all witches.”
The Age of Reason ought to have rescued the black cat from its pariah status, but superstitions die hard. (How many modern apartment buildings lack a 13th floor?). Cats had plenty of ardent fans among 19th century writers, including Charles Dickens and Mark Twain, who wrote “I simply can’t resist a cat, particularly a purring one.” But Edgar Allan Poe, the master of the gothic tale, felt otherwise: in his 1843 story “The Black Cat,” the spirit of a dead cat drives its killer to madness and destruction.
So pity the poor black cat, which through no fault of its own has gone from being an instrument of the devil to the convenient tool of the horror writer—and a favorite Halloween cliché.
These days, every neighborhood bar celebrates Oktoberfest, but the original fall beer festival is the one in Munich, Germany—still the largest of its kind in the world. Oktoberfest was started in 1810 by the Bavarian royal family as a celebration of Crown Prince Ludwig’s marriage to Princess Therese von Sachsen-Hildburghausen. Nowadays, it lasts 16 days and attracts some 6 million tourists, who guzzle almost 2 million gallons of beer.
Yet these staggering numbers conceal the fact that, outside of the developing world, the beer industry is suffering. Beer sales in the U.S. last year accounted for 45.6% of the alcohol market, down from 48.2% in 2010. In Germany, per capita beer consumption has dropped by one-third since 1976. It is a sad decline for a drink that has played a central role in the history of civilization. Brewing beer, like baking bread, is considered by archaeologists to be one of the key markers in the development of agriculture and communal living.
In Sumer, the ancient empire in modern-day Iraq where the world’s first cities emerged in the 4th millennium BC, up to 40% of all grain production may have been devoted to beer. It was more than an intoxicating beverage; beer was nutritious and much safer to drink than ordinary water because it was boiled first. The oldest known beer recipe comes from a Sumerian hymn to Ninkasi, the goddess of beer, composed around 1800 BC. The fact that a female deity oversaw this most precious commodity reflects the importance of women in its production. Beer was brewed in the kitchen and was considered as fundamental a skill for women as cooking and needlework.
The ancient Egyptians similarly regarded beer as essential for survival: Construction workers for the pyramids were usually paid in beer rations. The Greeks and Romans were unusual in preferring wine; blessed with climates that aided viticulture, they looked down on beer-drinking as foreign and unmanly. (There’s no mention of beer in Homer.)
Northern Europeans adopted wine-growing from the Romans, but beer was their first love. The Vikings imagined Valhalla as a place where beer perpetually flowed. Still, beer production remained primarily the work of women. With most occupations in the Middle Ages restricted to members of male-only guilds, widows and spinsters could rely on ale-making to support themselves. Among her many talents as a writer, composer, mystic and natural scientist, the renowned 12th century Rhineland abbess Hildegard of Bingen was also an expert on the use of hops in beer.
The female domination of beer-making lasted in Europe until the 15th and 16th centuries, when the growth of the market economy helped to transform it into a profitable industry. As professional male brewers took over production and distribution, female brewers lost their respectability. By the 19th century, women were far more likely to be temperance campaigners than beer drinkers.
When Prohibition ended in the U.S. in 1933, brewers struggled to get beer into American homes. Their solution was an ad campaign selling beer to housewives—not to drink it but to cook with it. In recent years, beer ads have rarely bothered to address women at all, which may explain why only a quarter of U.S. beer drinkers are female.
As we’ve seen recently in the Kavanaugh hearings, a male-dominated beer-drinking culture can be unhealthy for everyone. Perhaps it’s time for brewers to forget “the king of beers”—Budweiser’s slogan—and seek their once and future queen.
Writer and historian Amanda Foreman took her family on an epic motorhome adventure. Would it drive them all round the bend?
Imagining yourself behind the wheel of an RV and actually driving one are two completely different things. I discovered this shortly after we hit the road in our 30ft rental for a 1,500-mile odyssey from Denver, Colorado, to Las Vegas, Nevada. As soon as I pressed down on the accelerator, other faculties such as breathing and thinking stopped functioning. My husband, Jonathan, took over, and remained in the driving seat for the rest of the trip, while I quietly buried my pride beneath a pile of flip-flops and rain jackets.
The latest research offers hope, though. The best antidote to the rapacious demands of the internet, experts say, is its reverse — the power of lived experience. And so we decided to put that theory to the test by taking to the road.
Everything was new and exciting at first. Look, a herd of cattle blocking the road! Oh, we blew the electrics! It was when we left the cool mountain ranges of Colorado and descended into Utah, a dip that sent the thermometer climbing into the 30s, that reality took over.
We were heading for Moab and its two national and one state parks. Some of the best scenery in the totemic road-trip movie Thelma and Louise is here. I kept repeating this fact until Helena reminded me that they’d never seen the film and couldn’t care less. The atmosphere of the RV, already thick with sweat and hormones, was beginning to grow heavy. Knees and elbows were making contact where they shouldn’t.
Our first hike was through the Arches National Park, so named after the formations that stake the desert like a thousand oversized Durdle Doors. Our trail took us to Devils Garden, with its giant phallic pillars. Another led to Landscape Arch, the longest natural arch in the world. A peculiar tension emanates from it, as though at any second something will snap and the whole thing will collapse. I recognised the feeling.
Three children claimed to be on the edge of survival by day’s end (heat, thirst, sand in shoe, etc). A night’s camping at Devils Garden did it for the rest.
The evening had started well. We ate under the starriest sky I’d ever seen. The intense blackness around us served to highlight the strange symphony of sounds wafting from the brush. After dinner, we sat in companionable silence, listening to the music of the desert. If we could have stayed that way until morning, the night would have been perfect. But shutting down an RV for the night requires concentration — not the futzing and farting about of a family of camping novices. By the time we had secured the awning, washed and put everything away, cleared the only escape route of detritus, converted the “dinette” into a quasi-bed large enough for a small person, or in our case reluctant twins, turned the banquette into a sofa bed (tricky with the seatbelt fittings running down the middle) and stashed the teenagers into the bunk above the driver’s seat, mercury had started rising in the worst way.
I lay on my side in the back alcove, feeling the grit between my toes and the sweat between my eyes, waiting for the inevitable blow-up. There was a yell, a thwack, and the RV started shaking on its plastic levellers. My husband and I leapt out of bed and turned on the lights. The children froze, the twins in mid-wrestle. Each returned sulkily to his or her place.
Once peace was restored, Jonathan and I listened for signs of a round two: was that it, or merely the first skirmish in a battle to the death? “It depends,” I said, “on whether we kill them first.”
“Where do we go from here?” Evita sang over the sound system as we headed to our next stop the following day. There were three more destinations to see before we arrived at the Grand Canyon. “This isn’t where we intended to be,” she warbled.
“Don’t listen to her,” my husband told the children, “she’s being a wuss.”
I was beginning to fear the wuss factor would end up trumping the wow factor. Your basic renters’ RV is essentially a tin can on wheels with a lavatory attached. By midweek we had wrecked the place. Only the desperate and foolhardy dared face the loo. Every spare inch of the RV was occupied by wet and drying clothes, creating a steam-room effect without the refreshing scent of eucalyptus.
A gentle rain accompanied our arrival at Bryce Canyon. Despite the name, it’s not a canyon at all, but a series of natural stone amphitheatres created by millions of years of frost-thaw cycles. Each one is filled with densely packed pillars and spires called “hoodoos”, which turn crimson and ginger in the sun. Our goal, as we set off from Ruby’s Campground, was to see the sun set over Bryce’s most famous amphitheatre, the hauntingly named Silent City. There was a whiff of mutiny in the damp air. Nothing obvious, but if I had suggested going back to the RV, no one would have complained.
We trudged through the pines towards Inspiration Point (by God, did we need some) in a long, straggly line. Glimpses of Silent City flashed through breaks in the trees. The hoodoos seemed not alive exactly, but immobile. I looked back and saw that the stragglers had stopped. They, too, were staring at the rocks.
At the Point, we waited in vain for a break in the clouds. Our son suddenly shouted: “It’s a peregrine falcon, I’m sure of it.” His sisters clustered around, taking photographs as the bird swooped and then banked hard into the air. “They do this to impress their mates,” Theo informed us.
Back at camp, we broke out the chocolate Oreos in celebration of Theo’s surprise expertise in all things avian. The rain also cleared in time for us to light the barbecue and go full-on carnivore. There wasn’t a peep from anyone that night.
The next morning, we reached Zion National Park right on schedule, a first for us. This day was a hike through the Virgin River to the Narrows, where the orange-brown walls of the canyon rise 1,000ft, but the gorge is only 30ft across. The water was cold, the rocks slippery, providing ample opportunities for a whinge. None came. I noticed a subtle change as we waded upriver: the children were the ones out in front. That wasn’t all. Over the next few days, blisters that would have been incapacitating at the beginning of the trip were now displayed with pride.
The news that we were camping on the rustic side of the Grand Canyon, at the North Rim, didn’t faze anyone at all. The South Rim, which overlooks the Colorado River, gets 10 times more visitors and has two dozen main viewing points. The North is higher, colder, smaller, with just three principal viewpoints, and is so quiet that you can hear the trees rustling in the early-morning breeze. It means you have one of the great natural wonders of the world all to yourself. We could picnic among the most spectacular scenery without another person in sight.
If there was a wuss among the group, it was me. We had trekked to Cape Royal, where a wooded trail leads to a dramatic plateau that offers one of the widest panoramas of the canyon. The children were naturally drawn to the ledge, carelessly dangling their legs over the side. I was torn between wanting to capture the moment for ever and shouting at them to get back. Jonathan intervened. “It’s all right,” he said. “They are free.”
There wasn’t an ounce of regret when we dropped off the camper in Las Vegas. I doubt we will ever rent an RV again. Yet we ended the week feeling truly content. Without being sappy about it, to experience nature in its untrammelled state is to feel insignificant and uplifted in equal measure.
We came, we saw, and were conquered, happily.
Amanda’s guide to a stress-free US motorhome holiday Do book your campground berths at least six months in advance. I’m not exaggerating. Do find out what your RV comes with. Usually, it’s practically nothing. Do check the depot for “left behinds”. We picked up a barbecue that way, and donated it back at the end. Do bring physical maps with you — there isn’t much coverage out in the sticks. Praise be. Do have a games bag with amusements that don’t need charging, such as playing cards, colouring books and chess. Do take the desert heat seriously. Load up on hats, sunscreen, insulated water bottles and handheld fans. Do make a list of chores. There should be no passengers on your trip, only crew members. Don’t overpack. Don’t attempt more than four hours’ driving a day. It’s not much fun for the people stuck in the back. Don’t assume that you’ll be allowed to use your generator past 8pm. Plan your meals ahead of time. Don’t leave out any food overnight. Nature is not your friend. Nature wants to swarm over your leftovers and/or eat you. Don’t forget to check your water levels every day. Remember how disgusting it is when you flush the loo on a plane and nothing happens when the trap opens? Now multiply that by five. Don’t expect kids to be as rhapsodic about the scenery as you are.
Hiring a six-berth RV sleeping six for 10 days in November, picking up in Denver and dropping off in Las Vegas, starts at £830, including one-way fee, with Road Bear (roadbearrv.com). A 10-day fly-drive package from Denver in October costs £1,225pp for a family of six, including RV hire (bon-voyage.co.uk). A World on Fire by Amanda Foreman is published by Penguin
Have you successfully bonded as a family on holiday? Or had a disaster? Tell us and you could win a £250 holiday voucher; see Letters for details. Email firstname.lastname@example.org