I don’t look forward to New Year’s Eve. When the bells start to ring, it isn’t “Auld Lang Syne” I hear but echoes from the Anglican “Book of Common Prayer”: “We have left undone those things which we ought to have done; And we have done those things which we ought not to have done.”
At least I’m not alone in my annual dip into the waters of woe. Experiencing the sharp sting of regret around the New Year has a long pedigree. The ancient Babylonians required their kings to offer a ritual apology during the Akitu festival of New Year: The king would go down on his knees before an image of the god Marduk, beg his forgiveness, insist that he hadn’t sinned against the god himself and promise to do better next year. The rite ended with the high priest giving the royal cheek the hardest possible slap.
There are sufficient similarities between the Akitu festival and Yom Kippur, Judaism’s Day of Atonement—which takes place 10 days after the Jewish New Year—to suggest that there was likely a historical link between them. Yom Kippur, however, is about accepting responsibility, with the emphasis on owning up to sins committed rather than pointing out those omitted.
In Europe, the 14th-century Middle English poem “Sir Gawain and the Green Knight” begins its strange tale on New Year’s Day. A green-skinned knight arrives at King Arthur’s Camelot and challenges the knights to strike at him, on the condition that he can return the blow in a year and a day. Sir Gawain reluctantly accepts the challenge, and embarks on a year filled with adventures. Although he ultimately survives his encounter with the Green Knight, Gawain ends up haunted by his moral lapses over the previous 12 months. For, he laments (in J.R.R. Tolkien’s elegant translation), “a man may cover his blemish, but unbind it he cannot.”
New Year’s Eve in Shakespeare’s era was regarded as a day for gift-giving rather than as a catalyst for regret. But Sonnet 30 shows that Shakespeare was no stranger to the melancholy that looking back can inspire: “I summon up remembrance of things past, / I sigh the lack of many a thing I sought, / And with old woes new wail my dear time’s waste.”
For a full dose of New Year’s misery, however, nothing beats the Victorians. “I wait its close, I court its gloom,” declared the poet Walter Savage Landor in “Mild Is the Parting Year.” Not to be outdone, William Wordsworth offered his “Lament of Mary Queen of Scots on the Eve of a New Year”: “Pondering that Time tonight will pass / The threshold of another year; /…My very moments are too full / Of hopelessness and fear.”
Fortunately, there is always Charles Dickens. In 1844, Dickens followed up the wildly successful “A Christmas Carol” with a slightly darker but still uplifting seasonal tale, “The Chimes.” Trotty Veck, an elderly messenger, takes stock of his life on New Year’s Eve and decides that he has been nothing but a burden on society. He resolves to kill himself, but the spirits of the church bells intervene, showing him a vision of what would happen to the people he loves.
Today, most Americans recognize this story as the basis of the bittersweet 1946 Frank Capra film “It’s a Wonderful Life.” As an antidote to New Year’s blues, George Bailey’s lesson holds true for everyone: “No man is a failure who has friends.”
From Saturnalia to Christmas Eve, people have always had a spiritual need for greenery in the depths of winter
Queen Victoria and family with their Christmas tree in 1848. PHOTO: GETTY IMAGES
My family never had a pink-frosted Christmas tree, though Lord knows my 10-year-old self really wanted one. Every year my family went to Sal’s Christmas Emporium on Wilshire Boulevard in Los Angeles, where you could buy neon-colored trees, mechanical trees that played Christmas carols, blue and white Hanukkah bushes or even a real Douglas fir if you wanted to go retro. We were solidly retro.
Decorating the Christmas tree remains one of my most treasured memories, and according to the National Christmas Tree Association, the tradition is still thriving in our digital age: In 2017 Americans bought 48.5 million real and artificial Christmas trees. Clearly, bringing a tree into the house, especially during winter, taps into something deeply spiritual in the human psyche.
Nearly every society has at some point venerated the tree as a symbol of fertility and rebirth, or as a living link between the heavens, the earth and the underworld. In the ancient Near East, “tree of life” motifs appear on pottery as early as 7000 B.C. By the second millennium B.C., variations of the motif were being carved onto temple walls in Egypt and fashioned into bronze sculptures in southern China.
The early Christian fathers were troubled by the possibility that the faithful might identify the Garden of Eden’s trees of life and knowledge, described in the Book of Genesis, with paganism’s divine trees and sacred groves. Accordingly, in 572 the Council of Braga banned Christians from participating in the Roman celebration of Saturnalia—a popular winter solstice festival in honor of Saturn, the god of agriculture, that included decking the home with boughs of holly, his sacred symbol.
It wasn’t until the late Middle Ages that evergreens received a qualified welcome from the Church, as props in the mystery plays that told the story of Creation. In Germany, mystery plays were performed on Christmas Eve, traditionally celebrated in the church calendar as the feast day of Adam and Eve. The original baubles that hung on these “paradise trees,” representing the trees in the Garden of Eden, were round wafer breads that symbolized the Eucharist.
The Christmas tree remained a northern European tradition until Queen Charlotte, the German-born wife of George III, had one erected for a children’s party at Windsor Castle in 1800. The British upper classes quickly followed suit, but the rest of the country remained aloof until 1848, when the London Illustrated News published a charming picture of Queen Victoria and her family gathered around a large Christmas tree. Suddenly, every household had to have one for the children to decorate. It didn’t take long for President Franklin Pierce to introduce the first Christmas tree to the White House, in 1853—a practice that every President has honored except Theodore Roosevelt, who in 1902 refused to have a tree on conservationist grounds. (His children objected so much to the ban that he eventually gave in.)
Many writers have tried to capture the complex feelings that Christmas trees inspire, particularly in children. Few, though, can rival T.S. Eliot’s timeless meditation on joy, death and life everlasting, in his 1954 poem “The Cultivation of Christmas Trees”: “The child wonders at the Christmas Tree: / Let him continue in the spirit of wonder / At the Feast as an event not accepted as a pretext; / So that the glittering rapture, the amazement / Of the first-remembered Christmas Tree /…May not be forgotten.”
The tell-all memoir has been a feature of American politics ever since Raymond Moley, an ex-aide to Franklin Delano Roosevelt, published his excoriating book “After Seven Years” while FDR was still in office. What makes the Trump administration unusual is the speed at which such accounts are appearing—most recently, “Unhinged,” by Omarosa Manigault Newman, a former political aide to the president.
Spilling the beans on one’s boss may be disloyal, but it has a long pedigree. Alexander the Great is thought to have inspired the genre. His great run of military victories, beginning with the Battle of Chaeronea in 338 B.C., was so unprecedented that several of his generals felt the urge—unknown in Greek literature before then—to record their experiences for posterity.
Unfortunately, their accounts didn’t survive, save for the memoir of Ptolemy Soter, the founder of the Ptolemaic dynasty in Egypt, which exists in fragments. The great majority of Roman political memoirs have also disappeared—many by official suppression. Historians particularly regret the loss of the memoirs of Agrippina, the mother of Emperor Nero, who once boasted that she could bring down the entire imperial family with her revelations.
The Heian period (794-1185) in Japan produced four notable court memoirs, all by noblewomen. Dissatisfaction with their lot was a major factor behind these accounts—particularly for the anonymous author of ‘The Gossamer Years,” written around 974. The author was married to Fujiwara no Kane’ie, the regent for the Emperor Ichijo. Her exalted position at court masked a deeply unhappy private life; she was made miserable by her husband’s serial philandering, describing herself as “rich only in loneliness and sorrow.”
In Europe, the first modern political memoir was written by the Duc de Saint-Simon (1675-1755), a frustrated courtier at Versailles who took revenge on Louis XIV with his pen. Saint-Simon’s tales hilariously reveal the drama, gossip and intrigue that surrounded a king whose intellect, in his view, was “beneath mediocrity.”
But even Saint-Simon’s memoirs pale next to those of the Korean noblewoman Lady Hyegyeong (1735-1816), wife of Crown Prince Sado of the Joseon Dynasty. Her book, “Memoirs Written in Silence,” tells shocking tales of murder and madness at the heart of the Korean court. Sado, she writes, was a homicidal psychopath who went on a bloody killing spree that was only stopped by the intervention of his father King Yeongjo. Unwilling to see his son publicly executed, Yeongjo had the prince locked inside a rice chest and left to die. Understandably, Hyegyeong’s memoirs caused a huge sensation in Korea when they were first published in 1939, following the death of the last Emperor in 1926.
Fortunately, the Washington political memoir has been free of this kind of violence. Still, it isn’t just Roman emperors who have tried to silence uncomfortable voices. According to the historian Michael Beschloss, President John F. Kennedy had the White House household staff sign agreements to refrain from writing any memoirs. But eventually, of course, even Kennedy’s secrets came out. Perhaps every political leader should be given a plaque that reads: “Just remember, your underlings will have the last word.”
At the first Thanksgiving dinner, eaten by the Wampanoag Indians and the Pilgrims in 1621, the menu was rather different from what’s served today. For one thing, the pumpkin was roasted, not made into a pie. And there definitely wasn’t a side dish of mashed potatoes.
In fact, the first hundred Thanksgivings were spud-free, since potatoes weren’t grown in North America until 1719, when Scotch-Irish settlers began planting them in New Hampshire. Mashed potatoes were an even later invention. The first recorded recipe for the dish appeared in 1747, in Hannah Glasse’s splendidly titled “The Art of Cookery Made Plain and Easy, Which Far Exceeds Any Thing of the Kind yet Published.”
By then, the potato had been known in Europe for a full two centuries. It was first introduced by the Spanish conquerors of Peru, where the Incas had revered the potato and even invented a natural way of freeze-drying it for storage. Nevertheless, despite its nutritional value and ease of growing, the potato didn’t catch on in Europe. It wasn’t merely foreign and ugly-looking; to wheat-growing farmers it seemed unnatural—possibly even un-Christian, since there is no mention of the potato in the Bible. Outside of Spain, it was generally grown for animal feed.
The change in the potato’s fortunes was largely due to the efforts of a Frenchman named Antoine-Augustin Parmentier (1737-1813). During the Seven Years’ War, he was taken prisoner by the Prussians and forced to live on a diet of potatoes. To his surprise, he stayed relatively healthy. Convinced he had found a solution to famine, Parmentier dedicated his life after the war to popularizing the potato’s nutritional benefits. He even persuaded Marie-Antoinette to wear potato flowers in her hair.
Among the converts to his message were the economist Adam Smith, who realized the potato’s economic potential as a staple food for workers, and Thomas Jefferson, then the U.S. Ambassador to France, who was keen for his new nation to eat well in all senses of the word. Jefferson is credited with introducing Americans to french fries at a White House dinner in 1802.
As Smith predicted, the potato became the fuel for the Industrial Revolution. A study published in 2011 by Nathan Nunn and Nancy Qian in the Quarterly Journal of Economics estimates that up to a quarter of the world’s population growth from 1700 to 1900 can be attributed solely to the introduction of the potato. As Louisa May Alcott observed in “Little Men,” in 1871, “Money is the root of all evil, and yet it is such a useful root that we cannot get on without it any more than we can without potatoes.”
In 1887, two Americans, Jacob Fitzgerald and William H. Silver, patented the first potato ricer, which forced a cooked potato through a cast iron sieve, ending the scourge of lumpy mash. Still, the holy grail of “quick and easy” mashed potatoes remained elusive until the late 1950s. Using the flakes produced by the potato ricer and a new freeze drying method, U.S. government scientists perfected instant mashed potatoes, which only requires the simple step of adding hot water or milk to the mix. The days of peeling, boiling and mashing were now optional, and for millions of cooks, Thanksgiving became a little easier. And that’s something to be thankful for.
From Japanese knotweed to cane toads, humans have introduced invasive species to new environments with disastrous results
Ever since Neolithic people wandered the earth, inadvertently bringing the mouse along for the ride, humans have been responsible for introducing animal and plant species into new environments. But problems can arise when a non-native species encounters no barriers to population growth, allowing it to rampage unchecked through the new habitat, overwhelming the ecosystem. On more than one occasion, humans have transplanted a species for what seemed like good reasons, only to find out too late that the consequences were disastrous.
One of the most famous examples is celebrating its 150th anniversary this year: the introduction of Japanese knotweed to the U.S. A highly aggressive plant, it can grow 15 feet high and has roots that spread up to 45 feet. Knotweed had already been a hit in Europe because of its pretty little white flowers, and, yes, its miraculous indestructibility.
First mentioned in botanical articles in 1868, knotweed was brought to New York by the Hogg brothers, James and Thomas, eminent American horticulturalists and among the earliest collectors of Japanese plants. Thanks to their extensive contacts, knotweed found a home in arboretums, botanical gardens and even Central Park. Not content with importing one of world’s most invasive shrubs, the Hoggs also introduced Americans to the wonders of kudzu, a dense vine that can grow a foot a day.
Impressed by the vigor of kudzu, agriculturalists recommended using these plants to provide animal fodder and prevent soil erosion. In the 1930s, the government was even paying Southern farmers $8 per acre to plant kudzu. Today it is known as the “vine that ate the South,” because of the way it covers huge tracts of land in a green blanket of death. And Japanese knotweed is still spreading, colonizing entire habitats from Mississippi to Alaska, where only the Arctic tundra holds it back from world domination.
Knotweed has also reached Australia, a country that has been ground zero for the worst excesses of invasive species. In the 19th century, the British imported non-native animals such as rabbits, cats, goats, donkeys, pigs, foxes and camels, causing mass extinctions of Australia’s native mammal species. Australians are still paying the price; there are more rabbits in the country today than wombats, more camels than kangaroos.
Yet the lesson wasn’t learned. In the 1930s, scientists in both Australia and the U.S. decided to import the South American cane toad as a form of biowarfare against beetles that eat sugar cane. The experiment failed, and it turned out that the cane toad was poisonous to any predator that ate it. There’s also the matter of the 30,000 eggs it can lay at a time. Today, the cane toad can be found all over northern Australia and south Florida.
So is there anything we can do once an invasive species has taken up residence? The answer is yes, but it requires more than just fences, traps and pesticides; it means changing human incentives. Today, for instance, the voracious Indo-Pacific lionfish is gobbling up local fish in the west Atlantic, while the Asian carp threatens the ecosystem of the Great Lakes. There is only one solution: We must eat them, dear reader. These invasive fish can be grilled, fried or consumed as sashimi, and they taste delicious. Likewise, kudzu makes great salsa, and Japanese knotweed can be treated like rhubarb. Eat for America and save the environment.
As Halloween approaches, decorations featuring scary black cats are starting to make their seasonal appearance. But what did the black cat ever do to deserve its reputation as a symbol of evil? Why is it considered bad luck to have a black cat cross your path?
It wasn’t always this way. In fact, the first human-cat interactions were benign and based on mutual convenience. The invention of agriculture in the Neolithic era led to surpluses of grain, which attracted rodents, which in turn motivated wild cats to hang around humans in the hope of catching dinner. Domestication soon followed: The world’s oldest pet cat was found in a 9,500 year-old grave in Cyprus, buried alongside its human owner.
Yet as the ancient Egyptians realized, even when domesticated, the cat retains its independence. The Egyptians were fascinated by divine opposites and cosmic symmetries, and they saw this kind of duality in the cat—a fierce predator that was also a loyal guardian. Several Egyptian deities were depicted in part-cat, part-human form, including Bastet, who was a goddess of violence as well as fertility. One of her sacred colors was black, which is how the black cat first achieved its special status.
According to the Roman writer Polyaenus, who lived in the second century A.D., the Egyptian veneration of cats led to disaster at the Battle of Pelusium in 525 B.C. The invading Persian army carried cats on the front lines, rightly calculating that the Egyptians would rather accept defeat than kill a cat.
The Egyptians were unique in their extreme veneration of cats, but they weren’t alone in regarding them as having a special connection to the spirit world. In Greek mythology the cat was a familiar of Hecate, goddess of magic, sorcery and witchcraft. Hecate’s pet had once been a serving maid named Galanthis, who was turned into a cat as punishment by the goddess Hera for being rude.
When Christianity became the official religion of Rome in 380, the association of cats with paganism and witchcraft made them suspect. Moreover, the cat’s independence suggested a willful rebellion against the teaching of the Bible, which said that Adam had dominion over all the animals. The cat’s reputation worsened during the medieval era, as the Catholic Church battled against heresies and dissent. Fed lurid tales by his inquisitors, in 1233 Pope Gregory IX issued a papal bull, “Vox in Rama,” which accused heretics of using black cats in their nighttime sex orgies with Lucifer—who was described as half-cat in appearance.
In Europe, countless numbers of cats were killed in the belief that they could be witches in disguise. In 1484, Pope Innocent VIII fanned the flames of anti-cat prejudice with his papal bull on witchcraft, “Summis Desiderantes Affectibus,” which stated that the cat was “the devil’s favorite animal and idol of all witches.”
The Age of Reason ought to have rescued the black cat from its pariah status, but superstitions die hard. (How many modern apartment buildings lack a 13th floor?). Cats had plenty of ardent fans among 19th century writers, including Charles Dickens and Mark Twain, who wrote “I simply can’t resist a cat, particularly a purring one.” But Edgar Allan Poe, the master of the gothic tale, felt otherwise: in his 1843 story “The Black Cat,” the spirit of a dead cat drives its killer to madness and destruction.
So pity the poor black cat, which through no fault of its own has gone from being an instrument of the devil to the convenient tool of the horror writer—and a favorite Halloween cliché.
From ancient times until the Renaissance, beer-making was considered a female specialty
These days, every neighborhood bar celebrates Oktoberfest, but the original fall beer festival is the one in Munich, Germany—still the largest of its kind in the world. Oktoberfest was started in 1810 by the Bavarian royal family as a celebration of Crown Prince Ludwig’s marriage to Princess Therese von Sachsen-Hildburghausen. Nowadays, it lasts 16 days and attracts some 6 million tourists, who guzzle almost 2 million gallons of beer.
Yet these staggering numbers conceal the fact that, outside of the developing world, the beer industry is suffering. Beer sales in the U.S. last year accounted for 45.6% of the alcohol market, down from 48.2% in 2010. In Germany, per capita beer consumption has dropped by one-third since 1976. It is a sad decline for a drink that has played a central role in the history of civilization. Brewing beer, like baking bread, is considered by archaeologists to be one of the key markers in the development of agriculture and communal living.
In Sumer, the ancient empire in modern-day Iraq where the world’s first cities emerged in the 4th millennium BC, up to 40% of all grain production may have been devoted to beer. It was more than an intoxicating beverage; beer was nutritious and much safer to drink than ordinary water because it was boiled first. The oldest known beer recipe comes from a Sumerian hymn to Ninkasi, the goddess of beer, composed around 1800 BC. The fact that a female deity oversaw this most precious commodity reflects the importance of women in its production. Beer was brewed in the kitchen and was considered as fundamental a skill for women as cooking and needlework.
The ancient Egyptians similarly regarded beer as essential for survival: Construction workers for the pyramids were usually paid in beer rations. The Greeks and Romans were unusual in preferring wine; blessed with climates that aided viticulture, they looked down on beer-drinking as foreign and unmanly. (There’s no mention of beer in Homer.)
Northern Europeans adopted wine-growing from the Romans, but beer was their first love. The Vikings imagined Valhalla as a place where beer perpetually flowed. Still, beer production remained primarily the work of women. With most occupations in the Middle Ages restricted to members of male-only guilds, widows and spinsters could rely on ale-making to support themselves. Among her many talents as a writer, composer, mystic and natural scientist, the renowned 12th century Rhineland abbess Hildegard of Bingen was also an expert on the use of hops in beer.
The female domination of beer-making lasted in Europe until the 15th and 16th centuries, when the growth of the market economy helped to transform it into a profitable industry. As professional male brewers took over production and distribution, female brewers lost their respectability. By the 19th century, women were far more likely to be temperance campaigners than beer drinkers.
When Prohibition ended in the U.S. in 1933, brewers struggled to get beer into American homes. Their solution was an ad campaign selling beer to housewives—not to drink it but to cook with it. In recent years, beer ads have rarely bothered to address women at all, which may explain why only a quarter of U.S. beer drinkers are female.
As we’ve seen recently in the Kavanaugh hearings, a male-dominated beer-drinking culture can be unhealthy for everyone. Perhaps it’s time for brewers to forget “the king of beers”—Budweiser’s slogan—and seek their once and future queen.
Writer and historian Amanda Foreman took her family on an epic motorhome adventure. Would it drive them all round the bend?
Imagining yourself behind the wheel of an RV and actually driving one are two completely different things. I discovered this shortly after we hit the road in our 30ft rental for a 1,500-mile odyssey from Denver, Colorado, to Las Vegas, Nevada. As soon as I pressed down on the accelerator, other faculties such as breathing and thinking stopped functioning. My husband, Jonathan, took over, and remained in the driving seat for the rest of the trip, while I quietly buried my pride beneath a pile of flip-flops and rain jackets.
The latest research offers hope, though. The best antidote to the rapacious demands of the internet, experts say, is its reverse — the power of lived experience. And so we decided to put that theory to the test by taking to the road.
Everything was new and exciting at first. Look, a herd of cattle blocking the road! Oh, we blew the electrics! It was when we left the cool mountain ranges of Colorado and descended into Utah, a dip that sent the thermometer climbing into the 30s, that reality took over.
We were heading for Moab and its two national and one state parks. Some of the best scenery in the totemic road-trip movie Thelma and Louise is here. I kept repeating this fact until Helena reminded me that they’d never seen the film and couldn’t care less. The atmosphere of the RV, already thick with sweat and hormones, was beginning to grow heavy. Knees and elbows were making contact where they shouldn’t.
Our first hike was through the Arches National Park, so named after the formations that stake the desert like a thousand oversized Durdle Doors. Our trail took us to Devils Garden, with its giant phallic pillars. Another led to Landscape Arch, the longest natural arch in the world. A peculiar tension emanates from it, as though at any second something will snap and the whole thing will collapse. I recognised the feeling.
Three children claimed to be on the edge of survival by day’s end (heat, thirst, sand in shoe, etc). A night’s camping at Devils Garden did it for the rest.
The evening had started well. We ate under the starriest sky I’d ever seen. The intense blackness around us served to highlight the strange symphony of sounds wafting from the brush. After dinner, we sat in companionable silence, listening to the music of the desert. If we could have stayed that way until morning, the night would have been perfect. But shutting down an RV for the night requires concentration — not the futzing and farting about of a family of camping novices. By the time we had secured the awning, washed and put everything away, cleared the only escape route of detritus, converted the “dinette” into a quasi-bed large enough for a small person, or in our case reluctant twins, turned the banquette into a sofa bed (tricky with the seatbelt fittings running down the middle) and stashed the teenagers into the bunk above the driver’s seat, mercury had started rising in the worst way.
I lay on my side in the back alcove, feeling the grit between my toes and the sweat between my eyes, waiting for the inevitable blow-up. There was a yell, a thwack, and the RV started shaking on its plastic levellers. My husband and I leapt out of bed and turned on the lights. The children froze, the twins in mid-wrestle. Each returned sulkily to his or her place.
Once peace was restored, Jonathan and I listened for signs of a round two: was that it, or merely the first skirmish in a battle to the death? “It depends,” I said, “on whether we kill them first.”
“Where do we go from here?” Evita sang over the sound system as we headed to our next stop the following day. There were three more destinations to see before we arrived at the Grand Canyon. “This isn’t where we intended to be,” she warbled.
“Don’t listen to her,” my husband told the children, “she’s being a wuss.”
I was beginning to fear the wuss factor would end up trumping the wow factor. Your basic renters’ RV is essentially a tin can on wheels with a lavatory attached. By midweek we had wrecked the place. Only the desperate and foolhardy dared face the loo. Every spare inch of the RV was occupied by wet and drying clothes, creating a steam-room effect without the refreshing scent of eucalyptus.
A gentle rain accompanied our arrival at Bryce Canyon. Despite the name, it’s not a canyon at all, but a series of natural stone amphitheatres created by millions of years of frost-thaw cycles. Each one is filled with densely packed pillars and spires called “hoodoos”, which turn crimson and ginger in the sun. Our goal, as we set off from Ruby’s Campground, was to see the sun set over Bryce’s most famous amphitheatre, the hauntingly named Silent City. There was a whiff of mutiny in the damp air. Nothing obvious, but if I had suggested going back to the RV, no one would have complained.
We trudged through the pines towards Inspiration Point (by God, did we need some) in a long, straggly line. Glimpses of Silent City flashed through breaks in the trees. The hoodoos seemed not alive exactly, but immobile. I looked back and saw that the stragglers had stopped. They, too, were staring at the rocks.
At the Point, we waited in vain for a break in the clouds. Our son suddenly shouted: “It’s a peregrine falcon, I’m sure of it.” His sisters clustered around, taking photographs as the bird swooped and then banked hard into the air. “They do this to impress their mates,” Theo informed us.
Back at camp, we broke out the chocolate Oreos in celebration of Theo’s surprise expertise in all things avian. The rain also cleared in time for us to light the barbecue and go full-on carnivore. There wasn’t a peep from anyone that night.
The next morning, we reached Zion National Park right on schedule, a first for us. This day was a hike through the Virgin River to the Narrows, where the orange-brown walls of the canyon rise 1,000ft, but the gorge is only 30ft across. The water was cold, the rocks slippery, providing ample opportunities for a whinge. None came. I noticed a subtle change as we waded upriver: the children were the ones out in front. That wasn’t all. Over the next few days, blisters that would have been incapacitating at the beginning of the trip were now displayed with pride.
The news that we were camping on the rustic side of the Grand Canyon, at the North Rim, didn’t faze anyone at all. The South Rim, which overlooks the Colorado River, gets 10 times more visitors and has two dozen main viewing points. The North is higher, colder, smaller, with just three principal viewpoints, and is so quiet that you can hear the trees rustling in the early-morning breeze. It means you have one of the great natural wonders of the world all to yourself. We could picnic among the most spectacular scenery without another person in sight.
If there was a wuss among the group, it was me. We had trekked to Cape Royal, where a wooded trail leads to a dramatic plateau that offers one of the widest panoramas of the canyon. The children were naturally drawn to the ledge, carelessly dangling their legs over the side. I was torn between wanting to capture the moment for ever and shouting at them to get back. Jonathan intervened. “It’s all right,” he said. “They are free.”
There wasn’t an ounce of regret when we dropped off the camper in Las Vegas. I doubt we will ever rent an RV again. Yet we ended the week feeling truly content. Without being sappy about it, to experience nature in its untrammelled state is to feel insignificant and uplifted in equal measure.
We came, we saw, and were conquered, happily.
Amanda’s guide to a stress-free US motorhome holiday Do book your campground berths at least six months in advance. I’m not exaggerating. Do find out what your RV comes with. Usually, it’s practically nothing. Do check the depot for “left behinds”. We picked up a barbecue that way, and donated it back at the end. Do bring physical maps with you — there isn’t much coverage out in the sticks. Praise be. Do have a games bag with amusements that don’t need charging, such as playing cards, colouring books and chess. Do take the desert heat seriously. Load up on hats, sunscreen, insulated water bottles and handheld fans. Do make a list of chores. There should be no passengers on your trip, only crew members. Don’t overpack. Don’t attempt more than four hours’ driving a day. It’s not much fun for the people stuck in the back. Don’t assume that you’ll be allowed to use your generator past 8pm. Plan your meals ahead of time. Don’t leave out any food overnight. Nature is not your friend. Nature wants to swarm over your leftovers and/or eat you. Don’t forget to check your water levels every day. Remember how disgusting it is when you flush the loo on a plane and nothing happens when the trap opens? Now multiply that by five. Don’t expect kids to be as rhapsodic about the scenery as you are.
Hiring a six-berth RV sleeping six for 10 days in November, picking up in Denver and dropping off in Las Vegas, starts at £830, including one-way fee, with Road Bear (roadbearrv.com). A 10-day fly-drive package from Denver in October costs £1,225pp for a family of six, including RV hire (bon-voyage.co.uk). A World on Fire by Amanda Foreman is published by Penguin
Have you successfully bonded as a family on holiday? Or had a disaster? Tell us and you could win a £250 holiday voucher; see Letters for details. Email email@example.com
Today’s jet passengers may think they have it bad, but delay and discomfort have been a part of journeys since the Mayflower
Fifty years ago, on September 30, 1968, the world’s first 747 Jumbo Jet rolled out of Boeing’s Everett plant in Seattle, Washington. It was hailed as the future of commercial air travel, complete with fine dining, live piano music and glamorous stewardesses. And perhaps we might still be living in that future, were it not for the 1978 Airline Deregulation Act signed into law by President Jimmy Carter.
Deregulation was meant to increase the competitiveness of the airlines, while giving passengers more choice about the prices they paid. It succeeded in greatly expanding the accessibility of air travel, but at the price of making it a far less luxurious experience. Today, flying is a matter of “calculated misery,” as Columbia Law School professor Tim Wu put it in a 2014 article in the New Yorker. Airlines deliberately make travel unpleasant in order to force economy passengers to pay extra for things that were once considered standard, like food and blankets.
So it has always been with mass travel, since its beginnings in the 17th century: a test of how much discomfort and delay passengers are willing to endure. For the English Puritans who sailed to America on the Mayflower in 1620, light and ventilation were practically non-existent, the food was terrible and the sanitation primitive. All 102 passengers were crammed into a tiny living area just 80 feet long and 20 feet wide. To cap it all, the Mayflower took 66 days to arrive instead of the usual 47 for a trans-Atlantic crossing and was 600 miles off course from its intended destination of Virginia.
The introduction of the commercial stage coach in 1610, by a Scottish entrepreneur who offered trips between Edinburgh and Leith, made it easier for the middle classes to travel by land. But it was still an expensive and unpleasant experience. Before the invention of macadam roads—which rely on layers of crushed stone to create a flat and durable surface—in Britain in the 1820s, passengers sat cheek by jowl on springless benches, in a coach that trundled along at around five miles per hour.
The new paving technology improved the travel times but not necessarily the overall experience. Charles Dickens had already found fame with his comic stories of coach travel in “The Pickwick Papers” when he and Mrs. Dickens traveled on an American stage coach in Ohio in 1842. They paid to have the coach to themselves, but the journey was still rough: “At one time we were all flung together in a heap at the bottom of the coach.” Dickens chose to go by rail for the next leg of the trip, which wasn’t much better: “There is a great deal of jolting, a great deal of noise, a great deal of wall, not much window.”
Despite its primitive beginnings, 19th-century rail travel evolved to offer something revolutionary to its paying customers: quality service at an affordable price. In 1868, the American inventor George Pullman introduced his new designs for sleeping and dining cars. For a modest extra fee, the distinctive green Pullman cars provided travelers with hotel-like accommodation, forcing rail companies to raise their standards on all sleeper trains.
By contrast, the transatlantic steamship operators pampered their first-class passengers and abused the rest. In 1879, a reporter at the British Pall Mall Gazette sailed Cunard’s New York to Liverpool route in steerage in order to “test [the] truth by actual experience.” He was appalled to find that passengers were treated worse than cattle. No food was provided, “despite the fact that the passage is paid for.” The journalist noted that two steerage passengers “took one look at the place” and paid for an upgrade. I think we all know how they felt.
From ‘cantarella’ to polonium, governments have used toxins to terrorize and kill their enemies
ILLUSTRATION: THOMAS FUCHS
Among the pallbearers at Senator John McCain’s funeral in Washington last weekend was the Russian dissident Vladimir Kara-Murza. Mr. Kara-Murza is a survivor of two poisoning attempts, in 2015 and 2017, which he believes were intended as retaliation for his activism against the Putin regime.
Indeed, Russia is known or suspected to be responsible for several notorious recent poisoning cases, including the attempted murder this past March of Sergei Skripal, a former Russian spy living in Britain, and his daughter Yulia with the nerve agent Novichok. They survived the attack, but several months later a British woman died of Novichok exposure a few miles from where the Skirpals lived.
Poison has long been a favorite tool of brutal statecraft: It both terrorizes and kills, and it can be administered without detection. The Arthashastra, an ancient Indian political treatise that out-Machiavels Machiavelli, contains hundreds of recipes for toxins, as well as advice on when and how to use them to eliminate an enemy.
Most royal and imperial courts of the classical world were also awash with poison. Though it is impossible to prove so many centuries later, the long list of putative victims includes Alexander the Great (poisoned wine), Emperor Augustus (poisoned figs) and Emperor Claudius (poisoned mushrooms), as well as dozens of royal heirs, relatives, rivals and politicians. King Mithridates of Pontus, an ancient Hellenistic empire, was so paranoid—having survived a poison attempt by his own mother—that he took daily microdoses of every known toxin in order to build up his immunity.
Poisoning reached its next peak during the Italian Renaissance. Every ruling family, from the Medicis to the Viscontis, either fell victim to poison or employed it as a political weapon. The Borgias were even reputed to have their own secret recipe, a variation of arsenic called “cantarella.” Although a large number of their rivals conveniently dropped dead, the Borgias were small fry compared with the republic of Venice. The records of the Venetian Council of Ten reveal that a secret poison program went on for decades. Remarkably, two victims are known to have survived their assassination attempts: Count Francesco Sforza in 1450 and the Ottoman Sultan Mehmed II in 1477.
In the 20th century, the first country known to have established a targeted poisoning program was Russia under the Bolsheviks. According to Boris Volodarsky, a former Russian agent, Lenin ordered the creation of a poison laboratory called the “Special Room” in 1921. By the Cold War, the one-room lab had evolved into an international factory system staffed by hundreds, possibly thousands of scientists. Their specialty was untraceable poisons delivered by ingenious weapons—such as a cigarette packet made in 1954 that could fire bullets filled with potassium cyanide.
In 1978, the prizewinning Bulgarian writer Georgi Markov, then working for the BBC in London, was killed by an umbrella tip that shot a pellet containing the poison ricin into his leg. After the international outcry, the Soviet Union toned down its poisoning efforts but didn’t end them. And Putin’s Russia has continued to use similar techniques. In 2006, according to an official British inquiry, Russian secret agents murdered the ex-spy Alexander Litvinenko by slipping polonium into his drink during a meeting at a London hotel. It was the beginning of a new wave of poisonings whose end is not yet in sight.