With one magnificent renovation, Queen Victoria revamped the monarchy

The Sunday Times

A new exhibition reveals how the monarch’s redesign of Buckingham Palace created a home for her family and a focus for the nation, writes its co‑curator, Amanda Foreman.

Did Queen Victoria reign over Britain or did she rule? The difference may seem like splitting hairs, but the two words go to the heart of modern debates about the way society perceives women in power. A sovereign can be chained in a dungeon and still reign, but there’s no mistaking the action implied in the verb “to rule”. The very word has a potency to it that the mealy-mouthed “reign” does not.

The Victorians could never quite resolve in their minds whether Victoria was ruling or reigning over them. To mark her diamond jubilee in 1897, the poet laureate, Alfred Austin, composed Victoria, an embarrassing poem that attempted to have it both ways — praising the monarch for 60 dutiful years on the throne while dismissing her: “But, being a woman only, I can be / Not great, but good . . . Nor in the discords that distract a Realm / Be seen or heard.”

Despite a wealth of new scholarship and biographies about Victoria, most people still find it hard to say what she actually achieved, aside from reigning for a really long time. It’s as though she simply floated through life in her womanly way, pausing only to fall in love, have babies and reportedly say things such as “We are not amused”. Her personal accomplishments are diminished, ascribed to Prince Albert’s genius or ignored.

I have co-curated, with Lucy Peter of the Royal Collection Trust, an exhibition for this year’s Buckingham Palace summer opening. It is an attempt to redress the balance. Queen Victoria’s Palace argues that Victoria’s building programme at Buckingham Palace helped to redefine the monarchy for the modern age.

The new design enabled a more open, welcoming and inclusive relationship to develop between the royal family and the public.

The house Victoria inherited in 1837 was nothing like the building we know today. The Queen’s House, as Buckingham Palace was then known, was a mishmash of rooms and styles from three reigns.

The entertaining rooms and public spaces were too small, the kitchens dilapidated, the private apartments inadequate and the plumbing and heating barely functional.

Victoria, and then Albert after their marriage, put up with its failings until there was no room for their growing family.

It’s certainly true that Albert was more involved than Victoria in the decoration of the interior. But it was Victoria’s conception of female power that dictated the palace’s final form. Kingship, as Austin’s jubilee poem helpfully pointed out, is expressed by such manly virtues as strength, glory and military might, none of which Victoria could claim without appearing to betray her feminine identity.

Instead she made queenship a reflection of her own moral values, placing the emphasis on family, duty, patriotism and public service. These four “female” virtues formed the pillars not only of her reign but of every one that followed.

Today it would be impossible to conceive of the monarchy in any other way. It is one of the very few instances where gendered power has worked in favour of women.

The Buckingham Palace that emerged from its scaffolding in 1855 was a triumph. The additions included a nursery for the children, a large balcony on the east front, state rooms for diplomatic visits and a ballroom that was large enough to accommodate 2,000 guests.

For the next six years the palace was the epicentre of the monarchy. The death of Albert on December 14, 1861, brought a sudden and abrupt end to its glory.

Incapacitated by grief, Victoria hid herself away, much to the consternation of her family and subjects.

The story of Victoria’s eventual return to public life is reflected in the slow but sure rejuvenation of the palace. There were some things that she could never bear to do there, because they intruded too much on personal memories: she never again attended a concert at the palace or played host to a visiting head of state or gave a ball like the magnificent ones of old.

But Victoria developed other ways of opening the palace to the wider world. One of the most visible was the summer garden party, a tradition that now brings 30,000 people to the palace every year.

She also allowed Prince George — later George V — and Princess Mary to appear on the balcony after their wedding, cementing a tradition now watched by hundreds of millions. The palace balcony appearance has become so ingrained in the national consciousness that each occasion receives the most intense scrutiny.

At last month’s trooping the colour, lip-readers were brought in by media outlets to decipher the exchange between the Duke and Duchess of Sussex. (It’s believed Prince Harry told Meghan to turn around.)

By the end of Victoria’s life, the monarchy’s popularity was greater than ever. Buckingham Palace was also back in the people’s affections, having a starring role in Victoria’s diamond jubilee in 1897 as the physical and emotional focus for the London celebrations.

After her death in 1901, much of Victoria and Albert’s taste was swept away in the name of modernity, including the east front, which was refaced by George V. The Buckingham Palace of the 21st century looks quite different from the one she built. But its purpose is the same.

The palace still functions as a private home. It is still the administrative headquarters of the monarchy. And, perhaps most important of all, it is still the place where the nation gathers to celebrate and be celebrated.

This is her legacy and the proof, if such is needed, that Victoria reigned, ruled and did much else besides.

Queen Victoria’s Palace is at Buckingham Palace until September 29

Feminist queen: show explores how Victoria transformed monarchy

Caroline Davies, The Guardian
Wed 17 Jul 2019

Queen Victoria began some of our best-known traditions, says Buckingham Palace exhibition curator Dr. Amanda Foreman

Queen Victoria was responsible for a “feminist transformation” of the monarchy and initiated some of its best-known traditions, according to the curator of a new exhibition at Buckingham Palace.
The story of how Victoria and Prince Albert rebuilt the palace into the most glittering court in Europe is explored through paintings, sketches and costumes, and includes a Hollywood-produced immersive experience that brings to life the balls for which she was famous.
Visiting the exhibition, Victoria’s great-great granddaughter, the Queen, was “totally engrossed” as she watched virtual-reality dancers recreate a quadrille, a dance that was fashionable at 19th-century balls. “Thank God we don’t have to do that any more,” said the Queen.
Quadrilles, in which four couples dance together, may no longer be performed but many of Victoria’s innovations remain. She created the balcony, and bequeathed balcony appearances and garden parties to a nation. “It is now unimaginable you would have a national celebration without this balcony, so embedded is it in the nation’s consciousness,” said Dr Amanda Foreman, the historian and co-curator of the exhibiti

on, Queen Victoria’s Buckingham Palace.
Queen Victoria’s maternal role is highlighted in the sketches she made of her nine children, as well as an ornate casket containing their milk teeth and marble sculptures she had made of their tiny arms and feet.
The centrepiece of the exhibition, which marks the 200th anniversary of Victoria’s birth, is a recreation of the grand ballroom which she had built. She believed the picture gallery was too small for lavish entertainment, noting in her journal how the dresses get squashed and ruined during an attempt at a quadrille.
Digital technology by a Hollywood-based production company recreates the ballroom as it looked during a ball in 1856, with images of the wall furnishings and paintings, as shown in contemporary watercolours, projected on to its walls.
A quadrille is recreated through a hologram effect, using actors in replicas of the costumes featured in the watercolour. The technology was inspired by the Victorian illusionist trick known as Pepper’s Ghost, which used angled glass to reflect images on to the Victorian stage.
“Queen Victoria transformed Buckingham Palace, the fabric of this building, and in so doing created new traditions, those traditions which we now associate with the modern monarchy,” said Foreman.

“It is significant that it was a woman who was responsible for these traditions and a woman who defined our nation’s understanding and concept of sovereign power, how it’s experienced, how it’s expressed.
“It’s very much a feminist transformation, although Queen Victoria herself would not have used those words, and those words would not have meant to the Victorians what they mean to us today.
“We tend to diminish the contribution of women in particular. We assign their success to the men around them. We tend to simply forget who was responsible for certain things. So by putting on this exhibition, we are stripping away those lawyers of oblivion, forgetfulness, discounting, and allowing Queen Victoria the space to shine.”
Victoria turned the once-unloved palace into a home fit for state, public and private events. But for 10 years after her beloved Albert’s death, she rarely set foot in it, describing it in her journals as “one of my saddest of sad houses”.

The Immortal Charm of Daffodils

The humble flower has been a favorite symbol in myth and art since ancient times

ILLUSTRATION: THOMAS FUCHS

On April 15, 1802, the poet William Wordsworth and his sister Dorothy were enjoying a spring walk through the hills and vales of the English Lake District when they came across a field of daffodils. Dorothy was so moved that she recorded the event in her journal, noting how the flowers “tossed and reeled and danced and seemed as if they verily laughed with the wind that blew upon them over the Lake.” And William decided there was nothing for it but to write a poem, which he published in its final version in 1815. “I Wandered Lonely as a Cloud” is one of his most famous reflections on the power of nature: “For oft, when on my couch I lie/In vacant or in pensive mood,/They flash upon that inward eye/Which is the bliss of solitude;/And then my heart with pleasure fills,/And dances with the daffodils.”

Long dismissed as a common field flower, unworthy of serious attention by the artist, poet or gardener, the daffodil enjoyed a revival thanks in part to Wordsworth’s poem. The painters Claude Monet, Berthe Morisot and Vincent van Gogh were among its 19th-century champions. Today, the daffodil is so ubiquitous, in gardens and in art, that it’s easy to overlook.

But the flower deserves respect for being a survivor. Every part of the narcissus, to use its scientific name, is toxic to humans, animals and even other flowers, and yet—as many cultures have noted—it seems immortal. There are still swaths of daffodils on the lakeside meadow where the Wordsworths ambled two centuries ago.

The daffodil originated in the ancient Mediterranean, where it was regarded with deep ambivalence. The ancient Egyptians associated narcissi with the idea of death and resurrection, using them in tomb paintings. The Greeks also gave the flower contrary mythological meanings. Its scientific name comes from the story of Narcissus, a handsome youth who faded away after being cursed into falling in love with his own image. At the last moment, the gods saved him from death by granting him a lifeless immortality as a daffodil. In another Greek myth, the daffodil’s luminous beauty was used by Hades to lure Persephone away from her friends so that he could abduct her into the underworld. During her four-month captivity the only flower she saw was the asphodelus, which grew in abundance on the fields of Elysium—and whose name inspired the English derivative “daffodil.”

But it is isn’t only Mediterranean cultures that have fixated on the daffodil’s mysterious alchemy of life and death. A fragrant variety of the narcissus—the sweet-smelling paper white—traveled along the Silk Road to China. There, too, the flower appeared to encapsulate the happy promise of spring, but also other painful emotions such as loss and yearning. The famous Ming Dynasty scroll painting “Narcissi and Plum Blossoms” by Qiu Ying (ca. 1494-1552), for instance, is a study in contrasts, juxtaposing exquisitely rendered flowers with the empty desolation of winter.

The English botanist John Parkinson introduced the traditional yellow variety from Spain in 1618. Aided by a soggy but temperate climate, daffodils quickly spread across lawns and fields, causing its foreign origins to be forgotten. By the 19th century they had become quintessentially British—so much so that missionaries and traders, nostalgic for home, planted bucketfuls of bulbs wherever they went. Their legacy in North America is a burst of color each year just when the browns and grays of winter have worn out their welcome.

The Invention of Ice Hockey

Canada gave us the modern form of a sport that has been played for centuries around the world

ILLUSTRATION: THOMAS FUCHS

Canadians like to say—and print on mugs and T-shirts—that “Canada is Hockey.” No fewer than five Canadian cities and towns claim to be the birthplace of ice hockey, including Windsor, Nova Scotia, which has an entire museum dedicated to the sport. Canada’s annual Hockey Day, which falls on February 9 this year, features a TV marathon of hockey games. Such is the country’s love for the game that last year’s broadcast was watched by more than 1 in 4 Canadians.

But as with many of humanity’s great advances, no single country or person can take the credit for inventing ice hockey. Stick-and-ball games are as old as civilization itself. The ancient Egyptians were playing a form of field hockey as early as the 21st century B.C., if a mural on a tomb at Beni Hasan, a Middle Kingdom burial site about 120 miles south of Cairo, is anything to go by. The ancient Greeks also played a version of the game, as did the early Christian Ethiopians, the Mesoamerican Teotihuacanos in the Valley of Mexico, and the Daur tribes of Inner Mongolia. And the Scottish and Irish versions of field hockey, known as shinty and hurling respectively, have strong similarities with the modern game.

Taking a ball and stick onto the ice was therefore a fairly obvious innovation, at least in places with snowy winters. The figures may be tiny, but three villagers playing an ice hockey-type game can be seen in the background of Pieter Bruegel the Elder’s 1565 painting “Hunters in the Snow.” There is no such pictorial evidence to show when the Mi’kmaq Indians of Nova Scotia first started hitting a ball on ice, but linguistic clues suggest that their hockey tradition existed before the arrival of European traders in the 16th century. The two cultures then proceeded to influence each other, with the Mi’kmaqs becoming the foremost maker of hockey sticks in the 19th century.

The earliest known use of the word hockey appears in a book, “Juvenile Sports and Pastimes,” written by Richard Johnson in London in 1776. Recently, Charles Darwin became an unlikely contributor to ice hockey history after researchers found a letter in which he reminisced about playing the game as a boy in the 1820s: “I used to be very fond of playing Hocky [sic] on the ice in skates.” On January 8, 1864, the future King Edward VII played ice hockey at Windsor Castle while awaiting the birth of his first child.

As for Canada, apart from really liking the game, what has been its real contribution to ice hockey? The answer is that it created the game we know today, from the official rulebook to the size and shape of the rink to the establishment of the Stanley Cup championship in 1894. The first indoor ice hockey game was played in Montreal in 1875, thereby solving the perennial problem of pucks getting lost. (The rink was natural ice, with Canada’s cold winter supplying the refrigeration.) The game involved two teams of nine players, each with a set position—three more than teams field today—a wooden puck, and a list of rules for fouls and scoring.

In addition to being the first properly organized game, the Montreal match also initiated ice hockey’s other famous tradition: brawling on the ice. In this case, the fighting erupted between the players, spectators and skaters who wanted the ice rink back for free skating. Go Canada!

Unenforceable Laws Against Pleasure

The 100th anniversary of Prohibition is a reminder of how hard it is to regulate consumption and display

ILLUSTRATION: THOMAS FUCHS

This month we mark the centennial of the ratification of the Constitution’s 18th Amendment, better known as Prohibition. But the temperance movement was active for over a half-century before winning its great prize. As the novelist Anthony Trollope discovered to his regret while touring North America in 1861-2, Maine had been dry for a decade. The convivial Englishman condemned the ban: “This law, like all sumptuary laws, must fail,” he wrote.

Sumptuary laws had largely fallen into disuse by the 19th century, but they were once a near-universal tool, used in the East and West alike to control economies and preserve social hierarchies. A sumptuary law is a rule that regulates consumption in its broadest sense, from what a person may eat and drink to what they may own, wear or display. The oldest known example, the Locrian Law Code devised by the seventh century B.C. Greek law giver Zaleucus, banned all citizens of Locri (except prostitutes) from ostentatious displays of gold jewelry.

Sumptuary laws were often political weapons disguised as moral pieties, aimed at less powerful groups, particularly women. In 215 B.C., at the height of the Second Punic War, the Roman Senate passed the Lex Oppia, which (among other restrictions) banned women from owning more than a half ounce of gold. Ostensibly a wartime austerity measure, 20 years later the law appeared so ridiculous as to be unenforceable. But during debate on its repeal in 195 B.C., Cato the Elder, its strongest defender, inadvertently revealed the Lex Oppia’s true purpose: “What [these women] want is complete freedom…. Once they have achieved equality, they will be your masters.”

Cato’s message about preserving social hierarchy echoed down the centuries. As trade and economic stability returned to Europe during the High Middle Ages (1000-1300), so did the use of sumptuary laws to keep the new merchant elites in their place. By the 16th century, sumptuary laws in Europe had extended from clothing to almost every aspect of daily life. The more they were circumvented, the more specific such laws became. An edict issued by King Henry VIII of England in 1517, for example, dictated the maximum number of dishes allowed at a meal: nine for a cardinal, seven for the aristocracy and three for the gentry.

The rise of modern capitalism ultimately made sumptuary laws obsolete. Trade turned once-scarce luxuries into mass commodities that simply couldn’t be controlled. Adam Smith’s “The Wealth of Nations” (1776) confirmed what had been obvious for over a century: Consumption and liberty go hand in hand. “It is the highest impertinence,” he wrote, “to pretend to watch over the economy of private people…either by sumptuary laws, or by prohibiting the importation of foreign luxuries.”

Smith’s pragmatic view was echoed by President William Howard Taft. He opposed Prohibition on the grounds that it was coercive rather than consensual, arguing that “experience has shown that a law of this kind, sumptuary in its character, can only be properly enforced in districts in which a majority of the people favor the law.” Mass immigration in early 20th-century America had changed many cities into ethnic melting-pots. Taft recognized Prohibition as an attempt by nativists to impose cultural uniformity on immigrant communities whose attitudes toward alcohol were more permissive. But his warning was ignored, and the disastrous course of Prohibition was set.

New Year, Old Regrets

From the ancient Babylonians to Victorian England, the year’s end has been a time for self-reproach and general misery

for the Wall Street Journal

ILLUSTRATION: ALAN WITSCHONKE

I don’t look forward to New Year’s Eve. When the bells start to ring, it isn’t “Auld Lang Syne” I hear but echoes from the Anglican “Book of Common Prayer”: “We have left undone those things which we ought to have done; And we have done those things which we ought not to have done.”

At least I’m not alone in my annual dip into the waters of woe. Experiencing the sharp sting of regret around the New Year has a long pedigree. The ancient Babylonians required their kings to offer a ritual apology during the Akitu festival of New Year: The king would go down on his knees before an image of the god Marduk, beg his forgiveness, insist that he hadn’t sinned against the god himself and promise to do better next year. The rite ended with the high priest giving the royal cheek the hardest possible slap.

There are sufficient similarities between the Akitu festival and Yom Kippur, Judaism’s Day of Atonement—which takes place 10 days after the Jewish New Year—to suggest that there was likely a historical link between them. Yom Kippur, however, is about accepting responsibility, with the emphasis on owning up to sins committed rather than pointing out those omitted.

In Europe, the 14th-century Middle English poem “Sir Gawain and the Green Knight” begins its strange tale on New Year’s Day. A green-skinned knight arrives at King Arthur’s Camelot and challenges the knights to strike at him, on the condition that he can return the blow in a year and a day. Sir Gawain reluctantly accepts the challenge, and embarks on a year filled with adventures. Although he ultimately survives his encounter with the Green Knight, Gawain ends up haunted by his moral lapses over the previous 12 months. For, he laments (in J.R.R. Tolkien’s elegant translation), “a man may cover his blemish, but unbind it he cannot.”

New Year’s Eve in Shakespeare’s era was regarded as a day for gift-giving rather than as a catalyst for regret. But Sonnet 30 shows that Shakespeare was no stranger to the melancholy that looking back can inspire: “I summon up remembrance of things past, / I sigh the lack of many a thing I sought, / And with old woes new wail my dear time’s waste.”

For a full dose of New Year’s misery, however, nothing beats the Victorians. “I wait its close, I court its gloom,” declared the poet Walter Savage Landor in “Mild Is the Parting Year.” Not to be outdone, William Wordsworth offered his “Lament of Mary Queen of Scots on the Eve of a New Year”: “Pondering that Time tonight will pass / The threshold of another year; /…My very moments are too full / Of hopelessness and fear.”

Fortunately, there is always Charles Dickens. In 1844, Dickens followed up the wildly successful “A Christmas Carol” with a slightly darker but still uplifting seasonal tale, “The Chimes.” Trotty Veck, an elderly messenger, takes stock of his life on New Year’s Eve and decides that he has been nothing but a burden on society. He resolves to kill himself, but the spirits of the church bells intervene, showing him a vision of what would happen to the people he loves.

Today, most Americans recognize this story as the basis of the bittersweet 1946 Frank Capra film “It’s a Wonderful Life.” As an antidote to New Year’s blues, George Bailey’s lesson holds true for everyone: “No man is a failure who has friends.”

Trees of Life and Wonder

From Saturnalia to Christmas Eve, people have always had a spiritual need for greenery in the depths of winter

Queen Victoria and family with their Christmas tree in 1848. PHOTO: GETTY IMAGES

My family never had a pink-frosted Christmas tree, though Lord knows my 10-year-old self really wanted one. Every year my family went to Sal’s Christmas Emporium on Wilshire Boulevard in Los Angeles, where you could buy neon-colored trees, mechanical trees that played Christmas carols, blue and white Hanukkah bushes or even a real Douglas fir if you wanted to go retro. We were solidly retro.

Decorating the Christmas tree remains one of my most treasured memories, and according to the National Christmas Tree Association, the tradition is still thriving in our digital age: In 2017 Americans bought 48.5 million real and artificial Christmas trees. Clearly, bringing a tree into the house, especially during winter, taps into something deeply spiritual in the human psyche.

Nearly every society has at some point venerated the tree as a symbol of fertility and rebirth, or as a living link between the heavens, the earth and the underworld. In the ancient Near East, “tree of life” motifs appear on pottery as early as 7000 B.C. By the second millennium B.C., variations of the motif were being carved onto temple walls in Egypt and fashioned into bronze sculptures in southern China.

The early Christian fathers were troubled by the possibility that the faithful might identify the Garden of Eden’s trees of life and knowledge, described in the Book of Genesis, with paganism’s divine trees and sacred groves. Accordingly, in 572 the Council of Braga banned Christians from participating in the Roman celebration of Saturnalia—a popular winter solstice festival in honor of Saturn, the god of agriculture, that included decking the home with boughs of holly, his sacred symbol.

It wasn’t until the late Middle Ages that evergreens received a qualified welcome from the Church, as props in the mystery plays that told the story of Creation. In Germany, mystery plays were performed on Christmas Eve, traditionally celebrated in the church calendar as the feast day of Adam and Eve. The original baubles that hung on these “paradise trees,” representing the trees in the Garden of Eden, were round wafer breads that symbolized the Eucharist.

The Christmas tree remained a northern European tradition until Queen Charlotte, the German-born wife of George III, had one erected for a children’s party at Windsor Castle in 1800. The British upper classes quickly followed suit, but the rest of the country remained aloof until 1848, when the London Illustrated News published a charming picture of Queen Victoria and her family gathered around a large Christmas tree. Suddenly, every household had to have one for the children to decorate. It didn’t take long for President Franklin Pierce to introduce the first Christmas tree to the White House, in 1853—a practice that every President has honored except Theodore Roosevelt, who in 1902 refused to have a tree on conservationist grounds. (His children objected so much to the ban that he eventually gave in.)

Many writers have tried to capture the complex feelings that Christmas trees inspire, particularly in children. Few, though, can rival T.S. Eliot’s timeless meditation on joy, death and life everlasting, in his 1954 poem “The Cultivation of Christmas Trees”: “The child wonders at the Christmas Tree: / Let him continue in the spirit of wonder / At the Feast as an event not accepted as a pretext; / So that the glittering rapture, the amazement / Of the first-remembered Christmas Tree /…May not be forgotten.”

The Tradition of Telling All

From ancient Greece to modern Washington, political memoirs have been irresistible source of gossip about great leaders

ILLUSTRATION: THOMAS FUCHS

The tell-all memoir has been a feature of American politics ever since Raymond Moley, an ex-aide to Franklin Delano Roosevelt, published his excoriating book “After Seven Years” while FDR was still in office. What makes the Trump administration unusual is the speed at which such accounts are appearing—most recently, “Unhinged,” by Omarosa Manigault Newman, a former political aide to the president.

Spilling the beans on one’s boss may be disloyal, but it has a long pedigree. Alexander the Great is thought to have inspired the genre. His great run of military victories, beginning with the Battle of Chaeronea in 338 B.C., was so unprecedented that several of his generals felt the urge—unknown in Greek literature before then—to record their experiences for posterity.

Unfortunately, their accounts didn’t survive, save for the memoir of Ptolemy Soter, the founder of the Ptolemaic dynasty in Egypt, which exists in fragments. The great majority of Roman political memoirs have also disappeared—many by official suppression. Historians particularly regret the loss of the memoirs of Agrippina, the mother of Emperor Nero, who once boasted that she could bring down the entire imperial family with her revelations.

The Heian period (794-1185) in Japan produced four notable court memoirs, all by noblewomen. Dissatisfaction with their lot was a major factor behind these accounts—particularly for the anonymous author of ‘The Gossamer Years,” written around 974. The author was married to Fujiwara no Kane’ie, the regent for the Emperor Ichijo. Her exalted position at court masked a deeply unhappy private life; she was made miserable by her husband’s serial philandering, describing herself as “rich only in loneliness and sorrow.”

In Europe, the first modern political memoir was written by the Duc de Saint-Simon (1675-1755), a frustrated courtier at Versailles who took revenge on Louis XIV with his pen. Saint-Simon’s tales hilariously reveal the drama, gossip and intrigue that surrounded a king whose intellect, in his view, was “beneath mediocrity.”

But even Saint-Simon’s memoirs pale next to those of the Korean noblewoman Lady Hyegyeong (1735-1816), wife of Crown Prince Sado of the Joseon Dynasty. Her book, “Memoirs Written in Silence,” tells shocking tales of murder and madness at the heart of the Korean court. Sado, she writes, was a homicidal psychopath who went on a bloody killing spree that was only stopped by the intervention of his father King Yeongjo. Unwilling to see his son publicly executed, Yeongjo had the prince locked inside a rice chest and left to die. Understandably, Hyegyeong’s memoirs caused a huge sensation in Korea when they were first published in 1939, following the death of the last Emperor in 1926.

Fortunately, the Washington political memoir has been free of this kind of violence. Still, it isn’t just Roman emperors who have tried to silence uncomfortable voices. According to the historian Michael Beschloss, President John F. Kennedy had the White House household staff sign agreements to refrain from writing any memoirs. But eventually, of course, even Kennedy’s secrets came out. Perhaps every political leader should be given a plaque that reads: “Just remember, your underlings will have the last word.”

How Potatoes Conquered the World

It took centuries for the spud to travel from the New World to the Old and back again

For the Wall Street Journal

At the first Thanksgiving dinner, eaten by the Wampanoag Indians and the Pilgrims in 1621, the menu was rather different from what’s served today. For one thing, the pumpkin was roasted, not made into a pie. And there definitely wasn’t a side dish of mashed potatoes.

In fact, the first hundred Thanksgivings were spud-free, since potatoes weren’t grown in North America until 1719, when Scotch-Irish settlers began planting them in New Hampshire. Mashed potatoes were an even later invention. The first recorded recipe for the dish appeared in 1747, in Hannah Glasse’s splendidly titled “The Art of Cookery Made Plain and Easy, Which Far Exceeds Any Thing of the Kind yet Published.”

By then, the potato had been known in Europe for a full two centuries. It was first introduced by the Spanish conquerors of Peru, where the Incas had revered the potato and even invented a natural way of freeze-drying it for storage. Nevertheless, despite its nutritional value and ease of growing, the potato didn’t catch on in Europe. It wasn’t merely foreign and ugly-looking; to wheat-growing farmers it seemed unnatural—possibly even un-Christian, since there is no mention of the potato in the Bible. Outside of Spain, it was generally grown for animal feed.

The change in the potato’s fortunes was largely due to the efforts of a Frenchman named Antoine-Augustin Parmentier (1737-1813). During the Seven Years’ War, he was taken prisoner by the Prussians and forced to live on a diet of potatoes. To his surprise, he stayed relatively healthy. Convinced he had found a solution to famine, Parmentier dedicated his life after the war to popularizing the potato’s nutritional benefits. He even persuaded Marie-Antoinette to wear potato flowers in her hair.

Among the converts to his message were the economist Adam Smith, who realized the potato’s economic potential as a staple food for workers, and Thomas Jefferson, then the U.S. Ambassador to France, who was keen for his new nation to eat well in all senses of the word. Jefferson is credited with introducing Americans to french fries at a White House dinner in 1802.

As Smith predicted, the potato became the fuel for the Industrial Revolution. A study published in 2011 by Nathan Nunn and Nancy Qian in the Quarterly Journal of Economics estimates that up to a quarter of the world’s population growth from 1700 to 1900 can be attributed solely to the introduction of the potato. As Louisa May Alcott observed in “Little Men,” in 1871, “Money is the root of all evil, and yet it is such a useful root that we cannot get on without it any more than we can without potatoes.”

In 1887, two Americans, Jacob Fitzgerald and William H. Silver, patented the first potato ricer, which forced a cooked potato through a cast iron sieve, ending the scourge of lumpy mash. Still, the holy grail of “quick and easy” mashed potatoes remained elusive until the late 1950s. Using the flakes produced by the potato ricer and a new freeze drying method, U.S. government scientists perfected instant mashed potatoes, which only requires the simple step of adding hot water or milk to the mix. The days of peeling, boiling and mashing were now optional, and for millions of cooks, Thanksgiving became a little easier. And that’s something to be thankful for.

 

Overrun by Alien Species

From Japanese knotweed to cane toads, humans have introduced invasive species to new environments with disastrous results

Ever since Neolithic people wandered the earth, inadvertently bringing the mouse along for the ride, humans have been responsible for introducing animal and plant species into new environments. But problems can arise when a non-native species encounters no barriers to population growth, allowing it to rampage unchecked through the new habitat, overwhelming the ecosystem. On more than one occasion, humans have transplanted a species for what seemed like good reasons, only to find out too late that the consequences were disastrous.

One of the most famous examples is celebrating its 150th anniversary this year: the introduction of Japanese knotweed to the U.S. A highly aggressive plant, it can grow 15 feet high and has roots that spread up to 45 feet. Knotweed had already been a hit in Europe because of its pretty little white flowers, and, yes, its miraculous indestructibility.

First mentioned in botanical articles in 1868, knotweed was brought to New York by the Hogg brothers, James and Thomas, eminent American horticulturalists and among the earliest collectors of Japanese plants. Thanks to their extensive contacts, knotweed found a home in arboretums, botanical gardens and even Central Park. Not content with importing one of world’s most invasive shrubs, the Hoggs also introduced Americans to the wonders of kudzu, a dense vine that can grow a foot a day.

Impressed by the vigor of kudzu, agriculturalists recommended using these plants to provide animal fodder and prevent soil erosion. In the 1930s, the government was even paying Southern farmers $8 per acre to plant kudzu. Today it is known as the “vine that ate the South,” because of the way it covers huge tracts of land in a green blanket of death. And Japanese knotweed is still spreading, colonizing entire habitats from Mississippi to Alaska, where only the Arctic tundra holds it back from world domination.

Knotweed has also reached Australia, a country that has been ground zero for the worst excesses of invasive species. In the 19th century, the British imported non-native animals such as rabbits, cats, goats, donkeys, pigs, foxes and camels, causing mass extinctions of Australia’s native mammal species. Australians are still paying the price; there are more rabbits in the country today than wombats, more camels than kangaroos.

Yet the lesson wasn’t learned. In the 1930s, scientists in both Australia and the U.S. decided to import the South American cane toad as a form of biowarfare against beetles that eat sugar cane. The experiment failed, and it turned out that the cane toad was poisonous to any predator that ate it. There’s also the matter of the 30,000 eggs it can lay at a time. Today, the cane toad can be found all over northern Australia and south Florida.

So is there anything we can do once an invasive species has taken up residence? The answer is yes, but it requires more than just fences, traps and pesticides; it means changing human incentives. Today, for instance, the voracious Indo-Pacific lionfish is gobbling up local fish in the west Atlantic, while the Asian carp threatens the ecosystem of the Great Lakes. There is only one solution: We must eat them, dear reader. These invasive fish can be grilled, fried or consumed as sashimi, and they taste delicious. Likewise, kudzu makes great salsa, and Japanese knotweed can be treated like rhubarb. Eat for America and save the environment.