Harper’s Bazaar: Buckingham Palace is opening a new exhibition exploring the life of Queen Victoria this summer

This year marks the 200th anniversary of the birth of Queen Victoria and to celebrate, Buckingham Palace has announced a special exhibition as part of its state opening this summer, co-curated by Dr. Amanda Foreman.

Harper’s Bazaar, May 7, 2019

by Katie Frost

A portrait of Queen Victoria by Thomas Sully

The exhibit will explore the life of the monarch and how she turned the once unloved palace into the royal residence we know today. Highlights will include a portrait of the young queen painted by Thomas Sully soon after she moved into her new home, along with Victoria’s personal insignia, the Star and Collar of the Order of the Bath.

Victoria moved into the palace in 1837 when she was just 18. It had been empty for seven years following the death of her uncle, George IV. After Victoria married Prince Albert and started a family, Victoria wrote a letter to the Prime Minister, Sir Robert Peel, about her plans to revamp her family home. In it, she spoke of “the urgent necessity of doing something to Buckingham Palace” and “the total want of accommodation for our growing little family”, according to the Royal Collection Trust.

Historically Speaking: Overrun by Alien Species

From Japanese knotweed to cane toads, humans have introduced invasive species to new environments with disastrous results

The Wall Street Journal, November 1, 2018

Ever since Neolithic people wandered the earth, inadvertently bringing the mouse along for the ride, humans have been responsible for introducing animal and plant species into new environments. But problems can arise when a non-native species encounters no barriers to population growth, allowing it to rampage unchecked through the new habitat, overwhelming the ecosystem. On more than one occasion, humans have transplanted a species for what seemed like good reasons, only to find out too late that the consequences were disastrous.

One of the most famous examples is celebrating its 150th anniversary this year: the introduction of Japanese knotweed to the U.S. A highly aggressive plant, it can grow 15 feet high and has roots that spread up to 45 feet. Knotweed had already been a hit in Europe because of its pretty little white flowers, and, yes, its miraculous indestructibility.

First mentioned in botanical articles in 1868, knotweed was brought to New York by the Hogg brothers, James and Thomas, eminent American horticulturalists and among the earliest collectors of Japanese plants. Thanks to their extensive contacts, knotweed found a home in arboretums, botanical gardens and even Central Park. Not content with importing one of world’s most invasive shrubs, the Hoggs also introduced Americans to the wonders of kudzu, a dense vine that can grow a foot a day.

Impressed by the vigor of kudzu, agriculturalists recommended using these plants to provide animal fodder and prevent soil erosion. In the 1930s, the government was even paying Southern farmers $8 per acre to plant kudzu. Today it is known as the “vine that ate the South,” because of the way it covers huge tracts of land in a green blanket of death. And Japanese knotweed is still spreading, colonizing entire habitats from Mississippi to Alaska, where only the Arctic tundra holds it back from world domination.

Knotweed has also reached Australia, a country that has been ground zero for the worst excesses of invasive species. In the 19th century, the British imported non-native animals such as rabbits, cats, goats, donkeys, pigs, foxes and camels, causing mass extinctions of Australia’s native mammal species. Australians are still paying the price; there are more rabbits in the country today than wombats, more camels than kangaroos.

Yet the lesson wasn’t learned. In the 1930s, scientists in both Australia and the U.S. decided to import the South American cane toad as a form of biowarfare against beetles that eat sugar cane. The experiment failed, and it turned out that the cane toad was poisonous to any predator that ate it. There’s also the matter of the 30,000 eggs it can lay at a time. Today, the cane toad can be found all over northern Australia and south Florida.

So is there anything we can do once an invasive species has taken up residence? The answer is yes, but it requires more than just fences, traps and pesticides; it means changing human incentives. Today, for instance, the voracious Indo-Pacific lionfish is gobbling up local fish in the west Atlantic, while the Asian carp threatens the ecosystem of the Great Lakes. There is only one solution: We must eat them, dear reader. These invasive fish can be grilled, fried or consumed as sashimi, and they taste delicious. Likewise, kudzu makes great salsa, and Japanese knotweed can be treated like rhubarb. Eat for America and save the environment.

Historically Speaking: Poison and Politics

From ‘cantarella’ to polonium, governments have used toxins to terrorize and kill their enemies

The Wall Street Journal, September 7, 2018

ILLUSTRATION: THOMAS FUCHS

Among the pallbearers at Senator John McCain’s funeral in Washington last weekend was the Russian dissident Vladimir Kara-Murza. Mr. Kara-Murza is a survivor of two poisoning attempts, in 2015 and 2017, which he believes were intended as retaliation for his activism against the Putin regime.

Indeed, Russia is known or suspected to be responsible for several notorious recent poisoning cases, including the attempted murder this past March of Sergei Skripal, a former Russian spy living in Britain, and his daughter Yulia with the nerve agent Novichok. They survived the attack, but several months later a British woman died of Novichok exposure a few miles from where the Skirpals lived.

Poison has long been a favorite tool of brutal statecraft: It both terrorizes and kills, and it can be administered without detection. The Arthashastra, an ancient Indian political treatise that out-Machiavels Machiavelli, contains hundreds of recipes for toxins, as well as advice on when and how to use them to eliminate an enemy.

Most royal and imperial courts of the classical world were also awash with poison. Though it is impossible to prove so many centuries later, the long list of putative victims includes Alexander the Great (poisoned wine), Emperor Augustus (poisoned figs) and Emperor Claudius (poisoned mushrooms), as well as dozens of royal heirs, relatives, rivals and politicians. King Mithridates of Pontus, an ancient Hellenistic empire, was so paranoid—having survived a poison attempt by his own mother—that he took daily microdoses of every known toxin in order to build up his immunity.

Poisoning reached its next peak during the Italian Renaissance. Every ruling family, from the Medicis to the Viscontis, either fell victim to poison or employed it as a political weapon. The Borgias were even reputed to have their own secret recipe, a variation of arsenic called “cantarella.” Although a large number of their rivals conveniently dropped dead, the Borgias were small fry compared with the republic of Venice. The records of the Venetian Council of Ten reveal that a secret poison program went on for decades. Remarkably, two victims are known to have survived their assassination attempts: Count Francesco Sforza in 1450 and the Ottoman Sultan Mehmed II in 1477.

In the 20th century, the first country known to have established a targeted poisoning program was Russia under the Bolsheviks. According to Boris Volodarsky, a former Russian agent, Lenin ordered the creation of a poison laboratory called the “Special Room” in 1921. By the Cold War, the one-room lab had evolved into an international factory system staffed by hundreds, possibly thousands of scientists. Their specialty was untraceable poisons delivered by ingenious weapons—such as a cigarette packet made in 1954 that could fire bullets filled with potassium cyanide.

In 1978, the prizewinning Bulgarian writer Georgi Markov, then working for the BBC in London, was killed by an umbrella tip that shot a pellet containing the poison ricin into his leg. After the international outcry, the Soviet Union toned down its poisoning efforts but didn’t end them. And Putin’s Russia has continued to use similar techniques. In 2006, according to an official British inquiry, Russian secret agents murdered the ex-spy Alexander Litvinenko by slipping polonium into his drink during a meeting at a London hotel. It was the beginning of a new wave of poisonings whose end is not yet in sight.

WSJ Historically Speaking: A British Milestone in the Fight for Freedom

Photo: ERASMO VASQUEZ LENDECHY/WIKIMEDIA COMMONS

Photo: ERASMO VASQUEZ LENDECHY/WIKIMEDIA COMMON

The British officially abolished slavery throughout their empire on Aug. 1, 1834, freeing some 800,000 Africans from bondage. The date should be forever commemorated—but so should slavery’s own history of resistance and rebellion.

That slaves have always found ways to rebel is reflected in the earliest surviving legal texts. In the 21st century B.C., King Ur-Nammu of Ur, an ancient city in what is now Iraq, proclaimed that “if a slave escapes from the city limits and someone returns him, the owner shall pay two shekels to the one who returned him.”

As slavery became more deeply ingrained in society, so did the nature of the resistance. The Greeks were severe toward rebellious slaves. But no society was as cruel or inventive as Sparta. Having subjugated the neighboring Messenians into helotry in the seventh century B.C. (helots were the property of the state), the Spartans inflicted a reign of terror on them: During annual culls, young warriors were encouraged to hunt and kill the strongest helots.

A catastrophic earthquake in 464 B.C. prompted a short-lived rebellion, but the helots remained trapped in their wretched existence for another century. Finally, another opportunity to revolt came in 371 B.C. after the city-state of Thebes defeated Sparta at the Battle of Leuctra. Aided by the victorious Thebans, the Messenians rose up and drove the Spartans from their land.

Continue reading…

The New York Times Book Review: Joan of Arc: A History,’ by Helen Castor

Engraving by J.C. Buttre, via Corbis

Engraving by J.C. Buttre, via Corbis

Fame is like a parasite. It feeds off its host — infecting, extracting, consuming its victim until there’s nothing left but an empty husk. For the lucky (or unlucky, depending on your point of view), with the emptiness comes the possibility of a long afterlife as one of the blowup dolls of history.

These women — and they’re almost always women — become the public’s playthings in perpetuity. Stripped of truth, deprived of personhood, they can be claimed and used by anyone for any purpose. Exhibit A is Joan of Arc, simultaneously canonized by Pope Benedict XV and the women’s suffrage movement; sometime mascot of 19th-century French republicans, 20th-century Vichy France and the 21st-century National Front. She has over a dozen operas and several dozen movies to her name. And she’s the single thread that unites a bewilderingly diverse crowd of playwrights, writers, philosophers, poets and novelists, from Shakespeare to Voltaire, Robert Southey, Mark Twain, George Bernard Shaw, Vita Sackville-West and Bertolt Brecht.

No wonder the British historian Helen Castor begins her highly satisfying biography of Joan of Arc by stating the obvious: “In the firmament of history,” the Maid of Orléans is a “massive star” whose “light shines brighter than that of any other figure of her time and place.” Indeed, Castor insists, Joan’s star still shines. But what a travesty if all people can see is the reflected vainglory of their own desires.

Castor’s corrective approach to the problem of Joan’s fame is to turn the mirror outward, changing the point of view from Joan herself to the times in which she lived. Follow her too closely, Castor argues, and “it can seem, unnervingly, as though Joan’s star might collapse into a black hole.” To those who think they know her story, this statement might seem unnerving. But Castor doesn’t mean the facts are wrong or need revising.

Continue reading…

The Sunday Times: Hillary’s emails honour the creed of hiding, twisting, leaking at the top

Photo: Pawel Kadysz

Photo: Pawel Kadysz

WHY DID Great Britain stay neutral during the American Civil War? Back when I was researching this question, one answer that seemed particularly intriguing was the claim — made at the time in America and by subsequent historians — that it was due to a severe wheat shortage.

Repeated crop failures in the early 1860s had led to a massive reliance on imports from America and Russia. Ergo, Britain intervening in the war between the states would have been an unaffordable risk.

I combed through four years of cabinet reports, memoranda, letters and diaries, looking for proof. Cotton, slavery, Canada, blockade running, the balance of power: these were frequent subjects of fretful debate, but never wheat. The paper record showed the theory to be an utter dud, thereby freeing me to find the true causes of British neutrality.

I tell this story because I don’t see any point in hiding the fact that I am entirely partisan in the debate about government transparency. I believe that everything should be maintained in its proper place. What is classified should remain so, what is public should be open, and all must be preserved for future scrutiny.

Continue reading…

Smithsonian Magazine: The British View the War of 1812 Quite Differently Than Americans Do

Photo: Bettmann/CORBIS

Photo: Bettmann/CORBIS

As we look forward to celebrating the bicentennial of the “Star-Spangled Banner” by Francis Scott Key, I have to admit, with deep shame and embarrassment, that until I left England and went to college in the U.S., I assumed the words referred to the War of Independence. In my defense, I suspect I’m not the only one to make this mistake

For people like me, who have got their flags and wars mixed up, I think it should be pointed out that there may have been only one War of 1812, but there are four distinct versions of it—the American, the British, the Canadian and the Native American. Moreover, among Americans, the chief actors in the drama, there are multiple variations of the versions, leading to widespread disagreement about the causes, the meaning and even the outcome of the war.

Continue reading…

WSJ Historically Speaking: When a Monarch Calls It Quits

Photo: THOMAS FUCHS

Photo: THOMAS FUCHS

Abdication fever is sweeping the royal palaces of Europe. Recently, Spain’s King Juan Carlos became the third monarch in just over a year to renounce his crown. In January 2013, Queen Beatrix of the Netherlands declared that she was stepping down in favor of her son, Prince Willem-Alexander. King Albert II of Belgium followed six months later.

Abdication in the old days was usually a prelude to execution. Lucius Tarquinius Superbus, or Tarquin the Proud (who ruled from 534 to 509 B.C.), is one of the earliest recorded examples of a monarch who was forced to abdicate and still lived to tell the tale. Tarquin was the seventh and last king of the Romans. Burdened by heavy taxes, the aristocracy was already wishing to be rid of Tarquin when his son raped the pious Lucretia. The crime proved to be the catalyst for the birth of the Roman republic.

Tarquin eventually retired to the court of a neighboring tyrant. There, bored and angry, he plotted endlessly to reconquer Rome. Today, if Tarquin is remembered at all, it is by the generations of British schoolchildren who grew up learning to recite “Horatius at the Bridge,” Thomas Babington Macaulay’s stirring ballad on Tarquin’s defeat: “Lars Porsena of Clusium, / by the Nine Gods he swore, / That the great house of Tarquin / Should suffer wrong no more…And how can man die better/ Than facing fearful odds / For the ashes of his fathers / And the temples of his gods.”

Continue reading…