Historically Speaking: Duels Among the Clouds

Aerial combat was born during World War I, giving the world a new kind of military hero: the fighter pilot

“Top Gun” is back. The 1986 film about Navy fighter pilots is getting a sequel next year, with Tom Cruise reprising his role as Lt. Pete “Maverick” Mitchell, the sexy flyboy who can’t stay out of trouble. Judging by the trailer released by Paramount in July, the new movie, “Top Gun: Maverick,” will go straight to the heart of current debates about the future of aerial combat. An unseen voice tells Mr. Cruise, “Your kind is headed for extinction.”

The mystique of the fighter pilot began during World War I, when fighter planes first entered military service. The first aerial combat took place on Oct. 5, 1914, when French and German biplanes engaged in an epic contest in the sky, watched by soldiers on both sides of the trenches. At this early stage, neither plane was armed, but the German pilot had a rifle and the French pilot a machine gun; the latter won the day.

A furious arms race ensued. The Germans turned to the Dutch engineer Anthony Fokker, who devised a way to synchronize a plane’s propeller with its machine gun, creating a flying weapon of deadly accuracy. The Allies soon caught up, ushering in the era of the dogfight.

From the beginning, the fighter pilot seemed to belong to a special category of warrior—the dueling knight rather than the ordinary foot soldier. Flying aces of all nationalities gave each other a comradely respect. In 1916, the British marked the downing of the German fighter pilot Oswald Boelcke by dropping a wreath in his honor on his home airfield in Germany.

But not until World War II could air combat decide the outcome of an entire campaign. During the Battle of Britain in the summer of 1940, the German air force, the Luftwaffe, dispatched up to 1,000 aircraft in a single attack. The Royal Air Force’s successful defense of the skies led to British Prime Minister Winston Churchill’s famous declaration, “Never in the field of human conflict was so much owed by so many to so few.”

The U.S. air campaigns over Germany taught American military planners a different lesson. Rather than focusing on pilot skills, they concentrated on building planes with superior firepower. In the decades after World War II, the invention of air-to-air missiles was supposed to herald the end of the dogfight. But during the Vietnam War, steep American aircraft losses caused by the acrobatic, Soviet-built MiG fighter showed that one-on-one combat still mattered. The U.S. response to this threat was the highly maneuverable twin-engine F-15 and the formation of a new pilot training academy, the Navy Fighter Weapons School, which inspired the original “Top Gun.”

Since that film’s release, however, aerial combat between fighter planes has largely happened on screen, not in the real world. The last dogfight involving a U.S. aircraft took place in 1999, during the NATO air campaign in Kosovo. The F-14 Tomcats flown by Mr. Cruise’s character have been retired, and his aircraft carrier, the USS Enterprise, has been decommissioned.

Today, conventional wisdom again holds that aerial combat is obsolete. The new F-35 Joint Strike Fighter is meant to replace close-up dogfights with long-range weapons. But not everyone seems to have read the memo about the future of air warfare. Increasingly, U.S. and NATO pilots are having to scramble their planes to head off Russian incursions. The knights of the skies can’t retire just yet.

Historically Speaking: A Palace Open to the People

From the Pharaohs to Queen Victoria, royal dwellings have been symbols of how rulers think about power.

Every summer, Queen Elizabeth II opens the state rooms of Buckingham Palace to the public. This year’s opening features an exhibition that I curated, “Queen Victoria’s Palace,” the result of a three-year collaboration with Royal Collection Trust. The exhibition uses paintings, objects and even computer-generated imagery to show how Victoria transformed Buckingham Palace into both a family home and the headquarters of the monarchy. In the process, she modernized not only the building itself but also the relationship between the Royal Family and the British people.

Plenty of rulers before Victoria had built palaces, but it was always with a view to enhancing their power rather than sharing it. Consider Amarna in Egypt, the temple-palace complex created in the 14th century B.C. by Amenhotep IV, better known as Akhenaten. Supported by his beautiful wife Nefertiti, the heretical Akhenaten made himself the head of a new religion that revered the divine light of the sun’s disk, the Aten.

The Great Palace reflected Akhenaten’s megalomania: The complex featured vast open air courtyards where the public was required to engage in mass worship of the Pharaoh and his family. Akhenaten’s palace was hated as much as his religion, and both were abandoned after his death.

Weiyang, the Endless Palace, built in 200 B.C. in western China by the Emperor Gaozu, was also designed to impart a religious message. Until its destruction in the 9th century by Tibetan invaders, Weiyang extended over two square miles, making it the largest imperial palace in history. Inside, the halls and courtyards were laid out along specific axial and symmetrical lines to ensure that the Emperor existed in harmony with the landscape and, by extension, with his people. Each chamber was ranked according to its proximity to the Emperor’s quarters; every person knew his place and obligations according to his location in the palace.

Western Europe had nothing comparable to Weiyang until King Louis XIV built the Palace of Versailles in 1682. With its unparalleled opulence—particularly the glittering Hall of Mirrors—and spectacular gardens, Versailles was a cult of personality masquerading as architecture. Louis, the self-styled Sun King at the center of this artificial universe, created a living stage where seeing and being seen was the highest form of social currency.

The offstage reality was grimmer. Except for the royal family’s quarters, Versailles lacked even such basic amenities as plumbing. The cost of upkeep swallowed a quarter of the government’s annual tax receipts. Louis XIV’s fantasy lasted a century before being swept away by the French Revolution in 1789.

Although Victoria would never have described herself as a social revolutionary, the many changes she made to Buckingham Palace were an extraordinary break with the past. From the famous balcony where the Royal Family gathers to share special occasions with the nation, to the spaces for entertaining that can welcome thousands of guests, the revitalized palace created a more inclusive form of royal architecture. It sidelined the old values of wealth, lineage, power and divine right to emphasize new ones based on family, duty, loyalty and patriotism. Victoria’s palace was perhaps less awe-inspiring than its predecessors, but it may prove to be more enduring.

The Sunday Times: With one magnificent renovation, Queen Victoria revamped the monarchy

A new exhibition reveals how the monarch’s redesign of Buckingham Palace created a home for her family and a focus for the nation, writes its co‑curator, Amanda Foreman.

The Sunday Times,

Did Queen Victoria reign over Britain or did she rule? The difference may seem like splitting hairs, but the two words go to the heart of modern debates about the way society perceives women in power. A sovereign can be chained in a dungeon and still reign, but there’s no mistaking the action implied in the verb “to rule”. The very word has a potency to it that the mealy-mouthed “reign” does not.

The Victorians could never quite resolve in their minds whether Victoria was ruling or reigning over them. To mark her diamond jubilee in 1897, the poet laureate, Alfred Austin, composed Victoria, an embarrassing poem that attempted to have it both ways — praising the monarch for 60 dutiful years on the throne while dismissing her: “But, being a woman only, I can be / Not great, but good . . . Nor in the discords that distract a Realm / Be seen or heard.”

Despite a wealth of new scholarship and biographies about Victoria, most people still find it hard to say what she actually achieved, aside from reigning for a really long time. It’s as though she simply floated through life in her womanly way, pausing only to fall in love, have babies and reportedly say things such as “We are not amused”. Her personal accomplishments are diminished, ascribed to Prince Albert’s genius or ignored.

I have co-curated, with Lucy Peter of the Royal Collection Trust, an exhibition for this year’s Buckingham Palace summer opening. It is an attempt to redress the balance. Queen Victoria’s Palace argues that Victoria’s building programme at Buckingham Palace helped to redefine the monarchy for the modern age.

The new design enabled a more open, welcoming and inclusive relationship to develop between the royal family and the public.

The house Victoria inherited in 1837 was nothing like the building we know today. The Queen’s House, as Buckingham Palace was then known, was a mishmash of rooms and styles from three reigns.

The entertaining rooms and public spaces were too small, the kitchens dilapidated, the private apartments inadequate and the plumbing and heating barely functional.

Victoria, and then Albert after their marriage, put up with its failings until there was no room for their growing family.

It’s certainly true that Albert was more involved than Victoria in the decoration of the interior. But it was Victoria’s conception of female power that dictated the palace’s final form. Kingship, as Austin’s jubilee poem helpfully pointed out, is expressed by such manly virtues as strength, glory and military might, none of which Victoria could claim without appearing to betray her feminine identity.

Instead she made queenship a reflection of her own moral values, placing the emphasis on family, duty, patriotism and public service. These four “female” virtues formed the pillars not only of her reign but of every one that followed.

Today it would be impossible to conceive of the monarchy in any other way. It is one of the very few instances where gendered power has worked in favour of women.

The Buckingham Palace that emerged from its scaffolding in 1855 was a triumph. The additions included a nursery for the children, a large balcony on the east front, state rooms for diplomatic visits and a ballroom that was large enough to accommodate 2,000 guests.

For the next six years the palace was the epicentre of the monarchy. The death of Albert on December 14, 1861, brought a sudden and abrupt end to its glory.

Incapacitated by grief, Victoria hid herself away, much to the consternation of her family and subjects.

The story of Victoria’s eventual return to public life is reflected in the slow but sure rejuvenation of the palace. There were some things that she could never bear to do there, because they intruded too much on personal memories: she never again attended a concert at the palace or played host to a visiting head of state or gave a ball like the magnificent ones of old.

But Victoria developed other ways of opening the palace to the wider world. One of the most visible was the summer garden party, a tradition that now brings 30,000 people to the palace every year.

She also allowed Prince George — later George V — and Princess Mary to appear on the balcony after their wedding, cementing a tradition now watched by hundreds of millions. The palace balcony appearance has become so ingrained in the national consciousness that each occasion receives the most intense scrutiny.

At last month’s trooping the colour, lip-readers were brought in by media outlets to decipher the exchange between the Duke and Duchess of Sussex. (It’s believed Prince Harry told Meghan to turn around.)

By the end of Victoria’s life, the monarchy’s popularity was greater than ever. Buckingham Palace was also back in the people’s affections, having a starring role in Victoria’s diamond jubilee in 1897 as the physical and emotional focus for the London celebrations.

After her death in 1901, much of Victoria and Albert’s taste was swept away in the name of modernity, including the east front, which was refaced by George V. The Buckingham Palace of the 21st century looks quite different from the one she built. But its purpose is the same.

The palace still functions as a private home. It is still the administrative headquarters of the monarchy. And, perhaps most important of all, it is still the place where the nation gathers to celebrate and be celebrated.

This is her legacy and the proof, if such is needed, that Victoria reigned, ruled and did much else besides.

Queen Victoria’s Palace is at Buckingham Palace until September 29

Historically Speaking: Playing Cards for Fun and Money

From 13th-century Egypt to the Wild West, the standard deck of 52 cards has provided entertainment—and temptation.

ILLUSTRATION: THOMAS FUCHS

More than 8,500 people traveled to Las Vegas to play in this year’s World Series of Poker, which ended July 16—a near-record for the contest. I’m not a poker player myself, but I understand the fun and excitement of playing with real cards in an actual game rather than online. There’s something uniquely pleasurable about a pack of cards—the way they look and feel to the touch—that can’t be replicated on the screen.

Although the origins of the playing card are believed to lie in China, the oldest known examples come from the Mamluk Caliphate, an Islamic empire that stretched across Egypt and the eastern Mediterranean from 1260 to 1517. It’s significant that the empire was governed by a warrior caste of former slaves: A playing card can be seen as an assertion of freedom, since time spent playing cards is time spent freely. The Mamluk card deck consisted of 52 cards divided into four suits, whose symbols reflected the daily realities of soldiering—a scimitar, a polo stick, a chalice and a coin.

Returning Crusaders and Venetian traders were probably responsible for bringing cards to Europe. Church and state authorities were not amused: In 1377, Parisians were banned from playing cards on work days. Like dice, cards were classed as gateway vices that led to greater sins. The authorities may not have been entirely wrong: Some surviving card decks from the 15th century have incredibly bawdy themes.

The suits and symbols used in playing cards became more uniform and less ornate following the advent of the printing press. French printers added a number of innovations, including dividing the four suits into red and black, and giving us the heart, diamond, club and spade symbols. Standardization enabled cards to become a lingua franca across cultures, further enhancing their appeal as a communal leisure activity.

In the 18th century, the humble playing card was the downfall of many a noble family, with vast fortunes being won and lost at the gaming table. Cards also started to feature in paintings and novels as symbols of the vagaries of fortune. The 19th-century short story “The Queen of Spades,” by the Russian writer Alexander Pushkin, beautifully captures the card mania of the period. The anti-hero, Hermann, is destroyed by his obsession with winning at Faro, a game of chance that was as popular in the saloons of the American West as it was in the drawing rooms of Europe. The lawman Wyatt Earp may have won fame in the gunfight at the OK Corral, but he earned his money as a Faro dealer in Tombstone, Ariz.

In Britain, attempts to regulate card-playing through high taxes on the cards themselves were a failure, though they did result in one change: Every ace of spades had to show a government tax stamp, which is why it’s the card that traditionally carries the manufacturer’s mark. The last innovation in the card deck, like the first, had military origins. Many Civil War regiments killed time by playing Euchre, which requires an extra trump card. The Samuel Hart Co. duly obliged with a card which became the forerunner to the Joker, the wild card that doesn’t have a suit.

But we shouldn’t allow the unsavory association of card games with gambling to have the last word. As Charles Dickens wrote in “Nicholas Nickleby”: “Thus two people who cannot afford to play cards for money, sometimes sit down to a quiet game for love.”

Historically Speaking: Beware the Red Tide

Massive algae blooms that devastate ocean life have been recorded since antiquity—and they are getting worse.

Real life isn’t so tidy. Currently, there is no force, biological or otherwise, capable of stopping the algae blooms that are attacking coastal waters around the world with frightening regularity, turning thousands of square miles into odoriferous graveyards of dead and rotting fish. In the U.S., one of the chief culprits is the Karenia brevis algae, a common marine microorganism that blooms when exposed to sunlight, warm water and phosphorus or nitrates. The result is a toxic sludge known as a red tide, which depletes the oxygen in the water, poisons shellfish and emits a foul vapor strong enough to irritate the lungs.

The red tide isn’t a new phenomenon, though its frequency and severity have certainly gotten worse thanks to pollution and rising water temperatures. There used to be decades between outbreaks, but since 1998 the Gulf Coast has suffered one every year.

The earliest description of a red tide may have come from Tacitus, the first-century Roman historian, in his “Annals”: “the Ocean had appeared blood-red and…the ebbing tide had left behind it what looked to be human corpses.” The Japanese recorded their first red tide catastrophe in 1234: An algae bloom in Osaka Bay invaded the Yodo River, a major waterway between Kyoto and Osaka, which led to mass deaths among humans and fish alike.

The earliest reliable accounts of red tide invasions in the Western Hemisphere come from 16th-century Spanish sailors in the Gulf of Mexico. The colorful explorer Álvar Núñez Cabeza de Vaca (ca. 1490-1560) almost lost his entire expedition to red tide poisoning while sailing in Apalachee Bay on the west coast of Florida in July 1528. Unaware that local Native American tribes avoided fishing in the area at that time of year, he allowed his men to gorge themselves on oysters. “The journey was difficult in the extreme,” he wrote afterward, “because neither the horses were sufficient to carry all the sick, nor did we know what remedy to seek because every day they languished.”

Red tides started appearing everywhere in the late 18th and early 19th centuries. Charles Darwin recorded seeing red-tinged water off the coast of Chile during his 1832 voyage on HMS Beagle. Scientists finally identified K. brevis as the culprit behind the outbreaks in 1946-47, but this was small comfort to Floridians, who were suffering the worst red tide invasion in U.S. history. It started in Naples and spread all the way to Sarasota, hanging around for 18 months, destroying the fishing industry and making life unbearable for residents. A 35-mile long stretch of sea was so thick with rotting fish carcasses that the government dispatched Navy warships to try to break up the mass. People compared the stench to poison gas.

The red tide invasion of 2017-18 was particularly terrible, lasting some 15 months and covering 145 miles of Floridian coastline. The loss to tourism alone neared $100 million. Things are looking better this summer, fortunately, but we need more than hope or luck to combat this plague; we need a weapon that hasn’t yet been invented.

Historically Speaking: How We Kept Cool Before Air Conditioning

Wind-catching towers and human-powered rotary fans were just some of the devices invented to fight the heat.

ILLUSTRATION: THOMAS FUCHS

What would we do without our air conditioning? Given the number of rolling blackouts and brownouts that happen across the U.S. each summer, that’s not exactly a rhetorical question.

Fortunately, our ancestors knew a thing or two about staying cool even without electricity. The ancient Egyptians developed the earliest known technique: Evaporative cooling involved hanging wet reeds in front of windows, so that the air cooled as the water evaporated.

The Romans, the greatest engineers of the ancient world, had more sophisticated methods. By 312 B.C. they were piping fresh water into Rome via underground pipes and aqueducts, enabling the rich to cool and heat their houses using cold water pipes embedded in the walls and hot water pipes under the floor.

Nor were the Romans alone in developing clever domestic architecture to provide relief in hot climes. In the Middle East, architects constructed buildings with “wind catchers”—tall, four-windowed towers that funneled cool breezes down to ground level and allowed hot air to escape. These had the advantage of working on their own, without human labor. The Chinese had started using rotary fans as early as the second century, but they required a dedicated army of slaves to keep them moving. The addition of hydraulic power during the Song era, 960-1279, alleviated but didn’t end the manpower issue.

There had been no significant improvements in air conditioning designs for almost a thousand years when, in 1734, British politicians turned to Dr. John Theophilus Desaguliers, a former assistant to Isaac Newton, and begged him to find a way of cooling the overheated House of Commons. Desaguliers designed a marvelous Rube Goldberg-like system that used all three traditional methods: wind towers, pipes and rotary fans. It actually worked, so long as there was someone to crank the handle at all times.

But the machinery wore out in the late 1760s, leaving politicians as hot and bothered as ever. In desperation, the House invited Benjamin Franklin and other leading scientists to design something new. Their final scheme turned out to be no better than Desaguliers’s and required not one but two men to keep the system working.

The real breakthrough occurred in 1841, after the British engineer David Boswell Reid figured out how to control room temperature using steam power. St. George’s Hall in Liverpool is widely considered to be the world’s first air-conditioned building.

Indeed, Reid is one of history’s unsung heroes. His system worked so well that he was invited to install his pipe and ventilation design in hospitals and public buildings around the world. He was working in the U.S. at the start of the Civil War and was appointed inspector of military hospitals. Unfortunately, he died suddenly in 1863, leaving his proposed improvements to gather dust.

The chief problem with Reid’s system was that steam power was about to be overtaken by electricity. When President James Garfield was shot by an assassin in the summer of 1881, naval engineers attempted to keep him cool by using electric fans to blow air over blocks of ice. Two decades later, Willis Haviland Carrier invented the first all-electric air conditioning unit. Architects and construction engineers have been designing around it ever since.

Fears for our power grid may be exaggerated, but it’s good to know that if the unthinkable were to happen and we lost our air conditioners, history can offer us some cool solutions.

Historically Speaking: The Pleasures and Pains of Retirement

Since the Roman Empire, people have debated whether it’s a good idea to stop working in old age

The Wall Street Journal, June 7, 2019

The new film “All Is True,” directed by and starring Kenneth Branagh, imagines how William Shakespeare might have lived after he stopped writing plays. Alas for poor Shakespeare, in this version of his life retirement is hell. By day he potters in his garden, waging a feeble battle against Mother Nature; by night he is equally ineffectual against the verbal onslaughts of his resentful family.

In real life, people have been arguing the case for and against retirement since antiquity. The Roman statesman Marcus Tullius Cicero was thoroughly against the idea. In his essay “Cato the Elder on Aging,” Cicero argued that the best antidote to old age was a purposeful life. “I am in my eighty-fourth year,” he wrote, “yet, as you see, old age has not quite unnerved or shattered me. The senate and the popular assembly never find my vigor wanting.” Cicero lived by the pen—he was the greatest speechwriter in history—but he died by the sword, murdered on the orders of Mark Antony for his support of the waning Roman Republic.

Knowing when to exit from public life is a difficult art. The Roman Emperor Diocletian (ca. 245-316) retired to his palace in Split, in modern Croatia, after ruling for 21 years. According to Edward Gibbon, Diocletian was so content for the last six of years of his life that when emissaries from Rome tried to persuade him to return, he replied that he couldn’t possibly leave his cabbages.

For most of history, of course, the average person had no choice but to carry on working until they died. But in the 18th century, longer lifespans created a dilemma: The old were outliving their usefulness. Not realizing that he had two more productive decades left, the 60-year-old Voltaire told his friend Madame du Deffand: “I advise you to go on living solely to enrage those who are paying your annuities…. It is the only pleasure I have left.”

By the end of the 19th century, it had become possible for at least some ordinary people to spend their last years in retirement. In 1883, the German Chancellor Otto von Bismarck bowed to popular opinion and announced that all retirees over 65 would receive pensions. With that, 65 became the official age of retirement.

But some critics argued that this was the thin end of the wedge. If people could be forced out of careers and jobs on the basis of an arbitrary age limit, what else could be done to them? Troubled by what he regarded as ageism, the novelist Anthony Trollope published “The Fixed Period,” a dystopian novel about a society where anyone over the age is 67 is euthanized for his or her own good. The naysayers against any form of government retirement plan held sway in the U.S. until President Franklin Roosevelt signed the Social Security Act in 1935, by which time half of America’s elderly were living in poverty.

Today, the era of leisurely retirements may be passing into history. Whether driven by financial need or personal preference, for many people retirement simply means changing their occupation. According to the AARP, the number of adults working past age 65 has doubled since 1985.

Even the rich and famous aren’t retiring: President George W. Bush is a painter; the Oscar-nominated actor Gene Hackman is a novelist; and Microsoft founder Bill Gates is saving the planet. In this day and age, flush from his success on Broadway, a retired Shakespeare might start his own podcast.

Historically Speaking: The Ancient Origins of the Vacation

Once a privilege of the elite, summer travel is now a pleasure millions can enjoy.

The Wall Street Journal, June 6, 2019

ILLUSTRATION: THOMAS FUCHS

Finally, Americans are giving themselves a break. For years, according to the U.S. Travel Association, more than half of American workers didn’t use all their paid vacation days. But in a survey released in May by Discover, 71% of respondents said they were planning a summer vacation this year, up from 58% last year—meaning a real getaway, not just a day or two to catch up on chores or take the family to an amusement park.

The importance of vacations for health and happiness has been accepted for thousands of years. The ancient Greeks probably didn’t invent the vacation, but they perfected the idea of the tourist destination by providing quality amenities at festivals, religious sites and thermal springs. A cultured person went places. According to the “Crito,” one of Plato’s dialogues, Socrates’ stay-at-home mentality made him an exception: “You never made any other journey, as other people do, and you had no wish to know any other city.”

The Romans took a different approach. Instead of touring foreign cities, the wealthy preferred to vacation together in resort towns such as Pompeii, where they built ostentatious villas featuring grand areas for entertaining. The Emperor Nero was relaxing at his beach palace at Antium, modern Anzio, when the Great Fire of Rome broke out in the year 64.

The closest thing to a vacation that medieval Europeans could enjoy was undertaking pilgrimages to holy sites. Santiago de Compostela in northern Spain, where St. James was believed to be buried, was a favorite destination, second only to Rome in popularity. As Geoffrey Chaucer’s bawdy “Canterbury Tales” shows, a pilgrimage provided all sorts of opportunities for mingling and carousing, not unlike a modern cruise ship.

The vacation went upmarket in the late 17th century, as European aristocrats rediscovered the classical idea of tourism for pleasure. Broadening one’s horizons via the Grand Tour—a sightseeing trip through the major classical and Renaissance sites of France and Italy—became de rigueur for any gentleman. The spa town, too, enjoyed a spectacular revival. The sick and infertile would gather to “take the cure,” bathing in or drinking from a thermal spring. Bath in England became as renowned for its party scene as for its waters. Jane Austen, a frequent visitor to the city, set two of her novels there. The U.S. wasn’t far behind, with Saratoga Springs, N.Y., known as the “Queen of Spas,” becoming a popular resort in 1815.

But where was the average American to go? Resorts, with their boardwalks, grand hotels and amusement arcades, were expensive. In any case, the idea of vacations for pure pleasure sat uneasily with most religious leaders. The answer lay in the great outdoors, which were deemed to be good for the health and improving to the mind.

Cheap rail travel, and popular guidebooks such as William H.H. Murray’s “Adventures in the Wilderness; or, Camp-Life in the Adirondacks,” helped to entice the middle classes out of the city. The American Chautauqua movement, originally a New York-based initiative aimed at improving adult education, further served to democratize the summer vacation by providing cheap accommodation and wholesome entertainment for families.

The summer vacation was ready to become a national tradition. In 1910, President William Howard Taft even proposed to Congress that all workers should be entitled to two to three months of paid vacation. But the plan stalled, and it was left to France to pass the first guaranteed-vacation law, in 1919. Since then, most developed countries have recognized vacation time as a legal right. It’s not too late, America, to try again.

Harper’s Bazaar: Buckingham Palace is opening a new exhibition exploring the life of Queen Victoria this summer

This year marks the 200th anniversary of the birth of Queen Victoria and to celebrate, Buckingham Palace has announced a special exhibition as part of its state opening this summer, co-curated by Dr. Amanda Foreman.

Harper’s Bazaar, May 7, 2019

by Katie Frost

A portrait of Queen Victoria by Thomas Sully

The exhibit will explore the life of the monarch and how she turned the once unloved palace into the royal residence we know today. Highlights will include a portrait of the young queen painted by Thomas Sully soon after she moved into her new home, along with Victoria’s personal insignia, the Star and Collar of the Order of the Bath.

Victoria moved into the palace in 1837 when she was just 18. It had been empty for seven years following the death of her uncle, George IV. After Victoria married Prince Albert and started a family, Victoria wrote a letter to the Prime Minister, Sir Robert Peel, about her plans to revamp her family home. In it, she spoke of “the urgent necessity of doing something to Buckingham Palace” and “the total want of accommodation for our growing little family”, according to the Royal Collection Trust.

Historically Speaking: Beloved Buildings That Rose from the Ashes

From ancient Rome to modern London, great structures like Notre Dame have fallen and been built again

The Wall Street Journal, May 2, 2019

A disaster like the Notre Dame cathedral fire is as much a tragedy of the heart as it is a loss to architecture. But fortunately, unlike most love affairs, a building can be resurrected. In fact, throughout history communities have gone to remarkable lengths to rebuild monuments of sacred or national importance.

There is no shortage of inspirational examples of beloved buildings that have risen from the ashes. The Second Temple was built in Jerusalem in 515 B.C. following the destruction of the First by King Nebuchadnezzar II of Babylonia in 586 B.C.; Dresden’s Baroque Frauenkirche was faithfully rebuilt in 2005, after being destroyed by bombs in 1945.

Often the new structures are exact replicas, as with Venice and Barcelona’s opera houses, La Fenice and Gran Teatre del Liceu, both of which were rebuilt after suffering devastating fires in the 1990s. If France decides to rebuild Notre Dame according to the principle “as it was, where it was,” the skill and technology aren’t lacking.

In other cases, however, disasters have allowed for beloved landmarks to be reimagined. The Great Fire of Rome in 64 A.D. led to a revolution in architectural styles and techniques. After Hagia Sophia cathedral was torched during riots in Constantinople in 532, the Byzantine Emperor Justinian asked his architects Anthemius and Isidore to build something bold and impressive. It was risky to change such a renowned symbol of the Eastern Roman Empire; moreover, for security and financial reasons, the work had to be completed in just six years. Still, the result dazzled Justinian, who exclaimed when he saw it, ‘‘Solomon, I have outdone thee.” Almost a thousand years later, following Constantinople’s fall to the Turks in 1453, Sultan Mehmet II had a similar reaction and ordered Hagia Sophia to be turned into a mosque rather than destroyed.

Sir Christopher Wren, who rebuilt St. Paul’s Cathedral in London, was not so lucky in the reactions to his creation. The Great Fire of 1666 had left the medieval church in ruins, but there was little appetite for an innovative reconstruction and no money in the Treasury to pay for it. Wren endured setbacks at every turn, including a chronic shortage of stone. At one point, Parliament suspended half his salary in protest at the slowness of the work, which took almost three decades and spanned the reigns of five monarchs.

The reason for the delay became clear after the finished building was revealed to the public. Inspired by drawings of Hagia Sophia, Wren had ignored the approved design for a traditional reconstruction and quietly opted for a more experimental approach. Ironically, many of his contemporaries were appalled by the now iconic great dome, especially the Protestant clergy, who deemed it too foreign and Catholic-looking. Yet Wren’s vision has endured. During the German bombing of London in World War II, St. Paul’s was the one building that Winston Churchill declared must be saved at all costs.

It is never easy deciding how to draw the line between history and modernity, particularly when dealing with the loss of an architectural masterpiece. There isn’t always a right answer, but it may help to remember Churchill’s words: “We shape our buildings and afterwards our buildings shape us.”