Buckingham Palace is opening a new exhibition exploring the life of Queen Victoria this summer

It will highlight how the young monarch transformed the landmark into a royal residence.

Article by Katie Frost for Harper’s Bazaar detailing the upcoming BPSO, curated by Dr. Amanda Foreman.

A portrait of Queen Victoria by Thomas Sully

This year marks the 200th anniversary of the birth of Queen Victoria and to celebrate, Buckingham Palace has announced a special exhibition as part of its state opening this summer.

The exhibit will explore the life of the monarch and how she turned the once unloved palace into the royal residence we know today. Highlights will include a portrait of the young queen painted by Thomas Sully soon after she moved into her new home, along with Victoria’s personal insignia, the Star and Collar of the Order of the Bath.

Victoria moved into the palace in 1837 when she was just 18. It had been empty for seven years following the death of her uncle, George IV. After Victoria married Prince Albert and started a family, Victoria wrote a letter to the Prime Minister, Sir Robert Peel, about her plans to revamp her family home. In it, she spoke of “the urgent necessity of doing something to Buckingham Palace” and “the total want of accommodation for our growing little family”, according to the Royal Collection Trust.

Beloved Buildings That Rose from the Ashes

From ancient Rome to modern London, great structures like Notre Dame have fallen and been built again

A disaster like the Notre Dame cathedral fire is as much a tragedy of the heart as it is a loss to architecture. But fortunately, unlike most love affairs, a building can be resurrected. In fact, throughout history communities have gone to remarkable lengths to rebuild monuments of sacred or national importance.

There is no shortage of inspirational examples of beloved buildings that have risen from the ashes. The Second Temple was built in Jerusalem in 515 B.C. following the destruction of the First by King Nebuchadnezzar II of Babylonia in 586 B.C.; Dresden’s Baroque Frauenkirche was faithfully rebuilt in 2005, after being destroyed by bombs in 1945.

Often the new structures are exact replicas, as with Venice and Barcelona’s opera houses, La Fenice and Gran Teatre del Liceu, both of which were rebuilt after suffering devastating fires in the 1990s. If France decides to rebuild Notre Dame according to the principle “as it was, where it was,” the skill and technology aren’t lacking.

In other cases, however, disasters have allowed for beloved landmarks to be reimagined. The Great Fire of Rome in 64 A.D. led to a revolution in architectural styles and techniques. After Hagia Sophia cathedral was torched during riots in Constantinople in 532, the Byzantine Emperor Justinian asked his architects Anthemius and Isidore to build something bold and impressive. It was risky to change such a renowned symbol of the Eastern Roman Empire; moreover, for security and financial reasons, the work had to be completed in just six years. Still, the result dazzled Justinian, who exclaimed when he saw it, ‘‘Solomon, I have outdone thee.” Almost a thousand years later, following Constantinople’s fall to the Turks in 1453, Sultan Mehmet II had a similar reaction and ordered Hagia Sophia to be turned into a mosque rather than destroyed.

Sir Christopher Wren, who rebuilt St. Paul’s Cathedral in London, was not so lucky in the reactions to his creation. The Great Fire of 1666 had left the medieval church in ruins, but there was little appetite for an innovative reconstruction and no money in the Treasury to pay for it. Wren endured setbacks at every turn, including a chronic shortage of stone. At one point, Parliament suspended half his salary in protest at the slowness of the work, which took almost three decades and spanned the reigns of five monarchs.

The reason for the delay became clear after the finished building was revealed to the public. Inspired by drawings of Hagia Sophia, Wren had ignored the approved design for a traditional reconstruction and quietly opted for a more experimental approach. Ironically, many of his contemporaries were appalled by the now iconic great dome, especially the Protestant clergy, who deemed it too foreign and Catholic-looking. Yet Wren’s vision has endured. During the German bombing of London in World War II, St. Paul’s was the one building that Winston Churchill declared must be saved at all costs.

It is never easy deciding how to draw the line between history and modernity, particularly when dealing with the loss of an architectural masterpiece. There isn’t always a right answer, but it may help to remember Churchill’s words: “We shape our buildings and afterwards our buildings shape us.”

Real-Life Games of Thrones

From King Solomon to the Shah of Persia, rulers have used stately seats to project power.

ILLUSTRATION BY THOMAS FUCHS

Uneasy lies the head that wears a crown,” complains the long-suffering King Henry IV in Shakespeare. But that is not a monarch’s only problem; uneasy, too, is the bottom that sits on a throne, for thrones are often a dangerous place to be. That is why the image of a throne made of swords, in the HBO hit show “Game of Thrones” (which last week began its eighth and final season), has served as such an apt visual metaphor. It strikingly symbolizes the endless cycle of violence between the rivals for the Iron Throne, the seat of power in the show’s continent of Westeros.

In real history, too, virtually every state once put its leader on a throne. The English word comes from the ancient Greek “thronos,” meaning “stately seat,” but the thing itself is much older. Archaeologists working at the 4th millennium B.C. site of Arslantepe, in eastern Turkey, where a pre-Hittite Early Bronze Age civilization flourished, recently found evidence of what is believed to be the world’s oldest throne. It seems to have consisted of a raised bench which enabled the ruler to display his or her elevated status by literally sitting above all visitors to the palace.

Thrones were also associated with divine power: The famous 18th-century B.C. basalt stele inscribed with the law code of King Hammurabi of Babylon, which can be seen at the Louvre, depicts the king receiving the laws directly from the sun god Shamash, who is seated on a throne.

Naturally, because they were invested with so much religious and political symbolism, thrones often became a prime target in war. According to Jewish legend, King Solomon’s spectacular gold and ivory throne was stolen first by the Egyptians, who then lost it to the Assyrians, who subsequently gave it up to the Persians, whereupon it became lost forever.

In India, King Solomon’s throne was reimagined in the early 17th century by the Mughal Emperor Shah Jahan as the jewel-and-gold-encrusted Peacock Throne, featuring the 186-carat Koh-i-Noor diamond. (Shah Jahan also commissioned the Taj Mahal.) This throne also came to an unfortunate end: It was stolen during the sack of Delhi by the Shah of Iran and taken back to Persia. A mere eight years later, the Shah was assassinated by his own bodyguards and the Peacock Throne was destroyed, its valuable decorations stolen.

Perhaps the moral of the story is to keep things simple. In 1308, King Edward I of England commissioned a coronation throne made of oak. For the past 700 years it has supported the heads and backsides of 38 British monarchs during the coronation ceremony at Westminster Abbey. No harm has ever come to it, save for the pranks of a few very naughty choir boys, one of whom carved on the back of the throne: “P. Abbott slept in this chair 5-6 July 1800.”

Overcoming the Labor of Calculation

Inventors tried many times over the centuries to build an effective portable calculator—but no one succeeded until John Merryman.

ILLUSTRATION: THOMAS FUCHS

The world owes a debt of gratitude to Jerry Merryman, who died on Feb. 27 at the age of 86. It was Merryman who, in 1965, worked with two other engineers at Texas Instruments to invent the world’s first pocket calculator. Today we all carry powerful calculators on our smartphones, but Merryman was the first to solve a problem that had plagued mathematicians at least since Gottfried Leibniz, one of the creators of calculus in the 17th century, who observed: “It is unworthy of excellent men to lose hours like slaves in the labor of calculation, which could safely be relegated to anyone else if machines were used.”

In ancient times, the only alternative to mental arithmetic was the abacus, with its simple square frame, wooden rods and movable beads. Most civilizations, including the Romans, Chinese, Indians and Aztecs, had their own version—from counting boards for keeping track of sums to more sophisticated designs that could calculate square roots.

We know of just one counting machine from antiquity that was more complex. The 2nd-century B.C. Antikythera Mechanism, discovered in 1901 in the remains of an ancient Greek shipwreck, was a 30-geared bronze calculator that is believed to have been used to track the movement of the heavens. Despite its ingenious construction, the Antikythera had a limited range of capabilities: It could calculate dates and constellations and little else.

In the 15th century, Leonardo da Vinci drew up designs for a mechanical calculating device that consisted of 13 “digit-registering” wheels. In 1967, IBM commissioned the Italian scholar and engineer Roberto Guatelli to build a replica based on the sketches. Guatelli believed that Leonardo had invented the first calculator, but other experts disagreed. In any case, it turned out that the metal wheels would have generated so much friction that the frame would probably have caught fire.

Technological advances in clockmaking helped the French mathematician and inventor Blaise Pascal to build the first working mechanical calculator, called the Pascaline, in 1644. It wasn’t a fire hazard, and it could add and subtract. But the Pascaline was very expensive to make, fragile in the extreme, and far too limited to be really useful. Pascal only made 50 in his lifetime, of which less than 10 survive today.

Perhaps the greatest calculator never to see the light of day was designed by Charles Babbage in the early 19th century. He actually designed two machines—the Difference Engine, which could perform arithmetic, and the Analytical Engine, which was theoretically capable of a range of functions from direct multiplication to parallel processing. Ada, Countess of Lovelace, the daughter of the poet Lord Byron, made important contributions to the development of Babbage’s Analytical Engine. According to the historian Walter Isaacson, Lovelace realized that any process based on logical symbols could be used to represent entities other than quantity—the same principle used in modern computing.

Unfortunately, despite many years and thousands of pounds of government funding, Babbage only ever managed to build a small prototype of the Difference Engine. Even for most of the 20th century, his dream of a portable calculator stayed a dream, while the go-to instrument for doing large sums remained the slide rule. It took Merryman’s invention to allow us all to become “excellent men” and leave the labor of calculation to the machines.

The Immortal Charm of Daffodils

The humble flower has been a favorite symbol in myth and art since ancient times

ILLUSTRATION: THOMAS FUCHS

On April 15, 1802, the poet William Wordsworth and his sister Dorothy were enjoying a spring walk through the hills and vales of the English Lake District when they came across a field of daffodils. Dorothy was so moved that she recorded the event in her journal, noting how the flowers “tossed and reeled and danced and seemed as if they verily laughed with the wind that blew upon them over the Lake.” And William decided there was nothing for it but to write a poem, which he published in its final version in 1815. “I Wandered Lonely as a Cloud” is one of his most famous reflections on the power of nature: “For oft, when on my couch I lie/In vacant or in pensive mood,/They flash upon that inward eye/Which is the bliss of solitude;/And then my heart with pleasure fills,/And dances with the daffodils.”

Long dismissed as a common field flower, unworthy of serious attention by the artist, poet or gardener, the daffodil enjoyed a revival thanks in part to Wordsworth’s poem. The painters Claude Monet, Berthe Morisot and Vincent van Gogh were among its 19th-century champions. Today, the daffodil is so ubiquitous, in gardens and in art, that it’s easy to overlook.

But the flower deserves respect for being a survivor. Every part of the narcissus, to use its scientific name, is toxic to humans, animals and even other flowers, and yet—as many cultures have noted—it seems immortal. There are still swaths of daffodils on the lakeside meadow where the Wordsworths ambled two centuries ago.

The daffodil originated in the ancient Mediterranean, where it was regarded with deep ambivalence. The ancient Egyptians associated narcissi with the idea of death and resurrection, using them in tomb paintings. The Greeks also gave the flower contrary mythological meanings. Its scientific name comes from the story of Narcissus, a handsome youth who faded away after being cursed into falling in love with his own image. At the last moment, the gods saved him from death by granting him a lifeless immortality as a daffodil. In another Greek myth, the daffodil’s luminous beauty was used by Hades to lure Persephone away from her friends so that he could abduct her into the underworld. During her four-month captivity the only flower she saw was the asphodelus, which grew in abundance on the fields of Elysium—and whose name inspired the English derivative “daffodil.”

But it is isn’t only Mediterranean cultures that have fixated on the daffodil’s mysterious alchemy of life and death. A fragrant variety of the narcissus—the sweet-smelling paper white—traveled along the Silk Road to China. There, too, the flower appeared to encapsulate the happy promise of spring, but also other painful emotions such as loss and yearning. The famous Ming Dynasty scroll painting “Narcissi and Plum Blossoms” by Qiu Ying (ca. 1494-1552), for instance, is a study in contrasts, juxtaposing exquisitely rendered flowers with the empty desolation of winter.

The English botanist John Parkinson introduced the traditional yellow variety from Spain in 1618. Aided by a soggy but temperate climate, daffodils quickly spread across lawns and fields, causing its foreign origins to be forgotten. By the 19th century they had become quintessentially British—so much so that missionaries and traders, nostalgic for home, planted bucketfuls of bulbs wherever they went. Their legacy in North America is a burst of color each year just when the browns and grays of winter have worn out their welcome.

Fantasies of Alien Life

Human beings have never encountered extra-terrestrials, but we’ve been imagining them for thousands of years

Fifty years ago this month, Kurt Vonnegut published “Slaughterhouse-Five,” his classic semi-autobiographical, quasi-science fiction novel about World War II and its aftermath. The story follows the adventures of Billy Pilgrim, an American soldier who survives the bombing of Dresden in 1945, only to be abducted by aliens from the planet Tralfamadore and exhibited in their zoo. Vonnegut’s absurd-looking Tralfamadorians (they resemble green toilet plungers) are essentially vehicles for his meditations on the purpose of life.

Some readers may dismiss science fiction as mere genre writing. But the idea that there may be life on other planets has engaged many of history’s greatest thinkers, starting with the ancient Greeks. On the pro-alien side were the Pythagoreans, a fifth-century B.C. sect, which argued that life must exist on the moon; in the third century B.C., the Epicureans believed that there was an infinite number of life-supporting worlds. But Plato, Aristotle and the Stoics argued the opposite. In “On the Heavens,” Aristotle specifically rejected the possibility that other worlds might exist, on the grounds that the Earth is at the center of a perfect and finite universe.

The Catholic Church sided with Plato and Aristotle: If there was only one God, there could be only one world. But in Asia, early Buddhism encouraged philosophical explorations into the idea of multiverses and parallel worlds. Buddhist influence can be seen in the 10th-century Japanese romance “The Bamboo Cutter,” whose story of a marooned moon princess and a lovelorn emperor was so popular in its time that it is mentioned in Murasaki Shikibu’s seminal novel, “The Tale of Genji.”

During the Renaissance, Copernicus’s heliocentric theory, advanced in his book “On the Revolutions of the Celestial Spheres” (1543), and Galileo’s telescopic observations of the heavens in 1610 proved that the Church’s traditional picture of the cosmos was wrong. The discovery prompted Western thinkers to imagine the possibility of alien civilizations. From Johannes Kepler to Voltaire, imagining life on the moon (or elsewhere) became a popular pastime among advanced thinkers. In “Paradise Lost” (1667), the poet John Milton wondered “if Land be there,/Fields and Inhabitants.”

Such benign musings about extraterrestrial life didn’t survive the impact of industrialization, colonialism and evolutionary theory. In the 19th century, debates over whether aliens have souls morphed into fears about humans becoming their favorite snack food. This particular strain of paranoia reached its apogee in the alien-invasion novel “The War of the Worlds,” published in 1897 by the British writer H.G. Wells. Wells’s downbeat message—that contact with aliens would lead to a Darwinian fight for survival—resonated throughout the 20th century.

And it isn’t just science fiction writers who ponder “what if.” The physicist Stephen Hawking once compared an encounter with aliens to Christopher Columbus landing in America, “which didn’t turn out very well for the Native Americans.” More hopeful visions—such as Steven Spielberg’s 1982 film “E.T. the Extraterrestrial,” about a lovable alien who wants to get back home—have been exceptions to the rule.

The real mystery about aliens is the one described by the so-called “Fermi paradox.” The 20th-century physicist Enrico Fermi observed that, given the number of stars in the universe, it is highly probable that alien life exists. So why haven’t we seen it yet? As Fermi asked, “Where is everybody?”

Insuring Against Disasters

Insurance policies go back to the ancient Babylonians and were crucial in the early development of capitalism

ILLUSTRATION: THOMAS FUCHS

Living in a world without insurance, free from all those claim forms and high deductibles, might sound like a little bit of paradise. But the only thing worse than dealing with the insurance industry is trying to conduct business without it. In fact, the basic principle of insurance—pooling risk in order to minimize liability from unforeseen dangers—is one of the things that made modern capitalism possible.

The first merchants to tackle the problem of risk management in a systematic way were the Babylonians. The 18th-century B.C. Code of Hammurabi shows that they used a primitive form of insurance known as “bottomry.” According to the Code, merchants who took high-interest loans tied to shipments of goods could have the loans forgiven if the ship was lost. The practice benefited both traders and their creditors, who charged a premium of up to 30% on such loans.

The Athenians, realizing that bottomry was a far better hedge against disaster than relying on the Oracle of Delphi, subsequently developed the idea into a maritime insurance system. They had professional loan syndicates, official inspections of ships and cargoes, and legal sanctions against code violators.

With the first insurance schemes, however, came the first insurance fraud. One of the oldest known cases comes from Athens in the 3rd century B.C. Two men named Hegestratos and Xenothemis obtained bottomry insurance for a shipment of corn from Syracuse to Athens. Halfway through the journey they attempted to sink the ship, only to have their plan foiled by an alert passenger. Hegestratos jumped (or was thrown) from the ship and drowned. Xenothemis was taken to Athens to meet his punishment.

In Christian Europe, insurance was widely frowned upon as a form of gambling—betting against God. Even after Pope Gregory IX decreed in the 13th century that the premiums charged on bottomry loans were not usury, because of the risk involved, the industry rarely expanded. Innovations came mainly in response to catastrophes: The Great Fire of London in 1666 led to the growth of fire insurance, while the Lisbon earthquake of 1755 did the same for life insurance.

It took the Enlightenment to bring widespread changes in the way Europeans thought about insurance. Probability became subject to numbers and statistics rather than hope and prayer. In addition to his contributions to mathematics, astronomy and physics, Edmond Halley (1656-1742), of Halley’s comet fame, developed the foundations of actuarial science—the mathematical measurement of risk. This helped to create a level playing field for sellers and buyers of insurance. By the end of the 18th century, those who abjured insurance were regarded as stupid rather than pious. Adam Smith declared that to do business without it “was a presumptuous contempt of the risk.”

But insurance only works if it can be trusted in a crisis. For the modern American insurance industry, the deadly San Francisco earthquake of 1906 was a day of reckoning. The devastation resulted in insured losses of $235 million—equivalent to $6.3 billion today. Many American insurers balked, but in Britain, Lloyd’s of London announced that every one of its customers would have their claims paid in full within 30 days. This prompt action saved lives and ensured that business would be able to go on.

And that’s why we pay our premiums: You can’t predict tomorrow, but you can plan for it.

The Invention of Ice Hockey

Canada gave us the modern form of a sport that has been played for centuries around the world

ILLUSTRATION: THOMAS FUCHS

Canadians like to say—and print on mugs and T-shirts—that “Canada is Hockey.” No fewer than five Canadian cities and towns claim to be the birthplace of ice hockey, including Windsor, Nova Scotia, which has an entire museum dedicated to the sport. Canada’s annual Hockey Day, which falls on February 9 this year, features a TV marathon of hockey games. Such is the country’s love for the game that last year’s broadcast was watched by more than 1 in 4 Canadians.

But as with many of humanity’s great advances, no single country or person can take the credit for inventing ice hockey. Stick-and-ball games are as old as civilization itself. The ancient Egyptians were playing a form of field hockey as early as the 21st century B.C., if a mural on a tomb at Beni Hasan, a Middle Kingdom burial site about 120 miles south of Cairo, is anything to go by. The ancient Greeks also played a version of the game, as did the early Christian Ethiopians, the Mesoamerican Teotihuacanos in the Valley of Mexico, and the Daur tribes of Inner Mongolia. And the Scottish and Irish versions of field hockey, known as shinty and hurling respectively, have strong similarities with the modern game.

Taking a ball and stick onto the ice was therefore a fairly obvious innovation, at least in places with snowy winters. The figures may be tiny, but three villagers playing an ice hockey-type game can be seen in the background of Pieter Bruegel the Elder’s 1565 painting “Hunters in the Snow.” There is no such pictorial evidence to show when the Mi’kmaq Indians of Nova Scotia first started hitting a ball on ice, but linguistic clues suggest that their hockey tradition existed before the arrival of European traders in the 16th century. The two cultures then proceeded to influence each other, with the Mi’kmaqs becoming the foremost maker of hockey sticks in the 19th century.

The earliest known use of the word hockey appears in a book, “Juvenile Sports and Pastimes,” written by Richard Johnson in London in 1776. Recently, Charles Darwin became an unlikely contributor to ice hockey history after researchers found a letter in which he reminisced about playing the game as a boy in the 1820s: “I used to be very fond of playing Hocky [sic] on the ice in skates.” On January 8, 1864, the future King Edward VII played ice hockey at Windsor Castle while awaiting the birth of his first child.

As for Canada, apart from really liking the game, what has been its real contribution to ice hockey? The answer is that it created the game we know today, from the official rulebook to the size and shape of the rink to the establishment of the Stanley Cup championship in 1894. The first indoor ice hockey game was played in Montreal in 1875, thereby solving the perennial problem of pucks getting lost. (The rink was natural ice, with Canada’s cold winter supplying the refrigeration.) The game involved two teams of nine players, each with a set position—three more than teams field today—a wooden puck, and a list of rules for fouls and scoring.

In addition to being the first properly organized game, the Montreal match also initiated ice hockey’s other famous tradition: brawling on the ice. In this case, the fighting erupted between the players, spectators and skaters who wanted the ice rink back for free skating. Go Canada!

Unenforceable Laws Against Pleasure

The 100th anniversary of Prohibition is a reminder of how hard it is to regulate consumption and display

ILLUSTRATION: THOMAS FUCHS

This month we mark the centennial of the ratification of the Constitution’s 18th Amendment, better known as Prohibition. But the temperance movement was active for over a half-century before winning its great prize. As the novelist Anthony Trollope discovered to his regret while touring North America in 1861-2, Maine had been dry for a decade. The convivial Englishman condemned the ban: “This law, like all sumptuary laws, must fail,” he wrote.

Sumptuary laws had largely fallen into disuse by the 19th century, but they were once a near-universal tool, used in the East and West alike to control economies and preserve social hierarchies. A sumptuary law is a rule that regulates consumption in its broadest sense, from what a person may eat and drink to what they may own, wear or display. The oldest known example, the Locrian Law Code devised by the seventh century B.C. Greek law giver Zaleucus, banned all citizens of Locri (except prostitutes) from ostentatious displays of gold jewelry.

Sumptuary laws were often political weapons disguised as moral pieties, aimed at less powerful groups, particularly women. In 215 B.C., at the height of the Second Punic War, the Roman Senate passed the Lex Oppia, which (among other restrictions) banned women from owning more than a half ounce of gold. Ostensibly a wartime austerity measure, 20 years later the law appeared so ridiculous as to be unenforceable. But during debate on its repeal in 195 B.C., Cato the Elder, its strongest defender, inadvertently revealed the Lex Oppia’s true purpose: “What [these women] want is complete freedom…. Once they have achieved equality, they will be your masters.”

Cato’s message about preserving social hierarchy echoed down the centuries. As trade and economic stability returned to Europe during the High Middle Ages (1000-1300), so did the use of sumptuary laws to keep the new merchant elites in their place. By the 16th century, sumptuary laws in Europe had extended from clothing to almost every aspect of daily life. The more they were circumvented, the more specific such laws became. An edict issued by King Henry VIII of England in 1517, for example, dictated the maximum number of dishes allowed at a meal: nine for a cardinal, seven for the aristocracy and three for the gentry.

The rise of modern capitalism ultimately made sumptuary laws obsolete. Trade turned once-scarce luxuries into mass commodities that simply couldn’t be controlled. Adam Smith’s “The Wealth of Nations” (1776) confirmed what had been obvious for over a century: Consumption and liberty go hand in hand. “It is the highest impertinence,” he wrote, “to pretend to watch over the economy of private people…either by sumptuary laws, or by prohibiting the importation of foreign luxuries.”

Smith’s pragmatic view was echoed by President William Howard Taft. He opposed Prohibition on the grounds that it was coercive rather than consensual, arguing that “experience has shown that a law of this kind, sumptuary in its character, can only be properly enforced in districts in which a majority of the people favor the law.” Mass immigration in early 20th-century America had changed many cities into ethnic melting-pots. Taft recognized Prohibition as an attempt by nativists to impose cultural uniformity on immigrant communities whose attitudes toward alcohol were more permissive. But his warning was ignored, and the disastrous course of Prohibition was set.

The High Cost of Financial Panics

Roman emperors and American presidents alike have struggled to deal with sudden economic crashes

ILLUSTRATION: THOMAS FUCHS

On January 12, 1819 Thomas Jefferson wrote to his friend Nathaniel Macon, “I have…entire confidence in the late and present Presidents…I slumber without fear.” He did concede, though, that market fluctuations can trip up even the best governments. Jefferson was prescient: A few days later, the country plunged into a full-blown financial panic. The trigger was a collapse in the overseas cotton market, but the crisis had been building for months. The factors that led to the crash included the actions of the Second Bank of the United States, which had helped to fuel a real estate boom in the West only to reverse course suddenly and call in its loans.

The recession that followed the panic of 1819 was prolonged and severe: Banks closed, lending all but ceased and businesses failed by the thousands. By the time it was over in 1823, almost a third of the population—including Jefferson himself—had suffered irreversible losses.

As we mark the 200th anniversary of the 1819 panic, it is worth pondering the role of governments in a financial crisis. During a panic in Rome in the year 33, the emperor Tiberius’s prompt action prevented a total collapse of the city’s finances. Rome was caught among falling property prices, a real estate bubble and a sudden credit crunch. Instead of waiting it out, Tiberius ordered interest rates to be lowered and released 100 million sestertii (large brass coins) into the banking system to avoid a mass default.

But not all government interventions have been as successful or timely. In 1124, King Henry I of England attempted to restore confidence in the country’s money by having the mint-makers publicly castrated and their right hands amputated for producing substandard coins. A temporary fix at best, his bloody act neither deterred people from debasing the coinage nor allayed fears over England’s creditworthiness.

On the other side of the globe, China began using paper money in 1023. Successive emperors of the Ming Dynasty (1368-1644) failed, however, to limit the number of notes in circulation or to back the money with gold or silver specie. By the mid-15th century the economy was in the grip of hyperinflationary cycles. The emperor Yingzong simply gave up on the problem: China returned to coinage just as Europe was discovering the uses of paper.

The rise of commercial paper along with paper currencies allowed European countries to develop more sophisticated banking systems. But they also led to panics, inflation and dangerous speculation—sometimes all at once, as in France in 1720, when John Law’s disastrous Mississippi Company share scheme ended in mass bankruptcies for its investors and the collapse of the French livre.

As it turns out, it is easier to predict the consequences of a crisis than it is to prevent one from happening. In 2015, the U.K.’s Centre for Economic Policy Research published a paper on the effects of 100 financial crises in 20 Western countries over the past 150 years, down to the recession of 2007-09. They found two consistent outcomes. The first is that politics becomes more extreme and polarized following a crisis; the second is that countries become more ungovernable as violence, protests and populist revolts overshadow the rule of law.

With the U.S. stock market having suffered its worst December since the Great Depression of the 1930s, it is worth remembering that the only thing more frightening than a financial crisis can be its aftermath.