The Pleasures and Pains of Retirement

Since the Roman Empire, people have debated whether it’s a good idea to stop working in old age

The new film “All Is True,” directed by and starring Kenneth Branagh, imagines how William Shakespeare might have lived after he stopped writing plays. Alas for poor Shakespeare, in this version of his life retirement is hell. By day he potters in his garden, waging a feeble battle against Mother Nature; by night he is equally ineffectual against the verbal onslaughts of his resentful family.

In real life, people have been arguing the case for and against retirement since antiquity. The Roman statesman Marcus Tullius Cicero was thoroughly against the idea. In his essay “Cato the Elder on Aging,” Cicero argued that the best antidote to old age was a purposeful life. “I am in my eighty-fourth year,” he wrote, “yet, as you see, old age has not quite unnerved or shattered me. The senate and the popular assembly never find my vigor wanting.” Cicero lived by the pen—he was the greatest speechwriter in history—but he died by the sword, murdered on the orders of Mark Antony for his support of the waning Roman Republic.

Knowing when to exit from public life is a difficult art. The Roman Emperor Diocletian (ca. 245-316) retired to his palace in Split, in modern Croatia, after ruling for 21 years. According to Edward Gibbon, Diocletian was so content for the last six of years of his life that when emissaries from Rome tried to persuade him to return, he replied that he couldn’t possibly leave his cabbages.

For most of history, of course, the average person had no choice but to carry on working until they died. But in the 18th century, longer lifespans created a dilemma: The old were outliving their usefulness. Not realizing that he had two more productive decades left, the 60-year-old Voltaire told his friend Madame du Deffand: “I advise you to go on living solely to enrage those who are paying your annuities…. It is the only pleasure I have left.”

By the end of the 19th century, it had become possible for at least some ordinary people to spend their last years in retirement. In 1883, the German Chancellor Otto von Bismarck bowed to popular opinion and announced that all retirees over 65 would receive pensions. With that, 65 became the official age of retirement.

But some critics argued that this was the thin end of the wedge. If people could be forced out of careers and jobs on the basis of an arbitrary age limit, what else could be done to them? Troubled by what he regarded as ageism, the novelist Anthony Trollope published “The Fixed Period,” a dystopian novel about a society where anyone over the age is 67 is euthanized for his or her own good. The naysayers against any form of government retirement plan held sway in the U.S. until President Franklin Roosevelt signed the Social Security Act in 1935, by which time half of America’s elderly were living in poverty.

Today, the era of leisurely retirements may be passing into history. Whether driven by financial need or personal preference, for many people retirement simply means changing their occupation. According to the AARP, the number of adults working past age 65 has doubled since 1985.

Even the rich and famous aren’t retiring: President George W. Bush is a painter; the Oscar-nominated actor Gene Hackman is a novelist; and Microsoft founder Bill Gates is saving the planet. In this day and age, flush from his success on Broadway, a retired Shakespeare might start his own podcast.

The Ancient Origins of the Vacation

Once a privilege of the elite, summer travel is now a pleasure millions can enjoy.
Finally, Americans are giving themselves a break. For years, according to the U.S. Travel Association, more than half of American workers didn’t use all their paid vacation days. But in a survey released in May by Discover, 71% of respondents said they were planning a summer vacation this year, up from 58% last year—meaning a real getaway, not just a day or two to catch up on chores or take the family to an amusement park.

The importance of vacations for health and happiness has been accepted for thousands of years. The ancient Greeks probably didn’t invent the vacation, but they perfected the idea of the tourist destination by providing quality amenities at festivals, religious sites and thermal springs. A cultured person went places. According to the “Crito,” one of Plato’s dialogues, Socrates’ stay-at-home mentality made him an exception: “You never made any other journey, as other people do, and you had no wish to know any other city.”

The Romans took a different approach. Instead of touring foreign cities, the wealthy preferred to vacation together in resort towns such as Pompeii, where they built ostentatious villas featuring grand areas for entertaining. The Emperor Nero was relaxing at his beach palace at Antium, modern Anzio, when the Great Fire of Rome broke out in the year 64.

The closest thing to a vacation that medieval Europeans could enjoy was undertaking pilgrimages to holy sites. Santiago de Compostela in northern Spain, where St. James was believed to be buried, was a favorite destination, second only to Rome in popularity. As Geoffrey Chaucer’s bawdy “Canterbury Tales” shows, a pilgrimage provided all sorts of opportunities for mingling and carousing, not unlike a modern cruise ship.

The vacation went upmarket in the late 17th century, as European aristocrats rediscovered the classical idea of tourism for pleasure. Broadening one’s horizons via the Grand Tour—a sightseeing trip through the major classical and Renaissance sites of France and Italy—became de rigueur for any gentleman. The spa town, too, enjoyed a spectacular revival. The sick and infertile would gather to “take the cure,” bathing in or drinking from a thermal spring. Bath in England became as renowned for its party scene as for its waters. Jane Austen, a frequent visitor to the city, set two of her novels there. The U.S. wasn’t far behind, with Saratoga Springs, N.Y., known as the “Queen of Spas,” becoming a popular resort in 1815.

But where was the average American to go? Resorts, with their boardwalks, grand hotels and amusement arcades, were expensive. In any case, the idea of vacations for pure pleasure sat uneasily with most religious leaders. The answer lay in the great outdoors, which were deemed to be good for the health and improving to the mind.

Cheap rail travel, and popular guidebooks such as William H.H. Murray’s “Adventures in the Wilderness; or, Camp-Life in the Adirondacks,” helped to entice the middle classes out of the city. The American Chautauqua movement, originally a New York-based initiative aimed at improving adult education, further served to democratize the summer vacation by providing cheap accommodation and wholesome entertainment for families.

The summer vacation was ready to become a national tradition. In 1910, President William Howard Taft even proposed to Congress that all workers should be entitled to two to three months of paid vacation. But the plan stalled, and it was left to France to pass the first guaranteed-vacation law, in 1919. Since then, most developed countries have recognized vacation time as a legal right. It’s not too late, America, to try again.

Beloved Buildings That Rose from the Ashes

From ancient Rome to modern London, great structures like Notre Dame have fallen and been built again

A disaster like the Notre Dame cathedral fire is as much a tragedy of the heart as it is a loss to architecture. But fortunately, unlike most love affairs, a building can be resurrected. In fact, throughout history communities have gone to remarkable lengths to rebuild monuments of sacred or national importance.

There is no shortage of inspirational examples of beloved buildings that have risen from the ashes. The Second Temple was built in Jerusalem in 515 B.C. following the destruction of the First by King Nebuchadnezzar II of Babylonia in 586 B.C.; Dresden’s Baroque Frauenkirche was faithfully rebuilt in 2005, after being destroyed by bombs in 1945.

Often the new structures are exact replicas, as with Venice and Barcelona’s opera houses, La Fenice and Gran Teatre del Liceu, both of which were rebuilt after suffering devastating fires in the 1990s. If France decides to rebuild Notre Dame according to the principle “as it was, where it was,” the skill and technology aren’t lacking.

In other cases, however, disasters have allowed for beloved landmarks to be reimagined. The Great Fire of Rome in 64 A.D. led to a revolution in architectural styles and techniques. After Hagia Sophia cathedral was torched during riots in Constantinople in 532, the Byzantine Emperor Justinian asked his architects Anthemius and Isidore to build something bold and impressive. It was risky to change such a renowned symbol of the Eastern Roman Empire; moreover, for security and financial reasons, the work had to be completed in just six years. Still, the result dazzled Justinian, who exclaimed when he saw it, ‘‘Solomon, I have outdone thee.” Almost a thousand years later, following Constantinople’s fall to the Turks in 1453, Sultan Mehmet II had a similar reaction and ordered Hagia Sophia to be turned into a mosque rather than destroyed.

Sir Christopher Wren, who rebuilt St. Paul’s Cathedral in London, was not so lucky in the reactions to his creation. The Great Fire of 1666 had left the medieval church in ruins, but there was little appetite for an innovative reconstruction and no money in the Treasury to pay for it. Wren endured setbacks at every turn, including a chronic shortage of stone. At one point, Parliament suspended half his salary in protest at the slowness of the work, which took almost three decades and spanned the reigns of five monarchs.

The reason for the delay became clear after the finished building was revealed to the public. Inspired by drawings of Hagia Sophia, Wren had ignored the approved design for a traditional reconstruction and quietly opted for a more experimental approach. Ironically, many of his contemporaries were appalled by the now iconic great dome, especially the Protestant clergy, who deemed it too foreign and Catholic-looking. Yet Wren’s vision has endured. During the German bombing of London in World War II, St. Paul’s was the one building that Winston Churchill declared must be saved at all costs.

It is never easy deciding how to draw the line between history and modernity, particularly when dealing with the loss of an architectural masterpiece. There isn’t always a right answer, but it may help to remember Churchill’s words: “We shape our buildings and afterwards our buildings shape us.”

Real-Life Games of Thrones

From King Solomon to the Shah of Persia, rulers have used stately seats to project power.

ILLUSTRATION BY THOMAS FUCHS

Uneasy lies the head that wears a crown,” complains the long-suffering King Henry IV in Shakespeare. But that is not a monarch’s only problem; uneasy, too, is the bottom that sits on a throne, for thrones are often a dangerous place to be. That is why the image of a throne made of swords, in the HBO hit show “Game of Thrones” (which last week began its eighth and final season), has served as such an apt visual metaphor. It strikingly symbolizes the endless cycle of violence between the rivals for the Iron Throne, the seat of power in the show’s continent of Westeros.

In real history, too, virtually every state once put its leader on a throne. The English word comes from the ancient Greek “thronos,” meaning “stately seat,” but the thing itself is much older. Archaeologists working at the 4th millennium B.C. site of Arslantepe, in eastern Turkey, where a pre-Hittite Early Bronze Age civilization flourished, recently found evidence of what is believed to be the world’s oldest throne. It seems to have consisted of a raised bench which enabled the ruler to display his or her elevated status by literally sitting above all visitors to the palace.

Thrones were also associated with divine power: The famous 18th-century B.C. basalt stele inscribed with the law code of King Hammurabi of Babylon, which can be seen at the Louvre, depicts the king receiving the laws directly from the sun god Shamash, who is seated on a throne.

Naturally, because they were invested with so much religious and political symbolism, thrones often became a prime target in war. According to Jewish legend, King Solomon’s spectacular gold and ivory throne was stolen first by the Egyptians, who then lost it to the Assyrians, who subsequently gave it up to the Persians, whereupon it became lost forever.

In India, King Solomon’s throne was reimagined in the early 17th century by the Mughal Emperor Shah Jahan as the jewel-and-gold-encrusted Peacock Throne, featuring the 186-carat Koh-i-Noor diamond. (Shah Jahan also commissioned the Taj Mahal.) This throne also came to an unfortunate end: It was stolen during the sack of Delhi by the Shah of Iran and taken back to Persia. A mere eight years later, the Shah was assassinated by his own bodyguards and the Peacock Throne was destroyed, its valuable decorations stolen.

Perhaps the moral of the story is to keep things simple. In 1308, King Edward I of England commissioned a coronation throne made of oak. For the past 700 years it has supported the heads and backsides of 38 British monarchs during the coronation ceremony at Westminster Abbey. No harm has ever come to it, save for the pranks of a few very naughty choir boys, one of whom carved on the back of the throne: “P. Abbott slept in this chair 5-6 July 1800.”

Overcoming the Labor of Calculation

Inventors tried many times over the centuries to build an effective portable calculator—but no one succeeded until John Merryman.

ILLUSTRATION: THOMAS FUCHS

The world owes a debt of gratitude to Jerry Merryman, who died on Feb. 27 at the age of 86. It was Merryman who, in 1965, worked with two other engineers at Texas Instruments to invent the world’s first pocket calculator. Today we all carry powerful calculators on our smartphones, but Merryman was the first to solve a problem that had plagued mathematicians at least since Gottfried Leibniz, one of the creators of calculus in the 17th century, who observed: “It is unworthy of excellent men to lose hours like slaves in the labor of calculation, which could safely be relegated to anyone else if machines were used.”

In ancient times, the only alternative to mental arithmetic was the abacus, with its simple square frame, wooden rods and movable beads. Most civilizations, including the Romans, Chinese, Indians and Aztecs, had their own version—from counting boards for keeping track of sums to more sophisticated designs that could calculate square roots.

We know of just one counting machine from antiquity that was more complex. The 2nd-century B.C. Antikythera Mechanism, discovered in 1901 in the remains of an ancient Greek shipwreck, was a 30-geared bronze calculator that is believed to have been used to track the movement of the heavens. Despite its ingenious construction, the Antikythera had a limited range of capabilities: It could calculate dates and constellations and little else.

In the 15th century, Leonardo da Vinci drew up designs for a mechanical calculating device that consisted of 13 “digit-registering” wheels. In 1967, IBM commissioned the Italian scholar and engineer Roberto Guatelli to build a replica based on the sketches. Guatelli believed that Leonardo had invented the first calculator, but other experts disagreed. In any case, it turned out that the metal wheels would have generated so much friction that the frame would probably have caught fire.

Technological advances in clockmaking helped the French mathematician and inventor Blaise Pascal to build the first working mechanical calculator, called the Pascaline, in 1644. It wasn’t a fire hazard, and it could add and subtract. But the Pascaline was very expensive to make, fragile in the extreme, and far too limited to be really useful. Pascal only made 50 in his lifetime, of which less than 10 survive today.

Perhaps the greatest calculator never to see the light of day was designed by Charles Babbage in the early 19th century. He actually designed two machines—the Difference Engine, which could perform arithmetic, and the Analytical Engine, which was theoretically capable of a range of functions from direct multiplication to parallel processing. Ada, Countess of Lovelace, the daughter of the poet Lord Byron, made important contributions to the development of Babbage’s Analytical Engine. According to the historian Walter Isaacson, Lovelace realized that any process based on logical symbols could be used to represent entities other than quantity—the same principle used in modern computing.

Unfortunately, despite many years and thousands of pounds of government funding, Babbage only ever managed to build a small prototype of the Difference Engine. Even for most of the 20th century, his dream of a portable calculator stayed a dream, while the go-to instrument for doing large sums remained the slide rule. It took Merryman’s invention to allow us all to become “excellent men” and leave the labor of calculation to the machines.

Fantasies of Alien Life

Human beings have never encountered extra-terrestrials, but we’ve been imagining them for thousands of years

Fifty years ago this month, Kurt Vonnegut published “Slaughterhouse-Five,” his classic semi-autobiographical, quasi-science fiction novel about World War II and its aftermath. The story follows the adventures of Billy Pilgrim, an American soldier who survives the bombing of Dresden in 1945, only to be abducted by aliens from the planet Tralfamadore and exhibited in their zoo. Vonnegut’s absurd-looking Tralfamadorians (they resemble green toilet plungers) are essentially vehicles for his meditations on the purpose of life.

Some readers may dismiss science fiction as mere genre writing. But the idea that there may be life on other planets has engaged many of history’s greatest thinkers, starting with the ancient Greeks. On the pro-alien side were the Pythagoreans, a fifth-century B.C. sect, which argued that life must exist on the moon; in the third century B.C., the Epicureans believed that there was an infinite number of life-supporting worlds. But Plato, Aristotle and the Stoics argued the opposite. In “On the Heavens,” Aristotle specifically rejected the possibility that other worlds might exist, on the grounds that the Earth is at the center of a perfect and finite universe.

The Catholic Church sided with Plato and Aristotle: If there was only one God, there could be only one world. But in Asia, early Buddhism encouraged philosophical explorations into the idea of multiverses and parallel worlds. Buddhist influence can be seen in the 10th-century Japanese romance “The Bamboo Cutter,” whose story of a marooned moon princess and a lovelorn emperor was so popular in its time that it is mentioned in Murasaki Shikibu’s seminal novel, “The Tale of Genji.”

During the Renaissance, Copernicus’s heliocentric theory, advanced in his book “On the Revolutions of the Celestial Spheres” (1543), and Galileo’s telescopic observations of the heavens in 1610 proved that the Church’s traditional picture of the cosmos was wrong. The discovery prompted Western thinkers to imagine the possibility of alien civilizations. From Johannes Kepler to Voltaire, imagining life on the moon (or elsewhere) became a popular pastime among advanced thinkers. In “Paradise Lost” (1667), the poet John Milton wondered “if Land be there,/Fields and Inhabitants.”

Such benign musings about extraterrestrial life didn’t survive the impact of industrialization, colonialism and evolutionary theory. In the 19th century, debates over whether aliens have souls morphed into fears about humans becoming their favorite snack food. This particular strain of paranoia reached its apogee in the alien-invasion novel “The War of the Worlds,” published in 1897 by the British writer H.G. Wells. Wells’s downbeat message—that contact with aliens would lead to a Darwinian fight for survival—resonated throughout the 20th century.

And it isn’t just science fiction writers who ponder “what if.” The physicist Stephen Hawking once compared an encounter with aliens to Christopher Columbus landing in America, “which didn’t turn out very well for the Native Americans.” More hopeful visions—such as Steven Spielberg’s 1982 film “E.T. the Extraterrestrial,” about a lovable alien who wants to get back home—have been exceptions to the rule.

The real mystery about aliens is the one described by the so-called “Fermi paradox.” The 20th-century physicist Enrico Fermi observed that, given the number of stars in the universe, it is highly probable that alien life exists. So why haven’t we seen it yet? As Fermi asked, “Where is everybody?”

Insuring Against Disasters

Insurance policies go back to the ancient Babylonians and were crucial in the early development of capitalism

ILLUSTRATION: THOMAS FUCHS

Living in a world without insurance, free from all those claim forms and high deductibles, might sound like a little bit of paradise. But the only thing worse than dealing with the insurance industry is trying to conduct business without it. In fact, the basic principle of insurance—pooling risk in order to minimize liability from unforeseen dangers—is one of the things that made modern capitalism possible.

The first merchants to tackle the problem of risk management in a systematic way were the Babylonians. The 18th-century B.C. Code of Hammurabi shows that they used a primitive form of insurance known as “bottomry.” According to the Code, merchants who took high-interest loans tied to shipments of goods could have the loans forgiven if the ship was lost. The practice benefited both traders and their creditors, who charged a premium of up to 30% on such loans.

The Athenians, realizing that bottomry was a far better hedge against disaster than relying on the Oracle of Delphi, subsequently developed the idea into a maritime insurance system. They had professional loan syndicates, official inspections of ships and cargoes, and legal sanctions against code violators.

With the first insurance schemes, however, came the first insurance fraud. One of the oldest known cases comes from Athens in the 3rd century B.C. Two men named Hegestratos and Xenothemis obtained bottomry insurance for a shipment of corn from Syracuse to Athens. Halfway through the journey they attempted to sink the ship, only to have their plan foiled by an alert passenger. Hegestratos jumped (or was thrown) from the ship and drowned. Xenothemis was taken to Athens to meet his punishment.

In Christian Europe, insurance was widely frowned upon as a form of gambling—betting against God. Even after Pope Gregory IX decreed in the 13th century that the premiums charged on bottomry loans were not usury, because of the risk involved, the industry rarely expanded. Innovations came mainly in response to catastrophes: The Great Fire of London in 1666 led to the growth of fire insurance, while the Lisbon earthquake of 1755 did the same for life insurance.

It took the Enlightenment to bring widespread changes in the way Europeans thought about insurance. Probability became subject to numbers and statistics rather than hope and prayer. In addition to his contributions to mathematics, astronomy and physics, Edmond Halley (1656-1742), of Halley’s comet fame, developed the foundations of actuarial science—the mathematical measurement of risk. This helped to create a level playing field for sellers and buyers of insurance. By the end of the 18th century, those who abjured insurance were regarded as stupid rather than pious. Adam Smith declared that to do business without it “was a presumptuous contempt of the risk.”

But insurance only works if it can be trusted in a crisis. For the modern American insurance industry, the deadly San Francisco earthquake of 1906 was a day of reckoning. The devastation resulted in insured losses of $235 million—equivalent to $6.3 billion today. Many American insurers balked, but in Britain, Lloyd’s of London announced that every one of its customers would have their claims paid in full within 30 days. This prompt action saved lives and ensured that business would be able to go on.

And that’s why we pay our premiums: You can’t predict tomorrow, but you can plan for it.

The Invention of Ice Hockey

Canada gave us the modern form of a sport that has been played for centuries around the world

ILLUSTRATION: THOMAS FUCHS

Canadians like to say—and print on mugs and T-shirts—that “Canada is Hockey.” No fewer than five Canadian cities and towns claim to be the birthplace of ice hockey, including Windsor, Nova Scotia, which has an entire museum dedicated to the sport. Canada’s annual Hockey Day, which falls on February 9 this year, features a TV marathon of hockey games. Such is the country’s love for the game that last year’s broadcast was watched by more than 1 in 4 Canadians.

But as with many of humanity’s great advances, no single country or person can take the credit for inventing ice hockey. Stick-and-ball games are as old as civilization itself. The ancient Egyptians were playing a form of field hockey as early as the 21st century B.C., if a mural on a tomb at Beni Hasan, a Middle Kingdom burial site about 120 miles south of Cairo, is anything to go by. The ancient Greeks also played a version of the game, as did the early Christian Ethiopians, the Mesoamerican Teotihuacanos in the Valley of Mexico, and the Daur tribes of Inner Mongolia. And the Scottish and Irish versions of field hockey, known as shinty and hurling respectively, have strong similarities with the modern game.

Taking a ball and stick onto the ice was therefore a fairly obvious innovation, at least in places with snowy winters. The figures may be tiny, but three villagers playing an ice hockey-type game can be seen in the background of Pieter Bruegel the Elder’s 1565 painting “Hunters in the Snow.” There is no such pictorial evidence to show when the Mi’kmaq Indians of Nova Scotia first started hitting a ball on ice, but linguistic clues suggest that their hockey tradition existed before the arrival of European traders in the 16th century. The two cultures then proceeded to influence each other, with the Mi’kmaqs becoming the foremost maker of hockey sticks in the 19th century.

The earliest known use of the word hockey appears in a book, “Juvenile Sports and Pastimes,” written by Richard Johnson in London in 1776. Recently, Charles Darwin became an unlikely contributor to ice hockey history after researchers found a letter in which he reminisced about playing the game as a boy in the 1820s: “I used to be very fond of playing Hocky [sic] on the ice in skates.” On January 8, 1864, the future King Edward VII played ice hockey at Windsor Castle while awaiting the birth of his first child.

As for Canada, apart from really liking the game, what has been its real contribution to ice hockey? The answer is that it created the game we know today, from the official rulebook to the size and shape of the rink to the establishment of the Stanley Cup championship in 1894. The first indoor ice hockey game was played in Montreal in 1875, thereby solving the perennial problem of pucks getting lost. (The rink was natural ice, with Canada’s cold winter supplying the refrigeration.) The game involved two teams of nine players, each with a set position—three more than teams field today—a wooden puck, and a list of rules for fouls and scoring.

In addition to being the first properly organized game, the Montreal match also initiated ice hockey’s other famous tradition: brawling on the ice. In this case, the fighting erupted between the players, spectators and skaters who wanted the ice rink back for free skating. Go Canada!

The High Cost of Financial Panics

Roman emperors and American presidents alike have struggled to deal with sudden economic crashes

ILLUSTRATION: THOMAS FUCHS

On January 12, 1819 Thomas Jefferson wrote to his friend Nathaniel Macon, “I have…entire confidence in the late and present Presidents…I slumber without fear.” He did concede, though, that market fluctuations can trip up even the best governments. Jefferson was prescient: A few days later, the country plunged into a full-blown financial panic. The trigger was a collapse in the overseas cotton market, but the crisis had been building for months. The factors that led to the crash included the actions of the Second Bank of the United States, which had helped to fuel a real estate boom in the West only to reverse course suddenly and call in its loans.

The recession that followed the panic of 1819 was prolonged and severe: Banks closed, lending all but ceased and businesses failed by the thousands. By the time it was over in 1823, almost a third of the population—including Jefferson himself—had suffered irreversible losses.

As we mark the 200th anniversary of the 1819 panic, it is worth pondering the role of governments in a financial crisis. During a panic in Rome in the year 33, the emperor Tiberius’s prompt action prevented a total collapse of the city’s finances. Rome was caught among falling property prices, a real estate bubble and a sudden credit crunch. Instead of waiting it out, Tiberius ordered interest rates to be lowered and released 100 million sestertii (large brass coins) into the banking system to avoid a mass default.

But not all government interventions have been as successful or timely. In 1124, King Henry I of England attempted to restore confidence in the country’s money by having the mint-makers publicly castrated and their right hands amputated for producing substandard coins. A temporary fix at best, his bloody act neither deterred people from debasing the coinage nor allayed fears over England’s creditworthiness.

On the other side of the globe, China began using paper money in 1023. Successive emperors of the Ming Dynasty (1368-1644) failed, however, to limit the number of notes in circulation or to back the money with gold or silver specie. By the mid-15th century the economy was in the grip of hyperinflationary cycles. The emperor Yingzong simply gave up on the problem: China returned to coinage just as Europe was discovering the uses of paper.

The rise of commercial paper along with paper currencies allowed European countries to develop more sophisticated banking systems. But they also led to panics, inflation and dangerous speculation—sometimes all at once, as in France in 1720, when John Law’s disastrous Mississippi Company share scheme ended in mass bankruptcies for its investors and the collapse of the French livre.

As it turns out, it is easier to predict the consequences of a crisis than it is to prevent one from happening. In 2015, the U.K.’s Centre for Economic Policy Research published a paper on the effects of 100 financial crises in 20 Western countries over the past 150 years, down to the recession of 2007-09. They found two consistent outcomes. The first is that politics becomes more extreme and polarized following a crisis; the second is that countries become more ungovernable as violence, protests and populist revolts overshadow the rule of law.

With the U.S. stock market having suffered its worst December since the Great Depression of the 1930s, it is worth remembering that the only thing more frightening than a financial crisis can be its aftermath.

WSJ Historically Speaking: At Age 50, a Time of Second Acts

Amanda Foreman finds comfort in countless examples of the power of reinvention after five decades.

ILLUSTRATION BY TONY RODRIGUEZ

I turned 50 this week, and like many people I experienced a full-blown midlife crisis in the lead-up to the Big Day. The famous F. Scott Fitzgerald quotation, “There are no second acts in American lives,” dominated my thoughts. I wondered now that my first act was over—would my life no longer be about opportunities and instead consist largely of consequences?
Fitzgerald, who left the line among his notes for “The Last Tycoon,” had ample reason for pessimism. He had hoped the novel would lead to his own second act after failing to make it in Hollywood, but he died at 44, broken and disappointed, leaving the book unfinished. Yet the truth about his grim line is more complicated. Several years earlier, Fitzgerald had used it to make an almost opposite point, in the essay “My Lost City”: “I once thought that there were no second acts in American lives, but there was certainly to be a second act to New York’s boom days.”
The one comfort we should take from countless examples in history is the power of reinvention. The Victorian poet William Ernest Henley was right when he wrote, “I am the master of my fate/ I am the captain of my soul.”
The point is to seize the moment. The disabled Roman Emperor Claudius, (10 B.C.-A.D. 54) spent most his life being victimized by his awful family. Claudius was 50 when his nephew, Caligula, met his end at the hands of some of his own household security, the Praetorian Guards. The historian Suetonius writes that a soldier discovered Claudius, who had tried to hide, trembling in the palace. The guards decided to make Claudius their puppet emperor. It was a grave miscalculation. Claudius grabbed his chance, shed his bumbling persona and became a forceful and innovative ruler of Rome.

In Russia many centuries later, the general Mikhail Kutuzov was in his 60s when his moment came. In 1805, Czar Alexander I had unfairly blamed Kutuzov for the army’s defeat at the Battle of Austerlitz and relegated him to desk duties. Russian society cruelly treated the general, who looked far from heroic—a character in Tolstoy’s “War and Peace” notes the corpulent Kutuzov’s war scars, especially his “bleached eyeball.” But when the country needed a savior in 1812, Kutuzov, the “has-been,” drove Napoleon and his Grande Armée out of Russia.

Winston Churchill had a similar apotheosis in World War II when he was in his 60s. Until then, his political career had been a catalog of failures, the most famous being the Gallipoli Campaign of 1916 that left Britain and its allies with more than 100,000 casualties.

As for writers and artists, they often find middle age extremely liberating. They cease being afraid to take risks in life. Another Fitzgerald—the Man Booker Prize-winning novelist Penelope—lived on the brink of homelessness, struggling as a tutor and teacher (she later recalled “the stuffy and inky boredom of the classroom”) until she published her first book at 58.

Anna Mary Robertson Moses, better known as Grandma Moses, may be the greatest example of self-reinvention. After many decades of farm life, around age 75 she began a new career, becoming one of America’s best known folk painters.

Perhaps I’ll be inspired to master Greek when I am 80, as some say the Roman statesman Cato the Elder did. But what I’ve learned, while coming to terms with turning 50, is that time spent worrying about “what you might have been” is better passed with friends and family—celebrating the here and now.