Historically Speaking: Our Fraught Love Affair With Cannabis

Ban it? Tax it? Humans have been hounded by these questions for millennia.

The Wall Street Journal

January 19, 2024

Ohio’s new marijuana law marks a watershed moment in the decriminalization of cannabis: more than half of Americans now live in places where recreational marijuana is legal. It is a profound shift, but only the latest twist in the long and winding saga of society’s relationship with pot.

Humans first domesticated cannabis sativa around 12,000 years ago in Central and East Asia as hemp, mostly for rope and other textiles. Later, some adventurous forebears found more interesting uses. In 2008, archaeologists in northwestern China discovered almost 800 grams of dried cannabis containing high levels of THC, the psychoactive ingredient in marijuana, among the burial items of a seventh century B.C. shaman.

The Greeks and Romans used cannabis for hemp, medicine and possibly religious purposes, but the plant was never as pervasive in the classical world as it was in ancient India. Cannabis indica, the sacred plant of the god Shiva, was revered for its ability to relieve physical suffering and bring spiritual enlightenment to the holy.

Cannabis gradually spread across the Middle East in the form of hashish, which is smoked or eaten. The first drug laws were enacted by Islamic rulers who feared their subjects wanted to do little else. King al-Zahir Babar in Egypt banned hashish cultivation and consumption in 1266. When that failed, a successor tried taxing hashish instead in 1279. This filled local coffers, but consumption levels soared and the ban was restored.

The march of cannabis continued unabated across the old and new worlds, apparently reaching Stratford-upon-Avon by the 16th century. Fragments of some 400-year-old tobacco pipes excavated from Shakespeare’s garden were found to contain cannabis residue. If not the Bard, at least someone in the household was having a good time.

By the 1600s American colonies were cultivating hemp for the shipping trade, using its fibers for rigs and sails. George Washington and Thomas Jefferson grew cannabis on their Virginia plantations, seemingly unaware of its intoxicating properties.

Veterans of Napoleon’s Egypt campaign brought hashish to France in the early 1800s, where efforts to ban the habit may have enhanced its popularity. Members of the Club des Hashischins, which included Charles Baudelaire, Honoré de Balzac, Alexander Dumas and Victor Hugo, would meet to compare notes on their respective highs.

ILLUSTRATION: THOMAS FUCHS

Although Queen Victoria’s own physician advocated using cannabis to relieve childbirth and menstrual pains, British lawmakers swung back and forth over whether to tax or ban its cultivation in India.

In the U.S., however, Americans lumped cannabis with the opioid epidemic that followed the Civil War. Early 20th-century politicians further stigmatized the drug by associating it with Black people and Latino immigrants. Congress outlawed nonmedicinal cannabis in 1937, a year after the movie “Reefer Madness” portrayed pot as a corrupting influence on white teenagers.

American views of cannabis have changed since President Nixon declared an all-out War on Drugs more than 50 years ago, yet federal law still classifies the drug alongside heroin. As lawmakers struggle to catch up with the zeitgeist, two things remain certain: Governments are often out of touch with their citizens, and what people want isn’t always what’s good for them.

Historically Speaking: A Tale of Two Hats

Napoleon’s bicorne and Santa Claus’s red cap both trace their origins to the felted headgear worn in Asia Minor thousands of years ago.

December makes me think of hats—well, one hat in particular. Not Napoleon’s bicorne hat, an original of which (just in time for Ridley Scott’s movie) sold for $2.1 million at an auction last month in France, but Santa’s hat.

The two aren’t as different as you might imagine. They share the same origins and, improbably, tell a similar story. Both owe their existence to the invention of felt, a densely matted textile. The technique of felting was developed several thousand years ago by the nomads of Central Asia. Since felt stays waterproof and keeps its shape, it could be used to make tents, padding and clothes.

The ancient Phrygians of Asia Minor were famous for their conical felt hats, which resemble the Santa cap but with the peak curving upward and forwards. Greek artists used them to indicate a barbarian. The Romans adopted a red, flat-headed version, the pileus, which they bestowed on freed slaves.

Although the Phrygian style never went out of fashion, felt was largely unknown in Western Europe until the Crusades. Its introduction released a torrent of creativity, but nothing matched the sensation created by the hat worn by King Charles VII of France in 1449. At a celebration to mark the French victory over the English in Normandy, he appeared in a fabulously expensive, wide-brimmed, felted beaver-fur hat imported from the Low Countries. Beaver hats were not unknown; the show-off merchant in Chaucer’s “Canterbury Tales” flaunts a “Flandrish beaver hat.” But after Charles, everyone wanted one.

Hat brims got wider with each decade, but even beaver fur is subject to gravity. By the 17th century, wearers of the “cavalier hat” had to cock or fold up one or both sides for stability. Thus emerged the gentleman’s three-sided cocked hat, or tricorne, as it later became known—the ultimate divider between the haves and the have-nots.

The Phrygian hat resurfaced in the 18th century as the red “Liberty Cap.” Its historical connections made it the headgear of choice for rebels and revolutionaries. During the Reign of Terror, any Frenchman who valued his head wore a Liberty Cap. But afterward, it became synonymous with extreme radicalism and disappeared. In the meantime, the hated tricorne had been replaced by the less inflammatory top hat. It was only naval and military men, like Napoleon, who could get away with the bicorne.

The wide-brimmed felted beaver hat was resurrected in the 1860s by John B. Stetson, then a gold prospector in Colorado. Using the felting techniques taught to him by his hatter father, Stetson made himself an all-weather head protector, turning the former advertisement for privilege into the iconic hat of the American cowboy.

Thomas Nast, the Civil War caricaturist and father of Santa Claus’s modern image, performed a similar rehabilitation on the Phrygian cap. To give his Santa a far-away but still benign look, he gave him a semi-Phrygian crossed with a camauro, the medieval clergyman’s cap. Subsequent artists exaggerated the peak and cocked it back, like a nightcap. Thus the red cap of revolution became the cartoon version of Christmas.

In this tale of two hats lies a possible rejoinder to the cry in T.S. Eliot’s “The Waste Land”: “Who is the third who walks always beside you?” It is history, invisible yet present, protean yet permanent—and sometimes atop Santa’s head.

Historically Speaking: When Generals Run the State

Military leaders have been rulers since ancient times, but the U.S. has managed to keep them from becoming kings or dictators.

The Wall Street Journal

April 29, 2022

History has been kind to General Ulysses S. Grant, less so to President Grant. The hero of Appomattox, born 200 years ago this month, oversaw an administration beset by scandal. In his farewell address to Congress in 1876, Grant insisted lamely that his “failures have been errors of judgment, not of intent.”

Yet Grant’s presidency could as well be remembered for confirming the strength of American democracy at a perilous time. Emerging from the trauma of the Civil War, Americans sent a former general to the White House without fear of precipitating a military dictatorship. As with the separation of church and state, civilian control of the military is one of democracy’s hard-won successes.

In ancient times, the earliest kings were generals by definition. The Sumerian word for leader was “Lugal,” meaning “Big Man.” Initially, a Lugal was a temporary leader of a city-state during wartime. But by the 24th century B.C., Lugal had become synonymous with governor. The title wasn’t enough for Sargon the Great, c. 2334—2279 B.C., who called himself “Sharrukin,” or “True King,” in celebration of his subjugation of all Sumer’s city-states. Sargon’s empire lasted for three more generations.

In subsequent ancient societies, military and political power intertwined. The Athenians elected their generals, who could also be political leaders, as was the case for Pericles. Sparta was the opposite: The top Spartan generals inherited their positions. The Greek philosopher Aristotle described the Spartan monarchy—shared by two kings from two royal families—as a “kind of unlimited and perpetual generalship,” subject to some civic oversight by a 30-member council of elders.

ILLUSTRATION: THOMAS FUCHS

By contrast, ancient Rome was first a traditional monarchy whose kings were expected to fight with their armies, then a republic that prohibited actively serving generals from bringing their armies back from newly conquered territories into Italy, and finally a militarized autocracy led by a succession of generals-cum-emperors.

In later periods, boundaries between civil and military leadership blurred in much of the world. At the most extreme end, Japan’s warlords seized power in 1192, establishing the Shogunate, essentially a military dictatorship, and reducing the emperor to a mere figurehead until the Meiji Restoration in 1868. Napoleon trod a well-worn route in his trajectory from general to first consul, to first consul for life and finally emperor.

After defeating the British, General George Washington might have gone on to govern the new American republic in the manner of Rome’s Julius Caesar or England’s Oliver Cromwell. Instead, Washington chose to govern as a civilian and step down at the end of two terms, ensuring the transition to a new administration without military intervention. Astonished that a man would cling to his ideals rather than to power, King George III declared if Washington stayed true to his word, “he will be the greatest man in the world.”

The trust Americans have in their army is reflected in the tally of 12 former generals who have been U.S. presidents, from George Washington to Dwight D Eisenhower. President Grant may not have fulfilled the hopes of the people, but he kept the promise of the republic.

Historically Speaking: The Game of Queens and Grandmasters

Chess has captivated minds for 1,500 years, surviving religious condemnation, Napoleonic exile and even the Russian Revolution

The Wall Street Journal

April 15, 2022

Fifty years ago, the American chess grandmaster Bobby Fischer played the reigning world champion Boris Spassky at the “Match of the Century” in Reykjavik, Iceland. The Cold War was at its height, and the Soviets had held the title since 1948. More was riding on the competition than just the prize money.

ILLUSTRATION: THOMAS FUCHS

The press portrayed the Fischer-Spassky match as a duel between the East and the West. But for the West to win, Fischer had to play, and the temperamental chess genius wouldn’t agree to the terms. He boycotted the opening ceremony on July 1, 1972, prompting then-National Security Adviser Henry Kissinger to call Fischer and tell him that it was his patriotic duty to go out there and play. Fischer relented—and won, using the Queen’s Gambit in game 9 (a move made famous by the Netflix series about a fictional woman chess player).

The Fischer-Spassky match reignited global enthusiasm for a 1,500-year-old game. From its probable origins in India around the 6th century, the basic idea of chess spread rapidly across Asia, the Middle East and Europe. Religious authorities initially condemned the game; even so, the ability to play became an indispensable part of courtly culture.

Chess was a slow-moving game until the 1470s, when new rules were introduced that made it faster and more aggressive. The most important changes were greater mobility for the bishops and the transformation of the queen into the most powerful piece on the board. The instigator remains unknown, although the tradition seems to have started in Spain, inspired, perhaps, by Queen Isabella who ruled jointly with King Ferdinand.

The game captivated some of the greatest minds of the Renaissance. Around 1500, the Italian mathematician Luca Pacioli, known as the Father of Accounting, analyzed more than 100 plays and strategies in “De ludo schaccorum” (On the Game of Chess). The hand of Leonardo da Vinci has been detected in some of the illustrations in the only known copy of the book.

Although called the “game of kings,” chess was equally popular with generals. But, as a frustrated Napoleon discovered, triumph on the battlefield was no guarantee of success on the board. Nevertheless, during his exile on St. Helena, Napoleon played so often that one of the more enterprising escape attempts by his supporters involved hidden plans inside an ivory chess set.

PHOTO: J. WALTER GREEN/ASSOCIATED PRESS

London’s Great Exhibition of 1851 inspired the British chess master Howard Staunton, who gave his name to the first standardized chess pieces, to organize the first international chess tournament. Travel delays meant that none of the great Russian players were able to participate, despite the country’s enthusiasm for the game. In a letter to his wife, Russia’s greatest poet Alexander Pushkin declared, “[Chess] is a must for any well-organized family.” It was one of the few bourgeois pastimes to survive the Revolution unscathed.

The Russians regained the world title after the 1972 Fischer-Spassky match. However, in the 1990s they faced a new challenger that wasn’t a country but a computer. Grandmaster Garry Kasparov easily defeated IBM’s Deep Blue in 1996, only to suffer a shock defeat in 1997. Mr. Kasparov even questioned whether the opposition had played fair. Six years later, he agreed to a showdown at The FIDE Man Versus Machine World Chess Championship, against the new and improved Deep Junior. The 2003 match was a draw, leaving chess the winner.

Historically Speaking: Let Slip the Dogs, Birds and Donkeys of War

Animals have served human militaries with distinction since ancient times

The Wall Street Journal

August 5, 2021

Cher Ami, a carrier pigeon credited with rescuing a U.S. battalion from friendly fire in World War I, has been on display at the Smithsonian for more a century. The bird made news again this summer, when DNA testing revealed that the avian hero was a “he” and not—as two feature films, several novels and a host of poems depicted—a ”she.”

Cher Ami was one of more than 200,000 messenger pigeons Allied forces employed during the War. On Oct. 4, 1918, a battalion from the U.S. 77th Infantry Division in Verdun, northern France, was trapped behind enemy lines. The Germans had grown adept at shooting down any bird suspected of working for the other side. They struck Cher Ami in the chest and leg—but the pigeon still managed to make the perilous flight back to his loft with a message for U.S. headquarters.

Animals have played a crucial role in human warfare since ancient times. One of the earliest depictions of a war animal appears on the celebrated 4,500-year-old Sumerian box known as the Standard of Ur. One side shows scenes of war; the other, scenes of peace. On the war side, animals that are most probably onagers, a species of wild donkey, are shown dragging a chariot over the bodies of enemy soldiers.

War elephants of Pyrrhus in a 20th century Russian painting
PHOTO: ALAMY

The two most feared war animals of the classical world were horses and elephants. Alexander the Great perfected the use of the former and introduced the latter after his foray into India in 327 BC. For a time, the elephant was the ultimate weapon of war. At the Battle of Heraclea in 280 B.C., a mere 20 of them helped Pyrrhus, king of Epirus—whose costly victories inspired the term “Pyrrhic victory”—rout an entire Roman army.

War animals didn’t have to be big to be effective, however. The Romans learned how to defeat elephants by exploiting their fear of pigs. In 198 B.C., the citizens of Hatra, near Mosul in modern Iraq, successfully fought off a Roman attack by pouring scorpions on the heads of the besiegers. Eight years later, the Carthaginian general Hannibal won a surprise naval victory against King Eumenes II of Pergamon by catapulting “snake bombs”—jars stuffed with poisonous snakes—onto his ships.

Ancient war animals often suffered extraordinary cruelty. When the Romans sent pigs to confront Pyrrhus’s army, they doused the animals in oil and set them on fire to make them more terrifying. Hannibal would get his elephants drunk and stab their legs to make them angry.

Counterintuitively, as warfare became more mechanized the need for animals increased. Artillery needed transporting; supplies, camps, and prisoners needed guarding. A favorite mascot or horse might be well treated: George Washington had Nelson, and Napoleon had Marengo. But the life of the common army animal was hard and short. The Civil War killed between one and three million horses, mules and donkeys.

According to the Imperial War Museum in Britain, some 16 million animals served during World War I, including canaries, dogs, bears and monkeys. Horses bore the brunt of the fighting, though, with as many as 8 million dying over the four years.

Dolphins and sea lions have conducted underwater surveillance for the U.S. Navy and helped to clear mines in the Persian Gulf. The U.S. Army relies on dogs to detect hidden IEDs, locate missing soldiers, and even fight when necessary. In 2016, four sniffer dogs serving in Afghanistan were awarded the K-9 Medal of Courage by the American Humane Association. As the troop withdrawal continues, the military’s four-legged warriors are coming home, too

Historically Speaking: The Winning Ways of Moving the Troops

Since the siege of Troy, getting armed forces into battle zones quickly and efficiently has made a decisive difference in warfare

The Wall Street Journal

May 6, 2021

The massing of more than 100,000 Russian soldiers at Ukraine’s border in April was an unambiguous message to the West: President Putin could dispatch them at any moment, if he chose.

How troops move into battle positions is hardly the stuff of poetry. Homer’s “The Iliad” begins with the Greeks having already spent 10 years besieging Troy. Yet the engine of war is, quite literally, the ability to move armies. Many scholars believe that the actual Trojan War may have been part of a larger conflict between the Bronze Age kingdoms of the Mediterranean and a maritime confederacy known as the Sea Peoples.

The identity of these seafaring raiders is still debated, but their means of transportation is well-attested. The Sea Peoples had the largest and best fleets, allowing them to roam the seas unchecked. The trade network of the Mediterranean collapsed beneath their relentless attacks. Civilization went backward in many regions; even the Greeks lost the art of writing for several centuries.

ILLUSTRATION: THOMAS FUCHS

The West recovered and flourished until the fifth century, when the Romans were overwhelmed by the superior horse-borne armies of the Vandals. Their Central European horses, bred for strength and stamina, transformed the art of warfare, making it faster and more mobile. The invention of the stirrup, the curb bit, and finally the war saddle made mobility an effective weapon in and of itself.

Genghis Khan understood this better than any of his adversaries. His mounted troops could cover up to 100 miles a day, helping to stretch the Mongol empire from the shores of eastern China to the Austrian border. But horses need pasture, and Europe’s climate between 1238 to 1242 was excessively wet. Previously fertile plains became boggy marshes. The first modern invasion was stopped by rain.

Bad weather continued to provide an effective defense against invaders. Napoleon entered Russia in 1812 with a force of over 500,000. An unseasonably hot summer followed by an unbearably cold winter killed off most of his horses, immobilizing the cavalry and the supply wagons that would have prevented his army from starving. He returned with fewer than 20,000 men.

The reliance on pack animals for transport meant that until the Industrial Revolution, armies were no faster than their Roman counterparts. The U.S. Civil War first showed how decisive railroads could be. In 1863 the Confederate siege of Chattanooga, Tenn., was broken by 23,000 Federal troops who traveled over 1,200 miles across seven states to relieve Union forces under General William Rosecrans.

The Prussians referred to this kind of troop-maneuvering warfare as bewegungskrieg, war of movement, using it to crushing effect over the less-mobile French in the Franco-Prussian War. In the early weeks of World War I, France still struggled to mobilize; Gen. Joseph S. Gallieni, the military governor of Paris, famously resorted to commandeering Renault taxicabs to ferry soldiers to the Battle of the Marne.

The Germans started World War II with their production capacity lagging that of the Allies; they compensated by updating bewegungskrieg to what became known as blitzkrieg, or lightning strike, which combined speed with concentrated force. They overwhelmed French defenses in six weeks.

In the latter half of the 20th century, troop transport became even more inventive, if not decisive. Most of the 2.7 million U.S. soldiers sent into the Vietnam War were flown commercial. (Civilian air stewardesses flying over combat zones were given the military rank of Second Lieutenant.)

Although future conflicts may be fought in cyberspace, for now, modern warfare means mass deployment. Winning still requires moving.

Historically Speaking: Iron Curtains Are Older Than the Cold War

Winston Churchill made the term famous, but ideological rivalries have driven geopolitics since Athens and Sparta.

The Wall Street Journal

February 25, 2021

It was an unseasonably springlike day on March 5, 1946, when Winston Churchill visited Fulton, Missouri. The former British Prime Minister was ostensibly there to receive an honorary degree from Westminster College. But Churchill’s real purpose in coming was to urge the U.S. to form an alliance with Britain to keep the Soviet Union from expanding any further. Speaking before an august audience that included President Harry S. Truman, Churchill declared: “From Stettin in the Baltic to Trieste in the Adriatic, an iron curtain has descended across the continent.”

ILLUSTRATION: THOMAS FUCHS

Churchill wasn’t the first person to employ the phrase “iron curtain” as a political metaphor. Originally a theatrical term for the safety barrier between the stage and the audience, by the early 20th century it was being used to mean a barrier between opposing powers. Nevertheless, “iron curtain” became indelibly associated with Churchill and with the defense of freedom and democracy.

This was a modern expression of an idea first articulated by the ancient Greeks: that political beliefs are worth defending. In the winter of 431-30 B.C., the Athenians were staggering under a devastating plague while simultaneously fighting Sparta in the Peloponnesian War. The stakes couldn’t have been higher when the great statesman Pericles used a speech commemorating the war dead to define the struggle in terms that every Athenian would understand.

As reported by the historian Thucydides, Pericles told his compatriots that the fight wasn’t for more land, trade or treasure; it was for democracy pure and simple. Athens was special because its government existed “for the many instead of the few,” guaranteeing “equal justice to all.” No other regime, and certainly not the Spartans, could make the same claim.

Pericles died the following year, and Athens eventually went down in defeat in 404 B.C. But the idea that fighting for one’s country meant defending a political ideal continued to be influential. According to the 2nd-century Roman historian Cassius Dio, the Empire had to make war on the “lawless and godless” tribes living outside its borders. Fortifications such as Hadrian’s Wall in northern England weren’t just defensive measures but political statements: Inside bloomed civilization, outside lurked savagery.

The Great Wall of China, begun in 220 B.C. by Emperor Qin Shi Huang, had a similar function. In addition to keeping out the nomadic populations in the Mongolian plains, the wall symbolized the unity of the country under imperial rule and the Confucian belief system that supported it. Successive dynasties continued to fortify the Great Wall until the mid-17th century.

During the Napoleonic Wars, the British considered themselves to be fighting for democracy against dictatorship, like the ancient Athenians. In 1806, Napoleon instigated the Continental System, an economic blockade intended to cut off Britain from trading with France’s European allies and conquests. But the attack on free trade only strengthened British determination.

A similar resolve among the NATO allies led to the collapse of the Iron Curtain in 1991, when the Soviet Union was dissolved and withdrew its armies from Eastern Europe. As Churchill had predicted, freedom and democracy is the ultimate shield against “war and tyranny.”

Historically Speaking: The Original Victims of Cancel Culture

Roman emperors and modern dictators have feared the social and spiritual penalties of excommunication.

The Wall Street Journal

January 28, 2021

Nowadays, all it takes for a person to be condemned to internal exile is a Twitter stampede of outrage. The lack of any regulating authority or established criteria for what constitutes repentance gives “cancel culture,” as it is popularly known, a particularly modern edge over more old-fashioned expressions of public shaming such as tar-and-feathering, boycotts and blacklists.

Portrait of Martin Luther by Lucas Cranach the Elder.
PHOTO: CORBIS/VCG/GETTY IMAGES

But the practice of turning nonconforming individuals into non-persons has been used with great effectiveness for centuries, none more so than the punishment of excommunication by the Roman Catholic Church. The penalties included social ostracism, refusal of communion and Christian burial, and eternal damnation of one’s soul.

The fear inspired by excommunication was great enough to make even kings fall in line. In 390, soldiers under the command of the Roman emperor Theodosius I massacred thousands in the Greek city of Thessalonica. In response, Bishop Ambrose of Milan excommunicated Theodosius, forcing him to don sackcloth and ashes as public penance. Ambrose’s victory established the Church’s authority over secular rulers.

Later church leaders relied on the threat of excommunication to maintain their power, but the method could backfire. In 1054, Pope Leo III of Rome excommunicated Patriarch Michael Cerularius of Constantinople, the head of the eastern Church, who retaliated by excommunicating Leo and the western Church. Since this Great Schism, the two churches, Roman Catholic and Eastern Orthodox, have never reunited.

During the Middle Ages, the penalty of excommunication broadened to include the cancellation of all legal protections, including the right to collect debts. Neither kings nor cities were safe. After being excommunicated by Pope Gregory VII in 1076, Holy Roman Emperor Henry IV stood barefoot in the snow for three days before the pontiff grudgingly welcomed him inside to hear his repentance. The entire city of Venice was excommunicated over half a dozen times, and on each occasion the frightened Venetians capitulated to papal authority.

But the excommunication of Martin Luther, the founder of Protestantism, by Pope Leo X in January 1521, 500 years ago this month, didn’t work out as planned. Summoned to explain himself at the Diet of Worms, a meeting presided over by the Holy Roman Emperor Charles V, Luther refused to recant and ask forgiveness, allegedly saying: “Here I stand, I can do no other.” In response, the Emperor declared him a heretic and outlaw, putting his life in danger. Luther was only saved from assassination by his patron, Frederick, Elector of Saxony, who hid him in a castle. Luther used the time to begin translating the Bible into German.

Napoleon Bonaparte was equally unconcerned about the spiritual consequences when he was excommunicated by Pope Pius VII in 1809. Nevertheless, he was sufficiently frightened of public opinion to kidnap the pontiff and keep him out of sight for several months. In 1938, angry over Nazi Germany’s takeover of Austria, the Italian dictator Benito Mussolini tried to persuade Pope Pius XI to excommunicate the German dictator Adolf Hitler, a nonpracticing Catholic. Who knows what would have happened if he had been successful.

Historically Speaking: Funding Wars Through the Ages

U.S. antiterror efforts have cost nearly $6 trillion since the 9/11 attacks. Earlier governments from the ancient Greeks to Napoleon have had to get creative to finance their fights

The Wall Street Journal, October 31, 2019

The successful operation against Islamic State leader Abu Bakr al-Baghdadi is a bright spot in the war on terror that the U.S. declared in response to the attacks of 9/11. The financial costs of this long war have been enormous: nearly $6 trillion to date, according to a recent report by the Watson Institute of International and Public Affairs at Brown University, which took into account not just the defense budget but other major costs, like medical and disability care, homeland security and debt.

ILLUSTRATION: THOMAS FUCHS

War financing has come a long way since the ancient Greeks formed the Delian League in 478 B.C., which required each member state to contribute an agreed amount of money each year, rather than troops,. With the League’s financial backing, Athens became the Greek world’s first military superpower—at least until the Spartans, helped by the Persians, built up their naval fleet with tribute payments extracted from dependent states.

The Romans maintained their armies through tributes and taxes until the Punic Wars—three lengthy conflicts between 264 and 146 B.C.—proved so costly that the government turned to debasing the coinage in an attempt to increase the money supply. The result was runaway inflation and eventually a sovereign debt crisis during the Social War a half-century later between Rome and several breakaway Italian cities. The government ended up defaulting in 86 B.C., sealing the demise of the ailing Roman Republic.

After the fall of Rome in the late fifth century, wars in Europe were generally financed by plunder and other haphazard means. William the Conqueror financed the Norman invasion of England in 1066 the ancient Roman way, by debasing his currency. He learned his lesson and paid for all subsequent operations out of tax receipts, which stabilized the English monetary system and established a new model for financing war.

Taxation worked until European wars became too expensive for state treasuries to fund alone. Rulers then resorted to a number of different methods. During the 16th century, Philip I of Spain turned to the banking houses of Genoa to raise the money for his Armada invasion fleet against England. Seizing the opportunity, Sir Francis Walsingham, Elizabeth I’s chief spymaster, sent agents to Genoa with orders to use all legal means to sabotage and delay the payment of Philip’s bills of credit. The operation bought England a crucial extra year of preparation.

In his own financial preparations to fight England, Napoleon had better luck than Philip I: In 1803 he was able to raise a war chest of over $11 million in cash by selling the Louisiana Territory to the U.S.

Napoleon was unusual in having a valuable asset to offload. By the time the American Civil War broke out in 1861, governments had become reliant on a combination of taxation, printing money or borrowing to pay for war. But the U.S. lacked a regulated banking system since President Andrew Jackson’s dismantling of the Second Bank of the United States in the 1830s. The South resorted to printing paper money, which depreciated dramatically. The North could afford to be more innovative. In 1862 the financier Jay Cooke invented the war bond. This was marketed with great success to ordinary citizens. At the war’s end, the bonds had covered two-thirds of the North’s costs.

Incurring debt is still how the U.S. funds its wars. It has helped to shield the country from the full financial effects of its prolonged conflicts. But in the future it is worth remembering President Calvin Coolidge’s warning: “In any modern campaign the dollars are the shock troops…. A country loaded with debt is devoid of the first line of defense.”

The Sunday Times: No more midlife crisis – I’m riding the U-curve of happiness

Evidence shows people become happier in their fifties, but achieving that takes some soul-searching

I used not to believe in the “midlife crisis”. I am ashamed to say that I thought it was a convenient excuse for self-indulgent behaviour — such as splurging on a Lamborghini or getting buttock implants. So I wasn’t even aware that I was having one until earlier this year, when my family complained that I had become miserable to be around. I didn’t shout or take to my bed, but five minutes in my company was a real downer. The closer I got to my 50th birthday, the more I radiated dissatisfaction.

Can you be simultaneously contented and discontented? The answer is yes. Surveys of “national wellbeing” in several countries, including the UK, by the Office for National Statistics have revealed a fascinating U-curve in relation to happiness and age. In Britain, feelings of stress and anxiety appear to peak at 49 and subsequently fade as the years increase. Interestingly, a 2012 study showed that chimpanzees and orang-utans exhibited a similar U-curve of happiness as they reach middle age.

On a rational level, I wasn’t the least bit disappointed with my life. The troika of family, work and friends made me very happy. And yet something was eating away at my peace of mind. I regarded myself as a failure — not in terms of work but as a human being. Learning that I wasn’t alone in my daily acid bath of gloom didn’t change anything.

One of F Scott Fitzgerald’s most memorable lines is: “There are no second acts in American lives.” It’s so often quoted that it’s achieved the status of a truism. It’s often taken to be an ironic commentary on how Americans, particularly men, are so frightened of failure that they cling to the fiction that life is a perpetual first act. As I thought about the line in relation to my own life, Fitzgerald’s meaning seemed clear. First acts are about actions and opportunities. There is hope, possibility and redemption. Second acts are about reactions and consequences.

Old habits die hard, however. I couldn’t help conducting a little research into Fitzgerald’s life. What was the author of The Great Gatsby really thinking when he wrote the line? Would it even matter?

The answer turned out to be complicated. As far as the quotation goes, Fitzgerald actually wrote the reverse. The line appears in a 1935 essay entitled My Lost City, about his relationship with New York: “I once thought that there were no second acts in American lives, but there was certainly to be a second act to New York’s boom days.”

It reappeared in the notes for his Hollywood novel, The Love of the Last Tycoon, which was half finished when he died in 1940, aged 44. Whatever he had planned for his characters, the book was certainly meant to have been Fitzgerald’s literary comeback — his second act — after a decade of drunken missteps, declining book sales and failed film projects.

Fitzgerald may not have subscribed to the “It’s never too late to be what you might have been” school of thought, but he wasn’t blind to reality. Of course he believed in second acts. The world is full of middle-aged people who successfully reinvented themselves a second or even third time. The mercurial rise of Emperor Claudius (10BC to AD54) is one of the earliest historical examples of the true “second act”.

According to Suetonius, Claudius’s physical infirmities had made him the butt of scorn among his powerful family. But his lowly status saved him after the assassination of his nephew, Caligula. The plotters found the 56-year-old Claudius cowering behind a curtain. On the spur of the moment, instead of killing him, as they did Caligula’s wife and daughter, the plotters decided the stumbling and stuttering scion of the Julio-Claudian dynasty could be turned into a puppet emperor. It was a grave miscalculation. Claudius seized on his changed circumstances. The bumbling persona was dropped and, although flawed, he became a forceful and innovative ruler.

Mostly, however, it isn’t a single event that shapes life after 50 but the willingness to stay the course long after the world has turned away. It’s extraordinary how the granting of extra time can turn tragedy into triumph. In his heyday, General Mikhail Kutuzov was hailed as Russia’s greatest military leader. But by 1800 the 55-year-old was prematurely aged. Stiff-limbed, bloated and blind in one eye, Kutuzov looked more suited to play the role of the buffoon than the great general. He was Alexander I’s last choice to lead the Russian forces at the Battle of Austerlitz in 1805, but was the first to be blamed for the army’s defeat.

Kutuzov was relegated to the sidelines after Austerlitz. He remained under official disfavour until Napoleon’s army was halfway to Moscow in 1812. Only then, with the army and the aristocracy begging for his recall, did the tsar agree to his reappointment. Thus, in Russia’s hour of need it ended up being Kutuzov, the disgraced general, who saved the country.

Winston Churchill had a similar apotheosis in the Second World War. For most of the 1930s he was considered a political has-been by friends and foes alike. His elevation to prime minister in 1940 at the age of 65 changed all that, of course. But had it not been for the extraordinary circumstances created by the war, Robert Rhodes James’s Churchill: A Study in Failure, 1900-1939 would have been the epitaph rather than the prelude to the greatest chapter in his life.

It isn’t just generals and politicians who can benefit from second acts. For writers and artists, particularly women, middle age can be extremely liberating. The Booker prize-winning novelist Penelope Fitzgerald published her first book at 59 after a lifetime of teaching while supporting her children and alcoholic husband. Thereafter she wrote at a furious pace, producing nine novels and three biographies before she died at 83.

I could stop right now and end with a celebratory quote from Morituri Salutamus by the American poet Henry Wadsworth Longfellow: “For age is opportunity no less/ than youth itself, though in another dress, / And as the evening twilight fades away / The sky is filled with stars, invisible by day.”

However, that isn’t — and wasn’t — what was troubling me in the first place. I don’t think the existential anxieties of middle age are caused or cured by our careers. Sure, I could distract myself with happy thoughts about a second act where I become someone who can write a book a year rather than one a decade. But that would still leave the problem of the flesh-and-blood person I had become in reality. What to think of her? It finally dawned on me that this had been my fear all along: it doesn’t matter which act I am in; I am still me.

My funk lifted once the big day rolled around. I suspect that joining a gym and going on a regular basis had a great deal to do with it. But I had also learnt something valuable during these past few months. Worrying about who you thought you would be or what you might have been fills a void but leaves little space for anything else. It’s coming to terms with who you are right now that really matters.