Historically Speaking: Cities Built by Royal Command

From ancient Egypt to modern Russia, rulers have tried to build new capitals from the ground up.

March 5, 2020

The Wall Street Journal

In 1420, the Yongle Emperor moved China’s capital from Nanjing to the city then known as Beiping. To house his government he built the Forbidden City, the largest wooden complex in the world, with more than 70 palace compounds spread across 178 acres. Incredibly, an army of 100,000 artisans and one million laborers finished the project in only three years. Shortly after moving in, the emperor renamed the city Beijing, “northern capital.”

The monument of Peter the Great in St. Petersburg, Russia.
PHOTO: GETTY IMAGES

In the 600 years since, countless visitors have marveled at Yongle’s creation. As its name suggests, the Forbidden City could only be entered by permission of the emperor. But the capital city built around it was an impressive symbol of imperial power and social order, a kind of 3-D model of the harmony of the universe.

Beijing’s success and longevity marked an important leap forward in the history of purpose-built cities. The earliest attempts at building a capital from scratch were usually hubristic affairs that vanished along with their founders. Nothing remains of the fabled Akkad, commissioned by Sargon the Great in the 24th century B.C. following his victory over the Sumerians.

But the most vainglorious and ultimately futile capital ever constructed must surely be Akhetaten, later known as Amarna, on the east bank of the Nile River in Egypt. It was built by the Pharaoh Akhenaten around 1346 B.C. to serve as a living temple to Aten, the god of the sun. The pharaoh hoped to make Aten the center of a new monotheistic cult, replacing the ancient pantheon of Egyptian deities. In his eyes, Amarna was a glorious act of personal worship. But to the Egyptians, this hastily erected city of mud and brick was an indoctrination camp run by a crazed fanatic. Neither Akhenaten’s religion nor his city long survived his death.

In 330 B.C., Alexander the Great was responsible for one of the worst acts of cultural vandalism in history when he allowed his army to burn down Persepolis, the magnificent Achaemenid capital founded by Darius I, in revenge for the Persian destruction of Athens 150 years earlier.

Ironically, the year before he destroyed a metropolis in Persia, the Macedonian emperor had created one in Egypt. Legend has it that Alexander chose the site of Alexandria after being visited by the poet Homer in a dream. He may also have been influenced by the advantages of the location, near the island of Pharos on the Mediterranean coast, which boasted two harbors as well as a limitless supply of fresh water. Working closely with the famed Greek architect Dinocrates, Alexander designed the walls, city quarters and street grid himself. Alexandria went on to become a center of Greek and Roman civilization, famous for its library, museum and lighthouse.

No European ruler would rival the urban ambitions of Alexander the Great, let alone Emperor Yongle, until Tsar Peter the Great. In 1703, he founded St. Petersburg on the marshy archipelago where the Neva River meets the Baltic Sea. His goal was to replace Moscow, the old Russian capital, with a new city built according to modern, Western ideas. Those ideas were unpopular with Russia’s nobility, and after Peter’s death his successor moved the capital back to Moscow. But in 1732 the Romanovs transferred their court permanently to St. Petersburg, where it remained until 1917. Sometimes, the biggest urban-planning dreams actually come true.

 

Historically Speaking: Women Who Popped the Question

Tradition holds that women only propose marriage on leap days, but queens have never been afraid to take the initiative.

February 20, 2020

The Wall Street Journal

ILLUSTRATION: THOMAS FUCHS

An old tradition holds that every leap year, on Feb. 29, women may propose marriage to men without censure or stigma. Sources disagree about the origin of this privilege. One attributes it to St. Brigid, who became concerned for all the unmarried women in 5th-century Ireland and persuaded St. Patrick to grant them this relief. Another gives the credit to Queen Margaret of Scotland, who supposedly had the custom written into Scottish law in 1288.

Neither story is likely to be true: St. Brigid, if she even existed, would have been a child at the time of St. Patrick’s death, and Margaret died at the age of 7 in 1290. But around the world, there have always been a few women who exercised the usually male privilege of proposing.

In the Bible, the widowed Ruth, future great-grandmother of King David, asks her kinsman Boaz to marry her—not with words but by lying down at the foot of his bed. On the Bissagos Islands off the coast of Guinea-Bissau, women propose by offering the man of their choice a ceremonial dish of fish marinated in palm oil.

Queen Ankhesenamun of Egypt, who lived in the 14th century B.C., is believed to have made the earliest recorded marriage proposal by a woman. Based on a surviving letter in the Hittite royal archives, scholars have theorized that Ankhesenamun, the widow of the boy king Tutankhamun, secretly asked the Hittite king Suppiluliuma to agree to a match with one of his sons, so that she could avoid a forced marriage to Ay, her late husband’s grand vizier. The surprised and suspicious king eventually sent her his son Zannanza. Unfortunately, the plan leaked out and the Hittite wedding party was massacred at the Egyptian border. Ankhesenamun disappears from the historical record shortly after.

The Roman princess Justa Grata Honoria, sister of Emperor Valentinian III, had marginally better luck. In 450 she appealed to Attila, King of the Huns, to marry her, in order to escape an arranged marriage with a minor politician. When Valentinian learned of the plan, he refused to allow it and forced Honoria to wed the senator. In retaliation, Attila launched an attack against Rome on the pretext of claiming his bride, capturing Gaul and advancing as far as the Po River in northern Italy.

The idea that marriage was a sentimental union between two individuals, rather than an economic or strategic pact between families, gained ground in Europe in the late 18th century. Jane Austen highlighted the clash of values between generations in her 1813 novel “Pride and Prejudice”: Lady Catherine de Bourgh insists that her daughter and Mr. Darcy have been engaged since birth, while the heroine Elizabeth Bennet declares she will have Darcy if she wants.

Two decades later, a 20-year-old Queen Victoria came down on the side of love when she chose her cousin Albert to be her husband. As a ruling monarch, it was Victoria’s right and duty to make the proposal. As she recorded in her diary on October 15, 1839, “I said to him … that it would make me too happy if he would consent to what I wished.” The 21-year marriage was one of the most successful in royal history.

Although it’s still the custom in most countries for men to propose marriage, leap year or not, there’s more to courtship than getting down on one knee. As the Irving Berlin song goes, “A man chases a girl (until she catches him).”

Historically Speaking: Plagues From the Animal Kingdom

The coronavirus is just the latest of many deadly diseases to cross over to human beings from other species.

February 6, 2020

The Wall Street Journal

Earlier this week, the still-rising death toll in mainland China from the coronavirus surpassed the 349 fatalities recorded during the 2003 SARS epidemic. Although both viruses are believed to have originated in bats, they don’t behave in the same way. SARS spread slowly, but its mortality rate was 9.6%, compared with about 2% for the swift-moving coronavirus.

A scientist examines Ötzi, a 5,300-year-old mummy discovered in the Tyrolean Alps in 1991.
PHOTO: EURAC/MARION LAFOGLER

Statistics tell only one part of the story, however. Advances in the genetic sequencing of diseases have revealed that a vast hinterland of growth and adaptation precedes the appearance of a new disease. Cancer, for example, predates human beings themselves: Last year scientists announced that they had discovered traces of bone cancer in the fossil of a 240-million-year-old shell-less turtle from the Triassic period. This easily surpasses the oldest example of human cancer, which was found in a 1.7 million-year-old toe bone in South Africa. The findings confirm that even though cancer has all kinds of modern triggers such as radiation poisoning, asbestos and smoking, the disease is deeply rooted in our evolutionary past.

Unlike cancer, the majority of human diseases are zoonotic, meaning that they are passed between animals and people by viruses, fungi, parasites or bacteria. The rise of agriculture around 10,000 years ago, which forced humans into close contact with animals, was probably the single greatest factor behind the spread of infectious disease.

Rabies was one of the earliest diseases to be recognized as having an animal origin. The law code of Eshnunna, a Mesopotamian city that flourished around 2000 B.C., mandated harsh punishments against owners of mad dogs that bit people. Lyme disease was only identified by scientists in 1975, but it too was an ancient scourge. Ötzi, the 5,300-year-old mummy who was discovered in the Tyrolean Alps with cracked ribs and an arrow wound in his shoulder, was an unlucky fellow even before he was killed. DNA sequencing in 2010 revealed that while he was alive, Ötzi was lactose intolerant, had clogged arteries and suffered from Lyme disease.

Smallpox, which was eradicated by the World Health Organization in 1979, had been one of the most feared diseases for most of human history, with a mortality rate of 30%. Those who survived were often left with severe scarring; the telltale lesions of smallpox have been identified on the mummy of Pharaoh Ramesses V, who died in 1145 B.C. The disease is caused by the variola virus, which is thought to have crossed over to human beings from an animal, likely a rodent, in prehistoric times.

Although it can’t be proved for certain, it is likely that smallpox was behind the terrible plague that killed 20% of the Athenian population in 430 B.C. The historian Thucydides, who lived through it, described the agony of those infected with red pustules, the dead bodies piled high in the temples and the scars left on the survivors. He also noticed that those who did survive acquired immunity to the disease.

Thucydides’s observation turned out to be the key to one of humanity’s greatest weapons against infectious disease, vaccination. But apart from smallpox, the only eradication programs to have made some progress have been against viruses transmitted from human to human, such as polio and measles. Meanwhile, since the 1970s more than 40 new infectious diseases have emerged from the animal realm, including HIV, swine flu and Zika. And those are just the ones we know about.

Historically Speaking: Royal Treasures, Lost and Found

From Montezuma’s gold to the crown jewels of Scotland, some of the world’s most famous valuables have gone missing.

January 23, 2020

The Wall Street Journal

What sort of nitwit goes off in a snowstorm to feed leftovers to the chickens while still wearing her Christmas Day finery? In my defense, I was just trying to share the love. Alas, I ended up sharing an antique ring along with the Brussels sprouts. Only the chickens know where it is, and they aren’t talking.

The Honours of Scotland were recovered in 1818 after being lost for decades. PHOTO: ALAMY

One doesn’t need to read J.R.R. Tolkien’s “The Lord of the Rings” to be reminded that lost treasures are often the result of epic folly. In October 1216, King John of England lost the crown jewels while leading a campaign against rebellious barons. Against all advice, John—who is chiefly remembered for being forced to sign the Magna Carta, one of the cornerstones of civil liberty—took a shortcut via the Wash, a tidal estuary on England’s east coast. He got across the causeway just in time to see the waters come rushing up behind him. The wagon train with all his supplies and baggage—including, crucially, the king’s treasury—sank without a trace. The incident has given countless British schoolchildren the joy of being able to say, “Bad King John lost his crown in the wash.”

Folly also played a starring role in the disappearance of the treasure of the Aztec Emperor Montezuma II. In 1520, the inhabitants of the Aztec city of Tenochtitlan rose up against the occupying Spanish forces led by Hernán Cortés. By July 1, the situation was so critical that the outnumbered conquistadors attempted a midnight escape from the city. Hampered by their haul of plunder, however, the Spanish were too slow in crossing Lake Texcoco. Unable to run or fight, desperation overcame their greed and the conquistadors tossed the treasures into the water. Despite losing half his men on what he called “La Noche Triste,” the Night of Sorrows, Cortes captured the Aztec capital a year later. But he never found the lost gold.

It was a case of absent-mindedness that led to the disappearance of the Scottish royal sword, scepter and crown, known collectively as the Honours of Scotland. Having been successfully hidden during the Interregnum, England’s brief experiment with republicanism in 1649-60, the Honours were returned to Edinburgh Castle for safekeeping. Too safe, it turned out: No one could remember where they were. But the story has a happy ending. In 1818, the Scottish novelist Sir Walter Scott received permission to conduct his own search of the castle. He found the Honours in a locked storeroom, inside a trunk packed with linens.

Occasionally, royal treasures have been lost on purpose. One of the last rulers to be buried with his jewels was Emperor Tu Duc of Vietnam (1829-83). To outwit potential grave robbers, he left orders that he should not be buried in his elaborate official tomb but in a secret location; to ensure it stayed secret, the laborers who interred him were executed. In 1913, Georges Mahé, the French colonial administrator of the Vietnamese city of Hue, provoked a national outcry after he dug up Tu Duc’s official tomb in the hope of finding the hidden jewels. The French swiftly apologized and Mahé was sacked.

Tu Duc’s treasure remains lost, but it may not stay that way forever. Earlier this month, scientists in Mexico City confirmed that a gold bar found on a construction site is one of the ingots discarded by Cortés and his fleeing conquistadors almost exactly 500 years ago.

 

Historically Speaking: The Blessing and Curse of ‘Black Gold’

From the pharaohs to John D. Rockefeller, oil has been key to the growth of civilization—but it comes at a high cost.

January 10, 2020

The Wall Street Journal

This January marks the 150th anniversary of the Standard Oil Company, incorporated in 1870 by John D. Rockefeller and three partners. Such was their drive and ruthlessness that within a decade Standard Oil became a vast monopoly, controlling over 90% of America’s oil refineries.

Spindletop Hill in Beaumont, Texas, was the site of the first Texas oil gusher in 1901. PHOTO: TEXAS ENERGY MUSEUM/NEWSMAKERS/GETTY IMAGES

Standard Oil’s tentacle-like grip on U.S. commerce was finally prized loose in 1911, when the Supreme Court broke it up into 33 separate companies. But this victory didn’t put an end to the nefarious activities surrounding “black gold.” In the 1920s, tycoon Edward Doheny was drawn into the Teapot Dome scandal after he gave a $100,000 bribe to Secretary of the Interior Albert Fall. Doheny served as the inspiration for the corrupt and blood-soaked tycoon J. Arnold Ross in Upton Sinclair’s 1927 novel “Oil!” (which in turned inspired the Oscar-winning 2007 film “There Will Be Blood”).

Though clearly responsible for a great many evils, oil has also been key to the growth of human civilization. As the Bible attests, bitumen—a naturally forming liquid found in oil sands and pitch lakes—was used in ancient times for waterproofing boats and baskets. It also played an important role in Egyptian burial practices: The word “mummy” is derived from mumiyyah, the Arabic word for bitumen.

Over the centuries, oil proved useful in a variety of ways. As early as the fourth century, the Chinese were drilling for oil with bamboo pipes and burning it as heating fuel. In Central Eurasia it was a treatment for mange in camels. By the ninth century, Persian alchemists had discovered how to distill oil into kerosene to make light. The oil fields of medieval Baku, in today’s Azerbaijan, brought trade and culture to the region, rather than political oppression and underdevelopment, as is often the case in oil-rich countries today.

The drilling of the first commercial oil well in the U.S., in Pennsylvania in 1859, brought a raft of benefits. In the 19th century, an estimated 236,000 sperm whales were killed to make oil for lamps. The whaling industry died overnight once Standard Oil began marketing a clean-smelling version of kerosene. Plentiful oil also made the automobile industry possible. In 1901, when a massive oil gusher was discovered in Spindletop, Texas, there were 14,800 cars in the U.S.; two decades later, there were 8.5 million.

After World War II, the world’s oil supply was dominated not by private companies like Standard Oil but by global alliances such as OPEC. When OPEC nations declared an embargo in 1973, the resulting crisis caused the price of oil to climb nearly 400%.

At the time, the U.S. depended on foreign suppliers for 36% of its oil supply. Last month, the U.S. Energy Information Administration announced that, thanks to new technologies such as hydraulic fracturing, the country had become a net exporter of oil for the first time in 75 years.

Though helpful geopolitically, America’s oil independence doesn’t solve the environmental problems caused by carbon emissions. Ironically, some of John D. Rockefeller’s own descendants, aided by the multibillion-dollar fortune he bequeathed, are now campaigning against Exxon Mobil —one of the 33 Standard Oil offshoots—over its record on climate change.

Historically Speaking: Whiskey Is the Original ‘Cup of Kindness’

The barley fields of Scotland and Ireland gave birth to a drink that became popular around the world

The Wall Street Journal, December 27, 2019

On New Year’s Eve, the song “Auld Lang Syne” urges us to “take a cup of kindness.” It’s an old Scottish saying, meaning to share a friendly tipple—presumably of single malt whiskey.

Nowadays there are whisky (Scottish, with no e), whiskey (Irish and North American), rye whiskey (North American) and bourbon (American), but no matter what the drink is called, the method for making it is essentially the same. Like beer, its ancient precursor, whiskey is made with water, grains and yeast. It’s the distillation process that leads to a higher alcohol content in whiskey. For that we must thank an Egyptian alchemist from around the 2nd century named Maria Hebraea (Mary the Jewess) of Alexandria, whose celebrated inventions include the distillation pot.

ILLUSTRATION: THOMAS FUCHS

In the boggy moors of Scotland and Ireland, where never a grape will grow but barley is plentiful, medieval monks learned how to make a whiskey fiery enough to take a man down if he wasn’t careful. The Annals of Clonmacnoise, a 15th-century chronicle of early Irish history, records that in 1405 the clan chieftain Richard MaGranell drank a “surfeit” of whiskey over Christmas and died.

From the 17th century onward, Scotch-Irish emigrants to the New World brought their distillation techniques with them. But it was an Englishman, the Jamestown colonist George Thorpe, who learned that whiskey could be made just as easily with Indian corn. Thorpe was killed in the 1622 Powhatan massacre, but his inventiveness helped to create the American love affair with bourbon, which is made from corn and aged in charred new oak barrels.

Back in the mother country, the punitive Malt Tax of 1725 drove the British whiskey industry underground. Scottish distilleries became secretive nighttime operations, which is how the nickname “moonshine” came to be. A similar whiskey tax levied by the U.S. government in 1791, to pay down the country’s Revolutionary War debt, met with fierce resistance, sparking the so-called Whiskey Rebellion among Pennsylvania farmers. Whiskey production was a lucrative business, so much so that in retirement George Washington turned his Mount Vernon plantation into the country’s largest distillery.

British visitors to the U.S. during the 19th century were delighted by the myriad ways that bourbon could be enjoyed. In 1842, Charles Dickens passed a night in Baltimore drinking mint juleps with Washington Irving; it was, he wrote, “among the most memorable of my life.” American tourists were equally impressed with British know-how around a single malt. In 1874, Mark Twain wrote to his wife Olivia, “Ever since I have been in London I have taken in a wine glass what is called a cock-tail.” The ingredients consisted of “a bottle of scotch whisky, a lemon, some crushed sugar, and a bottle of Angostura bitters.”

Whiskey’s reputation as the liquor of choice for Prohibition Era bootleggers and tortured geniuses like the novelist William Faulkner made it deeply unfashionable among the vodka-drinking MTV generation. But there’s a revival under way, aided by millennials’ love of cocktail culture. In October, a bottle of 1926 Macallan single malt whisky smashed all auction records for a wine or spirit, selling for $1.9 million. That’s $76,000 per ounce of kindness, my dear, or $158,333 a cocktail.

Historically Speaking: Electric Lights for Yuletide

In 1882, Thomas Edison’s business partner put up a Christmas tree decorated with 80 red, white and blue bulbs—and launched an American tradition.

The Wall Street Journal, December 5, 2019

As a quotation often attributed to Maya Angelou has it, ‘’You can tell a lot about a person by the way (s)he handles these three things: a rainy day, lost luggage and tangled Christmas tree lights.” I’m not sure what it says about me that I actually look forward to getting my hands on the latter. My house is so brightly decorated with energy-saving LEDs that it could double as a landing beacon on a foggy night. It’s the one thing I really missed when I lived abroad—no other country does Christmas lights like America. More than 80 million households put up lighting displays each December, creating a seasonal spike in U.S. energy use that’s bigger than the annual consumption of some small countries.

Holiday lights in Brooklyn, 2015. PHOTO: GETTY IMAGES

Hanging festive lighting during the winter solstice is an ancient practice, but the modern version owes its origins to Thomas Edison and his business partner Edward H. Johnson. Edison perfected the first fully functional lightbulb in 1879. For Christmas the following year, he strung up lights outside his Menlo Park factory—partly to provide good cheer but mostly to advertise the benefits of electrification. Johnson went one further in 1882, placing a Christmas tree decorated with 80 red, white and blue blinking lightbulbs on a revolving turntable in his parlor window in Manhattan.

Johnson repeated the display every year, much to the delight of New Yorkers, striving to make it bigger and better each time. He thus founded the other great American tradition: the competitive light display. The first person to take up Johnson’s challenge was President Grover Cleveland, who in 1894 erected an enormous multi-light Christmas tree in the White House, thereby starting a new presidential tradition.

The initial $300 price tag for an electrified Christmas tree (about $2,000 today) put it beyond the reach of the average consumer. The alternative was clip-on candles, but they were so hazardous that by 1910 most home insurance policies contained a nonpayment clause for house fires caused by candlelit Christmas trees. Although it was possible to rent electric Christmas lights for the season, and the General Electric Company was beginning to produce easy-to-assemble kits, the stark difference between lit and unlit homes threatened to become a powerful symbol of social inequality.

Fortunately, a New Yorker named Emilie D. Lee Herreshoff was on the case. She persuaded the city council to allow an electrified Christmas tree to be put up in Madison Square Park. The inaugural tree-lighting celebration, on December 24, 1912, generated so much public enthusiasm that within two years over 300 cities and towns were holding similar ceremonies.

Not content with just one festive tree, in 1920 the city of Pasadena, Calif., agreed to light up the 150 mature evergreens lining Santa Rosa Avenue, leading to its nickname “Christmas Tree Lane.” This was quite a feat of electrical engineering, given that outdoor Christmas lights didn’t become commercially available until 1927.

To encourage buyers, GE began sponsoring local holiday lighting contests, unleashing a competitive spirit each Yuletide that only seems to have grown stronger with the passing decades. Since 2014, the Guinness Book of World Records title for the most lights on a private residence has been held by the Gay family of LaGrangeville, N.Y., with strong competition from Australia. To which I say, “God bless them, everyone.”

Historically Speaking: ‘Sesame Street’ Wasn’t the First to Make Learning Fun

The show turns 50 this month, but the idea that education can be entertaining goes back to ancient Greece.

The Wall Street Journal, November 21, 2019

“Sesame Street,” which first went on the air 50 years ago this month, is one of the most successful and cost-effective tools ever created for preparing preschool tots for the classroom. Now showing in 70 languages in more than 150 countries around the world, “Sesame Street” is that rare thing in a child’s life: truly educational entertainment.

The Muppets of ‘Sesame Street’ in the 1993-94 season. PHOTO: EVERETT COLLECTION

Historically, those two words have rarely appeared together. In the 4th century B.C., Plato and Aristotle both agreed that children can learn through play. In “The Republic,” Plato went so far as to advise, “Do not use compulsion, but let early education be a sort of amusement.” Unfortunately, his advice failed to catch on.

In Europe during the Middle Ages, play and learning were almost diametrically opposed. Monks were in charge of boys’ education, which largely consisted of Latin grammar and religious teaching. (Girls learned domestic skills at home.) The invention of the printing press in 1440 helped spread literacy among young readers, but the works written for them weren’t exactly entertaining. A book like “A token for children: Being an exact account of the conversion, holy and exemplary lives, and joyful deaths of several young children,” by the 17th-centrury English Puritan James Janeway, surely didn’t follow Plato’s injunction to be amusing as well as instructional.

Social attitudes toward children’s entertainment changed considerably, however, in the wake of the English philosopher John Locke’s 1693 treatise “Some Thoughts Concerning Education.” Locke followed Plato’s line on education, writing, “I always have had a fancy that learning might be made a play and recreation to children.” The publisher John Newbery heeded Locke’s advice; in 1744, he published “A Little Pretty Pocket-Book Intended for the Instruction and Amusement of Little Master Tommy and Pretty Miss Polly,” which was sold along with a ball for boys and a pincushion for girls. In the introduction, Newbery promised parents and guardians that the book would not only make their children “strong, healthy, virtuous, wise” but also “happy.”

When it came to early children’s television in the U.S., however, “play and recreation” usually squeezed out educational content. Many popular shows existed primarily to sell toys and products: “Howdy Doody,” the pioneering puppet show that ran on NBC from 1947 to 1960, was sponsored by RCA to pitch color televisions. Parents became so indignant over the exploitation of their children by the TV industry that, in 1968, grass-roots activists started the nonprofit Action for Children’s Television, which petitioned the Federal Communications Commission to ban advertising on children’s programming.

This cultural mood led to the birth of “Sesame Street.” The show’s co-creators, Joan Ganz Cooney and Lloyd Morrisett, were particularly devoted to using TV to combat educational inequality in minority communities. They spent three years working with teachers, child psychologists and Jim Henson’s Muppets to get the right mix of education and entertainment. The pilot episode, broadcast on public television stations on Nov. 10, 1969, introduced the world to Big Bird, Bert and Ernie, Oscar the Grouch, and their cast of multiethnic friends and neighbors. “You’re gonna love it,” says one of the show’s human characters, Gordon, to his wife Sally in the first show’s opening lines. And we have.

Historically Speaking: Funding Wars Through the Ages

U.S. antiterror efforts have cost nearly $6 trillion since the 9/11 attacks. Earlier governments from the ancient Greeks to Napoleon have had to get creative to finance their fights

The Wall Street Journal, October 31, 2019

The successful operation against Islamic State leader Abu Bakr al-Baghdadi is a bright spot in the war on terror that the U.S. declared in response to the attacks of 9/11. The financial costs of this long war have been enormous: nearly $6 trillion to date, according to a recent report by the Watson Institute of International and Public Affairs at Brown University, which took into account not just the defense budget but other major costs, like medical and disability care, homeland security and debt.

ILLUSTRATION: THOMAS FUCHS

War financing has come a long way since the ancient Greeks formed the Delian League in 478 B.C., which required each member state to contribute an agreed amount of money each year, rather than troops,. With the League’s financial backing, Athens became the Greek world’s first military superpower—at least until the Spartans, helped by the Persians, built up their naval fleet with tribute payments extracted from dependent states.

The Romans maintained their armies through tributes and taxes until the Punic Wars—three lengthy conflicts between 264 and 146 B.C.—proved so costly that the government turned to debasing the coinage in an attempt to increase the money supply. The result was runaway inflation and eventually a sovereign debt crisis during the Social War a half-century later between Rome and several breakaway Italian cities. The government ended up defaulting in 86 B.C., sealing the demise of the ailing Roman Republic.

After the fall of Rome in the late fifth century, wars in Europe were generally financed by plunder and other haphazard means. William the Conqueror financed the Norman invasion of England in 1066 the ancient Roman way, by debasing his currency. He learned his lesson and paid for all subsequent operations out of tax receipts, which stabilized the English monetary system and established a new model for financing war.

Taxation worked until European wars became too expensive for state treasuries to fund alone. Rulers then resorted to a number of different methods. During the 16th century, Philip I of Spain turned to the banking houses of Genoa to raise the money for his Armada invasion fleet against England. Seizing the opportunity, Sir Francis Walsingham, Elizabeth I’s chief spymaster, sent agents to Genoa with orders to use all legal means to sabotage and delay the payment of Philip’s bills of credit. The operation bought England a crucial extra year of preparation.

In his own financial preparations to fight England, Napoleon had better luck than Philip I: In 1803 he was able to raise a war chest of over $11 million in cash by selling the Louisiana Territory to the U.S.

Napoleon was unusual in having a valuable asset to offload. By the time the American Civil War broke out in 1861, governments had become reliant on a combination of taxation, printing money or borrowing to pay for war. But the U.S. lacked a regulated banking system since President Andrew Jackson’s dismantling of the Second Bank of the United States in the 1830s. The South resorted to printing paper money, which depreciated dramatically. The North could afford to be more innovative. In 1862 the financier Jay Cooke invented the war bond. This was marketed with great success to ordinary citizens. At the war’s end, the bonds had covered two-thirds of the North’s costs.

Incurring debt is still how the U.S. funds its wars. It has helped to shield the country from the full financial effects of its prolonged conflicts. But in the future it is worth remembering President Calvin Coolidge’s warning: “In any modern campaign the dollars are the shock troops…. A country loaded with debt is devoid of the first line of defense.”

Historically Speaking: The Many Roads to Vegetarianism

Health, religion and animal rights have all been advanced as reasons not to eat meat.

The Wall Street Journal, October 18, 2019

ILLUSTRATION: PETER ARKLE

The claim that today’s ingeniously engineered fake meat tastes like the real thing and helps the planet is winning over consumers from the carnivore side of the food aisle. According to Barclays, the alt-meat market could be worth $140 billion a year a decade from now. But the argument over the merits of vegetarianism is nothing new; it’s been going on since ancient times.

Meat played a pivotal role in the evolution of the human brain, providing the necessary calories and protein to enable it to increase in size. Nonetheless, meat-eating remained a luxury in the diets of most early civilizations. It wasn’t much of a personal sacrifice, therefore, when the Greek philosopher Pythagoras (ca. 570-495 B.C.), author of the famous theorem, became what many consider the first vegetarian by choice. Pythogoreans believed that humans could be reincarnated as animals and vice versa, meaning that if you ate meat, Aunt Lydia could end up on your plate.

The anti-meat school of thought was joined a century later by Plato, who argued in the Republic that meat consumption encouraged decadence and warlike behavior. These views were strongly countered by Aristotelian philosophy, which taught that animals exist for human use—an opinion that the Romans heartily endorsed.

The avoidance of meat for moral and ascetic reasons also found a home in Buddhism and Hinduism. Ashoka the Great, the 3rd-century Buddhist emperor of the Maurya Dynasty of India, abolished animal sacrifice and urged his people to abstain from eating flesh.

It wasn’t until the Enlightenment, however, that Western moralists and philosophers began to argue for vegetarianism on the grounds that we have a moral duty to avoid causing animals pain. In 1641 the Massachusetts Bay Colony passed one of the earliest laws against animal cruelty. By the early 19th century, the idea that animals have rights had started to take hold: The English Romantic poet Percy Bysshe Shelley proselytized for vegetarianism, as did the American transcendentalist thinker Henry David Thoreau, who wrote in “Walden”: “I have no doubt that it is part of the destiny of the human race … to leave off eating animals.”

The word “vegetarian” first appeared in print in England in 1842. Within a decade there were vegetarian societies in Britain and America. Echoing the Platonists rather than Pythagoras, their guiding motivation was self-denial as opposed to animal welfare. Sylvester Graham, the leader of the early American vegetarian movement, also urged sexual abstinence on his followers.

Vegetarianism finally escaped its moralistic straitjacket at the end of the 19th century, when the health guru John Harvey Kellogg, the inventor of corn flakes, popularized meat-free living for reasons of bodily well-being at his Battle Creek Sanitarium in Michigan.

There continue to be mixed motivations for vegetarianism today. Burger King’s meatless Impossible Whopper may be “green,” but it has less protein and virtually the same number of calories as the original. A healthier version will no doubt appear before long, and some people hope that when lab-grown meat hits the market in a few years, it will be as animal- and climate-friendly as plant-based food. With a lot of science and a bit of luck, vegetarians and meat-eaters may end up in the same place.