Historically Speaking: Duels Among the Clouds

Aerial combat was born during World War I, giving the world a new kind of military hero: the fighter pilot

“Top Gun” is back. The 1986 film about Navy fighter pilots is getting a sequel next year, with Tom Cruise reprising his role as Lt. Pete “Maverick” Mitchell, the sexy flyboy who can’t stay out of trouble. Judging by the trailer released by Paramount in July, the new movie, “Top Gun: Maverick,” will go straight to the heart of current debates about the future of aerial combat. An unseen voice tells Mr. Cruise, “Your kind is headed for extinction.”

The mystique of the fighter pilot began during World War I, when fighter planes first entered military service. The first aerial combat took place on Oct. 5, 1914, when French and German biplanes engaged in an epic contest in the sky, watched by soldiers on both sides of the trenches. At this early stage, neither plane was armed, but the German pilot had a rifle and the French pilot a machine gun; the latter won the day.

A furious arms race ensued. The Germans turned to the Dutch engineer Anthony Fokker, who devised a way to synchronize a plane’s propeller with its machine gun, creating a flying weapon of deadly accuracy. The Allies soon caught up, ushering in the era of the dogfight.

From the beginning, the fighter pilot seemed to belong to a special category of warrior—the dueling knight rather than the ordinary foot soldier. Flying aces of all nationalities gave each other a comradely respect. In 1916, the British marked the downing of the German fighter pilot Oswald Boelcke by dropping a wreath in his honor on his home airfield in Germany.

But not until World War II could air combat decide the outcome of an entire campaign. During the Battle of Britain in the summer of 1940, the German air force, the Luftwaffe, dispatched up to 1,000 aircraft in a single attack. The Royal Air Force’s successful defense of the skies led to British Prime Minister Winston Churchill’s famous declaration, “Never in the field of human conflict was so much owed by so many to so few.”

The U.S. air campaigns over Germany taught American military planners a different lesson. Rather than focusing on pilot skills, they concentrated on building planes with superior firepower. In the decades after World War II, the invention of air-to-air missiles was supposed to herald the end of the dogfight. But during the Vietnam War, steep American aircraft losses caused by the acrobatic, Soviet-built MiG fighter showed that one-on-one combat still mattered. The U.S. response to this threat was the highly maneuverable twin-engine F-15 and the formation of a new pilot training academy, the Navy Fighter Weapons School, which inspired the original “Top Gun.”

Since that film’s release, however, aerial combat between fighter planes has largely happened on screen, not in the real world. The last dogfight involving a U.S. aircraft took place in 1999, during the NATO air campaign in Kosovo. The F-14 Tomcats flown by Mr. Cruise’s character have been retired, and his aircraft carrier, the USS Enterprise, has been decommissioned.

Today, conventional wisdom again holds that aerial combat is obsolete. The new F-35 Joint Strike Fighter is meant to replace close-up dogfights with long-range weapons. But not everyone seems to have read the memo about the future of air warfare. Increasingly, U.S. and NATO pilots are having to scramble their planes to head off Russian incursions. The knights of the skies can’t retire just yet.

Historically Speaking: A Palace Open to the People

From the Pharaohs to Queen Victoria, royal dwellings have been symbols of how rulers think about power.

Every summer, Queen Elizabeth II opens the state rooms of Buckingham Palace to the public. This year’s opening features an exhibition that I curated, “Queen Victoria’s Palace,” the result of a three-year collaboration with Royal Collection Trust. The exhibition uses paintings, objects and even computer-generated imagery to show how Victoria transformed Buckingham Palace into both a family home and the headquarters of the monarchy. In the process, she modernized not only the building itself but also the relationship between the Royal Family and the British people.

Plenty of rulers before Victoria had built palaces, but it was always with a view to enhancing their power rather than sharing it. Consider Amarna in Egypt, the temple-palace complex created in the 14th century B.C. by Amenhotep IV, better known as Akhenaten. Supported by his beautiful wife Nefertiti, the heretical Akhenaten made himself the head of a new religion that revered the divine light of the sun’s disk, the Aten.

The Great Palace reflected Akhenaten’s megalomania: The complex featured vast open air courtyards where the public was required to engage in mass worship of the Pharaoh and his family. Akhenaten’s palace was hated as much as his religion, and both were abandoned after his death.

Weiyang, the Endless Palace, built in 200 B.C. in western China by the Emperor Gaozu, was also designed to impart a religious message. Until its destruction in the 9th century by Tibetan invaders, Weiyang extended over two square miles, making it the largest imperial palace in history. Inside, the halls and courtyards were laid out along specific axial and symmetrical lines to ensure that the Emperor existed in harmony with the landscape and, by extension, with his people. Each chamber was ranked according to its proximity to the Emperor’s quarters; every person knew his place and obligations according to his location in the palace.

Western Europe had nothing comparable to Weiyang until King Louis XIV built the Palace of Versailles in 1682. With its unparalleled opulence—particularly the glittering Hall of Mirrors—and spectacular gardens, Versailles was a cult of personality masquerading as architecture. Louis, the self-styled Sun King at the center of this artificial universe, created a living stage where seeing and being seen was the highest form of social currency.

The offstage reality was grimmer. Except for the royal family’s quarters, Versailles lacked even such basic amenities as plumbing. The cost of upkeep swallowed a quarter of the government’s annual tax receipts. Louis XIV’s fantasy lasted a century before being swept away by the French Revolution in 1789.

Although Victoria would never have described herself as a social revolutionary, the many changes she made to Buckingham Palace were an extraordinary break with the past. From the famous balcony where the Royal Family gathers to share special occasions with the nation, to the spaces for entertaining that can welcome thousands of guests, the revitalized palace created a more inclusive form of royal architecture. It sidelined the old values of wealth, lineage, power and divine right to emphasize new ones based on family, duty, loyalty and patriotism. Victoria’s palace was perhaps less awe-inspiring than its predecessors, but it may prove to be more enduring.

Historically Speaking: Playing Cards for Fun and Money

From 13th-century Egypt to the Wild West, the standard deck of 52 cards has provided entertainment—and temptation.

ILLUSTRATION: THOMAS FUCHS

More than 8,500 people traveled to Las Vegas to play in this year’s World Series of Poker, which ended July 16—a near-record for the contest. I’m not a poker player myself, but I understand the fun and excitement of playing with real cards in an actual game rather than online. There’s something uniquely pleasurable about a pack of cards—the way they look and feel to the touch—that can’t be replicated on the screen.

Although the origins of the playing card are believed to lie in China, the oldest known examples come from the Mamluk Caliphate, an Islamic empire that stretched across Egypt and the eastern Mediterranean from 1260 to 1517. It’s significant that the empire was governed by a warrior caste of former slaves: A playing card can be seen as an assertion of freedom, since time spent playing cards is time spent freely. The Mamluk card deck consisted of 52 cards divided into four suits, whose symbols reflected the daily realities of soldiering—a scimitar, a polo stick, a chalice and a coin.

Returning Crusaders and Venetian traders were probably responsible for bringing cards to Europe. Church and state authorities were not amused: In 1377, Parisians were banned from playing cards on work days. Like dice, cards were classed as gateway vices that led to greater sins. The authorities may not have been entirely wrong: Some surviving card decks from the 15th century have incredibly bawdy themes.

The suits and symbols used in playing cards became more uniform and less ornate following the advent of the printing press. French printers added a number of innovations, including dividing the four suits into red and black, and giving us the heart, diamond, club and spade symbols. Standardization enabled cards to become a lingua franca across cultures, further enhancing their appeal as a communal leisure activity.

In the 18th century, the humble playing card was the downfall of many a noble family, with vast fortunes being won and lost at the gaming table. Cards also started to feature in paintings and novels as symbols of the vagaries of fortune. The 19th-century short story “The Queen of Spades,” by the Russian writer Alexander Pushkin, beautifully captures the card mania of the period. The anti-hero, Hermann, is destroyed by his obsession with winning at Faro, a game of chance that was as popular in the saloons of the American West as it was in the drawing rooms of Europe. The lawman Wyatt Earp may have won fame in the gunfight at the OK Corral, but he earned his money as a Faro dealer in Tombstone, Ariz.

In Britain, attempts to regulate card-playing through high taxes on the cards themselves were a failure, though they did result in one change: Every ace of spades had to show a government tax stamp, which is why it’s the card that traditionally carries the manufacturer’s mark. The last innovation in the card deck, like the first, had military origins. Many Civil War regiments killed time by playing Euchre, which requires an extra trump card. The Samuel Hart Co. duly obliged with a card which became the forerunner to the Joker, the wild card that doesn’t have a suit.

But we shouldn’t allow the unsavory association of card games with gambling to have the last word. As Charles Dickens wrote in “Nicholas Nickleby”: “Thus two people who cannot afford to play cards for money, sometimes sit down to a quiet game for love.”

Historically Speaking: Beware the Red Tide

Massive algae blooms that devastate ocean life have been recorded since antiquity—and they are getting worse.

Real life isn’t so tidy. Currently, there is no force, biological or otherwise, capable of stopping the algae blooms that are attacking coastal waters around the world with frightening regularity, turning thousands of square miles into odoriferous graveyards of dead and rotting fish. In the U.S., one of the chief culprits is the Karenia brevis algae, a common marine microorganism that blooms when exposed to sunlight, warm water and phosphorus or nitrates. The result is a toxic sludge known as a red tide, which depletes the oxygen in the water, poisons shellfish and emits a foul vapor strong enough to irritate the lungs.

The red tide isn’t a new phenomenon, though its frequency and severity have certainly gotten worse thanks to pollution and rising water temperatures. There used to be decades between outbreaks, but since 1998 the Gulf Coast has suffered one every year.

The earliest description of a red tide may have come from Tacitus, the first-century Roman historian, in his “Annals”: “the Ocean had appeared blood-red and…the ebbing tide had left behind it what looked to be human corpses.” The Japanese recorded their first red tide catastrophe in 1234: An algae bloom in Osaka Bay invaded the Yodo River, a major waterway between Kyoto and Osaka, which led to mass deaths among humans and fish alike.

The earliest reliable accounts of red tide invasions in the Western Hemisphere come from 16th-century Spanish sailors in the Gulf of Mexico. The colorful explorer Álvar Núñez Cabeza de Vaca (ca. 1490-1560) almost lost his entire expedition to red tide poisoning while sailing in Apalachee Bay on the west coast of Florida in July 1528. Unaware that local Native American tribes avoided fishing in the area at that time of year, he allowed his men to gorge themselves on oysters. “The journey was difficult in the extreme,” he wrote afterward, “because neither the horses were sufficient to carry all the sick, nor did we know what remedy to seek because every day they languished.”

Red tides started appearing everywhere in the late 18th and early 19th centuries. Charles Darwin recorded seeing red-tinged water off the coast of Chile during his 1832 voyage on HMS Beagle. Scientists finally identified K. brevis as the culprit behind the outbreaks in 1946-47, but this was small comfort to Floridians, who were suffering the worst red tide invasion in U.S. history. It started in Naples and spread all the way to Sarasota, hanging around for 18 months, destroying the fishing industry and making life unbearable for residents. A 35-mile long stretch of sea was so thick with rotting fish carcasses that the government dispatched Navy warships to try to break up the mass. People compared the stench to poison gas.

The red tide invasion of 2017-18 was particularly terrible, lasting some 15 months and covering 145 miles of Floridian coastline. The loss to tourism alone neared $100 million. Things are looking better this summer, fortunately, but we need more than hope or luck to combat this plague; we need a weapon that hasn’t yet been invented.

Historically Speaking: How We Kept Cool Before Air Conditioning

Wind-catching towers and human-powered rotary fans were just some of the devices invented to fight the heat.

ILLUSTRATION: THOMAS FUCHS

What would we do without our air conditioning? Given the number of rolling blackouts and brownouts that happen across the U.S. each summer, that’s not exactly a rhetorical question.

Fortunately, our ancestors knew a thing or two about staying cool even without electricity. The ancient Egyptians developed the earliest known technique: Evaporative cooling involved hanging wet reeds in front of windows, so that the air cooled as the water evaporated.

The Romans, the greatest engineers of the ancient world, had more sophisticated methods. By 312 B.C. they were piping fresh water into Rome via underground pipes and aqueducts, enabling the rich to cool and heat their houses using cold water pipes embedded in the walls and hot water pipes under the floor.

Nor were the Romans alone in developing clever domestic architecture to provide relief in hot climes. In the Middle East, architects constructed buildings with “wind catchers”—tall, four-windowed towers that funneled cool breezes down to ground level and allowed hot air to escape. These had the advantage of working on their own, without human labor. The Chinese had started using rotary fans as early as the second century, but they required a dedicated army of slaves to keep them moving. The addition of hydraulic power during the Song era, 960-1279, alleviated but didn’t end the manpower issue.

There had been no significant improvements in air conditioning designs for almost a thousand years when, in 1734, British politicians turned to Dr. John Theophilus Desaguliers, a former assistant to Isaac Newton, and begged him to find a way of cooling the overheated House of Commons. Desaguliers designed a marvelous Rube Goldberg-like system that used all three traditional methods: wind towers, pipes and rotary fans. It actually worked, so long as there was someone to crank the handle at all times.

But the machinery wore out in the late 1760s, leaving politicians as hot and bothered as ever. In desperation, the House invited Benjamin Franklin and other leading scientists to design something new. Their final scheme turned out to be no better than Desaguliers’s and required not one but two men to keep the system working.

The real breakthrough occurred in 1841, after the British engineer David Boswell Reid figured out how to control room temperature using steam power. St. George’s Hall in Liverpool is widely considered to be the world’s first air-conditioned building.

Indeed, Reid is one of history’s unsung heroes. His system worked so well that he was invited to install his pipe and ventilation design in hospitals and public buildings around the world. He was working in the U.S. at the start of the Civil War and was appointed inspector of military hospitals. Unfortunately, he died suddenly in 1863, leaving his proposed improvements to gather dust.

The chief problem with Reid’s system was that steam power was about to be overtaken by electricity. When President James Garfield was shot by an assassin in the summer of 1881, naval engineers attempted to keep him cool by using electric fans to blow air over blocks of ice. Two decades later, Willis Haviland Carrier invented the first all-electric air conditioning unit. Architects and construction engineers have been designing around it ever since.

Fears for our power grid may be exaggerated, but it’s good to know that if the unthinkable were to happen and we lost our air conditioners, history can offer us some cool solutions.

Historically Speaking: The Pleasures and Pains of Retirement

Since the Roman Empire, people have debated whether it’s a good idea to stop working in old age

The Wall Street Journal, June 7, 2019

The new film “All Is True,” directed by and starring Kenneth Branagh, imagines how William Shakespeare might have lived after he stopped writing plays. Alas for poor Shakespeare, in this version of his life retirement is hell. By day he potters in his garden, waging a feeble battle against Mother Nature; by night he is equally ineffectual against the verbal onslaughts of his resentful family.

In real life, people have been arguing the case for and against retirement since antiquity. The Roman statesman Marcus Tullius Cicero was thoroughly against the idea. In his essay “Cato the Elder on Aging,” Cicero argued that the best antidote to old age was a purposeful life. “I am in my eighty-fourth year,” he wrote, “yet, as you see, old age has not quite unnerved or shattered me. The senate and the popular assembly never find my vigor wanting.” Cicero lived by the pen—he was the greatest speechwriter in history—but he died by the sword, murdered on the orders of Mark Antony for his support of the waning Roman Republic.

Knowing when to exit from public life is a difficult art. The Roman Emperor Diocletian (ca. 245-316) retired to his palace in Split, in modern Croatia, after ruling for 21 years. According to Edward Gibbon, Diocletian was so content for the last six of years of his life that when emissaries from Rome tried to persuade him to return, he replied that he couldn’t possibly leave his cabbages.

For most of history, of course, the average person had no choice but to carry on working until they died. But in the 18th century, longer lifespans created a dilemma: The old were outliving their usefulness. Not realizing that he had two more productive decades left, the 60-year-old Voltaire told his friend Madame du Deffand: “I advise you to go on living solely to enrage those who are paying your annuities…. It is the only pleasure I have left.”

By the end of the 19th century, it had become possible for at least some ordinary people to spend their last years in retirement. In 1883, the German Chancellor Otto von Bismarck bowed to popular opinion and announced that all retirees over 65 would receive pensions. With that, 65 became the official age of retirement.

But some critics argued that this was the thin end of the wedge. If people could be forced out of careers and jobs on the basis of an arbitrary age limit, what else could be done to them? Troubled by what he regarded as ageism, the novelist Anthony Trollope published “The Fixed Period,” a dystopian novel about a society where anyone over the age is 67 is euthanized for his or her own good. The naysayers against any form of government retirement plan held sway in the U.S. until President Franklin Roosevelt signed the Social Security Act in 1935, by which time half of America’s elderly were living in poverty.

Today, the era of leisurely retirements may be passing into history. Whether driven by financial need or personal preference, for many people retirement simply means changing their occupation. According to the AARP, the number of adults working past age 65 has doubled since 1985.

Even the rich and famous aren’t retiring: President George W. Bush is a painter; the Oscar-nominated actor Gene Hackman is a novelist; and Microsoft founder Bill Gates is saving the planet. In this day and age, flush from his success on Broadway, a retired Shakespeare might start his own podcast.

Historically Speaking: The Ancient Origins of the Vacation

Once a privilege of the elite, summer travel is now a pleasure millions can enjoy.

The Wall Street Journal, June 6, 2019

ILLUSTRATION: THOMAS FUCHS

Finally, Americans are giving themselves a break. For years, according to the U.S. Travel Association, more than half of American workers didn’t use all their paid vacation days. But in a survey released in May by Discover, 71% of respondents said they were planning a summer vacation this year, up from 58% last year—meaning a real getaway, not just a day or two to catch up on chores or take the family to an amusement park.

The importance of vacations for health and happiness has been accepted for thousands of years. The ancient Greeks probably didn’t invent the vacation, but they perfected the idea of the tourist destination by providing quality amenities at festivals, religious sites and thermal springs. A cultured person went places. According to the “Crito,” one of Plato’s dialogues, Socrates’ stay-at-home mentality made him an exception: “You never made any other journey, as other people do, and you had no wish to know any other city.”

The Romans took a different approach. Instead of touring foreign cities, the wealthy preferred to vacation together in resort towns such as Pompeii, where they built ostentatious villas featuring grand areas for entertaining. The Emperor Nero was relaxing at his beach palace at Antium, modern Anzio, when the Great Fire of Rome broke out in the year 64.

The closest thing to a vacation that medieval Europeans could enjoy was undertaking pilgrimages to holy sites. Santiago de Compostela in northern Spain, where St. James was believed to be buried, was a favorite destination, second only to Rome in popularity. As Geoffrey Chaucer’s bawdy “Canterbury Tales” shows, a pilgrimage provided all sorts of opportunities for mingling and carousing, not unlike a modern cruise ship.

The vacation went upmarket in the late 17th century, as European aristocrats rediscovered the classical idea of tourism for pleasure. Broadening one’s horizons via the Grand Tour—a sightseeing trip through the major classical and Renaissance sites of France and Italy—became de rigueur for any gentleman. The spa town, too, enjoyed a spectacular revival. The sick and infertile would gather to “take the cure,” bathing in or drinking from a thermal spring. Bath in England became as renowned for its party scene as for its waters. Jane Austen, a frequent visitor to the city, set two of her novels there. The U.S. wasn’t far behind, with Saratoga Springs, N.Y., known as the “Queen of Spas,” becoming a popular resort in 1815.

But where was the average American to go? Resorts, with their boardwalks, grand hotels and amusement arcades, were expensive. In any case, the idea of vacations for pure pleasure sat uneasily with most religious leaders. The answer lay in the great outdoors, which were deemed to be good for the health and improving to the mind.

Cheap rail travel, and popular guidebooks such as William H.H. Murray’s “Adventures in the Wilderness; or, Camp-Life in the Adirondacks,” helped to entice the middle classes out of the city. The American Chautauqua movement, originally a New York-based initiative aimed at improving adult education, further served to democratize the summer vacation by providing cheap accommodation and wholesome entertainment for families.

The summer vacation was ready to become a national tradition. In 1910, President William Howard Taft even proposed to Congress that all workers should be entitled to two to three months of paid vacation. But the plan stalled, and it was left to France to pass the first guaranteed-vacation law, in 1919. Since then, most developed countries have recognized vacation time as a legal right. It’s not too late, America, to try again.

Historically Speaking: Beloved Buildings That Rose from the Ashes

From ancient Rome to modern London, great structures like Notre Dame have fallen and been built again

The Wall Street Journal, May 2, 2019

A disaster like the Notre Dame cathedral fire is as much a tragedy of the heart as it is a loss to architecture. But fortunately, unlike most love affairs, a building can be resurrected. In fact, throughout history communities have gone to remarkable lengths to rebuild monuments of sacred or national importance.

There is no shortage of inspirational examples of beloved buildings that have risen from the ashes. The Second Temple was built in Jerusalem in 515 B.C. following the destruction of the First by King Nebuchadnezzar II of Babylonia in 586 B.C.; Dresden’s Baroque Frauenkirche was faithfully rebuilt in 2005, after being destroyed by bombs in 1945.

Often the new structures are exact replicas, as with Venice and Barcelona’s opera houses, La Fenice and Gran Teatre del Liceu, both of which were rebuilt after suffering devastating fires in the 1990s. If France decides to rebuild Notre Dame according to the principle “as it was, where it was,” the skill and technology aren’t lacking.

In other cases, however, disasters have allowed for beloved landmarks to be reimagined. The Great Fire of Rome in 64 A.D. led to a revolution in architectural styles and techniques. After Hagia Sophia cathedral was torched during riots in Constantinople in 532, the Byzantine Emperor Justinian asked his architects Anthemius and Isidore to build something bold and impressive. It was risky to change such a renowned symbol of the Eastern Roman Empire; moreover, for security and financial reasons, the work had to be completed in just six years. Still, the result dazzled Justinian, who exclaimed when he saw it, ‘‘Solomon, I have outdone thee.” Almost a thousand years later, following Constantinople’s fall to the Turks in 1453, Sultan Mehmet II had a similar reaction and ordered Hagia Sophia to be turned into a mosque rather than destroyed.

Sir Christopher Wren, who rebuilt St. Paul’s Cathedral in London, was not so lucky in the reactions to his creation. The Great Fire of 1666 had left the medieval church in ruins, but there was little appetite for an innovative reconstruction and no money in the Treasury to pay for it. Wren endured setbacks at every turn, including a chronic shortage of stone. At one point, Parliament suspended half his salary in protest at the slowness of the work, which took almost three decades and spanned the reigns of five monarchs.

The reason for the delay became clear after the finished building was revealed to the public. Inspired by drawings of Hagia Sophia, Wren had ignored the approved design for a traditional reconstruction and quietly opted for a more experimental approach. Ironically, many of his contemporaries were appalled by the now iconic great dome, especially the Protestant clergy, who deemed it too foreign and Catholic-looking. Yet Wren’s vision has endured. During the German bombing of London in World War II, St. Paul’s was the one building that Winston Churchill declared must be saved at all costs.

It is never easy deciding how to draw the line between history and modernity, particularly when dealing with the loss of an architectural masterpiece. There isn’t always a right answer, but it may help to remember Churchill’s words: “We shape our buildings and afterwards our buildings shape us.”

Historically Speaking: Real-Life Games of Thrones

From King Solomon to the Shah of Persia, rulers have used stately seats to project power.

The Wall Street Journal, April 22, 2019

ILLUSTRATION BY THOMAS FUCHS

Uneasy lies the head that wears a crown,” complains the long-suffering King Henry IV in Shakespeare. But that is not a monarch’s only problem; uneasy, too, is the bottom that sits on a throne, for thrones are often a dangerous place to be. That is why the image of a throne made of swords, in the HBO hit show “Game of Thrones” (which last week began its eighth and final season), has served as such an apt visual metaphor. It strikingly symbolizes the endless cycle of violence between the rivals for the Iron Throne, the seat of power in the show’s continent of Westeros.

In real history, too, virtually every state once put its leader on a throne. The English word comes from the ancient Greek “thronos,” meaning “stately seat,” but the thing itself is much older. Archaeologists working at the 4th millennium B.C. site of Arslantepe, in eastern Turkey, where a pre-Hittite Early Bronze Age civilization flourished, recently found evidence of what is believed to be the world’s oldest throne. It seems to have consisted of a raised bench which enabled the ruler to display his or her elevated status by literally sitting above all visitors to the palace.

Thrones were also associated with divine power: The famous 18th-century B.C. basalt stele inscribed with the law code of King Hammurabi of Babylon, which can be seen at the Louvre, depicts the king receiving the laws directly from the sun god Shamash, who is seated on a throne.

Naturally, because they were invested with so much religious and political symbolism, thrones often became a prime target in war. According to Jewish legend, King Solomon’s spectacular gold and ivory throne was stolen first by the Egyptians, who then lost it to the Assyrians, who subsequently gave it up to the Persians, whereupon it became lost forever.

In India, King Solomon’s throne was reimagined in the early 17th century by the Mughal Emperor Shah Jahan as the jewel-and-gold-encrusted Peacock Throne, featuring the 186-carat Koh-i-Noor diamond. (Shah Jahan also commissioned the Taj Mahal.) This throne also came to an unfortunate end: It was stolen during the sack of Delhi by the Shah of Iran and taken back to Persia. A mere eight years later, the Shah was assassinated by his own bodyguards and the Peacock Throne was destroyed, its valuable decorations stolen.

Perhaps the moral of the story is to keep things simple. In 1308, King Edward I of England commissioned a coronation throne made of oak. For the past 700 years it has supported the heads and backsides of 38 British monarchs during the coronation ceremony at Westminster Abbey. No harm has ever come to it, save for the pranks of a few very naughty choir boys, one of whom carved on the back of the throne: “P. Abbott slept in this chair 5-6 July 1800.”

Historically Speaking: Overcoming the Labor of Calculation

Inventors tried many times over the centuries to build an effective portable calculator—but no one succeeded until John Merryman.

The Wall Street Journal, April 5, 2019

ILLUSTRATION: THOMAS FUCHS

The world owes a debt of gratitude to Jerry Merryman, who died on Feb. 27 at the age of 86. It was Merryman who, in 1965, worked with two other engineers at Texas Instruments to invent the world’s first pocket calculator. Today we all carry powerful calculators on our smartphones, but Merryman was the first to solve a problem that had plagued mathematicians at least since Gottfried Leibniz, one of the creators of calculus in the 17th century, who observed: “It is unworthy of excellent men to lose hours like slaves in the labor of calculation, which could safely be relegated to anyone else if machines were used.”

In ancient times, the only alternative to mental arithmetic was the abacus, with its simple square frame, wooden rods and movable beads. Most civilizations, including the Romans, Chinese, Indians and Aztecs, had their own version—from counting boards for keeping track of sums to more sophisticated designs that could calculate square roots.

We know of just one counting machine from antiquity that was more complex. The 2nd-century B.C. Antikythera Mechanism, discovered in 1901 in the remains of an ancient Greek shipwreck, was a 30-geared bronze calculator that is believed to have been used to track the movement of the heavens. Despite its ingenious construction, the Antikythera had a limited range of capabilities: It could calculate dates and constellations and little else.

In the 15th century, Leonardo da Vinci drew up designs for a mechanical calculating device that consisted of 13 “digit-registering” wheels. In 1967, IBM commissioned the Italian scholar and engineer Roberto Guatelli to build a replica based on the sketches. Guatelli believed that Leonardo had invented the first calculator, but other experts disagreed. In any case, it turned out that the metal wheels would have generated so much friction that the frame would probably have caught fire.

Technological advances in clockmaking helped the French mathematician and inventor Blaise Pascal to build the first working mechanical calculator, called the Pascaline, in 1644. It wasn’t a fire hazard, and it could add and subtract. But the Pascaline was very expensive to make, fragile in the extreme, and far too limited to be really useful. Pascal only made 50 in his lifetime, of which less than 10 survive today.

Perhaps the greatest calculator never to see the light of day was designed by Charles Babbage in the early 19th century. He actually designed two machines—the Difference Engine, which could perform arithmetic, and the Analytical Engine, which was theoretically capable of a range of functions from direct multiplication to parallel processing. Ada, Countess of Lovelace, the daughter of the poet Lord Byron, made important contributions to the development of Babbage’s Analytical Engine. According to the historian Walter Isaacson, Lovelace realized that any process based on logical symbols could be used to represent entities other than quantity—the same principle used in modern computing.

Unfortunately, despite many years and thousands of pounds of government funding, Babbage only ever managed to build a small prototype of the Difference Engine. Even for most of the 20th century, his dream of a portable calculator stayed a dream, while the go-to instrument for doing large sums remained the slide rule. It took Merryman’s invention to allow us all to become “excellent men” and leave the labor of calculation to the machines.