The Sunday Times – Special relationship: US mourns the Queen as it would a president

America’s relationship with the monarchy has always been complicated but she brought the two nations together

The Sunday Times

Saturday, September 10 2022

My phone started ringing at 7am New York time on Thursday. The news about the Queen’s health had reached the networks and they were calling everyone in. By 9am I was in the CBS newsroom and so began a long day that has not yet ended.

To say that Americans are reacting strongly to the death of the Queen hardly does justice to the immense coverage it has received. Not since the funeral of Pope John Paul II in 2005 has an international figure been given the kind of treatment normally reserved for departed presidents.

Kamala Harris, the US vice-president, signs the condolence book at the British embassy in Washington

That includes the American tendency to start the criticising long before any European outlet would consider it seemly to do so. As any psychologist will tell you, extreme love and hate are two sides of the same coin.

In the studio I was struck by the comment of a “lifestyle” journalist who remarked that Elizabeth II had been the star of a reality TV family, years before the concept had been invented. “Is that why we are so obsessed with the Queen and the monarchy?” mused the anchor before proceeding to display a level of knowledge about the Queen that would put a royal correspondent to shame.

On Thursday night New York was a blinking skyline of mauve, purple and lavender tributes, led by the Empire State Building in silver and purple. It was almost exactly 65 years ago in October 1957 that the Queen had gone to the top of the building to see the view. Cities that the Queen had never visited also paid their respects. Las Vegas illuminated its High Roller observation wheel — the world’s second tallest — a neon purple.

American sport paused briefly, too. On Thursday a minute’s silence was observed and an image of the Queen projected on to large screens before play at the NFL season opener in Los Angeles, at the US Open tennis tournament and at the baseball game in Yankee stadium.

Not everyone was in mourning though. With a breathtaking lack of decorum, The New York Times rushed out an opinion piece that accused the Queen of being a participant in Britain’s cover-up of its colonial crimes. This was tepid compared with some outbursts on social media. A linguistics professor at Carnegie Mellon University in Pennsylvania tweeted her hope that the Queen’s final hours were “excruciating”.

America’s relationship with the monarchy has always been complicated. After victory in the War of Independence George Washington angrily dismissed suggestions that he become king of the new United States. But there were still officers in his own army who thought that a monarchy would be good for the country. It was an idea that never went away.

Britain was one of the first countries to recognise the Republic of Texas in 1836, and Texans repaid the compliment by openly flirting with the idea of joining the British Empire. At the start of the American Civil War in 1861, the British consulate in Charleston reported high-level discussions on whether it would aid the Confederacy to invite a younger member of the royal family to become king once the South had achieved independence.

About the same time the 19th-century journalist Thomas Nichols asked “Why does America hate England?”, this anglophobic nation gave the Prince of Wales, the future Edward VII, the biggest welcome yet shown to a foreign dignitary.

Like a cross between a Greek tragedy and a French farce, the drama of Anglo-American relations during the 19th century mainly consisted of young America alternating between vying for Britain’s good opinion and wanting to destroy her.

The Queen had the good fortune to come to the throne after the US had passed through this awkward adolescent phase. Aged 25, she was stepping into a role that had played an essential part in shaping American identity. Not quite mother figure, certainly not hated stepmother, and yet familial, maternal, and in the origin-story she represented, eternal.

It is no wonder that during her first state visit in 1957, a million people lined the roads to watch her arrive in Washington. There’s no escaping family. Still, family, like biology, is not destiny. It was her choice to embrace the past as proof that the two countries had put it behind them, like mature adults.

That first state visit took place after a crack in Anglo-American relations caused by Suez. Much depended, therefore, on the Queen’s four-night stay at the White House. It wasn’t merely a matter of repairing relations with President Eisenhower, she also needed to charm the public — which she did, by among other things, showing herself to be interested in the things that interested them, like watching a football game.

The recovery of good relations was a testament to the Queen’s personal touch with US presidents. In 1976, at a state dinner in Washington, she rather boldly referred to the British burning of the White House, to add: “But these early quarrels are long buried. What is more important is that our shared language, traditions and history have given us a common vision of what is right and just.”

Yet this mutual heritage alone was not enough to bind the nations, she came to believe. Active reminders and a stroking of the ego of whomever occupied the White House were equally important.

The success of this policy was borne out by what did not happen during the Falklands conflict in 1982. Despite great reason to avoid supporting Britain against Argentina, the US proved to be a reliable ally. Much has been made of the role played by Margaret Thatcher’s relationship with Ronald Reagan. But the Queen’s adept handling of Nancy Reagan during the latter’s visit for the royal wedding the year before had helped to lay the groundwork. After the wedding, the Queen issued a rare private invitation to the Reagans to stay with her at Windsor Castle the following year. A small thing, perhaps, and yet personal touches can decide the fate of a nation.

Before President Obama’s first visit to London in 2009 it was reported that he had little interest in cultivating a special relationship, but Operation Obama proved the most successful charm offensive in the Queen’s history of diplomacy.

Perhaps it was the moment that the Queen and Michelle Obama put their arms around each other, or the amount of private time that the Queen and Prince Philip gave to the Obamas. Either way, by the end of Obama’s two terms relations between the White House and Buckingham Palace had never been better.

The reason Americans are in shock is not because, or merely because, the Queen was famous, or that her family make for good copy; she held up a mirror through which this nation saw a different self, perhaps its best self that only a mother sees.

Historically Speaking: The Noble Elf Has a Devilish Alter-Ego

Pointy-eared magical creatures abound in folklore, but they weren’t always cute

The Wall Street Journal

September 8, 2022

“The Rings of Power” series, Amazon’s prequel to J.R.R. Tolkien’s fantasy epic, “The Lord of the Rings,” reserves a central role for heroic elves. Members of this tall, immortal race are distinguished by their beauty and wisdom and bear little resemblance to the merry, diminutive helpers in Santa’s workshop.

Yet both are called elves. One reason for the confusion is that the idea of pointy-eared magical beings has never been culturally specific. The ancient Greeks didn’t have elves per se, but their myths did include sex-mad satyrs, Dionysian half-human-half-animal nature spirits whose ears were pointed like a horse’s.

Before their conversion to Christianity, the Anglo-Saxons, like their Norse ancestors, believed in magical beings such as water spirits, elves and dragons. Later, in the epic poem Beowulf, written down around 1000, the “ylfe” is among the monsters spawned by the biblical Cain.

Benjamin Walker as Gil-galad, High King of the Elves of the West, in “The Rings of Power”
PHOTO: AMAZON STUDIOS

The best accounts of the early Norse myths come from two medieval Icelandic collections known as the Eddas, which are overlaid with Christian cosmology. The Prose Edda divided elves into the “light elves,” who are fair and wondrous, and the “dark elves,” who live underground and are interchangeable with dwarves. Both kinds appeared in medieval tales to torment or, occasionally, help humans.

When not portrayed as the cause of unexplained illnesses, elves were avatars for sexual desire. In Chaucer’s comic tale, the lusty “Wife of Bath” describes the elf queen as sex personified and then complains that the friars have chased all the fair folk away.

The popular conception of elves continued to evolve during the Renaissance under the influence of French “faerie” folklore, Celtic myths and newly available translations of Ovid’s “Metamorphoses” and Virgil’s “Georgics.” Shakespeare took something from almost every tradition in “A Midsummer Night’s Dream,” from Puck the naughty little sprite to Queen Titania, seductress of hapless humans.

But while elves were becoming smaller and cuter in English literature, in Northern Europe they retained their malevolence. Inspired by the Germanic folk tale of the Elf King who preys on little children, in 1782 Goethe composed “Der Erlkonig,” about a boy’s terror as he is chased through the forest to his death. Schubert liked the ballad so much that he set it to music.

In the 19th century, the Brothers Grimm, Jacob and Wilhelm, along with Hans Christian Andersen, brought ancient fairy tales and folk whimsy to a world eager for relief from rampant industrialization. The Grimms put a cheerful face on capitalism with the story of a cobbler and the industrious elves who work to make him wealthy. Clement Clarke Moore made elves the consumer’s friend in his night-before-Christmas poem, “A Visit From St. Nicholas,” where a “jolly old elf” stuffs every stocking with toys.

On the more serious side, the first English translation of Beowulf appeared in 1837, marking the beginning of the Victorians’ obsession with the supernatural and all things gothic. The poem’s negative connotation surrounding elves burst into the open with Richard Wagner’s Ring Cycle, based on Germanic legends, which portrayed the Elf King Alberich as an evil dwarf.

The elfin future would likely have been silly or satanic were it not for Tolkien’s restoration of the “light elf” tradition. For now, at least, the lovely royal elf Galadriel rules.

Historically Speaking: The Ancient Art of the Tattoo

Body ink has been used to elevate, humiliate and decorate people since the times of mummies.

The Wall Street Journal

August 25, 2022

Earlier this month the celebrity couple Kim Kardashian and Pete Davidson announced that their nine-month relationship was over. Ms. Kardashian departed with her memories, but Mr. Davidson was left with something a little more tangible: the words “my girl is a lawyer” tattooed on his shoulder in (premature) homage to Ms. Kardashian’s legal aspirations. He has since been inundated on social media with suggestions on how to cover it up. In 1993, following his split with actress Winona Ryder, the actor Johnny Depp changed “Winona Forever” to “Wino Forever.”

Throughout history, humans have tattooed themselves—and others—for reasons spanning the gamut from religion to revenge. The earliest evidence for tattooing comes from a 5,300-year-old ice mummy nicknamed Otzi after its discovery in the Ötztal Alps in Europe. An analysis of Otzi’s remains revealed that he had been killed by an arrow. Even before his violent death, however, he appeared to have suffered from various painful ailments. Scientists found 61 tattoo marks across Otzi’s body, with many of them placed on known acupuncture points, prompting speculation that the world’s oldest tattoos were used as a health aid.

Similar purposes may have prompted the ancient Nubians to apply tattoos on some pregnant women. Tomb paintings and mummified remains of women in Egypt show that they also adopted the custom, possibly for the same reason.

PHOTO: THOMAS FUCHS

The indelible aspect of tattooing inspired diametrically opposed attitudes. Some ancient peoples, such as the Thracians and the Gauls, regarded tattoos as a mark of noble status and spiritual power. But the Persians, Greeks and Romans used them as a form of punishment or humiliation. Those who imported slaves to Rome from Asia paid duties and tattooed “tax paid” on the foreheads of those they enslaved.

In Polynesian cultures, tattoos were imbued with symbolism. The traditional tatau, which gave rise to the English word tattoo via Captain James Cook in the 18th century, covered the bodies of Samoan men from the waist to the knees. The ritual application took many weeks and entailed excruciating pain plus the danger of septicemia, but anyone who gave up brought shame upon his family for generations.

As a rule, Christian missionaries tried to stamp out the practice during the 19th century. But their disapproval was no match for royal enthusiasm. Fascinated by irezumi, the Japanese decorative art of tattooing inspired by woodblock printing, the future King George V of Britain and his brother Prince Albert Victor both had themselves inked in 1881 while on a royal visit. Noting its subsequent spread among the American upper classes, the New York Herald complained, “The Tattooing Fad has Reached New York Via London.”

In the 20th century, the practice retained a dark side as a symbol of criminality and oppression—most notably associated with the Nazis’ tattooing of inmates at Auschwitz. At the same time, however, so many returning U.S. servicemen had them that the Marlboro Man sported one on his hand in the advertisements of the day.

Historically Speaking: Passports Haven’t Always Been Liberating

France’s Louis XIV first required international travelers to carry an official document. By the 20th century, most other countries did the same for reasons of national security.

The Wall Street Journal

August 12, 2022

As anyone who has recently applied for a passport can attest, U.S. passport agencies are still catching up from the pandemic lockdown. But even with the current delays and frustrations, a passport is, quite literally, our pass to freedom.

The exact concept did not exist in ancient times. An approximation was the official letter of introduction or safe conduct that guaranteed the security of the traveler holding it. The Hebrew Bible recounts that the prophet Nehemiah, cup-bearer to Artaxerxes I of Persia, requested a letter from the king for his mission to Judea. As an indispensable tool of international business and diplomacy, such documents were considered sacrosanct. In medieval England, King Henry V decreed that any attack on a bearer of one would be treated as high treason.

Another variation was the official credential proving the bearer had permission to travel. The Athenian army controlled the movements of officers between bases by using a clay token system. In China, by the time of the Tang dynasty in the early 7th century, trade along the Silk Road had become regulated by the paper-backed guosuo system. Functioning as a pass and identity card, possession of a signed guosuo document was the only means of legitimate travel between towns and cities.

The birth of the modern passport may be credited in part to King Louis XIV of France, who decreed in 1669 that all individuals, whether leaving or entering his country, were required to register their personal details with the appropriate officials and carry a copy of their travel license. Ironically, the passport requirement helped to foil King Louis XVI and Marie Antoinette’s attempt to escape from Paris in 1791.

The rise of middle-class tourism during the 19th century exposed the ideological gulf between the continental and Anglo-Saxon view of passports. Unlike many European states, neither Great Britain nor America required its citizens to carry an identity card or request government permission to travel. Only 785 Britons applied for a passport in 1847, mainly out of the belief that a document personally signed by the foreign secretary might elevate the bearer in the eyes of the locals.

By the end of World War I, however, most governments had come around to the idea that passports were an essential buttress of national security. The need to own one coincided with mass upheaval across Europe: Countries were redrawn, regimes toppled, minorities persecuted, creating millions of stateless refugees.

Into this humanitarian crisis stepped an unlikely savior, the Norwegian diplomat Fridtjof Nansen. In 1922, as the first high commissioner for refugees for the League of Nations, Nansen used his office to create a temporary passport for displaced persons, enabling them to travel, register and work in over 50 countries. Among the hundreds of thousands saved by a “Nansen passport” were the artist Marc Chagall and the writer Vladimir Nabokov. With unfortunate timing, the program lapsed in 1938, the year that Nazi Germany annexed Austria and invaded Czechoslovakia.

For a brief time during the Cold War, Americans experienced the kind of politicization that shaped most other passport systems. In the 1950s, the State Department could and did revoke the passports of suspected communist sympathizers. My father Carl Foreman was temporarily stripped of his after he finished making the anti-McCarthyite film classic “High Noon.”

Nowadays, neither race nor creed nor political opinions can come between an American and his passport. But delays of up to 12 weeks are currently unavoidable.

Historically Speaking: The Mystical Origins of Wordplay

From oracular riddles to the daily Wordle, humans have always had the urge to decode

The Wall Street Journal

July 28, 2022

In 2021, a software engineer named Josh Wardle uploaded Wordle, a 5-letter word puzzle, for a few friends and relatives. By February this year, the number of players had jumped to the millions, and Wardle’s daily Wordle game had become a global phenomenon and the property of the New York Times.

Wordle’s success is unusual but not unprecedented. The urge to decode patterns lies deep within the human psyche. Puzzles, whether mathematical or linguistic, were originally associated with cosmic truths and celestial communication. In the Rhind papyrus, which the British museum dates to around 1550 B.C., the Egyptian scribe Ahmes presented 84 mathematical problems that he claimed contained the key to “knowledge of all existing things.” The Chinese I Ching, or Book of Changes, a set of 64 hexagrams (six-lined figures) thought to have been written dow

ILLUSTRATION: THOMAS FUCHS

n with commentary around 800 B.C., was also said to provide a basis for understanding the universe.

The Greeks saw riddles as a means of interacting with the gods. Pilgrims would visit oracles, believing that the gods spoke through their priestesses by means of riddles. Interpreting these utterances, however, was fraught with danger. According to Herodotus, King Croesus of Lydia took the Oracle of Delphi’s prediction that a great empire would fall if he attacked the Persians to mean that victory was assured. It was—for the Persians.

With the spread of literacy, anagrams and acrostics became an alternative medium for heavenly messaging. The Hebrew Bible contains several examples, including the alphabetic acrostic Psalm 119, which symbolizes the presence of God in everything from A to Z (aleph to tav).

It remains a matter of scholarly dispute whether the acrostic Sator Square, a two-dimensional, five-line palindrome square made up of five five-letter Latin words, was just clever Roman graffiti or a means of transmitting Christian messages. From its earliest appearance in 1st-century Pompeii, the Sator Square has been found in medieval churches across Europe. It may have started out as a bit of fun, but it ended up as something deeply serious.

One of the most famous nonreligious word squares is “Xuanji Tu,” composed by the 4th-century Chinese poet Su Hui to win back her errant husband. The 29-by-29-character grid can be read in any direction and is said to contain 7,958 poems.

Although word squares, puzzles and riddles gradually shed their magical connotations, their popularity in the West remained undimmed. During the 17th century, King Louis XIII of France even employed his own Royal Anagrammatist. In colonial America, Benjamin Franklin wisely included all manner of puzzles in his Poor Richard’s Almanack.

The Disney-fication of Lewis Carroll’s 1865 fantasy novel, “Alice’s Adventures in Wonderland,” has obscured its triumph as a game of language from start to finish. An Oxford mathematician, Carroll was an inveterate puzzler. His playful inventions include an early form of scrabble and the “word ladder,” whereby a series of one-letter changes transforms a word into its opposite.

By contrast, the crossword was born of necessity. In 1913, Arthur Wynne, the color supplement editor of Joseph Pulitzer’s New York World, had a blank page to fill and resorted to a word-square puzzle. He called it a “word-cross” and invited readers to complete the grid by solving a series of clues. Like Wordle, the game became an overnight sensation. Nevertheless, it took 85 years for crossword puzzles to achieve the ultimate accolade of civilization—a place in The Wall Street Journal.

Historically Speaking: The Women Who Have Gone to War

There have been female soldiers since antiquity, but only in modern times have military forces accepted and integrated them

The Wall Street Journal

July 14, 2022

“War is men’s business,” Prince Hector of Troy declares in Homer’s Iliad, a sentiment shared by almost every culture since the beginning of history. But Hector was wrong. War is women’s business, too, even though their roles are frequently overlooked.

This month marks 75 years since the first American woman received a regular Army commission. Florence Aby Blanchfield, superintendent of the 59,000-strong Army Nurse Corps during World War II, was appointed Lt. Colonel by General Dwight Eisenhower in July, 1947. Today, women make up approximately 19% of the officer corps of the Armed Forces.

The integration of women into the military is a fundamental difference between the ancient and modern worlds. In the past, a weapon-wielding woman was seen as symbolizing the shame and emasculation of men. Among the foundation myths depicted on the Parthenon are scenes of the Athenian army defeating the Amazons, a race of warrior women.

Such propaganda couldn’t hide, however, the fact that in real life the Greeks and Romans on occasion fought and even lost against female commanders. Artemisia I of Caria was one of the Persian king Xerxes’s most successful naval commanders. Hearing about her exploits against the Greeks during the Battle of Salamis in 480 B.C., he is alleged to have exclaimed: “My men have become women, and my women men.” The Romans crushed Queen Boudica’s revolt in what is now eastern England in 61 A.D., but not before she had destroyed the 9th Roman Legion and massacred 70,000 others.

The medieval church was similarly torn between ideology and reality in its attitude toward female Christian warriors. Yet women did take part in the Crusades. Most famously, Queen Eleanor of Aquitaine accompanied her husband King Louis VII of France on the Second Crusade (1147-1149) and was by far the better strategist of the two. However, Eleanor’s enemies cited her presence as proof that she was a gender-bending harlot.

Florence Aby Blanchfield was the first woman to receive a regular commission with the U.S. Army

For centuries, the easiest way for a woman to become a soldier was to pass as a boy. In 1782, Massachusetts-born Deborah Sampson became one of the first American women to fight for her country by enlisting as a youth named Robert Shurtleff. During the Civil War, anywhere between 400 and 750 women practiced similar deceptions.

A dire personnel shortage finally opened a legal route for women to enter the Armed Forces. Unable to meet its recruitment targets, in March, 1917, the U.S. Navy announced that it would allow all qualified persons to enlist in the reserves. Loretta Perfectus Walsh, a secretary in the Philadelphia naval recruiting office, signed up almost immediately. The publicity surrounding her enlistment as the Navy’s first female chief yeoman encouraged thousands more to step forward.

World War II proved to be similarly transformative. In the U.S., more than 350,000 women served in uniform. In Britain, Queen Elizabeth II made history by becoming a military mechanic in the women’s branch of the British Army.

Although military women have made steady gains in terms of parity, the debate over their presence is by no means over. Yet the “firsts” keep coming. In June, Adm. Linda L. Fagan of the U.S. Coast Guard became the first woman to lead a branch of the U.S. Armed Forces. In the past as now, whatever the challenge, there’s always been a woman keen to accept it.

Historically Speaking: The Quest to Understand Skin Cancer

The 20th-century surgeon Frederic Mohs made a key breakthrough in treating a disease first described in ancient Greece.

The Wall Street Journal

June 30, 2022

July 1 marks the 20th anniversary of the death of Dr. Frederic Mohs, the Wisconsin surgeon who revolutionized the treatment of skin cancer, the most common form of cancer in the U.S. Before Mohs achieved his breakthrough in 1936, the best available treatment was drastic surgery without even the certainty of a cure.

Skin cancer is by no means a new illness or confined to one part of the world; paleopathologists have found evidence of it in the skeletons of 2,400- year-old Peruvian mummies. But it wasn’t recognized as a distinct cancer by ancient physicians. Hippocrates in the 5th century B.C. came the closest, noting the existence of deadly “black tumors (melas oma) with metastasis.” He was almost certainly describing malignant melanoma, a skin cancer that spreads quickly, as opposed to the other two main types, basal cell and squamous cell carcinoma.

ILLUSTRATION: THOMAS FUCHS

After Hippocrates, nearly 2,000 years elapsed before earnest discussions about black metastasizing tumors began to appear in medical writings. The first surgical removal of a melanoma took place in London in 1787. The surgeon involved, a Scotsman named John Hunter, was mystified by the large squishy thing he had removed from his patient’s jaw, calling it a “cancerous fungus excrescence.”

The “fungoid disease,” as some referred to skin cancer, yielded up its secrets by slow degrees. In 1806 René Laënnec, the inventor of the stethoscope, published a paper in France on the metastatic properties of “La Melanose.” Two decades later, Arthur Jacob in Ireland identified basal cell carcinoma, which was initially referred to as “rodent ulcer” because the ragged edges of the tumors looked as though they had been gnawed by a mouse.

By the beginning of the 20th century, doctors had become increasingly adept at identifying skin cancers in animals as well as humans, making the lack of treatment options all the more frustrating. In 1933, Mohs was a 23-year-old medical student assisting on cancer research in rats when he noticed the destructive effect of zinc chloride on malignant tissue. Excited by its potential, within three years he had developed a zinc chloride paste and a technique for using it on cancerous lesions.

He initially described it as “chemosurgery” since the cancer was removed layer by layer. The results for his patients, all of whom were either inmates of the local prison or the mental health hospital, were astounding. Even so, his method was so novel that the Dane County Medical Association in Wisconsin accused him of quackery and tried to revoke his medical license.

Mohs continued to encounter stiff resistance until the early 1940s, when the Quislings, a prominent Wisconsin family, turned to him out of sheer desperation. Their son, Abe, had a lemon-sized tumor on his neck which other doctors had declared to be inoperable and fatal. His recovery silenced Mohs’s critics, although the doubters remained an obstacle for several more decades. Nowadays, a modern version of ”Mohs surgery,” using a scalpel instead of a paste, is the gold standard for treating many forms of skin cancer.

Historically Speaking: The Modern Flush Toilet Has Ancient Origins

Even the Minoans of Crete found ways to whisk away waste with flowing water.

The Wall Street Journal

June 9, 2021

Defecation is a great equalizer. As the 16th-century French Renaissance philosopher Michel de Montaigne put it trenchantly in his Essays, “Kings and philosophers shit, as do ladies.”

Yet, even if each person is equal before the loo, not all toilets are considered equal. Sitting or squatting, high or low-tech, single or dual flush: Every culture has preferences and prejudices. A top-end Japanese toilet with all the fixtures costs as much as a new car.

THOMAS FUCHS

Pride in having the superior bathroom experience goes back to ancient times. As early as 2500 B.C., wealthy Mesopotamians could boast of having pedestal lavatories and underfloor piping that fed into cesspits. The Harappans of the Indus Valley Civilization went one better, building public drainage systems that enabled even ordinary dwellings to have bathrooms and toilets. Both, however, were surpassed by the Minoans of Crete, who invented the first known flush toilet, using roof cisterns that relied on the power of gravity to flush the contents into an underground sewer.

The Romans’ greatest contribution to sanitary comfort was the public restroom. By 300 B.C., Rome had nearly 150 public toilet facilities. These were communal, multi-seater affairs consisting of long stone benches with cut-out holes set over a channel of continuously running water. Setting high standards for hygiene, the restrooms had a second water trough for washing and sponging.

Although much knowledge and technology was lost during the Dark Ages, the Monty Python depiction of medieval society as unimaginably filthy was somewhat of an exaggeration. Castle bedrooms were often en-suite, with pipes running down the exterior walls or via internal channels to moats or cesspits. Waste management was fraught with danger, though—from mishaps as much as disease.

The most famous accident was the Erfurt Latrine Disaster. In 1184, King Henry VI of Germany convened a royal gathering at Petersburg Citadel in Erfurt, Thuringia. Unfortunately, the ancient hall was built over the citadel’s latrines. The meeting was in full swing when the wooden flooring suddenly collapsed, hurling many of the assembled nobles to their deaths in the cesspit below.

Another danger was the vulnerability of the sewage system to outside penetration. Less than 20 years after Erfurt, French troops in Normandy captured the English-held Chateau Gaillard by climbing up the waste shafts.

Sir John Harington, a godson of Queen Elizabeth I, rediscovered the flushable toilet in 1596. Her Majesty had one installed at Richmond Palace. But the contraption failed to catch on, perhaps because smells could travel back up the pipe. The Scottish inventor Alexander Cumming solved that problem in the late 18th century by introducing an S-shaped pipe below the bowl that prevented sewer gas from escaping.

Thomas Crapper, contrary to lore, didn’t invent the modern toilet: He was the chief supplier to the royal household. Strangely for a country renowned for its number of bathrooms per household, the U.S. granted its first patent for a toilet—or “plunger closet”—only in 1857. As late as 1940, some 45% of households still had outhouses. The American toilet race, like the space race, only took off later, in the ‘50s. There is no sign of its slowing down. Coming to a galaxy near you: The cloud-connected toilet that keeps track of your vitals and cardiovascular health.

Historically Speaking: Inflation Once Had No Name, Let Alone Remedy

Empires from Rome to China struggled to restore the value of currencies that spiraled out of control

The Wall Street Journal

May 27, 2022

Even if experts don’t always agree on the specifics, there is broad agreement on what inflation is and on its dangers. But this consensus is relatively new: The term “inflation” only came into general usage during the mid-19th century.

Long before that, Roman emperors struggled to address the nameless affliction by debasing their coinage, which only worsened the problem. By 268 AD, the silver content of the denarius had dropped to 0.5%, while the price of wheat had risen almost 200-fold. In 301, Emperor Diocletian tried to restore the value of Roman currency by imposing rigid controls on the economy. But the reforms addressed inflation’s symptoms rather than its causes. Even Diocletian’s government preferred to collect taxes in kind rather than in specie.

A lack of knowledge about the laws of supply and demand also doomed early Chinese experiments in paper money during the Southern Song, Mongol and Ming Dynasties. Too many notes wound up in circulation, leading to rampant inflation. Thinking that paper was the culprit, the Chongzhen Emperor hoped to restore stability by switching to silver coins. But these introduced other vulnerabilities. In the 1630s, the decline of Spanish silver from the New World (alongside a spate of crop failures) resulted in a money shortage—and a new round of inflation. The Ming Dynasty collapsed not long after, in 1644.

Spain was hardly in better shape. The country endured unrelenting inflation during the so-called Price Revolution in Europe in the 16th and 17th centuries, as populations increased, demand for goods spiraled and the purchasing power of silver collapsed. The French political theorist Jean Bodin recognized as early as 1568 that rising prices were connected to the amount of money circulating in the system. But his considered view was overlooked in the rush to find scapegoats, such as the Jews.

ILLUSTRATION: THOMAS FUCHS

The great breakthrough came in the 18th century as classical economists led by Adam Smith argued that the market was governed by laws and could be studied like any other science. Smith also came close to identifying inflation, observing that wealth is destroyed when governments attempt to “inflate the currency.” The term “inflation” became common in the mid-19th century, particularly in the U.S., in the context of boom and bust cycles caused by an unsecured money supply.

Ironically, the worst cases of inflation during the 20th century coincided with the rise of increasingly sophisticated models for predicting it. The hyperinflation of the German Papiermark during the Weimar Republic in 1921-23 may be the most famous, but it pales in comparison to the Hungarian Pengo in 1945-46. Inflamed by the government’s weak response, prices doubled every 15 hours at their peak. The one billion trillion Pengo note was worth about one pound sterling. By 1949 the currency had gone—and so had Hungary’s democracy.

In 1982, the U.S. Federal Reserve under Paul Volcker achieved a historic victory over what became known as the Great Inflation of the 1960s and ‘70s. It did so through an aggressive regimen of high interest rates to curb spending. Ordinary Americans suffered high unemployment as a result, but the country endured. As with any affliction, it isn’t enough for doctors to identify the cause: The patient must be prepared to take his medicine.

Historically Speaking: Typos Have Been Around as Long as Writing Itself

Egyptian engravers, medieval scribes and even Shakespeare’s printer made little mistakes that have endured

May 12, 2022

The Lincoln Memorial in Washington, D.C., is 100 years old this month. The beloved national monument is no less perfect for having one slight flaw: The word “future” in the Second Inaugural Address was mistakenly carved as “Euture.” It is believed that the artist, Ernest C. Bairstow, accidentally picked up the “e” stencil instead of the “f.” He tried to fix it by filling in the bottom line, but the smudged outline is still visible.

Bairstow was by no means the first engraver to rely on the power of fillers. It wasn’t uncommon for ancient Egyptian carvers—most of whom were illiterate—to botch their inscriptions. The seated Pharaoh statue at the Penn Museum in Philadelphia depicts Rameses II, third Pharaoh of the 19th dynasty, who lived during the 13th century BC. Part of the inscription ended up being carved backward, which the artist tried to hide with a bit of filler and paint. But time and wear have made the mistake, well, unmistakable.

ILLUSTRATION: THOMAS FUCHS

Medieval scribes were notorious for botching their illuminated manuscripts, using all kinds of paint tricks to hide their errors. But for big mistakes—for example, when an entire line was dropped—the monks could get quite inventive. In a 13th-century Book of Hours at the Walters Museum in Baltimore, an English monk solved the problem of a missing sentence by writing it at the bottom of the page and drawing a ladder with man on it, pulling the sentence by a rope, to where it was meant to be.

The phrase “the devil is in the details” may have been inspired by Titivillus, the medieval demon of typos. Monks were warned that Titivillus ensured that every scribal mistake was collected and logged, so that it could be held against the offender at Judgment Day.

The warning seems to have had only limited effect. The English poet Geoffrey Chaucer was so enraged by his copyist Adam Pinkhurst that he attacked him in verse, complaining he had “to rub and scrape: and all is through thy negligence and rape.”

The move to print failed to solve the problem of typos. When Shakespeare’s Cymbeline was first committed to text, the name of the heroine was accidentally changed from Innogen to Imogen, which is how she is known today. Little typos could have big consequences, such as the so-called Wicked Bible of 1631, whose printers managed to leave out the “not” in the seventh commandment, thereby telling Christians that “thou shalt commit adultery.”

The rise of the newspaper deadline in the 19th century inevitably led to typos big and small, as well as those unfortunate and unlikely. In 1838, British readers of the Manchester Guardian were informed that “writers” rather than “rioters” had caused extensive property damage during a protest meeting in Yorkshire.

In the age of computers, a single typo can have catastrophic consequences. On July 22, 1962, NASA’s Mariner 1 probe to Venus exploded just 293 seconds after launching. The failure was traced to an inputting error. A single hyphen was inadvertently left off one of the codes.