Historically Speaking: Passports Haven’t Always Been Liberating

France’s Louis XIV first required international travelers to carry an official document. By the 20th century, most other countries did the same for reasons of national security.

The Wall Street Journal

August 12, 2022

As anyone who has recently applied for a passport can attest, U.S. passport agencies are still catching up from the pandemic lockdown. But even with the current delays and frustrations, a passport is, quite literally, our pass to freedom.

The exact concept did not exist in ancient times. An approximation was the official letter of introduction or safe conduct that guaranteed the security of the traveler holding it. The Hebrew Bible recounts that the prophet Nehemiah, cup-bearer to Artaxerxes I of Persia, requested a letter from the king for his mission to Judea. As an indispensable tool of international business and diplomacy, such documents were considered sacrosanct. In medieval England, King Henry V decreed that any attack on a bearer of one would be treated as high treason.

Another variation was the official credential proving the bearer had permission to travel. The Athenian army controlled the movements of officers between bases by using a clay token system. In China, by the time of the Tang dynasty in the early 7th century, trade along the Silk Road had become regulated by the paper-backed guosuo system. Functioning as a pass and identity card, possession of a signed guosuo document was the only means of legitimate travel between towns and cities.

The birth of the modern passport may be credited in part to King Louis XIV of France, who decreed in 1669 that all individuals, whether leaving or entering his country, were required to register their personal details with the appropriate officials and carry a copy of their travel license. Ironically, the passport requirement helped to foil King Louis XVI and Marie Antoinette’s attempt to escape from Paris in 1791.

The rise of middle-class tourism during the 19th century exposed the ideological gulf between the continental and Anglo-Saxon view of passports. Unlike many European states, neither Great Britain nor America required its citizens to carry an identity card or request government permission to travel. Only 785 Britons applied for a passport in 1847, mainly out of the belief that a document personally signed by the foreign secretary might elevate the bearer in the eyes of the locals.

By the end of World War I, however, most governments had come around to the idea that passports were an essential buttress of national security. The need to own one coincided with mass upheaval across Europe: Countries were redrawn, regimes toppled, minorities persecuted, creating millions of stateless refugees.

Into this humanitarian crisis stepped an unlikely savior, the Norwegian diplomat Fridtjof Nansen. In 1922, as the first high commissioner for refugees for the League of Nations, Nansen used his office to create a temporary passport for displaced persons, enabling them to travel, register and work in over 50 countries. Among the hundreds of thousands saved by a “Nansen passport” were the artist Marc Chagall and the writer Vladimir Nabokov. With unfortunate timing, the program lapsed in 1938, the year that Nazi Germany annexed Austria and invaded Czechoslovakia.

For a brief time during the Cold War, Americans experienced the kind of politicization that shaped most other passport systems. In the 1950s, the State Department could and did revoke the passports of suspected communist sympathizers. My father Carl Foreman was temporarily stripped of his after he finished making the anti-McCarthyite film classic “High Noon.”

Nowadays, neither race nor creed nor political opinions can come between an American and his passport. But delays of up to 12 weeks are currently unavoidable.

Historically Speaking: How the Waistband Got Its Stretch

Once upon a time, human girth was bound by hooks and buttons, and corsets had metal stays. Along came rubber and a whole new technology of flexible cloth.

The Wall Street Journal

January 7, 2021

The New Year has arrived, and if you’re like me, you’ve promised yourself a slimmer, fitter and healthier you in 2022. But in the meantime there is the old you to deal with—the you who overindulged at Thanksgiving and didn’t stop for the next 37 days. No miracle diet or resolution can instantaneously eradicate five weeks of wild excess. Fortunately, modern science has provided the next best thing to a miracle: the elasticated waistband.

Before the invention of elastic, adjustable clothing was dependent on technology that had hardly changed since ancient times. The Indus Valley Civilization made buttons from seashells as early as 2000 BC.

The first inkling that there might be an alternative to buttons, belts, hooks and other adjustable paraphernalia came in the late 18th century, with the discovery that rubber wasn’t only good for toys. It also had immensely practical applications for things such as pencil erasers and lid sealants. Rubber’s stretchable nature offered further possibilities in the clothing department. But there was no word for its special property until the poet William Cowper borrowed the 17th-century term “elastic,” used to describe the expansion and contraction of gases, for his translation of the Iliad in 1791: “At once he bent Against Tydides his elastic bow.”

PHOTO: GETTY IMAGES

By 1820, an enterprising English engineer named Thomas Hancock was making elastic straps and suspenders out of rubber. He also invented the “masticator,” a machine that rolled shredded rubber into sheets for industrial use. Elastic seemed poised to make a breakthrough: In the 1840s, Queen Victoria’s shoemaker, Joseph Sparkes Hall, popularized his invention of the elastic-gusset ankle boot, still known today as the Chelsea Boot.

But rubber had drawbacks. Not only was it a rare and expensive luxury that tended to wear out quickly, it was also sticky, sweaty and smelly. Elasticized textiles became popular only after World War I, helped by the demand for steel—and female workers—that led women to forego corsets with metal stays. Improved production techniques at last made elasticated girdles a viable alternative: In 1924, the Madame X rubber girdle promised to help women achieve a thinner form in “perfect comfort while you sit, work or play.”

The promise of comfort became real with the invention of Lastex, essentially rubber yarn, in 1930. Four years later, in 1934, Alexander Simpson, a London tailor, removed the need for belts or suspenders by introducing the adjustable rubber waistband in men’s trousers.

The constant threat of rubber shortages sparked a global race to devise synthetic alternatives. The winner was the DuPont Company, which invented neoprene in 1930. That research led to an even more exciting invention: the nylon stocking. Sales were halted during World War II, creating such pent-up demand that in 1946 there were “nylon riots” throughout the U.S., including in Pittsburgh, where 40,000 people tried to buy 13,000 pairs of stockings.

DuPont scored another win in 1958 with spandex, also known under the brand name Lycra, which is not only more durable than nylon but also stretchier. Spandex made dreams possible by making fabrics more flexible and forgiving: It helped the astronaut Neil Armstrong to walk on the moon and Simone Biles to become the most decorated female gymnast in history. And it will help me to breathe a little easier until I can fit into my jeans again.

Historically Speaking: How Malaria Brought Down Great Empires

A mosquito-borne parasite has impoverished nations and stopped armies in their tracks

The Wall Street Journal

October 15, 2021

Last week brought very welcome news from the World Health Organization, which approved the first-ever childhood vaccine for malaria, a disease that has been one of nature’s grim reapers for millennia.

Originating in Africa, the mosquito-borne parasitic infection left its mark on nearly every ancient society, contributing to the collapse of Bronze-Age civilizations in Greece, Mesopotamia and Egypt. The boy pharaoh Tutankhamen, who died around 1324 BC, suffered from a host of conditions including a club foot and cleft palate, but malaria was likely what killed him.

Malaria could stop an army in its tracks. In 413 BC, at the height of the disastrous Sicilian Expedition, malaria sucked the life out of the Athenian army as it lay siege to Syracuse. Athens never recovered from its losses and fell to the Spartans in 404 BC.

But while malaria helped to destroy the Athenians, it provided the Roman Republic with a natural barrier against invaders. The infested Pontine Marshes south of Rome enabled successive generations of Romans to conquer North Africa, the Middle East and Europe with some assurance they wouldn’t lose their own homeland. Thus, the spread of classical civilization was carried on the wings of the mosquito. In the 5th century, though, the blessing became a curse as the disease robbed the Roman Empire of its manpower.

Throughout the medieval era, malaria checked the territorial ambitions of kings and emperors. The greatest beneficiary was Africa, where endemic malaria was deadly to would-be colonizers. The conquistadors suffered no such handicap in the New World.

ILLUSTRATION: JAMES STEINBERG

The first medical breakthrough came in 1623 after malaria killed Pope Gregory XV and at least six of the cardinals who gathered to elect his successor. Urged on by this catastrophe to find a cure, Jesuit missionaries in Peru realized that the indigenous Quechua people successfully treated fevers with the bark of the cinchona tree. This led to the invention of quinine, which kills malarial parasites.

For a time, quinine was as powerful as gunpowder. George Washington secured almost all the available supplies of it for his Continental Army during the War of Independence. When Lord Cornwallis surrendered at Yorktown in 1781, less than half his army was fit to fight: Malaria had incapacitated the rest.

During the 19th century, quinine helped to turn Africa, India and Southeast Asia into a constellation of European colonies. It also fueled the growth of global trade. Malaria had defeated all attempts to build the Panama Canal until a combination of quinine and better mosquito control methods led to its completion in 1914. But the drug had its limits, as both Allied and Axis forces discovered in the two World Wars. While fighting in the Pacific Theatre in 1943, General Douglas MacArthur reckoned that for every fighting division at his disposal, two were laid low by malaria.

A raging infection rate during the Vietnam War was malaria’s parting gift to the U.S. in the waning years of the 20th century. Between 1964 and 1973, the U.S. Army suffered an estimated 391,965 sick-days from malaria cases alone. The disease didn’t decide the war, but it stacked the odds.

Throughout history, malaria hasn’t had to wipe out entire populations to be devastating. It has left them poor and enfeebled instead. With the advent of the new vaccine, the hardest hit countries can envisage a future no longer shaped by the disease.

Historically Speaking: Let Slip the Dogs, Birds and Donkeys of War

Animals have served human militaries with distinction since ancient times

The Wall Street Journal

August 5, 2021

Cher Ami, a carrier pigeon credited with rescuing a U.S. battalion from friendly fire in World War I, has been on display at the Smithsonian for more a century. The bird made news again this summer, when DNA testing revealed that the avian hero was a “he” and not—as two feature films, several novels and a host of poems depicted—a ”she.”

Cher Ami was one of more than 200,000 messenger pigeons Allied forces employed during the War. On Oct. 4, 1918, a battalion from the U.S. 77th Infantry Division in Verdun, northern France, was trapped behind enemy lines. The Germans had grown adept at shooting down any bird suspected of working for the other side. They struck Cher Ami in the chest and leg—but the pigeon still managed to make the perilous flight back to his loft with a message for U.S. headquarters.

Animals have played a crucial role in human warfare since ancient times. One of the earliest depictions of a war animal appears on the celebrated 4,500-year-old Sumerian box known as the Standard of Ur. One side shows scenes of war; the other, scenes of peace. On the war side, animals that are most probably onagers, a species of wild donkey, are shown dragging a chariot over the bodies of enemy soldiers.

War elephants of Pyrrhus in a 20th century Russian painting
PHOTO: ALAMY

The two most feared war animals of the classical world were horses and elephants. Alexander the Great perfected the use of the former and introduced the latter after his foray into India in 327 BC. For a time, the elephant was the ultimate weapon of war. At the Battle of Heraclea in 280 B.C., a mere 20 of them helped Pyrrhus, king of Epirus—whose costly victories inspired the term “Pyrrhic victory”—rout an entire Roman army.

War animals didn’t have to be big to be effective, however. The Romans learned how to defeat elephants by exploiting their fear of pigs. In 198 B.C., the citizens of Hatra, near Mosul in modern Iraq, successfully fought off a Roman attack by pouring scorpions on the heads of the besiegers. Eight years later, the Carthaginian general Hannibal won a surprise naval victory against King Eumenes II of Pergamon by catapulting “snake bombs”—jars stuffed with poisonous snakes—onto his ships.

Ancient war animals often suffered extraordinary cruelty. When the Romans sent pigs to confront Pyrrhus’s army, they doused the animals in oil and set them on fire to make them more terrifying. Hannibal would get his elephants drunk and stab their legs to make them angry.

Counterintuitively, as warfare became more mechanized the need for animals increased. Artillery needed transporting; supplies, camps, and prisoners needed guarding. A favorite mascot or horse might be well treated: George Washington had Nelson, and Napoleon had Marengo. But the life of the common army animal was hard and short. The Civil War killed between one and three million horses, mules and donkeys.

According to the Imperial War Museum in Britain, some 16 million animals served during World War I, including canaries, dogs, bears and monkeys. Horses bore the brunt of the fighting, though, with as many as 8 million dying over the four years.

Dolphins and sea lions have conducted underwater surveillance for the U.S. Navy and helped to clear mines in the Persian Gulf. The U.S. Army relies on dogs to detect hidden IEDs, locate missing soldiers, and even fight when necessary. In 2016, four sniffer dogs serving in Afghanistan were awarded the K-9 Medal of Courage by the American Humane Association. As the troop withdrawal continues, the military’s four-legged warriors are coming home, too

Historically Speaking: Duels Among the Clouds

Aerial combat was born during World War I, giving the world a new kind of military hero: the fighter pilot

“Top Gun” is back. The 1986 film about Navy fighter pilots is getting a sequel next year, with Tom Cruise reprising his role as Lt. Pete “Maverick” Mitchell, the sexy flyboy who can’t stay out of trouble. Judging by the trailer released by Paramount in July, the new movie, “Top Gun: Maverick,” will go straight to the heart of current debates about the future of aerial combat. An unseen voice tells Mr. Cruise, “Your kind is headed for extinction.”

The mystique of the fighter pilot began during World War I, when fighter planes first entered military service. The first aerial combat took place on Oct. 5, 1914, when French and German biplanes engaged in an epic contest in the sky, watched by soldiers on both sides of the trenches. At this early stage, neither plane was armed, but the German pilot had a rifle and the French pilot a machine gun; the latter won the day.

A furious arms race ensued. The Germans turned to the Dutch engineer Anthony Fokker, who devised a way to synchronize a plane’s propeller with its machine gun, creating a flying weapon of deadly accuracy. The Allies soon caught up, ushering in the era of the dogfight.

From the beginning, the fighter pilot seemed to belong to a special category of warrior—the dueling knight rather than the ordinary foot soldier. Flying aces of all nationalities gave each other a comradely respect. In 1916, the British marked the downing of the German fighter pilot Oswald Boelcke by dropping a wreath in his honor on his home airfield in Germany.

But not until World War II could air combat decide the outcome of an entire campaign. During the Battle of Britain in the summer of 1940, the German air force, the Luftwaffe, dispatched up to 1,000 aircraft in a single attack. The Royal Air Force’s successful defense of the skies led to British Prime Minister Winston Churchill’s famous declaration, “Never in the field of human conflict was so much owed by so many to so few.”

The U.S. air campaigns over Germany taught American military planners a different lesson. Rather than focusing on pilot skills, they concentrated on building planes with superior firepower. In the decades after World War II, the invention of air-to-air missiles was supposed to herald the end of the dogfight. But during the Vietnam War, steep American aircraft losses caused by the acrobatic, Soviet-built MiG fighter showed that one-on-one combat still mattered. The U.S. response to this threat was the highly maneuverable twin-engine F-15 and the formation of a new pilot training academy, the Navy Fighter Weapons School, which inspired the original “Top Gun.”

Since that film’s release, however, aerial combat between fighter planes has largely happened on screen, not in the real world. The last dogfight involving a U.S. aircraft took place in 1999, during the NATO air campaign in Kosovo. The F-14 Tomcats flown by Mr. Cruise’s character have been retired, and his aircraft carrier, the USS Enterprise, has been decommissioned.

Today, conventional wisdom again holds that aerial combat is obsolete. The new F-35 Joint Strike Fighter is meant to replace close-up dogfights with long-range weapons. But not everyone seems to have read the memo about the future of air warfare. Increasingly, U.S. and NATO pilots are having to scramble their planes to head off Russian incursions. The knights of the skies can’t retire just yet.

WSJ Historically Speaking: Postal Pitfalls, From Beacons to Emails

Charles Francis Adams was an American historical editor, politician and diplomat. He was the son of President John Quincy Adams and grandson of President John Adams, of whom he wrote a major biography. PHOTO: LIBRARY OF CONGRESS

Charles Francis Adams was an American historical editor, politician and diplomat. He was the son of President John Quincy Adams and grandson of President John Adams, of whom he wrote a major biography. PHOTO: LIBRARY OF CONGRESS

It’s now 150 years since a trans-Atlantic cable finally crackled into continuous action after nine years of false starts and disappointments. The transmission speed of up to eight words a minute seemed to the Victorians almost godlike. Small wonder that the first telegram in the U.S., sent about two decades earlier, had read, “What hath God wrought.”

Our desire for instantaneous dialogue is as old as language itself. Contemporaries praised the masterful use of rapid communication by Persian King Xerxes I, who ruled from 486 to 465 B.C. and was famous for having slaughtered the Spartans at the Battle of Thermopylae. According to the Greek historian Herodotus, Xerxes’ messengers were the best in the ancient world, for “neither snow nor rain nor heat nor night holds back for the accomplishment of the course.” That sentiment, translated a bit differently, ended up chiseled in stone above the front columns of the New York City Post Office on Eighth Avenue. Continue reading…

WSJ Historically Speaking: London’s WWI Exhibit and Other Memorable Memorials

Photo: UPPA/ZUMA PRESS

Photo: UPPA/ZUMA PRESS

One of the most beautiful and sublime war memorials in modern history, the ceramic sea of poppies around the Tower of London, will vanish forever in two weeks’ time. The temporary installation, “Blood Swept Lands and Seas of Red” by artist Paul Cummins and designer Tom Piper, has attracted more than 4 million visitors since volunteers began “planting” the 888,246 poppies in early August—one poppy for each British or Commonwealth soldier who died.

The universal chorus of praise enjoyed by “Blood Swept Lands” is an important reminder that art and commemoration aren’t incompatible. Although it is no easy task to capture the tragedy of death or the essence of personal greatness, it can be done.

That said, public memorials are a notoriously sensitive business, and the result often ends in tears—if not for the artist, then for the public. Lord Byron thought the whole enterprise was a bad idea, especially when it came to commemorating public figures. “What is the end of Fame?” he asked in his poem “Don Juan.” “To have, when the original is dust, / A name, a wretched picture, and worse bust.”

Byron may have been influenced by the public scandal that surrounded the 1822 unveiling of Richard Westmacott’s statue in honor of the duke of Wellington. Westmacott created an 18-foot bronze colossus featuring Achilles in all his naked glory. It was the first nude statue in London since Roman times. The outcry was so great that the artist meekly added a fig leaf, thereby ruining the classical purity of the statue and making it seem more pornographic rather than less. Since then, the statue has been vandalized twice—in the predictable place.

Continue reading…

The Wall Street Journal: Literature: The Tragic Poets of World War I

Photo: THE GRANGER COLLECTION, NEW YORK

Photo: THE GRANGER COLLECTION, NEW YORK

In the early 1940s, the English man of letters Robert Graves observed that patriotic verse had always been written in time of war—but only in World War I did the terms “war poet” and “war poetry” come into use, and both were “peculiar to it.” The soldiers in the trenches included enormous numbers of highly educated young men from nonmilitary backgrounds, who brought a new and different sensibility to the experience of war.

The first notable war poet to emerge was the young Rupert Brooke. His 1915 poem “The Soldier” captured the early spirit of duty and sacrifice: “If I should die, think only this of me; / That there’s some corner of a foreign field / That is for ever England.”

But the Brooke model was short-lived. Later poets challenged the idea that patriotism had any connection with such slaughter. Two in particular, Wilfred Owen (“Dulce et Decorum Est”) and Siegfried Sassoon (“How to Die”), came to symbolize the disillusionment of an entire generation. Where Sassoon was sarcastic, Owen was blunt, as in “Anthem for Doomed Youth”: “What passing-bells for these who die as cattle? / Only the monstrous anger of the guns. / Only the stuttering rifles’ rapid rattle.”

Continue reading…

WSJ Historically Speaking: A Brief History of Avoiding Exercise

Photo: THOMAS FUCHS

Photo: THOMAS FUCHS

Winter storms have become so frequent in the U.S. that they now have names, like hurricanes. This week saw the arrival of Seneca, making for a touch-and-go race about which will run out first: the alphabet or the jet stream. The weather in the eastern U.S. has been brutal enough this year that millions of Americans have been confined to their homes. In a country where, according to the Centers for Disease Control and Prevention, only one in six of us does anything like the recommended amount of physical activity, “Snowmaggedon” is a danger to the country’s health as well as its roads.

The ancients knew well that people will use any excuse to avoid exercise—bad weather, of course, being among the most popular. To counteract the natural human tendency toward inertia, the Greeks had their Olympics, the Chinese their tai chi and the Indians their yoga. The Romans went so far as to make exercise a legal requirement for all male citizens age 17 to 60. With the exception of Thomas Aquinas, who was colossally fat, lack of exercise was rarely a problem in the Middle Ages. Few people had time for aerobics when survival was the order of the day.

Continue reading…

‘WSJ Historically Speaking: Tis the Season to Stop Fighting

PETER ARKLE

PETER ARKLE

For some, a traditional Christmas means church and carols; for others, it means presents under the tree. But for countless millions, Christmas also means a day of epic family arguments. As the novelist Graham Greene once observed, “Christmas it seems to me is a necessary festival; we require a season when we can regret all the flaws in our human relationships: it is the feast of failure, sad but consoling.”

A recent survey conducted for the British hotel chain TraveLodge appears to support Greene’s gloomy contention. Two years ago, the chain noticed a sharp upswing in bookings for Christmas Day. Hoping to capitalize on the trend, its marketing department commissioned a poll of 2,500 households to see how the typical British family spends Christmas Day. The findings offered few useful insights for the company but proved a gold mine for sociologists.
The respondents revealed that, on average, the first fight of the day takes place no later than 10:13 a.m., usually after the discovery that someone has consumed all the chocolate. A lull then ensues while presents are opened and the drinks cabinet raided. At 11:42 or so, the children express their disappointment with their haul while the parents become enraged by their lack of gratitude. At noon comes a “discussion” of the level of alcohol consumption before lunch, followed by simmering tension until everyone finally sits down to eat around 2:23. The fragile truce established during the turkey carving is destroyed by a massive family row at 3:24. Exhaustion then sets in until 6:05, when tempers flare over the remote control. At 10:15, there is one final blowup before everyone goes to bed.

Continue reading…