Historically Speaking: Tourists Behaving Badly

When today’s travelers get in trouble for knocking over statues or defacing temples, they’re following an obnoxious tradition that dates back to the Romans.

The Wall Street Journal

September 8, 2023

Tourists are giving tourism a bad name. The industry is a vital cog in the world economy, generating more than 10% of global GDP in 2019. But the antisocial behavior of a significant minority is causing some popular destinations to enact new rules and limits. Among the list of egregious tourist incidents this year, two drunk Americans had to be rescued off the Eiffel Tower, a group of Germans in Italy knocked over a 150-year-old statue while taking selfies, and a Canadian teen in Japan defaced an 8th-century temple.

It’s ironic that sightseeing, one of the great perks of civilization, has become one of its downsides. The ancient Greeks called it theoria and considered it to be both good for the mind and essential for appreciating one’s own culture. As the 5th-century B.C. Greek poet Lysippus quipped: “If you’ve never seen Athens, your brain’s a morass./If you’ve seen it and weren’t entranced, you’re an ass.”

The Romans surpassed the Greeks in their love of travel. Unfortunately, they became the prototype for that tourist cliché, the “ugly American,” since they were rich, entitled and careless of other people’s heritage. The Romans never saw an ancient monument they didn’t want to buy, steal or cover in graffiti. The word “miravi”—“I was amazed”—was the Latin equivalent of “Kilroy was here,” and can be found all over Egypt and the Mediterranean.

Thomas Fuchs

Mass tourism picked up during the Middle Ages, facilitated by the Crusades and the popularity of religious pilgrimages. But so did the Roman habit of treating every ancient building like a public visitor’s book. The French Crusader Lord de Coucy actually painted his family coat of arms onto one of the columns of the Church of the Nativity in Bethlehem.

In 17th- and 18th-century Europe, the scions of aristocratic families would embark on a Grand Tour of famous sights, especially in France and Italy. The idea was to turn them into sophisticated men of the world, but for many young men, the real point of the jaunt was to sample bordellos and be drunk and disorderly without fear of their parents finding out.

Even after the Grand Tour went out of fashion, the figure of the tourist usually conjured up negative images. Visiting Alexandria in Egypt in the 19th century, the French novelist Gustave Flaubert raged at the “imbecile” who had painted the words “Thompson of Sunderland” in six-foot-high letters on Pompey’s Pillar in Alexandria in Egypt. The perpetrators were in fact the rescued crew of a ship named the Thompson.

Flaubert was nevertheless right about the sheer destructiveness of some tourists. Souvenir hunters were among the worst offenders. In the Victorian era, Stonehenge in England was chipped and chiseled with such careless disregard that one of its massive stones eventually collapsed.

Sometimes tourists go beyond vandalism to outright madness. Jerusalem Syndrome, first recognized in the Middle Ages, is the sudden onset of religious delusions while visiting the biblical city. Stendhal Syndrome is an acute psychosomatic reaction to the beauty of Florence’s artworks, named for the French writer who suffered such an attack in 1817. There’s also Paris Syndrome, a transient psychosis triggered by extreme feelings of letdown on encountering the real Paris.

As for Stockholm Syndrome, when an abused person identifies with their abuser, there’s little chance of it developing in any of the places held hostage by hordes of tourists.

Historically Speaking: The Quest to Look Young Forever

From drinking gold to injecting dog hormones, people have searched for eternal youth in some unlikely places.

The Wall Street Journal

May 18, 2023

A study explaining why mouse hairs turn gray made global headlines last month. Not because the little critters are in desperate need of a makeover; but knowing the “why” in mice could lead to a cure for graying locks in humans. Everyone, nowadays, seems to be chasing after youth, either to keep it, find it or just remember it.

The ancient Greeks believed that seeking eternal youth and immortality was hubris, inviting punishment by the gods. Eos, goddess of dawn, asked Zeus to make her human lover Tithonus immortal. He granted her wish, but not quite the way she expected: Tithonus lived on and on as a prisoner of dementia and decrepitude.

The Egyptians believed it was possible for a person to achieve eternal life; the catch was that he had to die first. Also, for a soul to be reborn, every spell, ritual and test outlined in the Book of the Dead had to be executed perfectly, or else death was permanent.

Since asking the gods or dying first seemed like inadvisable ways to defy aging, people in the ancient world often turned to lotions and potions that promised to give at least the appearance of eternal youth. Most anti-aging remedies were reasonably harmless. Roman recipes for banishing wrinkles included a wide array of ingredients, from ass’s milk, swan’s fat and bean paste to frankincense and myrrh.

But ancient elixirs of life often contained substances with allegedly magical properties that were highly toxic. China’s first emperor Qin Shi Huang, who lived in the 3rd century B.C., is believed to have died from mercury poisoning after drinking elixirs meant to make him immortal. Perversely, his failure was subsequently regarded as a challenge. During the Tang Dynasty, from 618 to 907, noxious concoctions created by court alchemists to prolong youth killed as many as six emperors.

THOMAS FUCHS

Even nonlethal beauty aids could be dangerous. In 16th-century France, Diane de Poitiers, the mistress of King Henri II, was famous for looking the same age as her lover despite being 20 years older. Regular exercise and moderate drinking probably helped, but a study of Diane’s remains published in 2009 found that her hair contained extremely high levels of gold, likely due to daily sips of a youth-potion containing gold chloride, diethyl ether and mercury. The toxic combination would have ravaged her internal organs and made her look ghostly white.

By the 19th century, elixirs, fountains of youth and other magical nonsense had been replaced by quack medicine. In 1889, a French doctor named Charles Brown-Sequard started a fashion for animal gland transplants after he claimed spectacular results from injecting himself with a serum containing canine testicle fluid. This so-called rejuvenation treatment, which promised to restore youthful looks and sexual vigor to men, went through various iterations until it fell out of favor in the 1930s.

Advances in plastic surgery following World War I meant that people could skip tedious rejuvenation therapies and instantly achieve younger looks with a scalpel. Not surprisingly, in a country where ex-CNN anchor Don Lemon could call a 51-year-old woman “past her prime,” women accounted for 85% of the facelifts performed in the U.S. in 2019. For men, there’s nothing about looking old that can’t be fixed by a Lamborghini and a 21-year-old girlfriend. For women, the problem isn’t the mice, it’s the men.

Historically Speaking: Before Weather Was a Science

Modern instruments made accurate forecasting possible, but humans have tried to predict the weather for thousands of years.

The Wall Street Journal, August 31, 2019

ILLUSTRATION: THOMAS FUCHS

Labor Day weekend places special demands on meteorologists, even when there’s not a hurricane like Dorian on the way. September weather is notoriously variable: In 1974, Labor Day in Iowa was a chilly 43 degrees, while the following year it was a baking 103.

Humanity has always sought ways to predict the weather. The invention of writing during the 4th millennium B.C. was an important turning point for forecasting: It allowed the ancient Egyptians to create the first weather records, using them as a guide to predict the annual flood level of the Nile. Too high meant crop failures, too low meant drought.

Some early cultures, such as the ancient Greeks and the Mayans, based their weather predictions on the movements of the stars. Others relied on atmospheric signs and natural phenomena. One of the oldest religious texts in Indian literature, the Chandogya Upanishad from the 8th century B.C., includes observations on various types of rain clouds. In China, artists during the Han Dynasty (206 B.C.-9 A.D.) painted “cloud charts” on silk for use as weather guides.

These early forecasting attempts weren’t simply products of magical thinking. The ancient adage “red sky at night, shepherd’s delight,” which Jesus mentions in the gospel of Matthew, is backed by hard science: The sky appears red when a high-pressure front moves in from the west, driving the clouds away.

In the 4th century B.C., Aristotle tried to provide rational explanations for weather phenomena in his treatise Meteorologica. His use of scientific method laid the foundations for modern meteorology. The problem was that nothing could be built on Aristotle’s ideas until the invention of such tools as the thermometer (an early version was produced by Galileo in 1593) and the barometer (invented by his pupil Torricelli in 1643).

Such instruments couldn’t predict anything on their own, but they made possible accurate daily weather observations. Realizing this, Thomas Jefferson, a pioneer in modern weather forecasting, ordered Meriwether Lewis and William Clark to keep meticulous weather records during their 1804-06 expedition to the American West. He also made his own records wherever he resided, writing in his meteorological diary, “My method is to make two observations a day.”

Most governments, however, remained dismissive of weather forecasting until World War I. Suddenly, knowing which way the wind would blow tomorrow meant the difference between gassing your own side or the enemy’s.

To make accurate predictions, meteorologists needed a mathematical model that could combine different types of data into a single forecast. The first attempt, by the English mathematician Lewis Fry Richardson in 1917, took six weeks to calculate and turned out to be completely wrong.

There were still doubts about the accuracy of weather forecasting when the Allied meteorological team told Supreme Commander Dwight Eisenhower that there was only one window of opportunity for a Normandy landing: June 6, 1944. Despite his misgivings, Eisenhower acted on the information, surprising German meteorologists who had predicted that storms would continue in the English Channel until mid-June.

As we all know, meteorologists still occasionally make the wrong predictions. That’s when the old proverb comes into play: “There is no such thing as bad weather, only inappropriate clothes.”

Historically Speaking: Fantasies of Alien Life

Human beings have never encountered extra-terrestrials, but we’ve been imagining them for thousands of years

The Wall Street Journal, March 7, 2019

Fifty years ago this month, Kurt Vonnegut published “Slaughterhouse-Five,” his classic semi-autobiographical, quasi-science fiction novel about World War II and its aftermath. The story follows the adventures of Billy Pilgrim, an American soldier who survives the bombing of Dresden in 1945, only to be abducted by aliens from the planet Tralfamadore and exhibited in their zoo. Vonnegut’s absurd-looking Tralfamadorians (they resemble green toilet plungers) are essentially vehicles for his meditations on the purpose of life.

Some readers may dismiss science fiction as mere genre writing. But the idea that there may be life on other planets has engaged many of history’s greatest thinkers, starting with the ancient Greeks. On the pro-alien side were the Pythagoreans, a fifth-century B.C. sect, which argued that life must exist on the moon; in the third century B.C., the Epicureans believed that there was an infinite number of life-supporting worlds. But Plato, Aristotle and the Stoics argued the opposite. In “On the Heavens,” Aristotle specifically rejected the possibility that other worlds might exist, on the grounds that the Earth is at the center of a perfect and finite universe.

The Catholic Church sided with Plato and Aristotle: If there was only one God, there could be only one world. But in Asia, early Buddhism encouraged philosophical explorations into the idea of multiverses and parallel worlds. Buddhist influence can be seen in the 10th-century Japanese romance “The Bamboo Cutter,” whose story of a marooned moon princess and a lovelorn emperor was so popular in its time that it is mentioned in Murasaki Shikibu’s seminal novel, “The Tale of Genji.”

During the Renaissance, Copernicus’s heliocentric theory, advanced in his book “On the Revolutions of the Celestial Spheres” (1543), and Galileo’s telescopic observations of the heavens in 1610 proved that the Church’s traditional picture of the cosmos was wrong. The discovery prompted Western thinkers to imagine the possibility of alien civilizations. From Johannes Kepler to Voltaire, imagining life on the moon (or elsewhere) became a popular pastime among advanced thinkers. In “Paradise Lost” (1667), the poet John Milton wondered “if Land be there,/Fields and Inhabitants.”

Such benign musings about extraterrestrial life didn’t survive the impact of industrialization, colonialism and evolutionary theory. In the 19th century, debates over whether aliens have souls morphed into fears about humans becoming their favorite snack food. This particular strain of paranoia reached its apogee in the alien-invasion novel “The War of the Worlds,” published in 1897 by the British writer H.G. Wells. Wells’s downbeat message—that contact with aliens would lead to a Darwinian fight for survival—resonated throughout the 20th century.

And it isn’t just science fiction writers who ponder “what if.” The physicist Stephen Hawking once compared an encounter with aliens to Christopher Columbus landing in America, “which didn’t turn out very well for the Native Americans.” More hopeful visions—such as Steven Spielberg’s 1982 film “E.T. the Extraterrestrial,” about a lovable alien who wants to get back home—have been exceptions to the rule.

The real mystery about aliens is the one described by the so-called “Fermi paradox.” The 20th-century physicist Enrico Fermi observed that, given the number of stars in the universe, it is highly probable that alien life exists. So why haven’t we seen it yet? As Fermi asked, “Where is everybody?”

WSJ Historically Speaking: Breaking Up Has Always Been Hard to Do

Photo: THOMAS FUCHS

Photo: THOMAS FUCHS

As Valentine’s Day draws near, let’s not forget its Roman ancestor: the festival of Lupercalia, a fertility rite (celebrated every Ides of February) that was about as romantic as a trip to the abattoir. The highlight of the day involved priests dipping their whips into goat’s blood and trolling the streets of Rome, playfully slapping any women who passed by. The ancients had no use for frilly hearts and chocolates.

Nevertheless, our classical forbears did know a few things about the flip side of Valentine’s Day: the art of the breakup. The Romans were masters of the poetic put-down. The 1st-century poet Ovid could offer some exquisitely worded insults; here is Elegy VI in his “Amores,” as translated by Christopher Marlowe in the 16th century: “Either she was foul, or her attire was bad,/ Or she was not the wench I wished t’ have had./ Idly I lay with her, as if I loved not,/ And like a burden grieved the bed that moved not.”

Continue reading…