Historically Speaking: Tourists Behaving Badly

When today’s travelers get in trouble for knocking over statues or defacing temples, they’re following an obnoxious tradition that dates back to the Romans.

The Wall Street Journal

September 8, 2023

Tourists are giving tourism a bad name. The industry is a vital cog in the world economy, generating more than 10% of global GDP in 2019. But the antisocial behavior of a significant minority is causing some popular destinations to enact new rules and limits. Among the list of egregious tourist incidents this year, two drunk Americans had to be rescued off the Eiffel Tower, a group of Germans in Italy knocked over a 150-year-old statue while taking selfies, and a Canadian teen in Japan defaced an 8th-century temple.

It’s ironic that sightseeing, one of the great perks of civilization, has become one of its downsides. The ancient Greeks called it theoria and considered it to be both good for the mind and essential for appreciating one’s own culture. As the 5th-century B.C. Greek poet Lysippus quipped: “If you’ve never seen Athens, your brain’s a morass./If you’ve seen it and weren’t entranced, you’re an ass.”

The Romans surpassed the Greeks in their love of travel. Unfortunately, they became the prototype for that tourist cliché, the “ugly American,” since they were rich, entitled and careless of other people’s heritage. The Romans never saw an ancient monument they didn’t want to buy, steal or cover in graffiti. The word “miravi”—“I was amazed”—was the Latin equivalent of “Kilroy was here,” and can be found all over Egypt and the Mediterranean.

Thomas Fuchs

Mass tourism picked up during the Middle Ages, facilitated by the Crusades and the popularity of religious pilgrimages. But so did the Roman habit of treating every ancient building like a public visitor’s book. The French Crusader Lord de Coucy actually painted his family coat of arms onto one of the columns of the Church of the Nativity in Bethlehem.

In 17th- and 18th-century Europe, the scions of aristocratic families would embark on a Grand Tour of famous sights, especially in France and Italy. The idea was to turn them into sophisticated men of the world, but for many young men, the real point of the jaunt was to sample bordellos and be drunk and disorderly without fear of their parents finding out.

Even after the Grand Tour went out of fashion, the figure of the tourist usually conjured up negative images. Visiting Alexandria in Egypt in the 19th century, the French novelist Gustave Flaubert raged at the “imbecile” who had painted the words “Thompson of Sunderland” in six-foot-high letters on Pompey’s Pillar in Alexandria in Egypt. The perpetrators were in fact the rescued crew of a ship named the Thompson.

Flaubert was nevertheless right about the sheer destructiveness of some tourists. Souvenir hunters were among the worst offenders. In the Victorian era, Stonehenge in England was chipped and chiseled with such careless disregard that one of its massive stones eventually collapsed.

Sometimes tourists go beyond vandalism to outright madness. Jerusalem Syndrome, first recognized in the Middle Ages, is the sudden onset of religious delusions while visiting the biblical city. Stendhal Syndrome is an acute psychosomatic reaction to the beauty of Florence’s artworks, named for the French writer who suffered such an attack in 1817. There’s also Paris Syndrome, a transient psychosis triggered by extreme feelings of letdown on encountering the real Paris.

As for Stockholm Syndrome, when an abused person identifies with their abuser, there’s little chance of it developing in any of the places held hostage by hordes of tourists.

Historically Speaking: The Many Ingredients of Barbecue

Native Americans, European settlers and African slaves all contributed to creating an American culinary tradition.

The Wall Street Journal

August 18, 2023

There are more than 30,000 BBQ joints in the U.S., but as far as the Michelin Guide is concerned, not one of them is worthy of a coveted star. Many Americans would say that the fault, dear Brutus, is not in ourselves but in the stars.

A 1562 illustration shows Timucua Indians roasting animals on a raised wooden platform, the original form of barbecuing.

The American barbecue—cooking meat with an indirect flame at low temperature over seasoned wood or charcoal—is a centuries-old tradition. (Using the term for any kind of outdoor grilling came much later.) Like America itself, it is a cultural hybrid. Indigenous peoples in the Caribbean and the Americas would place a whole animal carcass on a wooden platform several feet above a fire and let the smoke do the cooking. The first Spanish arrivals were fascinated by the technique, and translated a native word for the platform as “barbacoa.”

Lyndon B. Johnson (right) and Hubert Humphrey celebrate their election victory with barbecue, November 1964.

The Europeans began to barbecue pigs and cattle, non-native animals that easily adapted to the New World. Another important culinary contribution—using a ground trench instead of a raised platform—may have been spread by African slaves. The 18th century African abolitionist Olaudah Equiano described seeing the Miskito of Honduras, a mixed community of Indians and Africans, barbecue an alligator: “Their manner of roasting is by digging a hole in the earth, and filling it with wood, which they burn to coal, and then they lay sticks across, on which they set the meat.”

European basting techniques also played a role. The most popular recipes for barbecue sauce reflect historic patterns of immigration to the U.S.: British colonists used a simple concoction of vinegar and spices, French émigrés insisted on butter, and German settlers preferred their native mustard. In the American West, two New World ingredients, tomatoes and molasses, formed the basis of many sauces. The type of meat became another regional difference: pork was more plentiful in the South, beef in the West.

Although labor-intensive, a barbecued whole hog can feed up to 150 people, making it the ideal food for communal gatherings. In 1793, President George Washington celebrated the laying of the cornerstone for the Capitol building with an enormous barbecue featuring a 500-pound ox.

In the South before the Civil War, a barbecue meant a hog cooked by slaves. The choicest cuts from the pig’s back went to the grandees, hence the phrase “living high on the hog.” Emancipation brought about a culinary reckoning; Southern whites wanting a barbecue had to turn to cookbooks, such as “Mrs Hill’s New Cook Book,” published in 1867 for “inexperienced Southern housekeepers…in this peculiar crisis of our domestic as well as national affairs.”

In the 20th century, the slower rate of urbanization outside the North helped to keep the outdoor barbecue alive. As a Texan, President Lyndon B. Johnson used “barbecue diplomacy” to project a folksy image, breaking with the refined European style of the Kennedys to endear himself to ordinary Americans. The ingredients in Lady Bird Johnson’s barbecue sauce embraced as many regional varieties as possible by including butter, ketchup and vinegar.

Commercial BBQ sauces, which first became available in 1909, offer a convenient substitute for making your own. But for most people, to experience real barbecue requires that other quintessentially American pastime, the road trip. Just leave the Michelin Guide at home.

Historically Speaking: The Enduring Technology of the Book

Durable, stackable and skimmable, books have been the world’s favorite way to read for two millennia and counting.

The Wall Street Journal

August 3, 2023

A fragment of the world’s oldest book was discovered earlier this year. Dated to about 260 B.C., the 6-by-10-inch piece of papyrus survived thanks to ancient Egyptian embalmers who recycled it for cartonnage, a papier-mache-like material used in mummy caskets. The Graz Mummy Book, so-called because it resides in the library of Austria’s Graz University, is 400 years older than the previous record holder, a fragment of a Latin book from the 2nd century A.D.

Stitching on the papyrus shows that it was part of a book with pages rather than a scroll. Scrolls served well enough in the ancient world, when only priests and scribes used them, but as the literacy rate in the Roman Empire increased, so did the demand for a more convenient format. A durable, stackable, skimmable, stitched-leaf book made sense. Its resemblance to a block of wood inspired the Latin name caudex, “bark stem,” which evolved into codex, the word for an ancient manuscript. The 1st-century Roman poet and satirist Martial was an early adopter: A codex contained more pages than the average scroll, he told his readers, and could even be held in one hand!

Thomas Fuchs

The book developed in different forms around the world. In India and parts of southeast Asia, dried palm-leaves were sewn together like venetian blinds. The Chinese employed a similar technique using bamboo or silk until the third century A.D., when hemp paper became a reliable alternative. In South America, the Mayans made their books from fig-tree bark, which was pliable enough to be folded into leaves. Only four codices escaped the mass destruction of Mayan culture by Franciscan missionaries in the 16th century.

Gutenberg’s printing press, perfected in 1454, made that kind of annihilation impossible in Europe. By the 16th century, more than nine million books had been printed. Authorities still tried their best to exert control, however. In 1538, England’s King Henry VIII prohibited the selling of “naughty printed books” by unlicensed booksellers.

Licensed or not, the profit margins for publishers were irresistible, especially after Jean Grolier, a 16th-century Treasurer-General of France, started the fashion for expensively decorated book covers made of leather. Bookselling became a cutthroat business. Shakespeare was an early victim of book-piracy: Shorthand stenographers would hide among the audience and surreptitiously record his plays so they could be printed and sold.

Beautiful leather-bound books never went out of fashion, but by the end of the 18th century, there was a new emphasis on cutting costs and shortening production time. Germany experimented with paperbacks in the 1840s, but these were downmarket prototypes that failed to catch on.

The paperback revolution was started in 1935 by the English publisher Allen Lane, who one day found himself stuck at a train station with nothing to read. Books were too rarefied and expensive, he decided. Facing down skeptics, Lane created Penguin and proceeded to publish 10 literary novels as paperbacks, including Ernest Hemingway’s “A Farewell to Arms.” A Penguin book had a distinctive look that signaled quality, yet it cost the same as a packet of cigarettes. The company sold a million paperbacks in its first year.

Radio was predicted to mean the downfall of books; so were television, the Internet and ebooks. For the record, Americans bought over 788.7 million physical books last year. Not bad for an invention well into its third millennium.

Historically Speaking: Saving Lives With Lighthouses

Since the first one was built in ancient Alexandria, lighthouses have helped humanity master the danger of the seas.

The Wall Street Journal

July 21, 2023

For those who dream big, there will be a government auction on Aug. 1 for two decommissioned lighthouses, one in Cleveland, Ohio, the other in Michigan’s Upper Peninsula. Calling these lighthouses “fixer-uppers,” however, hardly does justice to the challenge of converting them into livable

France’s Cordouan Lighthouse. GETTY IMAGES

homes. Lighthouses were built so man could fight nature, not sit back and enjoy it.

The Lighthouse of Alexandria, the earliest one recorded, was one of the ancient Seven Wonders of the World. An astonishing 300 feet tall or more, it was commissioned in 290 B.C. by Ptolemy I Soter, the founder of Egypt’s Ptolemaic dynasty, to guide ships into the harbor and keep them from the dangerous shoals surrounding the entrance. No word existed for lighthouse, hence it was called the Pharos of Alexandria, after the small islet on which it was located.

The Lighthouse did wonders for the Ptolemies’ reputation as the major power players in the region. The Romans implemented the same strategy on a massive scale. Emperor Trajan’s Torre de Hercules in A Coruña, in northwestern Spain, can still be visited. But after the empire’s collapse, its lighthouses were abandoned.

More than a thousand years passed before Europe again possessed the infrastructure and maritime capacity to need lighthouses, let alone build them. The contrasting approaches of France and England says much about the two cultures. The French regarded them as a government priority, resulting in such architectural masterpieces as Bordeaux’s Cordouan Lighthouse, commissioned by Henri III in 1584. The English entrusted theirs to Trinity House, a private charity, which led to inconsistent implementation. In 1707, poor lighthouse guidance contributed to the sinking of Admiral Sir Cloudesley Shovell’s fleet off the coast of the Scilly Isles, costing his and roughly 1,500 other lives.

Ida Lewis saved at least 18 people from drowning as the lighthouse keeper of Lime Rock in Newport, R.I.

In 1789, the U.S. adopted a third approach. Alexander Hamilton, the first Secretary of the Treasury, argued that federal oversight of lighthouses was an important symbol of the new government’s authority. Congress ordered the states to transfer control of their existing lighthouses to a new federal agency, the U.S. Lighthouse Establishment. But in the following decades Congress’s chief concern was cutting costs. America’s lighthouses were decades behind Europe’s in adopting the Fresnel lens, invented in France in 1822, which concentrated light into a powerful beam.

The U.S. had caught up by the time of the Civil War, but no amount engineering improvements could lessen the hardship and dangers involved in lighthouse-keeping. Isolation, accidents and deadly storms took their toll, yet it was one of the few government jobs open to women. Ida Lewis saved at least 18 people from drowning during her 54-year tenure of Lime Rock Station off Newport, R.I.

Starting in the early 1900s, there were moves to convert lighthouses to electricity. The days of the lighthouse keeper were numbered. Fortunately, when a Category 4 hurricane hit Galveston, Texas, on Sept. 8, 1900, its lighthouse station was still fully manned. The keeper, Henry C. Claiborne, managed to shelter 125 people in his tower before the storm surge engulfed the lower floors. Among them were the nine survivors of a stranded passenger train. Claiborne labored all night, manually rotating the lens after its mechanical parts became stuck. The lighthouse was a beacon of safety during the storm, and a beacon of hope afterward.

To own a lighthouse is to possess a piece of history, plus unrivaled views and not a neighbor in sight—a bargain whatever the price.

Historically Speaking: The Royal Origins of Tennis

The strict etiquette at Wimbledon and other tournaments is a reminder that the sport’s first players were French kings and aristocrats.

The Wall Street Journal

June 15, 2023

For the 136th Wimbledon Championships, opening on July 3, lady competitors will be allowed to ignore the all-white clothing rule for the first time—though only as it applies to their undergarments. Tennis may never be the same.

The break with tradition is all the more surprising given the sport’s penchant for strict etiquette rules and dress codes. The earliest recorded version of tennis was a type of handball played by medieval monks in France. Called “jeu de paume,” “game of the palm,” it involved hitting a leather ball against the cloister wall.

Thomas Fuchs

As the sport spread beyond the monastic world, it gained a new name, “tenez,” the French word for “receive.” It had instant social cachet, since it could only be played in large, high-walled courtyards, thus narrowing the pool of players to kings and aristocrats.

Early “tenez” was not without its dangers. Legend has it that King Louis X of France, who introduced the first covered courts, took ill and died after an overly strenuous match in 1316. In 1498 another French king, Charles VIII, suffered an untimely end after banging his head on a lintel while hurrying to his tennis court.

By the 16th century, the game had evolved into something in between modern squash and tennis. Players used angled wooden rackets, and the ball could be bounced off the walls and sloping roof as well as hit over the net. This version, known as real or royal tennis, is still played at a small number of courts around the world.

The Royal Tennis Court at King Henry VIII’s Hampton Court Palace, outside London, was the most luxurious in Europe, but the sophisticated surroundings failed to elevate on-court behavior. In 1541, Sir Edmund Knyvett was condemned to have his hand chopped off for striking his opponent and drawing blood. Henry ended up granting him a reprieve—more than he did for wives two and five.

The association of tennis with royal privilege hastened its demise in the 18th century. On June 20, 1789, Louis XVI’s tennis court at Versailles hosted one of the most important events of the French Revolution. The new National Assembly gathered there after being locked out of its premises, and made a pledge, the Tennis Court Oath, not to disband until France had a constitution. It was a very brave or very foolish person who played the game after that.

Modern tennis—known at first as “lawn tennis,” since it was played on a grass court—began to emerge in the 1870s, when an eccentric British Army major named Walter Clopton Wingfield invented a version of the game using rubber balls. His name for it—“Sphairistike,” from the Greek word for ball playing—never caught on. But the social opportunities offered by tennis made it extremely popular among the upper classes.

The exclusive All England Croquet and Lawn Tennis Club in Wimbledon, whose championships began in 1877, inspired imitators on both sides of the Atlantic. Unfortunately, many tennis players expended nearly as much effort keeping the “wrong sort” out as they did keeping the ball in. For years, the major tennis tournaments offered no prizes and were only open to amateurs, meaning the wealthy. Professionals were relegated to a separate circuit.

Tennis’s own revolution took place in 1968, following sustained pressure from players and fans for the Grand Slam Tournaments to be open to all competitors. Fifty-five years on, the barricades—and the barriers—are still coming down.

Historically Speaking: The Many Breeds of Unicorn

Ancient India, China and Greece all told stories about one-horned creatures, each with a different kind of magic.

The Wall Street Journal

June 1, 2023

There are around 1,280 active unicorns in the world, with just over 50% located in the United States. These aren’t the four-footed, one-horned kind, but privately held startups valued at least $1 billion. When angel investor Aileen Lee coined the term back in 2013, it seemed apt since such startups were incredibly rare, magically successful and free of worldly taint—just like a unicorn. Ten years on, however, it is clear that modern-day unicorns also represent some of the less appealing aspects of their ancient brethren. Not only are they vulnerable to frauds, they can also be a conduit for irrational feelings.

The mythical unicorn seems to have appeared independently in several Eastern and Western cultures. The earliest known images appear on seals used by the Indus Valley Civilization in India during the 3rd millennium B.C.; the animal depicted may be a now-extinct type of auroch, but the shape is unmistakably unicorn-ish, with an elongated body and a slender arching neck. Scholars identify the single horn as a symbol of pure, concentrated sexual virility, probably in reference to an ancient Indo-European deity known as the “master of the animals.”

The unicorn found in ancient Chinese myths, the qilin, was different. It was a multicolored, fantastical creature whose appearance heralded good news, such as the birth of an emperor. Meanwhile, the ancient Greeks were convinced by travelers’ reports of oryxes and the occasional Indian rhinoceros that the unicorn was a real creature. In the 1st century A.D., Pliny the Elder described the unicorn as “the fiercest animal” alive.

Thomas Fuchs

Greek and Latin translators of the Hebrew Bible rendered the word re’em, “horned animal,” as unicornis. Its presence in the Bible remained problematic for Christians until the theologian Tertullian, in the 3rd century, declared that its horn symbolized the beam of the holy cross. Character-wise, the Western unicorn was depicted as benevolent, like its Chinese counterpart, but also powerful like the Eurasian iteration. As in Indian tradition, unicorns were believed to have magic inside their (phallic) horns, though they were powerless against virgins. It is not hard to discern the psychosexual drama in “The Hunt of the Unicorn,” a series of tapestries made in the Netherlands around 1500, which tell the allegorical tale of a unicorn lured into a deadly trap by a beautiful virgin.

The widespread European belief that unicorn horns could protect against poison created a lucrative market for fakes, usually narwhal horns. In 1533 Pope Clement VII bought one for 17,000 ducats, roughly $5.3 million today. Once narwhals became better known the market dissipated, as did the belief in unicorns.

Its exposure as a tall tale rescued the unicorn from the clutches of charlatans and misogynists to live freely in the imaginations of 19th century writers and artists like the French symbolist painter Gustave Moreau. In the realm of the unreal, the unicorn accrued far more cultural potency than it had while theoretically alive. Lewis Carroll predicted its ascendancy in “Alice Through the Looking Glass”: “‘Well, now that we have seen each other,’ said the unicorn, ‘if you’ll believe in me, I’ll believe in you.’” Quite so.

Historically Speaking: The Quest to Look Young Forever

From drinking gold to injecting dog hormones, people have searched for eternal youth in some unlikely places.

The Wall Street Journal

May 18, 2023

A study explaining why mouse hairs turn gray made global headlines last month. Not because the little critters are in desperate need of a makeover; but knowing the “why” in mice could lead to a cure for graying locks in humans. Everyone, nowadays, seems to be chasing after youth, either to keep it, find it or just remember it.

The ancient Greeks believed that seeking eternal youth and immortality was hubris, inviting punishment by the gods. Eos, goddess of dawn, asked Zeus to make her human lover Tithonus immortal. He granted her wish, but not quite the way she expected: Tithonus lived on and on as a prisoner of dementia and decrepitude.

The Egyptians believed it was possible for a person to achieve eternal life; the catch was that he had to die first. Also, for a soul to be reborn, every spell, ritual and test outlined in the Book of the Dead had to be executed perfectly, or else death was permanent.

Since asking the gods or dying first seemed like inadvisable ways to defy aging, people in the ancient world often turned to lotions and potions that promised to give at least the appearance of eternal youth. Most anti-aging remedies were reasonably harmless. Roman recipes for banishing wrinkles included a wide array of ingredients, from ass’s milk, swan’s fat and bean paste to frankincense and myrrh.

But ancient elixirs of life often contained substances with allegedly magical properties that were highly toxic. China’s first emperor Qin Shi Huang, who lived in the 3rd century B.C., is believed to have died from mercury poisoning after drinking elixirs meant to make him immortal. Perversely, his failure was subsequently regarded as a challenge. During the Tang Dynasty, from 618 to 907, noxious concoctions created by court alchemists to prolong youth killed as many as six emperors.

THOMAS FUCHS

Even nonlethal beauty aids could be dangerous. In 16th-century France, Diane de Poitiers, the mistress of King Henri II, was famous for looking the same age as her lover despite being 20 years older. Regular exercise and moderate drinking probably helped, but a study of Diane’s remains published in 2009 found that her hair contained extremely high levels of gold, likely due to daily sips of a youth-potion containing gold chloride, diethyl ether and mercury. The toxic combination would have ravaged her internal organs and made her look ghostly white.

By the 19th century, elixirs, fountains of youth and other magical nonsense had been replaced by quack medicine. In 1889, a French doctor named Charles Brown-Sequard started a fashion for animal gland transplants after he claimed spectacular results from injecting himself with a serum containing canine testicle fluid. This so-called rejuvenation treatment, which promised to restore youthful looks and sexual vigor to men, went through various iterations until it fell out of favor in the 1930s.

Advances in plastic surgery following World War I meant that people could skip tedious rejuvenation therapies and instantly achieve younger looks with a scalpel. Not surprisingly, in a country where ex-CNN anchor Don Lemon could call a 51-year-old woman “past her prime,” women accounted for 85% of the facelifts performed in the U.S. in 2019. For men, there’s nothing about looking old that can’t be fixed by a Lamborghini and a 21-year-old girlfriend. For women, the problem isn’t the mice, it’s the men.

Historically Speaking: Using Forensic Evidence to Solve Crimes

Today’s DNA techniques are just the latest addition to a toolkit used by detectives since ancient times.

The Wall Street Journal

May 5, 2023

In February, police in Burlington, Vt., announced they had solved the city’s oldest cold case, the 1971 murder of 24-year-old schoolteacher Rita Curran. Taking advantage of genetic genealogy using DNA databases—the latest addition to the forensic science toolbox—the police were able to prove that the killer was a neighbor in Curran’s apartment building, William DeRoos.

The practice of forensic science, the critical examination of crime scenes, existed long before it had a name. The 1st-century A.D. Roman jurist Quintilian argued that evidence was not the same as proof unless supported by sound method and reasoning. Nothing should be taken at face value: Even blood stains on a toga, he pointed out, could be the result of a nosebleed or a messy religious sacrifice rather than a murder.

In 6th-century Byzantium, the Justinian Law Code allowed doctors to serve as expert witnesses, recognizing that murder cases required specialized knowledge. In Song Dynasty China, coroners were guided by Song Ci, a 13th-century judge who wrote “The Washing Away of Wrongs,” a comprehensive handbook on criminology and forensic science. Using old case studies, Song provided step-by-step instructions on how to tell if a drowned person had been alive before hitting the water and whether blowflies could be attracted by traces of blood on a murder weapon.

As late as the 17th century, however, Western investigators were still prone to attributing unexplained deaths to supernatural causes. In 1691 the sudden death of Henry Sloughter, the colonial governor of New York, provoked public hysteria. It subsided after an autopsy performed by six physicians proved that blocked lungs, not spells or poison, were responsible. The case was a watershed in placing forensic pathology at the heart of the American judicial system.

Using forensic evidence to solve criminal crimes

THOMAS FUCHS

Growing confidence in scientific methods resulted in more systematic investigations, which increased the chances of a case being solved. In England in 1784, the conviction of John Toms for the murder of Edward Culshaw hinged on a paper scrap pulled from Culshaw’s bullet wound. The jagged edge was found to match up perfectly with a torn sheet of paper found in Toms’s pocket.

Still, the only way to determine whether a suspect was present at the scene of a crime was by visual identification. By the late 19th century, studies by Charles Darwin’s cousin, the anthropologist Sir Francis Galton, and others had established that every individual has unique fingerprints. Fingerprint evidence might have helped to identify Jack the Ripper in 1888, but official skepticism kept the police from pursuing it.

Four years later, in Argentina, fingerprints were used to solve a crime for the first time. Two police officers, Juan Vucetich and Eduardo Alvarez, ignored their superiors’ distrust of the method to prove that a woman had murdered her children so she could marry her lover.

The success of fingerprinting ushered in a golden age of forensic innovation, driven by ambition but guided by scientific principles. By the 1930s, dried blood stains could be analyzed for their blood type and bullets could be matched to the guns that fired them. Almost a century later, the first principle of forensic science still stands: Every contact leaves a trace.

Historically Speaking: The Search for Better Weapons Against Pests

From sulfur to DDT, farmers have spent millennia looking for ways to stop crop-destroying insects.

The Wall Street Journal

April 20, 2023

The scientific breakthroughs of the 17th century, such as the compound microscope, made the natural world more intelligible and therefore controllable. By the 18th century, a farmer’s arsenal included nicotine, mercury and arsenic-based insecticides.

The Mesopotamians realized the necessity for pest control as early as 2500 B.C.E. They were fortunate to have access to elemental sulfur, which they made into a powder or burned as fumes to kill mites and insects. Elsewhere, farmers experimented with biological weaponry. The predatory ant Oecophylla smaragdina feasts on the caterpillars and boring beetles that destroy citrus trees. Farmers in ancient China learned to plant colonies next to their orange groves and tie ropes between branches, enabling the ants to spread easily from tree to tree.

The scientific breakthroughs of the 17th century, such as the compound microscope, made the natural world more intelligible and therefore controllable. By the 18th century, a farmer’s arsenal included nicotine, mercury and arsenic-based insecticides.

Pierre Marie Alexis Millardet, Bordeaux

Illustration: Thomas Fuchs

But the causes of fungal blight remained a mystery. In 1843, the pathogen behind potato late blight, Phytophthora infestans, jumped from South America to New York and Philadelphia, and then crossed the Atlantic. Ireland had all the ingredients for an agricultural catastrophe: cool, wet winters, water-retaining clay soil, and reliance on a single potato variety as a food staple, combined with the custom of storing old and new potato crops together. Four years of heavily infected harvests, made worse by the British government’s failed response to the emergency, led to more than a million deaths.

In the 1880s, French vineyards were under attack from a different blight, Uncinula necator. By a happy coincidence, Pierre Marie Alexis Millardet, a botany professor at the University of Bordeaux, noticed that some grape vines growing next to a country road were free of the powdery mildew, while those further away were riddled with it. The owner explained that he had doused the roadside vines with a mix of copper sulfate and lime to deter casual picking. Armed with this knowledge, in 1885 Millardet perfected the Bordeaux mixture, the first preventive fungicide, which is still used today.

At the same time, an American entomologist named Albert Koebele was experimenting with biological pest controls. Citrus trees were once again under attack, only now it was the cottony cushion scale insect. Koebele went to Australia in 1888 and brought back its best-known predator, the vedalia beetle, thereby saving California’s citrus industry.

In 1936, the development of DDT, the first synthetic insecticide, was hailed as a miracle of science, offering the first real defense against malaria and other insect-borne diseases. But it was later discovered to be toxic to animals and humans, and it killed insects indiscriminately. Among its many victims was the vedalia beetle, which led to a resurgence of cottony cushion scale. Ultimately, the Environment Protection Agency banned DDT’s use in 1972.

The race to invent environmentally safe alternatives is ramping up, but so are the pests. After a wet winter on the West Coast and a warm one in the East, experts are predicting great swarms of bloodsucking insects. Help!

Historically Speaking: The Long Road to Pensions for All

ILLUSTRATION: THOMAS FUCHS

From the Song Dynasty to the American Civil War, governments have experimented with ways to support retired soldiers and workers

The Wall Street Journal

April 6, 2023

“Will you still need me, will you still feed me,/When I’m sixty-four?” sang the Beatles in their 1967 album Sgt. Pepper’s Lonely Hearts Club Band. These were somewhat hypothetical questions at a time when the mean age of American men taking retirement was 64, and their average life expectancy was 67. More than a half-century later, the Beatles song resonates in a different way, because there are so few countries left where retirement on a state pension at 64 is even possible.

Historically, governments preferred not to be in the retirement business, but self-interest sometimes achieved what charitable impulses could not. In 6 A.D., a well-founded fear of civil unrest encouraged Augustus Caesar to institute the first state pension system, the aerarium militare, which looked after retired army veterans. He earmarked a 5% tax on inheritances to pay for the scheme, which served as a stabilizing force in the Roman Empire for the next 400 years. The Sack of Rome in 410 by Alaric, leader of the Visigoths, probably could have been avoided if Roman officials had kept their promise to pay his allied troops their military pensions.

In the 11th century, the Song emperor Shenzong invited the brilliant but mercurial governor of Nanjing, Wang Anshi, to reform China’s entire system of government. Wang’s far-reaching “New Laws” included state welfare plans to care for the aged and infirm. Some of his ideas were accepted but not the retirement plan, which achieved the remarkable feat of uniting both conservatives and radicals against him: The former regarded state pensions as an assault on family responsibility, the latter thought it gave too much power to the government. Wang was forced to retire in 1075.

Leaders in the West were content to muddle along until, like Augustus, they realized that a large nation-state requires a national army to defend it. England’s Queen Elizabeth I oversaw the first army and navy pensions in Europe. She also instituted the first Poor Laws, which codified the state’s responsibility toward its citizens. The problem with the Poor Laws, however, was that they transferred a national problem to the local level and kept it there.

Before he fell victim to the Terror during the French Revolution, the Marquis de Condorcet tried to figure out how France might pay for a national pension system. The question was largely ignored in the U.S. until the Civil War forced the federal government into a reckoning. A military pension system that helped fewer than 10,000 people in 1861 grew into a behemoth serving over 300,000 in 1885. By 1894 military pensions accounted for 37% of the federal budget. One side effect was to hamper the development of national and private pension schemes. Among the few companies to offer retirement pensions for employees were the railroads and American Express.

By the time Frances Perkins, President Franklin Roosevelt’s Labor Secretary, ushered in Social Security in 1935, Germany’s national pension scheme was almost 50 years old. But the German system started at age 70, far too late for most people, which was the idea. As Jane Austen’s Mrs. Dashwood complained in “Sense and Sensibility,” “People always live forever when there is an annuity to be paid to them.” The last Civil War pensioner was Irene Triplett, who died in 2020. She was receiving $73.13 every month for her father’s Union service.