Historically Speaking: The Many Breeds of Unicorn

Ancient India, China and Greece all told stories about one-horned creatures, each with a different kind of magic.

The Wall Street Journal

June 1, 2023

There are around 1,280 active unicorns in the world, with just over 50% located in the United States. These aren’t the four-footed, one-horned kind, but privately held startups valued at least $1 billion. When angel investor Aileen Lee coined the term back in 2013, it seemed apt since such startups were incredibly rare, magically successful and free of worldly taint—just like a unicorn. Ten years on, however, it is clear that modern-day unicorns also represent some of the less appealing aspects of their ancient brethren. Not only are they vulnerable to frauds, they can also be a conduit for irrational feelings.

The mythical unicorn seems to have appeared independently in several Eastern and Western cultures. The earliest known images appear on seals used by the Indus Valley Civilization in India during the 3rd millennium B.C.; the animal depicted may be a now-extinct type of auroch, but the shape is unmistakably unicorn-ish, with an elongated body and a slender arching neck. Scholars identify the single horn as a symbol of pure, concentrated sexual virility, probably in reference to an ancient Indo-European deity known as the “master of the animals.”

The unicorn found in ancient Chinese myths, the qilin, was different. It was a multicolored, fantastical creature whose appearance heralded good news, such as the birth of an emperor. Meanwhile, the ancient Greeks were convinced by travelers’ reports of oryxes and the occasional Indian rhinoceros that the unicorn was a real creature. In the 1st century A.D., Pliny the Elder described the unicorn as “the fiercest animal” alive.

Thomas Fuchs

Greek and Latin translators of the Hebrew Bible rendered the word re’em, “horned animal,” as unicornis. Its presence in the Bible remained problematic for Christians until the theologian Tertullian, in the 3rd century, declared that its horn symbolized the beam of the holy cross. Character-wise, the Western unicorn was depicted as benevolent, like its Chinese counterpart, but also powerful like the Eurasian iteration. As in Indian tradition, unicorns were believed to have magic inside their (phallic) horns, though they were powerless against virgins. It is not hard to discern the psychosexual drama in “The Hunt of the Unicorn,” a series of tapestries made in the Netherlands around 1500, which tell the allegorical tale of a unicorn lured into a deadly trap by a beautiful virgin.

The widespread European belief that unicorn horns could protect against poison created a lucrative market for fakes, usually narwhal horns. In 1533 Pope Clement VII bought one for 17,000 ducats, roughly $5.3 million today. Once narwhals became better known the market dissipated, as did the belief in unicorns.

Its exposure as a tall tale rescued the unicorn from the clutches of charlatans and misogynists to live freely in the imaginations of 19th century writers and artists like the French symbolist painter Gustave Moreau. In the realm of the unreal, the unicorn accrued far more cultural potency than it had while theoretically alive. Lewis Carroll predicted its ascendancy in “Alice Through the Looking Glass”: “‘Well, now that we have seen each other,’ said the unicorn, ‘if you’ll believe in me, I’ll believe in you.’” Quite so.

Historically Speaking: The Quest to Look Young Forever

From drinking gold to injecting dog hormones, people have searched for eternal youth in some unlikely places.

The Wall Street Journal

May 18, 2023

A study explaining why mouse hairs turn gray made global headlines last month. Not because the little critters are in desperate need of a makeover; but knowing the “why” in mice could lead to a cure for graying locks in humans. Everyone, nowadays, seems to be chasing after youth, either to keep it, find it or just remember it.

The ancient Greeks believed that seeking eternal youth and immortality was hubris, inviting punishment by the gods. Eos, goddess of dawn, asked Zeus to make her human lover Tithonus immortal. He granted her wish, but not quite the way she expected: Tithonus lived on and on as a prisoner of dementia and decrepitude.

The Egyptians believed it was possible for a person to achieve eternal life; the catch was that he had to die first. Also, for a soul to be reborn, every spell, ritual and test outlined in the Book of the Dead had to be executed perfectly, or else death was permanent.

Since asking the gods or dying first seemed like inadvisable ways to defy aging, people in the ancient world often turned to lotions and potions that promised to give at least the appearance of eternal youth. Most anti-aging remedies were reasonably harmless. Roman recipes for banishing wrinkles included a wide array of ingredients, from ass’s milk, swan’s fat and bean paste to frankincense and myrrh.

But ancient elixirs of life often contained substances with allegedly magical properties that were highly toxic. China’s first emperor Qin Shi Huang, who lived in the 3rd century B.C., is believed to have died from mercury poisoning after drinking elixirs meant to make him immortal. Perversely, his failure was subsequently regarded as a challenge. During the Tang Dynasty, from 618 to 907, noxious concoctions created by court alchemists to prolong youth killed as many as six emperors.

THOMAS FUCHS

Even nonlethal beauty aids could be dangerous. In 16th-century France, Diane de Poitiers, the mistress of King Henri II, was famous for looking the same age as her lover despite being 20 years older. Regular exercise and moderate drinking probably helped, but a study of Diane’s remains published in 2009 found that her hair contained extremely high levels of gold, likely due to daily sips of a youth-potion containing gold chloride, diethyl ether and mercury. The toxic combination would have ravaged her internal organs and made her look ghostly white.

By the 19th century, elixirs, fountains of youth and other magical nonsense had been replaced by quack medicine. In 1889, a French doctor named Charles Brown-Sequard started a fashion for animal gland transplants after he claimed spectacular results from injecting himself with a serum containing canine testicle fluid. This so-called rejuvenation treatment, which promised to restore youthful looks and sexual vigor to men, went through various iterations until it fell out of favor in the 1930s.

Advances in plastic surgery following World War I meant that people could skip tedious rejuvenation therapies and instantly achieve younger looks with a scalpel. Not surprisingly, in a country where ex-CNN anchor Don Lemon could call a 51-year-old woman “past her prime,” women accounted for 85% of the facelifts performed in the U.S. in 2019. For men, there’s nothing about looking old that can’t be fixed by a Lamborghini and a 21-year-old girlfriend. For women, the problem isn’t the mice, it’s the men.

Historically Speaking: Using Forensic Evidence to Solve Crimes

Today’s DNA techniques are just the latest addition to a toolkit used by detectives since ancient times.

The Wall Street Journal

May 5, 2023

In February, police in Burlington, Vt., announced they had solved the city’s oldest cold case, the 1971 murder of 24-year-old schoolteacher Rita Curran. Taking advantage of genetic genealogy using DNA databases—the latest addition to the forensic science toolbox—the police were able to prove that the killer was a neighbor in Curran’s apartment building, William DeRoos.

The practice of forensic science, the critical examination of crime scenes, existed long before it had a name. The 1st-century A.D. Roman jurist Quintilian argued that evidence was not the same as proof unless supported by sound method and reasoning. Nothing should be taken at face value: Even blood stains on a toga, he pointed out, could be the result of a nosebleed or a messy religious sacrifice rather than a murder.

In 6th-century Byzantium, the Justinian Law Code allowed doctors to serve as expert witnesses, recognizing that murder cases required specialized knowledge. In Song Dynasty China, coroners were guided by Song Ci, a 13th-century judge who wrote “The Washing Away of Wrongs,” a comprehensive handbook on criminology and forensic science. Using old case studies, Song provided step-by-step instructions on how to tell if a drowned person had been alive before hitting the water and whether blowflies could be attracted by traces of blood on a murder weapon.

As late as the 17th century, however, Western investigators were still prone to attributing unexplained deaths to supernatural causes. In 1691 the sudden death of Henry Sloughter, the colonial governor of New York, provoked public hysteria. It subsided after an autopsy performed by six physicians proved that blocked lungs, not spells or poison, were responsible. The case was a watershed in placing forensic pathology at the heart of the American judicial system.

Using forensic evidence to solve criminal crimes

THOMAS FUCHS

Growing confidence in scientific methods resulted in more systematic investigations, which increased the chances of a case being solved. In England in 1784, the conviction of John Toms for the murder of Edward Culshaw hinged on a paper scrap pulled from Culshaw’s bullet wound. The jagged edge was found to match up perfectly with a torn sheet of paper found in Toms’s pocket.

Still, the only way to determine whether a suspect was present at the scene of a crime was by visual identification. By the late 19th century, studies by Charles Darwin’s cousin, the anthropologist Sir Francis Galton, and others had established that every individual has unique fingerprints. Fingerprint evidence might have helped to identify Jack the Ripper in 1888, but official skepticism kept the police from pursuing it.

Four years later, in Argentina, fingerprints were used to solve a crime for the first time. Two police officers, Juan Vucetich and Eduardo Alvarez, ignored their superiors’ distrust of the method to prove that a woman had murdered her children so she could marry her lover.

The success of fingerprinting ushered in a golden age of forensic innovation, driven by ambition but guided by scientific principles. By the 1930s, dried blood stains could be analyzed for their blood type and bullets could be matched to the guns that fired them. Almost a century later, the first principle of forensic science still stands: Every contact leaves a trace.

Historically Speaking: The Search for Better Weapons Against Pests

From sulfur to DDT, farmers have spent millennia looking for ways to stop crop-destroying insects.

The Wall Street Journal

April 20, 2023

The scientific breakthroughs of the 17th century, such as the compound microscope, made the natural world more intelligible and therefore controllable. By the 18th century, a farmer’s arsenal included nicotine, mercury and arsenic-based insecticides.

The Mesopotamians realized the necessity for pest control as early as 2500 B.C.E. They were fortunate to have access to elemental sulfur, which they made into a powder or burned as fumes to kill mites and insects. Elsewhere, farmers experimented with biological weaponry. The predatory ant Oecophylla smaragdina feasts on the caterpillars and boring beetles that destroy citrus trees. Farmers in ancient China learned to plant colonies next to their orange groves and tie ropes between branches, enabling the ants to spread easily from tree to tree.

The scientific breakthroughs of the 17th century, such as the compound microscope, made the natural world more intelligible and therefore controllable. By the 18th century, a farmer’s arsenal included nicotine, mercury and arsenic-based insecticides.

Pierre Marie Alexis Millardet, Bordeaux

Illustration: Thomas Fuchs

But the causes of fungal blight remained a mystery. In 1843, the pathogen behind potato late blight, Phytophthora infestans, jumped from South America to New York and Philadelphia, and then crossed the Atlantic. Ireland had all the ingredients for an agricultural catastrophe: cool, wet winters, water-retaining clay soil, and reliance on a single potato variety as a food staple, combined with the custom of storing old and new potato crops together. Four years of heavily infected harvests, made worse by the British government’s failed response to the emergency, led to more than a million deaths.

In the 1880s, French vineyards were under attack from a different blight, Uncinula necator. By a happy coincidence, Pierre Marie Alexis Millardet, a botany professor at the University of Bordeaux, noticed that some grape vines growing next to a country road were free of the powdery mildew, while those further away were riddled with it. The owner explained that he had doused the roadside vines with a mix of copper sulfate and lime to deter casual picking. Armed with this knowledge, in 1885 Millardet perfected the Bordeaux mixture, the first preventive fungicide, which is still used today.

At the same time, an American entomologist named Albert Koebele was experimenting with biological pest controls. Citrus trees were once again under attack, only now it was the cottony cushion scale insect. Koebele went to Australia in 1888 and brought back its best-known predator, the vedalia beetle, thereby saving California’s citrus industry.

In 1936, the development of DDT, the first synthetic insecticide, was hailed as a miracle of science, offering the first real defense against malaria and other insect-borne diseases. But it was later discovered to be toxic to animals and humans, and it killed insects indiscriminately. Among its many victims was the vedalia beetle, which led to a resurgence of cottony cushion scale. Ultimately, the Environment Protection Agency banned DDT’s use in 1972.

The race to invent environmentally safe alternatives is ramping up, but so are the pests. After a wet winter on the West Coast and a warm one in the East, experts are predicting great swarms of bloodsucking insects. Help!

Historically Speaking: The Long Road to Pensions for All

ILLUSTRATION: THOMAS FUCHS

From the Song Dynasty to the American Civil War, governments have experimented with ways to support retired soldiers and workers

The Wall Street Journal

April 6, 2023

“Will you still need me, will you still feed me,/When I’m sixty-four?” sang the Beatles in their 1967 album Sgt. Pepper’s Lonely Hearts Club Band. These were somewhat hypothetical questions at a time when the mean age of American men taking retirement was 64, and their average life expectancy was 67. More than a half-century later, the Beatles song resonates in a different way, because there are so few countries left where retirement on a state pension at 64 is even possible.

Historically, governments preferred not to be in the retirement business, but self-interest sometimes achieved what charitable impulses could not. In 6 A.D., a well-founded fear of civil unrest encouraged Augustus Caesar to institute the first state pension system, the aerarium militare, which looked after retired army veterans. He earmarked a 5% tax on inheritances to pay for the scheme, which served as a stabilizing force in the Roman Empire for the next 400 years. The Sack of Rome in 410 by Alaric, leader of the Visigoths, probably could have been avoided if Roman officials had kept their promise to pay his allied troops their military pensions.

In the 11th century, the Song emperor Shenzong invited the brilliant but mercurial governor of Nanjing, Wang Anshi, to reform China’s entire system of government. Wang’s far-reaching “New Laws” included state welfare plans to care for the aged and infirm. Some of his ideas were accepted but not the retirement plan, which achieved the remarkable feat of uniting both conservatives and radicals against him: The former regarded state pensions as an assault on family responsibility, the latter thought it gave too much power to the government. Wang was forced to retire in 1075.

Leaders in the West were content to muddle along until, like Augustus, they realized that a large nation-state requires a national army to defend it. England’s Queen Elizabeth I oversaw the first army and navy pensions in Europe. She also instituted the first Poor Laws, which codified the state’s responsibility toward its citizens. The problem with the Poor Laws, however, was that they transferred a national problem to the local level and kept it there.

Before he fell victim to the Terror during the French Revolution, the Marquis de Condorcet tried to figure out how France might pay for a national pension system. The question was largely ignored in the U.S. until the Civil War forced the federal government into a reckoning. A military pension system that helped fewer than 10,000 people in 1861 grew into a behemoth serving over 300,000 in 1885. By 1894 military pensions accounted for 37% of the federal budget. One side effect was to hamper the development of national and private pension schemes. Among the few companies to offer retirement pensions for employees were the railroads and American Express.

By the time Frances Perkins, President Franklin Roosevelt’s Labor Secretary, ushered in Social Security in 1935, Germany’s national pension scheme was almost 50 years old. But the German system started at age 70, far too late for most people, which was the idea. As Jane Austen’s Mrs. Dashwood complained in “Sense and Sensibility,” “People always live forever when there is an annuity to be paid to them.” The last Civil War pensioner was Irene Triplett, who died in 2020. She was receiving $73.13 every month for her father’s Union service.

Historically Speaking: In Search of a Good Night’s Sleep

People have been pursuing the secrets of slumber ever since the ancient Egyptians opened sacred ‘sleep clinics.’

The Wall Street Journal

March 24, 2023

It is a riddle worthy of the Sphinx: What is abundant and yet in short supply, free and yet frequently paid for, necessary for life and yet resembles death? The answer is sleep. According to the Centers for Disease Control and Prevention, more than a third of Americans are sleep deprived.

Worries about sleeplessness also kept the ancient Egyptians up at night. They recognized its importance, and many of their methods for treating insomnia are still used today. Smelling or applying lavender oil was the first recommendation. Failing that, there was drinking camomile tea or taking a narcotic such as opium. The Egyptians also invented the sleep clinic. Among the medical services offered by priests was a night in a sacred chamber, where special rituals might send patients into a deep sleep in the hope of receiving a curative visitation from a god.

Ancient Indian and Chinese physicians treated sleep disorders with herbs such as Indian winter cherry, known as ashwagandha, and Korean red ginseng. But they also advocated practical measures to encourage the body to sleep, such as chanting, burning incense and, in the case of the Chinese, choosing the right kind of bed.

Greek physicians copied the practice of divine healing through ritualized sleep practices, calling it “incubation.” But Greek philosophers were puzzled by sleep, since it seemed to defy categorization. Ever the pragmatist, Aristotle decided in his treatise “On Sleep and Sleeplessness” that sleep was connected to digestion, the final stage of the whole complicated process.

ILLUSTRATION: THOMAS FUCHS

It turns out, however, that ancient prescriptions were based on a different model for sleeping than the one we use today. The historian A. Roger Ekirch has argued that before industrialization changed the workday, most societies practiced segmented sleep—sleeping in two shifts with a break in the middle. Waking up and thinking or futzing for a couple of hours was considered both normal and welcome.

In the 14th century, new translations of ancient Greek and Persian medical texts gave European physicians fresh insights into sleep conditions. Scientists looked for alternatives to overcome insomnia, with laudanum becoming the gold standard after the 16th century Swiss-German physician Paracelsus discovered how to dilute opium’s active ingredients in alcohol.

Another important question was how much control a person has over their actions while asleep. Pope Clement V, who took office in 1305, was the first pope to absolve Christians of sinful acts committed while sleepwalking. Centuries later, in 1846, a Boston jury acquitted Albert Tirrell of killing his mistress Mary Bickford, who was found in bed with her throat cut, on the grounds that he was a habitual sleepwalker with a history of somnambulistic violence.

The science of sleep had hardly advanced by 1925, when physiologist Nathaniel Kleitman persuaded the University of Chicago to fund the world’s first sleep laboratory. Helped by a team of students and test subjects that included his own family, Kleitman showed that sleep could be subjected to the same rigorous experiments as any other condition. He discovered REM sleep, which is when dreams occur, as well as the internal body clock and the effects of prolonged sleeplessness. Kleitman demystified the mechanics of sleep but not, alas, the alchemy behind the perfect night’s sleep.

Historically Speaking: When Taxis Were Drawn by Horses

Long before Uber, there were Roman litters, Venetian gondolas and other variations on the ride for hire.

The Wall Street Journal

March 10, 2023

Last month drivers working for Uber and Lyft went on strike in cities in the U.S., Great Britain and the Netherlands. This was on top of strikes in December. The digital ride-hailing companies are looking increasingly like the taxi industry of old: Customers complain about the prices while drivers say they are struggling financially.

Taxis have been a hallmark of urban life since ancient times. In Rome, the most ubiquitous taxi was the cisium, a two-wheeled horse-drawn carriage that could accommodate a single passenger plus a small amount of luggage. Customers could hail them from designated cisium ranks on the city’s main roads. In theory, drivers could be fined for speeding and other infractions, but they were very often a public nuisance. In the first century, the emperor Claudius went so far as to ban all for-hire wheeled vehicles from city centers, forcing travelers and those without private carriages to rely on independent litter-bearers and sedan chair porters.

During the Middle Ages, one of the first European cities to have strict regulations governing for-hire vehicles was Venice. In a city of canals, the vehicles in question were gondolas, and from at least the mid-14th century every gondolier had to belong to a guild, known as a scuola. Membership was relatively open and offered a way for immigrants to get ahead. A lawsuit between two scuolas in 1514 reveals that Giovanni the Ethiopian was one of the chief guild officers.

Other cities had battles over competition. In 1635, London’s watermen lobbied King Charles I to ban hackney carriage drivers from accepting journeys shorter than 3 miles. Neither this nor subsequent bans were well-enforced. In 1660, King Charles II prohibited hackney carriages from picking up fares in the streets, which, again, people ignored. On Nov. 22, 1660, the diarist Samuel Pepys wrote of hailing a coach after dinner and sending his wife home in it.

ILLUSTRATION: THOMAS FUCHS

The regulation of public carriage services in Paris and London is a tale of two very different approaches. Single horse-drawn cabriolets, from which the term “cab” derives, emerged in Paris in the 18th century, but the city saw them as a threat to traditional coachmen and so refused to license them. Customers complained that Paris coachmen were rude and unhelpful for want of competition.

London, for its part, frequently revised its licensing rules and coach regulations. Most important, it introduced the “knowledge” system, consisting of routes that every driver must able to recite to pass the license exam—a requirement that still exists today. A London taxi was, and is, expensive, but driving is a middle-class job, and the industry has largely avoided the labor strife that plagues New York.

Nor did London have the same terrible accident rates. In 1931, New York recorded more than 23,000 traffic accidents; incredibly, 21,000 of them involved a taxicab. Although the iconic Checker cabs provided more room for passengers and even helped to give New York its distinctive look until their production ended in 1982, taxi-driving was still regarded as a dangerous and low-status profession, exemplified by the introduction of the bulletproof partition in the 1990s.

The arrival of the new ride-hailing companies in 2009 disrupted the traditional taxi industry and forced cities to think about improving their systems. Now it is the turn of the disrupters to do the same.

Historically Speaking: Even Ancient Children Did Homework

Americans have long debated the value of take-home assignments, but children have been struggling with them for millennia.

The Wall Street Journal

February 24, 2023

If American schoolchildren no longer had to do hours of homework each night, a lot of dogs might miss out on their favorite snack, if an old excuse is to be believed, but would the children be worse off? Americans have been debating whether or not to abolish homework for almost a century and a half. Schoolwork and homework became indistinguishable during Covid, when children were learning from home. But the normal school day has returned and so has the issue.

The ancient Greek philosophers thought deeply about the purpose of education. In the Republic, Plato argued that girls as well as boys should receive physical, mental and moral training because it was good for the state. But the Greeks were less concerned about where this training should take place, school or home, or what kind of separation should exist between the two. The Roman statesman Cicero wrote that he learned at home as much as he did outside of it.

History of Homework

ILLUSTRATION: THOMAS FUCHS

But, of course, practice makes perfect, as Pliny the Younger told his students of oratory. Elementary schoolchildren in Greco-Roman Egypt were expected to practice their letters on wax writing tablets. A homework tablet from the second century AD, now in the British Museum, features two lines of Greek written by the teacher and a child’s attempt to copy them underneath.

Tedious copying exercises also plagued the lives of ancient Chinese students. In 1900 an enormous trove of 1500 to 900-year-old Buddhist scrolls was discovered in a cave near Dunhuang in northwestern China. Scattered among the texts were homework copies made by bored monastery pupils who scribbled things like, “This is Futong incurring another person’s anger.”

What many people generally think of as homework today—after-class assignments forced on children regardless of their pedagogical usefulness—has its origins in the Prussian school system. In the 18th and 19th centuries, Prussia led the world in mass education. Fueled by the belief that compulsory schooling was the best means of controlling the peasantry, the authorities devised a rigorous system based on universal standards and applied methods. Daily homework was introduced, in part because it was a way of inserting school oversight, and by extension the state, into the home.

American educationalists such as Horace Mann in Massachusetts sought to create a free school system based on the Prussian model. Dividing children into age groups and other practical reforms faced little opposition. But as early as 1855, the American Medical Monthly was warning of the dangers to children’s health from lengthy homework assignments. In the 1880s, the Boston school board expressed its concern by voting to reduce the amount of arithmetic homework in elementary schools.

As more parents complained of lost family time and homework wars, the Ladies’ Home Journal began to campaign for its abolition, calling after-school work a “national crime” in 1900. The California legislature agreed, abolishing all elementary school homework in 1901. The homework debate seesawed until the Russians launched Sputnik in 1957 and shocked Americans out of complacency. Congress quickly passed a $1 billion education spending program. More, not less, homework became the mantra until the permissive ‘70s, only to reverse in response to Japan’s economic ascendancy in the ‘80s.

All the old criticisms of homework remain today, but perhaps the bigger threat to such assignments is technological, in the form of the universal homework butler known as ChatGPT.

Historically Speaking: The Ancient Elixir Made by Bees

Honey has always been a sweet treat, but it has also long served as a preservative, medicine and salve.

The Wall Street Journal

February 9, 2023

The U.S. Department of Agriculture made medical history last month when it approved the first vaccine for honey bees. Hives will be inoculated against American Foulbrood, a highly contagious bacterial disease that kills bee larvae. Our buzzy friends need all the help they can get. In 2021, a national survey of U.S. beekeepers reported that 45.5% of managed colonies died during the preceding year. Since more than one-third of the foods we eat depend on insect pollinators, a bee-less world would drastically alter everyday life.

The loss of bees would also cost us honey, a foodstuff that throughout human history has been much more than a pleasant sugar-substitute. Energy-dense, nutritionally-rich wild honey, ideal for brain development, may have helped our earliest human ancestors along the path of evolution. The importance of honey foraging can be inferred from its frequent appearance in Paleolithic art. The Araña Caves of Valencia, Spain, are notable for a particularly evocative line drawing of a honey harvester dangling precariously while thrusting an arm into a beehive.

Honey is easily fermented, and there is evidence that the ancient Chinese were making a mixed fruit, rice and honey alcoholic beverage as early as 7000 BC. The Egyptians may have been the first to domesticate bees. A scene in the sun temple of Pharaoh Nyuserre Ini, built around 2400 B.C., depicts beekeepers blowing smoke into hives as they collect honey. They loved the taste, of course, but honey also played a fundamental role in Egyptian culture. It was used in religious rituals, as a preservative (for embalming) and, because of its anti-bacterial properties, as an ingredient in hundreds of concoctions from contraceptives to gastrointestinal medicines and salves for wounds.

The oldest known written reference to honey comes from a 4,000-year-old recipe for a skin ointment, noted on a cuneiform clay tablet found among the ruins of Nippur in the Iraqi desert.

The ancient Greeks judged honey like fine wine, rating its qualities by bouquet and region. The thyme-covered slopes of Mount Hymettus, near Athens, were thought to produce the best varieties, prompting sometimes violent competition between beekeepers. The Greeks also appreciated its preservative properties. In 323 B.C., the body of Alexander the Great was allegedly transported in a vat of honey to prevent it from spoiling.

Honey’s many uses were also recognized in medieval Europe. In fact, in 1403 honey helped to save the life of 16 year-old Prince Henry, the future King Henry V of England. During the battle of Shrewsbury, an arrowhead became embedded in his cheekbone. The extraction process was long and painful, resulting in a gaping hole. Knowing the dangers of an open wound, the royal surgeon John Bradmore treated the cavity with a honey mixture that kept it safe from dirt and bacteria.

Despite Bradmore’s success, honey was relegated to folk remedy status until World War I. Then medical shortages encouraged Russian doctors to use honey in wound treatments. Honey was soon after upstaged by the discovery of penicillin in 1928, but today its time has come.

A 2021 study in the medical journal BMJ found honey to be a cheap and effective treatment for the symptoms of upper respiratory tract infections. Scientists are exploring its potential uses in fighting cancer, diabetes, asthma and cardiovascular disease.

To save ourselves, however, first we must save the bees.

Historically Speaking: Fakers, Con Men and Pretenders to the Throne

George Santos is far from the first public figure to have assumed an identity later discovered to be rife with fictions

The Wall Street Journal

January 27, 2023

Few would have thought it possible in the age of the internet, and yet U.S. Rep. George Santos turns out to have invented a long list of details of the life story he told as a candidate.

It was much easier to be an impostor in the ancient world, when travel was difficult, communications slow, and few people even knew what their rulers looked like. One of history’s oldest and strangest examples depended on this ignorance.

According to the Greek historian Herodotus, King Cambyses II of Persia became so jealous of his younger brother Bardiya, also known in history as Smerdis, that he ordered his assassination in 522 B.C. Imagine Cambyses’s shock, therefore, when news came to him while he was away fighting in Egypt that a man calling himself Smerdis had seized the throne.

ILLUSTRATION: THOMAS FUCHS

Cambyses died before he could confront the impostor. But Prince Darius, a former ally of Cambyses, suspected that the new king was actually a Magi priest named Gautama. Herodotus relates that Darius knew a crucial fact about Gautama: his ears had been cut off.

Royal etiquette kept the king at distance from everyone—even the queen lived and slept in separate quarters. But Darius persuaded her to find an opportunity to be in the king’s apartment when he was asleep and check his ears. They were missing! Darius promptly denounced this Smerdis as an impostor, proclaimed himself the savior of the kingdom and took the throne. Modern historians suspect that the real impostor was Darius and that he invented the Gautama story to justify his coup against Smerdis.

The Middle Ages were a boom time for royal impostors. Kings and crown princes were often assassinated or executed under conditions that could plausibly be spun into tales of miraculous escape. Some of those impersonating long-lost monarchs managed to get quite far. King Henry VII of England spent eight years fighting a rebellion led by Perkin Warbeck, who claimed to be one of the two princes held in the Tower of London and killed by King Richard III in the 1480s.

During the Renaissance, the most famous case in Europe involved neither a king nor the heir to a great fortune but a French peasant named Martin Guerre. In 1548, the feckless Guerre abandoned his wife, Bertrande, and their son. Eight years later, a look-alike named Arnaud du Thil suddenly appeared, claiming to be Guerre. He settled down and became a good member of the community—too good for those who had known the old Guerre. But Bertrande insisted he was the same man. The deception unraveled in 1560 when the real Martin Guerre made a sensational return in the middle of a court trial to decide du Thil’s identity. Du Thil was executed for impersonation, but the judge declared Bertrande innocent of adultery on the grounds that women are known to be very silly creatures and easily deceived.

Opportunities to claim titles and thrones diminished after the 18th century, and a new class of impostor arose: the confidence man. Mr. Santos isn’t yet a match for the Czech-American fraudster, “Count” Victor Lustig. In 1925, posing as a French government official, Lustig successfully auctioned off the Eiffel Tower. In Chicago the following year, he posed as an investor and swindled Al Capone out of $5000.

Lustig’s long list of crimes eventually landed him in Alcatraz. Where Mr. Santos is heading is a mystery—much like where he’s been.