Historically Speaking: In Search of a Good Night’s Sleep

People have been pursuing the secrets of slumber ever since the ancient Egyptians opened sacred ‘sleep clinics.’

The Wall Street Journal

March 24, 2023

It is a riddle worthy of the Sphinx: What is abundant and yet in short supply, free and yet frequently paid for, necessary for life and yet resembles death? The answer is sleep. According to the Centers for Disease Control and Prevention, more than a third of Americans are sleep deprived.

Worries about sleeplessness also kept the ancient Egyptians up at night. They recognized its importance, and many of their methods for treating insomnia are still used today. Smelling or applying lavender oil was the first recommendation. Failing that, there was drinking camomile tea or taking a narcotic such as opium. The Egyptians also invented the sleep clinic. Among the medical services offered by priests was a night in a sacred chamber, where special rituals might send patients into a deep sleep in the hope of receiving a curative visitation from a god.

Ancient Indian and Chinese physicians treated sleep disorders with herbs such as Indian winter cherry, known as ashwagandha, and Korean red ginseng. But they also advocated practical measures to encourage the body to sleep, such as chanting, burning incense and, in the case of the Chinese, choosing the right kind of bed.

Greek physicians copied the practice of divine healing through ritualized sleep practices, calling it “incubation.” But Greek philosophers were puzzled by sleep, since it seemed to defy categorization. Ever the pragmatist, Aristotle decided in his treatise “On Sleep and Sleeplessness” that sleep was connected to digestion, the final stage of the whole complicated process.

ILLUSTRATION: THOMAS FUCHS

It turns out, however, that ancient prescriptions were based on a different model for sleeping than the one we use today. The historian A. Roger Ekirch has argued that before industrialization changed the workday, most societies practiced segmented sleep—sleeping in two shifts with a break in the middle. Waking up and thinking or futzing for a couple of hours was considered both normal and welcome.

In the 14th century, new translations of ancient Greek and Persian medical texts gave European physicians fresh insights into sleep conditions. Scientists looked for alternatives to overcome insomnia, with laudanum becoming the gold standard after the 16th century Swiss-German physician Paracelsus discovered how to dilute opium’s active ingredients in alcohol.

Another important question was how much control a person has over their actions while asleep. Pope Clement V, who took office in 1305, was the first pope to absolve Christians of sinful acts committed while sleepwalking. Centuries later, in 1846, a Boston jury acquitted Albert Tirrell of killing his mistress Mary Bickford, who was found in bed with her throat cut, on the grounds that he was a habitual sleepwalker with a history of somnambulistic violence.

The science of sleep had hardly advanced by 1925, when physiologist Nathaniel Kleitman persuaded the University of Chicago to fund the world’s first sleep laboratory. Helped by a team of students and test subjects that included his own family, Kleitman showed that sleep could be subjected to the same rigorous experiments as any other condition. He discovered REM sleep, which is when dreams occur, as well as the internal body clock and the effects of prolonged sleeplessness. Kleitman demystified the mechanics of sleep but not, alas, the alchemy behind the perfect night’s sleep.

Historically Speaking: When Taxis Were Drawn by Horses

Long before Uber, there were Roman litters, Venetian gondolas and other variations on the ride for hire.

The Wall Street Journal

March 10, 2023

Last month drivers working for Uber and Lyft went on strike in cities in the U.S., Great Britain and the Netherlands. This was on top of strikes in December. The digital ride-hailing companies are looking increasingly like the taxi industry of old: Customers complain about the prices while drivers say they are struggling financially.

Taxis have been a hallmark of urban life since ancient times. In Rome, the most ubiquitous taxi was the cisium, a two-wheeled horse-drawn carriage that could accommodate a single passenger plus a small amount of luggage. Customers could hail them from designated cisium ranks on the city’s main roads. In theory, drivers could be fined for speeding and other infractions, but they were very often a public nuisance. In the first century, the emperor Claudius went so far as to ban all for-hire wheeled vehicles from city centers, forcing travelers and those without private carriages to rely on independent litter-bearers and sedan chair porters.

During the Middle Ages, one of the first European cities to have strict regulations governing for-hire vehicles was Venice. In a city of canals, the vehicles in question were gondolas, and from at least the mid-14th century every gondolier had to belong to a guild, known as a scuola. Membership was relatively open and offered a way for immigrants to get ahead. A lawsuit between two scuolas in 1514 reveals that Giovanni the Ethiopian was one of the chief guild officers.

Other cities had battles over competition. In 1635, London’s watermen lobbied King Charles I to ban hackney carriage drivers from accepting journeys shorter than 3 miles. Neither this nor subsequent bans were well-enforced. In 1660, King Charles II prohibited hackney carriages from picking up fares in the streets, which, again, people ignored. On Nov. 22, 1660, the diarist Samuel Pepys wrote of hailing a coach after dinner and sending his wife home in it.

ILLUSTRATION: THOMAS FUCHS

The regulation of public carriage services in Paris and London is a tale of two very different approaches. Single horse-drawn cabriolets, from which the term “cab” derives, emerged in Paris in the 18th century, but the city saw them as a threat to traditional coachmen and so refused to license them. Customers complained that Paris coachmen were rude and unhelpful for want of competition.

London, for its part, frequently revised its licensing rules and coach regulations. Most important, it introduced the “knowledge” system, consisting of routes that every driver must able to recite to pass the license exam—a requirement that still exists today. A London taxi was, and is, expensive, but driving is a middle-class job, and the industry has largely avoided the labor strife that plagues New York.

Nor did London have the same terrible accident rates. In 1931, New York recorded more than 23,000 traffic accidents; incredibly, 21,000 of them involved a taxicab. Although the iconic Checker cabs provided more room for passengers and even helped to give New York its distinctive look until their production ended in 1982, taxi-driving was still regarded as a dangerous and low-status profession, exemplified by the introduction of the bulletproof partition in the 1990s.

The arrival of the new ride-hailing companies in 2009 disrupted the traditional taxi industry and forced cities to think about improving their systems. Now it is the turn of the disrupters to do the same.

Historically Speaking: Even Ancient Children Did Homework

Americans have long debated the value of take-home assignments, but children have been struggling with them for millennia.

The Wall Street Journal

February 24, 2023

If American schoolchildren no longer had to do hours of homework each night, a lot of dogs might miss out on their favorite snack, if an old excuse is to be believed, but would the children be worse off? Americans have been debating whether or not to abolish homework for almost a century and a half. Schoolwork and homework became indistinguishable during Covid, when children were learning from home. But the normal school day has returned and so has the issue.

The ancient Greek philosophers thought deeply about the purpose of education. In the Republic, Plato argued that girls as well as boys should receive physical, mental and moral training because it was good for the state. But the Greeks were less concerned about where this training should take place, school or home, or what kind of separation should exist between the two. The Roman statesman Cicero wrote that he learned at home as much as he did outside of it.

History of Homework

ILLUSTRATION: THOMAS FUCHS

But, of course, practice makes perfect, as Pliny the Younger told his students of oratory. Elementary schoolchildren in Greco-Roman Egypt were expected to practice their letters on wax writing tablets. A homework tablet from the second century AD, now in the British Museum, features two lines of Greek written by the teacher and a child’s attempt to copy them underneath.

Tedious copying exercises also plagued the lives of ancient Chinese students. In 1900 an enormous trove of 1500 to 900-year-old Buddhist scrolls was discovered in a cave near Dunhuang in northwestern China. Scattered among the texts were homework copies made by bored monastery pupils who scribbled things like, “This is Futong incurring another person’s anger.”

What many people generally think of as homework today—after-class assignments forced on children regardless of their pedagogical usefulness—has its origins in the Prussian school system. In the 18th and 19th centuries, Prussia led the world in mass education. Fueled by the belief that compulsory schooling was the best means of controlling the peasantry, the authorities devised a rigorous system based on universal standards and applied methods. Daily homework was introduced, in part because it was a way of inserting school oversight, and by extension the state, into the home.

American educationalists such as Horace Mann in Massachusetts sought to create a free school system based on the Prussian model. Dividing children into age groups and other practical reforms faced little opposition. But as early as 1855, the American Medical Monthly was warning of the dangers to children’s health from lengthy homework assignments. In the 1880s, the Boston school board expressed its concern by voting to reduce the amount of arithmetic homework in elementary schools.

As more parents complained of lost family time and homework wars, the Ladies’ Home Journal began to campaign for its abolition, calling after-school work a “national crime” in 1900. The California legislature agreed, abolishing all elementary school homework in 1901. The homework debate seesawed until the Russians launched Sputnik in 1957 and shocked Americans out of complacency. Congress quickly passed a $1 billion education spending program. More, not less, homework became the mantra until the permissive ‘70s, only to reverse in response to Japan’s economic ascendancy in the ‘80s.

All the old criticisms of homework remain today, but perhaps the bigger threat to such assignments is technological, in the form of the universal homework butler known as ChatGPT.

Historically Speaking: The Ancient Elixir Made by Bees

Honey has always been a sweet treat, but it has also long served as a preservative, medicine and salve.

The Wall Street Journal

February 9, 2023

The U.S. Department of Agriculture made medical history last month when it approved the first vaccine for honey bees. Hives will be inoculated against American Foulbrood, a highly contagious bacterial disease that kills bee larvae. Our buzzy friends need all the help they can get. In 2021, a national survey of U.S. beekeepers reported that 45.5% of managed colonies died during the preceding year. Since more than one-third of the foods we eat depend on insect pollinators, a bee-less world would drastically alter everyday life.

The loss of bees would also cost us honey, a foodstuff that throughout human history has been much more than a pleasant sugar-substitute. Energy-dense, nutritionally-rich wild honey, ideal for brain development, may have helped our earliest human ancestors along the path of evolution. The importance of honey foraging can be inferred from its frequent appearance in Paleolithic art. The Araña Caves of Valencia, Spain, are notable for a particularly evocative line drawing of a honey harvester dangling precariously while thrusting an arm into a beehive.

Honey is easily fermented, and there is evidence that the ancient Chinese were making a mixed fruit, rice and honey alcoholic beverage as early as 7000 BC. The Egyptians may have been the first to domesticate bees. A scene in the sun temple of Pharaoh Nyuserre Ini, built around 2400 B.C., depicts beekeepers blowing smoke into hives as they collect honey. They loved the taste, of course, but honey also played a fundamental role in Egyptian culture. It was used in religious rituals, as a preservative (for embalming) and, because of its anti-bacterial properties, as an ingredient in hundreds of concoctions from contraceptives to gastrointestinal medicines and salves for wounds.

The oldest known written reference to honey comes from a 4,000-year-old recipe for a skin ointment, noted on a cuneiform clay tablet found among the ruins of Nippur in the Iraqi desert.

The ancient Greeks judged honey like fine wine, rating its qualities by bouquet and region. The thyme-covered slopes of Mount Hymettus, near Athens, were thought to produce the best varieties, prompting sometimes violent competition between beekeepers. The Greeks also appreciated its preservative properties. In 323 B.C., the body of Alexander the Great was allegedly transported in a vat of honey to prevent it from spoiling.

Honey’s many uses were also recognized in medieval Europe. In fact, in 1403 honey helped to save the life of 16 year-old Prince Henry, the future King Henry V of England. During the battle of Shrewsbury, an arrowhead became embedded in his cheekbone. The extraction process was long and painful, resulting in a gaping hole. Knowing the dangers of an open wound, the royal surgeon John Bradmore treated the cavity with a honey mixture that kept it safe from dirt and bacteria.

Despite Bradmore’s success, honey was relegated to folk remedy status until World War I. Then medical shortages encouraged Russian doctors to use honey in wound treatments. Honey was soon after upstaged by the discovery of penicillin in 1928, but today its time has come.

A 2021 study in the medical journal BMJ found honey to be a cheap and effective treatment for the symptoms of upper respiratory tract infections. Scientists are exploring its potential uses in fighting cancer, diabetes, asthma and cardiovascular disease.

To save ourselves, however, first we must save the bees.

Historically Speaking: Fakers, Con Men and Pretenders to the Throne

George Santos is far from the first public figure to have assumed an identity later discovered to be rife with fictions

The Wall Street Journal

January 27, 2023

Few would have thought it possible in the age of the internet, and yet U.S. Rep. George Santos turns out to have invented a long list of details of the life story he told as a candidate.

It was much easier to be an impostor in the ancient world, when travel was difficult, communications slow, and few people even knew what their rulers looked like. One of history’s oldest and strangest examples depended on this ignorance.

According to the Greek historian Herodotus, King Cambyses II of Persia became so jealous of his younger brother Bardiya, also known in history as Smerdis, that he ordered his assassination in 522 B.C. Imagine Cambyses’s shock, therefore, when news came to him while he was away fighting in Egypt that a man calling himself Smerdis had seized the throne.

ILLUSTRATION: THOMAS FUCHS

Cambyses died before he could confront the impostor. But Prince Darius, a former ally of Cambyses, suspected that the new king was actually a Magi priest named Gautama. Herodotus relates that Darius knew a crucial fact about Gautama: his ears had been cut off.

Royal etiquette kept the king at distance from everyone—even the queen lived and slept in separate quarters. But Darius persuaded her to find an opportunity to be in the king’s apartment when he was asleep and check his ears. They were missing! Darius promptly denounced this Smerdis as an impostor, proclaimed himself the savior of the kingdom and took the throne. Modern historians suspect that the real impostor was Darius and that he invented the Gautama story to justify his coup against Smerdis.

The Middle Ages were a boom time for royal impostors. Kings and crown princes were often assassinated or executed under conditions that could plausibly be spun into tales of miraculous escape. Some of those impersonating long-lost monarchs managed to get quite far. King Henry VII of England spent eight years fighting a rebellion led by Perkin Warbeck, who claimed to be one of the two princes held in the Tower of London and killed by King Richard III in the 1480s.

During the Renaissance, the most famous case in Europe involved neither a king nor the heir to a great fortune but a French peasant named Martin Guerre. In 1548, the feckless Guerre abandoned his wife, Bertrande, and their son. Eight years later, a look-alike named Arnaud du Thil suddenly appeared, claiming to be Guerre. He settled down and became a good member of the community—too good for those who had known the old Guerre. But Bertrande insisted he was the same man. The deception unraveled in 1560 when the real Martin Guerre made a sensational return in the middle of a court trial to decide du Thil’s identity. Du Thil was executed for impersonation, but the judge declared Bertrande innocent of adultery on the grounds that women are known to be very silly creatures and easily deceived.

Opportunities to claim titles and thrones diminished after the 18th century, and a new class of impostor arose: the confidence man. Mr. Santos isn’t yet a match for the Czech-American fraudster, “Count” Victor Lustig. In 1925, posing as a French government official, Lustig successfully auctioned off the Eiffel Tower. In Chicago the following year, he posed as an investor and swindled Al Capone out of $5000.

Lustig’s long list of crimes eventually landed him in Alcatraz. Where Mr. Santos is heading is a mystery—much like where he’s been.

Historically Speaking: The Long, Dark Shadow of the Real ‘Bedlam’

Today’s debate over compulsory treatment for the mentally ill has roots in a history of good intentions gone awry

The Wall Street Journal

January 12, 2023

This year, California and New York City will roll out plans to force the homeless mentally ill to receive hospital treatment. The initiatives face fierce legal challenges despite their backers’ good intentions and promised extra funds.

Opposition to compulsory hospitalization has its roots in the historic maltreatment of mental patients. For centuries, the biggest problem regarding the care of the mentally ill was the lack of it. Until the 18th century, Britain was typical in having only one public insane asylum, Bethlehem Royal Hospital. The conditions were so notorious, even by contemporary standards, that the hospital’s nickname, Bedlam, became synonymous with violent anarchy.

Plate 8 of WIlliam Hogarth’s ‘A Rake’s Progress,’ titled ‘In The Madhouse,’ was painted around 1735 and depicted the hospital known as ‘Bedlam.’
PHOTO: HERITAGE IMAGES VIA GETTY IMAGES

The cost of treatment at Bedlam, which consisted of pacifying the patients through pain and terror, was offset by viewing fees. Anyone could pay to stare or laugh at the inmates, and thousands did. But social attitudes toward mental illness were changing. By the end of the 18th century, psychiatric reformers such as Benjamin Rush in America and Philippe Pinel in France had demonstrated the efficacy of more humane treatment.

In a burst of optimism, New York Hospital created a ward for the “curable” insane in 1792. The Quaker-run “Asylum for the Relief of Persons Deprived of the Use of Their Reason” in Pennsylvania became the first dedicated mental hospital in the U.S. in 1813. By the 1830s there were at least a dozen private mental hospitals in America.

The public authorities, however, were still shutting the mentally ill in prisons, as the social reformer Dorothea Dix was appalled to discover in 1841. Dix’s energetic campaigning bore fruit in New Jersey, which soon built its first public asylum. Designed by Thomas Kirkbride to provide state-of-the-art care amid pleasant surroundings, Trenton State Hospital served as a model for more than 70 purpose-built asylums that sprang up across the nation after Congress approved government funding for them in 1860.

Unfortunately, the philanthropic impetus driving the public mental hospital movement created as many problems as it solved. Abuse became rampant. It was so easy to have a person committed that in the 1870s, President Grover Cleveland, while still an aspiring politician, successfully silenced the mother of his illegitimate son by having her spirited away to an asylum.

In 1887, the journalist Nellie Bly went undercover as a patient in the Women’s Lunatic Asylum on Blackwell’s Island (now Roosevelt Island), New York. She exposed both the brutal practices of that institution and the general lack of legal safeguards against unwarranted incarceration.

The social reformer Dorothea Dix advocated for public mental health care.

During the first half of the 20th century, the best-run public mental hospitals lived up to the ideals that had inspired them. But the worst seemed to confirm fears that patients on the receiving end of state benevolence lost all basic rights. At Trenton State Hospital between 1907 and 1930, the director Henry Cotton performed thousands of invasive surgeries in the mistaken belief that removing patients’ teeth or organs would cure their mental illnesses. He ended up killing almost a third of those he treated and leaving the rest damaged and disfigured. The public uproar was immense. And yet, just a decade later, some mental hospitals were performing lobotomies on patients with or without consent.

In 1975 the ACLU persuaded the Supreme Court that the mentally ill had the right to refuse hospitalization, making public mental-health care mostly voluntary. But while legal principles are black and white, mental illness comes in shades of gray: A half century later, up to a third of people living on the streets are estimated to be mentally ill. As victories go, the Supreme Court decision was also a tragedy.

Historically Speaking: You Might Not Want to Win a Roman Lottery

Humans have long liked to draw lots as a way to win fortunes and settle fates

The Wall Street Journal

November 25, 2022

Someone in California won this month’s $2.04 billion Powerball lottery—the largest in U.S. history. The odds are staggering. The likelihood of death by plane crash (often estimated at 1 in 11 million for the average American) is greater than that of winning the Powerball or Mega Millions lottery (1 in roughly 300 million). Despite this, just under 50% of American adults bought a lottery ticket last year.

What drives people to risk their luck playing the lottery is more than just lousy math. Lotteries tap into a deep need among humans to find meaning in random events. Many ancient societies, from the Chinese to the Hebrews, practiced cleromancy, or the casting of lots to enable divine will to express itself. It is stated in the Bible’s Book of Proverbs: “The lot is cast into the lap, but its every decision is from the Lord.”

The ancient Greeks were among the first to use lotteries to ensure impartiality for non-religious purposes. The Athenians relied on a special device called a “kleroterion” for selecting jurors and public officials at random, to avoid unfair interference. The Romans had a more terrible use for drawing lots: A kind of collective military punishment known as “decimation” required a disgraced legion to select 1 out of every 10 soldiers at random and execute them. The last known use of the practice was in World War I by French and possibly Italian commanders.

The Romans also found less bloody uses for lotteries, including as a source of state revenue. Emperor Nero was likely the first ruler to use a raffle system as a means of filling the treasury without raising taxes.

ILLUSTRATION: THOMAS FUCHS

Following the fall of Rome, lotteries found other uses in the West—for example, as a means of allocating market stalls. But state lotteries only returned to Europe after 1441, when the city of Bruges successfully experimented with one as a means to finance its community projects. These fundraisers didn’t always work, however. A lack of faith in the English authorities severely dampened ticket sales for Queen Elizabeth I’s first (and last) National Lottery in 1567.

When they did work, the bonanzas could be significant: In the New World, lotteries helped to pay for the first years of the Jamestown Colony, as well as Harvard, Yale, Princeton, Columbia and many other institutions. And in France in 1729, the philosopher Voltaire got very rich by winning the national lottery, which was meant to sell bonds by making each bond a ticket for a jackpot drawing. He did it by unsavory means: Voltaire was part of a consortium of schemers who took advantage of a flaw in the lottery’s design by buying up enormous numbers of very cheap bonds.

Corruption scandals and failures eventually took their toll. Critics such as the French novelist Honoré de Balzac, who called lotteries the “opium of poverty,” denounced them for exploiting the poor. Starting in the late 1820s, a raft of anti-lottery laws were enacted on both sides of the Atlantic. Debates continued about them even where they remained legal. The Russian novelist Anton Chekhov highlighted their debilitating effects in his 1887 short story “The Lottery Ticket,” about a contented couple who are tormented and finally turned into raging malcontents by the mere possibility of winning.

New Hampshire was the first American state to roll back the ban on lotteries in 1964. Since then, state lotteries have proven to be neither the disaster nor the cure-all predicted. As for the five holdouts—Alabama, Alaska, Hawaii, Nevada and Utah still have no state lotteries—they are doing just fine.

Historically Speaking: Modern Dentistry’s Painful Past

Just be thankful that your teeth aren’t drilled with a flint or numbed with cocaine

The Wall Street Journal

November 3, 2022

Since the start of the pandemic, a number of studies have uncovered a surprising link: The presence of gum disease, the first sign often being bloody gums when brushing, can make a patient with Covid three times more likely to be admitted to the ICU with complications. Putting off that visit to the dental hygienist may not be such a good idea just now.

Many of us have unpleasant memories of visits to the dentist, but caring for our teeth has come a very long way over the millennia. What our ancestors endured is incredible. In 2006 an Italian-led team of evolutionary anthropologists working in Balochistan, in southwestern Pakistan, found several 9,000-year-old skeletons whose infected teeth had been drilled using a pointed flint tool. Attempts at re-creating the process found that the patients would have had to sit still for up to a minute. Poppies grow in the region, so there may have been some local expertise in using them for anesthesia.

Acupuncture may have been used to treat tooth pain in ancient China, but the chronology remains uncertain. In ancient Egypt, dentistry was considered a distinct medical skill, but the Egyptians still had some of the worst teeth in the ancient world, mainly from chewing on food adulterated with grit and sand. They were averse to dental surgery, relying instead on topical pain remedies such as amulets, mouthwashes and even pastes made from dead mice. Faster relief could be obtained in Rome, where tooth extraction—and dentures—were widely available

ILLUSTRATION: THOMAS FUCHS

In Europe during the Middle Ages, dentistry fell under the purview of barbers. They could pull teeth but little else. The misery of mouth pain continued unabated until the early 18th century, when the French physician Pierre Fauchard became the first doctor to specialize in teeth. A rigorous scientist, Fauchard helped to found modern dentistry by recording his innovative methods and discoveries in a two-volume work, “The Surgeon Dentist.”

Fauchard elevated dentistry to a serious profession on both sides of the Atlantic. Before he became famous for his midnight ride, Paul Revere earned a respectable living making dentures. George Washington’s false teeth were actually a marvel of colonial-era technology, combining human teeth with elephant and walrus ivory to create a realistic look.

During the 1800s, the U.S. led the world in dental medicine, not only as an academic discipline but in the standardization of the practice, from the use of automatic drills to the dentist’s chair.

Perhaps the biggest breakthrough was the invention of local anesthetic in 1884. In July of that year, Sigmund Freud published a paper in Vienna on the potential uses of cocaine. Four months later in America, Richard Hall and William S. Halsted, whose pioneering work included the first radical mastectomy for breast cancer, decided to test cocaine’s numbing properties by injecting it into their dental nerves. Hall had an infected incisor filled without feeling a thing.

For patients, the experiment was a miracle. For Hall and Halsted, it was a disaster; both became cocaine addicts. Fortunately, dental surgery would be made safe by the invention of non-habit forming Novocain 20 years later.

With orthodontics, veneers, implants and teeth whiteners, dentists can give anyone a beautiful smile nowadays. But the main thing is that oral care doesn’t have to hurt, and it could just save your life. So make that appointment.

Historically Speaking: The Fungus That Fed Gods And Felled a Pope

There’s no hiding the fact that mushrooms, though delicious, have a dark side

The Wall Street Journal

October 21, 2022

Fall means mushroom season. And, oh, what joy. The Romans called mushrooms the food of the gods; to the ancient Chinese, they contained the elixir of life; and for many people, anything with truffles is the next best thing to a taste of heaven.

Lovers of mushrooms are known as mycophiles, while haters are mycophobes. Both sets have good reasons for feeling so strongly. The medicinal properties of mushrooms have been recognized for thousands of years. The ancient Chinese herbal text “Shen Nong Ben Cao Jing,” written down sometime during the Eastern Han Dynasty, 25-220 AD, was among the earliest medical treatises to highlight the immune-boosting powers of the reishi mushroom, also known as lingzhi.

The hallucinogenic powers of certain mushrooms were also widely known. Many societies, from the ancient Mayans to the Vikings, used psilocybin-containing fungi, popularly known as magic mushrooms, to achieve altered states either during religious rituals or in preparation for battle. One of the very few pre-Hispanic texts to survive Spanish destruction, the Codex Yuta Tnoho or Vindobonensis Mexicanus I, reveals the central role played by the mushroom in the cosmology of the Mixtecs.

ILLUSTRATION: THOMAS FUCHS

There is no hiding the fact that mushrooms have a dark side, however. Fewer than 100 species are actually poisonous out of the thousands of varieties that have been identified. But some are so deadly—the death cap (Amanita phalloides), for example—that recovery is uncertain even with swift treatment. Murder by mushroom is a staple of crime writing, although modern forensic science has made it impossible to disguise.

There is a strong possibility that this is how the Roman Emperor Claudius died on Oct. 13, 54 A.D. The alleged perpetrator, his fourth wife Agrippina the Younger, wanted to clear the path for her son Nero to sit on the imperial throne. Nero dutifully deified the late emperor, as was Roman custom. But according to the historian Dio Cassius, he revealed his true feelings by joking that mushrooms were surely a dish for gods, since Claudius, by means of a mushroom, had become a god.

Another victim of the death cap mushroom, it has been speculated, was Pope Clement VII in 1534, who is best known for opposing King Henry VIII’s attempt to get rid of Catherine of Aragon, the first of his six wives. Two centuries later, in what was almost certainly an accident, Holy Roman Emperor King Charles VI died in Vienna on Oct. 20, 1740, after attempting to treat a cold and fever with his favorite dish of stewed mushrooms.

Of course, mushrooms don’t need to be lethal to be dangerous. Ergot fungus, which looks like innocuous black seeds, can contaminate cereal grains, notably rye. Its baleful effects include twitching, convulsions, the sensation of burning, and terrifying hallucinations. The Renaissance painter Hieronymus Bosch may well have been suffering from ergotism, known as St. Anthony’s Fire in his day, when he painted his depictions of hell. Less clear is whether ergotism was behind the strange symptoms recorded among some of the townspeople during the Salem witch panic of 1692-93.

Unfortunately, the mushroom’s mixed reputation deterred scientific research into its many uses. But earlier this year a small study in the Journal of Psychopharmacology found evidence to support what many college students already believe: Magic mushrooms can be therapeutic. Medication containing psilocybin had an antidepressant effect over the course of a year. More studies are needed, but I know one thing for sure: Sautéed mushrooms and garlic are a recipe for happiness.

Historically Speaking: A Pocket-Sized Dilemma for Women

Unlike men’s clothes, female fashion has been indifferent for centuries to creating ways for women to stash things in their garments

The Wall Street Journal

September 29, 2022

The current round of Fashion Weeks started in New York on Sep. 9 and will end in Paris on Oct. 4, with London and Milan slotted in between. Amid the usual impractical and unwearable outfits on stage, some designers went their own way and featured—gasp—women’s wear with large pockets.

The anti-pocket prejudice in women’s clothing runs deep. In 1954, the French designer Christian Dior stated: “Men have pockets to keep things in, women for decoration.” Designers seem to think that their idea of how a woman should look outweighs what she needs from her clothes. That mentality probably explains why a 2018 survey found that 100% of the pockets in men’s jeans were large enough to fit a midsize cellphone, but only 40% of women’s jeans pockets measured up.

The pocket is an ancient idea, initially designed as a pouch that was tied or sewn to a belt beneath a layer of clothing. Otzi, the 5,300-year-old ice mummy I wrote about recently for having the world’s oldest known tattoos, also wore an early version of a pocket; it contained his fire-starting tools.

ILLUSTRATION: THOMAS FUCHS

The ancient concept was so practical that the same basic design was still in use during the medieval era. Attempts to find other storage solutions usually came up short. In the 16th century a man’s codpiece sometimes served as an alternative holdall, despite the awkwardness of having to fish around your crotch to find things. Its fall from favor at the end of the 1600s coincided with the first in-seam pockets for men.

It was at this stage that the pocket divided into “his” and “hers” styles. Women retained the tie-on version; the fashion for wide dresses allowed plenty of room to hang a pouch underneath the layers of petticoats. But it was also impractical since reaching a pocket required lifting the layers up.

Moralists looked askance at women’s pockets, which seemed to defy male oversight and could potentially be a hiding place for love letters, money and makeup. On the other hand, in the 17th century a maidservant was able to thwart the unwelcome advances of the diarist Samuel Pepys by grabbing a pin from her pocket and threatening to stab him with it, according to his own account.

Matters looked up for women in the 18th century with the inclusion of side slits on dresses that allowed them direct access to their pockets. Newspapers began to carry advertisements for articles made especially to be carried in them. Sewing kits and snuff boxes were popular items, as were miniature “conversation cards” containing witty remarks “to create mirth in mixed companies.”

Increasingly, though, the essential difference between men’s and women’s pockets—his being accessible and almost anywhere, hers being hidden and nestled near her groin—gave them symbolism. Among the macabre acts committed by the Victorian serial killer Jack the Ripper was the disemboweling of his victims’ pockets, which he left splayed open next to their bodies.

Women had been agitating for more practical dress styles since the formation of the Rational Dress Society in Britain in 1881, but it took the upheavals caused by World War I for real change to happen. Women’s pantsuits started to appear in the 1920s. First lady Eleanor Roosevelt caused a sensation by appearing in one in 1933. The real revolution began in 1934, with the introduction of Levi’s bluejeans for women, 61 years after the originals for men. The women’s front pocket was born. And one day, with luck, it will grow up to be the same size as men’s.