Historically Speaking: Even Ancient Children Did Homework

Americans have long debated the value of take-home assignments, but children have been struggling with them for millennia.

The Wall Street Journal

February 24, 2023

If American schoolchildren no longer had to do hours of homework each night, a lot of dogs might miss out on their favorite snack, if an old excuse is to be believed, but would the children be worse off? Americans have been debating whether or not to abolish homework for almost a century and a half. Schoolwork and homework became indistinguishable during Covid, when children were learning from home. But the normal school day has returned and so has the issue.

The ancient Greek philosophers thought deeply about the purpose of education. In the Republic, Plato argued that girls as well as boys should receive physical, mental and moral training because it was good for the state. But the Greeks were less concerned about where this training should take place, school or home, or what kind of separation should exist between the two. The Roman statesman Cicero wrote that he learned at home as much as he did outside of it.

History of Homework

ILLUSTRATION: THOMAS FUCHS

But, of course, practice makes perfect, as Pliny the Younger told his students of oratory. Elementary schoolchildren in Greco-Roman Egypt were expected to practice their letters on wax writing tablets. A homework tablet from the second century AD, now in the British Museum, features two lines of Greek written by the teacher and a child’s attempt to copy them underneath.

Tedious copying exercises also plagued the lives of ancient Chinese students. In 1900 an enormous trove of 1500 to 900-year-old Buddhist scrolls was discovered in a cave near Dunhuang in northwestern China. Scattered among the texts were homework copies made by bored monastery pupils who scribbled things like, “This is Futong incurring another person’s anger.”

What many people generally think of as homework today—after-class assignments forced on children regardless of their pedagogical usefulness—has its origins in the Prussian school system. In the 18th and 19th centuries, Prussia led the world in mass education. Fueled by the belief that compulsory schooling was the best means of controlling the peasantry, the authorities devised a rigorous system based on universal standards and applied methods. Daily homework was introduced, in part because it was a way of inserting school oversight, and by extension the state, into the home.

American educationalists such as Horace Mann in Massachusetts sought to create a free school system based on the Prussian model. Dividing children into age groups and other practical reforms faced little opposition. But as early as 1855, the American Medical Monthly was warning of the dangers to children’s health from lengthy homework assignments. In the 1880s, the Boston school board expressed its concern by voting to reduce the amount of arithmetic homework in elementary schools.

As more parents complained of lost family time and homework wars, the Ladies’ Home Journal began to campaign for its abolition, calling after-school work a “national crime” in 1900. The California legislature agreed, abolishing all elementary school homework in 1901. The homework debate seesawed until the Russians launched Sputnik in 1957 and shocked Americans out of complacency. Congress quickly passed a $1 billion education spending program. More, not less, homework became the mantra until the permissive ‘70s, only to reverse in response to Japan’s economic ascendancy in the ‘80s.

All the old criticisms of homework remain today, but perhaps the bigger threat to such assignments is technological, in the form of the universal homework butler known as ChatGPT.

Historically Speaking: The Ancient Elixir Made by Bees

Honey has always been a sweet treat, but it has also long served as a preservative, medicine and salve.

The Wall Street Journal

February 9, 2023

The U.S. Department of Agriculture made medical history last month when it approved the first vaccine for honey bees. Hives will be inoculated against American Foulbrood, a highly contagious bacterial disease that kills bee larvae. Our buzzy friends need all the help they can get. In 2021, a national survey of U.S. beekeepers reported that 45.5% of managed colonies died during the preceding year. Since more than one-third of the foods we eat depend on insect pollinators, a bee-less world would drastically alter everyday life.

The loss of bees would also cost us honey, a foodstuff that throughout human history has been much more than a pleasant sugar-substitute. Energy-dense, nutritionally-rich wild honey, ideal for brain development, may have helped our earliest human ancestors along the path of evolution. The importance of honey foraging can be inferred from its frequent appearance in Paleolithic art. The Araña Caves of Valencia, Spain, are notable for a particularly evocative line drawing of a honey harvester dangling precariously while thrusting an arm into a beehive.

Honey is easily fermented, and there is evidence that the ancient Chinese were making a mixed fruit, rice and honey alcoholic beverage as early as 7000 BC. The Egyptians may have been the first to domesticate bees. A scene in the sun temple of Pharaoh Nyuserre Ini, built around 2400 B.C., depicts beekeepers blowing smoke into hives as they collect honey. They loved the taste, of course, but honey also played a fundamental role in Egyptian culture. It was used in religious rituals, as a preservative (for embalming) and, because of its anti-bacterial properties, as an ingredient in hundreds of concoctions from contraceptives to gastrointestinal medicines and salves for wounds.

The oldest known written reference to honey comes from a 4,000-year-old recipe for a skin ointment, noted on a cuneiform clay tablet found among the ruins of Nippur in the Iraqi desert.

The ancient Greeks judged honey like fine wine, rating its qualities by bouquet and region. The thyme-covered slopes of Mount Hymettus, near Athens, were thought to produce the best varieties, prompting sometimes violent competition between beekeepers. The Greeks also appreciated its preservative properties. In 323 B.C., the body of Alexander the Great was allegedly transported in a vat of honey to prevent it from spoiling.

Honey’s many uses were also recognized in medieval Europe. In fact, in 1403 honey helped to save the life of 16 year-old Prince Henry, the future King Henry V of England. During the battle of Shrewsbury, an arrowhead became embedded in his cheekbone. The extraction process was long and painful, resulting in a gaping hole. Knowing the dangers of an open wound, the royal surgeon John Bradmore treated the cavity with a honey mixture that kept it safe from dirt and bacteria.

Despite Bradmore’s success, honey was relegated to folk remedy status until World War I. Then medical shortages encouraged Russian doctors to use honey in wound treatments. Honey was soon after upstaged by the discovery of penicillin in 1928, but today its time has come.

A 2021 study in the medical journal BMJ found honey to be a cheap and effective treatment for the symptoms of upper respiratory tract infections. Scientists are exploring its potential uses in fighting cancer, diabetes, asthma and cardiovascular disease.

To save ourselves, however, first we must save the bees.

Historically Speaking: Fakers, Con Men and Pretenders to the Throne

George Santos is far from the first public figure to have assumed an identity later discovered to be rife with fictions

The Wall Street Journal

January 27, 2023

Few would have thought it possible in the age of the internet, and yet U.S. Rep. George Santos turns out to have invented a long list of details of the life story he told as a candidate.

It was much easier to be an impostor in the ancient world, when travel was difficult, communications slow, and few people even knew what their rulers looked like. One of history’s oldest and strangest examples depended on this ignorance.

According to the Greek historian Herodotus, King Cambyses II of Persia became so jealous of his younger brother Bardiya, also known in history as Smerdis, that he ordered his assassination in 522 B.C. Imagine Cambyses’s shock, therefore, when news came to him while he was away fighting in Egypt that a man calling himself Smerdis had seized the throne.

ILLUSTRATION: THOMAS FUCHS

Cambyses died before he could confront the impostor. But Prince Darius, a former ally of Cambyses, suspected that the new king was actually a Magi priest named Gautama. Herodotus relates that Darius knew a crucial fact about Gautama: his ears had been cut off.

Royal etiquette kept the king at distance from everyone—even the queen lived and slept in separate quarters. But Darius persuaded her to find an opportunity to be in the king’s apartment when he was asleep and check his ears. They were missing! Darius promptly denounced this Smerdis as an impostor, proclaimed himself the savior of the kingdom and took the throne. Modern historians suspect that the real impostor was Darius and that he invented the Gautama story to justify his coup against Smerdis.

The Middle Ages were a boom time for royal impostors. Kings and crown princes were often assassinated or executed under conditions that could plausibly be spun into tales of miraculous escape. Some of those impersonating long-lost monarchs managed to get quite far. King Henry VII of England spent eight years fighting a rebellion led by Perkin Warbeck, who claimed to be one of the two princes held in the Tower of London and killed by King Richard III in the 1480s.

During the Renaissance, the most famous case in Europe involved neither a king nor the heir to a great fortune but a French peasant named Martin Guerre. In 1548, the feckless Guerre abandoned his wife, Bertrande, and their son. Eight years later, a look-alike named Arnaud du Thil suddenly appeared, claiming to be Guerre. He settled down and became a good member of the community—too good for those who had known the old Guerre. But Bertrande insisted he was the same man. The deception unraveled in 1560 when the real Martin Guerre made a sensational return in the middle of a court trial to decide du Thil’s identity. Du Thil was executed for impersonation, but the judge declared Bertrande innocent of adultery on the grounds that women are known to be very silly creatures and easily deceived.

Opportunities to claim titles and thrones diminished after the 18th century, and a new class of impostor arose: the confidence man. Mr. Santos isn’t yet a match for the Czech-American fraudster, “Count” Victor Lustig. In 1925, posing as a French government official, Lustig successfully auctioned off the Eiffel Tower. In Chicago the following year, he posed as an investor and swindled Al Capone out of $5000.

Lustig’s long list of crimes eventually landed him in Alcatraz. Where Mr. Santos is heading is a mystery—much like where he’s been.

Historically Speaking: The Long, Dark Shadow of the Real ‘Bedlam’

Today’s debate over compulsory treatment for the mentally ill has roots in a history of good intentions gone awry

The Wall Street Journal

January 12, 2023

This year, California and New York City will roll out plans to force the homeless mentally ill to receive hospital treatment. The initiatives face fierce legal challenges despite their backers’ good intentions and promised extra funds.

Opposition to compulsory hospitalization has its roots in the historic maltreatment of mental patients. For centuries, the biggest problem regarding the care of the mentally ill was the lack of it. Until the 18th century, Britain was typical in having only one public insane asylum, Bethlehem Royal Hospital. The conditions were so notorious, even by contemporary standards, that the hospital’s nickname, Bedlam, became synonymous with violent anarchy.

Plate 8 of WIlliam Hogarth’s ‘A Rake’s Progress,’ titled ‘In The Madhouse,’ was painted around 1735 and depicted the hospital known as ‘Bedlam.’
PHOTO: HERITAGE IMAGES VIA GETTY IMAGES

The cost of treatment at Bedlam, which consisted of pacifying the patients through pain and terror, was offset by viewing fees. Anyone could pay to stare or laugh at the inmates, and thousands did. But social attitudes toward mental illness were changing. By the end of the 18th century, psychiatric reformers such as Benjamin Rush in America and Philippe Pinel in France had demonstrated the efficacy of more humane treatment.

In a burst of optimism, New York Hospital created a ward for the “curable” insane in 1792. The Quaker-run “Asylum for the Relief of Persons Deprived of the Use of Their Reason” in Pennsylvania became the first dedicated mental hospital in the U.S. in 1813. By the 1830s there were at least a dozen private mental hospitals in America.

The public authorities, however, were still shutting the mentally ill in prisons, as the social reformer Dorothea Dix was appalled to discover in 1841. Dix’s energetic campaigning bore fruit in New Jersey, which soon built its first public asylum. Designed by Thomas Kirkbride to provide state-of-the-art care amid pleasant surroundings, Trenton State Hospital served as a model for more than 70 purpose-built asylums that sprang up across the nation after Congress approved government funding for them in 1860.

Unfortunately, the philanthropic impetus driving the public mental hospital movement created as many problems as it solved. Abuse became rampant. It was so easy to have a person committed that in the 1870s, President Grover Cleveland, while still an aspiring politician, successfully silenced the mother of his illegitimate son by having her spirited away to an asylum.

In 1887, the journalist Nellie Bly went undercover as a patient in the Women’s Lunatic Asylum on Blackwell’s Island (now Roosevelt Island), New York. She exposed both the brutal practices of that institution and the general lack of legal safeguards against unwarranted incarceration.

The social reformer Dorothea Dix advocated for public mental health care.

During the first half of the 20th century, the best-run public mental hospitals lived up to the ideals that had inspired them. But the worst seemed to confirm fears that patients on the receiving end of state benevolence lost all basic rights. At Trenton State Hospital between 1907 and 1930, the director Henry Cotton performed thousands of invasive surgeries in the mistaken belief that removing patients’ teeth or organs would cure their mental illnesses. He ended up killing almost a third of those he treated and leaving the rest damaged and disfigured. The public uproar was immense. And yet, just a decade later, some mental hospitals were performing lobotomies on patients with or without consent.

In 1975 the ACLU persuaded the Supreme Court that the mentally ill had the right to refuse hospitalization, making public mental-health care mostly voluntary. But while legal principles are black and white, mental illness comes in shades of gray: A half century later, up to a third of people living on the streets are estimated to be mentally ill. As victories go, the Supreme Court decision was also a tragedy.

Historically Speaking: When Porcelain Wares Were ‘White Gold’

The fine china we set out for the holidays was once a mysterious imported substance that European alchemists struggled to recreate

The Wall Street Journal

December 22, 2022

It is that time of year again, when the table is laden, the candles are lit, and the good china comes out of cupboard. The rest of the time it sits gathering dust, which is silly because almost all modern china, even the most expensive kind, is now dishwasher safe. In any case, porcelain is stronger than it appears. The first Western imports of Chinese porcelain were able to survive journeys that had taken many months and covered thousands of miles.

Europe’s love affair with china began in 1499 when the Portuguese explorer Vasco da Gama brought home a dozen pieces for his patron, Dom Manuel I. The king of Portugal fell in love with this strange ceramic that had the strength of ordinary pottery but the luster and translucency of a seashell. That shell-like quality inspired Marco Polo in the 13th century to dub the material “porcellana,” the Italian slang for cowrie shell. Dom Manuel was the first monarch to commission a set of china from China. When it arrived in 1520, at least one piece had been painted with his coat of arms upside down.

ILLUSTRATION: THOMAS FUCHS

As demand grew, Chinese artists in Jingdezhen, eastern China, where the porcelain factories had been concentrated since the 10th century, became more expert at catering to European tastes. Centuries earlier, the Chinese had developed blue-and-white porcelain in response to Middle Eastern taste and demand. The Ming emperors fell in love with it, and the Chinese appropriated the Arabesque designs but added peonies, dragons, herons and phoenixes to the repertoire.

Chinese porcelain was called “white gold” and priced accordingly. Try as they might, would-be European competitors couldn’t replicate its formula, even once they knew that kaolin clay, which China has in abundance, was an essential ingredient. Alchemists working for Frederick the Great, the king of Prussia, made 30,000 failed attempts.

Augustus II, Elector of Saxony, forced a particularly talented young alchemist named Johann Friedrich Bottger to labor under house arrest until he devised what is generally thought to be the approximate European recipe in 1708. Two years later, Augustus founded the famous Meissen factory, near Dresden. Within three decades, industrial espionage on a rampant scale had enabled rival china factories to proliferate across Europe.

The power of porcelain went beyond its beauty. Gifts of exquisite china sets were a popular tool of royal diplomacy. The British potter Josiah Wedgwood made an anti-slavery medallion in 1787, featuring an African man in chains with the words, ‘Am I not a man and brother?’ on top. The image became an international symbol of protest, appearing on everything from plates to cuff links.

In the U.S., presidents would come to be associated with specific china patterns. George Washington initiated this tradition by purchasing from China a 302-piece dinner service decorated with the emblem of the Society of Cincinnati, founded by French and American officers who had fought together in the Revolutionary War.

Foreign exports dominated the U.S. fine china market until the late 19th century, when homegrown rivals Willard Pickard in Wisconsin and William Scott Lenox in New Jersey appeared on the scene. In 1918 President Woodrow Wilson commissioned Lenox China to manufacture the first set of American-made china for the White House. Pickard made the latest, for the Obama White House, in 2015. Some traditions are worth saving.

Historically Speaking: You Might Not Want to Win a Roman Lottery

Humans have long liked to draw lots as a way to win fortunes and settle fates

The Wall Street Journal

November 25, 2022

Someone in California won this month’s $2.04 billion Powerball lottery—the largest in U.S. history. The odds are staggering. The likelihood of death by plane crash (often estimated at 1 in 11 million for the average American) is greater than that of winning the Powerball or Mega Millions lottery (1 in roughly 300 million). Despite this, just under 50% of American adults bought a lottery ticket last year.

What drives people to risk their luck playing the lottery is more than just lousy math. Lotteries tap into a deep need among humans to find meaning in random events. Many ancient societies, from the Chinese to the Hebrews, practiced cleromancy, or the casting of lots to enable divine will to express itself. It is stated in the Bible’s Book of Proverbs: “The lot is cast into the lap, but its every decision is from the Lord.”

The ancient Greeks were among the first to use lotteries to ensure impartiality for non-religious purposes. The Athenians relied on a special device called a “kleroterion” for selecting jurors and public officials at random, to avoid unfair interference. The Romans had a more terrible use for drawing lots: A kind of collective military punishment known as “decimation” required a disgraced legion to select 1 out of every 10 soldiers at random and execute them. The last known use of the practice was in World War I by French and possibly Italian commanders.

The Romans also found less bloody uses for lotteries, including as a source of state revenue. Emperor Nero was likely the first ruler to use a raffle system as a means of filling the treasury without raising taxes.

ILLUSTRATION: THOMAS FUCHS

Following the fall of Rome, lotteries found other uses in the West—for example, as a means of allocating market stalls. But state lotteries only returned to Europe after 1441, when the city of Bruges successfully experimented with one as a means to finance its community projects. These fundraisers didn’t always work, however. A lack of faith in the English authorities severely dampened ticket sales for Queen Elizabeth I’s first (and last) National Lottery in 1567.

When they did work, the bonanzas could be significant: In the New World, lotteries helped to pay for the first years of the Jamestown Colony, as well as Harvard, Yale, Princeton, Columbia and many other institutions. And in France in 1729, the philosopher Voltaire got very rich by winning the national lottery, which was meant to sell bonds by making each bond a ticket for a jackpot drawing. He did it by unsavory means: Voltaire was part of a consortium of schemers who took advantage of a flaw in the lottery’s design by buying up enormous numbers of very cheap bonds.

Corruption scandals and failures eventually took their toll. Critics such as the French novelist Honoré de Balzac, who called lotteries the “opium of poverty,” denounced them for exploiting the poor. Starting in the late 1820s, a raft of anti-lottery laws were enacted on both sides of the Atlantic. Debates continued about them even where they remained legal. The Russian novelist Anton Chekhov highlighted their debilitating effects in his 1887 short story “The Lottery Ticket,” about a contented couple who are tormented and finally turned into raging malcontents by the mere possibility of winning.

New Hampshire was the first American state to roll back the ban on lotteries in 1964. Since then, state lotteries have proven to be neither the disaster nor the cure-all predicted. As for the five holdouts—Alabama, Alaska, Hawaii, Nevada and Utah still have no state lotteries—they are doing just fine.

The Sunday Times: Kate’s in touch with American over-40s but Meghan is where the money is

As the Prince and Princess of Wales head stateside for their first US tour in eight years, Amanda Foreman assesses the British monarchy’s popularity across the Atlantic

The Sunday Times

November 26, 2022

Two royal events dominated the American headlines in 1981. The first was the great “curtsy scandal” in April, when the White House chief of protocol, Leonore Annenberg, was photographed curtsying to Prince Charles during his brief visit to the US. The idea of an American bending the knee to royalty — and English royalty no less — sent people into a frenzy of righteous indignation. There were editorials, letters, television debates, and seemingly endless outrage at the insult to American republican values. The State Department was forced to make a statement, and Annenberg never lived it down. Three months later, more than 14 million US households either stayed up all night or got up before dawn to watch Charles and Diana’s wedding.

As the French like to say, “plus ça change, plus c’est la même chose”, the more things change, the more they stay the same. Four decades later, Americans are still viscerally tied and eternally conflicted about the royal family, only now they have been joined by two players from the other side who are pumping as much oxygen as possible to keep this psychodrama alive. The Duke and Duchess of Sussex’s move to California has made the volume louder, the talk trashier, and the stakes for the royal family never higher.

two-hour chat with Oprah Winfrey turned a 70-year reign of selfless dedication and duty into a mere sideshow compared with the vital question of who said what to whom. Imagine what the Sussexes’ documentary series and Harry’s biography will do. With William and Kate, the new Prince and Princess of Wales, due to arrive in the US on Wednesday for a three-day trip — their first state visit since Queen Elizabeth’s death — there is going to be a royal smackdown of sorts that will allow a direct, peer-to-peer comparison between the working Windsors and the working-it Windsors.

If you believe the British press, the Sussexes are wearing out their welcome in the US. Their popularity is slipping down the scale, just like each episode of Meghan Markle’s Archetypes podcast on the Spotify rankings (down 76 places last week). But if that were actually true, the Sussexes would not be attending one of the most expensive charity galas in New York on December 6 to accept an award for their humanitarian work. The Robert F Kennedy Human Rights foundation, named after the younger brother of President John F Kennedy, is giving the couple the Ripple of Hope Award in recognition of their “heroic” fight against the royal family’s “structural racism”. Everyone knows that charity galas select their recipients with both ears and eyes focused on getting bums on seats. It’s the oldest game in town. Some charities even demand that they commit to buying a certain number of tables. There happen to be four other “name brand” recipients at the event, including the Ukrainian president Volodymyr Zelensky, but they are being treated like the proverbial chopped liver compared with the excitement over having Harry and Meghan come to the East Coast.

While the younger branches of two globally famous families make common cause in New York, that same week in Boston the John F Kennedy Library Foundation and the Royal Foundation will jointly award five £1 million Earthshot prizes to innovators in environmentalism. The Prince and Princess of Wales will be in attendance. Only 190 miles as the crow flies will separate the rival Windsor-Kennedys, but it could be a million or a trillion as far as their supporters are concerned. On paper, there is no question which event will be the more important or carries more gravitas. By every possible metric from the historic to the meaningful, the Waleses come out ahead. The Earthshot prize has the potential to save the planet. Its prizewinners are anonymous worker-bees not big shots who have made it to the top. But, to put it crudely, William and Kate are trading in yesterday’s currency. Leaving aside sheer sartorial glamour, where Kate is unmatched, the Waleses offer the world a fixed basket of virtues: duty, probity, discipline, decency, discretion, loyalty, and commitment. It is a worthy one, to be sure, and also totally — fatally — in step with the values of the over-40 crowd: Baby Boomers, Generation X, and some millennials. But the Duke and Duchess of Sussex are dealers in today’s currency: self-actualisation, self-healing, self-identity, self-care, self-expression, self-confidence, and self-love.

It is a duel representative of a cultural and generational divide in America. The Waleses and the Sussexes carry extraordinary weight — but it’s with their own constituencies rather than each other’s. The US broadsheets and magazines that cater to readers who were alive before the internet have been questioning for some time now whether the luxurious lifestyles of Harry and Meghan are out of kilter with their narrative of victimhood.

In August, New York Magazine published a 6,000-word interview with the duchess, rather cheekily entitled “Meghan of Montecito”, where she was given the opportunity to talk about herself without interruption. It was a risky decision to go beyond her natural fanbase on TV and social media. Unlike the infamous Oprah interview, the reporter kept a modicum of distance from her subject. One choice line read: “She has been media-trained and then royal media-trained and sometimes converses like she has a tiny Bachelor producer in her brain directing what she says.”

The medium was a poor fit for Meghan and the message that came across was less than flattering: she’s a piece of a work, he’s out of his depth. It made no difference, the podcast debuted at No 1 anyway.

The best way to understand the Sussexes’ relationship with ordinary Americans is to put it in the same context as Gwyneth Paltrow and her Goop lifestyle company. Goop was valued at $250 million in 2018, having started out as a newsletter in 2008. Experts routinely criticise the company for peddling expensive tat to credulous consumers who are chasing after something called “wellness”. Her supporters could not care less because they don’t need the products to be real, they just need them to be emotionally satisfying at the point of sale.

Harry and Meghan are selling something similar, call it “me-spiration”. It’s not a philosophy so much as an ego massage and it’s a pure money-maker. Netflix and Random House each have millions of dollars invested in its success. Random House is sitting pretty. Books dishing the dirt on the royal family always sell, no matter what. Netflix has more of a challenge on its hands. But an Oscar-winning screenwriter who spoke to me on the condition of anonymity revealed that the streaming company regards them in the same way they do the Obamas, who also have their own producing deal. The Sussexes must succeed because they are too expensive to fail. In any case, they have a star power that goes beyond the normal considerations of content or success because their brand is cross-collateralised with The Crown, still one of Netflix’s flagship series which regularly tops the streamers viewing figures in the US.

With two media juggernauts each working overtime to protect their assets, and a formula that milks two of the biggest obsessions in America today: royalty and identity, the Sussexes are assured their place in the social mediaverse. There will be no shortage of interviews, humanitarian awards, and self-generated documentaries in their future. The war between the Waleses and the Sussexes was over before it started.

Historically Speaking: Modern Dentistry’s Painful Past

Just be thankful that your teeth aren’t drilled with a flint or numbed with cocaine

The Wall Street Journal

November 3, 2022

Since the start of the pandemic, a number of studies have uncovered a surprising link: The presence of gum disease, the first sign often being bloody gums when brushing, can make a patient with Covid three times more likely to be admitted to the ICU with complications. Putting off that visit to the dental hygienist may not be such a good idea just now.

Many of us have unpleasant memories of visits to the dentist, but caring for our teeth has come a very long way over the millennia. What our ancestors endured is incredible. In 2006 an Italian-led team of evolutionary anthropologists working in Balochistan, in southwestern Pakistan, found several 9,000-year-old skeletons whose infected teeth had been drilled using a pointed flint tool. Attempts at re-creating the process found that the patients would have had to sit still for up to a minute. Poppies grow in the region, so there may have been some local expertise in using them for anesthesia.

Acupuncture may have been used to treat tooth pain in ancient China, but the chronology remains uncertain. In ancient Egypt, dentistry was considered a distinct medical skill, but the Egyptians still had some of the worst teeth in the ancient world, mainly from chewing on food adulterated with grit and sand. They were averse to dental surgery, relying instead on topical pain remedies such as amulets, mouthwashes and even pastes made from dead mice. Faster relief could be obtained in Rome, where tooth extraction—and dentures—were widely available

ILLUSTRATION: THOMAS FUCHS

In Europe during the Middle Ages, dentistry fell under the purview of barbers. They could pull teeth but little else. The misery of mouth pain continued unabated until the early 18th century, when the French physician Pierre Fauchard became the first doctor to specialize in teeth. A rigorous scientist, Fauchard helped to found modern dentistry by recording his innovative methods and discoveries in a two-volume work, “The Surgeon Dentist.”

Fauchard elevated dentistry to a serious profession on both sides of the Atlantic. Before he became famous for his midnight ride, Paul Revere earned a respectable living making dentures. George Washington’s false teeth were actually a marvel of colonial-era technology, combining human teeth with elephant and walrus ivory to create a realistic look.

During the 1800s, the U.S. led the world in dental medicine, not only as an academic discipline but in the standardization of the practice, from the use of automatic drills to the dentist’s chair.

Perhaps the biggest breakthrough was the invention of local anesthetic in 1884. In July of that year, Sigmund Freud published a paper in Vienna on the potential uses of cocaine. Four months later in America, Richard Hall and William S. Halsted, whose pioneering work included the first radical mastectomy for breast cancer, decided to test cocaine’s numbing properties by injecting it into their dental nerves. Hall had an infected incisor filled without feeling a thing.

For patients, the experiment was a miracle. For Hall and Halsted, it was a disaster; both became cocaine addicts. Fortunately, dental surgery would be made safe by the invention of non-habit forming Novocain 20 years later.

With orthodontics, veneers, implants and teeth whiteners, dentists can give anyone a beautiful smile nowadays. But the main thing is that oral care doesn’t have to hurt, and it could just save your life. So make that appointment.

Historically Speaking: The Fungus That Fed Gods And Felled a Pope

There’s no hiding the fact that mushrooms, though delicious, have a dark side

The Wall Street Journal

October 21, 2022

Fall means mushroom season. And, oh, what joy. The Romans called mushrooms the food of the gods; to the ancient Chinese, they contained the elixir of life; and for many people, anything with truffles is the next best thing to a taste of heaven.

Lovers of mushrooms are known as mycophiles, while haters are mycophobes. Both sets have good reasons for feeling so strongly. The medicinal properties of mushrooms have been recognized for thousands of years. The ancient Chinese herbal text “Shen Nong Ben Cao Jing,” written down sometime during the Eastern Han Dynasty, 25-220 AD, was among the earliest medical treatises to highlight the immune-boosting powers of the reishi mushroom, also known as lingzhi.

The hallucinogenic powers of certain mushrooms were also widely known. Many societies, from the ancient Mayans to the Vikings, used psilocybin-containing fungi, popularly known as magic mushrooms, to achieve altered states either during religious rituals or in preparation for battle. One of the very few pre-Hispanic texts to survive Spanish destruction, the Codex Yuta Tnoho or Vindobonensis Mexicanus I, reveals the central role played by the mushroom in the cosmology of the Mixtecs.

ILLUSTRATION: THOMAS FUCHS

There is no hiding the fact that mushrooms have a dark side, however. Fewer than 100 species are actually poisonous out of the thousands of varieties that have been identified. But some are so deadly—the death cap (Amanita phalloides), for example—that recovery is uncertain even with swift treatment. Murder by mushroom is a staple of crime writing, although modern forensic science has made it impossible to disguise.

There is a strong possibility that this is how the Roman Emperor Claudius died on Oct. 13, 54 A.D. The alleged perpetrator, his fourth wife Agrippina the Younger, wanted to clear the path for her son Nero to sit on the imperial throne. Nero dutifully deified the late emperor, as was Roman custom. But according to the historian Dio Cassius, he revealed his true feelings by joking that mushrooms were surely a dish for gods, since Claudius, by means of a mushroom, had become a god.

Another victim of the death cap mushroom, it has been speculated, was Pope Clement VII in 1534, who is best known for opposing King Henry VIII’s attempt to get rid of Catherine of Aragon, the first of his six wives. Two centuries later, in what was almost certainly an accident, Holy Roman Emperor King Charles VI died in Vienna on Oct. 20, 1740, after attempting to treat a cold and fever with his favorite dish of stewed mushrooms.

Of course, mushrooms don’t need to be lethal to be dangerous. Ergot fungus, which looks like innocuous black seeds, can contaminate cereal grains, notably rye. Its baleful effects include twitching, convulsions, the sensation of burning, and terrifying hallucinations. The Renaissance painter Hieronymus Bosch may well have been suffering from ergotism, known as St. Anthony’s Fire in his day, when he painted his depictions of hell. Less clear is whether ergotism was behind the strange symptoms recorded among some of the townspeople during the Salem witch panic of 1692-93.

Unfortunately, the mushroom’s mixed reputation deterred scientific research into its many uses. But earlier this year a small study in the Journal of Psychopharmacology found evidence to support what many college students already believe: Magic mushrooms can be therapeutic. Medication containing psilocybin had an antidepressant effect over the course of a year. More studies are needed, but I know one thing for sure: Sautéed mushrooms and garlic are a recipe for happiness.

Historically Speaking: A Pocket-Sized Dilemma for Women

Unlike men’s clothes, female fashion has been indifferent for centuries to creating ways for women to stash things in their garments

The Wall Street Journal

September 29, 2022

The current round of Fashion Weeks started in New York on Sep. 9 and will end in Paris on Oct. 4, with London and Milan slotted in between. Amid the usual impractical and unwearable outfits on stage, some designers went their own way and featured—gasp—women’s wear with large pockets.

The anti-pocket prejudice in women’s clothing runs deep. In 1954, the French designer Christian Dior stated: “Men have pockets to keep things in, women for decoration.” Designers seem to think that their idea of how a woman should look outweighs what she needs from her clothes. That mentality probably explains why a 2018 survey found that 100% of the pockets in men’s jeans were large enough to fit a midsize cellphone, but only 40% of women’s jeans pockets measured up.

The pocket is an ancient idea, initially designed as a pouch that was tied or sewn to a belt beneath a layer of clothing. Otzi, the 5,300-year-old ice mummy I wrote about recently for having the world’s oldest known tattoos, also wore an early version of a pocket; it contained his fire-starting tools.

ILLUSTRATION: THOMAS FUCHS

The ancient concept was so practical that the same basic design was still in use during the medieval era. Attempts to find other storage solutions usually came up short. In the 16th century a man’s codpiece sometimes served as an alternative holdall, despite the awkwardness of having to fish around your crotch to find things. Its fall from favor at the end of the 1600s coincided with the first in-seam pockets for men.

It was at this stage that the pocket divided into “his” and “hers” styles. Women retained the tie-on version; the fashion for wide dresses allowed plenty of room to hang a pouch underneath the layers of petticoats. But it was also impractical since reaching a pocket required lifting the layers up.

Moralists looked askance at women’s pockets, which seemed to defy male oversight and could potentially be a hiding place for love letters, money and makeup. On the other hand, in the 17th century a maidservant was able to thwart the unwelcome advances of the diarist Samuel Pepys by grabbing a pin from her pocket and threatening to stab him with it, according to his own account.

Matters looked up for women in the 18th century with the inclusion of side slits on dresses that allowed them direct access to their pockets. Newspapers began to carry advertisements for articles made especially to be carried in them. Sewing kits and snuff boxes were popular items, as were miniature “conversation cards” containing witty remarks “to create mirth in mixed companies.”

Increasingly, though, the essential difference between men’s and women’s pockets—his being accessible and almost anywhere, hers being hidden and nestled near her groin—gave them symbolism. Among the macabre acts committed by the Victorian serial killer Jack the Ripper was the disemboweling of his victims’ pockets, which he left splayed open next to their bodies.

Women had been agitating for more practical dress styles since the formation of the Rational Dress Society in Britain in 1881, but it took the upheavals caused by World War I for real change to happen. Women’s pantsuits started to appear in the 1920s. First lady Eleanor Roosevelt caused a sensation by appearing in one in 1933. The real revolution began in 1934, with the introduction of Levi’s bluejeans for women, 61 years after the originals for men. The women’s front pocket was born. And one day, with luck, it will grow up to be the same size as men’s.