Historically Speaking: Fakers, Con Men and Pretenders to the Throne

George Santos is far from the first public figure to have assumed an identity later discovered to be rife with fictions

The Wall Street Journal

January 27, 2023

Few would have thought it possible in the age of the internet, and yet U.S. Rep. George Santos turns out to have invented a long list of details of the life story he told as a candidate.

It was much easier to be an impostor in the ancient world, when travel was difficult, communications slow, and few people even knew what their rulers looked like. One of history’s oldest and strangest examples depended on this ignorance.

According to the Greek historian Herodotus, King Cambyses II of Persia became so jealous of his younger brother Bardiya, also known in history as Smerdis, that he ordered his assassination in 522 B.C. Imagine Cambyses’s shock, therefore, when news came to him while he was away fighting in Egypt that a man calling himself Smerdis had seized the throne.

ILLUSTRATION: THOMAS FUCHS

Cambyses died before he could confront the impostor. But Prince Darius, a former ally of Cambyses, suspected that the new king was actually a Magi priest named Gautama. Herodotus relates that Darius knew a crucial fact about Gautama: his ears had been cut off.

Royal etiquette kept the king at distance from everyone—even the queen lived and slept in separate quarters. But Darius persuaded her to find an opportunity to be in the king’s apartment when he was asleep and check his ears. They were missing! Darius promptly denounced this Smerdis as an impostor, proclaimed himself the savior of the kingdom and took the throne. Modern historians suspect that the real impostor was Darius and that he invented the Gautama story to justify his coup against Smerdis.

The Middle Ages were a boom time for royal impostors. Kings and crown princes were often assassinated or executed under conditions that could plausibly be spun into tales of miraculous escape. Some of those impersonating long-lost monarchs managed to get quite far. King Henry VII of England spent eight years fighting a rebellion led by Perkin Warbeck, who claimed to be one of the two princes held in the Tower of London and killed by King Richard III in the 1480s.

During the Renaissance, the most famous case in Europe involved neither a king nor the heir to a great fortune but a French peasant named Martin Guerre. In 1548, the feckless Guerre abandoned his wife, Bertrande, and their son. Eight years later, a look-alike named Arnaud du Thil suddenly appeared, claiming to be Guerre. He settled down and became a good member of the community—too good for those who had known the old Guerre. But Bertrande insisted he was the same man. The deception unraveled in 1560 when the real Martin Guerre made a sensational return in the middle of a court trial to decide du Thil’s identity. Du Thil was executed for impersonation, but the judge declared Bertrande innocent of adultery on the grounds that women are known to be very silly creatures and easily deceived.

Opportunities to claim titles and thrones diminished after the 18th century, and a new class of impostor arose: the confidence man. Mr. Santos isn’t yet a match for the Czech-American fraudster, “Count” Victor Lustig. In 1925, posing as a French government official, Lustig successfully auctioned off the Eiffel Tower. In Chicago the following year, he posed as an investor and swindled Al Capone out of $5000.

Lustig’s long list of crimes eventually landed him in Alcatraz. Where Mr. Santos is heading is a mystery—much like where he’s been.

Historically Speaking: The Long, Dark Shadow of the Real ‘Bedlam’

Today’s debate over compulsory treatment for the mentally ill has roots in a history of good intentions gone awry

The Wall Street Journal

January 12, 2023

This year, California and New York City will roll out plans to force the homeless mentally ill to receive hospital treatment. The initiatives face fierce legal challenges despite their backers’ good intentions and promised extra funds.

Opposition to compulsory hospitalization has its roots in the historic maltreatment of mental patients. For centuries, the biggest problem regarding the care of the mentally ill was the lack of it. Until the 18th century, Britain was typical in having only one public insane asylum, Bethlehem Royal Hospital. The conditions were so notorious, even by contemporary standards, that the hospital’s nickname, Bedlam, became synonymous with violent anarchy.

Plate 8 of WIlliam Hogarth’s ‘A Rake’s Progress,’ titled ‘In The Madhouse,’ was painted around 1735 and depicted the hospital known as ‘Bedlam.’
PHOTO: HERITAGE IMAGES VIA GETTY IMAGES

The cost of treatment at Bedlam, which consisted of pacifying the patients through pain and terror, was offset by viewing fees. Anyone could pay to stare or laugh at the inmates, and thousands did. But social attitudes toward mental illness were changing. By the end of the 18th century, psychiatric reformers such as Benjamin Rush in America and Philippe Pinel in France had demonstrated the efficacy of more humane treatment.

In a burst of optimism, New York Hospital created a ward for the “curable” insane in 1792. The Quaker-run “Asylum for the Relief of Persons Deprived of the Use of Their Reason” in Pennsylvania became the first dedicated mental hospital in the U.S. in 1813. By the 1830s there were at least a dozen private mental hospitals in America.

The public authorities, however, were still shutting the mentally ill in prisons, as the social reformer Dorothea Dix was appalled to discover in 1841. Dix’s energetic campaigning bore fruit in New Jersey, which soon built its first public asylum. Designed by Thomas Kirkbride to provide state-of-the-art care amid pleasant surroundings, Trenton State Hospital served as a model for more than 70 purpose-built asylums that sprang up across the nation after Congress approved government funding for them in 1860.

Unfortunately, the philanthropic impetus driving the public mental hospital movement created as many problems as it solved. Abuse became rampant. It was so easy to have a person committed that in the 1870s, President Grover Cleveland, while still an aspiring politician, successfully silenced the mother of his illegitimate son by having her spirited away to an asylum.

In 1887, the journalist Nellie Bly went undercover as a patient in the Women’s Lunatic Asylum on Blackwell’s Island (now Roosevelt Island), New York. She exposed both the brutal practices of that institution and the general lack of legal safeguards against unwarranted incarceration.

The social reformer Dorothea Dix advocated for public mental health care.

During the first half of the 20th century, the best-run public mental hospitals lived up to the ideals that had inspired them. But the worst seemed to confirm fears that patients on the receiving end of state benevolence lost all basic rights. At Trenton State Hospital between 1907 and 1930, the director Henry Cotton performed thousands of invasive surgeries in the mistaken belief that removing patients’ teeth or organs would cure their mental illnesses. He ended up killing almost a third of those he treated and leaving the rest damaged and disfigured. The public uproar was immense. And yet, just a decade later, some mental hospitals were performing lobotomies on patients with or without consent.

In 1975 the ACLU persuaded the Supreme Court that the mentally ill had the right to refuse hospitalization, making public mental-health care mostly voluntary. But while legal principles are black and white, mental illness comes in shades of gray: A half century later, up to a third of people living on the streets are estimated to be mentally ill. As victories go, the Supreme Court decision was also a tragedy.

Historically Speaking: When Porcelain Wares Were ‘White Gold’

The fine china we set out for the holidays was once a mysterious imported substance that European alchemists struggled to recreate

The Wall Street Journal

December 22, 2022

It is that time of year again, when the table is laden, the candles are lit, and the good china comes out of cupboard. The rest of the time it sits gathering dust, which is silly because almost all modern china, even the most expensive kind, is now dishwasher safe. In any case, porcelain is stronger than it appears. The first Western imports of Chinese porcelain were able to survive journeys that had taken many months and covered thousands of miles.

Europe’s love affair with china began in 1499 when the Portuguese explorer Vasco da Gama brought home a dozen pieces for his patron, Dom Manuel I. The king of Portugal fell in love with this strange ceramic that had the strength of ordinary pottery but the luster and translucency of a seashell. That shell-like quality inspired Marco Polo in the 13th century to dub the material “porcellana,” the Italian slang for cowrie shell. Dom Manuel was the first monarch to commission a set of china from China. When it arrived in 1520, at least one piece had been painted with his coat of arms upside down.

ILLUSTRATION: THOMAS FUCHS

As demand grew, Chinese artists in Jingdezhen, eastern China, where the porcelain factories had been concentrated since the 10th century, became more expert at catering to European tastes. Centuries earlier, the Chinese had developed blue-and-white porcelain in response to Middle Eastern taste and demand. The Ming emperors fell in love with it, and the Chinese appropriated the Arabesque designs but added peonies, dragons, herons and phoenixes to the repertoire.

Chinese porcelain was called “white gold” and priced accordingly. Try as they might, would-be European competitors couldn’t replicate its formula, even once they knew that kaolin clay, which China has in abundance, was an essential ingredient. Alchemists working for Frederick the Great, the king of Prussia, made 30,000 failed attempts.

Augustus II, Elector of Saxony, forced a particularly talented young alchemist named Johann Friedrich Bottger to labor under house arrest until he devised what is generally thought to be the approximate European recipe in 1708. Two years later, Augustus founded the famous Meissen factory, near Dresden. Within three decades, industrial espionage on a rampant scale had enabled rival china factories to proliferate across Europe.

The power of porcelain went beyond its beauty. Gifts of exquisite china sets were a popular tool of royal diplomacy. The British potter Josiah Wedgwood made an anti-slavery medallion in 1787, featuring an African man in chains with the words, ‘Am I not a man and brother?’ on top. The image became an international symbol of protest, appearing on everything from plates to cuff links.

In the U.S., presidents would come to be associated with specific china patterns. George Washington initiated this tradition by purchasing from China a 302-piece dinner service decorated with the emblem of the Society of Cincinnati, founded by French and American officers who had fought together in the Revolutionary War.

Foreign exports dominated the U.S. fine china market until the late 19th century, when homegrown rivals Willard Pickard in Wisconsin and William Scott Lenox in New Jersey appeared on the scene. In 1918 President Woodrow Wilson commissioned Lenox China to manufacture the first set of American-made china for the White House. Pickard made the latest, for the Obama White House, in 2015. Some traditions are worth saving.

Historically Speaking: You Might Not Want to Win a Roman Lottery

Humans have long liked to draw lots as a way to win fortunes and settle fates

The Wall Street Journal

November 25, 2022

Someone in California won this month’s $2.04 billion Powerball lottery—the largest in U.S. history. The odds are staggering. The likelihood of death by plane crash (often estimated at 1 in 11 million for the average American) is greater than that of winning the Powerball or Mega Millions lottery (1 in roughly 300 million). Despite this, just under 50% of American adults bought a lottery ticket last year.

What drives people to risk their luck playing the lottery is more than just lousy math. Lotteries tap into a deep need among humans to find meaning in random events. Many ancient societies, from the Chinese to the Hebrews, practiced cleromancy, or the casting of lots to enable divine will to express itself. It is stated in the Bible’s Book of Proverbs: “The lot is cast into the lap, but its every decision is from the Lord.”

The ancient Greeks were among the first to use lotteries to ensure impartiality for non-religious purposes. The Athenians relied on a special device called a “kleroterion” for selecting jurors and public officials at random, to avoid unfair interference. The Romans had a more terrible use for drawing lots: A kind of collective military punishment known as “decimation” required a disgraced legion to select 1 out of every 10 soldiers at random and execute them. The last known use of the practice was in World War I by French and possibly Italian commanders.

The Romans also found less bloody uses for lotteries, including as a source of state revenue. Emperor Nero was likely the first ruler to use a raffle system as a means of filling the treasury without raising taxes.

ILLUSTRATION: THOMAS FUCHS

Following the fall of Rome, lotteries found other uses in the West—for example, as a means of allocating market stalls. But state lotteries only returned to Europe after 1441, when the city of Bruges successfully experimented with one as a means to finance its community projects. These fundraisers didn’t always work, however. A lack of faith in the English authorities severely dampened ticket sales for Queen Elizabeth I’s first (and last) National Lottery in 1567.

When they did work, the bonanzas could be significant: In the New World, lotteries helped to pay for the first years of the Jamestown Colony, as well as Harvard, Yale, Princeton, Columbia and many other institutions. And in France in 1729, the philosopher Voltaire got very rich by winning the national lottery, which was meant to sell bonds by making each bond a ticket for a jackpot drawing. He did it by unsavory means: Voltaire was part of a consortium of schemers who took advantage of a flaw in the lottery’s design by buying up enormous numbers of very cheap bonds.

Corruption scandals and failures eventually took their toll. Critics such as the French novelist Honoré de Balzac, who called lotteries the “opium of poverty,” denounced them for exploiting the poor. Starting in the late 1820s, a raft of anti-lottery laws were enacted on both sides of the Atlantic. Debates continued about them even where they remained legal. The Russian novelist Anton Chekhov highlighted their debilitating effects in his 1887 short story “The Lottery Ticket,” about a contented couple who are tormented and finally turned into raging malcontents by the mere possibility of winning.

New Hampshire was the first American state to roll back the ban on lotteries in 1964. Since then, state lotteries have proven to be neither the disaster nor the cure-all predicted. As for the five holdouts—Alabama, Alaska, Hawaii, Nevada and Utah still have no state lotteries—they are doing just fine.

The Sunday Times: Kate’s in touch with American over-40s but Meghan is where the money is

As the Prince and Princess of Wales head stateside for their first US tour in eight years, Amanda Foreman assesses the British monarchy’s popularity across the Atlantic

The Sunday Times

November 26, 2022

Two royal events dominated the American headlines in 1981. The first was the great “curtsy scandal” in April, when the White House chief of protocol, Leonore Annenberg, was photographed curtsying to Prince Charles during his brief visit to the US. The idea of an American bending the knee to royalty — and English royalty no less — sent people into a frenzy of righteous indignation. There were editorials, letters, television debates, and seemingly endless outrage at the insult to American republican values. The State Department was forced to make a statement, and Annenberg never lived it down. Three months later, more than 14 million US households either stayed up all night or got up before dawn to watch Charles and Diana’s wedding.

As the French like to say, “plus ça change, plus c’est la même chose”, the more things change, the more they stay the same. Four decades later, Americans are still viscerally tied and eternally conflicted about the royal family, only now they have been joined by two players from the other side who are pumping as much oxygen as possible to keep this psychodrama alive. The Duke and Duchess of Sussex’s move to California has made the volume louder, the talk trashier, and the stakes for the royal family never higher.

two-hour chat with Oprah Winfrey turned a 70-year reign of selfless dedication and duty into a mere sideshow compared with the vital question of who said what to whom. Imagine what the Sussexes’ documentary series and Harry’s biography will do. With William and Kate, the new Prince and Princess of Wales, due to arrive in the US on Wednesday for a three-day trip — their first state visit since Queen Elizabeth’s death — there is going to be a royal smackdown of sorts that will allow a direct, peer-to-peer comparison between the working Windsors and the working-it Windsors.

If you believe the British press, the Sussexes are wearing out their welcome in the US. Their popularity is slipping down the scale, just like each episode of Meghan Markle’s Archetypes podcast on the Spotify rankings (down 76 places last week). But if that were actually true, the Sussexes would not be attending one of the most expensive charity galas in New York on December 6 to accept an award for their humanitarian work. The Robert F Kennedy Human Rights foundation, named after the younger brother of President John F Kennedy, is giving the couple the Ripple of Hope Award in recognition of their “heroic” fight against the royal family’s “structural racism”. Everyone knows that charity galas select their recipients with both ears and eyes focused on getting bums on seats. It’s the oldest game in town. Some charities even demand that they commit to buying a certain number of tables. There happen to be four other “name brand” recipients at the event, including the Ukrainian president Volodymyr Zelensky, but they are being treated like the proverbial chopped liver compared with the excitement over having Harry and Meghan come to the East Coast.

While the younger branches of two globally famous families make common cause in New York, that same week in Boston the John F Kennedy Library Foundation and the Royal Foundation will jointly award five £1 million Earthshot prizes to innovators in environmentalism. The Prince and Princess of Wales will be in attendance. Only 190 miles as the crow flies will separate the rival Windsor-Kennedys, but it could be a million or a trillion as far as their supporters are concerned. On paper, there is no question which event will be the more important or carries more gravitas. By every possible metric from the historic to the meaningful, the Waleses come out ahead. The Earthshot prize has the potential to save the planet. Its prizewinners are anonymous worker-bees not big shots who have made it to the top. But, to put it crudely, William and Kate are trading in yesterday’s currency. Leaving aside sheer sartorial glamour, where Kate is unmatched, the Waleses offer the world a fixed basket of virtues: duty, probity, discipline, decency, discretion, loyalty, and commitment. It is a worthy one, to be sure, and also totally — fatally — in step with the values of the over-40 crowd: Baby Boomers, Generation X, and some millennials. But the Duke and Duchess of Sussex are dealers in today’s currency: self-actualisation, self-healing, self-identity, self-care, self-expression, self-confidence, and self-love.

It is a duel representative of a cultural and generational divide in America. The Waleses and the Sussexes carry extraordinary weight — but it’s with their own constituencies rather than each other’s. The US broadsheets and magazines that cater to readers who were alive before the internet have been questioning for some time now whether the luxurious lifestyles of Harry and Meghan are out of kilter with their narrative of victimhood.

In August, New York Magazine published a 6,000-word interview with the duchess, rather cheekily entitled “Meghan of Montecito”, where she was given the opportunity to talk about herself without interruption. It was a risky decision to go beyond her natural fanbase on TV and social media. Unlike the infamous Oprah interview, the reporter kept a modicum of distance from her subject. One choice line read: “She has been media-trained and then royal media-trained and sometimes converses like she has a tiny Bachelor producer in her brain directing what she says.”

The medium was a poor fit for Meghan and the message that came across was less than flattering: she’s a piece of a work, he’s out of his depth. It made no difference, the podcast debuted at No 1 anyway.

The best way to understand the Sussexes’ relationship with ordinary Americans is to put it in the same context as Gwyneth Paltrow and her Goop lifestyle company. Goop was valued at $250 million in 2018, having started out as a newsletter in 2008. Experts routinely criticise the company for peddling expensive tat to credulous consumers who are chasing after something called “wellness”. Her supporters could not care less because they don’t need the products to be real, they just need them to be emotionally satisfying at the point of sale.

Harry and Meghan are selling something similar, call it “me-spiration”. It’s not a philosophy so much as an ego massage and it’s a pure money-maker. Netflix and Random House each have millions of dollars invested in its success. Random House is sitting pretty. Books dishing the dirt on the royal family always sell, no matter what. Netflix has more of a challenge on its hands. But an Oscar-winning screenwriter who spoke to me on the condition of anonymity revealed that the streaming company regards them in the same way they do the Obamas, who also have their own producing deal. The Sussexes must succeed because they are too expensive to fail. In any case, they have a star power that goes beyond the normal considerations of content or success because their brand is cross-collateralised with The Crown, still one of Netflix’s flagship series which regularly tops the streamers viewing figures in the US.

With two media juggernauts each working overtime to protect their assets, and a formula that milks two of the biggest obsessions in America today: royalty and identity, the Sussexes are assured their place in the social mediaverse. There will be no shortage of interviews, humanitarian awards, and self-generated documentaries in their future. The war between the Waleses and the Sussexes was over before it started.

Historically Speaking: Modern Dentistry’s Painful Past

Just be thankful that your teeth aren’t drilled with a flint or numbed with cocaine

The Wall Street Journal

November 3, 2022

Since the start of the pandemic, a number of studies have uncovered a surprising link: The presence of gum disease, the first sign often being bloody gums when brushing, can make a patient with Covid three times more likely to be admitted to the ICU with complications. Putting off that visit to the dental hygienist may not be such a good idea just now.

Many of us have unpleasant memories of visits to the dentist, but caring for our teeth has come a very long way over the millennia. What our ancestors endured is incredible. In 2006 an Italian-led team of evolutionary anthropologists working in Balochistan, in southwestern Pakistan, found several 9,000-year-old skeletons whose infected teeth had been drilled using a pointed flint tool. Attempts at re-creating the process found that the patients would have had to sit still for up to a minute. Poppies grow in the region, so there may have been some local expertise in using them for anesthesia.

Acupuncture may have been used to treat tooth pain in ancient China, but the chronology remains uncertain. In ancient Egypt, dentistry was considered a distinct medical skill, but the Egyptians still had some of the worst teeth in the ancient world, mainly from chewing on food adulterated with grit and sand. They were averse to dental surgery, relying instead on topical pain remedies such as amulets, mouthwashes and even pastes made from dead mice. Faster relief could be obtained in Rome, where tooth extraction—and dentures—were widely available

ILLUSTRATION: THOMAS FUCHS

In Europe during the Middle Ages, dentistry fell under the purview of barbers. They could pull teeth but little else. The misery of mouth pain continued unabated until the early 18th century, when the French physician Pierre Fauchard became the first doctor to specialize in teeth. A rigorous scientist, Fauchard helped to found modern dentistry by recording his innovative methods and discoveries in a two-volume work, “The Surgeon Dentist.”

Fauchard elevated dentistry to a serious profession on both sides of the Atlantic. Before he became famous for his midnight ride, Paul Revere earned a respectable living making dentures. George Washington’s false teeth were actually a marvel of colonial-era technology, combining human teeth with elephant and walrus ivory to create a realistic look.

During the 1800s, the U.S. led the world in dental medicine, not only as an academic discipline but in the standardization of the practice, from the use of automatic drills to the dentist’s chair.

Perhaps the biggest breakthrough was the invention of local anesthetic in 1884. In July of that year, Sigmund Freud published a paper in Vienna on the potential uses of cocaine. Four months later in America, Richard Hall and William S. Halsted, whose pioneering work included the first radical mastectomy for breast cancer, decided to test cocaine’s numbing properties by injecting it into their dental nerves. Hall had an infected incisor filled without feeling a thing.

For patients, the experiment was a miracle. For Hall and Halsted, it was a disaster; both became cocaine addicts. Fortunately, dental surgery would be made safe by the invention of non-habit forming Novocain 20 years later.

With orthodontics, veneers, implants and teeth whiteners, dentists can give anyone a beautiful smile nowadays. But the main thing is that oral care doesn’t have to hurt, and it could just save your life. So make that appointment.

Historically Speaking: The Fungus That Fed Gods And Felled a Pope

There’s no hiding the fact that mushrooms, though delicious, have a dark side

The Wall Street Journal

October 21, 2022

Fall means mushroom season. And, oh, what joy. The Romans called mushrooms the food of the gods; to the ancient Chinese, they contained the elixir of life; and for many people, anything with truffles is the next best thing to a taste of heaven.

Lovers of mushrooms are known as mycophiles, while haters are mycophobes. Both sets have good reasons for feeling so strongly. The medicinal properties of mushrooms have been recognized for thousands of years. The ancient Chinese herbal text “Shen Nong Ben Cao Jing,” written down sometime during the Eastern Han Dynasty, 25-220 AD, was among the earliest medical treatises to highlight the immune-boosting powers of the reishi mushroom, also known as lingzhi.

The hallucinogenic powers of certain mushrooms were also widely known. Many societies, from the ancient Mayans to the Vikings, used psilocybin-containing fungi, popularly known as magic mushrooms, to achieve altered states either during religious rituals or in preparation for battle. One of the very few pre-Hispanic texts to survive Spanish destruction, the Codex Yuta Tnoho or Vindobonensis Mexicanus I, reveals the central role played by the mushroom in the cosmology of the Mixtecs.

ILLUSTRATION: THOMAS FUCHS

There is no hiding the fact that mushrooms have a dark side, however. Fewer than 100 species are actually poisonous out of the thousands of varieties that have been identified. But some are so deadly—the death cap (Amanita phalloides), for example—that recovery is uncertain even with swift treatment. Murder by mushroom is a staple of crime writing, although modern forensic science has made it impossible to disguise.

There is a strong possibility that this is how the Roman Emperor Claudius died on Oct. 13, 54 A.D. The alleged perpetrator, his fourth wife Agrippina the Younger, wanted to clear the path for her son Nero to sit on the imperial throne. Nero dutifully deified the late emperor, as was Roman custom. But according to the historian Dio Cassius, he revealed his true feelings by joking that mushrooms were surely a dish for gods, since Claudius, by means of a mushroom, had become a god.

Another victim of the death cap mushroom, it has been speculated, was Pope Clement VII in 1534, who is best known for opposing King Henry VIII’s attempt to get rid of Catherine of Aragon, the first of his six wives. Two centuries later, in what was almost certainly an accident, Holy Roman Emperor King Charles VI died in Vienna on Oct. 20, 1740, after attempting to treat a cold and fever with his favorite dish of stewed mushrooms.

Of course, mushrooms don’t need to be lethal to be dangerous. Ergot fungus, which looks like innocuous black seeds, can contaminate cereal grains, notably rye. Its baleful effects include twitching, convulsions, the sensation of burning, and terrifying hallucinations. The Renaissance painter Hieronymus Bosch may well have been suffering from ergotism, known as St. Anthony’s Fire in his day, when he painted his depictions of hell. Less clear is whether ergotism was behind the strange symptoms recorded among some of the townspeople during the Salem witch panic of 1692-93.

Unfortunately, the mushroom’s mixed reputation deterred scientific research into its many uses. But earlier this year a small study in the Journal of Psychopharmacology found evidence to support what many college students already believe: Magic mushrooms can be therapeutic. Medication containing psilocybin had an antidepressant effect over the course of a year. More studies are needed, but I know one thing for sure: Sautéed mushrooms and garlic are a recipe for happiness.

Historically Speaking: A Pocket-Sized Dilemma for Women

Unlike men’s clothes, female fashion has been indifferent for centuries to creating ways for women to stash things in their garments

The Wall Street Journal

September 29, 2022

The current round of Fashion Weeks started in New York on Sep. 9 and will end in Paris on Oct. 4, with London and Milan slotted in between. Amid the usual impractical and unwearable outfits on stage, some designers went their own way and featured—gasp—women’s wear with large pockets.

The anti-pocket prejudice in women’s clothing runs deep. In 1954, the French designer Christian Dior stated: “Men have pockets to keep things in, women for decoration.” Designers seem to think that their idea of how a woman should look outweighs what she needs from her clothes. That mentality probably explains why a 2018 survey found that 100% of the pockets in men’s jeans were large enough to fit a midsize cellphone, but only 40% of women’s jeans pockets measured up.

The pocket is an ancient idea, initially designed as a pouch that was tied or sewn to a belt beneath a layer of clothing. Otzi, the 5,300-year-old ice mummy I wrote about recently for having the world’s oldest known tattoos, also wore an early version of a pocket; it contained his fire-starting tools.

ILLUSTRATION: THOMAS FUCHS

The ancient concept was so practical that the same basic design was still in use during the medieval era. Attempts to find other storage solutions usually came up short. In the 16th century a man’s codpiece sometimes served as an alternative holdall, despite the awkwardness of having to fish around your crotch to find things. Its fall from favor at the end of the 1600s coincided with the first in-seam pockets for men.

It was at this stage that the pocket divided into “his” and “hers” styles. Women retained the tie-on version; the fashion for wide dresses allowed plenty of room to hang a pouch underneath the layers of petticoats. But it was also impractical since reaching a pocket required lifting the layers up.

Moralists looked askance at women’s pockets, which seemed to defy male oversight and could potentially be a hiding place for love letters, money and makeup. On the other hand, in the 17th century a maidservant was able to thwart the unwelcome advances of the diarist Samuel Pepys by grabbing a pin from her pocket and threatening to stab him with it, according to his own account.

Matters looked up for women in the 18th century with the inclusion of side slits on dresses that allowed them direct access to their pockets. Newspapers began to carry advertisements for articles made especially to be carried in them. Sewing kits and snuff boxes were popular items, as were miniature “conversation cards” containing witty remarks “to create mirth in mixed companies.”

Increasingly, though, the essential difference between men’s and women’s pockets—his being accessible and almost anywhere, hers being hidden and nestled near her groin—gave them symbolism. Among the macabre acts committed by the Victorian serial killer Jack the Ripper was the disemboweling of his victims’ pockets, which he left splayed open next to their bodies.

Women had been agitating for more practical dress styles since the formation of the Rational Dress Society in Britain in 1881, but it took the upheavals caused by World War I for real change to happen. Women’s pantsuits started to appear in the 1920s. First lady Eleanor Roosevelt caused a sensation by appearing in one in 1933. The real revolution began in 1934, with the introduction of Levi’s bluejeans for women, 61 years after the originals for men. The women’s front pocket was born. And one day, with luck, it will grow up to be the same size as men’s.

Harper’s Bazaar: Behind her eyes: celebrating the Queen as a cultural icon

Our steadfast hope

Harper’s Bazaar

June 2022

If you’ve ever had a dream involving the Queen, you are not alone. After her Silver Jubilee in 1977, it was estimated that more than a third of Britons had dreamt about her at least once, with even ardent republicans confessing to receiving royal visits in their slumbers. For the past 70 years, the Queen has been more than just a presence in our lives, subconscious or otherwise; she has been a source of fascination, inspiration and national pride.

Queen Elizabeth II in 2002

When Princess Elizabeth became Queen in 1952, the country was still struggling to emerge from the shadow of World War II. Her youth offered a break with the past. Time magazine in the United States named her its ‘Woman of the Year’, not because of anything she had achieved but because of the hope she represented for Britain’s future. A barrister and political hopeful named Margaret Thatcher wrote in the Sunday Graphic that having a queen ought to remove “the last shreds of prejudice against women aspiring to the highest places”. After all, Elizabeth II was a wife and mother of two small children, and yet no one was suggesting that family life made her unfit to rule.

Thatcher’s optimism belied the Queen’s dilemma over how to craft her identity as a modern monarch in a traditional role. At the beginning, tradition seemed to have the upper hand: a bagpiper played beneath her window every morning (a holdover from Queen Victoria). The Queen knew she didn’t want to be defined by the past. “Some people have expressed the hope that my reign may mark a new Elizabethan age,” she stated in 1953. “Frankly, I do not myself feel at all like my great Tudor forbear.”

Nevertheless, the historical parallels between the two queens are instructive. Elizabeth I created a public persona, yet made it authentic. Fakery was impossible, since “we princes,” she observed, “are set on stages in the sight and view of all the world.” Although Elizabeth I was a consummate performer, her actions were grounded in sincere belief. She began her reign by turning her coronation into a great public event. Observers were shocked by her willingness to interact with the crowds, but the celebrations laid the foundation for a new relationship between the queen and her subjects.

The introduction of television cameras for Elizabeth II’s coronation performed a similar function. In the 1860s, the journalist Walter Bagehot observed that society itself is a kind of “theatrical show” where “the climax of the play is the Queen”. The 1953 broadcast enabled 27 million Britons and 55 million Americans to participate in the ‘show’ from the comfort of their homes. It was a new kind of intimacy that demanded more from Elizabeth II than any previous monarch.

Images and quotes from the Queen’s coronavirus address in April 2022 displayed across London

The Queen had resisted being filmed, but having been convinced by Prince Philip of its necessity, she worked to master the medium. She practised reading off a teleprompter so that her 1957 Christmas speech, the first to be telecast, would appear warm and natural. Harking back to Elizabeth I, she admitted: “I cannot lead you into battle, I do not give you laws or administer justice, but I can do some­thing else, I can give you my heart and my devotion.” She vowed to fight for “fundamental principles” while not being “afraid of the future”.

In practice, embracing the future could be as groundbreaking as instituting the royal “walkabout”, or as subtle as adjusting her hemline to rest at the knee. Indeed, establishing her own sense of fashion was one of the first successes of Elizabeth II’s reign. Its essence was pure glamour, but the designs were performing a double duty: nothing could be too patterned, too hot, too shiny or too sheer, or else it wouldn’t photograph well. Her wardrobe carried the subversive message that dresses should be made to work for the wearer, not the other way round. In an era when female celebrity was becoming increasingly tied to “sexiness”, the Queen offered a different kind of confident femininity. Never afraid to wear bright blocks of colour, she has encouraged generations of women to think beyond merely blending in.

The opportunity to demonstrate her “fund­amental principles” on the international stage came in 1961, during a Cold War crisis involving Ghana. The Queen was due to go on a state visit, until growing violence there led to calls for it to be cancelled. She not only insisted on keeping the engage­ment, but during the wildly popular trip, she also made a point of dancing with President Kwame Nkrumah at a state ball. Her adept handling of the situation helped to prevent Ghana from switching allegiance to the Soviet Union. Just as important, though, was the coverage given to her rejection of contemporary racism. As Harold Macmillan noted: “She loves her duty and means to be Queen and not a puppet.” This determination has seen her through 14 prime ministers, 14 US presidents, seven popes and 265 official overseas visits.

At the beginning of the Covid epidemic in 2020, with the nation in shock at the sudden cessation of ordinary life, Elizabeth II spoke directly to the country, sharing a wartime memory to remind people of what can be endured. “We will succeed,” she promised, and in that desperate moment, she united us all in hope. The uniqueness of the Queen lies in her ability to weather change with grace and equanimity – as the poet Philip Larkin once wrote: “In times when nothing stood/but worsened, or grew strange/there was one constant good:/she did not change.” That steadfast continuity, so rare in a world of permanent flux, is an endless source of inspiration for artists and writers, designers and composers, all of us.

The Mail on Sunday: No miniskirts. No railing about being a working mother.

Leading historian AMANDA FOREMAN explains why the Queen was a true feminist icon who changed the world for millions of women – in very surprising ways.

The Mail on Sunday

September 17, 2022

Ask someone for the name of a famous feminist and no doubt you’ll get one of a few prominent women batted back to you. Germaine Greer. Gloria Steinem. Hillary Clinton. But Elizabeth Windsor? That would be a no. She looked the opposite of today’s powerful women with her knee-length tweeds and distinctly unfashionable court shoes.

I, though, argue differently. As a historian with a particular interest in female power, I believe one thing above all puts the Queen in a special category of achievement. Not the length of her reign. Not even her link to the courageous wartime generation. No, it is her global impact on the cause of gender equality that should be remembered, all without donning a miniskirt or wailing MeToo. All without spilling emotions, making herself a victim or hiding the effects of age and motherhood.

I believe the Queen is the ultimate feminist icon of the 20th Century, more a symbol of women’s progress in this century than other icons like Madonna or Beyoncé could dream of. Females everywhere, particularly those past menopause, have much to thank her for.

But when it has been previously suggested the Queen was a feminist, or that women should celebrate her life, critics have bitten back sharply.

In 2019 Olivia Colman, who portrayed the Queen in the Netflix drama The Crown, provoked equal cheers and jeers for describing her as ‘the ultimate feminist’. A few years before, Woman’s Hour chief presenter Emma Barnett had her intellectual credentials questioned for calling the Queen a ‘feminist icon’.

They justified the view for different reasons. For Colman, it was because the Queen had shown a wife could assume a man’s role while retaining her femininity. The argument went in reverse for Barnett: the Queen had shown her gender was ‘irrelevant to her capacity to do her job’.

Yet no King would ever have his masculinity and the definition of manhood so conflated in the same way. It’s doubtful anyone will question whether King Charles defines the essence of what it is to be a man.

In the midst of all the grief for the Queen, we should remember at the beginning of her reign Elizabeth’s potential power to effect change provoked as much unease as it did anticipation. In a patriarchal world, female empowerment is a force to fear. After all, we never talk about ‘male empowerment’, do we?

Our two other long-lived queens, Elizabeth I and Victoria, had the same scrutiny. Foreign affairs, great questions of state, probity in government, what did that matter compared to the burning issue of what it meant to have a woman placed above the heads of men?

It was not easy for Elizabeth II to escape from under the shadow of Queen Victoria, the figurative mother of the nation.

Initially, it wasn’t even clear she wanted to. Though the command for brides to obey their husbands had not been part of the Book of Common Prayer since 1928, Elizabeth included it in her wedding vows.

Aged 25, she was a mother-of-two when she made her accession speech before the Privy Council. Accompanied by her husband, Elizabeth looked even younger than her years, surrounded by a roomful of mostly old men. But after the Privy Council meeting, the comparisons with Victoria stopped. And you can begin to see her innate feminism come to the fore. Elizabeth did not lose her self-confidence in between pregnancies and pass over the red boxes or deputise Philip to meet her Ministers. Far from it. She took on the role of sovereign and Philip accepted his as the world’s most famous house-husband.

In reality, there were few actions or speeches of the Queen’s that could be classed as declaratively feminist – such as the time she drove Crown Prince Abdullah of Saudi Arabia around Balmoral in her Land Rover when Saudi women were forbidden to drive, going at such breakneck speed while chatting that the Prince begged her to slow down.

Or her few comments about the work of the WI, or the potential to be tapped if only society can ‘find ways to allow girls and women to play their full part’.

No, instead of examples like these, the Queen was a feminist for reasons most women can instantly relate to: first, she established clear boundaries between the demands of her job and those of her family.

Society still expects wives will drop everything for the family, no matter how consuming their careers, so husbands can go to work. Not once did the Queen say or imply she ought to shift her weekly audience with the Prime Minister, or cancel the ribbon-cutting of a hospital because of some domestic concern.

Second, society judges working mothers much more harshly than working fathers, giving the latter a free pass if their job is important enough but condemning the former as a terrible person if her children don’t turn out to be outstanding successes. The Queen’s fitness as sovereign has never been tied to her fitness as a mother. Although she always made her family a part of her life, Elizabeth did not allow it to define her as Victoria did.

Third, society makes middle- aged women feel that they are invisible. Their opinions stop mattering, contributions don’t count and their bodies, according to fashion designers, don’t exist. Whispers that the Queen ought to abdicate began in her 50s. By 1977, her Silver Jubilee, critics wondered what she was good for now her youth and figure were in the rear-view mirror.

In answer, she embodied the reverse of Invisible Woman Syndrome. By refusing to countenance abdication, she showed what a working woman looks like past menopause. Rather than shrinking, she revved up a gear and demonstrated a woman’s age has no bearing on her agency and authority.

Her fabulous colour sense and ability to match dresses to the mood excited intense interest – but this didn’t make her a feminist icon. In an age when a woman’s sexiness is her currency, and empowerment judged by how much of her body she exposes, she refused to make any concessions to fashion.

This was a confident femininity, an inner feminism based on absolute assuredness of who she was and why she mattered. For over five decades, the Queen showed what strength and purpose look like on the body of an older woman.

The next three generations of monarchs are due to be Kings. To some extent, the old way of doing things will return. So, it is up to us to honour Queen Elizabeth’s memory by following her example.

She tore up the rule book on gender roles without society falling apart or families breaking down. Despite heavy restrictions on what she could do as a woman let alone a Queen, she forged her own path – and invited the rest of us to follow.