Historically Speaking: Modern Dentistry’s Painful Past

Just be thankful that your teeth aren’t drilled with a flint or numbed with cocaine

The Wall Street Journal

November 3, 2022

Since the start of the pandemic, a number of studies have uncovered a surprising link: The presence of gum disease, the first sign often being bloody gums when brushing, can make a patient with Covid three times more likely to be admitted to the ICU with complications. Putting off that visit to the dental hygienist may not be such a good idea just now.

Many of us have unpleasant memories of visits to the dentist, but caring for our teeth has come a very long way over the millennia. What our ancestors endured is incredible. In 2006 an Italian-led team of evolutionary anthropologists working in Balochistan, in southwestern Pakistan, found several 9,000-year-old skeletons whose infected teeth had been drilled using a pointed flint tool. Attempts at re-creating the process found that the patients would have had to sit still for up to a minute. Poppies grow in the region, so there may have been some local expertise in using them for anesthesia.

Acupuncture may have been used to treat tooth pain in ancient China, but the chronology remains uncertain. In ancient Egypt, dentistry was considered a distinct medical skill, but the Egyptians still had some of the worst teeth in the ancient world, mainly from chewing on food adulterated with grit and sand. They were averse to dental surgery, relying instead on topical pain remedies such as amulets, mouthwashes and even pastes made from dead mice. Faster relief could be obtained in Rome, where tooth extraction—and dentures—were widely available

ILLUSTRATION: THOMAS FUCHS

In Europe during the Middle Ages, dentistry fell under the purview of barbers. They could pull teeth but little else. The misery of mouth pain continued unabated until the early 18th century, when the French physician Pierre Fauchard became the first doctor to specialize in teeth. A rigorous scientist, Fauchard helped to found modern dentistry by recording his innovative methods and discoveries in a two-volume work, “The Surgeon Dentist.”

Fauchard elevated dentistry to a serious profession on both sides of the Atlantic. Before he became famous for his midnight ride, Paul Revere earned a respectable living making dentures. George Washington’s false teeth were actually a marvel of colonial-era technology, combining human teeth with elephant and walrus ivory to create a realistic look.

During the 1800s, the U.S. led the world in dental medicine, not only as an academic discipline but in the standardization of the practice, from the use of automatic drills to the dentist’s chair.

Perhaps the biggest breakthrough was the invention of local anesthetic in 1884. In July of that year, Sigmund Freud published a paper in Vienna on the potential uses of cocaine. Four months later in America, Richard Hall and William S. Halsted, whose pioneering work included the first radical mastectomy for breast cancer, decided to test cocaine’s numbing properties by injecting it into their dental nerves. Hall had an infected incisor filled without feeling a thing.

For patients, the experiment was a miracle. For Hall and Halsted, it was a disaster; both became cocaine addicts. Fortunately, dental surgery would be made safe by the invention of non-habit forming Novocain 20 years later.

With orthodontics, veneers, implants and teeth whiteners, dentists can give anyone a beautiful smile nowadays. But the main thing is that oral care doesn’t have to hurt, and it could just save your life. So make that appointment.

Historically Speaking: The Fungus That Fed Gods And Felled a Pope

There’s no hiding the fact that mushrooms, though delicious, have a dark side

The Wall Street Journal

October 21, 2022

Fall means mushroom season. And, oh, what joy. The Romans called mushrooms the food of the gods; to the ancient Chinese, they contained the elixir of life; and for many people, anything with truffles is the next best thing to a taste of heaven.

Lovers of mushrooms are known as mycophiles, while haters are mycophobes. Both sets have good reasons for feeling so strongly. The medicinal properties of mushrooms have been recognized for thousands of years. The ancient Chinese herbal text “Shen Nong Ben Cao Jing,” written down sometime during the Eastern Han Dynasty, 25-220 AD, was among the earliest medical treatises to highlight the immune-boosting powers of the reishi mushroom, also known as lingzhi.

The hallucinogenic powers of certain mushrooms were also widely known. Many societies, from the ancient Mayans to the Vikings, used psilocybin-containing fungi, popularly known as magic mushrooms, to achieve altered states either during religious rituals or in preparation for battle. One of the very few pre-Hispanic texts to survive Spanish destruction, the Codex Yuta Tnoho or Vindobonensis Mexicanus I, reveals the central role played by the mushroom in the cosmology of the Mixtecs.

ILLUSTRATION: THOMAS FUCHS

There is no hiding the fact that mushrooms have a dark side, however. Fewer than 100 species are actually poisonous out of the thousands of varieties that have been identified. But some are so deadly—the death cap (Amanita phalloides), for example—that recovery is uncertain even with swift treatment. Murder by mushroom is a staple of crime writing, although modern forensic science has made it impossible to disguise.

There is a strong possibility that this is how the Roman Emperor Claudius died on Oct. 13, 54 A.D. The alleged perpetrator, his fourth wife Agrippina the Younger, wanted to clear the path for her son Nero to sit on the imperial throne. Nero dutifully deified the late emperor, as was Roman custom. But according to the historian Dio Cassius, he revealed his true feelings by joking that mushrooms were surely a dish for gods, since Claudius, by means of a mushroom, had become a god.

Another victim of the death cap mushroom, it has been speculated, was Pope Clement VII in 1534, who is best known for opposing King Henry VIII’s attempt to get rid of Catherine of Aragon, the first of his six wives. Two centuries later, in what was almost certainly an accident, Holy Roman Emperor King Charles VI died in Vienna on Oct. 20, 1740, after attempting to treat a cold and fever with his favorite dish of stewed mushrooms.

Of course, mushrooms don’t need to be lethal to be dangerous. Ergot fungus, which looks like innocuous black seeds, can contaminate cereal grains, notably rye. Its baleful effects include twitching, convulsions, the sensation of burning, and terrifying hallucinations. The Renaissance painter Hieronymus Bosch may well have been suffering from ergotism, known as St. Anthony’s Fire in his day, when he painted his depictions of hell. Less clear is whether ergotism was behind the strange symptoms recorded among some of the townspeople during the Salem witch panic of 1692-93.

Unfortunately, the mushroom’s mixed reputation deterred scientific research into its many uses. But earlier this year a small study in the Journal of Psychopharmacology found evidence to support what many college students already believe: Magic mushrooms can be therapeutic. Medication containing psilocybin had an antidepressant effect over the course of a year. More studies are needed, but I know one thing for sure: Sautéed mushrooms and garlic are a recipe for happiness.

Historically Speaking: A Pocket-Sized Dilemma for Women

Unlike men’s clothes, female fashion has been indifferent for centuries to creating ways for women to stash things in their garments

The Wall Street Journal

September 29, 2022

The current round of Fashion Weeks started in New York on Sep. 9 and will end in Paris on Oct. 4, with London and Milan slotted in between. Amid the usual impractical and unwearable outfits on stage, some designers went their own way and featured—gasp—women’s wear with large pockets.

The anti-pocket prejudice in women’s clothing runs deep. In 1954, the French designer Christian Dior stated: “Men have pockets to keep things in, women for decoration.” Designers seem to think that their idea of how a woman should look outweighs what she needs from her clothes. That mentality probably explains why a 2018 survey found that 100% of the pockets in men’s jeans were large enough to fit a midsize cellphone, but only 40% of women’s jeans pockets measured up.

The pocket is an ancient idea, initially designed as a pouch that was tied or sewn to a belt beneath a layer of clothing. Otzi, the 5,300-year-old ice mummy I wrote about recently for having the world’s oldest known tattoos, also wore an early version of a pocket; it contained his fire-starting tools.

ILLUSTRATION: THOMAS FUCHS

The ancient concept was so practical that the same basic design was still in use during the medieval era. Attempts to find other storage solutions usually came up short. In the 16th century a man’s codpiece sometimes served as an alternative holdall, despite the awkwardness of having to fish around your crotch to find things. Its fall from favor at the end of the 1600s coincided with the first in-seam pockets for men.

It was at this stage that the pocket divided into “his” and “hers” styles. Women retained the tie-on version; the fashion for wide dresses allowed plenty of room to hang a pouch underneath the layers of petticoats. But it was also impractical since reaching a pocket required lifting the layers up.

Moralists looked askance at women’s pockets, which seemed to defy male oversight and could potentially be a hiding place for love letters, money and makeup. On the other hand, in the 17th century a maidservant was able to thwart the unwelcome advances of the diarist Samuel Pepys by grabbing a pin from her pocket and threatening to stab him with it, according to his own account.

Matters looked up for women in the 18th century with the inclusion of side slits on dresses that allowed them direct access to their pockets. Newspapers began to carry advertisements for articles made especially to be carried in them. Sewing kits and snuff boxes were popular items, as were miniature “conversation cards” containing witty remarks “to create mirth in mixed companies.”

Increasingly, though, the essential difference between men’s and women’s pockets—his being accessible and almost anywhere, hers being hidden and nestled near her groin—gave them symbolism. Among the macabre acts committed by the Victorian serial killer Jack the Ripper was the disemboweling of his victims’ pockets, which he left splayed open next to their bodies.

Women had been agitating for more practical dress styles since the formation of the Rational Dress Society in Britain in 1881, but it took the upheavals caused by World War I for real change to happen. Women’s pantsuits started to appear in the 1920s. First lady Eleanor Roosevelt caused a sensation by appearing in one in 1933. The real revolution began in 1934, with the introduction of Levi’s bluejeans for women, 61 years after the originals for men. The women’s front pocket was born. And one day, with luck, it will grow up to be the same size as men’s.

Harper’s Bazaar: Behind her eyes: celebrating the Queen as a cultural icon

Our steadfast hope

Harper’s Bazaar

June 2022

If you’ve ever had a dream involving the Queen, you are not alone. After her Silver Jubilee in 1977, it was estimated that more than a third of Britons had dreamt about her at least once, with even ardent republicans confessing to receiving royal visits in their slumbers. For the past 70 years, the Queen has been more than just a presence in our lives, subconscious or otherwise; she has been a source of fascination, inspiration and national pride.

Queen Elizabeth II in 2002

When Princess Elizabeth became Queen in 1952, the country was still struggling to emerge from the shadow of World War II. Her youth offered a break with the past. Time magazine in the United States named her its ‘Woman of the Year’, not because of anything she had achieved but because of the hope she represented for Britain’s future. A barrister and political hopeful named Margaret Thatcher wrote in the Sunday Graphic that having a queen ought to remove “the last shreds of prejudice against women aspiring to the highest places”. After all, Elizabeth II was a wife and mother of two small children, and yet no one was suggesting that family life made her unfit to rule.

Thatcher’s optimism belied the Queen’s dilemma over how to craft her identity as a modern monarch in a traditional role. At the beginning, tradition seemed to have the upper hand: a bagpiper played beneath her window every morning (a holdover from Queen Victoria). The Queen knew she didn’t want to be defined by the past. “Some people have expressed the hope that my reign may mark a new Elizabethan age,” she stated in 1953. “Frankly, I do not myself feel at all like my great Tudor forbear.”

Nevertheless, the historical parallels between the two queens are instructive. Elizabeth I created a public persona, yet made it authentic. Fakery was impossible, since “we princes,” she observed, “are set on stages in the sight and view of all the world.” Although Elizabeth I was a consummate performer, her actions were grounded in sincere belief. She began her reign by turning her coronation into a great public event. Observers were shocked by her willingness to interact with the crowds, but the celebrations laid the foundation for a new relationship between the queen and her subjects.

The introduction of television cameras for Elizabeth II’s coronation performed a similar function. In the 1860s, the journalist Walter Bagehot observed that society itself is a kind of “theatrical show” where “the climax of the play is the Queen”. The 1953 broadcast enabled 27 million Britons and 55 million Americans to participate in the ‘show’ from the comfort of their homes. It was a new kind of intimacy that demanded more from Elizabeth II than any previous monarch.

Images and quotes from the Queen’s coronavirus address in April 2022 displayed across London

The Queen had resisted being filmed, but having been convinced by Prince Philip of its necessity, she worked to master the medium. She practised reading off a teleprompter so that her 1957 Christmas speech, the first to be telecast, would appear warm and natural. Harking back to Elizabeth I, she admitted: “I cannot lead you into battle, I do not give you laws or administer justice, but I can do some­thing else, I can give you my heart and my devotion.” She vowed to fight for “fundamental principles” while not being “afraid of the future”.

In practice, embracing the future could be as groundbreaking as instituting the royal “walkabout”, or as subtle as adjusting her hemline to rest at the knee. Indeed, establishing her own sense of fashion was one of the first successes of Elizabeth II’s reign. Its essence was pure glamour, but the designs were performing a double duty: nothing could be too patterned, too hot, too shiny or too sheer, or else it wouldn’t photograph well. Her wardrobe carried the subversive message that dresses should be made to work for the wearer, not the other way round. In an era when female celebrity was becoming increasingly tied to “sexiness”, the Queen offered a different kind of confident femininity. Never afraid to wear bright blocks of colour, she has encouraged generations of women to think beyond merely blending in.

The opportunity to demonstrate her “fund­amental principles” on the international stage came in 1961, during a Cold War crisis involving Ghana. The Queen was due to go on a state visit, until growing violence there led to calls for it to be cancelled. She not only insisted on keeping the engage­ment, but during the wildly popular trip, she also made a point of dancing with President Kwame Nkrumah at a state ball. Her adept handling of the situation helped to prevent Ghana from switching allegiance to the Soviet Union. Just as important, though, was the coverage given to her rejection of contemporary racism. As Harold Macmillan noted: “She loves her duty and means to be Queen and not a puppet.” This determination has seen her through 14 prime ministers, 14 US presidents, seven popes and 265 official overseas visits.

At the beginning of the Covid epidemic in 2020, with the nation in shock at the sudden cessation of ordinary life, Elizabeth II spoke directly to the country, sharing a wartime memory to remind people of what can be endured. “We will succeed,” she promised, and in that desperate moment, she united us all in hope. The uniqueness of the Queen lies in her ability to weather change with grace and equanimity – as the poet Philip Larkin once wrote: “In times when nothing stood/but worsened, or grew strange/there was one constant good:/she did not change.” That steadfast continuity, so rare in a world of permanent flux, is an endless source of inspiration for artists and writers, designers and composers, all of us.

The Mail on Sunday: No miniskirts. No railing about being a working mother.

Leading historian AMANDA FOREMAN explains why the Queen was a true feminist icon who changed the world for millions of women – in very surprising ways.

The Mail on Sunday

September 17, 2022

Ask someone for the name of a famous feminist and no doubt you’ll get one of a few prominent women batted back to you. Germaine Greer. Gloria Steinem. Hillary Clinton. But Elizabeth Windsor? That would be a no. She looked the opposite of today’s powerful women with her knee-length tweeds and distinctly unfashionable court shoes.

I, though, argue differently. As a historian with a particular interest in female power, I believe one thing above all puts the Queen in a special category of achievement. Not the length of her reign. Not even her link to the courageous wartime generation. No, it is her global impact on the cause of gender equality that should be remembered, all without donning a miniskirt or wailing MeToo. All without spilling emotions, making herself a victim or hiding the effects of age and motherhood.

I believe the Queen is the ultimate feminist icon of the 20th Century, more a symbol of women’s progress in this century than other icons like Madonna or Beyoncé could dream of. Females everywhere, particularly those past menopause, have much to thank her for.

But when it has been previously suggested the Queen was a feminist, or that women should celebrate her life, critics have bitten back sharply.

In 2019 Olivia Colman, who portrayed the Queen in the Netflix drama The Crown, provoked equal cheers and jeers for describing her as ‘the ultimate feminist’. A few years before, Woman’s Hour chief presenter Emma Barnett had her intellectual credentials questioned for calling the Queen a ‘feminist icon’.

They justified the view for different reasons. For Colman, it was because the Queen had shown a wife could assume a man’s role while retaining her femininity. The argument went in reverse for Barnett: the Queen had shown her gender was ‘irrelevant to her capacity to do her job’.

Yet no King would ever have his masculinity and the definition of manhood so conflated in the same way. It’s doubtful anyone will question whether King Charles defines the essence of what it is to be a man.

In the midst of all the grief for the Queen, we should remember at the beginning of her reign Elizabeth’s potential power to effect change provoked as much unease as it did anticipation. In a patriarchal world, female empowerment is a force to fear. After all, we never talk about ‘male empowerment’, do we?

Our two other long-lived queens, Elizabeth I and Victoria, had the same scrutiny. Foreign affairs, great questions of state, probity in government, what did that matter compared to the burning issue of what it meant to have a woman placed above the heads of men?

It was not easy for Elizabeth II to escape from under the shadow of Queen Victoria, the figurative mother of the nation.

Initially, it wasn’t even clear she wanted to. Though the command for brides to obey their husbands had not been part of the Book of Common Prayer since 1928, Elizabeth included it in her wedding vows.

Aged 25, she was a mother-of-two when she made her accession speech before the Privy Council. Accompanied by her husband, Elizabeth looked even younger than her years, surrounded by a roomful of mostly old men. But after the Privy Council meeting, the comparisons with Victoria stopped. And you can begin to see her innate feminism come to the fore. Elizabeth did not lose her self-confidence in between pregnancies and pass over the red boxes or deputise Philip to meet her Ministers. Far from it. She took on the role of sovereign and Philip accepted his as the world’s most famous house-husband.

In reality, there were few actions or speeches of the Queen’s that could be classed as declaratively feminist – such as the time she drove Crown Prince Abdullah of Saudi Arabia around Balmoral in her Land Rover when Saudi women were forbidden to drive, going at such breakneck speed while chatting that the Prince begged her to slow down.

Or her few comments about the work of the WI, or the potential to be tapped if only society can ‘find ways to allow girls and women to play their full part’.

No, instead of examples like these, the Queen was a feminist for reasons most women can instantly relate to: first, she established clear boundaries between the demands of her job and those of her family.

Society still expects wives will drop everything for the family, no matter how consuming their careers, so husbands can go to work. Not once did the Queen say or imply she ought to shift her weekly audience with the Prime Minister, or cancel the ribbon-cutting of a hospital because of some domestic concern.

Second, society judges working mothers much more harshly than working fathers, giving the latter a free pass if their job is important enough but condemning the former as a terrible person if her children don’t turn out to be outstanding successes. The Queen’s fitness as sovereign has never been tied to her fitness as a mother. Although she always made her family a part of her life, Elizabeth did not allow it to define her as Victoria did.

Third, society makes middle- aged women feel that they are invisible. Their opinions stop mattering, contributions don’t count and their bodies, according to fashion designers, don’t exist. Whispers that the Queen ought to abdicate began in her 50s. By 1977, her Silver Jubilee, critics wondered what she was good for now her youth and figure were in the rear-view mirror.

In answer, she embodied the reverse of Invisible Woman Syndrome. By refusing to countenance abdication, she showed what a working woman looks like past menopause. Rather than shrinking, she revved up a gear and demonstrated a woman’s age has no bearing on her agency and authority.

Her fabulous colour sense and ability to match dresses to the mood excited intense interest – but this didn’t make her a feminist icon. In an age when a woman’s sexiness is her currency, and empowerment judged by how much of her body she exposes, she refused to make any concessions to fashion.

This was a confident femininity, an inner feminism based on absolute assuredness of who she was and why she mattered. For over five decades, the Queen showed what strength and purpose look like on the body of an older woman.

The next three generations of monarchs are due to be Kings. To some extent, the old way of doing things will return. So, it is up to us to honour Queen Elizabeth’s memory by following her example.

She tore up the rule book on gender roles without society falling apart or families breaking down. Despite heavy restrictions on what she could do as a woman let alone a Queen, she forged her own path – and invited the rest of us to follow.

The Sunday Times – Special relationship: US mourns the Queen as it would a president

America’s relationship with the monarchy has always been complicated but she brought the two nations together

The Sunday Times

Saturday, September 10 2022

My phone started ringing at 7am New York time on Thursday. The news about the Queen’s health had reached the networks and they were calling everyone in. By 9am I was in the CBS newsroom and so began a long day that has not yet ended.

To say that Americans are reacting strongly to the death of the Queen hardly does justice to the immense coverage it has received. Not since the funeral of Pope John Paul II in 2005 has an international figure been given the kind of treatment normally reserved for departed presidents.

Kamala Harris, the US vice-president, signs the condolence book at the British embassy in Washington

That includes the American tendency to start the criticising long before any European outlet would consider it seemly to do so. As any psychologist will tell you, extreme love and hate are two sides of the same coin.

In the studio I was struck by the comment of a “lifestyle” journalist who remarked that Elizabeth II had been the star of a reality TV family, years before the concept had been invented. “Is that why we are so obsessed with the Queen and the monarchy?” mused the anchor before proceeding to display a level of knowledge about the Queen that would put a royal correspondent to shame.

On Thursday night New York was a blinking skyline of mauve, purple and lavender tributes, led by the Empire State Building in silver and purple. It was almost exactly 65 years ago in October 1957 that the Queen had gone to the top of the building to see the view. Cities that the Queen had never visited also paid their respects. Las Vegas illuminated its High Roller observation wheel — the world’s second tallest — a neon purple.

American sport paused briefly, too. On Thursday a minute’s silence was observed and an image of the Queen projected on to large screens before play at the NFL season opener in Los Angeles, at the US Open tennis tournament and at the baseball game in Yankee stadium.

Not everyone was in mourning though. With a breathtaking lack of decorum, The New York Times rushed out an opinion piece that accused the Queen of being a participant in Britain’s cover-up of its colonial crimes. This was tepid compared with some outbursts on social media. A linguistics professor at Carnegie Mellon University in Pennsylvania tweeted her hope that the Queen’s final hours were “excruciating”.

America’s relationship with the monarchy has always been complicated. After victory in the War of Independence George Washington angrily dismissed suggestions that he become king of the new United States. But there were still officers in his own army who thought that a monarchy would be good for the country. It was an idea that never went away.

Britain was one of the first countries to recognise the Republic of Texas in 1836, and Texans repaid the compliment by openly flirting with the idea of joining the British Empire. At the start of the American Civil War in 1861, the British consulate in Charleston reported high-level discussions on whether it would aid the Confederacy to invite a younger member of the royal family to become king once the South had achieved independence.

About the same time the 19th-century journalist Thomas Nichols asked “Why does America hate England?”, this anglophobic nation gave the Prince of Wales, the future Edward VII, the biggest welcome yet shown to a foreign dignitary.

Like a cross between a Greek tragedy and a French farce, the drama of Anglo-American relations during the 19th century mainly consisted of young America alternating between vying for Britain’s good opinion and wanting to destroy her.

The Queen had the good fortune to come to the throne after the US had passed through this awkward adolescent phase. Aged 25, she was stepping into a role that had played an essential part in shaping American identity. Not quite mother figure, certainly not hated stepmother, and yet familial, maternal, and in the origin-story she represented, eternal.

It is no wonder that during her first state visit in 1957, a million people lined the roads to watch her arrive in Washington. There’s no escaping family. Still, family, like biology, is not destiny. It was her choice to embrace the past as proof that the two countries had put it behind them, like mature adults.

That first state visit took place after a crack in Anglo-American relations caused by Suez. Much depended, therefore, on the Queen’s four-night stay at the White House. It wasn’t merely a matter of repairing relations with President Eisenhower, she also needed to charm the public — which she did, by among other things, showing herself to be interested in the things that interested them, like watching a football game.

The recovery of good relations was a testament to the Queen’s personal touch with US presidents. In 1976, at a state dinner in Washington, she rather boldly referred to the British burning of the White House, to add: “But these early quarrels are long buried. What is more important is that our shared language, traditions and history have given us a common vision of what is right and just.”

Yet this mutual heritage alone was not enough to bind the nations, she came to believe. Active reminders and a stroking of the ego of whomever occupied the White House were equally important.

The success of this policy was borne out by what did not happen during the Falklands conflict in 1982. Despite great reason to avoid supporting Britain against Argentina, the US proved to be a reliable ally. Much has been made of the role played by Margaret Thatcher’s relationship with Ronald Reagan. But the Queen’s adept handling of Nancy Reagan during the latter’s visit for the royal wedding the year before had helped to lay the groundwork. After the wedding, the Queen issued a rare private invitation to the Reagans to stay with her at Windsor Castle the following year. A small thing, perhaps, and yet personal touches can decide the fate of a nation.

Before President Obama’s first visit to London in 2009 it was reported that he had little interest in cultivating a special relationship, but Operation Obama proved the most successful charm offensive in the Queen’s history of diplomacy.

Perhaps it was the moment that the Queen and Michelle Obama put their arms around each other, or the amount of private time that the Queen and Prince Philip gave to the Obamas. Either way, by the end of Obama’s two terms relations between the White House and Buckingham Palace had never been better.

The reason Americans are in shock is not because, or merely because, the Queen was famous, or that her family make for good copy; she held up a mirror through which this nation saw a different self, perhaps its best self that only a mother sees.

Historically Speaking: The Noble Elf Has a Devilish Alter-Ego

Pointy-eared magical creatures abound in folklore, but they weren’t always cute

The Wall Street Journal

September 8, 2022

“The Rings of Power” series, Amazon’s prequel to J.R.R. Tolkien’s fantasy epic, “The Lord of the Rings,” reserves a central role for heroic elves. Members of this tall, immortal race are distinguished by their beauty and wisdom and bear little resemblance to the merry, diminutive helpers in Santa’s workshop.

Yet both are called elves. One reason for the confusion is that the idea of pointy-eared magical beings has never been culturally specific. The ancient Greeks didn’t have elves per se, but their myths did include sex-mad satyrs, Dionysian half-human-half-animal nature spirits whose ears were pointed like a horse’s.

Before their conversion to Christianity, the Anglo-Saxons, like their Norse ancestors, believed in magical beings such as water spirits, elves and dragons. Later, in the epic poem Beowulf, written down around 1000, the “ylfe” is among the monsters spawned by the biblical Cain.

Benjamin Walker as Gil-galad, High King of the Elves of the West, in “The Rings of Power”
PHOTO: AMAZON STUDIOS

The best accounts of the early Norse myths come from two medieval Icelandic collections known as the Eddas, which are overlaid with Christian cosmology. The Prose Edda divided elves into the “light elves,” who are fair and wondrous, and the “dark elves,” who live underground and are interchangeable with dwarves. Both kinds appeared in medieval tales to torment or, occasionally, help humans.

When not portrayed as the cause of unexplained illnesses, elves were avatars for sexual desire. In Chaucer’s comic tale, the lusty “Wife of Bath” describes the elf queen as sex personified and then complains that the friars have chased all the fair folk away.

The popular conception of elves continued to evolve during the Renaissance under the influence of French “faerie” folklore, Celtic myths and newly available translations of Ovid’s “Metamorphoses” and Virgil’s “Georgics.” Shakespeare took something from almost every tradition in “A Midsummer Night’s Dream,” from Puck the naughty little sprite to Queen Titania, seductress of hapless humans.

But while elves were becoming smaller and cuter in English literature, in Northern Europe they retained their malevolence. Inspired by the Germanic folk tale of the Elf King who preys on little children, in 1782 Goethe composed “Der Erlkonig,” about a boy’s terror as he is chased through the forest to his death. Schubert liked the ballad so much that he set it to music.

In the 19th century, the Brothers Grimm, Jacob and Wilhelm, along with Hans Christian Andersen, brought ancient fairy tales and folk whimsy to a world eager for relief from rampant industrialization. The Grimms put a cheerful face on capitalism with the story of a cobbler and the industrious elves who work to make him wealthy. Clement Clarke Moore made elves the consumer’s friend in his night-before-Christmas poem, “A Visit From St. Nicholas,” where a “jolly old elf” stuffs every stocking with toys.

On the more serious side, the first English translation of Beowulf appeared in 1837, marking the beginning of the Victorians’ obsession with the supernatural and all things gothic. The poem’s negative connotation surrounding elves burst into the open with Richard Wagner’s Ring Cycle, based on Germanic legends, which portrayed the Elf King Alberich as an evil dwarf.

The elfin future would likely have been silly or satanic were it not for Tolkien’s restoration of the “light elf” tradition. For now, at least, the lovely royal elf Galadriel rules.

Historically Speaking: The Ancient Art of the Tattoo

Body ink has been used to elevate, humiliate and decorate people since the times of mummies.

The Wall Street Journal

August 25, 2022

Earlier this month the celebrity couple Kim Kardashian and Pete Davidson announced that their nine-month relationship was over. Ms. Kardashian departed with her memories, but Mr. Davidson was left with something a little more tangible: the words “my girl is a lawyer” tattooed on his shoulder in (premature) homage to Ms. Kardashian’s legal aspirations. He has since been inundated on social media with suggestions on how to cover it up. In 1993, following his split with actress Winona Ryder, the actor Johnny Depp changed “Winona Forever” to “Wino Forever.”

Throughout history, humans have tattooed themselves—and others—for reasons spanning the gamut from religion to revenge. The earliest evidence for tattooing comes from a 5,300-year-old ice mummy nicknamed Otzi after its discovery in the Ötztal Alps in Europe. An analysis of Otzi’s remains revealed that he had been killed by an arrow. Even before his violent death, however, he appeared to have suffered from various painful ailments. Scientists found 61 tattoo marks across Otzi’s body, with many of them placed on known acupuncture points, prompting speculation that the world’s oldest tattoos were used as a health aid.

Similar purposes may have prompted the ancient Nubians to apply tattoos on some pregnant women. Tomb paintings and mummified remains of women in Egypt show that they also adopted the custom, possibly for the same reason.

PHOTO: THOMAS FUCHS

The indelible aspect of tattooing inspired diametrically opposed attitudes. Some ancient peoples, such as the Thracians and the Gauls, regarded tattoos as a mark of noble status and spiritual power. But the Persians, Greeks and Romans used them as a form of punishment or humiliation. Those who imported slaves to Rome from Asia paid duties and tattooed “tax paid” on the foreheads of those they enslaved.

In Polynesian cultures, tattoos were imbued with symbolism. The traditional tatau, which gave rise to the English word tattoo via Captain James Cook in the 18th century, covered the bodies of Samoan men from the waist to the knees. The ritual application took many weeks and entailed excruciating pain plus the danger of septicemia, but anyone who gave up brought shame upon his family for generations.

As a rule, Christian missionaries tried to stamp out the practice during the 19th century. But their disapproval was no match for royal enthusiasm. Fascinated by irezumi, the Japanese decorative art of tattooing inspired by woodblock printing, the future King George V of Britain and his brother Prince Albert Victor both had themselves inked in 1881 while on a royal visit. Noting its subsequent spread among the American upper classes, the New York Herald complained, “The Tattooing Fad has Reached New York Via London.”

In the 20th century, the practice retained a dark side as a symbol of criminality and oppression—most notably associated with the Nazis’ tattooing of inmates at Auschwitz. At the same time, however, so many returning U.S. servicemen had them that the Marlboro Man sported one on his hand in the advertisements of the day.

Historically Speaking: Passports Haven’t Always Been Liberating

France’s Louis XIV first required international travelers to carry an official document. By the 20th century, most other countries did the same for reasons of national security.

The Wall Street Journal

August 12, 2022

As anyone who has recently applied for a passport can attest, U.S. passport agencies are still catching up from the pandemic lockdown. But even with the current delays and frustrations, a passport is, quite literally, our pass to freedom.

The exact concept did not exist in ancient times. An approximation was the official letter of introduction or safe conduct that guaranteed the security of the traveler holding it. The Hebrew Bible recounts that the prophet Nehemiah, cup-bearer to Artaxerxes I of Persia, requested a letter from the king for his mission to Judea. As an indispensable tool of international business and diplomacy, such documents were considered sacrosanct. In medieval England, King Henry V decreed that any attack on a bearer of one would be treated as high treason.

Another variation was the official credential proving the bearer had permission to travel. The Athenian army controlled the movements of officers between bases by using a clay token system. In China, by the time of the Tang dynasty in the early 7th century, trade along the Silk Road had become regulated by the paper-backed guosuo system. Functioning as a pass and identity card, possession of a signed guosuo document was the only means of legitimate travel between towns and cities.

The birth of the modern passport may be credited in part to King Louis XIV of France, who decreed in 1669 that all individuals, whether leaving or entering his country, were required to register their personal details with the appropriate officials and carry a copy of their travel license. Ironically, the passport requirement helped to foil King Louis XVI and Marie Antoinette’s attempt to escape from Paris in 1791.

The rise of middle-class tourism during the 19th century exposed the ideological gulf between the continental and Anglo-Saxon view of passports. Unlike many European states, neither Great Britain nor America required its citizens to carry an identity card or request government permission to travel. Only 785 Britons applied for a passport in 1847, mainly out of the belief that a document personally signed by the foreign secretary might elevate the bearer in the eyes of the locals.

By the end of World War I, however, most governments had come around to the idea that passports were an essential buttress of national security. The need to own one coincided with mass upheaval across Europe: Countries were redrawn, regimes toppled, minorities persecuted, creating millions of stateless refugees.

Into this humanitarian crisis stepped an unlikely savior, the Norwegian diplomat Fridtjof Nansen. In 1922, as the first high commissioner for refugees for the League of Nations, Nansen used his office to create a temporary passport for displaced persons, enabling them to travel, register and work in over 50 countries. Among the hundreds of thousands saved by a “Nansen passport” were the artist Marc Chagall and the writer Vladimir Nabokov. With unfortunate timing, the program lapsed in 1938, the year that Nazi Germany annexed Austria and invaded Czechoslovakia.

For a brief time during the Cold War, Americans experienced the kind of politicization that shaped most other passport systems. In the 1950s, the State Department could and did revoke the passports of suspected communist sympathizers. My father Carl Foreman was temporarily stripped of his after he finished making the anti-McCarthyite film classic “High Noon.”

Nowadays, neither race nor creed nor political opinions can come between an American and his passport. But delays of up to 12 weeks are currently unavoidable.

Historically Speaking: The Mystical Origins of Wordplay

From oracular riddles to the daily Wordle, humans have always had the urge to decode

The Wall Street Journal

July 28, 2022

In 2021, a software engineer named Josh Wardle uploaded Wordle, a 5-letter word puzzle, for a few friends and relatives. By February this year, the number of players had jumped to the millions, and Wardle’s daily Wordle game had become a global phenomenon and the property of the New York Times.

Wordle’s success is unusual but not unprecedented. The urge to decode patterns lies deep within the human psyche. Puzzles, whether mathematical or linguistic, were originally associated with cosmic truths and celestial communication. In the Rhind papyrus, which the British museum dates to around 1550 B.C., the Egyptian scribe Ahmes presented 84 mathematical problems that he claimed contained the key to “knowledge of all existing things.” The Chinese I Ching, or Book of Changes, a set of 64 hexagrams (six-lined figures) thought to have been written dow

ILLUSTRATION: THOMAS FUCHS

n with commentary around 800 B.C., was also said to provide a basis for understanding the universe.

The Greeks saw riddles as a means of interacting with the gods. Pilgrims would visit oracles, believing that the gods spoke through their priestesses by means of riddles. Interpreting these utterances, however, was fraught with danger. According to Herodotus, King Croesus of Lydia took the Oracle of Delphi’s prediction that a great empire would fall if he attacked the Persians to mean that victory was assured. It was—for the Persians.

With the spread of literacy, anagrams and acrostics became an alternative medium for heavenly messaging. The Hebrew Bible contains several examples, including the alphabetic acrostic Psalm 119, which symbolizes the presence of God in everything from A to Z (aleph to tav).

It remains a matter of scholarly dispute whether the acrostic Sator Square, a two-dimensional, five-line palindrome square made up of five five-letter Latin words, was just clever Roman graffiti or a means of transmitting Christian messages. From its earliest appearance in 1st-century Pompeii, the Sator Square has been found in medieval churches across Europe. It may have started out as a bit of fun, but it ended up as something deeply serious.

One of the most famous nonreligious word squares is “Xuanji Tu,” composed by the 4th-century Chinese poet Su Hui to win back her errant husband. The 29-by-29-character grid can be read in any direction and is said to contain 7,958 poems.

Although word squares, puzzles and riddles gradually shed their magical connotations, their popularity in the West remained undimmed. During the 17th century, King Louis XIII of France even employed his own Royal Anagrammatist. In colonial America, Benjamin Franklin wisely included all manner of puzzles in his Poor Richard’s Almanack.

The Disney-fication of Lewis Carroll’s 1865 fantasy novel, “Alice’s Adventures in Wonderland,” has obscured its triumph as a game of language from start to finish. An Oxford mathematician, Carroll was an inveterate puzzler. His playful inventions include an early form of scrabble and the “word ladder,” whereby a series of one-letter changes transforms a word into its opposite.

By contrast, the crossword was born of necessity. In 1913, Arthur Wynne, the color supplement editor of Joseph Pulitzer’s New York World, had a blank page to fill and resorted to a word-square puzzle. He called it a “word-cross” and invited readers to complete the grid by solving a series of clues. Like Wordle, the game became an overnight sensation. Nevertheless, it took 85 years for crossword puzzles to achieve the ultimate accolade of civilization—a place in The Wall Street Journal.