Historically Speaking: A Tale of Two Hats

Napoleon’s bicorne and Santa Claus’s red cap both trace their origins to the felted headgear worn in Asia Minor thousands of years ago.

December makes me think of hats—well, one hat in particular. Not Napoleon’s bicorne hat, an original of which (just in time for Ridley Scott’s movie) sold for $2.1 million at an auction last month in France, but Santa’s hat.

The two aren’t as different as you might imagine. They share the same origins and, improbably, tell a similar story. Both owe their existence to the invention of felt, a densely matted textile. The technique of felting was developed several thousand years ago by the nomads of Central Asia. Since felt stays waterproof and keeps its shape, it could be used to make tents, padding and clothes.

The ancient Phrygians of Asia Minor were famous for their conical felt hats, which resemble the Santa cap but with the peak curving upward and forwards. Greek artists used them to indicate a barbarian. The Romans adopted a red, flat-headed version, the pileus, which they bestowed on freed slaves.

Although the Phrygian style never went out of fashion, felt was largely unknown in Western Europe until the Crusades. Its introduction released a torrent of creativity, but nothing matched the sensation created by the hat worn by King Charles VII of France in 1449. At a celebration to mark the French victory over the English in Normandy, he appeared in a fabulously expensive, wide-brimmed, felted beaver-fur hat imported from the Low Countries. Beaver hats were not unknown; the show-off merchant in Chaucer’s “Canterbury Tales” flaunts a “Flandrish beaver hat.” But after Charles, everyone wanted one.

Hat brims got wider with each decade, but even beaver fur is subject to gravity. By the 17th century, wearers of the “cavalier hat” had to cock or fold up one or both sides for stability. Thus emerged the gentleman’s three-sided cocked hat, or tricorne, as it later became known—the ultimate divider between the haves and the have-nots.

The Phrygian hat resurfaced in the 18th century as the red “Liberty Cap.” Its historical connections made it the headgear of choice for rebels and revolutionaries. During the Reign of Terror, any Frenchman who valued his head wore a Liberty Cap. But afterward, it became synonymous with extreme radicalism and disappeared. In the meantime, the hated tricorne had been replaced by the less inflammatory top hat. It was only naval and military men, like Napoleon, who could get away with the bicorne.

The wide-brimmed felted beaver hat was resurrected in the 1860s by John B. Stetson, then a gold prospector in Colorado. Using the felting techniques taught to him by his hatter father, Stetson made himself an all-weather head protector, turning the former advertisement for privilege into the iconic hat of the American cowboy.

Thomas Nast, the Civil War caricaturist and father of Santa Claus’s modern image, performed a similar rehabilitation on the Phrygian cap. To give his Santa a far-away but still benign look, he gave him a semi-Phrygian crossed with a camauro, the medieval clergyman’s cap. Subsequent artists exaggerated the peak and cocked it back, like a nightcap. Thus the red cap of revolution became the cartoon version of Christmas.

In this tale of two hats lies a possible rejoinder to the cry in T.S. Eliot’s “The Waste Land”: “Who is the third who walks always beside you?” It is history, invisible yet present, protean yet permanent—and sometimes atop Santa’s head.

Historically Speaking: The Noble Elf Has a Devilish Alter-Ego

Pointy-eared magical creatures abound in folklore, but they weren’t always cute

The Wall Street Journal

September 8, 2022

“The Rings of Power” series, Amazon’s prequel to J.R.R. Tolkien’s fantasy epic, “The Lord of the Rings,” reserves a central role for heroic elves. Members of this tall, immortal race are distinguished by their beauty and wisdom and bear little resemblance to the merry, diminutive helpers in Santa’s workshop.

Yet both are called elves. One reason for the confusion is that the idea of pointy-eared magical beings has never been culturally specific. The ancient Greeks didn’t have elves per se, but their myths did include sex-mad satyrs, Dionysian half-human-half-animal nature spirits whose ears were pointed like a horse’s.

Before their conversion to Christianity, the Anglo-Saxons, like their Norse ancestors, believed in magical beings such as water spirits, elves and dragons. Later, in the epic poem Beowulf, written down around 1000, the “ylfe” is among the monsters spawned by the biblical Cain.

Benjamin Walker as Gil-galad, High King of the Elves of the West, in “The Rings of Power”
PHOTO: AMAZON STUDIOS

The best accounts of the early Norse myths come from two medieval Icelandic collections known as the Eddas, which are overlaid with Christian cosmology. The Prose Edda divided elves into the “light elves,” who are fair and wondrous, and the “dark elves,” who live underground and are interchangeable with dwarves. Both kinds appeared in medieval tales to torment or, occasionally, help humans.

When not portrayed as the cause of unexplained illnesses, elves were avatars for sexual desire. In Chaucer’s comic tale, the lusty “Wife of Bath” describes the elf queen as sex personified and then complains that the friars have chased all the fair folk away.

The popular conception of elves continued to evolve during the Renaissance under the influence of French “faerie” folklore, Celtic myths and newly available translations of Ovid’s “Metamorphoses” and Virgil’s “Georgics.” Shakespeare took something from almost every tradition in “A Midsummer Night’s Dream,” from Puck the naughty little sprite to Queen Titania, seductress of hapless humans.

But while elves were becoming smaller and cuter in English literature, in Northern Europe they retained their malevolence. Inspired by the Germanic folk tale of the Elf King who preys on little children, in 1782 Goethe composed “Der Erlkonig,” about a boy’s terror as he is chased through the forest to his death. Schubert liked the ballad so much that he set it to music.

In the 19th century, the Brothers Grimm, Jacob and Wilhelm, along with Hans Christian Andersen, brought ancient fairy tales and folk whimsy to a world eager for relief from rampant industrialization. The Grimms put a cheerful face on capitalism with the story of a cobbler and the industrious elves who work to make him wealthy. Clement Clarke Moore made elves the consumer’s friend in his night-before-Christmas poem, “A Visit From St. Nicholas,” where a “jolly old elf” stuffs every stocking with toys.

On the more serious side, the first English translation of Beowulf appeared in 1837, marking the beginning of the Victorians’ obsession with the supernatural and all things gothic. The poem’s negative connotation surrounding elves burst into the open with Richard Wagner’s Ring Cycle, based on Germanic legends, which portrayed the Elf King Alberich as an evil dwarf.

The elfin future would likely have been silly or satanic were it not for Tolkien’s restoration of the “light elf” tradition. For now, at least, the lovely royal elf Galadriel rules.

Historically Speaking: Typos Have Been Around as Long as Writing Itself

Egyptian engravers, medieval scribes and even Shakespeare’s printer made little mistakes that have endured

May 12, 2022

The Lincoln Memorial in Washington, D.C., is 100 years old this month. The beloved national monument is no less perfect for having one slight flaw: The word “future” in the Second Inaugural Address was mistakenly carved as “Euture.” It is believed that the artist, Ernest C. Bairstow, accidentally picked up the “e” stencil instead of the “f.” He tried to fix it by filling in the bottom line, but the smudged outline is still visible.

Bairstow was by no means the first engraver to rely on the power of fillers. It wasn’t uncommon for ancient Egyptian carvers—most of whom were illiterate—to botch their inscriptions. The seated Pharaoh statue at the Penn Museum in Philadelphia depicts Rameses II, third Pharaoh of the 19th dynasty, who lived during the 13th century BC. Part of the inscription ended up being carved backward, which the artist tried to hide with a bit of filler and paint. But time and wear have made the mistake, well, unmistakable.

ILLUSTRATION: THOMAS FUCHS

Medieval scribes were notorious for botching their illuminated manuscripts, using all kinds of paint tricks to hide their errors. But for big mistakes—for example, when an entire line was dropped—the monks could get quite inventive. In a 13th-century Book of Hours at the Walters Museum in Baltimore, an English monk solved the problem of a missing sentence by writing it at the bottom of the page and drawing a ladder with man on it, pulling the sentence by a rope, to where it was meant to be.

The phrase “the devil is in the details” may have been inspired by Titivillus, the medieval demon of typos. Monks were warned that Titivillus ensured that every scribal mistake was collected and logged, so that it could be held against the offender at Judgment Day.

The warning seems to have had only limited effect. The English poet Geoffrey Chaucer was so enraged by his copyist Adam Pinkhurst that he attacked him in verse, complaining he had “to rub and scrape: and all is through thy negligence and rape.”

The move to print failed to solve the problem of typos. When Shakespeare’s Cymbeline was first committed to text, the name of the heroine was accidentally changed from Innogen to Imogen, which is how she is known today. Little typos could have big consequences, such as the so-called Wicked Bible of 1631, whose printers managed to leave out the “not” in the seventh commandment, thereby telling Christians that “thou shalt commit adultery.”

The rise of the newspaper deadline in the 19th century inevitably led to typos big and small, as well as those unfortunate and unlikely. In 1838, British readers of the Manchester Guardian were informed that “writers” rather than “rioters” had caused extensive property damage during a protest meeting in Yorkshire.

In the age of computers, a single typo can have catastrophic consequences. On July 22, 1962, NASA’s Mariner 1 probe to Venus exploded just 293 seconds after launching. The failure was traced to an inputting error. A single hyphen was inadvertently left off one of the codes.

Historically Speaking: How the Office Became a Place to Work

Employees are starting to return to their traditional desks in large shared spaces. But centuries ago, ‘office’ just meant work to be done, not where to do it.

The Wall Street Journal

June 24, 2021

Wall Street wants its workforce back in the office. Bank of America, Morgan Stanley and Goldman Sachs have all let employees know that the time is approaching to exchange pajamas and sweats for less comfortable work garb. Some employees are thrilled at the prospect, but others waved goodbye to the water cooler last year and have no wish to return.

Contrary to popular belief, office work is not a beastly invention of the capitalist system. As far back as 3000 B.C, the temple cities of Mesopotamia employed teams of scribes to keep records of official business. The word “office” is an amalgamation of the Latin officium, which meant a position or duty, and ob ficium, literally “toward doing.” Geoffrey Chaucer was the first writer known to use “office” to mean an actual place, in “The Canterbury Tales” in 1395.

In the 16th century, the Medicis of Florence built the Uffizi, now famous as a museum, for conducting their commercial and political business (the name means “offices” in Italian). The idea didn’t catch on in Europe, however, until the British began to flex their muscles across the globe. When the Royal Navy outgrew its cramped headquarters, it commissioned a U-shaped building in central London originally known as Ripley Block and later as the Old Admiralty building. Completed in 1726, it is credited with being the U.K.’s first purpose-built office.

Three years later, the East India Company began administering its Indian possessions from gleaming new offices in Leadenhall Street. The essayist and critic Charles Lamb joined the East India Company there as a junior clerk in 1792 and stayed until his retirement, but he detested office life, calling it “daylight servitude.” “I always arrive late at the office,” he famously wrote, “but I make up for it by leaving early.”

A scene from “The Office,” which reflected the modern ambivalence toward deskbound work.
PHOTO: CHRIS HASTON/NBC/EVERETT COLLECTION

Not everyone regarded the office as a prison without bars. For women it could be liberating. An acute manpower shortage during the Civil War led Francis Elias Spinner, the U.S. Treasurer, to hire the government’s first women office clerks. Some Americans were scandalized by the development. In 1864, Rep. James H Brooks told a spellbound House that the Treasury Department was being defiled by “orgies and bacchanals.”

In the late 19th century, the inventions of the light bulb and elevator were as transformative for the office as the telephone and typewriter: More employees could be crammed into larger spaces for longer hours. Then in 1911, Frederick Winslow Taylor published “The Principles of Scientific Management,” which advocated a factory-style approach to the workplace with rows of desks lined up in an open-plan room. “Taylorism” inspired an entire discipline devoted to squeezing more productivity from employees.

Sinclair Lewis’s 1917 novel, “The Job,” portrayed the office as a place of opportunity for his female protagonist, but he was an outlier among writers and social critics. Most fretted about the effects of office work on the souls of employees. In 1955, Sloan Wilson’s “The Man in the Grey Flannel Suit,” about a disillusioned war veteran trapped in a job that he hates, perfectly captured the deep-seated American ambivalence toward the office. Modern television satires like “The Office” show that the ambivalence has endured—as do our conflicted attitudes toward a post-pandemic return to office routines.

Historically Speaking: The Ancient Origins of the Vacation

Once a privilege of the elite, summer travel is now a pleasure millions can enjoy.

The Wall Street Journal, June 6, 2019

ILLUSTRATION: THOMAS FUCHS

Finally, Americans are giving themselves a break. For years, according to the U.S. Travel Association, more than half of American workers didn’t use all their paid vacation days. But in a survey released in May by Discover, 71% of respondents said they were planning a summer vacation this year, up from 58% last year—meaning a real getaway, not just a day or two to catch up on chores or take the family to an amusement park.

The importance of vacations for health and happiness has been accepted for thousands of years. The ancient Greeks probably didn’t invent the vacation, but they perfected the idea of the tourist destination by providing quality amenities at festivals, religious sites and thermal springs. A cultured person went places. According to the “Crito,” one of Plato’s dialogues, Socrates’ stay-at-home mentality made him an exception: “You never made any other journey, as other people do, and you had no wish to know any other city.”

The Romans took a different approach. Instead of touring foreign cities, the wealthy preferred to vacation together in resort towns such as Pompeii, where they built ostentatious villas featuring grand areas for entertaining. The Emperor Nero was relaxing at his beach palace at Antium, modern Anzio, when the Great Fire of Rome broke out in the year 64.

The closest thing to a vacation that medieval Europeans could enjoy was undertaking pilgrimages to holy sites. Santiago de Compostela in northern Spain, where St. James was believed to be buried, was a favorite destination, second only to Rome in popularity. As Geoffrey Chaucer’s bawdy “Canterbury Tales” shows, a pilgrimage provided all sorts of opportunities for mingling and carousing, not unlike a modern cruise ship.

The vacation went upmarket in the late 17th century, as European aristocrats rediscovered the classical idea of tourism for pleasure. Broadening one’s horizons via the Grand Tour—a sightseeing trip through the major classical and Renaissance sites of France and Italy—became de rigueur for any gentleman. The spa town, too, enjoyed a spectacular revival. The sick and infertile would gather to “take the cure,” bathing in or drinking from a thermal spring. Bath in England became as renowned for its party scene as for its waters. Jane Austen, a frequent visitor to the city, set two of her novels there. The U.S. wasn’t far behind, with Saratoga Springs, N.Y., known as the “Queen of Spas,” becoming a popular resort in 1815.

But where was the average American to go? Resorts, with their boardwalks, grand hotels and amusement arcades, were expensive. In any case, the idea of vacations for pure pleasure sat uneasily with most religious leaders. The answer lay in the great outdoors, which were deemed to be good for the health and improving to the mind.

Cheap rail travel, and popular guidebooks such as William H.H. Murray’s “Adventures in the Wilderness; or, Camp-Life in the Adirondacks,” helped to entice the middle classes out of the city. The American Chautauqua movement, originally a New York-based initiative aimed at improving adult education, further served to democratize the summer vacation by providing cheap accommodation and wholesome entertainment for families.

The summer vacation was ready to become a national tradition. In 1910, President William Howard Taft even proposed to Congress that all workers should be entitled to two to three months of paid vacation. But the plan stalled, and it was left to France to pass the first guaranteed-vacation law, in 1919. Since then, most developed countries have recognized vacation time as a legal right. It’s not too late, America, to try again.

WSJ Historically Speaking: Where ‘King Arthur’ Came From, and Why the Film Failed

Charlie Hunnam, that sword and that stone, in ‘King Arthur: Legend of the Sword.’ PHOTO: WARNER BROS. PICTURES

In the movie business, even the stuff of legend is no sure bet: The box-office returns for the latest version of the perennially popular Arthurian stories, “ King Arthur : Legend of the Sword,” have marked the film as one of the biggest flops yet for 2017.

What went wrong? High on the list of critics’ complaints was the rewriting of Arthur’s character and story to make him seem more down-to-earth and less like the virtuous leader of legend. The Journal’s Joe Morgenstern called the film “a choppy hunt for the grim, the grungy, the darkness of dungeons and the clamor of a war-torn world.” Continue reading…