Historically Speaking: The Ancient Origins of the Christmas Wreath

Before they became a holiday symbol, wreaths were used to celebrate Egyptian gods, victorious Roman generals and the winter solstice.

On Christmas Eve, 1843, three ghosts visited Ebenezer Scrooge in the Charles Dickens novella “A Christmas Carol” and changed Christmas forever. Dickens is often credited with “inventing” the modern idea of Christmas because he popularized and reinvigorated such beloved traditions as the turkey feast, singing carols and saying “Merry Christmas.”

But one tradition for which he cannot take credit is the ubiquitous Christmas wreath, whose pedigree goes back many centuries, even to pre-Christian times.

ILLUSTRATION: THOMAS FUCHS

Wreaths can be found in almost every ancient culture and were worn or hung for many purposes. In Egypt, participants in the festival of Sokar, a god of the underworld, wore onion wreaths because the vegetable was venerated as a symbol of eternal life. The Greeks awarded laurel wreaths to the winners of competitions because the laurel tree was sacred to Apollo, the god of poetry and athletics. The Romans bestowed them on emperors and victorious generals.

Fixing wreaths and boughs onto doorposts to bring good luck was another widely practiced tradition. As early as the 6th century, Lunar New Year celebrants in central China would decorate their doors with young willow branches, a symbol of immortality and rebirth.

The early Christians did not, as might be assumed, create the Yuletide wreath. That was the pagan Vikings. They celebrated the winter solstice with mistletoe, evergreen wreaths made of holly and ivy, and 12 days of feasting, all of which were subsequently turned into Christian symbols. Holly, for example, has often been equated with the crown of thorns. Perhaps not surprisingly for the people who also introduced the words “knife,” “slaughter” and “berserk,” one Viking custom involved setting the Yuletide wreath on fire in the hope of attracting the sun’s attention.

Across northern Europe, the Norse wreath was eventually absorbed into the Christian calendar as the Advent wreath, symbolizing the four weeks before Christmas. It became part of German culture in 1839, when Johann Hinrich Wichern, a Lutheran pastor in Hamburg, captured the public’s imagination with his enormous Advent wreath. Made with a cartwheel and 24 candles, it was intended to help the children at his orphanage count the days to Christmas.

German immigrants to the U.S. brought the Advent wreath with them. Still, while Americans might accept a candlelit wreath inside the house, door wreaths were rare, especially in former Puritan strongholds such as Boston. The whiff of disapproval hung about until 1935, when Colonial Williamsburg appointed Louise B. Fisher, an underemployed and overqualified professor’s wife, to be its head of flowers and decorations. Inspired by her love of 15th-century Italian art, Fisher allowed her wreath designs to run riot on the excuse that she was only adding fruits and other whimsies that had been available during the Colonial era.

Thousands of visitors saw her designs and returned home to copy them. Like Wichern, she helped inspire a whole generation to be unapologetically joyous with their wreaths. Thanks in part to her efforts, America’s front doors are a thing to behold at this time of year, proving that it’s never too late to pour new energy into an old tradition.

Historically Speaking: A Tale of Two Hats

Napoleon’s bicorne and Santa Claus’s red cap both trace their origins to the felted headgear worn in Asia Minor thousands of years ago.

December makes me think of hats—well, one hat in particular. Not Napoleon’s bicorne hat, an original of which (just in time for Ridley Scott’s movie) sold for $2.1 million at an auction last month in France, but Santa’s hat.

The two aren’t as different as you might imagine. They share the same origins and, improbably, tell a similar story. Both owe their existence to the invention of felt, a densely matted textile. The technique of felting was developed several thousand years ago by the nomads of Central Asia. Since felt stays waterproof and keeps its shape, it could be used to make tents, padding and clothes.

The ancient Phrygians of Asia Minor were famous for their conical felt hats, which resemble the Santa cap but with the peak curving upward and forwards. Greek artists used them to indicate a barbarian. The Romans adopted a red, flat-headed version, the pileus, which they bestowed on freed slaves.

Although the Phrygian style never went out of fashion, felt was largely unknown in Western Europe until the Crusades. Its introduction released a torrent of creativity, but nothing matched the sensation created by the hat worn by King Charles VII of France in 1449. At a celebration to mark the French victory over the English in Normandy, he appeared in a fabulously expensive, wide-brimmed, felted beaver-fur hat imported from the Low Countries. Beaver hats were not unknown; the show-off merchant in Chaucer’s “Canterbury Tales” flaunts a “Flandrish beaver hat.” But after Charles, everyone wanted one.

Hat brims got wider with each decade, but even beaver fur is subject to gravity. By the 17th century, wearers of the “cavalier hat” had to cock or fold up one or both sides for stability. Thus emerged the gentleman’s three-sided cocked hat, or tricorne, as it later became known—the ultimate divider between the haves and the have-nots.

The Phrygian hat resurfaced in the 18th century as the red “Liberty Cap.” Its historical connections made it the headgear of choice for rebels and revolutionaries. During the Reign of Terror, any Frenchman who valued his head wore a Liberty Cap. But afterward, it became synonymous with extreme radicalism and disappeared. In the meantime, the hated tricorne had been replaced by the less inflammatory top hat. It was only naval and military men, like Napoleon, who could get away with the bicorne.

The wide-brimmed felted beaver hat was resurrected in the 1860s by John B. Stetson, then a gold prospector in Colorado. Using the felting techniques taught to him by his hatter father, Stetson made himself an all-weather head protector, turning the former advertisement for privilege into the iconic hat of the American cowboy.

Thomas Nast, the Civil War caricaturist and father of Santa Claus’s modern image, performed a similar rehabilitation on the Phrygian cap. To give his Santa a far-away but still benign look, he gave him a semi-Phrygian crossed with a camauro, the medieval clergyman’s cap. Subsequent artists exaggerated the peak and cocked it back, like a nightcap. Thus the red cap of revolution became the cartoon version of Christmas.

In this tale of two hats lies a possible rejoinder to the cry in T.S. Eliot’s “The Waste Land”: “Who is the third who walks always beside you?” It is history, invisible yet present, protean yet permanent—and sometimes atop Santa’s head.

Historically Speaking: The Enduring Allure of a Close Shave

Men may be avoiding their razors for ‘Movember,’ but getting rid of facial and other body hair goes back millennia in many different cultures

November is a tough time for razors. Huge numbers of them haven’t seen their owners since the start of “Movember,” the annual no-shave fundraiser for men’s health. But the razors needn’t fear: The urge of men to express themselves by trimming and removing their hair, facial and otherwise, runs deep.

Although no two cultures share the exact same attitude toward shaving, it has excited strong feelings in every time and place. The act is redolent with symbolism, especially religious and sexual. Even Paleolithic Man shaved his hair on occasion, using shells or flint blades as a crude razor. Ancient Egypt was the earliest society to make hair removal a way of life for both sexes. By 3000 B.C., the Egyptians were using gold and copper to manufacture razors with handles. The whole head was shaved and covered by a wig, a bald head being a practical solution against head lice and overheating. Hair’s association with body odor and poor hygiene also made its absence a sign of purity.

Around 1900 B.C., the Egyptians started experimenting with depilatory pastes made of beeswax and boiled caramel. The methods were so efficacious that many of them are still used today. The only parts never shaved were the eyebrows, except when mourning the death of a cat, a sacred animal in Egyptian culture. The Greco-Roman world adopted the shaved look following Alexander the Great, who thought beards were a liability in battle, since they could be grabbed and pulled. The Romans took their cue from him.

But they also shared the Egyptian obsession with body hair. The Roman fetish for tweezing created professional hair pluckers. The philosopher Seneca, who lived next door to a public bath, complained bitterly about the loud screaming from plucking sessions, which intruded upon the serenity he required for deep thoughts. Both pluckers and barbers disappeared during Rome’s decline.

Professional barbers only reappeared in significant numbers after 1163 when the Council of Tours banned the clergy, who often provided this service, from shedding blood of any kind. The skills of barber-surgeons improved, unlike their utensils, which hardly changed at all until the English invented the foldable straight razor in the late 17th century.

The innovation prompted a “safety” race between English and French blade manufacturers. The French surged ahead in the late 18th century with the Perret razor, which had protective guards on three sides. The English responded with the T-shaped razor patented by William Henson in 1847. In 1875, the U.S. leapfrogged its European rivals with electrolysis, still the best way to remove hair permanently.

Nevertheless, the holy grail of a safe, self-shaving experience remained out of reach until King Camp Gillette, a traveling salesman from New York, worked with an engineer to produce the first disposable, double-edged razorblade. His Gillette razor went on sale in 1903. By the end of World War I, he was selling millions—and not just to men.

Gillette seized on the fashion for sleeveless summer dresses to market the Milady Décolleté Gillette Razor for the “smooth underarm” look. The campaign to convince women to use a traditionally male tool was helped by a series of scandals in the 1920s and 30s over tainted hair products. The worst involved Koremlu, a depilatory cream that was largely rat poison.

The razor may be at least 50,000 years old, but it remains an essential tool. And it’s a great stocking stuffer.

Historically Speaking: Marriage as a Mirror of Human Nature

From sacred ritual to declining institution, wedlock has always reflected our ideas about liberty and commitment.

The Wall Street Journal

October 26, 2023

Marriage is in decline in almost every part of the world. In the U.S., the marriage rate is roughly six per 1,000 people, a fall of nearly 60% since the 1970s. But this is still high compared with most of the highly developed countries in the Organization for Economic Cooperation and Development, where the average marriage rate has dropped below four per 1,000. Modern vie

History of Marriage

THOMAS FUCHS

ws on marriage are sharply divided: In a recent poll, two in five young adult Americans said that the institution has outlived its usefulness.

The earliest civilizations had no such thoughts. Marriage was an inseparable part of the religious and secular life of society. In Mesopotamian mythology, the first marriage was the heavenly union between Innana/Ishtar, the goddess of war and love, and her human lover, the shepherd Dumuzi. Each year, the high point of the religious calendar was the symbolic re-enactment of the Sacred Marriage Rite by the king and the high priestess of the city.

Throughout the ancient world, marriage placed extra constraints on women while allowing polygamy for men. The first major change to the institution took place in ancient Greece. A marriage between one man and one woman, with no others involved, became the bedrock of democratic states. According to Athenian law, only the son of two married citizens could inherit the rights of citizenship. The change altered the definition of marriage to give it a civic purpose, although women’s subordination remained unchanged.

At the end of the 1st century B.C., Augustus Caesar, the founder of the Roman Empire, tried to use the law to reinvigorate “traditional” marriage values. But it was the Stoic philosophers who had the greatest impact on ideas about marriage, teaching that its purpose included personal fulfillment. The 1st-century philosopher Musonius Rufus argued that love and companionship weren’t just incidental benefits but major purposes of marriage.

The early Church’s general hostility toward sex did away with such views. Matrimony was considered less desirable than celibacy; priests didn’t start officiating at wedding ceremonies until the 800s. On the other hand, during the 12th century the Catholic Church made marriage one of the seven unbreakable sacraments. In the 16th century, its intransigence on divorce resulted in King Henry VIII establishing the Anglican Church so he could leave Catherine of Aragon and marry Anne Boleyn.

In the U.S. after the Civil War, thousands of former slaves applied for marriage certificates from the Freedmen’s Bureau. Concurrently, between 1867 and 1886, there were 328,716 divorces among all Americans. The simultaneous moves by some to escape the bonds of matrimony, and by others to have the right to claim it, highlight the institution’s peculiar place in our ideas of individual liberty.

In 1920, female suffrage transformed the nature of marriage yet again, implicitly recognizing the right of wives to a separate legal identity. Still, the institution survived and even thrived. At the height of World War II in 1942, weddings were up 83% from the previous decade.

Though marriage symbolizes stability, its meaning is unstable. It doesn’t date or fall behind; for better or worse, it simply reflects who we are.

Historically Speaking: Sending Cards for a Happy Birthday

On Oct. 26, imprisoned WSJ reporter Evan Gershkovich will turn 32. Since ancient times, birthdays have been occasions for poems, letters and expressions of solidarity.

The Wall Street Journal

October 13, 2023

Wall Street Journal reporter Evan Gershkovich turns 32 on Oct. 26. This year he will be spending his birthday in Lefortovo prison in Moscow, a detention center for high-profile and political prisoners. He has been there for the past six months, accused of espionage—a charge vehemently denied by the U.S. government and The Journal.

Despite the extreme restrictions placed on Lefortovo prisoners, it is still possible to send Evan messages of support via the U.S. Embassy in Moscow or the freegershkovich.com website, like a birthday card, to let him know that the world cares.

Birthday cards are so cheap and plentiful it is easy to miss their cultural value. They are the modern iteration of a literary tradition that goes back at least to Roman times. Poets were especially given to composing birthday odes to their friends and patrons. The Augustan poet Horace dedicated many of his poems to Maecenas, whose birthday, he wrote, “is almost more sacred to me than that of my own birth.”

The custom of birthday salutations petered out along with much else during the Dark Ages but was revived with the spread of mass literacy. Jane Austen would write to her siblings on their birthdays, wishing them the customary “joy,” but toward the end of her life she began to experiment with the form. In 1817, she sent her three-year-old niece Cassy a special birthday letter written in reverse spelling, beginning with “Ym raed Yssac.”

Austen’s sense that a birthday letter ought to be unique coincided with a technological race in the printing industry. One of the first people to realize the commercial potential of greeting cards was Louis Prang, a German immigrant in Boston, who began selling printed cards in 1856. Holiday cards were an instant success, but birthday cards were less popular until World War I, when many American families had a relative fighting overseas.

Demand for birthday cards stayed high after the war, as did the importance attached to them. King George V seized on their popularity to introduce the royal tradition of sending every British citizen who reaches 100 a congratulatory birthday card. In 1926, to show how much they appreciated the gift of U.S. aid, more than 5 million Poles signed a 30,000-page birthday card commemorating America’s 150th anniversary.

During the Cold War, the symbolism of the birthday card became a power in itself. In 1984, Illinois Rep. John Edward Porter and other members of Congress sent birthday cards to Mart Niklus, an Estonian civil rights campaigner imprisoned in the U.S.S.R. By coincidence, the Soviets released Niklus in July 1988, the same month that Nelson Mandela received more than 50,000 cards for his 70th birthday. The frustrated South African prison authorities allowed him to have 12 of them. But the writing was on the wall, as it were, and Mandela was released from prison two years later.

Rep. Porter didn’t know what effect his birthday card to Niklus would have. “I doubt he will get them,” he told the House. “Yet by sending these birthday cards…we let the Soviet officials know that we will not forget him.”

I am sending my birthday card to Evan in the same spirit.

Historically Speaking: Broken Hearts and How to Heal Them

Modern medicine confirms what people have known for thousands of years: heartbreak is more than a metaphor.

The Wall Street Journal

September 30, 2023

A mere generation ago, “heartbreak” was an overused literary metaphor but not an actual medical event. The first person to recognize it as a genuine condition was a Japanese cardiologist named Hikaru Sato. In 1990, Dr. Sato identified the curious case of a female patient who displayed the symptoms of a heart attack while testing negative for it. He named it “Takotsubo Syndrome” after noticing that the left ventricle of her heart changed shape during the episode to resemble a takotsubo, a traditional octopus-trap. A Japanese study in 2001 not only confirmed Sato’s identification of a sudden cardio event that mimics a heart attack but also highlighted the common factor of emotional distress in such patients. It had taken the medical profession 4,000 years to acknowledge what poets had been saying all along: Broken Heart Syndrome is real.

The heart has always been regarded as more than just a pump. The Sumerians of ancient Mesopotamia, now part of modern Iraq, understood it performed a physical function. But they also believed it was the source of all emotion, including love, happiness and despair. One of the first known references to “heartbreak” appears in a 17th century B.C. clay tablet containing a copy of “Atrahasis,” a Babylonian epic poem that parallels the Old Testament story of Noah’s Ark. The words “heart” and “break” are used to describe Atrahasis’s pain at being unable to save people from their imminent doom.

The heart also played a dual mind-body role in ancient Chinese medicine. There was a great emphasis on the importance of emotional regulation, since an enraged or greedy heart was believed to affect other organs. The philosopher Confucius used the heart as an analogy for the perfect relationship between the king and his people: Harmony in the latter and obedience from the former were both essential.

Heart surgeon Daniel Hale Williams. ILLUSTRATION BY THOMAS FUCHS

In the West, the early Catholic Church adopted a more top-down approach to the heart and its emotional problems. Submitting to Christ was the only treatment for what St. Augustine described as the discomfort of the unquiet heart. Even then, the avoidance of heart “pain” was not always possible. For the 16th-century Spanish saint Teresa of Avila, the agonizing sensation of being pierced in the heart was the necessary proof she had received God’s love.

By Shakespeare’s era, the idea of dying for love had become a cliché, but the deadly effects of heartbreak were accepted without question. Grief and anguish kill several of Shakespeare’s characters, including Lady Montague in “Romeo and Juliet,” King Lear, and Desdemona’s father in “Othello.” Shame drives Enobarbus to will his heart to stop in “Antony and Cleopatra”: “Throw my heart against the flint and hardness of my fault.”

London parish clerks continued to list grief as a cause of death until the 19th century, by which time advances in medical science had produced more mechanical explanations for life’s mysteries. In 1893, Daniel Hale Williams—founder of Provident Hospital in Chicago, the first Black-owned hospital in the U. S.—performed one of the earliest successful heart surgeries. He quite literally fixed the broken heart of a stabbing victim by sewing the pericardium or heart sac back together.

Nowadays, there are protocols for treating the coronary problem diagnosed by Dr. Sato. But although we can cure Broken Heart Syndrome, we still can’t cure a broken heart.

Historically Speaking: Tourists Behaving Badly

When today’s travelers get in trouble for knocking over statues or defacing temples, they’re following an obnoxious tradition that dates back to the Romans.

The Wall Street Journal

September 8, 2023

Tourists are giving tourism a bad name. The industry is a vital cog in the world economy, generating more than 10% of global GDP in 2019. But the antisocial behavior of a significant minority is causing some popular destinations to enact new rules and limits. Among the list of egregious tourist incidents this year, two drunk Americans had to be rescued off the Eiffel Tower, a group of Germans in Italy knocked over a 150-year-old statue while taking selfies, and a Canadian teen in Japan defaced an 8th-century temple.

It’s ironic that sightseeing, one of the great perks of civilization, has become one of its downsides. The ancient Greeks called it theoria and considered it to be both good for the mind and essential for appreciating one’s own culture. As the 5th-century B.C. Greek poet Lysippus quipped: “If you’ve never seen Athens, your brain’s a morass./If you’ve seen it and weren’t entranced, you’re an ass.”

The Romans surpassed the Greeks in their love of travel. Unfortunately, they became the prototype for that tourist cliché, the “ugly American,” since they were rich, entitled and careless of other people’s heritage. The Romans never saw an ancient monument they didn’t want to buy, steal or cover in graffiti. The word “miravi”—“I was amazed”—was the Latin equivalent of “Kilroy was here,” and can be found all over Egypt and the Mediterranean.

Thomas Fuchs

Mass tourism picked up during the Middle Ages, facilitated by the Crusades and the popularity of religious pilgrimages. But so did the Roman habit of treating every ancient building like a public visitor’s book. The French Crusader Lord de Coucy actually painted his family coat of arms onto one of the columns of the Church of the Nativity in Bethlehem.

In 17th- and 18th-century Europe, the scions of aristocratic families would embark on a Grand Tour of famous sights, especially in France and Italy. The idea was to turn them into sophisticated men of the world, but for many young men, the real point of the jaunt was to sample bordellos and be drunk and disorderly without fear of their parents finding out.

Even after the Grand Tour went out of fashion, the figure of the tourist usually conjured up negative images. Visiting Alexandria in Egypt in the 19th century, the French novelist Gustave Flaubert raged at the “imbecile” who had painted the words “Thompson of Sunderland” in six-foot-high letters on Pompey’s Pillar in Alexandria in Egypt. The perpetrators were in fact the rescued crew of a ship named the Thompson.

Flaubert was nevertheless right about the sheer destructiveness of some tourists. Souvenir hunters were among the worst offenders. In the Victorian era, Stonehenge in England was chipped and chiseled with such careless disregard that one of its massive stones eventually collapsed.

Sometimes tourists go beyond vandalism to outright madness. Jerusalem Syndrome, first recognized in the Middle Ages, is the sudden onset of religious delusions while visiting the biblical city. Stendhal Syndrome is an acute psychosomatic reaction to the beauty of Florence’s artworks, named for the French writer who suffered such an attack in 1817. There’s also Paris Syndrome, a transient psychosis triggered by extreme feelings of letdown on encountering the real Paris.

As for Stockholm Syndrome, when an abused person identifies with their abuser, there’s little chance of it developing in any of the places held hostage by hordes of tourists.

Historically Speaking: The Many Ingredients of Barbecue

Native Americans, European settlers and African slaves all contributed to creating an American culinary tradition.

The Wall Street Journal

August 18, 2023

There are more than 30,000 BBQ joints in the U.S., but as far as the Michelin Guide is concerned, not one of them is worthy of a coveted star. Many Americans would say that the fault, dear Brutus, is not in ourselves but in the stars.

A 1562 illustration shows Timucua Indians roasting animals on a raised wooden platform, the original form of barbecuing.

The American barbecue—cooking meat with an indirect flame at low temperature over seasoned wood or charcoal—is a centuries-old tradition. (Using the term for any kind of outdoor grilling came much later.) Like America itself, it is a cultural hybrid. Indigenous peoples in the Caribbean and the Americas would place a whole animal carcass on a wooden platform several feet above a fire and let the smoke do the cooking. The first Spanish arrivals were fascinated by the technique, and translated a native word for the platform as “barbacoa.”

Lyndon B. Johnson (right) and Hubert Humphrey celebrate their election victory with barbecue, November 1964.

The Europeans began to barbecue pigs and cattle, non-native animals that easily adapted to the New World. Another important culinary contribution—using a ground trench instead of a raised platform—may have been spread by African slaves. The 18th century African abolitionist Olaudah Equiano described seeing the Miskito of Honduras, a mixed community of Indians and Africans, barbecue an alligator: “Their manner of roasting is by digging a hole in the earth, and filling it with wood, which they burn to coal, and then they lay sticks across, on which they set the meat.”

European basting techniques also played a role. The most popular recipes for barbecue sauce reflect historic patterns of immigration to the U.S.: British colonists used a simple concoction of vinegar and spices, French émigrés insisted on butter, and German settlers preferred their native mustard. In the American West, two New World ingredients, tomatoes and molasses, formed the basis of many sauces. The type of meat became another regional difference: pork was more plentiful in the South, beef in the West.

Although labor-intensive, a barbecued whole hog can feed up to 150 people, making it the ideal food for communal gatherings. In 1793, President George Washington celebrated the laying of the cornerstone for the Capitol building with an enormous barbecue featuring a 500-pound ox.

In the South before the Civil War, a barbecue meant a hog cooked by slaves. The choicest cuts from the pig’s back went to the grandees, hence the phrase “living high on the hog.” Emancipation brought about a culinary reckoning; Southern whites wanting a barbecue had to turn to cookbooks, such as “Mrs Hill’s New Cook Book,” published in 1867 for “inexperienced Southern housekeepers…in this peculiar crisis of our domestic as well as national affairs.”

In the 20th century, the slower rate of urbanization outside the North helped to keep the outdoor barbecue alive. As a Texan, President Lyndon B. Johnson used “barbecue diplomacy” to project a folksy image, breaking with the refined European style of the Kennedys to endear himself to ordinary Americans. The ingredients in Lady Bird Johnson’s barbecue sauce embraced as many regional varieties as possible by including butter, ketchup and vinegar.

Commercial BBQ sauces, which first became available in 1909, offer a convenient substitute for making your own. But for most people, to experience real barbecue requires that other quintessentially American pastime, the road trip. Just leave the Michelin Guide at home.

Historically Speaking: The Enduring Technology of the Book

Durable, stackable and skimmable, books have been the world’s favorite way to read for two millennia and counting.

The Wall Street Journal

August 3, 2023

A fragment of the world’s oldest book was discovered earlier this year. Dated to about 260 B.C., the 6-by-10-inch piece of papyrus survived thanks to ancient Egyptian embalmers who recycled it for cartonnage, a papier-mache-like material used in mummy caskets. The Graz Mummy Book, so-called because it resides in the library of Austria’s Graz University, is 400 years older than the previous record holder, a fragment of a Latin book from the 2nd century A.D.

Stitching on the papyrus shows that it was part of a book with pages rather than a scroll. Scrolls served well enough in the ancient world, when only priests and scribes used them, but as the literacy rate in the Roman Empire increased, so did the demand for a more convenient format. A durable, stackable, skimmable, stitched-leaf book made sense. Its resemblance to a block of wood inspired the Latin name caudex, “bark stem,” which evolved into codex, the word for an ancient manuscript. The 1st-century Roman poet and satirist Martial was an early adopter: A codex contained more pages than the average scroll, he told his readers, and could even be held in one hand!

Thomas Fuchs

The book developed in different forms around the world. In India and parts of southeast Asia, dried palm-leaves were sewn together like venetian blinds. The Chinese employed a similar technique using bamboo or silk until the third century A.D., when hemp paper became a reliable alternative. In South America, the Mayans made their books from fig-tree bark, which was pliable enough to be folded into leaves. Only four codices escaped the mass destruction of Mayan culture by Franciscan missionaries in the 16th century.

Gutenberg’s printing press, perfected in 1454, made that kind of annihilation impossible in Europe. By the 16th century, more than nine million books had been printed. Authorities still tried their best to exert control, however. In 1538, England’s King Henry VIII prohibited the selling of “naughty printed books” by unlicensed booksellers.

Licensed or not, the profit margins for publishers were irresistible, especially after Jean Grolier, a 16th-century Treasurer-General of France, started the fashion for expensively decorated book covers made of leather. Bookselling became a cutthroat business. Shakespeare was an early victim of book-piracy: Shorthand stenographers would hide among the audience and surreptitiously record his plays so they could be printed and sold.

Beautiful leather-bound books never went out of fashion, but by the end of the 18th century, there was a new emphasis on cutting costs and shortening production time. Germany experimented with paperbacks in the 1840s, but these were downmarket prototypes that failed to catch on.

The paperback revolution was started in 1935 by the English publisher Allen Lane, who one day found himself stuck at a train station with nothing to read. Books were too rarefied and expensive, he decided. Facing down skeptics, Lane created Penguin and proceeded to publish 10 literary novels as paperbacks, including Ernest Hemingway’s “A Farewell to Arms.” A Penguin book had a distinctive look that signaled quality, yet it cost the same as a packet of cigarettes. The company sold a million paperbacks in its first year.

Radio was predicted to mean the downfall of books; so were television, the Internet and ebooks. For the record, Americans bought over 788.7 million physical books last year. Not bad for an invention well into its third millennium.

Historically Speaking: Saving Lives With Lighthouses

Since the first one was built in ancient Alexandria, lighthouses have helped humanity master the danger of the seas.

The Wall Street Journal

July 21, 2023

For those who dream big, there will be a government auction on Aug. 1 for two decommissioned lighthouses, one in Cleveland, Ohio, the other in Michigan’s Upper Peninsula. Calling these lighthouses “fixer-uppers,” however, hardly does justice to the challenge of converting them into livable

France’s Cordouan Lighthouse. GETTY IMAGES

homes. Lighthouses were built so man could fight nature, not sit back and enjoy it.

The Lighthouse of Alexandria, the earliest one recorded, was one of the ancient Seven Wonders of the World. An astonishing 300 feet tall or more, it was commissioned in 290 B.C. by Ptolemy I Soter, the founder of Egypt’s Ptolemaic dynasty, to guide ships into the harbor and keep them from the dangerous shoals surrounding the entrance. No word existed for lighthouse, hence it was called the Pharos of Alexandria, after the small islet on which it was located.

The Lighthouse did wonders for the Ptolemies’ reputation as the major power players in the region. The Romans implemented the same strategy on a massive scale. Emperor Trajan’s Torre de Hercules in A Coruña, in northwestern Spain, can still be visited. But after the empire’s collapse, its lighthouses were abandoned.

More than a thousand years passed before Europe again possessed the infrastructure and maritime capacity to need lighthouses, let alone build them. The contrasting approaches of France and England says much about the two cultures. The French regarded them as a government priority, resulting in such architectural masterpieces as Bordeaux’s Cordouan Lighthouse, commissioned by Henri III in 1584. The English entrusted theirs to Trinity House, a private charity, which led to inconsistent implementation. In 1707, poor lighthouse guidance contributed to the sinking of Admiral Sir Cloudesley Shovell’s fleet off the coast of the Scilly Isles, costing his and roughly 1,500 other lives.

Ida Lewis saved at least 18 people from drowning as the lighthouse keeper of Lime Rock in Newport, R.I.

In 1789, the U.S. adopted a third approach. Alexander Hamilton, the first Secretary of the Treasury, argued that federal oversight of lighthouses was an important symbol of the new government’s authority. Congress ordered the states to transfer control of their existing lighthouses to a new federal agency, the U.S. Lighthouse Establishment. But in the following decades Congress’s chief concern was cutting costs. America’s lighthouses were decades behind Europe’s in adopting the Fresnel lens, invented in France in 1822, which concentrated light into a powerful beam.

The U.S. had caught up by the time of the Civil War, but no amount engineering improvements could lessen the hardship and dangers involved in lighthouse-keeping. Isolation, accidents and deadly storms took their toll, yet it was one of the few government jobs open to women. Ida Lewis saved at least 18 people from drowning during her 54-year tenure of Lime Rock Station off Newport, R.I.

Starting in the early 1900s, there were moves to convert lighthouses to electricity. The days of the lighthouse keeper were numbered. Fortunately, when a Category 4 hurricane hit Galveston, Texas, on Sept. 8, 1900, its lighthouse station was still fully manned. The keeper, Henry C. Claiborne, managed to shelter 125 people in his tower before the storm surge engulfed the lower floors. Among them were the nine survivors of a stranded passenger train. Claiborne labored all night, manually rotating the lens after its mechanical parts became stuck. The lighthouse was a beacon of safety during the storm, and a beacon of hope afterward.

To own a lighthouse is to possess a piece of history, plus unrivaled views and not a neighbor in sight—a bargain whatever the price.