Historically Speaking: When Masquerade Was All the Rage

Before there was Halloween, there were costume balls and Carnival, among other occasions for the liberation of dressing up

The Wall Street Journal

October 28, 2021

Costume parades and Halloween parties are back after being canceled last year. Donning a costume and mask to go prancing around might seem like the height of frivolity, but the act of dressing-up has deep roots in the human psyche.

During the early classical era, worshipers at the annual festivals of Dionysus—the god of wine, ritual madness and impersonation, among other things—expanded mask-wearing from religious use to personal celebrations and plays performed in his honor. Masks symbolized the suspension of real world rules: A human could become a god, an ordinary citizen could become a king, a man could be a woman. Anthropologists call such practices “rituals of inversion.”

In Christianized Europe, despite official disapproval of paganism, rituals of inversion not only survived but flourished. Carnival—possibly a corruption of the Latin phrase “carne vale,” farewell to meat, because the festival took place before Lent—included the Feast of Fools, where junior clergymen are alleged to have dressed as nuns and bishops and danced in the streets.

By the 13th century, the Venetians had taken to dressing up and wearing masks with such gusto that the Venice Carnival became an occasion for ever more elaborate masquerade. The city’s Great Council passed special laws to keep the practice within bounds, such as banning masks while gambling or visiting convents.

ILLUSTRATION: THOMAS FUCHS

The liberation granted by a costume could be dangerous. In January of 1393, King Charles VI of France and his wife, Isabeau of Bavaria, held the Bal des Sauvages, or Wild Men’s Ball, to celebrate the wedding of one of her ladies-in-waiting. The king had already suffered his first bout of insanity, and it was hoped that the costume ball would be an emotional outlet for his disordered mind. But the farce became a tragedy. The king and his entourage, dressed as hairy wild men, were meant to perform a “crazy” dance. Horrifically, the costumes caught fire, and only Charles and one other knight survived.

The masked ball became a staple of royal entertainments, offering delicious opportunities for sexual subterfuge and social subversion. At a masquerade in 1745, Louis XV of France disguised himself as a yew tree so he could pursue his latest love, the future Madame de Pompadour. Meanwhile, the Dauphine danced the night away with a charming Spanish knight, not realizing he was a lowly cook who had tricked his way in. More ominously, a group of disaffected nobles in Sweden infiltrated a masquerade to assassinate King Gustav III of Sweden in 1792. Five years later, the new ruler of Venice, Francis II of Austria, banned Carnival and forbade the city’s residents to wear masks.

Queen Victoria helped to return dress-up parties to respectability with historically-themed balls that celebrated creativity rather than debauchery. By 1893, American Vogue could run articles about fabulous Halloween costumes without fear of offense. The first Halloween parade took place not in cosmopolitan New York but in rural Hiawatha, Kansas, in 1914.

In the modern era, the taint of anarchy and licentiousness associated with dressing-up has been replaced by complaints about cultural appropriation, a concern that would have baffled our ancestors. Becoming what we are not, however briefly, is part of being who we are.

Historically Speaking: How Malaria Brought Down Great Empires

A mosquito-borne parasite has impoverished nations and stopped armies in their tracks

The Wall Street Journal

October 15, 2021

Last week brought very welcome news from the World Health Organization, which approved the first-ever childhood vaccine for malaria, a disease that has been one of nature’s grim reapers for millennia.

Originating in Africa, the mosquito-borne parasitic infection left its mark on nearly every ancient society, contributing to the collapse of Bronze-Age civilizations in Greece, Mesopotamia and Egypt. The boy pharaoh Tutankhamen, who died around 1324 BC, suffered from a host of conditions including a club foot and cleft palate, but malaria was likely what killed him.

Malaria could stop an army in its tracks. In 413 BC, at the height of the disastrous Sicilian Expedition, malaria sucked the life out of the Athenian army as it lay siege to Syracuse. Athens never recovered from its losses and fell to the Spartans in 404 BC.

But while malaria helped to destroy the Athenians, it provided the Roman Republic with a natural barrier against invaders. The infested Pontine Marshes south of Rome enabled successive generations of Romans to conquer North Africa, the Middle East and Europe with some assurance they wouldn’t lose their own homeland. Thus, the spread of classical civilization was carried on the wings of the mosquito. In the 5th century, though, the blessing became a curse as the disease robbed the Roman Empire of its manpower.

Throughout the medieval era, malaria checked the territorial ambitions of kings and emperors. The greatest beneficiary was Africa, where endemic malaria was deadly to would-be colonizers. The conquistadors suffered no such handicap in the New World.

ILLUSTRATION: JAMES STEINBERG

The first medical breakthrough came in 1623 after malaria killed Pope Gregory XV and at least six of the cardinals who gathered to elect his successor. Urged on by this catastrophe to find a cure, Jesuit missionaries in Peru realized that the indigenous Quechua people successfully treated fevers with the bark of the cinchona tree. This led to the invention of quinine, which kills malarial parasites.

For a time, quinine was as powerful as gunpowder. George Washington secured almost all the available supplies of it for his Continental Army during the War of Independence. When Lord Cornwallis surrendered at Yorktown in 1781, less than half his army was fit to fight: Malaria had incapacitated the rest.

During the 19th century, quinine helped to turn Africa, India and Southeast Asia into a constellation of European colonies. It also fueled the growth of global trade. Malaria had defeated all attempts to build the Panama Canal until a combination of quinine and better mosquito control methods led to its completion in 1914. But the drug had its limits, as both Allied and Axis forces discovered in the two World Wars. While fighting in the Pacific Theatre in 1943, General Douglas MacArthur reckoned that for every fighting division at his disposal, two were laid low by malaria.

A raging infection rate during the Vietnam War was malaria’s parting gift to the U.S. in the waning years of the 20th century. Between 1964 and 1973, the U.S. Army suffered an estimated 391,965 sick-days from malaria cases alone. The disease didn’t decide the war, but it stacked the odds.

Throughout history, malaria hasn’t had to wipe out entire populations to be devastating. It has left them poor and enfeebled instead. With the advent of the new vaccine, the hardest hit countries can envisage a future no longer shaped by the disease.

Historically Speaking: Dante’s Enduring Vision of Hell

The “Inferno” brought human complexity to the medieval conception of the afterlife

The Wall Street Journal

September 30, 2021

What is hell? For Plato, it was Tartarus, the lowest level of Hades where those who had sinned against the gods suffered eternal punishment. For Jean-Paul Sartre, the father of existentialism, hell was other people. For many travelers today, it is airport security.

No depiction of hell, however, has been more enduring than the “Inferno,” part one of the “Divine Comedy” by Dante Alighieri, the 700th anniversary of whose death is commemorated this year. Dante’s hell is divided into nine concentric circles, each one more terrifying and brutal than the last until the frozen center, where Satan resides alongside Judas, Brutus and Cassius. With Virgil as his guide, Dante’s spiritually bereft and depressed alter ego enters via a gate bearing the motto “Abandon all hope, ye who enter here”—a phrase so ubiquitous in modern times that it greets visitors to Disney’s Pirates of the Caribbean ride.

The inscription was a Dantean invention, but the idea of a physical gate separating the land of the living from a desolate one of the dead was already at least 3,000 years old: In the Sumerian Epic of Gilgamesh, written around 2150 B.C., two scorpionlike figures guard the gateway to an underworld filled with darkness and dust.

ILLUSTRATION: THOMAS FUCHS

The underworld of the ancient Egyptians was only marginally less bleak. Seven gates blocked the way to the Hall of Judgment, according to the Book of the Dead. Getting through them was arduous and fraught with failure. The successful then had to submit to having their hearts weighed against the Feather of Truth. Those found wanting were thrown into the fire of oblivion.

Zoroastrianism, the official religion of the ancient Persians, was possibly the first to divide the afterlife into two physically separate places, one for good souls and the other for bad. This vision contrasted with the Greek view of Hades as the catchall for the human soul and the early Hebrew Bible’s description of Sheol as a shadowy pit of nothingness. In the 4th century B.C., Alexander the Great’s Macedonian empire swallowed both Persia and Judea, and the three visions of the afterlife commingled. “hell” would then appear frequently in Greek versions of the New Testament. But the word, scholars point out, was a single translation for several distinct Hebrew terms.

Early Christianity offered more than one vision of hell, but all contained the essential elements of Satan, sinners and fire. The “Apocalypse of Peter,” a 2nd century text, helped start the trend of listing every sadistic torture that awaited the wicked.

Dante was thus following a well-trod path with his imaginatively crafted punishments of boiling pitch for the dishonest and downpours of icy rain on the gluttonous. But he deviated from tradition by describing Hell’s occupants with psychological depth and insight. Dante’s narrator rediscovers the meaning of Christian truth and love through his encounters. In this way the Inferno speaks to the complexities of the human condition rather than serving merely as a literary zoo of the dammed.

The “Divine Comedy” changed the medieval world’s conception of hell, and with it, man’s understanding of himself. Boccaccio, Chaucer, Milton, Balzac —the list of writers directly inspired by Dante’s vision goes on. “Dante and Shakespeare divide the world between them,” wrote T.S. Eliot. “There is no third.”

Historically Speaking: For Punishment or Penitence?

Fifty years ago, the Attica uprising laid bare the conflicting ideas at the heart of the U.S. prison system.

The Wall Street Journal

September 17, 2021

Fifty years ago this past week, inmates in Attica, New York, staged America’s deadliest prison uprising. The organizers held prison employees hostage while demanding better conditions. One officer and three inmates were killed during the rioting, and the revolt’s suppression left another 39 dead and at least 89 seriously wounded. The episode raised serious questions about prison conditions and ultimately led to some reforms.

Nearly two centuries earlier, the founders of the U.S. penal system had intended it as a humane alternative to those that relied on such physical punishments as mutilation and whipping. After the War of Independence, Benjamin Franklin and leading members of Philadelphia’s Quaker community argued that prison should be a place of correction and penitence. Their vision was behind the construction of the country’s first “penitentiary house” at the Walnut Street Jail in Philadelphia in 1790. The old facility threw all prisoners together; its new addition contained individual cells meant to prevent moral contagion and to encourage prisoners to spend time reflecting on their crimes.

Inmates protest prison conditions in Attica, New York, Sept. 10, 1971

Walnut Street inspired the construction of the first purpose-built prison, Eastern State Penitentiary, which opened outside of Philadelphia in 1829. Prisoners were kept in solitary confinement and slept, worked and ate in their cells—a model that became known as the Pennsylvania system. Neighboring New York adopted the Auburn system, which also enforced total silence but required prisoners to work in communal workshops and instilled discipline through surveillance, humiliation and corporal punishment. Although both systems were designed to prevent recidivism, the former stressed prisoner reform while the latter carried more than a hint of retribution.

Europeans were fascinated to see which system worked best. In 1831, the French government sent Alexis de Tocqueville and Gustave de Beaumont to investigate. Having inspected facilities in several states, they concluded that although the “penitentiary system in America is severe,” its combination of isolation and work offered hope of rehabilitation. But the novelist Charles Dickens reached the opposite conclusion. After touring Eastern State Penitentiary in 1842, he wrote that the intentions behind solitary confinement were “kind, humane and meant for reformation.” In practice, however, total isolation was “worse than any torture of the body”: It broke rather than reformed people.

Severe overcrowding—there was no parole in the 19th century—eventually undermined both systems. Prisoner violence became endemic, and regimes of control grew harsher. Sing Sing prison in New York meted out 36,000 lashes in 1843 alone. In 1870, the National Congress on Penitentiary and Reformatory Discipline proposed reforms, including education and work-release initiatives. Despite such efforts, recidivism rates remained high, physical punishment remained the norm and almost 200 serious prison riots were recorded between 1855 and 1955.

That year, Harry Manuel Shulman, a deputy commissioner in New York City’s Department of Correction, wrote an essay arguing that the country’s early failure to decide on the purpose of prison had immobilized the system, leaving it “with one foot in the road of rehabilitation and the other in the road of punishment.” Which would it choose? Sixteen years later, Attica demonstrated the consequences of ignoring the question.

Historically Speaking: The Long Haul of Distance Running

How the marathon became the world’s top endurance race

The Wall Street Journal

September 2, 2021

The New York City Marathon, the world’s largest, will hold its 50th race this autumn, after missing last year’s due to the pandemic. A podiatrist once told me that he always knows when there has been a marathon because of the sudden uptick in patients with stress fractures and missing toenails. Nevertheless, humans are uniquely suited to long-distance running.

Some 2-3 million years ago, our hominid ancestors began to develop sweat glands that enabled their bodies to stay cool while chasing after prey. Other mammals, by contrast, overheat unless they stop and rest. Thus, slow but sweaty humans won out over fleet but panting animals.

The marathon, at 26.2 miles, isn’t the oldest known long-distance race. Egyptian Pharaoh Taharqa liked to organize runs to keep his soldiers fit. A monument inscribed around 685 B.C. records a two-day, 62-mile race from Memphis to Fayum and back. The unnamed winner of the first leg (31 miles) completed it in about four hours.

ILLUSTRATION: THOMAS FUCHS

The considerably shorter marathon derives from the story of a Greek messenger, Pheidippides, who allegedly ran from Marathon to Athens in 490 B.C. to deliver news of victory over the Persians—only to drop dead of exhaustion at the end. But while it is true that the Greeks used long-distance runners, called hemerodromoi, or day runners, to convey messages, this story is probably a myth or a conflation of different events.

Still, foot-bound messengers ran impressive distances in their day. Within 24 hours of Herman Cortes’s landing in Mexico in 1519, messenger relays had carried news of his arrival over 260 miles to King Montezuma II in Tenochtitlan.

As a competitive sport, the marathon has a shorter history. The longest race at the ancient Olympic Games was about 3 miles. This didn’t stop the French philologist Michel Bréal from persuading the organizers of the inaugural modern Olympics in 1896 to recreate Pheidippides’s epic run as a way of adding a little classical flavor to the Games. The event exceeded his expectations: The Greek team trained so hard that it won 8 of the first 9 places. John Graham, manager of the U.S. Olympic team, was inspired to organize the first Boston Marathon in 1897.

Marathon runners became fitter and faster with each Olympics. But at the 1908 London Games the first runner to reach the stadium, the Italian Dorando Pietri, arrived delirious with exhaustion. He staggered and fell five times before concerned officials eventually helped him over the line. This, unfortunately, disqualified his time of 2:54:46.

Pietri’s collapse added fuel to the arguments of those who thought that a woman’s body could not possibly stand up to a marathon’s demands. Women were banned from the sport until 1964, when Britain’s Isle of Wight Marathon allowed the Scotswoman Dale Greig to run, with an ambulance on standby just in case. Organizers of the Boston Marathon proved more intransigent: Roberta Gibb and Katherine Switzer tried to force their way into the race in 1966 and ’67, but Boston’s gender bar stayed in place until 1972. The Olympics held out until 1984.

Since that time, marathons have become a great equalizer, with men and women on the same course: For 26.2 miles, the only label that counts is “runner.”

Historically Speaking: A Legacy of Tinderbox Forests

Long before climate change exacerbated the problem, policies meant to suppress wildfires served to fan the flames

The Wall Street Journal

August 19, 2021

This year’s heat waves and droughts have led to record-breaking wildfires across three continents. The fires in Siberia are so vast that smoke has reached the North Pole for what is believed to be the first time. In the United States, California’s Dixie Fire has become the largest single fire in the state’s history.

Humans have long wrestled with forest fire, seeking by turns to harness and to suppress it. Early European efforts to control forest fires were tentative and patchy. In the 14th century, the Sardinians experimented with fire breaks, but the practice was slow to catch on. In North America, by contrast, scientists have found 2000-year-old evidence of controlled burnings by Native American tribes. But this practice died out with the arrival of European immigrants, because of local bans as well as the expulsion of tribes from native lands. As a consequence, forests not only became larger and denser but also filled with mounds of dead and dried vegetation, making them very susceptible to fire.

Disaster struck in the fall of 1871. Dozens of wildfires broke out simultaneously across the Upper Midwest and Great Lakes region, and on Oct. 8, the same night as the Chicago Fire, a firestorm engulfed the town of Peshtigo, Wisconsin, killing an estimated 1,200 people (and possibly many more). The fire was the deadliest in U.S. history.

ILLUSTRATION: ANTHONY FREDA

Early conservationists, such as Franklin Hough, sought to organize a national wildfire policy. The U.S. Forest Service was created in 1905 and was still figuring out its mission in 1910, when the northern Rockies went up in flames. Fires raced through Washington, Montana and Idaho, culminating in what is known as the “Big Blowup” of August 20 and 21.

One of the U.S. Forest Service’s newly minted rangers, Edward C. Pulaski, was leading a team of 45 firefighters near Wallace, Idaho. A firestorm surrounded the men, forcing Pulaski to lead them down a disused mine shaft. Several attempted to go back outside, believing they would be cooked alive in the smoke-filled mine. Pulaski managed to stop the suicidal exodus by threatening to shoot any man who tried to leave. To maintain morale, he organized a water bucket chain to prevent the blankets covering the exit from catching fire. Pulaski’s actions saved the lives of all but five of the firefighters, but his eyesight and lungs never recovered.

The Big Blowup destroyed more than 3 million acres in two days and killed at least 80 people. In response to the devastation, the Forest Service, with the public’s support, adopted the mistaken goal of total fire suppression rather than fire management. The policy remained in place until 1978, bequeathing to the country a legacy of tinderbox forests.

Nowadays, the Forest Service conducts controlled burns, known as prescribed fires, to mitigate the risk of wildfire. The technique is credited with helping to contain July’s 413,000-acre Bootleg Fire in Oregon.

Fire caused by the effects of climate change will require human intervention of another order. In the Bible’s Book of Isaiah, the unhappy “sinners of Zion” cry out: ‘”Who of us can live where there is a consuming fire? Who among us can dwell with everlasting burnings?” Who, indeed.

Historically Speaking: Let Slip the Dogs, Birds and Donkeys of War

Animals have served human militaries with distinction since ancient times

The Wall Street Journal

August 5, 2021

Cher Ami, a carrier pigeon credited with rescuing a U.S. battalion from friendly fire in World War I, has been on display at the Smithsonian for more a century. The bird made news again this summer, when DNA testing revealed that the avian hero was a “he” and not—as two feature films, several novels and a host of poems depicted—a ”she.”

Cher Ami was one of more than 200,000 messenger pigeons Allied forces employed during the War. On Oct. 4, 1918, a battalion from the U.S. 77th Infantry Division in Verdun, northern France, was trapped behind enemy lines. The Germans had grown adept at shooting down any bird suspected of working for the other side. They struck Cher Ami in the chest and leg—but the pigeon still managed to make the perilous flight back to his loft with a message for U.S. headquarters.

Animals have played a crucial role in human warfare since ancient times. One of the earliest depictions of a war animal appears on the celebrated 4,500-year-old Sumerian box known as the Standard of Ur. One side shows scenes of war; the other, scenes of peace. On the war side, animals that are most probably onagers, a species of wild donkey, are shown dragging a chariot over the bodies of enemy soldiers.

War elephants of Pyrrhus in a 20th century Russian painting
PHOTO: ALAMY

The two most feared war animals of the classical world were horses and elephants. Alexander the Great perfected the use of the former and introduced the latter after his foray into India in 327 BC. For a time, the elephant was the ultimate weapon of war. At the Battle of Heraclea in 280 B.C., a mere 20 of them helped Pyrrhus, king of Epirus—whose costly victories inspired the term “Pyrrhic victory”—rout an entire Roman army.

War animals didn’t have to be big to be effective, however. The Romans learned how to defeat elephants by exploiting their fear of pigs. In 198 B.C., the citizens of Hatra, near Mosul in modern Iraq, successfully fought off a Roman attack by pouring scorpions on the heads of the besiegers. Eight years later, the Carthaginian general Hannibal won a surprise naval victory against King Eumenes II of Pergamon by catapulting “snake bombs”—jars stuffed with poisonous snakes—onto his ships.

Ancient war animals often suffered extraordinary cruelty. When the Romans sent pigs to confront Pyrrhus’s army, they doused the animals in oil and set them on fire to make them more terrifying. Hannibal would get his elephants drunk and stab their legs to make them angry.

Counterintuitively, as warfare became more mechanized the need for animals increased. Artillery needed transporting; supplies, camps, and prisoners needed guarding. A favorite mascot or horse might be well treated: George Washington had Nelson, and Napoleon had Marengo. But the life of the common army animal was hard and short. The Civil War killed between one and three million horses, mules and donkeys.

According to the Imperial War Museum in Britain, some 16 million animals served during World War I, including canaries, dogs, bears and monkeys. Horses bore the brunt of the fighting, though, with as many as 8 million dying over the four years.

Dolphins and sea lions have conducted underwater surveillance for the U.S. Navy and helped to clear mines in the Persian Gulf. The U.S. Army relies on dogs to detect hidden IEDs, locate missing soldiers, and even fight when necessary. In 2016, four sniffer dogs serving in Afghanistan were awarded the K-9 Medal of Courage by the American Humane Association. As the troop withdrawal continues, the military’s four-legged warriors are coming home, too

Historically Speaking: Diabetes and the Miracle of Insulin

One hundred years ago, a team of Canadian researchers showed that an age-old disease didn’t have to mean a death sentence.

The Wall Street Journal

July 22, 2021

The human body runs on glucose, a type of sugar that travels through the bloodstream to the cells where it converts into energy. Some 34.2 million Americans are diabetic; their bodies are unable to produce the hormone insulin, which regulates how glucose is processed and stored in the cells. Without treatment the condition is terminal. The discovery of insulin a century ago this year was one of the great medical breakthroughs of the 20th century.

Diabetes was first recognized some 4,000 years ago. The Ebers Papryus, an Egyptian medical text written around 1550 B.C., refers to patients suffering from thirst, frequent urination and weight loss. An ancient Indian text, the Sushruta Samhita, composed after the 7th century B.C., advised testing for diabetes by seeing whether ants were attracted to the sugar in the urine.

The disproportionate amount of urine experienced by sufferers is probably the reason why the ancient Greeks called it “diabetes,” the word for “siphon” or “to pass through.” They also made the link to lifestyle: The 5th century B.C. physician Hippocrates, the “father of medicine,” advocated exercise as part of the treatment.

Early on, the Chinese recognized that an unbalanced diet of sweet, rich and fatty foods could play a role. Lady Dai, a minor aristocrat who died in the 2nd century B.C., was a textbook case. Her perfectly preserved mummy, discovered in southern China in 1971, revealed a life of dietary excess. She also suffered the consequences: osteoarthritis, high cholesterol, hypertension, liver disease, gall stones and, crucially, diabetes.

Over time, physicians became more expert at diagnosis. The role of sugar came into sharper focus in the 1770s after the English doctor Matthew Dobson discovered that it stayed in the blood as well as urine. But no further progress was made until the end of the 19th century. In 1889 Oskar Minkowski conducted experiments on dogs at the University of Strasbourg to prove that a nonfunctioning pancreas triggered diabetes.

ILLUSTRATION: THOMAS FUCHS

By the early 20th century, scientists knew that a pancreatic secretion was responsible for controlling glucose in the body, but they couldn’t isolate it. The Canadian researchers credited with finding the answer— John Macleod, Frederick Banting, Charles Best and James Collip —were an unlikely team. Banting was a surgeon not a scientist. Yet he sufficiently impressed Macleod, a professor of physiology at the University of Toronto, that the latter lent him lab space and a research assistant, Best. The pairing almost ended in a fist fight, but Banting and Best got over their differences and in July 1921 successfully injected insulin into a dog.

Macleod then brought in Collip, a biochemist on a research sabbatical, to help make a human-compatible version. This led to more infighting and Collip threatened to leave. The dysfunctional group somehow held together long enough to try their insulin on a 14-year-old named Leonard Thompson. He lived another 13 years.

The research team sold their patent rights to the University of Toronto for $1, believing insulin was too vital to be exploited. Their idealism was betrayed: Today, manufacture of the drug is controlled by three companies, and according to a 2018 Yale study published in JAMA, its high cost is forcing 1 in 4 Americans to skimp on their medication. The next frontier for insulin is finding a way to make it affordable for all.

Historically Speaking: The Beacon of the Public Library

Building places for ordinary people to read and share books has been a passion project of knowledge-seekers since before Roman times.

The Wall Street Journal

July 8, 2021

“The libraries are closing forever, like tombs,” wrote the historian Ammianus Marcellinus in 378 A.D. The Goths had just defeated the Roman army in the Battle of Adrianople, marking what is generally thought to be the beginning of the end of Rome.

His words echoed in my head during the pandemic, when U.S. public libraries closed their doors one by one. By doing so they did more than just close off community spaces and free access to books: They dimmed one of the great lamps of civilization.

Kings and potentates had long held private libraries, but the first open-access version came about under the Ptolemies, the Macedonian rulers of Egypt from 305 to 30 B.C. The idea was the brainchild of Ptolemy I Soter, who inherited Egypt after the death of Alexander the Great, and the Athenian governor Demetrius Phalereus, who fled there following his ouster in 307 B.C. United by a shared passion for knowledge, they set out to build a place large enough to store a copy of every book in the world. The famed Library of Alexandria was the result.

ILLUSTRATION: THOMAS FUCHS

Popular myth holds that the library was accidentally destroyed when Julius Caesar’s army set fire to a nearby fleet of Egyptian boats in 48 B.C. In fact the library eroded through institutional neglect over many years. Caesar was himself responsible for introducing the notion of public libraries to Rome. These repositories became so integral to the Roman way of life that even the public baths had libraries.

Private libraries endured the Dark Ages better than public ones. The Al-Qarawiyyin Library and University in Fez, Morocco, founded in 859 by the great heiress and scholar Fatima al-Fihri, survives to this day. But the celebrated Abbasid library, Bayt al-Hikmah (House of Wisdom), in Baghdad, which served the entire Muslim world, did not. In 1258 the Mongols sacked the city, slaughtering its inhabitants and dumping hundreds of thousands of the library’s books into the Tigris River. The mass of ink reportedly turned the water black.

A thousand years after Soter and Demetrius built the Library of Alexandra, Renaissance Florence benefited from a similar partnership between Cosimo de Medici and the scholar Niccolo de Niccoli. At Niccolo’s death in 1437, Cosimo carried out his friend’s wishes to bequeath his books to the people. Not only was the magnificent Biblioteca San Marco Italy’s first purpose-built public library, but its emphasis on reading spaces and natural light became the template for library architecture.

By the end of the 18th century, libraries could be found all over Europe and the Americas. But most weren’t places where the public could browse or borrow for free. Even Benjamin Franklin’s Library Company of Philadelphia, founded in 1731, required its members to subscribe.

The citizens of Peterborough, New Hampshire, started the first free public library in the U.S. in 1833, voting to tax themselves to pay for it, on the grounds that knowledge was a civic good. Many philanthropists, including George Peabody and John Jacob Astor, took up the cause of building free libraries.

But the greatest advocate of all was the steel magnate Andrew Carnegie. Determined to help others achieve an education through free libraries—just as he had done as a boy —Carnegie financed the construction of some 2,509 of them, with 1,679 spread across the U.S. He built the first in his hometown of Dumferline, Scotland in 1883. Carved over the entrance were the words “Let There Be Light.” It’s a motto to keep in mind as U.S. public libraries start to reopen.

Historically Speaking: How the Office Became a Place to Work

Employees are starting to return to their traditional desks in large shared spaces. But centuries ago, ‘office’ just meant work to be done, not where to do it.

The Wall Street Journal

June 24, 2021

Wall Street wants its workforce back in the office. Bank of America, Morgan Stanley and Goldman Sachs have all let employees know that the time is approaching to exchange pajamas and sweats for less comfortable work garb. Some employees are thrilled at the prospect, but others waved goodbye to the water cooler last year and have no wish to return.

Contrary to popular belief, office work is not a beastly invention of the capitalist system. As far back as 3000 B.C, the temple cities of Mesopotamia employed teams of scribes to keep records of official business. The word “office” is an amalgamation of the Latin officium, which meant a position or duty, and ob ficium, literally “toward doing.” Geoffrey Chaucer was the first writer known to use “office” to mean an actual place, in “The Canterbury Tales” in 1395.

In the 16th century, the Medicis of Florence built the Uffizi, now famous as a museum, for conducting their commercial and political business (the name means “offices” in Italian). The idea didn’t catch on in Europe, however, until the British began to flex their muscles across the globe. When the Royal Navy outgrew its cramped headquarters, it commissioned a U-shaped building in central London originally known as Ripley Block and later as the Old Admiralty building. Completed in 1726, it is credited with being the U.K.’s first purpose-built office.

Three years later, the East India Company began administering its Indian possessions from gleaming new offices in Leadenhall Street. The essayist and critic Charles Lamb joined the East India Company there as a junior clerk in 1792 and stayed until his retirement, but he detested office life, calling it “daylight servitude.” “I always arrive late at the office,” he famously wrote, “but I make up for it by leaving early.”

A scene from “The Office,” which reflected the modern ambivalence toward deskbound work.
PHOTO: CHRIS HASTON/NBC/EVERETT COLLECTION

Not everyone regarded the office as a prison without bars. For women it could be liberating. An acute manpower shortage during the Civil War led Francis Elias Spinner, the U.S. Treasurer, to hire the government’s first women office clerks. Some Americans were scandalized by the development. In 1864, Rep. James H Brooks told a spellbound House that the Treasury Department was being defiled by “orgies and bacchanals.”

In the late 19th century, the inventions of the light bulb and elevator were as transformative for the office as the telephone and typewriter: More employees could be crammed into larger spaces for longer hours. Then in 1911, Frederick Winslow Taylor published “The Principles of Scientific Management,” which advocated a factory-style approach to the workplace with rows of desks lined up in an open-plan room. “Taylorism” inspired an entire discipline devoted to squeezing more productivity from employees.

Sinclair Lewis’s 1917 novel, “The Job,” portrayed the office as a place of opportunity for his female protagonist, but he was an outlier among writers and social critics. Most fretted about the effects of office work on the souls of employees. In 1955, Sloan Wilson’s “The Man in the Grey Flannel Suit,” about a disillusioned war veteran trapped in a job that he hates, perfectly captured the deep-seated American ambivalence toward the office. Modern television satires like “The Office” show that the ambivalence has endured—as do our conflicted attitudes toward a post-pandemic return to office routines.