Historically Speaking: Dante’s Enduring Vision of Hell

The “Inferno” brought human complexity to the medieval conception of the afterlife

The Wall Street Journal

September 30, 2021

What is hell? For Plato, it was Tartarus, the lowest level of Hades where those who had sinned against the gods suffered eternal punishment. For Jean-Paul Sartre, the father of existentialism, hell was other people. For many travelers today, it is airport security.

No depiction of hell, however, has been more enduring than the “Inferno,” part one of the “Divine Comedy” by Dante Alighieri, the 700th anniversary of whose death is commemorated this year. Dante’s hell is divided into nine concentric circles, each one more terrifying and brutal than the last until the frozen center, where Satan resides alongside Judas, Brutus and Cassius. With Virgil as his guide, Dante’s spiritually bereft and depressed alter ego enters via a gate bearing the motto “Abandon all hope, ye who enter here”—a phrase so ubiquitous in modern times that it greets visitors to Disney’s Pirates of the Caribbean ride.

The inscription was a Dantean invention, but the idea of a physical gate separating the land of the living from a desolate one of the dead was already at least 3,000 years old: In the Sumerian Epic of Gilgamesh, written around 2150 B.C., two scorpionlike figures guard the gateway to an underworld filled with darkness and dust.

ILLUSTRATION: THOMAS FUCHS

The underworld of the ancient Egyptians was only marginally less bleak. Seven gates blocked the way to the Hall of Judgment, according to the Book of the Dead. Getting through them was arduous and fraught with failure. The successful then had to submit to having their hearts weighed against the Feather of Truth. Those found wanting were thrown into the fire of oblivion.

Zoroastrianism, the official religion of the ancient Persians, was possibly the first to divide the afterlife into two physically separate places, one for good souls and the other for bad. This vision contrasted with the Greek view of Hades as the catchall for the human soul and the early Hebrew Bible’s description of Sheol as a shadowy pit of nothingness. In the 4th century B.C., Alexander the Great’s Macedonian empire swallowed both Persia and Judea, and the three visions of the afterlife commingled. “hell” would then appear frequently in Greek versions of the New Testament. But the word, scholars point out, was a single translation for several distinct Hebrew terms.

Early Christianity offered more than one vision of hell, but all contained the essential elements of Satan, sinners and fire. The “Apocalypse of Peter,” a 2nd century text, helped start the trend of listing every sadistic torture that awaited the wicked.

Dante was thus following a well-trod path with his imaginatively crafted punishments of boiling pitch for the dishonest and downpours of icy rain on the gluttonous. But he deviated from tradition by describing Hell’s occupants with psychological depth and insight. Dante’s narrator rediscovers the meaning of Christian truth and love through his encounters. In this way the Inferno speaks to the complexities of the human condition rather than serving merely as a literary zoo of the dammed.

The “Divine Comedy” changed the medieval world’s conception of hell, and with it, man’s understanding of himself. Boccaccio, Chaucer, Milton, Balzac —the list of writers directly inspired by Dante’s vision goes on. “Dante and Shakespeare divide the world between them,” wrote T.S. Eliot. “There is no third.”

Historically Speaking: For Punishment or Penitence?

Fifty years ago, the Attica uprising laid bare the conflicting ideas at the heart of the U.S. prison system.

The Wall Street Journal

September 17, 2021

Fifty years ago this past week, inmates in Attica, New York, staged America’s deadliest prison uprising. The organizers held prison employees hostage while demanding better conditions. One officer and three inmates were killed during the rioting, and the revolt’s suppression left another 39 dead and at least 89 seriously wounded. The episode raised serious questions about prison conditions and ultimately led to some reforms.

Nearly two centuries earlier, the founders of the U.S. penal system had intended it as a humane alternative to those that relied on such physical punishments as mutilation and whipping. After the War of Independence, Benjamin Franklin and leading members of Philadelphia’s Quaker community argued that prison should be a place of correction and penitence. Their vision was behind the construction of the country’s first “penitentiary house” at the Walnut Street Jail in Philadelphia in 1790. The old facility threw all prisoners together; its new addition contained individual cells meant to prevent moral contagion and to encourage prisoners to spend time reflecting on their crimes.

Inmates protest prison conditions in Attica, New York, Sept. 10, 1971

Walnut Street inspired the construction of the first purpose-built prison, Eastern State Penitentiary, which opened outside of Philadelphia in 1829. Prisoners were kept in solitary confinement and slept, worked and ate in their cells—a model that became known as the Pennsylvania system. Neighboring New York adopted the Auburn system, which also enforced total silence but required prisoners to work in communal workshops and instilled discipline through surveillance, humiliation and corporal punishment. Although both systems were designed to prevent recidivism, the former stressed prisoner reform while the latter carried more than a hint of retribution.

Europeans were fascinated to see which system worked best. In 1831, the French government sent Alexis de Tocqueville and Gustave de Beaumont to investigate. Having inspected facilities in several states, they concluded that although the “penitentiary system in America is severe,” its combination of isolation and work offered hope of rehabilitation. But the novelist Charles Dickens reached the opposite conclusion. After touring Eastern State Penitentiary in 1842, he wrote that the intentions behind solitary confinement were “kind, humane and meant for reformation.” In practice, however, total isolation was “worse than any torture of the body”: It broke rather than reformed people.

Severe overcrowding—there was no parole in the 19th century—eventually undermined both systems. Prisoner violence became endemic, and regimes of control grew harsher. Sing Sing prison in New York meted out 36,000 lashes in 1843 alone. In 1870, the National Congress on Penitentiary and Reformatory Discipline proposed reforms, including education and work-release initiatives. Despite such efforts, recidivism rates remained high, physical punishment remained the norm and almost 200 serious prison riots were recorded between 1855 and 1955.

That year, Harry Manuel Shulman, a deputy commissioner in New York City’s Department of Correction, wrote an essay arguing that the country’s early failure to decide on the purpose of prison had immobilized the system, leaving it “with one foot in the road of rehabilitation and the other in the road of punishment.” Which would it choose? Sixteen years later, Attica demonstrated the consequences of ignoring the question.

The Sunday Times: Texas Talibanistas, take note: freedom will win

The blow to abortion rights is shocking, but this fight is nowhere near over

The Sunday Times

September 7, 2021

The pro-life movement in America finally got its wish this week: a little before midnight on Wednesday, in a 5-4 decision, the Supreme Court ruled against temporarily blocking a Texas state law passed in May, known as Senate Bill 8 (SB8), banning almost all abortions once a heartbeat can be detected by ultrasound — which is around six weeks after conception. The bill will still eventually return to the Supreme Court for a final decision, but by being allowed to stand unchanged it becomes the strictest anti-abortion law in the nation. There are no exceptions for child pregnancy, rape or incest.

But this isn’t the reason for the national uproar. SB8 goes further than any other anti-abortion bill yet crafted because of the way it allows the ban to be enforced. Under the new Texas law, a $10,000 bounty will be awarded to any US citizen who successfully sues a person or entity that helps a woman to obtain an abortion. “Help” includes providing money, transport, medicines or medical aid.

To speed up the process, Texas Right to Life, an anti-abortion organisation, has already set up an anonymous tip line for “whistleblowers”. That’s right, the second-largest state in the union by size and population is turning family against family, neighbour against neighbour, to create its own spy network of uterus police. Welcome to Gilead-on-the-Rio Grande. Cue outrage from all Americans who support legal abortion — and, according to recent polls, they amount to 58 per cent of the country.

There is no doubt that SB8 is a huge victory for the pro-life campaign. Texas joins 24 countries worldwide that have a total or near-total ban on abortion. Outside the big cities, large swathes of America are already abortion-free zones: only 11 per cent of counties have a hospital or clinic that provides such services.

In the short term the outlook for that most basic of human rights, a woman’s control over her body, is dire in America. The combination of a Republican-packed Supreme Court, thanks to Donald Trump’s last-minute appointment of Amy Coney Barrett following the death in September last year of Ruth Bader Ginsberg, and SB8’s sneaky bypassing of federal authority has closed down the obvious routes for legal redress. Moreover, the Senate is tied 50-50, making it impossible for Congress to pass a law mandating a woman’s unrestricted access to abortion. The Texas Talibanistas have gained the upper hand. Similar laws to SB8 will no doubt be passed in other Republican states.

The Texas appeal to vigilantism should also offend everyone who believes in democracy and the rule of law. But — and this may be hard to accept in the heat of the moment — SB8 is a gift to the pro-choice movement.

Pro-life Texans thought they were being clever by avoiding both the Supreme Court and Congress to slip through an abortion ban. But, as the saying goes, be careful what you wish for. “Lawfare” is a two-way street. Critics of SB8 point out that there is nothing to stop California passing a similar bill that enables citizens to bring civil lawsuits against people who utter “hate speech”, or to stop New York deputising bounty-hunters to sue gun-owners. Nor does the legal chaos stop there. SB8 could open the way for railways, car companies and airlines to become liable for providing travel assistance to an abortion-seeking woman, or supermarkets for selling the disinfectant Lysol and other substances that induce abortion. Forget about boycotts for a moment; the threat of a lawsuit is a powerful deterrent to corporations seeking to do business in Texas.

History is not the best predictor of the future. Nevertheless, the disastrous dalliance with prohibition, which lasted for 13 years between 1920 and 1933, offers a salient lesson in what happens when a long-held individual right is taken away from Americans. The non-metropolitan parts of the country forced their will on the urban parts. But drinking didn’t stop; it just went underground. Some states wouldn’t enforce the ban, and other states couldn’t. In Detroit, Michigan, the alcohol trade was the largest single contributor to the economy after the car industry. Prohibition fostered the American mafia, led to a rise in alcoholism, alcohol-related deaths, mass lawlessness and civil disobedience and brought about extraordinary levels of corruption.

There is every reason to believe that abortions will continue in America no matter what anti-abortion zealots manage to pull off. It just won’t be pretty. A recent study published in The Lancet Global Health revealed that the countries with the greatest restrictions not only have the highest termination rates in the world but are also among the least economically successful. This is the club that awaits pro-life America.

The strangulation of women’s rights has been so slow that supporters of Roe v Wade, the 1973 ruling that made abortion legal, were lulled into a false sense of security. They assumed the minority of Americans fighting for a repeal would never overwhelm the will of the majority. SB8 has changed all that. Its underpinnings threaten so many constitutional rights that abortion is going to be front and centre in every state and federal election.

Democracy does work, even if, as with prohibition, it takes to time to roll back injustices. Last year the Virginia state legislature voted to remove more than a decade’s worth of abortion restrictions. This is the body that in 2012 stood accused of “state-sanctioned rape” for passing a bill that required any woman seeking an abortion to submit to an ultrasound first, not by the usual external method but with a transvaginal wand.

Despite what anti-abortion fanatics believe, the US is a pro-choice country. The fight for women’s rights will go on, and on, until the people win.

Historically Speaking: The Long Haul of Distance Running

How the marathon became the world’s top endurance race

The Wall Street Journal

September 2, 2021

The New York City Marathon, the world’s largest, will hold its 50th race this autumn, after missing last year’s due to the pandemic. A podiatrist once told me that he always knows when there has been a marathon because of the sudden uptick in patients with stress fractures and missing toenails. Nevertheless, humans are uniquely suited to long-distance running.

Some 2-3 million years ago, our hominid ancestors began to develop sweat glands that enabled their bodies to stay cool while chasing after prey. Other mammals, by contrast, overheat unless they stop and rest. Thus, slow but sweaty humans won out over fleet but panting animals.

The marathon, at 26.2 miles, isn’t the oldest known long-distance race. Egyptian Pharaoh Taharqa liked to organize runs to keep his soldiers fit. A monument inscribed around 685 B.C. records a two-day, 62-mile race from Memphis to Fayum and back. The unnamed winner of the first leg (31 miles) completed it in about four hours.

ILLUSTRATION: THOMAS FUCHS

The considerably shorter marathon derives from the story of a Greek messenger, Pheidippides, who allegedly ran from Marathon to Athens in 490 B.C. to deliver news of victory over the Persians—only to drop dead of exhaustion at the end. But while it is true that the Greeks used long-distance runners, called hemerodromoi, or day runners, to convey messages, this story is probably a myth or a conflation of different events.

Still, foot-bound messengers ran impressive distances in their day. Within 24 hours of Herman Cortes’s landing in Mexico in 1519, messenger relays had carried news of his arrival over 260 miles to King Montezuma II in Tenochtitlan.

As a competitive sport, the marathon has a shorter history. The longest race at the ancient Olympic Games was about 3 miles. This didn’t stop the French philologist Michel Bréal from persuading the organizers of the inaugural modern Olympics in 1896 to recreate Pheidippides’s epic run as a way of adding a little classical flavor to the Games. The event exceeded his expectations: The Greek team trained so hard that it won 8 of the first 9 places. John Graham, manager of the U.S. Olympic team, was inspired to organize the first Boston Marathon in 1897.

Marathon runners became fitter and faster with each Olympics. But at the 1908 London Games the first runner to reach the stadium, the Italian Dorando Pietri, arrived delirious with exhaustion. He staggered and fell five times before concerned officials eventually helped him over the line. This, unfortunately, disqualified his time of 2:54:46.

Pietri’s collapse added fuel to the arguments of those who thought that a woman’s body could not possibly stand up to a marathon’s demands. Women were banned from the sport until 1964, when Britain’s Isle of Wight Marathon allowed the Scotswoman Dale Greig to run, with an ambulance on standby just in case. Organizers of the Boston Marathon proved more intransigent: Roberta Gibb and Katherine Switzer tried to force their way into the race in 1966 and ’67, but Boston’s gender bar stayed in place until 1972. The Olympics held out until 1984.

Since that time, marathons have become a great equalizer, with men and women on the same course: For 26.2 miles, the only label that counts is “runner.”

Historically Speaking: A Legacy of Tinderbox Forests

Long before climate change exacerbated the problem, policies meant to suppress wildfires served to fan the flames

The Wall Street Journal

August 19, 2021

This year’s heat waves and droughts have led to record-breaking wildfires across three continents. The fires in Siberia are so vast that smoke has reached the North Pole for what is believed to be the first time. In the United States, California’s Dixie Fire has become the largest single fire in the state’s history.

Humans have long wrestled with forest fire, seeking by turns to harness and to suppress it. Early European efforts to control forest fires were tentative and patchy. In the 14th century, the Sardinians experimented with fire breaks, but the practice was slow to catch on. In North America, by contrast, scientists have found 2000-year-old evidence of controlled burnings by Native American tribes. But this practice died out with the arrival of European immigrants, because of local bans as well as the expulsion of tribes from native lands. As a consequence, forests not only became larger and denser but also filled with mounds of dead and dried vegetation, making them very susceptible to fire.

Disaster struck in the fall of 1871. Dozens of wildfires broke out simultaneously across the Upper Midwest and Great Lakes region, and on Oct. 8, the same night as the Chicago Fire, a firestorm engulfed the town of Peshtigo, Wisconsin, killing an estimated 1,200 people (and possibly many more). The fire was the deadliest in U.S. history.

ILLUSTRATION: ANTHONY FREDA

Early conservationists, such as Franklin Hough, sought to organize a national wildfire policy. The U.S. Forest Service was created in 1905 and was still figuring out its mission in 1910, when the northern Rockies went up in flames. Fires raced through Washington, Montana and Idaho, culminating in what is known as the “Big Blowup” of August 20 and 21.

One of the U.S. Forest Service’s newly minted rangers, Edward C. Pulaski, was leading a team of 45 firefighters near Wallace, Idaho. A firestorm surrounded the men, forcing Pulaski to lead them down a disused mine shaft. Several attempted to go back outside, believing they would be cooked alive in the smoke-filled mine. Pulaski managed to stop the suicidal exodus by threatening to shoot any man who tried to leave. To maintain morale, he organized a water bucket chain to prevent the blankets covering the exit from catching fire. Pulaski’s actions saved the lives of all but five of the firefighters, but his eyesight and lungs never recovered.

The Big Blowup destroyed more than 3 million acres in two days and killed at least 80 people. In response to the devastation, the Forest Service, with the public’s support, adopted the mistaken goal of total fire suppression rather than fire management. The policy remained in place until 1978, bequeathing to the country a legacy of tinderbox forests.

Nowadays, the Forest Service conducts controlled burns, known as prescribed fires, to mitigate the risk of wildfire. The technique is credited with helping to contain July’s 413,000-acre Bootleg Fire in Oregon.

Fire caused by the effects of climate change will require human intervention of another order. In the Bible’s Book of Isaiah, the unhappy “sinners of Zion” cry out: ‘”Who of us can live where there is a consuming fire? Who among us can dwell with everlasting burnings?” Who, indeed.

Historically Speaking: Let Slip the Dogs, Birds and Donkeys of War

Animals have served human militaries with distinction since ancient times

The Wall Street Journal

August 5, 2021

Cher Ami, a carrier pigeon credited with rescuing a U.S. battalion from friendly fire in World War I, has been on display at the Smithsonian for more a century. The bird made news again this summer, when DNA testing revealed that the avian hero was a “he” and not—as two feature films, several novels and a host of poems depicted—a ”she.”

Cher Ami was one of more than 200,000 messenger pigeons Allied forces employed during the War. On Oct. 4, 1918, a battalion from the U.S. 77th Infantry Division in Verdun, northern France, was trapped behind enemy lines. The Germans had grown adept at shooting down any bird suspected of working for the other side. They struck Cher Ami in the chest and leg—but the pigeon still managed to make the perilous flight back to his loft with a message for U.S. headquarters.

Animals have played a crucial role in human warfare since ancient times. One of the earliest depictions of a war animal appears on the celebrated 4,500-year-old Sumerian box known as the Standard of Ur. One side shows scenes of war; the other, scenes of peace. On the war side, animals that are most probably onagers, a species of wild donkey, are shown dragging a chariot over the bodies of enemy soldiers.

War elephants of Pyrrhus in a 20th century Russian painting
PHOTO: ALAMY

The two most feared war animals of the classical world were horses and elephants. Alexander the Great perfected the use of the former and introduced the latter after his foray into India in 327 BC. For a time, the elephant was the ultimate weapon of war. At the Battle of Heraclea in 280 B.C., a mere 20 of them helped Pyrrhus, king of Epirus—whose costly victories inspired the term “Pyrrhic victory”—rout an entire Roman army.

War animals didn’t have to be big to be effective, however. The Romans learned how to defeat elephants by exploiting their fear of pigs. In 198 B.C., the citizens of Hatra, near Mosul in modern Iraq, successfully fought off a Roman attack by pouring scorpions on the heads of the besiegers. Eight years later, the Carthaginian general Hannibal won a surprise naval victory against King Eumenes II of Pergamon by catapulting “snake bombs”—jars stuffed with poisonous snakes—onto his ships.

Ancient war animals often suffered extraordinary cruelty. When the Romans sent pigs to confront Pyrrhus’s army, they doused the animals in oil and set them on fire to make them more terrifying. Hannibal would get his elephants drunk and stab their legs to make them angry.

Counterintuitively, as warfare became more mechanized the need for animals increased. Artillery needed transporting; supplies, camps, and prisoners needed guarding. A favorite mascot or horse might be well treated: George Washington had Nelson, and Napoleon had Marengo. But the life of the common army animal was hard and short. The Civil War killed between one and three million horses, mules and donkeys.

According to the Imperial War Museum in Britain, some 16 million animals served during World War I, including canaries, dogs, bears and monkeys. Horses bore the brunt of the fighting, though, with as many as 8 million dying over the four years.

Dolphins and sea lions have conducted underwater surveillance for the U.S. Navy and helped to clear mines in the Persian Gulf. The U.S. Army relies on dogs to detect hidden IEDs, locate missing soldiers, and even fight when necessary. In 2016, four sniffer dogs serving in Afghanistan were awarded the K-9 Medal of Courage by the American Humane Association. As the troop withdrawal continues, the military’s four-legged warriors are coming home, too

Historically Speaking: Diabetes and the Miracle of Insulin

One hundred years ago, a team of Canadian researchers showed that an age-old disease didn’t have to mean a death sentence.

The Wall Street Journal

July 22, 2021

The human body runs on glucose, a type of sugar that travels through the bloodstream to the cells where it converts into energy. Some 34.2 million Americans are diabetic; their bodies are unable to produce the hormone insulin, which regulates how glucose is processed and stored in the cells. Without treatment the condition is terminal. The discovery of insulin a century ago this year was one of the great medical breakthroughs of the 20th century.

Diabetes was first recognized some 4,000 years ago. The Ebers Papryus, an Egyptian medical text written around 1550 B.C., refers to patients suffering from thirst, frequent urination and weight loss. An ancient Indian text, the Sushruta Samhita, composed after the 7th century B.C., advised testing for diabetes by seeing whether ants were attracted to the sugar in the urine.

The disproportionate amount of urine experienced by sufferers is probably the reason why the ancient Greeks called it “diabetes,” the word for “siphon” or “to pass through.” They also made the link to lifestyle: The 5th century B.C. physician Hippocrates, the “father of medicine,” advocated exercise as part of the treatment.

Early on, the Chinese recognized that an unbalanced diet of sweet, rich and fatty foods could play a role. Lady Dai, a minor aristocrat who died in the 2nd century B.C., was a textbook case. Her perfectly preserved mummy, discovered in southern China in 1971, revealed a life of dietary excess. She also suffered the consequences: osteoarthritis, high cholesterol, hypertension, liver disease, gall stones and, crucially, diabetes.

Over time, physicians became more expert at diagnosis. The role of sugar came into sharper focus in the 1770s after the English doctor Matthew Dobson discovered that it stayed in the blood as well as urine. But no further progress was made until the end of the 19th century. In 1889 Oskar Minkowski conducted experiments on dogs at the University of Strasbourg to prove that a nonfunctioning pancreas triggered diabetes.

ILLUSTRATION: THOMAS FUCHS

By the early 20th century, scientists knew that a pancreatic secretion was responsible for controlling glucose in the body, but they couldn’t isolate it. The Canadian researchers credited with finding the answer— John Macleod, Frederick Banting, Charles Best and James Collip —were an unlikely team. Banting was a surgeon not a scientist. Yet he sufficiently impressed Macleod, a professor of physiology at the University of Toronto, that the latter lent him lab space and a research assistant, Best. The pairing almost ended in a fist fight, but Banting and Best got over their differences and in July 1921 successfully injected insulin into a dog.

Macleod then brought in Collip, a biochemist on a research sabbatical, to help make a human-compatible version. This led to more infighting and Collip threatened to leave. The dysfunctional group somehow held together long enough to try their insulin on a 14-year-old named Leonard Thompson. He lived another 13 years.

The research team sold their patent rights to the University of Toronto for $1, believing insulin was too vital to be exploited. Their idealism was betrayed: Today, manufacture of the drug is controlled by three companies, and according to a 2018 Yale study published in JAMA, its high cost is forcing 1 in 4 Americans to skimp on their medication. The next frontier for insulin is finding a way to make it affordable for all.

Historically Speaking: The Beacon of the Public Library

Building places for ordinary people to read and share books has been a passion project of knowledge-seekers since before Roman times.

The Wall Street Journal

July 8, 2021

“The libraries are closing forever, like tombs,” wrote the historian Ammianus Marcellinus in 378 A.D. The Goths had just defeated the Roman army in the Battle of Adrianople, marking what is generally thought to be the beginning of the end of Rome.

His words echoed in my head during the pandemic, when U.S. public libraries closed their doors one by one. By doing so they did more than just close off community spaces and free access to books: They dimmed one of the great lamps of civilization.

Kings and potentates had long held private libraries, but the first open-access version came about under the Ptolemies, the Macedonian rulers of Egypt from 305 to 30 B.C. The idea was the brainchild of Ptolemy I Soter, who inherited Egypt after the death of Alexander the Great, and the Athenian governor Demetrius Phalereus, who fled there following his ouster in 307 B.C. United by a shared passion for knowledge, they set out to build a place large enough to store a copy of every book in the world. The famed Library of Alexandria was the result.

ILLUSTRATION: THOMAS FUCHS

Popular myth holds that the library was accidentally destroyed when Julius Caesar’s army set fire to a nearby fleet of Egyptian boats in 48 B.C. In fact the library eroded through institutional neglect over many years. Caesar was himself responsible for introducing the notion of public libraries to Rome. These repositories became so integral to the Roman way of life that even the public baths had libraries.

Private libraries endured the Dark Ages better than public ones. The Al-Qarawiyyin Library and University in Fez, Morocco, founded in 859 by the great heiress and scholar Fatima al-Fihri, survives to this day. But the celebrated Abbasid library, Bayt al-Hikmah (House of Wisdom), in Baghdad, which served the entire Muslim world, did not. In 1258 the Mongols sacked the city, slaughtering its inhabitants and dumping hundreds of thousands of the library’s books into the Tigris River. The mass of ink reportedly turned the water black.

A thousand years after Soter and Demetrius built the Library of Alexandra, Renaissance Florence benefited from a similar partnership between Cosimo de Medici and the scholar Niccolo de Niccoli. At Niccolo’s death in 1437, Cosimo carried out his friend’s wishes to bequeath his books to the people. Not only was the magnificent Biblioteca San Marco Italy’s first purpose-built public library, but its emphasis on reading spaces and natural light became the template for library architecture.

By the end of the 18th century, libraries could be found all over Europe and the Americas. But most weren’t places where the public could browse or borrow for free. Even Benjamin Franklin’s Library Company of Philadelphia, founded in 1731, required its members to subscribe.

The citizens of Peterborough, New Hampshire, started the first free public library in the U.S. in 1833, voting to tax themselves to pay for it, on the grounds that knowledge was a civic good. Many philanthropists, including George Peabody and John Jacob Astor, took up the cause of building free libraries.

But the greatest advocate of all was the steel magnate Andrew Carnegie. Determined to help others achieve an education through free libraries—just as he had done as a boy —Carnegie financed the construction of some 2,509 of them, with 1,679 spread across the U.S. He built the first in his hometown of Dumferline, Scotland in 1883. Carved over the entrance were the words “Let There Be Light.” It’s a motto to keep in mind as U.S. public libraries start to reopen.

The Sunday Times: Rumsfeld was the wrong man at the wrong time

Bush’s war supremo brought about his own worst fear: another Vietnam

July 4, 2021

On the whole, sacked US defence secretaries should avoid quoting Winston Churchill as they depart from the White House, in much the same way as disgraced preachers should leave off quoting Jesus as they are led away in handcuffs. Donald Rumsfeld, who died aged 88 last week, was the chief architect of the American invasions of Afghanistan and Iraq. The aftermath was considered a failure, however, and President George W Bush reversed course after Rumsfeld’s departure in December 2006, sending in extra troops to stop Iraq’s descent into civil war. Rumsfeld handled being fired with his customary mix of puckish humour and pugnacious bravado. In his final speech he paraphrased Churchill, saying:“I have benefited greatly from criticism and at no time have I suffered a lack thereof.”

The last bit was true then, and it continued to be until his death. Rumsfeld’s critics on his refusal to commit sufficient fighting troops in either Afghanistan or Iraq could fill a football pitch. (The first of many damning appraisals appeared in 2007, entitled Rumsfeld: An American Disaster.) But the claim he benefited from criticism was so laughable to anyone who knew him that it only highlighted the catastrophic deficiencies of the man. Rumsfeld was incapable of listening to anyone who didn’t already agree with him. He was the wrong man for the job at the most inopportune time in America’s history.

As several obituaries of Rumsfeld pointed out, his first stint as defence secretary, under Gerald Ford in 1975-77, happened under arguably worse circumstances. A survivor from Richard Nixon’s administration, where he stood out for his unwavering commitment to civil rights, Rumsfeld was the White House chief of staff during the last days of Saigon in April 1975. Appointed defence secretary shortly afterwards, Rumsfeld regarded it as his mission to keep the military ready and competitive but essentially inactive. This wasn’t cowardice but good Cold War strategy.

Rumsfeld’s reputation as a strategic thinker was subsequently borne out by his wildly successful transition to the business world. He was also a clear, no-nonsense communicator, whose fondness for aphorisms and golden rules became the stuff of legend. When Rumsfeld left the White House for the first time, he bequeathed a memorandum of best practices, Rumsfeld’s Rules, 11 of which were specific to the secretary of defence. They began with the reminder: “The secretary of defence is not a super general or admiral. His task is to exercise civilian control over the department for the commander-in-chief [the president] and the country”, and included such important pieces of advice as: “Establish good relations between the departments of Defence, State, the National Security Council, CIA and the Office of Management and Budget.”

When Rumsfeld returned to the White House in 2001, aged 68, he broke almost every one of his own rules. Not only did he allow relations between the departments to become utterly dysfunctional but he so alienated the generals and joint chiefs of staff that it was widely assumed he “hated” the military. The serving chairman of the joint chiefs, General Hugh Shelton, complained that Rumsfeld operated “the worst style of leadership I witnessed in 38 years of service”. Rumsfeld treated all soldiers as boneheaded grunts, and the generals simply as boneheaded grunts with the medals to prove it.

His planned military transformations suffered from an overly technocratic mentality. He believed that the army was costly and lacked efficiency — what army doesn’t?— as though bottom lines apply equally in business and fighting. Rumsfeld wanted to remake the military as one that relied more on air power and technical advantages and less on soldiers with guns. The charge against him is that he eviscerated the military’s ground capabilities and reduced its fighting numbers at precisely the moment both were paramount to American success in Afghanistan and Iraq.

What was going through Rumsfeld’s mind? Millions of words have been spilt in the effort to explain why the defence secretary doggedly pursued a losing policy in the teeth of protests from the military. In his last year in the job six retired generals publicly rebuked him. Part of the answer lies in his character: he was a micromanager who failed to engage, a bureaucrat who despised professionals, an aggressor who disliked to fight. But the real driver of his actions can be traced back to 1975. More than anything else, even more than winning perhaps, he wanted to avoid a repeat of the Vietnam War, with its “limited war” rhetoric that was belied by the fact it was the first of what the US media have dubbed America’s “forever wars” — metastasising conflicts without a clear end in sight.

Rumsfeld emerged from his first period in the White House blaming the military for having misled successive administrations into committing themselves to an unwinnable and unpopular war. Hence, he believed that nothing the military said could be taken at face value. He was not going to allow himself to be taken prisoner by the top brass. Unlike Robert McNamara, the defence secretary most associated with US involvement in Vietnam, Rumsfeld was determined to stick to quick, in-and-out military objectives. There would be no quagmires, mass body bags, forever wars or attempts at nation-building on his watch.

Yet he was a prisoner all the same. Even though the causes and conditions were different, the Vietnam War remained the lens through which Americans judged the Iraq war. A few months after the coalition’s initial victory in Iraq in 2003, Senator John McCain, who spent five years as a PoW in Vietnam, warned the Bush administration that stopping the army doing its job, as interpreted by McCain, would risk “the most serious American defeat on the global stage since Vietnam”. By 2004, Democrats were saying: “Iraq is George Bush’s Vietnam”. Rumsfeld ended up being directly compared to McNamara, even though, by winding down, rather than ratcheting up, the US presence in the Middle East, he was doing the opposite of his predecessor. A 2005 column in the Washington Post intoned: “Just as Vietnam became McNamara’s war, Iraq has become Rumsfeld’s war”.

The successful troop “surge” in 2008 appeared to erase Rumsfeld’s Vietnam legacy. Only it didn’t. Barack Obama’s foreign policy — summed up as “leading from behind” — was Rumsfeldian in its horror of American military entanglement in foreign affairs. The Trump administration’s was even more so. More than six trillion war dollars later, with countless lives lost, if Rumsfeld leaves anything behind, it’s the lesson that America’s forever war syndrome is a state of mind, not just a reality.

Historically Speaking: How the Office Became a Place to Work

Employees are starting to return to their traditional desks in large shared spaces. But centuries ago, ‘office’ just meant work to be done, not where to do it.

The Wall Street Journal

June 24, 2021

Wall Street wants its workforce back in the office. Bank of America, Morgan Stanley and Goldman Sachs have all let employees know that the time is approaching to exchange pajamas and sweats for less comfortable work garb. Some employees are thrilled at the prospect, but others waved goodbye to the water cooler last year and have no wish to return.

Contrary to popular belief, office work is not a beastly invention of the capitalist system. As far back as 3000 B.C, the temple cities of Mesopotamia employed teams of scribes to keep records of official business. The word “office” is an amalgamation of the Latin officium, which meant a position or duty, and ob ficium, literally “toward doing.” Geoffrey Chaucer was the first writer known to use “office” to mean an actual place, in “The Canterbury Tales” in 1395.

In the 16th century, the Medicis of Florence built the Uffizi, now famous as a museum, for conducting their commercial and political business (the name means “offices” in Italian). The idea didn’t catch on in Europe, however, until the British began to flex their muscles across the globe. When the Royal Navy outgrew its cramped headquarters, it commissioned a U-shaped building in central London originally known as Ripley Block and later as the Old Admiralty building. Completed in 1726, it is credited with being the U.K.’s first purpose-built office.

Three years later, the East India Company began administering its Indian possessions from gleaming new offices in Leadenhall Street. The essayist and critic Charles Lamb joined the East India Company there as a junior clerk in 1792 and stayed until his retirement, but he detested office life, calling it “daylight servitude.” “I always arrive late at the office,” he famously wrote, “but I make up for it by leaving early.”

A scene from “The Office,” which reflected the modern ambivalence toward deskbound work.
PHOTO: CHRIS HASTON/NBC/EVERETT COLLECTION

Not everyone regarded the office as a prison without bars. For women it could be liberating. An acute manpower shortage during the Civil War led Francis Elias Spinner, the U.S. Treasurer, to hire the government’s first women office clerks. Some Americans were scandalized by the development. In 1864, Rep. James H Brooks told a spellbound House that the Treasury Department was being defiled by “orgies and bacchanals.”

In the late 19th century, the inventions of the light bulb and elevator were as transformative for the office as the telephone and typewriter: More employees could be crammed into larger spaces for longer hours. Then in 1911, Frederick Winslow Taylor published “The Principles of Scientific Management,” which advocated a factory-style approach to the workplace with rows of desks lined up in an open-plan room. “Taylorism” inspired an entire discipline devoted to squeezing more productivity from employees.

Sinclair Lewis’s 1917 novel, “The Job,” portrayed the office as a place of opportunity for his female protagonist, but he was an outlier among writers and social critics. Most fretted about the effects of office work on the souls of employees. In 1955, Sloan Wilson’s “The Man in the Grey Flannel Suit,” about a disillusioned war veteran trapped in a job that he hates, perfectly captured the deep-seated American ambivalence toward the office. Modern television satires like “The Office” show that the ambivalence has endured—as do our conflicted attitudes toward a post-pandemic return to office routines.