September 17, 2021
Nearly two centuries earlier, the founders of the U.S. penal system had intended it as a humane alternative to those that relied on such physical punishments as mutilation and whipping. After the War of Independence, Benjamin Franklin and leading members of Philadelphia’s Quaker community argued that prison should be a place of correction and penitence. Their vision was behind the construction of the country’s first “penitentiary house” at the Walnut Street Jail in Philadelphia in 1790. The old facility threw all prisoners together; its new addition contained individual cells meant to prevent moral contagion and to encourage prisoners to spend time reflecting on their crimes.
Walnut Street inspired the construction of the first purpose-built prison, Eastern State Penitentiary, which opened outside of Philadelphia in 1829. Prisoners were kept in solitary confinement and slept, worked and ate in their cells—a model that became known as the Pennsylvania system. Neighboring New York adopted the Auburn system, which also enforced total silence but required prisoners to work in communal workshops and instilled discipline through surveillance, humiliation and corporal punishment. Although both systems were designed to prevent recidivism, the former stressed prisoner reform while the latter carried more than a hint of retribution.
Europeans were fascinated to see which system worked best. In 1831, the French government sent Alexis de Tocqueville and Gustave de Beaumont to investigate. Having inspected facilities in several states, they concluded that although the “penitentiary system in America is severe,” its combination of isolation and work offered hope of rehabilitation. But the novelist Charles Dickens reached the opposite conclusion. After touring Eastern State Penitentiary in 1842, he wrote that the intentions behind solitary confinement were “kind, humane and meant for reformation.” In practice, however, total isolation was “worse than any torture of the body”: It broke rather than reformed people.
Severe overcrowding—there was no parole in the 19th century—eventually undermined both systems. Prisoner violence became endemic, and regimes of control grew harsher. Sing Sing prison in New York meted out 36,000 lashes in 1843 alone. In 1870, the National Congress on Penitentiary and Reformatory Discipline proposed reforms, including education and work-release initiatives. Despite such efforts, recidivism rates remained high, physical punishment remained the norm and almost 200 serious prison riots were recorded between 1855 and 1955.
That year, Harry Manuel Shulman, a deputy commissioner in New York City’s Department of Correction, wrote an essay arguing that the country’s early failure to decide on the purpose of prison had immobilized the system, leaving it “with one foot in the road of rehabilitation and the other in the road of punishment.” Which would it choose? Sixteen years later, Attica demonstrated the consequences of ignoring the question.
“The libraries are closing forever, like tombs,” wrote the historian Ammianus Marcellinus in 378 A.D. The Goths had just defeated the Roman army in the Battle of Adrianople, marking what is generally thought to be the beginning of the end of Rome.
His words echoed in my head during the pandemic, when U.S. public libraries closed their doors one by one. By doing so they did more than just close off community spaces and free access to books: They dimmed one of the great lamps of civilization.
Kings and potentates had long held private libraries, but the first open-access version came about under the Ptolemies, the Macedonian rulers of Egypt from 305 to 30 B.C. The idea was the brainchild of Ptolemy I Soter, who inherited Egypt after the death of Alexander the Great, and the Athenian governor Demetrius Phalereus, who fled there following his ouster in 307 B.C. United by a shared passion for knowledge, they set out to build a place large enough to store a copy of every book in the world. The famed Library of Alexandria was the result.
Popular myth holds that the library was accidentally destroyed when Julius Caesar’s army set fire to a nearby fleet of Egyptian boats in 48 B.C. In fact the library eroded through institutional neglect over many years. Caesar was himself responsible for introducing the notion of public libraries to Rome. These repositories became so integral to the Roman way of life that even the public baths had libraries.
Private libraries endured the Dark Ages better than public ones. The Al-Qarawiyyin Library and University in Fez, Morocco, founded in 859 by the great heiress and scholar Fatima al-Fihri, survives to this day. But the celebrated Abbasid library, Bayt al-Hikmah (House of Wisdom), in Baghdad, which served the entire Muslim world, did not. In 1258 the Mongols sacked the city, slaughtering its inhabitants and dumping hundreds of thousands of the library’s books into the Tigris River. The mass of ink reportedly turned the water black.
By the end of the 18th century, libraries could be found all over Europe and the Americas. But most weren’t places where the public could browse or borrow for free. Even Benjamin Franklin’s Library Company of Philadelphia, founded in 1731, required its members to subscribe.
The citizens of Peterborough, New Hampshire, started the first free public library in the U.S. in 1833, voting to tax themselves to pay for it, on the grounds that knowledge was a civic good. Many philanthropists, including George Peabody and John Jacob Astor, took up the cause of building free libraries.
But the greatest advocate of all was the steel magnate Andrew Carnegie. Determined to help others achieve an education through free libraries—just as he had done as a boy —Carnegie financed the construction of some 2,509 of them, with 1,679 spread across the U.S. He built the first in his hometown of Dumferline, Scotland in 1883. Carved over the entrance were the words “Let There Be Light.” It’s a motto to keep in mind as U.S. public libraries start to reopen.
June 12, 2021
The Pentagon Papers—a secret Defense Department review of America’s involvement in the Vietnam War—became public 50 years ago next week. The ensuing Supreme Court case guaranteed the freedom of the press to report government malfeasance, but the U.S. military analyst behind the revelation, Daniel Ellsberg, still ended up being prosecuted for espionage. Luckily for him, the charges were dropped after the trial became caught up in the Watergate scandal.
The twists and turns surrounding the Pentagon Papers have a uniquely American flavor to them. At the time, no other country regarded whistleblowing as a basic right.
The origins of whistleblowing are far less idealistic. The idea is descended from Roman ‘‘Qui Tam’’ laws, from a Latin phrase meaning “he who sues for himself does also for the king.” The Qui Tam laws served a policing function by giving informers a financial incentive to turn in wrong-doers. A citizen who successfully sued over malfeasance was rewarded with a portion of the defendant’s estate.
Anglo-Saxon law retained a crude version of Qui Tam. At first primarily aimed at punishing Sabbath-breakers, it evolved into whistleblowing against corruption. In 1360, the English monarch Edward III resorted to Qui Tam-style laws to encourage the reporting of jurors and public officials who accepted bribes.
Whistleblowers could never be sure that those in power wouldn’t retaliate, however. The fate of two American sailors in the Revolutionary War, Richard Marven and Samuel Shaw, was a case in point. The men were imprisoned for libel after they reported the commander of the navy, Esek Hopkins, for a string of abuses, including the torture of British prisoners of war. In desperation they petitioned the Continental Congress for redress. Eager to assert its authority, the Congress not only paid the men’s legal bill but also passed what is generally held to be the first whistleblower-protection law in history. The law was strengthened during the Civil War via the False Claims Act, to deter the sale of shoddy military supplies.
These early laws framed such actions as an expression of patriotism. The phrase “to blow the whistle” only emerged in the 1920s, but by then U.S. whistleblowing culture had already reigned in the corporate behemoth Standard Oil. In 1902, a clerk glanced over some documents that he had been ordered to burn, only to realize they contained evidence of wrongdoing. He passed them to a friend, and they reached journalist Ida Tarbell, forming a vital part of her expose of Standard Oil’s monopolistic abuses.
During World War II, the World Jewish Congress requested special permission from the government to ransom Jewish refugees in Romania and German-occupied France. A Treasury Department lawyer named Joshua E. Dubois Jr. discovered that State Department officials were surreptitiously preventing the money from going abroad. He threatened to go public with the evidence, forcing a reluctant President Franklin Roosevelt to establish the War Refugee Board.
Over the past half-century, the number of corporate and government whistleblowers has grown enormously. Nowadays, the Internet is awash with Wikileaks-style whistleblowers. But in contrast to the saga of the Pentagon Papers, which became a turning point in the Vietnam War and concluded with Mr. Ellsberg’s vindication, it’s not clear what the release of classified documents by Julian Assange, Chelsea Manning and Edward Snowden has achieved. To some, the three are heroes; to others, they are simply spies.
Ancient Jews and Christians observed a day of rest, but not until the 20th century did workers get two days a week to do as they pleased.
April 1, 2021
Last month the Spanish government agreed to a pilot program for experimenting with a four-day working week. Before the pandemic, such a proposal would have seemed impossible—but then, so was the idea of working from home for months on end, with no clear downtime and no in-person schooling to keep the children occupied.
In ancient times, a week meant different things to different cultures. The Egyptians used sets of 10 days called decans; there were no official days off except for the craftsmen working on royal tombs and temples, who were allowed two days out of every 10. The Romans tried an eight-day cycle, with the eighth set aside as a market day. The Babylonians regarded the number seven as having divine properties and applied it whenever possible: There were seven celestial bodies, seven nights of each lunar phase and seven days of the week.
The ancient Jews, who also used a seven-day week, were the first to mandate a Sabbath or rest day, on Saturday, for all people regardless of rank or occupation. In 321 A.D., the Roman emperor Constantine integrated the Judeo-Christian Sabbath into the Julian calendar, but mindful of pagan sensibilities, he chose Sunday, the day of the god Sol, for rest and worship.
Constantine’s tinkering was the last change to the Western workweek for more than a millennia. The authorities saw no reason to allow the lower orders more than one day off a week, but they couldn’t stop them from taking matters into their own hands. By the early 18th century, the custom of “keeping Saint Monday”—that is, taking the day to recover from the Sunday hangover—had become firmly entrenched among the working classes in America and Britain.
Partly out of desperation, British factory owners began offering workers a half-day off on Saturday in return for a full day’s work on Monday. Rail companies supported the campaign with cheap-rate Saturday excursions. By the late 1870s, the term “weekend” had become so popular that even the British aristocracy started using it. For them however, the weekend began on Saturday and ended on Monday night.
American workers weren’t so fortunate. In 1908, a few New England mill owners granted workers Saturdays and Sundays off because of their large number of Jewish employees. Few other businesses followed suit until 1922, when Malcolm Gray, owner of the Rochester Can Company in upstate New York, decided to give a five-day week to his workers as a Christmas gift. The subsequent uptick in productivity was sufficiently impressive to convince Henry Ford to try the same experiment in 1926 at the Ford Motor Company. Ford’s success made the rest of the country take notice.
Meanwhile, the Soviet Union was moving in the other direction. In 1929, Joseph Stalin introduced the continuous week, which required 80% of the population to be working on any given day. It was so unpopular that the system was abandoned in 1940, the same year that the five-day workweek became law in the U.S. under the Fair Labor Standards Act. The battle for the weekend had been won at last. Now let the battle for the four-day week begin.