Historically Speaking: When Generals Run the State

Military leaders have been rulers since ancient times, but the U.S. has managed to keep them from becoming kings or dictators.

The Wall Street Journal

April 29, 2022

History has been kind to General Ulysses S. Grant, less so to President Grant. The hero of Appomattox, born 200 years ago this month, oversaw an administration beset by scandal. In his farewell address to Congress in 1876, Grant insisted lamely that his “failures have been errors of judgment, not of intent.”

Yet Grant’s presidency could as well be remembered for confirming the strength of American democracy at a perilous time. Emerging from the trauma of the Civil War, Americans sent a former general to the White House without fear of precipitating a military dictatorship. As with the separation of church and state, civilian control of the military is one of democracy’s hard-won successes.

In ancient times, the earliest kings were generals by definition. The Sumerian word for leader was “Lugal,” meaning “Big Man.” Initially, a Lugal was a temporary leader of a city-state during wartime. But by the 24th century B.C., Lugal had become synonymous with governor. The title wasn’t enough for Sargon the Great, c. 2334—2279 B.C., who called himself “Sharrukin,” or “True King,” in celebration of his subjugation of all Sumer’s city-states. Sargon’s empire lasted for three more generations.

In subsequent ancient societies, military and political power intertwined. The Athenians elected their generals, who could also be political leaders, as was the case for Pericles. Sparta was the opposite: The top Spartan generals inherited their positions. The Greek philosopher Aristotle described the Spartan monarchy—shared by two kings from two royal families—as a “kind of unlimited and perpetual generalship,” subject to some civic oversight by a 30-member council of elders.

ILLUSTRATION: THOMAS FUCHS

By contrast, ancient Rome was first a traditional monarchy whose kings were expected to fight with their armies, then a republic that prohibited actively serving generals from bringing their armies back from newly conquered territories into Italy, and finally a militarized autocracy led by a succession of generals-cum-emperors.

In later periods, boundaries between civil and military leadership blurred in much of the world. At the most extreme end, Japan’s warlords seized power in 1192, establishing the Shogunate, essentially a military dictatorship, and reducing the emperor to a mere figurehead until the Meiji Restoration in 1868. Napoleon trod a well-worn route in his trajectory from general to first consul, to first consul for life and finally emperor.

After defeating the British, General George Washington might have gone on to govern the new American republic in the manner of Rome’s Julius Caesar or England’s Oliver Cromwell. Instead, Washington chose to govern as a civilian and step down at the end of two terms, ensuring the transition to a new administration without military intervention. Astonished that a man would cling to his ideals rather than to power, King George III declared if Washington stayed true to his word, “he will be the greatest man in the world.”

The trust Americans have in their army is reflected in the tally of 12 former generals who have been U.S. presidents, from George Washington to Dwight D Eisenhower. President Grant may not have fulfilled the hopes of the people, but he kept the promise of the republic.

Historically Speaking: How the Waistband Got Its Stretch

Once upon a time, human girth was bound by hooks and buttons, and corsets had metal stays. Along came rubber and a whole new technology of flexible cloth.

The Wall Street Journal

January 7, 2021

The New Year has arrived, and if you’re like me, you’ve promised yourself a slimmer, fitter and healthier you in 2022. But in the meantime there is the old you to deal with—the you who overindulged at Thanksgiving and didn’t stop for the next 37 days. No miracle diet or resolution can instantaneously eradicate five weeks of wild excess. Fortunately, modern science has provided the next best thing to a miracle: the elasticated waistband.

Before the invention of elastic, adjustable clothing was dependent on technology that had hardly changed since ancient times. The Indus Valley Civilization made buttons from seashells as early as 2000 BC.

The first inkling that there might be an alternative to buttons, belts, hooks and other adjustable paraphernalia came in the late 18th century, with the discovery that rubber wasn’t only good for toys. It also had immensely practical applications for things such as pencil erasers and lid sealants. Rubber’s stretchable nature offered further possibilities in the clothing department. But there was no word for its special property until the poet William Cowper borrowed the 17th-century term “elastic,” used to describe the expansion and contraction of gases, for his translation of the Iliad in 1791: “At once he bent Against Tydides his elastic bow.”

PHOTO: GETTY IMAGES

By 1820, an enterprising English engineer named Thomas Hancock was making elastic straps and suspenders out of rubber. He also invented the “masticator,” a machine that rolled shredded rubber into sheets for industrial use. Elastic seemed poised to make a breakthrough: In the 1840s, Queen Victoria’s shoemaker, Joseph Sparkes Hall, popularized his invention of the elastic-gusset ankle boot, still known today as the Chelsea Boot.

But rubber had drawbacks. Not only was it a rare and expensive luxury that tended to wear out quickly, it was also sticky, sweaty and smelly. Elasticized textiles became popular only after World War I, helped by the demand for steel—and female workers—that led women to forego corsets with metal stays. Improved production techniques at last made elasticated girdles a viable alternative: In 1924, the Madame X rubber girdle promised to help women achieve a thinner form in “perfect comfort while you sit, work or play.”

The promise of comfort became real with the invention of Lastex, essentially rubber yarn, in 1930. Four years later, in 1934, Alexander Simpson, a London tailor, removed the need for belts or suspenders by introducing the adjustable rubber waistband in men’s trousers.

The constant threat of rubber shortages sparked a global race to devise synthetic alternatives. The winner was the DuPont Company, which invented neoprene in 1930. That research led to an even more exciting invention: the nylon stocking. Sales were halted during World War II, creating such pent-up demand that in 1946 there were “nylon riots” throughout the U.S., including in Pittsburgh, where 40,000 people tried to buy 13,000 pairs of stockings.

DuPont scored another win in 1958 with spandex, also known under the brand name Lycra, which is not only more durable than nylon but also stretchier. Spandex made dreams possible by making fabrics more flexible and forgiving: It helped the astronaut Neil Armstrong to walk on the moon and Simone Biles to become the most decorated female gymnast in history. And it will help me to breathe a little easier until I can fit into my jeans again.

Historically Speaking: For Punishment or Penitence?

Fifty years ago, the Attica uprising laid bare the conflicting ideas at the heart of the U.S. prison system.

The Wall Street Journal

September 17, 2021

Fifty years ago this past week, inmates in Attica, New York, staged America’s deadliest prison uprising. The organizers held prison employees hostage while demanding better conditions. One officer and three inmates were killed during the rioting, and the revolt’s suppression left another 39 dead and at least 89 seriously wounded. The episode raised serious questions about prison conditions and ultimately led to some reforms.

Nearly two centuries earlier, the founders of the U.S. penal system had intended it as a humane alternative to those that relied on such physical punishments as mutilation and whipping. After the War of Independence, Benjamin Franklin and leading members of Philadelphia’s Quaker community argued that prison should be a place of correction and penitence. Their vision was behind the construction of the country’s first “penitentiary house” at the Walnut Street Jail in Philadelphia in 1790. The old facility threw all prisoners together; its new addition contained individual cells meant to prevent moral contagion and to encourage prisoners to spend time reflecting on their crimes.

Inmates protest prison conditions in Attica, New York, Sept. 10, 1971

Walnut Street inspired the construction of the first purpose-built prison, Eastern State Penitentiary, which opened outside of Philadelphia in 1829. Prisoners were kept in solitary confinement and slept, worked and ate in their cells—a model that became known as the Pennsylvania system. Neighboring New York adopted the Auburn system, which also enforced total silence but required prisoners to work in communal workshops and instilled discipline through surveillance, humiliation and corporal punishment. Although both systems were designed to prevent recidivism, the former stressed prisoner reform while the latter carried more than a hint of retribution.

Europeans were fascinated to see which system worked best. In 1831, the French government sent Alexis de Tocqueville and Gustave de Beaumont to investigate. Having inspected facilities in several states, they concluded that although the “penitentiary system in America is severe,” its combination of isolation and work offered hope of rehabilitation. But the novelist Charles Dickens reached the opposite conclusion. After touring Eastern State Penitentiary in 1842, he wrote that the intentions behind solitary confinement were “kind, humane and meant for reformation.” In practice, however, total isolation was “worse than any torture of the body”: It broke rather than reformed people.

Severe overcrowding—there was no parole in the 19th century—eventually undermined both systems. Prisoner violence became endemic, and regimes of control grew harsher. Sing Sing prison in New York meted out 36,000 lashes in 1843 alone. In 1870, the National Congress on Penitentiary and Reformatory Discipline proposed reforms, including education and work-release initiatives. Despite such efforts, recidivism rates remained high, physical punishment remained the norm and almost 200 serious prison riots were recorded between 1855 and 1955.

That year, Harry Manuel Shulman, a deputy commissioner in New York City’s Department of Correction, wrote an essay arguing that the country’s early failure to decide on the purpose of prison had immobilized the system, leaving it “with one foot in the road of rehabilitation and the other in the road of punishment.” Which would it choose? Sixteen years later, Attica demonstrated the consequences of ignoring the question.

Historically Speaking: The Long Haul of Distance Running

How the marathon became the world’s top endurance race

The Wall Street Journal

September 2, 2021

The New York City Marathon, the world’s largest, will hold its 50th race this autumn, after missing last year’s due to the pandemic. A podiatrist once told me that he always knows when there has been a marathon because of the sudden uptick in patients with stress fractures and missing toenails. Nevertheless, humans are uniquely suited to long-distance running.

Some 2-3 million years ago, our hominid ancestors began to develop sweat glands that enabled their bodies to stay cool while chasing after prey. Other mammals, by contrast, overheat unless they stop and rest. Thus, slow but sweaty humans won out over fleet but panting animals.

The marathon, at 26.2 miles, isn’t the oldest known long-distance race. Egyptian Pharaoh Taharqa liked to organize runs to keep his soldiers fit. A monument inscribed around 685 B.C. records a two-day, 62-mile race from Memphis to Fayum and back. The unnamed winner of the first leg (31 miles) completed it in about four hours.

ILLUSTRATION: THOMAS FUCHS

The considerably shorter marathon derives from the story of a Greek messenger, Pheidippides, who allegedly ran from Marathon to Athens in 490 B.C. to deliver news of victory over the Persians—only to drop dead of exhaustion at the end. But while it is true that the Greeks used long-distance runners, called hemerodromoi, or day runners, to convey messages, this story is probably a myth or a conflation of different events.

Still, foot-bound messengers ran impressive distances in their day. Within 24 hours of Herman Cortes’s landing in Mexico in 1519, messenger relays had carried news of his arrival over 260 miles to King Montezuma II in Tenochtitlan.

As a competitive sport, the marathon has a shorter history. The longest race at the ancient Olympic Games was about 3 miles. This didn’t stop the French philologist Michel Bréal from persuading the organizers of the inaugural modern Olympics in 1896 to recreate Pheidippides’s epic run as a way of adding a little classical flavor to the Games. The event exceeded his expectations: The Greek team trained so hard that it won 8 of the first 9 places. John Graham, manager of the U.S. Olympic team, was inspired to organize the first Boston Marathon in 1897.

Marathon runners became fitter and faster with each Olympics. But at the 1908 London Games the first runner to reach the stadium, the Italian Dorando Pietri, arrived delirious with exhaustion. He staggered and fell five times before concerned officials eventually helped him over the line. This, unfortunately, disqualified his time of 2:54:46.

Pietri’s collapse added fuel to the arguments of those who thought that a woman’s body could not possibly stand up to a marathon’s demands. Women were banned from the sport until 1964, when Britain’s Isle of Wight Marathon allowed the Scotswoman Dale Greig to run, with an ambulance on standby just in case. Organizers of the Boston Marathon proved more intransigent: Roberta Gibb and Katherine Switzer tried to force their way into the race in 1966 and ’67, but Boston’s gender bar stayed in place until 1972. The Olympics held out until 1984.

Since that time, marathons have become a great equalizer, with men and women on the same course: For 26.2 miles, the only label that counts is “runner.”

Historically Speaking: Let Slip the Dogs, Birds and Donkeys of War

Animals have served human militaries with distinction since ancient times

The Wall Street Journal

August 5, 2021

Cher Ami, a carrier pigeon credited with rescuing a U.S. battalion from friendly fire in World War I, has been on display at the Smithsonian for more a century. The bird made news again this summer, when DNA testing revealed that the avian hero was a “he” and not—as two feature films, several novels and a host of poems depicted—a ”she.”

Cher Ami was one of more than 200,000 messenger pigeons Allied forces employed during the War. On Oct. 4, 1918, a battalion from the U.S. 77th Infantry Division in Verdun, northern France, was trapped behind enemy lines. The Germans had grown adept at shooting down any bird suspected of working for the other side. They struck Cher Ami in the chest and leg—but the pigeon still managed to make the perilous flight back to his loft with a message for U.S. headquarters.

Animals have played a crucial role in human warfare since ancient times. One of the earliest depictions of a war animal appears on the celebrated 4,500-year-old Sumerian box known as the Standard of Ur. One side shows scenes of war; the other, scenes of peace. On the war side, animals that are most probably onagers, a species of wild donkey, are shown dragging a chariot over the bodies of enemy soldiers.

War elephants of Pyrrhus in a 20th century Russian painting
PHOTO: ALAMY

The two most feared war animals of the classical world were horses and elephants. Alexander the Great perfected the use of the former and introduced the latter after his foray into India in 327 BC. For a time, the elephant was the ultimate weapon of war. At the Battle of Heraclea in 280 B.C., a mere 20 of them helped Pyrrhus, king of Epirus—whose costly victories inspired the term “Pyrrhic victory”—rout an entire Roman army.

War animals didn’t have to be big to be effective, however. The Romans learned how to defeat elephants by exploiting their fear of pigs. In 198 B.C., the citizens of Hatra, near Mosul in modern Iraq, successfully fought off a Roman attack by pouring scorpions on the heads of the besiegers. Eight years later, the Carthaginian general Hannibal won a surprise naval victory against King Eumenes II of Pergamon by catapulting “snake bombs”—jars stuffed with poisonous snakes—onto his ships.

Ancient war animals often suffered extraordinary cruelty. When the Romans sent pigs to confront Pyrrhus’s army, they doused the animals in oil and set them on fire to make them more terrifying. Hannibal would get his elephants drunk and stab their legs to make them angry.

Counterintuitively, as warfare became more mechanized the need for animals increased. Artillery needed transporting; supplies, camps, and prisoners needed guarding. A favorite mascot or horse might be well treated: George Washington had Nelson, and Napoleon had Marengo. But the life of the common army animal was hard and short. The Civil War killed between one and three million horses, mules and donkeys.

According to the Imperial War Museum in Britain, some 16 million animals served during World War I, including canaries, dogs, bears and monkeys. Horses bore the brunt of the fighting, though, with as many as 8 million dying over the four years.

Dolphins and sea lions have conducted underwater surveillance for the U.S. Navy and helped to clear mines in the Persian Gulf. The U.S. Army relies on dogs to detect hidden IEDs, locate missing soldiers, and even fight when necessary. In 2016, four sniffer dogs serving in Afghanistan were awarded the K-9 Medal of Courage by the American Humane Association. As the troop withdrawal continues, the military’s four-legged warriors are coming home, too

Historically Speaking: The Beacon of the Public Library

Building places for ordinary people to read and share books has been a passion project of knowledge-seekers since before Roman times.

The Wall Street Journal

July 8, 2021

“The libraries are closing forever, like tombs,” wrote the historian Ammianus Marcellinus in 378 A.D. The Goths had just defeated the Roman army in the Battle of Adrianople, marking what is generally thought to be the beginning of the end of Rome.

His words echoed in my head during the pandemic, when U.S. public libraries closed their doors one by one. By doing so they did more than just close off community spaces and free access to books: They dimmed one of the great lamps of civilization.

Kings and potentates had long held private libraries, but the first open-access version came about under the Ptolemies, the Macedonian rulers of Egypt from 305 to 30 B.C. The idea was the brainchild of Ptolemy I Soter, who inherited Egypt after the death of Alexander the Great, and the Athenian governor Demetrius Phalereus, who fled there following his ouster in 307 B.C. United by a shared passion for knowledge, they set out to build a place large enough to store a copy of every book in the world. The famed Library of Alexandria was the result.

ILLUSTRATION: THOMAS FUCHS

Popular myth holds that the library was accidentally destroyed when Julius Caesar’s army set fire to a nearby fleet of Egyptian boats in 48 B.C. In fact the library eroded through institutional neglect over many years. Caesar was himself responsible for introducing the notion of public libraries to Rome. These repositories became so integral to the Roman way of life that even the public baths had libraries.

Private libraries endured the Dark Ages better than public ones. The Al-Qarawiyyin Library and University in Fez, Morocco, founded in 859 by the great heiress and scholar Fatima al-Fihri, survives to this day. But the celebrated Abbasid library, Bayt al-Hikmah (House of Wisdom), in Baghdad, which served the entire Muslim world, did not. In 1258 the Mongols sacked the city, slaughtering its inhabitants and dumping hundreds of thousands of the library’s books into the Tigris River. The mass of ink reportedly turned the water black.

A thousand years after Soter and Demetrius built the Library of Alexandra, Renaissance Florence benefited from a similar partnership between Cosimo de Medici and the scholar Niccolo de Niccoli. At Niccolo’s death in 1437, Cosimo carried out his friend’s wishes to bequeath his books to the people. Not only was the magnificent Biblioteca San Marco Italy’s first purpose-built public library, but its emphasis on reading spaces and natural light became the template for library architecture.

By the end of the 18th century, libraries could be found all over Europe and the Americas. But most weren’t places where the public could browse or borrow for free. Even Benjamin Franklin’s Library Company of Philadelphia, founded in 1731, required its members to subscribe.

The citizens of Peterborough, New Hampshire, started the first free public library in the U.S. in 1833, voting to tax themselves to pay for it, on the grounds that knowledge was a civic good. Many philanthropists, including George Peabody and John Jacob Astor, took up the cause of building free libraries.

But the greatest advocate of all was the steel magnate Andrew Carnegie. Determined to help others achieve an education through free libraries—just as he had done as a boy —Carnegie financed the construction of some 2,509 of them, with 1,679 spread across the U.S. He built the first in his hometown of Dumferline, Scotland in 1883. Carved over the entrance were the words “Let There Be Light.” It’s a motto to keep in mind as U.S. public libraries start to reopen.

Historically Speaking: How the Office Became a Place to Work

Employees are starting to return to their traditional desks in large shared spaces. But centuries ago, ‘office’ just meant work to be done, not where to do it.

The Wall Street Journal

June 24, 2021

Wall Street wants its workforce back in the office. Bank of America, Morgan Stanley and Goldman Sachs have all let employees know that the time is approaching to exchange pajamas and sweats for less comfortable work garb. Some employees are thrilled at the prospect, but others waved goodbye to the water cooler last year and have no wish to return.

Contrary to popular belief, office work is not a beastly invention of the capitalist system. As far back as 3000 B.C, the temple cities of Mesopotamia employed teams of scribes to keep records of official business. The word “office” is an amalgamation of the Latin officium, which meant a position or duty, and ob ficium, literally “toward doing.” Geoffrey Chaucer was the first writer known to use “office” to mean an actual place, in “The Canterbury Tales” in 1395.

In the 16th century, the Medicis of Florence built the Uffizi, now famous as a museum, for conducting their commercial and political business (the name means “offices” in Italian). The idea didn’t catch on in Europe, however, until the British began to flex their muscles across the globe. When the Royal Navy outgrew its cramped headquarters, it commissioned a U-shaped building in central London originally known as Ripley Block and later as the Old Admiralty building. Completed in 1726, it is credited with being the U.K.’s first purpose-built office.

Three years later, the East India Company began administering its Indian possessions from gleaming new offices in Leadenhall Street. The essayist and critic Charles Lamb joined the East India Company there as a junior clerk in 1792 and stayed until his retirement, but he detested office life, calling it “daylight servitude.” “I always arrive late at the office,” he famously wrote, “but I make up for it by leaving early.”

A scene from “The Office,” which reflected the modern ambivalence toward deskbound work.
PHOTO: CHRIS HASTON/NBC/EVERETT COLLECTION

Not everyone regarded the office as a prison without bars. For women it could be liberating. An acute manpower shortage during the Civil War led Francis Elias Spinner, the U.S. Treasurer, to hire the government’s first women office clerks. Some Americans were scandalized by the development. In 1864, Rep. James H Brooks told a spellbound House that the Treasury Department was being defiled by “orgies and bacchanals.”

In the late 19th century, the inventions of the light bulb and elevator were as transformative for the office as the telephone and typewriter: More employees could be crammed into larger spaces for longer hours. Then in 1911, Frederick Winslow Taylor published “The Principles of Scientific Management,” which advocated a factory-style approach to the workplace with rows of desks lined up in an open-plan room. “Taylorism” inspired an entire discipline devoted to squeezing more productivity from employees.

Sinclair Lewis’s 1917 novel, “The Job,” portrayed the office as a place of opportunity for his female protagonist, but he was an outlier among writers and social critics. Most fretted about the effects of office work on the souls of employees. In 1955, Sloan Wilson’s “The Man in the Grey Flannel Suit,” about a disillusioned war veteran trapped in a job that he hates, perfectly captured the deep-seated American ambivalence toward the office. Modern television satires like “The Office” show that the ambivalence has endured—as do our conflicted attitudes toward a post-pandemic return to office routines.

Historically Speaking: Whistleblowing’s Evolution, From Rome to the Pentagon Papers to Wikileaks

The exposure 50 years ago of government documents about the Vietnam War ushered in a modern era of leaks, built on a long tradition

The Wall Street Journal

June 12, 2021

The Pentagon Papers—a secret Defense Department review of America’s involvement in the Vietnam War—became public 50 years ago next week. The ensuing Supreme Court case guaranteed the freedom of the press to report government malfeasance, but the U.S. military analyst behind the revelation, Daniel Ellsberg, still ended up being prosecuted for espionage. Luckily for him, the charges were dropped after the trial became caught up in the Watergate scandal.

The twists and turns surrounding the Pentagon Papers have a uniquely American flavor to them. At the time, no other country regarded whistleblowing as a basic right.

The origins of whistleblowing are far less idealistic. The idea is descended from Roman ‘‘Qui Tam’’ laws, from a Latin phrase meaning “he who sues for himself does also for the king.” The Qui Tam laws served a policing function by giving informers a financial incentive to turn in wrong-doers. A citizen who successfully sued over malfeasance was rewarded with a portion of the defendant’s estate.

Daniel Ellsberg, left, testifying before members of Congress on July 28, 1971, several weeks after the publication of the Pentagon Papers.
PHOTO: BETTMANN ARCHIVE/GETTY IMAGES

Anglo-Saxon law retained a crude version of Qui Tam. At first primarily aimed at punishing Sabbath-breakers, it evolved into whistleblowing against corruption. In 1360, the English monarch Edward III resorted to Qui Tam-style laws to encourage the reporting of jurors and public officials who accepted bribes.

Whistleblowers could never be sure that those in power wouldn’t retaliate, however. The fate of two American sailors in the Revolutionary War, Richard Marven and Samuel Shaw, was a case in point. The men were imprisoned for libel after they reported the commander of the navy, Esek Hopkins, for a string of abuses, including the torture of British prisoners of war. In desperation they petitioned the Continental Congress for redress. Eager to assert its authority, the Congress not only paid the men’s legal bill but also passed what is generally held to be the first whistleblower-protection law in history. The law was strengthened during the Civil War via the False Claims Act, to deter the sale of shoddy military supplies.

These early laws framed such actions as an expression of patriotism. The phrase “to blow the whistle” only emerged in the 1920s, but by then U.S. whistleblowing culture had already reigned in the corporate behemoth Standard Oil. In 1902, a clerk glanced over some documents that he had been ordered to burn, only to realize they contained evidence of wrongdoing. He passed them to a friend, and they reached journalist Ida Tarbell, forming a vital part of her expose of Standard Oil’s monopolistic abuses.

During World War II, the World Jewish Congress requested special permission from the government to ransom Jewish refugees in Romania and German-occupied France. A Treasury Department lawyer named Joshua E. Dubois Jr. discovered that State Department officials were surreptitiously preventing the money from going abroad. He threatened to go public with the evidence, forcing a reluctant President Franklin Roosevelt to establish the War Refugee Board.

Over the past half-century, the number of corporate and government whistleblowers has grown enormously. Nowadays, the Internet is awash with Wikileaks-style whistleblowers. But in contrast to the saga of the Pentagon Papers, which became a turning point in the Vietnam War and concluded with Mr. Ellsberg’s vindication, it’s not clear what the release of classified documents by Julian Assange, Chelsea Manning and Edward Snowden has achieved. To some, the three are heroes; to others, they are simply spies.

Historically Speaking: The Long Road to Protecting Inventions With Patents

Gunpowder was never protected. Neither were inventions by Southern slaves. Vaccines are—but that’s now the subject of a debate.

The Wall Street Journal

May 20, 2021

The U.S. and China don’t see eye to eye on much nowadays, but in a rare show of consensus, the two countries both support a waiver of patent rights for Covid-19 vaccines. If that happens, it would be the latest bump in a long, rocky road for intellectual property rights.

Elijah McCoy and a diagram from one of his patents for engine lubrication.
ILLUSTRATION: THOMAS FUCHS

There was no such thing as patent law in the ancient world. Indeed, until the invention of gunpowder, the true cost of failing to protect new ideas was never even considered. In the mid-11th century, the Chinese Song government realized too late that it had allowed the secret of gunpowder to escape. It tried to limit the damage by banning the sale of saltpeter to foreigners. But merchants found ways to smuggle it out, and by 1280 Western inventors were creating their own recipes for gunpowder.

Medieval Europeans understood that knowledge and expertise were valuable, but government attempts at control were crude in the extreme. The Italian Republic of Lucca protected its silk trade technology by prohibiting skilled workers from emigrating; Genoa offered bounties for fugitive artisans. Craft guilds were meant to protect against intellectual expropriation, but all too often they simply stifled innovation.

The architect Filippo Brunelleschi, designer of the famous dome of Florence’s Santa Maria del Fiore, was the first to rebel against the power of the guilds. In 1421 he demanded that the city grant him the exclusive right to build a new type of river boat. His deal with Florence is regarded as the first legal patent. Unfortunately, the boat sank on its first voyage, but other cities took note of Brunelleschi’s bold new business approach.

In 1474 the Venetians invited individuals “capable of devising and inventing all kinds of ingenious contrivances” to establish their workshops in Venice. In return for settling in the city, the Republic offered them the sole right to manufacture their inventions for 10 years. Countries that imitated Venice’s approach reaped great financial rewards. England’s Queen Elizabeth I granted over 50 individual patents, often with the proviso that the patent holder train English craftsmen to carry on the trade.

Taking their cue from British precedent, the framers of the U.S. Constitution gave Congress the power to legislate on intellectual property rights. Congress duly passed a patent law in 1790 but failed to address the legal position of enslaved inventors. Their anomalous position came to a head in 1857 after a Southern slave owner named Oscar Stuart tried to patent a new plow invented by his slave Ned. The request was denied on the grounds that the inventor was a slave and therefore not a citizen, and while the owner was a citizen, he wasn’t the inventor.

After the Civil War, the opening up of patent rights enabled African-American inventors to bypass racial barriers and amass significant fortunes. Elijah McCoy (1844-1929) transformed American rail travel with his engine lubrication system.

McCoy ultimately registered 57 U.S. patents, significantly more than Alexander Graham Bell’s 18, though far fewer than Thomas Edison’s 1,093. The American appetite for registering inventions remains unbounded. Last fiscal year alone, the U.S. Patent and Trademark Office issued 399,055 patents.

Is there anything that can’t be patented? The answer is yes. In 1999 Smuckers attempted to patent its crustless peanut butter and jelly sandwich with crimped edges. Eight years and a billion homemade PB&J sandwiches later, a federal appeals court ruled there was nothing “novel” about foregoing the crusts.

Historically Speaking: The Long Fight to Take the Weekend Off

Ancient Jews and Christians observed a day of rest, but not until the 20th century did workers get two days a week to do as they pleased.

Wall Street Journal

April 1, 2021

Last month the Spanish government agreed to a pilot program for experimenting with a four-day working week. Before the pandemic, such a proposal would have seemed impossible—but then, so was the idea of working from home for months on end, with no clear downtime and no in-person schooling to keep the children occupied.

In ancient times, a week meant different things to different cultures. The Egyptians used sets of 10 days called decans; there were no official days off except for the craftsmen working on royal tombs and temples, who were allowed two days out of every 10. The Romans tried an eight-day cycle, with the eighth set aside as a market day. The Babylonians regarded the number seven as having divine properties and applied it whenever possible: There were seven celestial bodies, seven nights of each lunar phase and seven days of the week.

A day of leisure in Newport Beach, Calif., 1928. PHOTO: DICK WHITTINGTON STUDIO/CORBIS/GETTY IMAGES

A day of leisure in Newport Beach, Calif., 1928. PHOTO: DICK WHITTINGTON STUDIO/CORBIS/GETTY IMAGES

The ancient Jews, who also used a seven-day week, were the first to mandate a Sabbath or rest day, on Saturday, for all people regardless of rank or occupation. In 321 A.D., the Roman emperor Constantine integrated the Judeo-Christian Sabbath into the Julian calendar, but mindful of pagan sensibilities, he chose Sunday, the day of the god Sol, for rest and worship.

Constantine’s tinkering was the last change to the Western workweek for more than a millennia. The authorities saw no reason to allow the lower orders more than one day off a week, but they couldn’t stop them from taking matters into their own hands. By the early 18th century, the custom of “keeping Saint Monday”—that is, taking the day to recover from the Sunday hangover—had become firmly entrenched among the working classes in America and Britain.

Partly out of desperation, British factory owners began offering workers a half-day off on Saturday in return for a full day’s work on Monday. Rail companies supported the campaign with cheap-rate Saturday excursions. By the late 1870s, the term “weekend” had become so popular that even the British aristocracy started using it. For them however, the weekend began on Saturday and ended on Monday night.

American workers weren’t so fortunate. In 1908, a few New England mill owners granted workers Saturdays and Sundays off because of their large number of Jewish employees. Few other businesses followed suit until 1922, when Malcolm Gray, owner of the Rochester Can Company in upstate New York, decided to give a five-day week to his workers as a Christmas gift. The subsequent uptick in productivity was sufficiently impressive to convince Henry Ford to try the same experiment in 1926 at the Ford Motor Company. Ford’s success made the rest of the country take notice.

Meanwhile, the Soviet Union was moving in the other direction. In 1929, Joseph Stalin introduced the continuous week, which required 80% of the population to be working on any given day. It was so unpopular that the system was abandoned in 1940, the same year that the five-day workweek became law in the U.S. under the Fair Labor Standards Act. The battle for the weekend had been won at last. Now let the battle for the four-day week begin.