The Sunday Times: Texas Talibanistas, take note: freedom will win

The blow to abortion rights is shocking, but this fight is nowhere near over

The Sunday Times

September 7, 2021

The pro-life movement in America finally got its wish this week: a little before midnight on Wednesday, in a 5-4 decision, the Supreme Court ruled against temporarily blocking a Texas state law passed in May, known as Senate Bill 8 (SB8), banning almost all abortions once a heartbeat can be detected by ultrasound — which is around six weeks after conception. The bill will still eventually return to the Supreme Court for a final decision, but by being allowed to stand unchanged it becomes the strictest anti-abortion law in the nation. There are no exceptions for child pregnancy, rape or incest.

But this isn’t the reason for the national uproar. SB8 goes further than any other anti-abortion bill yet crafted because of the way it allows the ban to be enforced. Under the new Texas law, a $10,000 bounty will be awarded to any US citizen who successfully sues a person or entity that helps a woman to obtain an abortion. “Help” includes providing money, transport, medicines or medical aid.

To speed up the process, Texas Right to Life, an anti-abortion organisation, has already set up an anonymous tip line for “whistleblowers”. That’s right, the second-largest state in the union by size and population is turning family against family, neighbour against neighbour, to create its own spy network of uterus police. Welcome to Gilead-on-the-Rio Grande. Cue outrage from all Americans who support legal abortion — and, according to recent polls, they amount to 58 per cent of the country.

There is no doubt that SB8 is a huge victory for the pro-life campaign. Texas joins 24 countries worldwide that have a total or near-total ban on abortion. Outside the big cities, large swathes of America are already abortion-free zones: only 11 per cent of counties have a hospital or clinic that provides such services.

In the short term the outlook for that most basic of human rights, a woman’s control over her body, is dire in America. The combination of a Republican-packed Supreme Court, thanks to Donald Trump’s last-minute appointment of Amy Coney Barrett following the death in September last year of Ruth Bader Ginsberg, and SB8’s sneaky bypassing of federal authority has closed down the obvious routes for legal redress. Moreover, the Senate is tied 50-50, making it impossible for Congress to pass a law mandating a woman’s unrestricted access to abortion. The Texas Talibanistas have gained the upper hand. Similar laws to SB8 will no doubt be passed in other Republican states.

The Texas appeal to vigilantism should also offend everyone who believes in democracy and the rule of law. But — and this may be hard to accept in the heat of the moment — SB8 is a gift to the pro-choice movement.

Pro-life Texans thought they were being clever by avoiding both the Supreme Court and Congress to slip through an abortion ban. But, as the saying goes, be careful what you wish for. “Lawfare” is a two-way street. Critics of SB8 point out that there is nothing to stop California passing a similar bill that enables citizens to bring civil lawsuits against people who utter “hate speech”, or to stop New York deputising bounty-hunters to sue gun-owners. Nor does the legal chaos stop there. SB8 could open the way for railways, car companies and airlines to become liable for providing travel assistance to an abortion-seeking woman, or supermarkets for selling the disinfectant Lysol and other substances that induce abortion. Forget about boycotts for a moment; the threat of a lawsuit is a powerful deterrent to corporations seeking to do business in Texas.

History is not the best predictor of the future. Nevertheless, the disastrous dalliance with prohibition, which lasted for 13 years between 1920 and 1933, offers a salient lesson in what happens when a long-held individual right is taken away from Americans. The non-metropolitan parts of the country forced their will on the urban parts. But drinking didn’t stop; it just went underground. Some states wouldn’t enforce the ban, and other states couldn’t. In Detroit, Michigan, the alcohol trade was the largest single contributor to the economy after the car industry. Prohibition fostered the American mafia, led to a rise in alcoholism, alcohol-related deaths, mass lawlessness and civil disobedience and brought about extraordinary levels of corruption.

There is every reason to believe that abortions will continue in America no matter what anti-abortion zealots manage to pull off. It just won’t be pretty. A recent study published in The Lancet Global Health revealed that the countries with the greatest restrictions not only have the highest termination rates in the world but are also among the least economically successful. This is the club that awaits pro-life America.

The strangulation of women’s rights has been so slow that supporters of Roe v Wade, the 1973 ruling that made abortion legal, were lulled into a false sense of security. They assumed the minority of Americans fighting for a repeal would never overwhelm the will of the majority. SB8 has changed all that. Its underpinnings threaten so many constitutional rights that abortion is going to be front and centre in every state and federal election.

Democracy does work, even if, as with prohibition, it takes to time to roll back injustices. Last year the Virginia state legislature voted to remove more than a decade’s worth of abortion restrictions. This is the body that in 2012 stood accused of “state-sanctioned rape” for passing a bill that required any woman seeking an abortion to submit to an ultrasound first, not by the usual external method but with a transvaginal wand.

Despite what anti-abortion fanatics believe, the US is a pro-choice country. The fight for women’s rights will go on, and on, until the people win.

Historically Speaking: The Long Haul of Distance Running

How the marathon became the world’s top endurance race

The Wall Street Journal

September 2, 2021

The New York City Marathon, the world’s largest, will hold its 50th race this autumn, after missing last year’s due to the pandemic. A podiatrist once told me that he always knows when there has been a marathon because of the sudden uptick in patients with stress fractures and missing toenails. Nevertheless, humans are uniquely suited to long-distance running.

Some 2-3 million years ago, our hominid ancestors began to develop sweat glands that enabled their bodies to stay cool while chasing after prey. Other mammals, by contrast, overheat unless they stop and rest. Thus, slow but sweaty humans won out over fleet but panting animals.

The marathon, at 26.2 miles, isn’t the oldest known long-distance race. Egyptian Pharaoh Taharqa liked to organize runs to keep his soldiers fit. A monument inscribed around 685 B.C. records a two-day, 62-mile race from Memphis to Fayum and back. The unnamed winner of the first leg (31 miles) completed it in about four hours.

ILLUSTRATION: THOMAS FUCHS

The considerably shorter marathon derives from the story of a Greek messenger, Pheidippides, who allegedly ran from Marathon to Athens in 490 B.C. to deliver news of victory over the Persians—only to drop dead of exhaustion at the end. But while it is true that the Greeks used long-distance runners, called hemerodromoi, or day runners, to convey messages, this story is probably a myth or a conflation of different events.

Still, foot-bound messengers ran impressive distances in their day. Within 24 hours of Herman Cortes’s landing in Mexico in 1519, messenger relays had carried news of his arrival over 260 miles to King Montezuma II in Tenochtitlan.

As a competitive sport, the marathon has a shorter history. The longest race at the ancient Olympic Games was about 3 miles. This didn’t stop the French philologist Michel Bréal from persuading the organizers of the inaugural modern Olympics in 1896 to recreate Pheidippides’s epic run as a way of adding a little classical flavor to the Games. The event exceeded his expectations: The Greek team trained so hard that it won 8 of the first 9 places. John Graham, manager of the U.S. Olympic team, was inspired to organize the first Boston Marathon in 1897.

Marathon runners became fitter and faster with each Olympics. But at the 1908 London Games the first runner to reach the stadium, the Italian Dorando Pietri, arrived delirious with exhaustion. He staggered and fell five times before concerned officials eventually helped him over the line. This, unfortunately, disqualified his time of 2:54:46.

Pietri’s collapse added fuel to the arguments of those who thought that a woman’s body could not possibly stand up to a marathon’s demands. Women were banned from the sport until 1964, when Britain’s Isle of Wight Marathon allowed the Scotswoman Dale Greig to run, with an ambulance on standby just in case. Organizers of the Boston Marathon proved more intransigent: Roberta Gibb and Katherine Switzer tried to force their way into the race in 1966 and ’67, but Boston’s gender bar stayed in place until 1972. The Olympics held out until 1984.

Since that time, marathons have become a great equalizer, with men and women on the same course: For 26.2 miles, the only label that counts is “runner.”

Historically Speaking: A Legacy of Tinderbox Forests

Long before climate change exacerbated the problem, policies meant to suppress wildfires served to fan the flames

The Wall Street Journal

August 19, 2021

This year’s heat waves and droughts have led to record-breaking wildfires across three continents. The fires in Siberia are so vast that smoke has reached the North Pole for what is believed to be the first time. In the United States, California’s Dixie Fire has become the largest single fire in the state’s history.

Humans have long wrestled with forest fire, seeking by turns to harness and to suppress it. Early European efforts to control forest fires were tentative and patchy. In the 14th century, the Sardinians experimented with fire breaks, but the practice was slow to catch on. In North America, by contrast, scientists have found 2000-year-old evidence of controlled burnings by Native American tribes. But this practice died out with the arrival of European immigrants, because of local bans as well as the expulsion of tribes from native lands. As a consequence, forests not only became larger and denser but also filled with mounds of dead and dried vegetation, making them very susceptible to fire.

Disaster struck in the fall of 1871. Dozens of wildfires broke out simultaneously across the Upper Midwest and Great Lakes region, and on Oct. 8, the same night as the Chicago Fire, a firestorm engulfed the town of Peshtigo, Wisconsin, killing an estimated 1,200 people (and possibly many more). The fire was the deadliest in U.S. history.

ILLUSTRATION: ANTHONY FREDA

Early conservationists, such as Franklin Hough, sought to organize a national wildfire policy. The U.S. Forest Service was created in 1905 and was still figuring out its mission in 1910, when the northern Rockies went up in flames. Fires raced through Washington, Montana and Idaho, culminating in what is known as the “Big Blowup” of August 20 and 21.

One of the U.S. Forest Service’s newly minted rangers, Edward C. Pulaski, was leading a team of 45 firefighters near Wallace, Idaho. A firestorm surrounded the men, forcing Pulaski to lead them down a disused mine shaft. Several attempted to go back outside, believing they would be cooked alive in the smoke-filled mine. Pulaski managed to stop the suicidal exodus by threatening to shoot any man who tried to leave. To maintain morale, he organized a water bucket chain to prevent the blankets covering the exit from catching fire. Pulaski’s actions saved the lives of all but five of the firefighters, but his eyesight and lungs never recovered.

The Big Blowup destroyed more than 3 million acres in two days and killed at least 80 people. In response to the devastation, the Forest Service, with the public’s support, adopted the mistaken goal of total fire suppression rather than fire management. The policy remained in place until 1978, bequeathing to the country a legacy of tinderbox forests.

Nowadays, the Forest Service conducts controlled burns, known as prescribed fires, to mitigate the risk of wildfire. The technique is credited with helping to contain July’s 413,000-acre Bootleg Fire in Oregon.

Fire caused by the effects of climate change will require human intervention of another order. In the Bible’s Book of Isaiah, the unhappy “sinners of Zion” cry out: ‘”Who of us can live where there is a consuming fire? Who among us can dwell with everlasting burnings?” Who, indeed.

Historically Speaking: Let Slip the Dogs, Birds and Donkeys of War

Animals have served human militaries with distinction since ancient times

The Wall Street Journal

August 5, 2021

Cher Ami, a carrier pigeon credited with rescuing a U.S. battalion from friendly fire in World War I, has been on display at the Smithsonian for more a century. The bird made news again this summer, when DNA testing revealed that the avian hero was a “he” and not—as two feature films, several novels and a host of poems depicted—a ”she.”

Cher Ami was one of more than 200,000 messenger pigeons Allied forces employed during the War. On Oct. 4, 1918, a battalion from the U.S. 77th Infantry Division in Verdun, northern France, was trapped behind enemy lines. The Germans had grown adept at shooting down any bird suspected of working for the other side. They struck Cher Ami in the chest and leg—but the pigeon still managed to make the perilous flight back to his loft with a message for U.S. headquarters.

Animals have played a crucial role in human warfare since ancient times. One of the earliest depictions of a war animal appears on the celebrated 4,500-year-old Sumerian box known as the Standard of Ur. One side shows scenes of war; the other, scenes of peace. On the war side, animals that are most probably onagers, a species of wild donkey, are shown dragging a chariot over the bodies of enemy soldiers.

War elephants of Pyrrhus in a 20th century Russian painting
PHOTO: ALAMY

The two most feared war animals of the classical world were horses and elephants. Alexander the Great perfected the use of the former and introduced the latter after his foray into India in 327 BC. For a time, the elephant was the ultimate weapon of war. At the Battle of Heraclea in 280 B.C., a mere 20 of them helped Pyrrhus, king of Epirus—whose costly victories inspired the term “Pyrrhic victory”—rout an entire Roman army.

War animals didn’t have to be big to be effective, however. The Romans learned how to defeat elephants by exploiting their fear of pigs. In 198 B.C., the citizens of Hatra, near Mosul in modern Iraq, successfully fought off a Roman attack by pouring scorpions on the heads of the besiegers. Eight years later, the Carthaginian general Hannibal won a surprise naval victory against King Eumenes II of Pergamon by catapulting “snake bombs”—jars stuffed with poisonous snakes—onto his ships.

Ancient war animals often suffered extraordinary cruelty. When the Romans sent pigs to confront Pyrrhus’s army, they doused the animals in oil and set them on fire to make them more terrifying. Hannibal would get his elephants drunk and stab their legs to make them angry.

Counterintuitively, as warfare became more mechanized the need for animals increased. Artillery needed transporting; supplies, camps, and prisoners needed guarding. A favorite mascot or horse might be well treated: George Washington had Nelson, and Napoleon had Marengo. But the life of the common army animal was hard and short. The Civil War killed between one and three million horses, mules and donkeys.

According to the Imperial War Museum in Britain, some 16 million animals served during World War I, including canaries, dogs, bears and monkeys. Horses bore the brunt of the fighting, though, with as many as 8 million dying over the four years.

Dolphins and sea lions have conducted underwater surveillance for the U.S. Navy and helped to clear mines in the Persian Gulf. The U.S. Army relies on dogs to detect hidden IEDs, locate missing soldiers, and even fight when necessary. In 2016, four sniffer dogs serving in Afghanistan were awarded the K-9 Medal of Courage by the American Humane Association. As the troop withdrawal continues, the military’s four-legged warriors are coming home, too

Historically Speaking: Diabetes and the Miracle of Insulin

One hundred years ago, a team of Canadian researchers showed that an age-old disease didn’t have to mean a death sentence.

The Wall Street Journal

July 22, 2021

The human body runs on glucose, a type of sugar that travels through the bloodstream to the cells where it converts into energy. Some 34.2 million Americans are diabetic; their bodies are unable to produce the hormone insulin, which regulates how glucose is processed and stored in the cells. Without treatment the condition is terminal. The discovery of insulin a century ago this year was one of the great medical breakthroughs of the 20th century.

Diabetes was first recognized some 4,000 years ago. The Ebers Papryus, an Egyptian medical text written around 1550 B.C., refers to patients suffering from thirst, frequent urination and weight loss. An ancient Indian text, the Sushruta Samhita, composed after the 7th century B.C., advised testing for diabetes by seeing whether ants were attracted to the sugar in the urine.

The disproportionate amount of urine experienced by sufferers is probably the reason why the ancient Greeks called it “diabetes,” the word for “siphon” or “to pass through.” They also made the link to lifestyle: The 5th century B.C. physician Hippocrates, the “father of medicine,” advocated exercise as part of the treatment.

Early on, the Chinese recognized that an unbalanced diet of sweet, rich and fatty foods could play a role. Lady Dai, a minor aristocrat who died in the 2nd century B.C., was a textbook case. Her perfectly preserved mummy, discovered in southern China in 1971, revealed a life of dietary excess. She also suffered the consequences: osteoarthritis, high cholesterol, hypertension, liver disease, gall stones and, crucially, diabetes.

Over time, physicians became more expert at diagnosis. The role of sugar came into sharper focus in the 1770s after the English doctor Matthew Dobson discovered that it stayed in the blood as well as urine. But no further progress was made until the end of the 19th century. In 1889 Oskar Minkowski conducted experiments on dogs at the University of Strasbourg to prove that a nonfunctioning pancreas triggered diabetes.

ILLUSTRATION: THOMAS FUCHS

By the early 20th century, scientists knew that a pancreatic secretion was responsible for controlling glucose in the body, but they couldn’t isolate it. The Canadian researchers credited with finding the answer— John Macleod, Frederick Banting, Charles Best and James Collip —were an unlikely team. Banting was a surgeon not a scientist. Yet he sufficiently impressed Macleod, a professor of physiology at the University of Toronto, that the latter lent him lab space and a research assistant, Best. The pairing almost ended in a fist fight, but Banting and Best got over their differences and in July 1921 successfully injected insulin into a dog.

Macleod then brought in Collip, a biochemist on a research sabbatical, to help make a human-compatible version. This led to more infighting and Collip threatened to leave. The dysfunctional group somehow held together long enough to try their insulin on a 14-year-old named Leonard Thompson. He lived another 13 years.

The research team sold their patent rights to the University of Toronto for $1, believing insulin was too vital to be exploited. Their idealism was betrayed: Today, manufacture of the drug is controlled by three companies, and according to a 2018 Yale study published in JAMA, its high cost is forcing 1 in 4 Americans to skimp on their medication. The next frontier for insulin is finding a way to make it affordable for all.

Historically Speaking: The Beacon of the Public Library

Building places for ordinary people to read and share books has been a passion project of knowledge-seekers since before Roman times.

The Wall Street Journal

July 8, 2021

“The libraries are closing forever, like tombs,” wrote the historian Ammianus Marcellinus in 378 A.D. The Goths had just defeated the Roman army in the Battle of Adrianople, marking what is generally thought to be the beginning of the end of Rome.

His words echoed in my head during the pandemic, when U.S. public libraries closed their doors one by one. By doing so they did more than just close off community spaces and free access to books: They dimmed one of the great lamps of civilization.

Kings and potentates had long held private libraries, but the first open-access version came about under the Ptolemies, the Macedonian rulers of Egypt from 305 to 30 B.C. The idea was the brainchild of Ptolemy I Soter, who inherited Egypt after the death of Alexander the Great, and the Athenian governor Demetrius Phalereus, who fled there following his ouster in 307 B.C. United by a shared passion for knowledge, they set out to build a place large enough to store a copy of every book in the world. The famed Library of Alexandria was the result.

ILLUSTRATION: THOMAS FUCHS

Popular myth holds that the library was accidentally destroyed when Julius Caesar’s army set fire to a nearby fleet of Egyptian boats in 48 B.C. In fact the library eroded through institutional neglect over many years. Caesar was himself responsible for introducing the notion of public libraries to Rome. These repositories became so integral to the Roman way of life that even the public baths had libraries.

Private libraries endured the Dark Ages better than public ones. The Al-Qarawiyyin Library and University in Fez, Morocco, founded in 859 by the great heiress and scholar Fatima al-Fihri, survives to this day. But the celebrated Abbasid library, Bayt al-Hikmah (House of Wisdom), in Baghdad, which served the entire Muslim world, did not. In 1258 the Mongols sacked the city, slaughtering its inhabitants and dumping hundreds of thousands of the library’s books into the Tigris River. The mass of ink reportedly turned the water black.

A thousand years after Soter and Demetrius built the Library of Alexandra, Renaissance Florence benefited from a similar partnership between Cosimo de Medici and the scholar Niccolo de Niccoli. At Niccolo’s death in 1437, Cosimo carried out his friend’s wishes to bequeath his books to the people. Not only was the magnificent Biblioteca San Marco Italy’s first purpose-built public library, but its emphasis on reading spaces and natural light became the template for library architecture.

By the end of the 18th century, libraries could be found all over Europe and the Americas. But most weren’t places where the public could browse or borrow for free. Even Benjamin Franklin’s Library Company of Philadelphia, founded in 1731, required its members to subscribe.

The citizens of Peterborough, New Hampshire, started the first free public library in the U.S. in 1833, voting to tax themselves to pay for it, on the grounds that knowledge was a civic good. Many philanthropists, including George Peabody and John Jacob Astor, took up the cause of building free libraries.

But the greatest advocate of all was the steel magnate Andrew Carnegie. Determined to help others achieve an education through free libraries—just as he had done as a boy —Carnegie financed the construction of some 2,509 of them, with 1,679 spread across the U.S. He built the first in his hometown of Dumferline, Scotland in 1883. Carved over the entrance were the words “Let There Be Light.” It’s a motto to keep in mind as U.S. public libraries start to reopen.

The Sunday Times: Rumsfeld was the wrong man at the wrong time

Bush’s war supremo brought about his own worst fear: another Vietnam

July 4, 2021

On the whole, sacked US defence secretaries should avoid quoting Winston Churchill as they depart from the White House, in much the same way as disgraced preachers should leave off quoting Jesus as they are led away in handcuffs. Donald Rumsfeld, who died aged 88 last week, was the chief architect of the American invasions of Afghanistan and Iraq. The aftermath was considered a failure, however, and President George W Bush reversed course after Rumsfeld’s departure in December 2006, sending in extra troops to stop Iraq’s descent into civil war. Rumsfeld handled being fired with his customary mix of puckish humour and pugnacious bravado. In his final speech he paraphrased Churchill, saying:“I have benefited greatly from criticism and at no time have I suffered a lack thereof.”

The last bit was true then, and it continued to be until his death. Rumsfeld’s critics on his refusal to commit sufficient fighting troops in either Afghanistan or Iraq could fill a football pitch. (The first of many damning appraisals appeared in 2007, entitled Rumsfeld: An American Disaster.) But the claim he benefited from criticism was so laughable to anyone who knew him that it only highlighted the catastrophic deficiencies of the man. Rumsfeld was incapable of listening to anyone who didn’t already agree with him. He was the wrong man for the job at the most inopportune time in America’s history.

As several obituaries of Rumsfeld pointed out, his first stint as defence secretary, under Gerald Ford in 1975-77, happened under arguably worse circumstances. A survivor from Richard Nixon’s administration, where he stood out for his unwavering commitment to civil rights, Rumsfeld was the White House chief of staff during the last days of Saigon in April 1975. Appointed defence secretary shortly afterwards, Rumsfeld regarded it as his mission to keep the military ready and competitive but essentially inactive. This wasn’t cowardice but good Cold War strategy.

Rumsfeld’s reputation as a strategic thinker was subsequently borne out by his wildly successful transition to the business world. He was also a clear, no-nonsense communicator, whose fondness for aphorisms and golden rules became the stuff of legend. When Rumsfeld left the White House for the first time, he bequeathed a memorandum of best practices, Rumsfeld’s Rules, 11 of which were specific to the secretary of defence. They began with the reminder: “The secretary of defence is not a super general or admiral. His task is to exercise civilian control over the department for the commander-in-chief [the president] and the country”, and included such important pieces of advice as: “Establish good relations between the departments of Defence, State, the National Security Council, CIA and the Office of Management and Budget.”

When Rumsfeld returned to the White House in 2001, aged 68, he broke almost every one of his own rules. Not only did he allow relations between the departments to become utterly dysfunctional but he so alienated the generals and joint chiefs of staff that it was widely assumed he “hated” the military. The serving chairman of the joint chiefs, General Hugh Shelton, complained that Rumsfeld operated “the worst style of leadership I witnessed in 38 years of service”. Rumsfeld treated all soldiers as boneheaded grunts, and the generals simply as boneheaded grunts with the medals to prove it.

His planned military transformations suffered from an overly technocratic mentality. He believed that the army was costly and lacked efficiency — what army doesn’t?— as though bottom lines apply equally in business and fighting. Rumsfeld wanted to remake the military as one that relied more on air power and technical advantages and less on soldiers with guns. The charge against him is that he eviscerated the military’s ground capabilities and reduced its fighting numbers at precisely the moment both were paramount to American success in Afghanistan and Iraq.

What was going through Rumsfeld’s mind? Millions of words have been spilt in the effort to explain why the defence secretary doggedly pursued a losing policy in the teeth of protests from the military. In his last year in the job six retired generals publicly rebuked him. Part of the answer lies in his character: he was a micromanager who failed to engage, a bureaucrat who despised professionals, an aggressor who disliked to fight. But the real driver of his actions can be traced back to 1975. More than anything else, even more than winning perhaps, he wanted to avoid a repeat of the Vietnam War, with its “limited war” rhetoric that was belied by the fact it was the first of what the US media have dubbed America’s “forever wars” — metastasising conflicts without a clear end in sight.

Rumsfeld emerged from his first period in the White House blaming the military for having misled successive administrations into committing themselves to an unwinnable and unpopular war. Hence, he believed that nothing the military said could be taken at face value. He was not going to allow himself to be taken prisoner by the top brass. Unlike Robert McNamara, the defence secretary most associated with US involvement in Vietnam, Rumsfeld was determined to stick to quick, in-and-out military objectives. There would be no quagmires, mass body bags, forever wars or attempts at nation-building on his watch.

Yet he was a prisoner all the same. Even though the causes and conditions were different, the Vietnam War remained the lens through which Americans judged the Iraq war. A few months after the coalition’s initial victory in Iraq in 2003, Senator John McCain, who spent five years as a PoW in Vietnam, warned the Bush administration that stopping the army doing its job, as interpreted by McCain, would risk “the most serious American defeat on the global stage since Vietnam”. By 2004, Democrats were saying: “Iraq is George Bush’s Vietnam”. Rumsfeld ended up being directly compared to McNamara, even though, by winding down, rather than ratcheting up, the US presence in the Middle East, he was doing the opposite of his predecessor. A 2005 column in the Washington Post intoned: “Just as Vietnam became McNamara’s war, Iraq has become Rumsfeld’s war”.

The successful troop “surge” in 2008 appeared to erase Rumsfeld’s Vietnam legacy. Only it didn’t. Barack Obama’s foreign policy — summed up as “leading from behind” — was Rumsfeldian in its horror of American military entanglement in foreign affairs. The Trump administration’s was even more so. More than six trillion war dollars later, with countless lives lost, if Rumsfeld leaves anything behind, it’s the lesson that America’s forever war syndrome is a state of mind, not just a reality.

Historically Speaking: How the Office Became a Place to Work

Employees are starting to return to their traditional desks in large shared spaces. But centuries ago, ‘office’ just meant work to be done, not where to do it.

The Wall Street Journal

June 24, 2021

Wall Street wants its workforce back in the office. Bank of America, Morgan Stanley and Goldman Sachs have all let employees know that the time is approaching to exchange pajamas and sweats for less comfortable work garb. Some employees are thrilled at the prospect, but others waved goodbye to the water cooler last year and have no wish to return.

Contrary to popular belief, office work is not a beastly invention of the capitalist system. As far back as 3000 B.C, the temple cities of Mesopotamia employed teams of scribes to keep records of official business. The word “office” is an amalgamation of the Latin officium, which meant a position or duty, and ob ficium, literally “toward doing.” Geoffrey Chaucer was the first writer known to use “office” to mean an actual place, in “The Canterbury Tales” in 1395.

In the 16th century, the Medicis of Florence built the Uffizi, now famous as a museum, for conducting their commercial and political business (the name means “offices” in Italian). The idea didn’t catch on in Europe, however, until the British began to flex their muscles across the globe. When the Royal Navy outgrew its cramped headquarters, it commissioned a U-shaped building in central London originally known as Ripley Block and later as the Old Admiralty building. Completed in 1726, it is credited with being the U.K.’s first purpose-built office.

Three years later, the East India Company began administering its Indian possessions from gleaming new offices in Leadenhall Street. The essayist and critic Charles Lamb joined the East India Company there as a junior clerk in 1792 and stayed until his retirement, but he detested office life, calling it “daylight servitude.” “I always arrive late at the office,” he famously wrote, “but I make up for it by leaving early.”

A scene from “The Office,” which reflected the modern ambivalence toward deskbound work.
PHOTO: CHRIS HASTON/NBC/EVERETT COLLECTION

Not everyone regarded the office as a prison without bars. For women it could be liberating. An acute manpower shortage during the Civil War led Francis Elias Spinner, the U.S. Treasurer, to hire the government’s first women office clerks. Some Americans were scandalized by the development. In 1864, Rep. James H Brooks told a spellbound House that the Treasury Department was being defiled by “orgies and bacchanals.”

In the late 19th century, the inventions of the light bulb and elevator were as transformative for the office as the telephone and typewriter: More employees could be crammed into larger spaces for longer hours. Then in 1911, Frederick Winslow Taylor published “The Principles of Scientific Management,” which advocated a factory-style approach to the workplace with rows of desks lined up in an open-plan room. “Taylorism” inspired an entire discipline devoted to squeezing more productivity from employees.

Sinclair Lewis’s 1917 novel, “The Job,” portrayed the office as a place of opportunity for his female protagonist, but he was an outlier among writers and social critics. Most fretted about the effects of office work on the souls of employees. In 1955, Sloan Wilson’s “The Man in the Grey Flannel Suit,” about a disillusioned war veteran trapped in a job that he hates, perfectly captured the deep-seated American ambivalence toward the office. Modern television satires like “The Office” show that the ambivalence has endured—as do our conflicted attitudes toward a post-pandemic return to office routines.

Historically Speaking: Whistleblowing’s Evolution, From Rome to the Pentagon Papers to Wikileaks

The exposure 50 years ago of government documents about the Vietnam War ushered in a modern era of leaks, built on a long tradition

The Wall Street Journal

June 12, 2021

The Pentagon Papers—a secret Defense Department review of America’s involvement in the Vietnam War—became public 50 years ago next week. The ensuing Supreme Court case guaranteed the freedom of the press to report government malfeasance, but the U.S. military analyst behind the revelation, Daniel Ellsberg, still ended up being prosecuted for espionage. Luckily for him, the charges were dropped after the trial became caught up in the Watergate scandal.

The twists and turns surrounding the Pentagon Papers have a uniquely American flavor to them. At the time, no other country regarded whistleblowing as a basic right.

The origins of whistleblowing are far less idealistic. The idea is descended from Roman ‘‘Qui Tam’’ laws, from a Latin phrase meaning “he who sues for himself does also for the king.” The Qui Tam laws served a policing function by giving informers a financial incentive to turn in wrong-doers. A citizen who successfully sued over malfeasance was rewarded with a portion of the defendant’s estate.

Daniel Ellsberg, left, testifying before members of Congress on July 28, 1971, several weeks after the publication of the Pentagon Papers.
PHOTO: BETTMANN ARCHIVE/GETTY IMAGES

Anglo-Saxon law retained a crude version of Qui Tam. At first primarily aimed at punishing Sabbath-breakers, it evolved into whistleblowing against corruption. In 1360, the English monarch Edward III resorted to Qui Tam-style laws to encourage the reporting of jurors and public officials who accepted bribes.

Whistleblowers could never be sure that those in power wouldn’t retaliate, however. The fate of two American sailors in the Revolutionary War, Richard Marven and Samuel Shaw, was a case in point. The men were imprisoned for libel after they reported the commander of the navy, Esek Hopkins, for a string of abuses, including the torture of British prisoners of war. In desperation they petitioned the Continental Congress for redress. Eager to assert its authority, the Congress not only paid the men’s legal bill but also passed what is generally held to be the first whistleblower-protection law in history. The law was strengthened during the Civil War via the False Claims Act, to deter the sale of shoddy military supplies.

These early laws framed such actions as an expression of patriotism. The phrase “to blow the whistle” only emerged in the 1920s, but by then U.S. whistleblowing culture had already reigned in the corporate behemoth Standard Oil. In 1902, a clerk glanced over some documents that he had been ordered to burn, only to realize they contained evidence of wrongdoing. He passed them to a friend, and they reached journalist Ida Tarbell, forming a vital part of her expose of Standard Oil’s monopolistic abuses.

During World War II, the World Jewish Congress requested special permission from the government to ransom Jewish refugees in Romania and German-occupied France. A Treasury Department lawyer named Joshua E. Dubois Jr. discovered that State Department officials were surreptitiously preventing the money from going abroad. He threatened to go public with the evidence, forcing a reluctant President Franklin Roosevelt to establish the War Refugee Board.

Over the past half-century, the number of corporate and government whistleblowers has grown enormously. Nowadays, the Internet is awash with Wikileaks-style whistleblowers. But in contrast to the saga of the Pentagon Papers, which became a turning point in the Vietnam War and concluded with Mr. Ellsberg’s vindication, it’s not clear what the release of classified documents by Julian Assange, Chelsea Manning and Edward Snowden has achieved. To some, the three are heroes; to others, they are simply spies.

Historically Speaking: The Long Road to Protecting Inventions With Patents

Gunpowder was never protected. Neither were inventions by Southern slaves. Vaccines are—but that’s now the subject of a debate.

The Wall Street Journal

May 20, 2021

The U.S. and China don’t see eye to eye on much nowadays, but in a rare show of consensus, the two countries both support a waiver of patent rights for Covid-19 vaccines. If that happens, it would be the latest bump in a long, rocky road for intellectual property rights.

Elijah McCoy and a diagram from one of his patents for engine lubrication.
ILLUSTRATION: THOMAS FUCHS

There was no such thing as patent law in the ancient world. Indeed, until the invention of gunpowder, the true cost of failing to protect new ideas was never even considered. In the mid-11th century, the Chinese Song government realized too late that it had allowed the secret of gunpowder to escape. It tried to limit the damage by banning the sale of saltpeter to foreigners. But merchants found ways to smuggle it out, and by 1280 Western inventors were creating their own recipes for gunpowder.

Medieval Europeans understood that knowledge and expertise were valuable, but government attempts at control were crude in the extreme. The Italian Republic of Lucca protected its silk trade technology by prohibiting skilled workers from emigrating; Genoa offered bounties for fugitive artisans. Craft guilds were meant to protect against intellectual expropriation, but all too often they simply stifled innovation.

The architect Filippo Brunelleschi, designer of the famous dome of Florence’s Santa Maria del Fiore, was the first to rebel against the power of the guilds. In 1421 he demanded that the city grant him the exclusive right to build a new type of river boat. His deal with Florence is regarded as the first legal patent. Unfortunately, the boat sank on its first voyage, but other cities took note of Brunelleschi’s bold new business approach.

In 1474 the Venetians invited individuals “capable of devising and inventing all kinds of ingenious contrivances” to establish their workshops in Venice. In return for settling in the city, the Republic offered them the sole right to manufacture their inventions for 10 years. Countries that imitated Venice’s approach reaped great financial rewards. England’s Queen Elizabeth I granted over 50 individual patents, often with the proviso that the patent holder train English craftsmen to carry on the trade.

Taking their cue from British precedent, the framers of the U.S. Constitution gave Congress the power to legislate on intellectual property rights. Congress duly passed a patent law in 1790 but failed to address the legal position of enslaved inventors. Their anomalous position came to a head in 1857 after a Southern slave owner named Oscar Stuart tried to patent a new plow invented by his slave Ned. The request was denied on the grounds that the inventor was a slave and therefore not a citizen, and while the owner was a citizen, he wasn’t the inventor.

After the Civil War, the opening up of patent rights enabled African-American inventors to bypass racial barriers and amass significant fortunes. Elijah McCoy (1844-1929) transformed American rail travel with his engine lubrication system.

McCoy ultimately registered 57 U.S. patents, significantly more than Alexander Graham Bell’s 18, though far fewer than Thomas Edison’s 1,093. The American appetite for registering inventions remains unbounded. Last fiscal year alone, the U.S. Patent and Trademark Office issued 399,055 patents.

Is there anything that can’t be patented? The answer is yes. In 1999 Smuckers attempted to patent its crustless peanut butter and jelly sandwich with crimped edges. Eight years and a billion homemade PB&J sandwiches later, a federal appeals court ruled there was nothing “novel” about foregoing the crusts.