The Sunday Times: I don’t want to fight about it but this talk of US civil war is overblown

Experts on conflict predict unrest, but America has a long way to go before it is as divided as it was in 1861

The Sunday Times

January 9, 2022

Violence is in the air. No one who saw the shocking scenes during the Capitol riot in Washington on January 6, 2021, can pretend that it was just a big misunderstanding. Donald Trump and his allies attempted to retain power at all costs. Terrible things happened that day. A year later the wounds are still raw and the country is still polarised. Only Democratic leaders participated in last week’s anniversary commemoration; Republicans stayed away. The one-year mark has produced a blizzard of warnings that the US is spiralling into a second civil war.

Only an idiot would ignore the obvious signs of a country turning against itself. Happy, contented electorates don’t storm their parliament (although terrified and oppressed peoples don’t either). America has reached the point where the mid-term elections are no longer a yawn but a test case for future civil unrest.

Predictably, the left and right are equally loud in their denunciations of each other. “Liberals” look at “conservatives” and see the alt-right: white supremacists and religious fanatics working together to suppress voting rights, women’s rights and democratic rights. Conservatives stare back and see antifa: essentially, progressive totalitarians making common cause with socialists and anarchists to undermine the pillars of American freedom and democracy. Put the two sides together and you have an electorate that has become angry, suspicious and volatile.

The looming threat of a civil war is almost the only thing that unites pundits and politicians across the political spectrum. Two new books, one by the Canadian journalist Stephen Marche and the other by the conflict analyst Barbara Walter, argue that the conditions for civil war are already in place. Walter believes that America is embracing “anocracy” (outwardly democratic, inwardly autocratic), joining a dismal list of countries that includes Turkey, Hungary and Poland. The two authors’ arguments have been boosted by the warnings of respected historians, including Timothy Snyder, who wrote in The New York Times that the US is teetering over the “abyss” of civil war.

If you accept the premise that America is facing, at the very least, a severe test of its democracy, then it is all the more important to subject the claims of incipient civil war to rigorous analysis. The fears aren’t baseless; the problem is that predictions are slippery things. How to prove a negative against something that hasn’t happened yet? There’s also the danger of the self-fulfilling prophecy: wishing and predicting don’t make things so, although they certainly help to fix the idea in people’s minds. The more Americans say that the past is repeating itself and the country has reached the point of no return, the more likely it will be believed.

Predictions based on comparisons to Weimar Germany, Nazi Germany, the Russian Revolution and the fall of Rome are simplistic and easy to dismiss. But, just as there is absolutely no basis for the jailed Capitol rioters to compare themselves to “Jews in Germany”, as one woman recently did, arguments that equate today’s fractured politics with the extreme violence that plagued the country just before the Civil War are equally overblown — not to mention trivialising of its 1.5 million casualties.

There simply isn’t a correlation between the factors dividing America then and now. In the run-up to the war in 1861, the North and South were already distinct entities in terms of ethnicity, customs and law. Crucially, the North’s economy was based on free labour and was prone to slumps and financial panics, whereas the South’s depended on slavery and was richer and more stable. The 13 Southern states seceded because they had local government, the military and judicial institutions on side.

Today there is a far greater plurality of voters spread out geographically. President Biden won Virginia and Georgia and almost picked up Texas in 2020; in 1860 there were ten Southern states where Abraham Lincoln didn’t even appear on the ballot.

When it comes to assessing the validity of generally accepted conditions for civil breakdown, the picture becomes more complicated. A 2006 study by the political scientists Havard Hegre and Nicholas Sambanis found that at least 88 circumstances are used to explain civil war. The generally accepted ones include: a fragile economy, deep ethnic and religious divides, weak government, long-standing grievances and factionalised elites. But models and circumstances are like railway tracks: they take us down one path and blind us to the others.

In 2015 the European think tank VoxEU conducted a historical analysis of over 100 financial crises between 1870 and 2014. Researchers found a pattern of street violence, greater distrust of government, increased polarisation and a rise in popular and right-wing parties in the five years after a crisis. This would perfectly describe the situation in the US except for one thing: the polarisation and populism have coincided with falling unemployment and economic growth. The Capitol riot took place despite, not because of, the strength of the financial system.

A country can meet a whole checklist of conditions and not erupt into outright civil war (for example, Northern Ireland in the 1970s) or meet only a few of the conditions and become a total disaster. It’s not only possible for the US, a rich, developed nation, to share certain similarities with an impoverished, conflict-ridden country and yet not become one; it’s also quite likely, given that for much of its history it has held together while being a violent, populist-driven society seething with racial and religious antagonisms behind a veneer of civil discourse. This is not an argument for complacency; it is simply a reminder that theory is not destiny.

A more worrying aspect of the torrent of civil war predictions by experts and ordinary Americans alike is the readiness to demonise and assume the absolute worst of the other side. It’s a problem when millions of voters believe that the American polity is irredeemably tainted, whether by corruption, communism, elitism, racism or what have you. The social cost of this divide is enormous. According to the Armed Conflict Location and Event Data Project: “In this hyper-polarised environment, state forces are taking a more heavy-handed approach to dissent, non-state actors are becoming more active and assertive and counter-demonstrators are looking to resolve their political disputes in the street.”

Dissecting the roots of America’s lurch towards rebarbative populism requires a particular kind of micro human analysis involving real-life interviews with perpetrators and protesters as well as trawls through huge sets of data. The results have shown that, more often than not, the attribution of white supremacist motives to the Capitol rioters, or anti-Americanism to Black Lives Matter protesters, says more about the politics of the accuser than the accused.

Social media is an amplifier for hire — polarisation lies at the heart of its business models and algorithms. Researchers looking at the “how” rather than just the “why” of America’s political Balkanisation have also found evidence of large-scale foreign manipulation of social media. A recent investigation by ProPublica and The Washington Post revealed that after November 3, 2020, there were more than 10,000 posts a day on Facebook attacking the legitimacy of the election.

In 2018 a Rasmussen poll asked American voters whether the US would experience a second civil war within the next five years. Almost a third said it would. In a similar poll conducted last year the proportion had risen to 46 per cent. Is it concerning? Yes. Does it make the prediction true? Well, polls also showed a win for Hillary Clinton and a landslide for Joe Biden. So, no.

Historically Speaking: How the Waistband Got Its Stretch

Once upon a time, human girth was bound by hooks and buttons, and corsets had metal stays. Along came rubber and a whole new technology of flexible cloth.

The Wall Street Journal

January 7, 2021

The New Year has arrived, and if you’re like me, you’ve promised yourself a slimmer, fitter and healthier you in 2022. But in the meantime there is the old you to deal with—the you who overindulged at Thanksgiving and didn’t stop for the next 37 days. No miracle diet or resolution can instantaneously eradicate five weeks of wild excess. Fortunately, modern science has provided the next best thing to a miracle: the elasticated waistband.

Before the invention of elastic, adjustable clothing was dependent on technology that had hardly changed since ancient times. The Indus Valley Civilization made buttons from seashells as early as 2000 BC.

The first inkling that there might be an alternative to buttons, belts, hooks and other adjustable paraphernalia came in the late 18th century, with the discovery that rubber wasn’t only good for toys. It also had immensely practical applications for things such as pencil erasers and lid sealants. Rubber’s stretchable nature offered further possibilities in the clothing department. But there was no word for its special property until the poet William Cowper borrowed the 17th-century term “elastic,” used to describe the expansion and contraction of gases, for his translation of the Iliad in 1791: “At once he bent Against Tydides his elastic bow.”

PHOTO: GETTY IMAGES

By 1820, an enterprising English engineer named Thomas Hancock was making elastic straps and suspenders out of rubber. He also invented the “masticator,” a machine that rolled shredded rubber into sheets for industrial use. Elastic seemed poised to make a breakthrough: In the 1840s, Queen Victoria’s shoemaker, Joseph Sparkes Hall, popularized his invention of the elastic-gusset ankle boot, still known today as the Chelsea Boot.

But rubber had drawbacks. Not only was it a rare and expensive luxury that tended to wear out quickly, it was also sticky, sweaty and smelly. Elasticized textiles became popular only after World War I, helped by the demand for steel—and female workers—that led women to forego corsets with metal stays. Improved production techniques at last made elasticated girdles a viable alternative: In 1924, the Madame X rubber girdle promised to help women achieve a thinner form in “perfect comfort while you sit, work or play.”

The promise of comfort became real with the invention of Lastex, essentially rubber yarn, in 1930. Four years later, in 1934, Alexander Simpson, a London tailor, removed the need for belts or suspenders by introducing the adjustable rubber waistband in men’s trousers.

The constant threat of rubber shortages sparked a global race to devise synthetic alternatives. The winner was the DuPont Company, which invented neoprene in 1930. That research led to an even more exciting invention: the nylon stocking. Sales were halted during World War II, creating such pent-up demand that in 1946 there were “nylon riots” throughout the U.S., including in Pittsburgh, where 40,000 people tried to buy 13,000 pairs of stockings.

DuPont scored another win in 1958 with spandex, also known under the brand name Lycra, which is not only more durable than nylon but also stretchier. Spandex made dreams possible by making fabrics more flexible and forgiving: It helped the astronaut Neil Armstrong to walk on the moon and Simone Biles to become the most decorated female gymnast in history. And it will help me to breathe a little easier until I can fit into my jeans again.

Historically Speaking: Boycotts that Brought Change

Modern rights movements have often used the threat of lost business to press for progress

The Wall Street Journal

November 12, 2021

Sixty-five years ago, on Nov. 13, 1956, the U.S. Supreme Court upheld Browder v. Gayle, putting an end to racial segregation on buses. The organizers of the Montgomery bus boycott, which had begun shortly after Rosa Park s’s arrest on Dec. 1, 1955, declared victory and called off the campaign as soon as the ruling came into effect. But that was only the beginning. As the Civil Rights Movement progressed, boycotts of businesses that supported segregation became a regular—and successful—feature of Black protests.

Boycotts are often confused with other forms of protest. At least four times between 494 and 287 B.C., Rome’s plebeian class marched out of the city en masse, withholding their labor in an effort to win more political rights. Some historians describe this as a boycott, but it more closely resembled a general strike. A better historical precedent is the West’s reaction to Sultan Mehmed II’s imposition of punitive taxes on Silk Road users in 1453. European traders simply avoided the overland networks via Constantinople in favor of new sea routes, initiating the so-called Age of Discovery.

The first modern boycotts began in the 18th century. Britain increased taxes on its colonial subjects following the Seven Years War in 1756-1763, inciting a wave of civil disobedience. Merchants in Philadelphia, New York and Boston united to boycott British imports. Together with intense political lobbying by Benjamin Franklin among others, the boycott resulted in the levies’ repeal in 1766. Several years later, London again tried to raise revenue through taxes, touching off a more famous boycott, during which the self-styled Sons of Liberty dumped the contents of 342 tea chests into Boston Harbor on Dec. 16, 1773.

In 1791 the British abolitionist movement instigated a national boycott of products made by enslaved people. As many as 300,000 British families stopped buying West Indian sugar, causing sales to drop by a third to half in some areas. Although the French Revolutionary Wars stymied this campaign, its early successes demonstrated the power of consumer protest.

Despite the growing popularity of the action, the term “boycott” wasn’t used until 1880. It was coined in Ireland as part of a campaign of civil disobedience against absentee landowners. Locals turned Captain Charles Cunningham Boycott, an unpopular land agent in County Mayo for the Earl of Erne, into a pariah. His isolation was so complete that the Boycott family relocated to England. The “boycott” of Boycott not only garnered international attention but inspired imitators.

Boycotts enabled oppressed people around the world to make their voices heard. But the same tool could also be a powerful weapon in the hands of oppressors. During the late 19th century in the American west, Chinese workers and the businesses that hired them were often subject to nativist boycotts. In 1933, German Nazi party leaders organized a nationwide boycott of Jewish-owned business in the lead-up to the passage of a law barring Jews from public sector employment.

Despite the potential for misuse, the popularity of economic and political boycotts increased after World War II. In January 1957, shortly after the successful conclusion of the Montgomery bus boycott, black South Africans in Johannesburg started their own bus boycott. From this local protest grew a national and then international boycott movement that continued until South Africa ended apartheid in 1991. Sometimes the penny, as well as the pen, is more powerful than the sword.

Historically Speaking: When Masquerade Was All the Rage

Before there was Halloween, there were costume balls and Carnival, among other occasions for the liberation of dressing up

The Wall Street Journal

October 28, 2021

Costume parades and Halloween parties are back after being canceled last year. Donning a costume and mask to go prancing around might seem like the height of frivolity, but the act of dressing-up has deep roots in the human psyche.

During the early classical era, worshipers at the annual festivals of Dionysus—the god of wine, ritual madness and impersonation, among other things—expanded mask-wearing from religious use to personal celebrations and plays performed in his honor. Masks symbolized the suspension of real world rules: A human could become a god, an ordinary citizen could become a king, a man could be a woman. Anthropologists call such practices “rituals of inversion.”

In Christianized Europe, despite official disapproval of paganism, rituals of inversion not only survived but flourished. Carnival—possibly a corruption of the Latin phrase “carne vale,” farewell to meat, because the festival took place before Lent—included the Feast of Fools, where junior clergymen are alleged to have dressed as nuns and bishops and danced in the streets.

By the 13th century, the Venetians had taken to dressing up and wearing masks with such gusto that the Venice Carnival became an occasion for ever more elaborate masquerade. The city’s Great Council passed special laws to keep the practice within bounds, such as banning masks while gambling or visiting convents.

ILLUSTRATION: THOMAS FUCHS

The liberation granted by a costume could be dangerous. In January of 1393, King Charles VI of France and his wife, Isabeau of Bavaria, held the Bal des Sauvages, or Wild Men’s Ball, to celebrate the wedding of one of her ladies-in-waiting. The king had already suffered his first bout of insanity, and it was hoped that the costume ball would be an emotional outlet for his disordered mind. But the farce became a tragedy. The king and his entourage, dressed as hairy wild men, were meant to perform a “crazy” dance. Horrifically, the costumes caught fire, and only Charles and one other knight survived.

The masked ball became a staple of royal entertainments, offering delicious opportunities for sexual subterfuge and social subversion. At a masquerade in 1745, Louis XV of France disguised himself as a yew tree so he could pursue his latest love, the future Madame de Pompadour. Meanwhile, the Dauphine danced the night away with a charming Spanish knight, not realizing he was a lowly cook who had tricked his way in. More ominously, a group of disaffected nobles in Sweden infiltrated a masquerade to assassinate King Gustav III of Sweden in 1792. Five years later, the new ruler of Venice, Francis II of Austria, banned Carnival and forbade the city’s residents to wear masks.

Queen Victoria helped to return dress-up parties to respectability with historically-themed balls that celebrated creativity rather than debauchery. By 1893, American Vogue could run articles about fabulous Halloween costumes without fear of offense. The first Halloween parade took place not in cosmopolitan New York but in rural Hiawatha, Kansas, in 1914.

In the modern era, the taint of anarchy and licentiousness associated with dressing-up has been replaced by complaints about cultural appropriation, a concern that would have baffled our ancestors. Becoming what we are not, however briefly, is part of being who we are.

Historically Speaking: How Malaria Brought Down Great Empires

A mosquito-borne parasite has impoverished nations and stopped armies in their tracks

The Wall Street Journal

October 15, 2021

Last week brought very welcome news from the World Health Organization, which approved the first-ever childhood vaccine for malaria, a disease that has been one of nature’s grim reapers for millennia.

Originating in Africa, the mosquito-borne parasitic infection left its mark on nearly every ancient society, contributing to the collapse of Bronze-Age civilizations in Greece, Mesopotamia and Egypt. The boy pharaoh Tutankhamen, who died around 1324 BC, suffered from a host of conditions including a club foot and cleft palate, but malaria was likely what killed him.

Malaria could stop an army in its tracks. In 413 BC, at the height of the disastrous Sicilian Expedition, malaria sucked the life out of the Athenian army as it lay siege to Syracuse. Athens never recovered from its losses and fell to the Spartans in 404 BC.

But while malaria helped to destroy the Athenians, it provided the Roman Republic with a natural barrier against invaders. The infested Pontine Marshes south of Rome enabled successive generations of Romans to conquer North Africa, the Middle East and Europe with some assurance they wouldn’t lose their own homeland. Thus, the spread of classical civilization was carried on the wings of the mosquito. In the 5th century, though, the blessing became a curse as the disease robbed the Roman Empire of its manpower.

Throughout the medieval era, malaria checked the territorial ambitions of kings and emperors. The greatest beneficiary was Africa, where endemic malaria was deadly to would-be colonizers. The conquistadors suffered no such handicap in the New World.

ILLUSTRATION: JAMES STEINBERG

The first medical breakthrough came in 1623 after malaria killed Pope Gregory XV and at least six of the cardinals who gathered to elect his successor. Urged on by this catastrophe to find a cure, Jesuit missionaries in Peru realized that the indigenous Quechua people successfully treated fevers with the bark of the cinchona tree. This led to the invention of quinine, which kills malarial parasites.

For a time, quinine was as powerful as gunpowder. George Washington secured almost all the available supplies of it for his Continental Army during the War of Independence. When Lord Cornwallis surrendered at Yorktown in 1781, less than half his army was fit to fight: Malaria had incapacitated the rest.

During the 19th century, quinine helped to turn Africa, India and Southeast Asia into a constellation of European colonies. It also fueled the growth of global trade. Malaria had defeated all attempts to build the Panama Canal until a combination of quinine and better mosquito control methods led to its completion in 1914. But the drug had its limits, as both Allied and Axis forces discovered in the two World Wars. While fighting in the Pacific Theatre in 1943, General Douglas MacArthur reckoned that for every fighting division at his disposal, two were laid low by malaria.

A raging infection rate during the Vietnam War was malaria’s parting gift to the U.S. in the waning years of the 20th century. Between 1964 and 1973, the U.S. Army suffered an estimated 391,965 sick-days from malaria cases alone. The disease didn’t decide the war, but it stacked the odds.

Throughout history, malaria hasn’t had to wipe out entire populations to be devastating. It has left them poor and enfeebled instead. With the advent of the new vaccine, the hardest hit countries can envisage a future no longer shaped by the disease.

Historically Speaking: Dante’s Enduring Vision of Hell

The “Inferno” brought human complexity to the medieval conception of the afterlife

The Wall Street Journal

September 30, 2021

What is hell? For Plato, it was Tartarus, the lowest level of Hades where those who had sinned against the gods suffered eternal punishment. For Jean-Paul Sartre, the father of existentialism, hell was other people. For many travelers today, it is airport security.

No depiction of hell, however, has been more enduring than the “Inferno,” part one of the “Divine Comedy” by Dante Alighieri, the 700th anniversary of whose death is commemorated this year. Dante’s hell is divided into nine concentric circles, each one more terrifying and brutal than the last until the frozen center, where Satan resides alongside Judas, Brutus and Cassius. With Virgil as his guide, Dante’s spiritually bereft and depressed alter ego enters via a gate bearing the motto “Abandon all hope, ye who enter here”—a phrase so ubiquitous in modern times that it greets visitors to Disney’s Pirates of the Caribbean ride.

The inscription was a Dantean invention, but the idea of a physical gate separating the land of the living from a desolate one of the dead was already at least 3,000 years old: In the Sumerian Epic of Gilgamesh, written around 2150 B.C., two scorpionlike figures guard the gateway to an underworld filled with darkness and dust.

ILLUSTRATION: THOMAS FUCHS

The underworld of the ancient Egyptians was only marginally less bleak. Seven gates blocked the way to the Hall of Judgment, according to the Book of the Dead. Getting through them was arduous and fraught with failure. The successful then had to submit to having their hearts weighed against the Feather of Truth. Those found wanting were thrown into the fire of oblivion.

Zoroastrianism, the official religion of the ancient Persians, was possibly the first to divide the afterlife into two physically separate places, one for good souls and the other for bad. This vision contrasted with the Greek view of Hades as the catchall for the human soul and the early Hebrew Bible’s description of Sheol as a shadowy pit of nothingness. In the 4th century B.C., Alexander the Great’s Macedonian empire swallowed both Persia and Judea, and the three visions of the afterlife commingled. “hell” would then appear frequently in Greek versions of the New Testament. But the word, scholars point out, was a single translation for several distinct Hebrew terms.

Early Christianity offered more than one vision of hell, but all contained the essential elements of Satan, sinners and fire. The “Apocalypse of Peter,” a 2nd century text, helped start the trend of listing every sadistic torture that awaited the wicked.

Dante was thus following a well-trod path with his imaginatively crafted punishments of boiling pitch for the dishonest and downpours of icy rain on the gluttonous. But he deviated from tradition by describing Hell’s occupants with psychological depth and insight. Dante’s narrator rediscovers the meaning of Christian truth and love through his encounters. In this way the Inferno speaks to the complexities of the human condition rather than serving merely as a literary zoo of the dammed.

The “Divine Comedy” changed the medieval world’s conception of hell, and with it, man’s understanding of himself. Boccaccio, Chaucer, Milton, Balzac —the list of writers directly inspired by Dante’s vision goes on. “Dante and Shakespeare divide the world between them,” wrote T.S. Eliot. “There is no third.”

Historically Speaking: For Punishment or Penitence?

Fifty years ago, the Attica uprising laid bare the conflicting ideas at the heart of the U.S. prison system.

The Wall Street Journal

September 17, 2021

Fifty years ago this past week, inmates in Attica, New York, staged America’s deadliest prison uprising. The organizers held prison employees hostage while demanding better conditions. One officer and three inmates were killed during the rioting, and the revolt’s suppression left another 39 dead and at least 89 seriously wounded. The episode raised serious questions about prison conditions and ultimately led to some reforms.

Nearly two centuries earlier, the founders of the U.S. penal system had intended it as a humane alternative to those that relied on such physical punishments as mutilation and whipping. After the War of Independence, Benjamin Franklin and leading members of Philadelphia’s Quaker community argued that prison should be a place of correction and penitence. Their vision was behind the construction of the country’s first “penitentiary house” at the Walnut Street Jail in Philadelphia in 1790. The old facility threw all prisoners together; its new addition contained individual cells meant to prevent moral contagion and to encourage prisoners to spend time reflecting on their crimes.

Inmates protest prison conditions in Attica, New York, Sept. 10, 1971

Walnut Street inspired the construction of the first purpose-built prison, Eastern State Penitentiary, which opened outside of Philadelphia in 1829. Prisoners were kept in solitary confinement and slept, worked and ate in their cells—a model that became known as the Pennsylvania system. Neighboring New York adopted the Auburn system, which also enforced total silence but required prisoners to work in communal workshops and instilled discipline through surveillance, humiliation and corporal punishment. Although both systems were designed to prevent recidivism, the former stressed prisoner reform while the latter carried more than a hint of retribution.

Europeans were fascinated to see which system worked best. In 1831, the French government sent Alexis de Tocqueville and Gustave de Beaumont to investigate. Having inspected facilities in several states, they concluded that although the “penitentiary system in America is severe,” its combination of isolation and work offered hope of rehabilitation. But the novelist Charles Dickens reached the opposite conclusion. After touring Eastern State Penitentiary in 1842, he wrote that the intentions behind solitary confinement were “kind, humane and meant for reformation.” In practice, however, total isolation was “worse than any torture of the body”: It broke rather than reformed people.

Severe overcrowding—there was no parole in the 19th century—eventually undermined both systems. Prisoner violence became endemic, and regimes of control grew harsher. Sing Sing prison in New York meted out 36,000 lashes in 1843 alone. In 1870, the National Congress on Penitentiary and Reformatory Discipline proposed reforms, including education and work-release initiatives. Despite such efforts, recidivism rates remained high, physical punishment remained the norm and almost 200 serious prison riots were recorded between 1855 and 1955.

That year, Harry Manuel Shulman, a deputy commissioner in New York City’s Department of Correction, wrote an essay arguing that the country’s early failure to decide on the purpose of prison had immobilized the system, leaving it “with one foot in the road of rehabilitation and the other in the road of punishment.” Which would it choose? Sixteen years later, Attica demonstrated the consequences of ignoring the question.

The Sunday Times: Texas Talibanistas, take note: freedom will win

The blow to abortion rights is shocking, but this fight is nowhere near over

The Sunday Times

September 7, 2021

The pro-life movement in America finally got its wish this week: a little before midnight on Wednesday, in a 5-4 decision, the Supreme Court ruled against temporarily blocking a Texas state law passed in May, known as Senate Bill 8 (SB8), banning almost all abortions once a heartbeat can be detected by ultrasound — which is around six weeks after conception. The bill will still eventually return to the Supreme Court for a final decision, but by being allowed to stand unchanged it becomes the strictest anti-abortion law in the nation. There are no exceptions for child pregnancy, rape or incest.

But this isn’t the reason for the national uproar. SB8 goes further than any other anti-abortion bill yet crafted because of the way it allows the ban to be enforced. Under the new Texas law, a $10,000 bounty will be awarded to any US citizen who successfully sues a person or entity that helps a woman to obtain an abortion. “Help” includes providing money, transport, medicines or medical aid.

To speed up the process, Texas Right to Life, an anti-abortion organisation, has already set up an anonymous tip line for “whistleblowers”. That’s right, the second-largest state in the union by size and population is turning family against family, neighbour against neighbour, to create its own spy network of uterus police. Welcome to Gilead-on-the-Rio Grande. Cue outrage from all Americans who support legal abortion — and, according to recent polls, they amount to 58 per cent of the country.

There is no doubt that SB8 is a huge victory for the pro-life campaign. Texas joins 24 countries worldwide that have a total or near-total ban on abortion. Outside the big cities, large swathes of America are already abortion-free zones: only 11 per cent of counties have a hospital or clinic that provides such services.

In the short term the outlook for that most basic of human rights, a woman’s control over her body, is dire in America. The combination of a Republican-packed Supreme Court, thanks to Donald Trump’s last-minute appointment of Amy Coney Barrett following the death in September last year of Ruth Bader Ginsberg, and SB8’s sneaky bypassing of federal authority has closed down the obvious routes for legal redress. Moreover, the Senate is tied 50-50, making it impossible for Congress to pass a law mandating a woman’s unrestricted access to abortion. The Texas Talibanistas have gained the upper hand. Similar laws to SB8 will no doubt be passed in other Republican states.

The Texas appeal to vigilantism should also offend everyone who believes in democracy and the rule of law. But — and this may be hard to accept in the heat of the moment — SB8 is a gift to the pro-choice movement.

Pro-life Texans thought they were being clever by avoiding both the Supreme Court and Congress to slip through an abortion ban. But, as the saying goes, be careful what you wish for. “Lawfare” is a two-way street. Critics of SB8 point out that there is nothing to stop California passing a similar bill that enables citizens to bring civil lawsuits against people who utter “hate speech”, or to stop New York deputising bounty-hunters to sue gun-owners. Nor does the legal chaos stop there. SB8 could open the way for railways, car companies and airlines to become liable for providing travel assistance to an abortion-seeking woman, or supermarkets for selling the disinfectant Lysol and other substances that induce abortion. Forget about boycotts for a moment; the threat of a lawsuit is a powerful deterrent to corporations seeking to do business in Texas.

History is not the best predictor of the future. Nevertheless, the disastrous dalliance with prohibition, which lasted for 13 years between 1920 and 1933, offers a salient lesson in what happens when a long-held individual right is taken away from Americans. The non-metropolitan parts of the country forced their will on the urban parts. But drinking didn’t stop; it just went underground. Some states wouldn’t enforce the ban, and other states couldn’t. In Detroit, Michigan, the alcohol trade was the largest single contributor to the economy after the car industry. Prohibition fostered the American mafia, led to a rise in alcoholism, alcohol-related deaths, mass lawlessness and civil disobedience and brought about extraordinary levels of corruption.

There is every reason to believe that abortions will continue in America no matter what anti-abortion zealots manage to pull off. It just won’t be pretty. A recent study published in The Lancet Global Health revealed that the countries with the greatest restrictions not only have the highest termination rates in the world but are also among the least economically successful. This is the club that awaits pro-life America.

The strangulation of women’s rights has been so slow that supporters of Roe v Wade, the 1973 ruling that made abortion legal, were lulled into a false sense of security. They assumed the minority of Americans fighting for a repeal would never overwhelm the will of the majority. SB8 has changed all that. Its underpinnings threaten so many constitutional rights that abortion is going to be front and centre in every state and federal election.

Democracy does work, even if, as with prohibition, it takes to time to roll back injustices. Last year the Virginia state legislature voted to remove more than a decade’s worth of abortion restrictions. This is the body that in 2012 stood accused of “state-sanctioned rape” for passing a bill that required any woman seeking an abortion to submit to an ultrasound first, not by the usual external method but with a transvaginal wand.

Despite what anti-abortion fanatics believe, the US is a pro-choice country. The fight for women’s rights will go on, and on, until the people win.

Historically Speaking: The Long Haul of Distance Running

How the marathon became the world’s top endurance race

The Wall Street Journal

September 2, 2021

The New York City Marathon, the world’s largest, will hold its 50th race this autumn, after missing last year’s due to the pandemic. A podiatrist once told me that he always knows when there has been a marathon because of the sudden uptick in patients with stress fractures and missing toenails. Nevertheless, humans are uniquely suited to long-distance running.

Some 2-3 million years ago, our hominid ancestors began to develop sweat glands that enabled their bodies to stay cool while chasing after prey. Other mammals, by contrast, overheat unless they stop and rest. Thus, slow but sweaty humans won out over fleet but panting animals.

The marathon, at 26.2 miles, isn’t the oldest known long-distance race. Egyptian Pharaoh Taharqa liked to organize runs to keep his soldiers fit. A monument inscribed around 685 B.C. records a two-day, 62-mile race from Memphis to Fayum and back. The unnamed winner of the first leg (31 miles) completed it in about four hours.

ILLUSTRATION: THOMAS FUCHS

The considerably shorter marathon derives from the story of a Greek messenger, Pheidippides, who allegedly ran from Marathon to Athens in 490 B.C. to deliver news of victory over the Persians—only to drop dead of exhaustion at the end. But while it is true that the Greeks used long-distance runners, called hemerodromoi, or day runners, to convey messages, this story is probably a myth or a conflation of different events.

Still, foot-bound messengers ran impressive distances in their day. Within 24 hours of Herman Cortes’s landing in Mexico in 1519, messenger relays had carried news of his arrival over 260 miles to King Montezuma II in Tenochtitlan.

As a competitive sport, the marathon has a shorter history. The longest race at the ancient Olympic Games was about 3 miles. This didn’t stop the French philologist Michel Bréal from persuading the organizers of the inaugural modern Olympics in 1896 to recreate Pheidippides’s epic run as a way of adding a little classical flavor to the Games. The event exceeded his expectations: The Greek team trained so hard that it won 8 of the first 9 places. John Graham, manager of the U.S. Olympic team, was inspired to organize the first Boston Marathon in 1897.

Marathon runners became fitter and faster with each Olympics. But at the 1908 London Games the first runner to reach the stadium, the Italian Dorando Pietri, arrived delirious with exhaustion. He staggered and fell five times before concerned officials eventually helped him over the line. This, unfortunately, disqualified his time of 2:54:46.

Pietri’s collapse added fuel to the arguments of those who thought that a woman’s body could not possibly stand up to a marathon’s demands. Women were banned from the sport until 1964, when Britain’s Isle of Wight Marathon allowed the Scotswoman Dale Greig to run, with an ambulance on standby just in case. Organizers of the Boston Marathon proved more intransigent: Roberta Gibb and Katherine Switzer tried to force their way into the race in 1966 and ’67, but Boston’s gender bar stayed in place until 1972. The Olympics held out until 1984.

Since that time, marathons have become a great equalizer, with men and women on the same course: For 26.2 miles, the only label that counts is “runner.”

Historically Speaking: A Legacy of Tinderbox Forests

Long before climate change exacerbated the problem, policies meant to suppress wildfires served to fan the flames

The Wall Street Journal

August 19, 2021

This year’s heat waves and droughts have led to record-breaking wildfires across three continents. The fires in Siberia are so vast that smoke has reached the North Pole for what is believed to be the first time. In the United States, California’s Dixie Fire has become the largest single fire in the state’s history.

Humans have long wrestled with forest fire, seeking by turns to harness and to suppress it. Early European efforts to control forest fires were tentative and patchy. In the 14th century, the Sardinians experimented with fire breaks, but the practice was slow to catch on. In North America, by contrast, scientists have found 2000-year-old evidence of controlled burnings by Native American tribes. But this practice died out with the arrival of European immigrants, because of local bans as well as the expulsion of tribes from native lands. As a consequence, forests not only became larger and denser but also filled with mounds of dead and dried vegetation, making them very susceptible to fire.

Disaster struck in the fall of 1871. Dozens of wildfires broke out simultaneously across the Upper Midwest and Great Lakes region, and on Oct. 8, the same night as the Chicago Fire, a firestorm engulfed the town of Peshtigo, Wisconsin, killing an estimated 1,200 people (and possibly many more). The fire was the deadliest in U.S. history.

ILLUSTRATION: ANTHONY FREDA

Early conservationists, such as Franklin Hough, sought to organize a national wildfire policy. The U.S. Forest Service was created in 1905 and was still figuring out its mission in 1910, when the northern Rockies went up in flames. Fires raced through Washington, Montana and Idaho, culminating in what is known as the “Big Blowup” of August 20 and 21.

One of the U.S. Forest Service’s newly minted rangers, Edward C. Pulaski, was leading a team of 45 firefighters near Wallace, Idaho. A firestorm surrounded the men, forcing Pulaski to lead them down a disused mine shaft. Several attempted to go back outside, believing they would be cooked alive in the smoke-filled mine. Pulaski managed to stop the suicidal exodus by threatening to shoot any man who tried to leave. To maintain morale, he organized a water bucket chain to prevent the blankets covering the exit from catching fire. Pulaski’s actions saved the lives of all but five of the firefighters, but his eyesight and lungs never recovered.

The Big Blowup destroyed more than 3 million acres in two days and killed at least 80 people. In response to the devastation, the Forest Service, with the public’s support, adopted the mistaken goal of total fire suppression rather than fire management. The policy remained in place until 1978, bequeathing to the country a legacy of tinderbox forests.

Nowadays, the Forest Service conducts controlled burns, known as prescribed fires, to mitigate the risk of wildfire. The technique is credited with helping to contain July’s 413,000-acre Bootleg Fire in Oregon.

Fire caused by the effects of climate change will require human intervention of another order. In the Bible’s Book of Isaiah, the unhappy “sinners of Zion” cry out: ‘”Who of us can live where there is a consuming fire? Who among us can dwell with everlasting burnings?” Who, indeed.