Chess has captivated minds for 1,500 years, surviving religious condemnation, Napoleonic exile and even the Russian Revolution
April 15, 2022
Fifty years ago, the American chess grandmaster Bobby Fischer played the reigning world champion Boris Spassky at the “Match of the Century” in Reykjavik, Iceland. The Cold War was at its height, and the Soviets had held the title since 1948. More was riding on the competition than just the prize money.
The press portrayed the Fischer-Spassky match as a duel between the East and the West. But for the West to win, Fischer had to play, and the temperamental chess genius wouldn’t agree to the terms. He boycotted the opening ceremony on July 1, 1972, prompting then-National Security Adviser Henry Kissinger to call Fischer and tell him that it was his patriotic duty to go out there and play. Fischer relented—and won, using the Queen’s Gambit in game 9 (a move made famous by the Netflix series about a fictional woman chess player).
The Fischer-Spassky match reignited global enthusiasm for a 1,500-year-old game. From its probable origins in India around the 6th century, the basic idea of chess spread rapidly across Asia, the Middle East and Europe. Religious authorities initially condemned the game; even so, the ability to play became an indispensable part of courtly culture.
Chess was a slow-moving game until the 1470s, when new rules were introduced that made it faster and more aggressive. The most important changes were greater mobility for the bishops and the transformation of the queen into the most powerful piece on the board. The instigator remains unknown, although the tradition seems to have started in Spain, inspired, perhaps, by Queen Isabella who ruled jointly with King Ferdinand.
The game captivated some of the greatest minds of the Renaissance. Around 1500, the Italian mathematician Luca Pacioli, known as the Father of Accounting, analyzed more than 100 plays and strategies in “De ludo schaccorum” (On the Game of Chess). The hand of Leonardo da Vinci has been detected in some of the illustrations in the only known copy of the book.
Green spaces and nature preserves have long existed, but the idea of protecting natural wonders for human enjoyment has American roots.
March 3, 2022
Yellowstone, the world’s oldest national park, turned 150 this month. The anniversary of its founding is a timely reminder that democracy isn’t just a political system but a way of life. The Transcendentalist writer Henry David Thoreau was one of the earliest Americans to link democratic values with national parks. Writing in 1858 he declared that having “renounced the king’s authority” over their land, Americans should use their hard-won freedom to create national preserves for “inspiration and our own true re-creation.”
There had been nature reserves, royal hunting grounds, pleasure gardens and parks long before Yellowstone, of course. The origins of green spaces can be traced to ancient Egypt’s temple gardens. Hyde Park, which King Charles I opened to Londoners in 1637, led the way for public parks. In the 3rd century B.C., King Devanampiya Tissa of Sri Lanka created the Mihintale nature reserve as a wildlife sanctuary, prefiguring by more than 2,000 years the likely first modern nature reserve, which the English naturalist Charles Waterton built on his estate in Yorkshire in the 1820s.
The 18th century saw a flowering of interest in man’s relationship with nature, and these ideas encouraged better management of the land. The English scientist Stephen Hale demonstrated the correlation between tree coverage and rainfall, leading a British MP named Soame Jenyns to convince Parliament to found the Tobago Main Ridge Forest Reserve in 1776. The protection of the Caribbean colony’s largest forest was a watershed moment in the history of conservation. Highly motivated individuals soon started conservation projects in multiple countries.
As stewards of their newly independent nation, Americans regarded their country’s natural wonders as places to be protected for the people rather than from them. (Niagara Falls, already marred by development when Alexis de Tocqueville visited in 1831, served as a cautionary example of legislative failure.) The first attempt to create a public nature reserve was at the state level: In 1864 President Abraham Lincoln signed the Yosemite Valley Grant Act, giving the land to California “upon the express conditions that the premises shall be held for public use, resort, and recreation.” But the initiative lacked any real oversight.
Many groups pushed for a federal system of national parks. Among them were the Transcendentalists, environmentalists and landscape painters such as George Catlin, Thomas Cole and Albert Bierstadt, but the ultimate credit belongs to the geologist Ferdinand Hayden, who surveyed Yellowstone in 1871. The mass acclaim following his expedition finally convinced Congress to turn Yellowstone into a national park.
Unfortunately, successive administrations failed to provide sufficient funds for its upkeep, and Yellowstone suffered years of illegal poaching and exploitation. In desperation, the federal government sent the U.S. Army in to take control of the park in 1886. The idea proved to be an inspired one. The military was such a conscientious custodian that its management style became the model for the newly created National Park Service in 1916.
The NPS currently oversees 63 National Parks. But the ethos hasn’t changed since the Yellowstone Act of 1872 set aside 2 million pristine acres “for the benefit and enjoyment of the people.” These words are now engraved above the north entrance to the park, an advertisement, as the novelist and environmentalist Wallace Stegner once wrote, that national parks are “absolutely American, absolutely democratic.”
The deadly affliction, once called self-starvation, has become much more common during the confinement of the pandemic.
February 18, 2022
Two years ago, when countries suspended the routines of daily life in an attempt to halt the spread of Covid-19, the mental health of children plunged precipitously.
Two years ago, when countries suspended the routines of daily life in an attempt to halt the spread of Covid-19, the mental health of children took a plunge. One worrying piece of evidence for this was an extraordinary spike in hospitalizations for anorexia and other eating disorders among adolescents, especially girls between the ages of 12 and 18, and not just in the U.S. but around the world. U.S. hospitalizations for eating disorders doubled between March and May 2020. England’s National Health Service recorded a 46% increase in eating disorder referrals by 2021 compared with 2019. Perth Children’s hospital in Australia saw a 104% increase in hospitalizations and in Canada, the rate tripled.
Anorexia nervosa has a higher death rate than any other mental illness. According to the National Eating Disorders Association, 75% of its sufferers are female. And while the affliction might seem relatively new, it has ancient antecedents.
As early as the sixth century B.C., adherents of Jainism in India regarded “santhara,” fasting to death, as a purifying religious ritual, particularly for men. Emperor Chandragupta, founder of the Mauryan dynasty, died in this way in 297 B.C. St. Jerome, who lived in the fourth and fifth centuries A.D., portrayed extreme asceticism as an expression of Christian piety. In 384, one of his disciples, a young Roman woman named Blaesilla, died of starvation. Perhaps because she fits the contemporary stereotype of the middle-class, female anorexic, Blaesilla rather than Chandragupta is commonly cited as the first known case.
The label given to spiritual and ascetic self-starvation is anorexia mirabilis, or “holy anorexia,” to differentiate it from the modern diagnosis of anorexia nervosa. There were two major outbreaks in history. The first began around 1300 and was concentrated among nuns and deeply religious women, some of whom were later elevated to sainthood. The second took off during the 19th century. So-called “fasting girls” or “miraculous maids” in Europe and America won acclaim for appearing to survive without food. Some were exposed as fakes; others, tragically, were allowed to waste away.
But, confusingly, there are other historical examples of anorexic-like behavior that didn’t involve religion or women. The first medical description of anorexia, written by Dr. Richard Morton in 1689, concerned two patients—an adolescent boy and a young woman—who simply wouldn’t eat. Unable to find a physical cause, Morton called the condition “nervous consumption.”
Almost two centuries passed before French and English doctors accepted Morton’s suspicion that the malady had a psychological component. In 1873, Queen Victoria’s physician, Sir William Gull, coined the term “Anorexia Nervosa.”
Naming the disease was a huge step forward. But its treatment was guided by an ever-changing understanding of anorexia’s causes, which has spanned the gamut from the biological to the psychosexual, from bad parenting to societal misogyny.
The first breakthrough in anorexia treatment, however, came from an experiment involving men. The Minnesota Starvation Experiment, a World War II –era study on how to treat starving prisoners, found that the 36 male volunteers exhibited many of the same behaviors as anorexics, including food obsessions, excessive chewing, bingeing and purging. The study showed that the malnourished brain reacts in predictable ways regardless of race, class or gender.
Recent research now suggests that a genetic predisposition could count for as many as 60% of the risk factors behind the disease. If this knowledge leads to new specialized treatments, it will do so at a desperate time: At the start of the year, the Lancet medical journal called on governments to take action before mass anorexia cases become mass deaths. The lockdown is over. Now save the children.
A shorter version appeared in The Wall Street Journal
Experts on conflict predict unrest, but America has a long way to go before it is as divided as it was in 1861
January 9, 2022
Violence is in the air. No one who saw the shocking scenes during the Capitol riot in Washington on January 6, 2021, can pretend that it was just a big misunderstanding. Donald Trump and his allies attempted to retain power at all costs. Terrible things happened that day. A year later the wounds are still raw and the country is still polarised. Only Democratic leaders participated in last week’s anniversary commemoration; Republicans stayed away. The one-year mark has produced a blizzard of warnings that the US is spiralling into a second civil war.
Only an idiot would ignore the obvious signs of a country turning against itself. Happy, contented electorates don’t storm their parliament (although terrified and oppressed peoples don’t either). America has reached the point where the mid-term elections are no longer a yawn but a test case for future civil unrest.
Predictably, the left and right are equally loud in their denunciations of each other. “Liberals” look at “conservatives” and see the alt-right: white supremacists and religious fanatics working together to suppress voting rights, women’s rights and democratic rights. Conservatives stare back and see antifa: essentially, progressive totalitarians making common cause with socialists and anarchists to undermine the pillars of American freedom and democracy. Put the two sides together and you have an electorate that has become angry, suspicious and volatile.
The looming threat of a civil war is almost the only thing that unites pundits and politicians across the political spectrum. Two new books, one by the Canadian journalist Stephen Marche and the other by the conflict analyst Barbara Walter, argue that the conditions for civil war are already in place. Walter believes that America is embracing “anocracy” (outwardly democratic, inwardly autocratic), joining a dismal list of countries that includes Turkey, Hungary and Poland. The two authors’ arguments have been boosted by the warnings of respected historians, including Timothy Snyder, who wrote in The New York Times that the US is teetering over the “abyss” of civil war.
If you accept the premise that America is facing, at the very least, a severe test of its democracy, then it is all the more important to subject the claims of incipient civil war to rigorous analysis. The fears aren’t baseless; the problem is that predictions are slippery things. How to prove a negative against something that hasn’t happened yet? There’s also the danger of the self-fulfilling prophecy: wishing and predicting don’t make things so, although they certainly help to fix the idea in people’s minds. The more Americans say that the past is repeating itself and the country has reached the point of no return, the more likely it will be believed.
Predictions based on comparisons to Weimar Germany, Nazi Germany, the Russian Revolution and the fall of Rome are simplistic and easy to dismiss. But, just as there is absolutely no basis for the jailed Capitol rioters to compare themselves to “Jews in Germany”, as one woman recently did, arguments that equate today’s fractured politics with the extreme violence that plagued the country just before the Civil War are equally overblown — not to mention trivialising of its 1.5 million casualties.
There simply isn’t a correlation between the factors dividing America then and now. In the run-up to the war in 1861, the North and South were already distinct entities in terms of ethnicity, customs and law. Crucially, the North’s economy was based on free labour and was prone to slumps and financial panics, whereas the South’s depended on slavery and was richer and more stable. The 13 Southern states seceded because they had local government, the military and judicial institutions on side.
Today there is a far greater plurality of voters spread out geographically. President Biden won Virginia and Georgia and almost picked up Texas in 2020; in 1860 there were ten Southern states where Abraham Lincoln didn’t even appear on the ballot.
When it comes to assessing the validity of generally accepted conditions for civil breakdown, the picture becomes more complicated. A 2006 study by the political scientists Havard Hegre and Nicholas Sambanis found that at least 88 circumstances are used to explain civil war. The generally accepted ones include: a fragile economy, deep ethnic and religious divides, weak government, long-standing grievances and factionalised elites. But models and circumstances are like railway tracks: they take us down one path and blind us to the others.
In 2015 the European think tank VoxEU conducted a historical analysis of over 100 financial crises between 1870 and 2014. Researchers found a pattern of street violence, greater distrust of government, increased polarisation and a rise in popular and right-wing parties in the five years after a crisis. This would perfectly describe the situation in the US except for one thing: the polarisation and populism have coincided with falling unemployment and economic growth. The Capitol riot took place despite, not because of, the strength of the financial system.
A country can meet a whole checklist of conditions and not erupt into outright civil war (for example, Northern Ireland in the 1970s) or meet only a few of the conditions and become a total disaster. It’s not only possible for the US, a rich, developed nation, to share certain similarities with an impoverished, conflict-ridden country and yet not become one; it’s also quite likely, given that for much of its history it has held together while being a violent, populist-driven society seething with racial and religious antagonisms behind a veneer of civil discourse. This is not an argument for complacency; it is simply a reminder that theory is not destiny.
A more worrying aspect of the torrent of civil war predictions by experts and ordinary Americans alike is the readiness to demonise and assume the absolute worst of the other side. It’s a problem when millions of voters believe that the American polity is irredeemably tainted, whether by corruption, communism, elitism, racism or what have you. The social cost of this divide is enormous. According to the Armed Conflict Location and Event Data Project: “In this hyper-polarised environment, state forces are taking a more heavy-handed approach to dissent, non-state actors are becoming more active and assertive and counter-demonstrators are looking to resolve their political disputes in the street.”
Dissecting the roots of America’s lurch towards rebarbative populism requires a particular kind of micro human analysis involving real-life interviews with perpetrators and protesters as well as trawls through huge sets of data. The results have shown that, more often than not, the attribution of white supremacist motives to the Capitol rioters, or anti-Americanism to Black Lives Matter protesters, says more about the politics of the accuser than the accused.
Social media is an amplifier for hire — polarisation lies at the heart of its business models and algorithms. Researchers looking at the “how” rather than just the “why” of America’s political Balkanisation have also found evidence of large-scale foreign manipulation of social media. A recent investigation by ProPublica and The Washington Post revealed that after November 3, 2020, there were more than 10,000 posts a day on Facebook attacking the legitimacy of the election.
In 2018 a Rasmussen poll asked American voters whether the US would experience a second civil war within the next five years. Almost a third said it would. In a similar poll conducted last year the proportion had risen to 46 per cent. Is it concerning? Yes. Does it make the prediction true? Well, polls also showed a win for Hillary Clinton and a landslide for Joe Biden. So, no.