Since the siege of Troy, getting armed forces into battle zones quickly and efficiently has made a decisive difference in warfare
May 6, 2021
The massing of more than 100,000 Russian soldiers at Ukraine’s border in April was an unambiguous message to the West: President Putin could dispatch them at any moment, if he chose.
How troops move into battle positions is hardly the stuff of poetry. Homer’s “The Iliad” begins with the Greeks having already spent 10 years besieging Troy. Yet the engine of war is, quite literally, the ability to move armies. Many scholars believe that the actual Trojan War may have been part of a larger conflict between the Bronze Age kingdoms of the Mediterranean and a maritime confederacy known as the Sea Peoples.
The identity of these seafaring raiders is still debated, but their means of transportation is well-attested. The Sea Peoples had the largest and best fleets, allowing them to roam the seas unchecked. The trade network of the Mediterranean collapsed beneath their relentless attacks. Civilization went backward in many regions; even the Greeks lost the art of writing for several centuries.
The West recovered and flourished until the fifth century, when the Romans were overwhelmed by the superior horse-borne armies of the Vandals. Their Central European horses, bred for strength and stamina, transformed the art of warfare, making it faster and more mobile. The invention of the stirrup, the curb bit, and finally the war saddle made mobility an effective weapon in and of itself.
Genghis Khan understood this better than any of his adversaries. His mounted troops could cover up to 100 miles a day, helping to stretch the Mongol empire from the shores of eastern China to the Austrian border. But horses need pasture, and Europe’s climate between 1238 to 1242 was excessively wet. Previously fertile plains became boggy marshes. The first modern invasion was stopped by rain.
Bad weather continued to provide an effective defense against invaders. Napoleon entered Russia in 1812 with a force of over 500,000. An unseasonably hot summer followed by an unbearably cold winter killed off most of his horses, immobilizing the cavalry and the supply wagons that would have prevented his army from starving. He returned with fewer than 20,000 men.
The reliance on pack animals for transport meant that until the Industrial Revolution, armies were no faster than their Roman counterparts. The U.S. Civil War first showed how decisive railroads could be. In 1863 the Confederate siege of Chattanooga, Tenn., was broken by 23,000 Federal troops who traveled over 1,200 miles across seven states to relieve Union forces under General William Rosecrans.
The Prussians referred to this kind of troop-maneuvering warfare as bewegungskrieg, war of movement, using it to crushing effect over the less-mobile French in the Franco-Prussian War. In the early weeks of World War I, France still struggled to mobilize; Gen. Joseph S. Gallieni, the military governor of Paris, famously resorted to commandeering Renault taxicabs to ferry soldiers to the Battle of the Marne.
The Germans started World War II with their production capacity lagging that of the Allies; they compensated by updating bewegungskrieg to what became known as blitzkrieg, or lightning strike, which combined speed with concentrated force. They overwhelmed French defenses in six weeks.
In the latter half of the 20th century, troop transport became even more inventive, if not decisive. Most of the 2.7 million U.S. soldiers sent into the Vietnam War were flown commercial. (Civilian air stewardesses flying over combat zones were given the military rank of Second Lieutenant.)
Although future conflicts may be fought in cyberspace, for now, modern warfare means mass deployment. Winning still requires moving.
Winston Churchill made the term famous, but ideological rivalries have driven geopolitics since Athens and Sparta.
February 25, 2021
It was an unseasonably springlike day on March 5, 1946, when Winston Churchill visited Fulton, Missouri. The former British Prime Minister was ostensibly there to receive an honorary degree from Westminster College. But Churchill’s real purpose in coming was to urge the U.S. to form an alliance with Britain to keep the Soviet Union from expanding any further. Speaking before an august audience that included President Harry S. Truman, Churchill declared: “From Stettin in the Baltic to Trieste in the Adriatic, an iron curtain has descended across the continent.”
Churchill wasn’t the first person to employ the phrase “iron curtain” as a political metaphor. Originally a theatrical term for the safety barrier between the stage and the audience, by the early 20th century it was being used to mean a barrier between opposing powers. Nevertheless, “iron curtain” became indelibly associated with Churchill and with the defense of freedom and democracy.
This was a modern expression of an idea first articulated by the ancient Greeks: that political beliefs are worth defending. In the winter of 431-30 B.C., the Athenians were staggering under a devastating plague while simultaneously fighting Sparta in the Peloponnesian War. The stakes couldn’t have been higher when the great statesman Pericles used a speech commemorating the war dead to define the struggle in terms that every Athenian would understand.
As reported by the historian Thucydides, Pericles told his compatriots that the fight wasn’t for more land, trade or treasure; it was for democracy pure and simple. Athens was special because its government existed “for the many instead of the few,” guaranteeing “equal justice to all.” No other regime, and certainly not the Spartans, could make the same claim.
Pericles died the following year, and Athens eventually went down in defeat in 404 B.C. But the idea that fighting for one’s country meant defending a political ideal continued to be influential. According to the 2nd-century Roman historian Cassius Dio, the Empire had to make war on the “lawless and godless” tribes living outside its borders. Fortifications such as Hadrian’s Wall in northern England weren’t just defensive measures but political statements: Inside bloomed civilization, outside lurked savagery.
The Great Wall of China, begun in 220 B.C. by Emperor Qin Shi Huang, had a similar function. In addition to keeping out the nomadic populations in the Mongolian plains, the wall symbolized the unity of the country under imperial rule and the Confucian belief system that supported it. Successive dynasties continued to fortify the Great Wall until the mid-17th century.
During the Napoleonic Wars, the British considered themselves to be fighting for democracy against dictatorship, like the ancient Athenians. In 1806, Napoleon instigated the Continental System, an economic blockade intended to cut off Britain from trading with France’s European allies and conquests. But the attack on free trade only strengthened British determination.
A similar resolve among the NATO allies led to the collapse of the Iron Curtain in 1991, when the Soviet Union was dissolved and withdrew its armies from Eastern Europe. As Churchill had predicted, freedom and democracy is the ultimate shield against “war and tyranny.”
Roman emperors and modern dictators have feared the social and spiritual penalties of excommunication.
January 28, 2021
Nowadays, all it takes for a person to be condemned to internal exile is a Twitter stampede of outrage. The lack of any regulating authority or established criteria for what constitutes repentance gives “cancel culture,” as it is popularly known, a particularly modern edge over more old-fashioned expressions of public shaming such as tar-and-feathering, boycotts and blacklists.
But the practice of turning nonconforming individuals into non-persons has been used with great effectiveness for centuries, none more so than the punishment of excommunication by the Roman Catholic Church. The penalties included social ostracism, refusal of communion and Christian burial, and eternal damnation of one’s soul.
The fear inspired by excommunication was great enough to make even kings fall in line. In 390, soldiers under the command of the Roman emperor Theodosius I massacred thousands in the Greek city of Thessalonica. In response, Bishop Ambrose of Milan excommunicated Theodosius, forcing him to don sackcloth and ashes as public penance. Ambrose’s victory established the Church’s authority over secular rulers.
Later church leaders relied on the threat of excommunication to maintain their power, but the method could backfire. In 1054, Pope Leo III of Rome excommunicated Patriarch Michael Cerularius of Constantinople, the head of the eastern Church, who retaliated by excommunicating Leo and the western Church. Since this Great Schism, the two churches, Roman Catholic and Eastern Orthodox, have never reunited.
During the Middle Ages, the penalty of excommunication broadened to include the cancellation of all legal protections, including the right to collect debts. Neither kings nor cities were safe. After being excommunicated by Pope Gregory VII in 1076, Holy Roman Emperor Henry IV stood barefoot in the snow for three days before the pontiff grudgingly welcomed him inside to hear his repentance. The entire city of Venice was excommunicated over half a dozen times, and on each occasion the frightened Venetians capitulated to papal authority.
But the excommunication of Martin Luther, the founder of Protestantism, by Pope Leo X in January 1521, 500 years ago this month, didn’t work out as planned. Summoned to explain himself at the Diet of Worms, a meeting presided over by the Holy Roman Emperor Charles V, Luther refused to recant and ask forgiveness, allegedly saying: “Here I stand, I can do no other.” In response, the Emperor declared him a heretic and outlaw, putting his life in danger. Luther was only saved from assassination by his patron, Frederick, Elector of Saxony, who hid him in a castle. Luther used the time to begin translating the Bible into German.
Napoleon Bonaparte was equally unconcerned about the spiritual consequences when he was excommunicated by Pope Pius VII in 1809. Nevertheless, he was sufficiently frightened of public opinion to kidnap the pontiff and keep him out of sight for several months. In 1938, angry over Nazi Germany’s takeover of Austria, the Italian dictator Benito Mussolini tried to persuade Pope Pius XI to excommunicate the German dictator Adolf Hitler, a nonpracticing Catholic. Who knows what would have happened if he had been successful.
U.S. antiterror efforts have cost nearly $6 trillion since the 9/11 attacks. Earlier governments from the ancient Greeks to Napoleon have had to get creative to finance their fights
The successful operation against Islamic State leader Abu Bakr al-Baghdadi is a bright spot in the war on terror that the U.S. declared in response to the attacks of 9/11. The financial costs of this long war have been enormous: nearly $6 trillion to date, according to a recent report by the Watson Institute of International and Public Affairs at Brown University, which took into account not just the defense budget but other major costs, like medical and disability care, homeland security and debt.
War financing has come a long way since the ancient Greeks formed the Delian League in 478 B.C., which required each member state to contribute an agreed amount of money each year, rather than troops,. With the League’s financial backing, Athens became the Greek world’s first military superpower—at least until the Spartans, helped by the Persians, built up their naval fleet with tribute payments extracted from dependent states.
The Romans maintained their armies through tributes and taxes until the Punic Wars—three lengthy conflicts between 264 and 146 B.C.—proved so costly that the government turned to debasing the coinage in an attempt to increase the money supply. The result was runaway inflation and eventually a sovereign debt crisis during the Social War a half-century later between Rome and several breakaway Italian cities. The government ended up defaulting in 86 B.C., sealing the demise of the ailing Roman Republic.
After the fall of Rome in the late fifth century, wars in Europe were generally financed by plunder and other haphazard means. William the Conqueror financed the Norman invasion of England in 1066 the ancient Roman way, by debasing his currency. He learned his lesson and paid for all subsequent operations out of tax receipts, which stabilized the English monetary system and established a new model for financing war.
Taxation worked until European wars became too expensive for state treasuries to fund alone. Rulers then resorted to a number of different methods. During the 16th century, Philip I of Spain turned to the banking houses of Genoa to raise the money for his Armada invasion fleet against England. Seizing the opportunity, Sir Francis Walsingham, Elizabeth I’s chief spymaster, sent agents to Genoa with orders to use all legal means to sabotage and delay the payment of Philip’s bills of credit. The operation bought England a crucial extra year of preparation.
In his own financial preparations to fight England, Napoleon had better luck than Philip I: In 1803 he was able to raise a war chest of over $11 million in cash by selling the Louisiana Territory to the U.S.
Napoleon was unusual in having a valuable asset to offload. By the time the American Civil War broke out in 1861, governments had become reliant on a combination of taxation, printing money or borrowing to pay for war. But the U.S. lacked a regulated banking system since President Andrew Jackson’s dismantling of the Second Bank of the United States in the 1830s. The South resorted to printing paper money, which depreciated dramatically. The North could afford to be more innovative. In 1862 the financier Jay Cooke invented the war bond. This was marketed with great success to ordinary citizens. At the war’s end, the bonds had covered two-thirds of the North’s costs.
Incurring debt is still how the U.S. funds its wars. It has helped to shield the country from the full financial effects of its prolonged conflicts. But in the future it is worth remembering President Calvin Coolidge’s warning: “In any modern campaign the dollars are the shock troops…. A country loaded with debt is devoid of the first line of defense.”
Evidence shows people become happier in their fifties, but achieving that takes some soul-searching
I used not to believe in the “midlife crisis”. I am ashamed to say that I thought it was a convenient excuse for self-indulgent behaviour — such as splurging on a Lamborghini or getting buttock implants. So I wasn’t even aware that I was having one until earlier this year, when my family complained that I had become miserable to be around. I didn’t shout or take to my bed, but five minutes in my company was a real downer. The closer I got to my 50th birthday, the more I radiated dissatisfaction.
Can you be simultaneously contented and discontented? The answer is yes. Surveys of “national wellbeing” in several countries, including the UK, by the Office for National Statistics have revealed a fascinating U-curve in relation to happiness and age. In Britain, feelings of stress and anxiety appear to peak at 49 and subsequently fade as the years increase. Interestingly, a 2012 study showed that chimpanzees and orang-utans exhibited a similar U-curve of happiness as they reach middle age.
On a rational level, I wasn’t the least bit disappointed with my life. The troika of family, work and friends made me very happy. And yet something was eating away at my peace of mind. I regarded myself as a failure — not in terms of work but as a human being. Learning that I wasn’t alone in my daily acid bath of gloom didn’t change anything.
One of F Scott Fitzgerald’s most memorable lines is: “There are no second acts in American lives.” It’s so often quoted that it’s achieved the status of a truism. It’s often taken to be an ironic commentary on how Americans, particularly men, are so frightened of failure that they cling to the fiction that life is a perpetual first act. As I thought about the line in relation to my own life, Fitzgerald’s meaning seemed clear. First acts are about actions and opportunities. There is hope, possibility and redemption. Second acts are about reactions and consequences.
Old habits die hard, however. I couldn’t help conducting a little research into Fitzgerald’s life. What was the author of The Great Gatsby really thinking when he wrote the line? Would it even matter?
The answer turned out to be complicated. As far as the quotation goes, Fitzgerald actually wrote the reverse. The line appears in a 1935 essay entitled My Lost City, about his relationship with New York: “I once thought that there were no second acts in American lives, but there was certainly to be a second act to New York’s boom days.”
It reappeared in the notes for his Hollywood novel, The Love of the Last Tycoon, which was half finished when he died in 1940, aged 44. Whatever he had planned for his characters, the book was certainly meant to have been Fitzgerald’s literary comeback — his second act — after a decade of drunken missteps, declining book sales and failed film projects.
Fitzgerald may not have subscribed to the “It’s never too late to be what you might have been” school of thought, but he wasn’t blind to reality. Of course he believed in second acts. The world is full of middle-aged people who successfully reinvented themselves a second or even third time. The mercurial rise of Emperor Claudius (10BC to AD54) is one of the earliest historical examples of the true “second act”.
According to Suetonius, Claudius’s physical infirmities had made him the butt of scorn among his powerful family. But his lowly status saved him after the assassination of his nephew, Caligula. The plotters found the 56-year-old Claudius cowering behind a curtain. On the spur of the moment, instead of killing him, as they did Caligula’s wife and daughter, the plotters decided the stumbling and stuttering scion of the Julio-Claudian dynasty could be turned into a puppet emperor. It was a grave miscalculation. Claudius seized on his changed circumstances. The bumbling persona was dropped and, although flawed, he became a forceful and innovative ruler.
Mostly, however, it isn’t a single event that shapes life after 50 but the willingness to stay the course long after the world has turned away. It’s extraordinary how the granting of extra time can turn tragedy into triumph. In his heyday, General Mikhail Kutuzov was hailed as Russia’s greatest military leader. But by 1800 the 55-year-old was prematurely aged. Stiff-limbed, bloated and blind in one eye, Kutuzov looked more suited to play the role of the buffoon than the great general. He was Alexander I’s last choice to lead the Russian forces at the Battle of Austerlitz in 1805, but was the first to be blamed for the army’s defeat.
Kutuzov was relegated to the sidelines after Austerlitz. He remained under official disfavour until Napoleon’s army was halfway to Moscow in 1812. Only then, with the army and the aristocracy begging for his recall, did the tsar agree to his reappointment. Thus, in Russia’s hour of need it ended up being Kutuzov, the disgraced general, who saved the country.
Winston Churchill had a similar apotheosis in the Second World War. For most of the 1930s he was considered a political has-been by friends and foes alike. His elevation to prime minister in 1940 at the age of 65 changed all that, of course. But had it not been for the extraordinary circumstances created by the war, Robert Rhodes James’s Churchill: A Study in Failure, 1900-1939 would have been the epitaph rather than the prelude to the greatest chapter in his life.
It isn’t just generals and politicians who can benefit from second acts. For writers and artists, particularly women, middle age can be extremely liberating. The Booker prize-winning novelist Penelope Fitzgerald published her first book at 59 after a lifetime of teaching while supporting her children and alcoholic husband. Thereafter she wrote at a furious pace, producing nine novels and three biographies before she died at 83.
I could stop right now and end with a celebratory quote from Morituri Salutamus by the American poet Henry Wadsworth Longfellow: “For age is opportunity no less/ than youth itself, though in another dress, / And as the evening twilight fades away / The sky is filled with stars, invisible by day.”
However, that isn’t — and wasn’t — what was troubling me in the first place. I don’t think the existential anxieties of middle age are caused or cured by our careers. Sure, I could distract myself with happy thoughts about a second act where I become someone who can write a book a year rather than one a decade. But that would still leave the problem of the flesh-and-blood person I had become in reality. What to think of her? It finally dawned on me that this had been my fear all along: it doesn’t matter which act I am in; I am still me.
My funk lifted once the big day rolled around. I suspect that joining a gym and going on a regular basis had a great deal to do with it. But I had also learnt something valuable during these past few months. Worrying about who you thought you would be or what you might have been fills a void but leaves little space for anything else. It’s coming to terms with who you are right now that really matters.
It’s not just the Nobel: Award-giving missteps have a long history
This spring, controversies have engulfed three big prizes.
The Swedish Academy isn’t awarding the Nobel Prize for Literature this year while it deals with the fallout from a scandal over allegations of sexual assault and financial impropriety.
In the U.S., the author Junot Díaz has stepped down as Pulitzer Prize chairman while the board investigates allegations of sexual misconduct. In a statement through his literary agent earlier this month, Mr. Díaz did not address individual accusations but said in part, “I take responsibility for my past.” Finally, the organizers of the Echo, Germany’s version of the Grammys, said they would no longer bestow the awards after one of this year’s prizes went to rappers who used anti-Semitic words and images in their lyrics and videos.
Prize-giving controversies—some more serious than others—go back millennia. I know something about prizes, having served as chairwoman of the literary Man Booker Prize jury.
Controversy surrounding a competition can be a revitalizing force—especially when the powers that be support the dissidents. By the 1860s, France’s Academy of Fine Arts, the defender of official taste, was growing increasingly out of touch with contemporary art. In 1863, the jury of the prestigious annual Salon exhibition, which the academy controlled, rejected artists such as Paul Cézanne, Camille Pissarro and Édouard Manet.
The furor caused Emperor Napoleon III to order a special exhibition called the Salon “of Rejects” to “let the public judge” who was right. The public was divided, but the artists felt emboldened, and many scholars regard 1863 as the birthdate of modern painting. The Academy ultimately relinquished its control of the Salon in 1881. Its time was over.
At other times, controversies over prizes are more flash than substance. As antigovernment student protests swept Paris and many other places in 1968, a group of filmmakers tried to show solidarity with the protesters by shutting down the venerable Cannes Film Festival. At one point, directors hung from a curtain to prevent a film from starting. The festival was canceled but returned in 1969 without the revolutionary changes some critics were hoping for.
In contrast, a recent dispute at the festival over its refusal to allow in its competition Netflix films that bypass French theaters for streaming was relatively quiet but reflects the serious power struggle between streaming services and theatrical movie distributors.
In 1956, Secretary of State John Foster Dulles, explaining how America could use the threat of nuclear war in diplomacy, told Life Magazine, “The ability to get to the verge without getting into the war is the necessary art…. If you try to run away from it, if you are scared to go to the brink, you are lost.” President Donald Trump recently seemed to embrace this idea with his warning that if North Korea made any more threats to the U.S., it “will be met with fire and fury like the world has never seen.” Continue reading…
Since 2014, Islamic State has been doing its best to destroy all traces of pre-Islamic culture in Iraq and Syria. Hammers and explosives aren’t its only tools. The antiquities trade is worth billions, and the self-styled caliphate is funding itself in part by looting and selling ancient treasures.
In late May, the Journal reported that U.S. and European Union authorities were scrutinizing a pair of art dealers as part of a wider investigation into who has been facilitating the market for ancient coins, statues and relics stolen by Islamic State. The dealers say they have done nothing wrong.
As we look forward to celebrating the bicentennial of the “Star-Spangled Banner” by Francis Scott Key, I have to admit, with deep shame and embarrassment, that until I left England and went to college in the U.S., I assumed the words referred to the War of Independence. In my defense, I suspect I’m not the only one to make this mistake
For people like me, who have got their flags and wars mixed up, I think it should be pointed out that there may have been only one War of 1812, but there are four distinct versions of it—the American, the British, the Canadian and the Native American. Moreover, among Americans, the chief actors in the drama, there are multiple variations of the versions, leading to widespread disagreement about the causes, the meaning and even the outcome of the war.