Historically Speaking: Aspirin, a Pioneering Wonder Drug

The winding, millennia-long route from bark to Bayer.

The Wall Street Journal

February 1, 2024

For ages the most reliable medical advice was also the most simple: Take two aspirin and call me in the morning. This cheap pain reliever, which also thins blood and reduces inflammation, has been a medicine cabinet staple ever since it became available over the counter nearly 110 years ago.

Willow bark, a distant ancestor of aspirin, was a popular ingredient in ancient remedies to relieve pain and treat skin problems. Hippocrates, the father of medicine, was a firm believer in willow’s curative powers. For women with gynecological troubles in the fourth century B.C., he advised burning the leaves “until the steam enters the womb.”

That willow bark could reduce fevers wasn’t discovered until the 18th century. Edward Stone, an English clergyman, noticed its extremely bitter taste was similar to that of the cinchona tree, the source of the costly malaria drug quinine. Stone dried the bark and dosed himself to treat a fever. When he felt better, he tested the powder on others suffering from “ague,” or malaria. When their fevers disappeared, he reported triumphantly to the Royal Society in 1763 that he had found another malaria cure. In fact, he had identified a way to treat its symptoms.

Willows contain salicin, a plant hormone with anti-inflammatory, fever-reducing and pain-relieving properties. Experiments with salicin, and its byproduct salicylic acid, began in earnest in Europe in the 1820s. In 1853 Charles Frédéric Gerhardt, a French chemist, discovered how to create acetylsalicylic acid, the active ingredient in aspirin, but then abandoned his research and died young.

There is some debate over how aspirin became a blockbuster drug for the German company Bayer. Its official history credits Felix Hoffmann, a Bayer chemist, with synthesizing acetylsalicylic acid in 1897 in the hopes of alleviating his father’s severe rheumatic pain. Bayer patented aspirin in 1899 and by 1918 it had become one of the most widely used drugs in the world.

ILLUSTRATION: THOMAS FUCHS

But did Hoffman work alone? Shortly before his death in 1949, Arthur Eichengrün, a Jewish chemist who had spent World War II in a concentration camp, published a paper claiming that Bayer had erased his contribution. In 2000 the BMJ published a study supporting Eichengrün’s claim. Bayer, which became part of the Nazi-backing conglomerate I.G. Farben in 1925, has denied that Eichengrün had a role in the breakthrough.

Aspirin shed its associations with the Third Reich after I.G. Farben sold off Bayer in the early 1950s, but the drug’s pain-relieving hegemony was fleeting. By 1956 Bayer’s British affiliate brought acetaminophen to the market. Ibuprofen became available in 1962.

The drug’s fortunes recovered after the New England Journal of Medicine published a study in 1989 that found the pill reduced the threat of a heart attack by 44%. Some public-health officials promptly encouraged anyone over 50 to take a daily aspirin as a preventive measure.

But as with the case with Rev. Stone, it seems the science is more complicated. In 2022 the U.S. Preventive Services Task Force officially advised against taking the drug prophylactically, given the risk of internal bleeding and the availability of other therapies. Aspirin may work wonders, but it can’t work miracles.

Historically Speaking: Marriage as a Mirror of Human Nature

From sacred ritual to declining institution, wedlock has always reflected our ideas about liberty and commitment.

The Wall Street Journal

October 26, 2023

Marriage is in decline in almost every part of the world. In the U.S., the marriage rate is roughly six per 1,000 people, a fall of nearly 60% since the 1970s. But this is still high compared with most of the highly developed countries in the Organization for Economic Cooperation and Development, where the average marriage rate has dropped below four per 1,000. Modern vie

History of Marriage

THOMAS FUCHS

ws on marriage are sharply divided: In a recent poll, two in five young adult Americans said that the institution has outlived its usefulness.

The earliest civilizations had no such thoughts. Marriage was an inseparable part of the religious and secular life of society. In Mesopotamian mythology, the first marriage was the heavenly union between Innana/Ishtar, the goddess of war and love, and her human lover, the shepherd Dumuzi. Each year, the high point of the religious calendar was the symbolic re-enactment of the Sacred Marriage Rite by the king and the high priestess of the city.

Throughout the ancient world, marriage placed extra constraints on women while allowing polygamy for men. The first major change to the institution took place in ancient Greece. A marriage between one man and one woman, with no others involved, became the bedrock of democratic states. According to Athenian law, only the son of two married citizens could inherit the rights of citizenship. The change altered the definition of marriage to give it a civic purpose, although women’s subordination remained unchanged.

At the end of the 1st century B.C., Augustus Caesar, the founder of the Roman Empire, tried to use the law to reinvigorate “traditional” marriage values. But it was the Stoic philosophers who had the greatest impact on ideas about marriage, teaching that its purpose included personal fulfillment. The 1st-century philosopher Musonius Rufus argued that love and companionship weren’t just incidental benefits but major purposes of marriage.

The early Church’s general hostility toward sex did away with such views. Matrimony was considered less desirable than celibacy; priests didn’t start officiating at wedding ceremonies until the 800s. On the other hand, during the 12th century the Catholic Church made marriage one of the seven unbreakable sacraments. In the 16th century, its intransigence on divorce resulted in King Henry VIII establishing the Anglican Church so he could leave Catherine of Aragon and marry Anne Boleyn.

In the U.S. after the Civil War, thousands of former slaves applied for marriage certificates from the Freedmen’s Bureau. Concurrently, between 1867 and 1886, there were 328,716 divorces among all Americans. The simultaneous moves by some to escape the bonds of matrimony, and by others to have the right to claim it, highlight the institution’s peculiar place in our ideas of individual liberty.

In 1920, female suffrage transformed the nature of marriage yet again, implicitly recognizing the right of wives to a separate legal identity. Still, the institution survived and even thrived. At the height of World War II in 1942, weddings were up 83% from the previous decade.

Though marriage symbolizes stability, its meaning is unstable. It doesn’t date or fall behind; for better or worse, it simply reflects who we are.

Historically Speaking: The Women Who Have Gone to War

There have been female soldiers since antiquity, but only in modern times have military forces accepted and integrated them

The Wall Street Journal

July 14, 2022

“War is men’s business,” Prince Hector of Troy declares in Homer’s Iliad, a sentiment shared by almost every culture since the beginning of history. But Hector was wrong. War is women’s business, too, even though their roles are frequently overlooked.

This month marks 75 years since the first American woman received a regular Army commission. Florence Aby Blanchfield, superintendent of the 59,000-strong Army Nurse Corps during World War II, was appointed Lt. Colonel by General Dwight Eisenhower in July, 1947. Today, women make up approximately 19% of the officer corps of the Armed Forces.

The integration of women into the military is a fundamental difference between the ancient and modern worlds. In the past, a weapon-wielding woman was seen as symbolizing the shame and emasculation of men. Among the foundation myths depicted on the Parthenon are scenes of the Athenian army defeating the Amazons, a race of warrior women.

Such propaganda couldn’t hide, however, the fact that in real life the Greeks and Romans on occasion fought and even lost against female commanders. Artemisia I of Caria was one of the Persian king Xerxes’s most successful naval commanders. Hearing about her exploits against the Greeks during the Battle of Salamis in 480 B.C., he is alleged to have exclaimed: “My men have become women, and my women men.” The Romans crushed Queen Boudica’s revolt in what is now eastern England in 61 A.D., but not before she had destroyed the 9th Roman Legion and massacred 70,000 others.

The medieval church was similarly torn between ideology and reality in its attitude toward female Christian warriors. Yet women did take part in the Crusades. Most famously, Queen Eleanor of Aquitaine accompanied her husband King Louis VII of France on the Second Crusade (1147-1149) and was by far the better strategist of the two. However, Eleanor’s enemies cited her presence as proof that she was a gender-bending harlot.

Florence Aby Blanchfield was the first woman to receive a regular commission with the U.S. Army

For centuries, the easiest way for a woman to become a soldier was to pass as a boy. In 1782, Massachusetts-born Deborah Sampson became one of the first American women to fight for her country by enlisting as a youth named Robert Shurtleff. During the Civil War, anywhere between 400 and 750 women practiced similar deceptions.

A dire personnel shortage finally opened a legal route for women to enter the Armed Forces. Unable to meet its recruitment targets, in March, 1917, the U.S. Navy announced that it would allow all qualified persons to enlist in the reserves. Loretta Perfectus Walsh, a secretary in the Philadelphia naval recruiting office, signed up almost immediately. The publicity surrounding her enlistment as the Navy’s first female chief yeoman encouraged thousands more to step forward.

World War II proved to be similarly transformative. In the U.S., more than 350,000 women served in uniform. In Britain, Queen Elizabeth II made history by becoming a military mechanic in the women’s branch of the British Army.

Although military women have made steady gains in terms of parity, the debate over their presence is by no means over. Yet the “firsts” keep coming. In June, Adm. Linda L. Fagan of the U.S. Coast Guard became the first woman to lead a branch of the U.S. Armed Forces. In the past as now, whatever the challenge, there’s always been a woman keen to accept it.

Historically Speaking: Anorexia’s Ancient Roots And Present Toll

The deadly affliction, once called self-starvation, has become much more common during the confinement of the pandemic.

The Wall Street Journal

February 18, 2022

Two years ago, when countries suspended the routines of daily life in an attempt to halt the spread of Covid-19, the mental health of children plunged precipitously.

Two years ago, when countries suspended the routines of daily life in an attempt to halt the spread of Covid-19, the mental health of children took a plunge. One worrying piece of evidence for this was an extraordinary spike in hospitalizations for anorexia and other eating disorders among adolescents, especially girls between the ages of 12 and 18, and not just in the U.S. but around the world. U.S. hospitalizations for eating disorders doubled between March and May 2020. England’s National Health Service recorded a 46% increase in eating disorder referrals by 2021 compared with 2019. Perth Children’s hospital in Australia saw a 104% increase in hospitalizations and in Canada, the rate tripled.

Anorexia nervosa has a higher death rate than any other mental illness. According to the National Eating Disorders Association, 75% of its sufferers are female. And while the affliction might seem relatively new, it has ancient antecedents.

As early as the sixth century B.C., adherents of Jainism in India regarded “santhara,” fasting to death, as a purifying religious ritual, particularly for men. Emperor Chandragupta, founder of the Mauryan dynasty, died in this way in 297 B.C. St. Jerome, who lived in the fourth and fifth centuries A.D., portrayed extreme asceticism as an expression of Christian piety. In 384, one of his disciples, a young Roman woman named Blaesilla, died of starvation. Perhaps because she fits the contemporary stereotype of the middle-class, female anorexic, Blaesilla rather than Chandragupta is commonly cited as the first known case.

The label given to spiritual and ascetic self-starvation is anorexia mirabilis, or “holy anorexia,” to differentiate it from the modern diagnosis of anorexia nervosa. There were two major outbreaks in history. The first began around 1300 and was concentrated among nuns and deeply religious women, some of whom were later elevated to sainthood. The second took off during the 19th century. So-called “fasting girls” or “miraculous maids” in Europe and America won acclaim for appearing to survive without food. Some were exposed as fakes; others, tragically, were allowed to waste away.

But, confusingly, there are other historical examples of anorexic-like behavior that didn’t involve religion or women. The first medical description of anorexia, written by Dr. Richard Morton in 1689, concerned two patients—an adolescent boy and a young woman—who simply wouldn’t eat. Unable to find a physical cause, Morton called the condition “nervous consumption.”

A subject under study in the Minnesota Starvation Experiment, 1945. WALLACE KIRKLAND/THE LIFE PICTURE COLLECTION/SHUTTERSTOCK

Almost two centuries passed before French and English doctors accepted Morton’s suspicion that the malady had a psychological component. In 1873, Queen Victoria’s physician, Sir William Gull, coined the term “Anorexia Nervosa.”

Naming the disease was a huge step forward. But its treatment was guided by an ever-changing understanding of anorexia’s causes, which has spanned the gamut from the biological to the psychosexual, from bad parenting to societal misogyny.

The first breakthrough in anorexia treatment, however, came from an experiment involving men. The Minnesota Starvation Experiment, a World War II –era study on how to treat starving prisoners, found that the 36 male volunteers exhibited many of the same behaviors as anorexics, including food obsessions, excessive chewing, bingeing and purging. The study showed that the malnourished brain reacts in predictable ways regardless of race, class or gender.

Recent research now suggests that a genetic predisposition could count for as many as 60% of the risk factors behind the disease. If this knowledge leads to new specialized treatments, it will do so at a desperate time: At the start of the year, the Lancet medical journal called on governments to take action before mass anorexia cases become mass deaths. The lockdown is over. Now save the children.

A shorter version appeared in The Wall Street Journal

Historically Speaking: How the Waistband Got Its Stretch

Once upon a time, human girth was bound by hooks and buttons, and corsets had metal stays. Along came rubber and a whole new technology of flexible cloth.

The Wall Street Journal

January 7, 2021

The New Year has arrived, and if you’re like me, you’ve promised yourself a slimmer, fitter and healthier you in 2022. But in the meantime there is the old you to deal with—the you who overindulged at Thanksgiving and didn’t stop for the next 37 days. No miracle diet or resolution can instantaneously eradicate five weeks of wild excess. Fortunately, modern science has provided the next best thing to a miracle: the elasticated waistband.

Before the invention of elastic, adjustable clothing was dependent on technology that had hardly changed since ancient times. The Indus Valley Civilization made buttons from seashells as early as 2000 BC.

The first inkling that there might be an alternative to buttons, belts, hooks and other adjustable paraphernalia came in the late 18th century, with the discovery that rubber wasn’t only good for toys. It also had immensely practical applications for things such as pencil erasers and lid sealants. Rubber’s stretchable nature offered further possibilities in the clothing department. But there was no word for its special property until the poet William Cowper borrowed the 17th-century term “elastic,” used to describe the expansion and contraction of gases, for his translation of the Iliad in 1791: “At once he bent Against Tydides his elastic bow.”

PHOTO: GETTY IMAGES

By 1820, an enterprising English engineer named Thomas Hancock was making elastic straps and suspenders out of rubber. He also invented the “masticator,” a machine that rolled shredded rubber into sheets for industrial use. Elastic seemed poised to make a breakthrough: In the 1840s, Queen Victoria’s shoemaker, Joseph Sparkes Hall, popularized his invention of the elastic-gusset ankle boot, still known today as the Chelsea Boot.

But rubber had drawbacks. Not only was it a rare and expensive luxury that tended to wear out quickly, it was also sticky, sweaty and smelly. Elasticized textiles became popular only after World War I, helped by the demand for steel—and female workers—that led women to forego corsets with metal stays. Improved production techniques at last made elasticated girdles a viable alternative: In 1924, the Madame X rubber girdle promised to help women achieve a thinner form in “perfect comfort while you sit, work or play.”

The promise of comfort became real with the invention of Lastex, essentially rubber yarn, in 1930. Four years later, in 1934, Alexander Simpson, a London tailor, removed the need for belts or suspenders by introducing the adjustable rubber waistband in men’s trousers.

The constant threat of rubber shortages sparked a global race to devise synthetic alternatives. The winner was the DuPont Company, which invented neoprene in 1930. That research led to an even more exciting invention: the nylon stocking. Sales were halted during World War II, creating such pent-up demand that in 1946 there were “nylon riots” throughout the U.S., including in Pittsburgh, where 40,000 people tried to buy 13,000 pairs of stockings.

DuPont scored another win in 1958 with spandex, also known under the brand name Lycra, which is not only more durable than nylon but also stretchier. Spandex made dreams possible by making fabrics more flexible and forgiving: It helped the astronaut Neil Armstrong to walk on the moon and Simone Biles to become the most decorated female gymnast in history. And it will help me to breathe a little easier until I can fit into my jeans again.

Historically Speaking: How Malaria Brought Down Great Empires

A mosquito-borne parasite has impoverished nations and stopped armies in their tracks

The Wall Street Journal

October 15, 2021

Last week brought very welcome news from the World Health Organization, which approved the first-ever childhood vaccine for malaria, a disease that has been one of nature’s grim reapers for millennia.

Originating in Africa, the mosquito-borne parasitic infection left its mark on nearly every ancient society, contributing to the collapse of Bronze-Age civilizations in Greece, Mesopotamia and Egypt. The boy pharaoh Tutankhamen, who died around 1324 BC, suffered from a host of conditions including a club foot and cleft palate, but malaria was likely what killed him.

Malaria could stop an army in its tracks. In 413 BC, at the height of the disastrous Sicilian Expedition, malaria sucked the life out of the Athenian army as it lay siege to Syracuse. Athens never recovered from its losses and fell to the Spartans in 404 BC.

But while malaria helped to destroy the Athenians, it provided the Roman Republic with a natural barrier against invaders. The infested Pontine Marshes south of Rome enabled successive generations of Romans to conquer North Africa, the Middle East and Europe with some assurance they wouldn’t lose their own homeland. Thus, the spread of classical civilization was carried on the wings of the mosquito. In the 5th century, though, the blessing became a curse as the disease robbed the Roman Empire of its manpower.

Throughout the medieval era, malaria checked the territorial ambitions of kings and emperors. The greatest beneficiary was Africa, where endemic malaria was deadly to would-be colonizers. The conquistadors suffered no such handicap in the New World.

ILLUSTRATION: JAMES STEINBERG

The first medical breakthrough came in 1623 after malaria killed Pope Gregory XV and at least six of the cardinals who gathered to elect his successor. Urged on by this catastrophe to find a cure, Jesuit missionaries in Peru realized that the indigenous Quechua people successfully treated fevers with the bark of the cinchona tree. This led to the invention of quinine, which kills malarial parasites.

For a time, quinine was as powerful as gunpowder. George Washington secured almost all the available supplies of it for his Continental Army during the War of Independence. When Lord Cornwallis surrendered at Yorktown in 1781, less than half his army was fit to fight: Malaria had incapacitated the rest.

During the 19th century, quinine helped to turn Africa, India and Southeast Asia into a constellation of European colonies. It also fueled the growth of global trade. Malaria had defeated all attempts to build the Panama Canal until a combination of quinine and better mosquito control methods led to its completion in 1914. But the drug had its limits, as both Allied and Axis forces discovered in the two World Wars. While fighting in the Pacific Theatre in 1943, General Douglas MacArthur reckoned that for every fighting division at his disposal, two were laid low by malaria.

A raging infection rate during the Vietnam War was malaria’s parting gift to the U.S. in the waning years of the 20th century. Between 1964 and 1973, the U.S. Army suffered an estimated 391,965 sick-days from malaria cases alone. The disease didn’t decide the war, but it stacked the odds.

Throughout history, malaria hasn’t had to wipe out entire populations to be devastating. It has left them poor and enfeebled instead. With the advent of the new vaccine, the hardest hit countries can envisage a future no longer shaped by the disease.

Historically Speaking: Whistleblowing’s Evolution, From Rome to the Pentagon Papers to Wikileaks

The exposure 50 years ago of government documents about the Vietnam War ushered in a modern era of leaks, built on a long tradition

The Wall Street Journal

June 12, 2021

The Pentagon Papers—a secret Defense Department review of America’s involvement in the Vietnam War—became public 50 years ago next week. The ensuing Supreme Court case guaranteed the freedom of the press to report government malfeasance, but the U.S. military analyst behind the revelation, Daniel Ellsberg, still ended up being prosecuted for espionage. Luckily for him, the charges were dropped after the trial became caught up in the Watergate scandal.

The twists and turns surrounding the Pentagon Papers have a uniquely American flavor to them. At the time, no other country regarded whistleblowing as a basic right.

The origins of whistleblowing are far less idealistic. The idea is descended from Roman ‘‘Qui Tam’’ laws, from a Latin phrase meaning “he who sues for himself does also for the king.” The Qui Tam laws served a policing function by giving informers a financial incentive to turn in wrong-doers. A citizen who successfully sued over malfeasance was rewarded with a portion of the defendant’s estate.

Daniel Ellsberg, left, testifying before members of Congress on July 28, 1971, several weeks after the publication of the Pentagon Papers.
PHOTO: BETTMANN ARCHIVE/GETTY IMAGES

Anglo-Saxon law retained a crude version of Qui Tam. At first primarily aimed at punishing Sabbath-breakers, it evolved into whistleblowing against corruption. In 1360, the English monarch Edward III resorted to Qui Tam-style laws to encourage the reporting of jurors and public officials who accepted bribes.

Whistleblowers could never be sure that those in power wouldn’t retaliate, however. The fate of two American sailors in the Revolutionary War, Richard Marven and Samuel Shaw, was a case in point. The men were imprisoned for libel after they reported the commander of the navy, Esek Hopkins, for a string of abuses, including the torture of British prisoners of war. In desperation they petitioned the Continental Congress for redress. Eager to assert its authority, the Congress not only paid the men’s legal bill but also passed what is generally held to be the first whistleblower-protection law in history. The law was strengthened during the Civil War via the False Claims Act, to deter the sale of shoddy military supplies.

These early laws framed such actions as an expression of patriotism. The phrase “to blow the whistle” only emerged in the 1920s, but by then U.S. whistleblowing culture had already reigned in the corporate behemoth Standard Oil. In 1902, a clerk glanced over some documents that he had been ordered to burn, only to realize they contained evidence of wrongdoing. He passed them to a friend, and they reached journalist Ida Tarbell, forming a vital part of her expose of Standard Oil’s monopolistic abuses.

During World War II, the World Jewish Congress requested special permission from the government to ransom Jewish refugees in Romania and German-occupied France. A Treasury Department lawyer named Joshua E. Dubois Jr. discovered that State Department officials were surreptitiously preventing the money from going abroad. He threatened to go public with the evidence, forcing a reluctant President Franklin Roosevelt to establish the War Refugee Board.

Over the past half-century, the number of corporate and government whistleblowers has grown enormously. Nowadays, the Internet is awash with Wikileaks-style whistleblowers. But in contrast to the saga of the Pentagon Papers, which became a turning point in the Vietnam War and concluded with Mr. Ellsberg’s vindication, it’s not clear what the release of classified documents by Julian Assange, Chelsea Manning and Edward Snowden has achieved. To some, the three are heroes; to others, they are simply spies.

Historically Speaking: The Winning Ways of Moving the Troops

Since the siege of Troy, getting armed forces into battle zones quickly and efficiently has made a decisive difference in warfare

The Wall Street Journal

May 6, 2021

The massing of more than 100,000 Russian soldiers at Ukraine’s border in April was an unambiguous message to the West: President Putin could dispatch them at any moment, if he chose.

How troops move into battle positions is hardly the stuff of poetry. Homer’s “The Iliad” begins with the Greeks having already spent 10 years besieging Troy. Yet the engine of war is, quite literally, the ability to move armies. Many scholars believe that the actual Trojan War may have been part of a larger conflict between the Bronze Age kingdoms of the Mediterranean and a maritime confederacy known as the Sea Peoples.

The identity of these seafaring raiders is still debated, but their means of transportation is well-attested. The Sea Peoples had the largest and best fleets, allowing them to roam the seas unchecked. The trade network of the Mediterranean collapsed beneath their relentless attacks. Civilization went backward in many regions; even the Greeks lost the art of writing for several centuries.

ILLUSTRATION: THOMAS FUCHS

The West recovered and flourished until the fifth century, when the Romans were overwhelmed by the superior horse-borne armies of the Vandals. Their Central European horses, bred for strength and stamina, transformed the art of warfare, making it faster and more mobile. The invention of the stirrup, the curb bit, and finally the war saddle made mobility an effective weapon in and of itself.

Genghis Khan understood this better than any of his adversaries. His mounted troops could cover up to 100 miles a day, helping to stretch the Mongol empire from the shores of eastern China to the Austrian border. But horses need pasture, and Europe’s climate between 1238 to 1242 was excessively wet. Previously fertile plains became boggy marshes. The first modern invasion was stopped by rain.

Bad weather continued to provide an effective defense against invaders. Napoleon entered Russia in 1812 with a force of over 500,000. An unseasonably hot summer followed by an unbearably cold winter killed off most of his horses, immobilizing the cavalry and the supply wagons that would have prevented his army from starving. He returned with fewer than 20,000 men.

The reliance on pack animals for transport meant that until the Industrial Revolution, armies were no faster than their Roman counterparts. The U.S. Civil War first showed how decisive railroads could be. In 1863 the Confederate siege of Chattanooga, Tenn., was broken by 23,000 Federal troops who traveled over 1,200 miles across seven states to relieve Union forces under General William Rosecrans.

The Prussians referred to this kind of troop-maneuvering warfare as bewegungskrieg, war of movement, using it to crushing effect over the less-mobile French in the Franco-Prussian War. In the early weeks of World War I, France still struggled to mobilize; Gen. Joseph S. Gallieni, the military governor of Paris, famously resorted to commandeering Renault taxicabs to ferry soldiers to the Battle of the Marne.

The Germans started World War II with their production capacity lagging that of the Allies; they compensated by updating bewegungskrieg to what became known as blitzkrieg, or lightning strike, which combined speed with concentrated force. They overwhelmed French defenses in six weeks.

In the latter half of the 20th century, troop transport became even more inventive, if not decisive. Most of the 2.7 million U.S. soldiers sent into the Vietnam War were flown commercial. (Civilian air stewardesses flying over combat zones were given the military rank of Second Lieutenant.)

Although future conflicts may be fought in cyberspace, for now, modern warfare means mass deployment. Winning still requires moving.

Historically Speaking: The Blessing and Curse of ‘Black Gold’

From the pharaohs to John D. Rockefeller, oil has been key to the growth of civilization—but it comes at a high cost.

January 10, 2020

The Wall Street Journal

This January marks the 150th anniversary of the Standard Oil Company, incorporated in 1870 by John D. Rockefeller and three partners. Such was their drive and ruthlessness that within a decade Standard Oil became a vast monopoly, controlling over 90% of America’s oil refineries.

Spindletop Hill in Beaumont, Texas, was the site of the first Texas oil gusher in 1901. PHOTO: TEXAS ENERGY MUSEUM/NEWSMAKERS/GETTY IMAGES

Standard Oil’s tentacle-like grip on U.S. commerce was finally prized loose in 1911, when the Supreme Court broke it up into 33 separate companies. But this victory didn’t put an end to the nefarious activities surrounding “black gold.” In the 1920s, tycoon Edward Doheny was drawn into the Teapot Dome scandal after he gave a $100,000 bribe to Secretary of the Interior Albert Fall. Doheny served as the inspiration for the corrupt and blood-soaked tycoon J. Arnold Ross in Upton Sinclair’s 1927 novel “Oil!” (which in turned inspired the Oscar-winning 2007 film “There Will Be Blood”).

Though clearly responsible for a great many evils, oil has also been key to the growth of human civilization. As the Bible attests, bitumen—a naturally forming liquid found in oil sands and pitch lakes—was used in ancient times for waterproofing boats and baskets. It also played an important role in Egyptian burial practices: The word “mummy” is derived from mumiyyah, the Arabic word for bitumen.

Over the centuries, oil proved useful in a variety of ways. As early as the fourth century, the Chinese were drilling for oil with bamboo pipes and burning it as heating fuel. In Central Eurasia it was a treatment for mange in camels. By the ninth century, Persian alchemists had discovered how to distill oil into kerosene to make light. The oil fields of medieval Baku, in today’s Azerbaijan, brought trade and culture to the region, rather than political oppression and underdevelopment, as is often the case in oil-rich countries today.

The drilling of the first commercial oil well in the U.S., in Pennsylvania in 1859, brought a raft of benefits. In the 19th century, an estimated 236,000 sperm whales were killed to make oil for lamps. The whaling industry died overnight once Standard Oil began marketing a clean-smelling version of kerosene. Plentiful oil also made the automobile industry possible. In 1901, when a massive oil gusher was discovered in Spindletop, Texas, there were 14,800 cars in the U.S.; two decades later, there were 8.5 million.

After World War II, the world’s oil supply was dominated not by private companies like Standard Oil but by global alliances such as OPEC. When OPEC nations declared an embargo in 1973, the resulting crisis caused the price of oil to climb nearly 400%.

At the time, the U.S. depended on foreign suppliers for 36% of its oil supply. Last month, the U.S. Energy Information Administration announced that, thanks to new technologies such as hydraulic fracturing, the country had become a net exporter of oil for the first time in 75 years.

Though helpful geopolitically, America’s oil independence doesn’t solve the environmental problems caused by carbon emissions. Ironically, some of John D. Rockefeller’s own descendants, aided by the multibillion-dollar fortune he bequeathed, are now campaigning against Exxon Mobil —one of the 33 Standard Oil offshoots—over its record on climate change.

WSJ Historically Speaking: The Psychology and History of Snipers

PHOTO: THOMAS FUCHS

Sharpshooters helped turn the course of World War II 75 years ago at the Battle of Stalingrad

The Battle of Stalingrad during World War II cost more than a million lives, making it one of the bloodiest battles in human history. The death toll began in earnest 75 years ago this week, after the Germans punched through Soviet defenses to reach the outskirts of the city. Once inside, however, they couldn’t get out.

With both sides dug in for the winter, the Russians unleashed one of their deadliest weapons: trained snipers. By the end of the war, Russia had trained more than 400,000 snipers, including thousands of women. At Stalingrad, they had a devastating impact on German morale and fighting capability. Continue reading…