WSJ Historically Speaking: With Big Prizes Often Comes Controversy

It’s not just the Nobel: Award-giving missteps have a long history

ILLUSTRATION: THOMAS FUCHS

This spring, controversies have engulfed three big prizes.

The Swedish Academy isn’t awarding the Nobel Prize for Literature this year while it deals with the fallout from a scandal over allegations of sexual assault and financial impropriety.

In the U.S., the author Junot Díaz has stepped down as Pulitzer Prize chairman while the board investigates allegations of sexual misconduct. In a statement through his literary agent earlier this month, Mr. Díaz did not address individual accusations but said in part, “I take responsibility for my past.” Finally, the organizers of the Echo, Germany’s version of the Grammys, said they would no longer bestow the awards after one of this year’s prizes went to rappers who used anti-Semitic words and images in their lyrics and videos.

Prize-giving controversies—some more serious than others—go back millennia. I know something about prizes, having served as chairwoman of the literary Man Booker Prize jury.

The ancient Greeks gave us the concept of the arts prize. To avoid jury corruption in their drama competitions, during the Festival of Dionysus, the Athenians devised a complicated system of votes and lotteries that is still not entirely understood today. Looking back now, the quality of the judging seems questionable. Euripides, the greatest tragedian of classical Greece, habitually challenged his society’s assumptions in tragedies like “Medea,” which sympathetically portrayed female desperation in a society where men ruled absolutely. In a three-way competition, “Medea,” which still holds the stage today, placed last.

Controversy surrounding a competition can be a revitalizing force—especially when the powers that be support the dissidents. By the 1860s, France’s Academy of Fine Arts, the defender of official taste, was growing increasingly out of touch with contemporary art. In 1863, the jury of the prestigious annual Salon exhibition, which the academy controlled, rejected artists such as Paul Cézanne, Camille Pissarro and Édouard Manet.

The furor caused Emperor Napoleon III to order a special exhibition called the Salon “of Rejects” to “let the public judge” who was right. The public was divided, but the artists felt emboldened, and many scholars regard 1863 as the birthdate of modern painting. The Academy ultimately relinquished its control of the Salon in 1881. Its time was over.

At other times, controversies over prizes are more flash than substance. As antigovernment student protests swept Paris and many other places in 1968, a group of filmmakers tried to show solidarity with the protesters by shutting down the venerable Cannes Film Festival. At one point, directors hung from a curtain to prevent a film from starting. The festival was canceled but returned in 1969 without the revolutionary changes some critics were hoping for.

In contrast, a recent dispute at the festival over its refusal to allow in its competition Netflix films that bypass French theaters for streaming was relatively quiet but reflects the serious power struggle between streaming services and theatrical movie distributors.

As the summer approaches and the beleaguered festivals around the world take a breather, here’s some advice from a survivor of the prize process: Use this time to reflect and revive.

WSJ Historically Speaking: Serendipity of Science is Often Born of Years of Labor

Over the centuries, lucky discoveries depend on training and discernment

ILLUSTRATION: THOMAS FUCHS

One recent example comes from an international scientific team studying the bacterium, Ideonella sakaiensis 201-F6, which makes an enzyme that breaks down the most commonly used form of plastic, thus allowing the bacterium to eat it. As reported last month in the Proceedings of the National Academy of Sciences, in the course of their research the scientists accidentally created an enzyme even better at dissolving the plastic. It’s still early days, but we may have moved a step closer to solving the man-made scourge of plastics pollution.

The development illustrates a truth about seemingly serendipitous discoveries: The “serendipity” part is usually the result of years of experimentation—and failure. A new book by two business professors at Wharton and a biology professor, “Managing Discovery in the Life Sciences,” argues that governments and pharmaceutical companies should adopt more flexible funding requirements—otherwise innovation and creativity could end up stifled by the drive for quick, concrete results. As one of the authors, Philip Rea, argues, serendipity means “getting answers to questions that were never posed.”

So much depends on who has observed the accident, too. As Joseph Henry, the first head of the Smithsonian Institution, said, “The seeds of great discoveries are constantly floating around us, but they only take root in minds well prepared to receive them.”

One famously lucky meeting of perception and conception happened in 1666, when Isaac Newton observed an apple fall from the tree. (The details remain hazy, but there’s no evidence that the fruit actually hit him, as legend has it.) Newton had seen apples fall before, of course, but this time the sight inspired him to ask questions about gravity’s relationship to the rules of motion that he was contemplating. Still, it took Newton another 20 years of work before he published his Law of Universal Gravitation.

Bad weather was the catalyst for another revelation, leading to physicist Henri Becquerel’s discovery of radioactivity in 1896. Unable to continue his photographic X-ray experiments on the effect of sunlight on uranium salt, Becquerel put the plates in a drawer. They developed, incredibly, without light. Realizing that he had been pursing the wrong question, Becquerel started again, this time focusing on uranium itself as a radiation emitter.

As for inventions, accident and inadvertence played a role in the development of Post-it Notes and microwave heating. During the 1990s, Viagra failed miserably in trials as a treatment for angina, but alert researchers at Pfizer realized that one of the side effects could have global appeal.

The most famous accidental medical discovery is antibiotics. The biologist Alexander Fleming discovered penicillin in 1928 after he went on vacation, leaving a petri dish of bacteria out in the laboratory. On his return, the dish had developed mold, with a clean area around it. Fleming realized that something in the mold must have killed off the bacteria.

That ability to ask the right questions can be more important than knowing the right answers. Funders of science should take note.

 

WSJ Historically Speaking: Undying Defeat: The Power of Failed Uprisings

From the Warsaw Ghetto to the Alamo, doomed rebels live on in culture

John Wayne said that he saw the Alamo as ‘a metaphor for America’. PHOTO: ALAMY

Earlier this month, Israel commemorated the 75th anniversary of the Warsaw Ghetto Uprising of April 1943. The annual Remembrance Day of the Holocaust and Heroism, as it is called, reminds Israelis of the moral duty to fight to the last.

The Warsaw ghetto battle is one of many doomed uprisings across history that have cast their influence far beyond their failures, providing inspiration to a nation’s politics and culture.

Nearly 500,000 Polish Jews once lived in the ghetto. By January 1943, the Nazis had marked the surviving 55,000 for deportation. The Jewish Fighting Organization had just one machine gun and fewer than a hundred revolvers for a thousand or so sick and starving volunteer soldiers. The Jews started by blowing up some tanks and fought on until May 16. The Germans executed 7,000 survivors and deported the rest.

For many Jews, the rebellion offered a narrative of resistance, an alternative to the grim story of the fortress of Masada, where nearly 1,000 besieged fighters chose suicide over slavery during the First Jewish-Roman War (A.D. 66–73).
The story of the Warsaw ghetto uprising has also entered the wider culture. The title of Leon Uris’s 1961 novel “Mila 18” comes from the street address of the headquarters of the Jewish resistance in their hopeless fight. Four decades later, Roman Polanski made the uprising a crucial part of his 2002 Oscar-winning film, “The Pianist,” whose musician hero aids the effort.

Other doomed uprisings have also been preserved in art. The 48-hour Paris Uprising of 1832, fought by 3,000 insurrectionists against 30,000 regular troops, gained immortality through Victor Hugo, who made the revolt a major plot point in “Les Misérables” (1862). The novel was a hit on its debut and ever after—and gave its world-wide readership a set of martyrs to emulate.

Even a young country like the U.S. has its share of national myths, of desperate last stands serving as touchstones for American identity. One has been the Battle of the Alamo in 1836 during the War of Texas Independence. “Remember the Alamo” became the Texan war cry only weeks after roughly 200 ill-equipped rebels, among them the frontiersman Davy Crockett, were killed defending the Alamo mission in San Antonio against some 2,000 Mexican troops.

The Alamo’s imagery of patriotic sacrifice became popular in novels and paintings but really took off during the film era, beginning in 1915 with the D.W. Griffith production, “Martyrs of the Alamo.” Walt Disney got in on the act with his 1950s TV miniseries, “ Davy Crockett : King of the Wild Frontier.” John Wayne’s 1960 “The Alamo,” starring Wayne as Crockett, immortalized the character for a generation.

Wayne said that he saw the Alamo as “a metaphor of America” and its will for freedom. Others did too, even in very different contexts. During the Vietnam War, President Lyndon Johnson, whose hometown wasn’t far from San Antonio, once told the National Security Council why he believed U.S. troops needed to be fighting in Southeast Asia: “Hell,” he said, “Vietnam is just like the Alamo.”

WSJ Historically Speaking: When Blossoms and Bullets Go Together: The Battles of Springtime

Generals have launched spring offensives from ancient times to the Taliban era

ILLUSTRATION: THOMAS FUCHS

‘When birds do sing, hey ding a ding, ding; Sweet lovers love the spring,” wrote Shakespeare. But the season has a darker side as well. As we’re now reminded each year when the Taliban anticipate the warm weather by announcing their latest spring offensive in Afghanistan, military commanders and strategists have always loved the season, too.

The World War I poet Wilfred Owen highlighted the irony of this juxtaposition—the budding of new life alongside the massacre of those in life’s prime—in his famous “Spring Offensive”: “Marvelling they stood, and watched the long grass swirled / By the May breeze”—right before their deaths.

The pairing of rebirth with violent death has an ancient history. In the 19th century, the anthropologist James George Frazer identified the concept of the “dying and rising god” as one of the earliest cornerstones of religious belief. For new life to appear in springtime, there had to be a death or sacrifice in winter. Similar sacrifice-and-rejuvenation myths can be found among the Sumerians, Egyptians, Canaanites and Greeks.

Mediterranean and Near Eastern cultures saw spring in this dual perspective for practical reasons as well. The agricultural calendar revolved around wet winters, cool springs and very hot summers when almost nothing grew except olives and figs. Harvest time for essential cereal crops such as wheat and barley took place in the spring. The months of May and June, therefore, were perfect for armies to invade, because they could live off the land. The Bible says of King David, who lived around 1,000 B.C., that he sent Joab and the Israelite army to fight the Ammonites “in the spring of the year, when kings normally go out to war.”

It was no coincidence that the Romans named the month of March after Mars, the god of war but also the guardian of agriculture. As the saying goes, “An army fights on its stomach.” For ancient Greek historians, the rhythm of war rarely changed: Discussion took place in the winter, action began in spring. When they referred to a population “waiting for spring,” it was usually literary shorthand for a people living in fear of the next attack. The military campaigns of Alexander the Great (356-323 B.C.) into the Balkans, Persia and India began with a spring offensive.

In succeeding centuries, the seasonal rhythms of Europe, which were very different from those of warmer climes, brought about a new calendar of warfare. Europe’s reliance on the autumn harvest ended the ancient marriage of spring and warfare. Conscripts were unwilling to abandon their farms and fight in the months between planting and harvesting.

 This seasonal difficulty would not be addressed until Sweden’s King Gustavus Adolphus (1594-1632), a great military innovator, developed principles for the first modern army. According to the British historian Basil Liddell Hart, Gustavus made the crucial shift from short-term conscripts, drawn away from agricultural labor, to a standing force of professional, trained soldiers on duty all year round, regardless of the seasons.

Gustavus died before he could fully implement his ideas. This revolution in military affairs fell instead to Frederick the Great, king of Prussia (1712-1786), who turned military life into a respectable upper-class career choice and the Prussian army into a mobile, flexible and efficient machine.

Frederick believed that a successful army attacks first and hard, a lesson absorbed by Napoleon a half century later. This meant that the spring season, which had become the season for drilling and training in preparation for summer campaigning, became a fighting season again.

But the modern iteration of the spring offensive is different from its ancient forebear. Its purpose isn’t to feed an army but to incapacitate enemies before they have the chance to strike. The strategy is a risky gambler’s throw, relying on timing and psychology as much as on strength and numbers.

For Napoleon, the spring offensive played to his strength in being able to combine speed, troop concentration and offensive action in a single, decisive blow. Throughout his career he relied on the spring offensive, beginning with his first military campaign in Italy (1796-7), in which the French defeated the more-numerous and better-supplied Austrians. His final spring campaign was also his boldest. Despite severe shortages of money and troops, Napoleon came within a hair’s breadth of victory at the Battle of Waterloo on June 18, 1815.

The most famous spring campaign of the early 20th century—Germany’s 1918 offensive in World War I, originated by Gen. Erich Ludendorff—reveals its limitations as a strategy. If the knockout blow doesn’t happen, what next?

 At the end of 1917, the German high command had decided that the army needed a spring offensive to revive morale. Ludendorff thought that only an attack in the Napoleonic mode would work: “The army pined for the offensive…It alone is decisive,” he wrote. He was convinced that all he had to do was “blow a hole in the middle” of the enemy’s front and “the rest will follow of its own accord.” When Ludendorff’s first spring offensive stalled after 15 days, he quickly launched four more. Lacking any other objective than the attack itself, all failed, leaving Germany bankrupt and crippled by July.

In this century, the Taliban have found their own brutal way to renew the ancient tradition—with the blossoms come the bombs and the bloodshed.

WSJ Historically Speaking: How Mermaid-Merman Tales Got to This Year’s Oscars

ILLUSTRATON: DANIEL ZALKUS

‘The Shape of Water,’ the best-picture winner, extends a tradition of ancient tales of these water creatures and their dealings with humans

Popular culture is enamored with mermaids. This year’s Best Picture Oscar winner, Guillermo del Toro’s “The Shape of Water,” about a lonely mute woman and a captured amphibious man, is a new take on an old theme. “The Little Mermaid,” Disney ’senormously successful 1989 animated film, was based on the Hans Christian Andersen story of the same name, and it was turned into a Broadway musical, which even now is still being staged across the country.

The fascination with mermythology began with the ancient Greeks. In the beginning, mermen were few and far between. As for mermaids, they were simply members of a large chorus of female sea creatures that included the benign Nereids, the sea-nymph daughters of the sea god Nereus, and the Sirens, whose singing led sailors to their doom—a fate Odysseus barely escapes in Homer’s epic “The Odyssey.”

Over the centuries, the innocuous mermaid became interchangeable with the deadly sirens. They led Scottish sailors to their deaths in one of the variations of the anonymous poem “Sir Patrick Spens,” probably written in the 15th century: “Then up it raise the mermaiden, / Wi the comb an glass in her hand: / ‘Here’s a health to you, my merrie young men, / For you never will see dry land.’”

In pictures, mermaids endlessly combed their hair while sitting semi-naked on the rocks, lying in wait for seafarers. During the Elizabethan era, a “mermaid” was a euphemism for a prostitute. Poets and artists used them to link feminine sexuality with eternal damnation.

But in other tales, the original, more innocent idea of a mermaid persisted. Andersen’s 1837 story followed an old literary tradition of a “virtuous” mermaid hoping to redeem herself through human love.

Andersen purposely broke with the old tales. As he acknowledged to a friend, his fishy heroine would “follow a more natural, more divine path” that depended on her own actions rather than that of “an alien creature.” Egged on by her sisters to murder the prince whom she loves and return to her mermaid existence, she chooses death instead—a sacrifice that earns her the right to a soul, something that mermaids were said to lack.

Richard Wagner’s version of mermaids—the Rhine maidens who guard the treasure of “Das Rheingold”—also bucked the “temptress” cliché. While these maidens could be cruel, they gave valuable advice later in the “Ring” cycle.

The cultural rehabilitation of mermaids gained steam in the 20th century. In T.S. Eliot’s 1915 poem, “The Love Song of J. Alfred Prufrock,” their erotic power becomes a symbol of release from stifling respectability. The sad protagonist laments, “I have heard the mermaids singing, each to each. / I do not think that they will sing to me.” By 1984, when a gorgeous mermaid (Daryl Hannah) fell in love with a nerdy man ( Tom Hanks ) in the film comedy “Splash,” audiences were ready to accept that mermaids might offer a liberating alternative to society’s hang-ups, and that humans themselves are the obstacle to perfect happiness, not female sexuality.

What makes “The Shape of Water” unusual is that a scaly male, not a sexy mermaid, is the object of affection to be rescued. Andersen probably wouldn’t recognize his Little Mermaid in Mr. del Toro’s nameless, male amphibian, yet the two tales are mirror images of the same fantasy: Love conquers all.

WSJ Historically Speaking: In Epidemics, Leaders Play a Crucial Role

ILLUSTRATION: JON KRAUSE

Lessons in heroism and horror as a famed flu pandemic hits a milestone

A century ago this week, an army cook named Albert Gitchell at Fort Riley, Kansas, paid a visit to the camp infirmary, complaining of a severe cold. It’s now thought that he was America’s patient zero in the Spanish Flu pandemic of 1918.

The disease killed more than 40 million people world-wide, including 675,000 Americans. In this case, as in so many others throughout history, the pace of the pandemic’s deadly progress depended on the actions of public officials.

Spain had allowed unrestricted reporting about the flu, so people mistakenly believed it originated there. Other countries, including the U.S., squandered thousands of lives by suppressing news and delaying health measures. Chicago kept its schools open, citing a state commission that had declared the epidemic at a “standstill,” while the city’s public health commissioner said, “It is our duty to keep the people from fear. Worry kills more people than the epidemic.”

Worry had indeed sown chaos, misery and violence in many previous outbreaks, such as the Black Plague. The disease, probably caused by bacteria-infected fleas living on rodents, swept through Asia and Europe during the 1340s, killing up to a quarter of the world’s population. In Europe, where over 50 million died, a search for scapegoats led to widespread pogroms against Jews. In 1349, the city of Strasbourg in France, already somewhat affected by the plague, put to death hundreds of Jews and expelled the rest.

But not all authorities lost their heads at the first sign of contagion. Pope Clement VI (1291-1352), one of a series of popes who ruled from the southern French city of Avignon, declared that the Jews had not caused the plague and issued two papal bulls against their persecution.

In Italy, Venetian authorities took the practical approach: They didn’t allow ships from infected ports to dock and subjected all travelers to a period of isolation. The term quarantine comes from the Italian quaranta giorni, meaning “40 days”—the official length of time until the Venetians granted foreign ships the right of entry.

Less exalted rulers could also show prudence and compassion in the face of a pandemic. After the Black Plague struck the village of Eyam in England, the vicar William Mompesson persuaded its several hundred inhabitants not to flee, to prevent the disease from spreading to other villages. The biggest landowner in the county, the earl of Devonshire, ensured a regular supply of food and necessities to the stricken community. Some 260 villagers died during their self-imposed quarantine, but their decision likely saved thousands of lives.

The response to more recent pandemics has not always met that same high standard. When viral severe acute respiratory syndrome (SARS) began in China in November 2002, the government’s refusal to acknowledge the outbreak allowed the disease to spread to Hong Kong, a hub for the West and much of Asia, thus creating a world problem. On a more hopeful note, when Ebola was spreading uncontrollably through West Africa in 2014, the Ugandans leapt into action, saturating their media with warnings and enabling quick reporting of suspected cases, and successfully contained their outbreak.

Pandemics always create a sense of crisis. History shows that public leadership is the most powerful weapon in keeping them from becoming full-blown tragedies.

WSJ Historically Speaking: The Quest for Unconsciousness: A Brief History of Anesthesia

The ancient Greeks used alcohol and opium. Patients in the 12th century got a ‘soporific sponge.’ A look at anesthetics over the centuries

ILLUSTRATION: ELLEN WEINSTEIN

Every year, some 21 million Americans undergo a general anesthetic. During recent minor surgery, I became one of the roughly 26,000 Americans a year who experience “anesthetic awareness” during sedation: I woke up. I still can’t say what was more disturbing: being conscious or seeing the horrified faces of the doctors and nurses.

The best explanation my doctors could give was that not all brains react in the same way to a general anesthetic. Redheads, for example, seem to require higher dosages than brunettes. While not exactly reassuring, this explanation does highlight one of the many mysteries behind the science of anesthesia.

Although being asleep and being unconscious might look the same, they are very different states. Until the mid-19th century, a medically induced deep unconsciousness was beyond the reach of science. Healers had no reliable way to control, let alone eliminate, a patient’s awareness or pain during surgery, though not for lack of trying.

The ancient Greeks generally relied on alcohol, poppy opium or mandrake root to sedate patients. Evidence from the “Sushruta Samhita,” an ancient Sanskrit medical text, suggests that Indian healers used cannabis incense. The Chinese developed acupuncture at some point before 100 B.C., and in Central and South America, shamans used the spit from chewed coca leaves as a numbing balm.

Little changed over the centuries. In the 12th century, Nicholas of Salerno recorded in a treatise the recipe for a “soporific sponge” with ingredients that hadn’t advanced much beyond the medicines used by the Greeks: a mixture of opium, mulberry juice, lettuce seed, mandrake, ivy and hemlock.

Discoveries came but weren’t exploited. In 1540, the German alchemist and astrologer Paracelsus (aka Theophrastus Bombastus von Hohenheim) noted that liquid ether could induce sleep in animals. In 1772, the English chemist Joseph Priestley discovered nitrous oxide gas (laughing gas). Using it became the thing to do at parties—in 1799, the poet Coleridge described trying the gas—but no one apparently tried using ether or nitrous oxide for medicinal purposes.

In 1811, the novelist Fanny Burney had no recourse when she went under the knife for suspected breast cancer. She wrote later, “O Heaven!—I then felt the Knife rackling against the breast bone—scraping it!”

Despite the ordeal, Burney lived into her 80s, dying in 1840—just before everything changed. Ether, nitrous oxide and later chloroform soon became common in operating theaters. On Oct. 16, 1846, a young dentist from Boston named William Morton made history by performing surgery on a patient anesthetized with ether. It was such a success that, a few months later, Frances Appleton Longfellow, wife of Henry Wadsworth Longfellow, became the first American to receive anesthesia during childbirth.

But these wonder drugs were lethal if not administered properly. A German study compiled in 1934 estimated that the number of chloroform-related deaths was as high as 1 in 3,000 operations. The drive for safer drugs produced such breakthroughs as halothane in 1955, which could be inhaled by patients.

Yet for all the continuous advances in anesthesia, scientists still don’t entirely understand how it works. A study published in the December 2017 issue of Annals of Botany reveals that anesthetics can also stop motion in plants like the Venus flytrap—which, as far as we know, doesn’t have a brain. Clearly, we still have a lot to learn about consciousness in every form.

WSJ Historically Speaking: Life Beyond the Three-Ring Circus

Why ‘The Greatest Show on Earth’ foundered—and what’s next

ILLUSTRATION: THOMAS FUCHS

The modern circus, which celebrates its 250th anniversary this year, has attracted such famous fans as Queen Victoria, Charles Dickens and Ernest Hemingway, who wrote in 1953, “It’s the only spectacle I know that, while you watch it, gives the quality of a truly happy dream.”

Recently, however, the “happy dream” has struggled with lawsuits, high-profile bankruptcies and killer clown scares inspired in part by the evil Pennywise in Stephen King’s “It.” Even the new Hugh Jackman -led circus film, “The Greatest Showman,” comes with an ironic twist. The surprise hit—about the legendary impresario P.T. Barnum, co-founder of “The Greatest Show on Earth”—arrives on the heels of last year’s closing of the actual Ringling Bros., Barnum and Bailey Circus, after 146 years in business.

The word circus is Roman, but Roman and modern circuses do not share the same roots. Rome’s giant Circus Maximus, which could hold some 150,000 people, was more of a sporting arena than a theatrical venue, built to hold races, athletic competitions and executions. The Roman satirist Juvenal was alluding to the popular appeal of such spectacles when he coined the phrase “bread and circuses,” assailing citizens’ lack of interest in politics.

In fact, the entertainments commonly linked with the modern circus—acrobatics, animal performances and pantomimes—belong to traditions long predating the Romans. Four-millennia-old Egyptian paintings show female jugglers; in China, archaeologists have found 2,000-year-old clay figurines of tumblers.

Circus-type entertainments could be hideously violent: In 17th-century Britain, dogs tore into bears and chimpanzees. A humane change of pace came in 1768, when Philip Astley, often called the father of the modern circus, put on his first show in London, in a simple horse-riding ring. He found that a circle 42 feet in diameter was ideal for using centrifugal force as an aid in balancing on a horse’s back while doing tricks. It’s a size still used today. Between the horse shows, he scheduled clowning and tumbling acts.Circuses in fledgling America, with its long distances, shortage of venues and lack of large cities, found the European model too static and costly. In 1808, Hachaliah Bailey took the circus in a new direction by making animals the real stars, particularly an African elephant named Old Bet. The focus on animal spectacles became the American model, while Europeans still emphasized human performers.

When railroads spread across America, circuses could ship their menageries. Already famous for his museums and “freak shows,” P.T. Barnum and his partners joined forces with rivals and used special circus trains to create the largest circus in the country. Although Barnum played up the animal and human oddities in his “sideshow,” the marquee attraction was Jumbo the Elephant. In its final year, the Ringling Bros. animal contingent, according to a news report, included tigers, camels, horses, kangaroos and snakes. The elephants had already retired.

Once animal-rights protests and rising travel costs started eroding profitability in the late 20th century, the American circus became trapped by its own history. But the success of Canada’s Cirque du Soleil, which since its 1984 debut has conquered the globe with its astounding acrobatics and staging, shows that the older European tradition introduced by Astley still has the power to inspire wonder. The future may well lie in looking backward, to the era when the stars of the show were the people in the ring.

WSJ Historically Speaking: Remembering the Pueblo: Hostages as Propaganda Tools

The Pueblo incident, involving the North Korean takeover of a spy ship, turns 50

ILLUSTRATION: THOMAS FUCHS

Fifty years ago, on Jan. 23, 1968, North Korean forces captured the U.S. Navy spy ship Pueblo in international waters. North Korea took 82 crew members hostage (one was killed in the attack) and subjected them to 11 months of sporadic torture and starvation, humiliating appearances and forced confessions before an international radio and TV audience. Communications technology had given the ancient practice of hostage-taking a whole new purpose as a tool of propaganda.

Hostages have always been a part of warfare. By the second millennium B.C., Egyptians would take the young princes of conquered states and hold them as surety for good behavior, treating the young nobles well with the aim of turning them into future allies.

The Romans admired this tactic and imitated it. But others were simply interested in money. As a young man, Julius Caesar (100-44 B.C.) was held for ransom by pirates. A biographer of the time writes that while hostage, Caesar amused himself by reading his poems and speeches to his captors. The pirates assumed he was mad, especially when he promised to come back and hang them all. Once the ransom had been paid, the future general fulfilled his vow, hunting down the pirates and executing all of them.

During the Middle Ages, a hostage was better than money in the bank. Negotiating parties used hostages to enforce peace treaties, trade deals and even safe passage. In 1412, for instance, a French political faction sealed an alliance with the English King Henry IV. As part of the guarantee, the 12 year-old John of Orléans, Count of Angoulême, was sent to England, where he remained a political hostage for the next 32 years.

If a deal fell apart, however, retribution could be devastating. During the Third Crusade (1189–1192), King Richard I of England, known as the Lionheart, ordered the massacre of nearly 3,000 Muslim hostages after the Sultan Saladin reneged on his promise to pay a ransom and return his Christian prisoners along with relics of the True Cross.

Brutality toward hostages has been a lamentably common feature of modern warfare. The Germans showed little compunction during the Franco-Prussian War of 1870-71, when they used civilians as human shields on military trains. During World War II, amid a range of other atrocities, the Nazis killed thousands of civilian hostages across Europe, often in reprisal for earlier attacks. During one massacre in German-occupied Serbia in 1941, 100 hostages were to be shot for each dead German soldier.

The idea of hostage-taking as an end in itself is largely a 20th-century development—a way to exploit the powerful reach of mass media. The North Koreans were hardly alone. Domestic extremists also saw the propaganda value of hostages, as in the 1974 kidnapping of Patty Hearst by the Symbionese Liberation Army.

Just five years later, students supporting Iran’s Islamic revolution stormed the U.S. Embassy in Tehran and took 66 American hostages. The students had various demands, among them the extradition of the deposed shah. But their real motivation seemed to be inflicting pain on the captive Americans—who were beaten, threatened with death and paraded in blindfolds before a mob—and on the U.S. itself. There were some early releases, but 52 hostages were held under appalling conditions for 444 days.

Today, memories of the Pueblo incident and the Iran hostage crisis have faded, but both hostage-takings have had a lasting influence on American attitudes. In certain ways, they still define U.S. relations with the regimes of North Korea and Iran.