Historically Speaking: Whistleblowing’s Evolution, From Rome to the Pentagon Papers to Wikileaks

The exposure 50 years ago of government documents about the Vietnam War ushered in a modern era of leaks, built on a long tradition

The Wall Street Journal

June 12, 2021

The Pentagon Papers—a secret Defense Department review of America’s involvement in the Vietnam War—became public 50 years ago next week. The ensuing Supreme Court case guaranteed the freedom of the press to report government malfeasance, but the U.S. military analyst behind the revelation, Daniel Ellsberg, still ended up being prosecuted for espionage. Luckily for him, the charges were dropped after the trial became caught up in the Watergate scandal.

The twists and turns surrounding the Pentagon Papers have a uniquely American flavor to them. At the time, no other country regarded whistleblowing as a basic right.

The origins of whistleblowing are far less idealistic. The idea is descended from Roman ‘‘Qui Tam’’ laws, from a Latin phrase meaning “he who sues for himself does also for the king.” The Qui Tam laws served a policing function by giving informers a financial incentive to turn in wrong-doers. A citizen who successfully sued over malfeasance was rewarded with a portion of the defendant’s estate.

Daniel Ellsberg, left, testifying before members of Congress on July 28, 1971, several weeks after the publication of the Pentagon Papers.
PHOTO: BETTMANN ARCHIVE/GETTY IMAGES

Anglo-Saxon law retained a crude version of Qui Tam. At first primarily aimed at punishing Sabbath-breakers, it evolved into whistleblowing against corruption. In 1360, the English monarch Edward III resorted to Qui Tam-style laws to encourage the reporting of jurors and public officials who accepted bribes.

Whistleblowers could never be sure that those in power wouldn’t retaliate, however. The fate of two American sailors in the Revolutionary War, Richard Marven and Samuel Shaw, was a case in point. The men were imprisoned for libel after they reported the commander of the navy, Esek Hopkins, for a string of abuses, including the torture of British prisoners of war. In desperation they petitioned the Continental Congress for redress. Eager to assert its authority, the Congress not only paid the men’s legal bill but also passed what is generally held to be the first whistleblower-protection law in history. The law was strengthened during the Civil War via the False Claims Act, to deter the sale of shoddy military supplies.

These early laws framed such actions as an expression of patriotism. The phrase “to blow the whistle” only emerged in the 1920s, but by then U.S. whistleblowing culture had already reigned in the corporate behemoth Standard Oil. In 1902, a clerk glanced over some documents that he had been ordered to burn, only to realize they contained evidence of wrongdoing. He passed them to a friend, and they reached journalist Ida Tarbell, forming a vital part of her expose of Standard Oil’s monopolistic abuses.

During World War II, the World Jewish Congress requested special permission from the government to ransom Jewish refugees in Romania and German-occupied France. A Treasury Department lawyer named Joshua E. Dubois Jr. discovered that State Department officials were surreptitiously preventing the money from going abroad. He threatened to go public with the evidence, forcing a reluctant President Franklin Roosevelt to establish the War Refugee Board.

Over the past half-century, the number of corporate and government whistleblowers has grown enormously. Nowadays, the Internet is awash with Wikileaks-style whistleblowers. But in contrast to the saga of the Pentagon Papers, which became a turning point in the Vietnam War and concluded with Mr. Ellsberg’s vindication, it’s not clear what the release of classified documents by Julian Assange, Chelsea Manning and Edward Snowden has achieved. To some, the three are heroes; to others, they are simply spies.

Historically Speaking: The Long Road to Protecting Inventions With Patents

Gunpowder was never protected. Neither were inventions by Southern slaves. Vaccines are—but that’s now the subject of a debate.

The Wall Street Journal

May 20, 2021

The U.S. and China don’t see eye to eye on much nowadays, but in a rare show of consensus, the two countries both support a waiver of patent rights for Covid-19 vaccines. If that happens, it would be the latest bump in a long, rocky road for intellectual property rights.

Elijah McCoy and a diagram from one of his patents for engine lubrication.
ILLUSTRATION: THOMAS FUCHS

There was no such thing as patent law in the ancient world. Indeed, until the invention of gunpowder, the true cost of failing to protect new ideas was never even considered. In the mid-11th century, the Chinese Song government realized too late that it had allowed the secret of gunpowder to escape. It tried to limit the damage by banning the sale of saltpeter to foreigners. But merchants found ways to smuggle it out, and by 1280 Western inventors were creating their own recipes for gunpowder.

Medieval Europeans understood that knowledge and expertise were valuable, but government attempts at control were crude in the extreme. The Italian Republic of Lucca protected its silk trade technology by prohibiting skilled workers from emigrating; Genoa offered bounties for fugitive artisans. Craft guilds were meant to protect against intellectual expropriation, but all too often they simply stifled innovation.

The architect Filippo Brunelleschi, designer of the famous dome of Florence’s Santa Maria del Fiore, was the first to rebel against the power of the guilds. In 1421 he demanded that the city grant him the exclusive right to build a new type of river boat. His deal with Florence is regarded as the first legal patent. Unfortunately, the boat sank on its first voyage, but other cities took note of Brunelleschi’s bold new business approach.

In 1474 the Venetians invited individuals “capable of devising and inventing all kinds of ingenious contrivances” to establish their workshops in Venice. In return for settling in the city, the Republic offered them the sole right to manufacture their inventions for 10 years. Countries that imitated Venice’s approach reaped great financial rewards. England’s Queen Elizabeth I granted over 50 individual patents, often with the proviso that the patent holder train English craftsmen to carry on the trade.

Taking their cue from British precedent, the framers of the U.S. Constitution gave Congress the power to legislate on intellectual property rights. Congress duly passed a patent law in 1790 but failed to address the legal position of enslaved inventors. Their anomalous position came to a head in 1857 after a Southern slave owner named Oscar Stuart tried to patent a new plow invented by his slave Ned. The request was denied on the grounds that the inventor was a slave and therefore not a citizen, and while the owner was a citizen, he wasn’t the inventor.

After the Civil War, the opening up of patent rights enabled African-American inventors to bypass racial barriers and amass significant fortunes. Elijah McCoy (1844-1929) transformed American rail travel with his engine lubrication system.

McCoy ultimately registered 57 U.S. patents, significantly more than Alexander Graham Bell’s 18, though far fewer than Thomas Edison’s 1,093. The American appetite for registering inventions remains unbounded. Last fiscal year alone, the U.S. Patent and Trademark Office issued 399,055 patents.

Is there anything that can’t be patented? The answer is yes. In 1999 Smuckers attempted to patent its crustless peanut butter and jelly sandwich with crimped edges. Eight years and a billion homemade PB&J sandwiches later, a federal appeals court ruled there was nothing “novel” about foregoing the crusts.

Historically Speaking: The Winning Ways of Moving the Troops

Since the siege of Troy, getting armed forces into battle zones quickly and efficiently has made a decisive difference in warfare

The Wall Street Journal

May 6, 2021

The massing of more than 100,000 Russian soldiers at Ukraine’s border in April was an unambiguous message to the West: President Putin could dispatch them at any moment, if he chose.

How troops move into battle positions is hardly the stuff of poetry. Homer’s “The Iliad” begins with the Greeks having already spent 10 years besieging Troy. Yet the engine of war is, quite literally, the ability to move armies. Many scholars believe that the actual Trojan War may have been part of a larger conflict between the Bronze Age kingdoms of the Mediterranean and a maritime confederacy known as the Sea Peoples.

The identity of these seafaring raiders is still debated, but their means of transportation is well-attested. The Sea Peoples had the largest and best fleets, allowing them to roam the seas unchecked. The trade network of the Mediterranean collapsed beneath their relentless attacks. Civilization went backward in many regions; even the Greeks lost the art of writing for several centuries.

ILLUSTRATION: THOMAS FUCHS

The West recovered and flourished until the fifth century, when the Romans were overwhelmed by the superior horse-borne armies of the Vandals. Their Central European horses, bred for strength and stamina, transformed the art of warfare, making it faster and more mobile. The invention of the stirrup, the curb bit, and finally the war saddle made mobility an effective weapon in and of itself.

Genghis Khan understood this better than any of his adversaries. His mounted troops could cover up to 100 miles a day, helping to stretch the Mongol empire from the shores of eastern China to the Austrian border. But horses need pasture, and Europe’s climate between 1238 to 1242 was excessively wet. Previously fertile plains became boggy marshes. The first modern invasion was stopped by rain.

Bad weather continued to provide an effective defense against invaders. Napoleon entered Russia in 1812 with a force of over 500,000. An unseasonably hot summer followed by an unbearably cold winter killed off most of his horses, immobilizing the cavalry and the supply wagons that would have prevented his army from starving. He returned with fewer than 20,000 men.

The reliance on pack animals for transport meant that until the Industrial Revolution, armies were no faster than their Roman counterparts. The U.S. Civil War first showed how decisive railroads could be. In 1863 the Confederate siege of Chattanooga, Tenn., was broken by 23,000 Federal troops who traveled over 1,200 miles across seven states to relieve Union forces under General William Rosecrans.

The Prussians referred to this kind of troop-maneuvering warfare as bewegungskrieg, war of movement, using it to crushing effect over the less-mobile French in the Franco-Prussian War. In the early weeks of World War I, France still struggled to mobilize; Gen. Joseph S. Gallieni, the military governor of Paris, famously resorted to commandeering Renault taxicabs to ferry soldiers to the Battle of the Marne.

The Germans started World War II with their production capacity lagging that of the Allies; they compensated by updating bewegungskrieg to what became known as blitzkrieg, or lightning strike, which combined speed with concentrated force. They overwhelmed French defenses in six weeks.

In the latter half of the 20th century, troop transport became even more inventive, if not decisive. Most of the 2.7 million U.S. soldiers sent into the Vietnam War were flown commercial. (Civilian air stewardesses flying over combat zones were given the military rank of Second Lieutenant.)

Although future conflicts may be fought in cyberspace, for now, modern warfare means mass deployment. Winning still requires moving.

Historically Speaking: The Tragedy of Vandalizing the Past

The 20th anniversary of the destruction of the Bamiyan Buddhas in Afghanistan reminds us of the imperative of historical preservation

April 15, 2021

Twenty years ago this spring, the Taliban completed their obliteration of Afghanistan’s 1,500-year-old Buddhas of Bamiyan. The colossal stone sculptures had survived major assaults in the 17th and 18th centuries by the Mughal emperor Aurangzeb and the Persian king Nader Afshar. Lacking sufficient firepower, both gave up after partly defacing the monuments.

The Taliban’s methodical destruction recalled the calculated brutality of ancient days. By the time the Romans were finished with Carthage in 146 B.C., the entire city had been reduced to rubble. They were given a taste of their own medicine in 455 A.D. by Genseric, King of the Vandals, who stripped Rome bare in two weeks of systematic looting and destruction.

One of the Buddhas of Bamiyan in 1997, before their destruction.
PHOTO: ALAMY

Like other vanquished cities, Rome’s buildings became a source of free material. Emperor Constans II of Byzantium blithely stole the Pantheon’s copper roofing in the mid-17th century; a millennium later, Pope Urban VIII appropriated its bronze girders for Bernini’s baldacchino over the high altar in St. Peter’s Basilica.

When not dismantled, ancient buildings might be repurposed by new owners. Thus Hagia Sophia Cathedral became a mosque after the Ottomans captured Constantinople, and St. Radegund’s Priory was turned into Jesus College at Cambridge University on the orders of King Henry VIII.

The idea that a country’s ancient heritage forms part of its cultural identity took hold in the wake of the French Revolution. Incensed by the Jacobins’ pillaging of churches, Henri Gregoire, the Constitutional Bishop of Blois, coined the term vandalisme. His protest inspired the novelist Victor Hugo’s efforts to save Notre Dame. But the architect chosen for the restoration, Eugène Emmanuel Viollet-le-Duc, added his own touches to the building, including the central spire that fell when the cathedral’s roof burned in 2019, spurring controversy over what to restore. Viollet-le-Duc’s own interpolations set off a fierce debate, led by the English art critic John Ruskin, about what constitutes proper historical preservation.

Ruskin inspired people to rethink society’s relationship with the past. There was uproar in England in 1883 when the London and South Western Railway tried to justify building a rail-track alongside Stonehenge, claiming the ancient site was unused.

Public opinion in the U.S., when aroused, could be equally determined. The first preservation society was started in the 1850s by Ann Pamela Cunningham of South Carolina. Despite being disabled by a riding accident, Cunningham initiated a successful campaign to save George Washington’s Mount Vernon from ruin.

But developers have a way of getting what they want. Not even modernist architect Philip Johnson protesting in front of New York’s Penn Station was able to save the McKim, Mead & White masterpiece in 1963. Two years later, fearing that the world’s architectural treasures were being squandered, retired army colonel James Gray founded the International Fund for Monuments (now the World Monuments Fund). Without the WMF’s campaign in 1996, the deteriorating south side of Ellis Island, gateway for 12 million immigrants, might have been lost to history.

The fight never ends. I still miss the magnificent beaux-arts interior of the old Rizzoli Bookstore on 57th Street in Manhattan. The 109-year-old building was torn down in 2014. Nothing like it will ever be seen again.

Historically Speaking: The Long Fight to Take the Weekend Off

Ancient Jews and Christians observed a day of rest, but not until the 20th century did workers get two days a week to do as they pleased.

Wall Street Journal

April 1, 2021

Last month the Spanish government agreed to a pilot program for experimenting with a four-day working week. Before the pandemic, such a proposal would have seemed impossible—but then, so was the idea of working from home for months on end, with no clear downtime and no in-person schooling to keep the children occupied.

In ancient times, a week meant different things to different cultures. The Egyptians used sets of 10 days called decans; there were no official days off except for the craftsmen working on royal tombs and temples, who were allowed two days out of every 10. The Romans tried an eight-day cycle, with the eighth set aside as a market day. The Babylonians regarded the number seven as having divine properties and applied it whenever possible: There were seven celestial bodies, seven nights of each lunar phase and seven days of the week.

A day of leisure in Newport Beach, Calif., 1928. PHOTO: DICK WHITTINGTON STUDIO/CORBIS/GETTY IMAGES

A day of leisure in Newport Beach, Calif., 1928. PHOTO: DICK WHITTINGTON STUDIO/CORBIS/GETTY IMAGES

The ancient Jews, who also used a seven-day week, were the first to mandate a Sabbath or rest day, on Saturday, for all people regardless of rank or occupation. In 321 A.D., the Roman emperor Constantine integrated the Judeo-Christian Sabbath into the Julian calendar, but mindful of pagan sensibilities, he chose Sunday, the day of the god Sol, for rest and worship.

Constantine’s tinkering was the last change to the Western workweek for more than a millennia. The authorities saw no reason to allow the lower orders more than one day off a week, but they couldn’t stop them from taking matters into their own hands. By the early 18th century, the custom of “keeping Saint Monday”—that is, taking the day to recover from the Sunday hangover—had become firmly entrenched among the working classes in America and Britain.

Partly out of desperation, British factory owners began offering workers a half-day off on Saturday in return for a full day’s work on Monday. Rail companies supported the campaign with cheap-rate Saturday excursions. By the late 1870s, the term “weekend” had become so popular that even the British aristocracy started using it. For them however, the weekend began on Saturday and ended on Monday night.

American workers weren’t so fortunate. In 1908, a few New England mill owners granted workers Saturdays and Sundays off because of their large number of Jewish employees. Few other businesses followed suit until 1922, when Malcolm Gray, owner of the Rochester Can Company in upstate New York, decided to give a five-day week to his workers as a Christmas gift. The subsequent uptick in productivity was sufficiently impressive to convince Henry Ford to try the same experiment in 1926 at the Ford Motor Company. Ford’s success made the rest of the country take notice.

Meanwhile, the Soviet Union was moving in the other direction. In 1929, Joseph Stalin introduced the continuous week, which required 80% of the population to be working on any given day. It was so unpopular that the system was abandoned in 1940, the same year that the five-day workweek became law in the U.S. under the Fair Labor Standards Act. The battle for the weekend had been won at last. Now let the battle for the four-day week begin.

Historically Speaking: The Ordeal of Standardized Testing

From the Incas to the College Board, exams have been a popular way for societies to select an elite.

The Wall Street Journal

March 11, 2021

Last month, the University of Texas at Austin joined the growing list of colleges that have made standardized test scores optional for another year due to the pandemic. Last year, applicants were quick to take up the offer: Only 44% of high-school students who applied to college using the Common Application submitted SAT or ACT scores in 2020-21, compared with 77% the previous year.

Nobody relishes taking exams, yet every culture expects some kind of proof of educational attainment from its young. To enter Plato’s Academy in ancient Athens, a prospective student had to solve mathematical problems. Would-be doctors at one of the many medical schools in Ephesus had to participate in a two-day competition that tested their knowledge as well as their surgical skills.

ILLUSTRATION: THOMAS FUCHS

On the other side of the world, the Incas of Peru were no less demanding. Entry into the nobility required four years of rigorous instruction in the Quechua language, religion and history. At the end of the course students underwent a harsh examination lasting several days that tested their physical and mental endurance.

It was the Chinese who invented the written examination, as a means of improving the quality of imperial civil servants. During the reign of Empress Wu Zetian, China’s only female ruler, in the 7th century, the exam became a national rite of passage for the intelligentsia. Despite its burdensome academic requirements, several hundred thousand candidates took it every year. A geographical quota system was eventually introduced to prevent the richer regions of China from dominating.

Over the centuries, all that cramming for one exam stifled innovation and encouraged conformity. Still, the meritocratic nature of the Chinese imperial exam greatly impressed educational reformers in the West. In 1702, Trinity College, Cambridge became the first institution to require students to take exams in writing rather than orally. By the end of the 19th century, exams to enter a college or earn a degree had become a fixture in most European countries.

In the U.S., the reformer Horace Mann introduced standardized testing in Boston schools in the 1840s, hoping to raise the level of teaching and ensure that all citizens would have equal access to a good education. The College Board, a nonprofit organization founded by a group of colleges and high schools in 1899, established the first standardized test for university applicants.

Not every institution that adopted standardized testing had noble aims, however. The U.S. Army had experimented with multiple-choice intelligence tastes during World War I and found them useless as a predictive tool. But in the early 1920s, the president of Columbia University, Nicholas M. Butler, adopted the Thorndike Tests for Mental Alertness as part of the admissions process, believing it would limit the number of Jewish students.

The College Board adopted the SAT, a multiple-choice aptitude test, in 1926, as a fair and inclusive alternative to written exams, which were thought to be biased against poorer students. In the 1960s, civil rights activists began to argue that standardized tests like the SAT and ACT were biased against minority students, but despite the mounting criticisms, the tests seemed like a permanent part of American education—until now.

Historically Speaking: Iron Curtains Are Older Than the Cold War

Winston Churchill made the term famous, but ideological rivalries have driven geopolitics since Athens and Sparta.

The Wall Street Journal

February 25, 2021

It was an unseasonably springlike day on March 5, 1946, when Winston Churchill visited Fulton, Missouri. The former British Prime Minister was ostensibly there to receive an honorary degree from Westminster College. But Churchill’s real purpose in coming was to urge the U.S. to form an alliance with Britain to keep the Soviet Union from expanding any further. Speaking before an august audience that included President Harry S. Truman, Churchill declared: “From Stettin in the Baltic to Trieste in the Adriatic, an iron curtain has descended across the continent.”

ILLUSTRATION: THOMAS FUCHS

Churchill wasn’t the first person to employ the phrase “iron curtain” as a political metaphor. Originally a theatrical term for the safety barrier between the stage and the audience, by the early 20th century it was being used to mean a barrier between opposing powers. Nevertheless, “iron curtain” became indelibly associated with Churchill and with the defense of freedom and democracy.

This was a modern expression of an idea first articulated by the ancient Greeks: that political beliefs are worth defending. In the winter of 431-30 B.C., the Athenians were staggering under a devastating plague while simultaneously fighting Sparta in the Peloponnesian War. The stakes couldn’t have been higher when the great statesman Pericles used a speech commemorating the war dead to define the struggle in terms that every Athenian would understand.

As reported by the historian Thucydides, Pericles told his compatriots that the fight wasn’t for more land, trade or treasure; it was for democracy pure and simple. Athens was special because its government existed “for the many instead of the few,” guaranteeing “equal justice to all.” No other regime, and certainly not the Spartans, could make the same claim.

Pericles died the following year, and Athens eventually went down in defeat in 404 B.C. But the idea that fighting for one’s country meant defending a political ideal continued to be influential. According to the 2nd-century Roman historian Cassius Dio, the Empire had to make war on the “lawless and godless” tribes living outside its borders. Fortifications such as Hadrian’s Wall in northern England weren’t just defensive measures but political statements: Inside bloomed civilization, outside lurked savagery.

The Great Wall of China, begun in 220 B.C. by Emperor Qin Shi Huang, had a similar function. In addition to keeping out the nomadic populations in the Mongolian plains, the wall symbolized the unity of the country under imperial rule and the Confucian belief system that supported it. Successive dynasties continued to fortify the Great Wall until the mid-17th century.

During the Napoleonic Wars, the British considered themselves to be fighting for democracy against dictatorship, like the ancient Athenians. In 1806, Napoleon instigated the Continental System, an economic blockade intended to cut off Britain from trading with France’s European allies and conquests. But the attack on free trade only strengthened British determination.

A similar resolve among the NATO allies led to the collapse of the Iron Curtain in 1991, when the Soviet Union was dissolved and withdrew its armies from Eastern Europe. As Churchill had predicted, freedom and democracy is the ultimate shield against “war and tyranny.”

Historically Speaking: How Roses Came to Mean True Love

Our favorite Valentine’s Day flower was already a symbol of passion in ancient Greek mythology

The Wall Street Journal

February 13, 2021

“My luve is like a red red rose,/That’s newly sprung in June,” wrote the Scottish poet Robert Burns in 1794, creating an inexhaustible revenue stream for florists everywhere, especially around Valentine’s Day. But why a red rose, you might well ask.

According to Greek myth, the blood of Aphrodite turned roses red.
PHOTO: GETTY IMAGES

Longevity is one reason. The rose is an ancient and well-traveled flower: A 55 million-year-old rose fossil found in Colorado suggests that roses were already blooming when our earliest primate ancestors began populating the earth. If you want to see where it all began, at least in the New World, then a trip to the Florissant Fossil Beds National Monument, roughly two hours’ drive from Denver, should be on your list of things to do once the pandemic is over.

In Greek mythology the rose was associated with Aphrodite, goddess of love, who was said to have emerged from the sea in a shower of foam that transformed into white roses. Her son Cupid bribed Harpocrates, the god of silence, with a single rose in return for not revealing his mother’s love affairs, giving rise to the Latin phrase sub rosa, “under the rose,” as a term for secrecy. As for the red rose, it was said to be born of tragedy: Aphrodite became tangled in a rose bush when she ran to comfort her lover Adonis as he lay dying from a wild boar attack. Scratched and torn by its thorns, her feet bled onto the roses and turned them crimson.

For the ancient Romans, the rose’s symbolic connection to love and death made it useful for celebrations and funerals alike. A Roman banquet without a suffocating cascade of petals was no banquet at all, and roses were regularly woven into garlands or crushed for their perfume. The first time Mark Antony saw Cleopatra he had to wade through a carpet of rose petals to reach her, by which point he had completely lost his head.

Rose cultivation in Asia became increasingly sophisticated during the Middle Ages, but in Europe the early church looked askance at the flower, regarding it as yet another example of pagan decadence. Fortunately, the Frankish emperor Charlemagne, an avid horticulturalist, refused to be cowed by old pieties, and in 794 he decreed that all royal gardens should contain roses and lilies.

The imperial seal of approval hastened the rose’s acceptance into the ecclesiastical fold. The Virgin Mary was likened to a thornless white rose because she was free of original sin. In fact, a climbing rose planted in her honor in 815 by the monks of Germany’s Hildesheim Cathedral is the oldest surviving rose bush today. Red roses, by contrast, symbolized the Crucifixion and Christian martyrs like St. Valentine, a priest killed by the Romans in the 3rd century, whose feast day is celebrated on Feb. 14. In the 14th century, his emergence as the patron saint of romantic love tipped the scales in favor of the red over the white rose.

The symbolism attached to the rose has long made it irresistible to poets. Shakespeare’s audience would have known that when Juliet compares Romeo to the flower—“that which we call a rose,/By any other name would smell as sweet”—it meant tragedy awaited the lovers. Yet they would have felt comforted, too, since each red rose bears witness, as Burns wrote, to the promise of love unbound and eternal: “Till a’ the seas gang dry, my dear,/And the rocks melt wi’ the sun.”

Historically Speaking: Two Centuries of Exploring Antarctica

Charting the southern continent took generations of heroic sacrifice and international cooperation.

The Wall Street Journal

January 14, 2021

There is a place on Earth that remains untouched by war, slavery or riots. Its inhabitants coexist in peace, and all nationalities are welcomed. No, it’s not Neverland or Shangri-La—it’s Antarctica, home to the South Pole, roughly 20 million penguins and a transient population of about 4,000 scientists and support staff.

Antarctica’s existence was only confirmed 200 years ago. Following some initial sightings by British and Russian explorers in January 1821, Captain John Davis, a British-born American sealer and explorer, landed on the Antarctic Peninsula on Feb. 7, 1821. Davis was struck by its immense size, writing in his logbook, “I think this Southern Land to be a Continent.” It is, in fact, the fifth-largest of Earth’s seven continents.

Herbert Ponting is attacked by a penguin during the 1911 Scott expedition in Antarctica.
PHOTO: HERBERT PONTING/SCOTT POLAR RESEARCH INSTITUTE, UNIVERSITY OF CAMBRIDGE/GETTY IMAGES

People had long speculated that there had to be something down at the bottom of the globe—in cartographers’ terms, a Terra Australis Incognita (“unknown southern land”). The ancient Greeks referred to the putative landmass as “Ant-Arktos,” because it was on the opposite side of the globe from the constellation of Arktos, the Bear, which appears in the north. But the closest anyone came to penetrating the freezing wastes of the Antarctic Circle was Captain James Cook, the British explorer, who looked for a southern continent from 1772-75. He got within 80 miles of the coast, but the harshness of the region convinced Cook that “no man will ever venture further than I have done.”

Davis proved him wrong half a century later, but explorers were unable to make further progress until the heroic age of Antarctic exploration in the early 20th century. In 1911, the British explorer Robert F. Scott led a research expedition to the South Pole, only to be beaten by the Norwegian Roald Amundsen, who misled his backers about his true intentions and jettisoned scientific research for the sake of getting there quickly.

Extraordinarily bad luck led to the deaths of Scott and his teammates on their return journey. In 1915, Ernest Shackleton led a British expedition that aimed to make the first crossing of Antarctica by land, but his ship Endurance was trapped in the polar ice. The crew’s 18-month odyssey to return to civilization became the stuff of legend.

Soon exploration gave way to international competition over Antarctica’s natural resources. Great Britain marked almost two-thirds of the continent’s landmass as part of the British Empire, but a half dozen other countries also staked claims. In 1947 the U.S. joined the fray with Operation High Jump, a U.S. Navy-led mission to establish a research base that involved 13 ships and 23 aircraft.

Antarctica’s freedom and neutrality were in question during the Cold War. But in 1957, a group of geophysicists managed to launch a year-long Antarctic research project involving 12 countries. It was such a success that two years later the countries, including the U.S., the U.K. and the USSR, signed the Antarctic Treaty, guaranteeing the continent’s protection from militarization and exploitation. This goodwill toward men took a further 20 years to extend to women, but in 1979 American engineer Irene C. Peden became the first woman to work at the South Pole for an entire winter.

Historically Speaking: Awed by the Meteor Shower of the New Year’s Sky

Human beings have always marveled at displays like this weekend’s Quadrantids, but now we can understand them as well.

The Wall Street Journal

January 1, 2021

If you wish upon a star this week, you probably won’t get your heart’s desire. But if you’re lucky, you’ll be treated to an outstanding display of the Quadrantids, the annual New Year’s meteor shower that rivals the Perseids in intensity and quality of fireballs. The Quadrantids are exceptionally brief, however: The peak lasts only a few hours on January 2, and a cloudy sky or full moon can ruin the entire show.

A long-exposure photograph of the Draconid meteor shower in October 2018.
PHOTO: SMITYUK YURI/TASS/ZUMA PRESS

Meteor showers happen when the Earth encounters dust and rock sloughed off by a comet as it orbits the sun. The streaks of light we see are produced by this debris burning up in the Earth’s atmosphere.

Human beings have been aware of the phenomenon since ancient times. Some Christian archaeologists have theorized that the biblical story of Sodom and Gomorrah was inspired by a massive meteor strike near the Dead Sea some 3,700 years ago, which wiped out the Bronze Age city of Tall el-Hammam in modern Jordan.

Aristotle believed that comets and meteors weren’t heavenly bodies but “exhalations” from the Earth that ignited in the sky. As a result, Western astronomers took little interest in them until the rise of modern science. By contrast, the Chinese began recording meteor events as early as 687 B.C. The Mayans were also fascinated by meteor showers: Studies of hieroglyphic records suggest that important occasions, such as royal coronations, were timed to coincide with the Eta Aquarid shower in the spring.

Even before telescopes were invented, it wasn’t hard to observe comets, meteors and meteor showers. The 11th-century Bayeux Tapestry contains a depiction of Halley’s comet, which appeared in 1066. But people couldn’t see meteors for what they really were. Medieval Christians referred to the annual Perseid shower as “the tears of St. Lawrence,” believing that the burning tears of the martyred saint lit up the sky on his feast day, August 10.

Things began to change in the 19th century, as astronomers noticed that some meteor showers recurred on a fixed cycle. In November 1799, the Leonid shower was recorded by Andrew Ellicott, an American surveyor on a mission to establish the boundary between the U.S. and the Spanish territory of Florida. Ellicott was on board a ship in the Florida Keys when he observed the Leonids, writing in his journal that “the whole heavens appeared as if illuminated with skyrockets, flying in an infinity of directions, and I was in constant expectation of some of them falling on the vessel.” When a similar spectacle lit up the skies in the eastern U.S. in 1833, astronomers realized that it was a recurrence of the same phenomenon and that the meteor storm must be linked to the orbit of a particular comet.

The origin of the Quadrantids was harder to locate. Astronomers kept looking for its parent comet until 2003, when NASA scientist Peter Jenniskens realized that they were on the wrong track: The shower is actually caused by a giant asteroid, designated 2003 EH1, which broke off from a comet 500 years ago. It is somehow fitting that a mystery of the New Year’s night sky yielded to the power of an open mind.