Castaways and Other Lonely Survivors

From prisoners to exiles to marooned sailors, human beings have faced the ordeal of enforced solitude.

April 2, 2020

Being one’s own company can be blissful, but not when it’s involuntary. According to John Donne, the 17th-century English poet and priest, “As sickness is the greatest misery, so the greatest misery of sickness is solitude.” Now that nearly two in three Americans are currently subject to shelter-in-place orders as a result of the coronavirus pandemic, how will we cope with prolonged isolation?

Tom Hanks in the 2000 movie ‘Cast Away.’
PHOTO: EVERETT COLLECTION

The Germans have a special word for feeling utterly alone and isolated: mutterseelenallein, a compound that literally means “mother’s souls alone.” According to one theory, the word entered the German language as a misinterpretation of the French phrase moi teut seul (“me all alone”), which was used by the Huguenots—French Protestants who fled to Berlin in the 18th century to escape persecution at home.

Although the ancients didn’t have an equivalent word, they certainly knew the feeling. In 44 B.C., the Roman statesman Marcus Tullius Cicero was declared an enemy of the state and forced into hiding. Despite his loneliness, or perhaps because of it, Cicero used the time to write his final work, “On Duties,” in only four weeks.

We can’t all be like Cicero, of course, and write a masterpiece while on lockdown. But we can certainly rise to the occasion and surprise ourselves. One of the least likely castaways in history was the wealthy French aristocrat Marguerite de La Rocque de Roberval, who in 1542 agreed to accompany her spendthrift cousin Jean-Francois de Roberval on a voyage to New France, modern Canada. The jolly adventure became a nightmare after Roberval accused Marguerite of sexual immorality while on board his ship. This was his flimsy excuse for marooning her along with her maid and lover on the deserted Isle of Demons off Newfoundland.

Marguerite’s lover and their baby soon succumbed, as did her maid, leaving the hitherto pampered heiress to survive in the wilderness as best she could. During her two years as a castaway she killed a bear, ate its carcass and turned its skin into clothing. After her rescue and return to France in 1544, Marguerite created a new life for herself as an educator. The treacherous Roberval escaped punishment, but was subsequently murdered by a mob in 1560.

Such stories provided ample material to Daniel Defoe, who wrote the ultimate social isolation story, “Robinson Crusoe,” in 1719. The novel has been adapted many times since, including for the 2000 film “Cast Away,” starring Tom Hanks as the Crusoe figure. Defoe based his tale in part on the real-life castaway Alexander Selkirk, a Royal Navy officer who in 1704 was abandoned by his shipmates on a deserted island in the south Pacific, where he managed to survive until he was rescued five years later.

Defoe himself knew something about prolonged isolation: In 1703, he was imprisoned for several months after he published a pamphlet satirizing the Church of England. Defoe’s stint in prison made him a better advocate for the social outcasts he described in his novels, just as Crusoe’s 28-year sojourn on the island made him into a better person—a man of faith and purpose rather than the malcontent of his former life. “No man is an island entire of itself,” wrote Donne; being human makes us all connected, no matter where we are.

 

The Long Fight Against Unjust Taxes

From ancient Jerusalem to the American Revolution and beyond, rebels have risen up against the burden of taxation.

March 19, 2020

The Wall Street Journal

With the world in the grip of a major health crisis, historical milestones are passing by with little notice. But the Boston Massacre, whose 250th anniversary was this month, deserves to be remembered as a cautionary tale.

ILLUSTRATION: THOMAS FUCHS

The bloody encounter on March 5, 1770, began with the harassment of a British soldier by a crowd of Bostonians. Panicked soldiers responded by firing on the crowd, leaving five dead and six wounded. The colonists were irate about new taxes imposed by the British Parliament to pay for the expenses of the Seven Years War, which in North America pitted the British and Americans against the French and their Indian allies. Whether or not the tax increase was justified, the failure of British leaders to include the American colonies in the deliberative process was catastrophic. The slogan “No taxation without representation” became a rallying cry for the fledgling nation.

The attitude of tax collecting authorities had hardly changed since ancient times, when empires treated their subject populations with greed, brutality and arrogance. In 1st century Judea, anger over the taxes imposed by Rome combined with religious grievances to provoke a full-scale Jewish revolt in 66-73 A.D. It was an unequal battle, as most tax rebellions are, and the resistors were made to pay dearly: Jerusalem was sacked and the Second Temple destroyed, and all Jews in the Roman Empire were forced to pay a punitive tax.

Even when tax revolts met with initial success, there was no guarantee that the authorities would carry out their promises. In 1381, a humble English roof tiler named Wat Tyler led an uprising, dubbed the Peasants’ Revolt, against a new poll tax. King Richard II met with Tyler and agreed to his demands, but only as a delaying tactic. The ringleaders were then rounded up and executed, and Richard revoked his concessions, claiming they had been made under duress.

Nevertheless, as the historian David F. Burg notes in his book “A World History of Tax Rebellions,” tax revolts have been more frequent than we realize, mainly because governments tend not to advertise them. In Germany, 210 separate protests and uprisings were recorded from 1300 to 1550, and at least 1,000 in Japan from 1600 to 1868.

The 19th century saw the rise of a new kind of tax rebel, the conscientious objector. In 1846, the writer and abolitionist Henry David Thoreau spent a night in the Concord, Mass., jail after he refused to pay a poll tax as a protest against slavery. He was released the next morning when his aunt paid it for him, against his will. But Thoreau would go on to withhold his taxes in protest against the Mexican-American War, arguing in his 1849 essay “Civil Disobedience” that it was better to go to jail than to “enable the State to commit violence and shed innocent blood.”

Irwin Schiff, a colorful antitax advocate and failed libertarian presidential candidate, wouldn’t get off so easily. Arguing that the income tax violated the U.S. Constitution, he refused to pay it, despite being convicted of tax evasion three times. In 2015, he died at age 87 in a federal prison—an ironic confirmation of Benjamin Franklin’s adage that “nothing can be said to be certain, except death and taxes.”

Fortunately for Americans at this time of national duress, tax day this year has been mercifully postponed.

Cities Built by Royal Command

From ancient Egypt to modern Russia, rulers have tried to build new capitals from the ground up.

March 5, 2020

The Wall Street Journal

In 1420, the Yongle Emperor moved China’s capital from Nanjing to the city then known as Beiping. To house his government he built the Forbidden City, the largest wooden complex in the world, with more than 70 palace compounds spread across 178 acres. Incredibly, an army of 100,000 artisans and one million laborers finished the project in only three years. Shortly after moving in, the emperor renamed the city Beijing, “northern capital.”

The monument of Peter the Great in St. Petersburg, Russia.
PHOTO: GETTY IMAGES

In the 600 years since, countless visitors have marveled at Yongle’s creation. As its name suggests, the Forbidden City could only be entered by permission of the emperor. But the capital city built around it was an impressive symbol of imperial power and social order, a kind of 3-D model of the harmony of the universe.

Beijing’s success and longevity marked an important leap forward in the history of purpose-built cities. The earliest attempts at building a capital from scratch were usually hubristic affairs that vanished along with their founders. Nothing remains of the fabled Akkad, commissioned by Sargon the Great in the 24th century B.C. following his victory over the Sumerians.

But the most vainglorious and ultimately futile capital ever constructed must surely be Akhetaten, later known as Amarna, on the east bank of the Nile River in Egypt. It was built by the Pharaoh Akhenaten around 1346 B.C. to serve as a living temple to Aten, the god of the sun. The pharaoh hoped to make Aten the center of a new monotheistic cult, replacing the ancient pantheon of Egyptian deities. In his eyes, Amarna was a glorious act of personal worship. But to the Egyptians, this hastily erected city of mud and brick was an indoctrination camp run by a crazed fanatic. Neither Akhenaten’s religion nor his city long survived his death.

In 330 B.C., Alexander the Great was responsible for one of the worst acts of cultural vandalism in history when he allowed his army to burn down Persepolis, the magnificent Achaemenid capital founded by Darius I, in revenge for the Persian destruction of Athens 150 years earlier.

Ironically, the year before he destroyed a metropolis in Persia, the Macedonian emperor had created one in Egypt. Legend has it that Alexander chose the site of Alexandria after being visited by the poet Homer in a dream. He may also have been influenced by the advantages of the location, near the island of Pharos on the Mediterranean coast, which boasted two harbors as well as a limitless supply of fresh water. Working closely with the famed Greek architect Dinocrates, Alexander designed the walls, city quarters and street grid himself. Alexandria went on to become a center of Greek and Roman civilization, famous for its library, museum and lighthouse.

No European ruler would rival the urban ambitions of Alexander the Great, let alone Emperor Yongle, until Tsar Peter the Great. In 1703, he founded St. Petersburg on the marshy archipelago where the Neva River meets the Baltic Sea. His goal was to replace Moscow, the old Russian capital, with a new city built according to modern, Western ideas. Those ideas were unpopular with Russia’s nobility, and after Peter’s death his successor moved the capital back to Moscow. But in 1732 the Romanovs transferred their court permanently to St. Petersburg, where it remained until 1917. Sometimes, the biggest urban-planning dreams actually come true.

 

Women Who Popped the Question

Tradition holds that women only propose marriage on leap days, but queens have never been afraid to take the initiative.

February 20, 2020

The Wall Street Journal

ILLUSTRATION: THOMAS FUCHS

An old tradition holds that every leap year, on Feb. 29, women may propose marriage to men without censure or stigma. Sources disagree about the origin of this privilege. One attributes it to St. Brigid, who became concerned for all the unmarried women in 5th-century Ireland and persuaded St. Patrick to grant them this relief. Another gives the credit to Queen Margaret of Scotland, who supposedly had the custom written into Scottish law in 1288.

Neither story is likely to be true: St. Brigid, if she even existed, would have been a child at the time of St. Patrick’s death, and Margaret died at the age of 7 in 1290. But around the world, there have always been a few women who exercised the usually male privilege of proposing.

In the Bible, the widowed Ruth, future great-grandmother of King David, asks her kinsman Boaz to marry her—not with words but by lying down at the foot of his bed. On the Bissagos Islands off the coast of Guinea-Bissau, women propose by offering the man of their choice a ceremonial dish of fish marinated in palm oil.

Queen Ankhesenamun of Egypt, who lived in the 14th century B.C., is believed to have made the earliest recorded marriage proposal by a woman. Based on a surviving letter in the Hittite royal archives, scholars have theorized that Ankhesenamun, the widow of the boy king Tutankhamun, secretly asked the Hittite king Suppiluliuma to agree to a match with one of his sons, so that she could avoid a forced marriage to Ay, her late husband’s grand vizier. The surprised and suspicious king eventually sent her his son Zannanza. Unfortunately, the plan leaked out and the Hittite wedding party was massacred at the Egyptian border. Ankhesenamun disappears from the historical record shortly after.

The Roman princess Justa Grata Honoria, sister of Emperor Valentinian III, had marginally better luck. In 450 she appealed to Attila, King of the Huns, to marry her, in order to escape an arranged marriage with a minor politician. When Valentinian learned of the plan, he refused to allow it and forced Honoria to wed the senator. In retaliation, Attila launched an attack against Rome on the pretext of claiming his bride, capturing Gaul and advancing as far as the Po River in northern Italy.

The idea that marriage was a sentimental union between two individuals, rather than an economic or strategic pact between families, gained ground in Europe in the late 18th century. Jane Austen highlighted the clash of values between generations in her 1813 novel “Pride and Prejudice”: Lady Catherine de Bourgh insists that her daughter and Mr. Darcy have been engaged since birth, while the heroine Elizabeth Bennet declares she will have Darcy if she wants.

Two decades later, a 20-year-old Queen Victoria came down on the side of love when she chose her cousin Albert to be her husband. As a ruling monarch, it was Victoria’s right and duty to make the proposal. As she recorded in her diary on October 15, 1839, “I said to him … that it would make me too happy if he would consent to what I wished.” The 21-year marriage was one of the most successful in royal history.

Although it’s still the custom in most countries for men to propose marriage, leap year or not, there’s more to courtship than getting down on one knee. As the Irving Berlin song goes, “A man chases a girl (until she catches him).”

Plagues From the Animal Kingdom

The coronavirus is just the latest of many deadly diseases to cross over to human beings from other species.

February 6, 2020

The Wall Street Journal

Earlier this week, the still-rising death toll in mainland China from the coronavirus surpassed the 349 fatalities recorded during the 2003 SARS epidemic. Although both viruses are believed to have originated in bats, they don’t behave in the same way. SARS spread slowly, but its mortality rate was 9.6%, compared with about 2% for the swift-moving coronavirus.

A scientist examines Ötzi, a 5,300-year-old mummy discovered in the Tyrolean Alps in 1991.
PHOTO: EURAC/MARION LAFOGLER

Statistics tell only one part of the story, however. Advances in the genetic sequencing of diseases have revealed that a vast hinterland of growth and adaptation precedes the appearance of a new disease. Cancer, for example, predates human beings themselves: Last year scientists announced that they had discovered traces of bone cancer in the fossil of a 240-million-year-old shell-less turtle from the Triassic period. This easily surpasses the oldest example of human cancer, which was found in a 1.7 million-year-old toe bone in South Africa. The findings confirm that even though cancer has all kinds of modern triggers such as radiation poisoning, asbestos and smoking, the disease is deeply rooted in our evolutionary past.

Unlike cancer, the majority of human diseases are zoonotic, meaning that they are passed between animals and people by viruses, fungi, parasites or bacteria. The rise of agriculture around 10,000 years ago, which forced humans into close contact with animals, was probably the single greatest factor behind the spread of infectious disease.

Rabies was one of the earliest diseases to be recognized as having an animal origin. The law code of Eshnunna, a Mesopotamian city that flourished around 2000 B.C., mandated harsh punishments against owners of mad dogs that bit people. Lyme disease was only identified by scientists in 1975, but it too was an ancient scourge. Ötzi, the 5,300-year-old mummy who was discovered in the Tyrolean Alps with cracked ribs and an arrow wound in his shoulder, was an unlucky fellow even before he was killed. DNA sequencing in 2010 revealed that while he was alive, Ötzi was lactose intolerant, had clogged arteries and suffered from Lyme disease.

Smallpox, which was eradicated by the World Health Organization in 1979, had been one of the most feared diseases for most of human history, with a mortality rate of 30%. Those who survived were often left with severe scarring; the telltale lesions of smallpox have been identified on the mummy of Pharaoh Ramesses V, who died in 1145 B.C. The disease is caused by the variola virus, which is thought to have crossed over to human beings from an animal, likely a rodent, in prehistoric times.

Although it can’t be proved for certain, it is likely that smallpox was behind the terrible plague that killed 20% of the Athenian population in 430 B.C. The historian Thucydides, who lived through it, described the agony of those infected with red pustules, the dead bodies piled high in the temples and the scars left on the survivors. He also noticed that those who did survive acquired immunity to the disease.

Thucydides’s observation turned out to be the key to one of humanity’s greatest weapons against infectious disease, vaccination. But apart from smallpox, the only eradication programs to have made some progress have been against viruses transmitted from human to human, such as polio and measles. Meanwhile, since the 1970s more than 40 new infectious diseases have emerged from the animal realm, including HIV, swine flu and Zika. And those are just the ones we know about.

Royal Treasures, Lost and Found

From Montezuma’s gold to the crown jewels of Scotland, some of the world’s most famous valuables have gone missing.

January 23, 2020

The Wall Street Journal

What sort of nitwit goes off in a snowstorm to feed leftovers to the chickens while still wearing her Christmas Day finery? In my defense, I was just trying to share the love. Alas, I ended up sharing an antique ring along with the Brussels sprouts. Only the chickens know where it is, and they aren’t talking.

The Honours of Scotland were recovered in 1818 after being lost for decades. PHOTO: ALAMY

One doesn’t need to read J.R.R. Tolkien’s “The Lord of the Rings” to be reminded that lost treasures are often the result of epic folly. In October 1216, King John of England lost the crown jewels while leading a campaign against rebellious barons. Against all advice, John—who is chiefly remembered for being forced to sign the Magna Carta, one of the cornerstones of civil liberty—took a shortcut via the Wash, a tidal estuary on England’s east coast. He got across the causeway just in time to see the waters come rushing up behind him. The wagon train with all his supplies and baggage—including, crucially, the king’s treasury—sank without a trace. The incident has given countless British schoolchildren the joy of being able to say, “Bad King John lost his crown in the wash.”

Folly also played a starring role in the disappearance of the treasure of the Aztec Emperor Montezuma II. In 1520, the inhabitants of the Aztec city of Tenochtitlan rose up against the occupying Spanish forces led by Hernán Cortés. By July 1, the situation was so critical that the outnumbered conquistadors attempted a midnight escape from the city. Hampered by their haul of plunder, however, the Spanish were too slow in crossing Lake Texcoco. Unable to run or fight, desperation overcame their greed and the conquistadors tossed the treasures into the water. Despite losing half his men on what he called “La Noche Triste,” the Night of Sorrows, Cortes captured the Aztec capital a year later. But he never found the lost gold.

It was a case of absent-mindedness that led to the disappearance of the Scottish royal sword, scepter and crown, known collectively as the Honours of Scotland. Having been successfully hidden during the Interregnum, England’s brief experiment with republicanism in 1649-60, the Honours were returned to Edinburgh Castle for safekeeping. Too safe, it turned out: No one could remember where they were. But the story has a happy ending. In 1818, the Scottish novelist Sir Walter Scott received permission to conduct his own search of the castle. He found the Honours in a locked storeroom, inside a trunk packed with linens.

Occasionally, royal treasures have been lost on purpose. One of the last rulers to be buried with his jewels was Emperor Tu Duc of Vietnam (1829-83). To outwit potential grave robbers, he left orders that he should not be buried in his elaborate official tomb but in a secret location; to ensure it stayed secret, the laborers who interred him were executed. In 1913, Georges Mahé, the French colonial administrator of the Vietnamese city of Hue, provoked a national outcry after he dug up Tu Duc’s official tomb in the hope of finding the hidden jewels. The French swiftly apologized and Mahé was sacked.

Tu Duc’s treasure remains lost, but it may not stay that way forever. Earlier this month, scientists in Mexico City confirmed that a gold bar found on a construction site is one of the ingots discarded by Cortés and his fleeing conquistadors almost exactly 500 years ago.

 

The Blessing and Curse of ‘Black Gold’

From the pharaohs to John D. Rockefeller, oil has been key to the growth of civilization—but it comes at a high cost.

January 10, 2020

The Wall Street Journal

This January marks the 150th anniversary of the Standard Oil Company, incorporated in 1870 by John D. Rockefeller and three partners. Such was their drive and ruthlessness that within a decade Standard Oil became a vast monopoly, controlling over 90% of America’s oil refineries.

Spindletop Hill in Beaumont, Texas, was the site of the first Texas oil gusher in 1901. PHOTO: TEXAS ENERGY MUSEUM/NEWSMAKERS/GETTY IMAGES

Standard Oil’s tentacle-like grip on U.S. commerce was finally prized loose in 1911, when the Supreme Court broke it up into 33 separate companies. But this victory didn’t put an end to the nefarious activities surrounding “black gold.” In the 1920s, tycoon Edward Doheny was drawn into the Teapot Dome scandal after he gave a $100,000 bribe to Secretary of the Interior Albert Fall. Doheny served as the inspiration for the corrupt and blood-soaked tycoon J. Arnold Ross in Upton Sinclair’s 1927 novel “Oil!” (which in turned inspired the Oscar-winning 2007 film “There Will Be Blood”).

Though clearly responsible for a great many evils, oil has also been key to the growth of human civilization. As the Bible attests, bitumen—a naturally forming liquid found in oil sands and pitch lakes—was used in ancient times for waterproofing boats and baskets. It also played an important role in Egyptian burial practices: The word “mummy” is derived from mumiyyah, the Arabic word for bitumen.

Over the centuries, oil proved useful in a variety of ways. As early as the fourth century, the Chinese were drilling for oil with bamboo pipes and burning it as heating fuel. In Central Eurasia it was a treatment for mange in camels. By the ninth century, Persian alchemists had discovered how to distill oil into kerosene to make light. The oil fields of medieval Baku, in today’s Azerbaijan, brought trade and culture to the region, rather than political oppression and underdevelopment, as is often the case in oil-rich countries today.

The drilling of the first commercial oil well in the U.S., in Pennsylvania in 1859, brought a raft of benefits. In the 19th century, an estimated 236,000 sperm whales were killed to make oil for lamps. The whaling industry died overnight once Standard Oil began marketing a clean-smelling version of kerosene. Plentiful oil also made the automobile industry possible. In 1901, when a massive oil gusher was discovered in Spindletop, Texas, there were 14,800 cars in the U.S.; two decades later, there were 8.5 million.

After World War II, the world’s oil supply was dominated not by private companies like Standard Oil but by global alliances such as OPEC. When OPEC nations declared an embargo in 1973, the resulting crisis caused the price of oil to climb nearly 400%.

At the time, the U.S. depended on foreign suppliers for 36% of its oil supply. Last month, the U.S. Energy Information Administration announced that, thanks to new technologies such as hydraulic fracturing, the country had become a net exporter of oil for the first time in 75 years.

Though helpful geopolitically, America’s oil independence doesn’t solve the environmental problems caused by carbon emissions. Ironically, some of John D. Rockefeller’s own descendants, aided by the multibillion-dollar fortune he bequeathed, are now campaigning against Exxon Mobil —one of the 33 Standard Oil offshoots—over its record on climate change.

Whiskey Is the Original ‘Cup of Kindness’

The barley fields of Scotland and Ireland gave birth to a drink that became popular around the world

The Wall Street Journal, December 27, 2019

On New Year’s Eve, the song “Auld Lang Syne” urges us to “take a cup of kindness.” It’s an old Scottish saying, meaning to share a friendly tipple—presumably of single malt whiskey.

Nowadays there are whisky (Scottish, with no e), whiskey (Irish and North American), rye whiskey (North American) and bourbon (American), but no matter what the drink is called, the method for making it is essentially the same. Like beer, its ancient precursor, whiskey is made with water, grains and yeast. It’s the distillation process that leads to a higher alcohol content in whiskey. For that we must thank an Egyptian alchemist from around the 2nd century named Maria Hebraea (Mary the Jewess) of Alexandria, whose celebrated inventions include the distillation pot.

ILLUSTRATION: THOMAS FUCHS

In the boggy moors of Scotland and Ireland, where never a grape will grow but barley is plentiful, medieval monks learned how to make a whiskey fiery enough to take a man down if he wasn’t careful. The Annals of Clonmacnoise, a 15th-century chronicle of early Irish history, records that in 1405 the clan chieftain Richard MaGranell drank a “surfeit” of whiskey over Christmas and died.

From the 17th century onward, Scotch-Irish emigrants to the New World brought their distillation techniques with them. But it was an Englishman, the Jamestown colonist George Thorpe, who learned that whiskey could be made just as easily with Indian corn. Thorpe was killed in the 1622 Powhatan massacre, but his inventiveness helped to create the American love affair with bourbon, which is made from corn and aged in charred new oak barrels.

Back in the mother country, the punitive Malt Tax of 1725 drove the British whiskey industry underground. Scottish distilleries became secretive nighttime operations, which is how the nickname “moonshine” came to be. A similar whiskey tax levied by the U.S. government in 1791, to pay down the country’s Revolutionary War debt, met with fierce resistance, sparking the so-called Whiskey Rebellion among Pennsylvania farmers. Whiskey production was a lucrative business, so much so that in retirement George Washington turned his Mount Vernon plantation into the country’s largest distillery.

British visitors to the U.S. during the 19th century were delighted by the myriad ways that bourbon could be enjoyed. In 1842, Charles Dickens passed a night in Baltimore drinking mint juleps with Washington Irving; it was, he wrote, “among the most memorable of my life.” American tourists were equally impressed with British know-how around a single malt. In 1874, Mark Twain wrote to his wife Olivia, “Ever since I have been in London I have taken in a wine glass what is called a cock-tail.” The ingredients consisted of “a bottle of scotch whisky, a lemon, some crushed sugar, and a bottle of Angostura bitters.”

Whiskey’s reputation as the liquor of choice for Prohibition Era bootleggers and tortured geniuses like the novelist William Faulkner made it deeply unfashionable among the vodka-drinking MTV generation. But there’s a revival under way, aided by millennials’ love of cocktail culture. In October, a bottle of 1926 Macallan single malt whisky smashed all auction records for a wine or spirit, selling for $1.9 million. That’s $76,000 per ounce of kindness, my dear, or $158,333 a cocktail.

Electric Lights for Yuletide

In 1882, Thomas Edison’s business partner put up a Christmas tree decorated with 80 red, white and blue bulbs—and launched an American tradition.

The Wall Street Journal, December 5, 2019

As a quotation often attributed to Maya Angelou has it, ‘’You can tell a lot about a person by the way (s)he handles these three things: a rainy day, lost luggage and tangled Christmas tree lights.” I’m not sure what it says about me that I actually look forward to getting my hands on the latter. My house is so brightly decorated with energy-saving LEDs that it could double as a landing beacon on a foggy night. It’s the one thing I really missed when I lived abroad—no other country does Christmas lights like America. More than 80 million households put up lighting displays each December, creating a seasonal spike in U.S. energy use that’s bigger than the annual consumption of some small countries.

Holiday lights in Brooklyn, 2015. PHOTO: GETTY IMAGES

Hanging festive lighting during the winter solstice is an ancient practice, but the modern version owes its origins to Thomas Edison and his business partner Edward H. Johnson. Edison perfected the first fully functional lightbulb in 1879. For Christmas the following year, he strung up lights outside his Menlo Park factory—partly to provide good cheer but mostly to advertise the benefits of electrification. Johnson went one further in 1882, placing a Christmas tree decorated with 80 red, white and blue blinking lightbulbs on a revolving turntable in his parlor window in Manhattan.

Johnson repeated the display every year, much to the delight of New Yorkers, striving to make it bigger and better each time. He thus founded the other great American tradition: the competitive light display. The first person to take up Johnson’s challenge was President Grover Cleveland, who in 1894 erected an enormous multi-light Christmas tree in the White House, thereby starting a new presidential tradition.

The initial $300 price tag for an electrified Christmas tree (about $2,000 today) put it beyond the reach of the average consumer. The alternative was clip-on candles, but they were so hazardous that by 1910 most home insurance policies contained a nonpayment clause for house fires caused by candlelit Christmas trees. Although it was possible to rent electric Christmas lights for the season, and the General Electric Company was beginning to produce easy-to-assemble kits, the stark difference between lit and unlit homes threatened to become a powerful symbol of social inequality.

Fortunately, a New Yorker named Emilie D. Lee Herreshoff was on the case. She persuaded the city council to allow an electrified Christmas tree to be put up in Madison Square Park. The inaugural tree-lighting celebration, on December 24, 1912, generated so much public enthusiasm that within two years over 300 cities and towns were holding similar ceremonies.

Not content with just one festive tree, in 1920 the city of Pasadena, Calif., agreed to light up the 150 mature evergreens lining Santa Rosa Avenue, leading to its nickname “Christmas Tree Lane.” This was quite a feat of electrical engineering, given that outdoor Christmas lights didn’t become commercially available until 1927.

To encourage buyers, GE began sponsoring local holiday lighting contests, unleashing a competitive spirit each Yuletide that only seems to have grown stronger with the passing decades. Since 2014, the Guinness Book of World Records title for the most lights on a private residence has been held by the Gay family of LaGrangeville, N.Y., with strong competition from Australia. To which I say, “God bless them, everyone.”

Historically Speaking: ‘Sesame Street’ Wasn’t the First to Make Learning Fun

The show turns 50 this month, but the idea that education can be entertaining goes back to ancient Greece.

The Wall Street Journal, November 21, 2019

“Sesame Street,” which first went on the air 50 years ago this month, is one of the most successful and cost-effective tools ever created for preparing preschool tots for the classroom. Now showing in 70 languages in more than 150 countries around the world, “Sesame Street” is that rare thing in a child’s life: truly educational entertainment.

The Muppets of ‘Sesame Street’ in the 1993-94 season. PHOTO: EVERETT COLLECTION

Historically, those two words have rarely appeared together. In the 4th century B.C., Plato and Aristotle both agreed that children can learn through play. In “The Republic,” Plato went so far as to advise, “Do not use compulsion, but let early education be a sort of amusement.” Unfortunately, his advice failed to catch on.

In Europe during the Middle Ages, play and learning were almost diametrically opposed. Monks were in charge of boys’ education, which largely consisted of Latin grammar and religious teaching. (Girls learned domestic skills at home.) The invention of the printing press in 1440 helped spread literacy among young readers, but the works written for them weren’t exactly entertaining. A book like “A token for children: Being an exact account of the conversion, holy and exemplary lives, and joyful deaths of several young children,” by the 17th-centrury English Puritan James Janeway, surely didn’t follow Plato’s injunction to be amusing as well as instructional.

Social attitudes toward children’s entertainment changed considerably, however, in the wake of the English philosopher John Locke’s 1693 treatise “Some Thoughts Concerning Education.” Locke followed Plato’s line on education, writing, “I always have had a fancy that learning might be made a play and recreation to children.” The publisher John Newbery heeded Locke’s advice; in 1744, he published “A Little Pretty Pocket-Book Intended for the Instruction and Amusement of Little Master Tommy and Pretty Miss Polly,” which was sold along with a ball for boys and a pincushion for girls. In the introduction, Newbery promised parents and guardians that the book would not only make their children “strong, healthy, virtuous, wise” but also “happy.”

When it came to early children’s television in the U.S., however, “play and recreation” usually squeezed out educational content. Many popular shows existed primarily to sell toys and products: “Howdy Doody,” the pioneering puppet show that ran on NBC from 1947 to 1960, was sponsored by RCA to pitch color televisions. Parents became so indignant over the exploitation of their children by the TV industry that, in 1968, grass-roots activists started the nonprofit Action for Children’s Television, which petitioned the Federal Communications Commission to ban advertising on children’s programming.

This cultural mood led to the birth of “Sesame Street.” The show’s co-creators, Joan Ganz Cooney and Lloyd Morrisett, were particularly devoted to using TV to combat educational inequality in minority communities. They spent three years working with teachers, child psychologists and Jim Henson’s Muppets to get the right mix of education and entertainment. The pilot episode, broadcast on public television stations on Nov. 10, 1969, introduced the world to Big Bird, Bert and Ernie, Oscar the Grouch, and their cast of multiethnic friends and neighbors. “You’re gonna love it,” says one of the show’s human characters, Gordon, to his wife Sally in the first show’s opening lines. And we have.