Historically Speaking: Diabetes and the Miracle of Insulin

One hundred years ago, a team of Canadian researchers showed that an age-old disease didn’t have to mean a death sentence.

The Wall Street Journal

July 22, 2021

The human body runs on glucose, a type of sugar that travels through the bloodstream to the cells where it converts into energy. Some 34.2 million Americans are diabetic; their bodies are unable to produce the hormone insulin, which regulates how glucose is processed and stored in the cells. Without treatment the condition is terminal. The discovery of insulin a century ago this year was one of the great medical breakthroughs of the 20th century.

Diabetes was first recognized some 4,000 years ago. The Ebers Papryus, an Egyptian medical text written around 1550 B.C., refers to patients suffering from thirst, frequent urination and weight loss. An ancient Indian text, the Sushruta Samhita, composed after the 7th century B.C., advised testing for diabetes by seeing whether ants were attracted to the sugar in the urine.

The disproportionate amount of urine experienced by sufferers is probably the reason why the ancient Greeks called it “diabetes,” the word for “siphon” or “to pass through.” They also made the link to lifestyle: The 5th century B.C. physician Hippocrates, the “father of medicine,” advocated exercise as part of the treatment.

Early on, the Chinese recognized that an unbalanced diet of sweet, rich and fatty foods could play a role. Lady Dai, a minor aristocrat who died in the 2nd century B.C., was a textbook case. Her perfectly preserved mummy, discovered in southern China in 1971, revealed a life of dietary excess. She also suffered the consequences: osteoarthritis, high cholesterol, hypertension, liver disease, gall stones and, crucially, diabetes.

Over time, physicians became more expert at diagnosis. The role of sugar came into sharper focus in the 1770s after the English doctor Matthew Dobson discovered that it stayed in the blood as well as urine. But no further progress was made until the end of the 19th century. In 1889 Oskar Minkowski conducted experiments on dogs at the University of Strasbourg to prove that a nonfunctioning pancreas triggered diabetes.

ILLUSTRATION: THOMAS FUCHS

By the early 20th century, scientists knew that a pancreatic secretion was responsible for controlling glucose in the body, but they couldn’t isolate it. The Canadian researchers credited with finding the answer— John Macleod, Frederick Banting, Charles Best and James Collip —were an unlikely team. Banting was a surgeon not a scientist. Yet he sufficiently impressed Macleod, a professor of physiology at the University of Toronto, that the latter lent him lab space and a research assistant, Best. The pairing almost ended in a fist fight, but Banting and Best got over their differences and in July 1921 successfully injected insulin into a dog.

Macleod then brought in Collip, a biochemist on a research sabbatical, to help make a human-compatible version. This led to more infighting and Collip threatened to leave. The dysfunctional group somehow held together long enough to try their insulin on a 14-year-old named Leonard Thompson. He lived another 13 years.

The research team sold their patent rights to the University of Toronto for $1, believing insulin was too vital to be exploited. Their idealism was betrayed: Today, manufacture of the drug is controlled by three companies, and according to a 2018 Yale study published in JAMA, its high cost is forcing 1 in 4 Americans to skimp on their medication. The next frontier for insulin is finding a way to make it affordable for all.

Historically Speaking: The Beacon of the Public Library

Building places for ordinary people to read and share books has been a passion project of knowledge-seekers since before Roman times.

The Wall Street Journal

July 8, 2021

“The libraries are closing forever, like tombs,” wrote the historian Ammianus Marcellinus in 378 A.D. The Goths had just defeated the Roman army in the Battle of Adrianople, marking what is generally thought to be the beginning of the end of Rome.

His words echoed in my head during the pandemic, when U.S. public libraries closed their doors one by one. By doing so they did more than just close off community spaces and free access to books: They dimmed one of the great lamps of civilization.

Kings and potentates had long held private libraries, but the first open-access version came about under the Ptolemies, the Macedonian rulers of Egypt from 305 to 30 B.C. The idea was the brainchild of Ptolemy I Soter, who inherited Egypt after the death of Alexander the Great, and the Athenian governor Demetrius Phalereus, who fled there following his ouster in 307 B.C. United by a shared passion for knowledge, they set out to build a place large enough to store a copy of every book in the world. The famed Library of Alexandria was the result.

ILLUSTRATION: THOMAS FUCHS

Popular myth holds that the library was accidentally destroyed when Julius Caesar’s army set fire to a nearby fleet of Egyptian boats in 48 B.C. In fact the library eroded through institutional neglect over many years. Caesar was himself responsible for introducing the notion of public libraries to Rome. These repositories became so integral to the Roman way of life that even the public baths had libraries.

Private libraries endured the Dark Ages better than public ones. The Al-Qarawiyyin Library and University in Fez, Morocco, founded in 859 by the great heiress and scholar Fatima al-Fihri, survives to this day. But the celebrated Abbasid library, Bayt al-Hikmah (House of Wisdom), in Baghdad, which served the entire Muslim world, did not. In 1258 the Mongols sacked the city, slaughtering its inhabitants and dumping hundreds of thousands of the library’s books into the Tigris River. The mass of ink reportedly turned the water black.

A thousand years after Soter and Demetrius built the Library of Alexandra, Renaissance Florence benefited from a similar partnership between Cosimo de Medici and the scholar Niccolo de Niccoli. At Niccolo’s death in 1437, Cosimo carried out his friend’s wishes to bequeath his books to the people. Not only was the magnificent Biblioteca San Marco Italy’s first purpose-built public library, but its emphasis on reading spaces and natural light became the template for library architecture.

By the end of the 18th century, libraries could be found all over Europe and the Americas. But most weren’t places where the public could browse or borrow for free. Even Benjamin Franklin’s Library Company of Philadelphia, founded in 1731, required its members to subscribe.

The citizens of Peterborough, New Hampshire, started the first free public library in the U.S. in 1833, voting to tax themselves to pay for it, on the grounds that knowledge was a civic good. Many philanthropists, including George Peabody and John Jacob Astor, took up the cause of building free libraries.

But the greatest advocate of all was the steel magnate Andrew Carnegie. Determined to help others achieve an education through free libraries—just as he had done as a boy —Carnegie financed the construction of some 2,509 of them, with 1,679 spread across the U.S. He built the first in his hometown of Dumferline, Scotland in 1883. Carved over the entrance were the words “Let There Be Light.” It’s a motto to keep in mind as U.S. public libraries start to reopen.

The Sunday Times: Rumsfeld was the wrong man at the wrong time

Bush’s war supremo brought about his own worst fear: another Vietnam

July 4, 2021

On the whole, sacked US defence secretaries should avoid quoting Winston Churchill as they depart from the White House, in much the same way as disgraced preachers should leave off quoting Jesus as they are led away in handcuffs. Donald Rumsfeld, who died aged 88 last week, was the chief architect of the American invasions of Afghanistan and Iraq. The aftermath was considered a failure, however, and President George W Bush reversed course after Rumsfeld’s departure in December 2006, sending in extra troops to stop Iraq’s descent into civil war. Rumsfeld handled being fired with his customary mix of puckish humour and pugnacious bravado. In his final speech he paraphrased Churchill, saying:“I have benefited greatly from criticism and at no time have I suffered a lack thereof.”

The last bit was true then, and it continued to be until his death. Rumsfeld’s critics on his refusal to commit sufficient fighting troops in either Afghanistan or Iraq could fill a football pitch. (The first of many damning appraisals appeared in 2007, entitled Rumsfeld: An American Disaster.) But the claim he benefited from criticism was so laughable to anyone who knew him that it only highlighted the catastrophic deficiencies of the man. Rumsfeld was incapable of listening to anyone who didn’t already agree with him. He was the wrong man for the job at the most inopportune time in America’s history.

As several obituaries of Rumsfeld pointed out, his first stint as defence secretary, under Gerald Ford in 1975-77, happened under arguably worse circumstances. A survivor from Richard Nixon’s administration, where he stood out for his unwavering commitment to civil rights, Rumsfeld was the White House chief of staff during the last days of Saigon in April 1975. Appointed defence secretary shortly afterwards, Rumsfeld regarded it as his mission to keep the military ready and competitive but essentially inactive. This wasn’t cowardice but good Cold War strategy.

Rumsfeld’s reputation as a strategic thinker was subsequently borne out by his wildly successful transition to the business world. He was also a clear, no-nonsense communicator, whose fondness for aphorisms and golden rules became the stuff of legend. When Rumsfeld left the White House for the first time, he bequeathed a memorandum of best practices, Rumsfeld’s Rules, 11 of which were specific to the secretary of defence. They began with the reminder: “The secretary of defence is not a super general or admiral. His task is to exercise civilian control over the department for the commander-in-chief [the president] and the country”, and included such important pieces of advice as: “Establish good relations between the departments of Defence, State, the National Security Council, CIA and the Office of Management and Budget.”

When Rumsfeld returned to the White House in 2001, aged 68, he broke almost every one of his own rules. Not only did he allow relations between the departments to become utterly dysfunctional but he so alienated the generals and joint chiefs of staff that it was widely assumed he “hated” the military. The serving chairman of the joint chiefs, General Hugh Shelton, complained that Rumsfeld operated “the worst style of leadership I witnessed in 38 years of service”. Rumsfeld treated all soldiers as boneheaded grunts, and the generals simply as boneheaded grunts with the medals to prove it.

His planned military transformations suffered from an overly technocratic mentality. He believed that the army was costly and lacked efficiency — what army doesn’t?— as though bottom lines apply equally in business and fighting. Rumsfeld wanted to remake the military as one that relied more on air power and technical advantages and less on soldiers with guns. The charge against him is that he eviscerated the military’s ground capabilities and reduced its fighting numbers at precisely the moment both were paramount to American success in Afghanistan and Iraq.

What was going through Rumsfeld’s mind? Millions of words have been spilt in the effort to explain why the defence secretary doggedly pursued a losing policy in the teeth of protests from the military. In his last year in the job six retired generals publicly rebuked him. Part of the answer lies in his character: he was a micromanager who failed to engage, a bureaucrat who despised professionals, an aggressor who disliked to fight. But the real driver of his actions can be traced back to 1975. More than anything else, even more than winning perhaps, he wanted to avoid a repeat of the Vietnam War, with its “limited war” rhetoric that was belied by the fact it was the first of what the US media have dubbed America’s “forever wars” — metastasising conflicts without a clear end in sight.

Rumsfeld emerged from his first period in the White House blaming the military for having misled successive administrations into committing themselves to an unwinnable and unpopular war. Hence, he believed that nothing the military said could be taken at face value. He was not going to allow himself to be taken prisoner by the top brass. Unlike Robert McNamara, the defence secretary most associated with US involvement in Vietnam, Rumsfeld was determined to stick to quick, in-and-out military objectives. There would be no quagmires, mass body bags, forever wars or attempts at nation-building on his watch.

Yet he was a prisoner all the same. Even though the causes and conditions were different, the Vietnam War remained the lens through which Americans judged the Iraq war. A few months after the coalition’s initial victory in Iraq in 2003, Senator John McCain, who spent five years as a PoW in Vietnam, warned the Bush administration that stopping the army doing its job, as interpreted by McCain, would risk “the most serious American defeat on the global stage since Vietnam”. By 2004, Democrats were saying: “Iraq is George Bush’s Vietnam”. Rumsfeld ended up being directly compared to McNamara, even though, by winding down, rather than ratcheting up, the US presence in the Middle East, he was doing the opposite of his predecessor. A 2005 column in the Washington Post intoned: “Just as Vietnam became McNamara’s war, Iraq has become Rumsfeld’s war”.

The successful troop “surge” in 2008 appeared to erase Rumsfeld’s Vietnam legacy. Only it didn’t. Barack Obama’s foreign policy — summed up as “leading from behind” — was Rumsfeldian in its horror of American military entanglement in foreign affairs. The Trump administration’s was even more so. More than six trillion war dollars later, with countless lives lost, if Rumsfeld leaves anything behind, it’s the lesson that America’s forever war syndrome is a state of mind, not just a reality.

Historically Speaking: How the Office Became a Place to Work

Employees are starting to return to their traditional desks in large shared spaces. But centuries ago, ‘office’ just meant work to be done, not where to do it.

The Wall Street Journal

June 24, 2021

Wall Street wants its workforce back in the office. Bank of America, Morgan Stanley and Goldman Sachs have all let employees know that the time is approaching to exchange pajamas and sweats for less comfortable work garb. Some employees are thrilled at the prospect, but others waved goodbye to the water cooler last year and have no wish to return.

Contrary to popular belief, office work is not a beastly invention of the capitalist system. As far back as 3000 B.C, the temple cities of Mesopotamia employed teams of scribes to keep records of official business. The word “office” is an amalgamation of the Latin officium, which meant a position or duty, and ob ficium, literally “toward doing.” Geoffrey Chaucer was the first writer known to use “office” to mean an actual place, in “The Canterbury Tales” in 1395.

In the 16th century, the Medicis of Florence built the Uffizi, now famous as a museum, for conducting their commercial and political business (the name means “offices” in Italian). The idea didn’t catch on in Europe, however, until the British began to flex their muscles across the globe. When the Royal Navy outgrew its cramped headquarters, it commissioned a U-shaped building in central London originally known as Ripley Block and later as the Old Admiralty building. Completed in 1726, it is credited with being the U.K.’s first purpose-built office.

Three years later, the East India Company began administering its Indian possessions from gleaming new offices in Leadenhall Street. The essayist and critic Charles Lamb joined the East India Company there as a junior clerk in 1792 and stayed until his retirement, but he detested office life, calling it “daylight servitude.” “I always arrive late at the office,” he famously wrote, “but I make up for it by leaving early.”

A scene from “The Office,” which reflected the modern ambivalence toward deskbound work.
PHOTO: CHRIS HASTON/NBC/EVERETT COLLECTION

Not everyone regarded the office as a prison without bars. For women it could be liberating. An acute manpower shortage during the Civil War led Francis Elias Spinner, the U.S. Treasurer, to hire the government’s first women office clerks. Some Americans were scandalized by the development. In 1864, Rep. James H Brooks told a spellbound House that the Treasury Department was being defiled by “orgies and bacchanals.”

In the late 19th century, the inventions of the light bulb and elevator were as transformative for the office as the telephone and typewriter: More employees could be crammed into larger spaces for longer hours. Then in 1911, Frederick Winslow Taylor published “The Principles of Scientific Management,” which advocated a factory-style approach to the workplace with rows of desks lined up in an open-plan room. “Taylorism” inspired an entire discipline devoted to squeezing more productivity from employees.

Sinclair Lewis’s 1917 novel, “The Job,” portrayed the office as a place of opportunity for his female protagonist, but he was an outlier among writers and social critics. Most fretted about the effects of office work on the souls of employees. In 1955, Sloan Wilson’s “The Man in the Grey Flannel Suit,” about a disillusioned war veteran trapped in a job that he hates, perfectly captured the deep-seated American ambivalence toward the office. Modern television satires like “The Office” show that the ambivalence has endured—as do our conflicted attitudes toward a post-pandemic return to office routines.

Historically Speaking: Whistleblowing’s Evolution, From Rome to the Pentagon Papers to Wikileaks

The exposure 50 years ago of government documents about the Vietnam War ushered in a modern era of leaks, built on a long tradition

The Wall Street Journal

June 12, 2021

The Pentagon Papers—a secret Defense Department review of America’s involvement in the Vietnam War—became public 50 years ago next week. The ensuing Supreme Court case guaranteed the freedom of the press to report government malfeasance, but the U.S. military analyst behind the revelation, Daniel Ellsberg, still ended up being prosecuted for espionage. Luckily for him, the charges were dropped after the trial became caught up in the Watergate scandal.

The twists and turns surrounding the Pentagon Papers have a uniquely American flavor to them. At the time, no other country regarded whistleblowing as a basic right.

The origins of whistleblowing are far less idealistic. The idea is descended from Roman ‘‘Qui Tam’’ laws, from a Latin phrase meaning “he who sues for himself does also for the king.” The Qui Tam laws served a policing function by giving informers a financial incentive to turn in wrong-doers. A citizen who successfully sued over malfeasance was rewarded with a portion of the defendant’s estate.

Daniel Ellsberg, left, testifying before members of Congress on July 28, 1971, several weeks after the publication of the Pentagon Papers.
PHOTO: BETTMANN ARCHIVE/GETTY IMAGES

Anglo-Saxon law retained a crude version of Qui Tam. At first primarily aimed at punishing Sabbath-breakers, it evolved into whistleblowing against corruption. In 1360, the English monarch Edward III resorted to Qui Tam-style laws to encourage the reporting of jurors and public officials who accepted bribes.

Whistleblowers could never be sure that those in power wouldn’t retaliate, however. The fate of two American sailors in the Revolutionary War, Richard Marven and Samuel Shaw, was a case in point. The men were imprisoned for libel after they reported the commander of the navy, Esek Hopkins, for a string of abuses, including the torture of British prisoners of war. In desperation they petitioned the Continental Congress for redress. Eager to assert its authority, the Congress not only paid the men’s legal bill but also passed what is generally held to be the first whistleblower-protection law in history. The law was strengthened during the Civil War via the False Claims Act, to deter the sale of shoddy military supplies.

These early laws framed such actions as an expression of patriotism. The phrase “to blow the whistle” only emerged in the 1920s, but by then U.S. whistleblowing culture had already reigned in the corporate behemoth Standard Oil. In 1902, a clerk glanced over some documents that he had been ordered to burn, only to realize they contained evidence of wrongdoing. He passed them to a friend, and they reached journalist Ida Tarbell, forming a vital part of her expose of Standard Oil’s monopolistic abuses.

During World War II, the World Jewish Congress requested special permission from the government to ransom Jewish refugees in Romania and German-occupied France. A Treasury Department lawyer named Joshua E. Dubois Jr. discovered that State Department officials were surreptitiously preventing the money from going abroad. He threatened to go public with the evidence, forcing a reluctant President Franklin Roosevelt to establish the War Refugee Board.

Over the past half-century, the number of corporate and government whistleblowers has grown enormously. Nowadays, the Internet is awash with Wikileaks-style whistleblowers. But in contrast to the saga of the Pentagon Papers, which became a turning point in the Vietnam War and concluded with Mr. Ellsberg’s vindication, it’s not clear what the release of classified documents by Julian Assange, Chelsea Manning and Edward Snowden has achieved. To some, the three are heroes; to others, they are simply spies.

Historically Speaking: The Long Road to Protecting Inventions With Patents

Gunpowder was never protected. Neither were inventions by Southern slaves. Vaccines are—but that’s now the subject of a debate.

The Wall Street Journal

May 20, 2021

The U.S. and China don’t see eye to eye on much nowadays, but in a rare show of consensus, the two countries both support a waiver of patent rights for Covid-19 vaccines. If that happens, it would be the latest bump in a long, rocky road for intellectual property rights.

Elijah McCoy and a diagram from one of his patents for engine lubrication.
ILLUSTRATION: THOMAS FUCHS

There was no such thing as patent law in the ancient world. Indeed, until the invention of gunpowder, the true cost of failing to protect new ideas was never even considered. In the mid-11th century, the Chinese Song government realized too late that it had allowed the secret of gunpowder to escape. It tried to limit the damage by banning the sale of saltpeter to foreigners. But merchants found ways to smuggle it out, and by 1280 Western inventors were creating their own recipes for gunpowder.

Medieval Europeans understood that knowledge and expertise were valuable, but government attempts at control were crude in the extreme. The Italian Republic of Lucca protected its silk trade technology by prohibiting skilled workers from emigrating; Genoa offered bounties for fugitive artisans. Craft guilds were meant to protect against intellectual expropriation, but all too often they simply stifled innovation.

The architect Filippo Brunelleschi, designer of the famous dome of Florence’s Santa Maria del Fiore, was the first to rebel against the power of the guilds. In 1421 he demanded that the city grant him the exclusive right to build a new type of river boat. His deal with Florence is regarded as the first legal patent. Unfortunately, the boat sank on its first voyage, but other cities took note of Brunelleschi’s bold new business approach.

In 1474 the Venetians invited individuals “capable of devising and inventing all kinds of ingenious contrivances” to establish their workshops in Venice. In return for settling in the city, the Republic offered them the sole right to manufacture their inventions for 10 years. Countries that imitated Venice’s approach reaped great financial rewards. England’s Queen Elizabeth I granted over 50 individual patents, often with the proviso that the patent holder train English craftsmen to carry on the trade.

Taking their cue from British precedent, the framers of the U.S. Constitution gave Congress the power to legislate on intellectual property rights. Congress duly passed a patent law in 1790 but failed to address the legal position of enslaved inventors. Their anomalous position came to a head in 1857 after a Southern slave owner named Oscar Stuart tried to patent a new plow invented by his slave Ned. The request was denied on the grounds that the inventor was a slave and therefore not a citizen, and while the owner was a citizen, he wasn’t the inventor.

After the Civil War, the opening up of patent rights enabled African-American inventors to bypass racial barriers and amass significant fortunes. Elijah McCoy (1844-1929) transformed American rail travel with his engine lubrication system.

McCoy ultimately registered 57 U.S. patents, significantly more than Alexander Graham Bell’s 18, though far fewer than Thomas Edison’s 1,093. The American appetite for registering inventions remains unbounded. Last fiscal year alone, the U.S. Patent and Trademark Office issued 399,055 patents.

Is there anything that can’t be patented? The answer is yes. In 1999 Smuckers attempted to patent its crustless peanut butter and jelly sandwich with crimped edges. Eight years and a billion homemade PB&J sandwiches later, a federal appeals court ruled there was nothing “novel” about foregoing the crusts.

Historically Speaking: The Winning Ways of Moving the Troops

Since the siege of Troy, getting armed forces into battle zones quickly and efficiently has made a decisive difference in warfare

The Wall Street Journal

May 6, 2021

The massing of more than 100,000 Russian soldiers at Ukraine’s border in April was an unambiguous message to the West: President Putin could dispatch them at any moment, if he chose.

How troops move into battle positions is hardly the stuff of poetry. Homer’s “The Iliad” begins with the Greeks having already spent 10 years besieging Troy. Yet the engine of war is, quite literally, the ability to move armies. Many scholars believe that the actual Trojan War may have been part of a larger conflict between the Bronze Age kingdoms of the Mediterranean and a maritime confederacy known as the Sea Peoples.

The identity of these seafaring raiders is still debated, but their means of transportation is well-attested. The Sea Peoples had the largest and best fleets, allowing them to roam the seas unchecked. The trade network of the Mediterranean collapsed beneath their relentless attacks. Civilization went backward in many regions; even the Greeks lost the art of writing for several centuries.

ILLUSTRATION: THOMAS FUCHS

The West recovered and flourished until the fifth century, when the Romans were overwhelmed by the superior horse-borne armies of the Vandals. Their Central European horses, bred for strength and stamina, transformed the art of warfare, making it faster and more mobile. The invention of the stirrup, the curb bit, and finally the war saddle made mobility an effective weapon in and of itself.

Genghis Khan understood this better than any of his adversaries. His mounted troops could cover up to 100 miles a day, helping to stretch the Mongol empire from the shores of eastern China to the Austrian border. But horses need pasture, and Europe’s climate between 1238 to 1242 was excessively wet. Previously fertile plains became boggy marshes. The first modern invasion was stopped by rain.

Bad weather continued to provide an effective defense against invaders. Napoleon entered Russia in 1812 with a force of over 500,000. An unseasonably hot summer followed by an unbearably cold winter killed off most of his horses, immobilizing the cavalry and the supply wagons that would have prevented his army from starving. He returned with fewer than 20,000 men.

The reliance on pack animals for transport meant that until the Industrial Revolution, armies were no faster than their Roman counterparts. The U.S. Civil War first showed how decisive railroads could be. In 1863 the Confederate siege of Chattanooga, Tenn., was broken by 23,000 Federal troops who traveled over 1,200 miles across seven states to relieve Union forces under General William Rosecrans.

The Prussians referred to this kind of troop-maneuvering warfare as bewegungskrieg, war of movement, using it to crushing effect over the less-mobile French in the Franco-Prussian War. In the early weeks of World War I, France still struggled to mobilize; Gen. Joseph S. Gallieni, the military governor of Paris, famously resorted to commandeering Renault taxicabs to ferry soldiers to the Battle of the Marne.

The Germans started World War II with their production capacity lagging that of the Allies; they compensated by updating bewegungskrieg to what became known as blitzkrieg, or lightning strike, which combined speed with concentrated force. They overwhelmed French defenses in six weeks.

In the latter half of the 20th century, troop transport became even more inventive, if not decisive. Most of the 2.7 million U.S. soldiers sent into the Vietnam War were flown commercial. (Civilian air stewardesses flying over combat zones were given the military rank of Second Lieutenant.)

Although future conflicts may be fought in cyberspace, for now, modern warfare means mass deployment. Winning still requires moving.

Historically Speaking: The Tragedy of Vandalizing the Past

The 20th anniversary of the destruction of the Bamiyan Buddhas in Afghanistan reminds us of the imperative of historical preservation

April 15, 2021

Twenty years ago this spring, the Taliban completed their obliteration of Afghanistan’s 1,500-year-old Buddhas of Bamiyan. The colossal stone sculptures had survived major assaults in the 17th and 18th centuries by the Mughal emperor Aurangzeb and the Persian king Nader Afshar. Lacking sufficient firepower, both gave up after partly defacing the monuments.

The Taliban’s methodical destruction recalled the calculated brutality of ancient days. By the time the Romans were finished with Carthage in 146 B.C., the entire city had been reduced to rubble. They were given a taste of their own medicine in 455 A.D. by Genseric, King of the Vandals, who stripped Rome bare in two weeks of systematic looting and destruction.

One of the Buddhas of Bamiyan in 1997, before their destruction.
PHOTO: ALAMY

Like other vanquished cities, Rome’s buildings became a source of free material. Emperor Constans II of Byzantium blithely stole the Pantheon’s copper roofing in the mid-17th century; a millennium later, Pope Urban VIII appropriated its bronze girders for Bernini’s baldacchino over the high altar in St. Peter’s Basilica.

When not dismantled, ancient buildings might be repurposed by new owners. Thus Hagia Sophia Cathedral became a mosque after the Ottomans captured Constantinople, and St. Radegund’s Priory was turned into Jesus College at Cambridge University on the orders of King Henry VIII.

The idea that a country’s ancient heritage forms part of its cultural identity took hold in the wake of the French Revolution. Incensed by the Jacobins’ pillaging of churches, Henri Gregoire, the Constitutional Bishop of Blois, coined the term vandalisme. His protest inspired the novelist Victor Hugo’s efforts to save Notre Dame. But the architect chosen for the restoration, Eugène Emmanuel Viollet-le-Duc, added his own touches to the building, including the central spire that fell when the cathedral’s roof burned in 2019, spurring controversy over what to restore. Viollet-le-Duc’s own interpolations set off a fierce debate, led by the English art critic John Ruskin, about what constitutes proper historical preservation.

Ruskin inspired people to rethink society’s relationship with the past. There was uproar in England in 1883 when the London and South Western Railway tried to justify building a rail-track alongside Stonehenge, claiming the ancient site was unused.

Public opinion in the U.S., when aroused, could be equally determined. The first preservation society was started in the 1850s by Ann Pamela Cunningham of South Carolina. Despite being disabled by a riding accident, Cunningham initiated a successful campaign to save George Washington’s Mount Vernon from ruin.

But developers have a way of getting what they want. Not even modernist architect Philip Johnson protesting in front of New York’s Penn Station was able to save the McKim, Mead & White masterpiece in 1963. Two years later, fearing that the world’s architectural treasures were being squandered, retired army colonel James Gray founded the International Fund for Monuments (now the World Monuments Fund). Without the WMF’s campaign in 1996, the deteriorating south side of Ellis Island, gateway for 12 million immigrants, might have been lost to history.

The fight never ends. I still miss the magnificent beaux-arts interior of the old Rizzoli Bookstore on 57th Street in Manhattan. The 109-year-old building was torn down in 2014. Nothing like it will ever be seen again.

Historically Speaking: The Long Fight to Take the Weekend Off

Ancient Jews and Christians observed a day of rest, but not until the 20th century did workers get two days a week to do as they pleased.

Wall Street Journal

April 1, 2021

Last month the Spanish government agreed to a pilot program for experimenting with a four-day working week. Before the pandemic, such a proposal would have seemed impossible—but then, so was the idea of working from home for months on end, with no clear downtime and no in-person schooling to keep the children occupied.

In ancient times, a week meant different things to different cultures. The Egyptians used sets of 10 days called decans; there were no official days off except for the craftsmen working on royal tombs and temples, who were allowed two days out of every 10. The Romans tried an eight-day cycle, with the eighth set aside as a market day. The Babylonians regarded the number seven as having divine properties and applied it whenever possible: There were seven celestial bodies, seven nights of each lunar phase and seven days of the week.

A day of leisure in Newport Beach, Calif., 1928. PHOTO: DICK WHITTINGTON STUDIO/CORBIS/GETTY IMAGES

A day of leisure in Newport Beach, Calif., 1928. PHOTO: DICK WHITTINGTON STUDIO/CORBIS/GETTY IMAGES

The ancient Jews, who also used a seven-day week, were the first to mandate a Sabbath or rest day, on Saturday, for all people regardless of rank or occupation. In 321 A.D., the Roman emperor Constantine integrated the Judeo-Christian Sabbath into the Julian calendar, but mindful of pagan sensibilities, he chose Sunday, the day of the god Sol, for rest and worship.

Constantine’s tinkering was the last change to the Western workweek for more than a millennia. The authorities saw no reason to allow the lower orders more than one day off a week, but they couldn’t stop them from taking matters into their own hands. By the early 18th century, the custom of “keeping Saint Monday”—that is, taking the day to recover from the Sunday hangover—had become firmly entrenched among the working classes in America and Britain.

Partly out of desperation, British factory owners began offering workers a half-day off on Saturday in return for a full day’s work on Monday. Rail companies supported the campaign with cheap-rate Saturday excursions. By the late 1870s, the term “weekend” had become so popular that even the British aristocracy started using it. For them however, the weekend began on Saturday and ended on Monday night.

American workers weren’t so fortunate. In 1908, a few New England mill owners granted workers Saturdays and Sundays off because of their large number of Jewish employees. Few other businesses followed suit until 1922, when Malcolm Gray, owner of the Rochester Can Company in upstate New York, decided to give a five-day week to his workers as a Christmas gift. The subsequent uptick in productivity was sufficiently impressive to convince Henry Ford to try the same experiment in 1926 at the Ford Motor Company. Ford’s success made the rest of the country take notice.

Meanwhile, the Soviet Union was moving in the other direction. In 1929, Joseph Stalin introduced the continuous week, which required 80% of the population to be working on any given day. It was so unpopular that the system was abandoned in 1940, the same year that the five-day workweek became law in the U.S. under the Fair Labor Standards Act. The battle for the weekend had been won at last. Now let the battle for the four-day week begin.

Historically Speaking: The Ordeal of Standardized Testing

From the Incas to the College Board, exams have been a popular way for societies to select an elite.

The Wall Street Journal

March 11, 2021

Last month, the University of Texas at Austin joined the growing list of colleges that have made standardized test scores optional for another year due to the pandemic. Last year, applicants were quick to take up the offer: Only 44% of high-school students who applied to college using the Common Application submitted SAT or ACT scores in 2020-21, compared with 77% the previous year.

Nobody relishes taking exams, yet every culture expects some kind of proof of educational attainment from its young. To enter Plato’s Academy in ancient Athens, a prospective student had to solve mathematical problems. Would-be doctors at one of the many medical schools in Ephesus had to participate in a two-day competition that tested their knowledge as well as their surgical skills.

ILLUSTRATION: THOMAS FUCHS

On the other side of the world, the Incas of Peru were no less demanding. Entry into the nobility required four years of rigorous instruction in the Quechua language, religion and history. At the end of the course students underwent a harsh examination lasting several days that tested their physical and mental endurance.

It was the Chinese who invented the written examination, as a means of improving the quality of imperial civil servants. During the reign of Empress Wu Zetian, China’s only female ruler, in the 7th century, the exam became a national rite of passage for the intelligentsia. Despite its burdensome academic requirements, several hundred thousand candidates took it every year. A geographical quota system was eventually introduced to prevent the richer regions of China from dominating.

Over the centuries, all that cramming for one exam stifled innovation and encouraged conformity. Still, the meritocratic nature of the Chinese imperial exam greatly impressed educational reformers in the West. In 1702, Trinity College, Cambridge became the first institution to require students to take exams in writing rather than orally. By the end of the 19th century, exams to enter a college or earn a degree had become a fixture in most European countries.

In the U.S., the reformer Horace Mann introduced standardized testing in Boston schools in the 1840s, hoping to raise the level of teaching and ensure that all citizens would have equal access to a good education. The College Board, a nonprofit organization founded by a group of colleges and high schools in 1899, established the first standardized test for university applicants.

Not every institution that adopted standardized testing had noble aims, however. The U.S. Army had experimented with multiple-choice intelligence tastes during World War I and found them useless as a predictive tool. But in the early 1920s, the president of Columbia University, Nicholas M. Butler, adopted the Thorndike Tests for Mental Alertness as part of the admissions process, believing it would limit the number of Jewish students.

The College Board adopted the SAT, a multiple-choice aptitude test, in 1926, as a fair and inclusive alternative to written exams, which were thought to be biased against poorer students. In the 1960s, civil rights activists began to argue that standardized tests like the SAT and ACT were biased against minority students, but despite the mounting criticisms, the tests seemed like a permanent part of American education—until now.