Historically Speaking: The Long Road to Protecting Inventions With Patents

Gunpowder was never protected. Neither were inventions by Southern slaves. Vaccines are—but that’s now the subject of a debate.

The Wall Street Journal

May 20, 2021

The U.S. and China don’t see eye to eye on much nowadays, but in a rare show of consensus, the two countries both support a waiver of patent rights for Covid-19 vaccines. If that happens, it would be the latest bump in a long, rocky road for intellectual property rights.

Elijah McCoy and a diagram from one of his patents for engine lubrication.
ILLUSTRATION: THOMAS FUCHS

There was no such thing as patent law in the ancient world. Indeed, until the invention of gunpowder, the true cost of failing to protect new ideas was never even considered. In the mid-11th century, the Chinese Song government realized too late that it had allowed the secret of gunpowder to escape. It tried to limit the damage by banning the sale of saltpeter to foreigners. But merchants found ways to smuggle it out, and by 1280 Western inventors were creating their own recipes for gunpowder.

Medieval Europeans understood that knowledge and expertise were valuable, but government attempts at control were crude in the extreme. The Italian Republic of Lucca protected its silk trade technology by prohibiting skilled workers from emigrating; Genoa offered bounties for fugitive artisans. Craft guilds were meant to protect against intellectual expropriation, but all too often they simply stifled innovation.

The architect Filippo Brunelleschi, designer of the famous dome of Florence’s Santa Maria del Fiore, was the first to rebel against the power of the guilds. In 1421 he demanded that the city grant him the exclusive right to build a new type of river boat. His deal with Florence is regarded as the first legal patent. Unfortunately, the boat sank on its first voyage, but other cities took note of Brunelleschi’s bold new business approach.

In 1474 the Venetians invited individuals “capable of devising and inventing all kinds of ingenious contrivances” to establish their workshops in Venice. In return for settling in the city, the Republic offered them the sole right to manufacture their inventions for 10 years. Countries that imitated Venice’s approach reaped great financial rewards. England’s Queen Elizabeth I granted over 50 individual patents, often with the proviso that the patent holder train English craftsmen to carry on the trade.

Taking their cue from British precedent, the framers of the U.S. Constitution gave Congress the power to legislate on intellectual property rights. Congress duly passed a patent law in 1790 but failed to address the legal position of enslaved inventors. Their anomalous position came to a head in 1857 after a Southern slave owner named Oscar Stuart tried to patent a new plow invented by his slave Ned. The request was denied on the grounds that the inventor was a slave and therefore not a citizen, and while the owner was a citizen, he wasn’t the inventor.

After the Civil War, the opening up of patent rights enabled African-American inventors to bypass racial barriers and amass significant fortunes. Elijah McCoy (1844-1929) transformed American rail travel with his engine lubrication system.

McCoy ultimately registered 57 U.S. patents, significantly more than Alexander Graham Bell’s 18, though far fewer than Thomas Edison’s 1,093. The American appetite for registering inventions remains unbounded. Last fiscal year alone, the U.S. Patent and Trademark Office issued 399,055 patents.

Is there anything that can’t be patented? The answer is yes. In 1999 Smuckers attempted to patent its crustless peanut butter and jelly sandwich with crimped edges. Eight years and a billion homemade PB&J sandwiches later, a federal appeals court ruled there was nothing “novel” about foregoing the crusts.

Historically Speaking: The Ordeal of Standardized Testing

From the Incas to the College Board, exams have been a popular way for societies to select an elite.

The Wall Street Journal

March 11, 2021

Last month, the University of Texas at Austin joined the growing list of colleges that have made standardized test scores optional for another year due to the pandemic. Last year, applicants were quick to take up the offer: Only 44% of high-school students who applied to college using the Common Application submitted SAT or ACT scores in 2020-21, compared with 77% the previous year.

Nobody relishes taking exams, yet every culture expects some kind of proof of educational attainment from its young. To enter Plato’s Academy in ancient Athens, a prospective student had to solve mathematical problems. Would-be doctors at one of the many medical schools in Ephesus had to participate in a two-day competition that tested their knowledge as well as their surgical skills.

ILLUSTRATION: THOMAS FUCHS

On the other side of the world, the Incas of Peru were no less demanding. Entry into the nobility required four years of rigorous instruction in the Quechua language, religion and history. At the end of the course students underwent a harsh examination lasting several days that tested their physical and mental endurance.

It was the Chinese who invented the written examination, as a means of improving the quality of imperial civil servants. During the reign of Empress Wu Zetian, China’s only female ruler, in the 7th century, the exam became a national rite of passage for the intelligentsia. Despite its burdensome academic requirements, several hundred thousand candidates took it every year. A geographical quota system was eventually introduced to prevent the richer regions of China from dominating.

Over the centuries, all that cramming for one exam stifled innovation and encouraged conformity. Still, the meritocratic nature of the Chinese imperial exam greatly impressed educational reformers in the West. In 1702, Trinity College, Cambridge became the first institution to require students to take exams in writing rather than orally. By the end of the 19th century, exams to enter a college or earn a degree had become a fixture in most European countries.

In the U.S., the reformer Horace Mann introduced standardized testing in Boston schools in the 1840s, hoping to raise the level of teaching and ensure that all citizens would have equal access to a good education. The College Board, a nonprofit organization founded by a group of colleges and high schools in 1899, established the first standardized test for university applicants.

Not every institution that adopted standardized testing had noble aims, however. The U.S. Army had experimented with multiple-choice intelligence tastes during World War I and found them useless as a predictive tool. But in the early 1920s, the president of Columbia University, Nicholas M. Butler, adopted the Thorndike Tests for Mental Alertness as part of the admissions process, believing it would limit the number of Jewish students.

The College Board adopted the SAT, a multiple-choice aptitude test, in 1926, as a fair and inclusive alternative to written exams, which were thought to be biased against poorer students. In the 1960s, civil rights activists began to argue that standardized tests like the SAT and ACT were biased against minority students, but despite the mounting criticisms, the tests seemed like a permanent part of American education—until now.

Historically Speaking: ‘Sesame Street’ Wasn’t the First to Make Learning Fun

The show turns 50 this month, but the idea that education can be entertaining goes back to ancient Greece.

The Wall Street Journal, November 21, 2019

“Sesame Street,” which first went on the air 50 years ago this month, is one of the most successful and cost-effective tools ever created for preparing preschool tots for the classroom. Now showing in 70 languages in more than 150 countries around the world, “Sesame Street” is that rare thing in a child’s life: truly educational entertainment.

The Muppets of ‘Sesame Street’ in the 1993-94 season. PHOTO: EVERETT COLLECTION

Historically, those two words have rarely appeared together. In the 4th century B.C., Plato and Aristotle both agreed that children can learn through play. In “The Republic,” Plato went so far as to advise, “Do not use compulsion, but let early education be a sort of amusement.” Unfortunately, his advice failed to catch on.

In Europe during the Middle Ages, play and learning were almost diametrically opposed. Monks were in charge of boys’ education, which largely consisted of Latin grammar and religious teaching. (Girls learned domestic skills at home.) The invention of the printing press in 1440 helped spread literacy among young readers, but the works written for them weren’t exactly entertaining. A book like “A token for children: Being an exact account of the conversion, holy and exemplary lives, and joyful deaths of several young children,” by the 17th-centrury English Puritan James Janeway, surely didn’t follow Plato’s injunction to be amusing as well as instructional.

Social attitudes toward children’s entertainment changed considerably, however, in the wake of the English philosopher John Locke’s 1693 treatise “Some Thoughts Concerning Education.” Locke followed Plato’s line on education, writing, “I always have had a fancy that learning might be made a play and recreation to children.” The publisher John Newbery heeded Locke’s advice; in 1744, he published “A Little Pretty Pocket-Book Intended for the Instruction and Amusement of Little Master Tommy and Pretty Miss Polly,” which was sold along with a ball for boys and a pincushion for girls. In the introduction, Newbery promised parents and guardians that the book would not only make their children “strong, healthy, virtuous, wise” but also “happy.”

When it came to early children’s television in the U.S., however, “play and recreation” usually squeezed out educational content. Many popular shows existed primarily to sell toys and products: “Howdy Doody,” the pioneering puppet show that ran on NBC from 1947 to 1960, was sponsored by RCA to pitch color televisions. Parents became so indignant over the exploitation of their children by the TV industry that, in 1968, grass-roots activists started the nonprofit Action for Children’s Television, which petitioned the Federal Communications Commission to ban advertising on children’s programming.

This cultural mood led to the birth of “Sesame Street.” The show’s co-creators, Joan Ganz Cooney and Lloyd Morrisett, were particularly devoted to using TV to combat educational inequality in minority communities. They spent three years working with teachers, child psychologists and Jim Henson’s Muppets to get the right mix of education and entertainment. The pilot episode, broadcast on public television stations on Nov. 10, 1969, introduced the world to Big Bird, Bert and Ernie, Oscar the Grouch, and their cast of multiethnic friends and neighbors. “You’re gonna love it,” says one of the show’s human characters, Gordon, to his wife Sally in the first show’s opening lines. And we have.

Historically Speaking: Funding Wars Through the Ages

U.S. antiterror efforts have cost nearly $6 trillion since the 9/11 attacks. Earlier governments from the ancient Greeks to Napoleon have had to get creative to finance their fights

The Wall Street Journal, October 31, 2019

The successful operation against Islamic State leader Abu Bakr al-Baghdadi is a bright spot in the war on terror that the U.S. declared in response to the attacks of 9/11. The financial costs of this long war have been enormous: nearly $6 trillion to date, according to a recent report by the Watson Institute of International and Public Affairs at Brown University, which took into account not just the defense budget but other major costs, like medical and disability care, homeland security and debt.

ILLUSTRATION: THOMAS FUCHS

War financing has come a long way since the ancient Greeks formed the Delian League in 478 B.C., which required each member state to contribute an agreed amount of money each year, rather than troops,. With the League’s financial backing, Athens became the Greek world’s first military superpower—at least until the Spartans, helped by the Persians, built up their naval fleet with tribute payments extracted from dependent states.

The Romans maintained their armies through tributes and taxes until the Punic Wars—three lengthy conflicts between 264 and 146 B.C.—proved so costly that the government turned to debasing the coinage in an attempt to increase the money supply. The result was runaway inflation and eventually a sovereign debt crisis during the Social War a half-century later between Rome and several breakaway Italian cities. The government ended up defaulting in 86 B.C., sealing the demise of the ailing Roman Republic.

After the fall of Rome in the late fifth century, wars in Europe were generally financed by plunder and other haphazard means. William the Conqueror financed the Norman invasion of England in 1066 the ancient Roman way, by debasing his currency. He learned his lesson and paid for all subsequent operations out of tax receipts, which stabilized the English monetary system and established a new model for financing war.

Taxation worked until European wars became too expensive for state treasuries to fund alone. Rulers then resorted to a number of different methods. During the 16th century, Philip I of Spain turned to the banking houses of Genoa to raise the money for his Armada invasion fleet against England. Seizing the opportunity, Sir Francis Walsingham, Elizabeth I’s chief spymaster, sent agents to Genoa with orders to use all legal means to sabotage and delay the payment of Philip’s bills of credit. The operation bought England a crucial extra year of preparation.

In his own financial preparations to fight England, Napoleon had better luck than Philip I: In 1803 he was able to raise a war chest of over $11 million in cash by selling the Louisiana Territory to the U.S.

Napoleon was unusual in having a valuable asset to offload. By the time the American Civil War broke out in 1861, governments had become reliant on a combination of taxation, printing money or borrowing to pay for war. But the U.S. lacked a regulated banking system since President Andrew Jackson’s dismantling of the Second Bank of the United States in the 1830s. The South resorted to printing paper money, which depreciated dramatically. The North could afford to be more innovative. In 1862 the financier Jay Cooke invented the war bond. This was marketed with great success to ordinary citizens. At the war’s end, the bonds had covered two-thirds of the North’s costs.

Incurring debt is still how the U.S. funds its wars. It has helped to shield the country from the full financial effects of its prolonged conflicts. But in the future it is worth remembering President Calvin Coolidge’s warning: “In any modern campaign the dollars are the shock troops…. A country loaded with debt is devoid of the first line of defense.”

Historically Speaking: The Long Road to Cleanliness

The ancient Babylonians and Egyptians knew about soap, but daily washing didn’t become popular until the 19th century.

As the mother of five teenagers, I have a keen appreciation of soap—especially when it’s actually used. Those little colored bars—or more frequently nowadays, dollops of gel—represent one of the triumphs of civilization.

Adolescents aside, human beings like to be clean, and any product made of fats or oils, alkaline salts and water will help them to stay that way. The Babylonians knew how to make soap as early as 2800 B.C., although it was probably too caustic for washing anything except hair and textiles. The Ebers Papyrus, an ancient Egyptian medical document from 1550 B.C., suggests that the Egyptians used soap only for treating skin ailments.

ILLUSTRATION: THOMAS FUCHS

The Greeks and Romans also avoided washing with harsh soaps, until Julius Caesar’s conquest of Gaul in 58 B.C. introduced them to a softer Celtic formula. Aretaeus of Cappadocia, a 1st century A.D. Greek physician, wrote that “those alkaline substances made into balls” are a “very excellent thing to cleanse the body in the bath.”

Following Rome’s collapse in the 5th century, the centers of soap-making moved to India, Africa and the Middle East. In Europe, soap suffered from being associated with ancient paganism. In the 14th century, Crusaders returning from the Middle East brought back with them a taste for washing with soap and water, but not in sufficient numbers to slow the spread of plague.

Soap began to achieve wider acceptance in Europe during the Renaissance, though geography still played a role: Southern countries had the advantage of making soap out of natural oils and perfumes, while the colder north had to make do with animal fats and whale blubber. Soap’s growing popularity also attracted the attention of revenue-hungry governments. In 1632, in one of the earliest documented cases of crony capitalism, King Charles I of England granted a group of London soapmakers a 14-year monopoly in exchange for annual payments of 4 pounds per ton sold.

Soap remained a luxury item, however, until scientific advances during the age of the Enlightenment made large-scale production possible: In 1790, the French chemist Nicholas Leblanc discovered how to make alkali from common salt. The saying “cleanliness is next to godliness”—credited to John Wesley, the founder of Methodism—was a great piece of free advertising, but it was soap’s role in modern warfare that had a bigger impact on society. During the Crimean War in Europe and the Civil War in the U.S., high death tolls from unsanitary conditions led to new requirements that soldiers use soap every day.

In the late 19th century, soap manufacturers helped to jump-start the advertising industry with their use of catchy poems and famous artworks as marketing tools. British and American soapmakers were ahead of their time in other ways, too: Lever (now Unilever) built housing for its workers, while Procter and Gamble pioneered the practice of profit-sharing.

And it was Procter and Gamble that made soap the basis for one of the most influential cultural institutions of the last century. Having read reports that women would like to be entertained while doing housework, the company decided it would sponsor the production of daytime radio domestic dramas. Thus began the first soap opera, “Ma Perkins,” a 15-minute tear-laden serial that ran from 1933 until 1960—and created a new form of storytelling.

Historically Speaking: Fashion Shows: From Royal to Retail

The catwalk has always been a place for dazzling audiences as well as selling clothes

The 2007 Fendi Fall Collection show at the Great Wall of China. PHOTO: GETTY IMAGES

As devotees know, the fashion calendar is divided between the September fashion shows, which display the designers’ upcoming spring collections, and the February shows, which preview the fall. New York Fashion Week, which wraps up this weekend, is the world’s oldest; it started in 1943, when it was called “press week,” and always goes first, followed by London, Milan, and Paris.

Although fashion week is an American invention, the twice-yearly fashion show can be traced back to the court of Louis XIV of France in the 17th century. The king insisted on a seasonal dress code at court as a way to boost the French textile industry: velvet and satin in the winter, silks in the summer. The French were also responsible for the rise of the dress designer: Charles Frederick Worth opened the first fashion house in Paris in 1858. Worth designed unique dresses for individual clients, but he made his fortune with seasonal dress collections, which he licensed to the new department stores that were springing up in the world’s big cities.

Worth’s other innovation was the use of live models instead of mannequins. By the late 1800s this had evolved into the “fashion parade,” a precursor to today’s catwalk, which took place at invitation-only luncheons and tea parties. In 1903, the Ehrich brothers transported the fashion parade idea to their department store in New York. The big difference was that the dresses on show could be bought and worn the same day. The idea caught on, and all the major department stores began holding fashion shows.

The French couture houses studiously ignored the consumer-friendly approach pioneered by American retailers. After World War II, however, they had to tout for business like anyone else. The first Paris fashion week took place in 1947. But unlike New York’s, which catered to journalists and wholesale buyers only, the emphasis of the Paris fashion shows was still on haute couture.

The two different types of fashion show—the selling kind, organized by department stores for the public, and the preview kind, held by designers for fashion insiders—coexisted until the 1960s. Suddenly, haute couture was out and buying off the rack was in. The retail fashion show became obsolete as the design houses turned to ready-to-wear collections and accessories such as handbags and perfume.

Untethered from its couture roots, the designer fashion show morphed into performance art—the more shocking the better. The late designer Alexander McQueen provocatively titled his 1995 Fall show “Highland Rape” and sent out models in bloodied and torn clothes. The laurels for the most insanely extravagant runway show still belong to Karl Lagerfeld, who staged his 2007 Fendi Fall Collection on the Great Wall of China at a cost of $10 million.

But today there’s trouble on the catwalk. Poor attendance has led to New York’s September Fashion Week shrinking to a mere five days. Critics have started to argue that the idea of seasonal collections makes little sense in today’s global economy, while the convenience of e-commerce has made customers unwilling to wait a week for a dress, let alone six months. Designers are putting on expensive fashion shows only to have their work copied and sold to the public at knockdown prices a few weeks later. The Ehrich brothers may have been right after all: don’t just tell, sell.

Historically Speaking: Before Weather Was a Science

Modern instruments made accurate forecasting possible, but humans have tried to predict the weather for thousands of years.

The Wall Street Journal, August 31, 2019

ILLUSTRATION: THOMAS FUCHS

Labor Day weekend places special demands on meteorologists, even when there’s not a hurricane like Dorian on the way. September weather is notoriously variable: In 1974, Labor Day in Iowa was a chilly 43 degrees, while the following year it was a baking 103.

Humanity has always sought ways to predict the weather. The invention of writing during the 4th millennium B.C. was an important turning point for forecasting: It allowed the ancient Egyptians to create the first weather records, using them as a guide to predict the annual flood level of the Nile. Too high meant crop failures, too low meant drought.

Some early cultures, such as the ancient Greeks and the Mayans, based their weather predictions on the movements of the stars. Others relied on atmospheric signs and natural phenomena. One of the oldest religious texts in Indian literature, the Chandogya Upanishad from the 8th century B.C., includes observations on various types of rain clouds. In China, artists during the Han Dynasty (206 B.C.-9 A.D.) painted “cloud charts” on silk for use as weather guides.

These early forecasting attempts weren’t simply products of magical thinking. The ancient adage “red sky at night, shepherd’s delight,” which Jesus mentions in the gospel of Matthew, is backed by hard science: The sky appears red when a high-pressure front moves in from the west, driving the clouds away.

In the 4th century B.C., Aristotle tried to provide rational explanations for weather phenomena in his treatise Meteorologica. His use of scientific method laid the foundations for modern meteorology. The problem was that nothing could be built on Aristotle’s ideas until the invention of such tools as the thermometer (an early version was produced by Galileo in 1593) and the barometer (invented by his pupil Torricelli in 1643).

Such instruments couldn’t predict anything on their own, but they made possible accurate daily weather observations. Realizing this, Thomas Jefferson, a pioneer in modern weather forecasting, ordered Meriwether Lewis and William Clark to keep meticulous weather records during their 1804-06 expedition to the American West. He also made his own records wherever he resided, writing in his meteorological diary, “My method is to make two observations a day.”

Most governments, however, remained dismissive of weather forecasting until World War I. Suddenly, knowing which way the wind would blow tomorrow meant the difference between gassing your own side or the enemy’s.

To make accurate predictions, meteorologists needed a mathematical model that could combine different types of data into a single forecast. The first attempt, by the English mathematician Lewis Fry Richardson in 1917, took six weeks to calculate and turned out to be completely wrong.

There were still doubts about the accuracy of weather forecasting when the Allied meteorological team told Supreme Commander Dwight Eisenhower that there was only one window of opportunity for a Normandy landing: June 6, 1944. Despite his misgivings, Eisenhower acted on the information, surprising German meteorologists who had predicted that storms would continue in the English Channel until mid-June.

As we all know, meteorologists still occasionally make the wrong predictions. That’s when the old proverb comes into play: “There is no such thing as bad weather, only inappropriate clothes.”

Historically Speaking: Duels Among the Clouds

Aerial combat was born during World War I, giving the world a new kind of military hero: the fighter pilot

“Top Gun” is back. The 1986 film about Navy fighter pilots is getting a sequel next year, with Tom Cruise reprising his role as Lt. Pete “Maverick” Mitchell, the sexy flyboy who can’t stay out of trouble. Judging by the trailer released by Paramount in July, the new movie, “Top Gun: Maverick,” will go straight to the heart of current debates about the future of aerial combat. An unseen voice tells Mr. Cruise, “Your kind is headed for extinction.”

The mystique of the fighter pilot began during World War I, when fighter planes first entered military service. The first aerial combat took place on Oct. 5, 1914, when French and German biplanes engaged in an epic contest in the sky, watched by soldiers on both sides of the trenches. At this early stage, neither plane was armed, but the German pilot had a rifle and the French pilot a machine gun; the latter won the day.

A furious arms race ensued. The Germans turned to the Dutch engineer Anthony Fokker, who devised a way to synchronize a plane’s propeller with its machine gun, creating a flying weapon of deadly accuracy. The Allies soon caught up, ushering in the era of the dogfight.

From the beginning, the fighter pilot seemed to belong to a special category of warrior—the dueling knight rather than the ordinary foot soldier. Flying aces of all nationalities gave each other a comradely respect. In 1916, the British marked the downing of the German fighter pilot Oswald Boelcke by dropping a wreath in his honor on his home airfield in Germany.

But not until World War II could air combat decide the outcome of an entire campaign. During the Battle of Britain in the summer of 1940, the German air force, the Luftwaffe, dispatched up to 1,000 aircraft in a single attack. The Royal Air Force’s successful defense of the skies led to British Prime Minister Winston Churchill’s famous declaration, “Never in the field of human conflict was so much owed by so many to so few.”

The U.S. air campaigns over Germany taught American military planners a different lesson. Rather than focusing on pilot skills, they concentrated on building planes with superior firepower. In the decades after World War II, the invention of air-to-air missiles was supposed to herald the end of the dogfight. But during the Vietnam War, steep American aircraft losses caused by the acrobatic, Soviet-built MiG fighter showed that one-on-one combat still mattered. The U.S. response to this threat was the highly maneuverable twin-engine F-15 and the formation of a new pilot training academy, the Navy Fighter Weapons School, which inspired the original “Top Gun.”

Since that film’s release, however, aerial combat between fighter planes has largely happened on screen, not in the real world. The last dogfight involving a U.S. aircraft took place in 1999, during the NATO air campaign in Kosovo. The F-14 Tomcats flown by Mr. Cruise’s character have been retired, and his aircraft carrier, the USS Enterprise, has been decommissioned.

Today, conventional wisdom again holds that aerial combat is obsolete. The new F-35 Joint Strike Fighter is meant to replace close-up dogfights with long-range weapons. But not everyone seems to have read the memo about the future of air warfare. Increasingly, U.S. and NATO pilots are having to scramble their planes to head off Russian incursions. The knights of the skies can’t retire just yet.

Historically Speaking: A Palace Open to the People

From the Pharaohs to Queen Victoria, royal dwellings have been symbols of how rulers think about power.

Every summer, Queen Elizabeth II opens the state rooms of Buckingham Palace to the public. This year’s opening features an exhibition that I curated, “Queen Victoria’s Palace,” the result of a three-year collaboration with Royal Collection Trust. The exhibition uses paintings, objects and even computer-generated imagery to show how Victoria transformed Buckingham Palace into both a family home and the headquarters of the monarchy. In the process, she modernized not only the building itself but also the relationship between the Royal Family and the British people.

Plenty of rulers before Victoria had built palaces, but it was always with a view to enhancing their power rather than sharing it. Consider Amarna in Egypt, the temple-palace complex created in the 14th century B.C. by Amenhotep IV, better known as Akhenaten. Supported by his beautiful wife Nefertiti, the heretical Akhenaten made himself the head of a new religion that revered the divine light of the sun’s disk, the Aten.

The Great Palace reflected Akhenaten’s megalomania: The complex featured vast open air courtyards where the public was required to engage in mass worship of the Pharaoh and his family. Akhenaten’s palace was hated as much as his religion, and both were abandoned after his death.

Weiyang, the Endless Palace, built in 200 B.C. in western China by the Emperor Gaozu, was also designed to impart a religious message. Until its destruction in the 9th century by Tibetan invaders, Weiyang extended over two square miles, making it the largest imperial palace in history. Inside, the halls and courtyards were laid out along specific axial and symmetrical lines to ensure that the Emperor existed in harmony with the landscape and, by extension, with his people. Each chamber was ranked according to its proximity to the Emperor’s quarters; every person knew his place and obligations according to his location in the palace.

Western Europe had nothing comparable to Weiyang until King Louis XIV built the Palace of Versailles in 1682. With its unparalleled opulence—particularly the glittering Hall of Mirrors—and spectacular gardens, Versailles was a cult of personality masquerading as architecture. Louis, the self-styled Sun King at the center of this artificial universe, created a living stage where seeing and being seen was the highest form of social currency.

The offstage reality was grimmer. Except for the royal family’s quarters, Versailles lacked even such basic amenities as plumbing. The cost of upkeep swallowed a quarter of the government’s annual tax receipts. Louis XIV’s fantasy lasted a century before being swept away by the French Revolution in 1789.

Although Victoria would never have described herself as a social revolutionary, the many changes she made to Buckingham Palace were an extraordinary break with the past. From the famous balcony where the Royal Family gathers to share special occasions with the nation, to the spaces for entertaining that can welcome thousands of guests, the revitalized palace created a more inclusive form of royal architecture. It sidelined the old values of wealth, lineage, power and divine right to emphasize new ones based on family, duty, loyalty and patriotism. Victoria’s palace was perhaps less awe-inspiring than its predecessors, but it may prove to be more enduring.