Historically Speaking: The Ancient Elixir Made by Bees

Honey has always been a sweet treat, but it has also long served as a preservative, medicine and salve.

The Wall Street Journal

February 9, 2023

The U.S. Department of Agriculture made medical history last month when it approved the first vaccine for honey bees. Hives will be inoculated against American Foulbrood, a highly contagious bacterial disease that kills bee larvae. Our buzzy friends need all the help they can get. In 2021, a national survey of U.S. beekeepers reported that 45.5% of managed colonies died during the preceding year. Since more than one-third of the foods we eat depend on insect pollinators, a bee-less world would drastically alter everyday life.

The loss of bees would also cost us honey, a foodstuff that throughout human history has been much more than a pleasant sugar-substitute. Energy-dense, nutritionally-rich wild honey, ideal for brain development, may have helped our earliest human ancestors along the path of evolution. The importance of honey foraging can be inferred from its frequent appearance in Paleolithic art. The Araña Caves of Valencia, Spain, are notable for a particularly evocative line drawing of a honey harvester dangling precariously while thrusting an arm into a beehive.

Honey is easily fermented, and there is evidence that the ancient Chinese were making a mixed fruit, rice and honey alcoholic beverage as early as 7000 BC. The Egyptians may have been the first to domesticate bees. A scene in the sun temple of Pharaoh Nyuserre Ini, built around 2400 B.C., depicts beekeepers blowing smoke into hives as they collect honey. They loved the taste, of course, but honey also played a fundamental role in Egyptian culture. It was used in religious rituals, as a preservative (for embalming) and, because of its anti-bacterial properties, as an ingredient in hundreds of concoctions from contraceptives to gastrointestinal medicines and salves for wounds.

The oldest known written reference to honey comes from a 4,000-year-old recipe for a skin ointment, noted on a cuneiform clay tablet found among the ruins of Nippur in the Iraqi desert.

The ancient Greeks judged honey like fine wine, rating its qualities by bouquet and region. The thyme-covered slopes of Mount Hymettus, near Athens, were thought to produce the best varieties, prompting sometimes violent competition between beekeepers. The Greeks also appreciated its preservative properties. In 323 B.C., the body of Alexander the Great was allegedly transported in a vat of honey to prevent it from spoiling.

Honey’s many uses were also recognized in medieval Europe. In fact, in 1403 honey helped to save the life of 16 year-old Prince Henry, the future King Henry V of England. During the battle of Shrewsbury, an arrowhead became embedded in his cheekbone. The extraction process was long and painful, resulting in a gaping hole. Knowing the dangers of an open wound, the royal surgeon John Bradmore treated the cavity with a honey mixture that kept it safe from dirt and bacteria.

Despite Bradmore’s success, honey was relegated to folk remedy status until World War I. Then medical shortages encouraged Russian doctors to use honey in wound treatments. Honey was soon after upstaged by the discovery of penicillin in 1928, but today its time has come.

A 2021 study in the medical journal BMJ found honey to be a cheap and effective treatment for the symptoms of upper respiratory tract infections. Scientists are exploring its potential uses in fighting cancer, diabetes, asthma and cardiovascular disease.

To save ourselves, however, first we must save the bees.

Historically Speaking: The Long, Dark Shadow of the Real ‘Bedlam’

Today’s debate over compulsory treatment for the mentally ill has roots in a history of good intentions gone awry

The Wall Street Journal

January 12, 2023

This year, California and New York City will roll out plans to force the homeless mentally ill to receive hospital treatment. The initiatives face fierce legal challenges despite their backers’ good intentions and promised extra funds.

Opposition to compulsory hospitalization has its roots in the historic maltreatment of mental patients. For centuries, the biggest problem regarding the care of the mentally ill was the lack of it. Until the 18th century, Britain was typical in having only one public insane asylum, Bethlehem Royal Hospital. The conditions were so notorious, even by contemporary standards, that the hospital’s nickname, Bedlam, became synonymous with violent anarchy.

Plate 8 of WIlliam Hogarth’s ‘A Rake’s Progress,’ titled ‘In The Madhouse,’ was painted around 1735 and depicted the hospital known as ‘Bedlam.’
PHOTO: HERITAGE IMAGES VIA GETTY IMAGES

The cost of treatment at Bedlam, which consisted of pacifying the patients through pain and terror, was offset by viewing fees. Anyone could pay to stare or laugh at the inmates, and thousands did. But social attitudes toward mental illness were changing. By the end of the 18th century, psychiatric reformers such as Benjamin Rush in America and Philippe Pinel in France had demonstrated the efficacy of more humane treatment.

In a burst of optimism, New York Hospital created a ward for the “curable” insane in 1792. The Quaker-run “Asylum for the Relief of Persons Deprived of the Use of Their Reason” in Pennsylvania became the first dedicated mental hospital in the U.S. in 1813. By the 1830s there were at least a dozen private mental hospitals in America.

The public authorities, however, were still shutting the mentally ill in prisons, as the social reformer Dorothea Dix was appalled to discover in 1841. Dix’s energetic campaigning bore fruit in New Jersey, which soon built its first public asylum. Designed by Thomas Kirkbride to provide state-of-the-art care amid pleasant surroundings, Trenton State Hospital served as a model for more than 70 purpose-built asylums that sprang up across the nation after Congress approved government funding for them in 1860.

Unfortunately, the philanthropic impetus driving the public mental hospital movement created as many problems as it solved. Abuse became rampant. It was so easy to have a person committed that in the 1870s, President Grover Cleveland, while still an aspiring politician, successfully silenced the mother of his illegitimate son by having her spirited away to an asylum.

In 1887, the journalist Nellie Bly went undercover as a patient in the Women’s Lunatic Asylum on Blackwell’s Island (now Roosevelt Island), New York. She exposed both the brutal practices of that institution and the general lack of legal safeguards against unwarranted incarceration.

The social reformer Dorothea Dix advocated for public mental health care.

During the first half of the 20th century, the best-run public mental hospitals lived up to the ideals that had inspired them. But the worst seemed to confirm fears that patients on the receiving end of state benevolence lost all basic rights. At Trenton State Hospital between 1907 and 1930, the director Henry Cotton performed thousands of invasive surgeries in the mistaken belief that removing patients’ teeth or organs would cure their mental illnesses. He ended up killing almost a third of those he treated and leaving the rest damaged and disfigured. The public uproar was immense. And yet, just a decade later, some mental hospitals were performing lobotomies on patients with or without consent.

In 1975 the ACLU persuaded the Supreme Court that the mentally ill had the right to refuse hospitalization, making public mental-health care mostly voluntary. But while legal principles are black and white, mental illness comes in shades of gray: A half century later, up to a third of people living on the streets are estimated to be mentally ill. As victories go, the Supreme Court decision was also a tragedy.

Historically Speaking: The Quest to Understand Skin Cancer

The 20th-century surgeon Frederic Mohs made a key breakthrough in treating a disease first described in ancient Greece.

The Wall Street Journal

June 30, 2022

July 1 marks the 20th anniversary of the death of Dr. Frederic Mohs, the Wisconsin surgeon who revolutionized the treatment of skin cancer, the most common form of cancer in the U.S. Before Mohs achieved his breakthrough in 1936, the best available treatment was drastic surgery without even the certainty of a cure.

Skin cancer is by no means a new illness or confined to one part of the world; paleopathologists have found evidence of it in the skeletons of 2,400- year-old Peruvian mummies. But it wasn’t recognized as a distinct cancer by ancient physicians. Hippocrates in the 5th century B.C. came the closest, noting the existence of deadly “black tumors (melas oma) with metastasis.” He was almost certainly describing malignant melanoma, a skin cancer that spreads quickly, as opposed to the other two main types, basal cell and squamous cell carcinoma.

ILLUSTRATION: THOMAS FUCHS

After Hippocrates, nearly 2,000 years elapsed before earnest discussions about black metastasizing tumors began to appear in medical writings. The first surgical removal of a melanoma took place in London in 1787. The surgeon involved, a Scotsman named John Hunter, was mystified by the large squishy thing he had removed from his patient’s jaw, calling it a “cancerous fungus excrescence.”

The “fungoid disease,” as some referred to skin cancer, yielded up its secrets by slow degrees. In 1806 René Laënnec, the inventor of the stethoscope, published a paper in France on the metastatic properties of “La Melanose.” Two decades later, Arthur Jacob in Ireland identified basal cell carcinoma, which was initially referred to as “rodent ulcer” because the ragged edges of the tumors looked as though they had been gnawed by a mouse.

By the beginning of the 20th century, doctors had become increasingly adept at identifying skin cancers in animals as well as humans, making the lack of treatment options all the more frustrating. In 1933, Mohs was a 23-year-old medical student assisting on cancer research in rats when he noticed the destructive effect of zinc chloride on malignant tissue. Excited by its potential, within three years he had developed a zinc chloride paste and a technique for using it on cancerous lesions.

He initially described it as “chemosurgery” since the cancer was removed layer by layer. The results for his patients, all of whom were either inmates of the local prison or the mental health hospital, were astounding. Even so, his method was so novel that the Dane County Medical Association in Wisconsin accused him of quackery and tried to revoke his medical license.

Mohs continued to encounter stiff resistance until the early 1940s, when the Quislings, a prominent Wisconsin family, turned to him out of sheer desperation. Their son, Abe, had a lemon-sized tumor on his neck which other doctors had declared to be inoperable and fatal. His recovery silenced Mohs’s critics, although the doubters remained an obstacle for several more decades. Nowadays, a modern version of ”Mohs surgery,” using a scalpel instead of a paste, is the gold standard for treating many forms of skin cancer.

WSJ Historically Speaking: Serendipity of Science is Often Born of Years of Labor

Over the centuries, lucky discoveries depend on training and discernment

ILLUSTRATION: THOMAS FUCHS

One recent example comes from an international scientific team studying the bacterium, Ideonella sakaiensis 201-F6, which makes an enzyme that breaks down the most commonly used form of plastic, thus allowing the bacterium to eat it. As reported last month in the Proceedings of the National Academy of Sciences, in the course of their research the scientists accidentally created an enzyme even better at dissolving the plastic. It’s still early days, but we may have moved a step closer to solving the man-made scourge of plastics pollution.

The development illustrates a truth about seemingly serendipitous discoveries: The “serendipity” part is usually the result of years of experimentation—and failure. A new book by two business professors at Wharton and a biology professor, “Managing Discovery in the Life Sciences,” argues that governments and pharmaceutical companies should adopt more flexible funding requirements—otherwise innovation and creativity could end up stifled by the drive for quick, concrete results. As one of the authors, Philip Rea, argues, serendipity means “getting answers to questions that were never posed.”

So much depends on who has observed the accident, too. As Joseph Henry, the first head of the Smithsonian Institution, said, “The seeds of great discoveries are constantly floating around us, but they only take root in minds well prepared to receive them.”

One famously lucky meeting of perception and conception happened in 1666, when Isaac Newton observed an apple fall from the tree. (The details remain hazy, but there’s no evidence that the fruit actually hit him, as legend has it.) Newton had seen apples fall before, of course, but this time the sight inspired him to ask questions about gravity’s relationship to the rules of motion that he was contemplating. Still, it took Newton another 20 years of work before he published his Law of Universal Gravitation.

Bad weather was the catalyst for another revelation, leading to physicist Henri Becquerel’s discovery of radioactivity in 1896. Unable to continue his photographic X-ray experiments on the effect of sunlight on uranium salt, Becquerel put the plates in a drawer. They developed, incredibly, without light. Realizing that he had been pursing the wrong question, Becquerel started again, this time focusing on uranium itself as a radiation emitter.

As for inventions, accident and inadvertence played a role in the development of Post-it Notes and microwave heating. During the 1990s, Viagra failed miserably in trials as a treatment for angina, but alert researchers at Pfizer realized that one of the side effects could have global appeal.

The most famous accidental medical discovery is antibiotics. The biologist Alexander Fleming discovered penicillin in 1928 after he went on vacation, leaving a petri dish of bacteria out in the laboratory. On his return, the dish had developed mold, with a clean area around it. Fleming realized that something in the mold must have killed off the bacteria.

That ability to ask the right questions can be more important than knowing the right answers. Funders of science should take note.