Historically Speaking: Using Forensic Evidence to Solve Crimes

Today’s DNA techniques are just the latest addition to a toolkit used by detectives since ancient times.

The Wall Street Journal

May 5, 2023

In February, police in Burlington, Vt., announced they had solved the city’s oldest cold case, the 1971 murder of 24-year-old schoolteacher Rita Curran. Taking advantage of genetic genealogy using DNA databases—the latest addition to the forensic science toolbox—the police were able to prove that the killer was a neighbor in Curran’s apartment building, William DeRoos.

The practice of forensic science, the critical examination of crime scenes, existed long before it had a name. The 1st-century A.D. Roman jurist Quintilian argued that evidence was not the same as proof unless supported by sound method and reasoning. Nothing should be taken at face value: Even blood stains on a toga, he pointed out, could be the result of a nosebleed or a messy religious sacrifice rather than a murder.

In 6th-century Byzantium, the Justinian Law Code allowed doctors to serve as expert witnesses, recognizing that murder cases required specialized knowledge. In Song Dynasty China, coroners were guided by Song Ci, a 13th-century judge who wrote “The Washing Away of Wrongs,” a comprehensive handbook on criminology and forensic science. Using old case studies, Song provided step-by-step instructions on how to tell if a drowned person had been alive before hitting the water and whether blowflies could be attracted by traces of blood on a murder weapon.

As late as the 17th century, however, Western investigators were still prone to attributing unexplained deaths to supernatural causes. In 1691 the sudden death of Henry Sloughter, the colonial governor of New York, provoked public hysteria. It subsided after an autopsy performed by six physicians proved that blocked lungs, not spells or poison, were responsible. The case was a watershed in placing forensic pathology at the heart of the American judicial system.

Using forensic evidence to solve criminal crimes

THOMAS FUCHS

Growing confidence in scientific methods resulted in more systematic investigations, which increased the chances of a case being solved. In England in 1784, the conviction of John Toms for the murder of Edward Culshaw hinged on a paper scrap pulled from Culshaw’s bullet wound. The jagged edge was found to match up perfectly with a torn sheet of paper found in Toms’s pocket.

Still, the only way to determine whether a suspect was present at the scene of a crime was by visual identification. By the late 19th century, studies by Charles Darwin’s cousin, the anthropologist Sir Francis Galton, and others had established that every individual has unique fingerprints. Fingerprint evidence might have helped to identify Jack the Ripper in 1888, but official skepticism kept the police from pursuing it.

Four years later, in Argentina, fingerprints were used to solve a crime for the first time. Two police officers, Juan Vucetich and Eduardo Alvarez, ignored their superiors’ distrust of the method to prove that a woman had murdered her children so she could marry her lover.

The success of fingerprinting ushered in a golden age of forensic innovation, driven by ambition but guided by scientific principles. By the 1930s, dried blood stains could be analyzed for their blood type and bullets could be matched to the guns that fired them. Almost a century later, the first principle of forensic science still stands: Every contact leaves a trace.

HistoricallySpeaking: How Fear of Sharks Became an American Obsession

Since colonial times, we’ve dreaded what one explorer called ‘the most ravenous fish known in the sea’

The Wall Street Journal

August 27, 2020

There had never been a fatal shark incident in Maine until last month’s shocking attack on a woman swimmer by a great white near Bailey Island in Casco Bay. Scientists suspect that the recent rise in seal numbers, rather than the presence of humans, was responsible for luring the shark inland.

A great white shark on the attack.
PHOTO: CHRIS PERKINS / MEDIADRUMWORLD / ZUMA PRESS

It’s often said that sharks aren’t the bloodthirsty killing machines portrayed in the media. In 2019 there were only 64 shark attacks world-wide, with just two fatalities. Still, they are feared for good reason.

The ancient Greeks knew well the horror that could await anyone unfortunate enough to fall into the wine-dark sea. Herodotus recorded how, in 492 B.C., a Persian invasion fleet of 300 ships was heading toward Greece when a sudden storm blew up around Mt. Athos. The ships broke apart, tossing some 20,000 men into the water. Those who didn’t drown immediately were “devoured” by sharks.

The Age of Discovery introduced European explorers not just to new landmasses but also to new shark species far more dangerous than the ones they knew at home. In a narrative of his 1593 journey to the South Seas, the explorer and pirate Richard Hawkins described the shark as “the most ravenous fishe knowne in the sea.”

It’s believed that the first deadly shark attack in the U.S. took place in 1642 at Spuyten Duyvil, an inlet on the Hudson River north of Manhattan. Antony Van Corlaer was attempting to swim across to the Bronx when a giant fish was seen to drag him under the water.

But the first confirmed American survivor of a shark attack was Brook Watson, a 14-year-old sailor from Boston. In 1749, Watson was serving on board a merchant ship when he was attacked while swimming in Cuba’s Havana Harbor. Fortunately, his crewmates were able to launch a rowboat and pull him from the water, leaving Watson’s right foot in the shark’s mouth.

Despite having a wooden leg, Watson enjoyed a successful career at sea before returning to his British roots to enter politics. He ended up serving as Lord Mayor of London and becoming Sir Brook Watson. His miraculous escape was immortalized by his friend the American painter John Singleton Copley. “Watson and the Shark” was completely fanciful, however, since Copley had never seen a shark.

The American relationship with sharks was changed irrevocably during the summer of 1916. The East Coast was gripped by both a heat wave and a polio epidemic, leaving the beach as one of the few safe places for Americans to relax. On July 1, a man was killed by a shark on Long Beach Island off the New Jersey coast. Over the next 10 days, sharks in the area killed three more people and left one severely injured. In the ensuing national uproar, President Woodrow Wilson offered federal funds to help get rid of the sharks, an understandable but impossible wish.

The Jersey Shore attacks served as an inspiration for Peter Benchley’s bestselling 1974 novel “Jaws,” which was turned into a blockbuster film the next year by Steven Spielberg. Since then the shark population in U.S. waters has dropped by 60%, in part due to an increase in shark-fishing inspired by the movie. Appalled by what he had unleashed, Benchley spent the last decades of his life campaigning for shark conservation.

WSJ Historically Speaking: Serendipity of Science is Often Born of Years of Labor

Over the centuries, lucky discoveries depend on training and discernment

ILLUSTRATION: THOMAS FUCHS

One recent example comes from an international scientific team studying the bacterium, Ideonella sakaiensis 201-F6, which makes an enzyme that breaks down the most commonly used form of plastic, thus allowing the bacterium to eat it. As reported last month in the Proceedings of the National Academy of Sciences, in the course of their research the scientists accidentally created an enzyme even better at dissolving the plastic. It’s still early days, but we may have moved a step closer to solving the man-made scourge of plastics pollution.

The development illustrates a truth about seemingly serendipitous discoveries: The “serendipity” part is usually the result of years of experimentation—and failure. A new book by two business professors at Wharton and a biology professor, “Managing Discovery in the Life Sciences,” argues that governments and pharmaceutical companies should adopt more flexible funding requirements—otherwise innovation and creativity could end up stifled by the drive for quick, concrete results. As one of the authors, Philip Rea, argues, serendipity means “getting answers to questions that were never posed.”

So much depends on who has observed the accident, too. As Joseph Henry, the first head of the Smithsonian Institution, said, “The seeds of great discoveries are constantly floating around us, but they only take root in minds well prepared to receive them.”

One famously lucky meeting of perception and conception happened in 1666, when Isaac Newton observed an apple fall from the tree. (The details remain hazy, but there’s no evidence that the fruit actually hit him, as legend has it.) Newton had seen apples fall before, of course, but this time the sight inspired him to ask questions about gravity’s relationship to the rules of motion that he was contemplating. Still, it took Newton another 20 years of work before he published his Law of Universal Gravitation.

Bad weather was the catalyst for another revelation, leading to physicist Henri Becquerel’s discovery of radioactivity in 1896. Unable to continue his photographic X-ray experiments on the effect of sunlight on uranium salt, Becquerel put the plates in a drawer. They developed, incredibly, without light. Realizing that he had been pursing the wrong question, Becquerel started again, this time focusing on uranium itself as a radiation emitter.

As for inventions, accident and inadvertence played a role in the development of Post-it Notes and microwave heating. During the 1990s, Viagra failed miserably in trials as a treatment for angina, but alert researchers at Pfizer realized that one of the side effects could have global appeal.

The most famous accidental medical discovery is antibiotics. The biologist Alexander Fleming discovered penicillin in 1928 after he went on vacation, leaving a petri dish of bacteria out in the laboratory. On his return, the dish had developed mold, with a clean area around it. Fleming realized that something in the mold must have killed off the bacteria.

That ability to ask the right questions can be more important than knowing the right answers. Funders of science should take note.