I’ve become interested in antivirulence therapy as an alternative to antibiotics. The Golden Age of Antibiotics ended about 1990, and although we are nowhere near to suffering a post-antibiotic apocalypse, there is also no doubt that new therapies are needed.
It’s not just that new antibiotics are increasingly difficult to discover and develop, but that the whole concept of antibiotic therapy suffers from serious limitations. Dating from the Heroic Age of Pasteur and Koch, we’ve tended to equate bacteria with germs: the fewer the better, so let’s just kill them all and be done with it. Antibiotics were adopted with such enthusiasm not just because of their enormous clinical benefit, but because they gave us a chance to crush an ancient foe. They satisfy a primitive urge.
Microbiomics teaches a different lesson: not only do bacteria benefit us, they ARE us. Antibiotics cure infections, sure. But they also cause collateral damage, killing the innocent civilian bacteria along with the barbarian invaders. Numerous studies associate antibiotic use, especially in childhood, with a host of disorders including obesity, diabetes, C. difficile-associated diarrhea, Crohn’s disease and arthritis. Even if antibiotic resistance was not a problem we would still want alternative treatments for bacterial infectious diseases.
Antivirulence therapy is one such alternative. Rather than killing bad bugs, it disarms them. Many important pathogens are really just opportunistic invaders, more like teenagers who succumb to temptation rather than psychopathic axe murderers. Antivirulence therapy has the potential to treat disease without disrupting our microbiomes.
Antivirulence therapies have been around for a long time; much longer than antibiotics. In fact, these therapies can claim to be among the first scientific medicines, therapies that were developed using the new tools and insights provided by emergent disciplines of bacteriology and immunology in the late 19th century. The very first Nobel Prize in Medicine and Physiology was awarded to Emil von Behring for his development of diphtheria anti-toxin serum.
Therapeutic antisera (Behring also developed, with Kitasato, a tetanus anti-toxin serum) were produced by injecting sublethal doses of bacteria or their toxins into large animals. With increasing booster doses, the animals’ immune systems pumped out anti-toxin antibodies. Antiserum preparations injected into diphtheria patients neutralized the diphtheria toxin. This toxin kills cells in the liver, heart, kidneys and respiratory system. Patients, children especially, died of strangulation when their airways became blocked by rough membranes (“diphtheria” is Greek for “leather”). Prior to Behring’s antiserum, tracheotomy was the only treatment and it was not very effective. More than 4 in every 1000 Americans died of diphtheria in 1890. Case fatality rates for diphtheria were commonly above 50%; Behring’s antiserum cut this rate in half.
American public health officials traveling in Europe learned of the new treatment and scrambled to establish it in the US. New York City created a farm that used retired police horses to produce the antiserum. St. Louis, led by city bacteriologist Dr. Armand Ravold, followed suit.
The publicly-funded production and distribution of pharmaceuticals strikes us as odd and unusual today, a practice that has more than a whiff of socialism about it. Indeed, the officials who led the charge were explicit in their intention to provide the treatment free of charge to the public lest the working classes be denied its benefits. But these were no socialists. St. Louis and Missouri were merely another stronghold of the Progressive Republicans who became powerful around the turn of the century.
My great-grandfather Jacob Kauffman was one of these Progressive Republicans and was a state legislator in Missouri around this time. His three older siblings died from diphtheria, all within a span of six months. He probably survived because he was nursing and was protected by maternal antibodies. Tragedies like this were common to all classes of society, and little time was wasted arguing politics or ideology when an effective treatment became available.
The anti-diphtheria effort in New York was an immediate success. Deaths dropped from 2870 in 1895 to less than 1400 in 1900. Similar results were seen in St. Louis, and there was constant pressure to create a supply of antiserum equal to the demand. Jim, a retired ambulance horse, was a champion producer who yielded gallons of lifesaving antiserum. But he was just a horse, and subject to disease himself. Last bled by Dr. Ravold on September 30, 1901, he came down with tetanus and had to be killed. The September 30 serum batch was ordered destroyed.
But the order was not carried out and the antiserum was distributed across the city with tragic results. In all, 13 children were killed by the tetanus-contaminated antiserum. Blame was first attached to Henry Taylor, an African-American janitor at the facility, but Dr. Ravold eventually admitted his culpability, citing the depletion of existing stocks. Both he and Taylor were dismissed from the public health service by a board of inquiry. The board additionally recommended that the city get out of the business of manufacturing antiserum.
Similar incidents occurred around the country, and Congress responded by passing the Biologics Control Act of 1902, also known as the Virus-Toxin Law. It mandated that producers of vaccines and antisera be licensed and inspected. It further tasked the Hygienic Laboratory (originally established as the Marine Hospital Service) with developing the regulations and standards governing the manufacture of biologics. Although limited to biological products, this was the first national regulation of drug production and one of the first national regulatory programs of any kind. The law also funded research in pathology, bacteriology, chemistry, pharmacology and zoology, thereby establishing a national research program. The Ransdell Act of 1930 changed the name of the lab to the National Institute of Health, which continued to regulate the manufacture of biological products until 1972, when this responsibility was transferred to the FDA.
The political pressure required for major reforms is difficult to develop. The reforms embodied in the Biologics Control Act had undoubtedly been discussed and debated before the St. Louis disaster. But the immediacy of the disaster, the faces of now-dead children staring out from the front pages of newspapers, the attribution of their deaths to the mismanagement of a single sick horse, created the sort of public outcry that politicians could not ignore. Then as now, we are numbed by abstract reports of thousands of deaths but will respond to tragedies when they have a face. As Mother Theresa said, “If I look at the mass I will never act. If I look at the one, I will.” Those 13 children and one sick horse did more to create the NIH than a thousand reasoned arguments ever would.
From Chronicling America
In writing this post I relied heavily on Ross DeHovitz’ fine article “The 1901 St. Louis Incident, the First Modern Medical Disaster”
1 thought on “The horse who founded the NIH”