The Unspoken Victims of COVID-19

 

A map of the worldDescription automatically generated with low confidence

Jules Grandjouan, “A Dream of the Vivisector,” L’Assiette au Beurre, July 1, 1911. Photo: The Author.

The Challenge of Omicron

The appearance in November 2021 of the Omicron variant of COVID 19 set off a global chase. Scientists everywhere sought to discover the nature of the mutated disease and its likely impact on the course of the pandemic. Researchers in South Africa and Botswana, where Omicron initially appeared, were the first to identify its approximately 50 genetic mutations and describe its morbidity and mortality. They quickly determined that the variant was more contagious but less dangerous than previous versions. Case numbers skyrocketed but hospitalizations and deaths did not. They also found that the duration of the disease appeared to be short; patients generally recovered after three to four days.

South African scientists and their compatriots were rewarded for their perspicacity with an international travel blockade and shunning, some of it racist, recalling Trump’s attacks upon Chinese researchers despite their similarly rapid identification and communication of the genetic sequence of the original Covid strain. Rather than banning South Africans from travelling to the U.S., the Biden administration should instead have invited them and provided free tickets to Disneyland. The sooner the less-lethal Omicron replaced Delta, the better off everybody would be, at least everybody who has been vaccinated.

Nevertheless, despite encouraging reports from South Africa about the lesser mortality of Omicron, robust clinical evidence was still lacking by mid-December 2021. It remained possible that lower rates of hospitalization and death were the result of contingent, local conditions — for example the age distribution of South African patients (mostly young), or their previous exposure to Covid (many had contracted Delta or other strains). Policy makers in the U.S. and abroad wanted to know what the likely consequence of widespread Omicron infection would be in their countries, and how long any new surge was likely to last. So, they pumped more money into biomedical research, and laboratory scientists across the globe immediately got to work doing what they do best: torturing small animals.

On December 31, 2021, the NYTimes reported that more than a dozen research groups – in the U.S., England, Germany, Japan, Syria, Hong Kong, and South Africa — had sprayed the Omicron virus into the noses of rats, hamsters and other animals in order to determine the disease’s infectiousness and impact. Many got the disease, some didn’t; many survived, some didn’t. The unsurprising conclusion was that Omicron caused “attenuated infection” (less severe disease) than Delta or previous variants. Scientists warned however, that these were only preliminary results that would need to be validated by further animal and human studies.

On the same day, the Times published a second story: “People with Omicron less likely to need hospitalization, U.K. report finds.” The authors stated that fully vaccinated people (with booster) infected with either Delta or Omicron variants were 81% less likely to go into the hospital than unvaccinated people. The reduction in hospital admissions was even higher, 88%, for those infected with Omicron. The study, conducted by the U.K. Health Security Agency, was based on research involving 528,176 Omicron cases and 572,012 Delta cases — a total sample size of 1,100,188. By comparison, the total number of Covid cases in the UK is more that 15 million.

There are many factors that come into play in determining the proper sample size for epidemiological analysis, for example the scale of effect being sought and the level of confidence desired. Generally, the larger the sample size, the greater the confidence in the outcome. By any measure, a sample size of over a million patients is robust, and fully adequate for determining likely rates of hospitalization for vaccinated Omicron patients. That magnitude is also sufficient to find out the probable degree of morbidity and likelihood of death from the disease. And every day, more evidence accumulates about the course and impact of the illness based upon the real-world experience of some 330 million past and present Covid patients from around the world.

So, one set of studies suggested that Omicron is a less severe disease than Delta, but because the research was only conducted on animals, it’s preliminary and unreliable. Another study, published the same day, involving more than a million patients, proves that Omicron, though more transmissible than Delta, is less dangerous to a vaccinated population. The conclusion is that the animal studies into Omicron transmission and morbidity were completely unnecessary. That raises the natural question: What have been the benefits (to humans) and the costs (to animals) of the thousands of previous animal-based Covid research projects?

The benefits to humans are uncertain because the necessity of animal experimentation in the development of vaccines and drug treatments is unclear, but I’ll return to that later. The animal toll has certainly been high. It’s estimated that more than 110 million mice and rats are used (and killed) in U.S. labs every year, though it is not yet known whether there has been an increase in rodent deaths from Covid research above already elevated levels. Other animals too, though in vastly smaller numbers, have been pressed into service, chiefly ferrets, pigs, and monkeys. “An important disadvantage of animal models that develop the COVID-19 disease phenotype,” a team of European microbiologists coolly reported last year, “is that this condition is inevitably linked to pain and suffering.” In the U.S., that’s no disadvantage since rodents — comprising some 99% of all lab animals — are not covered by the Animal Welfare Act. (Neither are fish and birds also commonly used in labs.) Their pain and suffering doesn’t count. How this circumstance came about is a long story – I’ll keep it short.

A Short History of Vivisection

The practice of experimenting upon the bodies of living, non-human animals for the purpose of gaining knowledge or discovering cures for disease arose late in human history, about 2500 years ago. Its existence depends upon an instrumental attitude toward nature and the non-human found only in large-scale, hierarchical societies. Vivisection is unknown for example, among foragers and pastoralists. To kill a creature for some experimental purpose is for them unimaginable because they understand animals and humans to be co-participants in a world imbued with significance and intention.

Vivisection was similarly unknown to the ancient Egyptians. Their reverence for the dead and preference to preserve the bodies and protect the souls of high-status humans and animals precluded the practice. Ancient Greeks of the Classic period (beginning 5th C. BCE) had more ambivalent attitudes. Aristotle abhorred human dissection but believed that killing and then examining the bodies of animals was justified because — lacking both rationality and a moral sense — they ranked below humans in the scala naturae (scale of nature). They could therefore be killed and dissected, with the resulting knowledge benefitting humans. Thus, there already exists in antiquity the paradox of analogy that still bedevils vivisection: animals are sufficiently like humans that research upon them yields useful insights, but sufficiently unlike them that the infliction of suffering and death can be morally justified.

400 years after Aristotle, the Roman Galen — the most renowned physician of antiquity – undertook to make medical science empirical. By means of dissection and vivisection (sometimes performed in public), he established a proto-inductive method that enabled him to order his anatomical observations by means of overarching theories. He proved for example that blood (not air) moved through the arteries and maintained that every organ in the body had a discoverable purpose. He also argued, against his own empiricism but consistent with the scala naturae, in support of the idea that because animals were less conscious than humans, they experienced less pain, even when subjected to the most gruesome procedures. The same diminished capacity for suffering was also attributed to slaves, exemplifying the connection between the treatment of animals and lower-status humans in hierarchical societies with developed systems of private accumulation.

Dissection and vivisection were unusual in medieval Europe and the Middle East, in part because of proscriptions inherited from pagan antiquity and in part because empirical investigation itself declined as access to Greek texts became rare. However, the Scholastics’ rediscovery of Aristotle and Galen in the 13th Century, concomitant with a rise in trade and manufactures, led to a resurgence of rational inquiry. In addition, the bible was used to sanction animal abuse. Except for the first books of Genesis, the Hebrew and Christian bibles as well as the Koran are not kind to animals. The assertion that after the Flood, God granted people dominion over animals was embedded in all three Abrahamic religions. Among Buddhists, the principle of Ahimsa (do no harm) would appear to proscribe vivisection, and there is no early history of the practice. But Buddhist researchers today have found ways to reconcile their faith with their professional norms.

Though Thomas Aquinas in the 13th Century urged Christians to treat animals with kindness (lest the custom of cruelty be turned against humans), he saw them as mere tools or resources made by God. In that context, and concomitant with the rise of empiricism, vivisection was revived in the 16th Century by the Flemish anatomist Andreas Vesalius and the English philosopher Francis Bacon, and in the 17th by William Harvey, Robert Hooke and Rene Descartes. It was the last, notoriously, and influentially, who saw animals as mere automata lacking mental life or soul, whose outcries were nothing more than the creaking of a mechanism under strain. And while the philosopher Kant in the late 18th Century accepted that animals experience pain, he argued in common with Descartes that vivisection “is praiseworthy since animals must be regarded as man’s instruments.”

But opposition to vivisection also existed. Descartes was criticized by the fabulist Jean de La Fontaine and the philosophers Julien Offray de La Mettrie, Voltaire and Rousseau, and vivisection was condemned by Samuel Johnson, Alexander Pope and many others in the 18th Century. Johnson was familiar with the practices of the surgeons and virtuosi of the Royal Society and Royal College of Surgeons “whose favorite amusement,” he wrote, “is to nail dogs to tables and open them alive.” These doctors were in fact celebrities. John and William Hunter established a popular anatomy theatre in London in 1748, though they were also pilloried for their combination of curiosity and inhumanity. The surgeon John Freke at St. Bartholomew’s Hospital was parodied by Fielding in Tom Jones both for his experiments with electricity and his arrogance, and he may have been inspiration for the seated, be-wigged figure presiding over the human dissection/vivisection in “The Reward of Cruelty”, the last of Hogarth’s set of engravings titled The Four Stages of Cruelty (1751).

A group of people sitting around a tableDescription automatically generated with low confidence

William Hogarth, “The Reward of Cruelty,” 1751: Photo: The author.

The consolidation of medical societies and increased state support and supervision of hospitals in the early-middle 19th Century led to an increase in dissection, human experimentation, and vivisection – all aspects of what the philosopher Michel Foucault called “the medical gaze”.

Patients were now seen as comprising sets of functions, symptoms, and illnesses separate from their individual identities, and animals treated as mere instruments – like scalpels and forceps — in the performance of research. The doctor was thus a technician or detective whose skills and wisdom stood above the common mien. His foremost responsibility was to a scientific method that obliged him to pursue every intuition and follow every clue — wherever it might lead and whatever the cost – to uncover a truth hidden from view.

The development was personified by Claude Bernard, professor at the College de France and Sorbonne and widely considered the father of modern physiology. Bernard was relentless in his functionalism, (discovering among other things the role of the pancreas in absorbing dietary fat), and remorseless in his vivisection. To obtain pancreatic secretions, he operated without anesthesia on 23 dogs, 22 of whom died from peritoneal infection. Strict adherence to the scientific method, he wrote, requires stoicism on the part of the researcher. “The physiologist is no ordinary man. He is a learned man, a man possessed and absorbed by a scientific idea. He does not hear the cries of animals, he does not see their flowing blood, he sees nothing but his idea, and is aware of nothing but an organism that conceals from him the problem he is seeking to resolve.”

The same forensic model prevailed in the late 19th and early 20th Centuries, and despite opposition, vivisection became ever more strongly entrenched in the field of physiology and medicine. Indeed, Bernard was understood by many of his followers to be a moralist for insisting that risky surgeries or other medical experiments not be used on humans before they are tried with animals: “If it is immoral, then, to make an experiment on man when it is dangerous to him, even though the result may be useful to others, it is essentially moral to do experiments on an animal, even though painful and dangerous, if they may be useful to man.” That rationale prevails to this day.

But even as the practice of vivisection was increasingly institutionalized and consequential in Europe, the US, and elsewhere, protests grew apace. The creation in London in 1875 of the Victoria Street Society for the Protection of Animals Liable to Vivisection was a landmark in the history of opposition, as was the passage of animal protection legislation in Parliament – in however vitiated form – the following year. Antivivisection commanded the elite heights and plebian plateaus of European and American society and culture until the second decade of the 20th Century. No less than Queen Victoria, Alfred Russell Wallace, Victor Hugo, Leo Tolstoy, Richard Wagner, Thomas Hardy, G.B. Shaw and Albert Einstein counted themselves anti-vivisectionist and were vocal in their opposition. Working Class opposition to vivisection was also strong, propelled both by sympathy for animals (the consequence of regular close contact with working horses and farm animals) and recognition that men and women from the lower ranks of society were themselves sometimes subject to unscrupulous medical experimentation.

The story of vivisection in the 20th C. can be summarized as follows: intense opposition by followed by spectacular triumph. In the decade before the First World War, anti-vivisection was a mass movement that engaged men and women of all classes and ages. Leaders of both the British Conservative and Labor Parties, including Winston Churchill and Lloyd George publicly condemned the practice, and aristocrats across Europe joined hands (figuratively) with working class organizations in opposition. At about the same time however, the vivisectors smartly changed their public rhetoric, abjuring the cold-bloodedness associated with continental physiologists and endorsing instead “biomedical research.” The latter was a new discipline ostensibly devoted to finding treatments or cures for disease, not simply amassing physiological knowledge. It required “animal experimentation” (the word vivisection was jettisoned by its adepts). In addition, vivisectors maintained that it was only suffering, not death that should trouble the conscience and that their experiments were essentially painless. They had embraced an “animal welfare” ethos developed by Jeremy Bentham, the late 18th C theorist of “animal rights” and his late 20th epigone, Peter Singer.

Bentham believed humans and animals had the right to be spared pain, but nevertheless asserted that the latter had a lesser claim on life itself. Because animals, unlike humans, can’t imagine the future, they have less to lose from an early death so long as little pain was involved. Indeed, to be quickly killed in a slaughterhouse was a death much preferred to the drawn-out agonies that often attend human passing.

Bentham’s argument, essentially repeated by Singer in his celebrated book from 1973, Animal Liberation, is an odd one. Apart from the suggestion – absurd on its face — that animals are lucky to be eaten, it denies what everyone who lives with a dog or cat understands implicitly: that animals live in constant expectation of future joys. A dog who sits all day by the door waiting for a human companion to return, or who brings over a favorite toy for play, or who yips in excitement when brought in proximity to the dog park is expressing hope for the future. Few humans are much more future oriented than that. Nevertheless, animal welfare remains the principle underlying most contemporary animal charities including The Humane Society, Farm Sanctuary, and People for the Ethical Treatment of Animals. And while these groups generally oppose vivisection, their emphasis upon its cruelty opens the door for a putatively humane vivisection, the very one through which vivisectors have walked for over a hundred years. But students of animal mind and behavior offer powerful evidence that this portal must be closed.

A picture containing text, differentDescription automatically generated

Sue Coe, Eden Biotechnologies Ltd. “Getting it Right the First Time”, Scene 21 from “Sue Coe: The Pit, The Tragical Tale of the Rise and Fall of a Vivisector”, 1999, Courtesy the artist and Galerie St. Etienne, New York.

The 2012 Cambridge Declaration on Consciousness, signed by more than a dozen leading neuroscientists, concluded that non-human animals possess self-consciousness and intellect, though of varying depth and complexity, and that the only reason (apart from prejudice) that most humans don’t recognize it is that animals are mute. Advances in the field of “affective neuroscience” have enabled the direct observation of mind and emotion in both humans and animals. By means of positron image tomography, the neuroscientist Jaak Panskepp among others demonstrated the similarity of responses in the subcortical brain – the so-called “ancestral mind” – of humans and animals to events with strong emotional cues. For example, the perception of a physical threat may generate a flight response in both, but it will also stimulate in the ancestral brain the emotional feeling of fear. And since it is logically impossible to have a feeling without knowing it, animal consciousness must exist.

The 2012 Cambridge Declaration on Consciousness, signed by more than a dozen leading neuroscientists, concluded that “non-human animals have the neuroanatomical, neurochemical, and neurophysiological substrates of conscious states along with the capacity to exhibit intentional behaviors. Consequently, the weight of evidence indicates that humans are not unique in possessing the neurological substrates that generate consciousness. Nonhuman animals, including all mammals and birds, and many other creatures, including octopuses, also possess these neurological substrates.” Since the Declaration, understanding of the subjective experience and inner lives of animals has continued to advance. Scientists have asked questions about the degree, or more properly the multi-dimensionality of consciousness in different animals, that is its richness, multiplicity, temporality, and self-awareness. Nearly all serious research in this vein has been scrupulous to avoid the false dichotomy of human vs animal. Every species has it unique character, and indeed differences in consciousness between individuals can be as great as differences between species.

The ethical and practical implications of these animal consciousness studies for biomedical and other research are significant: isolating animals from kin and their natural environment, and subjecting them to pathogens, repeated surgeries, painful recoveries, and premature death, stimulates in them intense physical and mental pain or distress. For that reason, the field must embrace the many existing alternatives to animal experimentation or risk ethical opprobrium. Alternatives include computer modeling, imaging studies, in vitro testing, organs-on-a-chip and microdosing. The discovery of effective cures for disease does not require the torture of non-humans. And yet it continues.

One reason it does, is prejudice, sometimes called “speciesism,” and another is money. Speciesism is the hard-to-pronounce term coined by the British psychologist Richard D. Ryder in 1971 (popularized by Peter Singer), that describes the longstanding bigotry of humans toward animals and the consequent regime of domination. The bias has existed at least since the Sumerians domesticated sheep (mouflon) and cattle (aurochs). The first animal to wear the harness, suffer the whip or have its young taken from it so that its milk could be used for human consumption suffered the world-historical defeat of animal rights. And from the pastures of the Fertile Crescent to the modern laboratories of the vivisectors, speciesism has flourished. When in 2002, the arch-segregationist, U.S. Senator Jessie Helms (R-N.C.) carved out an exception to the Animal Welfare Act for rodents and birds, he was enacting a prejudice that governed his whole outlook. This exclusion, he told senators, will “deliver a richly deserved rebuke” to the “so-called ‘animal rights’ crowd” who embrace animals fit only for “extermination.” If Helms’ gutting of the AWA gratified his prejudices, it also buttressed his pocketbook. He acted on behalf of lobbyists from the National Association for Biomedical Research, and Big Pharma. Helms’ biggest campaign contributor was GlaxoSmithKline.

To state the obvious, biomedical research is big business. The global value of the animal testing industry is about $11 billion, and in the U.S., $5 billon. Global pharmaceutical revenue in 2020 was $1.1 trillion, and in the U.S. $425 billion. In 2021, Pfizer, Moderna and BioNTech, the companies that produced the most successful Covid vaccines, made a total pre-tax profit of $34 billion, despite receiving over $8 billion in public subsidies. Johnson and Johnson earned an additional $4 billion in profit in 2021 compared to 2020, from its less effective Covid vaccine.

Public and private universities in the U.S. also make out big in the biomedical field. The gold rush began in 1980 with a change in law permitting universities to hold patents for research supported by federal grants. Since then, the quest for intellectual property (IP) has guided university investment, and in a few cases produced bonanzas. Over about a decade, from 2002 to 2012, Northwestern University, my former employer, netted more than $1 billion from the patent for a single drug, Lyrica, used for nerve pain and fibromyalgia, a stress-related, neuropsychiatric disorder.

And the quest for IP remains Northwestern’s loadstar; whatever advances that mission, such as a new and expanded Center for Comparative Medicine (aka animal experimentation lab), is supported as well. (See my harrowing report on a visit to the animal research labs at NU.) Nearly all drug development in its pre-clinical phase involves animal testing. The USFDA requires animal-based pharmacology and toxicology studies prior to permitting Phase 0 trials involving the microdosing of human subjects. The reason these preliminary human trials are performed is that animal studies are generally inconclusive.

The usefulness of animal testing – a question apart from its ethics — has come under sustained criticism in recent years. Some 92% of animal trials of pharmaceuticals end in failure. The reasons are many, but three stand out: 1) the inability to control for the impact upon animals of different laboratory settings and research procedures; 2) the obvious lack of congruence between animal models and human disease; and 3) differences in genetics between test species, and even between individual animals of the same species. The consequence of these nonpredictive animal experiments is investment in drug protocols that don’t work, and disinvestment in drugs that do. In both cases, human well-being is negatively impacted.

Nevertheless, the pipeline of money continues and so there is little incentive to change course. The financing comes from government grants, private donors to hospitals and universities, and most of all, exorbitant drug prices. It is in nobody’s interest – except animals and the public — to do anything that might potentially discomfit drug designers, marketers, corporate boards, university presidents, and stockholders content with the existing animal experimentation system and pharmaceutical profits. Covid-19 has been a partial exception to this rule: The FDA and NIH both permitted clinical human trials of vaccines without first testing them on animals. Finding a vaccine was simply too important to leave to chance.

Mouse Tales and Covid-19

Wild mice – those that are not genetically changed to model humans traits — are not susceptible to Covid because they lack the cellular surface enzyme that functions as the lock for the SARS-CoV-2 spike protein key. That hasn’t stopped tens and possibly hundreds of thousands of them being used – pointlessly — for Covid-19 research. The search for a genetically modelled mouse that can recapitulate disease development in humans has however been intense. Several candidates have been proposed and a few utilized. One of them, announced with fanfare in an August 2020 issue of Nature, perhaps the world’s leading, peer-reviewed, multidisciplinary science journal, was particularly significant. When it was infected with Covid, it developed disease in both the upper and lower respiratory systems, just like people. Even better for researchers, older mice suffered more severe illness than younger ones. And finally, the disease was effectively treated with interferon-λ1a, suggesting that a similar treatment might cure humans of the disease. Nevertheless, interferon has since proven to be a bust and is not currently recommended either by the NIH or WHO for treatment of Covid.

Wild mice are sacrificed because scientists somehow don’t realize that they can’t get Covid. Genetically modified mice are infected with severe Covid and killed in search for treatments that fail. And both kinds of mice are sometimes killed for no reason at all. Amid the pandemic surge in May 2020, tens and perhaps hundreds of thousands of mice – supposedly essential in the search for a cure – were killed all over the world because of reduced staff in laboratories. In some places, the number of mice were reduced by half, in others 10%-20%, while in still others, no tally was taken. At the University of Illinois, Chicago, a spokesperson said: “[We] did not track how many rodents were euthanized by investigators as a result of the pandemic but the number is thought to be low”.

Mice are sensitive, intelligent, and empathetic creatures. Just like humans, whales, dolphins and other mammals, birds, octopi and fish, mice and rats can re-experience the pain and fear of family members, friends, and neighbors. A recent study in the journal Science, showed that when one rat observed the alleviation of pain in a neighbor, its own pain was relieved. Spending just an hour or two getting to know a second mouse was enough for the first mouse to share its emotional state. This shouldn’t be a surprise. A few years ago, researchers at the University of Chicago showed that empathy or “an other-oriented emotional response elicited by and congruent with the perceived welfare of an individual in distress” was characteristic of laboratory rats. When confronted with a trapped rat, a free rat will do everything it can to liberate its cage mate, even when offered chocolate instead, one of its favorite foods. (After liberating the caged rat, the free one shared its chocolate.)

Continuing to practice vivisection is a barbarism on a par with continuing to kill and eat animals simply because we can, because of habit, or because we like the taste. It contravenes every moral precept we claim to care about, everything we know about evolutionary biology and human kinship with other species, and what we have learned about successful laboratory science. The challenge posed by Covid-19 should not be used as an excuse to continue the practice of vivisection; it’s a reason to do things differently.

Stephen F. Eisenman is Professor Emeritus of Art History at Northwestern University and the author of Gauguin’s Skirt (Thames and Hudson, 1997), The Abu Ghraib Effect (Reaktion, 2007), The Cry of Nature: Art and the Making of Animal Rights (Reaktion, 2015) and other books. He is also co-founder of the environmental justice non-profit,  Anthropocene Alliance. He and the artist Sue Coe have just published American Fascism, Still for Rotland Press. His next book with the artist Sue Coe The Young Person’s Illustrated Guide to American Fascism‘will be published late this summer by OR Books. He can be reached at: s-eisenman@northwestern.edu