Photo by Lucas Vasques on Unsplash

Immunity and the Institution of Health Care

The history and culture of medical treatment

Eric Bond
11 min readNov 20, 2020

--

It is unlikely that a mutation from changes to the genetic code during transcription of the DNA sequence has caused widespread alteration in traits of the human immune system, at least since the origin of our species, unless there is some phenomenon of transmission between individuals we know nothing about. Beneficial mutations are too rare to alone be an explanation for the distinctive immune function of modern Homo sapiens as opposed to humans of earlier eras. However, the selection pressures of virulent epidemics may be capable of transforming the overall immune system of a species as well as proportions of additional traits by the cataclysmic elimination of phenotypes. A percentage of human DNA is of viral origin, meaning that the immune system participates in regulating the genetic endowment of our species, and conceivably has even had its own genes structured by viruses. There are probably some genotypic differences between individuals due to distinct ancestry, the founder effect in which a small social group migrates to a new area and grows as a population, magnifying any unique traits it possesses and touching off genetic drift towards independent trait development. This has happened many times during human history and especially prehistory, resulting in slightly variant trait profiles across the world population, and immune system function is almost certain to have been potentiated in the process, though the details may be subtle, and most mutations prove deleterious, limiting their spread. Small-scale variation can arise at any time from mutations to particular lineages, the appearance of new traits in only a single individual that then may be transmitted hereditarily, leading to idiosyncrasies in many families, similar to the vast variety of body types and facial features.

Efficacy of the immune system is affected by lifestyle factors such as diet, hygiene and level of exposure to pathogens. Inflammation can be caused by a hyperactive first line of defense against foreign material in the bloodstream, intestinal tract, respiratory system and throughout the body, occurring primarily as collateral damage from macrophage behavior. Prolonged low level activations of the immune system, as in food or inhalation allergies, lingering bacterial infections, chronic stress, and that which obesity causes, are a risk factor for heart disease and cancer, but this is usually minimal, and some regular stimulation of the immune system actually makes it stronger (doctors and nurses are familiar with this truism), so that when a major infection occurs, the body is capable of immediate, efficient response. Too much immune system activity is unhealthy, but too little is as well. In general, modern civilization in developed countries has a satisfactory immunity profile, so that the average lifespan is probably as long as it ever has been. However, pollution of the environment with synthetic chemicals keeps a huge institution of modern medicine crucial, accounting for statistics like half the males of the U.S. population coming down with at least minor cancer at some point during their lives.

It is possible that the human immune system vestigialized during prehistory once we began cooking food due to reduced ingestion of live pathogens, and even further in a modern context from disinfecting environments and the surfaces of our bodies with heat, soap, alcohol and bathing. This might be why systemic infections are more virulent for us than many comparable species, as a large proportion of the human population is knocked out of commission for a stretch by a wide variety of bugs each year. Human infants are less functionally developed at birth than the young of most species, which includes their immune systems, contributing to regular bouts of illness. Before modern medicine, infant mortality rates were very high. Antibiotics and additional substances are useful for supplementing immune system function, but their total causality and often even a theoretical understanding of proven benefits can exceed our knowledge, meaning it is in many cases good for medical practitioners to be wary of excessive usage so as to avoid negative consequences such as detrimental mutations, increases in inflammation, and changes to body chemistry.

Neglecting to complete an antibiotic regimen can leave more resistant bacteria alive to reproduce and spread in the environment. With repeatedly inadequate dosing, natural selection for hardier strains takes place until the antibiotic is ineffective. Global medicine has caught on and attempted to educate the public, but many species of pathogen have become resistant to multiple treatments, and we must continually develop altered versions of antibiotics in the lab to restore efficacy. If individuals continue to misuse antibiotics on a large scale, we may one day entirely lose our ability to treat some common infections, so it is paramount that directions given by doctors and pharmacies are followed exactly unless severe or disturbing side effects occur (most medications have at least some mild side effects), at which time a professional is consulted for guidance, perhaps switching the patient to an alternative medication for the same issue.

Looking at the origins of medicine, it is clear that those living in the tropics had an advantage from the sheer variety of vegetal matter and its chemical compounds. Many of these substances — leaves, roots and juices of plants — resemble compounds in the human body closely enough that they can enhance tissue repair as well as provide pain relief and additional prophylactic effects when ingested or applied topically. Antibiotic and antifungal properties of many tropical plants and herbs also help the body fight infection. Many modern antibiotics has been derived from nonsynthetic sources. Penicillin, the first antibiotic compound to be isolated in the lab, was a fungal extract, and there are numerous such cases. Those living in temperate and especially polar and desert climates were less fortunate, but even in these regions tens of thousands of years experimenting yielded many natural remedies.

The tradition of using substances from all kinds of wild and domesticated sources for medical purposes persisted as humans entered into civilized living, but not all treatments were effectual, often being incidental to the body’s own healing process or stimulating recovery by placebo effects, and the prevalence of illusion increased as humans lost touch with long-standing hunter-gatherer traditions. Transition to civilization’s medicine is a vast topic that can only be thoroughly addressed by performing much meticulous research, but in essence humans of antiquity started to replace natural palliatives with alchemical compounds and rare medicinal commodities whose economic value was the main factor in their appeal, the more expensive and exotic the better; much of this was actually toxic. Initial forays into anatomy also led to some ineffective and sometimes harmful procedures based on crude models of bodily function. A chief offender was the practice of bleeding, intended to draw noxious humors out of the body, but it actually weakened patients so they were less likely to rebound. With a few notable exceptions such as the ancient Greek physician Hippocrates, who’s central motto was to at least do no harm, medical pretenders of antiquity and the Middle Ages went on poisoning and enervating those in their care for a fee until the alchemist Paracelsus arrived on the scene, who introduced a more conservative approach, in many cases doing no more than managing to forego interference with the body’s own recuperation, so much so that he was occasionally accused of swindling. As the sciences of biology and chemistry matured, models of bodily function improved by leaps and bounds so that it became possible to have some sense of why treatments work and how to develop better ones, an integration of laboratory experiment, theory and technology out of which sprung surgical, diagnostic and pharmaceutical methods increasing the average lifespan in many countries by as much as thirty years.

Perceptions of illness and contagion have changed considerably over the course of history. Early on, there was occasionally some sense of contributing causes, but the way diseases spread was a mystery. Humans had recognized the link between tetanus and wounds by antiquity at the latest, but the afflicted were left to sweat it out on a mat in isolation, given a wide birth by everyone. Ancient leprosy, whatever this disease actually was, inspired much fear in biblical times, and the infected could become homeless outcasts, beggars who were generally shunned.

When Bubonic plague hit Medieval Europe, the continent was unequipped to deal with a major epidemic. Citizens believed this disease was transmitted by air, so took the unnecessary precaution of covering their noses and mouths when in the presence of the sick. It was actually being spread by fleas from rats, but no one discerned this at the time. Posies, a type of flower, were ineffectually carried in the pocket to ward off illness, pesthouses were set up for quarantine but proved inadequate because of the flea vector, leeches were placed on the ill to draw out humors of infirmity; everyone was dropping like flies. Outbreaks kept recurring, and in fifty years a third of the European population died. Many thought the world was ending.

Smallpox was another virulent disease that could reach epidemic proportions in Europe and the United States. It was so widespread at times that citizens did not bother to rework their routines much to avert it. Even the likes of immortalized composer Wolfgang Amadeus Mozart, recognized early in life as one of the supreme musical talents of Europe, came down with this disease as a child and had pockmarks of a survivor to show for it. Smallpox nearly decimated the American army as it wintered at Valley Forge during the Revolutionary war. Mortality rates can be as high as 90% in the absence of treatment, but soldiers had learned from African slaves that infecting a puncture in the skin with discharge from a smallpox sore produces less severe infection, and once individuals have outlasted the disease they are usually inoculated. This reduced deaths to 10% and probably saved the American bid for independence. Good fortune did not teach them to play nice: Indian tribes were especially vulnerable to the new disease, and Americans occasionally traded possessions of smallpox victims, decimating some native populations so that Caucasians settled the West with less resistance.

The United States set up a national quarantine at Ellis island in New York City to prevent epidemics of Europe from entering the U.S. via immigration. Crowds of potential citizens were required to stay in isolation for up to two weeks, assuring that no one was infected. Pragmatic approaches such as this became more prevalent as science learned it was incubation of microscopic pathogens causing all of these illnesses. The first epidemiological research was carried out with cholera: European doctors carefully recorded demographic data of its origin and spread in populations during 19th century epidemics and were able to identify contaminated water supplies as the source.

Growing comprehension of nature and its causes led to more systematic concepts such as vulnerability by season, vector organisms, vaccination and fever control, greatly reducing the dangers posed by contagious disease. A massive inflow of capital to the medical profession due to relevance for universal need is rapidly technologizing the treatment of illness. Unfortunately, like any indispensable economic sector, medicine is a magnet for abusive practices, and vigilance is necessary to keep populations from being divided even further along class and ethnic lines by prejudicial policies regarding insurance and otherwise unequal access. It has become apparent that the profession must be tightly regulated, and many governments have taken steps to put stringent legal directives in place, but mischief is constantly challenging the boundaries of law, seeking loopholes by which to circumvent barriers to corruption.

One of the problems associated with medical treatment is the difficulty patients have with discussing their health. One’s own pain can be taboo as a topic of conversation, and many illnesses require some technical competence to understand. Most find bodily functions and disorders unpleasant to think about, especially those of others, and inability to broach the subject of sickness can keep knowledge of health care superficial. Tolerance for pains and illnesses of others is improving, so that measures such as health insurance and sick leave are becoming standard throughout the world.

Mimetic disorganization in modern society can exacerbate health care shortcomings as disinformation occurs, with sometimes tragic consequences. There was one noteworthy case in the United States during which some citizens went door to door telling individuals in a neighborhood of recent immigrants that vaccination endangers the health of children. Upon failure to vaccinate, all kinds of rare ailments started to crop up in the infant population. These pathogens are still in the environment and immunization is key to preventing reemergence of diseases they cause. Online resources have huge potential to improve health care as citizens can educate themselves about possible causes of their symptoms, providing a check on professional incompetence and abuse. Acquiring the ability to think critically about one’s own health and recognize when a supposed authority is legitimate are key to making the system run as smoothly as possible. When patients know their condition, they report illnesses accurately, diagnosis is more successful, and the distribution, spread and causes of disease are easier to track, model and control.

Medical treatment is a good example of what cultural evolution looks like, how collective conventions filiate as a consequence of differing perceptions, observations, theoretical paradigms, technical practices and environments. In Norway, some skilled researchers noticed very soon after antibiotics were introduced that bacterial resistance develops and immediately designed medication regimens to counteract it. This can be contrasted with the United States, where it became usual to wantonly prescribe antibiotics as if they are pain relievers, with no acknowledgement of long-term consequences. After decades of divergent practice, Norwegian hospitals have 1% as many antibiotic-resistant infections as the United States, all because of a simple observation that is probably attributable to only a few practitioners and some fortuitous events in a lab. In China, hospitals have sterilized and reused needles, which probably leads to some contagion and would be unacceptable in the United States. Yet Chinese citizens wear masks in public during flu season to compensate for the effects of concentrated population, even while walking down the street, a practice that would probably get one stigmatized and even lead to confrontations in the United States, though it is no doubt a sound countermeasure. (The American attitude towards protective masks seems to be growing more tolerant as we speak.)

Addiction medicine demonstrates how perceptions of illness can change and vary by culture. As the prevalence and variety of hard drugs increased, it was noticed by authorities almost everywhere that addicted individuals are much more likely to engage in at risk behaviors. Their judgement is often negatively impacted, and they are frequently running low on cash from paying for a fix or too impaired to work, contributing to theft and additional criminal activities. In the U.S., drug use has become stigmatized because of the link to crime and poverty. Addicts are treated like villains by a legal system that fills jails with those convicted of drug-related crime. Portugal is a stark contrast, where new policies of decriminalization and extensive health care have remedied the country’s problems with drug abuse. In the U.S. also, institutions are becoming somewhat more rational about drugs, getting many of the addicted into treatment facilities where they can kick their habits with medical support while remaining active and connected in the community.

Decriminalization like that instituted in Portugal is a difficult policy issue for governments to grapple with. When drugs are illegal, there is no middle ground for users: they must either avoid drugs altogether and remain on the legal side, or in many cases arrange their lifestyle around buying, selling and concealing illicit substances. But if hard drugs are legal, a whole country can suffer, as happened to 19th century China when Great Britain shipped huge quantities of opium to the country, addicting the population and instigating the Opium wars as Chinese leadership attempted to militarily save its citizenry from the destruction of their society. In general, medical understanding of addiction is currently sophisticated enough that at least some decriminalization can be successful in many areas, but logistical and cultural challenges differ by country. The question as to whether a nation of three hundred million like the U.S. with so much sprawling diversity can decriminalize requires much careful analysis.

--

--

Eric Bond

Chapters from the book Standards for Behavioral Commitments: Philosophy of Humanism, and more!